aboutsummaryrefslogtreecommitdiff
path: root/doc
diff options
context:
space:
mode:
authorJoey Hess <joeyh@debian.org>2013-07-10 21:49:23 -0400
committerJoey Hess <joeyh@debian.org>2013-07-10 21:49:23 -0400
commit1fc3f034191d3eec78b4d5da343e282092a221be (patch)
treed381dca05a61ec159803b92417e5393d8c10ed2b /doc
downloadikiwiki-1fc3f034191d3eec78b4d5da343e282092a221be.tar
ikiwiki-1fc3f034191d3eec78b4d5da343e282092a221be.tar.gz
ikiwiki (3.20130711) unstable; urgency=low
* Deal with git behavior change in 1.7.2 and newer that broke support for commits with an empty commit message. * Pass --no-edit when used with git 1.7.8 and newer. # imported from the archive
Diffstat (limited to 'doc')
-rw-r--r--doc/GPL340
-rw-r--r--doc/TourBusStop.mdwn30
-rw-r--r--doc/anchor.mdwn11
-rw-r--r--doc/backlinks.mdwn2
-rw-r--r--doc/banned_users.mdwn10
-rw-r--r--doc/banned_users/discussion.mdwn31
-rw-r--r--doc/basewiki.mdwn26
-rw-r--r--doc/basewiki/index.mdwn7
-rw-r--r--doc/basewiki/sandbox.mdwn32
-rw-r--r--doc/blog.mdwn4
-rw-r--r--doc/branches.mdwn25
-rw-r--r--doc/bugs.mdwn13
-rw-r--r--doc/bugs/2.45_Compilation_error.mdwn198
-rw-r--r--doc/bugs/404_plugin_and_lighttpd.mdwn45
-rw-r--r--doc/bugs/404_plugin_should_handle_403.mdwn16
-rw-r--r--doc/bugs/404_when_cancel_create_page.mdwn60
-rw-r--r--doc/bugs/4_spaces_after_bullet.mdwn18
-rw-r--r--doc/bugs/Add_a_footer_div_on_all_pages_to_improve_theming.mdwn149
-rw-r--r--doc/bugs/Add_permissions_for_suggesting__47__accepting_edits.mdwn15
-rw-r--r--doc/bugs/Aggregated_Atom_feeds_are_double-encoded.mdwn22
-rw-r--r--doc/bugs/Allow_overriding_of_symlink_restriction.mdwn139
-rw-r--r--doc/bugs/Another_UTF-8_problem.mdwn16
-rw-r--r--doc/bugs/Attachment_plug-in_not_committing_files.mdwn18
-rw-r--r--doc/bugs/Broken_URL_to_your_blog_script.mdwn10
-rw-r--r--doc/bugs/Broken_access_to_Ikiwiki_gitweb.mdwn19
-rw-r--r--doc/bugs/Building_a_sidebar_does_not_regenerate_the_subpages.mdwn8
-rw-r--r--doc/bugs/CGI__44___formbuilder__44___non-existent_field_address.mdwn59
-rw-r--r--doc/bugs/CGI_edit_and_slash_in_page_title.mdwn18
-rw-r--r--doc/bugs/CGI_problem_with_some_webservers.mdwn108
-rw-r--r--doc/bugs/CGI_showed_HTML_when_perl_error.mdwn40
-rw-r--r--doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn28
-rw-r--r--doc/bugs/CamelCase_and_Recent_Changes_create_spurious_Links.mdwn11
-rw-r--r--doc/bugs/Can__39__t_build_2.49__63__.mdwn35
-rw-r--r--doc/bugs/Can__39__t_connect_to_ikiwiki_git_repo_through_http.mdwn18
-rw-r--r--doc/bugs/Can__39__t_create_root_page.mdwn69
-rw-r--r--doc/bugs/Can__39__t_deplete_page__63__.mdwn8
-rw-r--r--doc/bugs/Can__39__t_rebuild_wiki_pages_with_ikiwiki_2.49.mdwn44
-rw-r--r--doc/bugs/Cannot_inline_pages_with_apostrophes_in_title.mdwn7
-rw-r--r--doc/bugs/Changing_language.mdwn9
-rw-r--r--doc/bugs/Checksum_errors_on_the_pristine-tar_branch.mdwn112
-rw-r--r--doc/bugs/Command-line_arguments_should_override_settings_in_the_setup_file.mdwn32
-rw-r--r--doc/bugs/Comments_are_not_sorted_by_their_date_attribute.mdwn71
-rw-r--r--doc/bugs/Comments_dissapeared.mdwn69
-rw-r--r--doc/bugs/Comments_link_is_to_index.html_if_usedirs_is_on.mdwn5
-rw-r--r--doc/bugs/Convert___34__somehost.com/user__34___OpenID_at_RecentChanges_page.mdwn44
-rw-r--r--doc/bugs/Disappearing_Pages.mdwn41
-rw-r--r--doc/bugs/Discussion_link_not_translated_after_page_update.mdwn27
-rw-r--r--doc/bugs/Discussion_link_not_translated_in_post.mdwn67
-rw-r--r--doc/bugs/Discussion_of_main_page_generates_invalid_link.mdwn3
-rw-r--r--doc/bugs/Does_IkiWiki::Setup::load__40____41___really_return_a_hash__63__.mdwn10
-rw-r--r--doc/bugs/Dupe_entry_in_Bundle::IkiWiki::Extras.pm.mdwn5
-rw-r--r--doc/bugs/Encoding_problem_in_calendar_plugin.mdwn73
-rw-r--r--doc/bugs/Error:_OpenID_failure:_time_bad_sig:.mdwn83
-rw-r--r--doc/bugs/Error:_Your_login_session_has_expired._.mdwn46
-rw-r--r--doc/bugs/Error:_no_text_was_copied_in_this_page_--_missing_page_dependencies.mdwn46
-rw-r--r--doc/bugs/Exception:_Unknown_function___96__this__39___.mdwn70
-rw-r--r--doc/bugs/Excessive_list_nesting_in_map_output_for_subpages.mdwn3
-rw-r--r--doc/bugs/Existing_Discussion_pages_appear_as_non-existing.mdwn5
-rw-r--r--doc/bugs/External_link:_underscore_conversion.mdwn25
-rw-r--r--doc/bugs/External_links_with_Creole.mdwn5
-rw-r--r--doc/bugs/Fancy_characters_get_munged_on_page_save.mdwn29
-rw-r--r--doc/bugs/Feeds_get_wrong_timezone..mdwn10
-rw-r--r--doc/bugs/Feeds_link_to_index.html_instead_of_directory.mdwn37
-rw-r--r--doc/bugs/Filenames_with_colons_cause_problems_for_Windows_users.mdwn75
-rw-r--r--doc/bugs/Git:_changed_behavior_w.r.t._timestamps.mdwn214
-rw-r--r--doc/bugs/Git:_web_commit_message_not_utf-8.mdwn17
-rw-r--r--doc/bugs/Graphviz_plug-in_directive_changed_in_2.60.mdwn11
-rw-r--r--doc/bugs/HTML_for_parentlinks_makes_theming_hard.mdwn45
-rw-r--r--doc/bugs/HTML_inlined_into_Atom_not_necessarily_well-formed.mdwn35
-rw-r--r--doc/bugs/HTML_is_not_update_nor_created_when_editing_markdown_via_CGI.mdwn43
-rw-r--r--doc/bugs/Highlight_extension_uses_hard_coded_paths.mdwn3
-rw-r--r--doc/bugs/Http_error_500_when_using_mercurial_backend.mdwn31
-rw-r--r--doc/bugs/Hyperestraier_search_plug-in_defective.mdwn55
-rw-r--r--doc/bugs/INC_location_not_set_correctly_in_make_test.mdwn24
-rw-r--r--doc/bugs/IkiWiki::Setup::load__40____41___broken_outside_ikiwiki__63__.mdwn20
-rw-r--r--doc/bugs/IkiWiki::Wrapper_should_use_destdir.mdwn23
-rw-r--r--doc/bugs/IkiWiki::Wrapper_should_use_destdir/discussion.mdwn4
-rw-r--r--doc/bugs/IkiWiki_does_not_use_file__39__s_mtime_for_Last_Edited.mdwn21
-rw-r--r--doc/bugs/Index_files_have_wrong_permissions.mdwn14
-rw-r--r--doc/bugs/Inline_doesn__39__t_wikilink_to_pages.mdwn96
-rw-r--r--doc/bugs/Insecure_dependency_in_eval_while_running_with_-T_switch.mdwn98
-rw-r--r--doc/bugs/Insecure_dependency_in_mkdir.mdwn160
-rw-r--r--doc/bugs/Insecure_dependency_in_utime.mdwn14
-rw-r--r--doc/bugs/Linkmap_doesn__39__t_support_multiple_linkmaps_on_a_single_page.mdwn3
-rw-r--r--doc/bugs/Links_to_missing_pages_should_always_be_styled.mdwn5
-rw-r--r--doc/bugs/Links_with_symbols_can__39__t_be_edited.mdwn22
-rw-r--r--doc/bugs/MTIME_not_set_for_inline_or_archive_entries.mdwn22
-rw-r--r--doc/bugs/Map_sorts_subtags_under_a_different_tag.mdwn49
-rw-r--r--doc/bugs/Mercurial_example_diffurl_should_read_r2__44___not_changeset.mdwn11
-rw-r--r--doc/bugs/Meta_plugin_does_not_respect_htmlscrubber__95__skip_setting.___40__patch__41__.mdwn11
-rw-r--r--doc/bugs/Missing_build-dep_on_perlmagick__63__.mdwn14
-rw-r--r--doc/bugs/Missing_constant_domain_at_IkiWiki.pm_line_842.mdwn50
-rw-r--r--doc/bugs/Monotone_rcs_support.mdwn58
-rw-r--r--doc/bugs/More_permission_checking.mdwn17
-rw-r--r--doc/bugs/Navbar_does_not_link_to_page_being_commented_on_while_commenting.mdwn11
-rw-r--r--doc/bugs/New_comments_are_not_always_displayed__59___need_page_refresh_to_appear.mdwn35
-rw-r--r--doc/bugs/No___34__sid__34___in_forms_resulting_in_Error:_Your_login_session_has_expired..mdwn39
-rw-r--r--doc/bugs/No_categories_in_RSS__47__Atom_feeds.mdwn7
-rw-r--r--doc/bugs/No_link_for_blog_items_when_filename_contains_a_colon.mdwn76
-rw-r--r--doc/bugs/No_numbacklinks_setting_for___34__no_limit__34__.mdwn4
-rw-r--r--doc/bugs/No_progress_in_progress_bar.mdwn43
-rw-r--r--doc/bugs/Not_all_comments_are_listed_by___33__map_or___33__inline.mdwn68
-rw-r--r--doc/bugs/Obsolete_templates__47__estseek.conf.mdwn3
-rw-r--r--doc/bugs/OpenID_delegation_fails_on_my_server.mdwn53
-rw-r--r--doc/bugs/PNG_triggers_UTF-8_error_in_MimeInfo.pm.mdwn25
-rw-r--r--doc/bugs/PREFIX_not_honoured_for_underlaydir.mdwn44
-rw-r--r--doc/bugs/Pages_with_non-ascii_characters_like_öäå_in_name_not_found_directly_after_commit.mdwn145
-rw-r--r--doc/bugs/Patch:_Fix_error_in_style.css.mdwn37
-rw-r--r--doc/bugs/Perl_scripts_depend_on___47__usr__47__bin__47__perl.mdwn6
-rw-r--r--doc/bugs/Please_avoid_using___39__cp_-a__39___in_Makefile.PL.mdwn77
-rw-r--r--doc/bugs/Please_don__39__t_refer_to_offsite_openid_image.mdwn19
-rw-r--r--doc/bugs/Preview_removes_page_location_drop-down_options.mdwn10
-rw-r--r--doc/bugs/Problem_with_displaying_smileys_on_inline_page.mdwn25
-rw-r--r--doc/bugs/Problem_with_editing_page_after_first_SVN_commit.mdwn209
-rw-r--r--doc/bugs/Problem_with_toc.pm_plug-in.mdwn37
-rw-r--r--doc/bugs/Problem_with_umlauts_and_friends.mdwn16
-rw-r--r--doc/bugs/Problems_using_cygwin.mdwn20
-rw-r--r--doc/bugs/Problems_with_graphviz.pm_plug-in.mdwn43
-rw-r--r--doc/bugs/Problems_with_graphviz.pm_plug-in_previews.mdwn54
-rw-r--r--doc/bugs/RecentChanges_broken_with_empty_svnpath.mdwn26
-rw-r--r--doc/bugs/RecentChanges_contains_invalid_XHTML.mdwn55
-rw-r--r--doc/bugs/RecentChanges_links_to_deleted_pages.mdwn15
-rw-r--r--doc/bugs/Remove_redirect_pages_from_inline_pages.mdwn15
-rw-r--r--doc/bugs/Renaming_a_file_via_the_web_is_failing_when_using_subversion.mdwn28
-rw-r--r--doc/bugs/Running_on_an_alternative_port_fails.mdwn93
-rw-r--r--doc/bugs/SSI_include_stripped_from_mdwn.mdwn21
-rw-r--r--doc/bugs/SVG_files_not_recognized_as_images.mdwn39
-rw-r--r--doc/bugs/Search_Help_doesn__39__t_exist.mdwn8
-rw-r--r--doc/bugs/Search_results_should_point_to_dir__44___not_index.html__44___when_use__95__dirs_is_enabled.mdwn15
-rw-r--r--doc/bugs/Search_summary_includes_text_from_navigational_elements.mdwn22
-rw-r--r--doc/bugs/Site_title_not_clickable_while_adding_a_comment.mdwn9
-rw-r--r--doc/bugs/Slow_Filecheck_attachments___34__snails_it_all__34__.mdwn47
-rw-r--r--doc/bugs/Slow_Filecheck_attachments___34__snails_it_all__34__/discussion.mdwn141
-rw-r--r--doc/bugs/Smileys_in_the_block_code.mdwn34
-rw-r--r--doc/bugs/Spaces_in_link_text_for_ikiwiki_links.mdwn53
-rw-r--r--doc/bugs/Spurious___60__p__62___elements_added_to_tags_in_inliine_pages.mdwn43
-rw-r--r--doc/bugs/Spurious___60__p__62___elements_added_to_tags_in_inliine_pages/discussion.mdwn9
-rw-r--r--doc/bugs/Sub-Discussion_pages_have_a_broken___34__FormattingHelp__34___link.mdwn3
-rw-r--r--doc/bugs/Symlinked_srcdir_requires_trailing_slash.mdwn81
-rw-r--r--doc/bugs/Tab_delimited_tables_don__39__t_work.mdwn22
-rw-r--r--doc/bugs/Template_variable_not_passed_as-is__63____33__.mdwn23
-rw-r--r--doc/bugs/Titles_are_lower-cased_when_creating_a_page.mdwn37
-rw-r--r--doc/bugs/Toc_map_and_template_plugins_do_not_play_well_together.mdwn30
-rw-r--r--doc/bugs/Trailing_slash_breaks_links.mdwn9
-rw-r--r--doc/bugs/URLs_with_parentheses_displayed_badly.mdwn19
-rw-r--r--doc/bugs/UTF-16_and_UTF-32_are_unhandled.mdwn29
-rw-r--r--doc/bugs/UTF-8_BOM_showing_up_inside_a_page__63__.mdwn38
-rw-r--r--doc/bugs/UTF-8_in_attachment_filenames.mdwn25
-rw-r--r--doc/bugs/Unable_to_add_attachments_to_some_pages.mdwn31
-rw-r--r--doc/bugs/Undefined_subroutine_IkiWiki::escapeHTML.mdwn27
-rw-r--r--doc/bugs/Undefined_subroutine_IkiWiki::refresh.mdwn7
-rw-r--r--doc/bugs/Underscores_in_links_don__39__t_appear.mdwn18
-rw-r--r--doc/bugs/Use_install__40__1__41___instead_of_cp__40__1__41___for_installing_files.mdwn32
-rw-r--r--doc/bugs/W3MMode_still_uses_http:__47____47__localhost__63__.mdwn34
-rw-r--r--doc/bugs/Warns_about_use_of_uninitialized_value_if_prefix__95__directives_is_on_and_a_directive_does_not_contain_a_space.mdwn19
-rw-r--r--doc/bugs/Weird_interaction_between_toc_plugin_and_page_sections.mdwn40
-rw-r--r--doc/bugs/Wrong_permissions_on_4_smileys.mdwn10
-rw-r--r--doc/bugs/XHTML_needs_xmlns_attribute_on_html_element.mdwn5
-rw-r--r--doc/bugs/__33__inline_sort__61____34__meta__40__date__41____34___ignored.mdwn41
-rw-r--r--doc/bugs/__34__Currently_enabled_SSH_keys:__34___shows_only_first_139_characters_of_each_key.mdwn12
-rw-r--r--doc/bugs/__34__First_post__34___deletion_does_not_refresh_front_page.mdwn6
-rw-r--r--doc/bugs/__34__more__34___doesn__39__t_work.mdwn17
-rw-r--r--doc/bugs/__34__skipping_bad_filename__34___error_when_src_path_contains_spaces.mdwn5
-rw-r--r--doc/bugs/__36__ENV__123__PATH__125___should_include_PREFIXbin.mdwn19
-rw-r--r--doc/bugs/__38__uuml__59___in_markup_makes_ikiwiki_not_un-escape_HTML_at_all.mdwn47
-rw-r--r--doc/bugs/__60__br__62___tags_are_removed_from_markdown_inline_HTML.mdwn31
-rw-r--r--doc/bugs/__63__Discussion_when_not_CGI_mode.mdwn9
-rw-r--r--doc/bugs/__91__PATCH__93___Use_correct_perl_when_running_make.html17
-rw-r--r--doc/bugs/__91__SOLVED__93___Pandoc_plugin_and_UTF-8:_IkiWiki_and_UTF-8.mdwn11
-rw-r--r--doc/bugs/__91__SOLVED__93___Pandoc_plugin_and_UTF-8:_IkiWiki_and_UTF-8/discussion.mdwn23
-rw-r--r--doc/bugs/__96____96__clear:_both__39____39___for___96__.page__42____39____63__.mdwn35
-rw-r--r--doc/bugs/__96__wiki__95__file__95__chars__96___setting_not_propagated_to_CGI_wrapper.mdwn28
-rw-r--r--doc/bugs/absolute_sizes_in_default_CSS.mdwn39
-rw-r--r--doc/bugs/aggregate_generates_long_filenames.mdwn40
-rw-r--r--doc/bugs/aggregate_global_feed_names.mdwn13
-rw-r--r--doc/bugs/aggregate_plugin_errors.mdwn62
-rw-r--r--doc/bugs/aggregate_plugin_errors/discussion.mdwn6
-rw-r--r--doc/bugs/aggregate_plugin_should_honour_a_post__39__s_mctime.mdwn15
-rw-r--r--doc/bugs/aggregate_removed_feeds_linger.mdwn11
-rw-r--r--doc/bugs/aggregateinline_planets_wrongly_link_to_posts.mdwn17
-rw-r--r--doc/bugs/align_doesn__39__t_always_work_with_img_plugin_.mdwn7
-rw-r--r--doc/bugs/anonok_vs._httpauth.mdwn118
-rw-r--r--doc/bugs/argument_isn__39__t_numeric:_mixing_templates_and_creation__95__date.mdwn62
-rw-r--r--doc/bugs/attachment:_escaping_underscores_in_filename__63__.mdwn22
-rw-r--r--doc/bugs/attachment:_failed_to_get_filehandle.mdwn115
-rw-r--r--doc/bugs/attachment_plugin_enabled_by_default__63__.mdwn19
-rw-r--r--doc/bugs/attachment_upload_does_not_work_for_windows_clients.mdwn34
-rw-r--r--doc/bugs/backlink__40__.__41___doesn__39__t_work.mdwn57
-rw-r--r--doc/bugs/backlinks_onhover_thing_can_go_weird.mdwn43
-rw-r--r--doc/bugs/barfs_on_recentchange_entry_for_a_change_removing_an_invalid_pagespec.mdwn48
-rw-r--r--doc/bugs/basewiki_uses_meta_directives_but_meta_is_not_enabled_by_default.mdwn5
-rw-r--r--doc/bugs/beautify__95__urlpath_will_add_.__47___even_if_it_is_already_present.mdwn3
-rw-r--r--doc/bugs/bestlink_change_update_issue.mdwn32
-rw-r--r--doc/bugs/bestlink_returns_deleted_pages.mdwn75
-rw-r--r--doc/bugs/blog_posts_not_added_to_mercurial_repo.mdwn50
-rw-r--r--doc/bugs/blog_spam_plugin_not_allowing_non-ASCII_chars__63__.mdwn15
-rw-r--r--doc/bugs/blogspam__95__options_whitelist_vs._IPv6__63__.mdwn4
-rw-r--r--doc/bugs/blogspam_marks_me_as_spam_on_ipv6.mdwn8
-rw-r--r--doc/bugs/both_inline_and_comment_create_elements_id__61__feedlink.mdwn15
-rw-r--r--doc/bugs/broken_page_after_buggy_remove.mdwn4
-rw-r--r--doc/bugs/broken_parentlinks.mdwn50
-rw-r--r--doc/bugs/brokenlinks_accumulates_duplicate_items.mdwn27
-rw-r--r--doc/bugs/brokenlinks_false_positives.mdwn6
-rw-r--r--doc/bugs/bug_in_cgiurl_port.mdwn15
-rw-r--r--doc/bugs/bug_when_toggling_in_a_preview_page.mdwn29
-rw-r--r--doc/bugs/bugfix_for:___34__mtn:_operation_canceled:_Broken_pipe__34_____40__patch__41__.mdwn24
-rw-r--r--doc/bugs/build_fails_oddly_when_older_ikiwiki_is_installed.mdwn31
-rw-r--r--doc/bugs/build_in_opensolaris.mdwn74
-rw-r--r--doc/bugs/bzr-update-syntax-error.mdwn11
-rw-r--r--doc/bugs/bzr_2.0_breaks_bzr_plugin.mdwn87
-rw-r--r--doc/bugs/bzr_RecentChanges_dates_start_from_1969.mdwn16
-rw-r--r--doc/bugs/bzr_plugin_does_not_define_rcs__95__diff.mdwn27
-rw-r--r--doc/bugs/can__39__t_mix_template_vars_inside_directives.mdwn61
-rw-r--r--doc/bugs/cannot_clone_documented_git_repo.mdwn16
-rw-r--r--doc/bugs/cannot_preview_shortcuts.mdwn17
-rw-r--r--doc/bugs/cannot_reliably_use_meta_in_template.mdwn18
-rw-r--r--doc/bugs/cannot_revert_page_deletion.mdwn10
-rw-r--r--doc/bugs/capitalized_attachment_names.mdwn14
-rw-r--r--doc/bugs/cgi_does_not_use_templatedir_overlay.mdwn26
-rw-r--r--doc/bugs/cgi_wrapper_always_regenerated.mdwn16
-rw-r--r--doc/bugs/class_parameter_of_img_directive_behave_not_as_documented.mdwn31
-rw-r--r--doc/bugs/clearenv_not_present_at_FreeBSD_.mdwn5
-rw-r--r--doc/bugs/clearenv_not_present_at_FreeBSD_/discussion.mdwn1
-rw-r--r--doc/bugs/clearing_email_in_prefs.mdwn5
-rw-r--r--doc/bugs/colon:problem.mdwn12
-rw-r--r--doc/bugs/colon:problem/discussion.mdwn1
-rw-r--r--doc/bugs/comments_appear_two_times.mdwn24
-rw-r--r--doc/bugs/comments_not_searchable.mdwn19
-rw-r--r--doc/bugs/comments_preview_unsafe_with_allowdirectives.mdwn8
-rw-r--r--doc/bugs/comments_produce_broken_links_in_RecentChanges.mdwn9
-rw-r--r--doc/bugs/complex_wiki-code___40__braces__41___in_wikilink-text_breaks_wikilinks.mdwn9
-rw-r--r--doc/bugs/conditional_preprocess_during_scan.mdwn57
-rw-r--r--doc/bugs/conflicts.mdwn32
-rw-r--r--doc/bugs/correct_published_and_updated_time_information_for_the_feeds.mdwn113
-rw-r--r--doc/bugs/creating_page_from_comment_creates_a_comment.mdwn9
-rw-r--r--doc/bugs/cutpaste.pm:_missing_filter_call.mdwn55
-rw-r--r--doc/bugs/ddate_plugin_causes_websetup_to_change_timeformat__44___even_when_disabled.mdwn7
-rw-r--r--doc/bugs/debbug_shortcut_should_expand_differently.mdwn17
-rw-r--r--doc/bugs/debbug_shortcut_should_expand_differently/discussion.mdwn11
-rw-r--r--doc/bugs/debian_package_doesn__39__t_pull_in_packages_required_for_openid.mdwn9
-rw-r--r--doc/bugs/default__95__pageext_not_working.mdwn16
-rw-r--r--doc/bugs/definition_lists_should_be_bold.mdwn27
-rw-r--r--doc/bugs/defintion_lists_appear_to_be_disabled.mdwn54
-rw-r--r--doc/bugs/deletion_warnings.mdwn89
-rw-r--r--doc/bugs/depends_simple_mixup.mdwn88
-rw-r--r--doc/bugs/diff_links_to_backtrace.mdwn9
-rw-r--r--doc/bugs/disable_sub-discussion_pages.mdwn63
-rw-r--r--doc/bugs/disabling_backlinks.mdwn32
-rw-r--r--doc/bugs/discussion.mdwn18
-rw-r--r--doc/bugs/discussion_of_what__63__.mdwn7
-rw-r--r--doc/bugs/discussion_pages_with_uppercase_characters_break_the_detection_of_the_best_location.mdwn6
-rw-r--r--doc/bugs/discussion_removal.mdwn16
-rw-r--r--doc/bugs/done.mdwn3
-rw-r--r--doc/bugs/dumpsetup_does_not_save_destdir.mdwn3
-rw-r--r--doc/bugs/edit_preview_resolves_links_differently_from_commit.mdwn23
-rw-r--r--doc/bugs/editmessage_turned_off_in_web_interface__63__.mdwn10
-rw-r--r--doc/bugs/edits_not_showing_up_in_compiled_pages.mdwn20
-rw-r--r--doc/bugs/edittemplate_seems_not_to_be_working.mdwn7
-rw-r--r--doc/bugs/emails_should_not_be_considered_as_broken_links.mdwn12
-rw-r--r--doc/bugs/encoding_issue_in_blogspam_plugin.mdwn34
-rw-r--r--doc/bugs/entirely_negated_pagespec_matches_internal_pages.mdwn30
-rw-r--r--doc/bugs/enumerations_of_dates_not_formatted_correctly.mdwn43
-rw-r--r--doc/bugs/errors_with_ampersand_in_filename.mdwn21
-rw-r--r--doc/bugs/example_Mercurial_historyurl_doesn__39__t_show_file_history.mdwn17
-rw-r--r--doc/bugs/external_links_inside_headings_don__39__t_work.mdwn24
-rw-r--r--doc/bugs/external_plugins_cannot_access_ARGV_needed_for_getopt.mdwn14
-rw-r--r--doc/bugs/feedfile_does_the_wrong_thing_from_index.mdwn2.mdwn7
-rw-r--r--doc/bugs/feedpages_does_not_prevent_tags_from_being_aggregated.mdwn32
-rw-r--r--doc/bugs/feeds_get_removed_in_strange_conditions.mdwn57
-rw-r--r--doc/bugs/filecheck_failing_to_find_files.mdwn65
-rw-r--r--doc/bugs/find:_invalid_predicate___96__-L__39__.mdwn26
-rw-r--r--doc/bugs/find_gnuism.mdwn7
-rw-r--r--doc/bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info.mdwn14
-rw-r--r--doc/bugs/format_bug.mdwn25
-rw-r--r--doc/bugs/formbuilder_3.0401_broken.mdwn73
-rw-r--r--doc/bugs/git.pm_should_prune_remote_branches_when_fetching.mdwn14
-rw-r--r--doc/bugs/git_commit_adds_files_that_were_not_tracked.mdwn19
-rw-r--r--doc/bugs/git_fails_to_compile.mdwn32
-rw-r--r--doc/bugs/git_mail_notification_race.mdwn57
-rw-r--r--doc/bugs/git_stderr_output_causes_problems.mdwn45
-rw-r--r--doc/bugs/git_utf8.mdwn12
-rw-r--r--doc/bugs/gitremotes_script_picks_up_tags_from_anywhere.mdwn22
-rw-r--r--doc/bugs/gitweb_deficiency_w.r.t._log_messages.mdwn14
-rw-r--r--doc/bugs/gitweb_deficiency_w.r.t._newly_created_pages.mdwn13
-rw-r--r--doc/bugs/goto_with_bad_page_name.mdwn25
-rw-r--r--doc/bugs/graphviz_demo_generates_empty_graph.mdwn15
-rw-r--r--doc/bugs/hardcoded___34__Discussion__34___link.mdwn44
-rw-r--r--doc/bugs/helponformatting_link_disappears.mdwn5
-rw-r--r--doc/bugs/html5_support.mdwn117
-rw-r--r--doc/bugs/html5_time_element__39__s_pubdate_wrong_when_using_xhtml5___34__mode__34__.mdwn46
-rw-r--r--doc/bugs/html_errors.mdwn5
-rw-r--r--doc/bugs/htmlbalance_fails_with_HTML-Tree_v4.mdwn18
-rw-r--r--doc/bugs/htmlscrubber_breaks_multimarkdown_footnotes.mdwn18
-rw-r--r--doc/bugs/htmlscrubber_still_scrubbing_HTML_from_mdwn_pages.mdwn21
-rw-r--r--doc/bugs/htmlscrubber_undoes_email_obfuscation_by_Text::Markdown.mdwn37
-rw-r--r--doc/bugs/htmltidy_has_no_possibilty_to_use_an_alternative_config_file_which_may_break_other_usages.mdwn26
-rw-r--r--doc/bugs/http_proxy_for_openid.mdwn86
-rw-r--r--doc/bugs/httpauth_conflicts_with_git_anon_push.mdwn25
-rw-r--r--doc/bugs/ikiwiki-mass-rebuild_fails_to_drop_privileges_and_execute_ikiwiki.mdwn30
-rw-r--r--doc/bugs/ikiwiki-transition_does_not_set_perl_moduels_path_properly.mdwn17
-rw-r--r--doc/bugs/ikiwiki.setup:_syntax_error_at___40__eval_5__41___line_120__44___at_EOF.mdwn9
-rw-r--r--doc/bugs/ikiwiki.setup_require_blank_rcs_to_work_as_cgi_only.mdwn46
-rw-r--r--doc/bugs/ikiwiki__39__s_ViewVC_down.mdwn3
-rw-r--r--doc/bugs/ikiwiki_cgi_fails_to_build_on_Solaris_due_to_missing_LOCK__95__EX.mdwn43
-rw-r--r--doc/bugs/ikiwiki_ignores_PATH_environment.mdwn24
-rw-r--r--doc/bugs/ikiwiki_lacks_a_--quiet.mdwn29
-rw-r--r--doc/bugs/ikiwiki_overzealously_honours_locks_when_asked_for_forms.mdwn34
-rw-r--r--doc/bugs/ikiwiki_renders___39__28__39___if_external_plugins_return_nothing.mdwn12
-rw-r--r--doc/bugs/images_in_inlined_pages_have_wrong_relative_URL.mdwn15
-rw-r--r--doc/bugs/img_plugin_and_class_attr.mdwn27
-rw-r--r--doc/bugs/img_plugin_and_missing_heigth_value.mdwn7
-rw-r--r--doc/bugs/img_plugin_causes_taint_failure.mdwn20
-rw-r--r--doc/bugs/img_plugin_renders___60__img__62___tag_without_src_attribute_post-2.20.mdwn36
-rw-r--r--doc/bugs/img_plugin_should_pass_through_class_attribute.mdwn49
-rw-r--r--doc/bugs/img_vs_align.mdwn38
-rw-r--r--doc/bugs/img_with_alt_has_extra_double_quote.mdwn32
-rw-r--r--doc/bugs/index.html__63__updated.mdwn15
-rw-r--r--doc/bugs/index.html_is_made_visible_by_various_actions.mdwn16
-rw-r--r--doc/bugs/iniline_breaks_toc_plugin.mdwn64
-rw-r--r--doc/bugs/inline_action_buttons_circumvent_exclude_criteria_from_edittemplate__39__s_match__61____34____34___pagespec.mdwn15
-rw-r--r--doc/bugs/inline_archive_crash.mdwn6
-rw-r--r--doc/bugs/inline_breaks_PERMALINK_variable.mdwn25
-rw-r--r--doc/bugs/inline_from_field_empty_if_rootpage_doesn__39__t_exist.mdwn20
-rw-r--r--doc/bugs/inline_page_not_updated_on_removal.mdwn9
-rw-r--r--doc/bugs/inline_plugin_rootpage_option_is_not_case_insensitive.mdwn9
-rw-r--r--doc/bugs/inline_plugin_sets_editurl_even_when_editpage_is_disabled.html16
-rw-r--r--doc/bugs/inline_raw_broken_on_unknown_pagetype.mdwn29
-rw-r--r--doc/bugs/inline_skip_causes_empty_inline.mdwn10
-rw-r--r--doc/bugs/inline_sort-by-title_issues.mdwn57
-rw-r--r--doc/bugs/inline_sort_order_and_meta_date_value.mdwn314
-rw-r--r--doc/bugs/install_into_home_dir_fails.mdwn57
-rw-r--r--doc/bugs/installing_from_svn_copies_.svn_directories.mdwn28
-rw-r--r--doc/bugs/internal_error:_smileys.mdwn_cannot_be_found.mdwn41
-rw-r--r--doc/bugs/ipv6_address_in_comments.mdwn19
-rw-r--r--doc/bugs/jquery-ui.min.css_missing_some_image_files.mdwn14
-rw-r--r--doc/bugs/librpc-xml-perl_0.69_breaks_XML-RPC_plugins.mdwn13
-rw-r--r--doc/bugs/linkingrules_should_document_how_to_link_to_page_at_root_if_non-root_page_exists.mdwn6
-rw-r--r--doc/bugs/linkmap_displays_underscore_escapes.mdwn18
-rw-r--r--doc/bugs/linkmap_displays_underscore_escapes/the_patch.pl68
-rw-r--r--doc/bugs/links_from_sidebars.mdwn14
-rw-r--r--doc/bugs/links_from_sidebars/discussion.mdwn5
-rw-r--r--doc/bugs/links_misparsed_in_CSV_files.mdwn27
-rw-r--r--doc/bugs/listdirectives_doesn__39__t_register_a_link.mdwn34
-rw-r--r--doc/bugs/lockedit_plugin_should_alert_user_about_an_invalid_pagespec_in_preferences.mdwn21
-rw-r--r--doc/bugs/locking_fun.mdwn105
-rw-r--r--doc/bugs/login_page_non-obvious_with_openid.mdwn47
-rw-r--r--doc/bugs/login_page_should_note_cookie_requirement.mdwn39
-rw-r--r--doc/bugs/logout_in_ikiwiki.mdwn44
-rw-r--r--doc/bugs/mailto:_links_not_properly_generated_in_rssatom_feeds.mdwn29
-rw-r--r--doc/bugs/map_does_not_link_directory_for_which_a_file_also_exists.mdwn26
-rw-r--r--doc/bugs/map_does_not_link_directory_for_which_a_file_also_exists/discussion.mdwn3
-rw-r--r--doc/bugs/map_doesn__39__t_calculate___34__common__95__prefix__34___correctly.mdwn70
-rw-r--r--doc/bugs/map_fails_to_close_ul_element_for_empty_list.mdwn93
-rw-r--r--doc/bugs/map_generates_malformed_HTML.mdwn36
-rw-r--r--doc/bugs/map_is_inconsistent_about_bare_directories.mdwn86
-rw-r--r--doc/bugs/map_sorts_by_pagename_and_not_title_when_show__61__title_is_used.mdwn20
-rw-r--r--doc/bugs/maps_with_nested_directories_sometimes_make_ugly_lists.mdwn62
-rw-r--r--doc/bugs/markdown_bug:_email_escaping_and_plus_addresses.mdwn37
-rw-r--r--doc/bugs/markdown_module_location.mdwn49
-rw-r--r--doc/bugs/mercurial_fail_to_add.mdwn34
-rw-r--r--doc/bugs/merging_to_basewiki_causes_odd_inconsistencies.mdwn6
-rw-r--r--doc/bugs/messed_up_repository.mdwn21
-rw-r--r--doc/bugs/meta_inline.mdwn4
-rw-r--r--doc/bugs/methodResponse_in_add__95__plugins.mdwn41
-rw-r--r--doc/bugs/minor:_tiny_rendering_error.mdwn5
-rw-r--r--doc/bugs/misctemplate_does_not_respect_the_current_page___40__if_any__41__.mdwn101
-rw-r--r--doc/bugs/monotone_backend_does_not_support_srcdir_in_subdir.mdwn5
-rw-r--r--doc/bugs/more_and_RSS_generation.mdwn20
-rw-r--r--doc/bugs/multiple_encoding_issues_in_atom.mdwn8
-rw-r--r--doc/bugs/multiple_pages_with_same_name.mdwn76
-rw-r--r--doc/bugs/multiple_rss_feeds_per_page.mdwn31
-rw-r--r--doc/bugs/must_save_before_uploading_more_than_one_attachment.mdwn44
-rw-r--r--doc/bugs/nested_inlines_produce_no_output.mdwn12
-rw-r--r--doc/bugs/nested_raw_included_inlines.mdwn51
-rw-r--r--doc/bugs/newfile-test.mdwn11
-rw-r--r--doc/bugs/no_commit_mails_for_new_pages.mdwn10
-rw-r--r--doc/bugs/no_easy_way_to_wrap_HTML_container_around_a_set_of_inlined_pages.mdwn23
-rw-r--r--doc/bugs/no_search_button__44___hence_can__39__t_do_search_in_w3m_at_ikiwiki.info.mdwn32
-rw-r--r--doc/bugs/nonexistent_pages_in_inline_pagenames_do_not_add_a_dependency.mdwn44
-rw-r--r--doc/bugs/octal_umask_setting_is_unintuitive.mdwn55
-rw-r--r--doc/bugs/opendiscussion_should_respect_the_discussion_option.mdwn11
-rw-r--r--doc/bugs/opendiscussion_should_respect_the_discussion_option/discussion.mdwn26
-rw-r--r--doc/bugs/openid_incompatability_with_pyblosxom_openid_server_plugin_when_used_with_simple_registration_extension.mdwn3
-rw-r--r--doc/bugs/openid_no_longer_pretty-prints_OpenIDs.mdwn17
-rw-r--r--doc/bugs/openid_postsignin_failure.mdwn52
-rw-r--r--doc/bugs/osm_KML_maps_do_not_display_properly_on_google_maps.mdwn14
-rw-r--r--doc/bugs/osm_KML_maps_icon_path_have_a_trailing_slash.mdwn34
-rw-r--r--doc/bugs/osm_linkto__40____41___usage_breaks_map_rendering.mdwn23
-rw-r--r--doc/bugs/osm_sometimes_looses_some_nodes.mdwn5
-rw-r--r--doc/bugs/output_of_successful_rename_should_list_the_full_path_to_affected_pages.mdwn14
-rw-r--r--doc/bugs/package_build_fails_in_non-English_environment.mdwn11
-rw-r--r--doc/bugs/page_is_not_rebuilt_if_it_changes_extension.mdwn27
-rw-r--r--doc/bugs/page_preview_does_not_work_on_new_page_with_a_table.mdwn3
-rw-r--r--doc/bugs/pagecount_is_broken.mdwn4
-rw-r--r--doc/bugs/pagemtime_in_refresh_mode.mdwn28
-rw-r--r--doc/bugs/pages_missing_top-level_directory.mdwn78
-rw-r--r--doc/bugs/pages_under_templates_are_invalid.mdwn16
-rw-r--r--doc/bugs/pagespec:_tagged__40____41____44___globbing.mdwn36
-rw-r--r--doc/bugs/pagespec_can__39__t_match___123__curly__125___braces.mdwn44
-rw-r--r--doc/bugs/pagespec_error_on_refresh_but_not_rebuild.mdwn32
-rw-r--r--doc/bugs/pagespec_parsing_chokes_on_function__40____41__.mdwn64
-rw-r--r--doc/bugs/pagestats_plugin_broken.mdwn29
-rw-r--r--doc/bugs/pagetitle_function_does_not_respect_meta_titles.mdwn289
-rw-r--r--doc/bugs/parsing_for_WikiWords_should_only_be_done_outside_html_tags.mdwn17
-rw-r--r--doc/bugs/password_deletion.mdwn7
-rw-r--r--doc/bugs/perl:_double_free_or_corruption.mdwn14
-rw-r--r--doc/bugs/pipe-symbol_in_taglink_target.mdwn25
-rw-r--r--doc/bugs/pipe_in_tables_as_characters.mdwn16
-rw-r--r--doc/bugs/plugin___96__rename__96___fails_if___96__attachment__96___is_not_enabled.mdwn7
-rw-r--r--doc/bugs/plugins__47__relativedate_depends_on_locale_at_setup_file.mdwn16
-rw-r--r--doc/bugs/po:__apache_config_serves_index_directory_for_index.mdwn85
-rw-r--r--doc/bugs/po:_apache_config_serves_index.rss_for_index.mdwn36
-rw-r--r--doc/bugs/po:_double_commits_of_po_files.mdwn22
-rw-r--r--doc/bugs/po:_markdown_link_parse_bug.mdwn21
-rw-r--r--doc/bugs/po:_might_not_add_translated_versions_of_all_underlays.mdwn16
-rw-r--r--doc/bugs/po:_new_pages_not_translatable.mdwn12
-rw-r--r--doc/bugs/po:_plugin_should_not_override_the_title_on_the_homepage.mdwn58
-rw-r--r--doc/bugs/po:_po4a_too_strict_on_html_pages.mdwn22
-rw-r--r--doc/bugs/po:_po_files_instead_of_html_files.mdwn30
-rw-r--r--doc/bugs/po:_ugly_messages_with_empty_files.mdwn6
-rw-r--r--doc/bugs/po:broken_links_to_translatable_basewiki_pages_that_lack_po_fies.mdwn73
-rw-r--r--doc/bugs/po_plugin_adds_new_dependency.mdwn38
-rw-r--r--doc/bugs/po_plugin_cannot_add_po_files_into_git.mdwn34
-rw-r--r--doc/bugs/po_vs_templates.mdwn48
-rw-r--r--doc/bugs/poll_plugin:_can__39__t_vote_for_non-ascii_options.mdwn7
-rw-r--r--doc/bugs/poll_plugin_uses_GET.mdwn8
-rw-r--r--doc/bugs/possibly_po_related_error.mdwn20
-rw-r--r--doc/bugs/post-commit_hangs.mdwn47
-rw-r--r--doc/bugs/post-update_hook_can__39__t_be_compiled_with_tcc.mdwn19
-rw-r--r--doc/bugs/prettydate_with_weekday-date_inconsistency.mdwn32
-rw-r--r--doc/bugs/preview_base_url_should_be_absolute.mdwn53
-rw-r--r--doc/bugs/preview_pagestate.mdwn13
-rw-r--r--doc/bugs/previewing_new_page_can_leave_files_dangling.mdwn53
-rw-r--r--doc/bugs/previewing_with_an_edittemplate_reverts_edit_box.mdwn5
-rw-r--r--doc/bugs/problem_adding_tag_from_template.mdwn10
-rw-r--r--doc/bugs/proxy.py_utf8_troubles.mdwn35
-rw-r--r--doc/bugs/prune_causing_taint_mode_failures.mdwn35
-rw-r--r--doc/bugs/pruning_is_too_strict.mdwn12
-rw-r--r--doc/bugs/quieten_mercurial.mdwn34
-rw-r--r--doc/bugs/raw_html_in-page_and___91____91____33__included__93____93__.mdwn100
-rw-r--r--doc/bugs/rebuild_after_changing_the_underlaydir_config_option.mdwn12
-rw-r--r--doc/bugs/recentchanges_escaping.mdwn5
-rw-r--r--doc/bugs/recentchanges_feed_links.mdwn107
-rw-r--r--doc/bugs/recentchanges_sets_has__95__diffurl__61__1_when_diffurl_is_empty.mdwn18
-rw-r--r--doc/bugs/recentchangesdiff_crashes_on_commits_which_remove_a_lot_of_files.mdwn46
-rw-r--r--doc/bugs/relative_date_weird_results.mdwn4
-rw-r--r--doc/bugs/removal_of_transient_pages.mdwn78
-rw-r--r--doc/bugs/remove_orphaned_sparkline-php_from_Suggests.mdwn22
-rw-r--r--doc/bugs/remove_plugin_and_untracked_files.mdwn6
-rw-r--r--doc/bugs/removing_pages_with_utf8_characters.mdwn51
-rw-r--r--doc/bugs/rename_fixup_not_attributed_to_author.mdwn12
-rw-r--r--doc/bugs/renaming_a_page_destroyed_some_links.mdwn12
-rw-r--r--doc/bugs/resized_img_with_only_width_or_height_breaks_ie.mdwn9
-rw-r--r--doc/bugs/rss_feed_cleanup_on_delete.mdwn6
-rw-r--r--doc/bugs/rss_feeds_do_not_use_recommended_encoding_of_entities_for_some_fields.mdwn52
-rw-r--r--doc/bugs/rss_output_relative_links.mdwn3
-rw-r--r--doc/bugs/rst_fails_on_file_containing_only_a_number.mdwn29
-rw-r--r--doc/bugs/rst_plugin_hangs_on_utf-8.mdwn20
-rw-r--r--doc/bugs/rst_plugin_has_python_hardcode_in_shebang_line.mdwn15
-rw-r--r--doc/bugs/rst_plugin_traceback_with_SimpleXMLRPCDispatcher_from_pyhton_2.5.mdwn13
-rw-r--r--doc/bugs/rst_tweak.mdwn52
-rw-r--r--doc/bugs/search:___34__link__34___and___34__title__34___fields_are_incorrectly_specified.mdwn29
-rw-r--r--doc/bugs/search_creates_configuration_files_many_times_on_rebuild.mdwn9
-rw-r--r--doc/bugs/search_for_locale_data_in_the_installed_location.mdwn25
-rw-r--r--doc/bugs/search_plugin_and_CGI_preview.mdwn19
-rw-r--r--doc/bugs/search_plugin_finds_no_results_with_xapian_1.2.7.mdwn14
-rw-r--r--doc/bugs/search_plugin_uses_wrong_css_path.mdwn14
-rw-r--r--doc/bugs/search_template_missing_dep.mdwn4
-rw-r--r--doc/bugs/several_entries_in_docs__47__bugs_contain_colons_in_the_filename.mdwn15
-rw-r--r--doc/bugs/shortcut_encoding.mdwn28
-rw-r--r--doc/bugs/shortcut_plugin_will_not_work_without_shortcuts.mdwn.mdwn33
-rw-r--r--doc/bugs/shortcuts_don__39__t_escape_from_Markdown.mdwn7
-rw-r--r--doc/bugs/sidebar_is_obscured_by_recentchanges.mdwn59
-rw-r--r--doc/bugs/sidebar_not_updated_in_unedited_subpages.mdwn9
-rw-r--r--doc/bugs/sitemap_includes_images_directly.mdwn8
-rw-r--r--doc/bugs/some_but_not_all_meta_fields_are_stored_escaped.mdwn44
-rw-r--r--doc/bugs/some_strings_are_not_internationalized.mdwn47
-rw-r--r--doc/bugs/space_in_a___91____91__page_link__93____93___doesn__39__t_make_link.mdwn32
-rw-r--r--doc/bugs/special_characters_in_tag_names_need_manual_escaping.mdwn3
-rw-r--r--doc/bugs/ssl_certificates_not_checked_with_openid.mdwn85
-rw-r--r--doc/bugs/strange_hook_id_in_skeleton.pm.mdwn5
-rw-r--r--doc/bugs/stray___60____47__p__62___tags.mdwn17
-rw-r--r--doc/bugs/support_for_openid2_logins.mdwn24
-rw-r--r--doc/bugs/svn+ssh_commit_fail.mdwn5
-rw-r--r--doc/bugs/svn-commit-hanging.mdwn7
-rw-r--r--doc/bugs/svn_commit_failures_interpreted_as_merge_conflicts.mdwn21
-rw-r--r--doc/bugs/svn_fails_to_update.mdwn89
-rw-r--r--doc/bugs/svn_post-commit_wrapper_can__39__t_find_IkiWiki.pm_if_not_installed.mdwn22
-rw-r--r--doc/bugs/syntax_error_in_aggregate.mdwn11
-rw-r--r--doc/bugs/table_external_file_links.mdwn9
-rw-r--r--doc/bugs/table_plugin_does_not_handle___92__r__92__n_lines_in_CSV_files.mdwn5
-rw-r--r--doc/bugs/tag_behavior_changes_introduced_by_typed_link_feature.mdwn16
-rw-r--r--doc/bugs/tag_plugin:_autotag_vs._staged_changes.mdwn17
-rw-r--r--doc/bugs/tagged__40____41___matching_wikilinks.mdwn35
-rw-r--r--doc/bugs/tags__44___backlinks_and_3.x.mdwn34
-rw-r--r--doc/bugs/tags_base_dir_not_used_when_creating_new_tags.mdwn43
-rw-r--r--doc/bugs/taint_and_-T.mdwn30
-rw-r--r--doc/bugs/taint_issue_with_regular_expressions.mdwn35
-rw-r--r--doc/bugs/tbasewiki__95__brokenlinks.t_broken.mdwn60
-rw-r--r--doc/bugs/tbasewiki__95__brokenlinks.t_broken/discussion.mdwn2
-rw-r--r--doc/bugs/templateForRecentChangesMissingCloseSpan.mdwn26
-rw-r--r--doc/bugs/template_creation_error.mdwn111
-rw-r--r--doc/bugs/teximg_does_not_work_Preview.mdwn12
-rw-r--r--doc/bugs/teximg_fails_if_same_tex_is_used_on_multiple_pages.mdwn24
-rw-r--r--doc/bugs/textile_plugin_dies_if_input_has_a_non-utf8_character.mdwn14
-rw-r--r--doc/bugs/the_login_page_is_unclear_when_multiple_methods_exist.mdwn16
-rw-r--r--doc/bugs/title__40____41___in_a_PageSpec__44___with_meta_enabled__44___causes_a_crash.mdwn3
-rw-r--r--doc/bugs/toc_displays_headings_from_sidebar.mdwn3
-rw-r--r--doc/bugs/toc_in_sidebar.mdwn21
-rw-r--r--doc/bugs/toggle_expects_body_element_without_attributes.mdwn3
-rw-r--r--doc/bugs/toggle_fails_on_Safari.mdwn58
-rw-r--r--doc/bugs/trail_excess_dependencies.mdwn95
-rw-r--r--doc/bugs/trail_shows_on_cgi_pages.mdwn12
-rw-r--r--doc/bugs/trail_test_suite_failures.mdwn97
-rw-r--r--doc/bugs/transient_autocreated_tagbase_is_not_transient_autoindexed.mdwn31
-rw-r--r--doc/bugs/transitive_dependencies.mdwn94
-rw-r--r--doc/bugs/trouble_with_base_in_search.mdwn60
-rw-r--r--doc/bugs/txt_plugin_having_problems_with_meta_directives.mdwn19
-rw-r--r--doc/bugs/typo_in_ikiwiki.setup.mdwn9
-rw-r--r--doc/bugs/typo_in_skeleton.pm:_sessionncgi.mdwn5
-rw-r--r--doc/bugs/undefined_tags_or_mismatched_tags_won__39__t_get_converted.mdwn46
-rw-r--r--doc/bugs/undefined_value_as_a_HASH_reference.mdwn68
-rw-r--r--doc/bugs/underlaydir_file_expose.mdwn13
-rw-r--r--doc/bugs/unicode_chars_in_wikiname_break_auth.mdwn20
-rw-r--r--doc/bugs/unicode_encoded_urls_and_recentchanges.mdwn38
-rw-r--r--doc/bugs/unrecognized___34__do__61__blog__34___CGI_parameter_when_creating_todo_item.mdwn18
-rw-r--r--doc/bugs/unwanted_discussion_links_on_discussion_pages.mdwn36
-rw-r--r--doc/bugs/urlto_API_change_breaks_wikis_with_po_plugin.mdwn98
-rw-r--r--doc/bugs/urlto__40____34____34____44___...__44___1__41___not_absolute.mdwn9
-rw-r--r--doc/bugs/user_links_on_recentchanges_pages_problem.mdwn12
-rw-r--r--doc/bugs/utf-8_bug_in_websetup.pm.mdwn22
-rw-r--r--doc/bugs/utf8_html_templates.mdwn22
-rw-r--r--doc/bugs/utf8_svn_log.mdwn11
-rw-r--r--doc/bugs/web_reversion_on_ikiwiki.info.mdwn14
-rw-r--r--doc/bugs/websetup_eats_setupconf_and_allow__95__symlinks__95__before__95__srcdir.mdwn21
-rw-r--r--doc/bugs/weird_signature_in_match__95__included__40____41__.mdwn7
-rw-r--r--doc/bugs/weird_syntax_in_aggregate.pm.mdwn9
-rw-r--r--doc/bugs/wiki_formatting_does_not_work_between_toc_and_an_inline.mdwn30
-rw-r--r--doc/bugs/wiki_links_still_processed_inside_code_blocks.mdwn67
-rw-r--r--doc/bugs/wiki_rebuild_should_throw_errors_if_the_configured_underlaydir_or_templatedir_don__39__t_exist.mdwn15
-rw-r--r--doc/bugs/wikilink_in_table.mdwn36
-rw-r--r--doc/bugs/word_wrap.mdwn16
-rw-r--r--doc/bugs/wrapper_can__39__t_find_the_perl_modules.mdwn16
-rw-r--r--doc/bugs/wrong_attachment_size.mdwn8
-rw-r--r--doc/bugs/wrong_discussion_page_created.mdwn12
-rw-r--r--doc/bugs/wrong_link_in_recentchanges_when_reverting_an_ikiwiki_outside_git_root.mdwn8
-rw-r--r--doc/bugs/wrong_permissions_on_some_files_in_source.mdwn11
-rw-r--r--doc/bugs/wrong_rss_url_when_inside_another_blog-like_page.mdwn36
-rw-r--r--doc/bugs/xgettext_issue.mdwn73
-rw-r--r--doc/bugs/yaml:xs_codependency_not_listed.mdwn13
-rw-r--r--doc/bugs/yaml_setup_file_does_not_support_UTF-8_if_XS_is_installed.mdwn104
-rw-r--r--doc/cgi.mdwn5
-rw-r--r--doc/cgi/discussion.mdwn22
-rw-r--r--doc/commit-internals.mdwn20
-rw-r--r--doc/competition.mdwn19
-rw-r--r--doc/consultants.mdwn9
-rw-r--r--doc/contact.mdwn11
-rw-r--r--doc/contact/discussion.mdwn14
-rw-r--r--doc/convert.mdwn9
-rw-r--r--doc/css.mdwn24
-rw-r--r--doc/css/discussion.mdwn18
-rw-r--r--doc/css_market.mdwn68
-rw-r--r--doc/css_market/02_template.css307
-rw-r--r--doc/css_market/02_template.tmpl20
-rw-r--r--doc/css_market/bma.css108
-rw-r--r--doc/css_market/cstamas.css69
-rw-r--r--doc/css_market/discussion.mdwn37
-rw-r--r--doc/css_market/embeddedmoose.css13
-rw-r--r--doc/css_market/kirkambar.css142
-rw-r--r--doc/css_market/zack.css193
-rw-r--r--doc/download.mdwn52
-rw-r--r--doc/examples.mdwn12
-rw-r--r--doc/examples/blog.mdwn26
-rw-r--r--doc/examples/blog/archives.mdwn8
-rw-r--r--doc/examples/blog/comments.mdwn10
-rw-r--r--doc/examples/blog/discussion.mdwn13
-rw-r--r--doc/examples/blog/index.mdwn11
-rw-r--r--doc/examples/blog/posts.mdwn3
-rw-r--r--doc/examples/blog/posts/first_post.mdwn2
-rw-r--r--doc/examples/blog/sidebar.mdwn10
-rw-r--r--doc/examples/blog/tags.mdwn3
-rw-r--r--doc/examples/softwaresite.mdwn19
-rw-r--r--doc/examples/softwaresite/Makefile15
-rw-r--r--doc/examples/softwaresite/bugs.mdwn4
-rw-r--r--doc/examples/softwaresite/bugs/done.mdwn3
-rw-r--r--doc/examples/softwaresite/bugs/fails_to_frobnicate.mdwn4
-rw-r--r--doc/examples/softwaresite/bugs/hghg.mdwn1
-rw-r--r--doc/examples/softwaresite/bugs/needs_more_bugs.mdwn3
-rw-r--r--doc/examples/softwaresite/contact.mdwn7
-rw-r--r--doc/examples/softwaresite/doc.mdwn5
-rw-r--r--doc/examples/softwaresite/doc/faq.mdwn11
-rw-r--r--doc/examples/softwaresite/doc/install.mdwn10
-rw-r--r--doc/examples/softwaresite/doc/setup.mdwn4
-rw-r--r--doc/examples/softwaresite/download.mdwn5
-rw-r--r--doc/examples/softwaresite/index.mdwn13
-rw-r--r--doc/examples/softwaresite/news.mdwn5
-rw-r--r--doc/examples/softwaresite/news/version_1.0.mdwn1
-rw-r--r--doc/examples/softwaresite/templates/release.mdwn7
-rw-r--r--doc/favicon.icobin0 -> 371 bytes
-rw-r--r--doc/features.mdwn181
-rw-r--r--doc/forum.mdwn11
-rw-r--r--doc/forum/404_-_not_found.mdwn22
-rw-r--r--doc/forum/404_-_not_found/comment_1_3dea2600474f77fb986767da4d507d62._comment8
-rw-r--r--doc/forum/404_-_not_found/comment_2_948e4678be6f82d9b541132405351a2c._comment31
-rw-r--r--doc/forum/404_-_not_found/comment_3_4c7b1fa88776815bbc6aa286606214c2._comment8
-rw-r--r--doc/forum/Accessing_meta_values_in_pages__63__.mdwn8
-rw-r--r--doc/forum/Adding_new_markup_to_markdown.mdwn11
-rw-r--r--doc/forum/Allow_only_specific_OpenIDs_to_login.mdwn7
-rw-r--r--doc/forum/Anyone_mirroring_ikiwiki_inline_feed_to_identi.ca__63__.mdwn3
-rw-r--r--doc/forum/Anyone_mirroring_ikiwiki_inline_feed_to_identi.ca__63__/comment_1_8a5acbb6234104b607c8c4cf16124ae4._comment8
-rw-r--r--doc/forum/Anyone_mirroring_ikiwiki_inline_feed_to_identi.ca__63__/comment_2_155e5823860a91989647ede8b5c9224a._comment16
-rw-r--r--doc/forum/Anyone_mirroring_ikiwiki_inline_feed_to_identi.ca__63__/comment_3_317f1202a3da1bfc845d4becbac4bba8._comment10
-rw-r--r--doc/forum/Apache_XBitHack.mdwn28
-rw-r--r--doc/forum/Are_these_revisions_legit_or_accidentally_committed_jiberish__63__.mdwn12
-rw-r--r--doc/forum/Are_these_revisions_legit_or_accidentally_committed_jiberish__63__/comment_1_b425823f800fba82ad2aaaa0dbe6686a._comment10
-rw-r--r--doc/forum/Asciidoc_plugin.mdwn14
-rw-r--r--doc/forum/Attachment_and_sub-directory.mdwn5
-rw-r--r--doc/forum/Background_picture_and_css.mdwn8
-rw-r--r--doc/forum/Blog_posting_times_and_ikiwiki_state.mdwn28
-rw-r--r--doc/forum/Blog_posting_times_and_ikiwiki_state/comment_1_87304dfa2caea7e526cdb04917524e8c._comment8
-rw-r--r--doc/forum/Broken_after_upgrading_Ikiwiki.mdwn10
-rw-r--r--doc/forum/Broken_after_upgrading_Ikiwiki/comment_1_3d0588a845c58b3aedc35970e8dcc811._comment14
-rw-r--r--doc/forum/Broken_after_upgrading_Ikiwiki/comment_2_fd65d4b87a735b67543bb0cf4053b652._comment10
-rw-r--r--doc/forum/Broken_after_upgrading_Ikiwiki/comment_3_7c8b46eabdb25cbc01c56c7b53ed3b91._comment8
-rw-r--r--doc/forum/CGI_script_and_HTTPS.mdwn29
-rw-r--r--doc/forum/CGI_script_and_HTTPS/comment_1_3f8ef438ca7de11635d4e40080e7baa9._comment43
-rw-r--r--doc/forum/Calendar:_listing_multiple_entries_per_day.mdwn21
-rw-r--r--doc/forum/Calendar:_listing_multiple_entries_per_day/comment_1_d3dd0b97c63d615e3dee22ceacaa5a30._comment83
-rw-r--r--doc/forum/Calendar:_listing_multiple_entries_per_day/comment_2_2311b96483bb91dc25d5e3695bbca513._comment12
-rw-r--r--doc/forum/Calendar:_listing_multiple_entries_per_day/comment_3_d23f0cedd0b9e937eaf200eef55ac457._comment166
-rw-r--r--doc/forum/Calendar:_listing_multiple_entries_per_day/comment_4_4be39c2043821848d4b25d0bf946a718._comment15
-rw-r--r--doc/forum/Calendar:_listing_multiple_entries_per_day/comment_5_de545ebb6376066674ef2aaae4757b9c._comment97
-rw-r--r--doc/forum/Can_I_change_the_default_menu_items__63__.mdwn6
-rw-r--r--doc/forum/Can_I_change_the_default_menu_items__63__/comment_2_eb56fed3b5fc19c8dd49af4444a049c5._comment31
-rw-r--r--doc/forum/Can_I_have_different_favicons_for_each_folder__63__.mdwn1
-rw-r--r--doc/forum/Can_I_have_different_favicons_for_each_folder__63__/comment_1_a01112ba235e2f44a7655c36ef680e7e._comment19
-rw-r--r--doc/forum/Can_I_have_different_favicons_for_each_folder__63__/comment_2_b8ccd3c29249eca73766f567bce12569._comment8
-rw-r--r--doc/forum/Can_Ikiwiki_recognize_multimarkdown_meta_tags__63__.mdwn4
-rw-r--r--doc/forum/Can_OpenID_users_be_adminusers__63__.mdwn69
-rw-r--r--doc/forum/Can__39__t_get_comments_plugin_working.mdwn16
-rw-r--r--doc/forum/Can__39__t_get_ikiwiki_working_again_after_reinstall.mdwn16
-rw-r--r--doc/forum/Can__39__t_get_ikiwiki_working_again_after_reinstall/comment_1_87a360155ff0502fe08274911cc6a53f._comment8
-rw-r--r--doc/forum/Can__39__t_login_using_Google__44___or_openID__44___but_can_use_Ikiwiki_login.mdwn5
-rw-r--r--doc/forum/Can__39__t_login_using_Google__44___or_openID__44___but_can_use_Ikiwiki_login/comment_1_79127e3c09a1d798146088dee5a67708._comment10
-rw-r--r--doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__.mdwn8
-rw-r--r--doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__/comment_1_d1e79825dfb5213d2d1cba2ace1707b1._comment8
-rw-r--r--doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__/comment_2_8177ede5a586b1a573a13fd26f8d3cc0._comment8
-rw-r--r--doc/forum/Can_ikiwiki_be_configured_as_multi_user_blog__63__.mdwn7
-rw-r--r--doc/forum/Can_not_advance_past_first_page_of_results_using_search_plugin.mdwn26
-rw-r--r--doc/forum/Can_one_tell_if_a_page_is_added_rather_than_changed__63__.mdwn5
-rw-r--r--doc/forum/Can_one_tell_if_a_page_is_added_rather_than_changed__63__/comment_1_1397feebfb0fb7cc57af2f8b74ce047e._comment8
-rw-r--r--doc/forum/Can_one_tell_if_a_page_is_added_rather_than_changed__63__/comment_2_ad36c945f59fe525428fc30246911ff5._comment10
-rw-r--r--doc/forum/Cannot_write_to_commitlock.mdwn28
-rw-r--r--doc/forum/Chinese_file_name_corruption.mdwn5
-rw-r--r--doc/forum/Chinese_file_name_corruption/comment_1_765ac8b6f70083bb5aaaaac5beab461f._comment10
-rw-r--r--doc/forum/Clarification_on_--cgi_option.mdwn4
-rw-r--r--doc/forum/Clarification_on_--cgi_option/comment_1_deda457e4bff7dfe630dbc0192dfddea._comment11
-rw-r--r--doc/forum/Commiting_all_moderated_comments_into_special_branch__63__.mdwn8
-rw-r--r--doc/forum/Commiting_all_moderated_comments_into_special_branch__63__/comment_1_8403e8ff9c5c8dddb6d744632322f7bc._comment12
-rw-r--r--doc/forum/Darcs_as_the_RCS___63__.mdwn13
-rw-r--r--doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm.mdwn19
-rw-r--r--doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm/comment_1_aec4bf4ca7d04d580d2fa83fd3f7166f._comment8
-rw-r--r--doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm/comment_4_c682ebb0e8e72088a8f92356dc31ef37._comment13
-rw-r--r--doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__.mdwn7
-rw-r--r--doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__/comment_1_5e916c8fa90470909064ea73531f79d4._comment12
-rw-r--r--doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__/comment_2_2fa15f0eaf8c860b82e366130c8563c7._comment8
-rw-r--r--doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__/comment_3_c5af589dcdfe4f91dba50243762065e5._comment12
-rw-r--r--doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__/comment_4_3090da7bafbf92a825edec8ffc45af20._comment12
-rw-r--r--doc/forum/Define_custom_commands.mdwn1
-rw-r--r--doc/forum/Define_custom_commands/comment_1_7d82637bc8c706b69e4a55585677f6bf._comment11
-rw-r--r--doc/forum/Different_behaviour_when_editing_with_editor_than_with_web-edit.mdwn24
-rw-r--r--doc/forum/Different_behaviour_when_editing_with_editor_than_with_web-edit/comment_1_ac6bda46ad00bfe980bc76c4a39aa796._comment9
-rw-r--r--doc/forum/Different_behaviour_when_editing_with_editor_than_with_web-edit/comment_3_10a46f8ee23c8935e20c70842671cee4._comment13
-rw-r--r--doc/forum/Different_templates_for_subdirectories__63_____40__Blogging_and_Wiki_pages__41__.mdwn7
-rw-r--r--doc/forum/Different_templates_for_subdirectories__63_____40__Blogging_and_Wiki_pages__41__/comment_1_15651796492a6f04a19f4a481947c97c._comment16
-rw-r--r--doc/forum/Disable_account_creation_for_new_users.mdwn9
-rw-r--r--doc/forum/Disable_account_creation_for_new_users/comment_1_adafddb0aff7c2c1f4574101c4cf9073._comment8
-rw-r--r--doc/forum/Disable_account_creation_for_new_users/comment_2_865591f77966f1657a9a4b2426318c51._comment12
-rw-r--r--doc/forum/Disable_account_creation_for_new_users/comment_3_05193e563682f634f13691ee0a8359db._comment8
-rw-r--r--doc/forum/Discussion_PageSpec__63__.mdwn3
-rw-r--r--doc/forum/Doing_related_links_based_on_tags.mdwn31
-rw-r--r--doc/forum/Dump_plugin.mdwn4
-rw-r--r--doc/forum/Dump_plugin/comment_1_bfce80b3f5be78ec28692330843d4ae1._comment14
-rw-r--r--doc/forum/Email_notifications_for_comment_moderation.mdwn3
-rw-r--r--doc/forum/Email_notifications_for_comment_moderation/comment_1_668bf6a21310dcc8b882bc60a130ba06._comment12
-rw-r--r--doc/forum/Empty_sha1sum_messages.mdwn11
-rw-r--r--doc/forum/Empty_sha1sum_messages/comment_1_b260b5e6b4c4f4c203b01183fee9fd69._comment10
-rw-r--r--doc/forum/Empty_sha1sum_messages/comment_2_d6a47838a3c81d0a75e6fc22e786c976._comment10
-rw-r--r--doc/forum/Encoding_problem_in_french_with_ikiwiki-calendar.mdwn20
-rw-r--r--doc/forum/Error:_CGI::tmpFileName_failed_to_return_the_uploaded_file_name.mdwn11
-rw-r--r--doc/forum/Error:_CGI::tmpFileName_failed_to_return_the_uploaded_file_name/comment_1_66c321b9eb618d20872cee7d6ca9e44c._comment8
-rw-r--r--doc/forum/Error:_CGI::tmpFileName_failed_to_return_the_uploaded_file_name/comment_2_80296d67c7f1dd75b56b85c14f5efa3b._comment12
-rw-r--r--doc/forum/Error:___34__do__34___parameter_missing.mdwn13
-rw-r--r--doc/forum/Error:___34__do__34___parameter_missing/comment_1_3a51c303ba1670f1567f323349b53837._comment16
-rw-r--r--doc/forum/Error:___34__do__34___parameter_missing/comment_2_c5f24a8c4d2de0267cf0de1908480e82._comment12
-rw-r--r--doc/forum/Error:_bad_page_name.mdwn46
-rw-r--r--doc/forum/Error:_cannot_write_to___47__home__47__user__47__myiki2__47__.ikiwiki__47__lockfile:_Permission_denied_.mdwn5
-rw-r--r--doc/forum/Error:_cannot_write_to___47__home__47__user__47__myiki2__47__.ikiwiki__47__lockfile:_Permission_denied_/comment_1_64146f306ec8c10614521359b6de4f82._comment10
-rw-r--r--doc/forum/Error:_cannot_write_to___47__home__47__user__47__myiki2__47__.ikiwiki__47__lockfile:_Permission_denied_/comment_2_ed2b4b8f7122b42bbde1189fbd2969dd._comment23
-rw-r--r--doc/forum/Error_Code_1.mdwn7
-rw-r--r--doc/forum/Error_Code_1/comment_1_0459afcc383aad382df67a19eaf2e731._comment8
-rw-r--r--doc/forum/Everyone_can_remove_comments.mdwn1
-rw-r--r--doc/forum/Flowplayer.mdwn1
-rw-r--r--doc/forum/Flowplayer/comment_1_75d13cd915a736422db47e00dbe46671._comment9
-rw-r--r--doc/forum/Flowplayer/comment_2_1b2d3891006a87a4773bd126baacddfc._comment8
-rw-r--r--doc/forum/Forward_slashes_being_escaped_as_252F.mdwn33
-rw-r--r--doc/forum/Forward_slashes_being_escaped_as_252F/comment_1_7702cf6d354ab600d6643b075b9f09da._comment12
-rw-r--r--doc/forum/Google_searches_of_ikiwiki.info_are_broken._:__40__.mdwn14
-rw-r--r--doc/forum/Have_creation_time_for_new_pages_fetched_from_the_VCS.mdwn7
-rw-r--r--doc/forum/Have_creation_time_for_new_pages_fetched_from_the_VCS/comment_1_9572dd6f7a2f6f630b12f24bb5c4a8ce._comment8
-rw-r--r--doc/forum/Help_with_tag__95__autocreate.mdwn9
-rw-r--r--doc/forum/Hide_text.mdwn3
-rw-r--r--doc/forum/Hide_text/comment_1_f21d21c130f97a7b21d8a317178e2e0c._comment8
-rw-r--r--doc/forum/Hide_text/comment_2_5a878865f34f78a89c4ec91a9425a085._comment8
-rw-r--r--doc/forum/How_best_to_manage_a_bilingual_ikiwiki_site__63__.mdwn15
-rw-r--r--doc/forum/How_best_to_manage_a_bilingual_ikiwiki_site__63__/comment_1_4389d65b14fa1b7134098e0ffe3bf055._comment10
-rw-r--r--doc/forum/How_can_I_prevent_spam__63__.mdwn17
-rw-r--r--doc/forum/How_can_I_prevent_spam__63__/comment_1_fd26fb7f1569e8c44ba8262794f938db._comment19
-rw-r--r--doc/forum/How_can_I_prevent_spam__63__/comment_2_d098124f005976ee815d25c883bc9106._comment16
-rw-r--r--doc/forum/How_can_I_prevent_spam__63__/comment_3_deb434d01aaefa18d2791e48d6c824ae._comment8
-rw-r--r--doc/forum/How_do_I_enable_OpenID__63__.mdwn1
-rw-r--r--doc/forum/How_does_ikiwiki_remember_times__63__.mdwn98
-rw-r--r--doc/forum/How_is_TITLE_evaluated_in_inline_archive_templates__63__.mdwn11
-rw-r--r--doc/forum/How_long_does_server_delay_newly_pushed_revisions__63__.mdwn10
-rw-r--r--doc/forum/How_to_add_a_mouse-over_pop-up_label_for_a_text__63__.mdwn8
-rw-r--r--doc/forum/How_to_add_additional_links_to_the_gray_horizontable_bar_under_page_title__63__.mdwn3
-rw-r--r--doc/forum/How_to_add_additional_links_to_the_gray_horizontable_bar_under_page_title__63__/comment_1_f2e52d38f60888c7d5142de853123540._comment8
-rw-r--r--doc/forum/How_to_add_link_to_previous_and_next_blog_on_blog_pages__63__.mdwn14
-rw-r--r--doc/forum/How_to_add_link_to_previous_and_next_blog_on_blog_pages__63__/comment_1_aad510f45be505efaabcb6fb860665a4._comment23
-rw-r--r--doc/forum/How_to_add_link_to_previous_and_next_blog_on_blog_pages__63__/comment_2_ee65792a5b796caa216f4e7a653fc668._comment23
-rw-r--r--doc/forum/How_to_allow_.markdown_and_.md_at_the_same_time_as_valid_extensions_for_source_files__63__.mdwn1
-rw-r--r--doc/forum/How_to_apply_a_background_color_to_a_page__63__.mdwn1
-rw-r--r--doc/forum/How_to_change_registration_page.mdwn9
-rw-r--r--doc/forum/How_to_change_registration_page/comment_1_43758a232e4360561bc84f710862ff40._comment14
-rw-r--r--doc/forum/How_to_change_registration_page/comment_2_8176ef231cf901802fc60b6d414018e6._comment8
-rw-r--r--doc/forum/How_to_configure_po_plugin__63__.mdwn21
-rw-r--r--doc/forum/How_to_configure_po_plugin__63__/comment_1_5e0cc4cdfd126f2f4af64104f02102d6._comment9
-rw-r--r--doc/forum/How_to_create_a_WikiLink_to_a_page_in_a_subdirectory__63__.mdwn26
-rw-r--r--doc/forum/How_to_create_a_WikiLink_to_a_page_in_a_subdirectory__63__/comment_1_d20ee1d8d7a3e77a445f8b887e807119._comment11
-rw-r--r--doc/forum/How_to_create_first_translation_page_using_po_plugin__63__.mdwn24
-rw-r--r--doc/forum/How_to_customize_page_title__63__.mdwn6
-rw-r--r--doc/forum/How_to_customize_page_title__63__/comment_1_403e1f866b5e04e5899021f54bbdd1ed._comment10
-rw-r--r--doc/forum/How_to_fix___34__does_not_map_to_Unicode__34___errors__63__.mdwn20
-rw-r--r--doc/forum/How_to_format___91____91__foobar__93____93___in_code_blocks__63__.mdwn9
-rw-r--r--doc/forum/How_to_format___91____91__foobar__93____93___in_code_blocks__63__/comment_1_ad000d39fd1dc05aa8ef6eb19d8d999b._comment8
-rw-r--r--doc/forum/How_to_generate_blog_archive_pages_in___47__blog_subdir_but_not_ikiwiki_root_path__63__.mdwn26
-rw-r--r--doc/forum/How_to_inline_a_page_from_another_git_repository.mdwn5
-rw-r--r--doc/forum/How_to_list_all_pages_in_a_wiki_instance__63__.mdwn5
-rw-r--r--doc/forum/How_to_list_all_pages_in_a_wiki_instance__63__/comment_1_920bcc70fe6d081cf27aa2cc7c6136f4._comment8
-rw-r--r--doc/forum/How_to_list_new_pages__44___inline__63__.mdwn5
-rw-r--r--doc/forum/How_to_list_new_pages__44___inline__63__/comment_1_e989b18bade34a92a9c8fe7099036e15._comment13
-rw-r--r--doc/forum/How_to_lock_all_pages_except_discussions_and_comments_to_blog_posts__63__.mdwn11
-rw-r--r--doc/forum/How_to_lock_all_pages_except_discussions_and_comments_to_blog_posts__63__/comment_1_e153beb17b6ada69c6ab09d1f491d112._comment8
-rw-r--r--doc/forum/How_to_make_a_table_of_content_at_the_top_of_page__63__.mdwn3
-rw-r--r--doc/forum/How_to_make_a_table_of_content_at_the_top_of_page__63__/comment_1_6dedc31dd1145490bb5fa4ad14cc4c63._comment8
-rw-r--r--doc/forum/How_to_migrate_multiple_pages_in_root_directory_into_a_new_subdirectory__63__.mdwn10
-rw-r--r--doc/forum/How_to_migrate_multiple_pages_in_root_directory_into_a_new_subdirectory__63__/comment_1_a83a1a33afbf245971733b4128809365._comment15
-rw-r--r--doc/forum/How_to_set_the_meta_author_field_from_user_name__63__.mdwn3
-rw-r--r--doc/forum/How_to_set_the_meta_author_field_from_user_name__63__/comment_1_0906e1f3eb8b826a7730233b95cb5ddd._comment10
-rw-r--r--doc/forum/How_to_set_up_a_page_as_internal__63__.mdwn5
-rw-r--r--doc/forum/How_to_set_up_git_repository_hook___63__.mdwn19
-rw-r--r--doc/forum/How_to_show_recent_changes_for_individual_pages__63__.mdwn1
-rw-r--r--doc/forum/How_to_show_recent_changes_for_individual_pages__63__/comment_1_cd34affc6883f4e4bc5e7e7b711cc8ba._comment10
-rw-r--r--doc/forum/How_to_specify_repository_is_on_a_remote_host__63__.mdwn3
-rw-r--r--doc/forum/How_to_specify_repository_is_on_a_remote_host__63__/comment_1_0c71e17ae552cbab1056ac96fbd36c59._comment9
-rw-r--r--doc/forum/How_to_specify_repository_is_on_a_remote_host__63__/comment_2_b309302a084fbd8bcd4cd9bd2509cf5a._comment10
-rw-r--r--doc/forum/How_to_style_main_sidebar_and_SubPage_sidebar_differently_using_CSS__63__.mdwn13
-rw-r--r--doc/forum/How_to_use___126____47__bin__47__multimarkdown_instead_of_Text::MultiMarkdown.mdwn5
-rw-r--r--doc/forum/How_to_use_number_as_bullet_labels_but_not_letter_in_toc_plugin.mdwn8
-rw-r--r--doc/forum/Howto_add_tag_from_plugin_code.mdwn12
-rw-r--r--doc/forum/Howto_add_tag_from_plugin_code/comment_1_c61454825874a6fe1905cb549386deb0._comment77
-rw-r--r--doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__.mdwn14
-rw-r--r--doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__/comment_1_a66fd9d7ab4359784a5420cd899a1057._comment8
-rw-r--r--doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__/comment_2_3351ff773fea3f640f4036bb8c7c7efd._comment10
-rw-r--r--doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__/comment_3_273b2b63a9af2bc4eeb030e026436687._comment12
-rw-r--r--doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__/comment_4_546771c13ea1b550301586e187d82cb5._comment8
-rw-r--r--doc/forum/Ikiwiki_CGI_not_working_on_my_server__44___and_it__39__s_a_binary_file__63__.mdwn33
-rw-r--r--doc/forum/Ikiwiki_themes_for_mobile_devices__63__.mdwn7
-rw-r--r--doc/forum/Include_attachment_in_a_page.mdwn9
-rw-r--r--doc/forum/Include_attachment_in_a_page/comment_1_275aad6ca3b2972749b7f6636b130035._comment12
-rw-r--r--doc/forum/Is_it_possible_to_change_default_mdwn_suffix__63__.mdwn1
-rw-r--r--doc/forum/Is_it_possible_to_change_default_mdwn_suffix__63__/comment_1_2a449c6017ecdb4f557963266fb4ec41._comment8
-rw-r--r--doc/forum/Is_there_a_pagespec_for_creation_dates_relative_to_today__63__.mdwn26
-rw-r--r--doc/forum/LaTeX_Error.mdwn66
-rw-r--r--doc/forum/Last_visited_pages.mdwn1
-rw-r--r--doc/forum/Last_visited_pages/comment_1_e34650064dd645b35da98e80c0311df9._comment8
-rw-r--r--doc/forum/Last_visited_pages/comment_2_2a0c4e844da1deaa2c286e87c8eab84d._comment8
-rw-r--r--doc/forum/Link_to_a_local_pdf_file.mdwn1
-rw-r--r--doc/forum/Link_to_a_local_pdf_file/comment_1_b6c57588042373f8e1f187041c1a8530._comment8
-rw-r--r--doc/forum/Log_in_error.mdwn5
-rw-r--r--doc/forum/Log_in_error/comment_1_0ef13ea01a413160d81951636c15c3e6._comment10
-rw-r--r--doc/forum/Map_Plugin__44___would_like_to_add___63__updated_to_all_links.mdwn3
-rw-r--r--doc/forum/Map_Plugin__44___would_like_to_add___63__updated_to_all_links/comment_1_3fe4c5967e704355f9b594aed46baf67._comment13
-rw-r--r--doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated.mdwn27
-rw-r--r--doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_2_b44a492c7f10395a31f3c0830ef33f0c._comment10
-rw-r--r--doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_2_f9240b217b2d1ee8d51dada9cb1186b3._comment28
-rw-r--r--doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_3_c3c5c41a4c220793c6d16f3fd6132272._comment15
-rw-r--r--doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_4_1f6f9e3939a454c1eb8d2fb29bd519de._comment16
-rw-r--r--doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_5_8611fc62797e70a0d2a61d94fcb03170._comment22
-rw-r--r--doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn54
-rw-r--r--doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__/comment_1_e5ce524c5d34b1d4218172296bd99100._comment8
-rw-r--r--doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__/comment_3_65c4a4895f6541ff0ff2d094ff447bba._comment8
-rw-r--r--doc/forum/Moving_wiki.git_folder__63__.mdwn17
-rw-r--r--doc/forum/Moving_wiki.git_folder__63__/comment_1_05238461520613f4ed1b0d02ece663bd._comment11
-rw-r--r--doc/forum/Moving_wiki.git_folder__63__/comment_2_72b2b842dfa0cfaf899fe7af12977519._comment10
-rw-r--r--doc/forum/Multiple_urls.mdwn8
-rw-r--r--doc/forum/Multiple_urls/comment_1_e4c1256346d5a421161c20e344d8bada._comment22
-rw-r--r--doc/forum/Need_help_installing_h1title_plugin.mdwn5
-rw-r--r--doc/forum/Need_help_setting_up_ikiwiki_CGI.mdwn16
-rw-r--r--doc/forum/Need_help_setting_up_ikiwiki_CGI/comment_1_0fc4573568711c56a0df4af620110c2f._comment12
-rw-r--r--doc/forum/Need_help_setting_up_ikiwiki_CGI/comment_3_89f2cd7d874a6257786478e4cae1e2bc._comment16
-rw-r--r--doc/forum/Need_help_setting_up_ikiwiki_CGI/comment_3_cbc20267fe5f0531f63db881d50596d1._comment20
-rw-r--r--doc/forum/Need_help_setting_up_ikiwiki_CGI/comment_4_2eaf53935eecd0a918755d728450a642._comment8
-rw-r--r--doc/forum/Need_some_help_on_starting_to_use_po_plugin_for_creating_pages_in_multiple_languages.mdwn6
-rw-r--r--doc/forum/Need_something_more_powerful_than_Exclude.mdwn5
-rw-r--r--doc/forum/Need_something_more_powerful_than_Exclude/comment_2_0019cd6b34c8d8678b2532de57a92d15._comment12
-rw-r--r--doc/forum/Need_something_more_powerful_than_Exclude/comment_2_f577ab6beb9912471949d8d18c790267._comment11
-rw-r--r--doc/forum/Need_something_more_powerful_than_Exclude/comment_3_1ed260b0083a290688425a006a83f603._comment8
-rw-r--r--doc/forum/Need_something_more_powerful_than_Exclude/comment_4_c39bdaf38e1e20db74eb26f0560bd673._comment10
-rw-r--r--doc/forum/Need_something_more_powerful_than_Exclude/comment_5_39b01857f7e0b388a6e7a3c1cf5388d5._comment9
-rw-r--r--doc/forum/Need_something_more_powerful_than_Exclude/comment_6_1dccdfebad31446200213a2cae25f0e2._comment10
-rw-r--r--doc/forum/News_site_where_articles_are_submitted_and_then_reviewed_before_posting.mdwn23
-rw-r--r--doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__.mdwn12
-rw-r--r--doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__/comment_1_bf1bec748d6ab419276a73a7001024cf._comment8
-rw-r--r--doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__/comment_2_14a1b269be6dbcc9b2068d3e18b55711._comment10
-rw-r--r--doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__/comment_3_f581afcdb4481ea5d65bcc33bdbab99a._comment25
-rw-r--r--doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__/comment_4_b0d39d30852bca1525ab9612a7532670._comment8
-rw-r--r--doc/forum/PERL5LIB__44___wrappers_and_homedir_install.mdwn38
-rw-r--r--doc/forum/PageSpec_results_from_independent_checkout.mdwn8
-rw-r--r--doc/forum/Parent_Links_all_link_to_root.mdwn18
-rw-r--r--doc/forum/Parent_Links_all_link_to_root/comment_1_4b5ed25cceb7740f64ee08aba00a1d91._comment8
-rw-r--r--doc/forum/Perhaps_I__39__m_doing_it_wrong_-_tracking_non-post_files_in_a_blog.mdwn7
-rw-r--r--doc/forum/Perhaps_I__39__m_doing_it_wrong_-_tracking_non-post_files_in_a_blog/comment_1_45ecaf6efa2065837fa54a42737f0a66._comment18
-rw-r--r--doc/forum/Perhaps_I__39__m_doing_it_wrong_-_tracking_non-post_files_in_a_blog/comment_2_45ca7ef4190c281d703c8c7ca6979298._comment12
-rw-r--r--doc/forum/Possible_to_use_meta_variables_in_templates__63__.mdwn11
-rw-r--r--doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_1_556078a24041289d8f0b7ee756664690._comment20
-rw-r--r--doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_2_e7e954218d39bc310015b95aa1a5212c._comment10
-rw-r--r--doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_3_8b16c563c89eb6980ad6a5539d934d7a._comment8
-rw-r--r--doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_4_76eadf93cce4e2168960131d4677c5fc._comment8
-rw-r--r--doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_5_ddabe4a005042d19c7669038b49275c1._comment12
-rw-r--r--doc/forum/Problem_with_gitweb.mdwn3
-rw-r--r--doc/forum/Problem_with_gitweb/comment_2_23cc0d87448d3cbdac20a005e9191589._comment10
-rw-r--r--doc/forum/Problem_with_gitweb/comment_3_697c6038009249e6a49d9e458a5ba271._comment47
-rw-r--r--doc/forum/Problem_with_gitweb/comment_3_6a5b96f7e0d6b169c090e3df7281d938._comment8
-rw-r--r--doc/forum/Problem_with_gitweb/comment_5_8a79b879205bd265d54e30f0eee2ac63._comment8
-rw-r--r--doc/forum/Problem_with_local_git_commit.mdwn42
-rw-r--r--doc/forum/Processing_non-pages.mdwn7
-rw-r--r--doc/forum/Recent_changes_on_main_site_or_on_a_sidebar.mdwn1
-rw-r--r--doc/forum/Recent_changes_on_main_site_or_on_a_sidebar/comment_1_018b977ff7ee59fc53838e0c20c3a9a7._comment11
-rw-r--r--doc/forum/Recent_changes_on_main_site_or_on_a_sidebar/comment_2_927c11f18315baa39f08ca4982ed2ab1._comment8
-rw-r--r--doc/forum/Refresh_or_recreate_style.css__63__.mdwn40
-rw-r--r--doc/forum/Refresh_or_recreate_style.css__63__/comment_1_3274be931d0b543c7f7cf641810817aa._comment8
-rw-r--r--doc/forum/Regex_for_Valid_Characters_in_Filenames.mdwn19
-rw-r--r--doc/forum/Render_more_than_one_dest_page_from_same_source_page.mdwn51
-rw-r--r--doc/forum/Revision_history_for_single_pages.mdwn3
-rw-r--r--doc/forum/Revision_history_for_single_pages/comment_1_d509d5d726cd7eab9472d723013f5ec4._comment8
-rw-r--r--doc/forum/Revision_history_for_single_pages/comment_2_d39a6177fc4c1e3c3c2c4e2592be9e3d._comment8
-rw-r--r--doc/forum/Revision_history_for_single_pages/comment_3_aecf2b031ace001afaa2a0f2b5f50c82._comment8
-rw-r--r--doc/forum/Run_script_on_markdown_source.mdwn1
-rw-r--r--doc/forum/See_rendered_old_revisions_via_pagehistory.mdwn1
-rw-r--r--doc/forum/Setting_http__95__proxy.mdwn22
-rw-r--r--doc/forum/Setting_http__95__proxy/comment_1_350a7c4834c9f422e107b646cdbae3b0._comment20
-rw-r--r--doc/forum/Setting_template_variable_from_config_file__63__.mdwn1
-rw-r--r--doc/forum/Setting_template_variable_from_config_file__63__/comment_1_bb4b5a7a49f33d660b5116fc0ce3c92d._comment8
-rw-r--r--doc/forum/Setting_up_a_development_environment.mdwn32
-rw-r--r--doc/forum/Should_files_in_.ikiwiki_be_committed_and_pushed__63__.mdwn14
-rw-r--r--doc/forum/Should_files_in_.ikiwiki_be_committed_and_pushed__63__/comment_1_8e65d7d8298e3c31d2a16446a71c8049._comment10
-rw-r--r--doc/forum/Should_not_create_an_existing_page.mdwn15
-rw-r--r--doc/forum/Sidebar_with_links__63__.mdwn58
-rw-r--r--doc/forum/Slow_ikiwiki_after_first_run.mdwn1
-rw-r--r--doc/forum/Spaces_in_wikilinks.mdwn104
-rw-r--r--doc/forum/Split_a_wiki.mdwn21
-rw-r--r--doc/forum/Split_a_wiki/comment_1_1599c26891b2071a2f1ca3fd90627fc4._comment8
-rw-r--r--doc/forum/Split_a_wiki/comment_2_1c54d3594f0350340f8dfb3e95c29ffd._comment20
-rw-r--r--doc/forum/Split_a_wiki/comment_3_9eac1d1b93df27d849acc574b1f0f26d._comment8
-rw-r--r--doc/forum/Split_a_wiki/comment_4_e193ba447c0188f72ba589180b5d529e._comment8
-rw-r--r--doc/forum/TMPL__95__VAR_IS__95__ADMIN.mdwn1
-rw-r--r--doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_1_3172568473e9b79ad7ab623afd19411a._comment13
-rw-r--r--doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_2_4302d56a6fe68d17cc42d26e6f3566c2._comment8
-rw-r--r--doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_3_4cc44e61b9c28a2d524fa874f115041a._comment14
-rw-r--r--doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_4_33143bad68f3f6beae963a3d0ec5d0bd._comment53
-rw-r--r--doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_5_ef790766456d723670f52cc9e3955e90._comment12
-rw-r--r--doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_6_3db50264e01c8fad2e5567b5a9c7b6dc._comment23
-rw-r--r--doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_7_bdc5c96022fdb8826b57d68a41ef6ca0._comment8
-rw-r--r--doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server..mdwn11
-rw-r--r--doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_1_d36ce6fab90e0a086ac84369af38d205._comment16
-rw-r--r--doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_2_5836bba08172d2ddf6a43da87ebb0332._comment7
-rw-r--r--doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_3_4eec15c8c383275db5401c8e3c2d9242._comment9
-rw-r--r--doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_4_43ac867621efb68affa6ae2b92740cad._comment10
-rw-r--r--doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_5_e098723bb12adfb91ab561cae21b492b._comment8
-rw-r--r--doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_6_101183817ca4394890bd56a7694bedd9._comment10
-rw-r--r--doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_7_2f514e6ba78d43d90e7ff4ae387e65e0._comment10
-rw-r--r--doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_8_098bb7a3112751a7e6167483dde626bb._comment10
-rw-r--r--doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_9_fbf403255c38da93caa5b98589fbb285._comment8
-rw-r--r--doc/forum/Translating_ikiwiki_interface.mdwn8
-rw-r--r--doc/forum/Upgrade_steps_from_RecentChanges_CGI_to_static_page.mdwn10
-rw-r--r--doc/forum/Various_ways_to_use_Subversion_with_ikiwiki.mdwn23
-rw-r--r--doc/forum/What__39__s_the_difference_between_tag_and_taglink__63__.mdwn9
-rw-r--r--doc/forum/What__39__s_the_difference_between_tag_and_taglink__63__/comment_1_b3553d65d12af4c4a87f1f66f961c8d9._comment49
-rw-r--r--doc/forum/What_is_wrong_with_my_recentchange_page___63__.mdwn13
-rw-r--r--doc/forum/When_do_tags_like_a__47__b_get_listed_as_a__47__b_and_not_only_b__63__.mdwn6
-rw-r--r--doc/forum/When_do_tags_like_a__47__b_get_listed_as_a__47__b_and_not_only_b__63__/comment_1_cd5ea3aac8a59793ece5bf01a6190b53._comment9
-rw-r--r--doc/forum/Wikilink_to_a_symbolic_link.mdwn1
-rw-r--r--doc/forum/Wikilink_to_a_symbolic_link/comment_1_e3ad5099491e0c84cd7729eba82ce552._comment8
-rw-r--r--doc/forum/Wikilink_to_a_symbolic_link/comment_2_46848020b1e3d0cd55bc1ec0ba382aad._comment8
-rw-r--r--doc/forum/Wikilink_to_section_of_a_wikipage.mdwn1
-rw-r--r--doc/forum/Wikilink_to_section_of_a_wikipage/comment_1_c1409a3c07dfc4ed7274560c962aba75._comment11
-rw-r--r--doc/forum/Wikilink_to_section_of_a_wikipage/comment_2_8a04eb7b0d7f17b9e5bb4cd04ba45871._comment8
-rw-r--r--doc/forum/Xapian_search:_empty_postlist_table.mdwn34
-rw-r--r--doc/forum/Xapian_search:_empty_postlist_table/comment_1_de9a7c94beec2707eda0924ca58be9df._comment8
-rw-r--r--doc/forum/Xapian_search:_empty_postlist_table/comment_2_55f191e4b1306a318a30319f01802229._comment15
-rw-r--r--doc/forum/Xapian_search:_empty_postlist_table/comment_3_0bd424a89c3a52ff393a1e7e00c806be._comment24
-rw-r--r--doc/forum/Xapian_search:_empty_postlist_table/comment_4_40479ac2cfbca609f5f423e539a20ee0._comment8
-rw-r--r--doc/forum/Xapian_search:_empty_postlist_table/comment_5_397443138da276e11c2e9b9fa7b51406._comment8
-rw-r--r--doc/forum/__42__.html_source_file_containing___60__script__62__...__60____47__script__62___not_working__63__.mdwn13
-rw-r--r--doc/forum/__42__.html_source_file_containing___60__script__62__...__60____47__script__62___not_working__63__/comment_1_953bd716373dcf51fa444ac098b7f971._comment8
-rw-r--r--doc/forum/__42__.html_source_file_containing___60__script__62__...__60____47__script__62___not_working__63__/comment_2_c7360852f9bf069f28c193373333c9a8._comment8
-rw-r--r--doc/forum/__42__.html_source_file_containing___60__script__62__...__60____47__script__62___not_working__63__/comment_3_6ffc30e27387366b48112198b66c01fa._comment8
-rw-r--r--doc/forum/access_restrictions:_for_extranet.mdwn8
-rw-r--r--doc/forum/access_restrictions:_for_extranet/comment_1_a0666c3c15661fb0fff70f313cd0d47d._comment29
-rw-r--r--doc/forum/access_restrictions:_for_extranet/comment_2_563040aa099c9366dc5701eb4bc9c10d._comment20
-rw-r--r--doc/forum/an_alternative_approach_to_structured_data.mdwn63
-rw-r--r--doc/forum/appear_if_you_are_login_or_not_in_a_page.mdwn36
-rw-r--r--doc/forum/attachments_fail_to_upload.mdwn8
-rw-r--r--doc/forum/attachments_fail_to_upload/comment_1_577adde1dfa49463dfa8e169c462fc42._comment10
-rw-r--r--doc/forum/attachments_fail_to_upload/comment_2_473f38c6d523496fac8dad13ac6d20c3._comment12
-rw-r--r--doc/forum/attachments_fail_to_upload/comment_3_799a2f1b7b259157e97fd31ec76fb845._comment10
-rw-r--r--doc/forum/attachments_fail_to_upload/comment_4_e37d1497acafd3fda547462f000636e3._comment8
-rw-r--r--doc/forum/attachments_fail_to_upload/comment_5_da03f9c4917cb1ef52de984b8ba86b68._comment11
-rw-r--r--doc/forum/attachments_fail_to_upload/comment_6_04498946a300ddb652dec73c2950f48f._comment19
-rw-r--r--doc/forum/bashman.mdwn7
-rw-r--r--doc/forum/build_error:_Cannot_decode_string_with_wide_characters.mdwn12
-rw-r--r--doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_1_83fbb415dd3ae6a19ed5ea5f82065c28._comment8
-rw-r--r--doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_2_d258536c98538d4744f66eb3132439a9._comment20
-rw-r--r--doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_3_d62173d0ae220ab7b063631952856587._comment10
-rw-r--r--doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_4_d5d0174e09a94359c23fd9c006a22bbc._comment50
-rw-r--r--doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_5_e652027a8f90ebef6f21613b5784ded2._comment8
-rw-r--r--doc/forum/chinese_character_problem.mdwn21
-rw-r--r--doc/forum/cleaning_up_discussion_pages_and_the_like.mdwn11
-rw-r--r--doc/forum/copyright_and_license_template_variables___40__where_are_they_set__63____41__.mdwn13
-rw-r--r--doc/forum/create_download_link.mdwn4
-rw-r--r--doc/forum/create_download_link/comment_1_4797493157c569f8893b53b5e5a58e73._comment14
-rw-r--r--doc/forum/cutpaste.pm_not_only_file-local.mdwn14
-rw-r--r--doc/forum/cutpaste.pm_not_only_file-local/comment_1_497c62f21fd1b87625b806407c72dbad._comment8
-rw-r--r--doc/forum/debian_backports_update_someone_please.mdwn18
-rw-r--r--doc/forum/discussion.mdwn7
-rw-r--r--doc/forum/double_forward_slash___39____47____47____39___in_the_address_bar.mdwn3
-rw-r--r--doc/forum/editing_a_comment.mdwn11
-rw-r--r--doc/forum/editing_the_style_sheet.mdwn18
-rw-r--r--doc/forum/error_302___40__Found__41___when_editing_page.mdwn59
-rw-r--r--doc/forum/ever-growing_list_of_pages.mdwn29
-rw-r--r--doc/forum/field__95__tags_not_linking.mdwn66
-rw-r--r--doc/forum/field__95__tags_not_linking/comment_10_7c1540e6eb6aafd2e1c9c7016e6e6249._comment10
-rw-r--r--doc/forum/field__95__tags_not_linking/comment_11_0c03cbaa4f748d2fb932fda08fe6e966._comment8
-rw-r--r--doc/forum/field__95__tags_not_linking/comment_12_9f3a402173f9584d8a36bc61e5755f6d._comment8
-rw-r--r--doc/forum/field__95__tags_not_linking/comment_13_455a2f921059f9ecca810bb8afed0fda._comment10
-rw-r--r--doc/forum/field__95__tags_not_linking/comment_14_b82294c290a215d9aa6774ee20b5a552._comment10
-rw-r--r--doc/forum/field__95__tags_not_linking/comment_15_57fb279ad50f8460341dc0f217acef06._comment10
-rw-r--r--doc/forum/field__95__tags_not_linking/comment_16_8dae1024e80cf6ea765dee0318324d71._comment8
-rw-r--r--doc/forum/field__95__tags_not_linking/comment_1_76a4fb4def8f13b906c848814de91660._comment12
-rw-r--r--doc/forum/field__95__tags_not_linking/comment_2_64d51cc9ba953e7fed609c380e30bb7d._comment13
-rw-r--r--doc/forum/field__95__tags_not_linking/comment_3_7a6eac4e216133f1cf6fc12336fc2496._comment8
-rw-r--r--doc/forum/field__95__tags_not_linking/comment_4_e6941a0df00fb9f45563c30e01efa622._comment9
-rw-r--r--doc/forum/field__95__tags_not_linking/comment_5_f08ded5a946458aeba59a2c4cec29b2f._comment8
-rw-r--r--doc/forum/field__95__tags_not_linking/comment_6_6ea7de20c3db96589c05adbe97d57cfd._comment8
-rw-r--r--doc/forum/field__95__tags_not_linking/comment_7_8ad385b61c46389d87c88b17430ab1f2._comment8
-rw-r--r--doc/forum/field__95__tags_not_linking/comment_8_c3c5eced158babd8c3acb493a86b6ecb._comment8
-rw-r--r--doc/forum/field__95__tags_not_linking/comment_9_9bd4b3df18a28a7ab3bbef5013856987._comment11
-rw-r--r--doc/forum/field_and_forms.mdwn13
-rw-r--r--doc/forum/field_and_forms/comment_1_a0e976cb79f03dcff5e9a4511b90d160._comment19
-rw-r--r--doc/forum/formating:_how_to_align_text_to_the_right.mdwn15
-rw-r--r--doc/forum/getsource_from_gitweb_with_cgi_disabled___44___Is_it_possible__63__.mdwn6
-rw-r--r--doc/forum/getsource_from_gitweb_with_cgi_disabled___44___Is_it_possible__63__/comment_1_747cc477584028ce2c7bc198070b1221._comment10
-rw-r--r--doc/forum/getsource_from_gitweb_with_cgi_disabled___44___Is_it_possible__63__/comment_2_a230861b26dba6d61461862bfedbc09c._comment8
-rw-r--r--doc/forum/getsource_from_gitweb_with_cgi_disabled___44___Is_it_possible__63__/comment_3_848b4801fc7887906a21a676e802023c._comment10
-rw-r--r--doc/forum/google_openid_broken__63__.mdwn82
-rw-r--r--doc/forum/how_can_I_use___39____47____39___as_tagbase__63__.mdwn13
-rw-r--r--doc/forum/how_can_I_use___39____47____39___as_tagbase__63__/comment_1_e7897651ba8d9156526d36d6b7744eae._comment8
-rw-r--r--doc/forum/how_could_i_generate_a_flat_textfile_from_metadata_in_multiple_pages.mdwn3
-rw-r--r--doc/forum/how_do_I_revert_edits_in_the_web_mode__63__.mdwn46
-rw-r--r--doc/forum/how_do_I_revert_edits_in_the_web_mode__63__/comment_1_e4720e8e4fe74bd6cba746e8259832e6._comment8
-rw-r--r--doc/forum/how_do_I_translate_a_TWiki_site.mdwn44
-rw-r--r--doc/forum/how_to_add_post_titles_in_ikiwiki_blog__63__.mdwn28
-rw-r--r--doc/forum/how_to_enable_multimarkdown__63__.mdwn9
-rw-r--r--doc/forum/how_to_enable_multimarkdown__63__/comment_1_037f858c4d0bcbb708c3efd264379500._comment14
-rw-r--r--doc/forum/how_to_enable_multimarkdown__63__/comment_2_b7d512a535490dabf8d6ce55439741c7._comment8
-rw-r--r--doc/forum/how_to_get_nice_pdf_from_ikiwiki_pages__63__.mdwn3
-rw-r--r--doc/forum/how_to_get_nice_pdf_from_ikiwiki_pages__63__/comment_1_332d32850c3dc0d45f5cc50434205f39._comment8
-rw-r--r--doc/forum/how_to_load_an_external_page_and_still_have_it_under_ikiwiki_template.mdwn3
-rw-r--r--doc/forum/how_to_login_as_admin.mdwn18
-rw-r--r--doc/forum/how_to_login_as_admin/comment_1_295e130c6400a2d7336758e82bcd5647._comment10
-rw-r--r--doc/forum/how_to_setup_ikiwiki_on_a_remote_host.mdwn35
-rw-r--r--doc/forum/howto_install_the_pagedown_plugin.mdwn1
-rw-r--r--doc/forum/howto_install_the_pagedown_plugin/comment_1_158fbcef24d20920c40968da8f10442a._comment8
-rw-r--r--doc/forum/html_source_pages_in_version_3.20100704.mdwn8
-rw-r--r--doc/forum/ikiwiki_+_mathjax.mdwn1
-rw-r--r--doc/forum/ikiwiki_+_mathjax/comment_1_8426a985ecfbb02d364116503ef3a0d4._comment8
-rw-r--r--doc/forum/ikiwiki_+_mathjax/comment_2_ddb7a4d59bbe7145167d122a146e8f65._comment11
-rw-r--r--doc/forum/ikiwiki_+_mathjax/comment_3_5a118654bc008bbb118285ff141eb6f1._comment8
-rw-r--r--doc/forum/ikiwiki_--setup_creates_tmp__47___directory_in_destdir.mdwn10
-rw-r--r--doc/forum/ikiwiki__39__s_notion_of_time.mdwn35
-rw-r--r--doc/forum/ikiwiki_and_big_files.mdwn102
-rw-r--r--doc/forum/ikiwiki_and_big_files/comment_1_df8a9f4249af435cc335f77768a3278d._comment8
-rw-r--r--doc/forum/ikiwiki_and_big_files/comment_2_2d996f1124aedc10f345139c3d8b11df._comment19
-rw-r--r--doc/forum/ikiwiki_and_big_files/comment_3_dfbd38e2b457ea3c4f70266dbf8fbeab._comment8
-rw-r--r--doc/forum/ikiwiki_development_environment_tips.mdwn68
-rw-r--r--doc/forum/ikiwiki_generates_html_files_with_600_permission..mdwn8
-rw-r--r--doc/forum/ikiwiki_generates_html_files_with_600_permission./comment_1_6d73d412a9cc6f6ae426b62885c1f157._comment19
-rw-r--r--doc/forum/ikiwiki_generates_html_files_with_600_permission./comment_2_1392fcde369d11a264f31f6b8993ccec._comment8
-rw-r--r--doc/forum/ikiwiki_generates_html_files_with_600_permission./comment_3_962306f22ceb17afb4150e766e9a05b3._comment10
-rw-r--r--doc/forum/ikiwiki_generates_html_files_with_600_permission./comment_4_8b988d85cfde123798238d0348764c79._comment22
-rw-r--r--doc/forum/ikiwiki_over_database__63__.wiki11
-rw-r--r--doc/forum/ikiwiki_vim_integration.mdwn17
-rw-r--r--doc/forum/ikiwiki_vim_syntaxfile.mdwn26
-rw-r--r--doc/forum/index_attachments.mdwn9
-rw-r--r--doc/forum/index_attachments/comment_1_18b9531d273292b45051eef6a306ca26._comment10
-rw-r--r--doc/forum/index_attachments/comment_2._comment31
-rw-r--r--doc/forum/index_attachments/comment_3_050e5847641a27e0c14232632f3e700a._comment10
-rw-r--r--doc/forum/index_attachments/comment_4._comment10
-rw-r--r--doc/forum/installation_and_setup_questions.mdwn52
-rw-r--r--doc/forum/installation_as_non-root_user.mdwn7
-rw-r--r--doc/forum/installation_of_selected_docs.mdwn29
-rw-r--r--doc/forum/is_it_possible_to_NOT_add_openid2_meta_tags.mdwn67
-rw-r--r--doc/forum/java_script_slideshow.mdwn11
-rw-r--r--doc/forum/java_script_slideshow/comment_1_3eba0b2f3c12acc991dc3069d2b83d49._comment8
-rw-r--r--doc/forum/java_script_slideshow/comment_2_59d90f42b2ca2a5cc71a4d9ba9b9ee9f._comment10
-rw-r--r--doc/forum/java_script_slideshow/comment_3_820a86db38231cff7239f0a88b1925fd._comment21
-rw-r--r--doc/forum/java_script_slideshow/comment_4_a68972e3dd20b65119211d4ab120b294._comment10
-rw-r--r--doc/forum/link_autocompletion_in_vim.mdwn22
-rw-r--r--doc/forum/link_to_an_image_inside_the_wiki_without_inlining_it.mdwn72
-rw-r--r--doc/forum/links_to_diff_on_recentchanges__63__.mdwn1
-rw-r--r--doc/forum/links_to_diff_on_recentchanges__63__/comment_1_1dbc723cc2794f6d45de9cbd2fc2e0fd._comment8
-rw-r--r--doc/forum/links_to_diff_on_recentchanges__63__/comment_2_4349c85d92cf9c1acf2e7678371ab12a._comment10
-rw-r--r--doc/forum/lockedit:_pages_don__39__t_get_locked.mdwn12
-rw-r--r--doc/forum/lockedit:_pages_don__39__t_get_locked/comment_1_bacffb831e5ce7ece7e670c55ad9f3af._comment14
-rw-r--r--doc/forum/lockedit:_pages_don__39__t_get_locked/comment_2_ad268d3f2cd3d529cfff281e0ecb2f16._comment8
-rw-r--r--doc/forum/lockedit:_pages_don__39__t_get_locked/comment_3_da2fb41c5313763e4393cdd921a3f36e._comment10
-rw-r--r--doc/forum/lockedit:_pages_don__39__t_get_locked/comment_4_d0de7964db26cb6f3e81d6e8c29d860d._comment16
-rw-r--r--doc/forum/lockedit:_pages_don__39__t_get_locked/comment_5_d60727c53197d1c667b59bc7250afd9f._comment10
-rw-r--r--doc/forum/managing_todo_lists.mdwn44
-rw-r--r--doc/forum/missing_pages_redirected_to_search-SOLVED.mdwn36
-rw-r--r--doc/forum/missing_pages_redirected_to_search-SOLVED/comment_1_aa03c337b31d7acb95761eb51caab1ef._comment44
-rw-r--r--doc/forum/move_pages.mdwn1
-rw-r--r--doc/forum/move_pages/comment_1_3f1b9563af1e729a7311e869cf7a7787._comment11
-rw-r--r--doc/forum/move_pages/comment_2_22b1c238faacbf10df5f03f415223b49._comment8
-rw-r--r--doc/forum/multi-user_setup_of_ikiwiki__44___gitosis_and_apache2_in_Debian_Sid.mdwn96
-rw-r--r--doc/forum/multi_domain_setup_possible__63__.mdwn15
-rw-r--r--doc/forum/multi_domain_setup_possible__63__/comment_1_43f5df30d09046ccc4f7c44703979a11._comment17
-rw-r--r--doc/forum/multi_domain_setup_possible__63__/comment_2_75d6581f81b71fb8acbe3561047ea759._comment16
-rw-r--r--doc/forum/navigation_of_wiki_pages_on_local_filesystem_with_vim.mdwn130
-rw-r--r--doc/forum/nginx:_404_plugin_not_working.mdwn12
-rw-r--r--doc/forum/nginx:_404_plugin_not_working/comment_1_02a82e468676ae64374cc91ec87e39d6._comment15
-rw-r--r--doc/forum/nginx:_404_plugin_not_working/comment_2_ce6bd8e98e4be08316522182f5f85a11._comment11
-rw-r--r--doc/forum/nginx:_404_plugin_not_working/comment_3_52b05c3274455db7bee3c1765776fd52._comment8
-rw-r--r--doc/forum/nginx:_404_plugin_not_working/comment_4_5a8c2987f442106c68eb822c5bce3bf1._comment23
-rw-r--r--doc/forum/nginx:_404_plugin_not_working/comment_5_0720cd8842dc1cb338b74a0e6fdb2aac._comment8
-rw-r--r--doc/forum/pandoc-iki_plugin.mdwn5
-rw-r--r--doc/forum/pandoc-iki_plugin/comment_1_11eef903493378fd704a6bd92e968508._comment8
-rw-r--r--doc/forum/pandoc-iki_plugin/comment_2_2c437577390cffe3401f5cc2f08a2ab1._comment8
-rw-r--r--doc/forum/paths_to_files_outside_the_wiki_root.mdwn34
-rw-r--r--doc/forum/perl5lib_and_wrappers.mdwn13
-rw-r--r--doc/forum/po_plugin_doesn__39__t_create_po_files___40__only_pot__41__..mdwn11
-rw-r--r--doc/forum/possible_utf-8_problem__63__.mdwn26
-rw-r--r--doc/forum/postsignin_redirect_not_working.mdwn30
-rw-r--r--doc/forum/problem_with_git_after_a_commit_of_ikiwiki.mdwn4
-rw-r--r--doc/forum/problem_with_git_after_a_commit_of_ikiwiki/comment_1_2b9986717769419a8ae0f730c36b7e65._comment22
-rw-r--r--doc/forum/recentchanges_dir_should_be_under_control_of_RCS__63__.mdwn105
-rw-r--r--doc/forum/recovering_original_title_with_meta_directive.mdwn1
-rw-r--r--doc/forum/remove_css__63__.mdwn5
-rw-r--r--doc/forum/report_pagination.mdwn18
-rw-r--r--doc/forum/screenplay_plugin.mdwn1
-rw-r--r--doc/forum/screenplay_plugin/comment_1_6c353acfc80b972ee3a34c8bb09dede3._comment8
-rw-r--r--doc/forum/screenplay_plugin/comment_2_1868aeebebefae80531f2031ffba35d3._comment8
-rw-r--r--doc/forum/section_editing.mdwn1
-rw-r--r--doc/forum/section_editing/comment_1_b193caa886a47c685ac7dafaf60c1761._comment12
-rw-r--r--doc/forum/speeding_up_ikiwiki.mdwn90
-rw-r--r--doc/forum/square_brackets_inside_backticks_generates_incorrect_html___40__interpreted_as_wikilinks__41____63__.html19
-rw-r--r--doc/forum/suppressing_output_of_pages_included_only_for_their_side_effects.mdwn16
-rw-r--r--doc/forum/tag_plugin:_rebuilding_autocreated_pages.mdwn11
-rw-r--r--doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds.mdwn28
-rw-r--r--doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_1_ec4ffab10e60510b53660b70908d1bd8._comment14
-rw-r--r--doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_2_a47884ffd749df980cd62f4c1e3167ce._comment10
-rw-r--r--doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_3_6c4affdbc637946506d0c28a8648dc6e._comment32
-rw-r--r--doc/forum/teximg_not_working.mdwn26
-rw-r--r--doc/forum/teximg_not_working/comment_2_35e2ebf3893fc0c7966490e1fef1e6cf._comment10
-rw-r--r--doc/forum/transition_from_handwritten_html_to_ikiwiki.mdwn90
-rw-r--r--doc/forum/two_new_contrib_plugins:_newpage__44___jssearchfield.mdwn20
-rw-r--r--doc/forum/understanding_filter_hooks.mdwn17
-rw-r--r--doc/forum/upgrade_steps.mdwn147
-rw-r--r--doc/forum/use_php-markdown-extra_with_ikiwiki__63__.mdwn3
-rw-r--r--doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_1_66d48218361caa4c07bd714b82ed0021._comment8
-rw-r--r--doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_2_f2ee0a4dce571d329f795e52139084c0._comment8
-rw-r--r--doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_3_e388714f457ccb6ef73630179914558c._comment9
-rw-r--r--doc/forum/usedirs___38___indexpages_using_problem.mdwn17
-rw-r--r--doc/forum/users/acodispo.mdwn2
-rw-r--r--doc/forum/using_l10n__39__d_basewiki.mdwn7
-rw-r--r--doc/forum/using_l10n__39__d_basewiki/comment_1_eaab671848ee6129f6fe9399474eeac0._comment8
-rw-r--r--doc/forum/using_l10n__39__d_basewiki/comment_2_d907676a1db1210ca59506673c564359._comment10
-rw-r--r--doc/forum/using_l10n__39__d_basewiki/comment_3_5e9d5bc5ecaf63f9bfe3315b09a279aa._comment10
-rw-r--r--doc/forum/using_svn+ssh_with_ikiwiki.mdwn11
-rw-r--r--doc/forum/utf8_warnings_for___34____92__xAB__34__.mdwn47
-rw-r--r--doc/forum/w3mmode___91__Save_Page__93___results_in_403.mdwn9
-rw-r--r--doc/forum/web_service_API__44___fastcgi_support.mdwn18
-rw-r--r--doc/forum/what_is_the_easiest_way_to_implement_order:_disallow_all__44___allow_chosen__95__few_page_editing_policy__63__.mdwn3
-rw-r--r--doc/forum/where_are_the_tags.mdwn9
-rw-r--r--doc/forum/where_are_the_tags/comment_1_6a559c3bfe72011c45b006d33176da3d._comment14
-rw-r--r--doc/forum/which_file_ikiwiki_--setup_is_processing_right_now__63__.mdwn4
-rw-r--r--doc/forum/which_file_ikiwiki_--setup_is_processing_right_now__63__/comment_1_4f52f8fc083982bd5a572742cf35c74f._comment7
-rw-r--r--doc/forum/wiki_clones_on_dynamic_IPs.mdwn10
-rw-r--r--doc/forum/wiki_name_in_page_titles.mdwn32
-rw-r--r--doc/forum/wishlist-discussion:_Editformular_showing_existing_tags.mdwn15
-rw-r--r--doc/forum/wishlist:_Allow_OpenID_users_to_set_a_display_name.mdwn3
-rw-r--r--doc/forum/wishlist:_support_staging_area.mdwn12
-rw-r--r--doc/forum/wmd_editor_double_preview.mdwn3
-rw-r--r--doc/forum/wmd_editor_double_preview/comment_1_0d3acf67f3c35f8c4156228f96dcd975._comment8
-rw-r--r--doc/freesoftware.mdwn11
-rw-r--r--doc/freesoftware/discussion.mdwn3
-rw-r--r--doc/git.mdwn84
-rw-r--r--doc/ikiwiki-calendar.mdwn53
-rw-r--r--doc/ikiwiki-calendar/discussion.mdwn36
-rw-r--r--doc/ikiwiki-makerepo.mdwn44
-rw-r--r--doc/ikiwiki-makerepo/discussion.mdwn1
-rw-r--r--doc/ikiwiki-mass-rebuild.mdwn33
-rw-r--r--doc/ikiwiki-mass-rebuild/discussion.mdwn1
-rw-r--r--doc/ikiwiki-transition.mdwn75
-rw-r--r--doc/ikiwiki-update-wikilist.mdwn33
-rw-r--r--doc/ikiwiki.mdwn17
-rw-r--r--doc/ikiwiki/directive.mdwn56
-rw-r--r--doc/ikiwiki/directive/aggregate.mdwn57
-rw-r--r--doc/ikiwiki/directive/aggregate/discussion.mdwn10
-rw-r--r--doc/ikiwiki/directive/brokenlinks.mdwn14
-rw-r--r--doc/ikiwiki/directive/brokenlinks/discussion.mdwn3
-rw-r--r--doc/ikiwiki/directive/calendar.mdwn60
-rw-r--r--doc/ikiwiki/directive/color.mdwn25
-rw-r--r--doc/ikiwiki/directive/comment.mdwn40
-rw-r--r--doc/ikiwiki/directive/commentmoderation.mdwn9
-rw-r--r--doc/ikiwiki/directive/copy.mdwn3
-rw-r--r--doc/ikiwiki/directive/cut.mdwn3
-rw-r--r--doc/ikiwiki/directive/cutpaste.mdwn50
-rw-r--r--doc/ikiwiki/directive/date.mdwn16
-rw-r--r--doc/ikiwiki/directive/edittemplate.mdwn34
-rw-r--r--doc/ikiwiki/directive/flattr.mdwn45
-rw-r--r--doc/ikiwiki/directive/format.mdwn29
-rw-r--r--doc/ikiwiki/directive/fortune.mdwn8
-rw-r--r--doc/ikiwiki/directive/graph.mdwn32
-rw-r--r--doc/ikiwiki/directive/graph/discussion.mdwn27
-rw-r--r--doc/ikiwiki/directive/haiku.mdwn15
-rw-r--r--doc/ikiwiki/directive/if.mdwn50
-rw-r--r--doc/ikiwiki/directive/img.mdwn39
-rw-r--r--doc/ikiwiki/directive/img/discussion.mdwn34
-rw-r--r--doc/ikiwiki/directive/inline.mdwn126
-rw-r--r--doc/ikiwiki/directive/inline/discussion.mdwn163
-rw-r--r--doc/ikiwiki/directive/linkmap.mdwn29
-rw-r--r--doc/ikiwiki/directive/listdirectives.mdwn20
-rw-r--r--doc/ikiwiki/directive/map.mdwn21
-rw-r--r--doc/ikiwiki/directive/map/discussion.mdwn99
-rw-r--r--doc/ikiwiki/directive/meta.mdwn206
-rw-r--r--doc/ikiwiki/directive/meta/discussion.mdwn69
-rw-r--r--doc/ikiwiki/directive/more.mdwn21
-rw-r--r--doc/ikiwiki/directive/orphans.mdwn15
-rw-r--r--doc/ikiwiki/directive/osm.mdwn69
-rw-r--r--doc/ikiwiki/directive/osm/discussion.mdwn13
-rw-r--r--doc/ikiwiki/directive/pagecount.mdwn10
-rw-r--r--doc/ikiwiki/directive/pagestats.mdwn40
-rw-r--r--doc/ikiwiki/directive/pagestats/discussion.mdwn18
-rw-r--r--doc/ikiwiki/directive/pagetemplate.mdwn13
-rw-r--r--doc/ikiwiki/directive/paste.mdwn3
-rw-r--r--doc/ikiwiki/directive/ping.mdwn18
-rw-r--r--doc/ikiwiki/directive/poll.mdwn27
-rw-r--r--doc/ikiwiki/directive/polygen.mdwn11
-rw-r--r--doc/ikiwiki/directive/postsparkline.mdwn45
-rw-r--r--doc/ikiwiki/directive/progress.mdwn18
-rw-r--r--doc/ikiwiki/directive/shortcut.mdwn9
-rw-r--r--doc/ikiwiki/directive/sidebar.mdwn20
-rw-r--r--doc/ikiwiki/directive/sidebar/discussion.mdwn10
-rw-r--r--doc/ikiwiki/directive/sparkline.mdwn52
-rw-r--r--doc/ikiwiki/directive/table.mdwn53
-rw-r--r--doc/ikiwiki/directive/table/discussion.mdwn1
-rw-r--r--doc/ikiwiki/directive/tag.mdwn35
-rw-r--r--doc/ikiwiki/directive/tag/discussion.mdwn13
-rw-r--r--doc/ikiwiki/directive/taglink.mdwn3
-rw-r--r--doc/ikiwiki/directive/template.mdwn91
-rw-r--r--doc/ikiwiki/directive/testpagespec.mdwn24
-rw-r--r--doc/ikiwiki/directive/testpagespec/discussion.mdwn6
-rw-r--r--doc/ikiwiki/directive/teximg.mdwn23
-rw-r--r--doc/ikiwiki/directive/toc.mdwn27
-rw-r--r--doc/ikiwiki/directive/toggle.mdwn34
-rw-r--r--doc/ikiwiki/directive/toggleable.mdwn3
-rw-r--r--doc/ikiwiki/directive/trailitem.mdwn9
-rw-r--r--doc/ikiwiki/directive/trailitems.mdwn25
-rw-r--r--doc/ikiwiki/directive/traillink.mdwn16
-rw-r--r--doc/ikiwiki/directive/trailoptions.mdwn18
-rw-r--r--doc/ikiwiki/directive/version.mdwn12
-rw-r--r--doc/ikiwiki/directive/waypoint.mdwn6
-rw-r--r--doc/ikiwiki/formatting.mdwn106
-rw-r--r--doc/ikiwiki/formatting/discussion.mdwn20
-rw-r--r--doc/ikiwiki/markdown.mdwn11
-rw-r--r--doc/ikiwiki/openid.mdwn28
-rw-r--r--doc/ikiwiki/pagespec.mdwn86
-rw-r--r--doc/ikiwiki/pagespec/attachment.mdwn38
-rw-r--r--doc/ikiwiki/pagespec/attachment/discussion.mdwn15
-rw-r--r--doc/ikiwiki/pagespec/discussion.mdwn170
-rw-r--r--doc/ikiwiki/pagespec/po.mdwn23
-rw-r--r--doc/ikiwiki/pagespec/sorting.mdwn30
-rw-r--r--doc/ikiwiki/searching.mdwn20
-rw-r--r--doc/ikiwiki/subpage.mdwn12
-rw-r--r--doc/ikiwiki/subpage/linkingrules.mdwn33
-rw-r--r--doc/ikiwiki/wikilink.mdwn29
-rw-r--r--doc/ikiwiki/wikilink/discussion.mdwn93
-rw-r--r--doc/ikiwikiusers.mdwn198
-rw-r--r--doc/ikiwikiusers/discussion.mdwn39
-rw-r--r--doc/index.mdwn29
-rw-r--r--doc/index/discussion.mdwn1
-rw-r--r--doc/index/openid/discussion.mdwn62
-rw-r--r--doc/install.mdwn48
-rw-r--r--doc/install/discussion.mdwn358
-rw-r--r--doc/local.css3
-rw-r--r--doc/logo.mdwn69
-rw-r--r--doc/logo/discussion.mdwn1
-rw-r--r--doc/logo/favicon.svgzbin0 -> 1412 bytes
-rw-r--r--doc/logo/ikiwiki.pngbin0 -> 1157 bytes
-rw-r--r--doc/logo/ikiwiki.svgzbin0 -> 6276 bytes
-rw-r--r--doc/logo/ikiwiki_button.pngbin0 -> 974 bytes
-rw-r--r--doc/logo/ikiwiki_large.pngbin0 -> 1952 bytes
-rw-r--r--doc/logo/ikiwiki_old.pngbin0 -> 335 bytes
-rw-r--r--doc/logo/ikiwiki_old2.pngbin0 -> 1275 bytes
-rw-r--r--doc/logo/ikiwiki_old2.svgzbin0 -> 28488 bytes
-rw-r--r--doc/news.mdwn9
-rw-r--r--doc/news/Article_on_Ikiwiki_as_a_BTS.mdwn1
-rw-r--r--doc/news/code_swarm.mdwn34
-rw-r--r--doc/news/code_swarm/code_swarm.config51
-rwxr-xr-xdoc/news/code_swarm/code_swarm_log.pl25
-rw-r--r--doc/news/code_swarm/discussion.mdwn3
-rw-r--r--doc/news/code_swarm/screenshot.pngbin0 -> 64320 bytes
-rw-r--r--doc/news/consultant_list.mdwn17
-rw-r--r--doc/news/discussion.mdwn35
-rw-r--r--doc/news/donations.mdwn1
-rw-r--r--doc/news/git_push_to_this_wiki.mdwn3
-rw-r--r--doc/news/git_push_to_this_wiki/discussion.mdwn37
-rw-r--r--doc/news/ikiwiki-hosting.mdwn16
-rw-r--r--doc/news/ikiwiki_accepted_for_Summer_of_Code.mdwn5
-rw-r--r--doc/news/ikiwiki_screencast.mdwn12
-rw-r--r--doc/news/ikiwiki_screencast/discussion.mdwn8
-rw-r--r--doc/news/ikiwiki_version_2.0.mdwn32
-rw-r--r--doc/news/ikiwiki_version_3.0.mdwn42
-rw-r--r--doc/news/irc_channel.mdwn6
-rw-r--r--doc/news/moved_to_git.mdwn10
-rw-r--r--doc/news/moved_to_git/discussion.mdwn43
-rw-r--r--doc/news/new_domain_name.mdwn1
-rw-r--r--doc/news/no_more_email_notifications.mdwn14
-rw-r--r--doc/news/openid.mdwn13
-rw-r--r--doc/news/openid/discussion.mdwn96
-rw-r--r--doc/news/server_move.mdwn9
-rw-r--r--doc/news/server_move_2009.mdwn6
-rw-r--r--doc/news/server_speed.mdwn9
-rw-r--r--doc/news/server_speed/discussion.mdwn1
-rw-r--r--doc/news/stylesheets.mdwn16
-rw-r--r--doc/news/stylesheets/discussion.mdwn3
-rw-r--r--doc/news/version_3.20121212.mdwn6
-rw-r--r--doc/news/version_3.20130212.mdwn18
-rw-r--r--doc/news/version_3.20130504.mdwn11
-rw-r--r--doc/news/version_3.20130518.mdwn9
-rw-r--r--doc/news/version_3.20130710.mdwn23
-rw-r--r--doc/pagehistory.mdwn8
-rw-r--r--doc/patch.mdwn12
-rw-r--r--doc/patch/core.mdwn7
-rw-r--r--doc/plugins.mdwn17
-rw-r--r--doc/plugins/404.mdwn24
-rw-r--r--doc/plugins/404/discussion.mdwn3
-rw-r--r--doc/plugins/aggregate.mdwn57
-rw-r--r--doc/plugins/aggregate/discussion.mdwn137
-rw-r--r--doc/plugins/amazon_s3.mdwn68
-rw-r--r--doc/plugins/amazon_s3/discussion.mdwn18
-rw-r--r--doc/plugins/anonok.mdwn19
-rw-r--r--doc/plugins/attachment.mdwn27
-rw-r--r--doc/plugins/autoindex.mdwn10
-rw-r--r--doc/plugins/autoindex/discussion.mdwn84
-rw-r--r--doc/plugins/blogspam.mdwn32
-rw-r--r--doc/plugins/brokenlinks.mdwn9
-rw-r--r--doc/plugins/calendar.mdwn36
-rw-r--r--doc/plugins/calendar/discussion.mdwn23
-rw-r--r--doc/plugins/camelcase.mdwn13
-rw-r--r--doc/plugins/color.mdwn5
-rw-r--r--doc/plugins/comments.mdwn55
-rw-r--r--doc/plugins/comments/discussion.mdwn232
-rw-r--r--doc/plugins/conditional.mdwn5
-rw-r--r--doc/plugins/conditional/discussion.mdwn76
-rw-r--r--doc/plugins/contrib.mdwn7
-rw-r--r--doc/plugins/contrib/album.mdwn140
-rw-r--r--doc/plugins/contrib/album/discussion.mdwn458
-rw-r--r--doc/plugins/contrib/asymptote.mdwn141
-rw-r--r--doc/plugins/contrib/asymptote/ikiwiki/directive/asymptote.mdwn27
-rw-r--r--doc/plugins/contrib/attach.mdwn47
-rw-r--r--doc/plugins/contrib/attach/discussion.mdwn18
-rw-r--r--doc/plugins/contrib/bibtex.mdwn59
-rw-r--r--doc/plugins/contrib/created_in_future.mdwn18
-rw-r--r--doc/plugins/contrib/default_content_for___42__copyright__42___and___42__license__42__.mdwn62
-rw-r--r--doc/plugins/contrib/dynamiccookies.mdwn12
-rw-r--r--doc/plugins/contrib/field.mdwn219
-rw-r--r--doc/plugins/contrib/field/discussion.mdwn407
-rw-r--r--doc/plugins/contrib/flattr.mdwn48
-rw-r--r--doc/plugins/contrib/flattr/discussion.mdwn9
-rw-r--r--doc/plugins/contrib/ftemplate.mdwn25
-rw-r--r--doc/plugins/contrib/ftemplate/discussion.mdwn33
-rw-r--r--doc/plugins/contrib/ftemplate/ikiwiki/directive/ftemplate.mdwn106
-rw-r--r--doc/plugins/contrib/gallery.mdwn39
-rw-r--r--doc/plugins/contrib/gallery/discussion.mdwn45
-rw-r--r--doc/plugins/contrib/getfield.mdwn137
-rw-r--r--doc/plugins/contrib/getfield/discussion.mdwn79
-rw-r--r--doc/plugins/contrib/googlemaps.mdwn21
-rw-r--r--doc/plugins/contrib/googlemaps/discussion.mdwn16
-rw-r--r--doc/plugins/contrib/groupfile.mdwn105
-rw-r--r--doc/plugins/contrib/highlightcode.mdwn10
-rw-r--r--doc/plugins/contrib/ikiwiki/directive/album.mdwn56
-rw-r--r--doc/plugins/contrib/ikiwiki/directive/albumimage.mdwn26
-rw-r--r--doc/plugins/contrib/ikiwiki/directive/albumsection.mdwn29
-rw-r--r--doc/plugins/contrib/ikiwiki/directive/jssearchfield.mdwn42
-rw-r--r--doc/plugins/contrib/ikiwiki/directive/ymlfront.mdwn17
-rw-r--r--doc/plugins/contrib/ikiwiki/directive/ymlfront/discussion.mdwn37
-rw-r--r--doc/plugins/contrib/imailhide.mdwn65
-rw-r--r--doc/plugins/contrib/img.mdwn14
-rw-r--r--doc/plugins/contrib/img/discussion.mdwn51
-rw-r--r--doc/plugins/contrib/jscalendar.mdwn45
-rw-r--r--doc/plugins/contrib/jssearchfield.mdwn35
-rw-r--r--doc/plugins/contrib/justlogin.mdwn52
-rw-r--r--doc/plugins/contrib/linguas.mdwn107
-rw-r--r--doc/plugins/contrib/livefyre.mdwn14
-rw-r--r--doc/plugins/contrib/localfavicon.mdwn7
-rw-r--r--doc/plugins/contrib/mailbox.mdwn18
-rw-r--r--doc/plugins/contrib/mailbox/discussion.mdwn8
-rw-r--r--doc/plugins/contrib/mandoc.mdwn12
-rw-r--r--doc/plugins/contrib/mathjax.mdwn13
-rw-r--r--doc/plugins/contrib/mediawiki.mdwn10
-rw-r--r--doc/plugins/contrib/mediawiki/discussion.mdwn9
-rw-r--r--doc/plugins/contrib/monthcalendar.mdwn23
-rw-r--r--doc/plugins/contrib/mscgen.mdwn52
-rw-r--r--doc/plugins/contrib/navbar.mdwn40
-rw-r--r--doc/plugins/contrib/navbar/discussion.mdwn2
-rw-r--r--doc/plugins/contrib/newpage.mdwn29
-rw-r--r--doc/plugins/contrib/newpage/discussion.mdwn10
-rw-r--r--doc/plugins/contrib/opml.mdwn11
-rw-r--r--doc/plugins/contrib/opml/discussion.mdwn4
-rw-r--r--doc/plugins/contrib/pagespec_alias.mdwn28
-rw-r--r--doc/plugins/contrib/pandoc.mdwn6
-rw-r--r--doc/plugins/contrib/plusone.mdwn35
-rw-r--r--doc/plugins/contrib/pod.mdwn38
-rw-r--r--doc/plugins/contrib/pod/discussion.mdwn14
-rw-r--r--doc/plugins/contrib/postal.mdwn35
-rw-r--r--doc/plugins/contrib/postal/discussion.mdwn24
-rw-r--r--doc/plugins/contrib/proxies.mdwn13
-rw-r--r--doc/plugins/contrib/report.mdwn26
-rw-r--r--doc/plugins/contrib/report/discussion.mdwn80
-rw-r--r--doc/plugins/contrib/report/ikiwiki/directive/report.mdwn175
-rw-r--r--doc/plugins/contrib/sar.mdwn109
-rw-r--r--doc/plugins/contrib/screenplay.pm.mdwn320
-rw-r--r--doc/plugins/contrib/siterel2pagerel.mdwn30
-rw-r--r--doc/plugins/contrib/sourcehighlight.mdwn30
-rw-r--r--doc/plugins/contrib/syntax.mdwn65
-rw-r--r--doc/plugins/contrib/syntax/discussion.mdwn23
-rw-r--r--doc/plugins/contrib/tex4ht.mdwn15
-rw-r--r--doc/plugins/contrib/texinfo.mdwn122
-rw-r--r--doc/plugins/contrib/tracking.mdwn30
-rw-r--r--doc/plugins/contrib/unixauth.mdwn21
-rw-r--r--doc/plugins/contrib/unixauth/discussion.mdwn38
-rw-r--r--doc/plugins/contrib/unixrelpagespec.mdwn42
-rw-r--r--doc/plugins/contrib/video.mdwn25
-rw-r--r--doc/plugins/contrib/video/discussion.mdwn3
-rw-r--r--doc/plugins/contrib/wc.mdwn22
-rw-r--r--doc/plugins/contrib/xslt.mdwn39
-rw-r--r--doc/plugins/contrib/xslt/discussion.mdwn49
-rw-r--r--doc/plugins/contrib/ymlfront.mdwn143
-rw-r--r--doc/plugins/contrib/ymlfront/discussion.mdwn31
-rw-r--r--doc/plugins/creole.mdwn22
-rw-r--r--doc/plugins/creole/discussion.mdwn22
-rw-r--r--doc/plugins/cutpaste.mdwn7
-rw-r--r--doc/plugins/date.mdwn6
-rw-r--r--doc/plugins/ddate.mdwn10
-rw-r--r--doc/plugins/discussion.mdwn42
-rw-r--r--doc/plugins/editdiff.mdwn13
-rw-r--r--doc/plugins/editdiff/discussion.mdwn5
-rw-r--r--doc/plugins/editpage.mdwn6
-rw-r--r--doc/plugins/editpage/discussion.mdwn24
-rw-r--r--doc/plugins/edittemplate.mdwn6
-rw-r--r--doc/plugins/embed.mdwn53
-rw-r--r--doc/plugins/favicon.mdwn7
-rw-r--r--doc/plugins/favicon/discussion.mdwn19
-rw-r--r--doc/plugins/filecheck.mdwn17
-rw-r--r--doc/plugins/filecheck/discussion.mdwn85
-rw-r--r--doc/plugins/flattr.mdwn9
-rw-r--r--doc/plugins/format.mdwn9
-rw-r--r--doc/plugins/format/discussion.mdwn15
-rw-r--r--doc/plugins/fortune.mdwn14
-rw-r--r--doc/plugins/getsource.mdwn14
-rw-r--r--doc/plugins/getsource/discussion.mdwn3
-rw-r--r--doc/plugins/goodstuff.mdwn29
-rw-r--r--doc/plugins/goodstuff/discussion.mdwn8
-rw-r--r--doc/plugins/google.mdwn11
-rw-r--r--doc/plugins/google/discussion.mdwn25
-rw-r--r--doc/plugins/goto.mdwn10
-rw-r--r--doc/plugins/graphviz.mdwn25
-rw-r--r--doc/plugins/haiku.mdwn12
-rw-r--r--doc/plugins/haiku/discussion.mdwn5
-rw-r--r--doc/plugins/headinganchors.mdwn7
-rw-r--r--doc/plugins/headinganchors/discussion.mdwn49
-rw-r--r--doc/plugins/highlight.mdwn77
-rw-r--r--doc/plugins/highlight/discussion.mdwn23
-rw-r--r--doc/plugins/hnb.mdwn6
-rw-r--r--doc/plugins/hnb/discussion.mdwn28
-rw-r--r--doc/plugins/html.mdwn11
-rw-r--r--doc/plugins/htmlbalance.mdwn9
-rw-r--r--doc/plugins/htmlbalance/discussion.mdwn10
-rw-r--r--doc/plugins/htmlscrubber.mdwn51
-rw-r--r--doc/plugins/htmlscrubber/discussion.mdwn18
-rw-r--r--doc/plugins/htmltidy.mdwn11
-rw-r--r--doc/plugins/httpauth.mdwn37
-rw-r--r--doc/plugins/img.mdwn14
-rw-r--r--doc/plugins/img/discussion.mdwn12
-rw-r--r--doc/plugins/inline.mdwn6
-rw-r--r--doc/plugins/install.mdwn19
-rw-r--r--doc/plugins/link.mdwn5
-rw-r--r--doc/plugins/linkmap.mdwn14
-rw-r--r--doc/plugins/listdirectives.mdwn16
-rw-r--r--doc/plugins/localstyle.mdwn12
-rw-r--r--doc/plugins/lockedit.mdwn23
-rw-r--r--doc/plugins/lockedit/discussion.mdwn18
-rw-r--r--doc/plugins/map.mdwn11
-rw-r--r--doc/plugins/map/discussion.mdwn49
-rw-r--r--doc/plugins/mdwn.mdwn23
-rw-r--r--doc/plugins/mdwn/discussion.mdwn7
-rw-r--r--doc/plugins/meta.mdwn5
-rw-r--r--doc/plugins/meta/discussion.mdwn18
-rw-r--r--doc/plugins/mirrorlist.mdwn22
-rw-r--r--doc/plugins/moderatedcomments.mdwn12
-rw-r--r--doc/plugins/more.mdwn6
-rw-r--r--doc/plugins/more/discussion.mdwn7
-rw-r--r--doc/plugins/notifyemail.mdwn14
-rw-r--r--doc/plugins/notifyemail/discussion.mdwn5
-rw-r--r--doc/plugins/opendiscussion.mdwn11
-rw-r--r--doc/plugins/openid.mdwn32
-rw-r--r--doc/plugins/openid/discussion.mdwn26
-rw-r--r--doc/plugins/orphans.mdwn16
-rw-r--r--doc/plugins/orphans/discussion.mdwn22
-rw-r--r--doc/plugins/osm.mdwn31
-rw-r--r--doc/plugins/otl.mdwn6
-rw-r--r--doc/plugins/pagecount.mdwn11
-rw-r--r--doc/plugins/pagestats.mdwn6
-rw-r--r--doc/plugins/pagetemplate.mdwn6
-rw-r--r--doc/plugins/parentlinks.mdwn5
-rw-r--r--doc/plugins/passwordauth.mdwn33
-rw-r--r--doc/plugins/passwordauth/discussion.mdwn151
-rw-r--r--doc/plugins/pingee.mdwn11
-rw-r--r--doc/plugins/pingee/discussion.mdwn9
-rw-r--r--doc/plugins/pinger.mdwn20
-rw-r--r--doc/plugins/po.mdwn260
-rw-r--r--doc/plugins/po/discussion.mdwn721
-rw-r--r--doc/plugins/poll.mdwn5
-rw-r--r--doc/plugins/poll/discussion.mdwn1
-rw-r--r--doc/plugins/polygen.mdwn25
-rw-r--r--doc/plugins/postsparkline.mdwn14
-rw-r--r--doc/plugins/prettydate.mdwn20
-rw-r--r--doc/plugins/progress.mdwn5
-rw-r--r--doc/plugins/rawhtml.mdwn13
-rw-r--r--doc/plugins/rawhtml/discussion.mdwn7
-rw-r--r--doc/plugins/recentchanges.mdwn32
-rw-r--r--doc/plugins/recentchanges/discussion.mdwn17
-rw-r--r--doc/plugins/recentchangesdiff.mdwn9
-rw-r--r--doc/plugins/relativedate.mdwn11
-rw-r--r--doc/plugins/remove.mdwn7
-rw-r--r--doc/plugins/rename.mdwn12
-rw-r--r--doc/plugins/repolist.mdwn17
-rw-r--r--doc/plugins/rst.mdwn18
-rw-r--r--doc/plugins/rst/discussion.mdwn81
-rw-r--r--doc/plugins/rsync.mdwn19
-rw-r--r--doc/plugins/rsync/discussion.mdwn79
-rw-r--r--doc/plugins/search.mdwn18
-rw-r--r--doc/plugins/search/discussion.mdwn1
-rw-r--r--doc/plugins/shortcut.mdwn9
-rw-r--r--doc/plugins/shortcut/discussion.mdwn18
-rw-r--r--doc/plugins/sidebar.mdwn28
-rw-r--r--doc/plugins/sidebar/discussion.mdwn12
-rw-r--r--doc/plugins/signinedit.mdwn5
-rw-r--r--doc/plugins/smiley.mdwn9
-rw-r--r--doc/plugins/sortnaturally.mdwn6
-rw-r--r--doc/plugins/sparkline.mdwn22
-rw-r--r--doc/plugins/table.mdwn12
-rw-r--r--doc/plugins/table/discussion.mdwn73
-rw-r--r--doc/plugins/tag.mdwn24
-rw-r--r--doc/plugins/tag/discussion.mdwn31
-rw-r--r--doc/plugins/template.mdwn7
-rw-r--r--doc/plugins/testpagespec.mdwn6
-rw-r--r--doc/plugins/teximg.mdwn15
-rw-r--r--doc/plugins/teximg/discussion.mdwn5
-rw-r--r--doc/plugins/textile.mdwn6
-rw-r--r--doc/plugins/theme.mdwn18
-rw-r--r--doc/plugins/theme/discussion.mdwn26
-rw-r--r--doc/plugins/toc.mdwn5
-rw-r--r--doc/plugins/toc/discussion.mdwn10
-rw-r--r--doc/plugins/toggle.mdwn7
-rw-r--r--doc/plugins/toggle/discussion.mdwn43
-rw-r--r--doc/plugins/trail.mdwn76
-rw-r--r--doc/plugins/trail/discussion.mdwn105
-rw-r--r--doc/plugins/transient.mdwn24
-rw-r--r--doc/plugins/txt.mdwn19
-rw-r--r--doc/plugins/txt/discussion.mdwn33
-rw-r--r--doc/plugins/type/auth.mdwn4
-rw-r--r--doc/plugins/type/bundle.mdwn3
-rw-r--r--doc/plugins/type/chrome.mdwn3
-rw-r--r--doc/plugins/type/comments.mdwn3
-rw-r--r--doc/plugins/type/core.mdwn3
-rw-r--r--doc/plugins/type/date.mdwn3
-rw-r--r--doc/plugins/type/format.mdwn3
-rw-r--r--doc/plugins/type/fun.mdwn3
-rw-r--r--doc/plugins/type/html.mdwn3
-rw-r--r--doc/plugins/type/link.mdwn3
-rw-r--r--doc/plugins/type/meta.mdwn3
-rw-r--r--doc/plugins/type/slow.mdwn5
-rw-r--r--doc/plugins/type/special-purpose.mdwn3
-rw-r--r--doc/plugins/type/tags.mdwn3
-rw-r--r--doc/plugins/type/web.mdwn3
-rw-r--r--doc/plugins/type/widget.mdwn4
-rw-r--r--doc/plugins/typography.mdwn12
-rw-r--r--doc/plugins/underlay.mdwn14
-rw-r--r--doc/plugins/userlist.mdwn6
-rw-r--r--doc/plugins/version.mdwn7
-rw-r--r--doc/plugins/websetup.mdwn27
-rw-r--r--doc/plugins/wikitext.mdwn23
-rw-r--r--doc/plugins/wmd.mdwn16
-rw-r--r--doc/plugins/wmd/discussion.mdwn73
-rw-r--r--doc/plugins/write.mdwn1396
-rw-r--r--doc/plugins/write/discussion.mdwn46
-rw-r--r--doc/plugins/write/external.mdwn146
-rw-r--r--doc/plugins/write/tutorial.mdwn189
-rw-r--r--doc/plugins/write/tutorial/discussion.mdwn20
-rw-r--r--doc/post-commit.mdwn19
-rw-r--r--doc/post-commit/discussion.mdwn123
-rw-r--r--doc/quotes.mdwn3
-rw-r--r--doc/quotes/pizza.mdwn4
-rw-r--r--doc/quotes/pizza/discussion.mdwn1
-rw-r--r--doc/quotes/sold.mdwn3
-rw-r--r--doc/rcs.mdwn44
-rw-r--r--doc/rcs/bzr.mdwn8
-rw-r--r--doc/rcs/cvs.mdwn46
-rw-r--r--doc/rcs/cvs/discussion.mdwn191
-rw-r--r--doc/rcs/darcs.mdwn15
-rw-r--r--doc/rcs/details.mdwn292
-rw-r--r--doc/rcs/details/discussion.mdwn15
-rw-r--r--doc/rcs/git.mdwn153
-rw-r--r--doc/rcs/git/discussion.mdwn129
-rw-r--r--doc/rcs/git/wiki_edit_flow.svg705
-rw-r--r--doc/rcs/mercurial.mdwn18
-rw-r--r--doc/rcs/monotone.mdwn24
-rw-r--r--doc/rcs/svn.mdwn9
-rw-r--r--doc/rcs/svn/discussion.mdwn13
-rw-r--r--doc/rcs/tla.mdwn13
-rw-r--r--doc/recentchanges.mdwn7
-rw-r--r--doc/reviewed.mdwn7
-rw-r--r--doc/roadmap.mdwn92
-rw-r--r--doc/roadmap/discussion.mdwn32
-rw-r--r--doc/robots.txt2
-rw-r--r--doc/sandbox.mdwn84
-rw-r--r--doc/sandbox/NewPage.mdwn1
-rw-r--r--doc/sandbox/hmm__44___what_kind_of_a_blog_is_this__63____41__.mdwn3
-rw-r--r--doc/security.mdwn499
-rw-r--r--doc/security/discussion.mdwn33
-rw-r--r--doc/setup.mdwn152
-rw-r--r--doc/setup/byhand.mdwn202
-rw-r--r--doc/setup/byhand/discussion.mdwn7
-rw-r--r--doc/setup/discussion.mdwn271
-rw-r--r--doc/shortcuts.mdwn83
-rw-r--r--doc/shortcuts/discussion.mdwn21
-rw-r--r--doc/sitemap.mdwn5
-rw-r--r--doc/smileys.mdwn56
-rw-r--r--doc/smileys/alert.pngbin0 -> 220 bytes
-rw-r--r--doc/smileys/angry.pngbin0 -> 295 bytes
-rw-r--r--doc/smileys/attention.pngbin0 -> 164 bytes
-rw-r--r--doc/smileys/biggrin.pngbin0 -> 173 bytes
-rw-r--r--doc/smileys/checkmark.pngbin0 -> 133 bytes
-rw-r--r--doc/smileys/devil.pngbin0 -> 354 bytes
-rw-r--r--doc/smileys/frown.pngbin0 -> 168 bytes
-rw-r--r--doc/smileys/icon-error.pngbin0 -> 397 bytes
-rw-r--r--doc/smileys/icon-info.pngbin0 -> 171 bytes
-rw-r--r--doc/smileys/idea.pngbin0 -> 372 bytes
-rw-r--r--doc/smileys/neutral.pngbin0 -> 239 bytes
-rw-r--r--doc/smileys/ohwell.pngbin0 -> 167 bytes
-rw-r--r--doc/smileys/prio1.pngbin0 -> 153 bytes
-rw-r--r--doc/smileys/prio2.pngbin0 -> 158 bytes
-rw-r--r--doc/smileys/prio3.pngbin0 -> 153 bytes
-rw-r--r--doc/smileys/question.pngbin0 -> 302 bytes
-rw-r--r--doc/smileys/redface.pngbin0 -> 306 bytes
-rw-r--r--doc/smileys/sad.pngbin0 -> 182 bytes
-rw-r--r--doc/smileys/smile.pngbin0 -> 356 bytes
-rw-r--r--doc/smileys/smile2.pngbin0 -> 334 bytes
-rw-r--r--doc/smileys/smile3.pngbin0 -> 326 bytes
-rw-r--r--doc/smileys/smile4.pngbin0 -> 275 bytes
-rw-r--r--doc/smileys/star_off.pngbin0 -> 297 bytes
-rw-r--r--doc/smileys/star_on.pngbin0 -> 370 bytes
-rw-r--r--doc/smileys/thumbs-up.pngbin0 -> 118 bytes
-rw-r--r--doc/smileys/tired.pngbin0 -> 157 bytes
-rw-r--r--doc/smileys/tongue.pngbin0 -> 176 bytes
-rw-r--r--doc/soc.mdwn20
-rw-r--r--doc/soc/application.mdwn96
-rw-r--r--doc/soc/discussion.mdwn2
-rw-r--r--doc/soc/ideas.mdwn8
-rw-r--r--doc/style.css551
-rw-r--r--doc/tags.mdwn26
-rw-r--r--doc/tags/discussion.mdwn20
-rw-r--r--doc/templates.mdwn94
-rw-r--r--doc/templates/discussion.mdwn27
-rw-r--r--doc/templates/gitbranch.mdwn16
-rw-r--r--doc/templates/links.mdwn16
-rw-r--r--doc/templates/note.mdwn11
-rw-r--r--doc/templates/plugin.mdwn19
-rw-r--r--doc/templates/popup.mdwn16
-rw-r--r--doc/theme_market.mdwn13
-rw-r--r--doc/themes.mdwn34
-rw-r--r--doc/themes/actiontabs_small.pngbin0 -> 19202 bytes
-rw-r--r--doc/themes/blueview_small.pngbin0 -> 18543 bytes
-rw-r--r--doc/themes/discussion.mdwn20
-rw-r--r--doc/themes/goldtype_small.pngbin0 -> 19240 bytes
-rw-r--r--doc/themes/monochrome_small.pngbin0 -> 21054 bytes
-rw-r--r--doc/themes/none_small.pngbin0 -> 18516 bytes
-rw-r--r--doc/tipjar.mdwn25
-rw-r--r--doc/tips.mdwn5
-rw-r--r--doc/tips/Adding_Disqus_to_your_wiki.mdwn30
-rw-r--r--doc/tips/Adding_Disqus_to_your_wiki/discussion.mdwn1
-rw-r--r--doc/tips/DreamHost.mdwn192
-rw-r--r--doc/tips/DreamHost/discussion.mdwn18
-rw-r--r--doc/tips/Emacs_and_markdown.html16
-rw-r--r--doc/tips/Git_repository_and_web_server_on_different_hosts.mdwn61
-rw-r--r--doc/tips/Google_custom_search.mdwn12
-rw-r--r--doc/tips/Importing_posts_from_Wordpress.mdwn102
-rw-r--r--doc/tips/Importing_posts_from_Wordpress/discussion.mdwn44
-rw-r--r--doc/tips/JavaScript_to_add_index.html_to_file:_links.mdwn63
-rw-r--r--doc/tips/JavaScript_to_add_index.html_to_file:_links/discusion.mdwn3
-rw-r--r--doc/tips/JavaScript_to_add_index.html_to_file:_links/discussion.mdwn2
-rw-r--r--doc/tips/Make_calendar_start_week_on_Monday.mdwn9
-rw-r--r--doc/tips/Make_calendar_start_week_on_Monday/discussion.mdwn1
-rw-r--r--doc/tips/add_chatterbox_to_blog.mdwn24
-rw-r--r--doc/tips/add_chatterbox_to_blog/discussion.mdwn43
-rw-r--r--doc/tips/blog_script.mdwn6
-rw-r--r--doc/tips/comments_feed.mdwn17
-rw-r--r--doc/tips/convert_blogger_blogs_to_ikiwiki.mdwn5
-rw-r--r--doc/tips/convert_mediawiki_to_ikiwiki.mdwn286
-rw-r--r--doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn669
-rw-r--r--doc/tips/convert_moinmoin_to_ikiwiki.mdwn109
-rw-r--r--doc/tips/convert_moinmoin_to_ikiwiki/discussion.mdwn5
-rw-r--r--doc/tips/distributed_wikis.mdwn46
-rw-r--r--doc/tips/distributed_wikis/discussion.mdwn7
-rw-r--r--doc/tips/dot_cgi.mdwn111
-rw-r--r--doc/tips/dot_cgi/discussion.mdwn51
-rw-r--r--doc/tips/emacs_syntax_highlighting.mdwn3
-rw-r--r--doc/tips/embedding_content.mdwn35
-rw-r--r--doc/tips/follow_wikilinks_from_inside_vim.mdwn47
-rw-r--r--doc/tips/github.mdwn64
-rw-r--r--doc/tips/howto_avoid_flooding_aggregators.mdwn28
-rw-r--r--doc/tips/howto_limit_to_admin_users.mdwn9
-rw-r--r--doc/tips/htaccess_file.mdwn27
-rw-r--r--doc/tips/html5.mdwn27
-rw-r--r--doc/tips/ikiwiki_as_a_requirements_management_tool.mdwn95
-rw-r--r--doc/tips/ikiwiki_as_a_requirements_management_tool/discussion.mdwn21
-rw-r--r--doc/tips/ikiwiki_on_mac_os_x.mdwn218
-rw-r--r--doc/tips/ikiwiki_via_gopher.mdwn22
-rw-r--r--doc/tips/ikiwiki_via_gopher/discussion.mdwn8
-rw-r--r--doc/tips/importing_posts_from_typo.mdwn1
-rw-r--r--doc/tips/importing_posts_from_wordpress/ikiwiki-wordpress-import.mdwn468
-rw-r--r--doc/tips/inside_dot_ikiwiki.mdwn91
-rw-r--r--doc/tips/inside_dot_ikiwiki/discussion.mdwn66
-rw-r--r--doc/tips/integrated_issue_tracking_with_ikiwiki.mdwn277
-rw-r--r--doc/tips/integrated_issue_tracking_with_ikiwiki/discussion.mdwn32
-rw-r--r--doc/tips/laptop_wiki_with_git.mdwn71
-rw-r--r--doc/tips/laptop_wiki_with_git/discussion.mdwn15
-rw-r--r--doc/tips/laptop_wiki_with_git_extended.mdwn43
-rw-r--r--doc/tips/laptop_wiki_with_git_extended/discussion.mdwn1
-rw-r--r--doc/tips/mailman_subscription_form.mdwn10
-rw-r--r--doc/tips/markdown_and_eclipse.mdwn4
-rw-r--r--doc/tips/mathopd_permissions.mdwn15
-rw-r--r--doc/tips/nearlyfreespeech.mdwn108
-rw-r--r--doc/tips/nearlyfreespeech/discussion.mdwn22
-rw-r--r--doc/tips/optimising_ikiwiki.mdwn188
-rw-r--r--doc/tips/parentlinks_style.mdwn143
-rw-r--r--doc/tips/psgi.mdwn21
-rw-r--r--doc/tips/redirections_for_usedirs.mdwn39
-rw-r--r--doc/tips/spam_and_softwaresites.mdwn87
-rw-r--r--doc/tips/spam_and_softwaresites/discussion.mdwn8
-rw-r--r--doc/tips/switching_to_usedirs.mdwn28
-rw-r--r--doc/tips/switching_to_usedirs/discussion.mdwn24
-rw-r--r--doc/tips/untrusted_git_push.mdwn114
-rw-r--r--doc/tips/untrusted_git_push/discussion.mdwn33
-rw-r--r--doc/tips/upgrade_to_3.0.mdwn95
-rw-r--r--doc/tips/using_the_web_interface_with_a_real_text_editor.mdwn17
-rw-r--r--doc/tips/using_the_web_interface_with_a_real_text_editor/discussion.mdwn2
-rw-r--r--doc/tips/vim_and_ikiwiki.mdwn28
-rw-r--r--doc/tips/vim_syntax_highlighting.mdwn20
-rw-r--r--doc/tips/vim_syntax_highlighting/discussion.mdwn8
-rw-r--r--doc/tips/vim_syntax_highlighting/ikiwiki.vim71
-rw-r--r--doc/tips/wikiannounce.mdwn8
-rw-r--r--doc/tips/yaml_setup_files.mdwn12
-rw-r--r--doc/todo.mdwn21
-rw-r--r--doc/todo/ACL.mdwn98
-rw-r--r--doc/todo/A_page_that_inlines_pages__61____34____42____34___results_in_unnecessary_feed_generation.mdwn80
-rw-r--r--doc/todo/Account-creation_password.mdwn6
-rw-r--r--doc/todo/Account_moderation.mdwn12
-rw-r--r--doc/todo/Add_DATE_parameter_for_use_in_templates.mdwn86
-rw-r--r--doc/todo/Add_HTML_support_to_po_plugin.mdwn9
-rw-r--r--doc/todo/Add_a_plugin_to_list_available_pre-processor_commands.mdwn141
-rw-r--r--doc/todo/Add_basename_in_edittemplate.mdwn8
-rw-r--r--doc/todo/Add_camelcase_exclusions.mdwn23
-rw-r--r--doc/todo/Add_instructive_commit_messages_for_add__47__edit_pages.mdwn43
-rw-r--r--doc/todo/Add_instructive_commit_messages_for_removing_pages.mdwn32
-rw-r--r--doc/todo/Add_label_to_search_form_input_field.mdwn56
-rw-r--r--doc/todo/Add_nicer_math_formatting.mdwn28
-rw-r--r--doc/todo/Add_showdown_GUI_input__47__edit.mdwn31
-rw-r--r--doc/todo/Add_space_before_slash_in_parent_links.mdwn156
-rw-r--r--doc/todo/Add_support_for_latest_Text::Markdown_as_found_on_CPAN.mdwn45
-rw-r--r--doc/todo/Adjust_goodstuff.mdwn12
-rw-r--r--doc/todo/Allow_TITLE_to_include_part_of_the_path_in_addition_to_the_basename.mdwn79
-rw-r--r--doc/todo/Allow_change_of_wiki_file_types.mdwn85
-rw-r--r--doc/todo/Allow_disabling_edit_and_preferences_links.mdwn81
-rw-r--r--doc/todo/Allow_edittemplate_to_set_file_type.mdwn44
-rw-r--r--doc/todo/Allow_filenames_that_are_all_type.mdwn41
-rw-r--r--doc/todo/Allow_per-page_template_selection.mdwn43
-rw-r--r--doc/todo/Allow_web_edit_form_comment_field_to_be_mandatory.mdwn22
-rw-r--r--doc/todo/Attempt_to_extend_Mercurial_backend_support.mdwn258
-rw-r--r--doc/todo/Auto-setup_and_maintain_Mercurial_wrapper_hooks.mdwn240
-rw-r--r--doc/todo/Auto-setup_should_default_to_YAML.mdwn3
-rw-r--r--doc/todo/Automatic_aggregate_setup_from_wikilist_in_Debian_package_.mdwn29
-rw-r--r--doc/todo/BTS_integration.mdwn11
-rw-r--r--doc/todo/Bestdir_along_with_bestlink_in_IkiWiki.pm.mdwn51
-rw-r--r--doc/todo/Bestdir_along_with_bestlink_in_IkiWiki.pm/discussion.mdwn6
-rw-r--r--doc/todo/Better_bug_tracking_support.mdwn71
-rw-r--r--doc/todo/Better_reporting_of_validation_errors.mdwn2
-rw-r--r--doc/todo/BibTeX.mdwn74
-rw-r--r--doc/todo/BrowserID.mdwn25
-rw-r--r--doc/todo/CGI_method_to_pullrefresh.mdwn7
-rw-r--r--doc/todo/CSS_classes_for_links.mdwn138
-rw-r--r--doc/todo/CVS_backend.mdwn16
-rw-r--r--doc/todo/Calendar:_listing_multiple_entries_per_day_.mdwn94
-rw-r--r--doc/todo/Case.mdwn4
-rw-r--r--doc/todo/Commit_emails:_ones_own_changes.mdwn9
-rw-r--r--doc/todo/Configurable_minimum_length_of_log_message_for_web_edits.mdwn5
-rw-r--r--doc/todo/Configureable_separator_of_page_name.mdwn12
-rw-r--r--doc/todo/Debian_package_could_Recommend_gcc_+_libc6-dev__44___not_Depend.mdwn22
-rw-r--r--doc/todo/Default_text_for_new_pages.mdwn104
-rw-r--r--doc/todo/Does_not_support_non-UTF8_files.mdwn7
-rw-r--r--doc/todo/Editing_po_files.mdwn5
-rw-r--r--doc/todo/Enable_filtering_of_files_indexed_for_search.mdwn7
-rw-r--r--doc/todo/Extensible_inlining.mdwn263
-rw-r--r--doc/todo/Feature_parity_with_Trac.mdwn22
-rw-r--r--doc/todo/Fenced_code_blocks___40__from_GitHub_Flavored_Markdown__41__.mdwn44
-rw-r--r--doc/todo/Fix_CSS_to_not_put_a_border_around_image_links.mdwn7
-rw-r--r--doc/todo/Fix_selflink_in_po_plugin.mdwn21
-rw-r--r--doc/todo/FormBuilder__95__Template__95__patch.mdwn10
-rw-r--r--doc/todo/FormattingHelp_should_open_new_window.mdwn1
-rw-r--r--doc/todo/Gallery.mdwn83
-rw-r--r--doc/todo/Give_access_to_more_TMPL__95__VAR_variables_in_templates_inserted_by_the_template_plugin.mdwn111
-rw-r--r--doc/todo/Google_Analytics_support.mdwn31
-rw-r--r--doc/todo/Google_Sitemap_protocol.mdwn60
-rw-r--r--doc/todo/Have_xapian_index_pdf__44___openoffice__44___documents.mdwn5
-rw-r--r--doc/todo/IRC_topic.mdwn10
-rw-r--r--doc/todo/Improve_display_of_OpenIDs.mdwn5
-rw-r--r--doc/todo/Improve_markdown_speed.mdwn33
-rw-r--r--doc/todo/Improve_signin_form_layout.mdwn44
-rw-r--r--doc/todo/Improving_the_efficiency_of_match__95__glob.mdwn228
-rw-r--r--doc/todo/Inline_plugin_option_to_show_full_page_path.mdwn30
-rw-r--r--doc/todo/Location_of_pages_starting_with___36__tagbase_should_be_in__by_default.mdwn15
-rw-r--r--doc/todo/Mailing_list.mdwn36
-rw-r--r--doc/todo/Make_example_setup_file_consistent.mdwn33
-rw-r--r--doc/todo/Mercurial_backend_update.mdwn969
-rw-r--r--doc/todo/Modern_standard_layout.mdwn39
-rw-r--r--doc/todo/More_flexible_po-plugin_for_translation.mdwn5
-rw-r--r--doc/todo/Move_teximg_latex_preamble_to_config_file.mdwn156
-rw-r--r--doc/todo/Moving_Pages.mdwn222
-rw-r--r--doc/todo/Multiple_categorization_namespaces.mdwn103
-rw-r--r--doc/todo/New_preprocessor_directive_syntax.mdwn21
-rw-r--r--doc/todo/New_preprocessor_directive_syntax/discussion.mdwn19
-rw-r--r--doc/todo/OpenSearch.mdwn38
-rw-r--r--doc/todo/Option_to_disable_date_footer_for_inlines.mdwn31
-rw-r--r--doc/todo/Option_to_make_title_an_h1__63__.mdwn14
-rw-r--r--doc/todo/Overlay_directory_for_pagetemplates.mdwn9
-rw-r--r--doc/todo/Pagination_next_prev_links.mdwn68
-rw-r--r--doc/todo/Plugins_to_provide___34__add_to__34___links_for_popular_feed_readers.mdwn6
-rw-r--r--doc/todo/Post-compilation_inclusion_of_the_sidebar.mdwn67
-rw-r--r--doc/todo/Print_link.mdwn73
-rw-r--r--doc/todo/RSS_fields.mdwn25
-rw-r--r--doc/todo/RSS_links.mdwn17
-rw-r--r--doc/todo/Raw_view_link.mdwn19
-rw-r--r--doc/todo/RecentChanges_page_links_without_cgi_wrapper.mdwn26
-rw-r--r--doc/todo/Render_multiple_destinations_from_one_source.mdwn85
-rw-r--r--doc/todo/Resolve_native_reStructuredText_links_to_ikiwiki_pages.mdwn333
-rw-r--r--doc/todo/Restrict_formats_allowed_for_comments.mdwn99
-rw-r--r--doc/todo/Restrict_page_viewing.mdwn42
-rw-r--r--doc/todo/Separate_OpenIDs_and_usernames.mdwn55
-rw-r--r--doc/todo/Set_arbitrary_date_to_be_used_by_calendar_plugin.mdwn210
-rw-r--r--doc/todo/Set_arbitrary_date_to_be_used_by_calendar_plugin/discussion.mdwn44
-rw-r--r--doc/todo/Set_templates_for_whole_sections_of_the_site.mdwn41
-rw-r--r--doc/todo/Short_wikilinks.mdwn104
-rw-r--r--doc/todo/Shorter_feeds.mdwn11
-rw-r--r--doc/todo/Silence_monotone_warning.mdwn17
-rw-r--r--doc/todo/Split_plugins_with_external_dependencies_into_separate_Debian_packages.mdwn40
-rw-r--r--doc/todo/Suggested_location_should_be_subpage_if_siblings_exist.mdwn26
-rw-r--r--doc/todo/Support_MultiMarkdown_3.X.mdwn10
-rw-r--r--doc/todo/Support_XML-RPC-based_blogging.mdwn17
-rw-r--r--doc/todo/Support__47__Switch_to_MultiMarkdown.mdwn35
-rw-r--r--doc/todo/Support_preprocessing_CSS.mdwn1
-rw-r--r--doc/todo/Support_subdirectory_of_a_git_repo.mdwn9
-rw-r--r--doc/todo/Support_tab_insertion_in_textarea.mdwn15
-rw-r--r--doc/todo/Support_wildcard_inside_of_link__40____41___within_a_pagespec.mdwn45
-rw-r--r--doc/todo/Tags_list_in_page_footer_uses_basename.mdwn11
-rw-r--r--doc/todo/Track_Markdown_Standardisation_Efforts.mdwn7
-rw-r--r--doc/todo/Unit_tests.mdwn10
-rw-r--r--doc/todo/Untrusted_push_in_Monotone.mdwn28
-rw-r--r--doc/todo/Updated_bug_tracking_example.mdwn136
-rw-r--r--doc/todo/Using_page_titles_in_internal_links.mdwn3
-rw-r--r--doc/todo/Wikilink_to_a_symbolic_link.mdwn5
-rw-r--r--doc/todo/Wrapper_config_with_multiline_regexp.mdwn36
-rw-r--r--doc/todo/Zoned_ikiwiki.mdwn64
-rw-r--r--doc/todo/__34__subscribe_to_this_page__34___checkbox_on_edit_form.mdwn10
-rw-r--r--doc/todo/__42__forward__42__ing_functionality_for_the_meta_plugin.mdwn70
-rw-r--r--doc/todo/__47___should_point_to_top-level_index.mdwn3
-rw-r--r--doc/todo/a_navbar_based_on_page_properties.mdwn48
-rw-r--r--doc/todo/abbreviation.mdwn7
-rw-r--r--doc/todo/ability_to_force_particular_UUIDs_on_blog_posts.mdwn24
-rw-r--r--doc/todo/absolute_urls_in_wikilinks.mdwn20
-rw-r--r--doc/todo/access_keys.mdwn286
-rw-r--r--doc/todo/ad-hoc_plugins.mdwn66
-rw-r--r--doc/todo/add_forward_age_sorting_option_to_inline.mdwn34
-rw-r--r--doc/todo/adding_new_pages_by_using_the_web_interface.mdwn79
-rw-r--r--doc/todo/adjust_commit_message_for_rename__44___remove.mdwn5
-rw-r--r--doc/todo/aggregate_401_handling.mdwn20
-rw-r--r--doc/todo/aggregate_locking.mdwn64
-rw-r--r--doc/todo/aggregate_to_internal_pages.mdwn59
-rw-r--r--doc/todo/aggregation.mdwn3
-rw-r--r--doc/todo/alias_directive.mdwn72
-rw-r--r--doc/todo/allow_CGI_to_create_dynamic_pages.mdwn3
-rw-r--r--doc/todo/allow_TMPL__95__LOOP_in_template_directives.mdwn278
-rw-r--r--doc/todo/allow_banning_a_user_when_moderating_a_comment.mdwn1
-rw-r--r--doc/todo/allow_creation_of_non-existent_pages.mdwn13
-rw-r--r--doc/todo/allow_disabling_backlinks.mdwn18
-rw-r--r--doc/todo/allow_displaying_number_of_comments.mdwn30
-rw-r--r--doc/todo/allow_full_post_from_the___34__add_a_new_post__34___form.mdwn12
-rw-r--r--doc/todo/allow_plugins_to_add_sorting_methods.mdwn304
-rw-r--r--doc/todo/allow_site-wide_meta_definitions.mdwn169
-rw-r--r--doc/todo/allow_wiki_syntax_in_commit_messages.mdwn21
-rw-r--r--doc/todo/anon_push_of_comments.mdwn14
-rw-r--r--doc/todo/anti-spam_protection.mdwn30
-rw-r--r--doc/todo/apache_404_ErrorDocument_handler.mdwn25
-rw-r--r--doc/todo/applydiff_plugin.mdwn110
-rw-r--r--doc/todo/assumes_system_perl.mdwn20
-rw-r--r--doc/todo/attachments.mdwn22
-rw-r--r--doc/todo/auto-create_tag_pages_according_to_a_template.mdwn270
-rw-r--r--doc/todo/auto_getctime_on_fresh_build.mdwn13
-rw-r--r--doc/todo/auto_publish_expire.mdwn33
-rw-r--r--doc/todo/auto_rebuild_on_template_change.mdwn78
-rw-r--r--doc/todo/autoindex_should_use_add__95__autofile.mdwn120
-rw-r--r--doc/todo/automatic_rebuilding_of_html_pages.mdwn5
-rw-r--r--doc/todo/automatic_use_of_syntax_plugin_on_source_code_files.mdwn17
-rw-r--r--doc/todo/automatic_use_of_syntax_plugin_on_source_code_files/discussion.mdwn215
-rw-r--r--doc/todo/avatar.mdwn31
-rw-r--r--doc/todo/avatar/discussion.mdwn1
-rw-r--r--doc/todo/avoid_attachement_ui_if_upload_not_allowed.mdwn25
-rw-r--r--doc/todo/avoid_thrashing.mdwn22
-rw-r--r--doc/todo/backlinks_result_is_lossy.mdwn12
-rw-r--r--doc/todo/basewiki_should_be_self_documenting.mdwn40
-rw-r--r--doc/todo/be_more_selective_about_running_hooks.mdwn68
-rw-r--r--doc/todo/beef_up_sidebar_to_allow_for_multiple_sidebars.mdwn125
-rw-r--r--doc/todo/beef_up_signin_page.mdwn17
-rw-r--r--doc/todo/block_external_links.mdwn16
-rw-r--r--doc/todo/blocking_ip_ranges.mdwn7
-rw-r--r--doc/todo/blogging.mdwn137
-rw-r--r--doc/todo/blogpost_plugin.mdwn156
-rw-r--r--doc/todo/blogs.mdwn4
-rw-r--r--doc/todo/blogspam_training.mdwn31
-rw-r--r--doc/todo/break_up_page_template_into_subfiles.mdwn36
-rw-r--r--doc/todo/brokenlinks_should_group_links_to_a_page.mdwn21
-rw-r--r--doc/todo/bzr.mdwn194
-rw-r--r--doc/todo/cache_backlinks.mdwn25
-rw-r--r--doc/todo/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn124
-rw-r--r--doc/todo/calendar_with___34__create__34___links.mdwn10
-rw-r--r--doc/todo/calendar_with___34__create__34___links/incomplete_patch.pl36
-rw-r--r--doc/todo/call_git-update-server-info_from_post-udpate_hook.mdwn15
-rw-r--r--doc/todo/canonical_feed_location.mdwn16
-rw-r--r--doc/todo/capitalize_title.mdwn31
-rw-r--r--doc/todo/cas_authentication.mdwn184
-rw-r--r--doc/todo/cdate_and_mdate_available_for_templates.mdwn15
-rw-r--r--doc/todo/cgi_hooks_get_session_objects.mdwn5
-rw-r--r--doc/todo/clear_page_to_delete.mdwn33
-rw-r--r--doc/todo/clickable-openid-urls-in-logs.mdwn23
-rw-r--r--doc/todo/color_plugin.mdwn231
-rw-r--r--doc/todo/comment_by_mail.mdwn3
-rw-r--r--doc/todo/comment_by_mail/discussion.mdwn25
-rw-r--r--doc/todo/comment_moderation_feed.mdwn16
-rw-r--r--doc/todo/comments.mdwn170
-rw-r--r--doc/todo/conditional_text_based_on_ikiwiki_features.mdwn128
-rw-r--r--doc/todo/conditional_underlay_files.mdwn29
-rw-r--r--doc/todo/configurable_markdown_path.mdwn64
-rw-r--r--doc/todo/configurable_tidy_command_for_htmltidy.mdwn8
-rw-r--r--doc/todo/configurable_timezones.mdwn7
-rw-r--r--doc/todo/conflict_free_comment_merges.mdwn23
-rw-r--r--doc/todo/consistent_smileys.mdwn22
-rw-r--r--doc/todo/copyright_based_on_pagespec.mdwn10
-rw-r--r--doc/todo/correct_published_and_updated_time_information_for_the_feeds.mdwn113
-rw-r--r--doc/todo/countdown_directive.mdwn5
-rw-r--r--doc/todo/credentials_page.mdwn33
-rw-r--r--doc/todo/ctime_on_blog_post_pages_.mdwn11
-rw-r--r--doc/todo/custom_location_for_openlayers.mdwn17
-rw-r--r--doc/todo/darcs.mdwn53
-rw-r--r--doc/todo/datearchives-plugin.mdwn77
-rw-r--r--doc/todo/default_content_for_new_post.mdwn66
-rw-r--r--doc/todo/default_name_for_new_post.mdwn3
-rw-r--r--doc/todo/dependency_types.mdwn579
-rw-r--r--doc/todo/description_meta_param_passed_to_templates.mdwn10
-rw-r--r--doc/todo/different_search_engine.mdwn332
-rw-r--r--doc/todo/directive_docs.mdwn79
-rw-r--r--doc/todo/discuss_without_login.mdwn19
-rw-r--r--doc/todo/discussion_page_as_blog.mdwn33
-rw-r--r--doc/todo/discussion_page_as_blog/discussion/castle.mdwn3
-rw-r--r--doc/todo/discussion_page_as_blog/discussion/castle/discussion.mdwn1
-rw-r--r--doc/todo/discussion_page_as_blog/discussion/castle/discussion/Don__39__t_like_foo.mdwn3
-rw-r--r--doc/todo/discussion_page_as_blog/discussion/castle/discussion/Don__39__t_like_foo/how_about_bar.mdwn1
-rw-r--r--doc/todo/discussion_page_as_blog/discussion/castle/discussion/Don__39__t_like_foo/sdf.mdwn5
-rw-r--r--doc/todo/discussion_page_as_blog/discussion/castle/discussion/foo_is_ok.mdwn1
-rw-r--r--doc/todo/discussion_page_as_blog/discussion/castle/discussion/test.mdwn1
-rw-r--r--doc/todo/do_not_make_links_backwards.mdwn95
-rw-r--r--doc/todo/done.mdwn3
-rw-r--r--doc/todo/double-click_protection_for_form_buttons.mdwn5
-rw-r--r--doc/todo/doxygen_support.mdwn7
-rw-r--r--doc/todo/dynamic_rootpage.mdwn35
-rw-r--r--doc/todo/ease_archivepage_styling.mdwn59
-rw-r--r--doc/todo/edit_form:_no_fixed_size_for_textarea.mdwn52
-rw-r--r--doc/todo/edittemplate_should_look_in_templates_directory_by_default.mdwn8
-rw-r--r--doc/todo/edittemplate_should_support_uuid__44___date_variables.mdwn19
-rw-r--r--doc/todo/else_parameter_for_map_plugin.mdwn56
-rw-r--r--doc/todo/enable-htaccess-files.mdwn80
-rw-r--r--doc/todo/enable_arbitrary_markup_for_directives.mdwn47
-rw-r--r--doc/todo/etherpad_support.mdwn22
-rw-r--r--doc/todo/excluding_commit_mails.mdwn19
-rw-r--r--doc/todo/fancypodcast.mdwn330
-rw-r--r--doc/todo/fastcgi_or_modperl_installation_instructions.mdwn18
-rw-r--r--doc/todo/feed_enhancements_for_inline_pages.mdwn132
-rw-r--r--doc/todo/fileupload.mdwn63
-rw-r--r--doc/todo/fileupload/discussion.mdwn45
-rw-r--r--doc/todo/fileupload/soc-proposal.mdwn71
-rw-r--r--doc/todo/fileupload/soc-proposal/discussion.mdwn46
-rw-r--r--doc/todo/filtering_content_when_inlining.mdwn16
-rw-r--r--doc/todo/finer_control_over___60__object___47____62__s.mdwn98
-rw-r--r--doc/todo/firm_up_plugin_interface.mdwn96
-rw-r--r--doc/todo/for_amazon_s3_pre-gzip-encode_safe_files.mdwn17
-rw-r--r--doc/todo/format_escape.mdwn292
-rw-r--r--doc/todo/fortune:_select_options_via_environment.mdwn34
-rw-r--r--doc/todo/friendly_markup_names.mdwn13
-rw-r--r--doc/todo/generated_po_stuff_not_ignored_by_git.mdwn6
-rw-r--r--doc/todo/generic___39__do__61__goto__39___for_CGI.mdwn35
-rw-r--r--doc/todo/generic_insert_links.mdwn24
-rw-r--r--doc/todo/geotagging.mdwn7
-rw-r--r--doc/todo/git-rev-list_requires_relative_path___40__fixes_git_ctime__41__.mdwn24
-rw-r--r--doc/todo/git_attribution.mdwn9
-rw-r--r--doc/todo/git_attribution/discussion.mdwn98
-rw-r--r--doc/todo/git_recentchanges_should_not_show_merges.mdwn20
-rw-r--r--doc/todo/graphviz.mdwn19
-rw-r--r--doc/todo/hard-coded_location_for_man_pages_and_w3m_cgi_wrapper.mdwn94
-rw-r--r--doc/todo/headless_git_branches.mdwn113
-rw-r--r--doc/todo/hidden_links__47__tags.mdwn13
-rw-r--r--doc/todo/hook_to_detect_markdown_links_to_wiki_pages.mdwn1
-rw-r--r--doc/todo/html.mdwn6
-rw-r--r--doc/todo/htmlvalidation.mdwn47
-rw-r--r--doc/todo/htpasswd_mirror_of_the_userdb.mdwn29
-rw-r--r--doc/todo/http_bl_support.mdwn67
-rw-r--r--doc/todo/httpauth_example.mdwn8
-rw-r--r--doc/todo/httpauth_example/discussion.mdwn1
-rw-r--r--doc/todo/httpauth_feature_parity_with_passwordauth.mdwn28
-rw-r--r--doc/todo/hyphenation.mdwn32
-rw-r--r--doc/todo/ikibot.mdwn9
-rw-r--r--doc/todo/improve_globlists.mdwn8
-rw-r--r--doc/todo/improved_mediawiki_support.mdwn9
-rw-r--r--doc/todo/improved_parentlinks_styling.mdwn9
-rw-r--r--doc/todo/index.html_allowed.mdwn126
-rw-r--r--doc/todo/inline:_numerical_ordering_by_title.mdwn254
-rw-r--r--doc/todo/inline_directive_should_support_pagination.mdwn8
-rw-r--r--doc/todo/inline_option_for_pagespec-specific_show__61__N.mdwn3
-rw-r--r--doc/todo/inline_plugin:_ability_to_override_feed_name.mdwn29
-rw-r--r--doc/todo/inline_plugin:_hide_feed_buttons_if_empty.mdwn7
-rw-r--r--doc/todo/inline_plugin:_specifying_ordered_page_names.mdwn19
-rw-r--r--doc/todo/inline_postform_autotitles.mdwn67
-rw-r--r--doc/todo/inline_raw_files.mdwn115
-rw-r--r--doc/todo/inlines_inheriting_links.mdwn39
-rw-r--r--doc/todo/integration_with_Firefox_and_Iceweasel_feed_subscription_mechanism.mdwn13
-rw-r--r--doc/todo/interactive_todo_lists.mdwn49
-rw-r--r--doc/todo/internal_definition_list_support.mdwn54
-rw-r--r--doc/todo/l10n.mdwn84
-rw-r--r--doc/todo/language_definition_for_the_meta_plugin.mdwn118
-rw-r--r--doc/todo/latex.mdwn244
-rw-r--r--doc/todo/latex/discussion.mdwn6
-rw-r--r--doc/todo/let_inline_plugin_use_pagetemplates.mdwn5
-rw-r--r--doc/todo/limit_the_markup_formats_available_for_editing.mdwn8
-rw-r--r--doc/todo/link_map.mdwn6
-rw-r--r--doc/todo/link_plugin_perhaps_too_general__63__.mdwn25
-rw-r--r--doc/todo/linkbase.mdwn16
-rw-r--r--doc/todo/linkify_and_preprocessor_ordering.mdwn24
-rw-r--r--doc/todo/linktitle.mdwn19
-rw-r--r--doc/todo/lists.mdwn3
-rw-r--r--doc/todo/location_of_external_plugins.mdwn24
-rw-r--r--doc/todo/location_of_ikiwiki-w3m.cgi.mdwn3
-rw-r--r--doc/todo/logo.mdwn4
-rw-r--r--doc/todo/lucene_search_engine.mdwn1
-rw-r--r--doc/todo/mailnotification.mdwn59
-rw-r--r--doc/todo/mailnotification/discussion.mdwn14
-rw-r--r--doc/todo/make_html-parser_use_encode_entities_numeric.mdwn19
-rw-r--r--doc/todo/make_link_target_search_all_paths_as_fallback.mdwn27
-rw-r--r--doc/todo/manpages.mdwn4
-rw-r--r--doc/todo/mark_edit_as_trivial__44___identify__47__filter_on_trivial_changes.mdwn11
-rw-r--r--doc/todo/matching_different_kinds_of_links.mdwn196
-rw-r--r--doc/todo/mbox.mdwn18
-rw-r--r--doc/todo/mdwn_itex.mdwn22
-rw-r--r--doc/todo/mdwn_preview.mdwn339
-rw-r--r--doc/todo/mdwn_preview/discussion.mdwn1
-rw-r--r--doc/todo/mercurial.mdwn129
-rw-r--r--doc/todo/mercurial/discussion.mdwn9
-rw-r--r--doc/todo/meta_rcsid.mdwn51
-rw-r--r--doc/todo/metadata.mdwn19
-rw-r--r--doc/todo/minor_adjustment_to_setup_documentation_for_recentchanges_feeds.mdwn28
-rw-r--r--doc/todo/mirrorlist_with_per-mirror_usedirs_settings.mdwn103
-rw-r--r--doc/todo/missingparents.pm.mdwn261
-rw-r--r--doc/todo/modify_page_filename_in_plugin.mdwn35
-rw-r--r--doc/todo/monochrome_theme.mdwn48
-rw-r--r--doc/todo/more_class__61____34____34___for_css.mdwn83
-rw-r--r--doc/todo/more_customisable_titlepage_function.mdwn42
-rw-r--r--doc/todo/more_flexible_inline_postform.mdwn23
-rw-r--r--doc/todo/mtime.mdwn16
-rw-r--r--doc/todo/multi-thread_ikiwiki.mdwn89
-rw-r--r--doc/todo/multiple_output_formats.mdwn17
-rw-r--r--doc/todo/multiple_repository_support.mdwn15
-rw-r--r--doc/todo/multiple_simultaneous_rcs.mdwn26
-rw-r--r--doc/todo/multiple_simultaneous_rcs/discussion.mdwn15
-rw-r--r--doc/todo/multiple_template_directories.mdwn73
-rw-r--r--doc/todo/multiple_templates.mdwn13
-rw-r--r--doc/todo/natural_sorting.mdwn21
-rw-r--r--doc/todo/need_global_renamepage_hook.mdwn115
-rw-r--r--doc/todo/nested_preprocessor_directives.mdwn69
-rw-r--r--doc/todo/online_configuration.mdwn28
-rw-r--r--doc/todo/openid_enable_cache.mdwn4
-rw-r--r--doc/todo/openid_user_filtering.mdwn13
-rw-r--r--doc/todo/optimisations.mdwn15
-rw-r--r--doc/todo/optimize_simple_dependencies.mdwn95
-rw-r--r--doc/todo/optional_underlaydir_prefix.mdwn46
-rw-r--r--doc/todo/org_mode.mdwn36
-rw-r--r--doc/todo/org_mode/Discussion.mdwn7
-rw-r--r--doc/todo/osm__95__optimisations__95__and__95__fixes.mdwn27
-rw-r--r--doc/todo/osm_arbitrary_layers.mdwn43
-rw-r--r--doc/todo/overriding_displayed_modification_time.mdwn27
-rw-r--r--doc/todo/page_edit_disable.mdwn53
-rw-r--r--doc/todo/pagedeletion.mdwn3
-rw-r--r--doc/todo/pagedown_plugin.mdwn5
-rw-r--r--doc/todo/pageindexes.mdwn5
-rw-r--r--doc/todo/pagespec_aliases.mdwn169
-rw-r--r--doc/todo/pagespec_aliases/discussion.mdwn13
-rw-r--r--doc/todo/pagespec_expansions.mdwn151
-rw-r--r--doc/todo/pagespec_relative_to_a_target.mdwn101
-rw-r--r--doc/todo/pagespec_to_disable_ikiwiki_directives.mdwn5
-rw-r--r--doc/todo/pagestats_among_a_subset_of_pages.mdwn28
-rw-r--r--doc/todo/pal_plugin.mdwn9
-rw-r--r--doc/todo/parse_debian_packages.mdwn70
-rw-r--r--doc/todo/passwordauth:_sendmail_interface.mdwn61
-rw-r--r--doc/todo/paste_plugin.mdwn36
-rw-r--r--doc/todo/pastebin.mdwn11
-rw-r--r--doc/todo/pdf_output.mdwn22
-rw-r--r--doc/todo/pdfshare_plugin.mdwn1
-rw-r--r--doc/todo/pedigree_plugin.mdwn194
-rw-r--r--doc/todo/per_page_ACLs.mdwn18
-rw-r--r--doc/todo/pingback_support.mdwn41
-rw-r--r--doc/todo/please_add_some_table_styles.mdwn8
-rw-r--r--doc/todo/pluggablerenderers.mdwn3
-rw-r--r--doc/todo/plugin.mdwn118
-rw-r--r--doc/todo/plugin_data_storage.mdwn94
-rw-r--r--doc/todo/plugin_dependency_calulation.mdwn24
-rw-r--r--doc/todo/po:_add_lang_name_and_code_template_variables.mdwn7
-rw-r--r--doc/todo/po:_avoid_rebuilding_to_fix_meta_titles.mdwn60
-rw-r--r--doc/todo/po:_better_documentation.mdwn3
-rw-r--r--doc/todo/po:_better_links.mdwn12
-rw-r--r--doc/todo/po:_better_translation_interface.mdwn5
-rw-r--r--doc/todo/po:_remove_po_files_when_disabling_plugin.mdwn13
-rw-r--r--doc/todo/po:_rethink_pagespecs.mdwn40
-rw-r--r--doc/todo/po:_should_cleanup_.pot_files.mdwn8
-rw-r--r--doc/todo/po:_transifex_integration.mdwn13
-rw-r--r--doc/todo/po:_translation_of_directives.mdwn8
-rw-r--r--doc/todo/po_needstranslation_pagespec.mdwn12
-rw-r--r--doc/todo/preprocessor_directive_for_proposed_changes.mdwn60
-rw-r--r--doc/todo/pretty-print_OpenIDs_even_if_not_enabled.mdwn29
-rw-r--r--doc/todo/preview_changes.mdwn14
-rw-r--r--doc/todo/preview_changes_before_git_commit.mdwn17
-rw-r--r--doc/todo/progressbar_plugin.mdwn132
-rw-r--r--doc/todo/provide_a_mailing_list.mdwn40
-rw-r--r--doc/todo/provide_inline_diffs_in_recentchanges.mdwn27
-rw-r--r--doc/todo/provide_sha1_for_git_diffurl.mdwn26
-rw-r--r--doc/todo/publishing_in_the_future.mdwn127
-rw-r--r--doc/todo/quieten-bzr.mdwn28
-rw-r--r--doc/todo/rcs.mdwn25
-rw-r--r--doc/todo/rcs__95__diff_implementation_for_Mercurial_backend__44___based_on_Git_backend.mdwn40
-rw-r--r--doc/todo/rcs__95__get__123__c__44__m__125__time_implementation_for_Mercurial_backend__44___based_on_Git_backend.mdwn157
-rw-r--r--doc/todo/rcs_updates_needed.mdwn10
-rw-r--r--doc/todo/recentchanges.mdwn144
-rw-r--r--doc/todo/recentchanges_feed_with_comment.mdwn5
-rw-r--r--doc/todo/recentchanges_path.mdwn9
-rw-r--r--doc/todo/recommend_libtext-markdown-discount_instead_of_depending.mdwn25
-rw-r--r--doc/todo/redirect_automatically_after_rename.mdwn10
-rw-r--r--doc/todo/refreshing_recentchanges_page.mdwn20
-rw-r--r--doc/todo/rel__61__nofollow_on_external_links.mdwn4
-rw-r--r--doc/todo/rel_attribute_for_links.mdwn19
-rw-r--r--doc/todo/relative_pagespec_deficiency.mdwn8
-rw-r--r--doc/todo/remove_basewiki_redir_pages.mdwn4
-rw-r--r--doc/todo/replace_HTML::Template_with_Template_Toolkit.mdwn115
-rw-r--r--doc/todo/require_CAPTCHA_to_edit.mdwn327
-rw-r--r--doc/todo/review_mechanism.mdwn35
-rw-r--r--doc/todo/rewrite_ikiwiki_in_haskell.mdwn65
-rw-r--r--doc/todo/rewrite_ikiwiki_in_haskell/discussion.mdwn61
-rw-r--r--doc/todo/rss_title_description.mdwn35
-rw-r--r--doc/todo/rst_plugin_python_rewrite.mdwn7
-rw-r--r--doc/todo/salmon_protocol_for_comment_sharing.mdwn21
-rw-r--r--doc/todo/search.mdwn5
-rw-r--r--doc/todo/search_terms.mdwn7
-rw-r--r--doc/todo/section-numbering.mdwn7
-rw-r--r--doc/todo/selective_more_directive.mdwn28
-rw-r--r--doc/todo/shortcut_link_text.mdwn19
-rw-r--r--doc/todo/shortcut_optional_parameters.mdwn46
-rw-r--r--doc/todo/shortcut_with_different_link_text.mdwn67
-rw-r--r--doc/todo/shortcut_with_no_url_parameter__44___only_desc.mdwn23
-rw-r--r--doc/todo/should_optimise_pagespecs.mdwn313
-rw-r--r--doc/todo/should_use_a_standard_encoding_for_utf_chars_in_filenames.mdwn61
-rw-r--r--doc/todo/sigs.mdwn25
-rw-r--r--doc/todo/sigs/discussion.mdwn1
-rw-r--r--doc/todo/simple_text_parsing_or_regex_in_template_or_shortcut.mdwn32
-rw-r--r--doc/todo/skip_option_for_inline_plugin.mdwn8
-rw-r--r--doc/todo/smarter_sorting.mdwn141
-rw-r--r--doc/todo/smileys_do_not_work_in_PreprocessorDirective_arguments.mdwn18
-rw-r--r--doc/todo/softlinks.mdwn14
-rw-r--r--doc/todo/sort_parameter_for_map_plugin_and_directive.mdwn53
-rw-r--r--doc/todo/sort_parameter_for_map_plugin_and_directive/incomplete_patch.pl.pl77
-rw-r--r--doc/todo/sort_parameter_for_map_plugin_and_directive/python_algorithms.py86
-rw-r--r--doc/todo/sortable_tables.mdwn1
-rw-r--r--doc/todo/sortbylastcomment_plugin.mdwn13
-rw-r--r--doc/todo/sorting_by_path.mdwn18
-rw-r--r--doc/todo/source_link.mdwn135
-rw-r--r--doc/todo/spell_check_plug-in.mdwn12
-rw-r--r--doc/todo/strftime.mdwn4
-rw-r--r--doc/todo/structured_page_data.mdwn633
-rw-r--r--doc/todo/structured_page_data/discussion.mdwn1
-rw-r--r--doc/todo/stylesheet_suggestion_for_verbatim_content.mdwn33
-rw-r--r--doc/todo/submodule_support.mdwn15
-rw-r--r--doc/todo/support_creole_markup.mdwn18
-rw-r--r--doc/todo/support_dicts_in_setup.mdwn26
-rw-r--r--doc/todo/support_for_SDF_documents.mdwn8
-rw-r--r--doc/todo/support_for_plugins_written_in_other_languages.mdwn56
-rw-r--r--doc/todo/support_includes_in_setup_files.mdwn10
-rw-r--r--doc/todo/support_link__40__.__41___in_pagespec.mdwn21
-rw-r--r--doc/todo/support_multiple_perl_libraries.mdwn11
-rw-r--r--doc/todo/supporting_comments_via_disussion_pages.mdwn222
-rw-r--r--doc/todo/svg.mdwn77
-rw-r--r--doc/todo/syntax_highlighting.mdwn120
-rw-r--r--doc/todo/syntax_highlighting/discussion.mdwn28
-rw-r--r--doc/todo/syslog_should_show_wiki_name.mdwn8
-rw-r--r--doc/todo/table_with_header_column.mdwn7
-rw-r--r--doc/todo/tag_pagespec_function.mdwn41
-rw-r--r--doc/todo/tagging_with_a_publication_date.mdwn71
-rw-r--r--doc/todo/tags.mdwn12
-rw-r--r--doc/todo/target_filter_for_brokenlinks.mdwn9
-rw-r--r--doc/todo/terminalclient.mdwn10
-rw-r--r--doc/todo/test_coverage.mdwn24
-rw-r--r--doc/todo/themes_should_ship_with_templates.mdwn19
-rw-r--r--doc/todo/tidy_git__39__s_ctime_debug_output.mdwn15
-rw-r--r--doc/todo/tla.mdwn7
-rw-r--r--doc/todo/tmplvars_plugin.mdwn75
-rw-r--r--doc/todo/tmplvars_plugin/discussion.mdwn1
-rw-r--r--doc/todo/toc-with-human-readable-anchors.mdwn7
-rw-r--r--doc/todo/toc_plugin:_set_a_header_ceiling___40__opposite_of_levels__61____41__.mdwn48
-rw-r--r--doc/todo/toc_plugin_to_skip_one_level.mdwn23
-rw-r--r--doc/todo/toggle_initial_state.mdwn6
-rw-r--r--doc/todo/toplevel_index.mdwn37
-rw-r--r--doc/todo/tracking_bugs_with_dependencies.mdwn680
-rw-r--r--doc/todo/transient_pages.mdwn318
-rw-r--r--doc/todo/translation_links.mdwn46
-rw-r--r--doc/todo/turn_edittemplate_verbosity_off_by_default.mdwn34
-rw-r--r--doc/todo/two-way_convert_of_wikis.mdwn18
-rw-r--r--doc/todo/typography_plugin_configuration.mdwn6
-rw-r--r--doc/todo/unaccent_url_instead_of_encoding.mdwn24
-rw-r--r--doc/todo/underlay.mdwn13
-rw-r--r--doc/todo/unified_temporary_file__47__directory_handling.mdwn19
-rw-r--r--doc/todo/untrusted_git_push_hooks.mdwn12
-rw-r--r--doc/todo/upgradehooks.mdwn8
-rw-r--r--doc/todo/use_secure_cookies_for_ssl_logins.mdwn36
-rw-r--r--doc/todo/use_templates_for_the_img_plugin.mdwn29
-rw-r--r--doc/todo/usedirs__95__redir_proposed_additional_module.mdwn8
-rw-r--r--doc/todo/user-defined_templates_outside_the_wiki.mdwn10
-rw-r--r--doc/todo/user-subdir_mechanism_like_etc_ikiwiki_wikilist.mdwn3
-rw-r--r--doc/todo/userdir_links.mdwn5
-rw-r--r--doc/todo/utf8.mdwn18
-rw-r--r--doc/todo/varioki_--_add_template_variables___40__with_closures_for_values__41___in_ikiwiki.setup.mdwn272
-rw-r--r--doc/todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both.mdwn376
-rw-r--r--doc/todo/wanted_pages_plugin.mdwn3
-rw-r--r--doc/todo/wdiffs_in_recentchanges.mdwn1
-rw-r--r--doc/todo/web-based_image_editing.mdwn3
-rw-r--r--doc/todo/web_gui_for_managing_tags.mdwn12
-rw-r--r--doc/todo/web_reversion.mdwn73
-rw-r--r--doc/todo/websetup_should_link_to_plugin_descriptions.mdwn3
-rw-r--r--doc/todo/wiki-formatted_comments_with_syntax_plugin.mdwn9
-rw-r--r--doc/todo/wikilink_titles.mdwn4
-rw-r--r--doc/todo/wikilinkfeatures.mdwn4
-rw-r--r--doc/todo/wikitrails.mdwn49
-rw-r--r--doc/todo/wikitrails/discussion.mdwn84
-rw-r--r--doc/todo/wikiwyg.mdwn71
-rw-r--r--doc/todo/wikiwyg/discussion.mdwn181
-rw-r--r--doc/todo/wmd_editor_live_preview.mdwn11
-rw-r--r--doc/todo/wrapperuser.mdwn7
-rw-r--r--doc/translation.mdwn46
-rw-r--r--doc/translation/discussion.mdwn121
-rw-r--r--doc/usage.mdwn389
-rw-r--r--doc/usage/discussion.mdwn1
-rw-r--r--doc/users.mdwn11
-rw-r--r--doc/users/BerndZeimetz.mdwn8
-rw-r--r--doc/users/Christine_Spang.mdwn1
-rw-r--r--doc/users/DamianSmall.mdwn1
-rw-r--r--doc/users/Daniel_Andersson.mdwn3
-rw-r--r--doc/users/DavidBremner.mdwn1
-rw-r--r--doc/users/David_Riebenbauer.mdwn8
-rw-r--r--doc/users/Edward_Betts.mdwn4
-rw-r--r--doc/users/Erkan_Yilmaz.mdwn2
-rw-r--r--doc/users/Gianpaolo_Macario.mdwn14
-rw-r--r--doc/users/GiuseppeBilotta.mdwn6
-rw-r--r--doc/users/HenrikBrixAndersen.mdwn3
-rw-r--r--doc/users/Jamie.mdwn1
-rw-r--r--doc/users/JeremieKoenig.mdwn3
-rw-r--r--doc/users/Jimmy_Tang.mdwn1
-rw-r--r--doc/users/JoshBBall.mdwn3
-rw-r--r--doc/users/Kai_Hendry.mdwn5
-rw-r--r--doc/users/KarlMW.mdwn3
-rw-r--r--doc/users/KarlMW/discussion.mdwn27
-rw-r--r--doc/users/KathrynAndersen.mdwn8
-rw-r--r--doc/users/KathrynAndersen/discussion.mdwn20
-rw-r--r--doc/users/Larry_Clapp.mdwn3
-rw-r--r--doc/users/LucaCapello.mdwn5
-rw-r--r--doc/users/MatthiasIhrke.mdwn4
-rw-r--r--doc/users/Mick_Pollard.mdwn1
-rw-r--r--doc/users/NeilSmithline.mdwn1
-rw-r--r--doc/users/NicolasLimare.mdwn1
-rw-r--r--doc/users/Oblomov.mdwn1
-rw-r--r--doc/users/Olea.mdwn4
-rw-r--r--doc/users/OscarMorante.mdwn3
-rw-r--r--doc/users/Perry.mdwn1
-rw-r--r--doc/users/Ramsey.mdwn3
-rw-r--r--doc/users/Remy.mdwn1
-rw-r--r--doc/users/RickOwens.mdwn1
-rw-r--r--doc/users/Simon_Michael.mdwn8
-rw-r--r--doc/users/Stefano_Zacchiroli.mdwn1
-rw-r--r--doc/users/StevenBlack.mdwn5
-rw-r--r--doc/users/TaylorKillian.mdwn9
-rw-r--r--doc/users/TaylorKillian/discussion.mdwn5
-rw-r--r--doc/users/The_TOVA_Company.mdwn32
-rw-r--r--doc/users/TimBosse.mdwn1
-rw-r--r--doc/users/Tim_Lavoie.mdwn1
-rw-r--r--doc/users/Will.mdwn28
-rw-r--r--doc/users/acathur.mdwn3
-rw-r--r--doc/users/adamshand.mdwn7
-rw-r--r--doc/users/ajt.mdwn20
-rw-r--r--doc/users/aland.mdwn1
-rw-r--r--doc/users/alexander.mdwn1
-rw-r--r--doc/users/alexandredupas.mdwn7
-rw-r--r--doc/users/anarcat.mdwn31
-rw-r--r--doc/users/anarcat.wiki1
-rw-r--r--doc/users/arpitjain.mdwn7
-rw-r--r--doc/users/bartmassey.mdwn5
-rw-r--r--doc/users/bbb.mdwn5
-rw-r--r--doc/users/blipvert.mdwn1
-rw-r--r--doc/users/bstpierre.mdwn1
-rw-r--r--doc/users/cfm.mdwn1
-rw-r--r--doc/users/chris.mdwn7
-rw-r--r--doc/users/chrismgray.mdwn4
-rw-r--r--doc/users/chrysn.mdwn4
-rw-r--r--doc/users/cord.mdwn1
-rw-r--r--doc/users/cstamas.mdwn4
-rw-r--r--doc/users/dark.mdwn3
-rw-r--r--doc/users/dato.mdwn3
-rw-r--r--doc/users/dirk.mdwn1
-rw-r--r--doc/users/dom.mdwn3
-rw-r--r--doc/users/donmarti.mdwn2
-rw-r--r--doc/users/emptty.mdwn2
-rw-r--r--doc/users/ericdrechsel.mdwn1
-rw-r--r--doc/users/fil.mdwn1
-rw-r--r--doc/users/fmarier.mdwn6
-rw-r--r--doc/users/harishcm.mdwn1
-rw-r--r--doc/users/harningt.mdwn11
-rw-r--r--doc/users/hb.mdwn11
-rw-r--r--doc/users/hb/discussion.mdwn6
-rw-r--r--doc/users/hendry.mdwn1
-rw-r--r--doc/users/intrigeri.mdwn4
-rw-r--r--doc/users/iustin.mdwn1
-rw-r--r--doc/users/ivan_shmakov.mdwn54
-rw-r--r--doc/users/jasonblevins.mdwn89
-rw-r--r--doc/users/jasonriedy.mdwn1
-rw-r--r--doc/users/jaywalk.mdwn5
-rw-r--r--doc/users/jcorneli.mdwn3
-rw-r--r--doc/users/jeanprivat.mdwn1
-rw-r--r--doc/users/jelmer.mdwn1
-rw-r--r--doc/users/jeremyreed.mdwn3
-rw-r--r--doc/users/jerojasro.mdwn3
-rw-r--r--doc/users/jmtd.mdwn1
-rw-r--r--doc/users/joey.mdwn8
-rw-r--r--doc/users/jogo.mdwn5
-rw-r--r--doc/users/jon.mdwn65
-rw-r--r--doc/users/jonassmedegaard.mdwn5
-rw-r--r--doc/users/josephturian.mdwn10
-rw-r--r--doc/users/joshtriplett.mdwn16
-rw-r--r--doc/users/joshtriplett/discussion.mdwn68
-rw-r--r--doc/users/jrblevin.mdwn1
-rw-r--r--doc/users/justint.mdwn1
-rw-r--r--doc/users/jwalzer.mdwn3
-rw-r--r--doc/users/kyle.mdwn2
-rw-r--r--doc/users/madduck.mdwn9
-rw-r--r--doc/users/marcelomagallon.mdwn3
-rw-r--r--doc/users/mathdesc.mdwn190
-rw-r--r--doc/users/michaelrasmussen.wiki1
-rw-r--r--doc/users/neale.mdwn10
-rw-r--r--doc/users/nil.mdwn8
-rw-r--r--doc/users/nolan.mdwn1
-rw-r--r--doc/users/patrickwinnertz.mdwn10
-rw-r--r--doc/users/pdurbin.mdwn1
-rw-r--r--doc/users/pelle.mdwn1
-rw-r--r--doc/users/perolofsson.mdwn7
-rw-r--r--doc/users/peteg.mdwn7
-rw-r--r--doc/users/peter_woodman.mdwn1
-rw-r--r--doc/users/ptecza.mdwn21
-rw-r--r--doc/users/rubykat.mdwn1
-rw-r--r--doc/users/sabr.mdwn32
-rw-r--r--doc/users/sabr/sub1.mdwn1
-rw-r--r--doc/users/sabr/sub2.mdwn1
-rw-r--r--doc/users/schmonz-web-ikiwiki.mdwn1
-rw-r--r--doc/users/schmonz.mdwn32
-rw-r--r--doc/users/seanh.mdwn1
-rw-r--r--doc/users/simonraven.mdwn7
-rw-r--r--doc/users/smcv.mdwn10
-rw-r--r--doc/users/smcv/gallery.mdwn4
-rw-r--r--doc/users/smcv/gallery/discussion.mdwn18
-rw-r--r--doc/users/solofo.mdwn1
-rw-r--r--doc/users/sphynkx.mdwn1
-rw-r--r--doc/users/sunny256.mdwn15
-rw-r--r--doc/users/svend.mdwn4
-rw-r--r--doc/users/tbm.mdwn3
-rw-r--r--doc/users/tjgolubi.mdwn3
-rw-r--r--doc/users/tschwinge.mdwn151
-rw-r--r--doc/users/ttw.mdwn1
-rw-r--r--doc/users/tupyakov_vladimir.mdwn1
-rw-r--r--doc/users/tychoish.mdwn10
-rw-r--r--doc/users/ulrik.mdwn3
-rw-r--r--doc/users/undx.mdwn7
-rw-r--r--doc/users/victormoral.mdwn6
-rw-r--r--doc/users/weakish.mdwn3
-rw-r--r--doc/users/weakishjiang.mdwn4
-rw-r--r--doc/users/wentasah.mdwn9
-rw-r--r--doc/users/wiebel.mdwn5
-rw-r--r--doc/users/wtk.mdwn6
-rw-r--r--doc/users/xma/discussion.mdwn18
-rw-r--r--doc/users/xtaran.mdwn5
-rw-r--r--doc/users/yds.mdwn1
-rw-r--r--doc/w3mmode.mdwn11
-rw-r--r--doc/w3mmode/ikiwiki.setup31
-rw-r--r--doc/whyikiwiki.mdwn15
-rw-r--r--doc/wikiicons/diff.pngbin0 -> 219 bytes
-rw-r--r--doc/wikiicons/openidlogin-bg.gifbin0 -> 336 bytes
-rw-r--r--doc/wikiicons/revert.pngbin0 -> 397 bytes
-rw-r--r--doc/wikiicons/search-bg.gifbin0 -> 74 bytes
-rw-r--r--doc/wishlist.mdwn6
-rw-r--r--doc/wishlist/watched_pages.mdwn1
2369 files changed, 83302 insertions, 0 deletions
diff --git a/doc/GPL b/doc/GPL
new file mode 100644
index 000000000..b7b5f53df
--- /dev/null
+++ b/doc/GPL
@@ -0,0 +1,340 @@
+ GNU GENERAL PUBLIC LICENSE
+ Version 2, June 1991
+
+ Copyright (C) 1989, 1991 Free Software Foundation, Inc.
+ 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
+ Everyone is permitted to copy and distribute verbatim copies
+ of this license document, but changing it is not allowed.
+
+ Preamble
+
+ The licenses for most software are designed to take away your
+freedom to share and change it. By contrast, the GNU General Public
+License is intended to guarantee your freedom to share and change free
+software--to make sure the software is free for all its users. This
+General Public License applies to most of the Free Software
+Foundation's software and to any other program whose authors commit to
+using it. (Some other Free Software Foundation software is covered by
+the GNU Library General Public License instead.) You can apply it to
+your programs, too.
+
+ When we speak of free software, we are referring to freedom, not
+price. Our General Public Licenses are designed to make sure that you
+have the freedom to distribute copies of free software (and charge for
+this service if you wish), that you receive source code or can get it
+if you want it, that you can change the software or use pieces of it
+in new free programs; and that you know you can do these things.
+
+ To protect your rights, we need to make restrictions that forbid
+anyone to deny you these rights or to ask you to surrender the rights.
+These restrictions translate to certain responsibilities for you if you
+distribute copies of the software, or if you modify it.
+
+ For example, if you distribute copies of such a program, whether
+gratis or for a fee, you must give the recipients all the rights that
+you have. You must make sure that they, too, receive or can get the
+source code. And you must show them these terms so they know their
+rights.
+
+ We protect your rights with two steps: (1) copyright the software, and
+(2) offer you this license which gives you legal permission to copy,
+distribute and/or modify the software.
+
+ Also, for each author's protection and ours, we want to make certain
+that everyone understands that there is no warranty for this free
+software. If the software is modified by someone else and passed on, we
+want its recipients to know that what they have is not the original, so
+that any problems introduced by others will not reflect on the original
+authors' reputations.
+
+ Finally, any free program is threatened constantly by software
+patents. We wish to avoid the danger that redistributors of a free
+program will individually obtain patent licenses, in effect making the
+program proprietary. To prevent this, we have made it clear that any
+patent must be licensed for everyone's free use or not licensed at all.
+
+ The precise terms and conditions for copying, distribution and
+modification follow.
+
+ GNU GENERAL PUBLIC LICENSE
+ TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
+
+ 0. This License applies to any program or other work which contains
+a notice placed by the copyright holder saying it may be distributed
+under the terms of this General Public License. The "Program", below,
+refers to any such program or work, and a "work based on the Program"
+means either the Program or any derivative work under copyright law:
+that is to say, a work containing the Program or a portion of it,
+either verbatim or with modifications and/or translated into another
+language. (Hereinafter, translation is included without limitation in
+the term "modification".) Each licensee is addressed as "you".
+
+Activities other than copying, distribution and modification are not
+covered by this License; they are outside its scope. The act of
+running the Program is not restricted, and the output from the Program
+is covered only if its contents constitute a work based on the
+Program (independent of having been made by running the Program).
+Whether that is true depends on what the Program does.
+
+ 1. You may copy and distribute verbatim copies of the Program's
+source code as you receive it, in any medium, provided that you
+conspicuously and appropriately publish on each copy an appropriate
+copyright notice and disclaimer of warranty; keep intact all the
+notices that refer to this License and to the absence of any warranty;
+and give any other recipients of the Program a copy of this License
+along with the Program.
+
+You may charge a fee for the physical act of transferring a copy, and
+you may at your option offer warranty protection in exchange for a fee.
+
+ 2. You may modify your copy or copies of the Program or any portion
+of it, thus forming a work based on the Program, and copy and
+distribute such modifications or work under the terms of Section 1
+above, provided that you also meet all of these conditions:
+
+ a) You must cause the modified files to carry prominent notices
+ stating that you changed the files and the date of any change.
+
+ b) You must cause any work that you distribute or publish, that in
+ whole or in part contains or is derived from the Program or any
+ part thereof, to be licensed as a whole at no charge to all third
+ parties under the terms of this License.
+
+ c) If the modified program normally reads commands interactively
+ when run, you must cause it, when started running for such
+ interactive use in the most ordinary way, to print or display an
+ announcement including an appropriate copyright notice and a
+ notice that there is no warranty (or else, saying that you provide
+ a warranty) and that users may redistribute the program under
+ these conditions, and telling the user how to view a copy of this
+ License. (Exception: if the Program itself is interactive but
+ does not normally print such an announcement, your work based on
+ the Program is not required to print an announcement.)
+
+These requirements apply to the modified work as a whole. If
+identifiable sections of that work are not derived from the Program,
+and can be reasonably considered independent and separate works in
+themselves, then this License, and its terms, do not apply to those
+sections when you distribute them as separate works. But when you
+distribute the same sections as part of a whole which is a work based
+on the Program, the distribution of the whole must be on the terms of
+this License, whose permissions for other licensees extend to the
+entire whole, and thus to each and every part regardless of who wrote it.
+
+Thus, it is not the intent of this section to claim rights or contest
+your rights to work written entirely by you; rather, the intent is to
+exercise the right to control the distribution of derivative or
+collective works based on the Program.
+
+In addition, mere aggregation of another work not based on the Program
+with the Program (or with a work based on the Program) on a volume of
+a storage or distribution medium does not bring the other work under
+the scope of this License.
+
+ 3. You may copy and distribute the Program (or a work based on it,
+under Section 2) in object code or executable form under the terms of
+Sections 1 and 2 above provided that you also do one of the following:
+
+ a) Accompany it with the complete corresponding machine-readable
+ source code, which must be distributed under the terms of Sections
+ 1 and 2 above on a medium customarily used for software interchange; or,
+
+ b) Accompany it with a written offer, valid for at least three
+ years, to give any third party, for a charge no more than your
+ cost of physically performing source distribution, a complete
+ machine-readable copy of the corresponding source code, to be
+ distributed under the terms of Sections 1 and 2 above on a medium
+ customarily used for software interchange; or,
+
+ c) Accompany it with the information you received as to the offer
+ to distribute corresponding source code. (This alternative is
+ allowed only for noncommercial distribution and only if you
+ received the program in object code or executable form with such
+ an offer, in accord with Subsection b above.)
+
+The source code for a work means the preferred form of the work for
+making modifications to it. For an executable work, complete source
+code means all the source code for all modules it contains, plus any
+associated interface definition files, plus the scripts used to
+control compilation and installation of the executable. However, as a
+special exception, the source code distributed need not include
+anything that is normally distributed (in either source or binary
+form) with the major components (compiler, kernel, and so on) of the
+operating system on which the executable runs, unless that component
+itself accompanies the executable.
+
+If distribution of executable or object code is made by offering
+access to copy from a designated place, then offering equivalent
+access to copy the source code from the same place counts as
+distribution of the source code, even though third parties are not
+compelled to copy the source along with the object code.
+
+ 4. You may not copy, modify, sublicense, or distribute the Program
+except as expressly provided under this License. Any attempt
+otherwise to copy, modify, sublicense or distribute the Program is
+void, and will automatically terminate your rights under this License.
+However, parties who have received copies, or rights, from you under
+this License will not have their licenses terminated so long as such
+parties remain in full compliance.
+
+ 5. You are not required to accept this License, since you have not
+signed it. However, nothing else grants you permission to modify or
+distribute the Program or its derivative works. These actions are
+prohibited by law if you do not accept this License. Therefore, by
+modifying or distributing the Program (or any work based on the
+Program), you indicate your acceptance of this License to do so, and
+all its terms and conditions for copying, distributing or modifying
+the Program or works based on it.
+
+ 6. Each time you redistribute the Program (or any work based on the
+Program), the recipient automatically receives a license from the
+original licensor to copy, distribute or modify the Program subject to
+these terms and conditions. You may not impose any further
+restrictions on the recipients' exercise of the rights granted herein.
+You are not responsible for enforcing compliance by third parties to
+this License.
+
+ 7. If, as a consequence of a court judgment or allegation of patent
+infringement or for any other reason (not limited to patent issues),
+conditions are imposed on you (whether by court order, agreement or
+otherwise) that contradict the conditions of this License, they do not
+excuse you from the conditions of this License. If you cannot
+distribute so as to satisfy simultaneously your obligations under this
+License and any other pertinent obligations, then as a consequence you
+may not distribute the Program at all. For example, if a patent
+license would not permit royalty-free redistribution of the Program by
+all those who receive copies directly or indirectly through you, then
+the only way you could satisfy both it and this License would be to
+refrain entirely from distribution of the Program.
+
+If any portion of this section is held invalid or unenforceable under
+any particular circumstance, the balance of the section is intended to
+apply and the section as a whole is intended to apply in other
+circumstances.
+
+It is not the purpose of this section to induce you to infringe any
+patents or other property right claims or to contest validity of any
+such claims; this section has the sole purpose of protecting the
+integrity of the free software distribution system, which is
+implemented by public license practices. Many people have made
+generous contributions to the wide range of software distributed
+through that system in reliance on consistent application of that
+system; it is up to the author/donor to decide if he or she is willing
+to distribute software through any other system and a licensee cannot
+impose that choice.
+
+This section is intended to make thoroughly clear what is believed to
+be a consequence of the rest of this License.
+
+ 8. If the distribution and/or use of the Program is restricted in
+certain countries either by patents or by copyrighted interfaces, the
+original copyright holder who places the Program under this License
+may add an explicit geographical distribution limitation excluding
+those countries, so that distribution is permitted only in or among
+countries not thus excluded. In such case, this License incorporates
+the limitation as if written in the body of this License.
+
+ 9. The Free Software Foundation may publish revised and/or new versions
+of the General Public License from time to time. Such new versions will
+be similar in spirit to the present version, but may differ in detail to
+address new problems or concerns.
+
+Each version is given a distinguishing version number. If the Program
+specifies a version number of this License which applies to it and "any
+later version", you have the option of following the terms and conditions
+either of that version or of any later version published by the Free
+Software Foundation. If the Program does not specify a version number of
+this License, you may choose any version ever published by the Free Software
+Foundation.
+
+ 10. If you wish to incorporate parts of the Program into other free
+programs whose distribution conditions are different, write to the author
+to ask for permission. For software which is copyrighted by the Free
+Software Foundation, write to the Free Software Foundation; we sometimes
+make exceptions for this. Our decision will be guided by the two goals
+of preserving the free status of all derivatives of our free software and
+of promoting the sharing and reuse of software generally.
+
+ NO WARRANTY
+
+ 11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
+FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
+OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
+PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
+OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
+MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
+TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
+PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
+REPAIR OR CORRECTION.
+
+ 12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
+WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
+REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
+INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
+OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
+TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
+YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
+PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
+POSSIBILITY OF SUCH DAMAGES.
+
+ END OF TERMS AND CONDITIONS
+
+ How to Apply These Terms to Your New Programs
+
+ If you develop a new program, and you want it to be of the greatest
+possible use to the public, the best way to achieve this is to make it
+free software which everyone can redistribute and change under these terms.
+
+ To do so, attach the following notices to the program. It is safest
+to attach them to the start of each source file to most effectively
+convey the exclusion of warranty; and each file should have at least
+the "copyright" line and a pointer to where the full notice is found.
+
+ <one line to give the program's name and a brief idea of what it does.>
+ Copyright (C) <year> <name of author>
+
+ This program is free software; you can redistribute it and/or modify
+ it under the terms of the GNU General Public License as published by
+ the Free Software Foundation; either version 2 of the License, or
+ (at your option) any later version.
+
+ This program is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+ GNU General Public License for more details.
+
+ You should have received a copy of the GNU General Public License
+ along with this program; if not, write to the Free Software
+ Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
+
+
+Also add information on how to contact you by electronic and paper mail.
+
+If the program is interactive, make it output a short notice like this
+when it starts in an interactive mode:
+
+ Gnomovision version 69, Copyright (C) year name of author
+ Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
+ This is free software, and you are welcome to redistribute it
+ under certain conditions; type `show c' for details.
+
+The hypothetical commands `show w' and `show c' should show the appropriate
+parts of the General Public License. Of course, the commands you use may
+be called something other than `show w' and `show c'; they could even be
+mouse-clicks or menu items--whatever suits your program.
+
+You should also get your employer (if you work as a programmer) or your
+school, if any, to sign a "copyright disclaimer" for the program, if
+necessary. Here is a sample; alter the names:
+
+ Yoyodyne, Inc., hereby disclaims all copyright interest in the program
+ `Gnomovision' (which makes passes at compilers) written by James Hacker.
+
+ <signature of Ty Coon>, 1 April 1989
+ Ty Coon, President of Vice
+
+This General Public License does not permit incorporating your program into
+proprietary programs. If your program is a subroutine library, you may
+consider it more useful to permit linking proprietary applications with the
+library. If this is what you want to do, use the GNU Library General
+Public License instead of this License.
diff --git a/doc/TourBusStop.mdwn b/doc/TourBusStop.mdwn
new file mode 100644
index 000000000..8e382bd77
--- /dev/null
+++ b/doc/TourBusStop.mdwn
@@ -0,0 +1,30 @@
+This is the **ikiwiki** TourBus stop.
+
+This wiki serves as the home for the ikiwiki wiki engine, providing collaborative development, documentation, and support.
+
+[[ikiwiki|/index]] provides a wiki engine with several [[/features]] unique or uncommon amongst wiki engines:
+
+* Rather than inventing yet another simplistic, linear version control system, ikiwiki uses a standard version control system such as [[Subversion|rcs/svn]] or [[rcs/Git]]. You can edit a wiki by committing to your repository, as well as through a traditional web interface. This makes ikiwiki ideal for collaborative software development; just keep your wiki in version control next to your software. You can also take full advantage of the features of these systems; for instance, you can keep a local branch of your wiki via [[rcs/Git]].
+
+* You can turn any set of pages into a [[blog]] or similar news feed,
+* complete with RSS and Atom support. You can run your weblog on ikiwiki (and [[many_people_do|ikiwikiusers]]), run a Planet-like [[aggregator|plugins/aggregate]] for external feeds, or keep a [[TODO]] and [[bug|bugs]] list with tags for completed items.
+
+* ikiwiki provides a wiki compiler, designed to transform your wiki content into a set of static pages. You can then serve these pages as static content. ikiwiki will not fall over during a Slashdotting, because page views don't require the ikiwiki CGI; as long as your web server can keep up, your site will survive. Furthermore, you can choose whether you want to run the ikiwiki CGI for web edits or only handle commits to the underlying version control system; you can even run ikiwiki privately and just manually copy the content to another server. So if you want to put a wiki up on a server without installing any software on that server, try ikiwiki.
+
+![Picture of the TourBus](http://www.wikiservice.at/image/wikibus.gif)
+
+Bus connections
+===============
+
+* Bus Nr. 02 - **Wiki Developers Tour** - next stop: [MeatballWiki TourBusStop](http://www.usemod.com/cgi-bin/mb.pl?TourBusStop)
+* Bus Nr. 42 - **Software Developers Tour** - next stop: [Ward's Wiki TourBusStop](http://c2.com/cgi/wiki?TourBusStop)
+
+[[!meatballwiki TourBusMap]]
+
+Famous sights to visit here at **ikiwiki**
+==========================================
+
+* [[features]]: See what ikiwiki can do, and why you might choose it for your wiki.
+* [[ikiwikiusers]]: The list of projects, personal sites, and blogs that use ikiwiki.
+* [[plugins]]: See the many ways people have extended ikiwiki.
+* [[examples]]: Example sites built using ikiwiki.
diff --git a/doc/anchor.mdwn b/doc/anchor.mdwn
new file mode 100644
index 000000000..12d193fe9
--- /dev/null
+++ b/doc/anchor.mdwn
@@ -0,0 +1,11 @@
+ikiwiki works with anchors in various situations.
+
+You can insert anchors directly in the body of a page and it will be used on the resulting HTML, for example:
+
+ <a name="anchor"></a>
+
+... will make the link [[anchor#anchor]] work..
+
+<a name="anchor"></a>
+
+This page accumulates links to the concept of anchors.
diff --git a/doc/backlinks.mdwn b/doc/backlinks.mdwn
new file mode 100644
index 000000000..192bc742a
--- /dev/null
+++ b/doc/backlinks.mdwn
@@ -0,0 +1,2 @@
+BackLinks are links from a page back to pages that link to it. They're automatically added, and you'll see them at the bottom of most pages in the wiki.
+This aids in navigating around and finding related stuff. \ No newline at end of file
diff --git a/doc/banned_users.mdwn b/doc/banned_users.mdwn
new file mode 100644
index 000000000..c44f8c587
--- /dev/null
+++ b/doc/banned_users.mdwn
@@ -0,0 +1,10 @@
+Banned users can be configured in the setup file via the `banned_users`
+setting. This is a list of user names, or [[PageSpecs|ikiwiki/PageSpec]]
+to ban. Using a PageSpec is useful to block an IP address.
+
+For example:
+
+ banned_users => ['evilspammer', 'ip(192.168.1.1)'],
+
+If a banned user attempts to use the ikiwiki CGI, they will receive a 403
+Forbidden webpage indicating they are banned.
diff --git a/doc/banned_users/discussion.mdwn b/doc/banned_users/discussion.mdwn
new file mode 100644
index 000000000..ca873e64c
--- /dev/null
+++ b/doc/banned_users/discussion.mdwn
@@ -0,0 +1,31 @@
+Jeremy, what about OpenID users? Do they also have Preferences?
+I don't use openid plugin in my ikiwiki, so I can't check
+whether OpenID users are stored in .ikiwiki/userdb file. --[[Paweł|ptecza]]
+
+> OpenID users are first-class users. It's even possible to set the
+> adminuser config option to an OpenID. The preferences work the same as
+> for any other user. --[[Joey]]
+
+>> Thanks a lot for your explanation! But I have a next question :)
+>> What about password of OpenID users? I can see in Preferences
+>> page that I can change my password there. Has it a sense?
+>> After all I store my OpenID password on OpenID server and I don't
+>> want to do it on ikiwiki home page site. --[[Paweł|ptecza]]
+
+>>> This wiki is currently configured to use both passwordauth and openid, so
+>>> it's possible to create an account by signing in with an openid, then
+>>> enter a password for that account, and then log in using the openid and
+>>> the password. Though I can't imagine why you'd waint to do that.. The
+>>> password stuff goes away if passwordauth is disabled.
+
+>>>> OK, I see it now :) The reason I can't do it (if I don't need it)
+>>>> is very simple: why to put my password in two places if I can to
+>>>> do it in one? ;)
+
+>>>> BTW, have you had a sleep this night? ;) Here, in Poland, the time is
+>>>> 1:30 PM (in your timezone is 7:30 AM), but we're just discussing for
+>>>> about 3 hours... --[[Paweł|ptecza]]
+
+----
+
+I would be quite interested in inheriting a banned users list from ikiwiki. More generally, the ikiwiki community might benefit from sharing ban lists amongst each other. Some way to achieve that as part of the existing work/data flows (git pulls etc.) would be interesting. That might require defining banned users in other places than just the setup file, though. -- [[Jon]]
diff --git a/doc/basewiki.mdwn b/doc/basewiki.mdwn
new file mode 100644
index 000000000..8392884eb
--- /dev/null
+++ b/doc/basewiki.mdwn
@@ -0,0 +1,26 @@
+The basewiki is a standard set of wiki pages that are included by default in
+all wikis that ikiwiki builds, unless the wiki overrides them with its own
+versions.
+
+It currently includes these pages:
+
+* [[index]]
+* [[sandbox]]
+* [[shortcuts]]
+* [[templates]]
+* [[ikiwiki/formatting]]
+* [[ikiwiki/markdown]]
+* [[ikiwiki/openid]]
+* [[ikiwiki/pagespec]]
+* [[ikiwiki/directive]]
+* [[ikiwiki/subpage]]
+* [[ikiwiki/wikilink]]
+
+As well as a few other files, like [[favicon.ico]], [[local.css]],
+[[style.css]], and some icons.
+
+Note that an important property of the basewiki is that it should be
+self-contained. That means that the pages listed above cannot link
+to pages outside the basewiki. Ikiwiki's test suite checks that the
+basewiki is self-contained, and from time to time links have to be
+removed (or replaced with `iki` [[shortcuts]]) to keep this invariant.
diff --git a/doc/basewiki/index.mdwn b/doc/basewiki/index.mdwn
new file mode 100644
index 000000000..4187c1162
--- /dev/null
+++ b/doc/basewiki/index.mdwn
@@ -0,0 +1,7 @@
+Welcome to your new wiki.
+
+All wikis are supposed to have a [[SandBox]], so this one does too.
+
+----
+
+This wiki is powered by [[ikiwiki]].
diff --git a/doc/basewiki/sandbox.mdwn b/doc/basewiki/sandbox.mdwn
new file mode 100644
index 000000000..c66534fc2
--- /dev/null
+++ b/doc/basewiki/sandbox.mdwn
@@ -0,0 +1,32 @@
+This is the SandBox, a page anyone can edit to learn how to use the wiki.
+
+----
+
+Here's a paragraph.
+
+Here's another one with *emphasised* text.
+
+# Header
+
+## Subheader
+
+> This is a blockquote.
+>
+> This is the first level of quoting.
+>
+> > This is nested blockquote.
+>
+> Back to the first level.
+
+Numbered list
+
+1. First item.
+1. Another.
+1. And another..
+
+Bulleted list
+
+* *item*
+* item
+
+[[ikiwiki/WikiLink]]
diff --git a/doc/blog.mdwn b/doc/blog.mdwn
new file mode 100644
index 000000000..c4a379fdb
--- /dev/null
+++ b/doc/blog.mdwn
@@ -0,0 +1,4 @@
+Ikiwiki allows turning any page into a weblog, by using the
+[[ikiwiki/directive/inline]] [[ikiwiki/directive]]. For example:
+
+ \[[!inline pages="blog/* and !*/Discussion" show="10" rootpage="blog"]]
diff --git a/doc/branches.mdwn b/doc/branches.mdwn
new file mode 100644
index 000000000..232f2ce6a
--- /dev/null
+++ b/doc/branches.mdwn
@@ -0,0 +1,25 @@
+In order to refer to a branch in one of the [[git]] repositories, for
+example when submitting a [[patch]], you can use the
+[[templates/gitbranch]] template. For example:
+
+ \[[!template id=gitbranch branch=yourrepo/amazingbranch author="\[[yourname]]"]]
+
+Branches that have been [[reviewed]] and need work will not be listed
+here.
+
+Branches referred to in open [[bugs]] and [[todo]]:
+
+[[!inline pages="(todo/* or bugs/*) and link(/branches) and !link(bugs/done)
+and !link(todo/done) and !*/*/*" show=0 archive=yes]]
+
+Long-lived branches in the main git repository:
+
+* `debian-stable` is used for updates to the old version included in
+ Debian's stable release, and `debian-testing` is used for updates to
+ Debian's testing release. (These and similar branches will be rebased.)
+* `ignore` gets various branches merged to it that [[Joey]] wishes to ignore
+ when looking at everyone's unmerged changes.
+* `pristine-tar` contains deltas that
+ [pristine-tar](http://joeyh.name/code/pristine-tar)
+ can use to recreate released tarballs of ikiwiki
+* `setup` contains the ikiwiki.setup file for this site
diff --git a/doc/bugs.mdwn b/doc/bugs.mdwn
new file mode 100644
index 000000000..f16a4f8e1
--- /dev/null
+++ b/doc/bugs.mdwn
@@ -0,0 +1,13 @@
+If you've found a bug in ikiwiki, post about it here. [[TODO]] items go
+elsewhere. Link items to [[bugs/done]] when done.
+
+Also see the [Debian bugs](http://bugs.debian.org/ikiwiki).
+
+There are [[!pagecount pages="bugs/* and !bugs/done and !bugs/discussion and
+!link(patch) and !link(bugs/done) and !bugs/*/*"
+feedpages="created_after(bugs/no_commit_mails_for_new_pages)"]] "open" bugs:
+
+[[!inline pages="bugs/* and !bugs/done and !bugs/discussion and
+!link(patch) and !link(bugs/done) and !bugs/*/*"
+feedpages="created_after(bugs/no_commit_mails_for_new_pages)"
+actions=yes rootpage="bugs" postformtext="Add a new bug titled:" show=0]]
diff --git a/doc/bugs/2.45_Compilation_error.mdwn b/doc/bugs/2.45_Compilation_error.mdwn
new file mode 100644
index 000000000..63147b656
--- /dev/null
+++ b/doc/bugs/2.45_Compilation_error.mdwn
@@ -0,0 +1,198 @@
+I have perl 5.10.0. Ikiwiki 2.44 compiles fine. Compiling 2.45 fails after 'make':
+
+ perl -Iblib/lib ikiwiki.out -libdir . -setup docwiki.setup -refresh
+ refreshing wiki..
+ docwiki.setup: Failed to load plugin IkiWiki::Plugin::goodstuff: Failed to load plugin IkiWiki::Plugin::shortcut: Too many arguments for IkiWiki::srcfile at IkiWiki/Plugin/shortcut.pm line 16, near "1)"
+ Compilation failed in require at (eval 31) line 2.
+ BEGIN failed--compilation aborted at (eval 31) line 2.
+ BEGIN failed--compilation aborted at (eval 23) line 2.
+ BEGIN failed--compilation aborted at (eval 10) line 21.
+ make: *** [extra_build] Error 255
+
+> I can't reproduce this. It looks like your IkiWiki.pm is out of sync with
+> your IkiWiki/Plugin/shortcut.pm. The ones distributed in 2.45 are in
+> sync. Or your perl is failing to use the right version of Ikiwiki.pm,
+> perhaps using a previously installed version. But the -Iblib/lib
+> instructs perl to look in that directory first, and the Makefile
+> puts Ikiwiki.pm there. --[[Joey]]
+
+>> I removed all traces of the previous installation, and now 2.45 compiles.
+>> I don't know why it was picking up the old version of Ikiwiki.pm, but now it
+>> works. Please close this bug, and thanks for the help.
+
+>>> Where were the files from the old installation? I still don't
+>>> understand why they would be seen, since -Iblib/lib is passed to perl.
+>>> --[[Joey]]
+
+>>>> They were under /usr/local/{bin,lib,share}. I can try to provide more info,
+>>>> or try to reproduce it, if you need me to.
+
+>>>>> Well, here are some things to try.
+
+ perl -Iblib/lib -V
+
+>>>>> This should have blib/lib first in the listed @INC
+
+ joey@kodama:~/src/ikiwiki>strace perl -Iblib/lib -e 'use IkiWiki' 2>&1 |grep IkiWiki.pm
+ stat64("blib/lib/IkiWiki.pmc", 0xbfa1594c) = -1 ENOENT (No such file or directory)
+ stat64("blib/lib/IkiWiki.pm", {st_mode=S_IFREG|0444, st_size=31982, ...}) = 0
+ open("blib/lib/IkiWiki.pm", O_RDONLY|O_LARGEFILE) = 5
+
+>>>>> This is how perl finds IkiWiki.pm here. Note that I've run "make" first.
+
+OK, this is what I'm getting:
+
+ $ perl -Iblib/lib -V
+ @INC:
+ blib/lib
+ /usr/lib/perl5/site_perl/5.10.0
+ /usr/share/perl5/site_perl/5.10.0
+ /usr/lib/perl5/vendor_perl
+ /usr/share/perl5/vendor_perl
+ /usr/share/perl5/vendor_perl
+ /usr/lib/perl5/core_perl
+ /usr/share/perl5/core_perl
+ /usr/lib/perl5/current
+ /usr/lib/perl5/site_perl/current
+
+I ran the following in my current 2.45 source dir, where the `make` already succeded. If you need it, I can post the output
+in the case where `make` fails.
+
+ $ strace perl -Iblib/lib -e 'use IkiWiki' 2>&1 |grep IkiWiki.pm
+ stat64("blib/lib/IkiWiki.pmc", 0xbfa6167c) = -1 ENOENT (No such file or directory)
+ stat64("blib/lib/IkiWiki.pm", {st_mode=S_IFREG|0444, st_size=31901, ...}) = 0
+ open("blib/lib/IkiWiki.pm", O_RDONLY|O_LARGEFILE) = 3
+
+> I need to see it in the case where it's failing. --[[Joey]]
+
+I finally had some time to look into this again.
+
+I wiped ikiwiki off my system, and then installed version 2.41. I tried installing
+2.46 and get the same error as above, so I'll be using 2.46 below. (BTW, the debian
+page still lists 2.45 as current; I had to fiddle with the download link to get 2.46).
+
+After running `./Makefile.PL` I get:
+
+ $ perl -Iblib/lib -V
+ [bunch of lines snipped]
+ @INC:
+ blib/lib
+ [bunch of paths snipped]
+
+Running the strace:
+
+ $ strace perl -Iblib/lib -e 'use IkiWiki' 2>&1 |grep IkiWiki.pm
+
+I get a bunch of ENOENTs and then at the end:
+
+ stat64("./IkiWiki.pmc", 0xbfa2fe5c) = -1 ENOENT (No such file or directory)
+ stat64("./IkiWiki.pm", {st_mode=S_IFREG|0644, st_size=31987, ...}) = 0
+ open("./IkiWiki.pm", O_RDONLY|O_LARGEFILE) = 3
+
+After running `make` (and having it fail as described above):
+
+ $ strace perl -Iblib/lib -e 'use IkiWiki' 2>&1 |grep IkiWiki.pm
+ stat64("blib/lib/IkiWiki.pmc", 0xbfd7999c) = -1 ENOENT (No such file or directory)
+ stat64("blib/lib/IkiWiki.pm", {st_mode=S_IFREG|0444, st_size=31901, ...}) = 0
+ open("blib/lib/IkiWiki.pm", O_RDONLY|O_LARGEFILE) = 3
+
+I don't know what is going on, but I'll run any more tests you need me to.
+
+> No help.
+> The only further thing I can think to try is `strace -f` the entire failing
+> `make` run (or the ikiwiki command that's failing in it, if you can
+> reproduce the failure at the command line). --[[Joey]]
+
+I have 2.46 installed and I can reproduce the bug reported against 2.49. The command that fails is:
+
+ $ /usr/bin/perl -Iblib/lib ikiwiki.out -libdir . -setup docwiki.setup -refresh
+ docwiki.setup: Failed to load plugin IkiWiki::Plugin::inline: Too many arguments for IkiWiki::htmlize at IkiWiki/Plugin/inline.pm line 359, near "))"
+ Compilation failed in require at (eval 14) line 2.
+ BEGIN failed--compilation aborted at (eval 14) line 2.
+ BEGIN failed--compilation aborted at (eval 10) line 21.
+
+strace -f produces a 112K file. I don't know enough to be comfortable analyzing it.
+However, lines like:
+
+ stat64("/usr/local/share/perl5/site_perl/5.10.0/IkiWiki.pm", {st_mode=S_IFREG|0444, st_size=31982, ...}) = 0
+
+make me think the make process is not completely independent of a previous
+installation. Joey, should I email you the strace log file?
+
+> Email it (joey@ikiwiki.info), or post it to a website somewhere.
+> --[[Joey]]
+
+> The relevant part of the file is:
+
+ execve("/usr/bin/perl", ["/usr/bin/perl", "-Iblib/lib", "ikiwiki.out", "-libdir", ".", "-setup", "docwiki.setup", "-refresh"], [/* 55 vars */]) = 0
+ [...]
+ stat64("blib/lib/5.10.0/i686-linux-thread-multi", 0xbfa72240) = -1 ENOENT (No such file or directory)
+ stat64("blib/lib/5.10.0", 0xbfa72240) = -1 ENOENT (No such file or directory)
+ stat64("blib/lib/i686-linux-thread-multi", 0xbfa72240) = -1 ENOENT (No such file or directory)
+ [...]
+ stat64("/usr/local/share/perl5/site_perl/5.10.0/IkiWiki.pmc", 0xbfa71e5c) = -1 ENOENT (No such file or directory)
+ stat64("/usr/local/share/perl5/site_perl/5.10.0/IkiWiki.pm", {st_mode=S_IFREG|0444, st_size=31982, ...}) = 0
+ open("/usr/local/share/perl5/site_perl/5.10.0/IkiWiki.pm", O_RDONLY|O_LARGEFILE) = 4
+
+> So it doesn't look for IkiWiki.pm in blib at all. But it clearly has been asked to look in blib, since it
+> looks for the 3 directories in it. When I run the same thing locally, I get:
+
+ execve("/usr/bin/perl", ["/usr/bin/perl", "-Iblib/lib", "ikiwiki.out", "-libdir", ".", "-setup", "docwiki.setup", "-refresh"], [/* 55 vars */]) = 0
+ [...]
+ stat64("blib/lib/5.10.0/i486-linux-gnu-thread-multi", 0xbf84f320) = -1 ENOENT (No such file or directory)
+ stat64("blib/lib/5.10.0", 0xbf84f320) = -1 ENOENT (No such file or directory)
+ stat64("blib/lib/i486-linux-gnu-thread-multi", 0xbf84f320) = -1 ENOENT (No such file or directory)
+ [...]
+ stat64("blib/lib/IkiWiki.pmc", 0xbf84ef4c) = -1 ENOENT (No such file or directory)
+ stat64("blib/lib/IkiWiki.pm", {st_mode=S_IFREG|0444, st_size=32204, ...}) = 0
+ open("blib/lib/IkiWiki.pm", O_RDONLY|O_LARGEFILE) = 6
+
+> The thing I really don't understand is why, on the system where perl fails
+> to look in blib when straced as above, we've already established it *does*
+> look for it when `perl -Iblib/lib -e 'use IkiWiki'` is straced.
+>
+> The only differences between the two calls to perl seem to be:
+> * One runs `perl`, and the other `/usr/bin/perl` -- are these really
+> the same program? Does `perl -lblib/lib ikiwiki.out -libdir . -setup docwiki.setup -refresh`
+> fail the same way as the `/usr/bin/perl` variant?
+> * The `-libdir .`, which causes ikiwiki to modify `@INC`, adding "." to
+> the front of it.
+>
+> I'm entirely at a loss as to why I cannot reproduce this with the same
+> versions of perl and ikiwiki as the two people who reported it. There must
+> be something unusual about your systems that we have not figured out yet. --[[Joey]]
+
+Joey, thanks for your time and effort looking into this.
+
+I checked with `which`: `perl` is indeed `/usr/bin/perl`. The commands fail similarly when
+calling `perl` and `/usr/bin/perl`.
+
+However, you might be into something with your `libdir` idea. If I remove it from the
+command line, the command succeeds. In other words, if I run
+
+ perl -Iblib/lib ikiwiki.out -setup docwiki.setup -refresh
+
+then it works perfectly.
+
+> Well, that's just weird, because `libdir` is handled by code in IkiWiki.pm.
+> So I don't see how setting it could affect its searching for IkiWiki.pm at all,
+> actually. It could only affect its searching for files loaded later. Anyway,
+> can I get a strace of it succeeding this way?
+>
+> Also, can you show me the first 15 lines of your `ikiwiki.out`? It's occurred to me
+> you might have an unusual `use lib` line in it.
+
+By the way, I'm running Arch linux. The perl build script is a bit long, but I
+see they install a patch to modify @INC: <http://repos.archlinux.org/viewvc.cgi/perl/repos/core-i686/perl-5.10.0-archlinux-inc-order.patch?revision=1&view=markup>
+
+Would you suggest I try rebuilding perl without this patch? Debian has a huge perl patch (102K!);
+it's not straightforward for me to see if they do something similar to Arch.
+
+> I think Debian has a similar patch.
+
+---
+
+[[done]] -- apparently this was a problem due to a distribution's
+customisation to perl, or something. Seems to late now to track down what,
+unfortunatly. And ikiwiki's Makefile no longer uses the "-libdir" switch
+that seemed to trigger the bug. --[[Joey]]
diff --git a/doc/bugs/404_plugin_and_lighttpd.mdwn b/doc/bugs/404_plugin_and_lighttpd.mdwn
new file mode 100644
index 000000000..8508d0dcd
--- /dev/null
+++ b/doc/bugs/404_plugin_and_lighttpd.mdwn
@@ -0,0 +1,45 @@
+Lighttpd apparently sets REDIRECT_STATUS=200 for the server.error-handler-404 page. This breaks the [[plugins/404]] plugin which checks this variable for 404 before processing the URI. It also doesn't seem to set REDIRECT_URL.
+
+> For what it's worth, the first half is <http://redmine.lighttpd.net/issues/1828>.
+> One workaround would be to make this script your 404 handler:
+>
+> #!/bin/sh
+> REDIRECT_STATUS=404; export REDIRECT_STATUS
+> REDIRECT_URL="$SERVER_NAME$REQUEST_URI"; export REDIRECT_URL
+> exec /path/to/your/ikiwiki.cgi "$@"
+>
+> --[[smcv]]
+
+I was able to fix my server to check the REQUEST_URI for ikiwiki.cgi and to continue processing if it was not found, passing $ENV{SEVER_NAME} . $ENV{REQUEST_URI} as the first parameter to cgi_page_from_404. However, my perl is terrible and I just made it work rather than figuring out exactly what to do to get it to work on both lighttpd and apache.
+
+This is with lighttpd 1.4.19 on Debian.
+
+> /cgi-bin/ikiwiki.cgi?do=goto also provides redirection in the same way,
+> if that's any help? You might need to set the lighttpd 404 handler to
+> that, then compose REDIRECT_URL from other variables if necessary.
+>
+> I originally wrote the plugin for Apache; [[weakish]] contributed the
+> lighttpd docs and might know more about how to make it work there.
+> --[[smcv]]
+
+>> As I said, I got it working for me, but somebody who knows perl should probably look at it with the aim of making it work for everyone.
+>> I considered having lighttpd construct a proper url for the 404 redirect itself, but I don't know if it can do something like that or not.
+>> For what it's worth, here's the change I made to the module:
+
+ sub cgi ($) {
+ my $cgi=shift;
+ if ($ENV{REQUEST_URI} !~ /ikiwiki\.cgi/) {
+ my $page = cgi_page_from_404(
+ Encode::decode_utf8($ENV{SERVER_NAME} . $ENV{REQUEST_URI}),
+ $config{url}, $config{usedirs});
+ IkiWiki::Plugin::goto::cgi_goto($cgi, $page);
+ }
+
+ # if (exists $ENV{REDIRECT_STATUS} &&
+ # $ENV{REDIRECT_STATUS} eq '404') {
+ # my $page = cgi_page_from_404(
+ # Encode::decode_utf8($ENV{REDIRECT_URL}),
+ # $config{url}, $config{usedirs});
+ # IkiWiki::Plugin::goto::cgi_goto($cgi, $page);
+ # }
+ }
diff --git a/doc/bugs/404_plugin_should_handle_403.mdwn b/doc/bugs/404_plugin_should_handle_403.mdwn
new file mode 100644
index 000000000..50288e525
--- /dev/null
+++ b/doc/bugs/404_plugin_should_handle_403.mdwn
@@ -0,0 +1,16 @@
+Apache will return 403 (Forbidden) instead of 404 (Not Found) if the
+`Indexes` option is turned off. This is because with `Indexes` turned on,
+it considers it something it *might* be able to serve in the future. With
+`Indexes` off, it will never serve that page in the future (unless
+`Indexes` is turned back on).
+
+The [[404 plugin|plugins/404]] code only checks for 404, not 403. It should check for both.
+
+> There are plenty of reasons a web server might 403. In most of those
+> cases, trying to create a page where the forbidden content is is not the
+> right thing for ikiwiki to do. --[[Joey]]
+
+See Also:
+
+ * [StackOverflow: 404-vs-403](http://stackoverflow.com/questions/5075300/404-vs-403-when-directory-index-is-missing)
+ * [[404 plugin discussion|plugins/404/discussion]]
diff --git a/doc/bugs/404_when_cancel_create_page.mdwn b/doc/bugs/404_when_cancel_create_page.mdwn
new file mode 100644
index 000000000..ee7b07f8a
--- /dev/null
+++ b/doc/bugs/404_when_cancel_create_page.mdwn
@@ -0,0 +1,60 @@
+If you
+
+ * Add a link to a non-existant page and save. (e.g. somewhere-over-the-rainbow)
+ * Click the question mark to create the page.
+ * Click the cancel button.
+
+You get a 404 as the page doesn't exist. This patch redirects to the from location
+if it is known.
+
+
+ === modified file 'IkiWiki/CGI.pm'
+ --- IkiWiki/CGI.pm
+ +++ IkiWiki/CGI.pm
+ @@ -427,7 +427,11 @@
+ }
+
+ if ($form->submitted eq "Cancel") {
+ - redirect($q, "$config{url}/".htmlpage($page));
+ + if ( $newpage && defined $from ) {
+ + redirect($q, "$config{url}/".htmlpage($from));
+ + } else {
+ + redirect($q, "$config{url}/".htmlpage($page));
+ + }
+ return;
+ }
+ elsif ($form->submitted eq "Preview") {
+
+> I think you mean to use `$newfile`? I've applied a modieid version
+> that also deal with creating a new page with no defined $from location.
+> [[bugs/done]] --[[Joey]]
+
+>> Yes of course, that's what I get for submitting an untested patch!
+>> I must stop doing that.
+
+[P.S. just above that is
+
+ $type=$form->param('type');
+ if (defined $type && length $type && $hooks{htmlize}{$type}) {
+ $type=possibly_foolish_untaint($type);
+ }
+ ....
+ $file=$page.".".$type;
+
+I'm a little worried by the `possibly_foolish_untaint` (good name for it by the way,
+makes it stick out). I don't think much can be done to exploit this (if anything),
+but it seems like you could have a very strict regex there rather than the untaint,
+is there aren't going to be many possible extensions. Something like `/(.\w+)+/`
+(groups of dot separated alpha-num chars if my perl-foo isn't failing me). You could
+at least exclude `/` and `..`. I'm happy to turn this in to a patch if you agree.]
+
+> The reason it's safe to use `possibly_foolish_untaint` here is because
+> of the check for $hooks{htmlize}{$type}. This limits it to types
+> that have a registered htmlize hook (mdwn, etc), and not whatever random
+> garbage an attacker might try to put in. If it wasn't for that check,
+> using `possibly_foolish_untaint` there would be _very_ foolish indeed..
+> --[[Joey]]
+
+>> Nice, sorry I missed it.
+>> I must say thankyou for creating ikiwiki.
+>> The more I look at it, the more I admire what you are doing with it and how you are going about it
diff --git a/doc/bugs/4_spaces_after_bullet.mdwn b/doc/bugs/4_spaces_after_bullet.mdwn
new file mode 100644
index 000000000..6a579c206
--- /dev/null
+++ b/doc/bugs/4_spaces_after_bullet.mdwn
@@ -0,0 +1,18 @@
+this example
+
+ * bla
+ * bla2
+
+ this should be treated as code block
+
+and it is not ...
+
+ but if bullets are not above this, it works
+
+Is this markdown limitation? I know that 8 spaces are required to make codeblock appear under bullet, but I don't want it to be under bullet. I want it outside of bullet as normally shown (as code block).
+
+> Yes, this is an ambiguity and limitation in markdown. I'm
+> not sure if it can be fixed, but I've filed a
+> [bug](http://bugs.debian.org/444309) in the Debian BTS
+> about it. I'm going to [[close|done]] this bug in the ikiwiki BTS, since
+> it's not a bug in ikiwiki's own code. --[[Joey]]
diff --git a/doc/bugs/Add_a_footer_div_on_all_pages_to_improve_theming.mdwn b/doc/bugs/Add_a_footer_div_on_all_pages_to_improve_theming.mdwn
new file mode 100644
index 000000000..e0dd100fa
--- /dev/null
+++ b/doc/bugs/Add_a_footer_div_on_all_pages_to_improve_theming.mdwn
@@ -0,0 +1,149 @@
+The following patch adds a footer div on all pages to ease CSS themeing.
+Indeed, the misc.tmpl, recentchanges.tmpl and editpage.tmpl templates lack such a div.
+
+> So, the problem with this is that the default css inserts a horizontal
+> line at the top of the footer div, and putting an empty footer on these
+> other pages looks a bit weird. Any idea how to get around that?
+> --[[Joey]]
+
+>> Sorry I didn't see that. It definitely looks weird. We could add text
+>> in all footers or change the CSS stylesheet, but it's not clean IMHO.
+
+>> The idea was about to ease themeing by giving all the pages the same
+>> structure. The current structure is the following one:
+
+>> div header - div actions ... div content - div footer (sometimes)
+
+>> So we could add some new divs in all templates. By default, they will
+>> be empty and no CSS will be defined for them. This way ikiwiki
+>> standard appearance is not changed but themeing will be eased.
+>> The new page structure could be:
+
+>> * div banner (to show up a logo for example)
+
+>> * div content-wrap containing div header, div actions, ... div content
+>> and div footer
+
+>> * div realfooter (to put credits, "Powered by ikiwiki", "Valid XHTML"...)
+
+>> From my tests, it works: Just adding the divs, without touching the stylesheet,
+>> doesn't change ikiwiki appearance. And themeing is eased :-)
+
+>> I can update the patch, if you want to follow and test this idea. --Fred
+
+>>> Sure, go ahead --[[Joey]]
+
+>>>>Here is an updated patch against current svn. --Fred
+
+ Index: templates/recentchanges.tmpl
+ ===================================================================
+ --- templates/recentchanges.tmpl (révision 3575)
+ +++ templates/recentchanges.tmpl (copie de travail)
+ @@ -12,7 +12,11 @@
+ </TMPL_IF>
+ </head>
+ <body>
+ +<div id="banner">
+ +</div>
+
+ +<div id="content-wrap">
+ +
+ <div class="header">
+ <span>
+ <TMPL_VAR INDEXLINK>/ <TMPL_VAR TITLE>
+ @@ -65,5 +69,10 @@
+
+ <!-- from <TMPL_VAR NAME=WIKINAME> -->
+
+ +</div>
+ +
+ +<div id="realfooter">
+ +</div>
+ +
+ </body>
+ </html>
+ Index: templates/page.tmpl
+ ===================================================================
+ --- templates/page.tmpl (révision 3575)
+ +++ templates/page.tmpl (copie de travail)
+ @@ -13,7 +13,11 @@
+ <TMPL_IF NAME="META"><TMPL_VAR META></TMPL_IF>
+ </head>
+ <body>
+ +<div id="banner">
+ +</div>
+
+ +<div id="content-wrap">
+ +
+ <div class="header">
+ <span>
+ <TMPL_LOOP NAME="PARENTLINKS">
+ @@ -95,5 +99,10 @@
+ <TMPL_IF EXTRAFOOTER><TMPL_VAR EXTRAFOOTER></TMPL_IF>
+ </div>
+
+ +</div>
+ +
+ +<div id="realfooter">
+ +</div>
+ +
+ </body>
+ </html>
+ Index: templates/editpage.tmpl
+ ===================================================================
+ --- templates/editpage.tmpl (révision 3575)
+ +++ templates/editpage.tmpl (copie de travail)
+ @@ -12,6 +12,11 @@
+ </TMPL_IF>
+ </head>
+ <body>
+ +<div id="banner">
+ +</div>
+ +
+ +<div id="content-wrap">
+ +
+ <TMPL_IF NAME="PAGE_CONFLICT">
+ <p>
+ <b>Your changes conflict with other changes made to the page.</b>
+ @@ -86,5 +91,11 @@
+ <TMPL_VAR PAGE_PREVIEW>
+ </div>
+ </TMPL_IF>
+ +
+ +</div>
+ +
+ +<div id="realfooter">
+ +</div>
+ +
+ </body>
+ </html>
+ Index: templates/misc.tmpl
+ ===================================================================
+ --- templates/misc.tmpl (révision 3575)
+ +++ templates/misc.tmpl (copie de travail)
+ @@ -12,7 +12,11 @@
+ </TMPL_IF>
+ </head>
+ <body>
+ +<div id="banner">
+ +</div>
+
+ +<div id="content-wrap">
+ +
+ <div class="header">
+ <span>
+ <TMPL_VAR INDEXLINK>/ <TMPL_VAR TITLE>
+ @@ -23,5 +27,10 @@
+ <TMPL_VAR PAGEBODY>
+ </div>
+
+ +</div>
+ +
+ +<div id="realfooter">
+ +</div>
+ +
+ </body>
+ </html>
+
+> I took a more intrusive approach to avoid ugly names like "realfooter".
+> [[done]] --[[Joey]]
diff --git a/doc/bugs/Add_permissions_for_suggesting__47__accepting_edits.mdwn b/doc/bugs/Add_permissions_for_suggesting__47__accepting_edits.mdwn
new file mode 100644
index 000000000..cbe53ad7a
--- /dev/null
+++ b/doc/bugs/Add_permissions_for_suggesting__47__accepting_edits.mdwn
@@ -0,0 +1,15 @@
+Wikis are great tools for collaborative content of all types, but the choice for website creators who want a level of collaboration seem to have to choose between a static website, a wiki that anyone (or all members) can edit, or an overkill customized web app.
+
+A simple innovation that needs to propagate through wiki software is adding the ability to suggest edits and accept those edits. Perhaps you want a wiki that anyone can suggest and edit, but only registered users can edit freely or accept edits. Or you want anyone, including members, to only be able to suggest edits, and only have moderators able to approve edits and edit freely. Etc, etc.
+
+> Ikiwiki always has some work in this area; there is
+> the moderatedcomments plugin and the `checkcontent` hook.
+> The hook allows, for example a plugin to reject changes
+> with spam links or swear words. A plugin could also use
+> it to save the diff for later moderation.
+>
+> I think the difficulty
+> is in the moderation interface, which would need to apply the diff
+> and show the resulting page with the changes somehow evident (for users
+> who can't just read diffs), and would have to deal with conflicting
+> edits, etc. --[[Joey]]
diff --git a/doc/bugs/Aggregated_Atom_feeds_are_double-encoded.mdwn b/doc/bugs/Aggregated_Atom_feeds_are_double-encoded.mdwn
new file mode 100644
index 000000000..fbdc58d5d
--- /dev/null
+++ b/doc/bugs/Aggregated_Atom_feeds_are_double-encoded.mdwn
@@ -0,0 +1,22 @@
+The Atom feed from <http://planet.collabora.co.uk/>
+get "double-encoded" (UTF-8 is decoded as Latin-1 and re-encoded as
+UTF-8) when aggregated with IkiWiki on Debian unstable. The RSS 1.0
+and RSS 2.0 feeds from the same Planet are fine. All three files
+are in fact correct UTF-8, but IkiWiki mis-parses the Atom.
+
+This turns out to be a bug in XML::Feed, or (depending on your point
+of view) XML::Feed failing to work around a design flaw in XML::Atom.
+When parsing RSS it returns Unicode strings, but when parsing Atom
+it delegates to XML::Atom's behaviour, which by default is to strip
+the UTF8 flag from strings that it outputs; as a result, they're
+interpreted by IkiWiki as byte sequences corresponding to the UTF-8
+encoding. IkiWiki then treats these as if they were Latin-1 and
+encodes them into UTF-8 for output.
+
+I've filed a bug against XML::Feed on CPAN requesting that it sets
+the right magical variable to change this behaviour. IkiWiki can
+also apply the same workaround (and doing so should be harmless even
+when XML::Feed is fixed); please consider merging my 'atom' branch,
+which does so. --[[smcv]]
+
+[[!tag patch done]]
diff --git a/doc/bugs/Allow_overriding_of_symlink_restriction.mdwn b/doc/bugs/Allow_overriding_of_symlink_restriction.mdwn
new file mode 100644
index 000000000..efdd9004e
--- /dev/null
+++ b/doc/bugs/Allow_overriding_of_symlink_restriction.mdwn
@@ -0,0 +1,139 @@
+There is currently a restriction in ikiwiki that there cannot be any symlinks in the source path. This is to deal with a security issue discussed [[here|security#index29h2]]. The issue, as I understand it, is that someone might change a symlink and so cause things on the server to be published when the server admin doesn't want them to be.
+
+I think there are two issues here:
+
+ - Symlinks with the source dir path itself, and
+ - Symlinks inside the source directory.
+
+The first appears to me to be less of a security issue. If there is a way for a malicious person to change where that path points, then you have problems this check isn't going to solve. The second is quite clearly a security issue - if someone were to commit a symlink into the source dir they could cause lots of stuff to be published that shouldn't be.
+
+> Correct. However, where does the revision controlled source directory end? Ikiwiki has no way
+> of knowing. It cannot assume that `srcdir` is in revision control, and
+> everything outside is not. For example, ikiwiki's own source tree has the
+> doc wiki source inside `ikiwiki/doc`. So to fully close the source dir
+> symlink issue, it's best to, by default, assume that the revision
+> controlled directories could go down arbitrarily deep, down to the root of
+> the filesystem. --[[Joey]]
+
+>> Fair point.
+
+The current code seems to check this constraint at the top of IkiWiki/Render.pm at the start of refresh(). It seems to only check the source dir itself, not the subdirs. Then it uses File::Find to recuse which doesn't follow symlinks.
+
+Now my problem: I have a hosted server where I cannot avoid having a symlink in the source path. I've made a patch to optionally turn off the symlink checking in the source path itself. The patch would still not follow symlinks inside the source dir. This would seem to be ok security-wise for me as I know that path is ok and it isn't going to change on me.
+
+> BTW, if you have a problem, please file it in [[todo]] or [[bugs]] in the
+> future. Especially if you also have a patch. :-) --[[Joey]]
+
+>> Well, I was unsure I wasn't missing something. I wanted to discuss the concept of the patch as much as submit the patch. But, ok :)
+
+Is there a huge objection to this patch?
+
+>>> [[patch]] updated.
+
+ diff --git a/IkiWiki/Render.pm b/IkiWiki/Render.pm
+ index 990fcaa..0fb78ba 100644
+ --- a/IkiWiki/Render.pm
+ +++ b/IkiWiki/Render.pm
+ @@ -260,13 +260,15 @@ sub prune ($) {
+
+ sub refresh () {
+ # security check, avoid following symlinks in the srcdir path
+ - my $test=$config{srcdir};
+ - while (length $test) {
+ - if (-l $test) {
+ - error("symlink found in srcdir path ($test)");
+ - }
+ - unless ($test=~s/\/+$//) {
+ - $test=dirname($test);
+ + if (! $config{allow_insecure_symlinks_in_path_to_srcdir}) {
+ + my $test=$config{srcdir};
+ + while (length $test) {
+ + if (-l $test) {
+ + error("symlink found in srcdir path ($test)");
+ + }
+ + unless ($test=~s/\/+$//) {
+ + $test=dirname($test);
+ + }
+ }
+ }
+
+ diff --git a/doc/ikiwiki.setup b/doc/ikiwiki.setup
+ index 10cb3da..eb86e49 100644
+ --- a/doc/ikiwiki.setup
+ +++ b/doc/ikiwiki.setup
+ @@ -203,4 +203,10 @@ use IkiWiki::Setup::Standard {
+ # For use with the attachment plugin, a program that returns
+ # nonzero if its standard input contains an virus.
+ #virus_checker => "clamdscan -",
+ +
+ + # The following setting allows symlinks in the path to your
+ + # srcdir. Symlinks are still not followed within srcdir.
+ + # Allowing symlinks to be followed, even in the path to srcdir,
+ + # will make some setups insecure.
+ + #allow_insecure_symlinks_in_path_to_srcdir => 0,
+ }
+
+> No, I don't have a big objection to such an option, as long as it's
+> extremely well documented that it will make many setups insecure.
+> It would be nice to come up with an option name that makes clear that
+> it's allowing symlinks in the path to the `srcdir`, but still not inside
+> the `srcdir`.
+> --[[Joey]]
+
+>> Slightly modified version of patch applied. --[[Joey]]
+
+>> Ok, I'll try to get it cleaned up and documented.
+
+There is a second location where this can be an issue. That is in the
+front of the wrapper. There the issue is that the path to the source dir
+as seen on the cgi server and on the git server are different - each has
+symlinks in place to support the other. The current wrapper gets the
+absolute path to the source dir, and that breaks things for me. This is a
+slightly different, albeit related, issue to the one above. The following
+patch fixes things. Again, patch inline. Again, this patch could be
+cleaned up :). I just wanted to see if there was any chance of a patch
+like this being accepted before I bothered.
+
+>>> Patch updated:
+
+ index 79b9eb3..ce1c395 100644
+ --- a/IkiWiki/Wrapper.pm
+ +++ b/IkiWiki/Wrapper.pm
+ @@ -4,14 +4,14 @@ package IkiWiki;
+
+ use warnings;
+ use strict;
+ -use Cwd q{abs_path};
+ use Data::Dumper;
+ use IkiWiki;
+ +use File::Spec;
+
+ sub gen_wrapper () {
+ - $config{srcdir}=abs_path($config{srcdir});
+ - $config{destdir}=abs_path($config{destdir});
+ - my $this=abs_path($0);
+ + $config{srcdir}=File::Spec->rel2abs($config{srcdir});
+ + $config{destdir}=File::Spec->rel2abs($config{destdir});
+ + my $this=File::Spec->rel2abs($0);
+ if (! -x $this) {
+ error(sprintf(gettext("%s doesn't seem to be executable"), $this
+ }
+
+> ikiwiki uses absolute paths for `srcdir`, `destdir` and `this` because
+> the wrapper could be run from any location, and if any of them happen to
+> be a relative path, it would crash and burn.
+
+>> Which makes perfect sense. It is annoying that abs_path() is also
+>> expanding symlinks.
+
+> I think the thing to do might be to make it check if `srcdir` and
+> `destdir` look like an absolute path (ie, start with "/"). If so, it can
+> skip running `abs_path` on them.
+
+>> I'll do that. I assume something like <code> File::Spec->file_name_is_absolute( $path ); </code> would have more cross-platformy goodness.
+>> hrm. I might see if <code> File::Spec->rel2abs( $path ) ; </code> will give absolute an path without expanding symlinks.
+>>> Patch using rel2abs() works well - it no longer expands symlinks.
+
+>>>> That patch is applied now. --[[Joey]]
+
+[[!tag done]]
diff --git a/doc/bugs/Another_UTF-8_problem.mdwn b/doc/bugs/Another_UTF-8_problem.mdwn
new file mode 100644
index 000000000..d67ed2fa0
--- /dev/null
+++ b/doc/bugs/Another_UTF-8_problem.mdwn
@@ -0,0 +1,16 @@
+Web-edit the [[Sandbox]], select *Preview* and watch all UTF-8 character
+getting garbled (would also get committed like this). Or is it a problem
+with my pretty standard Ubuntu gutsy Firefox installation? --[[tschwinge]]
+
+> Fixed, but I wish I knew what changed to break this. My guess is it might
+> have changed in the new upstream release of FormBuilder. All forms using
+> formbuilder were affected, none of them were utf-8 clean, and I know that
+> ikiwiki used to be fully utf-8 clean. The symptom of the problem is that
+> in `decode_form_utf8`, `Encode::is_utf8` says that the form field value
+> is already valid utf-8, when in fact it's not yet been decoded. So I
+> removed that line to fix it. --[[Joey]]
+
+[[!tag done]]
+
+Now we test it for Cyrillic and Western letters:
+Протестируем кириллицу и ещё «_другие_» буквы: grüne Öl & hôtel — 3² × 2° --Shoorick
diff --git a/doc/bugs/Attachment_plug-in_not_committing_files.mdwn b/doc/bugs/Attachment_plug-in_not_committing_files.mdwn
new file mode 100644
index 000000000..aaba13326
--- /dev/null
+++ b/doc/bugs/Attachment_plug-in_not_committing_files.mdwn
@@ -0,0 +1,18 @@
+I've added the attachment plug-in to our wiki. I am able to add files to the working copy of the website on the server, but none of the file are being checked into the SVN repository. Using logging I've tracked the problem to line 293 of attachment.pm:
+
+ IkiWiki::rcs_add($_) foreach @attachments;
+
+Here it is trying to add an absolute path to the file when rcs_add is expecting a path relative to the SVN root.
+
+From this code it looks like $dest needs to be absolute and that a relative path needs to be pushed to @attachments:
+
+ rename($filename, $dest);
+ push @attachments, $dest;
+
+I'm using ikiwiki version 3.20120202ubuntu1.
+
+> I don't think this affects git, just because it happens to
+> allow adding with an absolute path.
+>
+> So, this is an interesting way svn support can bit rot if nothing
+> is testing it! [[fixed|done]] --[[Joey]]
diff --git a/doc/bugs/Broken_URL_to_your_blog_script.mdwn b/doc/bugs/Broken_URL_to_your_blog_script.mdwn
new file mode 100644
index 000000000..3d6661d9c
--- /dev/null
+++ b/doc/bugs/Broken_URL_to_your_blog_script.mdwn
@@ -0,0 +1,10 @@
+Joey, I would like to see your blog script I've found
+at [[Tips|tips/blog_script]] page, but it seems that the URL
+(http://git.kitenet.net/?p=joey/home;a=blob_plain;f=bin/blog)
+to its Git repo is broken:
+
+ 403 Forbidden - No such project
+
+--[[Paweł|ptecza]]
+
+> [[fixed|done]] --[[Joey]]
diff --git a/doc/bugs/Broken_access_to_Ikiwiki_gitweb.mdwn b/doc/bugs/Broken_access_to_Ikiwiki_gitweb.mdwn
new file mode 100644
index 000000000..902e4086f
--- /dev/null
+++ b/doc/bugs/Broken_access_to_Ikiwiki_gitweb.mdwn
@@ -0,0 +1,19 @@
+I can't check the last changes in Ikiwiki using
+gitweb. It looks like XML
+validation problem with HTML entity.
+
+When I click a appropriate link on a [[git]] page, then I can only
+see the following error message. --[[Paweł|ptecza]]
+
+ <div class="title">&nbsp;</div>
+ -------------------^
+
+> I don't see or understand the problem. I've tried History links as well
+> as the diff links in RecentChanges, both seem to be working. --[[Joey]]
+
+>> Hm. It's strange. I really could see the error message like above
+>> when I sent my report. It seems that
+>> works now. So, we should be happy that it was self-fixed bug ;)
+>> --[[Paweł|ptecza]]
+
+>>> If it happens again, maybe take a full dump of the page? [[done]]
diff --git a/doc/bugs/Building_a_sidebar_does_not_regenerate_the_subpages.mdwn b/doc/bugs/Building_a_sidebar_does_not_regenerate_the_subpages.mdwn
new file mode 100644
index 000000000..419292930
--- /dev/null
+++ b/doc/bugs/Building_a_sidebar_does_not_regenerate_the_subpages.mdwn
@@ -0,0 +1,8 @@
+If sandbox/page.mdwn has been generated and sandbox/sidebar.mdwn is created, the sidebar is only added to sandbox and none of the subpages. --[[TaylorKillian]]
+
+> Yes, a known bug. As noted in the code: --[[Joey]]
+
+ # FIXME: This isn't quite right; it won't take into account
+ # adding a new sidebar page. So adding such a page
+ # currently requires a wiki rebuild.
+ add_depends($page, $sidebar_page);
diff --git a/doc/bugs/CGI__44___formbuilder__44___non-existent_field_address.mdwn b/doc/bugs/CGI__44___formbuilder__44___non-existent_field_address.mdwn
new file mode 100644
index 000000000..ef74deb91
--- /dev/null
+++ b/doc/bugs/CGI__44___formbuilder__44___non-existent_field_address.mdwn
@@ -0,0 +1,59 @@
+Error received when clicking on the "edit" link:
+
+> `Error: [CGI::FormBuilder::AUTOLOAD] Fatal: Attempt to address
+> non-existent field 'text' by name at
+> /home/tealart/bin/share/perl/5.8.4/IkiWiki/CGI.pm line 112`
+
+Error received when following a "Create New Page" (eg. ?) link:
+
+> `Error: [CGI::FormBuilder::AUTOLOAD] Fatal: Attempt to address
+> non-existent field 'param' by name at
+> /home/tealart/bin/share/perl/5.8.4/IkiWiki/Plugin/editpage.pm line 122`
+
+I could probably find several other flavors of this error if I went
+looking, but I trust you get the idea.
+
+The CGI starts to render (this isn't the "you forgot to set the
+permissions/turn on the CGI" error) and then fails.
+
+Further details:
+
+- Running on shared hosting (dreamhost; but everything compiles,
+ dependencies installed, the site generates perfectly, other CGIs
+ work, the file permissions work).
+
+- It's running perl 5.8.4, but I did upgrade gettext to 0.17
+
+- the server is running gcc v3.3.5 (at this point, this is the main
+ difference between the working system and my box.)
+
+- I've removed the locale declarations from both the config file and
+ the environment variable.
+
+- I've also modified the page template and have my templates in a non
+ standard location. The wiki compiles fine, with the template, but
+ might this be an issue? The CGI script doesn't (seem) to load under
+ the new template, but I'm not sure how to address this issue.
+
+- All of the required/suggested module dependencies are installed
+ (finally) to the latest version including (relevantly)
+ CGI::FormBuilder 3.0501.
+
+- I'm running ikiwiki v3.08. Did I mention that it works perfectly in
+ nearly every other way that I've managed to test thusfar?
+
+----
+
+> I suspect that your perl is too old and is incompatible with the version of CGI::FormBuilder you have installed.
+>
+> Is so, it seems likely that the same error message can be reproduced by running a simple command like this at the command line:
+>
+> perl -e 'use warnings; use strict; use CGI::FormBuilder; my $form=CGI::FormBuilder->new; $form->text("boo")'
+>
+> --[[Joey]]
+
+> > nope, that command produces no output. :/
+> >
+> > I considered downgrading CGI::FormBuilder but I saw evidence of previous versions being incompatible with ikiwiki so I decided against that.
+> >
+> > -- [[tychoish]]
diff --git a/doc/bugs/CGI_edit_and_slash_in_page_title.mdwn b/doc/bugs/CGI_edit_and_slash_in_page_title.mdwn
new file mode 100644
index 000000000..ec5763924
--- /dev/null
+++ b/doc/bugs/CGI_edit_and_slash_in_page_title.mdwn
@@ -0,0 +1,18 @@
+Try clicking the Edit link for <http://ikiwiki.info/todo/Add_showdown_GUI_input__47__edit/>
+
+The link produces a query string that the edit CGI interprets to
+mean `edit.mdwn` in an `Add showdown GUI input` subpage.
+
+There's something there now, but only because I created it. When
+I first tried it, it came up blank. I tried several different ways
+of altering the escaping of the query string to get the real page to
+come up, but I never succeeded, so I just grabbed the original text
+from git and pasted it into the new page....
+
+So somehow the generation of Edit links and the CGI for doing the
+editing need to get in agreement on just how they're going to
+escape slashes in a page title.
+
+--Chapman Flack
+
+> bleh. [[Fixed|done]] --[[joey]]
diff --git a/doc/bugs/CGI_problem_with_some_webservers.mdwn b/doc/bugs/CGI_problem_with_some_webservers.mdwn
new file mode 100644
index 000000000..e4b0fd448
--- /dev/null
+++ b/doc/bugs/CGI_problem_with_some_webservers.mdwn
@@ -0,0 +1,108 @@
+The "ikwiki.cgi?page=index&do=edit" function has a problem
+when running with [[!debpkg thttpd]] or [[!debpkg mini-httpd]]:
+for some reason the headers ikiwiki outputs are transmitted
+as the page content. Surprisingly, the "do=prefs" function
+works as expected.
+
+Here is what it looks like in iceweasel:
+
+ Set-Cookie: ikiwiki_session_apnkit=99dad8d796bc6c819523649ef25ea447; path=/
+ Date: Tue, 14 Aug 2007 17:16:32 GMT
+ Content-Type: text/html; charset=utf-8
+
+ <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
+ "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
+ <html>
+ (...)
+
+Ikiwiki runs fine with [[!debpkg boa]].
+
+--[[JeremieKoenig]]
+
+It doesn't work for signin either.
+What is the reason for these "header => 1" in FormBuilder initialisations?
+Why do they appear two times with conflicting values in the very same hashes?
+
+--[[JeremieKoenig]]
+
+> Clearly those duplicate header settings are a mistake. But in all cases, the
+> `header => 0` came second, so it _should_ override the other value and
+> can't be causing this problem. (cgi_signin only sets it to 0, too).
+>
+> What version of formbuilder are you using? If you run ikiwiki.cgi at the
+> command line, do you actually see duplicate headers? I don't:
+
+ joey@kodama:~/html>REQUEST_METHOD=GET QUERY_STRING="page=index&do=edit" ./ikiwiki.cgi
+ Set-Cookie: ikiwiki_session_joey=41a847ac9c31574c1e8f5c6081c74d12; path=/
+ Date: Tue, 14 Aug 2007 18:04:06 GMT
+ Content-Type: text/html; charset=utf-8
+
+ <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
+
+> Do thttpd and mini-httpd perhaps not realize that Set-Cookis is the start of
+> the headers? --[[Joey]]
+
+>> Thanks for your help: I think I found the problem!
+>> Ikiwiki outputs (in my case) the following
+>> error message on stderr, followed by an empty line:
+
+ /srv/ikiwiki/wc/index.mdwn: (Not a versioned resource)
+
+>> Probably thttpd and mini-httpd read stderr as well as stdout, while apache
+>> and boa don't. When using a shell-script wrapper as the CGI,
+>> which redirects ikiwiki's error output to /dev/null, it works better.
+
+>> The edit still fails to commit, because in my wiki, index.mdwn is
+>> pulled from the base wiki and somehow ikiwiki wants to change it
+>> rather that create it.
+
+>> --[[JeremieKoenig]]
+
+>>> If thttpd and mini-httpd interpret CGI's stderr as stdout, then
+>>> they're not properly following the CGI spec, and will break with tons
+>>> of cgi scripts besides ikiwiki. And of course there are many many cases
+>>> where ikiwiki might output to stderr, and that's the right thing to do.
+>>> So I don't see any way to address this in ikiwiki. --[[Joey]]
+
+>>>> (reported as [[!debbug 437927]] and [[!debbug 437932]]) --[[JeremieKoenig]]
+
+Marking [[done]] since it's not really an ikiwiki bug. --[[Joey]]
+
+----
+
+I'm using boa and getting some odd behaviour if I don't set the `umask`
+option in the config file. Editing a page through the web interface and
+hitting "Save Page" regenerates the `index.html` file with no world-read
+permissions. As a result, the server serves a "403 - Forbidden" error page
+instead of the page I was expecting to return to.
+
+There are only two ways I found to work around this: adding a `umask 022`
+option to the config file, or re-compiling the wiki from the command line
+using `ikiwiki --setup`. Setting up a git back-end and re-running `ikiwiki
+--setup` from inside a hook had no effect; it needed to be at the terminal.
+--Paul
+
+> Since others seem to have gotten ikiwiki working with boa,
+> I'm guessing that this is not a generic problem with boa, but that
+> your boa was started from a shell that had an unusual umask and inherited
+> that. --[[Joey]]
+
+>> That's right; once I'd worked out what was wrong, it was clear that any
+>> webserver should have been refusing to serve the page. I agree about the
+>> inherited umask; I hadn't expected that. Even if it's unusual, though, it
+>> probably won't be uncommon - this was a stock Ubuntu 9.04 install. --Paul
+
+(I'm new to wiki etiquette - would it be more polite to leave these details
+on the wiki, or to remove them and only leave a short summary? Thanks.
+--Paul)
+
+> Well, I just try to keep things understandable and clear, whether than
+> means deleting bad old data or not. That said, this page is a bug report,
+> that was already closed. It's generally better to open a new bug report
+> rather than edit an old closed one. --[[Joey]]
+
+>> Thanks for the feedback, I've tidied up my comment accordingly. I see
+>> your point about the bug; sorry for cluttering the page up. I doubt it's
+>> worth opening a new page at this stage, but will do so if there's a next
+>> time. The solution seems worth leaving, though, in case anyone else in my
+>> situation picks it up. --Paul
diff --git a/doc/bugs/CGI_showed_HTML_when_perl_error.mdwn b/doc/bugs/CGI_showed_HTML_when_perl_error.mdwn
new file mode 100644
index 000000000..b222b297d
--- /dev/null
+++ b/doc/bugs/CGI_showed_HTML_when_perl_error.mdwn
@@ -0,0 +1,40 @@
+I didn't have Time/Duration.pm installed when I clicked RecentChanges. The
+perl failed. The CGI outputed the Content-type: text/html and the complete
+HTML which included the error in side of the paragraph tags. Maybe a newline
+was sent before that Content-type line. The web browser didn't render the HTML
+but just showed the source.
+
+> I can't reproduce this, I get a properly formatted error page.
+> If you'd like to send me the page, I can try to figure out what
+> happened. --[[Joey]]
+
+>> The page is fine. I can reproduce by just putting a typo or error in a
+>> plugin. I used tcpdump. When I am missing plugin I get a newline 0a
+>> before Content-Type:
+
+ 0x0030: 0000 0003 0000 0000 0a43 6f6e 7465 6e74 .........Content
+
+>> And with it working, no newline:
+
+ 0x0030: 0000 0003 0000 0000 436f 6e74 656e 742d ........Content-
+
+>> I am using mini_httpd. I guess I could try another webserver real quick.
+>>
+>> --JeremyReed
+
+Here's what I see, taking the web server out of the picture:
+
+ joey@kodama:~>~/html/ikiwiki.cgi 2>/dev/null |hexdump -C|head -1
+ 00000000 43 6f 6e 74 65 6e 74 2d 74 79 70 65 3a 20 74 65 |Content-type: te|
+
+No spurious 0a. With apache:
+
+ 0100 75 6e 6b 65 64 0d 0a 43 6f 6e 74 65 6e 74 2d 54 unked..C ontent-T
+
+Here the 0d 0a is a CRLF, and note that it's output by the web server, not
+ikiwiki. It's perfectly valid, while a lone 0a, just a linefeed, is not valid
+HTTP. Conclusion, this was your web server; it's not uncommon for hacky
+little web servers to not use proper CRLF's, and it works _some_ of the time,
+depending on how strict the browser is.
+
+I'm calling this [[bugs/done]] --[[Joey]]
diff --git a/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn b/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
new file mode 100644
index 000000000..81a5abf28
--- /dev/null
+++ b/doc/bugs/CGI_wrapper_doesn__39__t_store_PERL5LIB_environment_variable.mdwn
@@ -0,0 +1,28 @@
+If you wish to install ikiwiki in your home directory (for example because you don't have root access), you need to set environment variables (such as PATH and PERL5LIB) to point to these directories that contain your personal copy of IkiWiki.
+
+The CGI wrapper remembers PATH, but not the environment variable PERL5LIB. Consequently, it will look for plugins and so on in the usual system directories, not in your personal copy. This is particularly insidious if you have a system copy of a different version installed, as your CGI wrapper may then load in code from this version.
+
+I think the CGI wrapper should remember PERL5LIB too.
+
+-- Martin
+
+Thank's a lot for pointing me to this location in the code. I was looking it for some time.
+
+This brutal patch implement your solution as a temporary fix.
+
+ *** Wrapper.pm.old 2012-08-25 16:41:41.000000000 +0200
+ --- Wrapper.pm 2012-10-01 17:33:17.582956524 +0200
+ ***************
+ *** 149,154 ****
+ --- 149,155 ----
+ $envsave
+ newenviron[i++]="HOME=$ENV{HOME}";
+ newenviron[i++]="PATH=$ENV{PATH}";
+ + newenviron[i++]="PERL5LIB=$ENV{PERL5LIB}";
+ newenviron[i++]="WRAPPED_OPTIONS=$configstring";
+
+ #ifdef __TINYC__
+
+As I am not sure that remembering `PERL5LIB` is a good idea, I think that a prettier solution will be to add a config variable (let's say `cgi_wrapper_perllib`) which, if fixed, contains the `PERL5LIB` value to include in the wrapper, or another (let's say `cgi_wrapper_remember_libdir`), which, if fixed, remember the current `PERL5LIB`.
+
+-- Bruno
diff --git a/doc/bugs/CamelCase_and_Recent_Changes_create_spurious_Links.mdwn b/doc/bugs/CamelCase_and_Recent_Changes_create_spurious_Links.mdwn
new file mode 100644
index 000000000..de95fb7d3
--- /dev/null
+++ b/doc/bugs/CamelCase_and_Recent_Changes_create_spurious_Links.mdwn
@@ -0,0 +1,11 @@
+Hi folks,
+
+This is a fairly fresh wiki. I recently noticed the Links: section the the bottom looked like this:
+
+Links: index recentchanges/change 0b2f03d3d21a3bb21f6de75d8711c73df227e17c recentchanges/change 1c5b830b15c4f2f0cc97ecc0adfd60a1f1578918 recentchanges/change 20b20b91b90b28cdf2563eb959a733c6dfebea7a recentchanges/change 3377cedd66380ed416f59076d69f546bf12ae1e4 recentchanges/change 4c53d778870ea368931e7df2a40ea67d00130202 recentchanges/change 7a9f3c441a9ec7e189c9df322851afa21fd8b00c recentchanges/change 7dcaea1be47308ee27a18f893ff232a8370e348a recentchanges/change 963245d4e127159e12da436dea30941ec371c6be recentchanges/change cd489ff4abde8dd611f7e42596b93953b38b9e1c ...
+
+All of those "recentchanges/ change xxxxxxx" links are clickable, but all yield 404 when clicked.
+
+When I disable the CamelCase plugin and rebuild the wiki, all the Links other than index disappear, as they should. Re-enable CamelCase, and they're back.
+
+This is a very simple wiki. Just fresh, only one page other than index (this one), and nothing at all fancy/weird about it.
diff --git a/doc/bugs/Can__39__t_build_2.49__63__.mdwn b/doc/bugs/Can__39__t_build_2.49__63__.mdwn
new file mode 100644
index 000000000..46f5ed2fd
--- /dev/null
+++ b/doc/bugs/Can__39__t_build_2.49__63__.mdwn
@@ -0,0 +1,35 @@
+<strong>Solved - see Buo's suggestion below</strong>
+
+I'm using ikiwiki on a shell account/hosting site where I can build but have no root privileges. I went to build 2.49 last night, but make fails with the following error:
+
+<pre><code>/home/telemachus/bin/perl -Iblib/lib ikiwiki.out -libdir . -setup docwiki.setup -refresh
+docwiki.setup: Failed to load plugin IkiWiki::Plugin::inline: Too many arguments for IkiWiki::htmlize at IkiWiki/Plugin/inline.pm line 359, near "))"
+Compilation failed in require at (eval 14) line 2.
+BEGIN failed--compilation aborted at (eval 14) line 2.
+
+BEGIN failed--compilation aborted at (eval 10) line 21.
+
+make: *** [extra_build] Error 255</code></pre>
+
+I looked at that patch of inline.pm, but nothing obvious jumped out at me, and I'm reluctant to tinker with it on my own. Other details that may be useful:
+
+* perl is 5.10.0 (built myself on the account, rather than the system perl)
+* The host system is Debian Etch
+* Kernel is a custom 2.6.16-xen kernel
+* Gcc 4.1.2
+
+Any thoughts or suggestions are appreciated. Thanks.
+
+> See this report: [[2.45_Compilation_error]]
+
+> I don't know if this is the same problem you're seeing, but it looks similar.
+> The issue is unresolved AFAICT, but I haven't had a chance to try 2.49 yet.
+
+> The only workaround I've found is to completely remove any previous ikiwiki
+> installation that might be present on the system. --Buo
+
+Thanks, Buo. I removed the traces of the previous ikiwiki, and I was able to install. No luck getting the new search plugin to work, but we'll leave that for another day. Thanks again.
+
+Quick follow-up: 2.50 compiled and installed without a hitch. Just thought I would mention it...
+
+> [[closing|done]] as a duplicate of [[2.45_Compilation_error]]
diff --git a/doc/bugs/Can__39__t_connect_to_ikiwiki_git_repo_through_http.mdwn b/doc/bugs/Can__39__t_connect_to_ikiwiki_git_repo_through_http.mdwn
new file mode 100644
index 000000000..decfd7269
--- /dev/null
+++ b/doc/bugs/Can__39__t_connect_to_ikiwiki_git_repo_through_http.mdwn
@@ -0,0 +1,18 @@
+When I try to clone or pull the ikiwiki repo via the http interface described on the [[git]] page,
+I get the following error message:
+
+ fatal: http://git.ikiwiki.info/ikiwiki.git/info/refs not found: did you run git update-server-info on the server?
+
+Using the "git" interface still works, but I can't access that through a firewall.
+
+Also, clicking on the "gitweb" link gets me to the front page of ikiwiki.info, not the gitweb page.
+
+-- [[KathrynAndersen]]
+
+> I've updated the gitweb link. Now that ikiwiki's wiki is managed by
+> ikiwiki-hosting, [this bug](http://ikiwiki-hosting.branchable.com/todo/readonly_git-http-backend/)
+> needs to be fixed there before git http can be used. In the meantime,
+> the github mirror does provide it: <http://github.com/joeyh/ikiwiki.git>
+> --[[Joey]] [[!tag done]]
+
+>> Thanks. -- [[KathrynAndersen]]
diff --git a/doc/bugs/Can__39__t_create_root_page.mdwn b/doc/bugs/Can__39__t_create_root_page.mdwn
new file mode 100644
index 000000000..91c2eae60
--- /dev/null
+++ b/doc/bugs/Can__39__t_create_root_page.mdwn
@@ -0,0 +1,69 @@
+This is a link to a non-existent page in the root directory: [[/root_page_test]] (\[[/root\_page\_test]])
+
+When you click on the question mark to create the page, you get *Error: bad page name*. It's a valid [[wikilink]], shouldn't it create the page? --[[sabr]]
+
+> Is it a valid wikilink? Should Iki prevent the page from being created? --[[sabr]], 2 months later
+
+This type of page name (with leading slash) also gets created by the aggregate plugin: /cgi-bin/ikiwiki.cgi?page=%2FCalculated_Risk&from=news%2FAll_Stories&do=create I'm now pretty convinced that Iki should handle this without error. I'll investigate if I can find the time.
+
+> As documented on [[ikiwiki/subpage/linkingrules]], such an absolute
+> link works perfectly when the linked page already exists.
+>
+> The CGI behaviour is thus not consistent with the general linking
+> rules, which is annoying for me : I'm using templates to generate
+> links to pages that may not exist yet, and I would like the "right"
+> path to be selected by default, instead of the usual
+> <current>/subdir/subpage, when a user clicks the "?" link to create
+> the missing page ; that's why I'm using absolute paths.
+>
+>> Totally agree, this had only not been addressed due to lack of time on
+>> my part. (I have about 50 ikiwiki things on my todo list.) --[[Joey]]
+>
+> Anyway, having the CGI consider invalid an otherwise valid wikilink
+> seems a bit weird to me, so I had a look to the code, and here is a
+> patch that should fix this issue ; I proceeded the only way I could
+> find to prevent side-effects : the only place where I use `$origpage`
+> is a match, so no function at all is fed with a `$page` with
+> leading slash :
+>
+> -- intrigeri
+
+
+ diff --git a/IkiWiki/CGI.pm b/IkiWiki/CGI.pm
+ index 99cead6..23d9616 100644
+ --- a/IkiWiki/CGI.pm
+ +++ b/IkiWiki/CGI.pm
+ @@ -305,9 +305,11 @@ sub cgi_editpage ($$) {
+ my $page=$form->field('page');
+ $page=possibly_foolish_untaint($page);
+ if (! defined $page || ! length $page ||
+ - file_pruned($page, $config{srcdir}) || $page=~/^\//) {
+ + file_pruned($page, $config{srcdir})) {
+ error("bad page name");
+ }
+ + my $origpage=$page;
+ + $page =~ s#^/##;
+
+ my $baseurl=$config{url}."/".htmlpage($page);
+
+ @@ -425,6 +427,7 @@ sub cgi_editpage ($$) {
+ $from ne $form->field('from') ||
+ file_pruned($from, $config{srcdir}) ||
+ $from=~/^\// ||
+ + $origpage=~/^\// ||
+ $form->submitted eq "Preview") {
+ @page_locs=$best_loc=$page;
+ }
+
+
+> [[Applied|done]]. BTW, I also accept full git changesets, if you like
+> having your name in commit logs. :-)
+
+>> Thanks. I'm considering setting up a public Git repository with topic branches, so that :
+
+>> - I can simply ask you to pull from there, next time
+>> - I have a tool to go on learning the beast (i.e. Git)
+
+>> -- intrigeri
+
+[[!tag patch]]
diff --git a/doc/bugs/Can__39__t_deplete_page__63__.mdwn b/doc/bugs/Can__39__t_deplete_page__63__.mdwn
new file mode 100644
index 000000000..5ae034e99
--- /dev/null
+++ b/doc/bugs/Can__39__t_deplete_page__63__.mdwn
@@ -0,0 +1,8 @@
+The issue from [[plugins/shortcut/discussion|plugins/shortcut/discussion]] has been fixed.
+I wanted to remove the old text from that page, as it serves no further value.
+On web-editing I erased all the text, entered a change notice and selected
+*Save Page*.
+I was, however, thrown back to the web-editing frame, with the old text in it
+restored, instead of the page being cleared.
+
+>> [[done]] --[[Joey]]
diff --git a/doc/bugs/Can__39__t_rebuild_wiki_pages_with_ikiwiki_2.49.mdwn b/doc/bugs/Can__39__t_rebuild_wiki_pages_with_ikiwiki_2.49.mdwn
new file mode 100644
index 000000000..86e719ea3
--- /dev/null
+++ b/doc/bugs/Can__39__t_rebuild_wiki_pages_with_ikiwiki_2.49.mdwn
@@ -0,0 +1,44 @@
+I've just upgraded ikiwiki from version 1.45 to 1.49, using my own Ubuntu Gutsy
+backport. Now I can't rebuild all my wiki pages. Both methods of rebuilding pages fail:
+
+ $ sudo ikiwiki-mass-rebuild
+ Processing /home/ptecza/path/to/ikiwiki.setup as user ptecza ...
+ failed to set egid 1000 4 20 24 25 29 30 44 46 104 109 110 119 1000 (got back 1000 1000 119 110 109 104 46 44 30 29 25 24 20 4) at /usr/sbin/ikiwiki-mass-rebuild line 38, <$list> line 13.
+ Processing /home/ptecza/path/to/ikiwiki.setup as user ptecza failed with code 65280
+
+ $ ikiwiki --setup ikiwiki.setup
+ pomyślnie utworzono /var/www/path/to/ikiwiki.cgi
+ pomyślnie utworzono /home/ptecza/path/to/hooks/post-commit.ikiwiki
+ terminate called after throwing an instance of 'Xapian::InvalidArgumentError'
+ Aborted
+
+I've installed all needed packages for new search engine and added path
+to `omega` binary in my `ikiwiki.setup` file.
+
+Any ideas how to fix that problem? --[[Paweł|ptecza]]
+
+> Well, it's two separate problems. Xapian is crashing in the C code when
+> asked to create a stemmer for `pl`. This is a Xapain bug, but I've put
+> in a workaround.
+>
+> For the first problem, looks like I need a more robust grouplist comparor
+> -- fixed in git.
+>
+> [[done]]
+> --[[Joey]]
+
+>> Thanks a lot for the rapid fix, Joey! Now my ikiwiki works good for me :)
+>>
+>> BTW, why have you replaced Hyper Estraier by Xapian? It seems that second
+>> search engine is faster, but I'm not sure it has the same wide syntax.
+>> Also I can't see how to change number of hits per page... --[[Paweł|ptecza]]
+
+>>> Xapian indexes more quickly, and with the perl interface I was able to
+>>> make updates for changes pages quite efficient. My experience with
+>>> Hyper Estraier has not been good, with its database often breaking, and
+>>> it sometimes crashing. Xapian also does a ranked search, and supports
+>>> searching for specific metadata like "title:foo". --[[Joey]]
+
+>>>> Thank you very much for the reply! I have never had problems with
+>>>> Hyper Estraier, but I'm not a long-time user of that searching engine.
+>>>> It's good to know about Xapian pros and Hyper Estraier cons. --[[Paweł|ptecza]]
diff --git a/doc/bugs/Cannot_inline_pages_with_apostrophes_in_title.mdwn b/doc/bugs/Cannot_inline_pages_with_apostrophes_in_title.mdwn
new file mode 100644
index 000000000..3e1fe823e
--- /dev/null
+++ b/doc/bugs/Cannot_inline_pages_with_apostrophes_in_title.mdwn
@@ -0,0 +1,7 @@
+If I create a page whose title contains an apostrophe, then inlining that
+page produces nothing. It looks like the inline plugin is failing to do
+the translation from apostrophe to `_39_` that other parts of the system do, so although one can make wikilinks to such pages and have them detected as existing (for instance, by the conditional plugin), inline looks in the wrong place and doesn't see the page.
+
+> I can't reproduce that (btw, an apostrophe would be `__39__`) --[[Joey]]
+
+[[done]]
diff --git a/doc/bugs/Changing_language.mdwn b/doc/bugs/Changing_language.mdwn
new file mode 100644
index 000000000..42fa1c97e
--- /dev/null
+++ b/doc/bugs/Changing_language.mdwn
@@ -0,0 +1,9 @@
+Sorry, this is certainly easy but I can't find the way to change the default language.
+
+My $LANG was fr_FR.UTF-8 when I installed the 1.40 package. It didn't take it.
+
+How can I setup ikiwiki in order to display French (or German) translated string ?
+
+>Oups, sorry, it's ok, thanks, just in the setup file....
+
+>> Ok, marking this [[bugs/done]] --[[Joey]]
diff --git a/doc/bugs/Checksum_errors_on_the_pristine-tar_branch.mdwn b/doc/bugs/Checksum_errors_on_the_pristine-tar_branch.mdwn
new file mode 100644
index 000000000..471dad98c
--- /dev/null
+++ b/doc/bugs/Checksum_errors_on_the_pristine-tar_branch.mdwn
@@ -0,0 +1,112 @@
+I'm in the process of installing ikiwiki on my home page (hooray), and wants to have the newest stable version available. I suppose that's the one on the `pristine-tar` branch.
+
+> You can check out the latest released version with:
+>
+> git tag # outputs a list of tags
+> git checkout 3.20110124 # or use the latest one, if different
+>
+> If you're using git already, there's no need to use pristine-tar,
+> unless you particularly want a tarball for some reason.
+>
+> Downloading the tarball from Debian is the other recommended way to
+> [[download]] the source code. --[[smcv]]
+
+>> Thanks for your responses, smcv. I'll use that method and install the newest version when I'm more familiar with the way ikiwiki works. For now I'm using version 3.20100122 installed with apt-get. Works great so far, but I'm looking forward to the new install. -- [[sunny256]] 2011-02-22 19:30+0100
+
+But I'm unable to recreate the newest `.tar` file, in fact there's errors in all these `.tar.gz` files on that branch:
+
+* `ikiwiki_2.20.tar.gz`
+* `ikiwiki_2.30.tar.gz`
+* `ikiwiki_2.31.1.tar.gz`
+* `ikiwiki_2.46.tar.gz`
+* `ikiwiki_2.47.tar.gz`
+* `ikiwiki_2.48.tar.gz`
+* `ikiwiki_2.49.tar.gz`
+* `ikiwiki_2.50.tar.gz`
+* `ikiwiki_2.51.tar.gz`
+* `ikiwiki_2.62.1.tar.gz`
+* `ikiwiki_2.62.tar.gz`
+* `ikiwiki_3.20101129.tar.gz`
+* `ikiwiki_3.20101201.tar.gz`
+* `ikiwiki_3.20101231.tar.gz`
+* `ikiwiki_3.20110105.tar.gz`
+* `ikiwiki_3.20110122.tar.gz`
+* `ikiwiki_3.20110123.tar.gz`
+* `ikiwiki_3.20110124.tar.gz`
+
+The operation fails on these files with a "Checksum validation failed" error from `xdelta`(1). The `pristine-tar`(1) version is 1.00, installed with `apt-get` on Ubuntu 10.04.2 LTS. Is this version too old, or are there some errors on this branch?
+
+> I get similar errors on Debian unstable, but not on all of the same versions;
+> for instance, my `ikiwiki_3.20110124.tar.gz` is OK. In some cases, xdelta
+> complains, but the tarball is produced successfully. However, I do see actual
+> failures for 2.62 and 2.62.1, for instance. --[[smcv]]
+
+> Yes, on Debian unstable I got failures on only old ones, but not in
+> contiguous blocks: --[[Joey]]
+>
+> ikiwiki_2.20.tar.gz
+> ikiwiki_2.30.tar.gz
+> ikiwiki_2.31.1.tar.gz
+> ikiwiki_2.46.tar.gz
+> ikiwiki_2.47.tar.gz
+> ikiwiki_2.48.tar.gz
+> ikiwiki_2.49.tar.gz
+> ikiwiki_2.50.tar.gz
+> ikiwiki_2.51.tar.gz
+> ikiwiki_2.62.1.tar.gz
+> ikiwiki_2.62.tar.gz
+>
+> Probably what would help debug this problem is if someone can
+> reproduce with one or more of the other ones that do **not** fail
+> for me, pass `-dk` to pristine-tar, and send me a copy of its temp directory
+> (joey@kitenet.net), and the versions of pristine-tar, tar, gzip.
+> Then I can compare the good and bad recreated
+> tarballs and identify the difference. Or pass them to the tar developers,
+> who have helped before.
+>
+> The only cause that I can think of is that perhaps tar's output
+> has changed compared with the version used to create those. The
+> only tar output change I know of involved filenames that were
+> exactly 100 bytes long -- and pristine-tar 1.11 works around that
+> when run with tar 1.25-2 on Debian. FWIW, I am only seeing
+> this in ikiwiki's pristine-tar info, not other packages'.
+> (Checked all of debhelper's and alien's and etckeeper's
+> and pristine-tar's tarballs.) --[[Joey]]
+>
+>> It looks as though I only get the same failures as you, so that's no help
+>> (reassuring, though, since we're presumably both running recent Debian).
+>> sunny256's failure cases might just result from the older tar and pristine-tar
+>> on Ubuntu 10.04? --[[smcv]]
+
+>>> Yes, I can reproduce the same failures sunny256 saw using Debian oldstable. Once I
+>>> upgrade pristine-tar and tar, it goes away, so I think it is the 100
+>>> byte filename bug affecting those.
+>>>
+>>> As to the ones we all see fail, I dunno what it is, but probably
+>>> has to do with some kind of historical issue in the versions of
+>>> pristine-tar/tar used to create them. We may never know what went wrong
+>>> there. --[[Joey]] [[done]]
+
+A complete output of the "pristine-tar checkout" of all files is stored on <https://gist.github.com/836720> .
+
+For now, I'll download the `.tar.gz` from <http://packages.debian.org/unstable/source/ikiwiki>, or maybe install `ikiwiki_3.20110124_all.deb`. Would you recommend using that `.deb` file on Ubuntu 10.04.2 LTS, or is it Debian-specific? -- [[sunny256]] 2011-02-21 08:42+0100
+
+> The .deb from Debian unstable is likely to work on Ubuntu; I've
+> generally been able to compile snapshots on Debian unstable and
+> install them onto Debian lenny (older than that Ubuntu release)
+> without modification. If in doubt, build it from source. --[[smcv]]
+
+> > The .deb file `ikiwiki_3.20110124_all.deb` from Debian unstable seems to
+> > work great. I'm now the happy user of the newest stable version, yay. There
+> > were some errors or warnings, though. This is the first one:
+
+> > > `You are overwriting a locally defined method (finished) with an accessor
+> > > at /usr/lib/perl5/Moose/Meta/Attribute.pm line 570`
+
+> > Along with loads of other suspicious stuff. Have posted the whole output at
+> > <https://gist.github.com/842789>. I'll dig around a bit in the source to
+> > see if there's something I need to worry about. It looks good so far.
+> > --&nbsp;[[sunny256]]&nbsp;<small>2011-02-24&nbsp;20:27Z</small>
+
+> > > Looks like a bug in [[!cpan Net::Amazon::S3::Client::Bucket]] or in something
+> > > it uses, rather than in ikiwiki itself. --[[smcv]]
diff --git a/doc/bugs/Command-line_arguments_should_override_settings_in_the_setup_file.mdwn b/doc/bugs/Command-line_arguments_should_override_settings_in_the_setup_file.mdwn
new file mode 100644
index 000000000..22b1c2cb5
--- /dev/null
+++ b/doc/bugs/Command-line_arguments_should_override_settings_in_the_setup_file.mdwn
@@ -0,0 +1,32 @@
+In setting up my wiki I followed the [[setup]] instruction which point
+to an ikiwiki.setup file that contains "verbose => 0".
+
+I hadn't noticed that setting in there, but later when I changed my
+standard command of:
+
+ ikiwiki --setup ikiwiki.setup
+
+to:
+
+ ikiwiki --verbose --setup ikiwiki.setup
+
+I was quite surprised that that change didn't have any effect.
+
+So two suggestions to fix this:
+
+1. Make command-line arguments override settings in the setup file
+
+> This is difficult to do, since reading a setup file replaces values for
+> config items with the values in the setup file. Also, when you say
+> --setup foo, you're asking ikiwiki to set up the wiki using the
+> comnfiguration in file foo. Which is what it does.
+
+2. Comment out all settings in the example setup file that are simply
+ setting options to their default values. That way, the file will
+ still be self-documenting, but command-line arguments will at least
+ work for these settings while they remain commented out.
+
+> I've done that, I also fixed some issues with --verbose handling earlier.
+> I'm pretty sure that those fixes fix the real issue, so calling this
+> [[done]].
+> --[[Joey]]
diff --git a/doc/bugs/Comments_are_not_sorted_by_their_date_attribute.mdwn b/doc/bugs/Comments_are_not_sorted_by_their_date_attribute.mdwn
new file mode 100644
index 000000000..5a4c4b2ae
--- /dev/null
+++ b/doc/bugs/Comments_are_not_sorted_by_their_date_attribute.mdwn
@@ -0,0 +1,71 @@
+(brief, sorry, via Phone. More details to follow)
+
+I gradually splitting discussion pages into separate comment pages, containing a comment directive.
+
+The "date" attribute is being set to the date output by gig for a commit. (I'd hope this was parseable)
+
+The presentation of the resulting comments is not sorted by this date, which I would hope/expect, but instead by the ctime or mtime of the file at the other end, as best I can tell.
+
+-- [[Jon]]
+
+> Yes, comments are displayed via an inline, and usual [[pagespec/sorting]]
+> (eg, default of when the file was first seen) is used. The comment
+> date only affects the date displayed.
+>
+> > That's not what I intended - it's meant to be more or less just
+> > syntactic sugar for `\[[!meta date=foo]]`, setting the `%pagectime`.
+> > The code looks as though it ought to work, but perhaps it's buggy?
+> > (edited to add: it is, see below) --[[smcv]]
+>
+> The only time I've seen this be much problem personally is when moving
+> a page, which means moving its comments directory, which tends to
+> jumble the order. (And --gettime does not help, as ikiwiki does not
+> tell git to follow renames for speed reasons.)
+>
+> I wonder if it wouldn't be best to just get rid of the extra date
+> inside the comment, and rely on the file date as is done for other pages.
+> Thoughts [[smcv]]?
+>
+> Altenatively, since comments tend to be named "comment_N_.....",
+> adding a new [[pagespec/sorting]] method that sorts by filename,
+> rather than by title, and using it by default for comments might be
+> better than the current situation. --[[Joey]]
+
+>> Since git does not track file time, I tend to prefer to encode date
+>> stuff inside files where possible. For other pages, I put an explicit
+>> [[plugins/meta]] date into the source when I create the page. I've
+>> had to reconstruct ordering after moving to a different git checkout
+>> after a server move before, it was painful ☺
+>>
+>> In my current situation, I could live with by-filename ordering. By-title
+>> ordering would also be workable. — [[Jon]]
+
+>>> I agree with Jon's reasons for embedding an explicit date in the file.
+>>> As I said, this is *meant* to work, but it might not.
+>>>
+>>> Sorting by filename would only be useful with
+>>> [[!cpan Sort::Naturally]], since normal `cmp` ordering would break pages
+>>> with more than 9 comments. --s
+
+----
+
+[[!template id=gitbranch author="[[smcv]]" branch=smcv/comments-metadata]]
+
+I thought that, as internal pages, comments were not preprocessed
+(and so their date attributes did not have a chance to take effect) until
+they were already being inlined, by which time they have already been
+sorted by the files' ctimes. Actually, I was wrong about that - internal
+pages have a special case elsewhere - but they did skip the `scan` hook,
+which is also fixed in my branch.
+
+The real bug was that the preprocess hook for comments didn't run
+in the scan phase; my branch fixes that, streamlines that hook a bit
+when run in the scan phase (so it doesn't htmlize, and only runs nested
+directives in scan mode), and adds a regression test. --[[smcv]]
+
+[[!tag patch]]
+
+> Thanks.. I am not 100% sure if I just forgot to scan internal pages
+> or left it out as some kind of optimisation since none needed to be
+> scanned. Anyway, if it was an optimisation it was not much of one
+> since they were preprocessed. All applied, [[done]]. --[[Joey]]
diff --git a/doc/bugs/Comments_dissapeared.mdwn b/doc/bugs/Comments_dissapeared.mdwn
new file mode 100644
index 000000000..787f18c98
--- /dev/null
+++ b/doc/bugs/Comments_dissapeared.mdwn
@@ -0,0 +1,69 @@
+Although I have comments enabled and I have been using them successfully for ages now, I've come to notice that they have stopped working in the last week or two.
+
+I am running version 3.20100312 with the following configuration:
+
+<http://static.natalian.org/2010-03-27/natalian.txt>
+
+In my (HTML5 modified page.tmpl) it doesn't seem to enter the "TMPL_IF COMMENTS" block anymore. I tried the stock page.tmpl and they didn't seem to work either, so the variable name hasn't changed has it?
+
+Any other ideas? With thanks,
+
+ comments_pagespec => 'archives/* and !*/Discussion',
+
+> Your setup file only allows comments to pages under archives. That
+> seems unlikely to be right, so I guess it is causing your problem.
+> --[[Joey]]
+
+That's the only place where I want comments. <http://natalian.org/archives/>
+Has the pagespec changed? Is it `archives/*/*` or something like that?
+
+It worked just fine with this configuration. I swear I have not modified it. :) -- [[Kai Hendry]]
+
+> No changes that I can think of. 'archives/*' will match *all* pages under
+> archives. Anyway, I can see in your site's rss feed that comments are
+> enabled for posts, since they have comments tags there. And
+> in fact I see comments on eg
+> <http://natalian.org/archives/2010/03/25/BBC_News_complaints/>.
+>
+> So I suspect you have simply not rebuilt your wiki after making some
+> change that fixed the comments, and so only newer pages are getting them.
+> --[[Joey]]
+
+I have tried rebuilding on my squeeze system and still comments don't appear. Any clues how to debug this?
+<http://natalian.org/comments/>
+
+I was worried is was due to a time skew problem I was experiencing on my VPS in the last month, though the time is right now and still comments do not appear on blog posts like <http://natalian.org/archives/2010/03/25/BBC_News_complaints/>
+
+# Debugging templates
+
+`sudo apt-get install libhtml-template-compiled-perl`
+
+ hendry@webconverger templates$ cat test-template.perl
+ #!/usr/bin/perl
+ use HTML::Template::Compiled;
+ local $HTML::Template::Compiled::DEBUG = 1;
+ my $htc = HTML::Template::Compiled->new(
+ filename => "$ARGV[0]",
+ );
+ eval {
+ print $htc->output;
+ };
+ if ($@) {
+ # reports as text
+ my $msg = $htc->debug_code;
+ # reports as a html table
+ my $msg_html = $htc->debug_code('html');
+ }
+ hendry@webconverger templates$ ./test-template.perl page.tmpl
+ Missing closing tag for 'IF' atend of page.tmpl line 159
+
+
+I think the problem was before that it was `<TMPL_IF COMMENTS>` and now it is `<TMPL_IF NAME="COMMENTS">` ?
+
+
+
+# Solved
+
+A merge with the templates in master with my [html5](http://git.webconverger.org/?p=ikiwiki;a=shortlog;h=refs/heads/html5) branch looks like it has solved the problem. Also see [[bugs/html5_support]].
+
+[[bugs/done]]
diff --git a/doc/bugs/Comments_link_is_to_index.html_if_usedirs_is_on.mdwn b/doc/bugs/Comments_link_is_to_index.html_if_usedirs_is_on.mdwn
new file mode 100644
index 000000000..6df3ccd9c
--- /dev/null
+++ b/doc/bugs/Comments_link_is_to_index.html_if_usedirs_is_on.mdwn
@@ -0,0 +1,5 @@
+When a page links to its own #comments anchor you get a link like
+"index.html#comments" rather than "./#comments". Fixed in commit 0844bd0b
+on my 'comments' branch. --[[smcv]]
+
+[[!tag patch done]]
diff --git a/doc/bugs/Convert___34__somehost.com/user__34___OpenID_at_RecentChanges_page.mdwn b/doc/bugs/Convert___34__somehost.com/user__34___OpenID_at_RecentChanges_page.mdwn
new file mode 100644
index 000000000..c331bf3a7
--- /dev/null
+++ b/doc/bugs/Convert___34__somehost.com/user__34___OpenID_at_RecentChanges_page.mdwn
@@ -0,0 +1,44 @@
+Don't panic, Joey! ;) It's not a bug. It's only my small wish :)
+
+Could you also please convert "somehost.com/user" OpenID to
+"user [somehost.com]" at RecentChanges page? You just do it
+for "user.somehost.com" OpenIDs. I think that same display form
+for all OpenIDs is a good idea. What's your opinion?
+
+> It's probably ok to do this, although there's the potential it might
+> misfire on some url. I've implemented it on a trial basis. [[bugs/done]] --[[Joey]]
+
+>> Thanks! What version of ikiwiki do you use at ikiwiki.info
+>> site? I still can see my "getopenid.com/ptecza" OpenID at
+>> RecentChanges page there... --[[Paweł|ptecza]]
+
+>>> I use the latest version I trust enough..
+
+>>>> Joey, your regexp is not good enough. Now I can see "tecza [getopenid.com]",
+>>>> instead of "ptecza [getopenid.com]" (you cut first character of my login).
+>>>> --[[Paweł|ptecza]]
+
+>>>>> fixed now
+
+>>>>>> Yes, now it's OK. Thank you very much! --[[Paweł|ptecza]]
+
+BTW, Happy New Year for you and Debian 'etch' and all ikiwiki
+users! :) --[[Paweł|ptecza]]
+
+>> I've just noticed that '/' character in title of my wish
+>> wasn't escaped and ikiwiki created two pages, instead of one.
+>> It's a bug, of course. --[[Paweł|ptecza]]
+
+>>> Sorry for confusing! I meant two files (directory and page),
+>>> instead of one (page).
+
+>>>> Yep, fixed. --[[Joey]]
+
+>>>>> Awesome! :) Thanks a lot!
+
+>>> BTW, I have some problems with singing in using getopenid.com
+>>> server, so I also use myopenid.com server :) --[[Paweł|ptecza]]
+
+>>>> They were having some glitches last week.. --[[Joey]]
+
+>>>>> It's good to know :) --[[Paweł|ptecza]]
diff --git a/doc/bugs/Disappearing_Pages.mdwn b/doc/bugs/Disappearing_Pages.mdwn
new file mode 100644
index 000000000..5ad198d37
--- /dev/null
+++ b/doc/bugs/Disappearing_Pages.mdwn
@@ -0,0 +1,41 @@
+I have a problem where pages within the rendered wiki become empty. The
+headers, footers, and sidepanel are there, but the body is completely
+missing. If I do a webedit and change anything (adding whitespace is
+enough) and commiting the change updates the page and the body appears. If
+I then do a rebuild of the wiki from the command line, I get the blank
+pages again. I have debug turned up but I don't see anything that makes me
+suspect anything. When I do a rebuild from the command line I get the
+following warning.
+
+>Use of uninitialized value in substitution (s///) at /usr/share/perl5/IkiWiki/Plugin/inline.pm line 234.
+
+The odd thing is that I have 5 other wikis on this same system and none of
+them seem to be experiencing the same problems. The only difference seems
+to be the use of sidebars and google calendar in the affected wiki.
+
+> Could you post a tarball of the wiki and any setup file you use somewhere
+> so I can try to reproduce your problem? --[[Joey]]
+>>[The Wiki](http://www.lcsee.wvu.edu/~scotth/sysstaff.tar.gz)
+>> I think it has something to do with the plugin selection. --[[ScottHenson]]
+
+>>> Ok, I built your wiki, and got no contentless pages here. I also
+>>> didn't see the uninitialized value warning, which could be connected.
+>>> However, I that uninitialized value come from an inline directive,
+>>> and the wiki source doesn't seem to use inlining at all, so I'm confused
+>>> about that. --[[Joey]]
+
+>>>> Sorry, thats my fault. The wiki that was having the problem had some
+>>>> information that I couldn't distribute. So I reproduced the bug on
+>>>> another wiki and sent you that. Those warnings don't seem to have any
+>>>> effect on the disappearing content. Sorry for the confusion. --[[ScottHenson]]
+
+>>>> That's ok, but since I couldn't reproduce it with the data you sent,
+>>>> I can't really fix it. --[[Joey]]
+
+>>>>> Can you not reproduce the warning or not reproduce the disappearing
+>>>>> pages? The warning does not seem to have anything to do with the
+>>>>> error of the disappearing pages. --[[ScottHenson]]
+
+>>>>>> I can't reproduce any disappearing text. --[[Joey]]
+
+Marking this [[unreproducible|done]]. --[[Joey]]
diff --git a/doc/bugs/Discussion_link_not_translated_after_page_update.mdwn b/doc/bugs/Discussion_link_not_translated_after_page_update.mdwn
new file mode 100644
index 000000000..e32c71972
--- /dev/null
+++ b/doc/bugs/Discussion_link_not_translated_after_page_update.mdwn
@@ -0,0 +1,27 @@
+Joey, I have a problem with translating of Discussion link in my backport
+of ikiwiki 1.38.
+
+I've just noticed that it's not translated after page update. In syslog
+I can see **English** ikiwiki messages. It's a strange, but the problem
+doesn't occur if I rebuild all my ikiwiki pages via command line
+(`ikiwki --setup ikiwiki.setup`). In syslog I can see **Polish** messages then.
+
+Unfortunately I don't know another Polish user of ikiwiki, so I can't ask him
+to confirm the problem. Maybe Victor Moral can do it? Probably he uses ikiwiki
+with him Spanish translation.
+
+--[[Paweł|ptecza]]
+
+> Well, do you have your setup file configured to use a polish locale?
+> --[[Joey]]
+
+>> Now I have :) I've just generated pl\_PL.UTF-8 locale via
+>> `dpkg-reconfigure locales` and set locale hash to 'pl_PL.UTF-8'
+>> in my `ikiwiki.setup` file. I also ran `ikiwiki --setup ikiwiki.setup`
+>> and restarted my Apache2 server. Unfortunately, I can still see
+>> "Discussion" link instead of "Dyskusja" link after any page update
+>> via WWW. --[[Paweł|ptecza]]
+
+>>> A setlocale issue. Now [[bugs/done]] --[[Joey]]
+
+>>>> I can confirm. Now it works :) Thanks a lot for the fix! --[[Paweł|ptecza]] \ No newline at end of file
diff --git a/doc/bugs/Discussion_link_not_translated_in_post.mdwn b/doc/bugs/Discussion_link_not_translated_in_post.mdwn
new file mode 100644
index 000000000..60e87524a
--- /dev/null
+++ b/doc/bugs/Discussion_link_not_translated_in_post.mdwn
@@ -0,0 +1,67 @@
+In my sent post (without Polish characters in title, of course ;) ) I still
+can see "Discussion" link, instead of Polish "Dyskusja" link. --[[Paweł|ptecza]]
+
+> I don't know what post you're referring to, more details please.
+> --[[Joey]]
+
+>> Sorry for the laconic bug report. I meant my blog and the post I sent
+>> to it. It works exactly like your blog at bugs page and I also created it
+>> for my ikiwiki users to bug reporting :)
+
+>> So, I sent the post to my blog and I can see that "Discussion" link
+>> from `inlinepage.tmpl` file is not translated to Polish. Now I hope
+>> you know what I mean :) --[[Paweł|ptecza]]
+
+>>> Joey, what about my bug report? ;) --[[Paweł|ptecza]]
+
+>>>> Found and [[fixed|bugs/done]] --[[Joey]].
+
+>>>>> Hm. I can't see any changes. I've builded ikiwiki 1.41 Debian package
+>>>>> from the latest SVN repo sources and installed it on my machine.
+>>>>> I've also rebuilded all my ikiwiki pages (`ikiwiki --setup ikiwiki.setup`).
+
+>>>>> I added a few debug lines to changed block of code from
+>>>>> `/usr/share/perl5/IkiWiki.pm` file:
+>>>>>
+>>>>> open(LOG, ">>/var/log/ikiwiki.log");
+>>>>> print LOG "(1) \$config{locale}=$config{locale}\n";
+>>>>> print LOG "(1) \$ENV{LANG}=$ENV{LANG}\n";
+>>>>> if (defined $config{locale}) {
+>>>>> eval q{use POSIX};
+>>>>> error($@) if $@;
+>>>>> print LOG "(2) \$config{locale}=$config{locale}\n";
+>>>>> print LOG "(2) \$ENV{LANG}=$ENV{LANG}\n";
+>>>>> if (POSIX::setlocale(&POSIX::LC_ALL, $config{locale})) {
+>>>>> $ENV{LANG}=$config{locale};
+>>>>> $gettext_obj=undef;
+>>>>> print LOG "(3) \$config{locale}=$config{locale}\n";
+>>>>> print LOG "(3) \$ENV{LANG}=$ENV{LANG}\n";
+>>>>> }
+>>>>> }
+>>>>> close(LOG);
+>>>>>
+>>>>> Here is a piece of result after rebuild:
+>>>>>
+>>>>> (1) $config{locale}=pl_PL.UTF-8
+>>>>> (1) $ENV{LANG}=pl_PL.UTF-8
+>>>>> (2) $config{locale}=pl_PL.UTF-8
+>>>>> (2) $ENV{LANG}=pl_PL.UTF-8
+>>>>> (3) $config{locale}=pl_PL.UTF-8
+>>>>> (3) $ENV{LANG}=pl_PL.UTF-8
+>>>>>
+>>>>> Is it a useful information for you? :) --[[Paweł|ptecza]]
+
+>>>>>> Not really.. I was able to reproduce the problem you desciribed and
+>>>>>> my changes fixed the problem I reproduced. --[[Joey]]
+
+> Found and fixed one more, when per-post discussion links are used in a
+> blog. --[[Joey]]
+
+>> Yes, now it's fixed :) Thank you very much, Joey!
+
+>> BTW, what about translating buttons to editing page ("Save Page",
+>> "Preview" and "Cancel")? There are still hard-coded :( --[[Paweł|ptecza]]
+
+>>> Those come via the formbuilder form, and currently my best plan for
+>>> them is to add templates for them, as described in [[translation]]
+>>> --[[Joey]]
diff --git a/doc/bugs/Discussion_of_main_page_generates_invalid_link.mdwn b/doc/bugs/Discussion_of_main_page_generates_invalid_link.mdwn
new file mode 100644
index 000000000..5080ac62e
--- /dev/null
+++ b/doc/bugs/Discussion_of_main_page_generates_invalid_link.mdwn
@@ -0,0 +1,3 @@
+The [[index/discussion]] for the main page is located at /index/discussion . In the header ikiwiki generates "ikiwiki / index / discussion" instead of "ikiwiki / discussion" like it should. /index/ contains no index.html file since it is at /index.html instead. The link should either be removed, or a copy of /index.html should be placed in /index/ . --[[TaylorKillian]]
+
+> [[done]] since a while ago --[[Joey]]
diff --git a/doc/bugs/Does_IkiWiki::Setup::load__40____41___really_return_a_hash__63__.mdwn b/doc/bugs/Does_IkiWiki::Setup::load__40____41___really_return_a_hash__63__.mdwn
new file mode 100644
index 000000000..6facec896
--- /dev/null
+++ b/doc/bugs/Does_IkiWiki::Setup::load__40____41___really_return_a_hash__63__.mdwn
@@ -0,0 +1,10 @@
+Yes, it's me again :-)
+
+Shouldn't this print a bunch of output (admittedly not very nicely formatted).
+<pre>
+[ 10 rocinante ~/tmp ] ikiwiki --dumpsetup foo.setup
+[ 11 rocinante ~/tmp ] perl -M'IkiWiki::Setup' -e 'print IkiWiki::Setup::load("foo.setup");'
+</pre>
+I get nothing with ikiwiki 2.63 [[DavidBremner]]
+
+> The docs were wrong, it populates `%config`. --[[Joey]] [[done]]
diff --git a/doc/bugs/Dupe_entry_in_Bundle::IkiWiki::Extras.pm.mdwn b/doc/bugs/Dupe_entry_in_Bundle::IkiWiki::Extras.pm.mdwn
new file mode 100644
index 000000000..236112786
--- /dev/null
+++ b/doc/bugs/Dupe_entry_in_Bundle::IkiWiki::Extras.pm.mdwn
@@ -0,0 +1,5 @@
+Authen::Passphrase
+
+is entered twice in the .pm file.
+
+[[done]]
diff --git a/doc/bugs/Encoding_problem_in_calendar_plugin.mdwn b/doc/bugs/Encoding_problem_in_calendar_plugin.mdwn
new file mode 100644
index 000000000..80e9f2c82
--- /dev/null
+++ b/doc/bugs/Encoding_problem_in_calendar_plugin.mdwn
@@ -0,0 +1,73 @@
+Hello,
+
+I studied this [[guy's problem|forum/Encoding_problem_in_french_with_ikiwiki-calendar]] and I propose here a (dirty) hack to correct it.
+
+Bug summary: when using the [[calendar plugin|plugins/calendar]] in French (`LANG=fr_FR.UTF-8`), "Décembre" (French for "December") is rendered as "Décembre".
+
+I managed to track this problem down to an encoding problem of `POSIX::strftime` in `Ikiwiki/Plugin/calendar.pm`. I used [[this guy's solution|http://www.perlmonks.org/?node_id=857018]] to solve the problem (the diff is printed below).
+
+The problem is that I do not know Perl, encoding is one of the thing I would be happy not to dive into, and it is the first time I contribute to Ikiwiki: I copied and made a few changes to the code I found without understanding it. So I am not sure that my code is neat, or works in every situation. Feel free to (help me to) improve it!
+
+Cheers,
+Louis
+
+> Yes, this seems basically right. I've applied a modified version of this.
+> [[done]]
+> --[[Joey]]
+
+
+ diff --git a/IkiWiki/Plugin/calendar.pm b/IkiWiki/Plugin/calendar.pm
+ index c7d2b7c..1345939 100644
+ --- a/IkiWiki/Plugin/calendar.pm
+ +++ b/IkiWiki/Plugin/calendar.pm
+ @@ -22,7 +22,14 @@ use warnings;
+ use strict;
+ use IkiWiki 3.00;
+ use Time::Local;
+ -use POSIX ();
+ +
+ +use POSIX qw/setlocale LC_TIME strftime/;
+ +use Encode;
+ +my ($strftime_encoding)= setlocale(LC_TIME)=~m#\.([^@]+)#;
+ +sub strftime_utf8 {
+ +# try to return an utf8 value from strftime
+ + $strftime_encoding ? Encode::decode($strftime_encoding, &strftime) : &strftime;
+ +}
+
+ my $time=time;
+ my @now=localtime($time);
+ @@ -123,10 +130,10 @@ sub format_month (@) {
+ }
+
+ # Find out month names for this, next, and previous months
+ - my $monthabbrev=POSIX::strftime("%b", @monthstart);
+ - my $monthname=POSIX::strftime("%B", @monthstart);
+ - my $pmonthname=POSIX::strftime("%B", localtime(timelocal(0,0,0,1,$pmonth-1,$pyear-1900)));
+ - my $nmonthname=POSIX::strftime("%B", localtime(timelocal(0,0,0,1,$nmonth-1,$nyear-1900)));
+ + my $monthabbrev=strftime_utf8("%b", @monthstart);
+ + my $monthname=strftime_utf8("%B", @monthstart);
+ + my $pmonthname=strftime_utf8("%B", localtime(timelocal(0,0,0,1,$pmonth-1,$pyear-1900)));
+ + my $nmonthname=strftime_utf8("%B", localtime(timelocal(0,0,0,1,$nmonth-1,$nyear-1900)));
+
+ my $archivebase = 'archives';
+ $archivebase = $config{archivebase} if defined $config{archivebase};
+ @@ -182,7 +189,7 @@ EOF
+ my %dowabbr;
+ for my $dow ($week_start_day..$week_start_day+6) {
+ my @day=localtime(timelocal(0,0,0,$start_day++,$params{month}-1,$params{year}-1900));
+ - my $downame = POSIX::strftime("%A", @day);
+ + my $downame = strftime_utf8("%A", @day);
+ my $dowabbr = substr($downame, 0, 1);
+ $downame{$dow % 7}=$downame;
+ $dowabbr{$dow % 7}=$dowabbr;
+ @@ -329,8 +336,8 @@ EOF
+ for (my $month = 1; $month <= 12; $month++) {
+ my @day=localtime(timelocal(0,0,0,15,$month-1,$params{year}-1900));
+ my $murl;
+ - my $monthname = POSIX::strftime("%B", @day);
+ - my $monthabbr = POSIX::strftime("%b", @day);
+ + my $monthname = strftime_utf8("%B", @day);
+ + my $monthabbr = strftime_utf8("%b", @day);
+ $calendar.=qq{\t<tr>\n} if ($month % $params{months_per_row} == 1);
+ my $tag;
+ my $mtag=sprintf("%02d", $month);
diff --git a/doc/bugs/Error:_OpenID_failure:_time_bad_sig:.mdwn b/doc/bugs/Error:_OpenID_failure:_time_bad_sig:.mdwn
new file mode 100644
index 000000000..2fa4a4759
--- /dev/null
+++ b/doc/bugs/Error:_OpenID_failure:_time_bad_sig:.mdwn
@@ -0,0 +1,83 @@
+Sometimes when I try to login (to edit a page) here then I can see
+the following error message:
+
+ Error: OpenID failure: time_bad_sig:
+
+Please note that authorization process successes on OpenID server side.
+
+It occurs both for [getopenid.com](http://www.getopenid.com/)
+and [myopenid.com](http://www.myopenid.com/) servers I use.
+
+I'm reporting this, but I'm not sure whether a problem is with your
+ikiwiki or my OpenID servers. --[[Paweł|ptecza]]
+
+> I've seen this too, once or twice (using myopenid), and reauthenticating
+> fixed it -- so I can't reproduce it reliably to work on it. I think I've
+> seen it both on this wiki and on the one running on my laptop.
+>
+> The perl openid client module seems
+> to fail with time_bad_sig if the time in the signature from the other end
+> is "faked". I'm not 100% sure what this code does yet:
+
+ # check age/signature of return_to
+ my $now = time();
+ {
+ my ($sig_time, $sig) = split(/\-/, $self->args("oic.time") || "");
+ # complain if more than an hour since we sent them off
+ return $self->_fail("time_expired") if $sig_time < $now - 3600;
+ also complain if the signature is from the future by more than 30 seconds,
+ # which compensates for potential clock drift between nodes in a web farm.
+ return $self->_fail("time_in_future") if $sig_time - 30 > $now;
+ # and check that the time isn't faked
+ my $c_secret = $self->_get_consumer_secret($sig_time);
+ my $good_sig = substr(OpenID::util::hmac_sha1_hex($sig_time, $c_secret), 0, 20);
+ return $self->_fail("time_bad_sig") unless $sig eq $good_sig;
+ }
+
+> At least it doesn't seem to be a time sync problem since the test for too
+> early/too late times have different error messages.. --[[Joey]]
+
+I've had this problem too, but with my track record of reporting OpenID bugs
+I thought it best if I held my tongue. I usually experience this the first
+time I sign in on any ikiwiki installation of {ikiwiki.kitenet, ikidev,
+betacantrips}, and I think re-logging in always works. --Ethan
+
+> Does seem easier to repro than I thought.
+> Ok, fixed it.. done --[[Joey]] (reopened for new instance of same error
+> message below)
+
+----
+
+## the return of the nasty bug
+
+Hmmmmm, looks like it is not entirely fixed. I am getting it on my own
+[blog](http://blog.tobez.org/). Just upgraded to 3.20110430, same same.
+I am using custom openid with redirection to myopenid.com.
+Please tell me if you need more info. The same openid worked fine to login to *this* site to post this.
+-- Anton
+
+> Well, this bug is from 2007. Probably you are not encountering the same
+> bug.
+>
+> I also have a openid delegation to myopenid, and I can reproduce the
+> problem when logging into your site.
+>
+> What version of the
+> Net::OpenId::Consumer perl library do you have installed? --[[Joey]]
+
+>> It is the latest version from FreeBSD's ports collection,
+>> which happens to be a [slightly patched up](http://www.freebsd.org/cgi/cvsweb.cgi/~checkout~/ports/net/p5-Net-OpenID-Consumer/files/patch-Consumer.pm?rev=1.2;content-type=text%2Fplain)
+>> variant of an
+>> ["unauthorized" CPAN release 1.06](http://search.cpan.org/~gugu/Net-OpenID-Consumer-1.06/)
+>>
+>> Do you think it might be a good idea to try with 1.03 or with an unpatched 1.06?
+>> -- Anton
+
+>>> Absolutely. --[[Joey]]
+
+>>>> 1.03 fails with "Error: login failed, perhaps you need to turn on cookies?" (needless to say cookies are enabled).
+>>>> Unpatched 1.06 fails with "Error: login failed, perhaps you need to turn on cookies?".
+>>>> 1.03 with the same patch fails with "Error: OpenID failure: time_bad_sig:" -- Anton.
+
+>>>>> Investigation revealed it was a bug in the freebsd patch, which I
+>>>>> understand is going to be dealt with. [[done]] --[[Joey]]
diff --git a/doc/bugs/Error:_Your_login_session_has_expired._.mdwn b/doc/bugs/Error:_Your_login_session_has_expired._.mdwn
new file mode 100644
index 000000000..b993cd8e7
--- /dev/null
+++ b/doc/bugs/Error:_Your_login_session_has_expired._.mdwn
@@ -0,0 +1,46 @@
+I keep getting:
+
+ Error: Your login session has expired.
+
+Whilst trying to edit http://hugh.vm.bytemark.co.uk/ikiwiki.cgi via OpenID. Any ideas?
+
+
+ iki@hugh:~$ dpkg -l | grep openid
+ ii libnet-openid-consumer-perl 0.14-4 library for consumers of OpenID iden
+ tities
+ iki@hugh:~$
+
+> This error occurs if ikiwiki sees something that looks like a CSRF
+> attack. It checks for such an attack by embedding your session id on the
+> page edit form, and comparing that id with the session id used to post
+> the form.
+>
+> So, somehow your session id has changed between opening the edit form and
+> posting it. A few ways this could happen:
+>
+> * Genuine CSRF attack (unlikely)
+> * If you logged out and back in, in another tab, while the edit form was
+> open.
+> * If `.ikiwiki/sessions.db` was deleted/corrupted while you were in the
+> midst of the edit.
+> * If some bug in CGI::Session caused your session not to be saved to the
+> database somehow.
+> * If your browser didn't preserve the session cookie across the edit
+> process, for whatever local reason.
+> * If you were using a modified version of `editpage.tmpl`, and
+> it did not include `FIELD-SID`.
+> * If you upgraded from an old version of ikiwiki, before `FIELD-SID` was
+> added (<= 2.41), and had an edit form open from that old version, and
+> tried to save it using the new.
+>
+> I don't see the problem editing the sandbox there myself, FWIW.
+> (BTW, shouldn't you enable the meta plugin so RecentChanges displays
+> better?)
+> --[[joey]]
+
+
+Thanks for you excellent analysis. The bug was due to old pre-3.0 **templates** laying about. After deleting them, ikiwiki defaults to its own templates. Clever. :-)
+
+Great, this saved me big time! It is a google 1st hit. I had the same with accidentally using old templates. Thanks! --[[cstamas]]
+
+[[bugs/done]]
diff --git a/doc/bugs/Error:_no_text_was_copied_in_this_page_--_missing_page_dependencies.mdwn b/doc/bugs/Error:_no_text_was_copied_in_this_page_--_missing_page_dependencies.mdwn
new file mode 100644
index 000000000..0082eed4d
--- /dev/null
+++ b/doc/bugs/Error:_no_text_was_copied_in_this_page_--_missing_page_dependencies.mdwn
@@ -0,0 +1,46 @@
+That one has bitten me for some time; here is the minimal testcase. There is
+also an equivalent (I suppose) problem when using another plugin, but I hope
+it's enough to track it down for this one.
+
+ $ tar -xj < [bug-dep_order.tar.bz2](http://schwinge.homeip.net/~thomas/tmp/bug-dep_order.tar.bz2)
+ $ cd bug-dep_order/
+ $ ./render_locally
+ [...]
+ $ find "$PWD".rendered/ -print0 | xargs -0 grep 'no text was copied'
+ $ [no output]
+ $ touch news/2010-07-31.mdwn
+ $ ./render_locally
+ refreshing wiki..
+ scanning news/2010-07-31.mdwn
+ building news/2010-07-31.mdwn
+ building news.mdwn, which depends on news/2010-07-31
+ building index.mdwn, which depends on news/2010-07-31
+ done
+ $ find "$PWD".rendered/ -print0 | xargs -0 grep 'no text was copied'
+ /home/thomas/tmp/hurd-web/bug-dep_order.rendered/news.html:<p>[[!paste <span class="error">Error: no text was copied in this page</span>]]</p>
+ /home/thomas/tmp/hurd-web/bug-dep_order.rendered/news.html:<p>[[!paste <span class="error">Error: no text was copied in this page</span>]]</p>
+
+This error shows up only for *news.html*, but not in *news/2010-07-31* or for
+the aggregation in *index.html* or its RSS and atom files.
+
+--[[tschwinge]]
+
+> So the cutpaste plugin, in order to support pastes
+> that come before the corresponding cut in the page,
+> relies on the scan hook being called for the page
+> before it is preprocessed.
+>
+> In the case of an inline, this doesn't happen, if
+> the page in question has not changed.
+>
+> Really though it's not just inline, it's potentially anything
+> that preprocesses content. None of those things guarantee that
+> scan gets re-run on it first.
+>
+> I think cutpaste is going beyond the intended use of scan hooks,
+> which is to gather link information, not do arbitrary data collection.
+> Requiring scan be run repeatedly could be a lot more work.
+>
+> Using `%pagestate` to store the cut content when scanning would be
+> one way to fix this bug. It would mean storing potentially big chunks
+> of page content in the indexdb. [[done]] --[[Joey]]
diff --git a/doc/bugs/Exception:_Unknown_function___96__this__39___.mdwn b/doc/bugs/Exception:_Unknown_function___96__this__39___.mdwn
new file mode 100644
index 000000000..189ba740f
--- /dev/null
+++ b/doc/bugs/Exception:_Unknown_function___96__this__39___.mdwn
@@ -0,0 +1,70 @@
+I'm very excited to try out ikiwiki, since it should fit my purposes extremely well, but I'm having trouble with the search plugin. I'm pretty sure that right after I installed ikiwiki and needed dependencies, the search plugin was working fine. However, now when I try to use search, I get "Exception: Unknown function `this'" error on a blank page. I'm not sure how I should go about debugging this issue - my server's (I use Lighttpd 1.4.22) error log has no mention of the exception and there's nothing in /var/log/syslog either.
+
+What might be causing this exception and how I might go about debugging exceptions?
+
+> Appears to be coming from your xapian omega cgi binary. If you
+> run `strings /usr/lib/cgi-bin/omega/omega` you can see it has
+> "Exception: " in it, and I have found some similar (but not identical)
+> error messages from xapian in a web search.
+>
+> I don´t know what to suggest, other than upgrade/downgrade/reinstall
+> xapian-omega, and contacting the xapian developers for debugging.
+> You could try rebuilding your wiki in case it is somehow
+> caused by a problem with the xapian database. Failing everything, you
+> could switch to [[google_search_plugin|plugins/google]]. --[[Joey]]
+
+>> Thanks, Joey. With your help I was able to figure out what was wrong. It's a fun little bug (or feature): the title of my wiki had string `$this` in title and that's what was causing the omega binary to choke. My wiki's title was inserted without escaping into the query template used by omega. Omega treated `$this` in the title as a function name and threw an exception because no such function was defined. To avoid this behavior, I used an html entity in the title, so `$this` became `&#36;this`. I don't think that the wiki title should be inserted into the template without escaping - it can produce an error that's not trivial to debug. If users want to modify the html in the title, they should be editing respective templates, not typing html in the wiki title input. What do you think? --[[dkobozev]]
+
+>>> Sounds like a bug in omega, and one that probably would affect other
+>>> users of omega too. Ikiwiki could work around it by pre-escaping
+>>> data before passing it to xapian. I have not quite managed to reproduce it though;
+>>> tried setting a page title to '$this' and 'foo $this'.
+>>> That's with version 1.0.18 of omega.
+>>> --[[Joey]]
+
+>>>> I tried it with both omega 1.0.13 and omega 1.0.18 and the issue is present in both. If I view the contents of {$srcdir}/.ikiwiki/xapian/templates/query, I can see that the wiki title is inserted verbatim and there are calls to `$setmap`, `$set` and `$def` etc in the template. --[[dkobozev]]
+
+>>>>> I don't see how that's relevant. It would help if you showed me
+>>>>> exactly something that could be inserted into a page to cause the
+>>>>> problem. --[[Joey]]
+
+>>>>>> Correct me if I'm wrong: ikiwiki generates an Omega template from its own templates, such as searchquery.tmpl and puts it into {$srcdir}/.ikiwiki/xapian/templates/query. Omega has its own template syntax, where function names are prefixed with dollar signs (`$`). So, when I call my wiki `$foobar`, ikiwiki generates an Omega template that looks like this snippet:
+
+ <div id="container">
+ <div class="pageheader">
+ <div class="header">
+ <span>
+ <a href="http://example.com">$foobar</ a>/search
+ </span>
+ </div>
+ </div> <!-- .pageheader -->
+
+ <div id="content">
+ $setmap{prefix,title,S}
+ $setmap{prefix,link,XLINK}
+ $set{thousand,$.}$set{decimal,.}$setmap{BN,,Any Country,uk,England,fr,France}
+ ${
+ $def{PREV,
+ $if{$ne{$topdoc,0},<INPUT TYPE=image NAME="&lt;" ALT="&lt;"
+ SRC="/images/xapian-omega/prev.png" BORDER=0 HEIGHT=30 WIDTH=30>,
+ <IMG ALT="" SRC="/images/xapian-omega/prevoff.png" HEIGHT=30 WIDTH=30>}
+
+>>>>>> So `$foobar` clashes with Omega's template tags. Does this help?
+
+>>>>>>> Ahh. I had somehow gotten it into my head that you were talking
+>>>>>>> about the title of a single page, not of the whole wiki. But
+>>>>>>> you were clear all along it was the wiki title. Sorry for
+>>>>>>> misunderstanding. I've put in a complete fix for this problem.
+>>>>>>> if this was in [[bugs]], I'd close it. :) --[[Joey]]
+
+>>>>>>>> Rather than escaping `$` as an HTML entity, it would be more natural
+>>>>>>>> to escape it as `$$` (since you are escaping it for Omega, not for
+>>>>>>>> the web browser.
+>>>>>>>>
+>>>>>>>> Also if ikiwiki can put arbitrary text inside the parameters of an
+>>>>>>>> OmegaScript command, you should also escape `{`, `}` and `,` as
+>>>>>>>> `$(`, `$)` and `$.`. It's only necessary to do so inside the
+>>>>>>>> parameters of a command, but it will work and be easier to escape
+>>>>>>>> them in any substituted text. --OllyBetts
+
+[[done]]
diff --git a/doc/bugs/Excessive_list_nesting_in_map_output_for_subpages.mdwn b/doc/bugs/Excessive_list_nesting_in_map_output_for_subpages.mdwn
new file mode 100644
index 000000000..8678098e7
--- /dev/null
+++ b/doc/bugs/Excessive_list_nesting_in_map_output_for_subpages.mdwn
@@ -0,0 +1,3 @@
+At <http://phd.martin-krafft.net/wiki/factors/tag/community/>, which iterates `link(tag/community)`, the list with the factors is nested within *three* other `<ul>` tags. That seems a bit excessive.
+
+> Think I [[fixed|done]] this as part of fixing the other bug. --[[Joey]]
diff --git a/doc/bugs/Existing_Discussion_pages_appear_as_non-existing.mdwn b/doc/bugs/Existing_Discussion_pages_appear_as_non-existing.mdwn
new file mode 100644
index 000000000..9ba4ede6e
--- /dev/null
+++ b/doc/bugs/Existing_Discussion_pages_appear_as_non-existing.mdwn
@@ -0,0 +1,5 @@
+If you look at [[todo/org mode]], the link to the Discussion page is not there (has a question mark), as if it didn't exist. But--through the search--I discovered that the Discussion page does exist actually: [[todo/org mode/Discussion]].
+
+So, there is a bug that prevents a link to the existing Discussion page from appearing in the correct way on the corresponding main page. --Ivan Z.
+
+Perhaps, this has something to do with the same piece of code/logic (concerning case-sensitivity) as the fixed [[bugs/unwanted discussion links on discussion pages]]? --Ivan Z.
diff --git a/doc/bugs/External_link:_underscore_conversion.mdwn b/doc/bugs/External_link:_underscore_conversion.mdwn
new file mode 100644
index 000000000..6ea421d84
--- /dev/null
+++ b/doc/bugs/External_link:_underscore_conversion.mdwn
@@ -0,0 +1,25 @@
+Hi,
+
+found one strange thing here:
+
+If i enter a link like this
+
+ [#Wikipedia:Mollison]: <http://www.tagari.com/bills_journal>
+
+the underscore appears like this (i inserted a space in the undercore-string to make it 'visible'):
+
+ <a href="http://www.tagari.com/billsb14a7b8059d9c05 5954c92674ce60032journal">http://www.tagari.com/billsb14a7b8059d9c05 5954c92674ce60032journal</a>
+
+Am i doing something wrong?
+
+Thanks for your support and best wishes,
+Tobias.
+
+> I believe you're hitting some kind of Markdown-processing but (so not
+> strictly Ikiwiki related). Could you provide a minimal page source
+> exhibiting the problem, and mention the exact nature of the processor
+> you use? (Markdown, MultiMarkdown, pandoc, ...) --GB
+
+> Insertation of weird hashes into some output is a [known bug](http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=380212) in the old
+> perl markdown. This is one of the main reasons why use of Text::Markdown
+> instead is recommended. --[[Joey]] [[done]]
diff --git a/doc/bugs/External_links_with_Creole.mdwn b/doc/bugs/External_links_with_Creole.mdwn
new file mode 100644
index 000000000..0d743e685
--- /dev/null
+++ b/doc/bugs/External_links_with_Creole.mdwn
@@ -0,0 +1,5 @@
+When using Creole for markup, creating an external link appears to be impossible. Neither \[[Outside URL|http://example.com]] nor <<http://example.com>> nor \[Outside URL]\(http://example.com) work. The first gets rendered as a broken WikiLink, the second get eaten and the last is not parsed in anyway so you end up with that exact text in your page.
+
+I'd have made this as a Creole page as a practical demonstration, but that doesn't seem possible here. Here's a page with an example: <https://www.icanttype.org//demo/CreoleExternalLinks>
+
+Looking at my example it seems that this is now [[done]] at least in 3.20100815.7 (Debian)
diff --git a/doc/bugs/Fancy_characters_get_munged_on_page_save.mdwn b/doc/bugs/Fancy_characters_get_munged_on_page_save.mdwn
new file mode 100644
index 000000000..d20e5b1f8
--- /dev/null
+++ b/doc/bugs/Fancy_characters_get_munged_on_page_save.mdwn
@@ -0,0 +1,29 @@
+It doesn't happen on ikiwiki.info but on my site if I enter any of the fancy typography characeters (“,”,…, etc) then when I save the page they are displayed as garbage characters. I've put an example at the top of [this page](http://adam.shand.net/iki/2007/Test_for_possible_ikiwiki_bug.html?updated).
+
+Sorry, I know the title is ridiculously vague for a bug but I don't know what the proper name the fancy typography characters are. :-/
+
+-- [[AdamShand]]
+
+> The page is fine if I download it and view it locally, or put it on my
+> laptop's web server. I imagine that the problem is due to your web server
+> not saying that the encoding of the page is utf8. For apache you could
+> try setting this:
+>
+> AddDefaultCharset UTF-8
+>
+> Although I don't need that setting here.. Maybe your apache config has
+> some problem, because your web server is doing this: --[[Joey]]
+
+ Accept-Ranges: bytes
+ Content-Length: 2135
+ Content-Type: text/html; charset=iso-8859-1
+
+ <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
+ "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
+ <html>
+ <head>
+ <meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
+
+Thanks for that I assumed it would be Perl foo. I'll look into the web server config. Though, just thinking some more ... it's weird that it does the correct thing on preview but not on display. Hrm ... -- Adam.
+
+**[[FIXED|done]]** - Adding "AddDefaultCharset UTF-8" to my Apache config fixed the problem. Thanks for the help Joey (especially since it had nothing to do with ikiwiki!). - Adam.
diff --git a/doc/bugs/Feeds_get_wrong_timezone..mdwn b/doc/bugs/Feeds_get_wrong_timezone..mdwn
new file mode 100644
index 000000000..87757883d
--- /dev/null
+++ b/doc/bugs/Feeds_get_wrong_timezone..mdwn
@@ -0,0 +1,10 @@
+I have a file with an mtime of 0312 BST, or 0212 GMT. If the atom/rss feed is
+generated while the system timezone is set to GMT, it's correctly shown as 0212
+GMT. If the system time is set to BST, it's incorrectly shown as having been
+created/modified at 0312 GMT.
+
+This occurs in both RSS and Atom feeds, but not in the HTML pages (which
+suggests it's not the server at fault, or IkiWiki overall, but rather
+whatever's generating the RSS and Atom is converting timezones wrongly).
+
+> [[done]], I *think* (description was not entirely clear) --[[Joey]]
diff --git a/doc/bugs/Feeds_link_to_index.html_instead_of_directory.mdwn b/doc/bugs/Feeds_link_to_index.html_instead_of_directory.mdwn
new file mode 100644
index 000000000..b7efa6a37
--- /dev/null
+++ b/doc/bugs/Feeds_link_to_index.html_instead_of_directory.mdwn
@@ -0,0 +1,37 @@
+When --usedirs is used, RSS and Atom feeds seem to link to the index.html directly, both for the site and for the feed items, instead of the directory, as pages otherwise do.
+
+Thanks, that had been annoying me too. [[done]] --[[Joey]]
+
+Patch:
+
+<pre>
+Index: IkiWiki/Plugin/inline.pm
+===================================================================
+--- IkiWiki/Plugin/inline.pm (revision 3241)
++++ IkiWiki/Plugin/inline.pm (working copy)
+@@ -312,13 +312,13 @@
+ my $page=shift;
+ my @pages=@_;
+
+- my $url=URI->new(encode_utf8($config{url}."/".htmlpage($page)));
++ my $url=URI->new(encode_utf8($config{url}."/".urlto($page, "")));
+
+ my $itemtemplate=template($feedtype."item.tmpl", blind_cache => 1);
+ my $content="";
+ my $lasttime = 0;
+ foreach my $p (@pages) {
+- my $u=URI->new(encode_utf8($config{url}."/".htmlpage($p)));
++ my $u=URI->new(encode_utf8($config{url}."/".urlto($p, "")));
+
+ my $pcontent = absolute_urls(get_inline_content($p, $page), $url);
+
+@@ -415,7 +415,7 @@
+
+ foreach my $page (keys %toping) {
+ my $title=pagetitle(basename($page), 0);
+- my $url="$config{url}/".htmlpage($page);
++ my $url="$config{url}/".urlto($page, "");
+ foreach my $pingurl (@{$config{pingurl}}) {
+ debug("Pinging $pingurl for $page");
+ eval {
+</pre>
diff --git a/doc/bugs/Filenames_with_colons_cause_problems_for_Windows_users.mdwn b/doc/bugs/Filenames_with_colons_cause_problems_for_Windows_users.mdwn
new file mode 100644
index 000000000..7559e6d0a
--- /dev/null
+++ b/doc/bugs/Filenames_with_colons_cause_problems_for_Windows_users.mdwn
@@ -0,0 +1,75 @@
+(Note: feel free to say "not a bug" or offer a workaround rather than changing ikiwiki.)
+
+As reported by a Windows user trying ikiwiki: because Windows doesn't support filenames with colons, he couldn't check out the ikiwiki svn repository. More generally, ikiwiki doesn't encode colons in filenames for wiki pages, but to support Windows users perhaps it should.
+
+Windows does not support filenames containing any of these characters: `/ \ * : ? " < > |`
+
+> I take it this is a problem when checking out a wiki in windows, not when
+> browsing to urls that have colons in them from windows? --[[Joey]]
+
+>> Correct. You can't directly check out a wiki's repository from Windows if it includes filenames with those characters; you will get errors on those filenames.
+
+>>> Ok, first, if a windows user fails to check out ikiwiki's own svn^Wgit
+>>> repo on windows due to the colons, that seems to be a bug in svn^Wgit
+>>> on windows -- those programs should deal with colons in filenames being
+>>> checked in/out somehow. Like they deal with windows using backslash
+>>> rather than slash, presumably. And there's nothing ikiwiki can do if
+>>> the source repo it's working on has a file with a problem character
+>>> added to it, since the breakage will happen at the revision control
+>>> system level.
+
+>>>> Just a quick note that the version control community generally doesn't
+>>>> agree with that view. They'll store what you ask them to store. If you
+>>>> want to work cross platform, then you need to make sure that all
+>>>> your file names work on all the platforms you're interested in. (Note: many systems will
+>>>> warn on commit, but not all. Many systems also have a way to fix
+>>>> the problem without checking out, but not all.) Another common place for this to
+>>>> arise is case insensitive file systems. If you have two files committed
+>>>> that differ only in case, then you cannot check out on a Mac in most systems.
+
+>>> OTOH, there are some simple mods to ikiwiki that can make it escape
+>>> colons etc the same way it already escapes other problem characters
+>>> like "*", "?", etc. Without actually testing it, it should suffice to
+>>> edit `IkiWiki.pm` and modify `titlepage` and `linkpage`, removing the
+>>> colon from the character class in each. Also modify the
+>>> `wiki_file_regexp` similarly. Then ikiwiki will read and
+>>> write files with escaped colons, avoiding the problem.
+>>>
+>>> So that's a simple fix, but on the gripping hand, I can't just make
+>>> that change, because it would break all existing unix-based
+>>> wikis that actually contain colons in their filenames, requiring an
+>>> annoying transition. I could do a OS test and do it in Windows, but then
+>>> there would be interop problems if a Windows and non-windows system both
+>>> acted on the same wiki source.
+
+>>>> I haven't checked the source, but need this break existing wikis?
+>>>> I can imagine a system where a colon gets converted to something safe,
+>>>> and the safe encoding gets converted back to a colon. But if you
+>>>> already have a colon, that doesn't get converted and stays a colon, and
+>>>> so it should still work shouldn't it? The only
+>>>> problem would be with pages that already have the 'safe encoding for a colon'.
+>>>> They'll suddenly change names. Well, I should finish frying my current fish
+>>>> before taking on something new, so I'll shut up now :). -- [[Will]]
+
+>>>>> If `linkpage()` is changed to escape colons, then links to pages
+>>>>> with literal colons in their names will stop working; ikiwiki will
+>>>>> instead look for page names with escaped colons. --[[Joey]]
+
+>>> So, I guess it has to be a config option, possibly defaulting on
+>>> when the OS is Windows. And if being able to checkout/etc the wiki
+>>> source on windows systems is desired, you'd have to remember to turn
+>>> that on when setting up a wiki, even if the wiki was hosted on unix.
+>>>
+>>> Ok, `wiki_file_chars` config option added, set to
+>>> `"-[:alnum:]+/._"` to exclude colons from filenames read or written by
+>>> ikiwiki. [[done]]
+>>>
+>>> BTW, I suspect there are lots of other problems with actually running
+>>> ikiwiki on windows, including its assumption that the directory
+>>> separator is "/". Windows will be supported when someone sends me a
+>>> comprehansive and not ugly or performance impacting patch. :-) --[[Joey]]
+
+> Speaking of Windows filename problems, how do I keep directories ending in a
+> period from being created? The following didn't seem to work.
+> `wiki_file_chars => "-[:alnum:]+/._",`
+> `wiki_file_regex => '[-[:alnum:]+_]$',`
diff --git a/doc/bugs/Git:_changed_behavior_w.r.t._timestamps.mdwn b/doc/bugs/Git:_changed_behavior_w.r.t._timestamps.mdwn
new file mode 100644
index 000000000..164e62075
--- /dev/null
+++ b/doc/bugs/Git:_changed_behavior_w.r.t._timestamps.mdwn
@@ -0,0 +1,214 @@
+After some months, I just updated my local ikiwiki sources, and rebuilt
+the Hurd web pages, <http://git.savannah.gnu.org/cgit/hurd/web.git/>.
+
+I was confused, having switched to the new automatic (thanks!) --gettime
+mechanism, why on some pages the timestamps had changed compared to my
+previous use of --getctime and setting files' mtimes (using a script)
+according to the last Git commit. For example:
+
+community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.html
+
+old:
+
+ Last edited <span class="date">2008-09-11 18:11:53 UTC</span>
+ <!-- Created <span class="date">2008-09-11 17:47:08 UTC</span> -->
+
+new:
+
+ Last edited <span class="date">2008-09-11 18:12:22 UTC</span>
+ <!-- Created <span class="date">2008-09-11 17:47:50 UTC</span> -->
+
+
+I had a look at what git.pm is doing, and began to manually replay /
+investigate:
+
+ $ git log --pretty=fuller --name-only --relative -- community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn
+ commit 8f1b97bfe45b2f173e3a7d55dee226a9e289a695
+ Author: arnebab <arne_bab@web.de>
+ AuthorDate: Thu Sep 11 20:11:53 2008 +0200
+ Commit: arnebab <arne_bab@web.de>
+ CommitDate: Thu Sep 11 20:11:53 2008 +0200
+
+ Added a link to the X.org guide in this wiki.
+
+ community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn
+
+ commit 3ef8b7d80d80572c436c4c60c71879bc74409816
+ Author: arnebab <arne_bab@web.de>
+ AuthorDate: Thu Sep 11 19:47:08 2008 +0200
+ Commit: arnebab <arne_bab@web.de>
+ CommitDate: Thu Sep 11 19:47:08 2008 +0200
+
+ Minor update on the enty trying to get X working -> 'watch this place for updates'
+
+ community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn
+
+OK, these are my old dates.
+
+ $ git log --pretty=format:%ci --name-only --relative -- community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn
+ 2008-09-11 20:11:53 +0200
+ community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn
+
+ 2008-09-11 19:47:08 +0200
+ community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn
+
+ $ git log --pretty=format:%ct --name-only --relative -- community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn
+ 1221156713
+ community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn
+
+ 1221155228
+ community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn
+
+ $ date -d @1221156713
+ Thu Sep 11 18:11:53 UTC 2008
+ $ date -d @1221155228
+ Thu Sep 11 17:47:08 UTC 2008
+
+That's all consistent.
+
+
+But:
+
+ $ perl -le 'use Storable; my $index=Storable::retrieve("indexdb"); use Data::Dumper; print Dumper $index'
+ [...]
+ 'community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn' => {
+ 'ctime' => '1221155270',
+ 'dest' => [
+ 'community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.html'
+ ],
+ 'typedlinks' => {
+ 'tag' => {}
+ },
+ 'mtime' => 1221156742,
+ 'depends_simple' => {
+ 'sidebar' => 1
+ },
+ 'links' => [
+ 'community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x/discussion',
+ 'Hurd/DebianXorg'
+ ],
+ 'state' => {
+ [...]
+
+ $ date -d @1221156742
+ Thu Sep 11 18:12:22 UTC 2008
+ $ date -d @1221155270
+ Thu Sep 11 17:47:50 UTC 2008
+
+That's different, and it matches what the new ikiwiki writes into the
+HTML file.
+
+
+Back to Git again, this time without specifying the file:
+
+ $ git log --pretty=format:%ct --name-only --relative
+ [...]
+ 1221255713
+ 1221255655
+ unsorted/PortingIssues.mdwn
+
+ 1221156742 [Thu Sep 11 18:12:22 UTC 2008]
+ 1221156713 [Thu Sep 11 18:11:53 UTC 2008]
+ community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn
+
+ 1221156267
+ 1221156235
+ index.mdwn
+
+ 1221156122
+ 1221156091
+ index.mdwn
+
+ 1221155942
+ 1221155910
+ index.mdwn
+
+ 1221155270 [Thu Sep 11 17:47:50 UTC 2008]
+ 1221155228 [Thu Sep 11 17:47:08 UTC 2008]
+ community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn
+
+ 1221154986
+ community/gsoc.mdwn
+ community/gsoc/project_ideas.mdwn
+
+ 1221147244
+ whatsnew.html
+ [...]
+
+Aha!
+
+... and some more detail:
+
+ $ git log --pretty=fuller --name-only --relative
+ [...]
+ commit e4e89e1683012c879012522105a3471a00714613
+ Author: Samuel Thibault <samuel.thibault@ens-lyon.org>
+ AuthorDate: Fri Sep 12 23:40:55 2008 +0200
+ Commit: Samuel Thibault <samuel.thibault@ens-lyon.org>
+ CommitDate: Fri Sep 12 23:40:55 2008 +0200
+
+ MSG_NOSIGNAL and IPV6_PKTINFO got fixed
+
+ unsorted/PortingIssues.mdwn
+
+ commit c389fae98dff86527be62f895ff7272e4ab1932c
+ Merge: 0339e3e 8f1b97b
+ Author: GNU Hurd wiki engine <web-hurd@gnu.org>
+ AuthorDate: Thu Sep 11 18:12:22 2008 +0000
+ Commit: GNU Hurd wiki engine <web-hurd@gnu.org>
+ CommitDate: Thu Sep 11 18:12:22 2008 +0000
+
+ Merge branch 'master' of wiki@192.168.10.50:wiki
+
+ commit 8f1b97bfe45b2f173e3a7d55dee226a9e289a695
+ Author: arnebab <arne_bab@web.de>
+ AuthorDate: Thu Sep 11 20:11:53 2008 +0200
+ Commit: arnebab <arne_bab@web.de>
+ CommitDate: Thu Sep 11 20:11:53 2008 +0200
+
+ Added a link to the X.org guide in this wiki.
+
+ community/weblogs/ArneBab/2008-08-02-gnu_hurd_and_x.mdwn
+ [...]
+
+So, merges are involved there.
+
+What (the new) ikiwiki code does, is use the timestamp when the merge was
+done instead of the timestamp when the commit was done. Is this
+intentional? Otherwise I could supply a patch.
+
+--[[tschwinge]]
+
+> In order to be nice and fast, the git backend runs git log once
+> and records data for all files. Rather than looking at the log for a
+> given file. So amoung other things, it does not follow renames.
+>
+> AFAICS, git log only shows merges modifying files if it was a conflicted
+> merge. As the file is then actually modified to resolve the merge
+> I think it makes sense to count the merge as the last modification in
+> that case. --[[Joey]]
+
+>> That'd be reasonable, but `git log` will also show merges that are not
+>> conflicting (as in my case).
+
+>>> Actually when displaying a merge, `git log --stat` only lists files that
+>>> were actually modified in a new way as part of the merge resolution.
+>>> Ie, if the merge resolution only joins together some of the parent
+>>> hunks, the file is not listed as having been modified.
+>>>
+>>> So, no, ikiwiki's use of git log will not show files modified in
+>>> non-conflicting merges.
+>>> --[[Joey]]
+
+>> Yet, I'm not totally disagreeing with your choice. With this `git
+>> log` invocation, you're not able to tell from its output whether a
+>> conflict was resolved or not.
+
+>> Also, it's a bit like the *should we use the **author timestamp** or
+>> **commit timestamp*** discussion. Your code will always use the
+>> latest timestamp.
+
+>> I guess I'll get my head wrapped around that, and it's fine, so this is
+>> [[done]].
+
+>> --[[tschwinge]]
diff --git a/doc/bugs/Git:_web_commit_message_not_utf-8.mdwn b/doc/bugs/Git:_web_commit_message_not_utf-8.mdwn
new file mode 100644
index 000000000..08247dded
--- /dev/null
+++ b/doc/bugs/Git:_web_commit_message_not_utf-8.mdwn
@@ -0,0 +1,17 @@
+The message generated for web commits:
+
+> web commit by mädduck
+
+is not utf-8 encoded before passed to Git (which uses utf-8 as default encoding for commit messages). This causes a wrongly-encoded log entry, and makes ikiwiki spew warnings as it creates `recentchanges`:
+
+ utf8 "\xF6" does not map to Unicode at /usr/share/perl5/IkiWiki/Rcs/git.pm line 36, <$OUT> line 57.
+ Malformed UTF-8 character (unexpected non-continuation byte 0x6e, immediately after start byte 0xf6) in pattern match (m//) at /usr/share/perl5/IkiWiki/Rcs/git.pm line 393.
+ utf8 "\xF6" does not map to Unicode at /usr/share/perl5/IkiWiki/Rcs/git.pm line 36, <$OUT> line 5.
+
+(This is version 2.53.3~bpo40+1 for lack of a newer backport for sarge)
+
+Please make sure that commit messages for Git are always utf-8.
+
+This is a change by user `mädduck` to trigger the error.
+
+> [[Fixed|done]] both on the commit and log sides. --[[Joey]]
diff --git a/doc/bugs/Graphviz_plug-in_directive_changed_in_2.60.mdwn b/doc/bugs/Graphviz_plug-in_directive_changed_in_2.60.mdwn
new file mode 100644
index 000000000..2924951ff
--- /dev/null
+++ b/doc/bugs/Graphviz_plug-in_directive_changed_in_2.60.mdwn
@@ -0,0 +1,11 @@
+The directive for the graphviz plug-in changed from 'graph' to 'graphviz' in ikiwiki-2.60. Was this intentional?
+
+If yes, the [[plugins/graphviz]] plug-in documentation needs updating, and a heads-up on the news page might not be a bad idea either.
+
+Personally, I like the new directive name better since it will allow us to add other graph plug-ins later.
+
+ -- [[HenrikBrixAndersen]]
+
+> No, that change was not made intentionally. I don't want to bother people
+> with such a transition. Though it would be nice if it had a less generic
+> name. Changed back. [[!tag done]] --[[Joey]]
diff --git a/doc/bugs/HTML_for_parentlinks_makes_theming_hard.mdwn b/doc/bugs/HTML_for_parentlinks_makes_theming_hard.mdwn
new file mode 100644
index 000000000..bc934d109
--- /dev/null
+++ b/doc/bugs/HTML_for_parentlinks_makes_theming_hard.mdwn
@@ -0,0 +1,45 @@
+I'm trying to make a pretty theme for ikiwiki and I'm making progress (or at least I think I am :-). However I've noticed an issue when it comes to theming. On the front page the wiki name is put inside the "title" span and on all the other pages, it's put in the "parentlinks" span. See here:
+
+From [my dev home page](http://adam.shand.net/iki-dev/):
+
+<code>
+&lt;div class="header">
+&lt;span>
+&lt;span class="parentlinks">
+
+&lt;/span>
+&lt;span class="title">
+adam.shand.net/iki-dev
+&lt;/span>
+&lt;/span>&lt;!--.header-->
+
+&lt;/div>
+</code>
+
+From a sub-page of [my dev home page](http://adam.shand.net/iki-dev/recipes/navajo_fry_bread/):
+
+<code>
+&lt;div class="header">
+&lt;span>
+&lt;span class="parentlinks">
+
+&lt;a href="../">adam.shand.net/iki-dev</a>/
+
+&lt;/span>
+&lt;span class="title">
+recipes
+&lt;/span>
+&lt;/span>&lt;!--.header-->
+
+&lt;/div>
+</code>
+
+I understand the logic behind doing this (on the front page it is the title as well as the name of the wiki) however if you want to do something different with the title of a page vs. the name of the wiki it makes things pretty tricky.
+
+I'll just modify the templates for my own site but I thought I'd report it as a bug in the hopes that it will be useful to others.
+
+Cheers,
+[[AdamShand]].
+
+----
+> I just noticed that it's also different on the comments, preferences and edit pages. I'll come up with a diff and see what you guys think. -- Adam.
diff --git a/doc/bugs/HTML_inlined_into_Atom_not_necessarily_well-formed.mdwn b/doc/bugs/HTML_inlined_into_Atom_not_necessarily_well-formed.mdwn
new file mode 100644
index 000000000..d2f8ca3dc
--- /dev/null
+++ b/doc/bugs/HTML_inlined_into_Atom_not_necessarily_well-formed.mdwn
@@ -0,0 +1,35 @@
+If a blog entry contains a HTML named entity, such as the `&mdash;` produced by [[plugins/rst]] for blockquote citations, it's pasted into the Atom feed as-is. However, Atom feeds don't have a DTD, so named entities beyond `&lt;`, `&gt;`, `&quot;`, `&amp;` and `&apos;` aren't well-formed XML.
+
+Possible solutions:
+
+* Put HTML in Atom feeds as type="html" (and use ESCAPE=HTML) instead
+
+> Are there any particular downsides to doing that ..? --[[Joey]]
+
+>> It's the usual XHTML/HTML distinction. type="html" will always be interpreted as "tag soup", I believe - this may lead to it being rendered differently in some browsers. In general ikiwiki seems to claim to produce XHTML (at least, the default page.tmpl makes it claim to be XHTML Strict). On the other hand, this is a much simpler solution... see escape-feed-html branch in my repository, which I'm now using instead --[[smcv]]
+
+>>> Of course, browsers [probably don't treat xhtml pages as xhtml anyway](http://hixie.ch/advocacy/xhtml).
+>>> And the same content will be treated as html (probably as tag soup) if it's
+>>> in a rss feed.
+
+>>> [[merged|done]]
+
+* Keep HTML in Atom feeds as type="xhtml", but replace named entities with numeric ones,
+ like in the re-escape-entities branch in my repository ([diff here](http://git.debian.org/?p=users/smcv/ikiwiki.git;a=commitdiff;h=c0eb041c65d0653bacf0d4acb7a602e9bda8888e))
+
+>> I can see why you think this is excessively complex! --[[smcv]]
+
+(Also, the HTML in RSS feeds would probably get better interoperability if it was escaped with ESCAPE=HTML rather than being in a CDATA section?)
+
+> Can't see why? --[[Joey]]
+
+>> For a start, `]]>` in content wouldn't break the feed :-) but I was really thinking of non-XML, non-SGML parsers (more tag soup) that don't understand CDATA (I've suffered from CDATA damage when feeding generated code through gtkdoc, for instance). --[[smcv]]
+
+>>> FWIW, the htmlscrubber escapes the `]]>`. (Wouldn't hurt to make that
+>>> more robust tho.)
+>>>
+>>> ikiwiki has used CDATA from the beginning -- this is the first time
+>>> I've heard about rss 2.0 parsers that didn't know about CDATA.
+>>>
+>>> (IIRC, I used CDATA because the result is more space-efficient and less
+>>> craptacular to read manually.)
diff --git a/doc/bugs/HTML_is_not_update_nor_created_when_editing_markdown_via_CGI.mdwn b/doc/bugs/HTML_is_not_update_nor_created_when_editing_markdown_via_CGI.mdwn
new file mode 100644
index 000000000..b00dd4621
--- /dev/null
+++ b/doc/bugs/HTML_is_not_update_nor_created_when_editing_markdown_via_CGI.mdwn
@@ -0,0 +1,43 @@
+I don't know if I have a missed configuration. But under 1.35 using the CGI to create login, to login, and to edit and save the markdown works fine. But the resulting HTML is not generated (updated or created). Using ikiwiki from command line will then update the HTML for me.
+
+An example of the problem: edit worked to update source *.mdw file but www version was not updated,
+so when have new page it has a "?Page". (So CGI is working that much.) I edit ("create") by clicking the ? question mark and save but then receive an HTTP 404 error because the testingpage.html?updated file is not found.
+
+How to get that file generated automatically on save (via edit)?
+
+If this is documented, sorry I missed it.
+
+> If a revision control system is configured, ikiwiki relies on a hook
+> being triggered by its commit to the RCS, which then runs ikiwiki again
+> to do the build, same as happens when a commit is made to the RCS
+> directly. If the appropriate hook is not uncommented and configured in
+> the setup file, you could see the behavior you describe.
+>
+> If no revision control system is used, ikiwiki handles the build after
+> writing the file.
+>
+> --[[Joey]]
+
+>> Thanks for the info. I added --no-rcs to my command-line and it was fixed.
+>> I assumed that using --setup with my configuration file that had no RCS configured
+>> would override the default (subversion). I never configured anything with subversion,
+>> so I don't even know if the content was being committed and if so I don't know where.
+>> Maybe when using --setup the defaults should not be used. Or maybe the manpage can
+>> mention that if no rcs is configured in --setup configuration then it will default to svn.
+>> I wonder what else is going to defaults not in my configuration file.
+>> I don't see any way in my configuration file for disabling this.
+>> (I should figure out what happened with the subversion commits too...)
+>>
+>> --JeremyReed
+
+>>> Defaulting that to svn in a bug. Fixed. In general, there are defaults
+>>> for various things configurable by the config file, but they should not
+>>> cause suprising behavior like this. For example, it defaults to
+>>> supporting discussion pages. (You can see all the defaults near
+>>> the top of `IkiWiki.pm`).
+>>>
+>>> The svn backend would have noticed that your wiki is not in svn, and
+>>> avoided doing anything, BTW.
+>>>
+>>> I've changed the default setting that was making it use svn, so this
+>>> bug is [[bugs/done]] --[[Joey]]
diff --git a/doc/bugs/Highlight_extension_uses_hard_coded_paths.mdwn b/doc/bugs/Highlight_extension_uses_hard_coded_paths.mdwn
new file mode 100644
index 000000000..275661fb8
--- /dev/null
+++ b/doc/bugs/Highlight_extension_uses_hard_coded_paths.mdwn
@@ -0,0 +1,3 @@
+The [[plugins/highlight]] plugin hard codes some paths up the top of the plugin. This means that you need to edit the ikiwiki source if you have highlight installed in a non-standard location (e.g. if you have done a user-level install of the highlight package).
+
+> configurable now, [[done]] --[[Joey]]
diff --git a/doc/bugs/Http_error_500_when_using_mercurial_backend.mdwn b/doc/bugs/Http_error_500_when_using_mercurial_backend.mdwn
new file mode 100644
index 000000000..638def337
--- /dev/null
+++ b/doc/bugs/Http_error_500_when_using_mercurial_backend.mdwn
@@ -0,0 +1,31 @@
+This is [[done]]
+
+The mercurial backend is broken when no changelog message is given.
+
+Here is a quick patch, partialy copying the svn backend.
+
+ --- /usr/share/perl5/IkiWiki/Rcs/mercurial.pm 2007-03-18 23:19:40.000000000 +0100
+ +++ ./mercurial.pm 2007-03-24 13:11:36.000000000 +0100
+ @@ -70,12 +70,15 @@
+
+ if (defined $user) {
+ $user = possibly_foolish_untaint($user);
+ + $message="web commit by $user".(length $message ? ": $message" : "");
+ }
+ elsif (defined $ipaddr) {
+ $user = "Anonymous from $ipaddr";
+ + $message="web commit from $ipaddr".(length $message ? ": $message" : "");
+ }
+ else {
+ $user = "Anonymous";
+ + $message="web commit by Anonymous".(length $message ? ": $message" : "");
+ }
+
+ $message = possibly_foolish_untaint($message);
+
+> The svn backend puts the user info in the message because that's the only
+> way to store the user info, unlike with mercurial. The svn plugin also
+> removes that info when getting the RecentChanges info. Since mercurial
+> does not do that, it seemed better to me to test for an empty message and
+> set it to a dummy commit string, which I've [[bugs/done]]. --[[Joey]]
+>>Thanks for the correct fix, Joey. --hb \ No newline at end of file
diff --git a/doc/bugs/Hyperestraier_search_plug-in_defective.mdwn b/doc/bugs/Hyperestraier_search_plug-in_defective.mdwn
new file mode 100644
index 000000000..783006535
--- /dev/null
+++ b/doc/bugs/Hyperestraier_search_plug-in_defective.mdwn
@@ -0,0 +1,55 @@
+The map() function used in the hyperestraier search plug-in doesn't work as intended as ilustrated by this simple script:
+
+ #!/usr/bin/perl -w
+ use strict;
+
+ my @foo = (
+ [ qw/foo bar baz/ ],
+ [ qw/fee faa fum/ ],
+ );
+
+ # similar to current ikiwiki code (defective):
+ my @bar = map { "/path/to/$_" foreach @{$_} } @foo;
+
+ # this works:
+ #my @bar = map { map { "/path/to/$_" } @{$_} } @foo;
+
+ foreach (@bar) {
+ print "$_\n";
+ }
+
+Expected output:
+
+ /path/to/foo
+ /path/to/bar
+ /path/to/baz
+ /path/to/fee
+ /path/to/faa
+ /path/to/fum
+
+Current output:
+
+ Useless use of string in void context at perl-map.pl line 10.
+
+The patch below fixes this issue:
+
+ --- IkiWiki/Plugin/search.pm.orig Thu Feb 1 23:52:03 2007
+ +++ IkiWiki/Plugin/search.pm Thu Feb 1 23:52:41 2007
+ @@ -64,8 +64,9 @@
+ debug(gettext("updating hyperestraier search index"));
+ estcmd("gather -cm -bc -cl -sd",
+ map {
+ - Encode::encode_utf8($config{destdir}."/".$_)
+ - foreach @{$renderedfiles{pagename($_)}};
+ + map {
+ + Encode::encode_utf8($config{destdir}."/".$_)
+ + } @{$renderedfiles{pagename($_)}};
+ } @_
+ );
+ estcfg();
+
+[[bugs/done]] ; thanks for the patch. Suprised it worked at all since the
+bad code was added (did it?) --[[Joey]]
+
+Thank you for accepting my patch. I can't see how it could ever have worked
+with the previous code, no. --[[Brix|HenrikBrixAndersen]]
diff --git a/doc/bugs/INC_location_not_set_correctly_in_make_test.mdwn b/doc/bugs/INC_location_not_set_correctly_in_make_test.mdwn
new file mode 100644
index 000000000..1d396c85b
--- /dev/null
+++ b/doc/bugs/INC_location_not_set_correctly_in_make_test.mdwn
@@ -0,0 +1,24 @@
+'make test' has the following errors:
+
+Can't locate Locale/gettext.pm in @INC (@INC contains: /home/turian/utils//lib/perl5/site_perl/5.8.8/i386-linux-thread-multi /home/turian/utils//lib/perl5/site_perl/5.8.8 . /usr/lib/perl5/site_perl/5.8.8/i386-linux-thread-multi /usr/lib/perl5/site_perl/5.8.7/i386-linux-thread-multi /usr/lib/perl5/site_perl/5.8.6/i386-linux-thread-multi /usr/lib/perl5/site_perl/5.8.5/i386-linux-thread-multi /usr/lib/perl5/site_perl/5.8.8 /usr/lib/perl5/site_perl/5.8.7 /usr/lib/perl5/site_perl/5.8.6 /usr/lib/perl5/site_perl/5.8.5 /usr/lib/perl5/site_perl /usr/lib/perl5/vendor_perl/5.8.8/i386-linux-thread-multi /usr/lib/perl5/vendor_perl/5.8.7/i386-linux-thread-multi /usr/lib/perl5/vendor_perl/5.8.6/i386-linux-thread-multi /usr/lib/perl5/vendor_perl/5.8.5/i386-linux-thread-multi /usr/lib/perl5/vendor_perl/5.8.8 /usr/lib/perl5/vendor_perl/5.8.7 /usr/lib/perl5/vendor_perl/5.8.6 /usr/lib/perl5/vendor_perl/5.8.5 /usr/lib/perl5/vendor_perl /usr/lib/perl5/5.8.8/i386-linux-thread-multi /usr/lib/perl5/5.8.8) at (eval 254) line 2.
+
+What's weird is that I already have gettext.pm:
+ /home/turian/utils/lib/perl5/lib/i386-linux-thread-multi/Locale/gettext.pm
+
+That directory should be part of @INC, because I have:
+ export PERL5LIB="$PERL5LIB:$UTILS/lib/perl5/lib/i386-linux-thread-multi/"
+in my .bashrc. However, /home/turian/utils/lib/perl5/lib/i386-linux-thread-multi/ does not appear in that @INC line.
+
+How do I get the proper @INC locations set?
+
+> Nothing in ikiwiki touches whatever PERL5DIR setting you may have,
+> so AFAICS, this must be some sort of local configuration problem.
+> How do
+> `/home/turian/utils//lib/perl5/site_perl/5.8.8/i386-linux-thread-multi`
+> and `/home/turian/utils//lib/perl5/site_perl/5.8.8` get into the
+> displayed `@INC`? The likely way seems to be that something in your
+> system sets PERL5LIB to contain those directories, clobbering
+> the earlier setting in your `.bashrc`.
+> --[[Joey]]
+
+[[!tag done]]
diff --git a/doc/bugs/IkiWiki::Setup::load__40____41___broken_outside_ikiwiki__63__.mdwn b/doc/bugs/IkiWiki::Setup::load__40____41___broken_outside_ikiwiki__63__.mdwn
new file mode 100644
index 000000000..e1e29f0a4
--- /dev/null
+++ b/doc/bugs/IkiWiki::Setup::load__40____41___broken_outside_ikiwiki__63__.mdwn
@@ -0,0 +1,20 @@
+Hmm, according to the docs, shouldn't the following work?
+<pre>
+[ 74 ikipostal/test ] ikiwiki --dumpsetup out.setup foo bar
+[ 75 ikipostal/test ] perl -MIkiWiki::Setup -e'IkiWiki::Setup::load("out.setup");'
+out.setup: Can't use an undefined value as an ARRAY reference at /usr/share/perl5/IkiWiki/Setup.pm line 37.
+BEGIN failed--compilation aborted at (eval 15) line 233.
+</pre>
+
+From looking at the code, it seems that a global hash %config should be
+initialized somehow.
+
+This is in ikiwiki 2.62.1. I think this call used to work in 2.54 (when you first refactored the setup IIRC)
+
+[[DavidBremner]]
+>> Updated:
+>> It seems that `%config = IkiWiki::defaultsetup();IkiWiki::Setup::load("file");`
+>> works (after `use IkiWiki; use IkiWiki::Setup;`). Of course the other api
+>> is nicer.
+
+[[done]], sorry for trouble --[[Joey]]
diff --git a/doc/bugs/IkiWiki::Wrapper_should_use_destdir.mdwn b/doc/bugs/IkiWiki::Wrapper_should_use_destdir.mdwn
new file mode 100644
index 000000000..6b02c4186
--- /dev/null
+++ b/doc/bugs/IkiWiki::Wrapper_should_use_destdir.mdwn
@@ -0,0 +1,23 @@
+In IkiWiki/Wrapper.pm, the gen_wrapper function finds out what srcdir and
+destdir are set to in the config, but does not use them.
+
+Later in the sub, when a new wiki.cgi wrapper is being created when calling
+ikiwiki --setup /path/to/setup, it will only work if cgi\_wrapper in the
+config file is set to the full path. Otherwise, it creates wiki.cgi in the
+current working directory. It works with the other wrapper it sets up in
+my config - post\_update (using git), as that shows in the config with a
+full path.
+
+One workaround would be to mention in the setup file that cgi_wrapper has
+to be the full path, not just the file name, but that seems silly when
+destdir is also specified in that file and that's where it should go, and
+$config{destdir} is a known value in the Wrapper.pm file.
+
+> Nowhere in any documentation does
+> it say that cgi\_wrapper is relative to the destdir.
+> As noted in [[discussion]], there are web server setups
+> that require the cgi be located elsewhere.
+> [[done]] --[[Joey]]
+
+>> A comment in the generated setup file that all paths should be full
+>> would prevent my (admittedly dumb) error without any drawbacks.
diff --git a/doc/bugs/IkiWiki::Wrapper_should_use_destdir/discussion.mdwn b/doc/bugs/IkiWiki::Wrapper_should_use_destdir/discussion.mdwn
new file mode 100644
index 000000000..870fa7a66
--- /dev/null
+++ b/doc/bugs/IkiWiki::Wrapper_should_use_destdir/discussion.mdwn
@@ -0,0 +1,4 @@
+Just as a point of information, I do not put my cgi wrapper in the dest
+directory. Instead I configure Apache to relate a specific URI to the cgi via
+ScriptAlias. I would not like things to be changed so that the cgi was put in
+the destdir, so I'd vote instead to comment in the `setup\_file`. -- [[Jon]]
diff --git a/doc/bugs/IkiWiki_does_not_use_file__39__s_mtime_for_Last_Edited.mdwn b/doc/bugs/IkiWiki_does_not_use_file__39__s_mtime_for_Last_Edited.mdwn
new file mode 100644
index 000000000..440596318
--- /dev/null
+++ b/doc/bugs/IkiWiki_does_not_use_file__39__s_mtime_for_Last_Edited.mdwn
@@ -0,0 +1,21 @@
+Is there any way to get IkiWiki to display a file's Last Edited date from the input file's mtime?
+
+I have a bunch of pages that have been imported from another wiki. Their Last Edited dates are all shown as today, the day they were imported, even though their mtimes are all two years ago or older. It's important to me that these files display the correct Last Edited date so that it's obvious that they all appear suitably stale.
+
+The *actual* mtime of the output file can be whatever works best for IkiWiki (as discussed on [[todo/mtime]]). I'd just like IkiWiki to display the correct Last Edited date.
+
+Thanks for any hints! -- [[sabr]]
+
+> I can't reproduce this. ikiwiki _does_ look at the mtime, and that's what
+> it puts in the last edited date. Example:
+>
+> joey@kodama:~/src/ikiwiki>touch -t 198501011111 doc/bar.mdwn
+> joey@kodama:~/src/ikiwiki>make
+> [...]
+> joey@kodama:~/src/ikiwiki>grep Last\ edit html/bar.html
+> Last edited Tue Jan 1 11:11:00 1985
+>
+> Note that if you then touch the file to be even older, it won't notice
+> in a refresh. You'd have to force a rebuild in that case. --[[Joey]]
+
+>> You're right. I thought I *had* rebuilt it but, when I started over from scratch, everything worked. Sorry for the noise! Marking this [[invalid|done]]. --[[sabr]]
diff --git a/doc/bugs/Index_files_have_wrong_permissions.mdwn b/doc/bugs/Index_files_have_wrong_permissions.mdwn
new file mode 100644
index 000000000..7493772bd
--- /dev/null
+++ b/doc/bugs/Index_files_have_wrong_permissions.mdwn
@@ -0,0 +1,14 @@
+ikiwiki has files that are not group-readable:
+
+ -rw------- 1 joseph users 541 Aug 18 12:02 index.atom
+ -rw------- 1 joseph users 2328 Aug 18 12:02 index.html
+ -rw------- 1 joseph users 282 Aug 18 12:02 index.rss
+
+I chmod a+r them, but then when I edit something through the web, its permissions are reverted to only user-readable. How do I resolve this?
+
+ -- [[JosephTurian]]
+
+> My index files have the correct permissions. Have you tried setting 'umask => 022' in your ikiwiki.setup file?
+> -- [[HenrikBrixAndersen]]
+
+> > Thanks Henrik, that worked. Bug is [[bugs/done]]. -- [[JosephTurian]]
diff --git a/doc/bugs/Inline_doesn__39__t_wikilink_to_pages.mdwn b/doc/bugs/Inline_doesn__39__t_wikilink_to_pages.mdwn
new file mode 100644
index 000000000..13b80b436
--- /dev/null
+++ b/doc/bugs/Inline_doesn__39__t_wikilink_to_pages.mdwn
@@ -0,0 +1,96 @@
+It seems that the [[ikiwiki/directive/inline]] directive doesn't generate wikilinks to the pages it includes. For example I would expect the following to inline all bugs inside this bug report:
+
+\[[!inline pages="bugs/* and !*/discussion and backlink(bugs)" feeds=no postform=no archive=yes show="10"]]
+
+and note that it only included the 'normal' wikilinks (and also note that this page is not marked done even though the done page is inlined).
+One might also wonder if inline would make this page link to any internal links on those inlined pages too, but I think
+that would be overkill.
+
+I'm not even really sure if this is a bug or defined behaviour, but I thought it might work and it didn't. Regardless,
+the correct behaviour, whichever is decided, should be documented. -- [[Will]]
+
+It appears that [[ikiwiki/directive/map]] also doesn't wikilink to the pages it links. Perhaps in each of these
+cases there should be another parameter to the directive that allows linking to switched on. Just switching
+it on universally at this point might break a number of people's pagespecs. -- [[Will]]
+
+> There's a simple reason why these directives don't generate a record of a
+> wikilink between them and the pages they include: Semantically, inlining
+> a page is not the same as writing a link to it. Nor is generating a map that
+> lists a page the same as linking to it. I don't think this is a bug.
+> --[[Joey]]
+
+>> Fair enough. I guess we can mark this as [[done]] then.
+>>
+>> Just a bit of background on where I was going here... I was looking for
+>> a simpler way of attacking [[todo/tracking_bugs_with_dependencies]].
+>> In particular, rather than introducing changes to the pagespec definition,
+>> I wondered if you could use wiki pages as the defined pagespec and
+>> introduce a 'match_mutual' function which matches whenever two pages
+>> link to the same third page, then you don't need to alter the pagespec
+>> handling code.
+>>
+>> But that requires being able use use a pagespec to decide what pages
+>> are linked to. e.g. I want to make an 'openbugs' page that links to all
+>> open bugs. Then I could make a 'readybugs' page that links to
+>> `backlink(openbugs) and !mutualLink(openbugs)`. That is, all bugs
+>> that are open and do not themselves link to an open bug.
+>>
+>> The problem with all this is that it introduces an ordering dependency,
+>> as I noted below. I think the original proposal is better, because it
+>> handles that ordering dependency in the definition of the pagespecs.
+>> --[[Will]]
+
+Here is a patch to make map link to its linked pages (when passed `link="yes"`). It is a bit problematic in that it uses a pagespec
+to decide what to link to (which is why I wanted it). However, at the time the pagespec is used the links
+for each page haven't finished being calculated (we're using the pagespec to figure out those links,
+remember). This means that some pagespec match functions may not work correctly. Sigh.
+It would be nice to find a topological ordering of the pages and scan them in that order
+so that everything we need is found before we need it, but this patch doesn't do that (it would be
+complex).
+
+If you just use simple pagespecs you'll be fine. Unfortunately I really wanted this for more complex
+pagespecs. -- [[Will]]
+
+ diff --git a/IkiWiki/Plugin/map.pm b/IkiWiki/Plugin/map.pm
+ index 3284931..57c0a7a 100644
+ --- a/IkiWiki/Plugin/map.pm
+ +++ b/IkiWiki/Plugin/map.pm
+ @@ -13,7 +13,7 @@ use IkiWiki 3.00;
+
+ sub import {
+ hook(type => "getsetup", id => "map", call => \&getsetup);
+ - hook(type => "preprocess", id => "map", call => \&preprocess);
+ + hook(type => "preprocess", id => "map", call => \&preprocess, scan => 1);
+ }
+
+ sub getsetup () {
+ @@ -27,7 +27,9 @@ sub getsetup () {
+ sub preprocess (@) {
+ my %params=@_;
+ $params{pages}="*" unless defined $params{pages};
+ -
+ +
+ + return if (!defined wantarray && !IkiWiki::yesno($params{link}));
+ +
+ my $common_prefix;
+
+ # Get all the items to map.
+ @@ -42,6 +44,9 @@ sub preprocess (@) {
+ else {
+ $mapitems{$page}='';
+ }
+ + if (!defined wantarray && IkiWiki::yesno($params{link})) {
+ + push @{$links{$params{page}}}, $page;
+ + }
+ # Check for a common prefix.
+ if (! defined $common_prefix) {
+ $common_prefix=$page;
+ @@ -62,6 +67,8 @@ sub preprocess (@) {
+ }
+ }
+
+ + return if ! defined wantarray;
+ +
+ # Common prefix should not be a page in the map.
+ while (defined $common_prefix && length $common_prefix &&
+ exists $mapitems{$common_prefix}) {
diff --git a/doc/bugs/Insecure_dependency_in_eval_while_running_with_-T_switch.mdwn b/doc/bugs/Insecure_dependency_in_eval_while_running_with_-T_switch.mdwn
new file mode 100644
index 000000000..c3beb8219
--- /dev/null
+++ b/doc/bugs/Insecure_dependency_in_eval_while_running_with_-T_switch.mdwn
@@ -0,0 +1,98 @@
+Hello,
+
+I had to fix a location of one page in my repo via `svn move` and now I have
+a problem with rebuilding my ikiwiki pages:
+
+ ikiwiki --setup ikiwiki.setup
+ [...]
+ tworzenie strony ubuntu/linki.mdwn
+ tworzenie strony knoppix/bootowalny_pendrive_usb.mdwn
+ tworzenie strony helponformatting.mdwn
+ Insecure dependency in eval while running with -T switch at /usr/share/perl5/IkiWiki/Plugin/conditional.pm line 30.
+ BEGIN failed--compilation aborted at (eval 5) line 110.
+
+Temporarily I've disabled conditional plugin to avoid that bug.
+
+PS. I still use ikiwiki 1.50 backported for Debian 'sarge'.
+
+--[[Paweł|ptecza]]
+
+---
+
+Hello again :)
+
+I've just builded successfully ikiwiki 2.00 package for Debian 'sarge'.
+Unfortunately, now I still can't to rebuild my ikiwiki pages:
+
+ ikiwiki --setup ikiwiki.setup
+ [...]
+ renderowanie ikiwiki/backport.mdwn
+ renderowanie ikiwiki/instalacja.mdwn
+ renderowanie ikiwiki/problemy.mdwn
+ Insecure dependency in eval while running with -T switch at /usr/share/perl5/IkiWiki.pm line 1005.
+ BEGIN failed--compilation aborted at (eval 5) line 111.
+
+I didn't apply your following old patch against `Ikiwiki.pm` file:
+
+ --- IkiWiki.pm-orig 2007-05-10 11:16:47.000000000 +0200
+ +++ IkiWiki.pm 2007-05-10 11:16:07.000000000 +0200
+ @@ -993,7 +993,18 @@
+ my $spec=shift;
+ my $from=shift;
+
+ - return eval pagespec_translate($spec);
+ + my $pagespec = pagespec_translate($spec);
+ +
+ + my $newpagespec;
+ +
+ + local($1);
+ + if ($pagespec =~ /(.*)/) {
+ + $newpagespec = $1;
+ + } else {
+ + die "oh";
+ + }
+ +
+ + return eval $newpagespec;
+ }
+
+ package IkiWiki::PageSpec;
+
+because `patch` command fails:
+
+ patch -p0 < ../IkiWiki.pm.patch
+ patching file IkiWiki.pm
+ Hunk #1 FAILED at 993.
+ 1 out of 1 hunk FAILED -- saving rejects to file IkiWiki.pm.rej
+
+Could you please fix that patch? I guess how to do it, but I don't want
+to break the code I distribute in my backport ;)
+
+--[[Paweł|ptecza]]
+
+> It's not my patch.. IIRC my suggestion was simply to do this: --[Joey]]
+
+ Index: IkiWiki.pm
+ ===================================================================
+ --- IkiWiki.pm (revision 3565)
+ +++ IkiWiki.pm (working copy)
+ @@ -1005,7 +1005,7 @@
+ unshift @params, "location";
+ }
+
+ - my $ret=eval pagespec_translate($spec);
+ + my $ret=eval possibly_foolish_untaint(pagespec_translate($spec));
+ return IkiWiki::FailReason->new("syntax error") if $@;
+ return $ret;
+ }
+
+>> Thanks a lot, Joey! It works :)
+>>
+>> BTW, I was quite sure that you sent me the old patch via e-mail long time ago.
+>> Maybe I found it at old ikiwiki home page? I don't remember it now.
+>>
+>> --[[Paweł|ptecza]]
+
+----
+
+I'm marking this [[done]] since it only affects sarge. Sarge users should
+use the patch above. --[[Joey]]
diff --git a/doc/bugs/Insecure_dependency_in_mkdir.mdwn b/doc/bugs/Insecure_dependency_in_mkdir.mdwn
new file mode 100644
index 000000000..46011a7e8
--- /dev/null
+++ b/doc/bugs/Insecure_dependency_in_mkdir.mdwn
@@ -0,0 +1,160 @@
+Joey, please see RecentChanges and note that this is my second bug report,
+because the first was unsuccessfully (bad characters in post title?).
+Could you please tidy it up?
+
+> I've fixed that and the bug that caused the dup.
+
+>> Thanks a lot! :)
+
+I've just upgraded my ikiwiki from version 2.11 to the latest version 2.15.
+I use my own rebuilt ikiwiki package for Ubuntu Gutsy box. Now I can't rebuild
+all my ikiwiki pages, because of the following bug:
+
+ ptecza@anahaim:~/blog$ ikiwiki --setup ikiwiki.setup --getctime --verbose
+ [...]
+ scanning post/2007/12/09/pink-freud-w-cafe-kulturalna.mdwn
+ ikiwiki.setup: Insecure dependency in mkdir while running with -T switch at /usr/share/perl5/IkiWiki.pm line 355.
+ BEGIN failed--compilation aborted at (eval 5) line 151.
+
+I have a write permission to the ikiwiki destination directory:
+
+ ptecza@anahaim:~/blog$ ls -ld /var/www/blog/
+ drwxr-xr-x 2 ptecza ptecza 4096 2007-12-17 10:48 /var/www/blog/
+
+I've read ikiwiki changelog for the previous releases and unfortunately
+I can't see any related entries. Any ideas?
+
+--[[Paweł|ptecza]]
+
+> **Update**: I've came back to ikiwiki 2.11 and... the bug still exists!
+> Probably the reason is that I've removed all content of `/var/www/blog/`
+> before mass rebuilding. --[[Paweł|ptecza]]
+
+> I can't reproduce this bug with a setup file that tells ikiwiki to
+> write to /var/www/blog, which doesn't exist. I get a "Permission denied"
+> since I can't write to /var/www. If I make the permissions allow me to
+> write to /var/www, it happily creates the blog subdirectory. If the blog
+> subdirectory is already there and I can write to it, that of course also
+> works.
+>
+> I'll need enough information to reproduce the problem before I can fix
+> it. Probably a copy of your setup file, wiki source, and information
+> about how your /var/www is set up. --[[Joey]]
+
+>> Thanks for your efforts, Joey! I sent my `ikiwiki.setup` file to you.
+>> What source do you need? Entire my ikiwiki or only some pages?
+>>
+>> There are settings of `/var/www/` directory on my Ubuntu Gutsy box:
+>>
+>> ptecza@anahaim:~$ ls -al /var/www/
+>> total 16
+>> drwxr-xr-x 4 root root 4096 2007-11-06 16:25 .
+>> drwxr-xr-x 14 root root 4096 2007-11-06 16:13 ..
+>> drwxr-xr-x 2 root root 4096 2007-11-06 16:13 apache2-default
+>> drwxr-xr-x 5 ptecza ptecza 4096 2007-12-17 16:54 blog
+>>
+>> --[[Paweł|ptecza]]
+
+>> I need a set of files that you know I can use to reproduce the bug.
+>> --[[Joey]]
+
+>>> OK, I've just sent you the URL where you can find all files you need :)
+>>>
+>>> Probably I know how to reproduce the bug. You have to erase all files from
+>>> `/var/www/blog` before mass rebuilding. This is my `mass-rebuild.sh` script:
+>>>
+>>> #!/bin/bash
+>>>
+>>> rm -rf /var/www/blog/*
+>>> ikiwiki --setup ikiwiki.setup --getctime --verbose
+>>>
+>>> I noticed that the bug was "resolved" when I added to my blog new entry
+>>> and commited the changes. Before I created all directories and touched
+>>> empty `*.html` files in `/var/www/blog` directory. Probably it's not
+>>> necessary, because without a new blog revision the bug still existed
+>>> and `ikiwiki` still failed.
+>>>
+>>> --[[Paweł|ptecza]]
+
+>> I'd forgotten about [this perl bug](http://bugs.debian.org/411786).
+>> All I can do is work around it by disabling the taint checking. :-(
+>> (Which I've [[done]].) --[[Joey]]
+
+>>> Ubuntu Gutsy also has Perl 5.8.8-7, so probably it has the bug too.
+>>> --[[Paweł|ptecza]]
+
+>>>> I just got it while building my latest version of git.ikiwiki.info + my stuff.
+>>>> Only thing different in my version in IkiWiki.pm is that I moved a &lt;/a> over
+>>>> a word (for createlink), and disabled the lowercasing of created pages. Running
+>>>> Lenny's Perl. --[[simonraven]]
+
+>>>> Simon, I'm not clear what version of ikiwiki you're using.
+>>>> Since version 2.40, taint checking has been disabled by
+>>>> default due to the underlying perl bug. Unless you
+>>>> build ikiwiki with NOTAINT=0. --[[Joey]]
+
+>>>> Hi, nope not doing this. Um, sorry, v. 3.13. I've no idea why it suddenly started doing this.
+>>>> It wasn't before. I've been messing around IkiWiki.pm to see if I can set
+>>>> a umask for `mkdir`.
+
+line 775 and down:
++ umask ($config{umask} || 0022);
+
+>>>> I figured it *might* be the `umask`, but I'll see in a few when / if it gets past that in the build. No; I keep getting garbage during the brokenlinks test
+
+<pre>
+t/basewiki_brokenlinks.....Insecure dependency in mkdir while running with -T switch at IkiWiki.pm line 776.
+
+# Failed test at t/basewiki_brokenlinks.t line 11.
+
+# Failed test at t/basewiki_brokenlinks.t line 19.
+
+
+broken links found
+&lt;li>shortcut from &lt;a href="./shortcuts/">shortcuts&lt;/a>&lt;/li>&lt;/ul>
+
+
+
+# Failed test at t/basewiki_brokenlinks.t line 25.
+Insecure dependency in mkdir while running with -T switch at IkiWiki.pm line 776.
+
+# Failed test at t/basewiki_brokenlinks.t line 11.
+
+# Failed test at t/basewiki_brokenlinks.t line 25.
+# Looks like you failed 5 tests of 12.
+dubious
+ Test returned status 5 (wstat 1280, 0x500)
+</pre>
+
+>>>> I get this over and over... I haven't touched that AFAICT, at all. --[[simonraven]]
+
+>>>>> Take a look at your `/usr/bin/ikiwiki`. The first
+>>>>> line should not contain -T. If it does, remove it,
+>>>>> and maybe try to work out or give details about how
+>>>>> you installed ikiwiki and why it got the -T in there,
+>>>>> which certianly doesn't happen by default when ikiwiki
+>>>>> is installed by the Makefile.PL or by any package I know of.
+>>>>> (If there's
+>>>>> no -T, then something *really* weird is going on..)
+>>>>> --[[Joey]]
+
+>>>>>> nope, no -T in the hashbang line at all. Haven't added any;
+>>>>>> only thing I did there was change `use lib` to `/usr/share/perl5`,
+>>>>>> otherwise I'd get bogus errors about CGI::Cookie_al or some such thing.
+>>>>>>
+>>>>>> How I installed it was in non-public directories in various sites, then
+>>>>>> make it publish stuff to a public dir in the relevant site. Or do you
+>>>>>> mean installed, as in the whole thing? From a .deb I made based on the git tree, with `git-buildpackage`.
+>>>>>>
+>>>>>> This issue is recent, after a `git pull` IIRC. It has never happened before. It's also puzzling me.
+>>>>>>
+>>>>>> You can check it out for yourself by pulling my fork of this, at github or my local repo.
+>>>>>> github will probably be faster for you: git://github.com/kjikaqawej/ikiwiki-simon.git --[[simonraven]]
+
+>>>>>>> I don't know what I'm supposed to see in your github tree.. it
+>>>>>>> looks identical to an old snapshot of ikiwiki's regular git repo?
+>>>>>>> If you want to put up the .deb you're using, I could examine that.
+>>>>>>>
+>>>>>>> I was in fact able to reproduce the insecure dependency in mkdir
+>>>>>>> message -- but only if I run 'perl -T ikiwiki'.
+>>>>>>> --[[Joey]]
diff --git a/doc/bugs/Insecure_dependency_in_utime.mdwn b/doc/bugs/Insecure_dependency_in_utime.mdwn
new file mode 100644
index 000000000..330479d22
--- /dev/null
+++ b/doc/bugs/Insecure_dependency_in_utime.mdwn
@@ -0,0 +1,14 @@
+ikiwiki.setup: Insecure dependency in utime while running with -T switch at /usr/pkg/lib/perl5/vendor_perl/5.8.0/IkiWiki/Plugin/recentchanges.pm line 158.
+BEGIN failed--compilation aborted at (eval 5) line 164.
+
+This was in ikiwiki\_2.32.3.
+
+I worked-around this by doing:
+
+ utime IkiWiki::possibly_foolish_untaint($change->{when}), IkiWiki::possibly_foolish_untaint($change->{when}), "$config{srcdir}/$file
+
+> Don't build ikiwiki with taint checking. It's known to be broken in
+> apparently all versions of perl, apparently leaking taint flags at random.
+> See [[Insecure_dependency_in_mkdir]] --[[Joey]]
+
+[[!tag done]]
diff --git a/doc/bugs/Linkmap_doesn__39__t_support_multiple_linkmaps_on_a_single_page.mdwn b/doc/bugs/Linkmap_doesn__39__t_support_multiple_linkmaps_on_a_single_page.mdwn
new file mode 100644
index 000000000..a0645477e
--- /dev/null
+++ b/doc/bugs/Linkmap_doesn__39__t_support_multiple_linkmaps_on_a_single_page.mdwn
@@ -0,0 +1,3 @@
+If I use the linkmap directive twice on a single page, I get the same image appearing in both locations, even though the parameters for the two directives may have been different.
+
+-- Martin
diff --git a/doc/bugs/Links_to_missing_pages_should_always_be_styled.mdwn b/doc/bugs/Links_to_missing_pages_should_always_be_styled.mdwn
new file mode 100644
index 000000000..73213209a
--- /dev/null
+++ b/doc/bugs/Links_to_missing_pages_should_always_be_styled.mdwn
@@ -0,0 +1,5 @@
+When the CGI URL is not defined, links to missing pages appear as plain, unstyled text. I think the 'createlink' span should always wrap this text, even when the actual question mark linking to the CGI for the create action is missing. This ensures consistent styling regardless of whether the CGI is available or not (and is thus useful for example when the same wiki has clones with the CGI link and clones without).
+
+A proposed patch is available [on my ikiwiki clone](http://git.oblomov.eu/ikiwiki/patch/290d1b498f00f63e6d41218ddb76d87e68ed5081)
+
+[[!tag patch cgi done]]
diff --git a/doc/bugs/Links_with_symbols_can__39__t_be_edited.mdwn b/doc/bugs/Links_with_symbols_can__39__t_be_edited.mdwn
new file mode 100644
index 000000000..b9fe734b7
--- /dev/null
+++ b/doc/bugs/Links_with_symbols_can__39__t_be_edited.mdwn
@@ -0,0 +1,22 @@
+We just installed 2.52 on debian testing, and all edit links with symbols (including spaces) are coming up with empty text fields because the URL to the mdwn file is wrong.
+
+For example, the existing page:
+
+ wiki/bugs/__39__Existing_Subject__39___vs.___39__Browse__39__/
+
+displays the bug "'Existing Subject' vs. 'Browse'". But if we click 'Edit' on that page, we get:
+
+ wiki/ikiwiki.cgi?page=bugs%2F'Existing%20Subject'%20vs.%20'Browse'&do=edit
+
+.. which of course opens with a blank edit box. Note that manually typing in the correct URL:
+
+ wiki/ikiwiki.cgi?page=bugs%2F__39__Existing_Subject__39___vs.___39__Browse__39__&do=edit
+
+does work.
+
+> You need to rebuild your wiki on upgrade to 2.52. The *old* edit links
+> looked like the first link above, and ikiwiki has changed so that it
+> needs the new link above. To get those links on static pages, you need to
+> rebuild the wiki, like the [[news]] says to.
+>
+> [[closing|done]] as user error --[[Joey]]
diff --git a/doc/bugs/MTIME_not_set_for_inline_or_archive_entries.mdwn b/doc/bugs/MTIME_not_set_for_inline_or_archive_entries.mdwn
new file mode 100644
index 000000000..89947b544
--- /dev/null
+++ b/doc/bugs/MTIME_not_set_for_inline_or_archive_entries.mdwn
@@ -0,0 +1,22 @@
+My <code>page.tmpl</code> can contain:
+
+ Created <TMPL_VAR CTIME>. Last edited <TMPL_VAR MTIME>.
+
+and that works. However, if I have the same line in <code>inlinepage.tmpl</code>
+or <code>archivepage.tmpl</code>, then only the <code>CTIME</code> works - the <code>MTIME</code> is blank.
+This leads to an annoying inconsistency.
+
+Update - even though I'm not a Perl programmer, this patch seems right:
+
+ --- /home/bothner/ikiwiki/ikiwiki/IkiWiki/Plugin/inline.pm 2008-10-01 14:29:11.000000000 -0700
+ +++ ./inline.pm 2008-10-12 13:26:11.000000000 -0700
+ @@ -316,6 +316,7 @@
+ $template->param(pageurl => urlto(bestlink($params{page}, $page), $params{destpage}));
+ $template->param(title => pagetitle(basename($page)));
+ $template->param(ctime => displaytime($pagectime{$page}, $params{timeformat}));
+ + $template->param(mtime => displaytime($pagemtime{$page}, $params{timeformat}));
+ $template->param(first => 1) if $page eq $list[0];
+ $template->param(last => 1) if $page eq $list[$#list];
+
+
+> [[done]], thanks
diff --git a/doc/bugs/Map_sorts_subtags_under_a_different_tag.mdwn b/doc/bugs/Map_sorts_subtags_under_a_different_tag.mdwn
new file mode 100644
index 000000000..fe634e9c1
--- /dev/null
+++ b/doc/bugs/Map_sorts_subtags_under_a_different_tag.mdwn
@@ -0,0 +1,49 @@
+ikiwiki's map directive is full of surprises. I have put a snapshot of the
+site as it was when I saw the following problem at
+<http://scratch.madduck.net/web__phd.martin-krafft.net__map-bug-2.tgz> and can
+revert there any time, but I need to move on.
+
+I have a few tags starting with `a` (abridged list):
+
+ $ ls wiki/factors/tag/a*
+ [...]
+ wiki/factors/tag/active/:
+ index.html
+
+ wiki/factors/tag/affects/:
+ contributors/ developers/ users/
+ [...]
+
+In `wiki-wc/factors/tag.mdwn`, I have a map for these tags:
+
+ \[[!map pages="factors/tag/*"]]
+
+and this works, except that for *whatever* reason, it actually sorts the three
+`affects/*` tags under `active`:
+
+ $ w3m -dump wiki/factors/tag/index.html | grep active -A3
+ ○ active
+ ■ contributors
+ ■ developers
+ ■ users
+
+And this is actually in the HTML code:
+
+ <li><a href="active/">active</a>
+ <ul>
+ <li><a href="affects/contributors/">contributors</a>
+ </li>
+ <li><a href="affects/developers/">developers</a>
+ </li>
+ <li><a href="affects/users/">users</a>
+ </li></ul>
+ </li>
+ <li><a href="approach/">approach</a>
+ </li>
+
+So it's not that the `<ul>` has an empty parent `<li>`, the three tags are
+*really* children of `active`.
+
+This really blows my mind. :)
+
+Rendering issue. [[fixed|done]] --[[Joey]]
diff --git a/doc/bugs/Mercurial_example_diffurl_should_read_r2__44___not_changeset.mdwn b/doc/bugs/Mercurial_example_diffurl_should_read_r2__44___not_changeset.mdwn
new file mode 100644
index 000000000..99e6e29a4
--- /dev/null
+++ b/doc/bugs/Mercurial_example_diffurl_should_read_r2__44___not_changeset.mdwn
@@ -0,0 +1,11 @@
+In the example ikiwiki.setup:
+
+ diffurl => "http://localhost:8000/?fd=\[[changeset]];file=\[[file]]",
+
+should be:
+
+ diffurl => "http://localhost:8000/?fd=\[[r2]];file=\[[file]]",
+
+(changeset doesn't get expanded; r2 does, to what appears to be The Right Thing)
+
+[[done]]
diff --git a/doc/bugs/Meta_plugin_does_not_respect_htmlscrubber__95__skip_setting.___40__patch__41__.mdwn b/doc/bugs/Meta_plugin_does_not_respect_htmlscrubber__95__skip_setting.___40__patch__41__.mdwn
new file mode 100644
index 000000000..0e40da551
--- /dev/null
+++ b/doc/bugs/Meta_plugin_does_not_respect_htmlscrubber__95__skip_setting.___40__patch__41__.mdwn
@@ -0,0 +1,11 @@
+I have been trying to include some meta info using the link setting something like the below
+
+ meta link="http://www.example.com/" rel="command" name="Example"
+
+This gets removed by the htmlscrubber as you would expect.
+
+Setting htmlscrubber_skip to the pagespec should stop this getting scrubbed but it does not.
+
+Below is a patch to fix that. It seams to work but I am not sure of it is the correct thing to do.
+
+> [[done]], thanks for the patch --[[Joey]]
diff --git a/doc/bugs/Missing_build-dep_on_perlmagick__63__.mdwn b/doc/bugs/Missing_build-dep_on_perlmagick__63__.mdwn
new file mode 100644
index 000000000..f6c0266ba
--- /dev/null
+++ b/doc/bugs/Missing_build-dep_on_perlmagick__63__.mdwn
@@ -0,0 +1,14 @@
+Trying to build current Git master in a (two weeks old - no DSL here) sid chroot triggers :
+
+ rendering news.mdwn
+ Can't locate Image/Magick.pm in @INC (@INC contains: . blib/lib /etc/perl /usr/local/lib/perl/5.10.0 /usr/local/share/perl/5.10.0 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.10 /usr/share/perl/5.10 /usr/local/lib/site_perl .) at (eval 175) line 2.
+ BEGIN failed--compilation aborted at (eval 175) line 2.
+
+ make[1]: *** [extra_build] Error 2
+ make[1]: Leaving directory `/tmp/buildd/ikiwiki-2.54'
+ make: *** [build-stamp] Error 2
+
+Adding perlmagick to the build-deps fixes it. I read somewhere in debian/changelog that this build-deb was not needed, but...
+
+> It's not needed by the test suite, but once I added a img to the source
+> wiki, it became needed. [[done]] --[[Joey]]
diff --git a/doc/bugs/Missing_constant_domain_at_IkiWiki.pm_line_842.mdwn b/doc/bugs/Missing_constant_domain_at_IkiWiki.pm_line_842.mdwn
new file mode 100644
index 000000000..6ed035232
--- /dev/null
+++ b/doc/bugs/Missing_constant_domain_at_IkiWiki.pm_line_842.mdwn
@@ -0,0 +1,50 @@
+I can't build my ikiwiki 1.40 backport for Debian 'sarge':
+
+ $ LANG=C dpkg-buildpackage -us -uc -rfakeroot
+ [...]
+ ./pm_filter /usr 1.40 /usr/share/perl5 <IkiWiki/Plugin/favicon.pm >blib/lib/IkiWiki/Plugin/favicon.pm
+ ./pm_filter /usr 1.40 /usr/share/perl5 <IkiWiki/Plugin/shortcut.pm >blib/lib/IkiWiki/Plugin/shortcut.pm
+ ./pm_filter /usr 1.40 /usr/share/perl5 <IkiWiki/Plugin/aggregate.pm >blib/lib/IkiWiki/Plugin/aggregate.pm
+ ./ikiwiki.in doc html --templatedir=templates --underlaydir=basewiki \
+ --wikiname="ikiwiki" --verbose --no-rcs \
+ --exclude=/discussion --no-discussion --userdir=users \
+ --plugin=goodstuff \
+ --plugin=haiku --plugin=polygen --plugin=fortune
+ Missing constant domain at IkiWiki.pm line 842
+ make[1]: *** [extra_build] Error 22
+ make[1]: Leaving directory `/home/ptecza/svn/ikiwiki'
+ make: *** [build-stamp] Error 2
+
+--[[Paweł|ptecza]]
+
+This is because of an old version of Locale::gettext which doesn't
+include the OO interface. I had this problem too, but installing a
+new version of Locale::gettext fixed it. --Ethan
+
+> I suppose I should document it needing a new enough version. Or find a
+> way to use the non-OO version while still getting proper UTF-8 strings,
+> which is why I began to use the OO version in the first place..
+>
+> Looks like the OO interface was added in version 1.04
+>
+> And there's no good way to get utf-8 strings w/o the OO interface, that I
+> can see.
+>
+> So, what I've done is documented that it needs Locale::gettext 1.04, and
+> made it not crash if run with an older version, though it also won't
+> gettext anything in that case. Might be a bit confusing if someone misses
+> the docs about it needing the newer version and wonders why gettext
+> doesn't work, but I consider it good enough to mark this [[bugs/done]].
+> --[[Joey]]
+
+>> Thanks for the hint, guys! :) I've just backported liblocale-gettext-perl
+>> 1.05 package and it seems that now I can build my ikiwiki successfully
+>> and it even works :) --[[Paweł|ptecza]]
+
+Thanks for the note on this, made it very easy to figure out what was going on. Just pointing out though that the Debian package doesn't require a new version of liblocale-gettext-perl. I just got bit by this bug setting up a dev ikiwiki box on a mixed stable/testing box. -- [[Adam]]
+
+> The Debian package has this:
+>
+> Suggests: [...], liblocale-gettext-perl (>= 1.05-1), [...]
+>
+> --[[JoshTriplett]] \ No newline at end of file
diff --git a/doc/bugs/Monotone_rcs_support.mdwn b/doc/bugs/Monotone_rcs_support.mdwn
new file mode 100644
index 000000000..8687e7983
--- /dev/null
+++ b/doc/bugs/Monotone_rcs_support.mdwn
@@ -0,0 +1,58 @@
+The Monotone module still lacks support for setting up a post-commit hook,
+so commits made via monotone will not automatically update the wiki.
+
+Here for future reference is the most recent version of support for
+that I've been sent. It's not yet working; there are path issues. --[[Joey]]
+
+> I think this was fixed in version 2.40. --[[Joey]] [[!tag done]]
+
+<pre>
+diff --git a/IkiWiki/Rcs/monotone.pm b/IkiWiki/Rcs/monotone.pm
+index cde6029..34f8f96 100644
+--- a/IkiWiki/Rcs/monotone.pm
++++ b/IkiWiki/Rcs/monotone.pm
+@@ -186,8 +186,9 @@ sub rcs_update () {
+ check_config();
+
+ if (defined($config{mtnsync}) && $config{mtnsync}) {
++ check_mergerc();
+ if (system("mtn", "--root=$config{mtnrootdir}", "sync",
+- "--quiet", "--ticker=none",
++ "--quiet", "--ticker=none", "--rcfile", $config{mtnmergerc},
+ "--key", $config{mtnkey}) != 0) {
+ debug("monotone sync failed before update");
+ }
+@@ -604,4 +605,9 @@ __DATA__
+ return true
+ end
+ }
++ function note_netsync_revision_received(new_id, revision, certs, session_id)
++ if (program_exists_in_path("ikiwiki-netsync-hook")) then
++ execute("ikiwiki-netsync-hook", new_id)
++ end
++ end
+ EOF
+diff --git a/IkiWiki/Wrapper.pm b/IkiWiki/Wrapper.pm
+index 2103ea5..cff718c 100644
+diff --git a/doc/ikiwiki.setup b/doc/ikiwiki.setup
+index 1377315..0cbe27e 100644
+--- a/doc/ikiwiki.setup
++++ b/doc/ikiwiki.setup
+@@ -88,6 +88,16 @@ use IkiWiki::Setup::Standard {
+ # # Enable mail notifications of commits.
+ # notify => 1,
+ #},
++ #{
++ # # The monotone netsync revision received wrapper.
++ # # Note that you also need to install a lua
++ # # hook into monotone to make this work
++ # # see: http://ikiwiki.info/rcs/monotone/
++ # wrapper => "/usr/local/bin/ikiwiki-netsync-hook",
++ # wrappermode => "04755",
++ # # Enable mail notifications of commits.
++ # notify => 1,
++ #},
+ ],
+
+ # Generate rss feeds for blogs?
+</pre>
diff --git a/doc/bugs/More_permission_checking.mdwn b/doc/bugs/More_permission_checking.mdwn
new file mode 100644
index 000000000..6cd6cb0ec
--- /dev/null
+++ b/doc/bugs/More_permission_checking.mdwn
@@ -0,0 +1,17 @@
+I'm often confused about permissions and I wish ikiwiki could stamp it's foot down and ensure all the permissions are correctly (canonically?) setup.
+
+I keep ending up having to `sudo chown -R :www-data` and `sudo chmod -R g+w` on srcdir, destdir. I'm never quite sure what is the best practice for the srcdirs' `/srv/git/` is. Currently everything looks like `hendry:www-data` with ug+rw.
+
+I think I've triggered these problems by (not thinking and) running `ikiwiki --rebuild --setup /home/hendry/.ikiwiki/mywiki.setup` as my user.
+
+I don't know if there can be some lookup with `/etc/ikiwiki/wikilist`. Though shouldn't everything be under the `www-data` group in reality?
+
+Also when I use `sudo ikiwiki -setup /etc/ikiwiki/auto.setup`, I think I create a ton of problems for myself since everything is created as the root user, right? And `/etc/ikiwiki/wikilist` doesn't seem to have the latest created wiki added. I have to reluctantly manually do this.
+
+> You should never make files be owned by www-data user or group.
+> Ikiwiki is designed to run as a single user, which can just
+> be your login user; all files should be owned by that user, the
+> ikiwiki.cgi and other wrappers suid to that user. And then there are
+> never any permissions problems. --[[Joey]]
+
+[[done]]
diff --git a/doc/bugs/Navbar_does_not_link_to_page_being_commented_on_while_commenting.mdwn b/doc/bugs/Navbar_does_not_link_to_page_being_commented_on_while_commenting.mdwn
new file mode 100644
index 000000000..4c7b12e8c
--- /dev/null
+++ b/doc/bugs/Navbar_does_not_link_to_page_being_commented_on_while_commenting.mdwn
@@ -0,0 +1,11 @@
+Say you are commenting on this report. The Navbar on top will look like
+
+[ikiwiki](http://ikiwiki.info/)/ [bugs](http://ikiwiki.info/bugs/)/ commenting on Navbar does not link to page being commented on while commenting
+
+while either of those two options would be better:
+
+[ikiwiki](http://ikiwiki.info/)/ [bugs](http://ikiwiki.info/bugs/)/ commenting on [Navbar does not link to page being commented on while commenting](http://ikiwiki.info/bugs/Navbar_does_not_link_to_page_being_commented_on_while_commenting/)
+
+[ikiwiki](http://ikiwiki.info/)/ [bugs](http://ikiwiki.info/bugs/)/ [Navbar does not link to page being commented on while commenting](http://ikiwiki.info/bugs/Navbar_does_not_link_to_page_being_commented_on_while_commenting/) / New comment
+
+-- RichiH
diff --git a/doc/bugs/New_comments_are_not_always_displayed__59___need_page_refresh_to_appear.mdwn b/doc/bugs/New_comments_are_not_always_displayed__59___need_page_refresh_to_appear.mdwn
new file mode 100644
index 000000000..ac079f5b8
--- /dev/null
+++ b/doc/bugs/New_comments_are_not_always_displayed__59___need_page_refresh_to_appear.mdwn
@@ -0,0 +1,35 @@
+I noticed this a few times in Google Chrome 12 (dev channel) a few times, already:
+
+I added a comment to
+
+ http://git-annex.branchable.com/forum/performance_improvement:_git_on_ssd__44___annex_on_spindle_disk/
+
+and left the page. Later, I revisited
+
+ http://git-annex.branchable.com/forum/
+
+and clicked on
+
+ http://git-annex.branchable.com/forum/performance_improvement:_git_on_ssd__44___annex_on_spindle_disk/
+
+My own comment did not appear. I pressed F5 and eh presto.
+
+My assumption is that ikiwiki does not tell Chrome to reload the page as the cache is stale.
+
+
+Richard
+
+> There is some lurking bug with certian web browsers, web servers, or
+> combination of the two that makes modifications to html files not
+> always be noticed by web browsers. See
+> [[bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info]]
+> see also <http://bugs.debian.org/588623>.
+>
+> On Branchable, we work around this problem with an apache configuration:
+> «ExpiresByType text/html "access plus 0 seconds"»
+>
+> There seems to be no way to work around it in ikiwiki's generated html,
+> aside from using the cache-control setting that is not allowed in html5.
+>
+> And, which browsers/web servers have the problem, and where the bug is,
+> seems very hard to pin down. --[[Joey]]
diff --git a/doc/bugs/No___34__sid__34___in_forms_resulting_in_Error:_Your_login_session_has_expired..mdwn b/doc/bugs/No___34__sid__34___in_forms_resulting_in_Error:_Your_login_session_has_expired..mdwn
new file mode 100644
index 000000000..06bbce91a
--- /dev/null
+++ b/doc/bugs/No___34__sid__34___in_forms_resulting_in_Error:_Your_login_session_has_expired..mdwn
@@ -0,0 +1,39 @@
+Logged in and edited in textbox. Clicked preview and then clicked Save Page and it errored with:
+
+Error: Your login session has expired.
+
+I also did same without preview and got same error.
+
+I added debugging to print the sid and session->id on the error. The sid was empty. The session->id was a long value (maybe base64).
+
+View source of editing page or preview+edit page shows no "sid" input value. (I do see it when editing here in your ikiwiki site, but not on mine.)
+
+> Further info. In the "prefs" dialog, it does have hidden "sid" defined. Viewing HTML source also shows this is in the fb_hidden class and Generated by CGI::FormBuilder v3.0501. I tried multiple times and never see "sid" in the HTML source of editing a page. --[[JeremyReed]]
+
+>> Found problem: Needed to update my editpage.tmpl to add TMPL_VAR FIELD-SID. This bug can be closed once that is documented -- and that documentation is obvious to find.
+
+>>> Whenever you choose to locally copy an ikiwiki template and modify it,
+>>> it's really up to you to keep it up-to-date. I did consider adding a
+>>> new item about this rather than just mentioning it in the changelog,
+>>> since I knew it would break locally modified templates -- but I've
+>>> never documented template changes in the news file before, and most of
+>>> them do lead to breakage of one kind or another if a locally modified
+>>> template is not kept up-to-date. I don't think that bloating the news
+>>> file with mentions of every single change to every template file would
+>>> be a win. --[[Joey]]
+
+>>>> I should have mentioned: yes, I already read the recent CHANGELOG
+>>>> entries. If it (like changes for 2.42) had indicated this was a
+>>>> template change, I would have known and wouldn't have filed the bug.
+>>>> Also maybe the manpage for ikiwiki can mention about local template
+>>>> modifications (I can fix that if not done.)
+
+> Perhaps what I should do is put in a template version check. --[[Joey]]
+
+ <TMPL_UNLESS IKIWIKI_TEMPLATE_REVISION_20080428>
+ <p><b>This template is not up-to-date with the installed version of
+ ikiwiki, and may not behave correctly until updated.</b></p>
+ </TMPL_IF>
+
+> Well, that don't look like as good an idea today.. I've documented the
+> recent template change. --[[Joey]] [[!tag done]]
diff --git a/doc/bugs/No_categories_in_RSS__47__Atom_feeds.mdwn b/doc/bugs/No_categories_in_RSS__47__Atom_feeds.mdwn
new file mode 100644
index 000000000..cb9c2612e
--- /dev/null
+++ b/doc/bugs/No_categories_in_RSS__47__Atom_feeds.mdwn
@@ -0,0 +1,7 @@
+RSS and Atom feeds don't have any categories listed in them, even though the
+templates have a loop for the categories (I'm assuming that it's the tags
+that it's supposed to be looping through?).
+
+The tags are showing up as expected in the HTML output, just not any feeds.
+
+> Yurk! [[bugs/done]] --[[Joey]]
diff --git a/doc/bugs/No_link_for_blog_items_when_filename_contains_a_colon.mdwn b/doc/bugs/No_link_for_blog_items_when_filename_contains_a_colon.mdwn
new file mode 100644
index 000000000..bb3f92f9c
--- /dev/null
+++ b/doc/bugs/No_link_for_blog_items_when_filename_contains_a_colon.mdwn
@@ -0,0 +1,76 @@
+Since upgrading from Ikiwiki 2.20 to 2.32.3 (from Debian Lenny), I don't get hyperlinks to items which have a colon in their name anymore. This applies to both the normal and the archive view. Before the update, the links have been created as relative links, so they weren't usable either as the browser tried to take the part before the colon as protocol specification (as in e.g. `http:`). Iirc, this applied to the archive view only, links in the normal view were ok (will check this out again if it's important). Is there a way to quote each colon as %xy in the hyperlinks? Perhaps it's only a problem with my config, and not an actual bug...?
+
+EDIT: I just found that in this wiki under <http://ikiwiki.info/bugs/done/> the entry "mailto: links not properly generated in rss/atom feeds" also doesn't have a hyperlink - at least it's not a problem with my config only ;-)
+
+[[madduck]]: I traced this down to `htmlscrubber`. If disabled, it works. If
+enabled, then `$safe_url_regexp` determines the URL unsafe because of the
+colon and hence removes the `src` attribute.
+
+Digging into this, I find that [[!rfc 3986]] pretty much discourages colons in
+filenames:
+
+> A path segment that contains a colon character (e.g., "this:that") cannot be
+> used as the first segment of a relative-path reference, as it would be
+> mistaken for a scheme name. Such a segment must be preceded by
+> a dot-segment (e.g., "./this:that") to make a relative- path reference.
+
+The solution seems not to use colons.
+
+In any case, `htmlscrubber` should get a new regexp, courtesy of dato:
+`[^:]+($|\/)`. I have tested and verified this.
+
+[Commit/patch
+be0b4f60](http://git.madduck.net/v/code/ikiwiki.git?a=commit;h=be0b4f603f918444b906e42825908ddac78b7073) fixes this.
+
+
+**July 21 2008:** I update this bug report as it still seems to be an issue: E.g. when creating a subpage whose name contains
+a colon by inserting an appropriate wikilink in the parent page: the new page can be created using that link, but afterwards
+there won't be a link to this page. Like madduck said above it seems to be htmlscrubber removing this link. However everything
+works fine if the same page is being linked to from another subpage because in that case the resulting link starts with `../`.
+
+At the moment I see two possible solutions:
+
+1. let all relative links at least start with `./`. I haven't tested this.
+
+2. Escape the colon in page titles. I created the following patch which worked for me:
+
+ --- IkiWiki.pm.2.53-save 2008-07-08 15:56:38.000000000 +0200
+ +++ IkiWiki.pm 2008-07-21 20:41:35.000000000 +0200
+ @@ -477,13 +477,13 @@
+
+ sub titlepage ($) {
+ my $title=shift;
+ - $title=~s/([^-[:alnum:]:+\/.])/$1 eq ' ' ? '_' : "__".ord($1)."__"/eg;
+ + $title=~s/([^-[:alnum:]+\/.])/$1 eq ' ' ? '_' : "__".ord($1)."__"/eg;
+ return $title;
+ }
+
+ sub linkpage ($) {
+ my $link=shift;
+ - $link=~s/([^-[:alnum:]:+\/._])/$1 eq ' ' ? '_' : "__".ord($1)."__"/eg;
+ + $link=~s/([^-[:alnum:]+\/._])/$1 eq ' ' ? '_' : "__".ord($1)."__"/eg;
+ return $link;
+ }
+
+What do you think about that? Does the patch have any side-effects I didn't see?
+
+> I almost really fixed this in 2.53, but missed one case. All fixed now
+> AFAICS. --[[Joey]]
+
+>> Hmm, did you fix it now in 2.54? If so, I suspect there is still one little case left (might well be the last one,
+>> at least I hope so ;-) ): I just created a test post in the sandbox here: [[sandbox/test: with a colon in its name]]
+>> (btw, why doesn't this get a hyperlink here?).
+>>
+>>> Because wikilinks cannot have spaces, convert to underscores.
+>>> --[[Joey]]
+>>
+>> As it is put in the list of blog posts as a relative link, it starts
+>> with `<word><colon>` -- this makes the browser think that "test" is a protocol specification which is to replace `http`,
+>> so it complains (at least Opera and Firefox/Iceweasel on my Debian Etch do). What I described above for subpages
+>> with this name pattern also still happens on my local install (ikiwiki 2.54 on Debian Etch), but this is basically
+>> the same problem.
+>>
+>> I think the cleanest solution would be to quote colons in page names (like it is currently done for slashes)?
+>> Starting the links with "`./`", as I proposed above, now seems a bit ugly to me... --Mathias
+
+>>> No, it's all [[done]] in 2.55. --[[Joey]]
diff --git a/doc/bugs/No_numbacklinks_setting_for___34__no_limit__34__.mdwn b/doc/bugs/No_numbacklinks_setting_for___34__no_limit__34__.mdwn
new file mode 100644
index 000000000..230befcdd
--- /dev/null
+++ b/doc/bugs/No_numbacklinks_setting_for___34__no_limit__34__.mdwn
@@ -0,0 +1,4 @@
+No value for `numbacklinks` will achieve a setting of "no limit".
+--[[JoshTriplett]]
+
+0 [[done]] --[[Joey]]
diff --git a/doc/bugs/No_progress_in_progress_bar.mdwn b/doc/bugs/No_progress_in_progress_bar.mdwn
new file mode 100644
index 000000000..d67c55c8e
--- /dev/null
+++ b/doc/bugs/No_progress_in_progress_bar.mdwn
@@ -0,0 +1,43 @@
+I've just upgraded my Ikiwiki to backported version 2.64 for Ubuntu Hardy
+and wanted to see progress bar in action. Unfortunately, I can't see any progress.
+
+This is my example Ikiwiki syntax:
+
+ [[!progress percent=75]]
+
+A here is a HTML result:
+
+ <div class="progress">
+ <div class="progress-done"></div>
+ </div>
+
+It seems that progress plugin works at all, but it's a problem with passing
+progress value.
+
+Anyone can confirm the bug? --[[Paweł|ptecza]]
+
+> You are correct. The above example does generate the HTML you suggest. The
+> plugin requires a % sign:
+
+[[!progress percent="75%"]]
+
+> This could probably be improved. Certainly the documentation could be. -- [[Will]]
+
+>> Thanks for the hint, Will! I could check better the code... However, in my opinion
+>> that '%' sign is confusing here and should be dropped. I hope it's clear for all
+>> people that "percent" parameter passes values in percentages. --[[Paweł|ptecza]]
+
+>>> [[fixed|done]] --[[Joey]]
+
+>>> I've forgotten to add that now the HTML code is OK, but I can see only
+>>> "75%" string on white background wihout any border. I need to look closer
+>>> at CSS styles for the progress bar. --[[Paweł|ptecza]]
+
+>>>> You need the `div.progress` and `div.progress-done` from ikiwiki's
+>>>> default `style.css`. --[[Joey]]
+
+>>>>> Thank you for the fix, Joey!
+
+>>>>> I had `div.progress*` in the `style.css` file, but my Epiphany didn't want
+>>>>> to display the progress bar... Now it's OK and I can see beautiful progress,
+>>>>> though I've not changed anything. --[[Paweł|ptecza]]
diff --git a/doc/bugs/Not_all_comments_are_listed_by___33__map_or___33__inline.mdwn b/doc/bugs/Not_all_comments_are_listed_by___33__map_or___33__inline.mdwn
new file mode 100644
index 000000000..a65306fcb
--- /dev/null
+++ b/doc/bugs/Not_all_comments_are_listed_by___33__map_or___33__inline.mdwn
@@ -0,0 +1,68 @@
+While working on our forum I was surprised to see that some of the comments were not appearing in the RSS feed created by `!inline pages="forum/* or comment(forum/*)" feedonly="yes" show="25"`.
+
+> I'm seeing some comments in the rss feed. The feed is limited to the 25
+> most recent items, you can increase that with feedshow. --[[Joey]]
+
+>> Of course, but in the feed, some of the new comments don't show up.
+>> Most does but not all. For example, none of the comments of this thread
+>> appears in the RSS, even though they should according to the « 25 most
+>> recent items » criteria:
+>> <https://tails.boum.org/forum/Security_Updates:_apt-get_Sufficient__63__/> --sajolida
+
+>>> Of course this is a moving target, so I checked out
+>>> 4a787aecb142f346190ddaef59938799818c964b, which is from the same day
+>>> the above was written.. The comments in question appeared in
+>>> the rss feed when I ran `ikiwiki -setup ikiwiki.setup -gettime`
+>>> (after configuring the setup file to use git and rss and setting
+>>> `gitorigin_branch: ''`)
+>>>
+>>> So I suppose I'd need a testcase in a tarball to reproduce
+>>> any problem. --[[Joey]]
+
+>>>> Once I set `rcs: git` and `gitorigin_branch: ''`, I also get a
+>>>> perfect RSS feed that contains the items git log makes me expect,
+>>>> in the correct order. So this is not a ikiwiki bug after all,
+>>>> sorry for the annoyance. (For the record, I think we have two
+>>>> problems: first, our ikiwiki.setup does not enable a RCS, mainly
+>>>> to avoid local refresh to create ugly "updated PO files" Git
+>>>> commits; this explains the issue sajolida noticed while locally
+>>>> building the wiki. Second, the RSS feed on our online ikiwiki is
+>>>> correct right now, was probably cured by a rebuild at some point.
+>>>> --[[intrigeri]]
+
+Then I found out that a map directive such as `!map pages="forum/* or
+comment(forum/*)"` was bringing a weird result too. The output is a map
+with quite a few broken links.
+
+> This is the same as if you tried to link to a comment page or other
+> internal page with a [[ikiwiki/WikiLink]] -- you'd get a broken link
+> or a create link because these are not true wiki pages. --[[Joey]]
+
+>> So I don't understand why 90 % of the comments are linked well and 10 %
+>> are broken links. Why does this map behave differently for only a few comments? --sajolida
+
+>>> I checked the first 50% or so of the comments, and every one was a
+>>> broken link. --[[Joey]]
+
+>>>> I now observe the same behaviour as Joey, which seems totally
+>>>> logical to me after all given that `forum/*/comment_*.html` are
+>>>> not generated. I wonder how we could have observed map generating
+>>>> working links to comments in the first place; sajolida, can you
+>>>> please try reproducing it? If you cannot reproduce it, I think we
+>>>> can close this bug. --[[intrigeri]]
+
+>>>>> Closing as apparently operator error (and while it's a bit confiusing
+>>>>> that map generates broken links for internal pages, it is *sorta* was
+>>>>> was requested by the pagespec, so I don't see a real reason to change
+>>>>> it). Please reopen if new data emerges. [[done]] --[[Joey]]
+
+Plus, some broken links in the map do match the comments missing on the RSS feed but some others do not.
+
+Unfortunately, I couldn't find an obvious pattern for this failure.
+
+We think it's a bug in ikiwiki. Our git repo is publicly available at
+`git://git.immerda.ch/amnesia.git` (the ikiwiki source is in `/wiki/src`)
+and the corresponding online version is available at
+<https://tails.boum.org/forum/>. The buggy `!inline` is already included in
+the original `forum.mdwn`. The buggy `!map` is not but the bug can be
+reproduced by just including it in the source of the forum.
diff --git a/doc/bugs/Obsolete_templates__47__estseek.conf.mdwn b/doc/bugs/Obsolete_templates__47__estseek.conf.mdwn
new file mode 100644
index 000000000..99330a115
--- /dev/null
+++ b/doc/bugs/Obsolete_templates__47__estseek.conf.mdwn
@@ -0,0 +1,3 @@
+The templates/estseek.conf file can safely be removed now that ikiwiki has switched to using xapian-omega.
+
+> Thanks for the reminder, [[done]] --[[Joey]]
diff --git a/doc/bugs/OpenID_delegation_fails_on_my_server.mdwn b/doc/bugs/OpenID_delegation_fails_on_my_server.mdwn
new file mode 100644
index 000000000..25cc47b18
--- /dev/null
+++ b/doc/bugs/OpenID_delegation_fails_on_my_server.mdwn
@@ -0,0 +1,53 @@
+When I use my OpenID, http://thewordnerd.info, I am redirected to
+http://thewordnerd.myopenid.com, the identity to which thewordnerd.info
+delegates. That is, I'm redirected to the exact identity URL, not to an
+authorization link.
+
+I am successfully using thewordnerd.info as my identity on many sites, so I
+know the delegation is pretty standard. It's stock WordPress with the
+delegation plugin. I also just attempted registration on http://identi.ca
+and successfully exchanged sreg data. So it seems like something is broken
+when using a delegate specifically with ikiwiki, and while I can use
+thewordnerd.myopenid.com, I'd rather use my delegate and free myself to
+switch to other providers in the future.
+
+> Hmm, I entered http://thewordnerd.info as the openid, and ended up at
+> http://thewordnerd.myopenid.com/ , which seems right? --[[Joey]]
+
+Sorry, didn't notice this edit. But, no, that is incorrect. Entering http://thewordnerd.info or thewordnerd.info should do the exact same thing that entering http://thewordnerd.myopenid.com does--in your case, prompt you to log in, in mine, ask if I want to verify the request. It's redirecting to the page itself, not using it as an OpenID provider.
+
+Unfortunately I don't speak or understand enough Perl to fix this, nor do I understand how to use its debugger, but it looks as if the consumer should support delegation. Not sure why it's behaving incorrectly here.
+
+> Your openid delegation is wrong.
+>
+> Here is a working openid delegation (from http://joey.kitenet.net:)
+> <link href="http://www.myopenid.com/server" rel="openid.server" />
+> <link href="http://www.myopenid.com/server" rel="openid2.provider" />
+> <link href="https://joeyh.myopenid.com/" rel="openid.delegate" />
+> <link href="https://joeyh.myopenid.com/" rel="openid2.local_id" />
+>
+> The above is generated by ikiwiki, using the meta openid directive:
+>
+> \[[meta openid="https://joeyh.myopenid.com/" server="http://www.myopenid.com/server"]]
+>
+> Here is your delegation:
+>
+> <meta http-equiv="X-XRDS-Location" content="http://thewordnerd.myopenid.com/xrds" />
+> <link rel="openid.server" href="http://thewordnerd.myopenid.com" />
+> <link rel="openid.delegate" href="http://thewordnerd.myopenid.com" />
+>
+> So, your openid.server is set wrong; when loging in ikiwiki redirects to
+> the specified url, which is not behaving as an openid server at all. If it's changed
+> to use http://www.myopenid.com/server, it would work the same as mine.
+>
+> I suspect that it was working for you on other sites that support openid
+> 2.0 and XRDS, since the xrds file on your site seems to have the correct
+> http://www.myopenid.com/server url in it. Ikiwiki, however, uses perl
+> modules that do not support openid 2.0 or XRDS, and so the incorrect
+> openid 1.0 delegation is used. --[[Joey]]
+
+[[done]]
+
+Seems so, thanks.
+
+For future reference, and in case anyone has a similar problem and searches here first as I did, I set my OpenID settings using the examples shown for the WordPress OpenID delegation plugin, which seem to work fine on a whole bunch of other sites but a) not here and b) are inaccurate according to the MyOpenID FAQ. I'll file a bug against that plugin to either update its example or remove it entirely. So not an ikiwiki bug, but someone's bug nonetheless.
diff --git a/doc/bugs/PNG_triggers_UTF-8_error_in_MimeInfo.pm.mdwn b/doc/bugs/PNG_triggers_UTF-8_error_in_MimeInfo.pm.mdwn
new file mode 100644
index 000000000..0a1299993
--- /dev/null
+++ b/doc/bugs/PNG_triggers_UTF-8_error_in_MimeInfo.pm.mdwn
@@ -0,0 +1,25 @@
+If a PNG image matches the [[ikiwiki/PageSpec]] of an [[ikiwiki/directive/inline]] directive, the page throws the following error:
+
+> \[[!inline Error: Malformed UTF-8 character (fatal) at /usr/local/lib/perl5/site_perl/5.8.8/File/MimeInfo.pm line 120.]]
+
+Individual posts display fine, and moving the offending image outside the scope of the [[ikiwiki/directive/inline]] directive's PageSpec eliminates the error.
+
+> I tried to reproduce this with a random png and File::MimeInfo
+> version 0.15, but could not. The png was included in the generated feed
+> via an enclosure, as it should be; no warnings or errors.
+>
+> Looking at the source to File::MimeInfo and its changelog,
+> I'm pretty sure that this problem was fixed in version
+> 0.14:
+>> - Fixed bug with malformed utf8 chars in default() method
+>
+> The code involved in that fix looks like this:
+>
+>> no warnings; # warnings can be thrown when input not ascii
+>> if ($] < 5.008 or ! utf8::valid($line)) {
+>> use bytes; # avoid invalid utf8 chars
+>
+> I guess that your locally installed version of File::MimeInfo is older than
+> this. So closing this bug [[done]]. If you still see the problem with a current
+> version of File::MimeInfo, please reopen and include where I can get a png file
+> that triggers the problem. --[[Joey]]
diff --git a/doc/bugs/PREFIX_not_honoured_for_underlaydir.mdwn b/doc/bugs/PREFIX_not_honoured_for_underlaydir.mdwn
new file mode 100644
index 000000000..11557c822
--- /dev/null
+++ b/doc/bugs/PREFIX_not_honoured_for_underlaydir.mdwn
@@ -0,0 +1,44 @@
+I built ikiwiki using
+
+% perl Makefile.PL PREFIX=/home/ed
+% make
+% make install
+
+This installs the files under /home/ed, for example one of the lines it prints is
+
+cp -a basewiki/* /home/ed/share/ikiwiki/basewiki
+
+However when I try to run ikiwiki I get an error as follows:
+
+% ikiwiki --verbose ~/wikiwc/ ~/public_html/wiki/ --url=http://membled.com/wiki/
+Can't stat /usr/share/ikiwiki/basewiki: No such file or directory
+ at /home/ed/lib/perl5/site_perl/5.8.7/IkiWiki/Render.pm line 349
+
+The PREFIX specified at build time should also affect the share directory -
+it shouldn't try to use /usr/share here.
+
+> Actually, the PREFIX, no matter where you specify it, is only
+> intended to control where files are _installed_, not where they're
+> looked for at runtime.
+
+> There's a good reason not to make PREFIX be used to actually
+> change the program's behavior: Most packaging systems use PREFIX
+> when building the package, to make it install into a temporary
+> directory which gets packaged up.
+
+This is not the case. That is the difference between PREFIX and DESTDIR.
+
+DESTDIR does what you describe; it causes the files to be installed into some
+directory you specify, which may not be the same place you'd eventually
+run it from.
+
+PREFIX means build the software to run under the location given. Normally it
+will also affect the location files are copied to, so that 'make install'
+installs a working system.
+
+At least, that's the way I've always understood it; the MakeMaker documentation
+isn't entirely clear (perhaps because ordinary Perl modules do not need to be
+configured at build time depending on the installation directory). It does mention
+that DESTDIR is the thing used by packaging tools.
+
+> Thanks for clarifying that. [[bugs/done]] --[[Joey]]
diff --git a/doc/bugs/Pages_with_non-ascii_characters_like_öäå_in_name_not_found_directly_after_commit.mdwn b/doc/bugs/Pages_with_non-ascii_characters_like_öäå_in_name_not_found_directly_after_commit.mdwn
new file mode 100644
index 000000000..8fb09f9d6
--- /dev/null
+++ b/doc/bugs/Pages_with_non-ascii_characters_like_öäå_in_name_not_found_directly_after_commit.mdwn
@@ -0,0 +1,145 @@
+At least my setup on kapsi.fi always prints 404 Not Found after adding a page with non-ascii characters in name. But the page exists and is visible after the 404 with url encoding and the blog page is inlined correctly on the feed page.
+
+Apparently ikiwiki.info does not complain with 404. Should the character encoding be set in wiki config?
+
+Happens also after editing the page. Here's an example:
+
+ * page name displayed in 404: http://mcfrisk.kapsi.fi/skiing/posts/Iso-Sy%F6te%20Freeride%202011%20Teaser.html?updated
+ * page name in the blog feed: http://mcfrisk.kapsi.fi/skiing/posts/Iso-Sy%C3%B6te%20Freeride%202011%20Teaser.html
+
+Difference is in the word Iso-Syöte. Pehaps also the browsers is part of
+the game, I use Iceweasel from Debian unstable with default settings.
+
+> I remember seeing this problem twice before, and both times it was caused
+> by a bug in the *web server* configuration. I think at least one case it was
+> due to an apache rewrite rule that did a redirect and mangled the correct
+> encoding.
+>
+> I recommend you check there. If you cannot find the problem with your web
+> server, I recommend you get a http protocol dump while saving the page,
+> and post it here for analysis. You could use tcpdump, or one of the
+> browser plugins that allows examining the http protocol. --[[Joey]]
+
+Server runs Debian 5.0.8 but I don't have access to the Apache configs. Here's the tcp stream from wireshark without cookie data, page name is testiä.html. I guess page name is in utf-8 but in redirect after post it is given to browser with 8859-1.
+
+ POST /ikiwiki.cgi HTTP/1.1
+ Host: mcfrisk.kapsi.fi
+ User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.16) Gecko/20110107 Iceweasel/3.5.16 (like Firefox/3.5.16)
+ Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
+ Accept-Language: en-us,en;q=0.5
+ Accept-Encoding: gzip,deflate
+ Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
+ Keep-Alive: 300
+ Connection: keep-alive
+ Referer: http://mcfrisk.kapsi.fi/ikiwiki.cgi
+ Cookie: XXXX
+ Content-Type: multipart/form-data; boundary=---------------------------138059850619952014921977844406
+ Content-Length: 1456
+
+ -----------------------------138059850619952014921977844406
+ Content-Disposition: form-data; name="_submitted"
+
+ 2
+ -----------------------------138059850619952014921977844406
+ Content-Disposition: form-data; name="do"
+
+ edit
+ -----------------------------138059850619952014921977844406
+ Content-Disposition: form-data; name="sid"
+
+ 93c956725705aa0bbdff98e57efb28f4
+ -----------------------------138059850619952014921977844406
+ Content-Disposition: form-data; name="from"
+
+
+ -----------------------------138059850619952014921977844406
+ Content-Disposition: form-data; name="rcsinfo"
+
+ 5419fbf402e685643ca965d577dff3dafdd0fde9
+ -----------------------------138059850619952014921977844406
+ Content-Disposition: form-data; name="page"
+
+ testi..
+ -----------------------------138059850619952014921977844406
+ Content-Disposition: form-data; name="type"
+
+ mdwn
+ -----------------------------138059850619952014921977844406
+ Content-Disposition: form-data; name="editcontent"
+
+ test
+ -----------------------------138059850619952014921977844406
+ Content-Disposition: form-data; name="editmessage"
+
+
+ -----------------------------138059850619952014921977844406
+ Content-Disposition: form-data; name="_submit"
+
+ Save Page
+ -----------------------------138059850619952014921977844406
+ Content-Disposition: form-data; name="attachment"; filename=""
+ Content-Type: application/octet-stream
+
+
+ -----------------------------138059850619952014921977844406--
+ HTTP/1.1 302 Found
+ Date: Wed, 02 Feb 2011 19:45:49 GMT
+ Server: Apache/2.2
+ Location: /testi%E4.html?updated
+ Content-Length: 0
+ Keep-Alive: timeout=5, max=500
+ Connection: Keep-Alive
+ Content-Type: text/plain
+
+ GET /testi%E4.html?updated HTTP/1.1
+ Host: mcfrisk.kapsi.fi
+ User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.16) Gecko/20110107 Iceweasel/3.5.16 (like Firefox/3.5.16)
+ Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
+ Accept-Language: en-us,en;q=0.5
+ Accept-Encoding: gzip,deflate
+ Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
+ Keep-Alive: 300
+ Connection: keep-alive
+ Referer: http://mcfrisk.kapsi.fi/ikiwiki.cgi
+ Cookie: XXXX
+
+ HTTP/1.1 404 Not Found
+ Date: Wed, 02 Feb 2011 19:45:55 GMT
+ Server: Apache/2.2
+ Content-Length: 279
+ Keep-Alive: timeout=5, max=499
+ Connection: Keep-Alive
+ Content-Type: text/html; charset=iso-8859-1
+
+ <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
+ <html><head>
+ <title>404 Not Found</title>
+ </head><body>
+ <h1>Not Found</h1>
+ <p>The requested URL /testi..html was not found on this server.</p>
+ <hr>
+ <address>Apache/2.2 Server at mcfrisk.kapsi.fi Port 80</address>
+ </body></html>
+
+Getting the pages has worked every time:
+
+ GET /testi%C3%A4.html HTTP/1.1
+ Host: mcfrisk.kapsi.fi
+ User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.16) Gecko/20110107 Iceweasel/3.5.16 (like Firefox/3.5.16)
+ Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
+ Accept-Language: en-us,en;q=0.5
+ Accept-Encoding: gzip,deflate
+ Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
+ Keep-Alive: 300
+ Connection: keep-alive
+ Cookie: XXXX
+ If-Modified-Since: Wed, 02 Feb 2011 19:45:54 GMT
+ If-None-Match: "1b518d-7c0-49b51e5a55c5f"
+ Cache-Control: max-age=0
+
+ HTTP/1.1 304 Not Modified
+ Date: Wed, 02 Feb 2011 20:01:43 GMT
+ Server: Apache/2.2
+ Connection: Keep-Alive
+ Keep-Alive: timeout=5, max=500
+ ETag: "1b518d-7c0-49b51e5a55c5f"
diff --git a/doc/bugs/Patch:_Fix_error_in_style.css.mdwn b/doc/bugs/Patch:_Fix_error_in_style.css.mdwn
new file mode 100644
index 000000000..3a160454e
--- /dev/null
+++ b/doc/bugs/Patch:_Fix_error_in_style.css.mdwn
@@ -0,0 +1,37 @@
+[[!tag patch css]]
+[[!template id=gitbranch branch=sunny256/css-fix author="[[sunny256]]"]]
+
+This trivial patch fixes an error in `styles.css` and is ready to be merged from the `css-fix` branch at `git://github.com/sunny256/ikiwiki.git` :
+
+ From e3b5eab2971109d18332fe44fd396322bb148cfc Mon Sep 17 00:00:00 2001
+ From: =?UTF-8?q?=C3=98yvind=20A.=20Holm?= <sunny@sunbase.org>
+ Date: Tue, 22 Feb 2011 18:14:21 +0100
+ Subject: [PATCH] style.css: Replace obsolete -moz-outline-style property with outline-style
+
+ The "-moz-outline-style" property generates an error at the W3C CSS
+ validator, saying the property doesn't exist. According to
+ <https://developer.mozilla.org/en/CSS/-moz-outline-style>, this property
+ is obsolete and the use of "outline-style" is preferred.
+ ---
+ doc/style.css | 2 +-
+ 1 files changed, 1 insertions(+), 1 deletions(-)
+
+ diff --git a/doc/style.css b/doc/style.css
+ index 922b82a..fa413cf 100644
+ --- a/doc/style.css
+ +++ b/doc/style.css
+ @@ -485,7 +485,7 @@ a.openid_large_btn:focus {
+ outline: none;
+ }
+ a.openid_large_btn:focus {
+ - -moz-outline-style: none;
+ + outline-style: none;
+ }
+ .openid_selected {
+ border: 4px solid #DDD;
+ --
+ 1.7.4.1.55.gdca3d
+
+--[[sunny256]] 2011-02-22 20:11+0100
+
+> [[Applied|done]]. --[[Joey]]
diff --git a/doc/bugs/Perl_scripts_depend_on___47__usr__47__bin__47__perl.mdwn b/doc/bugs/Perl_scripts_depend_on___47__usr__47__bin__47__perl.mdwn
new file mode 100644
index 000000000..d68d506f7
--- /dev/null
+++ b/doc/bugs/Perl_scripts_depend_on___47__usr__47__bin__47__perl.mdwn
@@ -0,0 +1,6 @@
+> On FreeBSD, perl defaults to installation in `/usr/local/bin/perl` since it is not a part of the base system. If the option to create symlinks in `/usr/bin` is not selected, > building and running ikiwiki will fail because the shebang lines use `#!/usr/bin/perl [args]`. Changing this to `#!/usr/bin/env -S perl [args]` fixes the issue.
+
+I think this should be a concern of ikiwiki's official FreeBSD port.
+
+At any rate, even if it is decided that ikiwiki should be fixed, then it is probably better to use
+`$installbin/perl` from `-MConfig` and not the `env` hack.
diff --git a/doc/bugs/Please_avoid_using___39__cp_-a__39___in_Makefile.PL.mdwn b/doc/bugs/Please_avoid_using___39__cp_-a__39___in_Makefile.PL.mdwn
new file mode 100644
index 000000000..8a83081c3
--- /dev/null
+++ b/doc/bugs/Please_avoid_using___39__cp_-a__39___in_Makefile.PL.mdwn
@@ -0,0 +1,77 @@
+In ikiwiki-2.60, external plug-ins are yet again installed using 'cp -a' instead of 'install -m 755'. This poses a problem on at least FreeBSD 6.x, since the cp(1) command doesn't support the '-a' flag.
+
+The change in question (from 2.56 to 2.60) can be seen here:
+
+ - for file in `find plugins -maxdepth 1 -type f ! -wholename plugins/.\*`; do \
+ - install -m 755 $$file $(DESTDIR)$(PREFIX)/lib/ikiwiki/plugins; \
+ - done; \
+ + for file in `find plugins -maxdepth 1 -type f ! -wholename plugins/.\* | grep -v demo`; do \
+ + cp -a $$file $(DESTDIR)$(PREFIX)/lib/ikiwiki/plugins; \
+ + done \
+
+Please restore the old behaviour of using 'install' :-)
+
+ -- [[HenrikBrixAndersen]]
+
+> I use cp -a because I don't want non-executable files to be installed
+> executable. (Causes breakage with setup file creation code) I really
+> wish *BSD could get out of the 70's in this area..
+> --[[Joey]]
+
+>> Well, really what's happening here is that *BSD (along with, for
+>> example, Solaris) is adhering rather closely to the Single UNIX
+>> Specification, whereas `-a` is a nonstandard option added to the
+>> GNU variant of `cp` (a habit Richard Stallman never really got under
+>> control). To install ikiwiki on Solaris I had to replace all uses not
+>> only of `cp` but also of `install` and `xgettext` with the GNU
+>> embrace-and-extend variants, and make sure I had those installed.
+>> That really is a bit of a PITA.
+
+>> I think there's an opportunity here for a really clean solution, though.
+
+>> Why not do the installation in pure Perl?
+
+>> The file manipulations being done by `cp` and `install` would be
+>> straightforward to code in Perl, and there really isn't a complicated
+>> build requiring the full functionality of `gmake`. `gxgettext` I'm
+>> not so sure about, but even getting rid of _almost_ all the
+>> nonstandard-utility dependencies would be a win.
+
+>> The idea is that if you're distributing a Perl-based app, one thing
+>> you'll always be absolutely certain of in the target environment is a
+>> working Perl. The fact that the current build starts out in Perl, but
+>> uses it to write a Makefile and then hand off to other utilities that
+>> are less dependably compatible across platforms is a disadvantage.
+
+>> A pure-Perl install can also query the very Perl it's running in to
+>> determine the proper places to install files, and that will be less
+>> error-prone that making a human edit the right paths into some files.
+>> It would be quite useful here, actually, where we have several distinct
+>> Perl builds installed at different paths, and ikiwiki could be correctly
+>> installed for any one of them simply by using the chosen Perl to run the
+>> install. That means this would also be a complete solution to
+>> [[todo/assumes_system_perl|todo/assumes_system_perl]].
+>> --ChapmanFlack
+
+>>> Joey: How about the following patch, then? -- [[HenrikBrixAndersen]]
+
+ --- Makefile.PL.orig 2008-08-16 14:57:00.000000000 +0200
+ +++ Makefile.PL 2008-08-16 15:03:45.000000000 +0200
+ @@ -67,9 +67,12 @@ extra_install:
+ done
+
+ install -d $(DESTDIR)$(PREFIX)/lib/ikiwiki/plugins
+ - for file in `find plugins -maxdepth 1 -type f ! -wholename plugins/.\* | grep -v demo`; do \
+ - cp -a $$file $(DESTDIR)$(PREFIX)/lib/ikiwiki/plugins; \
+ - done \
+ + for file in `find plugins -maxdepth 1 -type f ! -wholename plugins/.\* ! -name \*demo\* -name \*.py`; do \
+ + install -m 644 $$file $(DESTDIR)$(PREFIX)/lib/ikiwiki/plugins; \
+ + done
+ + for file in `find plugins -maxdepth 1 -type f ! -wholename plugins/.\* ! -name \*demo\* ! -name \*.py`; do \
+ + install -m 755 $$file $(DESTDIR)$(PREFIX)/lib/ikiwiki/plugins; \
+ + done
+
+ install -d $(DESTDIR)$(PREFIX)/share/man/man1
+ install -m 644 ikiwiki.man $(DESTDIR)$(PREFIX)/share/man/man1/ikiwiki.1
+
+[[!tag done]]
diff --git a/doc/bugs/Please_don__39__t_refer_to_offsite_openid_image.mdwn b/doc/bugs/Please_don__39__t_refer_to_offsite_openid_image.mdwn
new file mode 100644
index 000000000..561cd6771
--- /dev/null
+++ b/doc/bugs/Please_don__39__t_refer_to_offsite_openid_image.mdwn
@@ -0,0 +1,19 @@
+In style.css, please don't refer to the OpenID image on an external site.
+This reference allows that site to track users of ikiwikis and other sites
+supporting OpenID. Furthermore, this reference also opens up cross-site
+scripting vulnerabilities if the external site did something malicious. If
+the image has a Free Software license, please include it in ikiwiki, in the
+basewiki (preferably converted from gif to png). If the image does not
+have a Free Software license, please omit it, and allow users to choose to
+add it to their CSS themselves if they find the risks acceptable.
+--[[JoshTriplett]]
+
+> I wasn't able to get a clear statement of the license of that graphic,
+> back when I was writing the openid support although I didn't try very hard
+> (asked on irc on their irc channel, didn't seem to get anyone who was
+> familiar with DFSG). Googling around, they seem to have not yet decided
+> on a license:
+> <http://openid.net/pipermail/general/2007-January/001421.html>
+> <http://lists.danga.com/pipermail/yadis/2005-June/000990.html>
+>
+> Changed things around .. [[bugs/done]] --[[Joey]]
diff --git a/doc/bugs/Preview_removes_page_location_drop-down_options.mdwn b/doc/bugs/Preview_removes_page_location_drop-down_options.mdwn
new file mode 100644
index 000000000..71b3c7147
--- /dev/null
+++ b/doc/bugs/Preview_removes_page_location_drop-down_options.mdwn
@@ -0,0 +1,10 @@
+If there is more than one option in the Page Location drop down box when create a new page, when you preview, all other options are removed and you can only select your original selection.
+
+The problem exists on Ikiwiki's wiki, but manifests itself differently, in that subpages are still displayed, so maybe that's the actual bug :)
+
+Anyway, to reproduce, edit any page, view the page location options, preview, and some will disappear.
+
+> Yeah, this is a dark corner. :-) It'd be nice if preview did preserve the
+> lise, but it's complex to do so. So for now the bug is that it added
+> userdirs to the list, which I've fixed, so calling this [[done]]
+> --[[Joey]]
diff --git a/doc/bugs/Problem_with_displaying_smileys_on_inline_page.mdwn b/doc/bugs/Problem_with_displaying_smileys_on_inline_page.mdwn
new file mode 100644
index 000000000..db244457d
--- /dev/null
+++ b/doc/bugs/Problem_with_displaying_smileys_on_inline_page.mdwn
@@ -0,0 +1,25 @@
+I've noticed that my browser doesn't display the smileys on a page
+where I used inline pluggin. However I can see them when I click the link
+to the inlined "subpage".
+
+I checked HTML sources and it seems that Ikiwiki always generate the same
+relative path to image file with a smile (`../../../../../smileys/smile.png`),
+regardless of the page location. Are you sure it's a right working?
+
+What about if I have main inline page, for example
+<http://www.domain.com/blog/> page with
+<http://www.domain.com/blog/post/2008/06/27/foo/> subpage with the smile?
+
+Do you have any idea how to fix it? I dont't want to have multiple
+`smileys` directory, of course :) --[[Paweł|ptecza]]
+
+> I see that I broke this in commit
+> 0b9e849aba38f0695491ad5ca27de11632627ffe, presumably because a) sanitize
+> filters didn't get destpage at the time and b) I didn't think through
+> what that meant. Luckily, in the meantime, I added destpage to santize's
+> parameters, so it was easy to fix. [[done]] --[[Joey]]
+
+>> Great! Thank you very much, Joey!
+
+>> BTW, I love Git (and another distributed SCMs) commit id. It's so human
+>> friendly ;) --[[Paweł|ptecza]]
diff --git a/doc/bugs/Problem_with_editing_page_after_first_SVN_commit.mdwn b/doc/bugs/Problem_with_editing_page_after_first_SVN_commit.mdwn
new file mode 100644
index 000000000..b8a3e40ef
--- /dev/null
+++ b/doc/bugs/Problem_with_editing_page_after_first_SVN_commit.mdwn
@@ -0,0 +1,209 @@
+I have a strange problem with editing any page after its first SVN commit.
+I'm not sure whether it's my ikiwiki backport bug or my misunderstanding
+how ikiwiki works.
+
+Assume that I have Foo page with any content and I want to put there link
+to Bar page and next create the page. I do following steps:
+
+1. Click Edit link on Foo page
+
+2. Put the link to Bar page there and commit it by clicking "Save Page"
+ button
+
+ The Bar page is rendered correctly and now I can see ?Bar link. The URL
+ in the address bar of my browser is
+
+ http://my.host.com/wiki/foo.html?updated
+
+3. Click ?Bar link
+
+ Now I can see textarea for editing of page. It's empty, of course.
+
+ The page doesn't exists in my SVN repo yet and my Apache server knows
+ noting about it:
+
+ $ find /my/ikiwiki/src/dir/ -type f -name bar.mdwn
+ $ find /my/ikiwiki/dst/dir/ -type f -name bar.html
+
+4. Add some initial content and click "Save Page" button
+ to commit changes
+
+ The Foo page also is rendered correctly and now I can see what I wrote.
+ The URL in the address bar of my browser is
+
+ http://my.host.com/wiki/bar.html?updated
+
+ The page was added to the SVN repo and my Apache is able to serve it now:
+
+ $ find /my/ikiwiki/src/dir/ -type f -name bar.mdwn
+ /my/ikiwiki/src/dir/bar.mdwn
+ $ find /my/ikiwiki/dst/dir/ -type f -name bar.html
+ /my/ikiwiki/dst/dir/bar.html
+
+5. Change the content of Bar page by clicking Edit link
+
+ I can't do it, because the textarea is empty again. I have to run
+ `ikiwiki --setup ikiwiki.setup` command by hand to rebuild the page.
+ Then I can edit it.
+
+Where is my mistake?
+
+--[[Paweł|ptecza]]
+
+> It's not clear which Edit link you clicked in step 5. Is it the link on
+> the new page, or the old link back on page Foo that you clicked on before
+> to create Bar? It would also be good to see the URL you're at in step 5.
+> --[[Joey]]
+
+>> It was Edit link on new Bar page, of course. The URL in step 5 was
+>> http://my.host.com/wiki/ikiwiki.cgi?page=bar&do=edit.
+
+>> I've forget to add in my previous post that $pagesources{$page}
+>> (cgi_editpage subroutine of /usr/share/perl5/IkiWiki/CGI.pm file)
+>> doesn't exist in step 5. It exists after rebuilding all ikiwiki
+>> pages by hand.
+
+>> BTW, where does ikiwiki store information about rendered pages?
+>> Is it `/my/ikiwiki/src/dir/.ikiwiki/` directory?
+
+>> --[[Paweł|ptecza]]
+
+>>> Well, the missing %pagesources value explains the symptom for sure.
+>>> ikiwiki stores its state in .ikiwiki/index, and that should include
+>>> info about the new page you've created, including the source file for
+>>> it, which is where the data in %pagesources comes from.
+>>>
+>>> It sounds to me like somehow, when you commit a change to svn by
+>>> saving the page, it rebuilds the wiki, but does not update the index
+>>> file. Maybe it's crashing before it can save the index file. Or maybe
+>>> it's possibly be misconfigured, and updating a different index file in
+>>> a different copy of the source? You should be able to figure out what's
+>>> going on my looking at how the index file changes (or not) when you
+>>> create the new page. --[[Joey]]
+
+>>>> I've checked that my ikiwiki really doesn't touch `.ikiwiki/index` file
+>>>> when I create and save a new page. In `error.log` file of my Apache2
+>>>> server I can't see any "Permission denied" messages, but I suspect
+>>>> that a reason of my problem can be the bad access permissions:
+
+>>>> root@my.host:/my/ikiwiki/src/dir# ls -ld .ikiwiki/
+>>>> drwxrwsr-x 2 www-data src 4096 2007-01-11 10:00 .ikiwiki/
+>>>> root@my.host:/my/ikiwiki/src/dir# cd .ikiwiki/
+>>>> root@my.host:/my/ikiwiki/src/dir/.ikiwiki# ls -l
+>>>> razem 48
+>>>> -rw-rw-r-- 1 www-data src 17353 2007-01-11 10:00 index
+>>>> -rw-rw-r-- 1 www-data src 0 2007-01-11 10:17 lockfile
+>>>> -rw------- 1 www-data src 24576 2007-01-11 10:17 sessions.db
+>>>> -rw------- 1 www-data src 0 2006-11-15 14:45 sessions.db.lck
+>>>> -rw------- 1 www-data src 404 2007-01-08 10:24 userdb
+
+>>>> What do you think about it? Does it look good? My ikiwiki runs
+>>>> under control of Apache2 server and it's configured to run
+>>>> as `www-data` user and `www-data` group. --[[Paweł|ptecza]]
+
+>>>>> It's a bit weird to run ikiwiki as www-data. This means that www-data
+>>>>> can write to your subversion repository? And the svn post-commit hook
+>>>>> _also_ runs as www-data? It certianly could be some permissions issue
+>>>>> that is not being reported properly. --[[Joey]]
+
+>>>>>> No, my SVN `post-commit` hook runs as `root` (uid) and `www-data` (gid).
+>>>>>> Only `root` user and `src` group have write permissions to my SVN repo.
+
+>>>>>> Could you please show me your permissions for `repodir`, `srcdir`
+>>>>>> and `destdir` and how runs your Apache server? --[[Paweł|ptecza]]
+
+>>>>>>> Ugh, root? My permissions setup is simple, ikiwiki runs as a single
+>>>>>>> user, and that same user can commit to the svn repo and write to
+>>>>>>> all files. --[[Joey]]
+
+>>>>>>>> What's your user? Please show me result of `ls -ld dir` for
+>>>>>>>> directories above :) --[[Paweł|ptecza]]
+
+>>>>>>>>> All my directories are 755 joey:joey. --[[Joey]]
+
+>>>>>>>>>> Thanks! But I have another situation: a multiuser system and a few
+>>>>>>>>>> ikiwiki commiters. --[[Paweł|ptecza]]
+
+>>>>>>>>>>> Joey, I think I've just fixed the permission, but my ikiwiki still
+>>>>>>>>>>> doesn't update my `.ikiwiki/index` file. Could you please explain me
+>>>>>>>>>>> when ikiwiki calls `saveindex()` subroutine? My ikiwiki doesn't do it
+>>>>>>>>>>> when I create a new page and save it or when I update and save
+>>>>>>>>>>> a existing page. It does it only when I run
+>>>>>>>>>>> `ikiwiki --setup ikiwiki.setup` and I'm desperated...
+
+>>>>>>>>>>> BTW, where should I store my `ikiwiki.setup` file? It must be placed
+>>>>>>>>>>> under `$srcdir/.ikiwiki/` directory or it doesn't matter?
+>>>>>>>>>>> Does `ikiwiki.cgi` wrapper know where the `ikiwiki.setup` file
+>>>>>>>>>>> is stored? --[[Paweł|ptecza]]
+
+Sorry I am not indenting for my reply (in my browser the responses are very narrow.)
+
+I also had problem with no webpages getting generated via the CGI unless I ran ikiwiki to regen manually.
+I can't find the discussion here about in the ikiwiki website though. I think it was removed and now I can't find it in the history.
+My problem was caused by not having a revision system defined, so it defaulted to subversion (but I didn't have that installed).
+
+> Note that that confusing default to svn has been changed.. And you're
+> right about how the setup file is used below, BTW. --[[Joey]]
+
+As for your .setup file you can put it anywhere. I don't think the CGI knows where it is at because its settings are set in the "wrapper".
+In my case, my setup file is in a different home and owned by a different user than the CGI or my generated website. By the way, I also don't keep my .ikiwiki private directory in my source directory by setting wikistatedir (which doesn't seem to be documented).
+
+--[[JeremyReed]]
+
+> Never mind about indentation, Jeremy! :) Thanks a lot you're interested in
+> my problem and you try to help me.
+
+> I use RCS backend and store my ikiwiki sources in SVN repo. Here is my SVN
+> related settings:
+>
+> rcs => "svn",
+> svnrepo => "/var/lib/svn/ikiwiki",
+> svnpath => "trunk/pages",
+>
+> I've noticed the following piece of code in `/usr/share/perl5/IkiWiki/CGI.pm`
+> file (`cgi_editpage()` subroutine):
+>
+> # save page
+> page_locked($page, $session);
+>
+> my $content=$form->field('editcontent');
+>
+> $content=~s/\r\n/\n/g;
+> $content=~s/\r/\n/g;
+> writefile($file, $config{srcdir}, $content);
+>
+> if ($config{rcs}) {
+> # Here is RCS stuff
+> # ...
+> }
+> else {
+> require IkiWiki::Render;
+> refresh();
+> saveindex();
+> }
+>
+> # The trailing question mark tries to avoid broken
+> # caches and get the most recent version of the page.
+> redirect($q, "$config{url}/".htmlpage($page)."?updated");
+>
+> As you can see ikiwiki calls `saveindex()` subroutine if `rcs` variable
+> is not defined. I don't understand it, because in this way ikiwiki
+> doesn't update my `.ikiwiki/index` file. Joey, could you please
+> enlight me here ;)
+>
+> BTW, I also noticed `wikistatedir` variable in the ikiwiki code
+> and I couldn't find any information about it in ikiwiki docs :) --[[Paweł|ptecza]]
+
+>> wikistatedir is a non-configurable internal value.
+>>
+>> What happens during an edit with the code you quoted is that the "rcs
+>> stuff" results in a commit of the page to svn. This results in the
+>> ikiwiki svn post-commit hook running. The post-commit hook updates the
+>> wiki, and calls saveindex. That's why it's not called in the RCS path in
+>> the code above.
+>>
+>> It sounds like your post-commit hook is still not set up, or is failing
+>> for some reason (permissions perhaps?) --[[Joey]]
+
+>>> OK, [[bugs/done]]! It was problem with permissions and not upgraded
+>>> `editpage.tmpl` template :) --[[Paweł|ptecza]] \ No newline at end of file
diff --git a/doc/bugs/Problem_with_toc.pm_plug-in.mdwn b/doc/bugs/Problem_with_toc.pm_plug-in.mdwn
new file mode 100644
index 000000000..6be5f89b5
--- /dev/null
+++ b/doc/bugs/Problem_with_toc.pm_plug-in.mdwn
@@ -0,0 +1,37 @@
+The toc.pm plug-in currently renders empty 'a' tag elements. This seems to confuse at least Firefox, possibly others. The result is that the following text is rendered as an anchor (visible if you style 'a' elements in a different color).
+
+Here is a patch for toc.pm for producing non-empty 'a' elements.
+
+> Thanks for the patch, but I already fixed this in 2.4 using a different
+> approach. I think your patch is slightly broken, an anchor tag isn't
+> really meant to enclose all the html it anchors to, but just be stuck in
+> front of it. --[[Joey]] [[!tag done]]
+
+ --- IkiWiki/Plugin/toc.pm.orig Thu Jun 7 11:53:53 2007
+ +++ IkiWiki/Plugin/toc.pm Thu Jun 7 13:00:00 2007
+ @@ -47,7 +47,7 @@ sub format (@) {
+ if ($tagname =~ /^h(\d+)$/i) {
+ my $level=$1;
+ my $anchor="index".++$anchors{$level}."h$level";
+ - $page.="$text<a name=\"$anchor\" />";
+ + $page.="$text<a name=\"$anchor\">";
+
+ # Take the first header level seen as the topmost level,
+ # even if there are higher levels seen later on.
+ @@ -90,6 +90,16 @@ sub format (@) {
+ "</a>\n";
+ $p->handler(text => undef);
+ }, "dtext");
+ + }
+ + else {
+ + $page.=$text;
+ + }
+ + }, "tagname, text");
+ + $p->handler(end => sub {
+ + my $tagname=shift;
+ + my $text=shift;
+ + if ($tagname =~ /^h(\d+)$/i) {
+ + $page.="</a>$text";
+ }
+ else {
+ $page.=$text;
diff --git a/doc/bugs/Problem_with_umlauts_and_friends.mdwn b/doc/bugs/Problem_with_umlauts_and_friends.mdwn
new file mode 100644
index 000000000..1223ef4b3
--- /dev/null
+++ b/doc/bugs/Problem_with_umlauts_and_friends.mdwn
@@ -0,0 +1,16 @@
+I can't tell yet if this is a problem with ikiwiki, or rather with the web server,
+or `w3m`, or whatever.
+
+To reproduce:
+
+ $ LC_ALL=C w3m http://ikiwiki.info/sandbox/
+
+Select *Edit*, log in, have `w3m` spawn an editor for editing the page and notice
+that all umlauts and friends have disappeared. /!\ If the user now saves the page,
+the mangled page will be entered into the RCS, so don't do this on the ikiwiki
+sandbox page.
+
+> Yes, if you run a web browser in a non-utf8 locale, it can neither
+> display nor properly edit unicode. --[[Joey]]
+
+[[notabug|done]]
diff --git a/doc/bugs/Problems_using_cygwin.mdwn b/doc/bugs/Problems_using_cygwin.mdwn
new file mode 100644
index 000000000..9a49e2587
--- /dev/null
+++ b/doc/bugs/Problems_using_cygwin.mdwn
@@ -0,0 +1,20 @@
+I'd like to run ikiwiki under cygwin. I'm new to ikiwiki and have tried to follow the setup tutorial as best I could. I got all the way up to step 7, but I can't get the CGI to run successfully (step 8).
+
+> Moved the formbuilder bug to [[formbuilder_3.0401_broken]] --[[Joey]]
+
+-----
+
+A different problem has reared its ugly head. When I click on "RecentChanges", the CGI complains about an undefined subroutine:
+
+<pre>
+==> apache2/error_log <==
+[Thu Oct 12 16:20:52 2006] [error] [client 192.168.0.125] Undefined subroutine &IkiWiki::XMLin called at /usr/lib/perl5/site_perl/5.8/IkiWiki/Rcs/svn.pm line 143., referer: http://imrisws36/wiki/index.html?updated
+[Thu Oct 12 16:20:52 2006] [error] [client 192.168.0.125] Premature end of script headers: ikiwiki.cgi, referer: http://imrisws36/wiki/index.html?updated
+</pre>
+
+Indeed there is no such routine IkiWiki::XMLin(). I don't understand how this can possibly work -- as it manifestly does on linux.
+
+> XMLin is supposed to be exported by XML::Simple. My guess is that, due to a missing error check, XML::Simple is failing to load, and it's not aborting then. You probably need to install that module; in the meantime, I've fixed the missing error check in svn. --[[Joey]]
+
+
+[[bugs/done]]
diff --git a/doc/bugs/Problems_with_graphviz.pm_plug-in.mdwn b/doc/bugs/Problems_with_graphviz.pm_plug-in.mdwn
new file mode 100644
index 000000000..bc80125ad
--- /dev/null
+++ b/doc/bugs/Problems_with_graphviz.pm_plug-in.mdwn
@@ -0,0 +1,43 @@
+The graphviz.pm plug-in currently attempts to read PNG data in UTF-8 mode, which sometimes fail with a message similar to the following (depending on the input):
+
+ utf8 "\x89" does not map to Unicode at /usr/local/lib/perl5/site_perl/5.8.8/IkiWiki/Plugin/graphviz.pm line 53, <IN> chunk 1.
+ Wide character in subroutine entry at /usr/local/lib/perl5/site_perl/5.8.8/IkiWiki/Plugin/graphviz.pm line 68.
+
+> Ok, will remove the binmode IN then. done --[[Joey]]
+
+>> Thanks --[[HenrikBrixAndersen]]
+
+It also generates image URLs relative to the page being rendered, which means the URLs wont work when previewing a graph from the CGI script.
+
+(preview bug split to [[Problems_with_graphviz.pm_plug-in_previews]])
+
+>> Here is an updated patch againt ikiwiki-2.5:
+
+>>> [[Applied|done]], thanks. --[[Joey]]
+
+The patch below fixes these two issues.
+
+ --- graphviz.pm.orig Thu Jun 7 15:45:16 2007
+ +++ graphviz.pm Fri Jun 8 12:03:38 2007
+ @@ -41,7 +41,6 @@ sub render_graph (\%) {
+ $pid=open2(*IN, *OUT, "$params{prog} -Tpng");
+
+ # open2 doesn't respect "use open ':utf8'"
+ - binmode (IN, ':utf8');
+ binmode (OUT, ':utf8');
+
+ print OUT $src;
+ @@ -70,7 +69,12 @@ sub render_graph (\%) {
+ }
+ }
+
+ - return "<img src=\"".urlto($dest, $params{page})."\" />\n";
+ + if ($params{preview}) {
+ + return "<img src=\"".urlto($dest, "")."\" />\n";
+ + }
+ + else {
+ + return "<img src=\"".urlto($dest, $params{page})."\" />\n";
+ + }
+ }
+
+ sub graph (@) {
diff --git a/doc/bugs/Problems_with_graphviz.pm_plug-in_previews.mdwn b/doc/bugs/Problems_with_graphviz.pm_plug-in_previews.mdwn
new file mode 100644
index 000000000..c77bbeeaf
--- /dev/null
+++ b/doc/bugs/Problems_with_graphviz.pm_plug-in_previews.mdwn
@@ -0,0 +1,54 @@
+(split from [[Problems_with_graphviz.pm_plug-in]])
+
+[graphviz] generates image URLs relative to the page being rendered, which means the URLs wont work when previewing a graph from the CGI script.
+
+>> Here is an updated patch againt ikiwiki-2.5:
+
+>>> Applied, thanks. --[[Joey]]
+
+ --- IkiWiki/Plugin/graphviz.pm.orig 2007-07-27 11:35:05.000000000 +0200
+ +++ IkiWiki/Plugin/graphviz.pm 2007-07-27 11:36:02.000000000 +0200
+ @@ -69,7 +69,12 @@ sub render_graph (\%) {
+ }
+ }
+
+ - return "<img src=\"".urlto($dest, $params{page})."\" />\n";
+ + if ($params{preview}) {
+ + return "<img src=\"".urlto($dest, "")."\" />\n";
+ + }
+ + else {
+ + return "<img src=\"".urlto($dest, $params{page})."\" />\n";
+ + }
+ }
+
+ sub graph (@) {
+
+
+>> --[[HenrikBrixAndersen]]
+
+>>> Despite this patch I am still experiencing the problem. Normal page source for a graph contains:
+
+ <div id="content">
+ <p><img src="./graph-c9fd2a197322feb417bdedbca5e99f5aa65b3f06.png" /></p>
+
+ </div>
+
+>>> preview contains
+
+ <div id="preview">
+ <p><img src="./demo/diagrams/graph-c9fd2a197322feb417bdedbca5e99f5aa65b3f06.png" /></p>
+
+ </div>
+
+>>> I don't quite understand why, this makes sense from the CGI path (in my
+>>> case from the root of the site). The browsers appear to be trying to fetch
+>>> `/demo/diagrams/demo/diagrams/graph-c9fd2a197322feb417bdedbca5e99f5aa65b3f06.png`
+>>> (i.e., prepending the required relpath twice). -- [[Jon]]
+
+>>>> Yeah, that patch may have been right once, but it's wrong now;
+>>>> preview mode uses `<base>` to make urls work the same as they would
+>>>> when viewing the html page.
+>>>>
+>>>> Perhaps this was not noticed for a while while because it only
+>>>> shows up if previewing an *unchanged* graph on a page that has already
+>>>> been built before. Fixed now. [[done]] --[[Joey]]
diff --git a/doc/bugs/RecentChanges_broken_with_empty_svnpath.mdwn b/doc/bugs/RecentChanges_broken_with_empty_svnpath.mdwn
new file mode 100644
index 000000000..c852df5e9
--- /dev/null
+++ b/doc/bugs/RecentChanges_broken_with_empty_svnpath.mdwn
@@ -0,0 +1,26 @@
+The [[RecentChanges]] page is broken (doesn't show any history at all) when used with an empty svnpath in the ikiwiki.setup file.
+
+Say you have the following configuration:
+
+ rcs => "svn",
+ svnrepo => "ssh+svn://foo.bar.com/wiki",
+ svnpath => "",
+
+In the above, $svnpath need to be either empty or "/" - both trigger the 'next unless' check in IkiWiki/Rcs/svn.pm:rcs_recentchanges() as shown in the patch below, thus causing all files to be ignored for [[RecentChanges]].
+
+I can not see why this check is needed in the first place, so here's a patch for removing it :)
+
+ diff -upr ikiwiki-1.49.orig/IkiWiki/Rcs/svn.pm ikiwiki-1.49/IkiWiki/Rcs/svn.pm
+ --- ikiwiki-1.49.orig/IkiWiki/Rcs/svn.pm Mon Apr 16 15:15:09 2007
+ +++ ikiwiki-1.49/IkiWiki/Rcs/svn.pm Mon Apr 16 15:15:47 2007
+ @@ -176,7 +176,6 @@ sub rcs_recentchanges ($) {
+ }
+
+ foreach (keys %{$logentry->{paths}}) {
+ - next unless /^\/\Q$config{svnpath}\E\/([^ ]+)(?:$|\s)/;
+ my $file=$1;
+ my $diffurl=$config{diffurl};
+ $diffurl=~s/\[\[file\]\]/$file/g;
+
+> It's necessary for wikis, such as this one, that keep other things in the
+> same svn repository. Bug [[fixed|done]]. --[[Joey]]
diff --git a/doc/bugs/RecentChanges_contains_invalid_XHTML.mdwn b/doc/bugs/RecentChanges_contains_invalid_XHTML.mdwn
new file mode 100644
index 000000000..eb95e9992
--- /dev/null
+++ b/doc/bugs/RecentChanges_contains_invalid_XHTML.mdwn
@@ -0,0 +1,55 @@
+The final `</div>` in `recentchanges.tmpl` gets wrapped in a
+`<p>` tag for some reason, resulting in the following invalid XHTML at
+the end of the [[RecentChanges]] page
+
+ <p></div></p>
+
+> I'll bet this is fixed if you use the markdown 1.2 prerelease, which has
+> a much less buggy html parser. (Ah, I see below that was the case.)
+> --[[Joey]]
+
+Also, there is a problem with the `<img>` tags generated by the smiley
+plugin which end up wrapped in a `<pre>` tag in the inline diff output.
+`<img>` tags is not allowed within a `<pre>` block. Maybe the smiley
+plugin should be disabled on [[RecentChanges]]?
+
+> See [[Smileys_in_the_block_code]], which is now fixed. --[[Joey]]
+
+See the [validator output][validate] for more details.
+
+ [validate]: http://validator.w3.org/check?uri=http://ikiwiki.info/recentchanges/
+
+- - -
+
+I'll add this here since it's related. I also noticed that the meta tags for
+redirected pages need to be closed in order to be valid XHTML:
+
+ <meta http-equiv="refresh" content="10; URL=../ikiwiki/pagespec/">
+
+I'm noticing these problems because I'm serving ikiwiki-generated
+content as `application/xhtml+xml` (as opposed to `text/html`) in order
+to include inline MathML. Any invalid XHTML causes Firefox to halt all
+processing and throw an error. &mdash;[Jason Blevins](http://jblevins.org/)
+
+- - -
+
+Here is a simple patch for the refresh problem. I haven't figured out
+what's causing the recentchanges bug yet.
+
+--[[JasonBlevins]]
+
+> Thanks, applied that patch. --[[Joey]]
+
+- - -
+
+It turns out that the invalid XHTML on the recent changes page is due to
+a bug in Markdown. I was using the packaged version of markdown in
+Ubuntu (Gutsy and markdown 1.0.1-6). Everything is fine
+after installing the most recent version of Text::Markdown from CPAN.
+
+Note that the above patch for the redirect tag is still applicable and
+the smiley issue remains open. --[[JasonBlevins]]
+
+> This bug is [[done]], all issues are fixed. --[[Joey]]
+
+[[!tag patch]]
diff --git a/doc/bugs/RecentChanges_links_to_deleted_pages.mdwn b/doc/bugs/RecentChanges_links_to_deleted_pages.mdwn
new file mode 100644
index 000000000..0eaeafb0c
--- /dev/null
+++ b/doc/bugs/RecentChanges_links_to_deleted_pages.mdwn
@@ -0,0 +1,15 @@
+[[RecentChanges]] should not link to pages that are being deleted. For as
+example, see the change with the title 'add news item for ikiwiki 2.60'
+which includes the deletion of "news/version 2.52". Maybe it should be made
+clear in RecentChanges that the change to the file is it being deleted.
+
+> It needs to link to the deleted page so that you can recreate the page if
+> desired.
+>
+> The link is not of the normal form used for a link to a nonexistant page,
+> instead it redirects through a CGI. This is done because updating the
+> links would require rebuilding all change pages each time, which would be
+> 100x as slow as the current method.
+>
+> I don't feel that being 100 times faster at the expense of a marginally
+> inconsistent, but still usable interface is a bug. --[[Joey]] [[done]]
diff --git a/doc/bugs/Remove_redirect_pages_from_inline_pages.mdwn b/doc/bugs/Remove_redirect_pages_from_inline_pages.mdwn
new file mode 100644
index 000000000..a43bd408f
--- /dev/null
+++ b/doc/bugs/Remove_redirect_pages_from_inline_pages.mdwn
@@ -0,0 +1,15 @@
+[[!tag bugs wishlist]]
+
+
+I accidentally made a typo spelling "surprises" and changed my URL from
+
+<http://natalian.org/archives/2012/12/04/Singapore_banking_suprises/>
+to
+<http://natalian.org/archives/2012/12/04/Singapore_banking_suprises/>
+
+Using the meta redir. However the meta redir now appears in the index of <http://natalian.org/>
+
+Any ideas how to handle this situation?
+
+> Well, you can adjust the inline's pagespec to exclude it, or even tag it
+> with a tag that the pagespec is adjusted to exclude. --[[Joey]]
diff --git a/doc/bugs/Renaming_a_file_via_the_web_is_failing_when_using_subversion.mdwn b/doc/bugs/Renaming_a_file_via_the_web_is_failing_when_using_subversion.mdwn
new file mode 100644
index 000000000..1a737df0a
--- /dev/null
+++ b/doc/bugs/Renaming_a_file_via_the_web_is_failing_when_using_subversion.mdwn
@@ -0,0 +1,28 @@
+I'm using ikiwiki 3.12 on Mac OS X (installed via mac ports)
+
+When trying to rename a file via the web interface (using the rename plugin) I get the following error:
+
+Error: Undefined subroutine &IkiWiki::Plugin::svn::dirname called at /opt/local/lib/perl5/vendor_perl/5.8.9/IkiWiki/Plugin/svn.pm line 246.
+
+Applying the following patch fixed it:
+
+ --- IkiWiki/Plugin/svn.pm.orig 2009-07-08 12:25:23.000000000 -0400
+ +++ IkiWiki/Plugin/svn.pm 2009-07-08 12:28:36.000000000 -0400
+ @@ -243,10 +243,10 @@
+
+ if (-d "$config{srcdir}/.svn") {
+ # Add parent directory for $dest
+ - my $parent=dirname($dest);
+ + my $parent=IkiWiki::dirname($dest);
+ if (! -d "$config{srcdir}/$parent/.svn") {
+ while (! -d "$config{srcdir}/$parent/.svn") {
+ - $parent=dirname($dest);
+ + $parent=Ikiwiki::dirname($dest);
+ }
+ if (system("svn", "add", "--quiet", "$config{srcdir}/$parent") != 0) {
+ warn("svn add $parent failed\n");
+
+
+> Thank you very much for the patch, which I've applied. I wonder how
+> that snuck in (aside from the obvious, that the svn plugin is not often
+> used and the code was added w/o being tested..). [[done]] --[[Joey]]
diff --git a/doc/bugs/Running_on_an_alternative_port_fails.mdwn b/doc/bugs/Running_on_an_alternative_port_fails.mdwn
new file mode 100644
index 000000000..942700ba3
--- /dev/null
+++ b/doc/bugs/Running_on_an_alternative_port_fails.mdwn
@@ -0,0 +1,93 @@
+Can't appear to get 'wiki' functions (i.e. editing) running when ikiwiki is running on a port other than the default (port 80). Somewhere in the processing it considers the base URL to exclude the port number and the websever throws back an error finding the page.
+
+For example if you run on 'http://my.gear.xxx:8080/' then after clicking login (using default password auth) it will process and try to redirect you to 'http://my.gear.xxx/cgi-bin/ikiwiki.cgi'. I'm assuming that somewhere we've used the 'path' and the 'host' and dropped the remainder. I can figure out where this is yet but I'll post back if I get lucky.
+
+ -- fergus
+
+NB: both the 'url' and the 'cgiurl' include the port and removing the port element provides the expected functionality.
+
+---
+
+> I tried to reproduce this by making my laptop's web server use port
+> 8080. Set up ikiwiki to use that in cgiurl and url, and had
+> no problem with either openid or password auth login.
+>
+> Ikiwiki has had some changes in this area in the past year; you don't say
+> what version you were using. It could also be a problem with your web
+> server, conceviably, if didn't correctly communicate the port to the cgi
+> program. --[[Joey]]
+
+---
+
+>> I did think of that so threw a 'printenv' script to check the port was arriving
+right.
+
+>>> SERVER_PORT=8181
+>>> HTTP_HOST=zippy0.ie0.cobbled.net
+
+[ ... ]
+
+>>>> In apache, `HTTP_HOST` includes the port. This is not part of the CGI
+>>>> spec it seems, but perl's `CGI` module seems to rely on it,
+>>>> in `virtual_port`:
+
+>>>>> my $vh = $self->http('x_forwarded_host') || $self->http('host');
+>>>>> my $protocol = $self->protocol;
+>>>>> if ($vh) {
+>>>>> return ($vh =~ /:(\d+)$/)[0] || ($protocol eq 'https' ? 443 : 80);
+
+>>>> The `CGI` module only looks at `SERVER_PORT` when there's no
+>>>> `HTTP_HOST`. So this is either a bug in perl's CGI or thttpd.
+>>>> --[[Joey]]
+
+[ ... ]
+
+---
+
+>>>>> This is interesting. If HTTP_HOST is wrong then
+
+>>>>> 0. the client header must be wrong (i.e. not including the PORT)
+>>>>> 0. `perl`'s doing something bad[tm] (or at least lazy)
+>>>>> 0. `apache` is adding it
+>>>>> 0. `thttpd` is stripping it
+
+>>>>> Quick hack shows that `thttpd` must be stripping the port
+number from the `Host:` header. That can be fixed.
+
+>>>>> Thanks for the assist. -- fergus
+
+---
+
+Patch for `thttpd-2.25b` for posterity and completeness
+
+[[!format patch """
+
+diff --git a/libhttpd.c b/libhttpd.c
+index 73689be..039b7e3 100644
+--- a/libhttpd.c
++++ b/libhttpd.c
+@@ -2074,9 +2074,6 @@ httpd_parse_request( httpd_conn* hc )
+ cp = &buf[5];
+ cp += strspn( cp, " \t" );
+ hc->hdrhost = cp;
+- cp = strchr( hc->hdrhost, ':' );
+- if ( cp != (char*) 0 )
+- *cp = '\0';
+ if ( strchr( hc->hdrhost, '/' ) != (char*) 0 || hc->hdrhost[0] == '.' )
+ {
+ httpd_send_err( hc, 400, httpd_err400title, "", httpd_err400form, "" );
+
+"""]]
+
+-- fergus
+
+---
+
+I've gone ahead and filed a bug on CGI.pm too:
+<https://rt.cpan.org/Ticket/Display.html?id=72678> --[[Joey]]
+
+---
+
+That'll be an interesting discussion as I'd suggest that HTTP_ headers are defined in the CGI specification as client headers and thus what `thttpd` is doing is wrong (i.e. mangling the client's own representation). Whether a CGI client should trust HTTP_ header over the server is probably already settled by convention.
+
+-- fergus
diff --git a/doc/bugs/SSI_include_stripped_from_mdwn.mdwn b/doc/bugs/SSI_include_stripped_from_mdwn.mdwn
new file mode 100644
index 000000000..270da86d3
--- /dev/null
+++ b/doc/bugs/SSI_include_stripped_from_mdwn.mdwn
@@ -0,0 +1,21 @@
+If I have a &lt;--#include virtual="foo" --&gt; in some file, it gets stripped, even though other HTML comments don't get stripped. I imagine it's some plugin doing it, or IkiWiki itself, or an IkiWiki dependency, but I haven't found where this is happening. I'm trying to implement a workaround for my sidebars forcing a rebuild of the wiki every day - I use the calendar plugin - when the day changes, by using SSI.
+
+> It is probably the [[plugins/htmlscrubber]] plugin. -- [[Jon]]
+
+> htmlscrubber does strip these, because they look like
+> a html tag to it, not a html comment. (html comments start
+> with `<!--` .. of course, they get stripped too, because
+> they can be used to hide javascript..)
+>
+> Anyway, it makes sense for the htmlscrubber to strip server-side
+> includes because otherwise your wiki could be attacked
+> by them being added to it. If you want to use both the htmlscrubber and
+> SSI together, I'd suggest you modify the [[templates]]
+> and put the SSI on there.
+>
+> Ie, `page.tmpl` has a
+> div that the sidebar is put into; if you just replace
+> that with the SSI that includes your static sidebar,
+> you should be good to go. --[[Joey]]
+
+[[done]]
diff --git a/doc/bugs/SVG_files_not_recognized_as_images.mdwn b/doc/bugs/SVG_files_not_recognized_as_images.mdwn
new file mode 100644
index 000000000..b14cd9b94
--- /dev/null
+++ b/doc/bugs/SVG_files_not_recognized_as_images.mdwn
@@ -0,0 +1,39 @@
+In ikiwiki 2.66, SVG images are not recognized as images. In ikiwiki.pm,
+the hardcoded list of image file extensions does not include ".svg", which
+it probably should unless there's some other issue about rendering SVGs?
+
+The 'img' plugin also seems to not support SVGs.
+
+> SVG images can only be included via an `<object>`, `<embed>`, or
+> `<iframe>` tag. Or, perhaps as [inline SVG](http://wiki.svg.org/Inline_SVG).
+> The [[plugins/htmlscrubber]] strips all three tags since they can easily
+> be used maliciously. If doing inline SVG, I'd worry that the svg file
+> could be malformed and mess up the html, or even inject javascript. So,
+> the only options seem to be only supporting svgs on wikis that do not
+> sanitize their html, or assuming that svgs are trusted content and
+> embedding them inline. None of which seem particularly palatable.
+>
+> I suppose the other option would be converting the svg file to a static
+> image (png). The img plugin could probably do that fairly simply.
+> --[[Joey]]
+
+>> This seems to have improved since; at least chromium can display svg
+>> images from `<img>` tags. Firefox 3.5.19 did not in my testing.
+>>
+>> So, svgs can now be included on pages by linking to them, or by using
+>> the img directive. The most portable thing is to use the img directive
+>> plus some size, which forces them to be resized and a png to actually
+>> be displayed.
+>>
+>> I have not yet tried to do anything with sanitizing them. --[[Joey]]
+
+>> I'm working on inline SVG and MathML support in ikiwiki and I've
+>> modified my htmlscrubber to sanitize SVG and MathML using the
+>> whitelists from html5lib. Here's a [patch][]. I've also made some
+>> notes about this here: [[todo/svg]].
+>>
+>> I suspect that this bug may have caught the eye of anyone interested
+>> in this sort of thing. I'll elaborate a bit on my user page to avoid
+>> getting off-topic here. --[[JasonBlevins]], October 21, 2008
+
+ [patch]: http://xbeta.org/gitweb/?p=xbeta/ikiwiki.git;a=blobdiff;f=IkiWiki/Plugin/htmlscrubber.pm;h=3c0ddc8f25bd8cb863634a9d54b40e299e60f7df;hp=3bdaccea119ec0e1b289a0da2f6d90e2219b8d66;hb=fe333c8e5b4a5f374a059596ee698dacd755182d;hpb=be0b4f603f918444b906e42825908ddac78b7073
diff --git a/doc/bugs/Search_Help_doesn__39__t_exist.mdwn b/doc/bugs/Search_Help_doesn__39__t_exist.mdwn
new file mode 100644
index 000000000..9db466d62
--- /dev/null
+++ b/doc/bugs/Search_Help_doesn__39__t_exist.mdwn
@@ -0,0 +1,8 @@
+When you click on [Help](http://ikiwiki.info/ikiwiki.cgi?navi=2) in the search results, all you get is a message saying:
+
+ The file "estseek.help" is not found.
+
+This is true both on my personal wiki and on ikiwiki.info. --[[sabr]]
+
+> [[fixed|done]] by putting the full path to the file in the template.
+> (assuming it's always in the same place...) --[[Joey]]
diff --git a/doc/bugs/Search_results_should_point_to_dir__44___not_index.html__44___when_use__95__dirs_is_enabled.mdwn b/doc/bugs/Search_results_should_point_to_dir__44___not_index.html__44___when_use__95__dirs_is_enabled.mdwn
new file mode 100644
index 000000000..5f36e21df
--- /dev/null
+++ b/doc/bugs/Search_results_should_point_to_dir__44___not_index.html__44___when_use__95__dirs_is_enabled.mdwn
@@ -0,0 +1,15 @@
+When using the search plugin, results are linked to `foo/bar/index.html`,
+whether or not use_dirs is enabled; when use_dirs is enabled, the link should
+point to `foo/bar/` instead.
+
+> Similarly, after editing a page `foo/bar/`, the user is redirected to
+> `foo/bar/index.html?updated` instead of `foo/bar/?updated`.
+> --[Jason Blevins](http://jblevins.org/)
+
+>> Even with `usedirs`, there is no reason why the `index.html` should be called directly, and it might break content negotiation. Please just direct to the directory. --[[madduck]]
+
+> This bug affects the [[plugins/amazon_s3]] plugin -- when using that
+> plugin plus the search plugin, you need to enable `amazon_s3_dupindex`.
+> So this definitly should be fixed. --[[Joey]]
+
+> [[done]], the new xapian search uses nice urls
diff --git a/doc/bugs/Search_summary_includes_text_from_navigational_elements.mdwn b/doc/bugs/Search_summary_includes_text_from_navigational_elements.mdwn
new file mode 100644
index 000000000..b774c4531
--- /dev/null
+++ b/doc/bugs/Search_summary_includes_text_from_navigational_elements.mdwn
@@ -0,0 +1,22 @@
+Each listed result for a search will show some example text from the beginning of the linked page. It strips out HTML elements, but if there's any navigational text items, they will stay.
+
+For example, each search result on ikiwiki.info shows "(title) ikiwiki/ (title) Edit RecentChanges History Preferences Discussion" at the start of its results.
+
+A way to name some CSS ids that should be removed in search results within the ikiwiki setup file would work. Here's something similar that a friend proposed:
+
+http://leaf.dragonflybsd.org/mailarchive/users/2009-11/msg00077.html
+
+(bin attachment on that page is actually a .diff.)
+
+> So I was looking at this and I relized that while the search plugin used
+> to use the format hook, and so there was no way to avoid it seeing all
+> the gunk around the page body, it was changed a while ago for different
+> reasons to use its own hook, postscan. So there's really no reason not
+> to move postscan so it runs before said gunk is added to the page.
+> (Aside from a small risk of breaking other third-party plugins that
+> somehow use postscan.)
+>
+> I've implemented that in git, and it drops the navigation elements nicely.
+> It's perhaps less general than allowing specific divs to be skipped from
+> search, but it seems good enough. Please thank the dragonfly guys for their
+> work on this. [[done]] --[[Joey]]
diff --git a/doc/bugs/Site_title_not_clickable_while_adding_a_comment.mdwn b/doc/bugs/Site_title_not_clickable_while_adding_a_comment.mdwn
new file mode 100644
index 000000000..1347be4b0
--- /dev/null
+++ b/doc/bugs/Site_title_not_clickable_while_adding_a_comment.mdwn
@@ -0,0 +1,9 @@
+When I add a comment to a page, its title should be a hyperlink. This would make re-opening it to re-read parts of it, either.
+
+I.e. when adding a comment to this page, the last part should be a hyperlink, as well:
+
+ ikiwiki/ bugs/ creating Site title not clickable while adding a comment
+
+
+
+Richard
diff --git a/doc/bugs/Slow_Filecheck_attachments___34__snails_it_all__34__.mdwn b/doc/bugs/Slow_Filecheck_attachments___34__snails_it_all__34__.mdwn
new file mode 100644
index 000000000..e93f4e546
--- /dev/null
+++ b/doc/bugs/Slow_Filecheck_attachments___34__snails_it_all__34__.mdwn
@@ -0,0 +1,47 @@
+Saving a wiki page in ikwiki or
+<tt>ikiwiki --setup wiki.setup --rebuild</tt> takes a **dozen minutes** on a tiny tiny wiki (10 user-added pages)!
+
+I profiled ikiwiki with [[!cpan Devel::SmallProf]] : see [[users/mathdesc]] for details.
+
+And I came to the conclusion that [[plugins/filecheck]] on attachment was the only cause.
+It always go the fallback code using time-consuming file even there it's look like it's
+not successful.
+
+<pre>
+ # Get the mime type.
+ #
+ # First, try File::Mimeinfo. This is fast, but doesn't recognise
+ # all files.
+ eval q{use File::MimeInfo::Magic};
+ my $mimeinfo_ok=! $@;
+ my $mimetype;
+ if ($mimeinfo_ok) {
+ my $mimetype=File::MimeInfo::Magic::magic($file);
+ }
+
+ # Fall back to using file, which has a more complete
+ # magic database.
+ if (! defined $mimetype) {
+ open(my $file_h, "-|", "file", "-bi", $file);
+ $mimetype=<$file_h>;
+ chomp $mimetype;
+ close $file_h;
+ }
+ if (! defined $mimetype || $mimetype !~s /;.*//) {
+ # Fall back to default value.
+ $mimetype=File::MimeInfo::Magic::default($file)
+ if $mimeinfo_ok;
+ if (! defined $mimetype) {
+ $mimetype="unknown";
+ }
+ }
+</pre>
+
+I found on [[plugins/filecheck/discussion/]] what [[users/DavidBremner/]] described as :
+> no way to detect text/plain using File::MimeInfo::Magic::magic()
+But I can't figure out if my issue is boarder and includes this or not..
+
+Any ideas , solve :) more that welcome.
+
+> [[done]], as isbear noted in [[discussion]], there was a bug that
+> prevented File::MimeInfo::Magic from ever being used. --[[Joey]]
diff --git a/doc/bugs/Slow_Filecheck_attachments___34__snails_it_all__34__/discussion.mdwn b/doc/bugs/Slow_Filecheck_attachments___34__snails_it_all__34__/discussion.mdwn
new file mode 100644
index 000000000..629aba71e
--- /dev/null
+++ b/doc/bugs/Slow_Filecheck_attachments___34__snails_it_all__34__/discussion.mdwn
@@ -0,0 +1,141 @@
+##Foreword :
+Disabling of filecheck is not actually possible because btw it cause the attachment.pm to malfunction and
+any of pagespec that could contain a *mimetype* condition.
+
+attachment.pm imports "statically" filecheck so actually disabling it should be *interdicted* .
+
+<pre>
+sub import {
+ add_underlay("attachment");
+ add_underlay("javascript");
+ add_underlay("jquery");
+ hook(type => "getsetup", id => "attachment", call => \&getsetup);
+ hook(type => "checkconfig", id => "attachment", call => \&checkconfig);
+ hook(type => "formbuilder_setup", id => "attachment", call => \&formbuilder_setup);
+ hook(type => "formbuilder", id => "attachment", call => \&formbuilder, last => 1);
+ IkiWiki::loadplugin("filecheck");
+}
+</pre>
+
+----
+
+## How bad is it ?
+
+So I tried on three pages to inline <tt>!mimetype(image/*)</tt> while I allowed attachment of <tt>mimetype(image/*)</tt>
+
+My profiling tests in the bug report shows that most of the time is spend in the "Fallback using file" block code,
+I tried to comment that block and see how it'll perform. Obviously this is much much faster ... but is the mimetype
+discovered using only *File::MimeInfo* ?
+
+
+Dumping some strings before return to STDERR, rebuilding . This is just a [[!toggle id="code-test" text="dumpdebug adding"]]
+
+[[!toggleable id="code-test" text="""
+<pre>
+sub match_mimetype ($$;@) {
+ my $page=shift;
+ my $wanted=shift;
+
+ my %params=@_;
+ my $file=exists $params{file} ? $params{file} : IkiWiki::srcfile($IkiWiki::pagesources{$page});
+ if (! defined $file) {
+ return IkiWiki::ErrorReason->new("file does not exist");
+ }
+
+ # Get the mime type.
+ #
+ # First, try File::Mimeinfo. This is fast, but doesn't recognise
+ # all files.
+ eval q{use File::MimeInfo::Magic};
+ my $mimeinfo_ok=! $@;
+ my $mimetype;
+ print STDERR " --- match_mimetype (".$file.")\n";
+ if ($mimeinfo_ok) {
+ my $mimetype=File::MimeInfo::Magic::magic($file);
+ }
+
+ # Fall back to using file, which has a more complete
+ # magic database.
+ #if (! defined $mimetype) {
+ # open(my $file_h, "-|", "file", "-bi", $file);
+ # $mimetype=<$file_h>;
+ # chomp $mimetype;
+ # close $file_h;
+ #}
+
+ if (! defined $mimetype || $mimetype !~s /;.*//) {
+ # Fall back to default value.
+ $mimetype=File::MimeInfo::Magic::default($file)
+ if $mimeinfo_ok;
+ if (! defined $mimetype) {
+ $mimetype="unknown";
+ }
+ }
+
+ my $regexp=IkiWiki::glob2re($wanted);
+ if ($mimetype!~$regexp) {
+ print STDERR " xxx MIME unknown ($mimetype - $wanted - $regexp ) \n";
+ return IkiWiki::FailReason->new("file MIME type is $mimetype, not $wanted");
+ }
+ else {
+ print STDERR " vvv MIME found\n";
+ return IkiWiki::SuccessReason->new("file MIME type is $mimetype");
+ }
+}
+</pre>
+"""]]
+
+The results dump to stderr (or a file called... 'say *mime*) looks like this :
+<pre>
+--- match_mimetype (/usr/share/ikiwiki/attachment/ikiwiki/jquery.fileupload-ui.js)
+ xxx MIME unknown (text/plain - image/* - (?i-xsm:^image\/.*$) )
+ --- match_mimetype (/usr/share/ikiwiki/locale/fr/directives/ikiwiki/directive/fortune.mdwn)
+ xxx MIME unknown (text/plain - image/* - (?i-xsm:^image\/.*$) )
+ --- match_mimetype (/usr/share/ikiwiki/locale/fr/basewiki/shortcuts.mdwn)
+ xxx MIME unknown (text/plain - image/* - (?i-xsm:^image\/.*$)
+ --- match_mimetype (/usr/share/ikiwiki/smiley/smileys/alert.png)
+ xxx MIME unknown (application/octet-stream - image/* - (?i-xsm:^image\/.*$) )
+ --- match_mimetype (/usr/share/ikiwiki/attachment/ikiwiki/images/ui-bg_flat_75_ffffff_40x100.png)
+ xxx MIME unknown (application/octet-stream - image/* - (?i-xsm:^image\/.*$)
+</pre>
+
+<tt>---</tt> prepend signals the file on analysis<br/>
+<tt>xxx</tt> prepend signals a returns failure : mime is unknown, the match is a failure<br/>
+<tt>vvv</tt> prepend signals a return success.<br/>
+
+
+This is nasty-scary results ! Something missed me or this mime-filecheck is plain nuts ?
+
+*Question 1* : How many files have been analysed : **3055** (yet on a tiny tiny wiki)
+<pre>grep "^ --- " mime | wc -l
+3055
+</pre>
+
+*Question 2* : How many time it fails : *all the time*
+<pre>
+ grep "^ xxx " mime | wc -l
+3055
+</pre>
+
+*Question 1bis* : Doh btw , how many files have been re-analysed ? ** 2835 ** OMG !!
+<pre>grep "^ --- " mime | sort -u | wc -l
+220
+</pre>
+
+## Conclusion
+
+- Only the system command *file -bi* works. While it is **should** be easy on the cpu , it's also hard on the I/O -> VM :(
+- Something nasty with the mime implementation and/or my system configuration -> Hints ? :D
+- Need to cache during the rebuild : a same page needs not being rechecked for its mime while it's locked !
+
+
+--mathdesc
+
+> > if ($mimeinfo_ok) {
+> > my $mimetype=File::MimeInfo::Magic::magic($file);
+> > }
+>
+> That seems strange to me, `my` restricts scope of $mimetype to enclosing if block, thus, assigned value will be dropped - I think, it is the problem.
+> Try removing that stray `my`.
+>
+> --isbear
diff --git a/doc/bugs/Smileys_in_the_block_code.mdwn b/doc/bugs/Smileys_in_the_block_code.mdwn
new file mode 100644
index 000000000..b7854705b
--- /dev/null
+++ b/doc/bugs/Smileys_in_the_block_code.mdwn
@@ -0,0 +1,34 @@
+My backported ikiwiki 1.48 converts smileys in the block code incorrectly.
+I can see the HTML code of smileys image, instead of smileys image.
+
+For example, I'd like to save interesting for me thread of courier-users
+mailing list. Please looks below to see what ikiwiki does with that smileys:
+
+ From: Bernd Wurst <bernd@bw...>
+ Subject: Re: [courier-users] Uploaded my SRS implementation for courier to
+ the web
+ To: courier-users@li...
+ Date: Sat, 17 Mar 2007 19:02:10 +0100
+
+ Hi.
+
+ Am Samstag, 17. Mrz 2007 schrieb Matthias Wimmer:
+ > See the graphic on http://www.openspf.org/SRS at the bottom on the left
+ > side. You will find an example there how rewriting an already rewritten
+ > address works.
+
+ Ah, ok, didn't know that. :)
+ Thanks for the pointer!
+
+ cu, Bernd
+
+BTW, maybe converting smileys in the block code should be disabled at all?
+
+--[[Paweł|ptecza]]
+
+> Looks similar to [[wiki_links_still_processed_inside_code_blocks]]; in both
+> cases, substitution happens in a code block, which it shouldn't.
+> --[[JoshTriplett]]
+
+> [[fixed|done]], via some super duper regexp fun to notice if the smiley
+> is inside a pre or code tag. --[[Joey]]
diff --git a/doc/bugs/Spaces_in_link_text_for_ikiwiki_links.mdwn b/doc/bugs/Spaces_in_link_text_for_ikiwiki_links.mdwn
new file mode 100644
index 000000000..8aea5cd29
--- /dev/null
+++ b/doc/bugs/Spaces_in_link_text_for_ikiwiki_links.mdwn
@@ -0,0 +1,53 @@
+Versions 2.0 and 2.1 of ikiwiki, and I think earlier versions as well,
+allowed wiki links to have spaces in the link text. For example, [[!ikiwiki
+logo page|logo]] should create an anchor tag referencing the logo page, and
+[[!ikiwiki logo|logo/ikiwiki.png]] should create an image tag referencing
+the logo.
+
+As of version 2.2, this no longer works. I think the pattern \\[[...|...]]
+should allow spaces before the pipe. I suspect this is the same problem as
+reported in [[index/discussion#index11h1]].
+
+> The above examples are ambiguous, only worked due to a bug, and were
+> never documented to work. So I'm not inclined to re-add support for them.
+>
+> If you look at [[ikiwiki/WikiLink]], it is clear that spaces cannot be used in
+> WikiLinks. It also shows how to use underscores in the link text if you
+> want multiple words.
+>
+> This was a decision I made a long time ago due to the ambiguity between a
+> WikiLink and a [[ikiwiki/Directive]]. Is "\[[foo bar|baz]]" a wikilink to
+> baz with a link text of "foo bar", or an instance of preprocessor
+> directive "foo" with a parameter of "bar|baz"? If it's interpreted as a
+> wikilink today, that could change tomorrow if a new preprocessor directive
+> is added.
+>
+> Before version 2.2, ikiwiki actually first treated it as a preprocessor
+> directive. If that failed, it output the preprocessor directive back onto
+> the page, and next the wikilink code tried treating it as a wikilink.
+> In 2.2, I fixed several problems with the way an unhandled preprocessor
+> directive was re-output onto the page, by prefixing it with a '\' ...
+> which makes it not be treated as a WikiLink.
+>
+> If WikiLinks had ever been documented to work with spaces in them, then
+> I'd feel I needed to support the pre 2.2 behavior, but I don't feel that
+> I have to support old behavior that was never documented and happened due
+> to a bug, so I current have no plans to bring the old behavior back.
+> --[[Joey]]
+
+>> I agree that the grammar should be unambiguous. It seems to me that the
+>> problem with spaces-in-wikilinks is caused by overloading the wikilink
+>> and preprocessor syntax to use the same symbols. If they didn't (and is
+>> there much advantage in them using the same symbols? I know in some
+>> cases you have something which is a wikilink and a preprocessor directive,
+>> but how often?) there'd be no problem with spaces.
+>>
+>> If there was ever a future, syntax-breaking major release of ikiwiki
+>> (similar to python3000) I'd like to see this fixed as part of that.
+>> --[[users/Jon]]
+
+>>> You can enable `prefix_directives` and get the disambiguated behavior
+>>> and spaces in wikilinks today. It will become the default in 3.0.
+>>> --[[Joey]]
+
+[[done]]
diff --git a/doc/bugs/Spurious___60__p__62___elements_added_to_tags_in_inliine_pages.mdwn b/doc/bugs/Spurious___60__p__62___elements_added_to_tags_in_inliine_pages.mdwn
new file mode 100644
index 000000000..e3b1d858d
--- /dev/null
+++ b/doc/bugs/Spurious___60__p__62___elements_added_to_tags_in_inliine_pages.mdwn
@@ -0,0 +1,43 @@
+[[!tag patch]]
+
+When a page containing tags and using the [[syntax_(3rd_party)_plugin|plugins/contrib/syntax]] (though pages using other preprocessors may also be affected) is rendered as an inline page, some extra `<p>` elements are added.
+
+Example output:
+
+ <p><span class="tags">
+ Tags:</p>
+
+ <p><span class="selflink">XML</span></p>
+
+ <p></span></p>
+
+Expected output:
+
+ <p><span class="tags">
+ Tags:
+
+ <span class="selflink">XML</span>
+ </span></p>
+
+A fix is to change inlinepage.tmpl to remove new lines around tag links, as follows:
+
+ --- templates/inlinepage.tmpl (revision 4626)
+ +++ templates/inlinepage.tmpl (working copy)
+ @@ -24,9 +24,7 @@
+ <TMPL_IF NAME="TAGS">
+ <span class="tags">
+ Tags:
+ -<TMPL_LOOP NAME="TAGS">
+ -<TMPL_VAR NAME=LINK>
+ -</TMPL_LOOP>
+ +<TMPL_LOOP NAME="TAGS"> <TMPL_VAR NAME=LINK></TMPL_LOOP>
+ </span>
+ </TMPL_IF>
+
+> I'm sure this is only working around a symptom, the problem must be that
+> markdown gets confused by the html generated by the syntax plugin.
+> Have you tried markdown 1.0.2? This version has a more robust html
+> parser.
+>
+> I don't have the prerequisites for the syntax plugin installed here
+> to debug it myself. --[[Joey]]
diff --git a/doc/bugs/Spurious___60__p__62___elements_added_to_tags_in_inliine_pages/discussion.mdwn b/doc/bugs/Spurious___60__p__62___elements_added_to_tags_in_inliine_pages/discussion.mdwn
new file mode 100644
index 000000000..32a647597
--- /dev/null
+++ b/doc/bugs/Spurious___60__p__62___elements_added_to_tags_in_inliine_pages/discussion.mdwn
@@ -0,0 +1,9 @@
+Actually, expected output would be:
+
+ <span class="tags">
+ Tags:
+
+ <span class="selflink">XML</span>
+ </span>
+
+This is consistent with tags in other inlinepages. --[[sward]] \ No newline at end of file
diff --git a/doc/bugs/Sub-Discussion_pages_have_a_broken___34__FormattingHelp__34___link.mdwn b/doc/bugs/Sub-Discussion_pages_have_a_broken___34__FormattingHelp__34___link.mdwn
new file mode 100644
index 000000000..8f87329ae
--- /dev/null
+++ b/doc/bugs/Sub-Discussion_pages_have_a_broken___34__FormattingHelp__34___link.mdwn
@@ -0,0 +1,3 @@
+For an example of what I mean, go to [[TourBusStop]]. Click the Discussion link. Click the <code>FormattingHelp</code> link. You'll be sent to [TourBusStop/ikiwiki/formatting](/TourBusStop/ikiwiki/formatting/) which of course doesn't exist.
+
+> A bug introduced in the last release. [[fixed|done]] --[[Joey]]
diff --git a/doc/bugs/Symlinked_srcdir_requires_trailing_slash.mdwn b/doc/bugs/Symlinked_srcdir_requires_trailing_slash.mdwn
new file mode 100644
index 000000000..cd74c2496
--- /dev/null
+++ b/doc/bugs/Symlinked_srcdir_requires_trailing_slash.mdwn
@@ -0,0 +1,81 @@
+If the srcdir is a symlink, Ikiwiki will not render the pages unless the srcdir has a trailing slash.
+
+For example:
+
+ #!/bin/sh
+ set -x
+
+ REALSRCDIR=~/tmp/ikiwiki/wikiwc2
+ SRCDIR=~/tmp/ikiwiki/wikiwc
+ DESTDIR=~/tmp/ikiwiki/public_html/wiki/
+
+ echo "*** Testing without trailing slash."
+
+ rm -rf $REALSRCDIR $SRCDIR $DESTDIR
+
+ # Create the real srcdir and link the srcdir to it
+ mkdir -p $REALSRCDIR
+ ln -s $REALSRCDIR $SRCDIR
+
+ mkdir -p $DESTDIR
+
+ echo Test > $SRCDIR/index.mdwn
+
+ # No trailing slash after $SRCDIR
+ ikiwiki --verbose $SRCDIR $DESTDIR --url=http://example.org/~you/wiki/ --underlaydir /dev/null
+
+ echo "*** Testing with trailing slash."
+
+ rm -rf $REALSRCDIR $SRCDIR $DESTDIR
+
+ # Create the real srcdir and link the srcdir to it
+ mkdir -p $REALSRCDIR
+ ln -s $REALSRCDIR $SRCDIR
+
+ mkdir -p $DESTDIR
+
+ echo Test > $SRCDIR/index.mdwn
+
+ # Trailing slash after $SRCDIR
+ ikiwiki --verbose $SRCDIR/ $DESTDIR --url=http://example.org/~you/wiki/ --underlaydir /dev/null
+
+My output:
+
+ + REALSRCDIR=/home/svend/tmp/ikiwiki/wikiwc2
+ + SRCDIR=/home/svend/tmp/ikiwiki/wikiwc
+ + DESTDIR=/home/svend/tmp/ikiwiki/public_html/wiki/
+ + echo '*** Testing without trailing slash.'
+ *** Testing without trailing slash.
+ + rm -rf /home/svend/tmp/ikiwiki/wikiwc2 /home/svend/tmp/ikiwiki/wikiwc /home/svend/tmp/ikiwiki/public_html/wiki/
+ + mkdir -p /home/svend/tmp/ikiwiki/wikiwc2
+ + ln -s /home/svend/tmp/ikiwiki/wikiwc2 /home/svend/tmp/ikiwiki/wikiwc
+ + mkdir -p /home/svend/tmp/ikiwiki/public_html/wiki/
+ + echo Test
+ + ikiwiki --verbose /home/svend/tmp/ikiwiki/wikiwc /home/svend/tmp/ikiwiki/public_html/wiki/ --url=http://example.org/~you/wiki/ --underlaydir /dev/null
+ + echo '*** Testing with trailing slash.'
+ *** Testing with trailing slash.
+ + rm -rf /home/svend/tmp/ikiwiki/wikiwc2 /home/svend/tmp/ikiwiki/wikiwc /home/svend/tmp/ikiwiki/public_html/wiki/
+ + mkdir -p /home/svend/tmp/ikiwiki/wikiwc2
+ + ln -s /home/svend/tmp/ikiwiki/wikiwc2 /home/svend/tmp/ikiwiki/wikiwc
+ + mkdir -p /home/svend/tmp/ikiwiki/public_html/wiki/
+ + echo Test
+ + ikiwiki --verbose /home/svend/tmp/ikiwiki/wikiwc/ /home/svend/tmp/ikiwiki/public_html/wiki/ --url=http://example.org/~you/wiki/ --underlaydir /dev/null
+ scanning index.mdwn
+ rendering index.mdwn
+
+Note that index.mdwn was only rendered when srcdir had a trailing slash.
+
+> There are potential [[security]] issues with ikiwiki following a symlink,
+> even if it's just a symlink at the top level of the srcdir.
+> Consider ikiwiki.info's own setup, where the srcdir is ikiwiki/doc,
+> checked out of revision control. A malicious committer could convert
+> ikiwiki/doc into a symlink to /etc, then ikiwiki would happily publish
+> all of /etc to the web.
+>
+> This kind of attack is why ikiwiki does not let File::Find follow
+> symlinks when scanning the srcdir. By appending the slash, you're
+> actually bypassing that check. Ikiwiki should not let you set
+> up a potentially insecure configuration like that. More discussion of
+> this hole [[here|security#index29h2]], and I've had to release
+> a version of ikiwiki that explicitly checks for that, and fails to work.
+> Sorry, but security trumps convenience. [[done]] --[[Joey]]
diff --git a/doc/bugs/Tab_delimited_tables_don__39__t_work.mdwn b/doc/bugs/Tab_delimited_tables_don__39__t_work.mdwn
new file mode 100644
index 000000000..39d57a4fe
--- /dev/null
+++ b/doc/bugs/Tab_delimited_tables_don__39__t_work.mdwn
@@ -0,0 +1,22 @@
+Table directive should support tab-delimited data, especially important since this is the format you will get if copy/pasting from an HTML table or spreadsheet (Gnumeric, OO Calc, Excel). Test case which fails:
+
+[[!table format=dsv delimiter="\t" data="""
+1 2
+2 4
+"""]]
+
+> They do work, but C-style backslash escapes aren't recognised,
+> so the syntax `delimiter="\t"` (as in your test case) looks
+> for the literal string `\t`. Replacing `\t` with a literal
+> tab character makes it work - here's a test (I changed the data
+> to make the table layout more obvious):
+>
+> [[!table format=dsv delimiter=" " data="""
+left 2
+2 right
+alpha beta
+"""]]
+>
+> So, I think this can be considered [[not_a_bug|done]]? --[[smcv]]
+
+>> I've clarified the documentation. --[[smcv]]
diff --git a/doc/bugs/Template_variable_not_passed_as-is__63____33__.mdwn b/doc/bugs/Template_variable_not_passed_as-is__63____33__.mdwn
new file mode 100644
index 000000000..d70dff418
--- /dev/null
+++ b/doc/bugs/Template_variable_not_passed_as-is__63____33__.mdwn
@@ -0,0 +1,23 @@
+I have a part of a template that looks like:
+
+ <TMPL_VAR level> <TMPL_VAR string>
+
+Calling the template with:
+
+\[[!template id=templateid string="some string" level="##"]]
+
+Results in:
+
+ <h1 id="z-">#</h1>
+
+ <p>some string</p>
+
+While I expected:
+
+ <h2 id="some_string">some string</h2>
+
+>> Have you tried TMPL_VAR raw_level, raw_string? — [[Jon]]
+
+> Thanks. I should read the docs more closely the next time.
+
+[[not a bug|done]]
diff --git a/doc/bugs/Titles_are_lower-cased_when_creating_a_page.mdwn b/doc/bugs/Titles_are_lower-cased_when_creating_a_page.mdwn
new file mode 100644
index 000000000..351c2c1a1
--- /dev/null
+++ b/doc/bugs/Titles_are_lower-cased_when_creating_a_page.mdwn
@@ -0,0 +1,37 @@
+When you click on a broken link to create a new page, Ikiwiki lower-cases the new page's filename. I wish it wouldn't.
+
+If I click on "Czars in Russia", I'd like Ikiwiki to create "Czars\_in\_Russia.mdwn", not "czars\_in\_russia.mdwn". Is this possible? --[[sabr]]
+
+> There's a simple patch that can do this:
+
+<pre>
+-- a/IkiWiki.pm
++++ b/IkiWiki.pm
+@@ -584,7 +584,7 @@ sub htmllink ($$$;@) {
+ return "&lt;span class=\"createlink\">&lt;a href=\"".
+ cgiurl(
+ do => "create",
+- page => pagetitle(lc($link), 1),
++ page => pagetitle($link, 1),
+ from => $lpage
+ ).
+ "\">?&lt;/a>$linktext&lt;/span>"
+</pre>
+
+> This is fine if you don't mind mixed or randomly cased filenames getting
+> created. Otoh, if the link happened to start a sentence and so had its
+> first letter upper-cased, that might not be desired.
+>
+> Of course ikiwiki's case insensative, and there are other ways
+> of creating pages that don't lower case them, including using the create
+> a page form on a blog (as was done for this page..).
+>
+> I'm undecided about making the above change by default though, or about making
+> it a config option. Maybe it would be better to include both capitalisations
+> in the select list that is used to pick the name for the newly created page.
+> Then, which one is the default wouldn't much matter. (The non-lower cased
+> one would probably be the best choice.) --[[Joey]]
+>> Either of your proposed solutions (make it the default or include both in the pop-up menu) sounds fine to me. Which one is easier? :) --[[sabr]]
+
+>>> [[Done]]; it now defaults to the mixed case title and provides
+>>> the lower-case one as an option in the select box. --[[Joey]]
diff --git a/doc/bugs/Toc_map_and_template_plugins_do_not_play_well_together.mdwn b/doc/bugs/Toc_map_and_template_plugins_do_not_play_well_together.mdwn
new file mode 100644
index 000000000..4849edd63
--- /dev/null
+++ b/doc/bugs/Toc_map_and_template_plugins_do_not_play_well_together.mdwn
@@ -0,0 +1,30 @@
+The following renders incorrectly:
+
+ \[[!toc ]]
+
+ # header1
+
+ content1
+
+ # header2
+
+ \[[!map pages="sandbox"]]
+
+
+Removing the `\[[!toc]]` directive or moving it at the end of the page
+makes the whole wiki page be rendered as expected.
+
+Hint : in all cases, the non-interpreted markdown code is copied as-is
+in the HTML output, without any leading `<p>` or any HTML formatting.
+
+> You're using the old version of `markdown`, that is known to have a broken block
+> html parser, that will get confused if markdown is present between two
+> separate html blocks, and not format the markdown.
+>
+> This is fixed in [[!cpan Text::MarkDown]] 1.0.19. markdown 1.0.2 also
+> fixes the problem. Install either one. I'm going to make ikiwiki's
+> dependencies list Text::Markdown before markdown, since people keep
+> stumbling over this. (The downside is that the old broken markdown is
+> faster). --[[Joey]]
+
+[[done]]
diff --git a/doc/bugs/Trailing_slash_breaks_links.mdwn b/doc/bugs/Trailing_slash_breaks_links.mdwn
new file mode 100644
index 000000000..11ef49ca3
--- /dev/null
+++ b/doc/bugs/Trailing_slash_breaks_links.mdwn
@@ -0,0 +1,9 @@
+A trailing slash on a [[ikiwiki/wikilink]] breaks the link:
+
+`\[[todo/Gallery/]]` appears as [[todo/Gallery/]].
+
+`\[[todo/Gallery]]` appears as [[todo/Gallery]].
+
+--[[JoshTriplett]]
+
+[[done]]
diff --git a/doc/bugs/URLs_with_parentheses_displayed_badly.mdwn b/doc/bugs/URLs_with_parentheses_displayed_badly.mdwn
new file mode 100644
index 000000000..59b67d493
--- /dev/null
+++ b/doc/bugs/URLs_with_parentheses_displayed_badly.mdwn
@@ -0,0 +1,19 @@
+I've noticed that Ikiwiki displays URLs with parentheses badly. The problem occurs
+in the latest version 3.00 and older versions. Please look at the link to following
+Polish entry about C programming language at Wikipedia (it seems that URLs with
+parentheses are popular there):
+
+[Język programowania C](http://pl.wikipedia.org/wiki/C_(j%C4%99zyk_programowania))
+
+I need to escape a closing parenthesis of the URL to fix the problem.
+
+[Język programowania C](http://pl.wikipedia.org/wiki/C_(j%C4%99zyk_programowania\))
+
+--[[Paweł|users/ptecza]]
+
+> This is a bug in markdown version 1. It is fixed in [[!cpan Text::Markdown]],
+> which ikiwiki will use if it's installed. [[done]] --[[Joey]]
+
+>> Thanks a lot for the hint, Joey! I've installed `libtext-markdown-perl` package
+>> (Aptitude has removed `markdown` package to satisfy dependencies) and now
+>> I don't need to escape Wikipedia URLs with parentheses :) --[[Paweł|users/ptecza]]
diff --git a/doc/bugs/UTF-16_and_UTF-32_are_unhandled.mdwn b/doc/bugs/UTF-16_and_UTF-32_are_unhandled.mdwn
new file mode 100644
index 000000000..9e8fba4b9
--- /dev/null
+++ b/doc/bugs/UTF-16_and_UTF-32_are_unhandled.mdwn
@@ -0,0 +1,29 @@
+Wide characters should probably be supported, or, at the very least, warned about.
+
+Test case:
+
+ mkdir -p ikiwiki-utf-test/raw ikiwiki-utf-test/rendered
+ for page in txt mdwn; do
+ echo hello > ikiwiki-utf-test/raw/$page.$page
+ for text in 8 16 16BE 16LE 32 32BE 32LE; do
+ iconv -t UTF$text ikiwiki-utf-test/raw/$page.$page > ikiwiki-utf-test/raw/$page-utf$text.$page;
+ done
+ done
+ ikiwiki --verbose --plugin txt --plugin mdwn ikiwiki-utf-test/raw/ ikiwiki-utf-test/rendered/
+ www-browser ikiwiki-utf-test/rendered/ || x-www-browser ikiwiki-utf-test/rendered/
+ # rm -r ikiwiki-utf-test/ # some browsers rather stupidly daemonize themselves, so this operation can't easily be safely automated
+
+BOMless LE and BE input is probably a lost cause.
+
+Optimally, UTF-16 (which is ubiquitous in the Windows world) and UTF-32 should be fully supported, probably by converting to mostly-UTF-8 and using `&#xXXXX;` or `&#DDDDD;` XML escapes where necessary.
+
+Suboptimally, UTF-16 and UTF-32 should be converted to UTF-8 where cleanly possible and a warning printed where impossible.
+
+----
+Reading the wikipedia pages about [[!wikipedia UTF-8]] and [[!wikipedia UTF-16]], all valid Unicode characters are representable in UTF-8, UTF-16 and UTF-32, and the only errors possible with UTF-16/32 -> UTF-8 translation are when there are encoding errors in the original document.
+
+Of course, it's entirely possible that not all browsers support utf-8 correctly, and we might need to support the option of encoding into [[!wikipedia CESU-8]] instead, which has the side-effect of allowing the transcription of UTF-16 or UTF-32 encoding errors into the output byte-stream, rather than pedantically removing those bytes.
+
+An interesting question would be how to determine the character set of an arbitrary new file added to the repository, unless the repository itself handles character-encoding, in which case, we can just ask the repository to hand us a UTF-8 encoded version of the file.
+
+-- [[Martin Rudat|http://www.toraboka.com/~mrudat]]
diff --git a/doc/bugs/UTF-8_BOM_showing_up_inside_a_page__63__.mdwn b/doc/bugs/UTF-8_BOM_showing_up_inside_a_page__63__.mdwn
new file mode 100644
index 000000000..3b9c9662d
--- /dev/null
+++ b/doc/bugs/UTF-8_BOM_showing_up_inside_a_page__63__.mdwn
@@ -0,0 +1,38 @@
+I have a git-backed ikiwiki install, and when I commit and push a file from an x86 host (LANG=en_US.UTF-8) to the Ikiwiki box, which is Debian GNU/Linux on Sparc, I sometimes get unusual characters (ef bb ff) before the first character of the wiki text. It seems that this is a UTF-8 "byte order mark" that is getting inserted automatically into the .wiki file by my editor: http://vim.wikia.com/wiki/VimTip246#Tip:_.23246_-_Working_with_Unicode
+
+Example:
+
+ http://monkey.linuxworld.com/lwce-2007/
+
+Is there any way for ikiwiki to spot when .wiki files have this BOM and
+deal with it, or should I make sure to strip it out before committing?
+
+> It would be easy to make ikiwiki strip out the BOM. For example, a simple
+> plugin could be written to s/// them out as a filter.
+>
+> I'm unsure if ikiwiki should do this by default. --[[Joey]]
+
+> Looked at this some more. It seems this would be a browser bug, after
+> all, it's not displaying the BOM properly.
+> To test, I've added a BOM to this file.
+>
+> Well, this page looks ok in epiphany and w3m, even with the BOM. Epiphany
+> incorrectly displays it as a space (not zero-width). In w3m in a unicode
+> xterm, it's invisible. What's going on is that <FEFF> is only a BOM at
+> the very beginning of the file. Otherwise, it should be treated as a
+> zero-width, non-breaking space. Ie, invisible. Any browsers that display
+> it otherwise seem to be broken.
+>
+> I'm having a hard time with the idea that any program that reads utf-8
+> data from a file and sticks it in the middle on another, output, utf-8
+> file, is broken if it doesn't strip the BOM. It could be argued that
+> programs should do that; it could be argued that perl should strip the
+> BOM from the beginning of a file whenever reading a file in utf8 mode, to
+> avoid all perl programs needing to do this on their own. Or it could be
+> argued that requiring all programs do this is silly, and that the BOM was
+> designed so you didn't need to strip it.
+>
+> After consideration, I prefer this last argument, so I prefer not to
+> make ikiwiki stip utf8 BOMS. Calling this bug [[done]].
+>
+> --[[Joey]]
diff --git a/doc/bugs/UTF-8_in_attachment_filenames.mdwn b/doc/bugs/UTF-8_in_attachment_filenames.mdwn
new file mode 100644
index 000000000..07fff88d2
--- /dev/null
+++ b/doc/bugs/UTF-8_in_attachment_filenames.mdwn
@@ -0,0 +1,25 @@
+I have ikiwiki_3.20111229 installed on Debian Squeeze (Perl 5.10.1, UTF-8
+locale). The attachment plugin mangles UTF8-encoded attachment filenames if
+the name contains multibyte characters, e.g. "lää.png" becomes "lää.png".
+Apparently glob returns byte strings which are subject to implicit
+upgrading when concatenated with Perl strings. The following patch fixes
+the problem for me:
+
+----
+
+ diff -r -U 1 a/attachment.pm b/attachment.pm
+ --- a/attachment.pm 2012-01-13 23:07:29.000000000 +0200
+ +++ b/attachment.pm 2012-01-13 23:33:07.000000000 +0200
+ @@ -274,2 +274,3 @@
+ foreach my $filename (glob("$dir/*")) {
+ + $filename=Encode::decode_utf8($filename);
+ next unless -f $filename;
+ @@ -347,2 +348,3 @@
+ foreach my $file (glob("$dir/*")) {
+ + $file = Encode::decode_utf8($file);
+ next unless -f $file;
+
+> Seems it only mangled display of the just-uploaded attachment's filename,
+> the attachment was otherwise saved to disk with a valid UTF-8 name, and
+> doing other stuff with it also was ok. In any case, I applied your patch,
+> thanks. [[done]] --[[Joey]]
diff --git a/doc/bugs/Unable_to_add_attachments_to_some_pages.mdwn b/doc/bugs/Unable_to_add_attachments_to_some_pages.mdwn
new file mode 100644
index 000000000..c7fe0bd15
--- /dev/null
+++ b/doc/bugs/Unable_to_add_attachments_to_some_pages.mdwn
@@ -0,0 +1,31 @@
+I can add attachment to some pages within an ikiwiki site. ( for example the index page ), but I'm unable to add attachments to other child pages.
+
+When I try I get the error message "Error: bad attachment filename". I can successfully attach the same file to the index page.
+
+I'm running
+
+ikiwiki version 3.20100815.7 on Debian Squeeze.
+
+Please advise.
+
+
+> I get the following error in apache error.log
+> Died at /usr/share/perl5/IkiWiki/CGI.pm line 466.
+> -- [[aland]]
+
+> Well, what subpages are you trying to add it to? What is your
+> `allowed_attachments` PageSpec set to in your setup file? --[[Joey]]
+
+>> I can reproduce this by creating a new ikiwiki using the auto.setup
+
+>> I create a subpage of index called "projects" and a subpage of projects called "Firewall Replacement 2010-2011"
+>> and when I add attachment to the subpages I get the error.
+
+>> I dont have an `allowed_attachments` section in my setup file
+>> But I've set an `allowed_attachments` like `allowed_attachments => 'maxsize(40000kb) and mimetype(*)',` and I still get the error.
+>> Thanks --[[aland]]
+
+>>> Being the subpage of index is the problem. It's not usual to have
+>>> any other page as a subpage of index; as there's really no reason to do
+>>> that, so some code broke in that special case. [[fixed|done]]
+>>> --[[Joey]]
diff --git a/doc/bugs/Undefined_subroutine_IkiWiki::escapeHTML.mdwn b/doc/bugs/Undefined_subroutine_IkiWiki::escapeHTML.mdwn
new file mode 100644
index 000000000..6a123fbf8
--- /dev/null
+++ b/doc/bugs/Undefined_subroutine_IkiWiki::escapeHTML.mdwn
@@ -0,0 +1,27 @@
+Trying to upgrade to IkiWiki 2.41+ (git head), whenever I try to edit a page I get:
+
+ [Mon Apr 07 16:53:33 2008] [error] [client 68.122.117.135] Undefined subroutine &IkiWiki::escapeHTML called at /root/ikiwiki/install/share/perl/5.8.8/IkiWiki.pm line 610.
+ [Mon Apr 07 16:53:33 2008] [error] [client 68.122.117.135] Premature end of script headers: wrapper.cgi
+
+This patch appears to fix it for me:
+
+ --- IkiWiki.pm 2008-04-07 17:05:04.000000000 -0400
+ +++ /usr/share/perl5/IkiWiki.pm 2008-04-07 18:03:55.000000000 -0400
+ @@ -621,6 +619,9 @@
+ return "<a href=\"$user\">$oiduser</a>";
+ }
+ else {
+ + eval q{use CGI 'escapeHTML'};
+ + error($@) if $@;
+ +
+ return htmllink("", "", escapeHTML(
+ length $config{userdir} ? $config{userdir}."/".$user : $user
+ ), noimageinline => 1);
+
+That's dirty and wrong though... Can you suggest a better fix? -- [[sabr]]
+
+> Hmm, I think I've not noticed this because the openid plugin hides it.
+> Bet you have openid disabled.
+>
+> Anyway, your fix is fine, [[applied|done]]. --[[Joey]]
+>> Actually, I do have openid enabled. Passwordauth is disabled, dunno if that matters. My setup file is here: <http://iki.u32.net/iki.setup>
diff --git a/doc/bugs/Undefined_subroutine_IkiWiki::refresh.mdwn b/doc/bugs/Undefined_subroutine_IkiWiki::refresh.mdwn
new file mode 100644
index 000000000..c0cc3fd9a
--- /dev/null
+++ b/doc/bugs/Undefined_subroutine_IkiWiki::refresh.mdwn
@@ -0,0 +1,7 @@
+After building a fresh deb from current Git master (9b62dac4bcf62f3a1f76ec5a7ed5a90db16ea1c8) :
+
+ $ ikiwiki --setup ~/ikiwiki.setup --rebuild
+ Undefined subroutine &IkiWiki::refresh called at /usr/share/perl5/IkiWiki/Setup.pm line 113.
+
+> [[done]], it just needed "require IkiWiki::Render" before it started
+> rendering. --[[smcv]]
diff --git a/doc/bugs/Underscores_in_links_don__39__t_appear.mdwn b/doc/bugs/Underscores_in_links_don__39__t_appear.mdwn
new file mode 100644
index 000000000..b25dfb7fe
--- /dev/null
+++ b/doc/bugs/Underscores_in_links_don__39__t_appear.mdwn
@@ -0,0 +1,18 @@
+Observed behavior:
+
+When I create a link like \[[cmd_test]] , the link appears as 'cmd test'.
+
+Expected behavior:
+
+I would like to be able to create links with underscores. I realize this is a feature, and I searched for ways to escape the underscore so it would appear, but I didn't find any.
+
+> as a workaround, you can use \[[cmd\_\_95\_\_test|cmd_test]] (which will link to a page named "cmd test" at the url location "cmd\_test") or \[[cmd\_\_95\_\_test]] (which will link to a page named "cmd\_test" at the url location "cmd\_\_95\_\_test"). i would, from my limited understanding of ikiwiki internals, consider the bug valid, and suggest that
+>
+> * explicit link text be not subject to de-escaping (why should it; this would be the short term solution)
+> * escaped page names never be used in user visible parts of ikiwiki (in my opinion, a user should not need to know about those internals, especially as they are configuration dependant (wiki_file_regexp))
+>
+> note that in [[ikiwiki/wikilink]], that very behavior is documented; it says that "\[[foo\_bar|Sandbox]]" will show as "foo bar". (although you can't tell that apart from "foo\_bar" easily because it's a hyperlink).
+>
+> i assume that this behavior stems from times when wikilinks and [[ikiwiki/directive]]s were not distinguished by \[[ vs \[[! but by the use of whitespace in directives, so whitespace had to be avoided in wikilinks.
+>
+> --[[chrysn]]
diff --git a/doc/bugs/Use_install__40__1__41___instead_of_cp__40__1__41___for_installing_files.mdwn b/doc/bugs/Use_install__40__1__41___instead_of_cp__40__1__41___for_installing_files.mdwn
new file mode 100644
index 000000000..12c0ad07f
--- /dev/null
+++ b/doc/bugs/Use_install__40__1__41___instead_of_cp__40__1__41___for_installing_files.mdwn
@@ -0,0 +1,32 @@
+Currently ikiwiki uses cp(1) with GNU extensions in Makefile.PL for installing files, thus causing problems on FreeBSD which doesn't have a cp(1) with GNU extensions in the base system.
+
+Here is a patch against ikiwiki-1.51 for using find(1) and install(1) instead of cp(1).
+
+ --- Makefile.PL.orig Sun Apr 29 12:57:51 2007
+ +++ Makefile.PL Sun Apr 29 13:08:38 2007
+ @@ -47,8 +47,12 @@ extra_clean:
+
+ extra_install:
+ install -d $(DESTDIR)$(PREFIX)/share/ikiwiki
+ - find basewiki templates \( -type f -or -type l \) ! -regex '.*\.svn.*' \
+ - -exec cp --parents -aL {} $(DESTDIR)$(PREFIX)/share/ikiwiki \;
+ + for dir in `find -L basewiki templates -type d ! -regex '.*\.svn.*'`; do \
+ + install -d $(DESTDIR)$(PREFIX)/share/ikiwiki/$$dir; \
+ + for file in `find -L $$dir -maxdepth 1 -type f`; do \
+ + install -m 644 $$file $(DESTDIR)$(PREFIX)/share/ikiwiki/$$dir; \
+ + done; \
+ + done
+
+ install -d $(DESTDIR)$(PREFIX)/share/man/man1
+ install -m 644 ikiwiki.man $(DESTDIR)$(PREFIX)/share/man/man1/ikiwiki.1
+
+> Couldn't it just use install -D ? --[[Joey]]
+
+>> No, apparently FreeBSD `install` does not support `-D`. See [the FreeBSD install manpage](http://www.freebsd.org/cgi/man.cgi?query=install&apropos=0&sektion=0&manpath=FreeBSD+6.2-RELEASE&format=html). --[[JoshTriplett]]
+
+>> Patch applied; [[done]]. --[[JoshTriplett]]
+
+There are still/again "cp -a"s in the Makefile as of 3.00
+
+> It's a cp -a || install. Is that causing you a problem somehow?
+> --[[Joey]]
diff --git a/doc/bugs/W3MMode_still_uses_http:__47____47__localhost__63__.mdwn b/doc/bugs/W3MMode_still_uses_http:__47____47__localhost__63__.mdwn
new file mode 100644
index 000000000..34eecef8c
--- /dev/null
+++ b/doc/bugs/W3MMode_still_uses_http:__47____47__localhost__63__.mdwn
@@ -0,0 +1,34 @@
+My setup matches w3mmode [[w3mmode/ikiwiki.setup]] exactly.
+My doc/index.mdwn just has a line or two of plain text.
+When I try to edit that page in w3m, it works fine until I push [Save Page].
+Then I just get a page that only contains "403".
+
+ikiwiki version is 3.20110715ubuntu1.
+w3m is 0.5.3.
+
+-- [[terry|tjgolubi]]
+
+I made it work, though probably not completely, by renaming
+~/.ikiwiki/wrappers/ikiwiki.cgi to ikiwiki2.cgi and replacing it with:
+
+ #!/bin/bash
+ /home/tjgolubi/.ikiwiki/wrappers/ikiwiki2.cgi $* | sed -e 's,http://localhost,file://,g'
+
+I'm afraid that this hack may have bad side-effects, but I hope it points you to the cause/solution.
+Of course, the next time I rerun ikiwiki --setup, it will overwrite my wrapper-wrapper.
+
+-- [[terry|tjgolubi]]
+
+I made a logfile of all the args, env, and stdin/stdout to/from my wrapper. If you're interested, I'll email it to you. I wasn't able to attach it here.
+
+-- [[terry|tjgolubi]]
+
+I confirm that the supplied w3mmode setup appears not to work. When I try to edit a page and save it, w3m tries to access an URL beginning http://localhost/ . The HTML source of the edit page contains a BASE URL beginning with http://localhost. It should not. Maybe this is a result of changes a while back, where use of absolute URLs was enforced in various places in Ikiwiki.
+
+-- Martin
+
+The problem is that IkiWiki::CGI::cgitemplate() and IkiWiki::CGI::redirect() use Perl's CGI::url() to determine the absolute URL of the CGI script when it is being executed. url() generates an URL beginning http://localhost. As w3m's serverless CGI mode is rather unusual, presumably there's no provision for the URL of a CGI script beginning file:///, even if there's a way to specify that.
+
+A quick workaround might be to force the use of $config{url} instead of $cgi->url as a base for URLs when w3mmode is set.
+
+-- Martin
diff --git a/doc/bugs/Warns_about_use_of_uninitialized_value_if_prefix__95__directives_is_on_and_a_directive_does_not_contain_a_space.mdwn b/doc/bugs/Warns_about_use_of_uninitialized_value_if_prefix__95__directives_is_on_and_a_directive_does_not_contain_a_space.mdwn
new file mode 100644
index 000000000..efb5c70b8
--- /dev/null
+++ b/doc/bugs/Warns_about_use_of_uninitialized_value_if_prefix__95__directives_is_on_and_a_directive_does_not_contain_a_space.mdwn
@@ -0,0 +1,19 @@
+In `IkiWiki::preprocess`, the last capturing group in the regex used to parse directives in prefix_directives mode is of the form `(\s+...)?\]\]`, which will not be matched if the directive is something without arguments or whitespace, like `\[[!orphans]]`. As a result, its value is undef instead of being an empty string, causing a warning when it is used in the anonymous sub `$handle`. A trivial fix is to treat it as "" if it is undef.
+
+[[patch]] in the master branch of my git repository, and quoted here. --[[smcv]]
+
+ diff --git a/IkiWiki.pm b/IkiWiki.pm
+ index 241a7c0..d2c35a2 100644
+ --- a/IkiWiki.pm
+ +++ b/IkiWiki.pm
+ @@ -1167,7 +1167,8 @@ sub preprocess ($$$;$$) {
+ }sx;
+ }
+
+ - $content =~ s{$regex}{$handle->($1, $2, $3, $4)}eg;
+ + # $4 can be undef if the directive was \[[!foo]]
+ + $content =~ s{$regex}{$handle->($1, $2, $3, ($4 or ""))}eg;
+ return $content;
+ }
+
+[[cherry-picked|done]] --[[Joey]]
diff --git a/doc/bugs/Weird_interaction_between_toc_plugin_and_page_sections.mdwn b/doc/bugs/Weird_interaction_between_toc_plugin_and_page_sections.mdwn
new file mode 100644
index 000000000..76944104d
--- /dev/null
+++ b/doc/bugs/Weird_interaction_between_toc_plugin_and_page_sections.mdwn
@@ -0,0 +1,40 @@
+[[!meta title="Weird interaction between toc plugin and markdown that follows"]]
+
+<http://vcs-pkg.org/people/> goes really weird when I enable the `toc` plugin. It renders the page sections fine without it, but as soon as I turn it on, the following output is generated:
+
+ \[[!meta title="My Name"]]
+
+ 1. Fedora
+ 2. Ubuntu
+ 3. People grouped by version control system
+ 4. GNU Arch (tla/baz)
+ 5. Bazaar-NG
+ 6. Git users
+ 7. Mercurial
+ 8. Subversion
+ 9. Everyone, sorted by first name
+
+ People grouped by distro ======================== Debian ––
+
+ * jelmer
+ * madduck
+
+ Fedora
+
+I have no idea what's eating the first two headers. But as you can see from the current page, it's not just headers by any markdown...
+
+--[[madduck]]
+
+> This is a markdown bug. It occurs with 1.0.1, but not with the 1.0.2
+> beta. Apparently markdown is getting confused by the div inserted for the
+> toc, not realizing that it has ended, and so not marking up the text as
+> markdown. 1.0.2 fixes many such div-related markdown bugs. I have to call
+> this [[done]] since it's not an ikiwiki bug, recommend upgrading markdown
+> if the slower speed of 1.0.2 doesn't hurt too badly. --[[Joey]]
+
+>> Note how it works on <http://madduck.net/docs/mailfilter/> but adding a rule like in that doc to the people page didn't fix it.
+
+>>> Yes, it's probably confused by the two divs so close to each other, and
+>>> doesn't realize that the text in between is not part of either and is
+>>> markdown. Problem is that the old markdown doesn't have a real html
+>>> parser, it just fakes it --[[Joey]]
diff --git a/doc/bugs/Wrong_permissions_on_4_smileys.mdwn b/doc/bugs/Wrong_permissions_on_4_smileys.mdwn
new file mode 100644
index 000000000..f9a03ff01
--- /dev/null
+++ b/doc/bugs/Wrong_permissions_on_4_smileys.mdwn
@@ -0,0 +1,10 @@
+The following four PNG files have permissions 600 instead of 644 in the source tarball of ikiwiki-1.43:
+
+* basewiki/smileys/icon-info.png
+* basewiki/smileys/prio1.png
+* basewiki/smileys/prio2.png
+* basewiki/smileys/prio3.png
+
+[[bugs/done]] in my sources --[[Joey]]
+
+Thank you --[[Brix|HenrikBrixAndersen]]
diff --git a/doc/bugs/XHTML_needs_xmlns_attribute_on_html_element.mdwn b/doc/bugs/XHTML_needs_xmlns_attribute_on_html_element.mdwn
new file mode 100644
index 000000000..751aaf064
--- /dev/null
+++ b/doc/bugs/XHTML_needs_xmlns_attribute_on_html_element.mdwn
@@ -0,0 +1,5 @@
+XHTML needs `xmlns="http://www.w3.org/1999/xhtml"` on the `html` element;
+otherwise, it will not validate.
+--[[JoshTriplett]]
+
+[[done]] --[[JoshTriplett]]
diff --git a/doc/bugs/__33__inline_sort__61____34__meta__40__date__41____34___ignored.mdwn b/doc/bugs/__33__inline_sort__61____34__meta__40__date__41____34___ignored.mdwn
new file mode 100644
index 000000000..31bd3dfec
--- /dev/null
+++ b/doc/bugs/__33__inline_sort__61____34__meta__40__date__41____34___ignored.mdwn
@@ -0,0 +1,41 @@
+I am trying to do an !inline and sort the pages after meta(date)
+
+ \[[!inline pages="blog/* and !*/Discussion" sort="meta(date)" show="0" rootpage="blog" archive="yes"]]
+
+There are a few pages inside blog/* and I would like to give the !meta line as example for two of them:
+
+page 1: blog/get_http.mdwn
+
+ \[[!meta title="HTTP GET method" date="2010-09-17 00:00:00"]]
+
+page 2: blog/nagios.mdwn
+
+ \[[!meta title="Nagios 3" date="2010-09-09 00:00:00"]]
+
+page 3: blog/using_macos.mdwn
+
+ \[[!meta title="How I am using Mac OS X" date="2010-06-10 00:00:00"]]
+
+The ordering which is created can be seen at <http://www.michael-hammer.at/blog_all> and is
+
+page 1 -> page 3 -> page 2
+
+which is obviously not correct. I can say that the ordering is regardless of the sort="" argument inside !inline done by the ctime. This is really annoying as ctime is hard to recover if one has to move the blog from one machine to another.
+
+- What am I doing wrong?
+- Is this a bug? If not: Why is meta(date) ignored?
+
+% ikiwiki --version
+
+ikiwiki version 3.20100815.7
+
+> You're not using the [[meta directive|ikiwiki/directive/meta]] correctly.
+> As it says at the top of that page,
+
+>> You can have only one field
+>> per `meta` directive, use more directives if you want to specify more fields.
+
+> So, \[[!meta title="Nagios 3"]] \[[!meta date="2010-09-09 00:00:00"]]
+> and you should be good to go. --[[Joey]] [[done]]
+
+>> Thank you for your help. Sometimes the solution is to easy. Sorry for PEBKAC bug report. --mueli
diff --git a/doc/bugs/__34__Currently_enabled_SSH_keys:__34___shows_only_first_139_characters_of_each_key.mdwn b/doc/bugs/__34__Currently_enabled_SSH_keys:__34___shows_only_first_139_characters_of_each_key.mdwn
new file mode 100644
index 000000000..3c3352f66
--- /dev/null
+++ b/doc/bugs/__34__Currently_enabled_SSH_keys:__34___shows_only_first_139_characters_of_each_key.mdwn
@@ -0,0 +1,12 @@
+At least at http://free-thursday.pieni.net/ikiwiki.cgi the "SSH keys" page shows only the first 139 characters of each SSH key. I'm using iceweasel in 1024x768 resolution and there are not scrollbars visible.
+
+Please contact me at timo.lindfors@iki.fi
+
+> I have access to the same wiki, and do not see the problem Timo sees. I see 380 chars of the SSH keys, and do have a scrollbar.
+> Weird. --liw
+
+> Also, that's a Branchable.com site and the bug, if any is
+> in ikiwiki-hosting's plugin, not ikiwiki proper. Moved
+> [here](http://ikiwiki-hosting.branchable.com/bugs/__34__Currently_enabled_SSH_keys:__34___shows_only_first_139_characters_of_each_key/) --[[Joey]]
+
+[[!tag done]]
diff --git a/doc/bugs/__34__First_post__34___deletion_does_not_refresh_front_page.mdwn b/doc/bugs/__34__First_post__34___deletion_does_not_refresh_front_page.mdwn
new file mode 100644
index 000000000..2367335a7
--- /dev/null
+++ b/doc/bugs/__34__First_post__34___deletion_does_not_refresh_front_page.mdwn
@@ -0,0 +1,6 @@
+When I created an ikiwiki site (on Branchable) using the blog template, it added a "First post", which was fine.
+Deleting that post removed it, but the front page did not get the re-generated, so it was still there.
+--[[liw]]
+
+> This is a bug involving the `page()` pagespec. Deleted
+> pages matching this pagespec are not noticed. --[[Joey]] [[done]]
diff --git a/doc/bugs/__34__more__34___doesn__39__t_work.mdwn b/doc/bugs/__34__more__34___doesn__39__t_work.mdwn
new file mode 100644
index 000000000..b2d929f13
--- /dev/null
+++ b/doc/bugs/__34__more__34___doesn__39__t_work.mdwn
@@ -0,0 +1,17 @@
+As one can see at [[plugins/more/discussion]], the [[plugins/more]] plugin doesn't work --- it renders as:
+
+ <p><a name="more"></a></p>
+
+ <p>This is the rest of my post. Not intended for people catching up on
+ their blogs at 30,000 feet. Because I like to make things
+ difficult.</p>
+
+No way to toggle visibility.
+-- Ivan Z.
+
+> More is not about toggling visibility. Perhaps you want
+> [[plugins/toggle]] More is about displaying the whole page
+> content when it's a standalone page, and only displaying a fragment when
+> it's inlined into a blog. --[[Joey]] [[done]]
+
+>> I see, thanks for bothering with the reply, I didn't understand this. --Ivan Z.
diff --git a/doc/bugs/__34__skipping_bad_filename__34___error_when_src_path_contains_spaces.mdwn b/doc/bugs/__34__skipping_bad_filename__34___error_when_src_path_contains_spaces.mdwn
new file mode 100644
index 000000000..470d4eebd
--- /dev/null
+++ b/doc/bugs/__34__skipping_bad_filename__34___error_when_src_path_contains_spaces.mdwn
@@ -0,0 +1,5 @@
+After installing IkiWiki 2.16 on Mac OS X 10.4 server I attempted to use "/Library/Application\ Support/IkiWiki/Working\ Copies" as the parent of my $SRCPATH and get "skipping bad filename" errors for any .mdwn file in that directory:
+
+ skipping bad filename /Library/Application Support/IkiWiki/Working Copies/ikiwikinewt/index.mdwn
+
+Tthe .ikiwiki directory is correctly created in that directory. I switched to using a path with no spaces and it works correctly.
diff --git a/doc/bugs/__36__ENV__123__PATH__125___should_include_PREFIXbin.mdwn b/doc/bugs/__36__ENV__123__PATH__125___should_include_PREFIXbin.mdwn
new file mode 100644
index 000000000..edc411c85
--- /dev/null
+++ b/doc/bugs/__36__ENV__123__PATH__125___should_include_PREFIXbin.mdwn
@@ -0,0 +1,19 @@
+Running ikiwiki had error (ignored and continued):
+
+rebuilding wiki..
+Can't exec "ikiwiki-transition": No such file or directory at /usr/pkg/lib/perl5/vendor_perl/5.8.0/IkiWiki.pm line 917.
+scanning ....
+
+ikiwiki.in and ikiwiki.out have:
+
+$ENV{PATH}="/usr/local/bin:/usr/bin:/bin";
+
+Please use full-path to installed "ikiwiki-transition" in the system() function or adjust the PATH with PREFIX/bin.
+
+Maybe pm_filter can handle this.
+
+ikiwiki installer already knows about PREFIX.
+
+[[done]]
+
+Not that it matters, but why "if" instead of "elsif"?
diff --git a/doc/bugs/__38__uuml__59___in_markup_makes_ikiwiki_not_un-escape_HTML_at_all.mdwn b/doc/bugs/__38__uuml__59___in_markup_makes_ikiwiki_not_un-escape_HTML_at_all.mdwn
new file mode 100644
index 000000000..eb3450a7e
--- /dev/null
+++ b/doc/bugs/__38__uuml__59___in_markup_makes_ikiwiki_not_un-escape_HTML_at_all.mdwn
@@ -0,0 +1,47 @@
+I'm experimenting with using Ikiwiki as a feed aggregator.
+
+The Planet Ubuntu RSS 2.0 feed (<http://planet.ubuntu.com/rss20.xml>) as of today
+has someone whose name contains the character u-with-umlaut. In HTML 4.0, this is
+specified as the character entity uuml. Ikiwiki 2.47 running on Debian etch does
+not seem to understand that entity, and decides not to un-escape any markup in
+the feed. This makes the feed hard to read.
+
+The following is the test input:
+
+ <rss version="2.0">
+ <channel>
+ <title>testfeed</title>
+ <link>http://example.com/</link>
+ <language>en</language>
+ <description>example</description>
+ <item>
+ <title>&uuml;</title>
+ <guid>http://example.com</guid>
+ <link>http://example.com</link>
+ <description>foo</description>
+ <pubDate>Tue, 27 May 2008 22:42:42 +0000</pubDate>
+ </item>
+ </channel>
+ </rss>
+
+When I feed this to ikiwiki, it complains:
+"processed ok at 2008-05-29 09:44:14 (invalid UTF-8 stripped from feed) (feed entities escaped"
+
+Note also that the test input contains only pure ASCII, no UTF-8 at all.
+
+If I remove the ampersand in the title, ikiwiki has no problem. However, the entity is
+valid HTML, so it would be good for ikiwiki to understand it. At the minimum, stripping
+the offending entity but un-escaping the rest seems like a reasonable thing to do,
+unless that has security implications.
+
+> I tested on unstable, and ikiwiki handled that sample rss fine,
+> generating a `ü.html`. --[[Joey]]
+
+>> I confirm that it works with ikiwiki 2.50, at least partially. The HTML output is
+>> OK, but the aggregate plugin still reports this:
+>>
+>> processed ok at 2008-07-01 21:24:29 (invalid UTF-8 stripped from feed) (feed entities escaped)
+>>
+>> I hope that's just a minor blemish. --liw
+
+>>> Sounds like this is [[done]] --[[Joey]]
diff --git a/doc/bugs/__60__br__62___tags_are_removed_from_markdown_inline_HTML.mdwn b/doc/bugs/__60__br__62___tags_are_removed_from_markdown_inline_HTML.mdwn
new file mode 100644
index 000000000..2c3fdea3e
--- /dev/null
+++ b/doc/bugs/__60__br__62___tags_are_removed_from_markdown_inline_HTML.mdwn
@@ -0,0 +1,31 @@
+I am trying to add a post address to a document:
+
+<address>
+ First line<br/>
+ Second line
+</address>
+
+As you can see, the `<br/>` is being removed. I disabled [[plugins/htmlscrubber]], but that was not it. The [markdown Dingus](http://daringfireball.net/projects/markdown/dingus) on its homepage processes the inline HTML just fine.
+
+I tried searching the web and wiki but could not find any information on why <br/> would be removed.
+
+> It does work if you use `<br />`:
+>
+> First line<br />
+> Second line
+>
+> Or, as we've just been told in #ikiwiki: put two spaces at the end of the first line.
+>
+> First line
+> Second line
+>
+> --[[tschwinge]]
+
+> > `<br/>` is also valid, so this is a bug still. --[[madduck]]
+
+>>> It _is_ the htmlscrubber that removes that. It's due to [[!debbug 365971]],
+>>> basically the [[!cspan HTML::Scrubber]] doesn't understand xhtml tags
+>>> of this sort at all, I hacked it to support `<br />` by tellig it to treak
+>>> the "/" as an attribute, but if there's no space, it doesn't see it as
+>>> an attribute. Hmm, I could also add `br` as a tag name, that would catch both cases.
+>>> Ok, [[done]] --[[Joey]]
diff --git a/doc/bugs/__63__Discussion_when_not_CGI_mode.mdwn b/doc/bugs/__63__Discussion_when_not_CGI_mode.mdwn
new file mode 100644
index 000000000..e08abaff3
--- /dev/null
+++ b/doc/bugs/__63__Discussion_when_not_CGI_mode.mdwn
@@ -0,0 +1,9 @@
+I used ikiwiki without my --setup configuration that defined my --cgiurl. This regenerated my HTML pages without the "Edit" link as is documented ("Required when building the wiki for links to the cgi script to be generated").
+
+But I did have a "?Discussion" link still and that is CGI.
+
+So the bug (I think) is that it has a hyperlink to a CGI when CGI is not enabled.
+
+> [[bugs/done]] -- [[Joey]]
+
+(Note that my title above has _63_ but was supposed to be a question mark.)
diff --git a/doc/bugs/__91__PATCH__93___Use_correct_perl_when_running_make.html b/doc/bugs/__91__PATCH__93___Use_correct_perl_when_running_make.html
new file mode 100644
index 000000000..9de2a0fa4
--- /dev/null
+++ b/doc/bugs/__91__PATCH__93___Use_correct_perl_when_running_make.html
@@ -0,0 +1,17 @@
+If the Perl used to run Makefile.PL is not first on the PATH, it will not be the one used when make is run. The patch below fixes this.
+
+[[done]], thanks
+
+<pre>
+--- Makefile.PL.orig 2008-06-02 10:33:41.000000000 -0500
++++ Makefile.PL 2008-06-02 10:34:00.000000000 -0500
+@@ -31,7 +31,7 @@
+ chmod +x ikiwiki.out
+
+ extra_build: ikiwiki.out
+- perl -Iblib/lib $(extramodules) $(tflag) ikiwiki.out -libdir . -setup docwiki.setup -refresh
++ $(PERL) -Iblib/lib $(extramodules) $(tflag) ikiwiki.out -libdir . -setup docwiki.setup -refresh
+ ./mdwn2man ikiwiki 1 doc/usage.mdwn > ikiwiki.man
+ ./mdwn2man ikiwiki-mass-rebuild 8 doc/ikiwiki-mass-rebuild.mdwn > ikiwiki-mass-rebuild.man
+ ./mdwn2man ikiwiki-makerepo 1 doc/ikiwiki-makerepo.mdwn > ikiwiki-makerepo.man
+</pre>
diff --git a/doc/bugs/__91__SOLVED__93___Pandoc_plugin_and_UTF-8:_IkiWiki_and_UTF-8.mdwn b/doc/bugs/__91__SOLVED__93___Pandoc_plugin_and_UTF-8:_IkiWiki_and_UTF-8.mdwn
new file mode 100644
index 000000000..7282a71b8
--- /dev/null
+++ b/doc/bugs/__91__SOLVED__93___Pandoc_plugin_and_UTF-8:_IkiWiki_and_UTF-8.mdwn
@@ -0,0 +1,11 @@
+import os
+os.environment['LANG'] = 'it_IT.utf-8'
+
+Suona plausibile?
+
+[GitHub pykipandoc](https://github.com/temmen/pykipandoc) -- Temmen
+
+> The place to put contrib plugins is in [[plugins/contrib]].
+>
+> Closing this bug report as whatever it is that was fixed is apparently not an ikiwiki
+> bug.. I guess. [[done]] --[[Joey]]
diff --git a/doc/bugs/__91__SOLVED__93___Pandoc_plugin_and_UTF-8:_IkiWiki_and_UTF-8/discussion.mdwn b/doc/bugs/__91__SOLVED__93___Pandoc_plugin_and_UTF-8:_IkiWiki_and_UTF-8/discussion.mdwn
new file mode 100644
index 000000000..9cdc9e746
--- /dev/null
+++ b/doc/bugs/__91__SOLVED__93___Pandoc_plugin_and_UTF-8:_IkiWiki_and_UTF-8/discussion.mdwn
@@ -0,0 +1,23 @@
+ # plugins to add to the default configuration
+ add_plugins => [qw{
+ brokenlinks
+ map
+ orphans
+ pagecount
+ pagestats
+ tag
+ template
+ openid
+ attachment
+ edittemplate
+ remove
+ listdirectives
+ shortcut
+ pykipandoc}],
+ # plugins to disable
+ disable_plugins => [qw{passwordauth mdwn}],
+
+and
+
+ # UTF-8 locale to use
+ locale => 'it_IT.UTF-8',
diff --git a/doc/bugs/__96____96__clear:_both__39____39___for___96__.page__42____39____63__.mdwn b/doc/bugs/__96____96__clear:_both__39____39___for___96__.page__42____39____63__.mdwn
new file mode 100644
index 000000000..a1b5ba94a
--- /dev/null
+++ b/doc/bugs/__96____96__clear:_both__39____39___for___96__.page__42____39____63__.mdwn
@@ -0,0 +1,35 @@
+Please have a look at
+<http://www.bddebian.com/~wiki/hurd/running/debian/faq/>.
+There is (on a sufficiently small display) a large free spacing between the
+*vmstat* line and the first *Posted* line.
+Even without any `local.css`.
+This is because of `clear: both` in ikiwiki's `style.css`, line 109,
+for `.pagedate, .pagelicense, .pagecopyright`.
+
+I can override this in `local.css`, but what was the original reason for
+adding this `clear: both`?
+
+> Without investigating in detail, I think it was probably because any
+> of the items can be enabled or disabled. So any of them might be the
+> first item after the horizontal rule, and any might be the last item
+> before the modification date. So all of them have to clear both above and
+> below. I'm sure there are better ways for the CSS to handle that.
+> --[[Joey]]
+
+>> There is indeed a better way - all the optional things below the
+>> content are wrapped in `<div id="footer">`, so to have the browser wait
+>> until all floating boxes have finished before rendering the footer, it
+>> would be sufficient to have `#footer { clear: both; }` and remove all
+>> the other footer-related `clear` attributes. I'm not sure what you mean
+>> by "clear above and below" - the clear attribute takes values none, left,
+>> right or both, and its purpose is to stop floating boxes (sidebars,
+>> mainly) from overlapping with footers.
+>>
+>> ... oh, I see what you mean - this affects inlines too. In inlinepage.tmpl
+>> we could wrap the "pseudo-footer" in `<div class="inlinefooter">` too?
+>> Then sites could choose whether to set clear:both on the inlinefooter
+>> or not, and this would be separate from the same styling on whole pages.
+>>
+>> [[done]] --[[smcv]]
+
+[[patch]]
diff --git a/doc/bugs/__96__wiki__95__file__95__chars__96___setting_not_propagated_to_CGI_wrapper.mdwn b/doc/bugs/__96__wiki__95__file__95__chars__96___setting_not_propagated_to_CGI_wrapper.mdwn
new file mode 100644
index 000000000..f04b3404b
--- /dev/null
+++ b/doc/bugs/__96__wiki__95__file__95__chars__96___setting_not_propagated_to_CGI_wrapper.mdwn
@@ -0,0 +1,28 @@
+I've set `wiki_file_chars` to a non-standard value in the setup file:
+
+ wiki_file_chars => "-[:alnum:]+/.:_\x{1f310}\x{1f430}",
+
+(In case you're wondering, [this is the page](http://xn--9dbdkw.se/🌐/).)
+
+ikiwiki recognises my pages when I run it from the command line, but
+when I edit something through the CGI "script", ikiwiki would suddenly
+not recognise them.
+
+By running `strings` on the CGI wrapper I found that the option
+`wiki_file_regexp` was still at its original setting. So as a workaround,
+I added this to the setup file and everything worked:
+
+ wiki_file_regexp => qr/(^[-[:alnum:]+\/.:_\x{1f310}\x{1f430}]+$)/,
+
+Maybe the CGI wrapper should specially call `checkconfig`, which is
+the function responsible for updating `wiki_file_regexp`?
+
+--[[legoscia]]
+
+> You have to regrenerate the cgi wrapper after changing your setup file
+> for the configuration changes to take effect.
+>
+> I tested it, setting `wiki_file_chars => "moocow"`,
+> running ikiwiki -refresh -wrappers my.setup, and looking at strings:
+> `'wiki_file_regexp' => qr/(?-xism:(^[moocow]+$))/`
+> So, this appears to have been user error. [[done]] --[[Joey]]
diff --git a/doc/bugs/absolute_sizes_in_default_CSS.mdwn b/doc/bugs/absolute_sizes_in_default_CSS.mdwn
new file mode 100644
index 000000000..bb3c0c7a0
--- /dev/null
+++ b/doc/bugs/absolute_sizes_in_default_CSS.mdwn
@@ -0,0 +1,39 @@
+While toying around with some font sizes on my persona ikiwiki I discovered that some font sizes in the default CSS are fixed rather than relative. Here's a git patch that replaces them with relative font sizes (assuming the default 12pt/16px base font size recommended by the W3C):
+
+[[done]] --[[Joey]]
+
+<pre>
+From 01c14db255bbb727d8dd1e72c3f6f2f25f07e757 Mon Sep 17 00:00:00 2001
+From: Giuseppe Bilotta &lt;giuseppe.bilotta@gmail.com&gt;
+Date: Tue, 17 Aug 2010 00:48:24 +0200
+Subject: [PATCH] Use relative font-sizes
+
+---
+ doc/style.css | 4 ++--
+ 1 files changed, 2 insertions(+), 2 deletions(-)
+
+diff --git a/doc/style.css b/doc/style.css
+index 66d962b..fa4b2a3 100644
+--- a/doc/style.css
++++ b/doc/style.css
+@@ -14,7 +14,7 @@ nav {
+
+ .header {
+ margin: 0;
+- font-size: 22px;
++ font-size: 140%;
+ font-weight: bold;
+ line-height: 1em;
+ display: block;
+@@ -22,7 +22,7 @@ nav {
+
+ .inlineheader .author {
+ margin: 0;
+- font-size: 18px;
++ font-size: 112%;
+ font-weight: bold;
+ display: block;
+ }
+--
+1.7.2.rc0.231.gc73d
+</pre>
diff --git a/doc/bugs/aggregate_generates_long_filenames.mdwn b/doc/bugs/aggregate_generates_long_filenames.mdwn
new file mode 100644
index 000000000..33c300bd2
--- /dev/null
+++ b/doc/bugs/aggregate_generates_long_filenames.mdwn
@@ -0,0 +1,40 @@
+the [[plugins/aggregate]] plugin mashes the `title` of an aggregated post into a filename. This results in long filenames. I have hit a filesystem length limitation on several occasions. Some (ab)uses of RSS, e.g., twitter,
+generate long titles. Especially once you throw escaping into the mix:
+
+ $ ikiwiki --setup testsetup --aggregate --refresh
+ failed to write ./test/lifestream/Hidden_Features_Of_Perl__44___PHP__44___Javascript__44___C__44___C++__44___C__35____44___Java__44___Ruby___46____46____46__._aggregated.ikiwiki-new: File name too long
+ aggregation failed with code 9216
+ $ echo $?
+ 25
+
+It would also appear this abrubtly terminates aggregate processing (if not ikiwiki itself). Only after moving my test repo to `/tmp` to shorten the filename did I see newer RSS feeds (from a totally different source) picked up.
+
+
+-- [[Jon]]
+
+> I have to wonder what filesystem you have there where 147 characters
+> is a long filename. Ikiwiki already uses `POSIX::pathconf` on the srcdir
+> to look up `_PC_NAME_MAX`
+> to see if the filename is too long, and shortens it, so it seems
+> that, in additional to having a rather antique long filename limit, your
+> system also doesn't properly expose it via pathconf. Not sure what
+> ikiwiki can do here. --[[Joey]]
+
+>> This is an ext4 filesystem with default settings (which appears to mean
+>> 256 bytes for pathnames). Despite the error saying file name, it's
+>> definitely a path issue since moving my test repo to `/tmp`from
+>> `/home/jon/wd/mine/www` hides the problem. I note the following comment
+>> in `aggregate.pm`:
+
+ # Make sure that the file name isn't too long.
+ # NB: This doesn't check for path length limits.
+
+>> I don't fully grok the aggregate source yet, but I wouldn't rule out
+>> a bug in the path length checking, personally. I'm happy to try and
+>> find it myself though :) -- [[Jon]]
+
+>>> Path length seems unlikely, since the max is 4096 there.
+>>> --[[Joey]]
+
+>>>> Aggregate now used a "if it crashes, it must be too long" strategy.
+>>>> [[done]] --[[Joey]]
diff --git a/doc/bugs/aggregate_global_feed_names.mdwn b/doc/bugs/aggregate_global_feed_names.mdwn
new file mode 100644
index 000000000..27127ce27
--- /dev/null
+++ b/doc/bugs/aggregate_global_feed_names.mdwn
@@ -0,0 +1,13 @@
+[[plugins/aggregate]] takes a name parameter that specifies a global name
+for a feed. This causes some problems:
+
+* If a site has multiple pages that aggregate, and they use the same
+ name, one will win and get the global name, the other will claim it's
+ working, but it's really showing what the other aggregated.
+* If an aggregate directive is moved from page A to page B, and the wiki
+ refreshed, aggregate does not realize the feed moved, and so it will
+ keep aggregated pages under `A/feed_name/*`. To work around this bug,
+ you have to delete A, refresh (maybe with --aggregate?), and then add B.
+
+Need to find a way to not make the name be global. Perhaps it needs to
+include the name of the page that contains the directive?
diff --git a/doc/bugs/aggregate_plugin_errors.mdwn b/doc/bugs/aggregate_plugin_errors.mdwn
new file mode 100644
index 000000000..aa36bdd09
--- /dev/null
+++ b/doc/bugs/aggregate_plugin_errors.mdwn
@@ -0,0 +1,62 @@
+I'm using the aggregate plugin (which is wonderful, thanks!) at this URL http://adam.shand.net/blog/planet/
+
+When I run ikiwiki with "--aggregate" I get this error:
+
+ ronin(adam)$ /usr/bin/ikiwiki --setup ~adam/.ikiwiki/asni.setup --aggregategate
+ successfully generated /var/www/adam.shand.net/iki.cgi
+ Use of uninitialized value in subroutine entry at /usr/share/perl5/IkiWiki/Plugin/aggregate.pm line 414.
+ Use of uninitialized value in subroutine entry at /usr/share/perl5/IkiWiki/Plugin/aggregate.pm line 414.
+ Use of uninitialized value in subroutine entry at /usr/share/perl5/IkiWiki/Plugin/aggregate.pm line 414.
+ Use of uninitialized value in subroutine entry at /usr/share/perl5/IkiWiki/Plugin/aggregate.pm line 414.
+ Use of uninitialized value in subroutine entry at /usr/share/perl5/IkiWiki/Plugin/aggregate.pm line 414.
+ Use of uninitialized value in subroutine entry at /usr/share/perl5/IkiWiki/Plugin/aggregate.pm line 414.
+ Use of uninitialized value in subroutine entry at /usr/share/perl5/IkiWiki/Plugin/aggregate.pm line 414.
+ Use of uninitialized value in subroutine entry at /usr/share/perl5/IkiWiki/Plugin/aggregate.pm line 414.
+
+> Fixed, this occurred when a feed did not include any body content tag.
+> --[[Joey]]
+
+Also, feeds from DokuWiki seem to crash the aggregate plugin completely,
+it's not a completely valid feed but presumably crashing is still bad. The
+feed I'm seeing this with is http://www.wirelesscommons.org/feed.php
+
+> This is a bug in XML::Parser. Unfortunately, perl does not have a feed
+> parser that handles invalid feeds, and in particular, XML::Parser has
+> issues with feeds that claim to be encoded in utf-8 and contain invalid
+> utf sequences, as well as other encoding issues. See also [[!debbug 380426]].
+> Note though that this invalid feed does not really crash the aggregate plugin,
+> it just notes that XML::Parser crashed on it and continues. This is the
+> best I can do in ikiwiki. I have filed a bug on XML::Parser about this,
+> it's [[!debbug 420636]]. I've also put in a workaround, so [[done]].
+
+**Wonderful**, thanks Joey! -- Adam.
+
+ -- System Information:
+ Debian Release: 3.1
+ APT prefers testing
+ APT policy: (650, 'testing')
+ Architecture: i386 (i686)
+ Kernel: Linux 2.4.25-1um
+ Locale: LANG=C, LC_CTYPE=C (charmap=ANSI_X3.4-1968)
+
+ Versions of packages ikiwiki depends on:
+ ii gcc [c-compiler] 4:3.3.5-3 The GNU C compiler
+ ii gcc-2.95 [c-compiler] 1:2.95.4-22 The GNU C compiler
+ ii gcc-3.3 [c-compiler] 1:3.3.5-13 The GNU C compiler
+ ii libc6-dev [libc-dev] 2.3.6-7 GNU C Library: Development Librari
+ ii libcgi-formbuilder-perl 3.03.01-1 Easily generate and process statef
+ ii libcgi-session-perl 4.14-1 Persistent session data in CGI app
+ ii libhtml-parser-perl 3.45-2 A collection of modules that parse
+ ii libhtml-scrubber-perl 0.08-3 Perl extension for scrubbing/sanit
+ ii libhtml-template-perl 2.8-1 HTML::Template : A module for usin
+ ii libmail-sendmail-perl 0.79-1 Send email from a perl script
+ ii libtime-duration-perl 1.02-1 Time::Duration -- rounded or exact
+ ii libtimedate-perl 1.1600-4 Time and date functions for Perl
+ ii liburi-perl 1.35-1 Manipulates and accesses URI strin
+ ii libxml-simple-perl 2.14-5 Perl module for reading and writin
+ ii markdown 1.0.1-3 Text-to-HTML conversion tool
+ ii perl 5.8.8-6.1 Larry Wall's Practical Extraction
+
+
+Cheers,
+--[[AdamShand]]
diff --git a/doc/bugs/aggregate_plugin_errors/discussion.mdwn b/doc/bugs/aggregate_plugin_errors/discussion.mdwn
new file mode 100644
index 000000000..3425b6d16
--- /dev/null
+++ b/doc/bugs/aggregate_plugin_errors/discussion.mdwn
@@ -0,0 +1,6 @@
+I have the same problem here when I use a feed from googles shared feed. http://www.google.com/reader/public/atom/user/04715560304044435944/state/com.google/broadcast
+
+john
+
+> I cannot reproduce any problem with this feed. Can you provide details?
+> --[[Joey]]
diff --git a/doc/bugs/aggregate_plugin_should_honour_a_post__39__s_mctime.mdwn b/doc/bugs/aggregate_plugin_should_honour_a_post__39__s_mctime.mdwn
new file mode 100644
index 000000000..865637ea4
--- /dev/null
+++ b/doc/bugs/aggregate_plugin_should_honour_a_post__39__s_mctime.mdwn
@@ -0,0 +1,15 @@
+It would be nice if the [[aggregate_plugin|plugins/aggregate]] would try to
+extract the m/ctime out of each post and touch the files on the filesystem
+appropriately, so that ikiwiki reflects the actual time of the post via the
+[[inline_plugin|plugins/inline]], rather than the time when the aggregation ran to pull the post in. --[[madduck]]
+
+> Like this? (Existing code in aggregate.pm...) --[[Joey]]
+
+ # Set the mtime, this lets the build process get the right creation
+ # time on record for the new page.
+ utime $mtime, $mtime, pagefile($guid->{page})
+ if defined $mtime && $mtime <= time;
+
+>> I'll have to debug this, it's not working here... and this is an ikiwiki aggregator scraping another ikiwiki site.
+
+>>> Any news about this? --[[Joey]]
diff --git a/doc/bugs/aggregate_removed_feeds_linger.mdwn b/doc/bugs/aggregate_removed_feeds_linger.mdwn
new file mode 100644
index 000000000..3e856e26d
--- /dev/null
+++ b/doc/bugs/aggregate_removed_feeds_linger.mdwn
@@ -0,0 +1,11 @@
+When the [[plugins/aggregate]] plugin was used for a feed and this is removed (or the
+same feed name given a different rss feed), the old entries don't
+automatically vanish.
+
+I think that if it was just removed, they are never GC'd, because the
+expiry code works on the basis of existing feeds. And if it was replaced,
+old items won't go away until expirecount or expireage is met.
+
+To fix it probably needs an explicit check for items aggregated by feeds
+that no longer provide them, Catching old items for feeds that were changed
+to a different url may be harder yet. --[[Joey]]
diff --git a/doc/bugs/aggregateinline_planets_wrongly_link_to_posts.mdwn b/doc/bugs/aggregateinline_planets_wrongly_link_to_posts.mdwn
new file mode 100644
index 000000000..58c247631
--- /dev/null
+++ b/doc/bugs/aggregateinline_planets_wrongly_link_to_posts.mdwn
@@ -0,0 +1,17 @@
+[[!meta title="aggregate/inline planets wrongly link to posts"]]
+
+Please see
+<http://vcs-pkg.org/planet/>. The headers of posts link to the HTML pages, which ikiwiki scraped.
+I believe that the headers should link to the posts directly, not the "cached" copies ikiwiki keeps around.
+
+> As far as I can see, that problem no longer exists.
+
+Also, the `\[[meta]]` titles and author directives aren't processed, but included inline.
+
+> Hmm, I don't see that either.
+
+What's also not ideal is that the cached copies can be edited. Any edits there will never make it to the VCS and thus won't show up in recentchanges.
+
+> That can be disabled now by enabling `aggreageinternal` --[[Joey]]
+
+> Calling this [[done]], please let me know if I missed something.
diff --git a/doc/bugs/align_doesn__39__t_always_work_with_img_plugin_.mdwn b/doc/bugs/align_doesn__39__t_always_work_with_img_plugin_.mdwn
new file mode 100644
index 000000000..e986bdc82
--- /dev/null
+++ b/doc/bugs/align_doesn__39__t_always_work_with_img_plugin_.mdwn
@@ -0,0 +1,7 @@
+Using the img plugin to inline an image, the "align" parameter doesn't work as expected if you also include a "caption".
+
+As best as I can tell this is because the "caption" parameter works by wrapping the image inside a table which means that the "align" parameter is aligning within the table cell rather then the page itself.
+
+-- AdamShand
+
+> I agree, this is annoying... and [[done]]! --[[Joey]]
diff --git a/doc/bugs/anonok_vs._httpauth.mdwn b/doc/bugs/anonok_vs._httpauth.mdwn
new file mode 100644
index 000000000..bff37e18b
--- /dev/null
+++ b/doc/bugs/anonok_vs._httpauth.mdwn
@@ -0,0 +1,118 @@
+I've got a wiki where editing requires [[plugins/httpauth]] (with
+`cgiauthurl` working nicely). I now want to let the general public
+edit Discussion subpages, so I enabled [[plugins/anonok]] and set
+`anonok_pagespec` to `'*/Discussion'`, but HTTP auth is still being
+required for those.
+
+(Actually, what I'll really want to do is probably [[plugins/lockedit]]
+and a whitelist of OpenIDs in `locked_pages`...)
+
+--[[schmonz]]
+
+> The only way I can see to support this combination is for httpauth with
+> cgiauthurl to work more like other actual login types. Which would mean
+> that on editing a page that needs authentication, ikiwiki would redirect
+> them to the Signin page, which would then have a link they could follow
+> to bounce through the cgiauthurl and actually sign in. This would be
+> significantly different than the regular httpauth process, in which the
+> user signs in in passing. --[[Joey]]
+
+>> My primary userbase has grown accustomed to the seamlessness of
+>> httpauth with SPNEGO, so I'd rather not reintroduce a seam into
+>> their web-editing experience in order to let relatively few outsiders
+>> edit relatively few pages. When is the decision made about whether
+>> the current page can be edited by the current user (if any)? What
+>> if there were a way to require particular auth plugins for particular
+>> PageSpecs? --[[schmonz]]
+
+>>> The decision about whether a user can edit a page is made by plugins
+>>> such as signinedit and lockedit, that also use canedit hooks to redirect
+>>> the user to a signin page if necessary.
+>>>
+>>> A tweak on my earlier suggestion would be to have httpauth notice when the
+>>> Signin page is being built and immediatly redirect to the cgiauthurl
+>>> before the page can be shown to the user. This would, though, not play
+>>> well with other authentication methods like openid, since the user
+>>> would never see the Signin form. --[[Joey]]
+
+>>>> Would I be able to do what I want with a local plugin that
+>>>> abuses canedit (and auth) to reach in and call the appropriate
+>>>> plugin's auth method -- e.g., if the page matches */Discussion,
+>>>> call `openid:auth()`, else `httpauth:auth()`? --[[schmonz]]
+
+>>>>> That seems it would be
+>>>>> annoying for httpauth users (who were not currently authed),
+>>>>> as they would then see the openid signin form when going to edit a
+>>>>> Discussion page.
+>>>>> --[[Joey]]
+
+>>>>>> I finally see the problem, I think. When you initially
+>>>>>> suggested "a link they could follow to bounce through the
+>>>>>> cgiauthurl", presumably this could _be_ the Edit link for
+>>>>>> non-Discussion pages, so that the typical case of an httpauth
+>>>>>> user editing an editable-only-by-httpauth page doesn't visibly
+>>>>>> change. And then the Edit link for Discussion subpages could do
+>>>>>> as you suggest, adding one click for the httpauth user, who won't
+>>>>>> often need to edit those subpages. --[[schmonz]]
+
+>> On reflection, I've stopped being bothered by the
+>> redirect-to-signin-page approach. (It only needs to happen once per
+>> browser session, anyway.) Can we try that? --[[schmonz]]
+
+Here is an attempt. With this httpauth will only redirect to the
+`cgiauth_url` when a page is edited, and it will defer to other plugins
+like anonok first. I have not tested this. --[[Joey]]
+
+<pre>
+diff --git a/IkiWiki/Plugin/httpauth.pm b/IkiWiki/Plugin/httpauth.pm
+index 127c321..a18f8ca 100644
+--- a/IkiWiki/Plugin/httpauth.pm
++++ b/IkiWiki/Plugin/httpauth.pm
+@@ -9,6 +9,8 @@ use IkiWiki 3.00;
+ sub import {
+ hook(type => "getsetup", id => "httpauth", call => \&getsetup);
+ hook(type => "auth", id => "httpauth", call => \&auth);
++ hook(type => "canedit", id => "httpauth", call => \&canedit,
++ last => 1);
+ }
+
+ sub getsetup () {
+@@ -33,9 +35,21 @@ sub auth ($$) {
+ if (defined $cgi->remote_user()) {
+ $session->param("name", $cgi->remote_user());
+ }
+- elsif (defined $config{cgiauthurl}) {
+- IkiWiki::redirect($cgi, $config{cgiauthurl}.'?'.$cgi->query_string());
+- exit;
++}
++
++sub canedit ($$$) {
++ my $page=shift;
++ my $cgi=shift;
++ my $session=shift;
++
++ if (! defined $cgi->remote_user() && defined $config{cgiauthurl}) {
++ return sub {
++ IkiWiki::redirect($cgi, $config{cgiauthurl}.'?'.$cgi->query_string());
++ exit;
++ };
++ }
++ else {
++ return undef;
+ }
+ }
+
+</pre>
+
+> With `anonok` enabled, this works for anonymous editing of an
+> existing Discussion page. auth is still needed to create one. --[[schmonz]]
+
+>> Refreshed above patch to fix that. --[[Joey]]
+
+>> Remaining issue: This patch will work with anonok, but not openid or
+>> passwordauth, both of which want to display a login page at the same
+>> time that httpauth is redirecting to the cgiauthurl. As mentioned above,
+>> the only way to deal with that would be to add a link to the signin page
+>> that does the httpauth signin. --[[Joey]]
+
+>>> That's dealt with in final version. [[done]] --[[Joey]]
diff --git a/doc/bugs/argument_isn__39__t_numeric:_mixing_templates_and_creation__95__date.mdwn b/doc/bugs/argument_isn__39__t_numeric:_mixing_templates_and_creation__95__date.mdwn
new file mode 100644
index 000000000..ff98ba55f
--- /dev/null
+++ b/doc/bugs/argument_isn__39__t_numeric:_mixing_templates_and_creation__95__date.mdwn
@@ -0,0 +1,62 @@
+I get the following error when building my wiki
+
+ Argument "\x{3c}\x{54}..." isn't numeric in numeric eq (==) at /usr/share/perl5/IkiWiki.pm line 2547.
+ Argument "\x{3c}\x{54}..." isn't numeric in numeric eq (==) at /usr/share/perl5/IkiWiki.pm line 2547.
+
+that line corresponds to
+
+ sub match_creation_year ($$;@) {
+ if ((localtime($IkiWiki::pagectime{shift()}))[5] + 1900 == shift) { <-- this one
+ return IkiWiki::SuccessReason->new('creation_year matched');
+ }
+
+A git bisect shows that the offending commit introduced this hunk
+
+
+ --- /dev/null
+ +++ b/templates/all_entry.mdwn
+ @@ -0,0 +1,23 @@
+ +## <TMPL_VAR year>
+ +
+ +There
+ +<TMPL_IF current>
+ +have been
+ +<TMPL_ELSE>
+ +were
+ +</TMPL_IF>
+ +[[!pagecount pages="
+ +log/* and !tagged(aggregation) and !*/Discussion and !tagged(draft)
+ +and creation_year(<TMPL_VAR year>)
+ +and !*.png and !*.jpg
+ +"]] posts
+ +<TMPL_IF current>
+ +so far
+ +</TMPL_IF>
+ +in <TMPL_VAR year>.
+ +
+ +[[!inline pages="
+ + log/* and !tagged(aggregation) and !*/Discussion and !tagged(draft)
+ + and creation_year(<TMPL_VAR year>)
+ + and !*.png and !*.jpg
+ + " archive=yes feeds=no]]
+
+The lines which feature creation_year(<TMPL_VAR year>) are most likely the culprits. That would explain why the error was repeated twice, and would tally with the file in `templates/` being rendered, rather than the inclusionists.
+
+A workaround is to move the template outside of the srcdir into the external templates directory and include the file suffix when using it, e.g.
+
+ \[[!template id=all_entry.tmpl year=2010 current=true]]
+
+I believed (until I tested) that the [[ikiwiki/directive/if]] directive, with the `included()` test, would be an option here, E.g.
+
+ \[[!if test="included()" then="""
+ ...template...
+ """ else="""
+ Nothing to see here.
+ """]]
+
+However this doesn't work. I assume "included" in this context means e.g. via an `inline` or `map`, not template trans-clusion. -- [[Jon]]
+
+> As far as I know, this bug was fixed in
+> 4a75dee651390b79ce4ceb1d951b02e28b3ce83a on October 20th. [[done]] --[[Joey]]
+
+>> Sorry Joey, I'll make sure to reproduce stuff against master in future. [[Jon]]
diff --git a/doc/bugs/attachment:_escaping_underscores_in_filename__63__.mdwn b/doc/bugs/attachment:_escaping_underscores_in_filename__63__.mdwn
new file mode 100644
index 000000000..4ce4ac5ee
--- /dev/null
+++ b/doc/bugs/attachment:_escaping_underscores_in_filename__63__.mdwn
@@ -0,0 +1,22 @@
+I've just noticed that `attachment` plugin escapes the underscore
+characters in attached filenames. For example, when I wanted to add
+`foo_bar_baz.txt` file, then Ikiwiki added file `foo__95__bar__95__baz.txt`
+to my Subversion repo. I hope that the filename is terribly ugly not only
+for me ;)
+
+Is it a bug or security feature? --[[Paweł|ptecza]]
+
+>> Update: It's not only problem with attached filenames. I have
+>> `mysql/myisam_vs_ndb.mdwn` page in my wiki and attached two
+>> images (`myisam_vs_ndb_sql.png` and `myisam_vs_ndb_cpu.png`)
+>> and one OpenDocument file (`myisam_vs_ndb.ods`). Ikiwiki placed
+>> them into `myisam__95__vs__95__ndb` subdirectory as
+>> `myisam__95__vs__95__ndb__95__sql.png`, `myisam__95__vs__95__ndb__95__cpu.png`
+>> and `myisam__95__vs__95__ndb.ods` files. When I click "Attachments" link,
+>> I can't see my uploaded files, because there are in another subdirectory
+>> (`myisam__95__vs__95__ndb` instead of `myisam_vs_ndb`). --[[Paweł|ptecza]]
+
+> [[done]], uses `linkpage` now.
+
+>> It's seems that now Ikiwiki doesn't escape the filenames with underscore(s).
+>> Thank you very much for the fast fix! --[[Paweł|ptecza]]
diff --git a/doc/bugs/attachment:_failed_to_get_filehandle.mdwn b/doc/bugs/attachment:_failed_to_get_filehandle.mdwn
new file mode 100644
index 000000000..acfc60078
--- /dev/null
+++ b/doc/bugs/attachment:_failed_to_get_filehandle.mdwn
@@ -0,0 +1,115 @@
+I can't add any attachment to my wiki. When I select file using "Browse"
+button and click "Upload Attachment", then `ikiwiki.cgi` file displays
+the error message like below:
+
+ Błąd: failed to get filehandle
+
+> Can you do some debugging? If you edit attachment.pm line 136, to print
+> out what it did get, and show me what that yields, maybe I can figure
+> this out.
+
+ error("failed to get filehandle ($fh)");
+
+>> Sure. I've done the change and it seems that $fh variable is undefined:
+
+>> Use of uninitialized value in concatenation (.) or string at /usr/share/perl5/IkiWiki/Plugin/attachment.pm line 135.
+>> failed to get filehandle ()
+
+> Also, what web server and version of perl is this? --[[Joey]]
+
+>> It's Apache2 2.2.8-1ubuntu0.3 and Perl 5.8.8-12 from Ubuntu Hardy. --[[Paweł|ptecza]]
+
+>>> Hmm, is your CGI.pm perhaps creating the attachment temp file, but
+>>> not providing an open filehandle to it via the `upload` method?
+>>> Change the debugging line to this: --[[Joey]]
+
+ error("failed to get filehandle:$fh ; file:$filename ; is ref:".ref($q->param('attachment')));
+
+>>>> Now my Ikiwiki returns:
+
+>>>> failed to get filehandle: ; file:sandbox/test.txt ; is ref:
+
+>>>> Is it helpful for you? --[[Paweł|ptecza]]
+
+>>>>> Yes, this suggests that CGI.pm's `upload` function is not working,
+>>>>> but that it *is* returning a filehandle pointing at the attachment
+>>>>> using the old method. Hmm, so I'll bet you have a CGI.pm version
+>>>>> older than 2.47. Can you find your system's CGI.pm and grep for
+>>>>> "VERSION" in it to determine the version? I checked debian stable.
+>>>>> and its perl 5.8.8 has version 3.15, so is not affected, I think.
+
+>>>>>> I have CGI.pm 3.15 too:
+
+>>>>>> $ grep VERSION= /usr/share/perl/5.8.8/CGI.pm
+>>>>>> $CGI::VERSION='3.15';
+
+>>>>> I've just checked in a fix that should work, can you test it?
+>>>>> 71f10579c00a8ddc20ada1a1efd33aac25a3da7e --[[Joey]]
+
+>>>>>> I've patched `attachment.pm` module, but the bug still occurs.
+>>>>>> However I can see a little progress. I changed invoking `error()`
+>>>>>> subroutine like you showed me before and now Ikiwiki prints
+
+>>>>>> failed to get filehandle:test.txt ; file:sandbox/test.txt ; is ref:
+
+>>>>>> --[[Paweł|ptecza]]
+
+>>>>>>> Well then, your CGI.pm is somehow not behaving as its documentation
+>>>>>>> describes, in two ways:
+>>>>>>> 1. `upload()` is not returning a reference to the filehandle
+>>>>>>> 2. The filename returned by `param("attachment")` is not also
+>>>>>>> a file handle.
+>>>>>>> That seems very broken. I can try to work around it some more
+>>>>>>> though. I've checked in a second try at dealing with things, can
+>>>>>>> you try it? --[[Joey]]
+
+>>>>>>>> Do you mean that 66f35e30dcea03c631a293e2341771277543b4ae?
+>>>>>>>> If so, then it causes "Internal Server Error" for me:
+
+>>>>>>>> Can't use string ("test.txt") as a symbol ref while "strict refs" in use at /usr/share/perl5/IkiWiki/Plugin/attachment.pm line 144.
+
+>>>>>>>> I can rebuild Debian stable source package with CGI for Perl. Maybe it will help me? What do you think? --[[Paweł|ptecza]]
+
+>>>>>>>>> Silly thinko on my part, fixed that in git.. --[[Joey]]
+
+>>>>>>>>>> Thanks for the fix, Joey! Now CGI doesn't fails, but still no success with attaching file:
+
+>>>>>>>>>> failed to open : No such file or directory
+
+>>>>>>>>>> Do you have any another idea how to resolve that problem? I can try with rebuilding
+>>>>>>>>>> package `perl-modules` if it's necessary in that situation. --[[Paweł|ptecza]]
+
+>>>>>>>>>>> If CGI.pm is not creating a temp file, not providing a
+>>>>>>>>>>> filehandle by either of its documented methods, then it's just
+>>>>>>>>>>> broken; ikiwiki can't deal with that level of brokennecess.
+>>>>>>>>>>> I need to find out if this affects stable in general, or just
+>>>>>>>>>>> you/ubuntu. --[[Joey]]
+
+>>>>>>>>>>>> Same thing on FreeBSD using CGI.pm 3.15. Looks like $self->{'.tmpfiles'} in CGI.pm
+>>>>>>>>>>>> is not populated with the information about the uploaded file, causing tmpFileName()
+>>>>>>>>>>>> to return '' (unloadInfo(), which uses the same lookup method fails in the same manner),
+>>>>>>>>>>>> but I have yet to find out why this happens. --[[HenrikBrixAndersen]]
+
+The same message I can see in the Apache log file. There is also
+following warning:
+
+ Use of uninitialized value in length at /usr/share/perl5/IkiWiki/Plugin/attachment.pm line 36.
+
+> This is unrelated, I've fixed the warning message. --[[Joey]]
+
+Is it Ikiwiki bug or my attachment plugin is misconfigured? --[[Paweł|ptecza]]
+
+> I've reproduced the bug, and it does seem to be a bug with the perl in
+> debian stable/ubuntu hardy. Trying to figure it out --[[Joey]]
+
+> This was amazingly disgusting, see commit message for the full horror of
+> the details. I think it's [[done]] -- at least it works on debian stable
+> now. --[[Joey]]
+
+>> Wow! It's probably the biggest Ikiwiki commit message I've ever seen :)
+
+>> Yes, I can confirm that now the plugin works for me and I'm able to add
+>> attachments to my wiki. Yupiii! :D
+>> Thanks a lot, Joey! You're really great! :) --[[Paweł|ptecza]]
+
+>> Thank you very much for your effort, Joey! :) --[[Paweł|ptecza]]
diff --git a/doc/bugs/attachment_plugin_enabled_by_default__63__.mdwn b/doc/bugs/attachment_plugin_enabled_by_default__63__.mdwn
new file mode 100644
index 000000000..0b6d73300
--- /dev/null
+++ b/doc/bugs/attachment_plugin_enabled_by_default__63__.mdwn
@@ -0,0 +1,19 @@
+At [[attachment|plugins/attachment]] plugin page I can see
+that it's enabled by default in Ikiwiki. Is it true?
+
+> No, typo ([[done]]). I don't want to enable it by default because it requires
+> site-specific configuration to be made secure. --[[Joey]]
+
+>> Thanks for your reply! I was guessing it :) --[[Paweł|ptecza]]
+
+I have backported Ikiwiki 2.52 and I need to add that plugin to
+`add_plugins` variable in my `ikiwiki.setup` file (and rebuild
+my wiki, of course) to see new upload buttons when I edit a page
+and click "Attachments" link.
+
+> FWIW, you don't need to rebuild the whole wiki, --refresh --wrappers is enough.
+
+>> It's good to know. Thank you for the hint!
+
+Maybe should I enable attachment handling in different way?
+--[[Paweł|users/ptecza]]
diff --git a/doc/bugs/attachment_upload_does_not_work_for_windows_clients.mdwn b/doc/bugs/attachment_upload_does_not_work_for_windows_clients.mdwn
new file mode 100644
index 000000000..4e8c7bdcf
--- /dev/null
+++ b/doc/bugs/attachment_upload_does_not_work_for_windows_clients.mdwn
@@ -0,0 +1,34 @@
+It seems as if windows clients (IE) submit filenames with backslash as directory separator.
+(no surprise :-).
+
+But the attachment plugin translates these backslashes to underscore, making the
+whole path a filename.
+
+> As far as I can see, that just means that the file will be saved with
+> a filename something like `c:__92__My_Documents__92__somefile`.
+> I don't see any "does not work" here. Error message?
+>
+> Still, I don't mind adding a special case, though obviously not in
+> `basename`. [[done]] --[[Joey]]
+
+>> Well, it's probably something else also, I get **bad attachment filename**.
+>> Now, that could really be a bad filename, problem is that it wasn't. I even
+>> tried applying the **wiki_file_prune_regexps** one by one to see what was
+>> causing it. No problem there. The strange thing is that the error shows up
+>> when using firefox on windows too. But the backslash hack fixes at least the
+>> incorrect filename from IE (firefox on windows gave me the correct filename.
+>> I'll do some more digging... :-) /jh
+
+This little hack fixed the backslash problem, although I wonder if that
+really is the problem?
+(Everything works perfectly from linux clients of course. :-)
+
+ sub basename ($) {
+ my $file=shift;
+
+ $file=~s!.*/+!!;
+ $file=~s!.*\\+!!;
+ return $file;
+ }
+
+Should probably be `$file=~s!.*[/\\]+!!` :-)
diff --git a/doc/bugs/backlink__40__.__41___doesn__39__t_work.mdwn b/doc/bugs/backlink__40__.__41___doesn__39__t_work.mdwn
new file mode 100644
index 000000000..534e5a01f
--- /dev/null
+++ b/doc/bugs/backlink__40__.__41___doesn__39__t_work.mdwn
@@ -0,0 +1,57 @@
+It seems `backlink(.)` doesn't work, that is, it doesn't match pages linked
+to from the current page.
+
+If I have two test pages, `foo`, which links to `bar`, then (on the `foo`
+page):
+
+ * backlink(foo) lists 'bar'
+ * backlink(.) lists nothing
+
+tested with 3.20120109.
+
+— [[Jon]]
+
+> The attached patch should fix it:
+
+>> [[applied|done]] thanks --[[Joey]]
+
+ From 30512ac5f6a724bafb1095ab246e0648999f7b6c Mon Sep 17 00:00:00 2001
+ From: Giuseppe Bilotta <giuseppe.bilotta@gmail.com>
+ Date: Fri, 13 Jan 2012 11:02:11 +0100
+ Subject: [PATCH] backlink(.) should behave like backlink(<current page>)
+
+ Since commit c4d4cad3befbbd444d094cbeb0b6ebba3910a025, the single dot in
+ a pagespec can be used to mean the current page. While this worked
+ correctly in link() it didn't work in backlink(). Fix this by explicitly
+ checking the testpage in backlink against . and replacing it with the
+ current location if necessary.
+ ---
+ IkiWiki.pm | 10 ++++++++--
+ 1 files changed, 8 insertions(+), 2 deletions(-)
+
+ diff --git a/IkiWiki.pm b/IkiWiki.pm
+ index 08e242a..bc56501 100644
+ --- a/IkiWiki.pm
+ +++ b/IkiWiki.pm
+ @@ -2647,8 +2647,14 @@ sub match_link ($$;@) {
+ }
+
+ sub match_backlink ($$;@) {
+ - my $ret=match_link($_[1], $_[0], @_);
+ - $ret->influences($_[1] => $IkiWiki::DEPEND_LINKS);
+ + my $page=shift;
+ + my $testpage=shift;
+ + my %params=@_;
+ + if ($testpage eq '.') {
+ + $testpage = $params{'location'}
+ + }
+ + my $ret=match_link($testpage, $page, @_);
+ + $ret->influences($testpage => $IkiWiki::DEPEND_LINKS);
+ return $ret;
+ }
+
+ --
+ 1.7.8.rc2.253.gdbf3
+
+
+> (you need to re-make IkiWiki for it to work)
diff --git a/doc/bugs/backlinks_onhover_thing_can_go_weird.mdwn b/doc/bugs/backlinks_onhover_thing_can_go_weird.mdwn
new file mode 100644
index 000000000..415e6af91
--- /dev/null
+++ b/doc/bugs/backlinks_onhover_thing_can_go_weird.mdwn
@@ -0,0 +1,43 @@
+I was just hovering over the '...' next to the backlinks on a page on
+<http://ikiwiki.info/>. In terms of the size of my browser window, this was
+towards the bottom-right of the screen.
+
+When I hovered over the '...', the additional backlinks float appeared. This
+caused the page length to grow down, meaning a horizontal scrollbar was added
+to the page. This meant the text reflowed, and the '...' moved outside of my
+mouse pointer region.
+
+This caused an infinite loop of box appears... text moves, box disappears...
+box re-appears.. which was not very visually pleasant.
+
+In general I think that the onhover float is a bit of bad UI. Even a truncated
+list of backlinks looks cluttered due to there being no delimiters. I moved to
+having an always-complete list of backlinks and having them as LI elements
+inside a UL to make it look neater, although I appreciate that would make some
+pages very long indeed.
+
+How about doing something a little like [[plugins/toggle]] for the excess
+items instead?
+
+-- [[Jon]]
+
+----
+
+An additional, related issue: if the box expands beyond the bottom of the
+page, you might move your mouse pointer to the scrollbar in order to move
+further down the list, but of course then you are outside the hover region.
+
+-- [[Jon]]
+
+> I agree, browser handling of this CSS is often not good.
+>
+> A toggle would be the perfect UI, but the heaviness of needing
+> to include 30 lines of javascript to do it, plus then it only working
+> with javascript enabled, is also not optimal.
+>
+> Another idea would be to make the "..." a link to the ikiwiki cgi.
+> The cgi could then have a mode that displays all the backlinks of a page
+> in a list.
+>
+> Yet another idea: Find some more refined CSS for handling a variable
+> size popup.. --[[Joey]]
diff --git a/doc/bugs/barfs_on_recentchange_entry_for_a_change_removing_an_invalid_pagespec.mdwn b/doc/bugs/barfs_on_recentchange_entry_for_a_change_removing_an_invalid_pagespec.mdwn
new file mode 100644
index 000000000..2f21d71c3
--- /dev/null
+++ b/doc/bugs/barfs_on_recentchange_entry_for_a_change_removing_an_invalid_pagespec.mdwn
@@ -0,0 +1,48 @@
+I have a commit doing
+
+ -\[[map pages="link(tag/<TMPL_VAR name>) and !papers/*"]]
+ +\[[map pages="link(sourcepage()) and !papers/*"]]
+
+ikiwiki now fails to compile the site, barfing:
+
+ Use of uninitialized value in subroutine entry at /usr/share/perl5/IkiWiki.pm line 1288.
+ ikiwiki.setup: Can't use string ("") as a subroutine ref while "strict refs" in use at /usr/share/perl5/IkiWiki.pm line 1288.
+ BEGIN failed--compilation aborted at (eval 6) line 200.
+
+after forcefully entering the Perl mode of thinking, I reduced this to line
+1285 of IkiWiki.pm (2.53), which apparently returns `undef`:
+
+ my $sub=pagespec_translate($spec);
+
+Why does it even bother parsing the diffs of `recentchanges`?
+
+I have not recompiled this site in ages, so I am not sure when this problem
+was introduced, but it wasn't there when I worked on the site last about
+a year ago in September 2007.
+
+-- [[madduck]]
+
+> I can't reproduce this problem. When I try, the generated
+> `recentchanges/change_$sha1._change` file has the diff properly escaped,
+> so that the map is not expanded at all.
+>
+> I also tried de-escaping that, and still failed to reproduce any crash.
+> The bogus pagespec simply expands to nothing. The line directly after the
+> line you quoted checks for syntax errors in the pagespec translation
+> eval and seems to be working fine:
+>
+> joey@kodama:~>perl -e 'use IkiWiki; my
+> $sub=IkiWiki::pagespec_translate("link(tag/<TMPL_VAR name>) and !papers/*"); print "caught failure:".$@'
+> caught failure:syntax error at (eval 14) line 1, near "|| &&"
+>
+> Based on your line numbers, you are not running a current version of
+> ikiwiki. (Doesn't quite seem to be version 2.53.x either) Try with a current
+> version, and see if you can send me a source tree that can reproduce the
+> problem? --[[Joey]]
+
+Did not hear back, so calling this [[done]], unless I hear differently.
+--[[Joey]]
+
+Just in case someone else sees this same error message:
+I was able to reproduce this by having an incomplete (not upgraded) rcs backend that didn't provide rcs_commit_staged() when attempting to submit a blog comment.
+--[[JeremyReed]]
diff --git a/doc/bugs/basewiki_uses_meta_directives_but_meta_is_not_enabled_by_default.mdwn b/doc/bugs/basewiki_uses_meta_directives_but_meta_is_not_enabled_by_default.mdwn
new file mode 100644
index 000000000..62931d8bc
--- /dev/null
+++ b/doc/bugs/basewiki_uses_meta_directives_but_meta_is_not_enabled_by_default.mdwn
@@ -0,0 +1,5 @@
+[[plugins/meta]] is not enabled by default, yet some pages in the default basewiki include [[the_meta_directive|ikiwiki/directive/meta]], notably the [[ikiwiki]] heirarchy.
+
+This means that the default output of "ikiwiki src dest", for two empty directories src and dest, result in the meta directive being displayed inline with the page text.
+
+> [[done]], meta now enabled by default.
diff --git a/doc/bugs/beautify__95__urlpath_will_add_.__47___even_if_it_is_already_present.mdwn b/doc/bugs/beautify__95__urlpath_will_add_.__47___even_if_it_is_already_present.mdwn
new file mode 100644
index 000000000..8e96b1f56
--- /dev/null
+++ b/doc/bugs/beautify__95__urlpath_will_add_.__47___even_if_it_is_already_present.mdwn
@@ -0,0 +1,3 @@
+beautify_urlpath will prepend a useless "./" to the URL "./foo". Fixed in commit 5b1cf21a on my comments branch. --[[smcv]]
+
+[[!tag patch done]]
diff --git a/doc/bugs/bestlink_change_update_issue.mdwn b/doc/bugs/bestlink_change_update_issue.mdwn
new file mode 100644
index 000000000..c26e40d10
--- /dev/null
+++ b/doc/bugs/bestlink_change_update_issue.mdwn
@@ -0,0 +1,32 @@
+* Has bugs updating things if the bestlink of a page changes due to
+ adding/removing a page. For example, if Foo/Bar links to "Baz", which is
+ Foo/Baz, and Foo/Bar/Baz gets added, it will update the links in Foo/Bar
+ to point to it, but will forget to update the backlinks in Foo/Baz.
+
+ The buggy code is in `refresh()`, when it determines what
+ links, on what pages, have changed. It only looks at
+ changed/added/deleted pages when doing this. But when Foo/Bar/Baz
+ is added, Foo/Bar is not changed -- so the change it its
+ backlinks is not noticed.
+
+ To fix this, it needs to consider, when rebuilding Foo/Bar for the changed
+ links, what oldlinks Foo/Bar had. If one of the oldlinks linked to
+ Foo/Baz, and not links to Foo/Bar/Baz, it could then rebuild Foo/Baz.
+
+ Problem is that in order to do that, it needs to be able to tell that
+ the oldlinks linked to Foo/Baz. Which would mean either calculating
+ all links before the scan phase, or keeping a copy of the backlinks
+ from the last build, and using that. The first option would be a lot
+ of work for this minor issue.. it might be less expensive to just rebuild
+ *all* pages that Foo/Bar links to.
+
+ Keeping a copy of the backlinks has some merit. It could also be
+ incrementally updated.
+
+ This old bug still exists as of 031d1bf5046ab77c796477a19967e7c0c512c417.
+
+* And if Foo/Bar/Baz is then removed, Foo/Bar gets a broken link,
+ instead of changing back to linking to Foo/Baz.
+
+ This part was finally fixed by commit
+ f1ddf4bd98821a597d8fa1532092f09d3d9b5483.
diff --git a/doc/bugs/bestlink_returns_deleted_pages.mdwn b/doc/bugs/bestlink_returns_deleted_pages.mdwn
new file mode 100644
index 000000000..874f18ead
--- /dev/null
+++ b/doc/bugs/bestlink_returns_deleted_pages.mdwn
@@ -0,0 +1,75 @@
+To reproduce:
+
+1. Add the backlinkbug plugin below to ikiwiki.
+2. Create a page named test.mdwn somewhere in the wiki.
+3. Refresh ikiwiki in verbose mode. Pages whose bestlink is the test.mwdn page will be printed to the terminal.
+4. Delete test.mdwn.
+5. Refresh ikiwiki in verbose mode again. The same pages will be printed to the terminal again.
+6. Refresh ikiwiki in verbose mode another time. Now no pages will be printed.
+
+bestlink() checks %links (and %pagecase) to confirm the existance of the page.
+However, find_del_files() does not remove the deleted page from %links (and %pagecase).
+
+Since find_del_files removes the deleted page from %pagesources and %destsources,
+won't it make sense for bestlink() to check %pagesources first? --[[harishcm]]
+
+> This same problem turned out to also be the root of half of ikiwiki's
+> second-oldest bug, [[bestlink_change_update_issue]].
+>
+> Fixing it is really a bit involved, see commit
+> f1ddf4bd98821a597d8fa1532092f09d3d9b5483. The fix I committed fixes
+> bestlink to not return deleted pages, but only *after* the needsbuild and
+> scan hooks are called. So I was able to fix it for every case except the
+> one you gave! Sorry for that. To fix it during beedsbuild and scan,
+> a much more involved approach would be needed. AFAICS, no existing plugin
+> in ikiwiki uses bestlink in needsbuild or scan though.
+>
+> If the other half of [[bestlink_change_update_issue]] is fixed,
+> maybe by keeping a copy of the old backlinks info, then that fix could be
+> applied here too. --[[Joey]]
+
+>> Cool that was fast! Well at least half the bug is solved :) For now I'll
+>> probably try using a workaround if using bestlink within the needsbuild
+>> or scan hooks. Maybe by testing if pagemtime equals zero. --[[harishcm]]
+
+>>> Yeah, and bestlink could also do that. However, it feels nasty to have
+>>> it need to look at pagemtime. --[[Joey]]
+
+----
+
+ #!/usr/bin/perl
+ # Plugin to reproduce bestlink returning deleted pages.
+ # Run with ikiwiki in verbose mode.
+
+ package IkiWiki::Plugin::bestlinkbug;
+
+ use warnings;
+ use strict;
+ use IkiWiki 3.00;
+
+ sub import {
+ hook(type => "getsetup", id => "bestlinkbug", call => \&getsetup);
+ hook(type => "needsbuild", id => "bestlinkbug", call => \&needsbuild);
+ }
+
+ sub getsetup () {
+ return
+ plugin => {
+ safe => 1,
+ rebuild => 0,
+ },
+ }
+
+ sub needsbuild (@) {
+ my $needsbuild=shift;
+
+ foreach my $page (keys %pagestate) {
+ my $testpage=bestlink($page, "test") || next;
+
+ debug("$page");
+ }
+ }
+
+ 1
+
+
diff --git a/doc/bugs/blog_posts_not_added_to_mercurial_repo.mdwn b/doc/bugs/blog_posts_not_added_to_mercurial_repo.mdwn
new file mode 100644
index 000000000..eead716d5
--- /dev/null
+++ b/doc/bugs/blog_posts_not_added_to_mercurial_repo.mdwn
@@ -0,0 +1,50 @@
+I am using mercurial as RCS backend and ikiwiki 2.40.
+
+It seems that, when adding a blog post, it is not immediately commited to the mercurial repo. I have a page with this directive:
+
+ \[[!inline pages="journal/blog2008/* and !*/Discussion" show="0" feeds="no" actions="yes" rootpage="journal/blog2008"]]
+
+When I add a blog post, I see it on the wiki but it doesn't appear on `History` or `RecentChanges`. If I run `hg status` on the wiki source dir, I see the new file has been marked as `A` (ie, a new file that has not been commited).
+
+If I then edit the blog post, **then** the file gets commited and I can see the edit on `History` and `RecentChanges`. The creation of the file remains unrecorded. --[[buo]]
+
+> Ikiwiki calls `rcs_add()` if the page is new, followed by `rcs_commit()`.
+> For mercurial, these run respectively `hg add` and `hg commit`. If the
+> add or commit fails, it will print a warning to stderr, you might check
+> apache's error.log to see if there's anything there. --[[Joey]]
+
+>>The problem was using accented characters (é, í) on the change comments. I didn't have
+>>an UTF-8 locale enabled in my setup file. By coincidence this happened for the first time
+>>in a couple of consecutive blog posts, so I was mistaken about the root of the problem. I don't know if
+>>you will consider this behavior a bug, since it's strictly speaking a misconfiguration but it
+>>still causes ikiwiki's mercurial backend to fail. A quick note in the docs might be a good idea. For my part, please
+>>close this bug, and thanks for the help. --[[buo]]
+
+>>> So, in a non-utf8 locale, mercurial fails to commit if the commit
+>>> message contains utf8? --[[Joey]]
+
+>>>> (Sorry for the delay, I was AFK for a while.) What I am seeing is this: in a non-utf8 locale, using mercurial "stand-alone" (no ikiwiki involved), mercurial fails to commit if the commit message has characters such as á. If the locale is utf8, mercurial works fine (this is with mercurial 1.0).
+
+>>>> However, the part that seems a bit wrong to me, is this: even if my locale is utf8, I have to explicitly set a utf8 locale in the wiki's setup file, or the commit fails. It looks like ikiwiki is not using this machine's default locale, which is utf8. Also, I'm not getting any errors on apache's error log.
+
+>>>> Wouldn't it make sense to use the machine's default locale if 'locale' is commented out in the setup file?
+
+>>>>> Ikiwiki wrappers only allow whitelisted environment variables
+>>>>> through, and the locale environment variables are not included
+>>>>> currently.
+>>>>>
+>>>>> But that's not the whole story, because "machine's default locale"
+>>>>> is not very well defined. For example, my laptop is a Debian system.
+>>>>> It has a locale setting in /etc/environment (`LANG="en_US.UTF-8"`).
+>>>>> But even if I start apache, making sure that LANG is set and exported
+>>>>> in the environment, CGI scripts apache runs do not see LANG in their
+>>>>> environment. (I notice that `/etc/init.d/apache` explocitly
+>>>>> forces LANG=C. But CGI scripts don't see the C value either.)
+>>>>> Apache simply does not propigate its runtime environment to CGI
+>>>>> scripts, and this is probably to comply with the CGI specification
+>>>>> (although it doesn't seem to completly rule out CGI's being passed
+>>>>> other variables).
+>>>>>
+>>>>> If mercurial needs a utf-8 locale, I guess the mercurial plugin needs
+>>>>> to check if it's not in one, and do something sane (either fail
+>>>>> earlier, or complain, or strip utf-8 out of comments). --[[Joey]]
diff --git a/doc/bugs/blog_spam_plugin_not_allowing_non-ASCII_chars__63__.mdwn b/doc/bugs/blog_spam_plugin_not_allowing_non-ASCII_chars__63__.mdwn
new file mode 100644
index 000000000..59bf93d14
--- /dev/null
+++ b/doc/bugs/blog_spam_plugin_not_allowing_non-ASCII_chars__63__.mdwn
@@ -0,0 +1,15 @@
+Hi,
+
+I'm trying to add a comment, and ikiwiki fails with this error message:
+
+ Error: HTTP::Message content must be bytes at /usr/share/perl5/RPC/XML/Client.pm line 308
+
+This seems to happen because I had a non-ASCII character in the comment (an ellipse, …).
+The interesting part is that the comment preview works fine, just the save fails. Probably
+this means that the blogspam plugin is the culprit (hence the error in RPC::XML::Client library).
+I'm using version 3.20100815~bpo50+. Thanks!
+
+> I've filed an upstream bug about this on RPC::XML:
+> <https://rt.cpan.org/Ticket/Display.html?id=61333>
+>
+> Worked around it in blogspam by decoding. [[done]] --[[Joey]]
diff --git a/doc/bugs/blogspam__95__options_whitelist_vs._IPv6__63__.mdwn b/doc/bugs/blogspam__95__options_whitelist_vs._IPv6__63__.mdwn
new file mode 100644
index 000000000..f3a39c02b
--- /dev/null
+++ b/doc/bugs/blogspam__95__options_whitelist_vs._IPv6__63__.mdwn
@@ -0,0 +1,4 @@
+This is possibly/probably due to my weird setup, which is that I have apache behind nginx, with the result that apache sees the client's IPv4 address as having been mapped to IPv6. i.e. <tt>::ffff:10.11.12.13</tt>. That being the case, I currently need to specify that (with the <tt>::ffff:</tt> prepended) if I want to whitelist (or more importantly blacklist) and IPv4 address.
+
+It strikes me that this is liable to become more of a problem as people finally start using IPv6, so it might be worth ensuring that the code that compares IP addresses be able to treat the two formats (with and without the ffff's) as equivalent. --[[fil]]
+
diff --git a/doc/bugs/blogspam_marks_me_as_spam_on_ipv6.mdwn b/doc/bugs/blogspam_marks_me_as_spam_on_ipv6.mdwn
new file mode 100644
index 000000000..9b415a84a
--- /dev/null
+++ b/doc/bugs/blogspam_marks_me_as_spam_on_ipv6.mdwn
@@ -0,0 +1,8 @@
+I just got this message trying to post to this wiki:
+
+ Error: Sorry, but that looks like spam to blogspam: No reverse DNS entry for 2001:1928:1:9::1
+
+So yeah, it seems I have no reverse DNS for my IPv6 address, which may
+be quite common for emerging IPv6 deployments...
+
+This may be related to [[blogspam_options whitelist vs. IPv6?]].
diff --git a/doc/bugs/both_inline_and_comment_create_elements_id__61__feedlink.mdwn b/doc/bugs/both_inline_and_comment_create_elements_id__61__feedlink.mdwn
new file mode 100644
index 000000000..170f3810e
--- /dev/null
+++ b/doc/bugs/both_inline_and_comment_create_elements_id__61__feedlink.mdwn
@@ -0,0 +1,15 @@
+The [[plugins/inline]] and [[plugins/comments]] plugins both generate feed links.
+
+In both cases, the generated markup include an element with `id="feedlink"`.
+
+[XHTML 1.0 Strict](http://www.w3.org/TR/xhtml1/#h-4.10) (Ikiwiki's default output type) forbids multiple elements with the same ID:
+
+> In XML, fragment identifiers are of type ID, and there can only be a single attribute of type ID per element. Therefore, in XHTML 1.0 the id attribute is defined to be of type ID. In order to ensure that XHTML 1.0 documents are well-structured XML documents, XHTML 1.0 documents MUST use the id attribute when defining fragment identifiers on the elements listed above. See the HTML Compatibility Guidelines for information on ensuring such anchors are backward compatible when serving XHTML documents as media type text/html.
+
+As does [W3C's HTML5](http://www.w3.org/TR/html5/elements.html#the-id-attribute).
+
+Any page with both a comments feed and an inline feed will be invalid XHTML 1.0 Strict or HTML 5.
+
+-- [[Jon]]
+
+> [[news/version_3.2011012]] suggests this is fixed for `inline`, at least, I will test to see if it is cleared up for comments too. -- [[Jon]]
diff --git a/doc/bugs/broken_page_after_buggy_remove.mdwn b/doc/bugs/broken_page_after_buggy_remove.mdwn
new file mode 100644
index 000000000..c85d22cc5
--- /dev/null
+++ b/doc/bugs/broken_page_after_buggy_remove.mdwn
@@ -0,0 +1,4 @@
+Hi, I created \[[sandbox/subpage]] then I deleted it with the "remove" button.
+After confirmation there was a message about a xapian error (My bad, I did not write down the exact error message).
+Now, accessing [[sandbox/subpage|sandbox/subpage]] leads my browser complains about a redirect loop. [[JeanPrivat]]
+>Uh. Now the bug of redirect loop seems to have solved itself. However, I don't know if the xapian error need to be investigated. But I found another [[bug|cannot revert page deletion]]. [[JeanPrivat]]
diff --git a/doc/bugs/broken_parentlinks.mdwn b/doc/bugs/broken_parentlinks.mdwn
new file mode 100644
index 000000000..556d89b65
--- /dev/null
+++ b/doc/bugs/broken_parentlinks.mdwn
@@ -0,0 +1,50 @@
+The header of subpages always links to its "superpage", even if it doesn't
+exist. I'm not sure if this is a feature or a bug, but I would certainly prefer
+that superpages weren't mandatory.
+
+For example, if you are in 'example/page.html', the header will be something
+like 'wiki / example / page'. Now, if 'example.html' doesn't exist, you'll have
+a dead link for every subpage.
+
+---
+
+This is a bug, but fixing it is very tricky. Consider what would happen if
+example.mdwn were created: example/page.html and the rest of example/*
+would need to be updated to change the parentlink from a bare word to a
+link to the new page. Now if example.mdwn were removed again, they'd need
+to be updated again. So example/* depends on example. But it's even more
+tricky, because if example.mdwn is modified, we _don't_ want to rebuild
+example/*!
+
+ikiwiki doesn't have a way to represent this dependency and can't get one
+without a lot of new complex code being added.
+
+> Note that this code has now been added. In new terms, example/* has a
+> presence dependency on example. So this bug is theoretically fixable now.
+> --[[Joey]]
+
+For now the best thing to do is to make sure that you always create
+example if you create example/foo. Which is probably a good idea anyway..
+
+----
+
+Note that this bug does not exist if the wiki is built with the "usedirs"
+option, since in that case, the parent link will link to a subdirectory,
+that will just be missing the index.html file, but still nicely usable.
+--[[Joey]]
+
+----
+
+<http://www.gnu.org/software/hurd/hurd/translator/writing.html> does not exist.
+Then, on
+<http://www.gnu.org/software/hurd/hurd/translator/writing/example.html>, in the
+*parentlinks* line, *writing* links to the top-level *index* file. It should
+rather not link anywhere at all. --[[tschwinge]]
+
+> So, the bug has changed behavior a bit. Rather than a broken link, we get
+> a link to the toplevel page. This, FWIW, is because the template now
+> uses this for each parentlink:
+
+ <a href="<TMPL_VAR URL>"><TMPL_VAR PAGE></a>/
+
+> Best workaround is still to enable usedirs. --[[Joey]]
diff --git a/doc/bugs/brokenlinks_accumulates_duplicate_items.mdwn b/doc/bugs/brokenlinks_accumulates_duplicate_items.mdwn
new file mode 100644
index 000000000..7fe92f7a9
--- /dev/null
+++ b/doc/bugs/brokenlinks_accumulates_duplicate_items.mdwn
@@ -0,0 +1,27 @@
+After several runs of ikiwiki --refresh, the page I use with the [[plugin/brokenlinks]] directive on it accumulates multiple repeated lists of pages on the RHS. For example:
+
+ * ?freebies from free kilowatts, free kilowatts, free kilowatts, free kilowatts, free kilowatts
+
+In this case the page "free kilowatts" has one link to "freebies" (it's tagged freebies).
+
+I think this may just be links caused by tags, actually.
+
+ikiwiki version 3.14159265.
+
+-- [[Jon]]
+
+> Is it possible that you upgraded from a version older than 3.12,
+> and have not rebuilt your wiki since, but just refreshed? And did not run
+> `ikiwiki-transition deduplinks`? If so, suggest you rebuild the wiki,
+> or run that, either would probably fix the problem.
+>
+> If you can get to the problem after rebuilding with the current ikiwiki,
+> and then refreshing a few times, I guess I will need a copy of the wiki
+> source and the `.ikiwiki` directory to reproduce this. --[[Joey]]
+
+>> Hi Joey, thanks for your response. I've reproduced it post rebuild and after having ran ikiwiki-transition and many refreshes (both resulting from content changes and otherwise) unfortunately, with ikiwiki 3.14159265 (different machine to above report, though). I will contact you privately to provide a git URL and a copy of my .ikiwiki. -- [[Jon]]
+
+>>> Found the bug that was causing duplicates to get in, and fixed it.
+>>> [[done]] --[[Joey]]
+
+>>>> Good work Joey, thanks! -- [[Jon]]
diff --git a/doc/bugs/brokenlinks_false_positives.mdwn b/doc/bugs/brokenlinks_false_positives.mdwn
new file mode 100644
index 000000000..3fdc43c40
--- /dev/null
+++ b/doc/bugs/brokenlinks_false_positives.mdwn
@@ -0,0 +1,6 @@
+The [[plugins/brokenlinks]] plugin falsely complains that
+[[ikiwiki/formatting]] has a broken link to [[smileys]], if the smiley plgin
+is disabled. While the page links to it inside a
+conditional, and so doesn't show the link in this case, ikiwiki scans for
+links w/o looking at conditionals and so still thinks the page contains the
+link.
diff --git a/doc/bugs/bug_in_cgiurl_port.mdwn b/doc/bugs/bug_in_cgiurl_port.mdwn
new file mode 100644
index 000000000..373657814
--- /dev/null
+++ b/doc/bugs/bug_in_cgiurl_port.mdwn
@@ -0,0 +1,15 @@
+I think there's a bug in the code that determines if the cgiurl is relative
+to the url. If one has a different port than the other, they're not
+relative, and I hear Fil encountered an issue where the wrong port was then
+used. --[[Joey]]
+
+> I tested, setting cgiurl to a nonstandard port. After rebuilding,
+> pages used the full url. So I don't see a bug here, or am missing
+> something from my memory of the report (which was done the bad way, on
+> IRC). [[done]] --[[Joey]]
+
+> > Sorry about wittering on IRC instead of reporting proper bugs.
+> >
+> > The setup I have is nginx in front of apache, so that nginx is listening on port 80, apache is on port 81, and ikiwiki is being served by apache. After upgrading to 3.20120203 (backported to squeeze) I found that the URLs in the edit page all have the port set as :81 ... but now that I look at it more closely, that is the case for several ikiwiki-hosting controlled sites, but not for a few other sites that are also on the same machine, so it must be some difference between the settings for the sites, either in ikiwiki, or apache, or perhaps even nginx. Anyway, on the affected sites, explicitly including a port :80 in the cgiurl fixes the problem.
+
+> > So, for the moment, this bug report is a bit useless, until I find out what is causing the ikiwiki-hosting sites to be beffuddled, so it should probably stay closed -[[fil]]
diff --git a/doc/bugs/bug_when_toggling_in_a_preview_page.mdwn b/doc/bugs/bug_when_toggling_in_a_preview_page.mdwn
new file mode 100644
index 000000000..2f1b5f68c
--- /dev/null
+++ b/doc/bugs/bug_when_toggling_in_a_preview_page.mdwn
@@ -0,0 +1,29 @@
+When toggling an item while being in a web-editing session in the *Preview* frame,
+you'll lose the context of the editing session and will be directed to the wiki's
+main page instead. --[[tschwinge]]
+
+Making toggles actually work in preview is hard: The toggle plugin uses
+a format hook to add javascript to the page, after htmlscrubber runs. Page
+preview does not currently run the format hook.
+
+I think that is not done because the format hook is supposed to get the
+entire html file contents, including the html head and body elements, and
+in the case of page preview, such a full page is not being generated,
+instead it's just inlining the previewed page into the edit form.
+
+If the format hook were called on this partial data, hooks that looked for
+body tags etc would break. OTOH, if in preview mode it were run on the
+whole edit form page, ones like toc that parse the page would have
+unexpected results, since they would also parse the edit form.
+
+(Also, if format were run in preview mode then plugins like linkmap, which
+generate object files in their format hook, would need to be changed to not
+do this during preview (to avoid preview mode writing files to the wiki).
+So the format hook would need to be passed a flag indicating preview mode.)
+
+So I don't see a good way to call the format hook in preview mode.
+Failing that, the best I can do is make the toggle plugin detect preview
+mode, and generate nonfunctional toggles that warn they're not toggleable
+in preview mode. I've [[done]] that, which also fixes the incidental issue of
+the toggle link pointing to the wrong place, which was due to the use of the
+&lt;base&gt; tag in the preview page template. --[[Joey]]
diff --git a/doc/bugs/bugfix_for:___34__mtn:_operation_canceled:_Broken_pipe__34_____40__patch__41__.mdwn b/doc/bugs/bugfix_for:___34__mtn:_operation_canceled:_Broken_pipe__34_____40__patch__41__.mdwn
new file mode 100644
index 000000000..b7f38fd29
--- /dev/null
+++ b/doc/bugs/bugfix_for:___34__mtn:_operation_canceled:_Broken_pipe__34_____40__patch__41__.mdwn
@@ -0,0 +1,24 @@
+When using monotone as revision control system, a "mtn: operation canceled: Broken pipe" message is printed. Reason is that, in a call to mtn, the pipe is closed before mtn has done all its output. This patch fixes the problem.
+
+ diff -up ikiwiki/IkiWiki/Plugin/monotone.pm.orig ikiwiki/IkiWiki/Plugin/monotone.pm
+ --- ikiwiki/IkiWiki/Plugin/monotone.pm.orig 2008-11-12 23:45:24.000000000 +0100
+ +++ ikiwiki/IkiWiki/Plugin/monotone.pm 2008-12-16 12:41:38.000000000 +0100
+ @@ -525,13 +525,12 @@ sub rcs_recentchanges ($) {
+ my $child = open(MTNLOG, "-|");
+ if (! $child) {
+ exec("mtn", "log", "--root=$config{mtnrootdir}", "--no-graph",
+ - "--brief") || error("mtn log failed to run");
+ + "--brief", "--last=$num") || error("mtn log failed to run");
+ }
+
+ - while (($num >= 0) and (my $line = <MTNLOG>)) {
+ + while (my $line = <MTNLOG>) {
+ if ($line =~ m/^($sha1_pattern)/) {
+ push @revs, $1;
+ - $num -= 1;
+ }
+ }
+ close MTNLOG || debug("mtn log exited $?");
+
+> Thanks for the patch, and for testing the monotone backend.
+> applied [[done]] --[[Joey]]
diff --git a/doc/bugs/build_fails_oddly_when_older_ikiwiki_is_installed.mdwn b/doc/bugs/build_fails_oddly_when_older_ikiwiki_is_installed.mdwn
new file mode 100644
index 000000000..7b252031b
--- /dev/null
+++ b/doc/bugs/build_fails_oddly_when_older_ikiwiki_is_installed.mdwn
@@ -0,0 +1,31 @@
+I got this failure when trying to build ikiwiki version 3.20100403:
+
+ $ perl Makefile.PL INSTALL_BASE=/opt/ikiwiki PREFIX=
+ Writing Makefile for IkiWiki
+ $ make
+
+*...snip...*
+
+ ./pm_filter /opt/ikiwiki 3.20100403 /opt/ikiwiki/lib/perl5 < ikiwiki.in > ikiwiki.out
+ chmod +x ikiwiki.out
+ ./pm_filter /opt/ikiwiki 3.20100403 /opt/ikiwiki/lib/perl5 < ikiwiki-transition.in > ikiwiki-transition.out
+ chmod +x ikiwiki-transition.out
+ ./pm_filter /opt/ikiwiki 3.20100403 /opt/ikiwiki/lib/perl5 < ikiwiki-calendar.in > ikiwiki-calendar.out
+ chmod +x ikiwiki-calendar.out
+ HOME=/home/me /usr/bin/perl -Iblib/lib ikiwiki.out -libdir . -dumpsetup ikiwiki.setup
+ Use of uninitialized value $IkiWiki::Setup::config{"setuptype"} in concatenation (.) or string at IkiWiki/Setup.pm line 53.
+ Can't locate IkiWiki/Setup/.pm in @INC (@INC contains: . /opt/ikiwiki/lib/perl5/i486-linux-gnu-thread-multi /opt/ikiwiki/lib/perl5 blib/lib /etc/perl /usr/local/lib/perl/5.10.1 /usr/local/share/perl/5.10.1 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.10 /usr/share/perl/5.10 /usr/local/lib/site_perl .) at (eval 35) line 3.
+
+ make: *** [ikiwiki.setup] Error 2
+
+Note that I had been trying to upgrade with an installed ikiwiki 3.20091114
+already in place under /opt/ikiwiki. The build does not fail for me
+if I first remove the old ikiwiki installation, nor does it fail with
+3.20100403 or newer installed at /opt/ikiwiki. Hence this is not
+really a critical bug, although it's somewhat perplexing to me why it
+ought to make a difference.
+
+> So, using INSTALL_BASE causes a 'use lib' to be hardcoded into the `.out`
+> files; which overrides the -libdir and the -I, and so the old version
+> of IkiWiki.pm is used.
+> [[fixed|done]] --[[Joey]]
diff --git a/doc/bugs/build_in_opensolaris.mdwn b/doc/bugs/build_in_opensolaris.mdwn
new file mode 100644
index 000000000..faf5d9980
--- /dev/null
+++ b/doc/bugs/build_in_opensolaris.mdwn
@@ -0,0 +1,74 @@
+I've learned I'm not yet clever enough to get IkiWiki to build in OpenSolaris (as running on a [Joyent Accelerator][ja]). Anyone figured this out already?
+
+I think problem lies mostly, if not entirely, in getting ikiwiki.cgi.c to compile in an OpenSolaris context (this is ikiwiki-2.2):
+
+> <code>$ ikiwiki --setup ~/etc/ikiwiki/ikiwiki-edit.setup
+> [...]
+> Error: failed to compile /home/username/domains/domain.tld/cgi-bin/ikiwiki.cgi.c at /opt/local/lib/perl5/site_perl/IkiWiki.pm line 104.
+BEGIN failed--compilation aborted at (eval 3) line 145.</code>
+
+More specifically,
+
+> <code>$ /usr/sfw/bin/gcc ikiwiki.cgi.c
+> Undefined first referenced
+> symbol in file
+> asprintf /var/tmp//cczPaG7R.o
+> ld: fatal: Symbol referencing errors. No output written to a.out
+> collect2: ld returned 1 exit status</code>
+
+[ja]: <http://www.joyent.com/accelerator/technical-specifications/>
+
+Thanks, Joey et al., for a really cool tool.
+
+--Mike
+
+> Looks like the ikiwiki wrapper uses asprintf. glibc has that, and I think some other libc implementations have that, but apparently the Solaris libc does not. The same problem will come up on other platforms that don't use glibc. The ikiwiki wrapper needs to either avoid asprintf or use a portable asprintf implementation from somewhere like gnulib. --[[JoshTriplett]]
+
+>> I used asprintf because it was easy, and safe. That is a good reason for
+>> C libraries to support asprintf, IMHO. Note that both linux and *BSD
+>> support asprintf.
+>>
+>> Of the possible patches to make this more portable, I'd generally prefer
+>> one that uses portable functions (safely), rather than one that includes
+>> an asprintf implementation in ikiwiki. --[[Joey]]
+
+> I got ikiwiki working (sort of) on OpenSolaris today. I ran into this problem too, and wrote a version of asprintf() from scratch which uses more portable APIs:
+
+<code>
+ #include &lt;stdarg.h&gt;
+
+ int
+ asprintf(char **string_ptr, const char *format, ...)
+ {
+ va_list arg;
+ char *str;
+ int size;
+ int rv;
+
+ va_start(arg, format);
+ size = vsnprintf(NULL, 0, format, arg);
+ size++;
+ va_start(arg, format);
+ str = malloc(size);
+ if (str == NULL) {
+ va_end(arg);
+ /*
+ * Strictly speaking, GNU asprintf doesn't do this,
+ * but the caller isn't checking the return value.
+ */
+ fprintf(stderr, "failed to allocate memory\\n");
+ exit(1);
+ }
+ rv = vsnprintf(str, size, format, arg);
+ va_end(arg);
+
+ *string_ptr = str;
+ return (rv);
+ }
+
+</code>
+
+> I added this after the rest of the #include's in Wrapper.pm, and it seems to work. --Blake
+
+>> I have this marked [[bugs/done]] already, because I changed it in svn to
+>> not use asprintf --[[Joey]]
diff --git a/doc/bugs/bzr-update-syntax-error.mdwn b/doc/bugs/bzr-update-syntax-error.mdwn
new file mode 100644
index 000000000..bf715a29e
--- /dev/null
+++ b/doc/bugs/bzr-update-syntax-error.mdwn
@@ -0,0 +1,11 @@
+On Line #46 of the `bzr` plugin there's a mistalke. Instead of:
+
+ my @cmdline = ("bzr", $config{srcdir}, "update");
+
+It should be:
+
+ my @cmdline = ("bzr", "update", $config{srcdir});
+
+The former produces errors such as "_bzr: ERROR: unknown command "/home/user/ikiwiki/posts_", "_'bzr /home/user/ikiwiki/posts update' failed: Inappropriate ioctl for device at /usr/share/perl5/IkiWiki/Rcs/bzr.pm line 48._".
+
+[[done]], thanks --[[Joey]]
diff --git a/doc/bugs/bzr_2.0_breaks_bzr_plugin.mdwn b/doc/bugs/bzr_2.0_breaks_bzr_plugin.mdwn
new file mode 100644
index 000000000..39500af20
--- /dev/null
+++ b/doc/bugs/bzr_2.0_breaks_bzr_plugin.mdwn
@@ -0,0 +1,87 @@
+Version 2.0 of bzr seems to break the bzr plugin.
+
+I traced this to the bzr_log method in the plugin, and patching that seems to fix it. The plugin just needs to parse the input little bit differently.
+--liw
+
+> Patch applied, [[done]] (but, it would be good if it could be tested with
+> an older bzr, and it's a pity bzr's human-targeted log has to be parsed,
+> I assume there is no machine-targeted version?) --[[Joey]]
+
+ From fb897114124e627fd3acf5af8e784c9a77419a81 Mon Sep 17 00:00:00 2001
+ From: Lars Wirzenius <liw@liw.fi>
+ Date: Sun, 4 Apr 2010 21:05:07 +1200
+ Subject: [PATCH] Fix bzr plugin to work with bzr 2.0.
+
+ The output of "bzr log" seems to have changed a bit, so we change the
+ parsing accordingly. This has not been tested with earlier versions of
+ bzr.
+
+ Several problems seemed to occur, all in the bzr_log subroutine:
+
+ 1. The @infos list would contain an empty hash, which would confuse the
+ rest of the program.
+ 2. This was because bzr_log would push an empty anonymous hash to the
+ list whenever it thought a new record would start.
+ 3. However, a new record marker (now?) also happens at th end of bzr log
+ output.
+ 4. Now we collect the record to a hash that gets pushed to the list only
+ if it is not empty.
+ 5. Also, sometimes bzr log outputs "revno: 1234 [merge]", so we catch only
+ the revision number.
+ 6. Finally, there may be non-headers at the of the output, so we ignore
+ those.
+ ---
+ IkiWiki/Plugin/bzr.pm | 23 ++++++++++++++++-------
+ 1 files changed, 16 insertions(+), 7 deletions(-)
+
+ diff --git a/IkiWiki/Plugin/bzr.pm b/IkiWiki/Plugin/bzr.pm
+ index 1ffdc23..e813331 100644
+ --- a/IkiWiki/Plugin/bzr.pm
+ +++ b/IkiWiki/Plugin/bzr.pm
+ @@ -73,28 +73,37 @@ sub bzr_log ($) {
+ my @infos = ();
+ my $key = undef;
+
+ + my $hash = {};
+ while (<$out>) {
+ my $line = $_;
+ my ($value);
+ if ($line =~ /^message:/) {
+ $key = "message";
+ - $infos[$#infos]{$key} = "";
+ + $$hash{$key} = "";
+ }
+ elsif ($line =~ /^(modified|added|renamed|renamed and modified|removed):/) {
+ $key = "files";
+ - unless (defined($infos[$#infos]{$key})) { $infos[$#infos]{$key} = ""; }
+ + unless (defined($$hash{$key})) { $$hash{$key} = ""; }
+ }
+ elsif (defined($key) and $line =~ /^ (.*)/) {
+ - $infos[$#infos]{$key} .= "$1\n";
+ + $$hash{$key} .= "$1\n";
+ }
+ elsif ($line eq "------------------------------------------------------------\n") {
+ + if (keys %$hash) {
+ + push (@infos, $hash);
+ + }
+ + $hash = {};
+ $key = undef;
+ - push (@infos, {});
+ }
+ - else {
+ + elsif ($line =~ /: /) {
+ chomp $line;
+ - ($key, $value) = split /: +/, $line, 2;
+ - $infos[$#infos]{$key} = $value;
+ + if ($line =~ /^revno: (\d+)/) {
+ + $key = "revno";
+ + $value = $1;
+ + } else {
+ + ($key, $value) = split /: +/, $line, 2;
+ + }
+ + $$hash{$key} = $value;
+ }
+ }
+ close $out;
+ --
+ 1.7.0
diff --git a/doc/bugs/bzr_RecentChanges_dates_start_from_1969.mdwn b/doc/bugs/bzr_RecentChanges_dates_start_from_1969.mdwn
new file mode 100644
index 000000000..fa6e45b47
--- /dev/null
+++ b/doc/bugs/bzr_RecentChanges_dates_start_from_1969.mdwn
@@ -0,0 +1,16 @@
+Using bzr, the dates for changes on the RecentChanges page all start
+slightly before the Unix epoch.
+
+Changing line 249 of bzr.pm from
+
+` when => time - str2time($info->{"timestamp"}),`
+
+to
+
+` when => str2time($info->{"timestamp"}),`
+
+fixed this for me.
+
+> Weird, I wonder why it was written to return an absolute time like that
+> in the first place? Can't have ever been right. Fixed, thanks. --[[Joey]]
+> [[done]]
diff --git a/doc/bugs/bzr_plugin_does_not_define_rcs__95__diff.mdwn b/doc/bugs/bzr_plugin_does_not_define_rcs__95__diff.mdwn
new file mode 100644
index 000000000..0294ec62e
--- /dev/null
+++ b/doc/bugs/bzr_plugin_does_not_define_rcs__95__diff.mdwn
@@ -0,0 +1,27 @@
+The bzr plugin does not seem to define the rcs_diff subroutine.
+I got the follow error after enabling recentchangesdiff:
+
+"Undefined subroutine &IkiWiki::Plugin::bzr::rcs_diff called at /usr/share/perl5/IkiWiki.pm line 1590."
+
+Grepping to verify absence of rcs_diff:
+
+ $ grep rcs_diff /usr/share/perl5/IkiWiki/Plugin/{git,bzr}.pm
+ /usr/share/perl5/IkiWiki/Plugin/git.pm: hook(type => "rcs", id => "rcs_diff", call => \&rcs_diff);
+ /usr/share/perl5/IkiWiki/Plugin/git.pm:sub rcs_diff ($) {
+ /usr/share/perl5/IkiWiki/Plugin/bzr.pm: hook(type => "rcs", id => "rcs_diff", call => \&rcs_diff);
+
+> I've added the minimal stub needed to avoid the crash, but for
+> recentchangesdiff to work, someone needs to implement `rcs_diff` for bzr.
+> This should be trivial if you know and use bzr. The function
+> is passed as a parameter the revno of interest and just needs
+> to ask bzr for the diff between that and the previous version. --[[Joey]]
+
+>> I'll see if I can make a patch. The bzr command to get the revision would
+>> look like this: bzr diff -r revno:$PREV:/path/to/src..revno:$REVNO:/path/to/src
+>> (where $PREV would be $REVNO minus one). --liw
+
+>> Sorry, that was not entirely correct, for some reason. I'll add a patch below that
+>> seems to work. I am unfortunately not ready to set up a git repository that you
+>> can pull from. --liw
+
+[[done]] --[[Joey]]
diff --git a/doc/bugs/can__39__t_mix_template_vars_inside_directives.mdwn b/doc/bugs/can__39__t_mix_template_vars_inside_directives.mdwn
new file mode 100644
index 000000000..e91a8923d
--- /dev/null
+++ b/doc/bugs/can__39__t_mix_template_vars_inside_directives.mdwn
@@ -0,0 +1,61 @@
+I often find myself wrapping the same boiler plate around [[ikiwiki/directives/img]] img directives, so I tried to encapsulate it using the following [[ikiwiki/directives/template]]:
+
+
+ <div class="image">
+ [\[!img <TMPL_VAR raw_href>
+ size="<TMPL_VAR raw_size>"
+
+ <TMPL_IF alt>
+ alt="<TMPL_VAR raw_alt>"
+ <TMPL_ELSE>
+ <TMPL_IF caption>
+ alt="<TMPL_VAR raw_alt>"
+ <TMPL_ELSE>
+ alt="[pic]"
+ </TMPL_IF>
+ </TMPL_IF>
+
+ ]]
+ <TMPL_IF caption>
+ <p><TMPL_VAR raw_caption></p>
+ </TMPL_IF>
+ </div>
+
+The result, even with htmlscrubber disabled, is mangled, something like
+
+ <div class="image">
+ <span class="createlink"><a href="http://jmtd.net/cgi?
+ page=size&amp;from=log0.000000old_new_test&amp;do=create"
+ rel="nofollow">?</a>size</span>
+
+ </div>
+
+Any suggestions gladly received. -- [[Jon]]
+
+> Well, you *should* be able to do things like this, and in my testing, I
+> *can*. I used your exact example above (removing the backslash escape)
+> and invoked it as:
+> \[[!template id=test href=himom.png size=100x]]
+>
+> And got just what you would expect.
+>
+> I don't know what went wrong for you, but I don't see a bug here.
+> My guess, at the moment, is that you didn't specify the required href
+> and size parameters when using the template. If I leave those off,
+> I of course reproduce what you reported, since the img directive gets
+> called with no filename, and so assumes the size parameter is the image
+> to display.. [[done]]? --[[Joey]]
+
+>> Hmm, eek. Just double-checked, and done a full rebuild. No dice! Version 3.20100831. Feel free to leave this marked done, It probably *is* PEBKAC. I shall look again in day time. -- [[Jon]]
+
+>>> As always, if you'd like to mail me a larger test case that reproduces a
+>>> problem for you, I can take a look at it. --[[Joey]]
+
+>>>> <s>Thank you for the offer. I might still take you up on it. I've just proven that this
+>>>> does work for a clean repo / bare bones test case. -- [[Jon]]</s> Figured it out. The
+>>>> problem was I'd copied a page (old_new) which had two images embedded in it to test.
+>>>> I'd stored the images under a subdir "old_new". The new page was called "old_new_test"
+>>>> and the images thus could not be found by a pagespec "some-image.jpg". Adjusting the
+>>>> href argument to the template (consequently the src argument to img) to
+>>>> "old_new/some-image.jpg" fixed it all. [[done]], PEBKAC. Thank you for your time :)
+>>>> -- [[Jon]]
diff --git a/doc/bugs/cannot_clone_documented_git_repo.mdwn b/doc/bugs/cannot_clone_documented_git_repo.mdwn
new file mode 100644
index 000000000..4f2ec66f3
--- /dev/null
+++ b/doc/bugs/cannot_clone_documented_git_repo.mdwn
@@ -0,0 +1,16 @@
+ smcv@vasks:~$ git clone git://git.ikiwiki.info/
+ Cloning into git.ikiwiki.info...
+ fatal: read error: Connection reset by peer
+
+I tried this from a UK consumer ISP, my virtual server in the
+UK, and vasks (aka alioth.debian.org) in the Netherlands,
+with the same results. I can't update my clone from `origin`
+either; for the moment I'm using the github mirror instead.
+--[[smcv]]
+
+> Strange.. The git-daemon was not running, but one child was running
+> waiting on an upload-pack, but not accepting new connections. Nothing
+> in the logs about what happened to the parent. The monitor that checks
+> services are running was satisfied with the child.. I've made it
+> restart if the parent pid is no longer running, which should avoid
+> this problem in the future. --[[Joey]] [[done]]
diff --git a/doc/bugs/cannot_preview_shortcuts.mdwn b/doc/bugs/cannot_preview_shortcuts.mdwn
new file mode 100644
index 000000000..d7045b2dc
--- /dev/null
+++ b/doc/bugs/cannot_preview_shortcuts.mdwn
@@ -0,0 +1,17 @@
+Shortcuts such as \[[!google foo]] do not work when previewing pages.
+--[[JasonBlevins]]
+
+> Broken during the setup dumping changes, now fixed. --[[Joey]] [[done]]
+
+>> Just a quick note that this fix interacts with the way the `listdirectives`
+>> directive gets its list of non-shortcut directives. At the moment it
+>> still works, but it relies on the fact that the `listdirectives` `checkconfig`
+>> hook is called before the `shortcut` `checkconfig` hook.
+>> -- [[Will]]
+
+>> The order plugins are loaded is effectively random. (`keys %hooks`).
+>> So I've made shortcuts pass a 'shortcut' parameter when registering
+>> them, which listdirectives can grep out of the full list of directives.
+>> That may not be the best name to give it, especially if other plugins
+>> generate directives too. Seemed better than forcing shortcut's
+>> checkconfig hook to run last tho. --[[Joey]]
diff --git a/doc/bugs/cannot_reliably_use_meta_in_template.mdwn b/doc/bugs/cannot_reliably_use_meta_in_template.mdwn
new file mode 100644
index 000000000..de6c227f6
--- /dev/null
+++ b/doc/bugs/cannot_reliably_use_meta_in_template.mdwn
@@ -0,0 +1,18 @@
+meta title cannot reliably be put inside a template used by the
+[[plugins/template]] plugin. Since meta title info is gathered in the scan
+pass, which does not look at the template a page includes, it will not be
+seen then, and so other pages that use the page title probably won't use
+it. (Barring luck with build order.)
+
+Update: This also affects using tags from templates.
+
+There is a simple fix for this, just add `scan => 1` when registering the
+preprocess hook for the template plugin.
+
+However, the overhead of this has to be considered. It means that, on every
+scan pass, every page containing a template will cause the template to be
+loaded and filled out. This can be some serious additional overhead.
+
+--[[Joey]]
+
+[[done]]
diff --git a/doc/bugs/cannot_revert_page_deletion.mdwn b/doc/bugs/cannot_revert_page_deletion.mdwn
new file mode 100644
index 000000000..651b4d0ec
--- /dev/null
+++ b/doc/bugs/cannot_revert_page_deletion.mdwn
@@ -0,0 +1,10 @@
+After deleting a page with the "remove" button, it seems that the page deletion cannot be reverted using the "revert" icon in [[RecentChanges]].
+It ironically says that "Error: ?$pagename does not exist". See [[http://ikiwiki.info/ikiwiki.cgi?rev=860c2c84d98ea0a38a4f91dacef6d4e09f6e6c2e&do=revert]]. [[JeanPrivat]]
+
+> And it only gets that far if the remove plugin is enabled. Otherwise it
+> complains that you cannot change $pagename.
+>
+> The root bug is that git's `rcs_preprevert` creates a structure that
+> shows the change that was made (which includes a file deletion),
+> not the change that would be made if it was reverted (which includes a
+> file addition). [[Fixed|done]]. --[[Joey]]
diff --git a/doc/bugs/capitalized_attachment_names.mdwn b/doc/bugs/capitalized_attachment_names.mdwn
new file mode 100644
index 000000000..b10781bf7
--- /dev/null
+++ b/doc/bugs/capitalized_attachment_names.mdwn
@@ -0,0 +1,14 @@
+Given an uploaded image via: \[\[!img NAME.svg alt="image"\]\]
+
+Viewing the generated page shows the following error:
+
+"\[\[!img Error: failed to read name.svg: Exception 420: no decode delegate for this image format `/home/user/path/name.svg' @ error/svg.c/ReadSVGImage/2815\]\]"
+
+The caps in the image title were somehow converted to small letters and then the image is saved as a directory. Very puzzling.
+I get the same error when image names are small letters.
+
+The error also occurs with png images.
+
+How do I fix this?
+
+Later investigation ... I got around the problem by creating the mark-up in a new directory. However, if I try to create a new directory with the same name as the directory containing the problem code, the problem re-emerges -- the old directory is apparently not overwritten. Perhaps this is an issue with the git storage.
diff --git a/doc/bugs/cgi_does_not_use_templatedir_overlay.mdwn b/doc/bugs/cgi_does_not_use_templatedir_overlay.mdwn
new file mode 100644
index 000000000..d86a3ac3e
--- /dev/null
+++ b/doc/bugs/cgi_does_not_use_templatedir_overlay.mdwn
@@ -0,0 +1,26 @@
+I have set
+
+ templatedir => "/srv/apache2/madduck.net/templates",
+
+in ikiwiki.setup and put a custom ``page.tmpl`` in there, then called ``ikiwiki --setup`` and verified that it works. It even works when I push to the Git repo and let the receive-hook update the wiki.
+
+However, when I make a change via the CGI (which has been created by the last setup run), it applies the default ``page.tmpl`` file to all pages it updates.
+
+> This issue can arise in at least two ways:
+>
+> 1. A permissions problem with the templatedir that prevents ikiwiki from
+> accessing it. If it can't access it, it silently falls back to using
+> templates from the default directory.
+> 2. A templatedir that doesn't have an absolute path. In this case ikiwiki
+> will look relative to *somewhere*, which will sometimes work and
+> sometimes not. Clearly not a good idea.
+>
+> So far that's the only ways that I can see that this could happen.
+> It would be possible to make ikiwiki try to detect these sorts of
+> problems; it could check if the templatedir exists, and check that it's
+> readable. This would add some extra system calls to every ikiwiki run,
+> and I'm not convinced it's worth it. --[[Joey]]
+
+>> Closing this bug since I never heard back that it was not one
+>> of the above two problems, and I consider both problems local
+>> configuration errors. --[[Joey]] [[done]]
diff --git a/doc/bugs/cgi_wrapper_always_regenerated.mdwn b/doc/bugs/cgi_wrapper_always_regenerated.mdwn
new file mode 100644
index 000000000..76819422b
--- /dev/null
+++ b/doc/bugs/cgi_wrapper_always_regenerated.mdwn
@@ -0,0 +1,16 @@
+I am using --setup with a configuration file that enables my CGI wrapper. This works well. This same configuration also defines the locations of my source and my destination website.
+
+Whenever I run ikiwiki with --setup with this same configuration, then each time my cgi wrapper is updated. It says my cgi is generated and it has new timestamp but md5 shows the file never changed.
+
+Should I use a different config? This is confusing because if I use a config without my cgi wrapper defined, I still have my left-over ikiwiki.cgi still in place (so CGI is enabled).
+
+It seems wasteful to update each time when my goal is just to create the HTML pges (since the CGI didn't generate them) as noted in my other bug report.
+
+> If ikiwiki is run in refresh mode, it won't regnerate the wrapper. You
+> want to run it in refresh mode, because it's also wastful to rebuild all
+> the unchanged pages, which is done by default when setting up a wiki with
+> --setup.
+>
+> Example of refresh mode: `ikiwiki -setup ikiwiki.setup --refresh`
+>
+> Improved the docs slightly, so I'll call this [[bugs/done]] --[[Joey]]
diff --git a/doc/bugs/class_parameter_of_img_directive_behave_not_as_documented.mdwn b/doc/bugs/class_parameter_of_img_directive_behave_not_as_documented.mdwn
new file mode 100644
index 000000000..e7797765f
--- /dev/null
+++ b/doc/bugs/class_parameter_of_img_directive_behave_not_as_documented.mdwn
@@ -0,0 +1,31 @@
+On [[ikiwiki/directive/img/]] I read that
+
+> You can also pass alt, title, class, align, id, hspace, and vspace
+> parameters. These are passed through unchanged to the html img tag.
+
+but when I pass `class="myclass"` to an img directive, I obtain
+
+ <img class="myclass img" ...
+
+I found that this behaviour was added in commit f6db10d:
+
+> img: Add a margin around images displayed by this directive.
+>
+> Particularly important for floating images, which could before be placed
+> uncomfortably close to text.
+
+which adds to img.pm:
+
+ if (exists $params{class}) {
+ $params{class}.=" img";
+ }
+ else {
+ $params{class}="img";
+ }
+
+I would prefer if the `img` class were only added if no class attribute is
+passed.
+
+If you keep the current behaviour, please document it.
+
+> [[done]] --[[Joey]]
diff --git a/doc/bugs/clearenv_not_present_at_FreeBSD_.mdwn b/doc/bugs/clearenv_not_present_at_FreeBSD_.mdwn
new file mode 100644
index 000000000..f38c86e03
--- /dev/null
+++ b/doc/bugs/clearenv_not_present_at_FreeBSD_.mdwn
@@ -0,0 +1,5 @@
+When build wrapper on FreeBSD system, is error occured with clearenv reference. clearenv() das not exists at FreeBSD system, use workaround environ[0]=NULL;
+P.S. new git instalation, FreeBSD 7.x
+
+> `#include <stupid-standards.h>` fixed with nasty ifdefs to handle tcc w/o
+> breaking everything else. [[done]] --[[Joey]]
diff --git a/doc/bugs/clearenv_not_present_at_FreeBSD_/discussion.mdwn b/doc/bugs/clearenv_not_present_at_FreeBSD_/discussion.mdwn
new file mode 100644
index 000000000..713198b61
--- /dev/null
+++ b/doc/bugs/clearenv_not_present_at_FreeBSD_/discussion.mdwn
@@ -0,0 +1 @@
+Mmmm... i see. But it not setup under FreeBSD without magic manual passes.
diff --git a/doc/bugs/clearing_email_in_prefs.mdwn b/doc/bugs/clearing_email_in_prefs.mdwn
new file mode 100644
index 000000000..f03d9d902
--- /dev/null
+++ b/doc/bugs/clearing_email_in_prefs.mdwn
@@ -0,0 +1,5 @@
+If a user has an email address in their prefs, and they try to clear it,
+the prefs seem to save ok, and the email seems to be gone, but reloading
+the prefs page will reveal the old email address.
+
+Also affected subscriptions.. [[done]]
diff --git a/doc/bugs/colon:problem.mdwn b/doc/bugs/colon:problem.mdwn
new file mode 100644
index 000000000..41f1624b2
--- /dev/null
+++ b/doc/bugs/colon:problem.mdwn
@@ -0,0 +1,12 @@
+> Joey, please fix the colon in page name of my report. Ikiwiki sets
+> "attachment:\_failed\_to\_get\_filehandle/" URL on "Bugs" page and
+> the report is not clickable in my Epiphany browser:
+
+> Firefox doesn't know how to open this address, because the protocol
+> (attachment) isn't associated with any program.
+
+> I can only edit it :) Bad handling ':' character by Ikiwiki is probably
+> its another bug.
+> --[[Paweł|ptecza]]
+
+> [[fixed|done]] in git (but fix is not live on here as of this writing) --[[Joey]]
diff --git a/doc/bugs/colon:problem/discussion.mdwn b/doc/bugs/colon:problem/discussion.mdwn
new file mode 100644
index 000000000..e2856ed95
--- /dev/null
+++ b/doc/bugs/colon:problem/discussion.mdwn
@@ -0,0 +1 @@
+testing a link to [[colon:problem]]
diff --git a/doc/bugs/comments_appear_two_times.mdwn b/doc/bugs/comments_appear_two_times.mdwn
new file mode 100644
index 000000000..2ae081844
--- /dev/null
+++ b/doc/bugs/comments_appear_two_times.mdwn
@@ -0,0 +1,24 @@
+When a comment is added to page named "directory/page" it also appears in the page "directory".
+
+This seems to happen at least with versions 3.20100815.6 and 3.20110225. Id didn't happen in version from about a year ago. I created a testing ikiwiki installation demonstrating this bug. The same comment can be seen at <http://rtime.felk.cvut.cz/~sojka/blog/posts/directory/post/> and at <http://rtime.felk.cvut.cz/~sojka/blog/posts/directory/>. The corresponding git repo can be cloned by
+
+ git clone git://rtime.felk.cvut.cz/~sojka/blog.git
+
+> Unfortunatly, that git repo seems to be empty.
+> Perhaps you forgot to push to it? Thank you for working
+> to provide a way to reproduce this!
+>
+> Myself, I cannot reproduce it. Eg, my blog has all posts
+> under <http://kitenet.net/~joey/blog/entry/>, but that page
+> shows none of the comments to my blog posts. And here on ikiwiki.info,
+> posts on the forum have comments, but they don't show up as comments
+> to the [[forum]] page.
+> --[[Joey]]
+
+>> The repo can be cloned now. There was a problem with permissions. --[[wentasah]]
+
+>>> I see the bug now. Probably most configs hide it by setting
+>>> `comments_pagespec` more tightly. It was introduced by
+>>> d9d910f6765de6ba07508ab56a5a0f93edb4c8ad, and/or later
+>>> changes to actually use the `comments()` PageSpec.
+>>> Fixed in git! [[done]] --[[Joey]]
diff --git a/doc/bugs/comments_not_searchable.mdwn b/doc/bugs/comments_not_searchable.mdwn
new file mode 100644
index 000000000..6fda89bd2
--- /dev/null
+++ b/doc/bugs/comments_not_searchable.mdwn
@@ -0,0 +1,19 @@
+The text of comments (and other internal pages) does not get indexed by the
+search plugin.
+
+Search indexes content passed to the postscan hook.
+Comments are inlined, but inline's speed hack avoids adding inlined
+content to the page until the format hook.
+
+And hmm, that's somewhat desirable, because we don't want searches
+to find content that is inlined onto another page.
+
+That suggests that the fix could be to call the postscan hook
+for internal pages.
+
+However, the search postscan hook tells xapian the page url,
+and uses `urlto($page)` to do it. And that won't work for
+an internal page. Guess it could be modified to tell xapian the
+permalink. --[[Joey]]
+
+> [[done]] --[[Joey]]
diff --git a/doc/bugs/comments_preview_unsafe_with_allowdirectives.mdwn b/doc/bugs/comments_preview_unsafe_with_allowdirectives.mdwn
new file mode 100644
index 000000000..7f9fb67e9
--- /dev/null
+++ b/doc/bugs/comments_preview_unsafe_with_allowdirectives.mdwn
@@ -0,0 +1,8 @@
+If `comments_allowdirectives` is set, previewing a comment can run
+directives that create files. (Eg, img.) Unlike editpage, it does not
+keep track of those files and expire them. So the files will linger in
+destdir forever.
+
+Probably when the user then tries to save the comment, ikiwiki will refuse
+to overwrite the unknown file, and will crash.
+--[[Joey]]
diff --git a/doc/bugs/comments_produce_broken_links_in_RecentChanges.mdwn b/doc/bugs/comments_produce_broken_links_in_RecentChanges.mdwn
new file mode 100644
index 000000000..dae00857b
--- /dev/null
+++ b/doc/bugs/comments_produce_broken_links_in_RecentChanges.mdwn
@@ -0,0 +1,9 @@
+Comments produce links like `sandbox/comment_1` in [[RecentChanges]], which,
+when clicked, redirect to a page that does not exist.
+
+The `recentchanges` branch in my repository contains one possible [[patch]],
+which causes the CGI to go to the [[ikiwiki/directive/meta]] `permalink`, if
+any, if the page is internal (comments do have a permalink).
+
+> [[done]].. I I had thought about not showing internal page changes at
+> all, but I like your approach better --[[Joey]]
diff --git a/doc/bugs/complex_wiki-code___40__braces__41___in_wikilink-text_breaks_wikilinks.mdwn b/doc/bugs/complex_wiki-code___40__braces__41___in_wikilink-text_breaks_wikilinks.mdwn
new file mode 100644
index 000000000..780e695c2
--- /dev/null
+++ b/doc/bugs/complex_wiki-code___40__braces__41___in_wikilink-text_breaks_wikilinks.mdwn
@@ -0,0 +1,9 @@
+Example:
+<pre>
+[[`\[[!taglink TAG\]\]`|plugins/tag]]
+</pre>
+gives:
+
+[[`\[[!taglink TAG\]\]`|plugins/tag]]
+
+Expected: there is a [[ikiwiki/wikilink]] with the complex text as the displayed text. --Ivan Z.
diff --git a/doc/bugs/conditional_preprocess_during_scan.mdwn b/doc/bugs/conditional_preprocess_during_scan.mdwn
new file mode 100644
index 000000000..1ba142331
--- /dev/null
+++ b/doc/bugs/conditional_preprocess_during_scan.mdwn
@@ -0,0 +1,57 @@
+[[!template id=gitbranch branch=GiuseppeBilotta/scanif author="[[GiuseppeBilotta]]"]]
+
+When a directive that should be run during scan preprocessing is inside
+an if directive, it doesn't get called because the if preprocessing does
+not run during scan.
+
+I've written a simple [[patch]] to fix the issue, currently hosted on the
+scanif branch of my repository. The patch also passes the preview option
+back to the Ikiwiki::preprocess call, making sure that whatever is being
+reprocessed is done so in the same conditions as the original call.
+
+> One problem with this is that it has the same dependency-ordering problems
+> as inline-based or pagespec-based trails with my [[plugins/contrib/trail]]
+> plugin: `if` takes a pagespec, but pagespecs aren't guaranteed to match
+> correctly until everything has been scanned (for instance, `link()` might
+> give the wrong results because a page that added or deleted a link hasn't
+> been scanned yet). If you have a clever idea for how to fix this, I'd love
+> to hear it - being able to specify a [[plugins/contrib/trail]] in terms
+> of a sorted pagespec would be useful. --[[smcv]]
+
+>> I have a solution to the dependency-ordering problem in a different
+>> branch of my repository, with a post_scan hook mechanism which I use to
+>> be able to sort outer inline pages according to the last modification
+>> date of their nested inline pages. The way I implemented it currently,
+>> though, doesn't use the existing hooks mechanism of ikiwiki (because
+>> it's something which I believe to be more efficiently done the way I
+>> implemented it) so I don't know how likely it is to be included
+>> upstream.
+
+>> For what it's worth, I think that my post_scan hook mechanism would work
+>> rather fine with your trail plugin.
+
+>>> We discussed this on IRC, and I think it's actually more complicated
+>>> than that: the branch to sort by newest inlined entry wants a
+>>> "pagespecs now work" hook, whereas for trail I want a "sorting now
+>>> works" hook:
+>>>
+>>> * scan
+>>> * pagespecs now work (post-scan)
+>>> * Giuseppe's version of inline can decide what each inline
+>>> contains, and thus decide where they go in `inline(mtime)`
+>>> order
+>>> * pagespecs and sorting now work (pre-render)
+>>> * my trail plugin can decide what each trail contains, and
+>>> also sort them in the right order (which might be
+>>> `inline(mtime)`, so might be undefined until pagespecs work)
+>>> * render
+>>>
+>>> --[[smcv]]
+
+>> However, the case of the if
+>> directive is considerably more complicated, because the conditional
+>> can introduce a much stronger feedback effect in the pre/post scanning
+>> dependency. In fact, it's probably possible to build a couple of pages
+>> with vicious conditional dependency circles that would break/unbreak
+>> depending on which pass we are in. And I believe this is an intrinsic
+>> limitation of the system, which cannot be solved at all.
diff --git a/doc/bugs/conflicts.mdwn b/doc/bugs/conflicts.mdwn
new file mode 100644
index 000000000..bef0f54cd
--- /dev/null
+++ b/doc/bugs/conflicts.mdwn
@@ -0,0 +1,32 @@
+The `conflicts` testcase has 4 failing test cases. The underlaying problem
+is that there are multiple possible source files that can create the same
+destination files.
+
+1. `foo.mdwn` is in srcdir, rendered to destdir. Then
+ it is removed, and `foo` is added, which will be rendered
+ raw to destdir. Since the `foo/` directory still exists,
+ it fails.
+1. `foo` is added to srcdir, rendered raw to destdir.
+ Then it is removed from srcdir, and `foo.mdwn` is added.
+ The `foo` file is still present in the destdir, and mkdir
+ of the directory `foo/` fails.
+1. `foo.mdwn` renders to `foo/index.html`. Then `foo/index.html`
+ is added to the srcdir, using rawhtml. It renders to the same
+ thing.
+1. `foo/index.html` in srcdir is rendered to same thing in destdir
+ using rawhtml. Then `foo.mdwn` is added; renders same thing.
+
+Note that another case, that of page `foo.mdwn` and page `foo.txt`, that
+both render to `foo/index.html`, used to cause problems, but no longer
+crashes ikiwiki. It now only complains in this situation, and which
+file "wins" is undefined. The fix for this relied on both pages being
+named `foo`; but in the above cases, the source files have different
+pagenames.
+
+One approach: Beef up checking in `will_render` to detect when the same
+destination file is rendered by multiple pages. Or when one page renders
+a file that is a parent directory of the rendered file of another page.
+It could warn, rather than erroring. The last page rendered would "win";
+generating the destdir file.
+
+[[done]]
diff --git a/doc/bugs/correct_published_and_updated_time_information_for_the_feeds.mdwn b/doc/bugs/correct_published_and_updated_time_information_for_the_feeds.mdwn
new file mode 100644
index 000000000..565f3b16c
--- /dev/null
+++ b/doc/bugs/correct_published_and_updated_time_information_for_the_feeds.mdwn
@@ -0,0 +1,113 @@
+In [Atom](http://www.ietf.org/rfc/rfc4287.txt), we can provide `published` and `updated` information.
+In [RSS](http://cyber.law.harvard.edu/rss/rss.html), there is only `pubDate`, for the
+publication date, but an update can be mentioned with the [`dc:modified`](http://www.ietf.org/rfc/rfc2413.txt)
+element (whose datetime format is [iso 8601](http://www.w3.org/TR/NOTE-datetime)).
+This patch updates :) `inline.pm` and the two relevant templates.
+
+> I tested a slightly modified patch, which I've put below for now.
+> feedvalidator.org complains that dc:modified is not a known element. I'll
+> bet some header needs to be added to make the dublin core stuff available.
+> The atom feeds seem ok. --[[Joey]]
+
+<pre>
+Index: debian/changelog
+===================================================================
+--- debian/changelog (revision 4066)
++++ debian/changelog (working copy)
+@@ -15,8 +15,11 @@
+ * Updated French translation from Cyril Brulebois. Closes: #437181
+ * The toc directive doesn't work well or make sense inside an inlined page.
+ Disable it when the page with the toc is nested inside another page.
++ * Apply a patch from NicolasLimare adding modification date tags to rss and
++ atom feeds, and also changing the publication time for a feed to the
++ newest modiciation time (was newest creation time).
+
+- -- Joey Hess <joeyh@debian.org> Sat, 11 Aug 2007 17:40:45 -0400
++ -- Joey Hess <joeyh@debian.org> Sat, 11 Aug 2007 18:25:28 -0400
+
+ ikiwiki (2.5) unstable; urgency=low
+
+Index: templates/atomitem.tmpl
+===================================================================
+--- templates/atomitem.tmpl (revision 4066)
++++ templates/atomitem.tmpl (working copy)
+@@ -11,7 +11,8 @@
+ <category term="<TMPL_VAR CATEGORY>" />
+ </TMPL_LOOP>
+ </TMPL_IF>
+- <updated><TMPL_VAR DATE_3339></updated>
++ <updated><TMPL_VAR MDATE_3339></updated>
++ <published><TMPL_VAR CDATE_3339></published>
+ <TMPL_IF NAME="ENCLOSURE">
+ <link rel="enclosure" type="<TMPL_VAR TYPE>" href="<TMPL_VAR ENCLOSURE>" length="<TMPL_VAR LENGTH>" />
+ <TMPL_ELSE>
+Index: templates/rssitem.tmpl
+===================================================================
+--- templates/rssitem.tmpl (revision 4066)
++++ templates/rssitem.tmpl (working copy)
+@@ -12,7 +12,8 @@
+ <category><TMPL_VAR CATEGORY></category>
+ </TMPL_LOOP>
+ </TMPL_IF>
+- <pubDate><TMPL_VAR DATE_822></pubDate>
++ <pubDate><TMPL_VAR CDATE_822></pubDate>
++ <dc:modified><TMPL_VAR MDATE_3339></dc:modified>
+ <TMPL_IF NAME="ENCLOSURE">
+ <enclosure url="<TMPL_VAR ENCLOSURE>" type="<TMPL_VAR TYPE>" length="<TMPL_VAR LENGTH>" />
+ <TMPL_ELSE>
+Index: IkiWiki/Plugin/inline.pm
+===================================================================
+--- IkiWiki/Plugin/inline.pm (revision 4066)
++++ IkiWiki/Plugin/inline.pm (working copy)
+@@ -361,8 +361,10 @@
+ title => pagetitle(basename($p)),
+ url => $u,
+ permalink => $u,
+- date_822 => date_822($pagectime{$p}),
+- date_3339 => date_3339($pagectime{$p}),
++ cdate_822 => date_822($pagectime{$p}),
++ mdate_822 => date_822($pagemtime{$p}),
++ cdate_3339 => date_3339($pagectime{$p}),
++ mdate_3339 => date_3339($pagemtime{$p}),
+ );
+
+ if ($itemtemplate->query(name => "enclosure")) {
+@@ -397,7 +399,7 @@
+ $content.=$itemtemplate->output;
+ $itemtemplate->clear_params;
+
+- $lasttime = $pagectime{$p} if $pagectime{$p} > $lasttime;
++ $lasttime = $pagemtime{$p} if $pagemtime{$p} > $lasttime;
+ }
+
+ my $template=template($feedtype."page.tmpl", blind_cache => 1);
+</pre>
+
+
+
+>> Yes I noticedthe bug today; the correct (tested on feedvalidator) rssitem.tmpl template must start with the following content:
+
+ <item>
+ <TMPL_IF NAME="AUTHOR">
+ <title><TMPL_VAR AUTHOR ESCAPE=HTML>: <TMPL_VAR TITLE></title>
+ <dcterms:creator><TMPL_VAR AUTHOR ESCAPE=HTML></dcterms:creator>
+ <TMPL_ELSE>
+ <title><TMPL_VAR TITLE></title>
+ </TMPL_IF>
+ <dcterms:modified><TMPL_VAR MDATE_3339></dcterms:modified>
+ <dcterms:created><TMPL_VAR DATE_3339></dcterms:created>
+ ....
+
+>> and rsspage.tmpl must start with:
+
+ <?xml version="1.0"?>
+ <rss version="2.0"
+ xmlns:dc="http://purl.org/dc/elements/1.1/"
+ xmlns:dcterms="http://purl.org/dc/terms/" >
+ ....
+
+>> — [[NicolasLimare]]
+
+[[done]] --[[Joey]]
+
+[[!tag patch]]
diff --git a/doc/bugs/creating_page_from_comment_creates_a_comment.mdwn b/doc/bugs/creating_page_from_comment_creates_a_comment.mdwn
new file mode 100644
index 000000000..0eff756de
--- /dev/null
+++ b/doc/bugs/creating_page_from_comment_creates_a_comment.mdwn
@@ -0,0 +1,9 @@
+If a comment contains a WikiLink, for a page that doesn't exist, and the
+user clicks on the edit link, and creates the page, it will itself be saved
+as a comment, with "._comment" extension.
+
+This is very surprising and wrong behavior. The page editor tries to
+preserve the linking page's format type, but it shouldn't do so if the page
+is an internal page. --[[Joey]]
+
+[[done]] --[[Joey]]
diff --git a/doc/bugs/cutpaste.pm:_missing_filter_call.mdwn b/doc/bugs/cutpaste.pm:_missing_filter_call.mdwn
new file mode 100644
index 000000000..4b22fd06c
--- /dev/null
+++ b/doc/bugs/cutpaste.pm:_missing_filter_call.mdwn
@@ -0,0 +1,55 @@
+Consider this:
+
+ $ wget http://schwinge.homeip.net/~thomas/tmp/cutpaste_filter.tar.bz2
+ $ wget http://schwinge.homeip.net/~thomas/tmp/cutpaste_filter.patch
+
+ $ tar -xj < cutpaste_filter.tar.bz2
+ $ cd cutpaste_filter/
+ $ ./render_locally
+ $ find "$PWD".rendered/ -type f -print0 | xargs -0 grep -H -E 'FOO|BAR'
+ [notice one FOO in there]
+ $ rm -rf .ikiwiki "$PWD".rendered
+
+ $ cp /usr/share/perl5/IkiWiki/Plugin/cutpaste.pm .library/IkiWiki/Plugin/
+ $ patch -p0 < ../cutpaste_filter.patch
+ $ ./render_locally
+ $ find "$PWD".rendered/ -type f -print0 | xargs -0 grep -H -E 'FOO|BAR'
+ [correct; notice no more FOO]
+
+I guess this needs a general audit -- there are other places where `preprocess`
+is being doing without `filter`ing first, for example in the same file, `copy`
+function.
+
+--[[tschwinge]]
+
+> So, in English, page text inside a cut directive will not be filtered.
+> Because the cut directive takes the text during the scan pass, before
+> filtering happens.
+>
+> Commit 192ce7a238af9021b0fd6dd571f22409af81ebaf and
+> [[bugs/po_vs_templates]] has to do with this.
+> There I decided that filter hooks should *only* act on the complete
+> text of a page.
+>
+> I also suggested that anything that wants to reliably
+> s/FOO/BAR/ should probably use a sanitize hook, not a filter hook.
+> I think that would make sense in this example.
+>
+> I don't see any way to make cut text be filtered while satisfying these
+> constraints, without removing cutpaste's ability to have forward pastes
+> of text cut laster in the page. (That does seems like an increasingly
+> bad idea..) --[[Joey]]
+
+> > OK -- so the FOO/BAR thing was only a very stripped-down example, of
+> > course, and the real thing is being observed with the
+> > *[[plugins/contrib/getfield]]* plugin. This one needs to run *before*
+> > `preprocess`ing, for its `{{$page#field}}` syntax is (a) meant to be usable
+> > inside ikiwiki directives, and (b) the field values are meant to still be
+> > `preprocess`ed before being embedded. That's why it's using the `filter`
+> > hook instead of `sanitize`.
+
+> > Would adding another kind of hook be a way to fix this? My idea is that
+> > *cut* (and others) would then take their data not during `scan`ning, but
+> > *after* `filter`ing.
+
+> > --[[tschwinge]]
diff --git a/doc/bugs/ddate_plugin_causes_websetup_to_change_timeformat__44___even_when_disabled.mdwn b/doc/bugs/ddate_plugin_causes_websetup_to_change_timeformat__44___even_when_disabled.mdwn
new file mode 100644
index 000000000..a74f6fcc7
--- /dev/null
+++ b/doc/bugs/ddate_plugin_causes_websetup_to_change_timeformat__44___even_when_disabled.mdwn
@@ -0,0 +1,7 @@
+If the timeformat option is '%c', every time websetup rewrites the setup file, it changes to a ddate-style time format (but if ddate is not actually enabled, some of the format codes aren't understood by strftime, so they get passed through).
+
+Presumably this is because websetup loads all plugins, so IkiWiki::plugin::ddate::checkconfig is run...
+
+(This bug seems oddly appropriate. Hail Eris)
+
+[[done_fnord|done]]
diff --git a/doc/bugs/debbug_shortcut_should_expand_differently.mdwn b/doc/bugs/debbug_shortcut_should_expand_differently.mdwn
new file mode 100644
index 000000000..b93b20a32
--- /dev/null
+++ b/doc/bugs/debbug_shortcut_should_expand_differently.mdwn
@@ -0,0 +1,17 @@
+`\[[!debbug 123456]]` expands to "bug #123456", which is hyperlinked. Could you please drop the leading "bug", for two reasons?
+
+First, #123456 is not a bug, it's a bug report. And second, #123456 suffices, doesn't it? By hardcoding the "bug" in there, you make it impossible to start a sentence with a bug number, e.g.:
+
+ There are problems with code. #123456 is a good example of...
+
+instead of
+
+ There are problems with code. bug #123456 is a good example of...
+
+Thanks, --[[madduck]]
+
+> Tschwinge changed it to expand to "Debian bug #xxxx". Which happens to
+> sidestep the start of sentence problem. I think it makes sense to be
+> explicit about whose bug it is, in general -- but you can always edit the
+> shortcuts page for your own wiki to use something shorter and more
+> implicit. --[[Joey]] [[done]]
diff --git a/doc/bugs/debbug_shortcut_should_expand_differently/discussion.mdwn b/doc/bugs/debbug_shortcut_should_expand_differently/discussion.mdwn
new file mode 100644
index 000000000..8234f806e
--- /dev/null
+++ b/doc/bugs/debbug_shortcut_should_expand_differently/discussion.mdwn
@@ -0,0 +1,11 @@
+You could change this on your wiki by modifying the shortcut definition :
+
+Currently this is:
+
+ [shortcut name=debbug url="http://bugs.debian.org/%s" desc="bug #%s"]
+
+You just have to use:
+
+ [shortcut name=debbug url="http://bugs.debian.org/%s" desc="#%s"]
+
+(I use single bracket here beacause of [[bugs/wiki_links_still_processed_inside_code_blocks]])
diff --git a/doc/bugs/debian_package_doesn__39__t_pull_in_packages_required_for_openid.mdwn b/doc/bugs/debian_package_doesn__39__t_pull_in_packages_required_for_openid.mdwn
new file mode 100644
index 000000000..e1c297162
--- /dev/null
+++ b/doc/bugs/debian_package_doesn__39__t_pull_in_packages_required_for_openid.mdwn
@@ -0,0 +1,9 @@
+I installed version 1.48-1 of Debian package of ikiwiki. When I went to login with an OpenID URL it told me I was missing the Net::OpenID::Consumer Perl module which in turn required the Crypt::DH module.
+
+I assume that these should be pulled in by default since OpenID is enabled by default?
+
+-- Adam.
+
+> I'm going to promote it from a Suggests to a Recommends, that should get
+> it installed by default and still let it not be installed by users who
+> don't want it. [[done]] --[[Joey]]
diff --git a/doc/bugs/default__95__pageext_not_working.mdwn b/doc/bugs/default__95__pageext_not_working.mdwn
new file mode 100644
index 000000000..b7064206f
--- /dev/null
+++ b/doc/bugs/default__95__pageext_not_working.mdwn
@@ -0,0 +1,16 @@
+default_pageext in the setup file does not work for me.
+
+I tried to set it as 'txt' and as a custom plugin I am developing but when I edit a page it only ever loads with Markdown selected.
+
+Yes I am only trying to set it to loaded and working plugins.
+
+ikiwiki version 3.20101129
+
+> I've tested `default_pageext` with 3.20110124, and it works fine.
+>
+> It seems to me from what you describe that you expect
+> it to have an effect when you go and edit an existing page.
+> That's not what it's for, it only chooses the default used
+> when creating a new page.
+>
+> Closing this bug as apparent user error. --[[Joey]] [[done]]
diff --git a/doc/bugs/definition_lists_should_be_bold.mdwn b/doc/bugs/definition_lists_should_be_bold.mdwn
new file mode 100644
index 000000000..a72206b8c
--- /dev/null
+++ b/doc/bugs/definition_lists_should_be_bold.mdwn
@@ -0,0 +1,27 @@
+Definition lists do not look great here...
+
+Here is an example.
+
+<dl>
+<dt>this is a term</dt>
+<dd>and this is its definition.</dd>
+</dl>
+
+(This wiki doesn't support Markdown's extended definition lists, but still, this is valid markup.)
+
+I believe `<dt>` should be made bold. I have added this to my `local.css`, and I would hate to add this all the time forever:
+
+ /* definition lists look better with the term in bold */
+ dt
+ {
+ font-weight: bold;
+ }
+
+:) How does that look? I can provide a patch for the base wiki if you guys really want... ;) -- [[anarcat]]
+
+> What you dislike seems to be the default rendering of definition lists by
+> browsers. I don't think it's ikiwiki's place to override browser defaults
+> for standard markup in the document body, at least not in the default
+> antitheme. --[[Joey]]
+
+> > How about in the actiontab theme then? :)
diff --git a/doc/bugs/defintion_lists_appear_to_be_disabled.mdwn b/doc/bugs/defintion_lists_appear_to_be_disabled.mdwn
new file mode 100644
index 000000000..6dac9c8b8
--- /dev/null
+++ b/doc/bugs/defintion_lists_appear_to_be_disabled.mdwn
@@ -0,0 +1,54 @@
+Adding text of the format
+
+ Apple
+ : Pomaceous fruit of plants of the genus Malus in
+ the family Rosaceae.
+ : An american computer company.
+
+ Orange
+ : The fruit of an evergreen tree of the genus Citrus.
+
+Does not result in expected HTML as described in the [MultiMarkdown Syntax Guide](http://fletcherpenney.net/multimarkdown/users_guide/multimarkdown_syntax_guide/):
+
+Should be
+
+ <dl xmlns="http://www.w3.org/1999/xhtml">
+ <dt>Apple</dt>
+ <dd>
+ <p>Pomaceous fruit of plants of the genus Malus in
+ the family Rosaceae.</p>
+ </dd>
+ <dd>
+ <p>An american computer company.</p>
+ </dd>
+ <dt>Orange</dt>
+ <dd>
+ <p>The fruit of an evergreen tree of the genus Citrus.</p>
+ </dd>
+ </dl>
+
+But instead it gives:
+
+ <p>Apple
+ : Pomaceous fruit of plants of the genus Malus in
+ the family Rosaceae.
+ : An american computer company.</p>
+
+ <p>Orange
+ : The fruit of an evergreen tree of the genus Citrus.</p>
+
+> ikiwiki's markdown support does not include support for multimarkdown by
+> default. If you want to enable that, you can turn on the `multimarkdown`
+> option in the setup file. --[[Joey]]
+
+>> Sorry, I should have indicated, I have multimarkdown enabled:
+
+ # mdwn plugin
+ # enable multimarkdown features?
+ multimarkdown => 1,
+
+>>Other features appear to be working, tables and footnotes for instance. See current install: <http://wiki.infosoph.org>
+
+>>> Ok, in that case it's a bug in the perl module. Forwarded to
+>>> <http://github.com/bobtfish/text-markdown/issues#issue/6> --[[Joey]]
+>>> [[!tag done]]
diff --git a/doc/bugs/deletion_warnings.mdwn b/doc/bugs/deletion_warnings.mdwn
new file mode 100644
index 000000000..668626b49
--- /dev/null
+++ b/doc/bugs/deletion_warnings.mdwn
@@ -0,0 +1,89 @@
+Seen while deleting a blog's calendar pages:
+
+--[[Joey]]
+
+[[done]] -- the new `page()` pagespec needed to check if there was a source
+file for the page, and was leaking undef.
+
+<pre>
+ 427250f..ff6c054 master -> origin/master
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+Use of uninitialized value $file in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 688.
+Use of uninitialized value $file in substitution (s///) at /usr/share/perl5/IkiWiki.pm line 668.
+Use of uninitialized value $base in exists at /usr/share/perl5/IkiWiki.pm line 692.
+</pre>
+
diff --git a/doc/bugs/depends_simple_mixup.mdwn b/doc/bugs/depends_simple_mixup.mdwn
new file mode 100644
index 000000000..a5910d02e
--- /dev/null
+++ b/doc/bugs/depends_simple_mixup.mdwn
@@ -0,0 +1,88 @@
+The [[bugs]] page, at least before I commit this, has a bug at the top that
+has been modified to link to done, and ikiwiki's dependency calculations
+failed to notice and update the bugs page. Looking at the indexdb, I saw
+that the page was not included in the `depends_simple` of the bugs page.
+
+I was able to replicate the problem locally by starting off with the page
+marked done (when it did appear in the bugs page `depends_simple`
+(appropriatly as a link dependency, since a change to the page removing the
+link would make it match)), then removing the done link.
+
+At that point, it vanished from `depends_simple`. Presumably because
+the main (pagespec) depends for the bugs page now matched it, as a content
+dependency. But, it seems to me it should still be listed in
+`depends_simple` here. This, I think, is the cause of the bug.
+
+Then re-add the done link, and the dependency calc code breaks down,
+not noticing that bugs dependeded on the page and needs to be updated.
+
+Ok.. Turns out this was not a problem with the actual influences
+calculation or dependency calculation code. Whew! `match_link`
+just didn't set the influence correctly when failing. fixed
+
+--[[Joey]]
+
+---
+
+Update: Reopening this because the fix for it rather sucks.
+
+I made `match_link` return on failure an influence of
+type DEPEND_LINKS. So, a tag page that inlines `tagged(foo)`
+gets a `depends_simple` built up that contains link dependencies for
+*every* page in the wiki. A very bloaty way to represent the dependency!
+
+Per [[todo/dependency_types]], `link(done)` only needs to list in
+`depends_simple` the pages that currently match. If a page is modified
+to add the link, the regular dependency calculation code notices that
+a new page matches. If a page that had the link is modified to remove it,
+the `depends_simple` lets ikiwiki remember that the now non-matching page
+matched before.
+
+Where that fell down was `!link(done)`. A page matching that was not added
+to `depends_simple`, because the `link(done)` did not match it. If the page
+is modified to add the link, the regular dependency calculation code
+didn't notice, since the pagespec no longer matched.
+
+In this case, `depends_simple` needs to contain all pages
+that do *not* match `link(done)`, but before my change, it contained
+all pages that *do* match. After my change, it contained all pages.
+
+----
+
+So, seems what is needed is a way for influence info to be manipulated by
+the boolean operations that are applied. One way would be to have two
+sets of influences be returned, one for successful matches, and one for
+failed matches. Normally, these would be the same. For successful
+`match_link`, the successful influence would be the page.
+For failed `match_link`, the failed influence would be the page.
+
+Then, when NOTting a `*Reason`, swap the two sets of influences.
+When ANDing/ORing, combine the individual sets. Querying the object for
+influences should return only the successful influences.
+
+----
+
+Would it be possible to avoid the complication of maintianing two sets of
+influence info?
+
+Well, notice that the influence of `pagespec_match($page, "link(done)")`
+is $page. Iff the match succeeds.
+
+Also, the influence of `pagespec_match($page, "!link(done)")` is
+$page. Iff the (overall) match succeeds.
+
+Does that hold for all cases? If so, the code that populates
+`depends_simple` could just test if the pagespec was successful, and
+if not, avoid adding $page influences, while still adding any other,
+non-$page influences.
+
+----
+
+Hmm, commit f2b3d1341447cbf29189ab490daae418fbe5d02d seems
+thuroughly wrong. So, what about influence info for other matches
+like `!author(foo)` etc? Currently, none is returned, but it should
+be a content influence. (Backlink influence data seems ok.)
+
+----
+
+[[done]] again!
diff --git a/doc/bugs/diff_links_to_backtrace.mdwn b/doc/bugs/diff_links_to_backtrace.mdwn
new file mode 100644
index 000000000..0a361aa24
--- /dev/null
+++ b/doc/bugs/diff_links_to_backtrace.mdwn
@@ -0,0 +1,9 @@
+The diff links in RecentChanges go to a viewvc backtrace if the rev in
+question is when the page was added. Is this a viewvc bug, or a behavior
+ikiwiki needs to work around?
+ - As a special case, there should certianly be no history link for
+ pages generated from the underlaydir as it can never work for them.
+
+Hmm, gitweb deals with this ok, and can even handle the case where a page
+was renamed from some other filename. So I don't think it's appropriate
+for ikiwiki to worry about this. [[done]] --[[Joey]]
diff --git a/doc/bugs/disable_sub-discussion_pages.mdwn b/doc/bugs/disable_sub-discussion_pages.mdwn
new file mode 100644
index 000000000..39d9ba528
--- /dev/null
+++ b/doc/bugs/disable_sub-discussion_pages.mdwn
@@ -0,0 +1,63 @@
+Any setting to disable having a discussion of a discussion?
+The [[features]] page says every page, but I don't want every page.
+I do want discussion subpage, but I don't want to have, for example: discussion/ discussion/ discussion.
+-- [[JeremyReed]]
+
+> Discussion pages should clearly be a special case that don't get Discussion
+> links put at the top... aaand.. [[bugs/done]]! --[[Joey]]
+
+>> This bug appears to have returned. For example,
+>> [[plugins/contrib/unixauth/discussion]] has a Discussion link. -- [[schmonz]]
+
+>>> Lots of case issues this time. Audited for and fixed them all. [[done]]
+>>> --[[Joey]]
+
+>>> Joey, I've just seen that you closed that bug in ikiwiki 1.37, but it seems
+>>> you fixed it only for English "discussion" page. The bug still occurs
+>>> for the international "discussion" pages. I have backported ikiwiki 1.40
+>>> and I can see "Dyskusja" link on my Polish "dyskusja" pages. --[[Paweł|ptecza]]
+
+>>> Yes, I missed that string when internationalizing ikiwiki, fixed now.
+
+>>>> Thanks a lot for the quick fix, Joey! It works for me :)
+
+>>>> BTW, I had to apply the patch manually, because I have a problem
+>>>> with building ikiwiki 1.41 Debian package:
+
+>>>> ptecza@devel2:~/svn/ikiwiki$ LANG=C dpkg-buildpackage -uc -us -rfakeroot
+>>>> [...]
+>>>> make[2]: Entering directory `/home/ptecza/svn/ikiwiki/po'
+>>>> msgfmt -o bg.mo bg.po
+>>>> msgfmt -o cs.mo cs.po
+>>>> msgfmt -o es.mo es.po
+>>>> msgfmt -o fr.mo fr.po
+>>>> msgfmt -o gu.mo gu.po
+>>>> msgfmt -o pl.mo pl.po
+>>>> Merging ikiwiki.pot and sv.pomsgmerge: sv.po: warning: Charset missing in header.
+>>>> Message conversion to user's charset will not work.
+>>>> sv.po:10:2: parse error
+>>>> sv.po:10: keyword "mine" unknown
+>>>> sv.po:14: keyword "r2262" unknown
+>>>> sv.po:37:2: parse error
+>>>> sv.po:37: keyword "mine" unknown
+>>>> sv.po:43: keyword "r2262" unknown
+>>>> sv.po:52:2: parse error
+>>>> sv.po:52: keyword "mine" unknown
+>>>> sv.po:56: keyword "r2262" unknown
+>>>> msgmerge: found 9 fatal errors
+>>>> make[2]: *** [sv.po] Error 1
+>>>> make[2]: Leaving directory `/home/ptecza/svn/ikiwiki/po'
+>>>> make[1]: *** [extra_build] Error 2
+>>>> make[1]: Leaving directory `/home/ptecza/svn/ikiwiki'
+>>>> make: *** [build-stamp] Error 2
+
+>>>> I think you should be notified about it :) --[[Paweł|ptecza]]
+
+>>>>> You have an unresolved svn conflict in some files in po, looks like.
+
+>>>>>> Thanks for the hint! You're absolutely right again, Joey! It's because of
+>>>>>> I always do `svn up` command (for CVS repo `cvs up` is enough :) )
+>>>>>> and I never use `svn status` command. Yes, now I know I should do it :)
+
+>>>>>> First I removed conflicting `sv.po` file and next I updated it and
+>>>>>> it has resolved the problem. --[[Paweł|ptecza]]
diff --git a/doc/bugs/disabling_backlinks.mdwn b/doc/bugs/disabling_backlinks.mdwn
new file mode 100644
index 000000000..415708a50
--- /dev/null
+++ b/doc/bugs/disabling_backlinks.mdwn
@@ -0,0 +1,32 @@
+I have tried `--numbacklinks 0` on ikiwiki commandline, but I still get backlinks. The man page says:
+
+ --numbacklinks n
+ Controls how many backlinks should be displayed maximum.
+ Excess backlinks will be hidden in a popup.
+ Default is 10. Set to 0 to disable this feature.
+
+My first reading (and second and third) of this was that backlinks would be disabled entirely if I set numbacklinks=0, but now that I look again, I see the possibility that it is just controlling how many may be displayed before moving excess to a popup. If this is in fact how it is meant, I'll just get rid of the backlinks via the page template. Is this the case, that numbacklinks controls the popup, rather than backlinks in general?
+
+--[[KarlMW]]
+
+> Yes, it only controls the number of backlinks that are shown at the
+> bottom of the page vs how many are moved to the popup. I've tried to
+> improve the documentation for this. [[done]] --[[Joey]]
+
+
+I notice that there is quite a bit of redundancy when both tags and
+backlinks are used extensively. On most pages, the set of links features in
+both categories is almost identical because a tag's index page is shown
+both as a tag link and as a backlink. Is there a way to improve that
+situation somehow? I realise that backlinks aren't generated when the tag
+index page refers to its contents by \[\[!map ...]], etc., but sometimes an
+auto-generated index is insufficient.
+
+ --Peter
+
+> Um, if you're manually linking from the tag's page to each page so
+> tagged, I think you have larger problems than tags and backlinks being
+> the same. Like keeping that list of links up to date as tags are added
+> and changed. --[[Joey]]
+
+I see your point, Joey. I need to maintain that list manually, though, because the automatically generated list is too brief. \[[!map ...]] generates just a list of titles or descriptions. I need a list that contains both. See [[this_posting|ikiwiki/directive/map/discussion]] for more details. Until \[[!map]] can do that, I'm stuck with a manually maintained list. Which means that every link shows up in the backlinks.
diff --git a/doc/bugs/discussion.mdwn b/doc/bugs/discussion.mdwn
new file mode 100644
index 000000000..474e07564
--- /dev/null
+++ b/doc/bugs/discussion.mdwn
@@ -0,0 +1,18 @@
+Related to using --cgi on command-line. The man page should just be more
+clear. ikiwiki-1.35 has: "Enable CGI mode. ..." To me this implies that
+is configuring something that will enable CGI mode. If I understand it
+correctly, it should say "Run ikiwiki as a CGI. Normally this is used from
+the ikiwiki.cgi CGI wrapper. ..."
+
+> The man page says:
+
+ Enable [[CGI]] mode. In cgi mode ikiwiki __runs as a cgi script__, and
+ supports editing pages, signing in, registration, and displaying
+ [[RecentChanges]].
+
+ __To use ikiwiki as a [[CGI]] program you need to use --wrapper or --setup
+ to generate a wrapper.__ The wrapper will generally need to run suid 6755
+ to the user who owns the `source` and `destination` directories.
+
+> (emphasis mine). Anyway, if you have ideas to improve the man page, it's
+> over in [[usage]] --[[Joey]]
diff --git a/doc/bugs/discussion_of_what__63__.mdwn b/doc/bugs/discussion_of_what__63__.mdwn
new file mode 100644
index 000000000..763e599bf
--- /dev/null
+++ b/doc/bugs/discussion_of_what__63__.mdwn
@@ -0,0 +1,7 @@
+When searching in ikiwiki, sometimes discussion pages turn up. However, they are only titled "discussion".
+In order to know what topic they are discussing, you have to look at the URL. Shouldn't they be titled
+"foo/discussion" or "discussion of foo" or something? Thanks, --[[perolofsson]]
+
+> This bug was filed when ikiwiki still used hyperestradier.
+> Now that it uses xapian, the search results include the full
+> page name, which seems sufficient to call this [[done]] --[[Joey]]
diff --git a/doc/bugs/discussion_pages_with_uppercase_characters_break_the_detection_of_the_best_location.mdwn b/doc/bugs/discussion_pages_with_uppercase_characters_break_the_detection_of_the_best_location.mdwn
new file mode 100644
index 000000000..cb82766fa
--- /dev/null
+++ b/doc/bugs/discussion_pages_with_uppercase_characters_break_the_detection_of_the_best_location.mdwn
@@ -0,0 +1,6 @@
+As it says on the tin. I think $page might need to be lowercased in editpage.pm. Workaround until then is to change the setting to 'discussion'.
+
+> [[news/version_20111106]] says: "Fix handling of discussion page creation
+> links to make discussion pages in the right place and with the right case.
+> Broken by page case preservation feature added in 3.20110707." So I think
+> this is probably [[done]]? --[[smcv]]
diff --git a/doc/bugs/discussion_removal.mdwn b/doc/bugs/discussion_removal.mdwn
new file mode 100644
index 000000000..6da35f37b
--- /dev/null
+++ b/doc/bugs/discussion_removal.mdwn
@@ -0,0 +1,16 @@
+If a page has a discussion page, which is then removed, ikiwiki seems not
+to notice that the discussion page has gone away, and does not update the
+link to it in the action bar.
+
+> Reprocued with 2.5 --[[Joey]]
+
+Looks to me like loadindex is populating %destsources with information
+that the old discussion page exists, which isn't invalidated when ikiwiki
+discovers that the page is gone. This leaves dangling links whenever *any*
+page is deleted, not just a discussion page. --Ethan
+
+Here's a patch that trawls through %destsources deleting pages when they
+are found to be deleted. It's a little inelegant, but it's simple and it
+works. --Ethan
+
+Thankyou for the [[patch]]! [[Done]]. --[[Joey]]
diff --git a/doc/bugs/done.mdwn b/doc/bugs/done.mdwn
new file mode 100644
index 000000000..0a666ab11
--- /dev/null
+++ b/doc/bugs/done.mdwn
@@ -0,0 +1,3 @@
+recently fixed [[bugs]]
+
+[[!inline pages="link(bugs/done) and !bugs and !*/Discussion" sort=mtime show=10]]
diff --git a/doc/bugs/dumpsetup_does_not_save_destdir.mdwn b/doc/bugs/dumpsetup_does_not_save_destdir.mdwn
new file mode 100644
index 000000000..768c3fc5e
--- /dev/null
+++ b/doc/bugs/dumpsetup_does_not_save_destdir.mdwn
@@ -0,0 +1,3 @@
+Calling ikiwiki with a bunch of options, including the --dumpsetup somefile.setup option creates somefile.setup for later reuse with the --setup option. The destination dir however is not saved in the setup file, it has destdir => ''.
+
+> that broke in version 2.64 .. fixed [[done]] --[[Joey]]
diff --git a/doc/bugs/edit_preview_resolves_links_differently_from_commit.mdwn b/doc/bugs/edit_preview_resolves_links_differently_from_commit.mdwn
new file mode 100644
index 000000000..320eca626
--- /dev/null
+++ b/doc/bugs/edit_preview_resolves_links_differently_from_commit.mdwn
@@ -0,0 +1,23 @@
+I'm editing /posts/foo. If I create a link to a subpage (in my case,
+"discussion"), and hit preview, it gets resolved to /discussion, not
+/posts/foo/discussion. If I hit commit, the latter happens. This seems like
+a bug. --liw
+
+> That would be a bug, but I cannot reproduce it. For example, I edited
+> <http://kitenet.net/~joey/blog/entry/wikis_out_of_disk/> and added a
+> discussion link and on preview it went to the page's discussion page. I
+> don't normally have a toplevel /discussion page, but I also tried adding
+> one, and the link still doesn't link to it. Testcase? --[[Joey]]
+
+>> I can reproduce this on <http://blog.liw.fi/posts/distributed-internet-witness-service/>:
+>> if I edit the page, then preview (no changes made), the "discussion" link at the bottom
+>> of the page points in the preview
+>> to <http://blog.liw.fi/discussion/>,
+>> whereas the saved page has it pointing to
+>> <http://blog.liw.fi/posts/distributed-internet-witness-service/discussion/>.
+>> I'll arrange so that you can edit the page to test this.
+>> --liw
+
+>> Joey suggested my wiki might be missing the FORCEBASEURL snippet from the misc.tmpl
+>> template, and he's right. Mea culpa: I had not diffed the various templates when upgrading
+>> and had missed that updated. [[done]] --liw
diff --git a/doc/bugs/editmessage_turned_off_in_web_interface__63__.mdwn b/doc/bugs/editmessage_turned_off_in_web_interface__63__.mdwn
new file mode 100644
index 000000000..d8c6c3a08
--- /dev/null
+++ b/doc/bugs/editmessage_turned_off_in_web_interface__63__.mdwn
@@ -0,0 +1,10 @@
+the "Optional comment about this change:" text area is not showing up on my wiki when I edit pages. I just see the label "Optional comment about this change:" and no box in which to put the comment.
+
+Is it possible I turned this off by messing around with plugins? Even if so, then it's strange that I see the "optional comment" text without the corresponding text area.
+
+If the answer isn't immediately obvious you can see for yourself at <http://metameso.org/aa/ikiwiki.cgi?page=index&do=edit> (UN: guest PW: guest2011).
+
+> This happened to me. It was due to overriding either one of the ikiwiki templates based on an earlier version than current ikiwiki, or overriding style.css, instead of using local.css. It doesn't look like you are doing the former. Are you overriding the ikiwiki template dir with an out-of-date editpagel template? -- [[Jon]]
+
+>> Yes, every time I've diagnosed this, it was an old page.tmpl. [[done]]
+>> --[[Joey]]
diff --git a/doc/bugs/edits_not_showing_up_in_compiled_pages.mdwn b/doc/bugs/edits_not_showing_up_in_compiled_pages.mdwn
new file mode 100644
index 000000000..9f1e89397
--- /dev/null
+++ b/doc/bugs/edits_not_showing_up_in_compiled_pages.mdwn
@@ -0,0 +1,20 @@
+I edited some pages on the ikiwiki ikiwiki ([[/shortcuts]] and
+[[/ikiwikiusers]]). The edits show up in RecentChanges and History, but not
+in the compiled pages. --[[JoshTriplett]]
+
+Well, I seem to have fixed this now (crossed fingers) --[[Joey]]
+
+Looks fixed. Out of curiosity, what caused the problem? --[[JoshTriplett]]
+
+Looks like a build died halfway through, so it was stumbling over rendered
+html pages that it didn't have record of. I don't know what build failed
+exactly. --[[Joey]]
+
+>> Has this just happened again? [[todo/datearchives-plugin]] is now exhibiting the same symptoms -- it's in the repository and RecentChanges, but the actual page is 404. --Ben
+
+>>> Yes, it seems to have happened again. Added debugging to track it
+>>> down next time it occurs. It seems to be happening when you add things
+>>> to patchqueue. --[[Joey]]
+
+>>> Got it, it seems that htperestradier was dying and this was killing
+>>> ikiwiki before it could save state filed && [[bugs/done]], for real this time. --[[Joey]]
diff --git a/doc/bugs/edittemplate_seems_not_to_be_working.mdwn b/doc/bugs/edittemplate_seems_not_to_be_working.mdwn
new file mode 100644
index 000000000..a6c77b51a
--- /dev/null
+++ b/doc/bugs/edittemplate_seems_not_to_be_working.mdwn
@@ -0,0 +1,7 @@
+I tried to use [[the_edittemplate_plugin|plugins/edittemplate]] on <http://vcs-pkg.org/people/>, but if you create a new person's page, [the template](http://vcs-pkg.org/templates/person/) is not used, despite the note that the template has been registered, which replaces the `\[[!edittemplate ...]]` directive.
+
+I hope I am not doing something wrong...
+
+--[[madduck]]
+
+> [[fixed|done]] --[[Joey]]
diff --git a/doc/bugs/emails_should_not_be_considered_as_broken_links.mdwn b/doc/bugs/emails_should_not_be_considered_as_broken_links.mdwn
new file mode 100644
index 000000000..5aad1292a
--- /dev/null
+++ b/doc/bugs/emails_should_not_be_considered_as_broken_links.mdwn
@@ -0,0 +1,12 @@
+The [[ikiwiki/directive/brokenlinks]] directive lists emails when used inside [[ikiwiki/wikilink]]s: \[[john.doo@example.com\]] -> [[john.doo@example.com]]. Obviously its is a bug since 1) there is a link generated in the page; 2) "fixing" the broken link in the brokenlinks page may yield to stange results [[http://ikiwiki.info/ikiwiki.cgi?page=john.doo__64__example.com&do=create]]. [[JeanPrivat]]
+
+[[!brokenlinks pages="*@* and !recentchanges"]]
+
+> Weird. The bug, imho, is that `\[[email-address]]` results in a marked-up email address. I think marking up email addresses into hyperlinks should be handled by a markup plugin (e.g. markdown), not by the wikilink parser. I feel the same way for external links, but it appears [this is all by design](http://source.ikiwiki.branchable.com/?p=source.git;a=commitdiff;h=07a08122d926ab6b7741c94bc6c0038ffe0113fb). — [[Jon]]
+
+>> I belive this was done for compatability with the wikicreole plugin.
+>> Since in creole, a wikilink can contain an email or full html link,
+>> and it was easier to make ikiwiki's wikilinks do so too, rather
+>> than put entirely different link handling into creole.
+>>
+>> Anyway, I've fixed this. [[done]] --[[Joey]]
diff --git a/doc/bugs/encoding_issue_in_blogspam_plugin.mdwn b/doc/bugs/encoding_issue_in_blogspam_plugin.mdwn
new file mode 100644
index 000000000..92318d165
--- /dev/null
+++ b/doc/bugs/encoding_issue_in_blogspam_plugin.mdwn
@@ -0,0 +1,34 @@
+[[!tag patch]]
+
+<pre>
+From 5ad35b2805ca50478f07d810e57e7c9b8f4eddea Mon Sep 17 00:00:00 2001
+From: Changaco &lt;changaco@changaco.net>
+Date: Tue, 4 Jun 2013 02:54:35 +0200
+Subject: [PATCH] fix encoding issue in blogspam plugin
+
+RPC::XML uses ascii as default encoding, we have to tell it to use utf8.
+
+Without this, ikiwiki returns "failed to get response from blogspam server"
+every time a non-ascii character is used in a content that needs checking.
+
+---
+ IkiWiki/Plugin/blogspam.pm | 1 +
+ 1 file changed, 1 insertion(+)
+
+diff --git a/IkiWiki/Plugin/blogspam.pm b/IkiWiki/Plugin/blogspam.pm
+index d32c2f1..e48ed72 100644
+--- a/IkiWiki/Plugin/blogspam.pm
++++ b/IkiWiki/Plugin/blogspam.pm
+@@ -53,6 +53,7 @@ sub checkconfig () {
+ eval q{
+ use RPC::XML;
+ use RPC::XML::Client;
++ $RPC::XML::ENCODING = 'utf-8';
+ };
+ error $@ if $@;
+ }
+--
+1.8.3
+</pre>
+
+[[done]] --[[Joey]]
diff --git a/doc/bugs/entirely_negated_pagespec_matches_internal_pages.mdwn b/doc/bugs/entirely_negated_pagespec_matches_internal_pages.mdwn
new file mode 100644
index 000000000..a9b223a46
--- /dev/null
+++ b/doc/bugs/entirely_negated_pagespec_matches_internal_pages.mdwn
@@ -0,0 +1,30 @@
+A [[PageSpec]] that is entirely negated terminals, such as "!foo and !bar"
+matches all other pages, including all internal pages. This can lead to
+unexpected results, since it will match a bunch of recentchanges pages,
+etc.
+
+Recall that internal-use pages are not matched by a glob. So "\*" doesn't
+match them. So if the pagespec is "\* and !foo and !bar", it won't match
+them. This is the much more common style.
+
+There's an odd inconsistency with entirely negated pagespecs. If "!foo"
+matches page bar, shouldn't "" also match bar? But, the empty pagespec is
+actually special-cased to not match anything.
+
+Indeed, it seems what would be best would be for "!foo" to not match any
+pages, unless it's combined with a terminal that positively matches pages
+("* and !foo"). Although this would be a behavior change, with transition
+issues.
+
+Another approach would be to try to detect the case of an entirely negated
+pagespec, and implicitly add "and !internal()" to it.
+
+Either approach would require fully parsing the pagespec. And consider cases
+like "!(foo and !bar)". Doesn't seem at all easy to solve. --[[Joey]]
+
+> It occurs to me that at least one place in ikiwiki optimizes by assuming
+> that pagespecs not mentioning the word "internal" never match internal
+> pages. I wonder whether this bug could be solved by making that part of
+> the definition of a pagespec, rather than a risky optimization
+> like it is now? That seems strange, though - having this special case
+> would make pagespecs significantly harder to understand. --[[smcv]]
diff --git a/doc/bugs/enumerations_of_dates_not_formatted_correctly.mdwn b/doc/bugs/enumerations_of_dates_not_formatted_correctly.mdwn
new file mode 100644
index 000000000..263ddd78b
--- /dev/null
+++ b/doc/bugs/enumerations_of_dates_not_formatted_correctly.mdwn
@@ -0,0 +1,43 @@
+When an enumeration contains entries starting with ordinal numbers, e.g., for lists of meeting dates, ikiwiki turns them all into the 1st.
+
+Testcase:
+
+*The following lists should read: 1. January, 27. March, 99. November, 42. April*
+**But instead it reads:**
+
+* 1. January
+* 27. March
+* 99. November
+* 42. April
+
+> That's a consequence of Markdown syntax. The syntax for ordered lists
+> (HTML `<ol>`) in Markdown is to use arbitrary numeric prefixes in that style,
+> so your text gets parsed as:
+>
+> <ul>
+> <li>
+> <ol>
+> <li>January</li>
+> </ol>
+> </li>
+> ...
+>
+> You can avoid that interpretation by escaping the dot with a backslash
+> (`1\. January`) like so:
+>
+> * 1\. January
+> * 27\. March
+>
+> or by writing "1st January" and so on. --[[smcv]]
+
+>> I think that this is a bug in Text::Markdown (and probably other
+>> versions of markdown). The [markdown spec)(http://daringfireball.net/projects/markdown/syntax.text),
+>> though unmaintained and bitrotted into near illegibility, seems to say
+>> that list items can only be preceeded by whitespace:
+>>
+>>> "List markers typically start at the left margin, but may be indented by
+>>> up to three spaces."
+>>
+>> So "* * * 1. 2. 3." should not be parsed as a deeply nested list.
+>>
+>> Forwarded to [upsteam RT](https://rt.cpan.org/Ticket/Display.html?id=65116). [[done]] --[[Joey]]
diff --git a/doc/bugs/errors_with_ampersand_in_filename.mdwn b/doc/bugs/errors_with_ampersand_in_filename.mdwn
new file mode 100644
index 000000000..6b459d4d1
--- /dev/null
+++ b/doc/bugs/errors_with_ampersand_in_filename.mdwn
@@ -0,0 +1,21 @@
+I created an image and a mdwn with an ampersand in the filename.
+This gaves me some error messages.
+I renamed the files and committed.
+
+> Ikiwiki does not allow files containing ampersands or most other non-alphanumeric characters.
+> It will display a "skipping bad filename" warning if you have such files. --[[Joey]]
+
+Even now, I still get the following error messages:
+
+ Use of uninitialized value in substitution (s///) at /usr/lib64/perl5/vendor_perl/5.8.8/IkiWiki.pm line 764.
+ Use of uninitialized value in substitution (s///) at /usr/lib64/perl5/vendor_perl/5.8.8/IkiWiki.pm line 768.
+ Use of uninitialized value in concatenation (.) or string at /usr/lib64/perl5/vendor_perl/5.8.8/IkiWiki.pm line 773.
+ Use of uninitialized value in concatenation (.) or string at /usr/lib64/perl5/vendor_perl/5.8.8/IkiWiki.pm line 773.
+ Use of uninitialized value in concatenation (.) or string at /usr/lib64/perl5/vendor_perl/5.8.8/IkiWiki.pm line 773.
+ Use of uninitialized value in concatenation (.) or string at /usr/lib64/perl5/vendor_perl/5.8.8/IkiWiki.pm line 773.
+
+> To help with this I'd need to know, at a minimum, what version of ikiwiki you're using,
+> and I'd really also probably need a copy of the source of the wiki you are trying to
+> build with it. (If you'd like to email me a tarball, send it to joey@kitenet.net) --[[Joey]]
+
+> > Hmm... the most recent ikiwiki seems to fix the issue, so I will mark it as [[done]]. --[[JosephTurian]]
diff --git a/doc/bugs/example_Mercurial_historyurl_doesn__39__t_show_file_history.mdwn b/doc/bugs/example_Mercurial_historyurl_doesn__39__t_show_file_history.mdwn
new file mode 100644
index 000000000..390449dd3
--- /dev/null
+++ b/doc/bugs/example_Mercurial_historyurl_doesn__39__t_show_file_history.mdwn
@@ -0,0 +1,17 @@
+The Mercurial historyurl in the example ikiwiki.setup file creates a link to the repo's summary page. It should take you to the history page for the file, like the example Git historyurl does.
+
+The current historyurl is:
+
+ #historyurl => "http://localhost:8000/", # hg serve'd local repository
+
+A link to the history page for the file would be:
+
+ #historyurl => "http://localhost:8000/log/tip/\[[file]]", # hg serve'd local repository
+
+*The backslash in the code should be removed.*
+
+> To escape a link, use `\\[[link]]`. Example: \[[link]] --[[Joey]]
+
+> ([[done]], BTW)
+
+This creates links to the hgweb page which is equivalent to the Git file history page.
diff --git a/doc/bugs/external_links_inside_headings_don__39__t_work.mdwn b/doc/bugs/external_links_inside_headings_don__39__t_work.mdwn
new file mode 100644
index 000000000..51d6ad475
--- /dev/null
+++ b/doc/bugs/external_links_inside_headings_don__39__t_work.mdwn
@@ -0,0 +1,24 @@
+The standalone 'markdown' utility is perfectly happy with an external link inside a `<h1>`, e.g.:
+
+ # Review of [Dwarf Fortress][]
+ ...
+ [Dwarf Fortress]: http://www.bay12games.com/dwarves/
+
+produces
+
+ <h1>Review of <a href="http://www.bay12games.com/dwarves/">Dwarf Fortress</a></h1>
+
+but when I try to use this construct in an ikiwiki page, I get
+
+ <h1>Review of [Dwarf Fortress][]</h1>
+
+It works fine with h2 and deeper. The square brackets also appear in the output of an [[ikiwiki/directive/inline]] directive in archive mode, I haven't tried non-archive mode.
+
+> I think you were confused by markdown's slightly wacky mix of square brackets and parens.
+> The url in a markdown link goes in parens, not square brackets. For example:
+
+# [Google](http://google.com/)
+
+> [[done]] --[[Joey]]
+
+>> It works here but it definitely does *not* work on my wiki; but on further experimentation, I believe my problem is being caused by JasonBlevins' [h1title](http://jblevins.org/git/ikiwiki/plugins.git/plain/h1title.pm) plugin.
diff --git a/doc/bugs/external_plugins_cannot_access_ARGV_needed_for_getopt.mdwn b/doc/bugs/external_plugins_cannot_access_ARGV_needed_for_getopt.mdwn
new file mode 100644
index 000000000..be7f16a79
--- /dev/null
+++ b/doc/bugs/external_plugins_cannot_access_ARGV_needed_for_getopt.mdwn
@@ -0,0 +1,14 @@
+The `getopt` hook expects plugins to modify `@ARGV`. This is not exported via xml-rpc and thus external plugins cannot do anything. --[[madduck]]
+
+> I can think of two interfaces to handle this.
+>
+> 1. Pass @ARGV to the hook, and remove any values the hook returns from @ARGV.
+> 2. Provide an XML-RPC interface for setting and getting ikiwiki's @ARGV.
+>
+> The first is simpler, but requires keeping track of which options to
+> remove, which could be a pain, and probably precludes using regular
+> getopt libraries to process options. It also could theoretically cause
+> problems for existing perl getopt hooks.
+>
+> The second should allow using regular getopt libraries, but does bloat
+> the RPC interface. Oh well, guess that's ok. [[done]] --[[Joey]]
diff --git a/doc/bugs/feedfile_does_the_wrong_thing_from_index.mdwn2.mdwn b/doc/bugs/feedfile_does_the_wrong_thing_from_index.mdwn2.mdwn
new file mode 100644
index 000000000..6b8781a8c
--- /dev/null
+++ b/doc/bugs/feedfile_does_the_wrong_thing_from_index.mdwn2.mdwn
@@ -0,0 +1,7 @@
+[[!meta title="feedfile does the wrong thing from index"]]
+
+When I put the following !inline in my index.mdwn, it generate a file called index/graphics.rss. However, the link in the RSS button is to graphics.rss (i.e., not in the index/ directory).
+
+`\[[!inline pages="link(tags/graphics) and ./posts/* and !*/Discussion" show="10" feedfile=graphics feedonly=yes]]`
+
+[[done]] --[[Joey]]
diff --git a/doc/bugs/feedpages_does_not_prevent_tags_from_being_aggregated.mdwn b/doc/bugs/feedpages_does_not_prevent_tags_from_being_aggregated.mdwn
new file mode 100644
index 000000000..a004154df
--- /dev/null
+++ b/doc/bugs/feedpages_does_not_prevent_tags_from_being_aggregated.mdwn
@@ -0,0 +1,32 @@
+I added a feedpages directive to `blog/index.mdwn` to not pick up anything tagged `tags/random/hidden` yet that still happenend.
+
+ ~git/richardhartmann.de/blog % grep hidden index.mdwn
+ \[[!inline pages="./posts/*/*/* and !*/Discussion" feedpages="./posts/*/*/* and !*/Discussion and not tagged(tags/random/hidden)" show="10" actions=yes rootpage="blog"]]
+ ~git/richardhartmann.de/blog % grep hidden posts/2013/05/17-Debian_Release_Critical_Bug_report_for_Week_20.mdwn
+ \[[!tag tags/tech/floss/debian tags/tech/floss/debian/rc-stats/8.0-jessie tags/random/hidden]]
+ ~git/richardhartmann.de/blog %
+
+If you need more information, please let me know.
+
+Richard
+
+> I don't think this is a bug. You have a syntax error in your pagespec:
+> "not" is not a recognised keyword in [[pagespecs|ikiwiki/pagespec]],
+> so `and not tagged(...)` should be `and !tagged(...)`. Presumably inline
+> falls back to `pages` when `feedpages` doesn't work.
+>
+> By posting the pagespec here with insufficient escaping (which I've fixed)
+> you caused *this* ikiwiki instance's HTML to contain an error message
+> illustrating that syntax error :-)
+>
+> <span class="error">Error: syntax error in pagespec "(./posts/*/*/* and !*/Discussion) and (./posts/*/*/* and !*/Discussion and not tagged(tags/random/hidden))"</span>
+>
+> [[done]]. --[[smcv]]
+
+> > As per IRC: Thanks. As an aside, shouldn't this ikiwiki instance ignore directives in normal text? The problem may be non-trivial, but still... -- Richard
+
+>>> "Normal text" is exactly where directives go, so, not really.
+>>> If you mean verbatim text (e.g. indentation in Markdown): the fact that
+>>> directives still expand to HTML, which is then treated as verbatim, is an
+>>> unfortunate result of how ikiwiki interacts with pages' markup languages
+>>> (directives and wikilinks happen before markup is converted to HTML). --[[smcv]]
diff --git a/doc/bugs/feeds_get_removed_in_strange_conditions.mdwn b/doc/bugs/feeds_get_removed_in_strange_conditions.mdwn
new file mode 100644
index 000000000..deec208ba
--- /dev/null
+++ b/doc/bugs/feeds_get_removed_in_strange_conditions.mdwn
@@ -0,0 +1,57 @@
+For some time now, in circumstances that I've had enormous troubles
+trying to track, I've seen feeds getting removed by ikiwiki when
+apparently unrelated pages got changed, with the message:
+
+> removing somepath/somepage/somefeed, no longer built by some/unrelated/page
+
+I've finally been able to find how and why it happens. The situation is
+the following:
+
+* page A has an inline directive that (directly) generates a feed F
+* page B inlines A, thus (indirectly) generating F again
+* page B is rendered after page A
+
+The feed removal happens when changes are made to prevent B from
+inlining A; for example, because B is a tag page and A is untagged B, or
+because B includes A through a pagespec that no longer matches A. In
+this case, this happens:
+
+* page A is built, rendering F
+* page B is built, _not_ rendering F, which it used to render
+* F is removed because it is not built by B anymore
+
+Note that although this issue is triggered (for me) from the changes I
+proposed last year to allow feed generation from nested inlines
+coalescing it to be page-based instead of destpage-based
+(bb8f76a4a04686def8cc6f21bcca80cb2cc3b2c9 and
+72c8f01b36c841b0e83a2ad7ad1365b9116075c5) there is potential for it
+popping up in other cases.
+
+Specifically, the logic for the removal of dependent pages currently
+relies on the assumption that each output has a single generator. My
+changes caused this assumption to be violated, hence the error, but
+other cases may pop up for other plugins in the future.
+
+I have a [patch] fixing this issue (for feeds specifically, i.e. only
+the problem I am actually having) on top of my `mystuff` branch, but
+since that also has heaps of other unrelated stuff, you may want to just
+[pick it from my gitweb][gw].
+
+[gw]: (http://git.oblomov.eu/ikiwiki/patch/671cb26cf50643827f258270d9ac8ad0b1388a65)
+
+The patch changes the `will_render()` for feeds to be based on the page
+rather than on the destpage, matching the fact that for nested inlines
+it's the inner page that is ultimately responsible for generating the
+feed.
+
+I've noticed that it requires at least _two_ full rebuilds before the
+index is again in a sensible state. (On the first rebuild, all feeds
+from nested inlines are actually _removed_.)
+
+While the patch is needed because there are legitimate cases in which
+nested feeds are needed (for example, I have an index page that inlines
+index pages for subsection of my site, and I want _those_ feed from
+being visible), there are other cases when one may want to skip feed
+generation from nested inlines.
+
+--[[GiuseppeBilotta]]
diff --git a/doc/bugs/filecheck_failing_to_find_files.mdwn b/doc/bugs/filecheck_failing_to_find_files.mdwn
new file mode 100644
index 000000000..6501508e4
--- /dev/null
+++ b/doc/bugs/filecheck_failing_to_find_files.mdwn
@@ -0,0 +1,65 @@
+Using the attachment plugin, when filecheck was checking the mime-type of the attachment before allowing the attachment to be removed, it was returning with an error saying that the mime-type of the file was "unknown" (when the mime-type definitely was known!)
+
+It turns out that the filecheck plugin couldn't find the file, because it was merely using the $pagesources hash, rather than finding the absolute path of the file in question.
+
+> I don't understand why the file was not in `%pagesources`. Do you?
+> --[[Joey]]
+
+>> The file *was* in `%pagesources`, but what returns from that is the filename relative to the `srcdir` directory; for example, `foo/bar.gif`.
+>> When File::MimeInfo::Magic::magic is given that, it can't find the file.
+>> But if it is given `/path/to/srcdir/foo/bar.gif` instead, then it *can* find the file, and returns the mime-type correctly.
+>> --[[KathrynAndersen]]
+
+>>> Ok, so it's not removal specific, can in fact be triggered by using
+>>> testpagespec (or really anything besides attachment, which passes
+>>> the filename parameter). Nor is it limited to mimetype, all the tests in
+>>> filecheck have the problem. --[[Joey]]
+
+>>>> Alas, not fixed. It seems I was mistaken in some of my assumptions.
+>>>> It still happens when attempting to remove attachments.
+>>>> With your fix, the `IkiWiki::srcfile` function is only called when the filename is not passed in, but it appears that in the case of removing attachments, the filename IS passed in, but it is the relative filename as mentioned above. Thus, the file is still not found, and the mime-type comes back as unknown.
+>>>> The reason my patch worked is because, rather than checking whether a filename was passed in before applying IkiWiki::srcfile to the filename, it checks whether the file can be found, and if it cannot be found, then it applies IkiWiki::srcfile to the filename.
+>>>> --[[KathrynAndersen]]
+
+>>>>> Can you test if this patch fixes that? --[[Joey]]
+
+>>>>>> Yes, it works! --[[KathrynAndersen]]
+
+applied && [[done]]
+
+<pre>
+diff --git a/IkiWiki/Plugin/remove.pm b/IkiWiki/Plugin/remove.pm
+index f59d026..0fc180f 100644
+--- a/IkiWiki/Plugin/remove.pm
++++ b/IkiWiki/Plugin/remove.pm
+@@ -49,7 +49,7 @@ sub check_canremove ($$$) {
+ # This is sorta overkill, but better safe than sorry.
+ if (! defined pagetype($pagesources{$page})) {
+ if (IkiWiki::Plugin::attachment->can("check_canattach")) {
+- IkiWiki::Plugin::attachment::check_canattach($session, $page, $file);
++ IkiWiki::Plugin::attachment::check_canattach($session, $page, "$config{srcdir}/$file");
+ }
+ else {
+ error("removal of attachments is not allowed");
+diff --git a/IkiWiki/Plugin/rename.pm b/IkiWiki/Plugin/rename.pm
+index 3908443..1a9da63 100644
+--- a/IkiWiki/Plugin/rename.pm
++++ b/IkiWiki/Plugin/rename.pm
+@@ -50,7 +50,7 @@ sub check_canrename ($$$$$$) {
+ IkiWiki::check_canedit($src, $q, $session);
+ if ($attachment) {
+ if (IkiWiki::Plugin::attachment->can("check_canattach")) {
+- IkiWiki::Plugin::attachment::check_canattach($session, $src, $srcfile);
++ IkiWiki::Plugin::attachment::check_canattach($session, $src, "$config{srcdir}/$srcfile");
+ }
+ else {
+ error("renaming of attachments is not allowed");
+@@ -85,7 +85,7 @@ sub check_canrename ($$$$$$) {
+ if ($attachment) {
+ # Note that $srcfile is used here, not $destfile,
+ # because it wants the current file, to check it.
+- IkiWiki::Plugin::attachment::check_canattach($session, $dest, $srcfile);
++ IkiWiki::Plugin::attachment::check_canattach($session, $dest, "$config{srcdir}/$srcfile");
+ }
+ }
+</pre>
diff --git a/doc/bugs/find:_invalid_predicate___96__-L__39__.mdwn b/doc/bugs/find:_invalid_predicate___96__-L__39__.mdwn
new file mode 100644
index 000000000..993c678fa
--- /dev/null
+++ b/doc/bugs/find:_invalid_predicate___96__-L__39__.mdwn
@@ -0,0 +1,26 @@
+Hi,
+
+I have a problem with building ikiwiki 2.00 backport for Debian `sarge`,
+because it seems that my `find` doesn't support `-L` option. I had to patch
+`Makefile.PL` file to work around it:
+
+ --- Makefile.PL-orig 2007-05-10 15:18:04.000000000 +0200
+ +++ Makefile.PL 2007-05-10 15:18:41.000000000 +0200
+ @@ -47,9 +47,9 @@
+
+ extra_install:
+ install -d $(DESTDIR)$(PREFIX)/share/ikiwiki
+ - for dir in `find -L basewiki templates -type d ! -regex '.*\.svn.*'`; do \
+ + for dir in `find basewiki templates -follow -type d ! -regex '.*\.svn.*'`; do \
+ install -d $(DESTDIR)$(PREFIX)/share/ikiwiki/$$dir; \
+ - for file in `find -L $$dir -maxdepth 1 -type f`; do \
+ + for file in `find $$dir -follow -maxdepth 1 -type f`; do \
+ install -m 644 $$file $(DESTDIR)$(PREFIX)/share/ikiwiki/$$dir; \
+ done; \
+ done
+
+-- Pawel
+
+[[applied|done]], thanks --[[Joey]]
+
+> Thank you! :) --[[Paweł|ptecza]]
diff --git a/doc/bugs/find_gnuism.mdwn b/doc/bugs/find_gnuism.mdwn
new file mode 100644
index 000000000..65ee10657
--- /dev/null
+++ b/doc/bugs/find_gnuism.mdwn
@@ -0,0 +1,7 @@
+[[!template id=gitbranch branch=schmonz/portability author="[[schmonz]]"]]
+
+Whoops, somehow missed a spot on the last incarnation of this branch.
+`find -not` doesn't work on NetBSD and `find !` runs equivalently
+for me. Fixed in 9659272e25fac37f896991dab01a05b4f4c85ccb.
+
+> [[done]] --[[Joey]]
diff --git a/doc/bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info.mdwn b/doc/bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info.mdwn
new file mode 100644
index 000000000..558eb90c8
--- /dev/null
+++ b/doc/bugs/firefox_doesn__39__t_want_to_load_updated_pages_at_ikiwiki.info.mdwn
@@ -0,0 +1,14 @@
+I'm using firefox-3.0.8-alt0.M41.1 (Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.4pre) Gecko/2008100921 Firefox/3.0). I have noticed that quite often it shows an old state of a page at http://ikiwiki.info, e.g., [[recentchanges]] without my last edits, or the last page I edited (say, 50 min ago) in the state it was before I edited it.
+
+Only explicitly pressing "reload" helps.
+
+Is it a bug? I haven't been noticing such problems usually on other sites. --Ivan Z.
+
+This remains to be true now, with Epiphany 2.26.3 (Mozilla/5.0 (X11; U; Linux i686; en; rv:1.9.1.4pre) Gecko/20080528 Epiphany/2.22 Firefox/3.5). --Ivan Z.
+
+> In the most recent ikiwiki release, I added a Cache-Control hack
+> explicitly to work around firefox's broken over-caching.
+>
+> (When I tested epiphany and chromium, neither had firefox's problem.)
+>
+> [[!tag done]]
diff --git a/doc/bugs/format_bug.mdwn b/doc/bugs/format_bug.mdwn
new file mode 100644
index 000000000..dc9a09ebb
--- /dev/null
+++ b/doc/bugs/format_bug.mdwn
@@ -0,0 +1,25 @@
+* list item 1
+* list item 2
+* list item 3
+
+1. First item.
+1. Sub item.
+1. Another.
+1. And another..
+
+Not sure whether this is a bug or a case of RTFM.
+Anyway, unexpected. Trying my first install of ikiwiki,
+reading through the quick setup, trying out the given
+examples and finding half of them to behave oddly is not
+encouraging.
+
+> This is a bug in markdown, not in ikiwiki. Markdown often has issues with
+> one sort of list followed by a second sort. I've filed a bug report on
+> markdown about this ([[!debbug 432152]])
+
+> (BTW, this bug was filed by editing the bugs page directly. Please don't
+> do that, use the form to generate a new per-bug page..)
+
+> --[[Joey]]
+
+Marking this [[done]] since it's not a ikiwiki bug directly.
diff --git a/doc/bugs/formbuilder_3.0401_broken.mdwn b/doc/bugs/formbuilder_3.0401_broken.mdwn
new file mode 100644
index 000000000..7e1df24e0
--- /dev/null
+++ b/doc/bugs/formbuilder_3.0401_broken.mdwn
@@ -0,0 +1,73 @@
+After editing ikiwiki.setup, and running "ikiwiki --setup", the CGI script is successfully created. However, if I then click on "Edit Page" link, I see nothing in the browser and the following in the logs:
+
+<pre>
+==> /var/log/apache2/access_log <==
+192.168.0.125 - - [06/Oct/2006:15:12:05 -0500] "GET /cgi-bin/ikiwiki.cgi?page=index&do=edit HTTP/1.1" 500 666
+
+==> /var/log/apache2/error_log <==
+[Fri Oct 06 15:12:07 2006] [error] [client 192.168.0.125] HTML::Template::param() : attempt to set parameter 'form-submit' with an array ref - parameter is not a TMPL_LOOP! at /usr/lib/perl5/site_perl/5.8/CGI/FormBuilder.pm line 1415, referer: http://imrisws36/wiki/
+[Fri Oct 06 15:12:07 2006] [error] [client 192.168.0.125] Premature end of script headers: ikiwiki.cgi, referer: http://imrisws36/wiki/
+</pre>
+
+Can anyone decipher this for me? I spent some time with cpan earlier today downloading the latest version I could find of prerequisite modules such as HTML::Template and CGI::FormBuilder.
+
+> It would help to know what version of CGI::FormBuilder you have. Mine
+> (3.03.01) does not seem to contain this error message. --[[Joey]]
+
+I have version 3.0401 of CGI::FormBuilder -- the latest from CPAN. If you are wondering about any other modules, the answer
+is likely the same: the latest from CPAN. And you're right: the error string in question does not appear in CGI::FormBuilder. I found it in HTML::Template (version 2.8).
+
+-----
+
+OK, so downgrading CGI::FormBuilder to 3.0302 makes the problem go away. I'll leave it to you to figure out whether the bug is in CGI::FormBuilder or in IkiWiki. --Steve
+
+Maybee this bug should be renamed as "doesn't work with CGI::FormBuilder (3.04)." I get same error on FreeBSD.
+
+ HTML::Template::param() : attempt to set parameter 'form-submit' with an array
+ ref - parameter is not a TMPL_LOOP!
+ at /usr/local/lib/perl5/site_perl/5.8.7/CGI/FormBuilder.pm line 1415
+
+version info:
+
+ root@freedom# pkg_info | grep p5-CGI
+ p5-CGI-FastTemplate-1.09 Perl module for manage templates and parses templates
+ p5-CGI-FormBuilder-3.0401 FormBuilder for CGI
+ p5-CGI-Session-4.14 Perl extension for persistent session management
+
+--Mark
+
+----
+
+For the curious, this new version of CGI::FormBuilder changes how it passes
+some values to the HTML::Template template. In particular, FORM-SUBMIT used
+to be just a string containing the buttons used to submit the form. With
+the new version, it's an array of strings, one per button, and the template
+needs to be written differently to deal with this. Oddly, the docs have not
+been updated about this. In fact, from all I can tell, it's a bug, since
+the array is not in the form that HTML::Template expects to receive it.
+Here's a simple test case:
+
+ #!/usr/bin/perl
+ my @fields=qw(editcontent);
+ my @buttons=("Save", "Preview", "Cancel");
+
+ use CGI::FormBuilder;
+ my $form = CGI::FormBuilder->new(
+ fields => \@fields,
+ template => "foo.tmpl",
+ );
+ print $form->render(submit => \@buttons);
+
+With this test case, it does not seem to be possible to write a foo.tmpl that
+outputs the buttons using the FORM-SUBMIT template variable.
+
+I was able to work around this bug by just not using FORM-SUBMIT in the
+template, and hardcoding the buttons (since they never change anyway).
+Nasty, but it should work. I haven't fully installed the new version of
+CGI::FormBuilder to test it, and it's quite possible that other changes
+in the new version cause other breakage. If you want to test the fix,
+it's in svn now. --[[Joey]]
+
+Now that the new version of formbuilder is in debian unstable, I'm using
+ikiwiki with it, and, after fixing a bug or two more, I think it's all
+working, so I'll call this bug [[bugs/done]]. --[[Joey]]
diff --git a/doc/bugs/git.pm_should_prune_remote_branches_when_fetching.mdwn b/doc/bugs/git.pm_should_prune_remote_branches_when_fetching.mdwn
new file mode 100644
index 000000000..5dc4250e3
--- /dev/null
+++ b/doc/bugs/git.pm_should_prune_remote_branches_when_fetching.mdwn
@@ -0,0 +1,14 @@
+The _git_ module does not appear ever to prune obsolete remote branches in the _srcdir_ repository, leading to spurious errors when fetching.
+
+Pruning remote branches can be done automatically with the --prune option to "git fetch" or in a separate command "git remote prune".
+
+--[[blipvert]]
+
+> I'll need more information than that before I add extra processing
+> work to the current git commands it uses. I don't see any errors here
+> from obsolete remote branches. --[[Joey]]
+
+Suppose a remote repository contains a branch named "foo", and you fetch from it. Then, someone renames that branch to "foo/bar". The next time you fetch from that repository, you will get an error because the obsolete branch "foo" is blocking the branch "foo/bar" from being created (due to the way git stores refs for branches). Pruning gets around the problem. It doesn't really add much overhead to the fetch, and in fact it can *save* overhead since obsolete branches do consume resources (any commits they point to cannot be garbage collected). --[[blipvert]]
+
+> Ok, so git pull --prune can be used to do everything in one command.
+> [[done]] --[[Joey]]
diff --git a/doc/bugs/git_commit_adds_files_that_were_not_tracked.mdwn b/doc/bugs/git_commit_adds_files_that_were_not_tracked.mdwn
new file mode 100644
index 000000000..587650c61
--- /dev/null
+++ b/doc/bugs/git_commit_adds_files_that_were_not_tracked.mdwn
@@ -0,0 +1,19 @@
+Commit 3650d0265bc501219bc6d5cd4fa91a6b6ecd793a seems to have been caused by
+a bug in ikiwiki. recentchanges/* was added to the git repo incorrectly.
+
+Part of the problem seems to be that git's `rcs_commit` does a git add followed
+by a `rcs_commit_staged`, and so calling `rcs_commit` on files that were
+not checked in before adds them, incorrectly.
+
+I'm unsure yet why the recentchanges files were being committed. Possibly
+because of the link fixup code run when renaming a page. --[[Joey]]
+
+> See also [[bugs/rename fixup not attributed to author]]. --[[smcv]]
+
+> Ok, there was a call to `rcs_commit` that was still using non-named
+> parameters, which confused it thuroughly, and I think caused
+> 'git add .' to be called. I've fixed that.
+>
+> I think there is still potential for the problem I described above to
+> occur during a rename or possibly otherwise. Ok.. fixed `rcs_commit`
+> to not add too. [[done]] --[[Joey]]
diff --git a/doc/bugs/git_fails_to_compile.mdwn b/doc/bugs/git_fails_to_compile.mdwn
new file mode 100644
index 000000000..25aa417d0
--- /dev/null
+++ b/doc/bugs/git_fails_to_compile.mdwn
@@ -0,0 +1,32 @@
+Background: I'm running ikiwiki on OS X leopard, (a laptop), and I have the wiki running locally, as it's mostly for note taking and personal stuff. Anyway.
+
+I'd been using svn, but I'm making the leap to git (finally) and moving the wiki over as well...
+
+The git works great, I am in fact, quite pleased. Here's the problem. When I try and run `ikiwiki --setup [setupfile]` I get the following message:
+
+ Can't exec "git": No such file or directory at /opt/local/lib/perl5/vendor_perl/5.8.8/IkiWiki/Rcs/git.pm line 29.
+ /Users/tychoish/.ikiwiki/ikiwiki.setup: Cannot exec 'git log --max-count=50 --pretty=raw --raw --abbrev=40 --always -c -r HEAD -- .': No such file or directory
+ BEGIN failed--compilation aborted at (eval 6) line 139.
+
+ /Users/$HOME/.ikiwiki/ikiwiki.setup: 'git log --max-count=50 --pretty=raw --raw --abbrev=40 --always -c -r HEAD -- .' failed:
+ BEGIN failed--compilation aborted at (eval 6) line 139.
+
+---
+
+I can get the wiki to compile if: I take the git stuff out of the setup file, if I put `rcs => ""` **or** if I set the the git_master_branch to "".
+
+I think the problem is that ikiwiki can't deal with the onslaught of such a large quantity of history/log information at once, somehow, because the repository came to this moment with a lot of history that the compiler has to crunch through. How to remedy this, is beyond my skill insight...
+
+Thanks.
+
+-- [[tychoish]]
+
+> It looks like it can't find git; what is $PATH set to when ikiwiki is run, and is git in one of those directories? --[[bma]]
+>> Yeah, ikiwiki and git are both installed underneath macports, which is in the path and works just fine most of the time, and I use macports stuff a lot.
+
+>>> The PATH is set at the top of the ikiwiki program; so the system's PATH
+>>> setting, or one in the environment will be ignored. (This is done for
+>>> security since ikiwiki can be run setuid.) If you need to use
+>>> a nonstandard path, you'll need to edit that. --[[Joey]]
+
+[[!tag done]]
diff --git a/doc/bugs/git_mail_notification_race.mdwn b/doc/bugs/git_mail_notification_race.mdwn
new file mode 100644
index 000000000..58bd82325
--- /dev/null
+++ b/doc/bugs/git_mail_notification_race.mdwn
@@ -0,0 +1,57 @@
+[[done]] (in this branch); fixed removing email notification support!
+
+I was suprised to receive two mails from ikiwiki about one web edit:
+
+ 1 F Oct 30 To joey+ikiwiki update of ikiwiki's plugins/contrib/gallery.mdwn by http://arpitjain11.myopenid.com/
+ 1 F Oct 30 To joey+ikiwiki update of ikiwiki's plugins/contrib/gallery.mdwn by http://arpitjain11.myopenid.com/
+
+The first of these had the correct diff for the changes made by the web
+edit (00259020061577316895370ee04cf00b634db98a).
+
+But the second had a diff for modifications I made to ikiwiki code
+around the same time (2a6e353c205a6c2c8b8e2eaf85fe9c585c1af0cd).
+
+I'm now fairly sure a race is involved. Note that the name of the modified
+page is correct, while the diff was not. The git rcs_commit code first
+gets the list of modified page(s) and then gets the diff, and apparently my
+other commit happened in the meantime, so HEAD changed.
+
+Seems that the rcs_notify for git assumes that HEAD is the commit
+that triggered the ikiwiki run, and that it won't change while the function is
+running. Both assumptions are bad. Git does no locking and another commit
+can come in at any time.
+
+Notice that the equivilant code for subversion is careful to look at $REV.
+git's post-update hook doesn't have an equivilant of $REV. Its post-receive hook
+does, but I'm not sure if that's an appropriate hook for ikiwiki to be using
+for this. Switching existing wikis to use a different hook would be tricky..
+
+I've avoided part of the race; now the code does:
+
+1. Get commit info for HEAD commit.
+2. Extract file list from that.
+3. Extract sha1 of commit from it too.
+4. git diff sha1^ sha1
+
+Still seems like there can be a race if another commit comes in before step #1.
+In this case, two post-receive hooks could run at the same time, and both
+see the same sha1 as HEAD, so two diffs might be sent for second commit and no
+diff for the first commit.
+
+Ikiwiki's own locking prevents this from happenning if both commits are web
+edits. At least one of the two commits has to be a non-web commit.
+
+----
+
+A related problem is that if two commits are made separately but then
+pushed in together, the commit code only looks at the HEAD commit, which
+is the second one. No notification is sent for the first.
+
+----
+
+Based on all of these problems with using the post-update hook, ikiwiki
+should be changed to use the post-receive hook, which provides enough
+information to avoid the assumuptions that led to these problems.
+Transitioning existing wikis to using a new hook will be interesting. Also,
+this hook is only present in git >= 1.5.0.7.
+--[[Joey]]
diff --git a/doc/bugs/git_stderr_output_causes_problems.mdwn b/doc/bugs/git_stderr_output_causes_problems.mdwn
new file mode 100644
index 000000000..d8e14db42
--- /dev/null
+++ b/doc/bugs/git_stderr_output_causes_problems.mdwn
@@ -0,0 +1,45 @@
+I've just been getting ikiwiki running on a hosted server. The server is wrapping all cgi scripts to 'harden' them. Unfortunately, that script is sensitive to what a cgi script outputs on stderr.
+
+Ikiwiki's git handling is sending a bunch of output to stderr. The following patch closes stderr in the child process that ikiwiki forks to run git. This allows me to use ikiwiki on this hosted server. (patch inline - check source to get it)
+
+ diff --git a/IkiWiki/Rcs/git.pm b/IkiWiki/Rcs/git.pm
+ index 425536f..5734bb2 100644
+ --- a/IkiWiki/Rcs/git.pm
+ +++ b/IkiWiki/Rcs/git.pm
+ @@ -24,6 +24,7 @@ sub _safe_git (&@) {
+ if (!$pid) {
+ # In child.
+ # Git commands want to be in wc.
+ + open STDERR, '>/dev/null';
+ chdir $config{srcdir}
+ or error("Cannot chdir to $config{srcdir}: $!");
+ exec @cmdline or error("Cannot exec '@cmdline': $!");
+
+> This sounds like rather counter-productive "hardening" (making life harder
+> for real users without any security improvement that I can think of),
+> but if you have to suppress standard error of the CGI,
+> can't you just replace ikiwiki.cgi with this...
+>
+> #!/bin/sh
+> exec /some/where/else/ikiwiki.cgi "$@" 2>/dev/null
+>
+> or (if you're constrained to Perl) this?
+>
+> #!/usr/bin/perl
+> open STDERR, '>/dev/null';
+> exec ("/some/where/else/ikiwiki.cgi", @ARGV);
+>
+> (Also indented all the lines of your patch so markdown won't eat it :-) )
+> --[[smcv]]
+
+> Right, I don't like throwing stderr away because stderr is supposed to be
+> logged to error.log for a reason: To allow debugging problems.
+> It's unfortunate that git [abuses atderr](http://bugs.debian.org/447395),
+> outputting non-errors to it. That doesn't mean that git might not also
+> output actual error messages there. --[[Joey]]
+
+>> I'm happy with the wrapper script solution, so this is [[done]].
+>> And this report is now here to point others to that solution.
+
+This is also useful when running ikiwiki behind a nginx proxy, because nginx
+considers this stderr as invalid headers and reports a server error. -- [[nil]]
diff --git a/doc/bugs/git_utf8.mdwn b/doc/bugs/git_utf8.mdwn
new file mode 100644
index 000000000..39903d51c
--- /dev/null
+++ b/doc/bugs/git_utf8.mdwn
@@ -0,0 +1,12 @@
+Just saw a bug with the git backend and utf8. I was committing to ikiwiki and
+the post-commit hook emitted a warning message about some text from git
+not being encoded as proper utf-8. I've lost the message, but it was from
+line 36 of git.pm. After a couple other commits, the message stopped
+appearing.
+
+Probably git's output needs to be force encoded to utf-8.
+--[[Joey]]
+
+> I did that in 4ac0b2953131d7a53562ab8918c8e5a49952d8ac , [[done]]
+> --[[Joey]]
+
diff --git a/doc/bugs/gitremotes_script_picks_up_tags_from_anywhere.mdwn b/doc/bugs/gitremotes_script_picks_up_tags_from_anywhere.mdwn
new file mode 100644
index 000000000..9bd8938c5
--- /dev/null
+++ b/doc/bugs/gitremotes_script_picks_up_tags_from_anywhere.mdwn
@@ -0,0 +1,22 @@
+[[!tag patch]]
+[[!template id=gitbranch branch=smcv/ready/no-tags author="[[smcv]]"]]
+
+The `gitremotes` script picks up tags from any repository, including those
+used for local .debs that were never actually present in Debian:
+
+ smcv@reptile% git tag | grep -c nmu
+ 52
+
+This can be avoided with the `tagopt = --no-tags` option in .git/config;
+see <http://git.pseudorandom.co.uk/smcv/ikiwiki.git?a=shortlog;h=refs/heads/ready/no-tags>
+
+> *done* thanks. Also cleared propigated tags out of origin.
+>
+> Hmm, in testing I still see tags get pulled the first time a remote
+> is added. If those are then locally deleted, it doesn't pull them again
+> with the `--no-tags`.
+> --[[Joey]]
+
+>> Oh, I see why. Try the same branch again... --[[smcv]]
+
+>>> [[done]] --[[Joey]]
diff --git a/doc/bugs/gitweb_deficiency_w.r.t._log_messages.mdwn b/doc/bugs/gitweb_deficiency_w.r.t._log_messages.mdwn
new file mode 100644
index 000000000..564982ff3
--- /dev/null
+++ b/doc/bugs/gitweb_deficiency_w.r.t._log_messages.mdwn
@@ -0,0 +1,14 @@
+Following a diff link from *RecentChanges*, the *log message* shown will not be
+the one of the actual commit, but the one of some **previous** commit, in most
+cases of the one which was installed directly before the current commit.
+
+--[[tschwinge]]
+
+> Is there some way to make gitweb show a diff with the right message?
+> I don't see one, except for diffs that show all changes in the commit,
+> rather than only changes to a single file. This feels like a bug in
+> gitweb. --[[Joey]]
+
+This is fixed by using the new gitweb style urls. Which new gitweb
+requires, but is a manual change you have to make in your setup. So,
+[[done]] --[[Joey]]
diff --git a/doc/bugs/gitweb_deficiency_w.r.t._newly_created_pages.mdwn b/doc/bugs/gitweb_deficiency_w.r.t._newly_created_pages.mdwn
new file mode 100644
index 000000000..0b4d70596
--- /dev/null
+++ b/doc/bugs/gitweb_deficiency_w.r.t._newly_created_pages.mdwn
@@ -0,0 +1,13 @@
+Going from *RecentChanges*, when viewing the diffs of newly created pages
+(like <http://tinyurl.com/35f5q3> for the
+*Allow web edit form comment field to be mandatory* page, then there should
+-- in my opinion -- the diff for the newly added page be shown.
+
+--[[tschwinge]]
+
+> I don't see any way to make gitweb do that. You can click on the filename
+> after the "diff -cc" to see the whole file output, but gitweb won't show
+> a diff for a newly added file. --[[Joey]]
+
+>> happily this, too, is fixed by using the new style gitweb urls. [[done]]
+>> --[[Joey]]
diff --git a/doc/bugs/goto_with_bad_page_name.mdwn b/doc/bugs/goto_with_bad_page_name.mdwn
new file mode 100644
index 000000000..bc462c840
--- /dev/null
+++ b/doc/bugs/goto_with_bad_page_name.mdwn
@@ -0,0 +1,25 @@
+If goto is passed a page name that
+contains spaces or is otherwise not a valid page name,
+it will display a "page does not exist", with a create link. But,
+clicking on the link will result in "bad page name".
+
+I have found at least two ways it can happen:
+
+* If 404 is enabled, and the user goes to "http://wiki/some page with spaces"
+* If mercurial is used, it pulls the user's full name, with spaces,
+ out for `rcs_recentchanges` and that ends up on RecentChanges.
+
+When fixing, need to keep in mind that we can't just run the input through
+titlepage, since in all other circumstances, the page name is already valid
+and we don't want to doubly-encode it.
+
+Seems like the goto plugin needs to check if the page name is valid and
+pass it through titlepage if not.
+
+(As a side effect of this, 404 will start redirecting "http://wiki/some page
+with spaces" to "http://wiki/some_page_with_spaces", if the latter exists.
+That seems like a fairly good thing.)
+
+[[done]]
+
+--[[Joey]]
diff --git a/doc/bugs/graphviz_demo_generates_empty_graph.mdwn b/doc/bugs/graphviz_demo_generates_empty_graph.mdwn
new file mode 100644
index 000000000..5b96f148e
--- /dev/null
+++ b/doc/bugs/graphviz_demo_generates_empty_graph.mdwn
@@ -0,0 +1,15 @@
+The following code in our sandbox generates an empty graph:
+
+ [[!graph src=""""
+ google [ href="http://google.com/" ]
+ sandbox [ href=\[[SandBox]] ]
+ help [ href=\[[ikiwiki/formatting]] ]
+ newpage [ href=\[[NewPage]] ]
+
+ google -> sandbox -> help -> newpage -> help -> google;
+ """"]]
+
+It is the exact same thing that on the [[ikiwiki/directive/graph/]] directive documentation, from the [[plugins/graphviz]] plugin. This is ikiwiki 3.20120203 on Debian wheezy and graphviz is installed (2.26.3-10). Note that the first demo actually works. See <http://mesh.openisp.ca/sandbox> --[[anarcat]]
+
+> Looking at the example shows too many double quoted. [[fixed|done]]
+> --[[Joey]]
diff --git a/doc/bugs/hardcoded___34__Discussion__34___link.mdwn b/doc/bugs/hardcoded___34__Discussion__34___link.mdwn
new file mode 100644
index 000000000..d5e5a1a68
--- /dev/null
+++ b/doc/bugs/hardcoded___34__Discussion__34___link.mdwn
@@ -0,0 +1,44 @@
+I can't translate "Discussion" link in `templates/inlinepage.tmpl`
+and `templates/page.tmpl` files, because it's hardcoded in
+ikiwiki sources:
+
+ ptecza@horus:~/svn/ikiwiki$ rgrep -i DISCUSSIONLINK . |grep -v '.svn'
+ ./templates/inlinepage.tmpl:<TMPL_IF NAME="DISCUSSIONLINK">
+ ./templates/inlinepage.tmpl:<li><TMPL_VAR DISCUSSIONLINK></li>
+ ./templates/page.tmpl:<TMPL_IF NAME="DISCUSSIONLINK">
+ ./templates/page.tmpl:<li><TMPL_VAR DISCUSSIONLINK><br /></li>
+ ./IkiWiki/Plugin/inline.pm: $template->param(discussionlink => htmllink($page, $params{page}, "Discussion", 1, 1));
+ ./IkiWiki/Render.pm: $template->param(discussionlink => htmllink($page, $page, "Discussion", 1, 1));
+
+I hope it's a bug, not a feature, because I don't have the same
+problem with other links, for example "Edit", "RecentChanges"
+or "History". --[[Paweł|ptecza]]
+
+> There are good reasons for feeding a full html link into the template,
+> rather than the urls used for the other links. For one, the Discussion
+> link needs to be different if the Discussion page doesn't yet exist.
+
+>> You can always use `<tmpl_if>` and `<tmpl_else>` construct in that place ;) --[[Paweł|ptecza]]
+
+>>> Not without duplicating the logic that constructs a link to an
+>>> existing/nonexisting page in two places, one in code in ikiwiki and one
+>>> in the template. Not good design. --[[Joey]]
+
+> As noted in [[todo/l10n]], there are some other places in ikiwiki
+> that hard code English strings, and I feel that using standard gettext
+> and po files is the best approach for these, although Recai suggested an
+> approach of translating the strings using a template file. --[[Joey]]
+
+>> You know that I rather prefer static templates, but it's your choice,
+>> of course.
+>>
+>> BTW, is there a chance for configurable name of Discussion page?
+>> In my wiki I use only Polish name of pages, so I would like to have
+>> dyskusja.html page, instead of discussion.html page. --[[Paweł|ptecza]]
+
+>>> Ikiwiki is now fully internationalised, so you can change the name of
+>>> the Discussion page and quite a lot more (but hardly everything) by
+>>> translating it. [[bugs/done]]! There's a `po/debconf.pot` in the source
+>>> now for translating. See [[translation]]. --[[Joey]]
+
+>>>> Joey, you're great! ;) Thanks a lot! I'll try ikiwiki l10n stuff soon. --[[Paweł|ptecza]]
diff --git a/doc/bugs/helponformatting_link_disappears.mdwn b/doc/bugs/helponformatting_link_disappears.mdwn
new file mode 100644
index 000000000..11ce5488d
--- /dev/null
+++ b/doc/bugs/helponformatting_link_disappears.mdwn
@@ -0,0 +1,5 @@
+If you are editing a page using your www browser and hit the "Preview"
+button, the link to "HelpOnFormatting" on the bottom of the page
+disappears. This may be expected, or not.
+
+[[bugs/done]]
diff --git a/doc/bugs/html5_support.mdwn b/doc/bugs/html5_support.mdwn
new file mode 100644
index 000000000..ba67d532b
--- /dev/null
+++ b/doc/bugs/html5_support.mdwn
@@ -0,0 +1,117 @@
+Some elements of
+[HTML5](http://www.whatwg.org/specs/web-apps/current-work/multipage/) can be
+safely supported by ikiwiki. There are [several differences between HTML4 and
+HTML5](http://www.w3.org/TR/html5-diff/).
+
+[[!template id=gitbranch branch=hendry/html5 author="[[Kai_Hendry|hendry]]"]]
+
+* [HTML5 branch](http://git.webconverger.org/?p=ikiwiki;h=refs/heads/html5)
+* [ikiwiki instance with HTML5 templates](http://natalian.org)
+* [HTML5 outliner tool](http://gsnedders.html5.org/outliner/) -- to check you have the structure of your markup correct
+
+> Kai, thanks enormously for working on this. I switched a page to
+> the html5 doctype today, and was rather pleasently suprised that it
+> validated, except for the new Cache-Control meta tag. Now I see you're
+> well ahead of me. --[[Joey]]
+>
+> So, how should ikiwiki support html5? There are basically 3 approaches:
+>
+> 1. Allow users to add html5 tags to their existing xhtml pages.
+> What has been done so far, can be extended. Basically works
+> in browsers, if you don't care about standards. A good prerequisite
+> for anything else, anyway.
+> 2. Have both a html5 and a xhtml mode, allow user to select.
+> 3. Switch to html5 in eg, ikiwiki 4; users have to deal with
+> any custom markup on their pages/templates that breaks then.
+>
+> The second option seems fairly tractable from what I see here and in
+> your branch. You made only relatively minor changes to 10 templates.
+> It would probably not be too dreadful to put them in ifdefs. I've made a
+> small start at doing that.
+>
+> I've made ikiwiki use the time element and all the new semantic elements
+> in html5 mode.
+>
+> Other ideas:
+>
+> * Use details tag instead of the javascript in the toggle plugin.
+> (Need to wait on browser support probably.)
+> * Use figure and figcaption for captions in img. However, I have not
+> managed to style it to look as good as the current table+caption
+> approach.
+>
+> --[[Joey]]
+
+# htmlscrubber.pm needs to not scrub new HTML5 elements
+
+* [new elements](http://www.w3.org/TR/html5-diff/#new-elements)
+
+> Many added now.
+>
+> Things I left out, too hard to understand today:
+> Attributes contenteditable,
+> data-\*, draggable, role, aria-\*.
+> Tags command, keygen, output.
+>
+> Clearly unsafe: embed.
+>
+> Apparently cannot be used w/o javascript: menu.
+>
+> I have not added the new `ping` attribute, because parsing a
+> space-separeated list of urls to avoid javascript injection is annoying,
+> and the attribute seems generally dubious.
+> --[[Joey]]
+
+# HTML5 Validation and t/html.t
+
+[validator.nu](http://validator.nu/) is the authorative HTML5 validator,
+however it is almost impossible to sanely introduce as a build dependency
+because of its insane Java requirements. :( I test locally via
+[cURL](http://wiki.whatwg.org/wiki/IDE), though Debian packages cannot be built
+with a network dependency.
+
+In the future, hopefully ikiwiki can test for valid HTML5 using [Relax NG
+schema](http://syntax.whattf.org/) using a Debian package tool
+[rnv](http://packages.qa.debian.org/r/rnv.html).
+
+> Validation in the test suite is nice, but I am willing to lose those
+> tests for a while. --[[Joey]]
+
+# HTML5 migration issues
+
+# [article](http://www.whatwg.org/specs/web-apps/current-work/multipage/semantics.html#the-article-element) element
+
+This element is poorly supported by browsers. As a workaround, `style.css` needs:
+
+ article {
+ display: block;
+ }
+
+Internet Explorer will display it as a block, though you can't seem to be able to further control the style.
+
+> done (needed for header too) --[[Joey]]
+
+## Time element
+
+The [time element](http://www.whatwg.org/specs/web-apps/current-work/multipage/text-level-semantics.html#the-time-element) ideally needs the datatime= attribute set by a template variable with what [HTML5 defines as a valid datetime string](http://www.whatwg.org/specs/web-apps/current-work/multipage/infrastructure.html#valid-global-date-and-time-string).
+
+As a workaround:
+
+ au:~% grep timeformat natalian.setup
+ timeformat => '%Y-%m-%d',
+
+> Also, the [[plugins/relativedate]] plugin needs to be updated to
+> support relatatizing the contents of time elements. --[[Joey]]
+
+> Done and done; in html5 mode it uses the time tag, and even
+> adds pubdate when displaying ctimes. --[[Joey]]
+
+## tidy plugin
+
+Will reformat html5 to html4.
+
+----
+
+
+Ok, I consider this [[done]], at least as a first pass. Html5 mode
+is experimental, but complete enough. --[[Joey]]
diff --git a/doc/bugs/html5_time_element__39__s_pubdate_wrong_when_using_xhtml5___34__mode__34__.mdwn b/doc/bugs/html5_time_element__39__s_pubdate_wrong_when_using_xhtml5___34__mode__34__.mdwn
new file mode 100644
index 000000000..def5bcc2a
--- /dev/null
+++ b/doc/bugs/html5_time_element__39__s_pubdate_wrong_when_using_xhtml5___34__mode__34__.mdwn
@@ -0,0 +1,46 @@
+Hi,
+
+XML error:
+
+ Created <time datetime="2009-03-24T18:02:14Z" pubdate class="relativedate" title="Tue, 24 Mar 2009 14:02:14 -0400">2009-03-24</time>
+
+The pubdate REQUIRES a date, so e.g. `pubdate="2009-03-24T18:02:14Z"`
+
+> No, `pubdate="pubdate"`. It's a boolean attribute. applied && [[done]]
+> --[[Joey]]
+>> awesome, thanks for fixing my fix ;) --[[simonraven]]
+
+Otherwise the XML parser chokes.
+
+<http://www.whatwg.org/specs/web-apps/current-work/multipage/text-level-semantics.html#attr-time-pubdate>
+
+(indented exactly 4 spaces)
+
+<pre>
+ diff --git a/IkiWiki.pm b/IkiWiki.pm
+ index 1f2ab07..6ab5b56 100644
+ --- a/IkiWiki.pm
+ +++ b/IkiWiki.pm
+ @@ -1004,7 +1004,7 @@ sub displaytime ($;$$) {
+ my $time=formattime($_[0], $_[1]);
+ if ($config{html5}) {
+ return '&lt;time datetime="'.date_3339($_[0]).'"'.
+ - ($_[2] ? ' pubdate' : '').
+ + ($_[2] ? ' pubdate="'.date_3339($_[0]).'"' : '').
+ '>'.$time.'&lt;/time&gt;';
+ }
+ else {
+ diff --git a/IkiWiki/Plugin/relativedate.pm b/IkiWiki/Plugin/relativedate.pm
+ index fe8ef09..8c4a1b4 100644
+ --- a/IkiWiki/Plugin/relativedate.pm
+ +++ b/IkiWiki/Plugin/relativedate.pm
+ @@ -59,7 +59,7 @@ sub mydisplaytime ($;$$) {
+
+ if ($config{html5}) {
+ return '&lt;time datetime="'.IkiWiki::date_3339($time).'"'.
+ - ($pubdate ? ' pubdate' : '').$mid.'&lt;/time&gt;';
+ + ($pubdate ? ' pubdate="'.IkiWiki::date_3339($time).'"' : '').$mid.'&lt;/time&gt;';
+ }
+ else {
+ return '&lt;span'.$mid.'&lt;/span&gt;';
+</pre>
diff --git a/doc/bugs/html_errors.mdwn b/doc/bugs/html_errors.mdwn
new file mode 100644
index 000000000..aef2f7f71
--- /dev/null
+++ b/doc/bugs/html_errors.mdwn
@@ -0,0 +1,5 @@
+ikiwiki will generate html formatted error messages to the command
+line if --cgi is set, even if it's not yet running as a cgi
+
+> [[done]], at last. Oldest open bug.. just thought of an elegant fix!
+> --[[Joey]]
diff --git a/doc/bugs/htmlbalance_fails_with_HTML-Tree_v4.mdwn b/doc/bugs/htmlbalance_fails_with_HTML-Tree_v4.mdwn
new file mode 100644
index 000000000..92427065d
--- /dev/null
+++ b/doc/bugs/htmlbalance_fails_with_HTML-Tree_v4.mdwn
@@ -0,0 +1,18 @@
+[[!template id=gitbranch branch=smcv/ready/htmlbalance author="[[smcv]]"]]
+[[!tag patch]]
+
+My one-patch htmlbalance branch fixes incompatibility with HTML::Tree 4.0.
+From the git commit:
+
+ The HTML::Tree changelog says:
+
+ [THINGS THAT MAY BREAK YOUR CODE OR TESTS]
+ ...
+ * Attribute names are now validated in as_XML and invalid names will
+ cause an error.
+
+ and indeed the regression tests do get an error.
+
+--[[smcv]]
+
+[[done]] --[[Joey]]
diff --git a/doc/bugs/htmlscrubber_breaks_multimarkdown_footnotes.mdwn b/doc/bugs/htmlscrubber_breaks_multimarkdown_footnotes.mdwn
new file mode 100644
index 000000000..343037b45
--- /dev/null
+++ b/doc/bugs/htmlscrubber_breaks_multimarkdown_footnotes.mdwn
@@ -0,0 +1,18 @@
+I enabled multimarkdown to make use of footnotes in my file. I have the multimarkdown plugin,
+as well as the command-line program. If I write a document with footnotes:
+
+ This line has a footnote[^1]
+
+ [^1]: this is the footnote
+
+and compile it from the cli, the reference becomes a link to the footnote and the footnote
+gets a backreferencing link appended. When compiled in ikiwiki with the goodstuff plugin
+enabled, the links are created but their hrefs are empty (so they do not actually act as links).
+Disabling the htmlscrubber plugin fixes this issue
+
+[[!tag multimarkdown htmlscrubber]]
+
+> href was of the form: #fnref:1 , scrubbed by overzealous protocol
+> scrubbing.
+
+[[done]] --[[Joey]]
diff --git a/doc/bugs/htmlscrubber_still_scrubbing_HTML_from_mdwn_pages.mdwn b/doc/bugs/htmlscrubber_still_scrubbing_HTML_from_mdwn_pages.mdwn
new file mode 100644
index 000000000..8e79cdd49
--- /dev/null
+++ b/doc/bugs/htmlscrubber_still_scrubbing_HTML_from_mdwn_pages.mdwn
@@ -0,0 +1,21 @@
+Though [[htmlscrubber|plugins/htmlscrubber]] is disabled on the Branchable site <http://geekout.org.uk/> , [HTML embedded](http://source.geekout-org-uk.branchable.com/?p=source.git;a=blob_plain;f=actual_route.mdwn;hb=HEAD) (Flickr badge thing) in a markdown page is scrubbed.
+
+<http://geekout.org.uk/actual_route/>
+
+I assumed one can mix and match HTML in markdown?
+
+> I checked, htmlscrubber is really disabled. So it's not scrubbing
+> anything. Really.
+>
+> What is going on is you have found a bug in markdown. If you feed the
+> file into markdown directly you will see a bunch of weird hashes
+> like 67255d9b2a988139c95269498399f10a in place of your html,
+> so this is like [[!debbug 380212]] but that was with the other version of
+> markdown; I have not seen the current version behave this way before.
+>
+> One workaround is to wrap the html in a span tag. Another is to switch
+> the page from .mdwn to .html and enable the html plugin. I guess I'd pick
+> the first if I were you.
+>
+> I've filed [[!debbug 647128]] about this. Since it's not really an ikiwiki
+> bug, I'll close this one. [[done]] --[[Joey]]
diff --git a/doc/bugs/htmlscrubber_undoes_email_obfuscation_by_Text::Markdown.mdwn b/doc/bugs/htmlscrubber_undoes_email_obfuscation_by_Text::Markdown.mdwn
new file mode 100644
index 000000000..12018d517
--- /dev/null
+++ b/doc/bugs/htmlscrubber_undoes_email_obfuscation_by_Text::Markdown.mdwn
@@ -0,0 +1,37 @@
+From the source of [[usage]]:
+
+ <a href="mailto:joey@ikiwiki.info">&#x6A;&#111;&#101;&#x79;&#64;i&#107;&#105;w&#105;&#107;&#x69;&#46;&#105;n&#x66;&#x6F;</a>
+
+Text::Markdown obfuscates email addresses in the href= attribute and in the text.
+Apparently this can't be configured.
+
+HTML::Scrubber doesn't set `attr_encoded` for its HTML::Parser, so the href= attribtute is decoded.
+Currently it seems it doesn't set `attr_encoded` for good reason: so attributes can be sanitized easily,
+e.g. as in htmlscrubber with `$safe_url_regexp`.
+This apparently can't be configured either.
+
+So I can't see an obvious solution to this.
+Perhaps improvements to Text::Markdown or HTML::Scrubber can allow a fix.
+
+One question is: how useful is email obfuscation?
+Don't spammers use HTML parsers?
+
+> I now see this was noted in the formatting [[/ikiwiki/formatting/discussion]], and won't/can't be fixed.
+> So I guess this is [[done]]. --Gabriel
+
+I've [[patch]]ed mdwn.pm to prevent Text::Markdown from obfuscating the emails.
+The relevant commits are on the master branch of [my "fork" of ikiwiki on Github] [github]:
+
+- 7d0970adbcf0b63e7e5532c239156f6967d10158
+- 52c241e723ced4d7c6a702dd08cda37feee75531
+
+--Gabriel.
+
+[github]: http://github.com/gmcmanus/ikiwiki/
+
+> Thanks for coming up with a patch, but overriding
+> `Text::Markdown::_EncodeEmailAddress` gets into its internals more than
+> I'm comfortable with.
+>
+> It would probably be best to add an option to [[!cpan Text::Markdown]] to
+> let the email address munging be disabled. --[[Joey]]
diff --git a/doc/bugs/htmltidy_has_no_possibilty_to_use_an_alternative_config_file_which_may_break_other_usages.mdwn b/doc/bugs/htmltidy_has_no_possibilty_to_use_an_alternative_config_file_which_may_break_other_usages.mdwn
new file mode 100644
index 000000000..2882eeb12
--- /dev/null
+++ b/doc/bugs/htmltidy_has_no_possibilty_to_use_an_alternative_config_file_which_may_break_other_usages.mdwn
@@ -0,0 +1,26 @@
+The htmltidy plugin as in the Backports.org version 2.32.3~bpo40+1 of ikiwiki does not play well with other usages of HTML Tidy since it has no possibility to use an alternative config file.
+
+E.g. since I usually use HTML Tidy manually only to check and not to fix HTML, I have "markup: no" in my $HOME/.tidyrc which throws an awful lot of Perl warnings and renders all ikiwiki pages empty as soon as I enable htmltidy.
+
+I see two possibilities how to fix this:
+
+1) Replace "$pid=open2(*IN, *OUT, 'tidy -quiet -asxhtml -utf8 --show-body-only yes -
+-show-warnings no --tidy-mark no');" by "$pid=open2(*IN, *OUT, 'tidy -quiet -asxhtml -utf8 --show-body-only yes -
+-show-warnings no --tidy-mark no --markup yes');" -- This is the fastest fix, but not very elegant, since it doesn't solve the general problem.
+
+2) Make it configurable via ikiwiki.setup as e.g.with the tags plugin. Haven't looked into this code yet.
+
+> I don't understand why you're talking about setting --write-back. The
+> htmltidy plugin communicates with tidy using stdio. No files are used, so
+> write-back settings should be irrelevant. --[[Joey]]
+
+>> Hmmm, ok. Well, it didn't work. Empty pages, Perl Warnings. I moved my $HOME/.tidyrc away and it worked again. Had a short look into it and the only obvious non-default setting I found was write-back. I'll check what exactly caused the breakage and let you know. --[[XTaran]]
+
+>>> Ok, found it. It indeed wasn't `write-back`, but `markup: no`. (I usually only want to see warnings and errors, not the fixed markup.) I now've corrected this in the bug report above. --[[XTaran]]
+
+> Ok, so should I pass --markup yes, or should I force it not to use
+> ~/.tidyrc? I can do that (by setting HOME to /dev/null), but there seems
+> to be no way to override it reading /etc/tidy.conf, so options there can
+> still screw things up. I guess I'll pass --markup yes and deal with
+> overriding other problem settings from config files if they're found
+> later. --[[Joey]] [[!tag done]]
diff --git a/doc/bugs/http_proxy_for_openid.mdwn b/doc/bugs/http_proxy_for_openid.mdwn
new file mode 100644
index 000000000..566896ec3
--- /dev/null
+++ b/doc/bugs/http_proxy_for_openid.mdwn
@@ -0,0 +1,86 @@
+If I try to authenticate using openid to my site, it tries to create a http or https connection to the openid server. This doesn't work, because the direct connection is blocked by the firewall.
+
+It would be good if ikiwiki supported setting up a proxy server to solve this.
+
+I have found if I add:
+
+ newenviron[i++]="HTTPS_PROXY=http://host.domain.com:3128";
+
+to IkiWiki/Wrapper.pm it solves the problem for https requests, however it obviously would be preferred if the proxy name is not hard coded.
+
+Also, the ability to set HTTPS\_CA\_FILE and HTTPS\_CA\_DIR might benefit some people. Then again, it I can't see any evidence that the SSL certificate of the server is being checked. See the [[bug_report|ssl_certificates_not_checked_with_openid]] I filed on this separate issue.
+
+Unfortunately, HTTP\_PROXY doesn't work for http:// requests, it looks like that library is different.
+
+---
+
+Update 2008-10-26:
+
+Better solution, one that works for both http and https, and uses config options. It appears to work...
+
+Note that using $ua->proxy(['https'], ...); won't work, you get a "Not Implemented" error, see <http://community.activestate.com/forum-topic/lwp-https-requests-proxy>. Also see [[!debbug 129528]].
+
+Also note that the proxy won't work with liblwpx-paranoidagent-perl, I had to remove liblwpx-paranoidagent-perl first.
+
+<pre>
+louie:/usr/share/perl5/IkiWiki/Plugin# diff -u openid.pm.old openid.pm
+--- openid.pm.old 2008-10-26 12:18:58.094489360 +1100
++++ openid.pm 2008-10-26 12:40:05.763429880 +1100
+@@ -165,6 +165,14 @@
+ $ua=LWP::UserAgent->new;
+ }
+
++ if (defined($config{"http_proxy"})) {
++ $ua->proxy(['http'], $config{"http_proxy"});
++ }
++
++ if (defined($config{"https_proxy"})) {
++ $ENV{HTTPS_PROXY} = $config{"https_proxy"};
++ }
++
+ # Store the secret in the session.
+ my $secret=$session->param("openid_secret");
+ if (! defined $secret) {
+</pre>
+
+Brian May
+
+> Rather than adding config file settings for every useful environment
+> variable, there is a ENV config file setting that can be used to set
+> any environment variables you like. So, no changed needed.
+> --[[Joey]]
+
+>> One thing I don't like about using ikiwiki for tracking bugs is I don't
+>> get notified when changes are made :-(.
+>>
+>> Anyway, if you look at the code I pasted above, the environment variables
+>> do not work for http:// - you have to use $ua->proxy(...) for them.
+>> This is significant, because all openid servers in my version appear to have been
+>> defined with http:// not https:// in /usr/share/ikiwiki/openid-selector/ikiwiki/openid/openid-jquery.js
+>>
+>> Use $ua->env_proxy() to get it to read the environment variables. Then http:// does work.
+>>
+>> Unfortunately this breaks https:// even more - but nothing I do seems to make https:// work anymore.
+
+
+>>> LWP::UserAgent defaults to not caring about proxy settings in
+>>> the environment. (To give control over the result, I guess?)
+>>> To get it to care, pass `env_proxy => 1` to the constructor. Affected
+>>> plugins: aggregate, openid, pinger. This probably wants to be on
+>>> by default, and might not need to be configurable. --[[schmonz]]
+
+>>>> Okay, in a real-world scenario it does need to be
+>>>> configurable. A working implementation (tested with aggregate,
+>>>> not tested with the other two plugins) is in my git, commit
+>>>> 91c46819dee237a281909b0c7e65718eb43f4119. --[[schmonz]]
+
+>>>>> Oh, and according to the LWPx::ParanoidAgent docs, "proxy support is
+>>>>> explicitly removed", so if ikiwiki can preferentially find that
+>>>>> installed, even with the above commit, `openid` won't be able to
+>>>>> traverse a proxy. --[[schmonz]]
+
+[[!template id=gitbranch branch=schmonz/proxies author="[[schmonz]]"]]
+
+>>>>> I bollixed up my git, recloned, and reapplied the diffs, so
+>>>>> that commit won't exist anymore. My proxy-related changes are
+>>>>> now on a branch. --[[schmonz]]
diff --git a/doc/bugs/httpauth_conflicts_with_git_anon_push.mdwn b/doc/bugs/httpauth_conflicts_with_git_anon_push.mdwn
new file mode 100644
index 000000000..91507f57a
--- /dev/null
+++ b/doc/bugs/httpauth_conflicts_with_git_anon_push.mdwn
@@ -0,0 +1,25 @@
+Someone tried to report a bug using IRC while I was on vacation.
+--[[Joey]]
+
+<pre>
+julm: [11:58:35] han, it's me the problem; I was generating a post-update hook instead of a pre-receive hook
+julm: [12:03:59] why does the pre-receive hook return: "Status: 302 Found" and a "Location: <url>"? Is it normal?
+julm: [00:08:44] it's Plugin/httpauth.pm which is outputing those Status and Location
+julm: [00:09:12] problem is that it's an anonymous push via git://
+julm: [03:28:53] hacked my way to fix it somehow: http://git.internet.alpes.fr.eu.org/?p=web/ikiwiki.git;a=commitdiff;h=7211df4f7457c3afab53822a97cbd21825c473f4
+</pre>
+
+Analysis:
+
+* IkiWiki::Receive calls `check_canedit`.
+* httpauth's canedit hook returns an error handler function
+ which redirects the browser through the cgiauthurl.
+ (Similarly, signinedit's hook may call needsignin, which
+ can display a signin form.
+* That doesn't work well when doing a git anon push. :)
+* Also, IkiWiki::Receive calls `check_canattach` and
+ `check_canremove`, which both also call `check_canedit`.
+
+So, all these calls need to avoid running the error handler
+functions returned by canedit hooks, and just return error
+messages. [[done]] --[[Joey]]
diff --git a/doc/bugs/ikiwiki-mass-rebuild_fails_to_drop_privileges_and_execute_ikiwiki.mdwn b/doc/bugs/ikiwiki-mass-rebuild_fails_to_drop_privileges_and_execute_ikiwiki.mdwn
new file mode 100644
index 000000000..2eaca7632
--- /dev/null
+++ b/doc/bugs/ikiwiki-mass-rebuild_fails_to_drop_privileges_and_execute_ikiwiki.mdwn
@@ -0,0 +1,30 @@
+The ikiwiki-mass-rebuild utility fails to drop privileges and fails to execute ikiwiki on FreeBSD.
+
+The solution is to set the effective UID after setting the real UID, and to set $PATH in the environment before calling exec().
+
+> Why does the PATH need to be reset? --[[Joey]]
+
+> > It doesn't - it needs to be set. The line with `%ENV=();` clears the environment, thus no `$PATH` is set. --[[HenrikBrixAndersen]]
+
+> > > I guess it shouldn't clear it then. Both [[done]] --[[Joey]]
+
+Proposed patch:
+
+ --- ikiwiki-mass-rebuild.orig 2007-08-15 22:21:59.000000000 +0200
+ +++ ikiwiki-mass-rebuild 2007-10-25 13:04:10.000000000 +0200
+ @@ -22,13 +22,14 @@ sub processline {
+ my ($uuid, $ugid) = (getpwnam($user))[2, 3];
+ $)="$ugid $ugid";
+ $(=$ugid;
+ - $>=$uuid;
+ $<=$uuid;
+ + $>=$uuid;
+ if ($< != $uuid || $> != $uuid || $( != $ugid || $) ne "$ugid $ugid") {
+ die "failed to drop permissions to $user";
+ }
+ %ENV=();
+ $ENV{HOME}=(getpwnam($user))[7];
+ + $ENV{PATH}="/usr/bin:/usr/local/bin";
+ exec("ikiwiki", "-setup", $setup, @ARGV);
+ die "failed to run ikiwiki: $!";
+ }
diff --git a/doc/bugs/ikiwiki-transition_does_not_set_perl_moduels_path_properly.mdwn b/doc/bugs/ikiwiki-transition_does_not_set_perl_moduels_path_properly.mdwn
new file mode 100644
index 000000000..b3e87b529
--- /dev/null
+++ b/doc/bugs/ikiwiki-transition_does_not_set_perl_moduels_path_properly.mdwn
@@ -0,0 +1,17 @@
+When installing ikiwiki the perl module path is setup correctly
+
+ use lib '/usr/local/ikiwiki-3.20100312/share/perl/5.10.0';
+
+This is not true for ikiwiki-transition:
+
+ $ PATH=/usr/local/ikiwiki-3.20100312/bin ikiwiki-transition prefix_directives ikiwiki.setup
+ Can't locate IkiWiki.pm in @INC (@INC contains: /etc/perl /usr/local/lib/perl/5.10.0
+ /usr/local/share/perl/5.10.0 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.10 /usr/share/perl/5.10 /usr/local/lib/site_perl .)
+ at /usr/local/ikiwiki-3.20100312/bin/ikiwiki-transition line 4.
+ BEGIN failed--compilation aborted at /usr/local/ikiwiki-3.20100312/bin/ikiwiki-transition line 4.
+
+The missing line should be added.
+
+Thanks!
+
+[[done]] --[[Joey]]
diff --git a/doc/bugs/ikiwiki.setup:_syntax_error_at___40__eval_5__41___line_120__44___at_EOF.mdwn b/doc/bugs/ikiwiki.setup:_syntax_error_at___40__eval_5__41___line_120__44___at_EOF.mdwn
new file mode 100644
index 000000000..0f38c4949
--- /dev/null
+++ b/doc/bugs/ikiwiki.setup:_syntax_error_at___40__eval_5__41___line_120__44___at_EOF.mdwn
@@ -0,0 +1,9 @@
+Extremely minor bug, but ikiwiki.setup has a line that reads:
+
+ #tagbase => "tag".
+
+At the end of this line is a period. I think it should read:
+
+ #tagbase => "tag",
+
+[[bugs/done]]
diff --git a/doc/bugs/ikiwiki.setup_require_blank_rcs_to_work_as_cgi_only.mdwn b/doc/bugs/ikiwiki.setup_require_blank_rcs_to_work_as_cgi_only.mdwn
new file mode 100644
index 000000000..8ac6eeb09
--- /dev/null
+++ b/doc/bugs/ikiwiki.setup_require_blank_rcs_to_work_as_cgi_only.mdwn
@@ -0,0 +1,46 @@
+I have a mobile setup of ikiwiki.
+
+* On my server, I set up an ikiwiki + svn
+* On my laptop, I did a check out of the svn and used it as the src of a local ikiwiki (local apache server)
+
+I wanted to be able to change my wiki off line with cgi only, and just commit as I come back online.
+
+During the laptop setup, I think that the setup was confused by the fact that my src directory is a versionned one.
+I had to setup explicitely rcs => "" to force it to work the way I wanted.
+
+Should it be documented ?
+
+>> `rcs => ""` is only needed with ikiwiki before version 1.37. As of 1.37,
+>> you have to manually enable a `rcs => "svn"` etc to get it to use a
+>> revision control system. So no, I don't need to document the `rcs => ""`
+>> thing. --[[Joey]]
+
+> after some tests, the main trouble with this setup is that it won't make the "svn add" for the new files.
+> I wonder what should be the setup if I want to use ikiwiki off line on the laptop and then commit back the changes,
+> without having to take care of the new files before svn commit-ting...
+> --hb (hugues)
+
+>> Hi. It sounds like you want a distributed RCS, where you can branch and commit changes locally and periodically
+>> push changes back. What I do is use svk, which is a distributed RCS based on svn, edit using text editors on my
+>> laptop, and periodically `svk push` up to the server, which triggers a rebuild on the server. I think [[Joey]]
+>> works this way too, but I'm not sure. If you don't like editing pages "by hand" then maybe you should look at
+>> [[rcs/git]] or [[rcs/mercurial]] -- they should theoretically allow you to run apache on a working copy which is itself
+>> a branch of a working copy running on another machine, but I haven't used them so I don't know. --Ethan
+
+>>> Well, by hand editing is just what I'm making sometime. it's just using subversion, in fact.
+>>> But, yes, someone told me about git, which seems to allow what you are describing. In fact, my needs are typically
+>>> that I want to have 2 ikiwiki web-frontend on two (or more) different machines, with one machine sometimes off-line.
+>>> Imagine a team of auditor that want to report and collaborate on a wiki, but are not allways connected.
+
+>>>> I don't use svk, I just let the ikwiki on my laptop update a source
+>>>> directory that's a subversion checkout, and I manually commit changes
+>>>> back to the central wiki using svn. Yes, this does include manually
+>>>> adding new files. It would be possible to hack the svn module so that it
+>>>> added new files, but did not ever try to commit. Maybe call that a halfsvn
+>>>> module or something.
+>>>>
+>>>> Of course git may be a better fit. It would also be possible to add
+>>>> svk support to ikiwiki, although I've not done it yet. --[[Joey]]
+
+Looks like this is not really a bug, so I'm considering it [[bugs/done]]..
+--[[Joey]]
diff --git a/doc/bugs/ikiwiki__39__s_ViewVC_down.mdwn b/doc/bugs/ikiwiki__39__s_ViewVC_down.mdwn
new file mode 100644
index 000000000..9e533813b
--- /dev/null
+++ b/doc/bugs/ikiwiki__39__s_ViewVC_down.mdwn
@@ -0,0 +1,3 @@
+ikiwiki's ViewVC, linked on the download page and used for RecentChanges and History, appears down. --[[JoshTriplett]]
+
+[[fixed|bugs/done]]
diff --git a/doc/bugs/ikiwiki_cgi_fails_to_build_on_Solaris_due_to_missing_LOCK__95__EX.mdwn b/doc/bugs/ikiwiki_cgi_fails_to_build_on_Solaris_due_to_missing_LOCK__95__EX.mdwn
new file mode 100644
index 000000000..aca1ef106
--- /dev/null
+++ b/doc/bugs/ikiwiki_cgi_fails_to_build_on_Solaris_due_to_missing_LOCK__95__EX.mdwn
@@ -0,0 +1,43 @@
+when enabling the cgi wrapper I get,
+
+
+generating wrappers..
+"/var/www/cgi-bin/ikiwiki.cgi.c", line 91: warning: implicit function declaration: flock
+"/var/www/cgi-bin/ikiwiki.cgi.c", line 91: undefined symbol: LOCK_EX
+cc: acomp failed for /var/www/cgi-bin/ikiwiki.cgi.c
+failed to compile /var/www/cgi-bin/ikiwiki.cgi.c
+
+
+ cc -V
+cc: Sun C 5.9 SunOS_i386 Patch 124868-01 2007/07/12
+
+
+I don't know enough C to provide a patch, but from googling it, people seem to be suggesting fcntl has an alternative.
+
+
+-----
+
+
+changing
+
+ if (lockfd != -1 && flock(lockfd, LOCK_EX) == 0) {
+
+
+to read
+
+ if (lockfd != -1 && lockf(lockfd, F_LOCK,0) == 0) {
+
+
+in IkiWiki/Wrapper.pm lets it compile, according to
+http://man-wiki.net/index.php/3:lockf "On Linux, this call is just an
+interface for fcntl(2)" does this seem like a sensible fix?a
+
+> Don't see why not. flock was used only because it's being used
+> in the same file for testing some other locks.
+>
+> While lockf's fcntl locks are not inherited across a fork,
+> that doesn't matter for this lock, which is only used to
+> prevent more than one ikiwiki perl process being run at a time.
+> Nor is there any need to be compatible with some other user of this
+> lock; it's only checked in one place. [[applied|done]]
+> --[[Joey]]
diff --git a/doc/bugs/ikiwiki_ignores_PATH_environment.mdwn b/doc/bugs/ikiwiki_ignores_PATH_environment.mdwn
new file mode 100644
index 000000000..6781d4b4b
--- /dev/null
+++ b/doc/bugs/ikiwiki_ignores_PATH_environment.mdwn
@@ -0,0 +1,24 @@
+At the very top of the main ikiwiki executable script the `PATH` environment is set like this:
+
+ $ENV{PATH}="/usr/local/bin:/usr/bin:/bin:/opt/local/bin";
+
+This makes it a little hard to specify which specific binaries should be used, especially if there is more than one of them available (see c.f. <http://trac.macports.org/ticket/26333> where the MacPorts-supplied, up-to-date subversion should be used and not an arcane one from the base distro / OS). Is there a specific reason why ikiwiki wipes out `$PATH` like this or could that line be improved to
+
+ $ENV{PATH}="$ENV{PATH}:/usr/local/bin:/usr/bin:/bin:/opt/local/bin";
+
+? The alternative is of course to patch ikiwiki as suggested in the bug, but I wanted to ask here first :)
+
+> You can use the ENV setting in your setup file to set any environment
+> variables you like. Since ikiwiki.cgi is run by the web browser, that
+> is the best way to ensure ikiwiki always runs with a given variable set.
+>
+> As a suid program, the ikiwiki wrappers have to sanitize the environment.
+> The ikiwiki script's own sanitization of PATH was done to make perl taint
+> checking happy, but as taint checking is disabled anyway, I have removed
+> that. [[done]] --[[Joey]]
+
+Question: Do ikiwiki.cgi and the RCS post-commit script sanitize the $PATH separately from bin/ikiwiki? If not, then bin/ikiwiki is probably right to sanitize the $PATH; otherwise you've created a security hole with access to the account that ikiwiki is SUID to. It'd be nice if /opt/local/bin were earlier in the $PATH, but that can be changed (as noted) in the setup file. [[Glenn|geychaner@mac.com]] (Also the person who started this by filing an issue with MacPorts; I'm experimenting with ikiwiki for collaborative documentation.)
+
+> The suid wrappers remove all environment variables except for a few used
+> for CGI. PATH is not propigated by them, so when they run ikiwiki it will
+> get the system's default path now. --[[Joey]]
diff --git a/doc/bugs/ikiwiki_lacks_a_--quiet.mdwn b/doc/bugs/ikiwiki_lacks_a_--quiet.mdwn
new file mode 100644
index 000000000..48fa3b068
--- /dev/null
+++ b/doc/bugs/ikiwiki_lacks_a_--quiet.mdwn
@@ -0,0 +1,29 @@
+When building ikiwiki in the background, having a --quiet which will only
+report errors would be nice. -- RichiH
+
+> Except for building wrappers, and possibly progress cruft output to
+> stderr by git (gag), ikiwiki is quiet by default. --[[Joey]]
+
+>> Correct, which means it's not quite quiet:
+
+ % ikiwiki --setup foo --rebuild
+ generating wrappers..
+ successfully generated foo
+ successfully generated foo
+ rebuilding wiki..
+ scanning [...]
+ [...]
+ building [...]
+ [...]
+ done
+
+Yes, I can simply redirect the output, but an option would be cleaner, imo. -- Richard
+
+> The output above looks like verbose mode output to me (the scanning/building lines, at least). Check you haven't enabled it in your setup file by accident. I get the following:
+
+ $ ikiwiki --setup setup
+ successfully generated [cgi]
+ successfully generated [post-update]
+ skipping bad filename [...]
+
+> I've written a patch ([[merged|done]]), pull request sent) that fixes the 'generated...' lines. -- [[Jon]]
diff --git a/doc/bugs/ikiwiki_overzealously_honours_locks_when_asked_for_forms.mdwn b/doc/bugs/ikiwiki_overzealously_honours_locks_when_asked_for_forms.mdwn
new file mode 100644
index 000000000..1e74fe8db
--- /dev/null
+++ b/doc/bugs/ikiwiki_overzealously_honours_locks_when_asked_for_forms.mdwn
@@ -0,0 +1,34 @@
+When an `ikiwiki` instance is holding a lock, a web user clicking on "add comment" (for example) will have to wait for the lock to be released. However, all they are then presented with is a web form. Perhaps CGI requests that are read-only (such as generating a comment form, or perhaps certain types of edits) should ignore locks? Of course, I'd understand that the submission would need to wait for a lock. — [[Jon]]
+
+> Ikiwiki has what I think of as the Big Wiki Lock (remembering the "Big
+> Kernel Lock"). It takes the exclusive lock before loading any state,
+> to ensure that any changes to that state are made safely.
+>
+> A few CGI actions that don't need that info loaded do avoid taking the
+> lock.
+>
+> In the case of showing the comment form, the comments
+> plugin needs CGI session information to be loaded, so it can check if
+> the user is logged in, and so it can add XSRF prevention tokens based on
+> the session ID. (Actually, it might be possible to rely on
+> `CGI::Session`'s own locking of the sessions file, and have a hook that
+> runs with a session but before the indexdb is loaded.)
+>
+> But, the comment form also needs to load the indexdb, in order to call
+> `check_canedit`, which matches a pagespec, which can need to look things
+> up in the indexdb. (Though the pagespecs that can do that are unlikely
+> to be relevant when posting a comment.)
+>
+> I've thought about trying to get rid of the Big Wiki Lock from time to
+> time. It's difficult though; if two ikiwikis are both making changes
+> to the stored state, it's hard to see a way to reconcile them. (There
+> could be a daemon that all changes are fed thru using a protocol, but
+> that's really complicated, and it'd almost be better to have a single
+> daemon that just runs ikiwiki; a major architectural change.)
+>
+> One way that *almost* seems it could work is to have a entry path
+> that loads everything read-only, without a lock. And then in read-only
+> mode, `saveindex` would be an error to run. However, both the commenting
+> code and the page edit code currently have the same entry path for
+> drawing the form as is used for handling the posted form, so they would
+> need to be adapted to separate that into two code paths. --[[Joey]]
diff --git a/doc/bugs/ikiwiki_renders___39__28__39___if_external_plugins_return_nothing.mdwn b/doc/bugs/ikiwiki_renders___39__28__39___if_external_plugins_return_nothing.mdwn
new file mode 100644
index 000000000..8d73dfa86
--- /dev/null
+++ b/doc/bugs/ikiwiki_renders___39__28__39___if_external_plugins_return_nothing.mdwn
@@ -0,0 +1,12 @@
+If the rst2html procedure of the rst external plugin returns None (e.g. when it throws an exception), then ikiwiki will render
+
+ <div id="content">
+ 2/8
+ </div>
+
+In addition to the broken plugin, this seems like a bug in ikiwiki, which should probably output an informational message about the plugin returning an invalid value.
+
+--[[madduck]]
+
+> [[done]], I made it print the thrown error message to stderr, and return
+> "", which seems better than dying of the thrown error entirely. --[[Joey]]
diff --git a/doc/bugs/images_in_inlined_pages_have_wrong_relative_URL.mdwn b/doc/bugs/images_in_inlined_pages_have_wrong_relative_URL.mdwn
new file mode 100644
index 000000000..8cda7a70f
--- /dev/null
+++ b/doc/bugs/images_in_inlined_pages_have_wrong_relative_URL.mdwn
@@ -0,0 +1,15 @@
+I can make an image link, such as:
+
+ ![image](image.jpg)
+
+That will render as ![image](image.jpg).
+
+If I then inline that page, the (relative) URL no longer points to the right place. The fix for this promises to be hairy.
+
+> Similarly, if you insert a relative link using the markdown link syntax,
+> it will tend to break when the page is inlined.
+>
+> However, there is a simple way to avoid both problems: Use WikiLinks
+> and/or the [[img_directive|ikiwiki/directive/img]]. --[[Joey]]
+
+[[!tag done]]
diff --git a/doc/bugs/img_plugin_and_class_attr.mdwn b/doc/bugs/img_plugin_and_class_attr.mdwn
new file mode 100644
index 000000000..7e880b4fc
--- /dev/null
+++ b/doc/bugs/img_plugin_and_class_attr.mdwn
@@ -0,0 +1,27 @@
+The [[plugins/img]] plugin is not generating the proper `class`
+attribute in its HTML output.
+
+The plugin receives something like the following:
+
+ \[[!img 129199047595759991.jpg class="centered"]]
+
+And is supossed to generate an HTML code like the following:
+
+ <img src="129199047595759991.jpg" class="centered" />
+
+But is generating the following
+
+ <img src="129199047595759991.jpg" class="centered img" />
+
+This seems to be happening with all images inserted using the plugin (that use
+the `class=yaddayadda` argument to the `img` directive.) I remember it didn't
+happen before, and I suspect an ikiwiki upgrade is to blame. I tested with a
+blog created from scratch, and a single post, and the problem appeared there
+too.
+
+This is happening with version 3.20100815 of ikiwiki.
+
+[[jerojasro]]
+
+> How is this a bug? It's perfectly legal html for a class attribute to
+> put an element into multiple classes. [[notabug|done]] --[[Joey]]
diff --git a/doc/bugs/img_plugin_and_missing_heigth_value.mdwn b/doc/bugs/img_plugin_and_missing_heigth_value.mdwn
new file mode 100644
index 000000000..4bc070c95
--- /dev/null
+++ b/doc/bugs/img_plugin_and_missing_heigth_value.mdwn
@@ -0,0 +1,7 @@
+When I set up my picture page with `\[[!img defaults size=300x]]` then the html validator complains that the value for height is missing and the IE browsers won't show the pictures up at all; no problems with ff tho. If I set up my picture page with `\[[!img defaults size=300x300]]` then all the images are funny stretched. What am I doing wrong?
+
+> This is a bug. --[[Joey]]
+
+> And .. [[fixed|done]] --[[Joey]]
+
+>> Not quite; For some reason it requires me to update wiki pages twice before the height value shows up.
diff --git a/doc/bugs/img_plugin_causes_taint_failure.mdwn b/doc/bugs/img_plugin_causes_taint_failure.mdwn
new file mode 100644
index 000000000..14de17557
--- /dev/null
+++ b/doc/bugs/img_plugin_causes_taint_failure.mdwn
@@ -0,0 +1,20 @@
+The img plugin causes a taint failure if one tries to link a scaled image, e.g.
+
+ \[[!img foo.png size=64x64]]
+
+ .ikiwiki.setup: Insecure dependency in mkdir while running with -T switch at /usr/lib/perl5/vendor_perl/5.8.8/IkiWiki.pm line 360.
+ BEGIN failed--compilation aborted at (eval 5) line 109.
+
+If one omits the size argument it works. And if it worked once the taint failure will not happen again unless one rm -r's the destdir.
+
+Seen with ikiwiki 2.30
+
+> And what version of perl? See [[Insecure_dependency_in_mkdir]] et al.
+> Also, the debian build of ikiwiki has taint checking disabled to avoid
+> this perl bug. Did you build your own? Set NOTAINT=1 when building..
+> --[[Joey]]
+
+>> perl-5.8.8, I've created a package for openSUSE. Will file perl bug there as well then.
+
+I've gone ahead and turned off taint checking by default now.
+[[done]]
diff --git a/doc/bugs/img_plugin_renders___60__img__62___tag_without_src_attribute_post-2.20.mdwn b/doc/bugs/img_plugin_renders___60__img__62___tag_without_src_attribute_post-2.20.mdwn
new file mode 100644
index 000000000..6b861c397
--- /dev/null
+++ b/doc/bugs/img_plugin_renders___60__img__62___tag_without_src_attribute_post-2.20.mdwn
@@ -0,0 +1,36 @@
+I upgraded from 2.18 to 2.32.3 and my \[[img]] links became `<img>` tags without `src` attributes. Reverting to 2.18 fixes the problem again. I assume this is 2.20, which is the only revision between 2.19 and 2.32.3 to mention img in the changelog.
+
+> I can't reproduce this. For example, page using the img plugin with a
+> current version of ikiwiki: <http://kitenet.net/~joey/pics/debconf/7/>
+> Feel free to mail me a recipe to reproduce it.
+> --[[Joey]]
+
+I've done some research and found that `$imgtag` seems properly created in `img.pm`:`preprocess`. Thus, something must be changing it later. Disabling all plugins and leaving only `img` enabled does not fix the problem.
+
+I added `print STDERR "$imgtag\n";` to `img.pm` after the `my $imgtag='<img src="'.$imgurl.`... statement. This is what happens:
+
+ seamus:~/web/madduck.net/wc> ikiwiki --setup ../ikiwiki.setup --render blog/2008.02.17:the-penny-martin-adventure-phase-2.mdwn | grep img
+ <img src="../img/2008.02.17:chairs_on_the_beach.jpg" alt="Chilling
+ at the beach with chairs" width="301" height="200" class="center" />
+ <img src="../img/2008.02.17:gates_of_haast.jpg" alt="The Gates of Haast" width="487" height="150" class="center" />
+ <img src="../img/2008.02.17:camping_in_cascade_valley.jpg" alt="Camping in Cascade Valley" width="526" height="140" class="center" />
+ <img src="../img/2008.02.17:monteith_keg_urinal.jpg" alt="Keg
+ urinals at Monteith's brewery in Greymouth" width="87" height="150" class="float-right" />
+ <img src="../img/2008.02.17:pancake_rocks_and_nikau_loving.jpg" alt="Pancake rocks
+ at Punakaiki, and Penny loving a Nikau tree" width="582" height="150" class="center" />
+ <img src="../img/2008.02.17:phoenix_foundation_in_concert.jpg" alt="The Phoenix
+ Foundation in concert in Wellington" width="565" height="150" class="center" />
+ <p><a><img alt="Chilling
+ <p><a><img alt="The Gates of Haast" width="487" height="150" class="center" /></a></p>
+ <p><a><img alt="Camping in Cascade Valley" width="526" height="140" class="center" /></a></p>
+ <p><a><img alt="Keg
+ <p><a><img alt="Pancake rocks
+ <p><a><img alt="The Phoenix
+
+Something is eating the src= attribute.
+
+I have been unable to reproduce this outside of the `madduck.net` website...
+
+**Update**: this is the same bug as [[the_colon-in-links_bug|No_link_for_blog_items_when_filename_contains_a_colon]]
+
+[[!tag done]]
diff --git a/doc/bugs/img_plugin_should_pass_through_class_attribute.mdwn b/doc/bugs/img_plugin_should_pass_through_class_attribute.mdwn
new file mode 100644
index 000000000..f72ecade2
--- /dev/null
+++ b/doc/bugs/img_plugin_should_pass_through_class_attribute.mdwn
@@ -0,0 +1,49 @@
+I wanted to make images float left or right, so I thought it would be nice to be able to pass a class attribute through the img plugin to the final img tag.
+
+An example of the feature in use can be seen here (notice class="floatleft" and class="floatright"):
+
+ http://www.cworth.org/
+
+And here's a patch to implement it. Will this survive markdown munging? It seems quite unlikely... How does one protect a block like this? Oh well, we'll see what happens.
+
+> thanks, [[done]] --[[Joey]]
+
+-Carl
+
+ From 405c29ba2ef97a514bade33ef826e71fe825962b Mon Sep 17 00:00:00 2001
+ From: Carl Worth <cworth@cworth.org>
+ Date: Wed, 23 May 2007 15:27:51 +0200
+ Subject: [PATCH] img plugin: Pass a class attribute through to the final img tag.
+
+ This is particularly useful for allowing the image to float.
+ For example, in my usage I use class=floatleft and then
+ in the css do .floatleft { float: left; }.
+ ---
+ Plugin/img.pm | 12 +++++++++---
+ 1 files changed, 9 insertions(+), 3 deletions(-)
+
+ diff --git a/Plugin/img.pm b/Plugin/img.pm
+ index 7226231..3eb1ae7 100644
+ --- a/Plugin/img.pm
+ +++ b/Plugin/img.pm
+ @@ -93,9 +93,15 @@ sub preprocess (@) {
+ $imgurl="$config{url}/$imglink";
+ }
+
+ - return '<a href="'.$fileurl.'"><img src="'.$imgurl.
+ - '" alt="'.$alt.'" width="'.$im->Get("width").
+ - '" height="'.$im->Get("height").'" /></a>';
+ + my $result = '<a href="'.$fileurl.'"><img src="'.$imgurl.
+ + '" alt="'.$alt.'" width="'.$im->Get("width").
+ + '" height="'.$im->Get("height").'" ';
+ + if (exists $params{class}) {
+ + $result .= ' class="'.$params{class}.'"';
+ + }
+ + $result .= '/></a>';
+ +
+ + return $result;
+ }
+
+ 1
+ --
+ 1.5.1.gee969
diff --git a/doc/bugs/img_vs_align.mdwn b/doc/bugs/img_vs_align.mdwn
new file mode 100644
index 000000000..5eb4489b0
--- /dev/null
+++ b/doc/bugs/img_vs_align.mdwn
@@ -0,0 +1,38 @@
+The *[[ikiwiki/directive/img]]* directive allows for specifying an
+*align* parameter -- which is of limited usability as the image is
+embedded as `<p><img ...></p>`. That's at least what I see on
+<http://www.bddebian.com:8888/~hurd-web/hurd/status/>. On the other
+hand, CSS is supposed to be used instead, I guess. (But how... I forgot
+almost of my CSS foo again ;-) it seems.) --[[tschwinge]]
+
+> [[!img logo/ikiwiki.png align=right]]The img tag doesn't create P tags, but if you have surrounded the img directive with newlines, they will result in paragraph tags.
+>
+> I've edited the URL you provided to demonstrate this -- hope you don't mind! I've also added an inline, right-aligned image to this page.[[!tag done]]
+> -- [[Jon]]
+
+> Contrary to all of the above, html does not care about P tags when
+> floating an image to the left or right via align. Proof:
+> <http://kitenet.net/~joey/pics/toomanypicturesofjoey/>, where the image
+> is in its own paragraph but still floats. Also, I re-modified a local
+> copy of the hurd page to enclose the image in a P, and it still floats.
+>
+> Tested with Chromium and Firefox. --[[Joey]]
+
+>> Uh, sorry for not confirming what I supposed to be with looking into
+>> the relevant standard. It just seemed too obvious to me that the
+>> closure of `<p>...</p>` would confine whatever embedded stuff may be
+>> doing. (Meaning, I didn't expect that the *img*'s alignment would
+>> propagate to the *p*'s and would thus be visible from the outside.)
+>>
+>> I confirm (Firefox, Ubuntu jaunty) that your picture page is being
+>> shown correctly -- thus I suppose that there's a buglet in our CSS
+>> scripts again...
+>>
+>> --[[tschwinge]]
+
+>>> It seems, the 'align=right' parameter gets filtered in my installation
+>>> Are there other plugins, that could throw the parameter away?
+>>> --[[jwalzer]]
+
+>>>> Can't think of anything. htmlscrubber doesn't; tidy doesn't.
+>>>> --[[Joey]]
diff --git a/doc/bugs/img_with_alt_has_extra_double_quote.mdwn b/doc/bugs/img_with_alt_has_extra_double_quote.mdwn
new file mode 100644
index 000000000..81bbe7fb5
--- /dev/null
+++ b/doc/bugs/img_with_alt_has_extra_double_quote.mdwn
@@ -0,0 +1,32 @@
+The [[ikiwiki/directive/img]] directive emits an extra double quote if alt=x is
+specified (as is necessary for valid HTML). This results in malformed HTML,
+like this:
+
+ <img src="U" width="W" height="H"" alt="A" />
+ ^
+
+This [[patch]] is available from the img-bugfix branch in my git repository:
+
+ commit a648c439f3467571374daf597e9b3a659ea2008f
+ Author: Simon McVittie <smcv@ http://smcv.pseudorandom.co.uk/>
+ Date: 2009-06-16 17:15:06 +0100
+
+ img plugin: do not emit a redundant double-quote before alt attribute
+
+ diff --git a/IkiWiki/Plugin/img.pm b/IkiWiki/Plugin/img.pm
+ index a697fea..a186abd 100644
+ --- a/IkiWiki/Plugin/img.pm
+ +++ b/IkiWiki/Plugin/img.pm
+ @@ -121,7 +121,7 @@ sub preprocess (@) {
+ my $imgtag='<img src="'.$imgurl.
+ '" width="'.$im->Get("width").
+ '" height="'.$im->Get("height").'"'.
+ - (exists $params{alt} ? '" alt="'.$params{alt}.'"' : '').
+ + (exists $params{alt} ? ' alt="'.$params{alt}.'"' : '').
+ (exists $params{title} ? ' title="'.$params{title}.'"' : '').
+ (exists $params{class} ? ' class="'.$params{class}.'"' : '').
+ (exists $params{id} ? ' id="'.$params{id}.'"' : '').
+
+--[[smcv]]
+
+[[done]] --[[Joey]]
diff --git a/doc/bugs/index.html__63__updated.mdwn b/doc/bugs/index.html__63__updated.mdwn
new file mode 100644
index 000000000..2f20d66bb
--- /dev/null
+++ b/doc/bugs/index.html__63__updated.mdwn
@@ -0,0 +1,15 @@
+After editing a page `pagename`, ikiwiki redirects to `pagename/index.html?updated`. Ignoring for the moment that ?updated seems like a bad idea to begin with, this should at least not introduce /index.html into the URL.
+
+> The "?updated" works around caching issues with certain broken browsers,
+> web proxys, and/or webservers. These assume that since the "?" is there,
+> the page is not static, or is a different page, thus forcing the page to
+> be reloaded and the edited version seen. So no, not a bad idea, really.
+>
+> Removing the index.html would probably break this workaround.
+> http://foo/bar/?updated will redirect to http://foo/bar/index.html, and
+> said broken software will then display its old out of date cached
+> version.
+>
+> So, not changing this. [[!tag done]]
+>
+> --[[Joey]]
diff --git a/doc/bugs/index.html_is_made_visible_by_various_actions.mdwn b/doc/bugs/index.html_is_made_visible_by_various_actions.mdwn
new file mode 100644
index 000000000..561801578
--- /dev/null
+++ b/doc/bugs/index.html_is_made_visible_by_various_actions.mdwn
@@ -0,0 +1,16 @@
+When you do various CGI actions, "index.html" is visible in the redirection URL. It's desirable that this is avoided, so there is only one visible URL for each page (search engines don't think that /foo/index.html is equivalent to /foo/, since this is not necessarily true for all servers and configurations).
+
+[The beautify branch in my repository](http://git.debian.org/?p=users/smcv/ikiwiki.git;a=shortlog;h=refs/heads/beautify) contains [[patches|patch]] for all the cases I found by grepping for "htmlpage", which are:
+
+* [[plugins/editpage]] redirects you to the page under various circumstances, most visibly after you finish editing it
+* [[plugins/poll]] redirects you to the poll after voting
+* [[plugins/recentchanges]] redirects you to the relevant page when you click a link
+* [[plugins/remove]] redirects you to the parent of the removed page
+
+I think the coding standard in future should be: use htmlpage when you want a local file, or urlto if you want a URL.
+
+> Agreed, and I've updated the docs accordingly. Merged your changes. BTW,
+> did you notice they included a fix in passing for a bug in the recentchanges
+> redirection? [[done]] --[[Joey]]
+
+>> No, I suppose I must have assumed htmlpage already did the pagetype check... --[[smcv]]
diff --git a/doc/bugs/iniline_breaks_toc_plugin.mdwn b/doc/bugs/iniline_breaks_toc_plugin.mdwn
new file mode 100644
index 000000000..a99f8d6da
--- /dev/null
+++ b/doc/bugs/iniline_breaks_toc_plugin.mdwn
@@ -0,0 +1,64 @@
+Hi, I try to make an example to reproduce some bug with inline and toc plugins.
+My friend uses
+
+ \[[!inline pages="users/joey" raw="yes"]]
+
+to include common snippets in various pages, and it works as advertised, but if toc plugin is used, page is messed up.
+
+I'll try here to reproduce one example...
+
+You can see that wikilink is rendered in one line and 'here's paragraph' is not a separate paragraph.
+
+All this is displayed correctly if toc is removed.
+
+> Thanks for the clear explanation and example. This is a bug in markdown.
+> Version 1.0.1 gets confused by the toc div, which is followed by a few
+> lines of markdown, and then by a second div for the inlined page. It
+> doesn't convert the markdown in the middle to html.
+>
+> I've tested with markdown 1.0.2 and it seems to render this ok. 1.0.2 has
+> a much improved parser, which is known to fix several block-related
+> problems, including this one. It is currently packaged in experimental in
+> Debian, and I suggest you give it a try if you're experiencing this
+> problem.
+>
+> (Actually, this seems closely related to the problem described
+> [here](http://bugs.debian.org/421843).
+>
+> I'm going to close this bug report since it's a markdown bug. --[[Joey]]
+[[!tag done]]
+
+>> thanks, that fixes it.
+
+
+TEST1
+=====
+
+TEST2
+-----
+
+bla bla bla
+
+ * bl bla
+ * aadsd
+
+[[!toc levels=2]]
+
+bla bla
+
+[[!table data="""
+Customer|Amount
+Fulanito|134,34
+Menganito|234,56
+Menganito|234,56
+"""]]
+
+
+[[wikilink|users]]
+
+Here's a paragraph.
+
+
+[[!inline pages="users/joey" raw="yes"]]
+
+bla bla
diff --git a/doc/bugs/inline_action_buttons_circumvent_exclude_criteria_from_edittemplate__39__s_match__61____34____34___pagespec.mdwn b/doc/bugs/inline_action_buttons_circumvent_exclude_criteria_from_edittemplate__39__s_match__61____34____34___pagespec.mdwn
new file mode 100644
index 000000000..2e2d35381
--- /dev/null
+++ b/doc/bugs/inline_action_buttons_circumvent_exclude_criteria_from_edittemplate__39__s_match__61____34____34___pagespec.mdwn
@@ -0,0 +1,15 @@
+ikiwiki verison: 3.20100815.2
+
+If I instruct editemplate to only match the top level pages in a directory using
+
+ match="foo/* and !foo/*/* and !foo/*/*/*"
+
+everything works as expected for pages created via links on other wiki pages. So, if I open foo/bar (or any other page on the wiki) and create a link to foo/bar/bug, edittemplate appropriately does not insert any text into the new page.
+
+However, if I use an inline directive like the following
+
+ !inline pages="page(foo/bar/*)" rootpage="foo/bar" postform=yes actions=yes
+
+every page created via the action buttons incorrectly pulls in the text from the edittemplate registration. Changing the order of the conditions in the match="" pagespec has no impact.
+
+> [[fixed|done]] --[[Joey]]
diff --git a/doc/bugs/inline_archive_crash.mdwn b/doc/bugs/inline_archive_crash.mdwn
new file mode 100644
index 000000000..a1139a9bc
--- /dev/null
+++ b/doc/bugs/inline_archive_crash.mdwn
@@ -0,0 +1,6 @@
+ \[[!inline Error: Undefined subroutine &HTML::Entities::encode_numeric called at /usr/share/perl5/IkiWiki/Plugin/meta.pm line 292.]]
+
+This occurred when recentchanges was disabled and building a change
+to a page that inlined other pages with archive=yes. I have
+committed a fix; filing a bug since the fix won't be landing in Debian stable any
+time soon. [[done]] --[[Joey]]
diff --git a/doc/bugs/inline_breaks_PERMALINK_variable.mdwn b/doc/bugs/inline_breaks_PERMALINK_variable.mdwn
new file mode 100644
index 000000000..fc891bb25
--- /dev/null
+++ b/doc/bugs/inline_breaks_PERMALINK_variable.mdwn
@@ -0,0 +1,25 @@
+in 3.20091017 the following inline
+
+> `\[[!inline pages="internal(foo/bar/baz/*)" show=3 archive="yes" feeds="no" template="sometemplates"]]`
+
+with sometemplate being
+
+> `<p><a href="<TMPL_VAR PERMALINK>"><TMPL_VAR TITLE></a> (<TMPL_VAR CTIME>)</p>`
+
+produced output that links nowhere (`<a href="">`) while the other variables do fine. This problem does not occur in 3.1415926.
+
+> This must be caused by an optimisation that avoids reading the page
+> content when using a template that does not use CONTENT.
+>
+> I guess that it needs to instead check all the variables the template
+> uses, and read content if PERMALINK, or probably any other unknown
+> variable is used. Unfortunatly, that will lose the optimisation
+> for the archivepage template as well -- it also uses PERMALINK.
+>
+> So, revert the optimisation? Or, make meta gather the permalink
+> data on scan? That seems doable, but is not a general fix for
+> other stuff that might be a) used in a template and b) gathered
+> at preprocess time.
+>
+> For now, I am going with the special case fix of fixing meta. I may need
+> to go for a more general fix later. --[[Joey]] [[!tag done]]
diff --git a/doc/bugs/inline_from_field_empty_if_rootpage_doesn__39__t_exist.mdwn b/doc/bugs/inline_from_field_empty_if_rootpage_doesn__39__t_exist.mdwn
new file mode 100644
index 000000000..61aeff244
--- /dev/null
+++ b/doc/bugs/inline_from_field_empty_if_rootpage_doesn__39__t_exist.mdwn
@@ -0,0 +1,20 @@
+If I put something like the below in my index.mdwn
+
+ <<!inline pages="posts/* and !*/discussion" rootpage="posts" show="10">>
+
+But posts doesn't exist, I get the following in index.html
+
+ <input type="hidden" name="do" value="blog" />
+ <input type="hidden" name="from" value="" />
+ <input type="hidden" name="subpage" value="1" />
+
+When I create posts (touch posts.mdwn), I get the following in index.html
+
+ <input type="hidden" name="do" value="blog" />
+ <input type="hidden" name="from" value="posts" />
+ <input type="hidden" name="subpage" value="1" />
+
+Bug?
+
+> Yes, thanks for reminding me I need to do something about that... [[done]]
+> --[[Joey]]
diff --git a/doc/bugs/inline_page_not_updated_on_removal.mdwn b/doc/bugs/inline_page_not_updated_on_removal.mdwn
new file mode 100644
index 000000000..fc626cab1
--- /dev/null
+++ b/doc/bugs/inline_page_not_updated_on_removal.mdwn
@@ -0,0 +1,9 @@
+If a page inlines some other page (such as this page by the bugs page),
+and the page is removed (such as by this page being linked to bugs/done),
+the inlining page is not updated to remove it.
+
+This only happens if the page is removed from the inlined pagespec due to
+a tag changing; the problem is that once the tag is changed, ikiwiki does
+not know that the page used to match before.
+
+[[done]]
diff --git a/doc/bugs/inline_plugin_rootpage_option_is_not_case_insensitive.mdwn b/doc/bugs/inline_plugin_rootpage_option_is_not_case_insensitive.mdwn
new file mode 100644
index 000000000..8ef128117
--- /dev/null
+++ b/doc/bugs/inline_plugin_rootpage_option_is_not_case_insensitive.mdwn
@@ -0,0 +1,9 @@
+If rootpage is "foo/Bar" and a directory "foo/bar" exists already, the new pages go into the "foo/Bar" directory instead of the existing "foo/bar".
+
+But maybe this is intended behavior? --rdennis
+
+> It is. Ikiwiki does not try to subvert the case sensativity of the
+> filesystem. It just avoids you having to get the case right when
+> referring to existing files, by wikilinks or pagespecs --[[Joey]]
+
+[[!tag done]]
diff --git a/doc/bugs/inline_plugin_sets_editurl_even_when_editpage_is_disabled.html b/doc/bugs/inline_plugin_sets_editurl_even_when_editpage_is_disabled.html
new file mode 100644
index 000000000..62c91a932
--- /dev/null
+++ b/doc/bugs/inline_plugin_sets_editurl_even_when_editpage_is_disabled.html
@@ -0,0 +1,16 @@
+see subject, simple patch below
+<pre>
+--- a/IkiWiki/Plugin/inline.pm
++++ b/IkiWiki/Plugin/inline.pm
+@@ -371,7 +371,8 @@ sub preprocess_inline (@) {
+ }
+ if (length $config{cgiurl} && defined $type) {
+ $template->param(have_actions => 1);
+- $template->param(editurl => cgiurl(do => "edit", page => $page));
++ $template->param(editurl => cgiurl(do => "edit", page => $page))
++ if IkiWiki->can("cgi_editpage");
+ }
+ }
+</pre>
+
+[[done]] --[[Joey]]
diff --git a/doc/bugs/inline_raw_broken_on_unknown_pagetype.mdwn b/doc/bugs/inline_raw_broken_on_unknown_pagetype.mdwn
new file mode 100644
index 000000000..19aa94e7e
--- /dev/null
+++ b/doc/bugs/inline_raw_broken_on_unknown_pagetype.mdwn
@@ -0,0 +1,29 @@
+When trying to insert the raw content of an attached shell script
+called `whatever` using:
+
+ \[[!inline pages="whatever" raw="yes"]]
+
+The generated HTML contains:
+
+ \[[!inline Erreur: Can't call method "param" on an undefined value
+ at /usr/local/share/perl/5.10.0/IkiWiki/Plugin/inline.pm
+ line 346.]]
+
+Looking at the inline plugin's code, it is clear that `$template` is
+undef in such a situation. Defining `$template` just before line 346,
+in case it's not defined, removes the error message, but nothing
+gets inlined as `get_inline_content` returns the empty string in
+this situation.
+
+If we explicitely don't want to allow raw inlining of unknown page
+types, ikiwiki should output a better error message.
+
+> I have made it just do a direct include if the page type is not known, in
+> raw mode. That seems useful if you want to include some other file right
+> into a page. You could probably even wrap it in a format directive.
+>
+> It does allow including binary files right into a page, but nothing is
+> stopping you pasting binary data right into the edit form either, so
+> while annoying I don't think that will be a security problem. --[[Joey]]
+
+[[done]]
diff --git a/doc/bugs/inline_skip_causes_empty_inline.mdwn b/doc/bugs/inline_skip_causes_empty_inline.mdwn
new file mode 100644
index 000000000..e1cbc5470
--- /dev/null
+++ b/doc/bugs/inline_skip_causes_empty_inline.mdwn
@@ -0,0 +1,10 @@
+When using the [[directive/inline]] directive with the skip parameter i get
+emtpy list inline (no output at all). The same inline used to work before
+but not in 3.20091031.
+
+> I need more information to help. Skip is working as expected here.
+> Can I download/clone your wiki? --[[Joey]]
+
+>> The bug occurs only together with archive="yes" as I Just found out:
+
+>>> Thanks, [[fixed|done]] in git. --[[Joey]]
diff --git a/doc/bugs/inline_sort-by-title_issues.mdwn b/doc/bugs/inline_sort-by-title_issues.mdwn
new file mode 100644
index 000000000..ff4555067
--- /dev/null
+++ b/doc/bugs/inline_sort-by-title_issues.mdwn
@@ -0,0 +1,57 @@
+The [[plugins/inline]] plugin has a `sort="title"` option that causes the pages in the list to be sorted by title rather than creation time. The [[plugins]] list on this wiki was recently changed to use this option. If you look at the plugin page, you'll notice that it doesn't look correctly sorted. e.g. `attach (third party plugin)` falls between `conditional` and `default content for *copyright* and *license*`.
+
+I think the problem here is that the pages are being sorted by their path, whereas only the basename is displayed. This makes the example above:
+
+ * plugins/conditional
+ * plugins/contrib/attach
+ * plugins/contrib/default content for *copyright* and *license*
+
+and now you can see why it is ordered that way, and why later on we get:
+
+ * plugins/contrib/unixauth (third party plugin)
+ * plugins/creole
+
+which appears to list `unixauth` before `creole`.
+
+I'm not sure what the best fix is. One fix would be to add another sort option, `sort="path"`, that would use the current (broken) sort by title. Then add a true `sort="title"` that actually sorts on the title. It might also be interesting to modify the sort=path to actually list the full path in the links - that way it would be obvious how it is sorted. Or you could ignore the idea for `sort="path"`, and tell people to use [[plugins/map]] for that.
+
+--[[users/Will]]
+
+And here is a [[patch]] for this. It makes `sort=title` actually sort on the title, and adds `sort=path` if you really want to sort on the path. `sort=path` still only displays titles. Just use map if you want more.
+
+ diff --git a/IkiWiki/Plugin/inline.pm b/IkiWiki/Plugin/inline.pm
+ index 9c336e7..99f6de3 100644
+ --- a/IkiWiki/Plugin/inline.pm
+ +++ b/IkiWiki/Plugin/inline.pm
+ @@ -185,9 +185,12 @@ sub preprocess_inline (@) {
+ }
+ }
+
+ - if (exists $params{sort} && $params{sort} eq 'title') {
+ + if (exists $params{sort} && $params{sort} eq 'path') {
+ @list=sort @list;
+ }
+ + elsif (exists $params{sort} && $params{sort} eq 'title') {
+ + @list=sort { lc(pagetitle(basename($a))) cmp lc(pagetitle(basename($b))) } @list;
+ + }
+ elsif (exists $params{sort} && $params{sort} eq 'mtime') {
+ @list=sort { $pagemtime{$b} <=> $pagemtime{$a} } @list;
+ }
+ diff --git a/doc/ikiwiki/blog.mdwn b/doc/ikiwiki/blog.mdwn
+ index 19ec7ac..7608628 100644
+ --- a/doc/ikiwiki/blog.mdwn
+ +++ b/doc/ikiwiki/blog.mdwn
+ @@ -89,7 +89,8 @@ Here are some less often needed parameters:
+ inlining page.
+ * `sort` - Controls how inlined pages are sorted. The default, "age" is to
+ sort newest created pages first. Setting it to "title" will sort pages by
+ - title, and "mtime" sorts most recently modified pages first.
+ + title, "path" sorts by the path to the page, and "mtime" sorts most
+ + recently modified pages first.
+ * `reverse` - If set to "yes", causes the sort order to be reversed.
+ * `feedshow` - Specify the maximum number of matching pages to include in
+ the rss/atom feeds. The default is the same as the `show` value above.
+
+> Thanks for the patch. [[done]], but I left off the sort=path. Also left
+> off the lc (if you ask your locale to sort case-sensatively, it should, I
+> think). --[[Joey]]
diff --git a/doc/bugs/inline_sort_order_and_meta_date_value.mdwn b/doc/bugs/inline_sort_order_and_meta_date_value.mdwn
new file mode 100644
index 000000000..d4ec8f345
--- /dev/null
+++ b/doc/bugs/inline_sort_order_and_meta_date_value.mdwn
@@ -0,0 +1,314 @@
+I have a directory containing two files. f1 (<http://alcopop.org/~jon/repro2/src/blog/debgtd.html>) has
+
+ meta date="2008-07-02 14:13:17"
+
+f2 (<http://alcopop.org/~jon/repro2/src/blog/moving.html>) has
+
+ meta date="2008-07-02 21:04:21"
+
+They have both been modified recently:
+
+ >>> stat(f1)
+ (33188, 459250L, 65027L, 1, 1000, 1000, 1686L, 1227967177, 1227966706, 1227966706)
+ >>> stat(f2)
+ (33188, 458868L, 65027L, 1, 1000, 1000, 938L, 1227967187, 1227966705, 1227966705)
+
+Note that f1 is fractionally newer than f2 in terms of ctime and mtime, but f2 is much newer in terms of the meta information.
+
+Another page includes them both via inline:
+
+ inline pages="blog/*" show=5
+
+The resulting page is rendered with f1 above f2, seemingly not using the meta directive information: <http://alcopop.org/~jon/repro2/dest/blog/>. The footer in the inline pages does use the correct time e.g. <em>Posted Wed 02 Jul 2008 14:13:17 BST</em>.
+
+If I instead include them using creation_year in the pagespec, they are ordered correctly.
+
+<http://alcopop.org/~jon/repro2/> contains the src used to reproduce this, the precise ikiwiki invocation (inside Makefile) and the results (dest).
+
+-- [[users/Jon]]
+
+
+> On Ikiwiki 2.53.3 (Debian Lenny), my inlines are also sorted using mtime
+> by default -- despite what the [[documentation|/ikiwiki/directive/inline]]
+> says -- but setting the supposed default sort order manually produces the
+> correct results. For example, the following inline sorts my blog
+> entires using their meta date or ctime:
+>
+> inline pages="blog/*" show="10" sort="age"
+>
+> I'll try to look at the code this weekend and see if age is really the
+> default sort order.
+>
+> -- [David A. Harding](http://dtrt.org), 2008-12-20
+
+Here is the code. As you can see, sort="age" is equivilant to leaving
+out the sort option. --[[Joey]]
+
+ if (exists $params{sort} && $params{sort} eq 'title') {
+ @list=sort { pagetitle(basename($a)) cmp pagetitle(basename($b)) } @list;
+ }
+ elsif (exists $params{sort} && $params{sort} eq 'mtime') {
+ @list=sort { $pagemtime{$b} <=> $pagemtime{$a} } @list;
+ }
+ elsif (! exists $params{sort} || $params{sort} eq 'age') {
+ @list=sort { $pagectime{$b} <=> $pagectime{$a} } @list;
+ }
+ else {
+ return sprintf(gettext("unknown sort type %s"), $params{sort});
+ }
+
+> On further testing, I find that the bug is limited to the first time
+> creation time should be used and has nothing to do with setting the sort
+> parameter. Revised steps to reproduce: --[David A. Harding](http://dtrt.org), 2008-12-20
+>
+> 1. Create pages that sort different by mtime and ctime
+>
+> 2. inline pages="somepages/*"
+>
+> 3. ikiwiki --setup setup_file
+>
+> 4. Pages are output incorrectly in mtime order
+>
+> 5. ikiwiki --setup setup_file
+>
+> 6. Pages are output correctly in ctime order
+>
+> 7. Create new page in somepages/, set its ctime to earlier than another
+> page in sompages/
+>
+> 8. ikiwiki --setup setup_file
+>
+> 9. All previously sorted pages output correctly in ctime order but new
+> page is output incorrectly at the top as if its mtime was its ctime
+>
+> 10. ikiwiki --setup setup_file
+>
+> 11. All pages, including new page, are output correctly in ctime order
+
+You're confusing ctime and creation time. This is perhaps not suprising, as
+ikiwiki uses the term 'ctime' to refer to creation time. However, the unix
+ctime value is not the same thing. Unix ctime can change if a file changes
+owner, or in some cases, permissions. Unix ctime also always changes
+when the file is modified. Ikiwiki wants a first creation date of the file,
+and it accomplishes this by recording the initial ctime of a file the first
+time it processes it, and *preserving* that creation time forever, ignoring
+later ctime changes.
+
+I suspect that this, coupled with the fact that ikiwiki sorts newest pages
+first, explains everything you describe. If not, please send me a shell script
+test case I can run, as instructions like "Create pages that sort different by
+mtime and ctime" are not ones that I know how to follow (given that touch sets
+*both*). --[[Joey]]
+
+> Sorry. I conflated Unix ctime and ikiwiki's creation time because I
+> didn't think the difference was important to this case. I'm a writer,
+> and I should have known better -- I appologize. Revised steps to
+> reproduce are below; feel free to delete this whole misunderstanding
+> to make the bug report more concise.
+>
+> 1. Create pages in the srcdir that should sort in one order by
+> last-modification time and in a diffent order by original creation
+> time. For example:
+>
+> $ echo -e '\[[!meta date="2007-01-01"]]\nNew.' > test/new.mdwn
+> $ echo -e '\[[!meta date="2006-01-01"]]\nOld.' > test/old.mdwn
+>
+> 2. Create a page that includes the files. For example:
+>
+>
+> $ echo '\[[!inline pages="test/*"]]' > sort-test.mdwn
+>
+> 3. Run ikiwiki. For example `ikiwiki --setup setup_file`
+>
+> 4. Pages are output incorrectly in modification time order. For example,
+> actual partial HTML of the sort-test/index.html from commands used
+> above (whitespace-only lines removed; one whitespace-only line
+> added):
+>
+> <div class="inlinepage">
+> <span class="header">
+> <a href="./../test/old/">old</a>
+> </span>
+> <p>Old.</p>
+> <span class="pagedate">
+> Posted Sun 01 Jan 2006 12:00:00 AM EST
+> </span>
+> </div>
+>
+> <div class="inlinepage">
+> <span class="header">
+> <a href="./../test/new/">new</a>
+> </span>
+> <p>New.</p>
+> <span class="pagedate">
+> Posted Mon 01 Jan 2007 12:00:00 AM EST
+> </span>
+> </div>
+>
+> 5. Run ikiwiki again with the same command line. For example: `ikiwiki --setup setup_file`
+>
+> 6. Pages are output correctly in creation time order. For example,
+> actual partial HTML of the sort-test/index.html from commands used
+> above (whitespace-only lines removed; one whitespace-only line
+> added):
+>
+> <div class="inlinepage">
+> <span class="header">
+> <a href="./../test/new/">new</a>
+> </span>
+> <p>New.</p>
+> <span class="pagedate">
+> Posted Mon 01 Jan 2007 12:00:00 AM EST
+> </span>
+> </div>
+>
+> <div class="inlinepage">
+> <span class="header">
+> <a href="./../test/old/">old</a>
+> </span>
+> <p>Old.</p>
+> <span class="pagedate">
+> Posted Sun 01 Jan 2006 12:00:00 AM EST
+> </span>
+> </div>
+>
+> 7. Create a new page with the current Unix mtime and Unix ctime, but a
+> !meta date before the most recent creation date of another page.
+> For example:
+>
+> $ echo -e '\[[!meta date="2005-01-01"]]\nOlder.' > test/older.mdwn
+>
+> 8. Run ikiwiki again with the same command line. For example: `ikiwiki --setup setup_file`
+>
+> 9. All previously sorted pages output correctly in order of their
+> creation time, but the new page is output incorrectly at the top as
+> if its modification time was its creation time. For example,
+> actual partial HTML of the sort-test/index.html from commands used
+> above (whitespace-only lines removed; two whitespace-only
+> lines added):
+>
+> <div class="inlinepage">
+> <span class="header">
+> <a href="./../test/older/">older</a>
+> </span>
+> <p>Older.</p>
+> <span class="pagedate">
+> Posted Sat 01 Jan 2005 12:00:00 AM EST
+> </span>
+> </div>
+>
+> <div class="inlinepage">
+> <span class="header">
+> <a href="./../test/new/">new</a>
+> </span>
+> <p>New.</p>
+> <span class="pagedate">
+> Posted Mon 01 Jan 2007 12:00:00 AM EST
+> </span>
+> </div>
+>
+> <div class="inlinepage">
+> <span class="header">
+> <a href="./../test/old/">old</a>
+> </span>
+> <p>Old.</p>
+> <span class="pagedate">
+> Posted Sun 01 Jan 2006 12:00:00 AM EST
+> </span>
+> </div>
+>
+> 10. Run ikiwiki again with the same command line. For example: `ikiwiki --setup setup_file`
+>
+> 11. All pages, including new page, are output correctly in creation time
+> order. For example, actual partial HTML of the sort-test/index.html
+> from commands used above (whitespace-only lines removed; two
+> whitespace-only lines added):
+>
+> <div class="inlinepage">
+> <span class="header">
+> <a href="./../test/new/">new</a>
+> </span>
+> <p>New.</p>
+> <span class="pagedate">
+> Posted Mon 01 Jan 2007 12:00:00 AM EST
+> </span>
+> </div>
+>
+> <div class="inlinepage">
+> <span class="header">
+> <a href="./../test/old/">old</a>
+> </span>
+> <p>Old.</p>
+> <span class="pagedate">
+> Posted Sun 01 Jan 2006 12:00:00 AM EST
+> </span>
+> </div>
+>
+> <div class="inlinepage">
+> <span class="header">
+> <a href="./../test/older/">older</a>
+> </span>
+> <p>Older.</p>
+> <span class="pagedate">
+> Posted Sat 01 Jan 2005 12:00:00 AM EST
+> </span>
+> </div>
+>
+> File status after all the above actions:
+>
+> $ stat test/*
+> File: `test/new.mdwn'
+> Size: 33 Blocks: 8 IO Block: 4096 regular file
+> Device: ca20h/51744d Inode: 684160 Links: 1
+> Access: (0644/-rw-r--r--) Uid: ( 1000/ harding) Gid: ( 1000/ harding)
+> Access: 2008-12-20 21:48:32.000000000 -0500
+> Modify: 2008-12-20 21:27:03.000000000 -0500
+> Change: 2008-12-20 21:27:03.000000000 -0500
+> File: `test/older.mdwn'
+> Size: 35 Blocks: 8 IO Block: 4096 regular file
+> Device: ca20h/51744d Inode: 684407 Links: 1
+> Access: (0644/-rw-r--r--) Uid: ( 1000/ harding) Gid: ( 1000/ harding)
+> Access: 2008-12-20 21:48:32.000000000 -0500
+> Modify: 2008-12-20 21:42:10.000000000 -0500
+> Change: 2008-12-20 21:42:10.000000000 -0500
+> File: `test/old.mdwn'
+> Size: 33 Blocks: 8 IO Block: 4096 regular file
+> Device: ca20h/51744d Inode: 684161 Links: 1
+> Access: (0644/-rw-r--r--) Uid: ( 1000/ harding) Gid: ( 1000/ harding)
+> Access: 2008-12-20 21:48:32.000000000 -0500
+> Modify: 2008-12-20 21:27:09.000000000 -0500
+> Change: 2008-12-20 21:27:09.000000000 -0500
+>
+> My ikiwiki configuration file (being used to port a blog from pyblosxom
+> to ikiwiki):
+>
+> harding@mail:~$ sed 's/#.*//; /^[ ]*$/d' .ikiwiki/gnuisance.setup
+> use IkiWiki::Setup::Standard {
+> wikiname => "HardingBlog",
+> adminemail => 'dave@dtrt.org',
+> srcdir => "/srv/backup/git/gnuisance.net",
+> destdir => "/srv/test.dtrt.org",
+> url => "http://srv.dtrt.org",
+> wrappers => [
+> ],
+> atom => 1,
+> syslog => 0,
+> prefix_directives => 1,
+> add_plugins => [qw{goodstuff tag}],
+> disable_plugins => [qw{passwordauth}],
+> tagbase => "tag",
+> }
+>
+> --[David A. Harding](http://dtrt.org/), 2008-12-20
+
+Thank you for a textbook excellent reproduction recipe.
+
+What appears to be going on here is that meta directives are not processed
+until the leaf pages are rendered, and thus the ctime setting is not
+available at the time that they are inlined, and the newer unix ctime is
+used. On the second build, the meta data has already been recorded.
+
+This can probably be avoided by processing meta date at scan time.
+
+Verified, fix works. [[done]]
+--[[Joey]]
diff --git a/doc/bugs/install_into_home_dir_fails.mdwn b/doc/bugs/install_into_home_dir_fails.mdwn
new file mode 100644
index 000000000..e08b7484c
--- /dev/null
+++ b/doc/bugs/install_into_home_dir_fails.mdwn
@@ -0,0 +1,57 @@
+dunno if it just me, but I had to add PREFIX a few places to get 'perl INSTALL_BASE=$HOME' to work
+
+> That will cause the files to be installed into a place that ikiwiki
+> doesn't look for them. It will also cause them to be installed into
+> /usr/etc by default, where ikiwiki also won't find them.
+>
+> Thomas Keller also ran into some sort of problem with the MacPort
+> involving the installation into /etc. From that discussion:
+>
+> Both ikiwiki-update-wikilist and ikiwiki-mass-rebuild hardcode /etc; so
+> do several pages in the doc wiki.
+>
+> The real problem though is that MakeMaker does not have a standard way
+> of specifying where /etc files go. In Debian we want everything to go
+> into /usr, rather than the default /usr/local, so set PREFIX=/usr -- but
+> we still want config files in /etc, not /usr/etc. The only way I can see
+> around this is to add a nonstandard variable to control the location of
+> /etc, that would override the PREFIX.
+>
+> Which implies that you can't just use "$installdir/etc/" in the /etc
+> hardcoding scripts, and would instead need to record the new variable
+> at build time, like PREFIX is recorded in $installdir.
+>
+> Instead, let's ignore failure of the lines. [[done]]
+>
+> --[[Joey]]
+
+<pre>
+From a1e02fbdaba3725730418a837b506e713904ada5 Mon Sep 17 00:00:00 2001
+From: David Bremner <bremner@pivot.cs.unb.ca>
+Date: Fri, 29 Aug 2008 15:18:24 -0300
+Subject: [PATCH] add missing $(PREFIX) to install path
+
+---
+ Makefile.PL | 6 +++---
+ 1 files changed, 3 insertions(+), 3 deletions(-)
+
+diff --git a/Makefile.PL b/Makefile.PL
+index 979483c..1f27394 100755
+--- a/Makefile.PL
++++ b/Makefile.PL
+@@ -50,9 +50,9 @@ extra_clean:
+ $(MAKE) -C po clean
+
+ extra_install:
+- install -d $(DESTDIR)/etc/ikiwiki
+- install -m 0644 wikilist $(DESTDIR)/etc/ikiwiki
+- install -m 0644 auto.setup $(DESTDIR)/etc/ikiwiki
++ install -d $(DESTDIR)$(PREFIX)/etc/ikiwiki
++ install -m 0644 wikilist $(DESTDIR)$(PREFIX)/etc/ikiwiki
++ install -m 0644 auto.setup $(DESTDIR)$(PREFIX)/etc/ikiwiki
+
+ install -d $(DESTDIR)$(PREFIX)/share/ikiwiki
+ for dir in `cd underlays && find . -follow -type d ! -regex '.*\.svn.*'`; do \
+--
+1.5.6.3
+</pre>
diff --git a/doc/bugs/installing_from_svn_copies_.svn_directories.mdwn b/doc/bugs/installing_from_svn_copies_.svn_directories.mdwn
new file mode 100644
index 000000000..dec916044
--- /dev/null
+++ b/doc/bugs/installing_from_svn_copies_.svn_directories.mdwn
@@ -0,0 +1,28 @@
+If you do an svn co, and then install from the generated WC, the makefile
+copies .svn directories to various locations:
+
+$ find ~/ikidev-install/share/ikiwiki/ -name ".svn"
+/home/glasserc/ikidev-install/share/ikiwiki/basewiki/smileys/.svn
+/home/glasserc/ikidev-install/share/ikiwiki/basewiki/subpage/.svn
+/home/glasserc/ikidev-install/share/ikiwiki/basewiki/wikiicons/.svn
+/home/glasserc/ikidev-install/share/ikiwiki/basewiki/templates/.svn
+
+I think the guilty command is this one:
+
+cp -aL basewiki/* /home/glasserc/ikidev-install//share/ikiwiki/basewiki
+
+(PREFIX is ~/ikidev-install here.)
+
+I don't know of a good fix. I wouldn't have discovered this except that I got
+permission denied errors when I tried to make a change and install again.
+--Ethan
+
+> I observed the same thing by running debuild from an SVN checkout and
+> getting lintian warnings about the .svn directories. --[[JoshTriplett]]
+
+> The .svn directories can be avoided in the deb by setting
+> DH_ALWAYS_EXCLUDE=.svn in the environment before building.
+> I prefer to use that kind of hack exterior to a package rather than
+> putting in RCS-specific exclude hacks.
+>
+> [[Done]] for the install from svn checkout case. --[[Joey]]
diff --git a/doc/bugs/internal_error:_smileys.mdwn_cannot_be_found.mdwn b/doc/bugs/internal_error:_smileys.mdwn_cannot_be_found.mdwn
new file mode 100644
index 000000000..9b2be8290
--- /dev/null
+++ b/doc/bugs/internal_error:_smileys.mdwn_cannot_be_found.mdwn
@@ -0,0 +1,41 @@
+I've just backported your ikiwiki 1.43 and installed it, but now
+I can't rebuild my all ikiwiki pages. When I run `ikiwiki --setup ikiwiki.setup`,
+then I can see the following error:
+
+ internal error: smileys.mdwn cannot be found
+ BEGIN failed--compilation aborted at (eval 5) line 111.
+
+I have smiley plugin enabled in my `ikiwiki.setup` file, but I've never
+had `smileys.mdwn` page.
+
+Probably the reason of the problem is that you've removed many pages
+from `basewiki` directory and created symlinks for that pages, but there
+don't exist in the last package:
+
+ $ LANG=C apt-cache policy ikiwiki
+ ikiwiki:
+ Installed: 1.43gpa1
+ Candidate: 1.43gpa1
+ Version Table:
+ *** 1.43gpa1 0
+ 500 http://gpa.net.icm.edu.pl sarge/main Packages
+ 100 /var/lib/dpkg/status
+
+ $ dpkg -L ikiwiki |grep smileys.mdwn
+
+--[[Paweł|ptecza]]
+
+This seems to be a bug in your 1.43gpal version, whatever that is.. In the package I built, I see:
+
+ joey@kodama:~>dpkg -L ikiwiki | grep smileys.mdwn
+ /usr/share/ikiwiki/basewiki/smileys.mdwn
+ joey@kodama:~>ls -l /usr/share/ikiwiki/basewiki/smileys.mdwn
+ -rw-r--r-- 1 root root 1643 Feb 13 18:03 /usr/share/ikiwiki/basewiki/smileys.mdwn
+
+--[[Joey]]
+
+> You're right. My backport was builded without all symlinks, because I store
+> all rebuilded sources in CVS repo, but it seems that CVS doesn't support symlinks.
+> Grrr... I need to switch to another repo now. --[[Paweł|ptecza]]
+
+>> Ok, [[bugs/done]] then --[[Joey]]
diff --git a/doc/bugs/ipv6_address_in_comments.mdwn b/doc/bugs/ipv6_address_in_comments.mdwn
new file mode 100644
index 000000000..90391650a
--- /dev/null
+++ b/doc/bugs/ipv6_address_in_comments.mdwn
@@ -0,0 +1,19 @@
+If I make a comment from an ipv4 address
+I see the commenter's ipv4 address logged in the comment file.
+
+If I make a comment from an ipv6 address
+I see nothing.
+
+There is a sanity check in /usr/share/perl5/IkiWiki/Plugin/comments.pm
+line 447 (according to today's version) there is an ipv4 specific regexp.
+
+I removed the regexp and used the value without this added check and it fixed
+the problem for me. Not sure if this is the best solution. --[[cstamas]]
+
+[[patch]]
+
+[[!tag ipv6]]
+
+> [[done]] --[[Joey]]
+
+> > Thank you! --[[cstamas]]
diff --git a/doc/bugs/jquery-ui.min.css_missing_some_image_files.mdwn b/doc/bugs/jquery-ui.min.css_missing_some_image_files.mdwn
new file mode 100644
index 000000000..dd026f4ec
--- /dev/null
+++ b/doc/bugs/jquery-ui.min.css_missing_some_image_files.mdwn
@@ -0,0 +1,14 @@
+This is very minor. Noticed in nginx's logs that jquery-ui.min.css (the attachment plugin uses this) keeps referencing some png files that are not available in public_html/mywiki/ikiwiki/images/ These should be included in underlays/attachment/ikiwiki/images/ in the source repo and seem to be copied from /usr/local/share/ikiwiki/attachment/ikiwiki/images/ when I compile a new wiki. The complete list of images jquery-ui.min.css is looking for can be found here. https://github.com/jquery/jquery-ui/tree/1.8.14/themes/base/images
+
+> Do you have a list of files that are *actually* used when ikiwiki is
+> running? I don't want to include a lot of files that jquery only
+> uses in other situations. The currently included files are exactly those
+> that I see it try to use. --[[Joey]]
+
+Fair enough. These 3 files are the only ones that appear consistently in nginx error logs.
+ui-bg_glass_75_dadada_1x400.png
+ui-icons_454545_256x240.png
+ui-bg_glass_95_fef1ec_1x400.png
+
+> Hmm, that's most of the missing ones. I just added them all. [[done]]
+> --[[Joey]]
diff --git a/doc/bugs/librpc-xml-perl_0.69_breaks_XML-RPC_plugins.mdwn b/doc/bugs/librpc-xml-perl_0.69_breaks_XML-RPC_plugins.mdwn
new file mode 100644
index 000000000..d8def82aa
--- /dev/null
+++ b/doc/bugs/librpc-xml-perl_0.69_breaks_XML-RPC_plugins.mdwn
@@ -0,0 +1,13 @@
+Upgrading to `librpc-xml-perl` 0.69-1 on debian breaks external XML-RPC plugins (such as [[rst]]).
+Refresing the wiki fails with an error message like this:
+
+ RPC::XML::Parser::new: This method should have
+ been overridden by the RPC::XML::Parser class at
+ /usr/share/perl5/RPC/XML/Parser.pm line 46, <GEN1> line 30.
+
+It appears an incompatible change in the library creates this problem.
+
+`librpc-xml-perl` version 0.67-1 works. The easiest solution is to downgrade to this version,
+or disable XML-RPC plugins if they are not needed.
+
+[[fixed|done]] --[[Joey]]
diff --git a/doc/bugs/linkingrules_should_document_how_to_link_to_page_at_root_if_non-root_page_exists.mdwn b/doc/bugs/linkingrules_should_document_how_to_link_to_page_at_root_if_non-root_page_exists.mdwn
new file mode 100644
index 000000000..715f8cd4d
--- /dev/null
+++ b/doc/bugs/linkingrules_should_document_how_to_link_to_page_at_root_if_non-root_page_exists.mdwn
@@ -0,0 +1,6 @@
+The [[linking_rules|ikiwiki/subpage/linkingrules]] should document how to
+link to a page at the root of the wiki when a normal, unadorned link would
+use a page of the same name further down the hierarchy. For example, how
+should [[todo/latex]] link to [[logo]] rather than [[todo/logo|todo/logo]]?
+
+> [[bugs/done]].. the syntax to use is "/logo" --[[Joey]]
diff --git a/doc/bugs/linkmap_displays_underscore_escapes.mdwn b/doc/bugs/linkmap_displays_underscore_escapes.mdwn
new file mode 100644
index 000000000..f74ca5119
--- /dev/null
+++ b/doc/bugs/linkmap_displays_underscore_escapes.mdwn
@@ -0,0 +1,18 @@
+[[ikiwiki/directive/linkmap]]s display the file name instead of the pagetitle, showing unsightly underscore escapes and underscores instead of blanks to users.
+
+the attached [[!taglink patch]] fixes this; from its commit message:
+
+ display the pagetitle() in linkmaps
+
+ without this patch, linkmaps display underscores and underscore escape
+ sequences in the rendered output.
+
+ this introduces a pageescape function, which invoces pagetitle() to get
+ rid of underscore escapes and wraps the resulting utf8 string
+ appropriately for inclusion in a dot file (using dot's html encoding
+ because it can represent the '\"' dyad properly, and because it doesn't
+ need special-casing of newlines).
+
+the output will look much better (at least in my wikis) with the "[[bugs/pagetitle function does not respect meta titles]]" issue fixed.
+
+the patch is stored in [[the patch.pl]] as created by git-format-patch. (btw, what's the preferred way to send patches, apart from creating a git branch somewhere?)
diff --git a/doc/bugs/linkmap_displays_underscore_escapes/the_patch.pl b/doc/bugs/linkmap_displays_underscore_escapes/the_patch.pl
new file mode 100644
index 000000000..6b56c553e
--- /dev/null
+++ b/doc/bugs/linkmap_displays_underscore_escapes/the_patch.pl
@@ -0,0 +1,68 @@
+From efbb1121ffdc146f5c9a481a51f23ad151b9f240 Mon Sep 17 00:00:00 2001
+From: chrysn <chrysn@fsfe.org>
+Date: Thu, 15 Mar 2012 14:38:42 +0100
+Subject: [PATCH] display the pagetitle() in linkmaps
+
+without this patch, linkmaps display underscores and underscore escape
+sequences in the rendered output.
+
+this introduces a pageescape function, which invoces pagetitle() to get
+rid of underscore escapes and wraps the resulting utf8 string
+appropriately for inclusion in a dot file (using dot's html encoding
+because it can represent the '\"' dyad properly, and because it doesn't
+need special-casing of newlines).
+---
+ IkiWiki/Plugin/linkmap.pm | 17 +++++++++++++++--
+ 1 files changed, 15 insertions(+), 2 deletions(-)
+
+diff --git a/IkiWiki/Plugin/linkmap.pm b/IkiWiki/Plugin/linkmap.pm
+index ac26e07..b5ef1a1 100644
+--- a/IkiWiki/Plugin/linkmap.pm
++++ b/IkiWiki/Plugin/linkmap.pm
+@@ -5,6 +5,7 @@ use warnings;
+ use strict;
+ use IkiWiki 3.00;
+ use IPC::Open2;
++use HTML::Entities;
+
+ sub import {
+ hook(type => "getsetup", id => "linkmap", call => \&getsetup);
+@@ -22,6 +23,18 @@ sub getsetup () {
+
+ my $mapnum=0;
+
++sub pageescape {
++ my $item = shift;
++ # encoding explicitly in case ikiwiki is configured to accept <> or &
++ # in file names
++ my $title = pagetitle($item, 1);
++ # it would not be necessary to encode *all* the html entities (<> would
++ # be sufficient, &" probably a good idea), as dot accepts utf8, but it
++ # isn't bad either
++ $title = encode_entities($title);
++ return("<$title>");
++}
++
+ sub preprocess (@) {
+ my %params=@_;
+
+@@ -63,7 +76,7 @@ sub preprocess (@) {
+ my $show=sub {
+ my $item=shift;
+ if (! $shown{$item}) {
+- print OUT "\"$item\" [shape=box,href=\"$mapitems{$item}\"];\n";
++ print OUT pageescape($item)." [shape=box,href=\"$mapitems{$item}\"];\n";
+ $shown{$item}=1;
+ }
+ };
+@@ -74,7 +87,7 @@ sub preprocess (@) {
+ foreach my $endpoint ($item, $link) {
+ $show->($endpoint);
+ }
+- print OUT "\"$item\" -> \"$link\";\n";
++ print OUT pageescape($item)." -> ".pageescape($link).";\n";
+ }
+ }
+ print OUT "}\n";
+--
+1.7.9.1
diff --git a/doc/bugs/links_from_sidebars.mdwn b/doc/bugs/links_from_sidebars.mdwn
new file mode 100644
index 000000000..32bc5f04d
--- /dev/null
+++ b/doc/bugs/links_from_sidebars.mdwn
@@ -0,0 +1,14 @@
+If you use sidebars for navigation the Links section of a page ends up looking like:
+
+Links: blog/posts blog/sidebar sidebar
+
+instead of what I would expect:
+
+Links: blog/posts blog index
+
+I'm not sure what the bug actually is, or if there even is one, this just seems odd.
+
+> If you have a page named "sidebar" and a page named "blog/sidebar",
+> and they each link to a page, then of course they will show up in that
+> page's list of links. The index page does not link to the page, so is not
+> listed, which is appropriate. [[Notabug|done]] --[[Joey]]
diff --git a/doc/bugs/links_from_sidebars/discussion.mdwn b/doc/bugs/links_from_sidebars/discussion.mdwn
new file mode 100644
index 000000000..9cb84328a
--- /dev/null
+++ b/doc/bugs/links_from_sidebars/discussion.mdwn
@@ -0,0 +1,5 @@
+In the meantime I have configured nginx to redirect any requests for .../sidebar/ up to the parent page.
+
+ rewrite ^(.*/)sidebar/$ $1 redirect;
+
+This appears to work well.
diff --git a/doc/bugs/links_misparsed_in_CSV_files.mdwn b/doc/bugs/links_misparsed_in_CSV_files.mdwn
new file mode 100644
index 000000000..27d2b7b1e
--- /dev/null
+++ b/doc/bugs/links_misparsed_in_CSV_files.mdwn
@@ -0,0 +1,27 @@
+If a link inside a CSV file contains two or more underscores (\_), then it will get mis-parsed by the table plugin.
+
+e.g. \[[single\_track\_lines]] becomes "em>lines".
+
+Links with only one underscore are OK.
+
+Update 2008-11-24: The problem only occurs if the CSV data is in an external file. If I load it using data="""...""" then it works fine.
+
+The problem appears to be the call to htmlize inside genrow. If the data is inline, then wikilinks get expanded before they get here, and are OK. If the data is from an external file, the wikilinks aren't expanded, and htmlize will expand \[[single\_track\_lines]] into \[[single&lt;em&gt;track&lt;/em&gt;lines]].
+
+Oh, wait, I see the problem. IkiWiki::linkify is only called if the external file doesn't exist. If I remove this check and always call IkiWiki::linkify, then the problem is solved.
+
+(this is inside /usr/share/perl5/IkiWiki/Plugin/table.pm).
+
+> To reproduce this bug, I had to install the old, broken markdown 1.0,
+> instead of the now-default Text::Markdown.
+>
+> Why is linkify not called for external files? Well, I checked the
+> history, and it's probably best to say "for historical reasons that no
+> longer apply". So, changed as you suggest. [[done]] --[[Joey]]
+
+I am rather confused what this check does, and the fact the comments are very different for CSV and DSV when the code is the same doesn't seem to help.
+
+> The code is not the same; two operations are run in different orders for
+> CSV and DSV, as the comments note. --[[Joey]]
+
+-- Brian May
diff --git a/doc/bugs/listdirectives_doesn__39__t_register_a_link.mdwn b/doc/bugs/listdirectives_doesn__39__t_register_a_link.mdwn
new file mode 100644
index 000000000..26945ee07
--- /dev/null
+++ b/doc/bugs/listdirectives_doesn__39__t_register_a_link.mdwn
@@ -0,0 +1,34 @@
+The [[ikiwiki/directive/listdirectives]]` directive doesn't register a link between the page and the subpages. This is a problem because then the [[ikiwiki/directive/orphans]] directive then marks the directives as orphans... Maybe it is a but with the orphans directive however... A simple workaround is to exclude those files from the orphans call... --[[anarcat]]
+
+> There's a distinction between wikilinks (matched by `link()`,
+> `backlink()` etc.) and other constructs that produce a
+> hyperlink. Some directives count as a wikilink (like `tag`)
+> but many don't (notably `inline`, `map`, `listdirectives`,
+> and `orphans` itself). As documented in
+> [[ikiwiki/directive/orphans]], orphans will tend to list
+> pages that are only matched by inlines/maps, too.
+>
+> The rule of thumb seems to be that a link to a particular
+> page counts as a wikilink, but a directive that lists
+> pages matching some pattern does not; so I think
+> `listdirectives` is working as intended here.
+> `orphans` itself obviously shouldn't count as a wikilink,
+> because that would defeat the point of it :-)
+>
+> Anything that uses a [[ikiwiki/pagespec]] to generate links,
+> like `inline` and `map`, can't generate wikilinks, because
+> wikilinks are gathered during the scan phase, and pagespecs
+> can't be matched until after the scan phase has finished
+> (otherwise, it'd be non-deterministic whether all wikilinks
+> had been seen yet, and `link()` in pagespecs wouldn't work
+> predictably).
+>
+> I suggest just using something like:
+>
+> \[[!orphans pages="* and !blog/* and !ikiwiki/directive/*"]]
+>
+> This wiki's example of listing [[plugins/orphans]] has a
+> more elaborate pagespec, which avoids bugs, todo items etc.
+> as well.
+>
+> --[[smcv]]
diff --git a/doc/bugs/lockedit_plugin_should_alert_user_about_an_invalid_pagespec_in_preferences.mdwn b/doc/bugs/lockedit_plugin_should_alert_user_about_an_invalid_pagespec_in_preferences.mdwn
new file mode 100644
index 000000000..b8023ce87
--- /dev/null
+++ b/doc/bugs/lockedit_plugin_should_alert_user_about_an_invalid_pagespec_in_preferences.mdwn
@@ -0,0 +1,21 @@
+[[plugins/lockedit]] adds the form fields for a [[pagespec]] to preferences. This pagespec should be supplied "raw"; i.e., without quotes around it. Inexperienced users (such as [[myself|users/jon]]) may provide an invalid pagespec, such as one with quotes on it. This will be merrily accepted by the form, but will cause no locking to take place.
+
+Perhaps some validation should be performed on the pagespec and the form-submission return include "warning: this pagespec is invalid" or "warning: this pagespec does not match any existing pages" or similar.
+
+> The pagespec is no longer in the preferences and instead in the setup
+> file now. That makes warning about a problem with it harder.
+>
+> Ikiwiki could try to detect this problem and warn at setup time to
+> stderr, I guess.
+>
+> Main problem is I see little way to actually detect the problem you
+> described. A pagespec with quotes around it is valid. For example, the
+> pagespec `"foo or bar"` matches a page named `"foo` or a page named `bar"`.
+>
+> There are small classes of invalid pagespecs. For example, `(foo or bar`
+> is invalid due to having unbalanced parens, while `foo or and bar`
+> has invalid syntax. It's possible to detect these, I guess ... --[[Joey]]
+
+>> Having moved it to the .setup file makes things more obvious I think.
+>> Anyway I consider this [[done]], please de-done this if you disagree.
+>> --[[Jon]]
diff --git a/doc/bugs/locking_fun.mdwn b/doc/bugs/locking_fun.mdwn
new file mode 100644
index 000000000..46278b028
--- /dev/null
+++ b/doc/bugs/locking_fun.mdwn
@@ -0,0 +1,105 @@
+It's possible for concurrent web edits to race and the winner commits both
+changes at once with its commit message (see r2779). The loser gets a
+message that there were conflicts and gets to see his own edits as the
+conflicting edits he's supposed to resolve.
+
+This can happen because CGI.pm writes the change, then drops the main wiki
+lock before calling rcs_commit. It can't keep the lock because the commit
+hook needs to be able to lock.
+
+-------
+
+We batted this around for an hour or two on irc. The best solution seems to
+be adding a subsidiary second lock, which is only used to lock the working
+copy and is a blocking read/write lock.
+
+* As before, the CGI will take the main wiki lock when starting up.
+* Before writing to the WC, the CGI takes an exclusive lock on the WC.
+* After writing to the WC, the CGI can downgrade it to a shared lock.
+ (If this downgrade does not happen atomically, other CGIs can
+ steal the exclusive lock.)
+* Then the CGI, as before, drops the main wiki lock to prevent deadlock. It
+ keeps its shared WC lock.
+* The commit hook takes first the main wiki lock and then the shared WC lock
+ when starting up, and holds them until it's done.
+* Once the commit is done, the CGI, as before, does not attempt to regain
+ the main wiki lock (that could deadlock). It does its final stuff and
+ exits, dropping the shared WC lock.
+
+Locking:
+
+Using fcntl locking from perl is very hard. flock locking has the problem
+that one some OSes (linux?) converting an exclusive to a shared lock is not
+atomic and can be raced. What happens if this race occurs is that,
+since ikiwiki always uses LOCK_NB, the flock fails. Then we're back to the
+original race. It should be possible though to use a separate exclusive lock,
+wrapped around these flock calls, to force them to be "atomic" and avoid that
+race.
+
+------
+
+My alternative idea, which seems simpler than all this tricky locking
+stuff, is to introduce a new lock file (really a flag file implemented
+using a lock), which tells the commit hook that the CGI is running, and
+makes the commit hook a NOOP.
+
+* CGI takes the wikilock
+* CGI writes changes to WC
+* CGI sets wclock to disable the commit hook
+* CGI does *not* drop the main wikilock
+* CGI commit
+* The commit hook tries to set the wclock, fails, and becomes a noop
+ (it may still need to send commit mails)
+* CGI removes wclock, thus re-enabling the commit hook
+* CGI updates the WC (since the commit hook didn't)
+* CGI renders the wiki (always. commits may have came in and not been
+ rendered)
+* CGI checks for conflicts, and if any are found does its normal dance
+
+> It seems like there are two things to be concerned with: RCS commit between
+> disable of hook and CGI commit, or RCS commit between CGI commit and re-enable
+> of hook. The second case isn't a big deal if the CGI is gonna rerender
+> everything anyhow. --[[Ethan]]
+
+I agree, and I think that the second case points to the hooks still being
+responsible for sending out commit mails. Everything else the CGI can do.
+
+I don't believe that the first case is actually a problem: If the RCS
+commit does not introduce a conflict then the CGI commit's changes will be
+merged into the repo cleanly. OTOH, if the RCS commit does introduces a
+conflict then the CGI commit will fail gracefully. This is exactly what
+happens now if RCS commit happens while a CGI commit is in progress! Ie:
+
+* cgi takes the wikilock
+* cgi writes change to wc
+* svn commit -m "conflict" (this makes a change to repo immediately, then
+ runs the post-commit hook, which waits on the wikilock)
+* cgi drops wikilock
+* the post-commit hook from the above manual commit can now run.
+* cgi calls rcs_commit, which fails due to the conflict just introduced
+
+The only difference to this scenario will be that the CGI will not drop the
+wiki lock before its commit, and that the post-commit hook will turn into a
+NOOP:
+
+* cgi takes the wikilock
+* cgi writes change to wc
+* cgi takes the wclock
+* svn commit -m "conflict" (this makes a change to repo immediately, then
+ runs the post-commit hook, which becomes a NOOP)
+* cgi calls rcs_commit, which fails due to the conflict just introduced
+* cgi renders the wiki
+
+Actually, the only thing that scares me about this apprach a little is that
+we have two locks. The CGI takes them in the order (wikilock, wclock).
+The commit hook takes them in the order (wclock, wikilock). This is a
+classic potential deadlock scenario. _However_, the commit hook should
+close the wclock as soon as it successfully opens it, before taking the
+wikilock, so I think that's ok.
+
+-----
+
+I've committed an implementation of my idea just above, and it seems to
+work, although testing for races etc is tricky. Calling this [[bugs/done]]
+unless someone finds a new bug or finds a problem in my thinking above.
+--[[Joey]]
diff --git a/doc/bugs/login_page_non-obvious_with_openid.mdwn b/doc/bugs/login_page_non-obvious_with_openid.mdwn
new file mode 100644
index 000000000..9aa702037
--- /dev/null
+++ b/doc/bugs/login_page_non-obvious_with_openid.mdwn
@@ -0,0 +1,47 @@
+I just setup my first OpenID account and tried to login to ikiwiki.info. It all works but being relatively unfamiliar with OpenID, when I was presented with the login page it wasn't at all clear which bits needed to be filled in.
+
+At the moment it looks like this:
+
+ Name:
+ Password:
+ OpenID:
+
+ [Login] [Register] [Mail Password]
+
+Really this form is presenting two entirely separate ways to login. The "normal" user/pass *OR* OpenID. Also (I assume) the [Register] and [Mail Password] actions are only relevant to user/pass form.
+
+I would suggest that the form be split into two parts, something like this:
+
+ Login (or register) with a username and password:
+
+ Name:
+ Password:
+
+ [Login] [Register] [Mail Password]
+
+ **OR**
+
+ Login with OpenID:
+
+ OpenID URL:
+
+ [Login]
+
+As an example, the first time I went to login I filled in all three fields (user, pass, openid) and then clicked [Register] because from the layout I assumed I still had to instantiate an account with ikiwiki ... and to make it even more confusing, it worked! Of course it worked by creating me an account based on the username password and ignoring the OpenID URL.
+
+If you want to keep it as one form, then perhaps using some javascript to disable the other pieces of the form as soon as you fill in one part would help? Eg. If you put in an OpenID URL then Name/Password/Register/Mail Password gets greyed out. If you enter a username then the OpenID URL gets greyed out.
+ -- Adam.
+
+> It's one form for architectural reasons -- the OpenID plugin uses a hook
+> that allows modifying that form, but does not allow creating a separate
+> form. The best way to make it obvious how to use it currently is to just
+> disable password auth, then it's nice and simple. :-) Javascript is an
+> interesting idea. It's also possible to write a custom [[templates]] that
+> is displayed instead of the regular signin form, and it should be
+> possible to use that to manually lay it out better than FormBuilder
+> manages with its automatic layout. --[[Joey]]
+
+> I've improved the form, I think it's more obvious now that the openid
+> stuff is separate. Good enough to call this [[done]]. I think. --[[Joey]]
+
+>> Looks good, thanks! :-) -- [[AdamShand]]
diff --git a/doc/bugs/login_page_should_note_cookie_requirement.mdwn b/doc/bugs/login_page_should_note_cookie_requirement.mdwn
new file mode 100644
index 000000000..17ac12b34
--- /dev/null
+++ b/doc/bugs/login_page_should_note_cookie_requirement.mdwn
@@ -0,0 +1,39 @@
+At the moment, you go through the login shuffle and then are told that cookies are needed, so you lose all your data and login again. It would be much slicker to note by the edit link, or at least on the login page, that cookies are required.
+
+> Hmm, it seems to me to be fairly obvious, since the vast majority of
+> websites that have a login require cookies. Such warnings used to be
+> common, but few sites bother with them anymore. --[[Joey]]
+
+>> Very few websites break without cookies. Even fewer lose data.
+>> Can ikiwiki avoid being below average by default? --[MJR](http://mjr.towers.org.uk)
+
+>>> Can we avoid engaging in hyperbole? (Hint: Your browser probably has a
+>>> back button. Hint 2: A username/password does not count as "lost data".
+>>> Hint 3: Now we're arguing, which is pointless.) --[[Joey]]
+
+Even better would be to only display the cookie note as a warning if the login page doesn't receive a session cookie.
+
+> I considered doing this before, but it would require running the cgi once
+> to attempt to set the cookie and then redirecting to the cgi a second
+> time to check if it took, which is both complicated and probably would
+> look bad.
+
+>> Might this be possible client-side with javascript? A quick google suggests it is possible:
+>> <http://www.javascriptkit.com/javatutors/cookiedetect.shtml>. MJR, want to try adding
+>> that? -- [[Will]]
+
+Best of all would be to use URL-based or hidden-field-based session tokens if cookies are not permitted.
+
+> This is not very doable since most of the pages the user browses are
+> static pages in a static location.
+
+>> The pages that lose data without cookies (the edit pages, primarily)
+>> don't look static. Are they really? --[MJR](http://mjr.towers.org.uk)
+
+>>> As soon as you post an edit page, you are back to a static website.
+
+>>> It is impossible to get to an edit page w/o a cookie, unless
+>>> anonymous edits are allowed, in which case it will save. No data loss.
+>>> Since noone is working on this, and the nonsense above has pissed me
+>>> off to the point that I will certianly never work on it, I'm going to
+>>> close it. --[[Joey]] [[done]]
diff --git a/doc/bugs/logout_in_ikiwiki.mdwn b/doc/bugs/logout_in_ikiwiki.mdwn
new file mode 100644
index 000000000..6696cc689
--- /dev/null
+++ b/doc/bugs/logout_in_ikiwiki.mdwn
@@ -0,0 +1,44 @@
+It looks like there is no way to logout of ikiwiki at present, meaning that if you edit the ikiwiki in, say, a cybercafe, the cookie remains... is there some other security mechanism in place that can check for authorization, or should I hack in a logout routine into ikiwiki.cgi?
+
+> Click on "Preferences". There is a logout button there. --liw
+
+> It would be nice if it were not buried there, but putting it on the
+> action bar statically would be confusing. The best approach might be to
+> use javascript. --[[Joey]]
+
+
+>> I agree that javascript seems to be a solution, but my brain falls
+>> off the end of the world while looking at ways to manipulate the DOM.
+>> (I'd argue also in favor of the openid_provider cookie expiring
+>> in less time than it does now, and being session based)
+
+>>> (The `openid_provider` cookie is purely a convenience cookie to
+>>> auto-select the user's openid provider the next time they log
+>>> in. As such, it cannot be a session cookie. It does not provide any
+>>> personally-identifying information so it should not really matter
+>>> when it expires.) --[[Joey]]
+
+>> It would be nice to move navigational elements to the upper right corner
+>> of the page...
+
+>> I have two kinds of pages (wiki and blog), and three classes of users
+
+>> anonymous users - display things like login, help, and recentchanges,
+
+>> non-admin users - on a per subdir basis (blog and !blog) display
+>> logout, help, recentchanges, edit, comment
+
+>> admin users - logout, help, recentchanges, edit, comment, etc
+
+
+I was referred to this page from posting to the forum. I am also interested in being able to use user class and status to modify the page. I will try to put together a plugin. From what I can see there needs to be a few items in it.
+
+* It should expose a link to a dedicated login page that, once logged in, returns the user to the calling page, or at least the home page. I have started a plugin to do this: [[/plugins/contrib/justlogin]]
+
+* it needs to expose a link to a little json explaining the type of user and login status.
+
+* it should expose a link that logs the person out and returns to the calling page, or at least the home page.
+
+Then there would need to be a little javascript to use these links appropriately. I have little javascript experience but I know that can be done. I am less sure if it is possible to add this functionality to a plugin so I'll start with that. If no one objects I will continue to post here if I make progress. If anyone has any suggestions on how to modify my approach to code it in an easier way I'd appreciate the input. [[justint]]
+
+
diff --git a/doc/bugs/mailto:_links_not_properly_generated_in_rssatom_feeds.mdwn b/doc/bugs/mailto:_links_not_properly_generated_in_rssatom_feeds.mdwn
new file mode 100644
index 000000000..8e694ff6c
--- /dev/null
+++ b/doc/bugs/mailto:_links_not_properly_generated_in_rssatom_feeds.mdwn
@@ -0,0 +1,29 @@
+[[!meta title="mailto: links not properly generated in rss/atom feeds"]]
+
+A link like \[this](mailto:foo@bar.org) will not be converted correctly to a mailto link in the rss/atom feeds, but an absolute link instead. See e.g. the logitech post on http://madduck.net/blog/feeds/planet-lca2008/index.rss
+
+> fixed --[[Joey]] [[!tag done]]
+
+This still happens for auto-generated mailto: links that are [garbled](http://daringfireball.net/projects/markdown/syntax#autolink) by Markdown, so that
+
+ <matthias@rampke.de>
+is turned into
+
+ <a href="m&#97;&#105;&#x6C;t&#111;:&#109;&#x61;&#116;&#x74;&#x68;&#105;a&#x73;&#64;&#x72;&#x61;&#109;&#x70;&#x6B;&#101;&#46;&#100;&#x65;">&#109;&#x61;&#116;&#x74;&#x68;&#105;a&#x73;&#64;&#x72;&#x61;&#109;&#x70;&#x6B;&#101;&#46;&#100;&#x65;</a>
+
+for HTML, but
+
+ &lt;a href=&quot;http://rampke.de/m&amp;#97;&amp;#105;&amp;#x6C;t&amp;#111;:&amp;#109;&amp;#x61;&amp;#116;&amp;#x74;&amp;#x68;&amp;#105;a&amp;#x73;&amp;#64;&amp;#x72;&amp;#x61;&amp;#109;&amp;#x70;&amp;#x6B;&amp;#101;&amp;#46;&amp;#100;&amp;#x65;&quot;&gt;&amp;#109;&amp;#x61;&amp;#116;&amp;#x74;&amp;#x68;&amp;#105;a&amp;#x73;&amp;#64;&amp;#x72;&amp;#x61;&amp;#109;&amp;#x70;&amp;#x6B;&amp;#101;&amp;#46;&amp;#100;&amp;#x65;&lt;/a&gt;&lt;/p&gt;
+
+for Atom and RSS.
+
+> This garbling is provably pointless. Proof: For $1000 I will take off my
+> white hat, put on my black hat, and implement support for it in any
+> spammer's email address extraction tool. Money will be donated to a
+> spam-fighting organisation of my choice.
+>
+> So, in leu of money, it seems best to find a way to disable it in
+> markdown.
+>
+> Anyway, I've fixed this, at the expense of additional total worldwide
+> power usage, etc. --[[Joey]]
diff --git a/doc/bugs/map_does_not_link_directory_for_which_a_file_also_exists.mdwn b/doc/bugs/map_does_not_link_directory_for_which_a_file_also_exists.mdwn
new file mode 100644
index 000000000..d197cdb6c
--- /dev/null
+++ b/doc/bugs/map_does_not_link_directory_for_which_a_file_also_exists.mdwn
@@ -0,0 +1,26 @@
+[[!meta title="map does not link entries which are equal to basename(current_page)"]]
+
+On <http://phd.martin-krafft.net/wiki/tag/factors/>, the top-level `factors` entry is not linked to the corresponding page. Looking at <http://phd.martin-krafft.net/wiki/tag/factors/language/>, this must be because the page name is the same as the entry name, and ikiwiki probably doesn't take the complete path of subpages into account.
+
+[[done]] --[[Joey]]
+
+I can confirm that most if the issues are fixed, but map still includes and links to pages that do not match the pagespec. The list includes entries like `tag/factors/contribute`, but that page does *not* link/is not tagged with any `factors*` tag. I have put a snapshot of the site as it was when I saw this bug at <http://scratch.madduck.net/web__phd.martin-krafft.net__map-bug-1.tgz> and can return to the state at any time, but I needed to move on now...
+
+--[[madduck]]
+
+That's a different issue. :-)
+
+This is really subtle and tricky. It's doing this because it
+thinks that tag/factors/contribute _does_ link to a page
+matching "tag/factors/*". That page? tag/factors/contribute/discussion!
+
+Now, tag/factors/contribute/discussion doesn't exist yet in your wiki,
+but there is a "?Discussion" pseudo-link, and that's good enough for
+ikiwiki.
+
+So, you could work around this annoying behavior with
+!link(tag/factors/*/Discussion)
+
+BTW, the testpagespec plugin is useful in debugging these kind of things.
+
+--[[Joey]]
diff --git a/doc/bugs/map_does_not_link_directory_for_which_a_file_also_exists/discussion.mdwn b/doc/bugs/map_does_not_link_directory_for_which_a_file_also_exists/discussion.mdwn
new file mode 100644
index 000000000..44ff4143f
--- /dev/null
+++ b/doc/bugs/map_does_not_link_directory_for_which_a_file_also_exists/discussion.mdwn
@@ -0,0 +1,3 @@
+On <http://phd.martin-krafft.net/wiki/tag/factors/ideology/>, another symptom of this problem can be seen: the page collects pages linking to `tag/factors/ideology`, and `/factors/language` is one of those. However, the link points to `tag/factors/language`, which does *not* link to `tag/factors/ideology`.
+
+I think this is an artefact of the intersection of ikiwiki [[ikiwiki/SubPage/LinkingRules]] with the use of a page's basename when its path relative to the wikiroot should be used.
diff --git a/doc/bugs/map_doesn__39__t_calculate___34__common__95__prefix__34___correctly.mdwn b/doc/bugs/map_doesn__39__t_calculate___34__common__95__prefix__34___correctly.mdwn
new file mode 100644
index 000000000..df00621d8
--- /dev/null
+++ b/doc/bugs/map_doesn__39__t_calculate___34__common__95__prefix__34___correctly.mdwn
@@ -0,0 +1,70 @@
+Problem with [[plugins/map]]:
+
+# Observed behavior:
+
+## given map:
+
+\[[!map pages="blog/tags/*"]]
+
+## received map:
+
+<div class="map">
+<ul>
+<li><a href="../" class="mapparent">blog</a>
+<ul>
+<li><a href="../tags/" class="mapparent">tags</a>
+<ul>
+<li>life
+</li>
+</ul>
+<ul>
+<li>tech
+</li>
+</ul>
+</li>
+</ul>
+</li>
+</ul>
+</div>
+
+Note that you get "blog" and "tags", and they're both links, but "life" and "tech" are not links.
+
+# desired output:
+
+<div class="map">
+<ul>
+<li><a href="../tags/life/" class="mapitem">life</a>
+</li>
+<li><a href="../tags/tech/" class="mapitem">tech</a>
+</li>
+</ul>
+</div>
+
+Note that you you don't get "blog" or "tags", and "life" and "tech" are links now.
+
+# patch which appears to achieve this:
+
+<pre>
+--- map.pm.orig 2007-11-23 16:04:02.000000000 -0500
++++ map.pm 2007-12-21 00:12:15.000000000 -0500
+@@ -37,6 +37,9 @@
+ my @b=split(/\//, $common_prefix);
+ $common_prefix="";
+ while (@a && @b && $a[0] eq $b[0]) {
++ if ($common_prefix) {
++ $common_prefix .= "/";
++ }
+ $common_prefix.=shift(@a);
+ shift @b;
+ }
+</pre>
+
+# Discussion
+
+(Disclaimer: I don't know ikiwiki internals.)
+
+Map tries to calculate a "common prefix" between the pagespec and the page being rendered, and then later does some substitutions using the prefix. But the path has /'s in it and the common prefix doesn't, so it never matches correctly. So, add the /'s.
+
+-- [[users/Larry_Clapp]]
+
+> Excellent problem description and analysis. Patch [[applied|done]] --[[Joey]]
diff --git a/doc/bugs/map_fails_to_close_ul_element_for_empty_list.mdwn b/doc/bugs/map_fails_to_close_ul_element_for_empty_list.mdwn
new file mode 100644
index 000000000..ad0f506f2
--- /dev/null
+++ b/doc/bugs/map_fails_to_close_ul_element_for_empty_list.mdwn
@@ -0,0 +1,93 @@
+[[!tag plugins/map patch]]
+
+input:
+
+ before.
+ \[[!map pages="sdfsdfsdfsd/*"]]
+ after.
+
+Presuming that the pagespec does not match, output:
+
+ <p>before.
+ <div class="map">
+ <ul>
+ </div></p>
+
+The UL element is not closed.
+
+Patch:
+
+ --- /usr/share/perl5/IkiWiki/Plugin/map.pm 2009-05-06 00:56:55.000000000 +0100
+ +++ IkiWiki/Plugin/map.pm 2009-06-15 12:23:54.000000000 +0100
+ @@ -137,11 +137,11 @@
+ $openli=1;
+ $parent=$item;
+ }
+ - while ($indent > 0) {
+ + while ($indent > 1) {
+ $indent--;
+ $map .= "</li>\n</ul>\n";
+ }
+ - $map .= "</div>\n";
+ + $map .= "</ul>\n</div>\n";
+ return $map;
+ }
+
+
+-- [[Jon]]
+
+> Strictly speaking, a `<ul>` with no `<li>`s isn't valid HTML either...
+> could `map` instead delay emitting the first `<ul>` until it determines that
+> it will have at least one item? Perhaps refactoring that function into
+> something easier to regression-test would be useful. --[[smcv]]
+
+>> You are right (just checked 4.01 DTD to confirm). I suspect refactoring
+>> the function would be wise. From my brief look at it to formulate the
+>> above I thought it was a bit icky. I'm not a good judge of what would
+>> be regression-test friendly but I might have a go at reworking it. With
+>> this variety of problem I have a strong inclination to use HOFs like map,
+>> grep. - [[Jon]]
+
+>>> The patch in [[map/discussion|plugins/map/discussion]] has the same
+>>> problem, but does suggest a simpler approach to solving it (bail out
+>>> early if the map has no items at all). --[[smcv]]
+
+>>>> Thanks for pointing out the problem. I guess this patch should solve it.
+>>>> --[[harishcm]]
+
+>>>>> Well, I suppose that's certainly a minimal patch for this bug :-)
+>>>>> I'm not the IkiWiki maintainer, but if I was, I'd apply it, so I've put
+>>>>> it in a git branch for Joey's convenience. Joey, Jon: any opinion?
+>>>>>
+>>>>> If you want to be credited for this patch under a name other than
+>>>>> "harishcm" (e.g. your real name), let me know and I'll amend the branch.
+>>>>> (Or, make a git branch of your own and replace the reference just below,
+>>>>> if you prefer.) --[[smcv]]
+
+>>>>>> The current arrangement looks fine to me. Thanks. --[[harishcm]]
+
+> [[merged|done]] --[[Joey]]
+
+Patch:
+
+ --- /usr/local/share/perl/5.8.8/IkiWiki/Plugin/map.pm
+ +++ map.pm
+ @@ -80,7 +80,17 @@
+ my $indent=0;
+ my $openli=0;
+ my $addparent="";
+ - my $map = "<div class='map'>\n<ul>\n";
+ + my $map = "<div class='map'>\n";
+ +
+ + # Return empty div if %mapitems is empty
+ + if (!scalar(keys %mapitems)) {
+ + $map .= "</div>\n";
+ + return $map;
+ + }
+ + else { # continue populating $map
+ + $map .= "<ul>\n";
+ + }
+ +
+ foreach my $item (sort keys %mapitems) {
+ my @linktext = (length $mapitems{$item} ? (linktext => $mapitems{$item}) : ());
+ $item=~s/^\Q$common_prefix\E\///
diff --git a/doc/bugs/map_generates_malformed_HTML.mdwn b/doc/bugs/map_generates_malformed_HTML.mdwn
new file mode 100644
index 000000000..890a6ef7f
--- /dev/null
+++ b/doc/bugs/map_generates_malformed_HTML.mdwn
@@ -0,0 +1,36 @@
+[[!template id=gitbranch branch=smcv/ready/map author="[[Simon McVittie|smcv]]"]]
+[[!tag patch]]
+
+`\[[!map]]` can generate bad HTML with unbalanced open/close tags
+(in XML terms: "not well-formed") in certain situations. This
+appears to be a regression caused by fixing
+[[maps with nested directories sometimes make ugly lists]], which
+suppressed some redundant `</ul><ul>` pairs, but appears not to
+have the ideal logic for this, leading to malformed HTML.
+
+In particular, on a site with these pages:
+
+* alpha
+ * 1
+ * i
+ * ii
+ * iii
+ * iv
+ * 2
+ * a
+ * b
+ * 3
+* beta
+
+the maps "`alpha/1 or beta`", "`alpha/1/i* or alpha/2/a or beta`" and
+"`alpha/1/i* or alpha/2/a`" have malformed HTML.
+
+My `ready/map` branch adds a regression test and makes it pass.
+
+The fix is not particularly elegant - it generates the previous
+HTML with redundant `</ul><ul>` pairs, marks the redundant
+pairs, and edits them out afterwards - but it works. If anyone can come
+up with a cleaner algorithm that avoids generating the redundant tags
+in the first place, that would be even better. --[[smcv]]
+
+> [[merged|done]] (not thrilled at this solution, but it works) --[[Joey]]
diff --git a/doc/bugs/map_is_inconsistent_about_bare_directories.mdwn b/doc/bugs/map_is_inconsistent_about_bare_directories.mdwn
new file mode 100644
index 000000000..a53296dfc
--- /dev/null
+++ b/doc/bugs/map_is_inconsistent_about_bare_directories.mdwn
@@ -0,0 +1,86 @@
+The [[plugins/map]] plugin has inconsistent behaviour. In particular, I have in my wiki some directory structures holding files without wikitext pointers (I point directly to the files from elsewhere). For example, imagine the following file structure in the source dir:
+
+ ; ls -R dirA dirB
+ dirA:
+ subA subB
+
+ dirA/subA:
+ filea.mdwn fileb.mdwn
+
+ dirA/subB:
+ filec.mdwn filed.mdwn
+
+ dirB:
+ subA subC
+
+ dirB/subA:
+ filea.mdwn
+
+ dirB/subC:
+ fileb.mdwn filec.mdwn
+
+When I use map to make a map of this, the result looks more like this:
+
+<ul>
+<li><span class="createlink">? dirA</span>
+<ul>
+<li><span class="createlink">? subA</span>
+<ul>
+<li>filea
+</li>
+</ul>
+<ul>
+<li>fileb
+</li>
+</ul>
+<ul>
+<li>filec
+</li>
+<li>filed
+</li>
+</ul>
+</li>
+</ul>
+</li>
+<li><span class="createlink">? dirB</span>
+<ul>
+<li><span class="createlink">? subA</span>
+<ul>
+<li>filea
+</li>
+</ul>
+</li>
+</ul>
+<ul>
+<li><span class="createlink">? subC</span>
+<ul>
+<li>fileb
+</li>
+</ul>
+<ul>
+<li>filec
+</li>
+</ul>
+</li>
+</ul>
+</li>
+</ul>
+
+Note that while the dirA/subA directory exists with a create link, the dirA/subB directory is missing from the map. Interestingly, dirB/subC is shown in the map. If you add a second file to dirB/subA then dirB/subC disappears as well.
+
+I could imagine including all 'bare' directories in the map, and I could imagine including no 'bare' directories in the map. Just including the first bare directory seems a strange intermediate point.
+
+Attached is a [[patch]] that fixes the issue. The current map code makes one pass over the sorted list of pages. This adds an initial pass that goes through and makes sure that all parent directories are included. With this initial pass added, the following pass could probably be simplified.
+
+One solution could also use the [[plugins/autoindex]] plugin to make sure that parent pages actually exist. This is really only a stop-gap solution until the patch is applied - map still needs to be made bug-free.
+
+Note: This patch adds items to a map while it is in a foreach loop over a sorted list of keys from that same map. Changing a map while iterating through it is normally problematic. I'm assuming the sort insulates the code from this - I do not need to iterate over any of the newly added elements.
+
+-- [[users/Will]]
+
+> The patch is subtly buggy and just papers over the actual bug with a
+> lot of extra code. Thanks for trying to come up with a patch for this
+> annyoingly complicated bug.. I think I've fixed the underlying bug now.
+> --[[Joey]]
+>
+> [[!tag done]]
diff --git a/doc/bugs/map_sorts_by_pagename_and_not_title_when_show__61__title_is_used.mdwn b/doc/bugs/map_sorts_by_pagename_and_not_title_when_show__61__title_is_used.mdwn
new file mode 100644
index 000000000..d12414d55
--- /dev/null
+++ b/doc/bugs/map_sorts_by_pagename_and_not_title_when_show__61__title_is_used.mdwn
@@ -0,0 +1,20 @@
+The [[ikiwiki/directive/map]] directive sort by pagename. That looks kind of odd, when used together with show=title. I would expect it to sort by title then.
+
+> This would be quite hard to fix. Map sorts the pages it displays by page
+> name, which has the happy effect of making "foo/bar" come after "foo";
+> which it *has* to do, so that it can be displayed as a child of the page
+> it's located in. If sorting by title, that wouldn't hold. So, map
+> would have to be effectively totally rewritten, to build up each group
+> of child pages, and then re-sort those. --[[Joey]]
+
+>> Ok, you are right, that does would break the tree. This made me think that I do not
+>> need to generate a tree for my particular use case just a list, so i thought i could use [[ikiwiki/directive/inline]] instead.
+>> This created two new issues:
+>>
+>> 1. inline also does sort by pagename even when explicitly told to sort by title.
+>>
+>> 2. I cannot get inline to create a list when the htmltidy plugin is switched on. I have a template which is enclosed in an li tag, and i put the ul tag around the inline manually, but htmltidy breaks this. --martin
+
+>>>> You might want to check if the [[plugins/contrib/report]] plugin solves your problem. It can sort by title, among other things. --[[KathrynAndersen]]
+
+>> See also: [[todo/sort_parameter_for_map_plugin_and_directive]] --[[smcv]]
diff --git a/doc/bugs/maps_with_nested_directories_sometimes_make_ugly_lists.mdwn b/doc/bugs/maps_with_nested_directories_sometimes_make_ugly_lists.mdwn
new file mode 100644
index 000000000..a6546faad
--- /dev/null
+++ b/doc/bugs/maps_with_nested_directories_sometimes_make_ugly_lists.mdwn
@@ -0,0 +1,62 @@
+I'm using the [[map_directive|ikiwiki/directive/map]] to build dynamic navigation menus, and it's working really nicely!
+
+However on some pages, each nested item each get wrapped in a full set of `<ul>` tags. This doesn't actually hurt anything, but it's does it inconsistently which seems like a bug. I don't like it because it puts extra vertical spacing into my menu bar.
+
+Here's what I expect it to look like:
+
+ <div class="map">
+ <ul>
+ <li><span class="selflink">Archives</span>
+ <ul>
+ <li><a href="./2010/" class="mapitem">2010</a></li>
+ <li><a href="./2011/" class="mapitem">2011</a></li>
+ </ul>
+ </li>
+ </ul>
+ </div>
+
+And here's what it's actually doing:
+
+ <div class="map">
+ <ul>
+ <li><span class="selflink">Archives</span>
+ <ul>
+ <li><a href="./2010/" class="mapitem">2010</a></li>
+ </ul>
+ <ul>
+ <li><a href="./2011/" class="mapitem">2011</a></li>
+ </ul>
+ </li>
+ </ul>
+ </div>
+
+I've tried to replicate the problem on this site and cannot, I'm not sure if that's because of exactly how I'm using map or if there's something different with my site. I just upgraded ikiwiki to the latest Debian unstable as well as most of the required Perl modules and nothing changed.
+
+If you look at [this page on my site](http://adam.shand.net/ikidev/archives/) (getsource is enabled) you can see it working as expected in the main page and not working in the side bar.
+
+But it also doesn't work on the sitemap page: <http://adam.shand.net/ikidev/site/map/>
+
+This might be really simple, but I've been staring at it too long and it only looks like a bug to me. :-( Any suggestions would be gratefully accepted. -- [[AdamShand]]
+
+> Okay, I think I've figured this out, it looks like ikiwiki behaves differently depending on the level of heirarchy. I'll post the details once I'm sure. -- [[AdamShand]]
+
+>> I managed to replicate the issue on my ikiwiki, and I believe it is a
+>> bug. The current upstream logic for going up/down by a level opens
+>> (and closes) the unnecessary lists that you are seeing. Although the
+>> resulting markup is semantically correct, it has superflous stuff
+>> that introduces whitespace issues at the very least.
+
+>> I have a [[patch]] up [on my git repo](http://git.oblomov.eu/ikiwiki/patch/55fa11e8a5fb351f9371533c758d8bd3eb9de245)
+>> that ought to fix the issue.
+
+>>> Wonderful, thank you very much for the help! I've installed the patch and it's working great, but it looks like there a minor bug. Sometimes it doesn't print the top/first map item. Cheers, -- [[AdamShand]]
+>>>
+>>> <http://adam.shand.net/tmp/map-orig.jpg>
+>>> <http://adam.shand.net/tmp/map-patched.jpg>
+
+>>>> Thanks for testing. I managed to reproduce it and I adjusted the logic.
+>>>> An updated [[patch]] can be found [here](http://git.oblomov.eu/ikiwiki/patch/dcfb18b7989a9912ed9489f5ff15f871b6d8c24a)
+
+>>>>> Seems to work perfectly to me, thanks! -- [[AdamShand]]
+
+[[applied|done]] --[[Joey]]
diff --git a/doc/bugs/markdown_bug:_email_escaping_and_plus_addresses.mdwn b/doc/bugs/markdown_bug:_email_escaping_and_plus_addresses.mdwn
new file mode 100644
index 000000000..6fccc5c86
--- /dev/null
+++ b/doc/bugs/markdown_bug:_email_escaping_and_plus_addresses.mdwn
@@ -0,0 +1,37 @@
+compare:
+
+ * <jon+markdownbug@example.org>
+ * <jon.markdownbug@example.org>
+
+* <jon+markdownbug@example.org>
+* <jon.markdownbug@example.org>
+
+It seems putting a '+' in there throws it. Maybe it's a markdown bug, or maybe the obfuscation markdown applies to email-links is being caught by the HTML sanitizer.
+
+ -- [[users/Jon]]
+
+> It's a markdown bug. For some reason, markdown doesn't recognize the email with a '+' as an email:
+>
+> $ echo '<a+b@c.org>' | markdown
+> <p><a+b@c.org></p>
+>
+> htmlscrubber then (rightly) removes this unknown tag.
+>
+
+>> Filed [in CPAN](http://rt.cpan.org/Ticket/Display.html?id=37909)
+>> --[[Joey]] [[!tag done]]
+
+> But I've noticed some other Text::Markdown bugs that, even with htmlscrubber, produce
+> [ill-formed (X)HTML](http://validator.w3.org/check?uri=http%3A%2F%2Fikiwiki.info%2Fbugs%2Fmarkdown_bug%3A_email_escaping_and_plus_addresses%2F).
+> (View the markdown source of this page.)
+>
+> --Gabriel
+
+>> The htmlscrubber does not attempt to produce valid html from invalid. It
+>> attempts to prevent exploits in html. The tidy plugin can force html to
+>> valid. --[[Joey]]
+
+<tt>
+
+-
+>
diff --git a/doc/bugs/markdown_module_location.mdwn b/doc/bugs/markdown_module_location.mdwn
new file mode 100644
index 000000000..3e2e4daa5
--- /dev/null
+++ b/doc/bugs/markdown_module_location.mdwn
@@ -0,0 +1,49 @@
+If the Markdown module is installed via CPAN rather than apt then
+the module is actually Text::Markdown.
+
+I had to edit the source to change this on my old server. I have filed
+a [bug](http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=387687) against
+the Debian markdown, which I guess you can consider a blocking bug of this.
+
+I tried to come up with the magical invocation to allow either location
+to be used by ikiwiki, but I couldn't do it.
+
+-- [[JamesWestby]]
+
+Fixed, I think --[[Joey]]
+
+> Fraid not. The import works ok, but I get
+ `Undefined subroutine &Markdown::Markdown called at IkiWiki/Plugin/mdwn.pm line 41.`
+> This is what stumped me, I was trying to import as an alias, but couldn't work
+> out how to do it. A flag if you use the second import would be an ugly solution.
+> -- [[JamesWestby]]
+
+Ok, the markdown in CPAN must be an entirely different version then if it
+doesn't has a Markdown::Markdown. Interesting, I'll have a look at it.
+--[[Joey]]
+
+> It works if you use Text::Markdown::Markdown, sorry, I forgot to mention that.
+> --JamesWestby
+
+I think what I've committed now will work in all cases. Well, unless there
+are even more forks of markdown out there (the CPAN module is a fork
+apparently...)
+
+ --[[Joey]]
+
+> It now compiles here, thanks. --JamesWestby
+
+> It's back open in the latest incarnation of Text::Markdown ... the fix is to use the
+> lowercase function name (Text::Markdown::markdown) however w/ this setup
+> it causes a segfault on my system.... down while compiling
+
+ todo/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn
+ *** glibc detected *** double free or corruption (!prev): 0x08bced80 ***
+
+ -- [[harningt]]
+
+> What version of Text::Markdown are you referring to? If it crashes perl
+> then perhaps you need to find a less evil version... --[[Joey]]
+
+>> The patch in [[todo/Add_support_for_latest_Text::Markdown_as_found_on_CPAN]] adds support for Text::Markdown::markdown(). -- [[HenrikBrixAndersen]]
+>> Doesn't fix the above double free though. Nevertheless, I think I'm going to call this [[done]] since I already added support for Text::Markdown::markdown in git earlier this week. --[[Joey]]
diff --git a/doc/bugs/mercurial_fail_to_add.mdwn b/doc/bugs/mercurial_fail_to_add.mdwn
new file mode 100644
index 000000000..3bbf4e5fd
--- /dev/null
+++ b/doc/bugs/mercurial_fail_to_add.mdwn
@@ -0,0 +1,34 @@
+I don't know what's wrong but I can't add a file with the mercurial backend (the file is created but not added).
+
+Here is a patch that's seems to work, although I'm not quite sure what's wrong with current code :
+
+ hbernard@tactic:/usr/share/perl5/IkiWiki/Rcs$ diff mercurial.pm /home/hbernard/mercurial.pm -Nau
+ --- mercurial.pm 2007-03-24 16:14:35.000000000 +0100
+ +++ /home/hbernard/mercurial.pm 2007-04-19 19:05:47.000000000 +0200
+ @@ -95,7 +95,7 @@
+ sub rcs_add ($) {
+ my ($file) = @_;
+
+ - my @cmdline = ("hg", "-q", "-R", "$config{srcdir}", "add", "$file");
+ + my @cmdline = ("hg", "-q", "-R", "$config{srcdir}", "add", "$config{srcdir}/$file");
+ if (system(@cmdline) != 0) {
+ warn "'@cmdline' failed: $!";
+ }
+
+My srcdir path has some symbolics links and hidden directorys... maybe that's it ?
+
+> Interesting, the mercurial test suite shows the add without the path
+> working ok. OTOH, it also continues to work if I apply your patch, so I
+> guess it's safe enough. It would be good to know why it's failing w/o the
+> path in your case. --[[Joey]]
+
+>> I didn't have time to investigate, but here is the path :
+>>
+>> /home/hbernard/.hiddendatas/hg/ikigtd/
+>>
+>> but ~/.hiddendatas is itself a symbolic to .dotfiles/.hiddendatas (this seems weird I know ;))
+>> As I was trying to resolve this, the interesting thing is that keying directly in a terminal the same command was working...
+>> I suspect something with the symbolic link... but didn't investigate further. I will setup a test wiki with symlink in the path to check this.
+
+> As I applied the patch, I'm moving this out of the patchqueue to bugs and
+> marking it [[done]]. --[[Joey]]
diff --git a/doc/bugs/merging_to_basewiki_causes_odd_inconsistencies.mdwn b/doc/bugs/merging_to_basewiki_causes_odd_inconsistencies.mdwn
new file mode 100644
index 000000000..2a668fd59
--- /dev/null
+++ b/doc/bugs/merging_to_basewiki_causes_odd_inconsistencies.mdwn
@@ -0,0 +1,6 @@
+Merging pages to the basewiki causes some odd inconsistencies. In particular,
+if I edit a page, you merge it to the basewiki, and I go to edit it again, the
+edit box displays the old text, without my edit. --[[JoshTriplett]]
+
+Funny, I was getting annoyed at this independently, and committed a fix
+before I saw this.. [[bugs/done]] --[[Joey]]
diff --git a/doc/bugs/messed_up_repository.mdwn b/doc/bugs/messed_up_repository.mdwn
new file mode 100644
index 000000000..e245b84a8
--- /dev/null
+++ b/doc/bugs/messed_up_repository.mdwn
@@ -0,0 +1,21 @@
+I messed up my local clone of my repository.
+
+It appears that there is a special clone, which contains .ikiwiki, local.css, recentchanges, and the like.
+
+How can I create a new version of this clone?
+
+> No, there's the srcdir, which ikiwiki populates with some of those files
+> when run. Notably the .ikiwiki directory and all its contents, and the
+> recentchanges directory and its contents. But not local.css.
+>
+> If you've lost .ikiwiki and it contained user registration info
+> (passwords etc), you've lost that info. Everything else can be
+> regenerated by running `ikiwiki -setup your.setup`
+>
+> If you still have .ikiwiki, but the git clone is messed up somehow, you
+> can just make a new clone and move .ikiwiki into it before running
+> ikiwiki. --[[Joey]]
+
+> > Great, that worked. Thanks Joey!
+
+[[!tag done]]
diff --git a/doc/bugs/meta_inline.mdwn b/doc/bugs/meta_inline.mdwn
new file mode 100644
index 000000000..9ca32a0e8
--- /dev/null
+++ b/doc/bugs/meta_inline.mdwn
@@ -0,0 +1,4 @@
+The meta plugin doesn't affect a page if it's being inlined. Probably
+setting the title with it should override the title of the blog post.
+
+[[bugs/done]]
diff --git a/doc/bugs/methodResponse_in_add__95__plugins.mdwn b/doc/bugs/methodResponse_in_add__95__plugins.mdwn
new file mode 100644
index 000000000..c82b532db
--- /dev/null
+++ b/doc/bugs/methodResponse_in_add__95__plugins.mdwn
@@ -0,0 +1,41 @@
+**problem description:** when using an external plugin like rst, the cgi script (but not the build process) fails with the following words:
+
+ Unsuccessful stat on filename containing newline at /usr/share/perl5/IkiWiki.pm line 501.
+ Unsuccessful stat on filename containing newline at /usr/share/perl5/IkiWiki.pm line 501.
+ Failed to load plugin IkiWiki::Plugin::</methodResponse>
+ : Can't locate IkiWiki/Plugin/.pm in @INC (@INC contains: /home/ikiwiki/.ikiwiki /etc/perl \
+ /usr/local/lib/perl/5.10.0 /usr/local/share/perl/5.10.0 /usr/lib/perl5 /usr/share/perl5 \
+ /usr/lib/perl/5.10 /usr/share/perl/5.10 /usr/local/lib/site_perl .) at (eval 44) line 3.
+ BEGIN failed--compilation aborted at (eval 44) line 3.
+
+**setup used:** blank debian sid with ikiwiki 2.61 (but as the patch can be cleanly merged to git HEAD, i suppose this would happen on HEAD as well). perl version is 5.10.0-13.
+
+**problem analysis:** `strings ikiwiki.cgi` tells that the stored WRAPPED\_OPTIONS contain the string "&lt;/methodResponse&gt;\n" where i'd expect "rst" in `config{add_plugins}`. this seems to originate in the use of `$_` in the plugin loading function.
+
+**patch comment:** solves the problem on 2.61. as these are the first lines of perl i've knowingly written, i can not explain what exactly was happening there.
+
+> Perl's `$_` handling is the worst wart on it, or possibly any language.
+> Here it's an alias to the actual value in the array, and when deep
+> in the external plugin load code something resets `$_` to a different
+> value, the alias remains and it changes the value at a distance.
+>
+> Thanks for the excellent problem report, [[fixed|done]]. --[[Joey]]
+
+------------------------------------------------------------------------------
+ diff --git a/IkiWiki.pm b/IkiWiki.pm
+ index e476521..d43abd4 100644
+ --- a/IkiWiki.pm
+ +++ b/IkiWiki.pm
+ @@ -471,7 +471,11 @@ sub loadplugins () {
+ unshift @INC, possibly_foolish_untaint($config{libdir});
+ }
+
+ - loadplugin($_) foreach @{$config{default_plugins}}, @{$config{add_plugins}};
+ + my $pluginname;
+ + foreach $pluginname (@{$config{default_plugins}}, @{$config{add_plugins}})
+ + {
+ + loadplugin($pluginname);
+ + }
+
+ if ($config{rcs}) {
+ if (exists $IkiWiki::hooks{rcs}) {
diff --git a/doc/bugs/minor:_tiny_rendering_error.mdwn b/doc/bugs/minor:_tiny_rendering_error.mdwn
new file mode 100644
index 000000000..b2e07eef9
--- /dev/null
+++ b/doc/bugs/minor:_tiny_rendering_error.mdwn
@@ -0,0 +1,5 @@
+`\[[!inline]]` is rendered with a space in front of the first closing bracket. --[[tschwinge]]
+
+> I don't think that complicating the directive parser
+> is warrented by the minorness of this bug. The result that it outputs is
+> still valid. --[[Joey]]
diff --git a/doc/bugs/misctemplate_does_not_respect_the_current_page___40__if_any__41__.mdwn b/doc/bugs/misctemplate_does_not_respect_the_current_page___40__if_any__41__.mdwn
new file mode 100644
index 000000000..3b0347f5f
--- /dev/null
+++ b/doc/bugs/misctemplate_does_not_respect_the_current_page___40__if_any__41__.mdwn
@@ -0,0 +1,101 @@
+I really like the new approach to having only one main template "page.tmpl". For instance, it improves previews during edits.
+But it causes some nasty bugs for plugins that use the pagetemplate hook. It is especially visible with the [[plugins/sidebar]] plugin.
+
+## Some examples
+
+### A first example
+
+* activate sidebars globally and cgi
+* create "/sidebar.mdwn" with "[<span></span>[foo]]" inside
+* create "/foo.mdwn" with "hello!" inside
+* create "/bar.mdwn"
+* run ikiwiki
+* with the web browser, go to the page "bar"
+* notice the sidebar, click on "foo"
+* notice the page "foo" is now displayed (hello!)
+* return the the page "bar" and click "edit"
+* notice the sidebar is still here, click on the "foo"
+* -> Problem: 404, the browser goes to "/bar/foo"
+* -> Was expected: the browser goes to "/foo"
+
+> You must have a locally modified `page.tmpl` that omits the "TMPL_IF DYNAMIC"
+> that adds a `<base>` tag. That is needed to make all links displayed by
+> cgis work reliably. Not just in this page editing case.
+> The [[version_3.20100515]] announcment mentions that you need to
+> update old `page.tmpl` files to include that on upgrade. --[[Joey]]
+
+>> I followed the anouncment. I also disabled my custom page.tmpl to confirm the bug. I even produced a step-by-step example to reproduce the bug.
+>> In fact, the base tag work for the page links (the content part) but did not works for the sidebar links (the sidebar part) since the sidebar links are generated in the context of the root page.
+>> In the examble above:
+>>
+>> * base="http://www.example.com/bar" relative_link_in_bar=''something" -> absolute_link_in_bar = "http://www.example.com/bar/something" (that is fine)
+>> * base="http://www.example.com/bar" relative_link_in_sidebar="foo" (because generated in the context of the root page) -> absolute_link_in_sidebar = "http://www.example.com/bar/foo" (that is not fine)
+>>
+>> The fix commited work for previewing, but not in other cases : links are still broken. Please juste follow the example step-by-step to reproduce it (I just retried it with a "fixed" version: Debian 3.20100610). If you cannot reproduce, please say it explicitely instead of guessing about my innability to read changelogs. -- [[JeanPrivat]]
+
+>>> Sorry if my not seeing the bug offended you. [[Fixed|done]] --[[Joey]]
+
+>>>> Thanks! --[[JeanPrivat]] (I'm not offended)
+
+### A second example
+
+* create "/bar/sidebar.mdwn" with "world"
+* run ikiwiki
+* with the web browser, go to the page "bar"
+* notice the sidebar displays "world"
+* click "edit"
+* -> Problem: the sidebar now shows the foo link (it is the root sidebar!)
+* -> Was expecte : the sidebar displays "world"
+
+> I think it's a misconception to think that the page editing page is the same
+> as the page it's editing. If you were deleting that page, would you expect
+> the "are you sure" confirmation page to display the page's sidebar?
+> --[[Joey]]
+
+>> It is a very good point and could be argued:
+>>
+>> * for dynamic page, is the root context more legitimate than the current page context?
+>> * when clicking the Edit link, does the user expect to remain in the "same page"?
+>>
+>> But, as far as something sensible is displayed and that the links work. I'm OK with any choice. -- [[JeanPrivat]]
+
+### A last example
+
+* with the web browser edit the page "bar"
+* type <code>[<span></span>[!sidebar content="goodby"]]</code>
+* click preview
+* -> Problem: the sidebar still displays the foo link
+* -> Was expected: the sidebar display "goodby"
+
+> In the specific case of previewing, it is indeed a bug that the
+> right sidebar is not displayed. And replacing the regular sidebar
+> with the one from the previewed page is probably the best we can do..
+> displaying 2 sidebars would be confusing, and the `page.tmpl` can
+> put the sidebar anywhere so we can't just display the preview sidebar
+> next to the rest of the page preview. --[[Joey]]
+
+>> The behavior is fine for me. However, some nitpicking (fell free to ingore) :
+>>
+>> * If the sidebar is replaced (making the previewing in-place), for consitency, should not the previewed content also shown in-place ? i.e. above the form part
+>> * there is no way to come back (without saving or canceling) to the root context (e.g. displaying the root sidebar) i.e. some sort of unpreviewing.
+>>
+>> -- [[JeanPrivat]]
+
+## Some superficial hacking
+
+With the following workaround hacks, I manage to solve the 3 examples shown above:
+
+1- edit IkiWiki/Plugin/editpage.pm and call showform with additional page and destpage parameters:
+<pre>showform($form, \@buttons, $session, $q, forcebaseurl => $baseurl, page => $page, destpage => $page);</pre>
+
+2- edit /usr/share/perl5/IkiWiki.pm and modify the misctemplate function to use the given page and destpage:
+<pre>my %params=@_;
+shift->(page => $params{page}, destpage => $params{destpage}, template => $template);</pre>
+
+I do not guarantee (I do not even expect) that it is the proper way to solve
+this bug but it may help developers to find and solve the real problem.
+
+> Oh, it's pretty reasonable. I don't think it breaks anything. :)
+> I modified it a bit, and explicitly made it *not* "fix" the second example.
+> --[[Joey]]
+>> I removed the done tag (I suspect it is the way to reopen bugs) -- [[JeanPrivat]]
diff --git a/doc/bugs/monotone_backend_does_not_support_srcdir_in_subdir.mdwn b/doc/bugs/monotone_backend_does_not_support_srcdir_in_subdir.mdwn
new file mode 100644
index 000000000..35f624f78
--- /dev/null
+++ b/doc/bugs/monotone_backend_does_not_support_srcdir_in_subdir.mdwn
@@ -0,0 +1,5 @@
+The [[rcs/monotone]] backend does not currently support putting the ikiwiki srcdir
+in a subdirectory of the repository. It must be at the top. Git has
+special code to handle this case. --[[Joey]]
+
+[[done]]
diff --git a/doc/bugs/more_and_RSS_generation.mdwn b/doc/bugs/more_and_RSS_generation.mdwn
new file mode 100644
index 000000000..00ab43fa2
--- /dev/null
+++ b/doc/bugs/more_and_RSS_generation.mdwn
@@ -0,0 +1,20 @@
+I'd like the more plugin and RSS to play better together. In the case of the html generation of the main page of a blog, I'd like to get the first paragraph out, but keep RSS as a full feed.
+
+Maybe there is a different plugin (I also tried toggle)?
+
+> I am not a fan of the more directive (thus the rant about it sucking
+> embedded in its [[example|ikiwiki/directive/more]]). But I don't think
+> that weakening it to not work in rss feeds is a good idea, if someone
+> wants to force users to go somewhere to view their full content,
+> they should be able to do it, even though it does suck.
+>
+> The toggle directive will degrade fairly well in an rss feed to
+> display the full text. (There is an annoying toggle link that does
+> nothing when embedded in an rss feed). --[[Joey]]
+
+I also note, that at least currently, more seems to break on a few pages, not being parsed at all when aggregated into the front page.
+
+> It's just a simple directive, it should work anywhere any directive will,
+> and does as far as I can see. Details? --[[Joey]]
+
+see also: [[/bugs/rss_feeds_do_not_use_recommended_encoding_of_entities_for_some_fields/]]
diff --git a/doc/bugs/multiple_encoding_issues_in_atom.mdwn b/doc/bugs/multiple_encoding_issues_in_atom.mdwn
new file mode 100644
index 000000000..b5ec034ab
--- /dev/null
+++ b/doc/bugs/multiple_encoding_issues_in_atom.mdwn
@@ -0,0 +1,8 @@
+Two examples of encoding breakage observed in the wild. In both cases
+the ampersand needs to be escaped.
+--[[Joey]]
+
+ <link href="http://www.youtube.com/watch?v=Z9hP9lhBDsI&feature=youtube_gdata"/>
+
+ <category term="vicky&alice" />
+
diff --git a/doc/bugs/multiple_pages_with_same_name.mdwn b/doc/bugs/multiple_pages_with_same_name.mdwn
new file mode 100644
index 000000000..20c38c062
--- /dev/null
+++ b/doc/bugs/multiple_pages_with_same_name.mdwn
@@ -0,0 +1,76 @@
+I'm just working on an updated solution to [[todo/automatic_use_of_syntax_plugin_on_source_code_files]] (see also [[plugins/contrib/highlightcode]] or [[plugins/contrib/sourcehighlight]]).
+
+I realised that this is going to have problems when you ask it to process `.c` and `.h` files with the same base name. e.g. `hello.c` and `hello.h`.
+
+I tested it briefly with `test.java` and `test.mdwn` just to see what would happen. Things got quite strange. The source-highlighting plugin was called (probably for the java file), but then when it calls `pagetype($pagesources{$page})` to figure out the file type, that function returns `mdwn`, which confuses things somewhat.
+
+> This is a known possible point of confusion. If there are multiple source
+> files, it will render them both, in an arbitrary sequence, so one "wins".
+> --[[Joey]]
+
+Anyway, I'm thinking about possible solutions. The best option I've come up with so far is: when registering an htmlize hook, add a new optional paramter 'keep_extension'. This would make a source file of `hello.c` generate a page with name `hello.c` rather than the current `hello`. This would keep the pages unique (until someone makes `hello.c.mdwn`...).
+
+Suggestions welcome.
+
+-- [[Will]]
+
+> Ok, this turned out not to be a hard change. [[patch]] is below. With this patch you can tell IkiWiki not to drop the suffix when you register a hook: `hook(type => "htmlize", id => $lang, call => \&htmlize, leavesuffix => 1);`
+
+>> I think that's a good solution to the problem that most syntax plugins
+>> have struggled with. It makes sense. It doesn't solve the case where
+>> you have source files without any extension (eg `Makefile`), but at
+>> least it covers the common cases.
+>>
+>> I'm going to be annoying and call it "keepextension", otherwise, applied
+>> as-is. --[[Joey]] [[done]]
+
+ diff --git a/IkiWiki.pm b/IkiWiki.pm
+ index 4e4da11..853f905 100644
+ --- a/IkiWiki.pm
+ +++ b/IkiWiki.pm
+ @@ -618,7 +618,7 @@ sub pagename ($) {
+
+ my $type=pagetype($file);
+ my $page=$file;
+ - $page=~s/\Q.$type\E*$// if defined $type;
+ + $page=~s/\Q.$type\E*$// if defined $type && !$hooks{htmlize}{$type}{leavesuffix};
+ return $page;
+ }
+
+ diff --git a/t/pagename.t b/t/pagename.t
+ index 96e6a87..58811b9 100755
+ --- a/t/pagename.t
+ +++ b/t/pagename.t
+ @@ -6,7 +6,7 @@ use Test::More tests => 5;
+ BEGIN { use_ok("IkiWiki"); }
+
+ # Used internally.
+ -$IkiWiki::hooks{htmlize}{mdwn}=1;
+ +$IkiWiki::hooks{htmlize}{mdwn}{call}=1;
+
+ is(pagename("foo.mdwn"), "foo");
+ is(pagename("foo/bar.mdwn"), "foo/bar");
+
+----
+
+I wonder if this patch will also be useful:
+
+> Reasonable, applied.
+
+ diff --git a/IkiWiki/Render.pm b/IkiWiki/Render.pm
+ index 752d176..3f1b67b 100644
+ --- a/IkiWiki/Render.pm
+ +++ b/IkiWiki/Render.pm
+ @@ -279,7 +279,11 @@ sub refresh () {
+ else {
+ $f=~s/^\Q$config{srcdir}\E\/?//;
+ push @files, $f;
+ - $exists{pagename($f)}=1;
+ + my $pagename = pagename($f);
+ + if ($exists{$pagename}) {
+ + warn(sprintf(gettext("Page %s has multiple possible source pages"), $pagename)."\n");
+ + }
+ + $exists{$pagename}=1;
+ }
+ }
+ },
diff --git a/doc/bugs/multiple_rss_feeds_per_page.mdwn b/doc/bugs/multiple_rss_feeds_per_page.mdwn
new file mode 100644
index 000000000..f65d1884e
--- /dev/null
+++ b/doc/bugs/multiple_rss_feeds_per_page.mdwn
@@ -0,0 +1,31 @@
+Pages with multiple inline macros try to use the same URL for the RSS feed for each inline. As a result, the last inline "wins" and overwrites the other feeds on the same page.
+
+Josh Triplett suggests that the inline macro should take a parameter for the feed basename, and refuse to generate feeds after the first one if that parameter is not specified. That sounds like a good solution to me.
+
+> That's a reasonable fix to this longstanding bug. Autoincrementing a
+> basename value would also work.
+>
+> I've known about this bug since well, the day I wrote rss support, but
+> I haven't seen a use case that really motivated me to take the time to
+> fix it. Fixes or good motivation both accepted. :-) --[[Joey]]
+
+> A good reason to support autoincrementing might be that it's possible
+> to have a blog feed that inlines another blog feed. On purpose, or
+> semi-on-accident, it happened to me:
+>
+> <http://kitenet.net/~joey/code/whatsnew/>
+>
+> The result was that my whatsnew feed actually contains my Words2Nums
+> feed, or something. --[[joey]]
+
+> I've implemented autoincrementing unique feeds, the first one on a page
+> is a .rss, next is .rss2, etc.
+>
+> There may be room for manual specification of feed basenames, but it is tricky to do that
+> well. One problem is that if page foo adds a feed with basename bar,
+> the resulting "foo_bar.rss" would have the same name as a feed for page
+> foo_bar. (Assuming usedirs is not set.) This is also why I stuck the
+> number on the end of the filename extension -- it's slightly ugly, but
+> it avoids all such naming ambiguities.
+>
+> Anyway, I think this is [[done]] --[[Joey]]
diff --git a/doc/bugs/must_save_before_uploading_more_than_one_attachment.mdwn b/doc/bugs/must_save_before_uploading_more_than_one_attachment.mdwn
new file mode 100644
index 000000000..bd5ddc6d5
--- /dev/null
+++ b/doc/bugs/must_save_before_uploading_more_than_one_attachment.mdwn
@@ -0,0 +1,44 @@
+When I create a new page and upload an attachment all is fine.
+
+If I try to upload a second attachment (or remove the previously uploaded attachment), no upload happens. Instead the page gets created. No matter what I typed in, I just get a map to show the attachment. Now I can edit this page and everything is fine again.
+
+Another workaround is to first save the text and then edit and upload the rest.
+
+Is this a problem on my site or does anyone else see this?
+
+(If it's my fault feel free to move this to [[forum]].)
+
+> I don't see a behavior like that.
+> I don't know what you mean when you say "I just get a map to show the
+> attachment" A map?
+>
+> What version of ikiwiki? What browser? Is javascript enabled? --[[Joey]]
+
+>> I mean the [[ikiwiki/directive/map]] directive.
+>> It was ikiwiki 3.20110430.
+>> Tried Firefox and uzbl (webkit) with or without javascript.
+>>
+>> Just updated to 3.20110905. Now the problem has changed. Instead of saving the page with the second upload and leading me to it, it leaves me in the editform but creates the page anyway.
+>> When saving I get informed, that someone else created the page. Obviously it was ikiwiki itself with the mentioned map:
+>> \[[!map pages="path/to/page/* and ! ...
+>>
+>> This told me that [[plugins/autoindex]] is the bad guy. Deactivating this plugin helps out. Don't know if this is worth fixing... I can live without that plugin. --bacuh
+
+>>> The right fix would probably be for `do=create` to allow replacing a page
+>>> in the transient underlay without complaining (like the behaviour that
+>>> `do=edit` normally has).
+
+>>>> ... which it turns out it already does. --[[smcv]]
+
+>>> That wouldn't help you unless [[plugins/autoindex]]
+>>> defaulted to making transient pages (`autoindex_commit => 0`), but if we
+>>> can fix [[removal_of_transient_pages]] then maybe that default can change?
+>>> --[[smcv]]
+
+>>>> It turns out that with `autoindex_commit => 0`, the failure mode is
+>>>> different. The transient map is created when you attach the
+>>>> attachment. When you save the page, it's written into the srcdir,
+>>>> the map is deleted from the transientdir, and the ctime/mtime
+>>>> in the indexdb are those of the file in the srcdir, but for some
+>>>> reason the HTML output isn't re-generated (despite a refresh
+>>>> happening). --[[smcv]]
diff --git a/doc/bugs/nested_inlines_produce_no_output.mdwn b/doc/bugs/nested_inlines_produce_no_output.mdwn
new file mode 100644
index 000000000..3f2fccdfb
--- /dev/null
+++ b/doc/bugs/nested_inlines_produce_no_output.mdwn
@@ -0,0 +1,12 @@
+If an inlined page itself contains an inline directive, the nested directive will produce no output. In [this example wiki](http://www.willthompson.co.uk/tmp/ikiwiki-nested-inline/), the following pages exist:
+
+ * _pets_: contains some content, and the directive `inline pages="pets/* and !pets/*/*"` to inline its immediate children.
+ * _pets/dogs_: some content, and `inline pages="pets/dogs/*"`.
+ * _pets/dogs/fifi_, _pets/dogs/rover_: content.
+ * _pets/cats_, _pets/cats/mumu_, _pets/cats/ceefer_: similar.
+
+When rendered, _pets_ [contains](http://www.willthompson.co.uk/tmp/ikiwiki-nested-inline/output/pets/) the content from _pets/dogs_ and _pets/cats_, but not the pages inlined into them. However, the subpages [correctly](http://www.willthompson.co.uk/tmp/ikiwiki-nested-inline/output/pets/dogs/) [include](http://www.willthompson.co.uk/tmp/ikiwiki-nested-inline/output/pets/cats/) their own children.
+
+This used to work in at least ikiwiki 1.45. I stepped through `preprocess_inline`, but couldn't see why this wasn't working.
+
+> Broke due to overoptimisation, fixed now. [[done]] --[[Joey]]
diff --git a/doc/bugs/nested_raw_included_inlines.mdwn b/doc/bugs/nested_raw_included_inlines.mdwn
new file mode 100644
index 000000000..92ea4c4ef
--- /dev/null
+++ b/doc/bugs/nested_raw_included_inlines.mdwn
@@ -0,0 +1,51 @@
+I have the following structure:
+
+## page0
+ # Page 0
+ \[[!inline raw="yes" pages="page1"]]
+
+## page1
+ # Page 1
+ \[[!inline pages="page2"]]
+
+## page2
+ # Page 2
+ test
+
+In this situation, a change in page 2 will trigger a rebuild of page1 but not of page0.
+
+ refreshing wiki..
+ scanning page2.mdwn
+ rendering page2.mdwn
+ rendering page1.mdwn, which depends on page2
+ done
+
+In my real world situation, page1 is actually listing all pages that match a certain tag and page0 is the home page.
+Whenever a page got tagged, it will appear on page1 but not on page0.
+
+Am I missing something? Is this a bug or Ikiwiki not supposed to support this use case?
+
+> Perhaps the inline plugin isn't being clever enough about dependencies -
+> strictly speaking, when a page is inlined with full content, the inlining
+> page should probably inherit all the inlined page's dependencies.
+> That might be prohibitively slow in practise due to the way IkiWiki
+> currently merges pagespecs, though - maybe the patches I suggested for
+> [[separating_and_uniquifying_pagespecs|todo/should_optimise_pagespecs]]
+> would help? --[[smcv]]
+
+>> That, or something seems to have helped in the meantime...
+>> Actually, I think it was the [[transitive_dependencies]] support
+>> that did it, though smcv's pagespec stuff was also a crucial improvement.
+>>
+>> Anyhoo:
+
+ joey@gnu:~/tmp>touch testcase/page2.mdwn
+ joey@gnu:~/tmp>ikiwiki -v testcase html
+ refreshing wiki..
+ scanning page2.mdwn
+ building page2.mdwn
+ building page1.mdwn, which depends on page2
+ building page0.mdwn, which depends on page1
+ done
+
+>> I happily think this is [[done]] --[[Joey]]
diff --git a/doc/bugs/newfile-test.mdwn b/doc/bugs/newfile-test.mdwn
new file mode 100644
index 000000000..34e3ac6c8
--- /dev/null
+++ b/doc/bugs/newfile-test.mdwn
@@ -0,0 +1,11 @@
+The CGI tries to decide whether an user is trying to edit a new file or not with the following test:
+
+ $form->field(name => "newfile",
+ value => ! -e "$config{srcdir}/$file",
+ force => 1);
+
+Assume the script is called like this `http://example.com/ikiwiki.cgi?page=discussion&from=some-page&do=create`. The `if (exists $pagesources{$page}) {` test determines whether there's a file called `$config{srcdir}/discussion`. Most installs won't have a `$config{srcdir}/discussion` page, so this test will fail causing the else clause to be executed. In this case, the else clause results in `$file` being set to `discussion.mdwn`. Thus, on typical installs `value => ! -e "$config{srcdir}/$file",` always succeeds, which results in the expected behaviour, albeit for the wrong reasons. Similarly, the ` $form->field(name => "rcsinfo", value => rcs_prepedit($file)` line is also meaningless because `$file` isn't what we think it is.
+
+(To confirm that this wasn't just a result of my imagination, I created [[/discussion]] on this site; feel free to delete it now.)
+
+> I've fixed it to only look for an existing page if it's not creating a new page, so [[bugs/done]] --[[Joey]]
diff --git a/doc/bugs/no_commit_mails_for_new_pages.mdwn b/doc/bugs/no_commit_mails_for_new_pages.mdwn
new file mode 100644
index 000000000..3773a9455
--- /dev/null
+++ b/doc/bugs/no_commit_mails_for_new_pages.mdwn
@@ -0,0 +1,10 @@
+At least with svn, ikiwiki seems to currently not send commit mails for
+newly created pages that match a pagespec such as "*". Subsequent edits to
+the same page do result in commit mails.
+
+(Granted, this could be almost considered a feature, if the new page is in
+an rss feed..)
+
+Turned out to occur only for web commits that added a new file, since now
+the wiki does not rebuild during the commit hook in a web commit, it could
+not rely on `%pagesources` having the file. [[done]] --[[Joey]]
diff --git a/doc/bugs/no_easy_way_to_wrap_HTML_container_around_a_set_of_inlined_pages.mdwn b/doc/bugs/no_easy_way_to_wrap_HTML_container_around_a_set_of_inlined_pages.mdwn
new file mode 100644
index 000000000..1c1cbbb73
--- /dev/null
+++ b/doc/bugs/no_easy_way_to_wrap_HTML_container_around_a_set_of_inlined_pages.mdwn
@@ -0,0 +1,23 @@
+The [[ikiwiki/directive/inline]] directive applies a template to each page-to-be-inlined, but the loop over the pages is in the Perl, not the template itself. This means if I want to wrap a container `<div>` or a `<table>` or whatever around the entire set of inlined pages, I can't do it by just editing the template. In fact, I think the only way to do it without hacking any Perl is with a wrapper template directive, e.g.
+
+ \[[!template id="wrapinline" pages="..."]]
+
+with a template definition like
+
+ <div id="foo">\[[!inline ... pages="<TMPL_VAR raw_pages>"]]</div>
+
+It would be much more convenient if the loop over pages happened in the template, allowing me to just stick whatever markup I want around the loop.
+
+> Unfortunatly, I don't think this can be changed at this point,
+> it would probably break a lot of stuff that relies on the current
+> template arrangement, both in ikiwiki's internals and in
+> people's own, customised inline templates. (Also, I have some plans
+> to allow a single inline to use different templates for different
+> sorts of pages, which would rely on the current one template per
+> page approach to work.)
+>
+> But there is a simple workaround.. the first template in
+> an inline has FIRST set, and the last one has LAST set.
+> So you can use that to emit your div or table top and bottom.
+>
+> [[done]] --[[Joey]]
diff --git a/doc/bugs/no_search_button__44___hence_can__39__t_do_search_in_w3m_at_ikiwiki.info.mdwn b/doc/bugs/no_search_button__44___hence_can__39__t_do_search_in_w3m_at_ikiwiki.info.mdwn
new file mode 100644
index 000000000..2d600fdbb
--- /dev/null
+++ b/doc/bugs/no_search_button__44___hence_can__39__t_do_search_in_w3m_at_ikiwiki.info.mdwn
@@ -0,0 +1,32 @@
+If I browse <http://ikiwiki.info> in [emacs-w3m](http://www.emacswiki.org/emacs/emacs-w3m) (without Javascript), I
+can't do a [[search|plugins/search]]: the text field is there (so I can
+enter my search request), but there seems to be no way to make
+actually a search request (i.e., no button).
+
+(A remark on how it works now in the other browsers:
+In the more "complete"
+browsers (Chromium etc.), the request is done by pressing Enter in the
+text field.)
+--Ivan Z.
+
+I see, no Javascript is probably involved in using the search form;
+the code is simply:
+
+ <form method="get" action="/ikiwiki.cgi" id="searchform">
+ <div>
+ <input type="text" id="searchbox" name="P" value="" size="16"
+ />
+ </div>
+ </form>
+
+So, if the semantics suggested by HTML is such that such a form is to
+be submitted by some default form submitting action in the UI and it
+doesn't really require a button to be functional, then I'd say it's
+not an ikiwiki's problem, but a missing feature in the UI of emacs-w3m
+or the underlying w3m... Perhaps I'll report this issue to them. --Ivan Z.
+
+[[!tag done]]
+There is no problem at all!
+I'm sorry for this hassle!
+In emacs-w3m, there is the <code>w3m-submit-form</code> command
+(<kbd>C-c C-c</kbd>) to submit the form at point; it works.--Ivan Z.
diff --git a/doc/bugs/nonexistent_pages_in_inline_pagenames_do_not_add_a_dependency.mdwn b/doc/bugs/nonexistent_pages_in_inline_pagenames_do_not_add_a_dependency.mdwn
new file mode 100644
index 000000000..486be0363
--- /dev/null
+++ b/doc/bugs/nonexistent_pages_in_inline_pagenames_do_not_add_a_dependency.mdwn
@@ -0,0 +1,44 @@
+In commit aaa72a3a8, Joey noted:
+
+> bestlink returns '' if no existing page matches a link. This propigated
+> through inline and other plugins, causing uninitialized value warnings, and
+> in some cases (when filecheck was enabled) making the whole directive fail.
+>
+> Skipping the empty results fixes that, but this is papering over another
+> problem: If the missing page is later added, there is not dependency
+> information to know that the inline needs to be updated. Perhaps smcv will
+> fix that later.
+
+Potential ways this could be addressed:
+
+* Add a presence dependency on everything the reference could match:
+ so if the `inline` is on `a/b/c` and the missing page is `m`,
+ add a `$depends_simple` `$DEPEND_PRESENCE` dependency on `a/b/c/m`,
+ `a/b/m`, `a/m`, `m` and (if configured) `$config{userdir}/m`
+
+* Make the page names in `\[[!inline pagenames=...]]` count as wikilinks,
+ changing the behaviour of `link()` and backlinks, but causing appropriate
+ rebuilds via the special cases in `IkiWiki::Render`
+
+* Extend the special cases in `IkiWiki::Render` to consider a superset of
+ wikilinks, to which `pagenames` would add its named pages, without
+ affecting `link()` and backlinks
+
+(Note that `\[[!inline pages=...]]` cannot count as wikilinks, because
+pagespecs can contain `link()`, so can't be evaluated until we know what
+wikilinks exist, at which point it's too late to add more wikilinks.)
+
+I think the presence dependency is probably the cleanest approach?
+--[[smcv]]
+
+> I think it was possibly a mistake to use wikilink style lookup for
+> `pagenames`. --[[Joey]]
+
+[[!tag patch]] [[!template id=gitbranch branch=smcv/literal-pagenames author="[[smcv]]"]]
+>> I used the linking rules to make references to
+>> "nearby" pages convenient, but if you'd prefer "absolute"
+>> semantics, my `ready/literal-pagenames` branch does that. For
+>> my main use-case for `pagenames` ([[plugins/contrib/album]])
+>> it's fine either way. --[[smcv]]
+
+>>> Ok, [[merged|done]]. I think it's more consistent this way. --[[Joey]]
diff --git a/doc/bugs/octal_umask_setting_is_unintuitive.mdwn b/doc/bugs/octal_umask_setting_is_unintuitive.mdwn
new file mode 100644
index 000000000..5cdefcf09
--- /dev/null
+++ b/doc/bugs/octal_umask_setting_is_unintuitive.mdwn
@@ -0,0 +1,55 @@
+To make ikiwiki publish world-readable files (usually what you want)
+regardless of your umask, you override the `umask` setting to 022
+octal (which is 18 in decimal). So far so good.
+
+However, because it's interpreted as a plain number in Perl, the
+way you set it varies between formats. In `IkiWiki::Setup::Standard`
+you can use either
+
+ umask => 022
+
+or (less obviously) one of
+
+ umask => 18
+ umask => "18"
+
+but if you use
+
+ umask => "022"
+
+you get the less than helpful umask of 026 octal (22 decimal).
+
+Similarly, in `IkiWiki::Setup::Yaml` (the default for
+[ikiwiki-hosting](http://ikiwiki-hosting.branchable.com/)
+you have to use one of
+
+ umask: 18
+ umask: "18"
+
+and if you try to say 022 you'll get 22 decimal = 026 octal.
+
+[[!tag patch]]
+[[!template id=gitbranch branch=smcv/umask-keywords author="[[smcv]]"]]
+
+Perhaps the best way to solve this would be to have keywords
+for the few values of `umask` that are actually useful?
+
+* `private` (= 077 octal = 63 decimal)
+* `group` (= 027 octal = 23 decimal)
+* `public` (= 022 octal = 18 decimal)
+
+I don't think g+w is a good idea in any case, because as
+documented on [[security]], if ikiwiki makes its `srcdir`
+group-writeable then any member of the group can "cause
+trouble" (escalate privileges to those of the wiki user?)
+via a symlink attack. So I don't think we need keywords
+for those.
+
+--[[smcv]]
+
+> I support this change, but your git repository does not seem to have
+> that branch (or anything) in it today. --[[Joey]]
+
+>> git pushes have a restrictive umask, ironically... fixed. --[[smcv]]
+
+>>> [[done]] --[[Joey]]
diff --git a/doc/bugs/opendiscussion_should_respect_the_discussion_option.mdwn b/doc/bugs/opendiscussion_should_respect_the_discussion_option.mdwn
new file mode 100644
index 000000000..cacd2b73b
--- /dev/null
+++ b/doc/bugs/opendiscussion_should_respect_the_discussion_option.mdwn
@@ -0,0 +1,11 @@
+[[!template id=gitbranch branch=smcv/ready/less-open author="[[smcv]]"]]
+[[!tag patch]]
+
+The [[plugins/opendiscussion]] plugin allows pages named according to
+the `discussionpage` setting to be edited anonymously, even if
+`discussion => 0` is set.
+
+(If it respected the `discussion` option, the combination of
+`opendiscussion` and `moderatedcomments` might be good for blogs.)
+
+[[done]] --[[smcv]]
diff --git a/doc/bugs/opendiscussion_should_respect_the_discussion_option/discussion.mdwn b/doc/bugs/opendiscussion_should_respect_the_discussion_option/discussion.mdwn
new file mode 100644
index 000000000..a5c951671
--- /dev/null
+++ b/doc/bugs/opendiscussion_should_respect_the_discussion_option/discussion.mdwn
@@ -0,0 +1,26 @@
+This would be great to see fixed. It's perplexing to have discussion => 0 in my configuration, not have any discussion links on my site, but still be able to add a discussion page by URL hacking something like this: /cgi-bin/ikiwiki/ikiwiki.cgi?page=posts%2Fdiscussion&do=edit.
+
+spammers have figured that little trick out so I am consitently getting spammed checked into my git repository.
+
+I'm not really sure if this patch introduced other problems, but it seems to have fixed my site:
+
+ 0 mcclelland@chavez:~/.ikiwiki/IkiWiki/Plugin$ diff -u /usr/share/perl5/IkiWiki/Plugin/opendiscussion.pm opendiscussion.pm
+ --- /usr/share/perl5/IkiWiki/Plugin/opendiscussion.pm 2012-05-07 11:31:24.000000000 -0400
+ +++ opendiscussion.pm 2012-07-29 17:49:28.000000000 -0400
+ @@ -25,7 +25,7 @@
+ my $cgi=shift;
+ my $session=shift;
+
+ - return "" if $page=~/(\/|^)\Q$config{discussionpage}\E$/i;
+ + return "" if $page=~/(\/|^)\Q$config{discussionpage}\E$/i && $config{discussion};
+ return "" if pagespec_match($page, "postcomment(*)");
+ return undef;
+ }
+ 1 mcclelland@chavez:~/.ikiwiki/IkiWiki/Plugin$
+
+If libdir is configured to be ~/.ikiwiki in your ikiwiki.settings file, and you are running Debian, you can do the following:
+
+ mkdir -p ~/.ikiwiki/IkiWiki/Plugin
+ cp /usr/share/perl5/IkiWiki/Plugin/opendiscussion.pm ~/.ikiwiki/IkiWiki/Plugin/
+
+And then apply the patch above to ~/.ikiwiki/Ikiwiki/Plugin/opendiscussion.pm.
diff --git a/doc/bugs/openid_incompatability_with_pyblosxom_openid_server_plugin_when_used_with_simple_registration_extension.mdwn b/doc/bugs/openid_incompatability_with_pyblosxom_openid_server_plugin_when_used_with_simple_registration_extension.mdwn
new file mode 100644
index 000000000..9aa00b639
--- /dev/null
+++ b/doc/bugs/openid_incompatability_with_pyblosxom_openid_server_plugin_when_used_with_simple_registration_extension.mdwn
@@ -0,0 +1,3 @@
+This bug is described here:
+
+<http://kitenet.net/~joey/blog/entry/OpenID/discussion/>
diff --git a/doc/bugs/openid_no_longer_pretty-prints_OpenIDs.mdwn b/doc/bugs/openid_no_longer_pretty-prints_OpenIDs.mdwn
new file mode 100644
index 000000000..85a206bc0
--- /dev/null
+++ b/doc/bugs/openid_no_longer_pretty-prints_OpenIDs.mdwn
@@ -0,0 +1,17 @@
+The git commit (in my `openid` branch) says it all:
+
+ Update IkiWiki::openiduser to work with Net::OpenID 2.x
+
+ openiduser previously used a constructor that no longer works in 2.x.
+ However, all we actually want is the (undocumented) DisplayOfURL function
+ that is invoked by the display method, so try to use that.
+
+This bug affects ikiwiki.info (my commits show up in [[RecentChanges]] as http://smcv.pseudorandom.co.uk/ rather than smcv [pseudorandom.co.uk]).
+
+> Cherry picked, thanks. --[[Joey]]
+
+Relatedly, the other commit on the same branch would be nice to have
+(edited to add: I've now moved it, and its discussion, to
+[[todo/pretty-print_OpenIDs_even_if_not_enabled]]). --[[smcv]]
+
+[[!tag done]]
diff --git a/doc/bugs/openid_postsignin_failure.mdwn b/doc/bugs/openid_postsignin_failure.mdwn
new file mode 100644
index 000000000..01c3e5a6f
--- /dev/null
+++ b/doc/bugs/openid_postsignin_failure.mdwn
@@ -0,0 +1,52 @@
+I tried enabling the openid plugin on my site. I tried to log in but got an
+error when coming back to ikiwiki.cgi: "Error: unknown do parameter". I think
+this means that do=postsignin isn't handled by CGI.pm.
+
+The URI in question is fairly long, but if you want me to add it here, I can do that.
+
+I didn't really know how to debug this so I grepped for "postsignin" in both
+openid.pm and passwordauth.pm and found:
+
+ IkiWiki/Plugin/openid.pm: return_to => IkiWiki::cgiurl(do => "postsignin"),
+ IkiWiki/Plugin/passwordauth.pm: IkiWiki::cgi_postsignin($cgi, $session);
+
+Am I barking up the wrong tree? Maybe I'm missing something obvious?
+
+I'm running 1.38 of ikiwiki and the newest CGI::Session, Net::OpenID::Consumer,
+Crypt::DH, URI-Fetch. --Ethan
+
+> The url must not have a setting for openid.mode or openid_identifier in
+> it. So the OpenId plugin didn't know that it was trying to log in. I
+> think this points to an issue with the OpenID server. --[[Joey]]
+
+>> I put debugging output in openid.pm and it suggests that the
+>> verification is taking place successfully. I see "openid.mode=id_res"
+>> in the URI. On top of that, it's the same Openid server I use
+>> to sign in here on ikiwiki.info. --Ethan
+
+>>> Yikes, I don't really have the newest CGI::Session after all..
+>>> let me try updating that. --Ethan
+>>>> Sorry, I'm an idiot -- cookies disabled on my browser. Sorry to
+>>>> waste your time.. --Ethan
+
+>>>>> No problem, the error message could certianly use improvement.
+>>>>> Although if I disable cookies, myopenid lets me know. Maybe you
+>>>>> should paste the url. --[[Joey]]
+
+I have cookies disabled on my computer, with a bunch of manual
+exceptions. This includes myopenid, ikiwiki.info, livejournal,
+and some others. Unfortunately it didn't include my own domain.
+So the URI that myopenid redirected me to was fine, but because
+I didn't have cookies set, I didn't have a session, and so
+session->param('postsignin') was undefined, so instead of being
+redirected my query fell through CGI.pm to the bottom of cgi(),
+where I got the message above. In a perfect world I'd say that
+it would be nice to let the user know that they can't sign in
+w/o cookies, but I don't see any easy way of detecting that
+from CGI::Session. Maybe you know a way -- I have never used
+CGI.pm before, this isn't my forte (in case that wasn't obvious).
+--Ethan
+
+> It's not easily possible to test for cookies, but it is possible to
+> display a better error message in this failure mode. [[bugs/done]]
+> --[[Joey]]
diff --git a/doc/bugs/osm_KML_maps_do_not_display_properly_on_google_maps.mdwn b/doc/bugs/osm_KML_maps_do_not_display_properly_on_google_maps.mdwn
new file mode 100644
index 000000000..2b20240c4
--- /dev/null
+++ b/doc/bugs/osm_KML_maps_do_not_display_properly_on_google_maps.mdwn
@@ -0,0 +1,14 @@
+[[!template id=gitbranch branch=anarcat/master author="[[anarcat]]"]]
+
+I know this sounds backwards, but it seems to me that the KML-generated map should be displayable on google maps. KML is the standard Google uses for google maps, and since we use it, we should interoperate with them. God knows why this is failing, but it is and should probably be fixed for the sake of interoperability: <https://maps.google.ca/maps?q=http:%2F%2Fwiki.reseaulibre.ca%2Fmap%2Fpois.kml> -- [[users/anarcat]]
+
+> The KML only needs a Document tag because it uses "shared styles" -- don't ask me what this is. Here is a [[patch]]: [[https://reseaulibre.deuxpi.ca/0001-Add-Document-tag-to-OSM-plugin-KML-output.patch]] --[[deuxpi]]
+
+> > I applied the patch to my master branch and tested it on the above URL: it works... mostly. The icons for the elements on the actual map seem incorrect (some are the proper icons, some others are the ugly default blue pin of google maps, weird) but I think this is a step in the right direction. Thus, this should be merged. -- [[anarcat]]
+
+>>> I've cherry-picked this patch, but from the description it does not
+>>> sound "fixed" enough to close this bug. (OTOH, perhaps only google can
+>>> fix it, so it people are happy with the state of affairs I won't insist
+>>> this bug be left open.) --[[Joey]]
+
+> > > > I am happy with this right now, so let's mark this as [[done]]. I do agree this seems like a google bug, so let's move on. --[[anarcat]]
diff --git a/doc/bugs/osm_KML_maps_icon_path_have_a_trailing_slash.mdwn b/doc/bugs/osm_KML_maps_icon_path_have_a_trailing_slash.mdwn
new file mode 100644
index 000000000..a3a88d138
--- /dev/null
+++ b/doc/bugs/osm_KML_maps_icon_path_have_a_trailing_slash.mdwn
@@ -0,0 +1,34 @@
+This is not a problem on Apache webservers because they, oddly enough, ignore trailing slashes on paths (maybe some `PATH_INFO` magic, no idea). But basically, in our wiki, the paths to the icon tags are generated with a trailing slash. An excerpt of our [KML file](http://wiki.reseaulibre.ca/map/pois.kml):
+
+ <Style id="/tag/up">
+ <IconStyle>
+ <Icon>
+ <href>http://wiki.reseaulibre.ca//tag/up/icon.png/</href>
+ </Icon>
+ </IconStyle>
+ </Style>
+
+Notice the trailing `/` after the `icon.png`. This breaks display on nginx - the file that gets served isn't the icon, but the frontpage for some reason. I followed the [[setup instructions|tips/dot cgi]] for Nginx that I just had to write because there weren't any, so maybe I screwed up some part, but it does seem to me that the trailing slash is wrong regardless.
+
+(Also notice how the style tag is being turned over backwards by the HTML sanitizer here, cute. :P)
+
+I wrote a crude hack for this, but this strikes me as a similar problem to the one we found in [[bugs/osm linkto() usage breaks map rendering]]. However, I am at a loss how to fix this cleanly because we cannot `will_render()` the tag icons, as they are already generated out there! Weird. Anyways, here's the stupid [[patch]]:
+
+[[!format diff """
+diff --git a/IkiWiki/Plugin/osm.pm b/IkiWiki/Plugin/osm.pm
+index a7baa5f..c9650d0 100644
+--- a/IkiWiki/Plugin/osm.pm
++++ b/IkiWiki/Plugin/osm.pm
+@@ -192,6 +192,7 @@ sub process_waypoint {
+ }
+ }
+ $icon = urlto($icon, $dest, 1);
++ $icon =~ s!/*$!!; # hack - urlto shouldn't be appending a slash in the first place
+ $tag = '' unless $tag;
+ register_rendered_files($map, $page, $dest);
+ $pagestate{$page}{'osm'}{$map}{'waypoints'}{$name} = {
+"""]]
+
+I'm not writing this to a branch out of sheer shame of my misunderstanding. ;) There also may be a workaround that could be done in Nginx too. --[[anarcat]]
+
+> [[applied|done]], but I'm not happy with this either --[[Joey]]
diff --git a/doc/bugs/osm_linkto__40____41___usage_breaks_map_rendering.mdwn b/doc/bugs/osm_linkto__40____41___usage_breaks_map_rendering.mdwn
new file mode 100644
index 000000000..89c08b73c
--- /dev/null
+++ b/doc/bugs/osm_linkto__40____41___usage_breaks_map_rendering.mdwn
@@ -0,0 +1,23 @@
+[[!template id=gitbranch branch=anarcat/master author="[[anarcat]]"]]
+
+Under some circumstances that remain unclear to me, the usage of `urlto()` in the revised version of the [[plugins/osm]] plugin break the map totally. The javascript console in Chromium tells me the following:
+
+ GET http://mesh.openisp.ca/map/pois.kml/ 404 (Not Found)
+
+Indeed, that URL yields a 404. The proper URL is <http://mesh.openisp.ca/map/pois.kml>. --[[anarcat]]
+
+## Proposed solution
+
+The problem seems to be caused by `urlto()` being called for the `osm`
+directive before the generated files are registered with `will_render()`
+from the `waypoint` directive. Proposed patch adds a function that is
+called from the `preprocess` hook for both directives that registers the
+files.
+
+Here is a [[patch]] to IkiWiki/Plugin/osm.pm: <https://reseaulibre.deuxpi.ca/0000-Fix-incorrect-URL-pointing-to-the-generated-waypoint.patch>
+
+--[[deuxpi]]
+
+I confirm the patch works, and I added it to my master branch. --[[anarcat]]
+
+> [[applied|done]]. Thanks guys. --[[Joey]]
diff --git a/doc/bugs/osm_sometimes_looses_some_nodes.mdwn b/doc/bugs/osm_sometimes_looses_some_nodes.mdwn
new file mode 100644
index 000000000..9de1b4e23
--- /dev/null
+++ b/doc/bugs/osm_sometimes_looses_some_nodes.mdwn
@@ -0,0 +1,5 @@
+I have heard repeated reports on <http://mesh.openisp.ca/> that editing a page that has a waypoint in it will sometimes make that waypoint disappear from the main map. I have yet to understand why that happens or how, but multiple users have reported that.
+
+A workaround is to rebuild the whole wiki, although sometimes re-editing the same page will bring the waypoint back on the map.
+
+I have been able to reproduce this by simply creating a new node. It will not show up on the map until the wiki is rebuilt or the node is resaved. -- [[anarcat]]
diff --git a/doc/bugs/output_of_successful_rename_should_list_the_full_path_to_affected_pages.mdwn b/doc/bugs/output_of_successful_rename_should_list_the_full_path_to_affected_pages.mdwn
new file mode 100644
index 000000000..132d23463
--- /dev/null
+++ b/doc/bugs/output_of_successful_rename_should_list_the_full_path_to_affected_pages.mdwn
@@ -0,0 +1,14 @@
+I've just renamed a page and received the following as a result:
+
+<p>
+<b>Successfully renamed users/jondowland.mdwn to users/jon.mdwn.</b>
+</p>
+<p>
+
+The following pages have been automatically modified to update their links to users/jon.mdwn:
+<ul>
+<li><a href="./../../tips/convert_mediawiki_to_ikiwiki/discussion/">discussion</a></li><li><a href="./../../tips/untrusted_git_push/discussion/">discussion</a></li></ul>...
+
+In this situation I think the link to pages should be expanded to show the entire path, since there is quite likely to be a lot of things like "discussion". -- [[users/Jon]]
+
+[[done]]
diff --git a/doc/bugs/package_build_fails_in_non-English_environment.mdwn b/doc/bugs/package_build_fails_in_non-English_environment.mdwn
new file mode 100644
index 000000000..521ba62f8
--- /dev/null
+++ b/doc/bugs/package_build_fails_in_non-English_environment.mdwn
@@ -0,0 +1,11 @@
+basewiki_brokenlinks.t fails when running dpkg-buildpackage in non-English environment : it greps for an (non-)error message that is i18n'd. This of course does not happen when building in a proper chroot environment... which happens to fail as well, for other reasons, but this will be for another bug.
+
+The `LANG=` on line 9 does not seem to do what it's supposed to, go figure.
+
+I've never had to understand the Unix locales, so I randomly tried to replace `LANG=` in basewiki_brokenlinks.t with :
+
+- `LANG=C` : fails
+- `LANGUAGE=` : fails
+- `LANGUAGE=C` : works!
+
+> For maximum precedence it should have been LC_ALL=C. [[done]], I think... --[[smcv]]
diff --git a/doc/bugs/page_is_not_rebuilt_if_it_changes_extension.mdwn b/doc/bugs/page_is_not_rebuilt_if_it_changes_extension.mdwn
new file mode 100644
index 000000000..e47be8d28
--- /dev/null
+++ b/doc/bugs/page_is_not_rebuilt_if_it_changes_extension.mdwn
@@ -0,0 +1,27 @@
+Suppose a wiki has a source page a.mdwn, which is then moved to a.wiki.
+(Suppose both the mdwn and wikitext plugins are enabled, so this changes how "a" is rendered.)
+Currently, when the wiki is refreshed, ikiwiki doesn't notice the change
+and the page is not rebuilt.
+
+I have a [[patch]] that fixes this.
+The relevant commit on [my Github fork of ikiwiki](http://github.com/gmcmanus/ikiwiki/) is:
+
+ b6a3b8a683fed7a7f6d77a5b3f2dfbd14c849843
+
+The patch (ab)uses`%forcerebuild`, which is meant for use by plugins.
+If, for some reason, a plugin deletes the page's entry in `%forcerebuild`, it won't be rebuilt.
+
+This patch uncovers another problem.
+Suppose a wiki has a source page "a" (no extension)
+which is then moved to "a.mdwn" (or vice versa).
+ikiwiki fails when trying to create a directory "a" where there is a file "a"
+(or vice versa).
+
+The same problem occurs if both "a" and "a.mdwn" exist in the wiki.
+
+> Thank you for looking into it!
+>
+> On the use of forcerebuild, I think it's acceptable; plugins that unset
+> it would break other plugins that set it, too.
+>
+> [[cherry-picked|done]] --[[Joey]]
diff --git a/doc/bugs/page_preview_does_not_work_on_new_page_with_a_table.mdwn b/doc/bugs/page_preview_does_not_work_on_new_page_with_a_table.mdwn
new file mode 100644
index 000000000..65dffd671
--- /dev/null
+++ b/doc/bugs/page_preview_does_not_work_on_new_page_with_a_table.mdwn
@@ -0,0 +1,3 @@
+If the table plugin is enabled, then creating a page, inserting a `\[[!table ...]` and clicking preview yields "htmlization of not supported" (sic). --[[madduck]]
+
+[[fix0red|done]] --[[Joey]]
diff --git a/doc/bugs/pagecount_is_broken.mdwn b/doc/bugs/pagecount_is_broken.mdwn
new file mode 100644
index 000000000..57df6b75d
--- /dev/null
+++ b/doc/bugs/pagecount_is_broken.mdwn
@@ -0,0 +1,4 @@
+The [[plugins/pagecount]] plugin seems to be broken, as it claims there are
+\[[!pagecount ]] pages in this wiki. (if it's not 0, the bug is fixed)
+
+[[fixed|done]] --[[Joey]]
diff --git a/doc/bugs/pagemtime_in_refresh_mode.mdwn b/doc/bugs/pagemtime_in_refresh_mode.mdwn
new file mode 100644
index 000000000..f926ec86c
--- /dev/null
+++ b/doc/bugs/pagemtime_in_refresh_mode.mdwn
@@ -0,0 +1,28 @@
+I'd like a way to always ask the RCS (Git) to update a file's mtime in
+refresh mode. This is currently only done on the first build, and later
+for `--gettime --rebuild`. But always rebuilding is too heavy-weight for
+this use-case. My options are to either manually set the mtime before
+refreshing, or to have ikiwiki do it at command. I used to do the
+former, but would now like the latter, as ikiwiki now generally does this
+timestamp handling.
+
+From a quick look, the code in `IkiWiki/Render.pm:find_new_files` is
+relevant: `if (! $pagemtime{$page}) { [...]`.
+
+How would you like to tackle this?
+
+--[[tschwinge]]
+
+> This could be done via a `needsbuild` hook. The hook is passed
+> the list of changed files, and it should be safe to call `rcs_getmtime`
+> and update the `pagemtime` for each.
+>
+> That lets the feature be done by a plugin, which seems good, since
+> `rcs_getmtime` varies between very slow and not very fast, depending on
+> VCS.
+>
+> AFAICS, the only use case for doing this is if you commit changes and
+> then delay pushing them to a DVCS repo. Since then the file mtime will
+> be when the change was pushed, not when it was committed. But I've
+> generally felt that recording when a change was published to the repo
+> of a wiki as its mtime is good enough. --[[Joey]]
diff --git a/doc/bugs/pages_missing_top-level_directory.mdwn b/doc/bugs/pages_missing_top-level_directory.mdwn
new file mode 100644
index 000000000..77c31cd27
--- /dev/null
+++ b/doc/bugs/pages_missing_top-level_directory.mdwn
@@ -0,0 +1,78 @@
+Hi,
+
+I've rebuilt two sites now, and anything that requires a working directory structure isn't working properly. I have no idea how it's doing this. I don't see anything in my templates, and I haven't messed around with the back-end code much.
+
+An example would show this best I think.
+
+<pre>
+/ <- root of site
+/About/ <- sub-directory
+ /Policy/ <- sub-sub-
+</pre>
+
+When you're on /About/, any generated links get mapped to /Policy/ and NOT /About/Policy/ - of course this results in a 404 error.
+
+I used to be able to use relative links or absolute ones to get the links I want, and now I can't do either. The generated link results in a 404 due to the stripping of a directory.
+
+I don't know if it's related to the fact that I have one ikiwiki install under another (/blog/ under / is also ikiwiki), but both are FUBAR.
+
+> what do you mean by generated links: do you mean the output of
+> [[ikiwiki/wikilink]]s? Or are you generating links some other way?
+> When you say "on /About/, any generated links get mapped to
+> /Policy/ and NOT /About/Policy" can you provide an example of what
+> source generates the link? -- [[Jon]]
+
+>> No, a \[[map]] call, such as:
+>>
+>> (actual code)<br />
+>> = = = = =<br />
+>> \[[!map pages="About/*" show="title"]]<br />
+>> = = = = =<br />
+>>
+>> The end result is:<br />
+>> (actual code)
+>>
+<pre>
+&lt;div class="map">
+&lt;ul>
+&lt;li>&lt;a class="mapitem" href="./Policy/">Policy&lt;/a>
+&lt;ul>
+&lt;li>&lt;a class="mapitem" href="./Policy/Microblog/">Microblogging subscription policy&lt;/a>
+&lt;/li>
+&lt;/ul>
+&lt;/li>
+&lt;/ul>
+&lt;/div>
+</pre>
+
+> I'm also confused about what is generating the links. The map directive?
+> You? --[[Joey]]
+
+>> see above :)
+
+>> I suspect this is due to git scanning everything under the pwd of the .git/ directory, but not totally so.
+
+>>> Ikiwiki never, ever, looks in directories with names starting with a
+>>> dot. --[[Joey]]
+
+>> Other ikiwiki sites I have don't do this, and work OK, on the same server, but different docroots.
+
+>>> Well, I've moved my blog to under my site's docroot - in terms of git
+>>> and ikiwiki - and it's still cutting out a whole directory level. I
+>>> have no idea what's going on. I need to check the code. The site is at
+>>> http://simonraven.kisikew.org/ - if you follow the "About" link, you'll
+>>> understand exactly what's going on, if you look at the URL in your
+>>> status bar (or under your cursor if you're using a text browser).
+
+>>>> Your page contains the following in its html:
+>>>> `<base href="../" />`
+>>>>
+>>>> Given a link like "./Policy/", which is *correct*, and when on the
+>>>> About page will normally link to the About/Policy page, this causes
+>>>> the link to really link to ".././Policy/" which is of course broken.
+>>>>
+>>>> Ikiwiki's standard page templates do not contain this base tag, so
+>>>> I guess your customised templates are broken. --[[Joey]] [[done]]
+
+>>>>> I totally forgot about that tag... good catch. I was thinking it was my template that was broken, since yesterday, but I couldn't see what. Thank you very much for your eyes.
+
diff --git a/doc/bugs/pages_under_templates_are_invalid.mdwn b/doc/bugs/pages_under_templates_are_invalid.mdwn
new file mode 100644
index 000000000..f7e115d48
--- /dev/null
+++ b/doc/bugs/pages_under_templates_are_invalid.mdwn
@@ -0,0 +1,16 @@
+Pages under templates/ are invalid (in fact, not only invalid, but also not well-formed) xhtml pages.
+
+This problem is especially serious when you change extension from .html to
+.xhtml in ikiwiki.setup and use Firefox. Since Firefox will display a error
+message only for not well-formed application/xhtml+xml pages.
+
+It seems that HTML::Template also support `<!--Variable-->` syntax instead
+of `<Variable>`. Chaning to this syntax will solve this problem, I guess.
+
+
+Even if changed to `<!-- TMPL_VAR -->` style, the problem may still exist if the template contains if else block.
+
+Maybe just encode all &lt; and &gt; when compling pages within the templates folder will solve this problem.
+
+> I never noticed this bug, since it only happens if the htmlscrubber is
+> disabled. --[[Joey]]
diff --git a/doc/bugs/pagespec:_tagged__40____41____44___globbing.mdwn b/doc/bugs/pagespec:_tagged__40____41____44___globbing.mdwn
new file mode 100644
index 000000000..f9cb37487
--- /dev/null
+++ b/doc/bugs/pagespec:_tagged__40____41____44___globbing.mdwn
@@ -0,0 +1,36 @@
+With the current HEAD (b10d353490197b576ef7bf2e8bf8016039efbd2d),
+globbing in `tagged()` pagespecs doesn't work for me. For example,
+`tagged(*)` doesn't match any pages. (It does in this wiki installation
+here, though.)
+
+I did not yet do any testing to figure out when this broke.
+
+--[[tschwinge]]
+
+[[!map pages="*/a* and tagged(*ose)"]]
+
+> Are you sure that `tagged()` ever matches pages there? Take globbing
+> out of the equasion.
+>
+> This could be as simple as you having not rebuilt the wiki
+> on upgrade to the version that tracks tagged links. --[[Joey]]
+
+>> Yes, it is a globbing issue:
+
+>> \[[!map pages="tagged(open_i*ue_gdb)" show=title]]
+
+>> ... doesn't show anything.
+
+>> \[[!map pages="tagged(open_issue_gdb)" show=title]]
+
+>> ... does show a map of eight pages. Also, it's working fine on the
+>> autotags pages.
+
+>> --[[tschwinge]]
+
+>>> Only way I can reproduce something like this is if tagbase is not set.
+>>> I have fixed a bug there, see if it works for you?
+>>> --[[Joey]]
+
+>>>> This is now indeed [[fixed|done]] (thanks!) -- even though I already
+>>>> did have tagbase set.
diff --git a/doc/bugs/pagespec_can__39__t_match___123__curly__125___braces.mdwn b/doc/bugs/pagespec_can__39__t_match___123__curly__125___braces.mdwn
new file mode 100644
index 000000000..dee1e9891
--- /dev/null
+++ b/doc/bugs/pagespec_can__39__t_match___123__curly__125___braces.mdwn
@@ -0,0 +1,44 @@
+I want match pages which have actually curly braces in the names (like this one), but this matches a lot of pages without the braces in their names :( :
+
+[[!inline show="3" feeds="no" archive="yes" pages="*_{*}_*"]]
+
+(note: the inline above has been restricted to 3 matches to keep this page
+concise. Hopefully it is still clear that this page is not in the output set,
+and the 3 pages in the output set do not contain curly braces in their
+titles).
+
+When escaped, it doesn't work at all:
+
+[[!inline show="3" feeds="no" archive="yes" pages="*_\{*}_*"]]
+
+[[!inline show="3" feeds="no" archive="yes" pages="*_{*\}_*"]]
+
+More tests:
+
+"\*{\*":
+
+[[!inline show="3" feeds="no" archive="yes" pages="*{*"]]
+
+"\*\\{\*":
+
+[[!inline show="3" feeds="no" archive="yes" pages="*\{*"]]
+
+> This is due to the current handling of quoting and escaping issues
+> when converting a pagespec to perl code. `safequote` is used to
+> safely quote an input string as a `q{}` quote, and it strips
+> curlies when doing so to avoid one being used to break out of the `q{}`.
+>
+> Alternative ways to handle it would be:
+>
+> * Escape curlies. But then you have to deal with backslashes
+> in the user's input as they could try to defeat your escaping.
+> Gets tricky.
+>
+> * Avoid exposing user input to interpolation as a string. One
+> way that comes to mind is to have a local string lookup hash,
+> and insert each user specified string into it, then use the hash
+> to lookup the specified strings at runtime. [[done]]
+>
+> --[[Joey]]
+
+Thank you! I'll try it. --Ivan Z.
diff --git a/doc/bugs/pagespec_error_on_refresh_but_not_rebuild.mdwn b/doc/bugs/pagespec_error_on_refresh_but_not_rebuild.mdwn
new file mode 100644
index 000000000..df941af37
--- /dev/null
+++ b/doc/bugs/pagespec_error_on_refresh_but_not_rebuild.mdwn
@@ -0,0 +1,32 @@
+I'm getting this error message when I refresh my wiki:
+
+ $ hg commit -u me -m "Minor corrections"
+ refreshing wiki..
+ scanning htmletc/moco-conf-rooms.mdwn
+ building htmletc/moco-conf-rooms.mdwn
+ Use of uninitialized value in concatenation (.) or string at /usr/local/lib/perl5/site_perl/5.8.9/Text/Typography.pm line 542.
+ building sidebar.mdwn, which depends on htmletc/moco-conf-rooms
+ building contact.mdwn, which depends on sidebar
+ building 500.mdwn, which depends on sidebar
+ Use of uninitialized value in concatenation (.) or string at /usr/local/lib/perl5/site_perl/5.8.9/Text/Typography.pm line 542.
+ building ceramics.mdwn, which depends on sidebar
+ building glossary.mdwn, which depends on sidebar
+ syntax error in pagespec "internal(glossary/comment_*)"
+ warning: post-commit hook exited with status 2
+
+But there is no error if I use `ikiwiki --rebuild` to regenerate the whole thing.
+
+> You neglect to say what version of ikiwiki this is,
+> or give any information to reproduce the bug.
+>
+> My guess: A version older than 3.20100403, which included
+> 799b93d258bad917262ac160df74136f05d4a451,
+> which could lead to incorrect "syntax error in pagespec"
+> that only happened some of the time.
+>
+> (The Text::Typography warning seems probably unrelated.)
+> --[[Joey]]
+
+>> I'm sorry, I don't know what I was thinking there. It's ikiwiki 3.20100212, and manually applying the patch you linked to made the bug go away. (Upgrading ikiwiki is a pain on nearlyfreespeech, especially if you don't want to keep the build directory around -- please consider making ikiwiki runnable directly from a git clone.)
+
+[[!meta link="done"]]
diff --git a/doc/bugs/pagespec_parsing_chokes_on_function__40____41__.mdwn b/doc/bugs/pagespec_parsing_chokes_on_function__40____41__.mdwn
new file mode 100644
index 000000000..78fed0e5d
--- /dev/null
+++ b/doc/bugs/pagespec_parsing_chokes_on_function__40____41__.mdwn
@@ -0,0 +1,64 @@
+The pagespec regexes don't allow functions with no arguments.
+
+IkiWiki.pm, around line 1035:
+
+<pre>
+$spec=~m{
+ \s* # ignore whitespace
+ ( # 1: match a single word
+ \! # !
+ |
+ \( # (
+ |
+ \) # )
+ |
+ \w+\([^\)]+\) # command(params)
+ |
+ [^\s()]+ # any other text
+ )
+ \s* # ignore whitespace
+ }igx
+</pre>
+
+command(params) of course might be just command(). (See
+conditional.pm: match_included.) Trying to feed
+ikiwiki a pagespec without params will get you instead:
+
+IkiWiki::PageSpec::match_glob($page, q{function}, @params) ( )
+
+Which is completely not desired. The second + on that line should be a *.
+
+None of the builtin pagespecs "work" with no parameters, so it's hard to
+write a unit test for this. But can we at least write a helpful note in
+case the user is given to rebuilding the wiki by hand. --Ethan
+
+<pre>
+--- ikiwiki/IkiWiki.pm 2007-07-26 15:15:22.716860000 -0700
++++ ikidev/IkiWiki.pm 2007-07-26 21:34:45.542248000 -0700
+@@ -1032,7 +1032,7 @@
+ |
+ \) # )
+ |
+- \w+\([^\)]+\) # command(params)
++ \w+\([^\)]*\) # command(params)
+ |
+ [^\s()]+ # any other text
+ )
+@@ -1075,6 +1075,10 @@
+ }
+
+ my $ret=eval pagespec_translate($spec);
++ if ($@){
++ my $t = pagespec_translate($spec);
++ print "evaluating pagespec failed: $t $@\n";
++ }
+ return IkiWiki::FailReason->new("syntax error") if $@;
+ return $ret;
+ }
+</pre>
+
+> Thanks, [[done]] --[[Joey]]
+>
+> Note that the printing of the error isn't needed though. pagespec_match()
+> returns an IkiWiki::FailReason object if parsing fails, and its caller
+> can use that as desired to print the error.
diff --git a/doc/bugs/pagestats_plugin_broken.mdwn b/doc/bugs/pagestats_plugin_broken.mdwn
new file mode 100644
index 000000000..0ae74b4ee
--- /dev/null
+++ b/doc/bugs/pagestats_plugin_broken.mdwn
@@ -0,0 +1,29 @@
+Since at least version 2.0 (and certainly a few version before), it seems that the pagestats plugin is broken : each matched page has a count of 2.
+This is also (of course) producing flat tag cloud.
+
+My perl knowledge is very limited, but the call :
+
+ my @bl = IkiWiki::backlinks($page);
+ $counts{$page} = scalar(@bl);
+
+return allways 2, which seems to me "obvious", because the backlinks() function is returning two array of links...
+
+Patch is :
+
+ --- /usr/share/perl5/IkiWiki/Plugin/pagestats.pm 2007-04-27 04:33:43.000000000 +0200
+ +++ ./pagestats.pm 2007-05-12 16:47:14.000000000 +0200
+ @@ -36,7 +36,7 @@
+ if (pagespec_match($page, $params{pages}, location => $params{page})) {
+ use IkiWiki::Render;
+ my @bl = IkiWiki::backlinks($page);
+ - $counts{$page} = scalar(@bl);
+ + $counts{$page} = scalar(@{$bl[0]})+scalar(@{$bl[1]});
+ $max = $counts{$page} if $counts{$page} > $max;
+ }
+ }
+
+
+
+--[[users/hb]]
+
+thanks, [[done]] --[[Joey]]
diff --git a/doc/bugs/pagetitle_function_does_not_respect_meta_titles.mdwn b/doc/bugs/pagetitle_function_does_not_respect_meta_titles.mdwn
new file mode 100644
index 000000000..15d28f989
--- /dev/null
+++ b/doc/bugs/pagetitle_function_does_not_respect_meta_titles.mdwn
@@ -0,0 +1,289 @@
+[[!tag patch plugins/inline patch/core]]
+
+The `IkiWiki::pagetitle` function does not respect title changes via `meta.title`. It really should, so that links rendered with `htmllink` get the proper title in the link text.
+
+--[[madduck]]
+
+----
+
+It is possible to set a Page-Title in the meta-plugin, but that one isn't
+reused in parentlinks. This patch may fix it.
+
+<ul>
+<li> I give pagetitle the full path to a page.
+<li> I redefine the 'pagetitle'-sub to deal with it.
+<li> to maintain compatibility for IkiWikis without the meta-plugin, i added a 'basename' to the Original-pagetitle.
+</ul>
+
+<pre>
+diff -c /usr/share/perl5/IkiWiki/Render.pm.distrib /usr/share/perl5/IkiWiki/Render.pm
+*** /usr/share/perl5/IkiWiki/Render.pm.distrib Wed Aug 6 07:34:55 2008
+--- /usr/share/perl5/IkiWiki/Render.pm Tue Aug 26 23:29:32 2008
+***************
+*** 102,108 ****
+ $template->param(
+ title => $page eq 'index'
+ ? $config{wikiname}
+! : pagetitle(basename($page)),
+ wikiname => $config{wikiname},
+ content => $content,
+ backlinks => $backlinks,
+--- 102,108 ----
+ $template->param(
+ title => $page eq 'index'
+ ? $config{wikiname}
+! : pagetitle($page),
+ wikiname => $config{wikiname},
+ content => $content,
+ backlinks => $backlinks,
+
+diff -c /usr/share/perl5/IkiWiki/Plugin/parentlinks.pm.distrib /usr/share/perl5/IkiWiki/Plugin/parentlinks.pm
+*** /usr/share/perl5/IkiWiki/Plugin/parentlinks.pm.distrib Wed Aug 6 07:34:55 2008
+--- /usr/share/perl5/IkiWiki/Plugin/parentlinks.pm Tue Aug 26 23:19:43 2008
+***************
+*** 44,50 ****
+ "height_$height" => 1,
+ };
+ $path.="/".$dir;
+! $title=IkiWiki::pagetitle($dir);
+ $i++;
+ }
+ return @ret;
+--- 44,50 ----
+ "height_$height" => 1,
+ };
+ $path.="/".$dir;
+! $title=IkiWiki::pagetitle($path);
+ $i++;
+ }
+ return @ret;
+
+diff -c /usr/share/perl5/IkiWiki.pm.distrib /usr/share/perl5/IkiWiki.pm
+*** /usr/share/perl5/IkiWiki.pm.distrib Wed Aug 6 07:48:34 2008
+--- /usr/share/perl5/IkiWiki.pm Tue Aug 26 23:47:30 2008
+***************
+*** 792,797 ****
+--- 792,799 ----
+ my $page=shift;
+ my $unescaped=shift;
+
++ $page=basename($page);
++
+ if ($unescaped) {
+ $page=~s/(__(\d+)__|_)/$1 eq '_' ? ' ' : chr($2)/eg;
+ }
+
+diff -c /usr/share/perl5/IkiWiki/Plugin/meta.pm.distrib /usr/share/perl5/IkiWiki/Plugin/meta.pm
+*** /usr/share/perl5/IkiWiki/Plugin/meta.pm.distrib Wed Aug 6 07:34:55 2008
+--- /usr/share/perl5/IkiWiki/Plugin/meta.pm Tue Aug 26 23:30:58 2008
+***************
+*** 3,8 ****
+--- 3,9 ----
+ package IkiWiki::Plugin::meta;
+
+ use warnings;
++ no warnings 'redefine';
+ use strict;
+ use IkiWiki 2.00;
+
+***************
+*** 289,294 ****
+--- 290,319 ----
+ }
+ }
+
++ sub IkiWiki::pagetitle ($;$) {
++ my $page=shift;
++ my $unescaped=shift;
++
++ if ($page =~ m#/#) {
++ $page =~ s#^/##;
++ $page =~ s#/index$##;
++ if ($pagestate{"$page/index"}{meta}{title}) {
++ $page = $pagestate{"$page/index"}{meta}{title};
++ } else {
++ $page = IkiWiki::basename($page);
++ }
++ }
++
++ if ($unescaped) {
++ $page=~s/(__(\d+)__|_)/$1 eq '_' ? ' ' : chr($2)/eg;
++ }
++ else {
++ $page=~s/(__(\d+)__|_)/$1 eq '_' ? ' ' : "&#$2;"/eg;
++ }
++
++ return $page;
++ }
++
+ package IkiWiki::PageSpec;
+
+ sub match_title ($$;@) {
+
+</pre>
+
+----
+
+> A few quick notes about it:
+
+> - Using <code>inline</code> would avoid the redefinition + code duplication.
+> - A few plugins would need to be upgraded.
+> - It may be necessary to adapt the testsuite in `t/pagetitle.t`, as well.
+>
+> --[[intrigeri]]
+>
+>> It was actually more complicated than expected. A working prototype is
+>> now in my `meta` branch, see my userpage for the up-to-date url.
+>> Thus tagging patch. --[[intrigeri]]
+>>
+>>> Joey, please consider merging my `meta` branch. --[[intrigeri]]
+
+So, looking at your meta branch: --[[Joey]]
+
+* Inter-page dependencies. If page A links to page B, and page B currently
+ has no title, then A will display the link as "B". Now page B is modified
+ and a title is added. Nothing updates "A".
+ The added overhead of rebuilding every page that links to B when B is
+ changed (as the `indexhtml` hook of the po plugin does) is IMHO a killer.
+ That could be hundreds or thousands of pages, making interactive editing
+ way slow. This is probably the main reason I had not attempted this whole
+ thing myself. IMHO this calls for some kind of intellegent dependency
+ handler that can detect when B's title has changed and only rebuild pages
+ that link to B in that case.
+* Looks like some plugins that use `pagetitle` to format it for display
+ were not changed to use `nicepagetitle` (for example, rename).
+ But most of those callers intend to display the page name
+ as a title, but including the parent directories in the path. (Ie,
+ "renaming foo/page title to bar/page title" --
+ you want to know it's moved from foo to bar.) `nicepagetitle` does not
+ allow doing that since it always takes the `basename`.
+* I don't like the name `nicepagetitle`. It's not very descriptive, is it?
+ And it seems very confusing to choose whether to use the "nice" or original
+ version. My hope is that adding a second function is unnecessary.
+ As I understand it, you added a new function for two reasons:
+ 1) It needs the full page name, not basename.
+ 2) `titlepage(pagetitle($page))` reversability.
+
+ 1) If you look at all the callers
+ Of `pagetitle` most of them pass a complete page name, not just the
+ basename. In most cases `pagetitle` is used to display the full name
+ of the page, including any subdirectory it's in. So why not just make
+ it consitently be given the full name of the page, with another argument
+ specifying if we want to get back just the base name.
+
+ 2) I can't find any code that actually uses the reversability like that.
+ The value passed to `titlepage` always comes from some external
+ source. Unless I missed one.
+* The use of `File::Spec->rel2abs` is a bit scary.
+* Does it really make sense to call `pagetitle` on the meta title
+ in meta's `nicepagetitle`? What if the meta title is something like
+ "foo_bar" -- that would be changed to "foo bar".
+* parentlinks is changed to use `nicepagetitle(bestlink($page, $path))`.
+ Won't `bestlink` return "" if the parent page in question does not exist?
+* `backlinks()` is changed to add an additional `title` field
+ to the hash returned, but AFAICS this is not used in the template.
+* Shouldn't `Render.pm` use nicepagetitle when getting the title for the
+ page template? Then meta would not need to override the title in the
+ `pagetemplate` hook. (Although this would eliminate handling of
+ `title_overridden` -- but that is little used and would not catch
+ all the other ways titles can be overridden with this patch anyway.)
+
+> I'm not a reviewer or anything, but can I chime in on changes to pagetitle?
+> I don't think having meta-titles in wikilinks and the parentlinks path by
+> default is necessarily a good thing. I don't consider the meta-title of a page
+> as used in `<title>` to be the same thing as the short title you
+> want in those contexts - IMO, the meta-title is the "formal" title of the page,
+> enough to identify it with no other context, and frequently too long to be used
+> as a link title or a parentlink, whereas the parentlinks title in particular
+> should be some abbreviated form that's enough to identify it in context.
+> [tbm](http://www.cyrius.com/) expressed a similar opinion when I was discussing
+> ikiwiki with him at the weekend.
+>
+> It's a matter of taste whether wikilinks are "like a parentlink" or "like a
+> `<title>`"; I could be persuaded either way on that one.
+>
+> An example from my site: [this page](http://www.pseudorandom.co.uk/2004/debian/ipsec/)
+> is the parent of [this page](http://www.pseudorandom.co.uk/2004/debian/ipsec/wifi/)
+> with a title too long to use in the latter's parentlinks; I think the titles of
+> both those pages are too long to use as wikilink text too. Similarly, tbm's page
+> about [Debian on Orion devices from Buffalo](http://www.cyrius.com/debian/orion/buffalo/)
+> can simply be called "Buffalo" in context.
+>
+> Having a `\[[!meta abbrev="..."]]` that took precedence over title
+> in parentlinks and possibly wikilinks might be a good way to fix this? Or if your
+> preference goes the other way, perhaps a `\[[!meta longtitle=""]]` could take
+> precedence when generating the `<title>` and the title that comes after the
+> parentlinks. --[[smcv]]
+
+>> I think you've convinced me. (I had always had some doubt in my mind as
+>> to whether using titles in all these other places would make sense.)
+>>
+>> Instead of meta abbrev, you could have a meta pagename that
+>> overrides the page name displayed everywhere (in turn overridden by
+>> meta title iff the page's title is being displayed). But is this complexity
+>> needed? We have meta redir, so if you want to change the name of a page,
+>> you can just rename it, and put in a stub redirection page so links
+>> still work.
+>>
+>> This leaves the [[plugins/contrib/po]] plugin, which really does need
+>> a way to change the displayed page name everywhere, and at least a
+>> subset of the changes in the meta branch are needed to support that.
+>>
+>> (This would also get around my concern about inter-page dependency
+>> handling, since po contains a workaround for that, and it's probably
+>> acceptable to use potentially slow methods to handle this case.)
+>> --[[Joey]]
+
+>>> I'm glad to implement whatever decision we'll make, but I don't
+>>> clearly understand what this discussion's conclusion is. It seems
+>>> like we agree at least on one point: meta page titles shall not be
+>>> displayed all over the place by default; I have therefore disabled
+>>> `meta_overrides_page_title` by default in my `meta` branch.
+>>>
+>>> My next question is then: do we only want to satisfy the `po`
+>>> plugin needs? Or do we want to allow people who want this, such as
+>>> [[madduck]], to turn on a config switch so that meta page titles
+>>> are displayed as wikilinks titles? In the latter case, what level
+>>> of configurability do we want? I can think of a quite inelegant
+>>> way to implement full configurability, and provide a configuration
+>>> switch for every place where links are displayed, such as
+>>> wikilinks, parentlinks, etc., but I don't think the added bonus is
+>>> worth the complexity of it.
+>>>
+>>> I think we can roughly split the needs into three categories:
+>>>
+>>> 1. never display any modified page title in links; this is the
+>>> current behaviour, and we should keep it as the default one
+>>> 2. display modified page titles only at well chosen places; that
+>>> could be "manual" wikilinks, I mean those generated by the
+>>> `link`, `camelcase` & al. plugins, the recentchanges page, and
+>>> maybe a few other places; keep the usual pagename-based title
+>>> for every other link, such as the parentlinks ones.
+>>> The inter-page dependency problem remains, though. As a first
+>>> step, I'm in favour of the "slow, but correct" implementation,
+>>> with a big warning stating that enabling this option can make
+>>> a wiki really sluggish; if someone really wants this to work
+>>> fast, he/she'll implement a clever dependency handler :)
+>>> 3. display modified page titles all over the place; IMHO, we
+>>> should implement only the bits needed so that the `po` plugin
+>>> can set this up, rather than provide this as
+>>> a user-configurable option.
+>>>
+>>> So my question is: do we want to implement the #2 case, or not?
+>>> I propose myself to only implement #1 and #3 to start with, but do
+>>> it in a way that leaves room for #2.
+>>>
+>>> --[[intrigeri]]
+>>>
+>>>> I agree, we should concentrate on getting just enough functionality
+>>>> for the po plugin, because I want to merge the po plugin soon.
+>>>> If #2 gets tackled later, we will certianly have all kinds of fun.
+>>>> no matter what is done for the po plugin. --[[Joey]]
+
+>>>>> For the record: I've gotten used to the lack of this feature,
+>>>>> and it now seems much less important to me than it was when
+>>>>> initially developing the po plugin. So, I'm hereby officially
+>>>>> removing this from my plate. If anyone else wants to start from
+>>>>> scratch, or from my initial work, I'm happy to review the
+>>>>> po-related part of things -- just drop me an email in this
+>>>>> case. --[[intrigeri]]
diff --git a/doc/bugs/parsing_for_WikiWords_should_only_be_done_outside_html_tags.mdwn b/doc/bugs/parsing_for_WikiWords_should_only_be_done_outside_html_tags.mdwn
new file mode 100644
index 000000000..44938c754
--- /dev/null
+++ b/doc/bugs/parsing_for_WikiWords_should_only_be_done_outside_html_tags.mdwn
@@ -0,0 +1,17 @@
+When having a link to e.g. http://www.chumba.com/media/Chumbawamba-EnoughIsEnough.mp3 inside an anquor tag ikiwiki seems to parse EnoughIsEnough as WikiWord and breaks the link with that. As a general rule I would suggest that inside tags there should never be any WikiWord parsing, this is just asking for problems.
+
+You can see an example of the breakage on <http://wiki.debian-community.org/planets/de/> - scroll down to the Chumbawamba entry.
+
+>> There's a great workaround for this bug: Disable the
+>> [[plugins/camelcase]] plugin. :-) I really don't recommend using that
+>> plugin. _Especially_ not when aggregating third-party content as you do
+>> in the example.
+>>
+>> Fixing this at the html parsing level would involve making ikiwiki 2
+>> times slower, not even counting the html parsing overhead, since it
+>> would have to fully render pages in the "scan" pass.
+>>
+>> All I can do is improve the regexp it uses to try to avoid false
+>> positives. Which I've now [[done]].
+>>
+>> --[[Joey]]
diff --git a/doc/bugs/password_deletion.mdwn b/doc/bugs/password_deletion.mdwn
new file mode 100644
index 000000000..ff2cd2c61
--- /dev/null
+++ b/doc/bugs/password_deletion.mdwn
@@ -0,0 +1,7 @@
+I have just deleted my password, accidentally (which is not a crisis, but it shouldn't really happen).
+
+I logged in to tweak my page subscriptions, did so, and clicked 'save preferences' - unfortunately, the password boxes are cleared when you arrive at the preferences page and if you don't fill them in again then the new password (which is blank) gets saved. I'm sure I'm not the first one to notice this - I'm just writing here because I've not yet found anywhere where this inconvenience is documented.
+
+-- [[KarlMW]]
+
+[[fixed|done]] --[[Joey]]
diff --git a/doc/bugs/perl:_double_free_or_corruption.mdwn b/doc/bugs/perl:_double_free_or_corruption.mdwn
new file mode 100644
index 000000000..8499b6388
--- /dev/null
+++ b/doc/bugs/perl:_double_free_or_corruption.mdwn
@@ -0,0 +1,14 @@
+If your perl is dumping core, that's a perl bug (or a libc bug or the like) by
+definition, not an ikiwiki bug. Ikiwiki is pure perl code; pure perl code
+can't cause perl to dump core unless it tickles a perl bug.
+
+Calling this [[done]] since this is not the right forum. You'll need to
+figure out what's wrong with your perl, I'm afraid. --[[Joey]]
+
+<pre>
+rendering todo/more_class__61____34____34___for_css.mdwn
+rendering todo/Support_subdirectory_of_a_git_repo.mdwn
+rendering todo/link_map.mdwn
+rendering todo/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn
+*** glibc detected *** perl: double free or corruption (!prev): 0x00000000018a7bd0 ***
+</pre>
diff --git a/doc/bugs/pipe-symbol_in_taglink_target.mdwn b/doc/bugs/pipe-symbol_in_taglink_target.mdwn
new file mode 100644
index 000000000..e467959be
--- /dev/null
+++ b/doc/bugs/pipe-symbol_in_taglink_target.mdwn
@@ -0,0 +1,25 @@
+[[!tag bugs wishlist]]
+
+Escaping pipe-symbol in [[taglink|ikwiki/directive/taglink]] targets doesn't work as I wanted:
+
+[[!taglink smth_with_a_pipe|about_the_\|-symbol]]
+[[!taglink smth_with_a_pipe|about_the_|-symbol]]
+
+as opposed to simple wikilinks:
+
+[[a link to smth with a pipe|about the \|-symbol]]
+[[a link to smth with a pipe|about the |-symbol]]
+
+And it seems to work in pagespecs:
+
+tagged:
+
+[[!map pages="tagged(about the |-symbol)"]]
+
+[[!map pages="tagged(about the \|-symbol)"]]
+
+link:
+
+[[!map pages="link(about the |-symbol)"]]
+
+[[!map pages="link(about the \|-symbol)"]]
diff --git a/doc/bugs/pipe_in_tables_as_characters.mdwn b/doc/bugs/pipe_in_tables_as_characters.mdwn
new file mode 100644
index 000000000..12d5e1597
--- /dev/null
+++ b/doc/bugs/pipe_in_tables_as_characters.mdwn
@@ -0,0 +1,16 @@
+How to put '|' character in a field ? I tried escaping it but it does not work.
+Seems tables are disabled here ?
+
+> Explicitly specify format=csv, then you can use pipes as values and even
+> use quotes to unambiguously include commas in values. --[[Joey]]
+
+>> Great! thanks.
+
+>>> Guess I can mark this [[done]] --[[Joey]]
+
+See this example:
+
+[[!table class=table1 data="""
+aaaaaaaaaaaaaaa|b|c
+--\|\|--|e|f
+"""]]
diff --git a/doc/bugs/plugin___96__rename__96___fails_if___96__attachment__96___is_not_enabled.mdwn b/doc/bugs/plugin___96__rename__96___fails_if___96__attachment__96___is_not_enabled.mdwn
new file mode 100644
index 000000000..6bc8bb815
--- /dev/null
+++ b/doc/bugs/plugin___96__rename__96___fails_if___96__attachment__96___is_not_enabled.mdwn
@@ -0,0 +1,7 @@
+ikiwiki 3.20110712: A try to rename a page through the web interface without plugin `attachment` enabled renders:
+
+ Error: Undefined subroutine &IkiWiki::Plugin::attachment::attachment_holding_location called at /usr/share/perl5/IkiWiki/Plugin/rename.pm line 326.
+
+Enabling `attachment` makes it work. Some check if `attachment` is enabled before running that code path would solve it. Not sure of the best way to check it. --[[Daniel Andersson]]
+
+> [[fixed|done]] --[[Joey]]
diff --git a/doc/bugs/plugins__47__relativedate_depends_on_locale_at_setup_file.mdwn b/doc/bugs/plugins__47__relativedate_depends_on_locale_at_setup_file.mdwn
new file mode 100644
index 000000000..a9a39ac47
--- /dev/null
+++ b/doc/bugs/plugins__47__relativedate_depends_on_locale_at_setup_file.mdwn
@@ -0,0 +1,16 @@
+[[plugins/relativedate]] does not works when russian locale defined at setup file (locale => 'ru_RU.UTF-8'). This is happen because javascript for this plugin takes either elements title or content itself. If russian locale is turned on then title generated on russian language and JS can't convert it into Date object. innerHTML is language independent (YYYY-MM-DD HH:mm) always.
+
+If I switch locale to en_US.UTF-8 then this plugin correctly parses text date and print relative date. But when I mouseover on date I see unusual formating of the date (it uses AM/PM format while russians use 24-h notation).
+
+P.S. All pages but RecentChanges show well-formated date. RecentChanges show date formated using locale. Anyway, plugin does not work without en_US locale.
+
+> [[Fixed|done]]. Now it uses C locale for the date put in the title,
+> that is used by relativedate. The mouseover will display the date in your
+> native locale.
+>
+> Only exception is that when javascript is disabled... then
+> relativedate can't work, so instead you will see your localized date
+> displayed; but on mouseover you will get shown the C locale date.
+> --[[Joey]]
+
+>> Thanks.
diff --git a/doc/bugs/po:__apache_config_serves_index_directory_for_index.mdwn b/doc/bugs/po:__apache_config_serves_index_directory_for_index.mdwn
new file mode 100644
index 000000000..fd7cd518c
--- /dev/null
+++ b/doc/bugs/po:__apache_config_serves_index_directory_for_index.mdwn
@@ -0,0 +1,85 @@
+Similarly to [[po:_apache_config_serves_index.rss_for_index]],
+the [[plugins/po]] apache config has another bug.
+
+The use of "DirectoryIndex index", when combined with multiviews, is intended
+to serve up a localized version of the index.??.html file.
+
+But, if the site's toplevel index page has a discussion page, that
+is "/index/discussion/index.html". Or, if the img plugin is used to scale
+an image on the index page, that will be "/index/foo.jpg". In either case,
+the "index" directory exists, and so apache happily displays that
+directory, rather than the site's index page!
+
+--[[Joey]]
+
+> Ack, we do have a problem. Seems like ikiwiki's use of `index/` as
+> the directory for homepage's sub-pages and attachments makes it
+> conflict deeply with Apache's `MultiViews`: as the [MultiViews
+> documentation](http://httpd.apache.org/docs/2.2/mod/mod_negotiation.html#multiviews)
+> says, `index.*` are considered as possible matches only if the
+> `index/` directory *does not exist*. Neither type maps nor
+> `mod_mime` config parameters seem to allow overriding this behavior.
+> Worse even, I guess any page called `index` would have the same
+> issues, not only the wiki homepage.
+
+> I can think of two workarounds, both kinda stink:
+>
+> 1. Have the homepage's `targetpage` be something else than
+> `index.html`.
+> 2. Have the directory for the homepage's sub-pages and attachments
+> be something else than `index`.
+>
+> I doubt either of those can be implemented without ugly special
+> casing. Any other idea? --[[intrigeri]]
+
+>> As I understand it, this is how you'd do it with type maps:
+>>
+>> * turn off MultiViews
+>> * `AddHandler type-map .var`
+>> * `DirectoryIndex index.var`
+>> * make `index.var` a typemap (text file) pointing to `index.en.html`,
+>> `index.fr.html`, etc.
+>>
+>> I'm not sure how well that fits into IkiWiki's structure, though;
+>> perhaps the master language could be responsible for generating the
+>> type-map on behalf of all slave languages, or something?
+>>
+>> Another possibility would be to use filenames like `index.html.en`
+>> and `index.html.fr`, and set `DirectoryIndex index.html`? This could
+>> get problematic for languages whose ISO codes conventionally mean
+>> something else as extensions (Polish, `.pl`, is the usual example,
+>> since many sites interpret `.pl` as "this is a (Perl) CGI").
+>> --[[smcv]]
+
+>>> There is something to be said about "index/foo" being really ugly
+>>> and perhaps it would be nice to use something else. There does not
+>>> appear to even be one function that could be changed; "$page/foo" is
+>>> hardwired into ikiwiki in many places as a place to dump subsidiary
+>>> content -- and it's not even consistent, since there is also eg,
+>>> "$page.rss". I agree, approaching it from this direction would be a
+>>> mess or a lot of work.
+>>>
+>>> Type maps seem like a valid option, but also a lot of clutter.
+>>>
+>>> `index.html.pl` does seem to be asking for trouble, even if apache
+>>> can be configured to DTRT. It would make serving actual up perl scripts
+>>> hard, at least. But that is some good out of the box thinking..
+>>> perhaps "index.foo.pl.html"?
+>>>
+>>> However, that would mean that
+>>> web servers need to be configured differently to serve translated
+>>> and non-translated sites. The current apache configuration for po
+>>> can be used with non-po sites and they still work. --[[Joey]]
+
+>>>> I am vulnerable to the same problem because I use MultiViews, though I don't use the `po` module;
+>>>> I have to serve both Australian English and American English for my company's website
+>>>> (for SEO purposes; certain words that relate to our products are spelt differently in US and Australian English, and we need to be able to be googled with both spellings).
+>>>> I'm just fortunate that nobody has thought to add attachments to the front page yet.
+>>>> I raise this to point out that this is going to be a recurring problem that won't necessarily be fixed by changing the `po` module in isolation.
+>>>>
+>>>> One could argue that "index" is already a special case, since it is the top page of the site.
+>>>> Things like parentlinks already use a special case for the top page (checking the variable HAS_PARENTLINKS).
+>>>> Likewise, when --usedirs is true, index is treated as a special case, since it generates "index.html" and not "index/index.html".
+>>>>
+>>>> Unfortunately, I'm not sure what the best approach to solving this would be.
+>>>> --[[KathrynAndersen]]
diff --git a/doc/bugs/po:_apache_config_serves_index.rss_for_index.mdwn b/doc/bugs/po:_apache_config_serves_index.rss_for_index.mdwn
new file mode 100644
index 000000000..a2b68c4b1
--- /dev/null
+++ b/doc/bugs/po:_apache_config_serves_index.rss_for_index.mdwn
@@ -0,0 +1,36 @@
+The apache config documented in [[plugins/po]] has a subtle bug. It works
+until a site gets an index.atom or index.rss file. (Acutally, with po
+enabled, they're called index.en.atom or index.en.rss etc, but the result
+is the same).
+
+Then, when wget, curl, or w3m is pointed at http://site/, apache serves
+up the rss/atom file rather than the index page.
+
+Analysis:
+
+* /etc/mime.types gives mime types to .rss and .atom files
+* `mod_negotiation`'s MultiViews allows any file with a mime type to be
+ served up via content negotiation, if the client requests that type.
+* wget etc send `Accept: */*` to accept all content types. Compare
+ with firefox, which sends `Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*`
+* So apache has a tie between a html encoded Enlish file, and a rss encoded
+ English file and the client has no preference. In a tie, apache will serve up the
+ *smallest* file, which tends to be the rss file. (Apache's docs say it uses that
+ strange criteria to break ties; see <http://httpd.apache.org/docs/2.0/mod/mod_mime.html#multiviewsmatch>)
+
+The only way I have found to work around this problem is to remove
+atom and rss from /etc/mime.types. Of course, that has other undesirable
+results.
+
+I wonder if it would be worth making the po plugin generate apache
+[type map files](http://httpd.apache.org/docs/2.0/mod/mod_negotiation.html#typemaps).
+That should avoid this problem.
+--[[Joey]]
+
+Update: A non-intrusive fix is to add this to apache configuration.
+This tunes the "quality" of the rss and atom files, in an apparently currently
+undocumented way (though someone on #httpd suggested it should get documented).
+Result is that apache will prefer serving index.html. --[[Joey]] [[done]]
+
+ AddType application/rss+xml;qs=0.8 .rss
+ AddType application/atom+xml;qs=0.8 .atom
diff --git a/doc/bugs/po:_double_commits_of_po_files.mdwn b/doc/bugs/po:_double_commits_of_po_files.mdwn
new file mode 100644
index 000000000..2f3015e2b
--- /dev/null
+++ b/doc/bugs/po:_double_commits_of_po_files.mdwn
@@ -0,0 +1,22 @@
+When adding a new english page, the po files are created, committed,
+and then committed again. The second commit makes this change:
+
+ -"Content-Type: text/plain; charset=utf-8\n"
+ -"Content-Transfer-Encoding: ENCODING"
+ +"Content-Type: text/plain; charset=UTF-8\n"
+ +"Content-Transfer-Encoding: ENCODING\n"
+
+Same thing happens when a change to an existing page triggers a po file
+update. --[[Joey]]
+
+> * The s/utf-8/UTF-8 part has been fixed.
+> * The ENCODING\n part is due to an inconsistency in po4a, which
+> I've just send a patch for. --[[intrigeri]]
+
+>> I resubmitted the patch to po4a upstream, sending it this time to
+>> their mailing-list:
+>> [post archive](http://lists.alioth.debian.org/pipermail/po4a-devel/2010-July/001897.html).
+>> --[[intrigeri]]
+
+>>> Seems to me Debian Squeeze's po4a does not expose this bug anymore
+>>> => [[done]]. --[[intrigeri]]
diff --git a/doc/bugs/po:_markdown_link_parse_bug.mdwn b/doc/bugs/po:_markdown_link_parse_bug.mdwn
new file mode 100644
index 000000000..1aa4eb803
--- /dev/null
+++ b/doc/bugs/po:_markdown_link_parse_bug.mdwn
@@ -0,0 +1,21 @@
+Apparently this is legal markdown, though unusual syntax for a link:
+
+ [Branchable](http://www.branchable.com/ "Ikiwiki hosting")
+
+If that is put on a translatable page, the translations display it not as a
+link, but as plain text.
+
+Probably a po4a bug, but I don't see the bug clearly in the gernerated po
+file:
+
+ "This was posted automatically by [Branchable](http://www.branchable.com/ "
+ "\"Ikiwiki hosting\") when I signed up."
+
+--[[Joey]]
+
+> I cannot reproduce this on my Squeeze system with ikiwiki Git code;
+> both the page in the master language and translation pages perfectly
+> display the link (and tooltip) in my testing environment. Were you
+> using an oldest po4a, such as Lenny's one? --[[intrigeri]]
+
+>> Quite likely. Not seeing the problem now, [[done]] --[[Joey]]
diff --git a/doc/bugs/po:_might_not_add_translated_versions_of_all_underlays.mdwn b/doc/bugs/po:_might_not_add_translated_versions_of_all_underlays.mdwn
new file mode 100644
index 000000000..82aed400d
--- /dev/null
+++ b/doc/bugs/po:_might_not_add_translated_versions_of_all_underlays.mdwn
@@ -0,0 +1,16 @@
+[[plugins/po]]'s `checkconfig` looks in the `underlaydirs`, but plugins that
+add underlays typically do so in their own `checkconfig`.
+
+As far as I can see, this will result in it not adding translated versions
+of underlays added by a plugin that comes after it in `$config{add_plugins}`;
+for instance, if you have `add_plugins => qw(po smiley)`, you'll probably
+not get the translated versions of `smileys.mdwn`. (I haven't tested this.)
+
+> It doesn't happen because smiley adds the underlay unconditionally on
+> import. Which is really more usual.
+
+To see them all, `po` should use `last => 1` when registering the hook.
+--[[smcv]]
+
+> At least all that don't last their hooks too! But, added, since
+> it will make the problem much less likely to occur. --[[Joey]] [[done]]
diff --git a/doc/bugs/po:_new_pages_not_translatable.mdwn b/doc/bugs/po:_new_pages_not_translatable.mdwn
new file mode 100644
index 000000000..c19f66594
--- /dev/null
+++ b/doc/bugs/po:_new_pages_not_translatable.mdwn
@@ -0,0 +1,12 @@
+Today I added a new English page to l10n.ikiwiki.info. When I saved,
+the page did not have the translation links at the top. I waited until
+the po plugin had, in the background, created the po files, and refreshed;
+still did not see the translation links. Only when I touched the page
+source and refreshed did it finally add the translation links.
+I can reproduce this bug in a test site. --[[Joey]]
+
+> I could reproduce this bug at some point during the merge of a buggy
+> version of my ordered slave languages patch, but I cannot anymore.
+> Could you please try again? --[[intrigeri]]
+
+>> Cannot reproduce with 3.20100722, [[done]] I guess. --[[Joey]]
diff --git a/doc/bugs/po:_plugin_should_not_override_the_title_on_the_homepage.mdwn b/doc/bugs/po:_plugin_should_not_override_the_title_on_the_homepage.mdwn
new file mode 100644
index 000000000..8f9374707
--- /dev/null
+++ b/doc/bugs/po:_plugin_should_not_override_the_title_on_the_homepage.mdwn
@@ -0,0 +1,58 @@
+The po plugin systematically overrides the title of the homepage with the wikiname. This prevents explicitly changing it with a meta directive. It should rather check whether it was overridden before setting it back.
+
+Here is a simple patch for that:
+
+ diff --git a/Plugin/po.pm b/Plugin/po.pm
+ index 6395ebd..a048c6a 100644
+ --- a/Plugin/po.pm
+ +++ b/Plugin/po.pm
+ @@ -333,7 +333,7 @@ sub pagetemplate (@) {
+ && $masterpage eq "index") {
+ $template->param('parentlinks' => []);
+ }
+ - if (ishomepage($page) && $template->query(name => "title")) {
+ + if (ishomepage($page) && $template->query(name => "title") && !$template->query(name => "title_overridden")) {
+ $template->param(title => $config{wikiname});
+ }
+ }
+
+Thanks.
+
+> I fixed this patch a bit and applied it to my po branch, thanks
+> (commit 406485917).
+>
+> But... a bug (probably in HTML::Template) prevents this
+> theoretically correct solution to actually work.
+> Setting a parameter that does not appear in the template, such as
+> `title_overridden`, is not working on my install: the value does not
+> seem to be stored anywhere, and when accessing it later using
+> `$template->param('title_overridden')` it is always undef.
+> Adding `<TMPL_IF TMPL_VAR TITLE_OVERRIDDEN></TMPL_IF>` in
+> `page.tmpl` is a working, but ugly workaround.
+>
+> I am nevertheless in favour of merging the fix into ikiwiki.
+> We'll then need to find how to find the remaining (smaller) bug so
+> that this code can actually work.
+>
+> I'd like others to test my po branch and see if they can reproduce
+> the bug I am talking of.
+>
+> --[[intrigeri]]
+
+>> Commit 406485917 looks fine to me, FWIW --[[smcv]]
+
+>>> I tracked the HTML::Template bug (or missing documentation?) a bit
+>>> more. This lead to commit b2a2246ba in my po branch, that enables
+>>> HTML::Template's parent_global_vars option which makes
+>>> title_overridden work.
+>>>
+>>> OTOH I feel this workaround is a bit ugly as this option is not
+>>> documented. IMHO being forced to use it reveals a bug in
+>>> HTML::Template. I reported this:
+>>> https://rt.cpan.org/Public/Bug/Display.html?id=64158.
+>>>
+>>> But still, I think we need to apply the workaround as
+>>> HTML::Template's author has not updated any dist on CPAN for more
+>>> than one year. --[[intrigeri]]
+
+>>>> All merged, [[done]]. --[[Joey]]
diff --git a/doc/bugs/po:_po4a_too_strict_on_html_pages.mdwn b/doc/bugs/po:_po4a_too_strict_on_html_pages.mdwn
new file mode 100644
index 000000000..eba59a682
--- /dev/null
+++ b/doc/bugs/po:_po4a_too_strict_on_html_pages.mdwn
@@ -0,0 +1,22 @@
+On some source .html pages, po4a wrongly detects a malformed document,
+which makes the po plugin error out and the wiki build is aborted.
+
+I've pushed a [[patch]] to my `po` branch to fix this: it makes po4a
+warn, instead of error'ing out, when it detects a malformed input
+document.
+
+This is really a po4a bug which I will report, but since most ikiwiki
+users are gonna live with Squeeze's po4a for a while, I think we
+should workaround it in ikiwiki.
+
+Also, the current state of things makes it a bit too easy to break a
+given ikiwiki site build (DoS) when both the html and po plugins are
+enabled: inserting a html tag without closing it is enough.
+
+--[[intrigeri]]
+
+> Hmm, so this happened while I was away at the beach and I have a big
+> backlog of stuff, only saw it now. I've merged the match for master and
+> will be releasing that soon. I will cherry-pick the fix into at least
+> my debian-stable branch too. I don't know if this is worth doing a whole
+> security advisory for. --[[Joey]]
diff --git a/doc/bugs/po:_po_files_instead_of_html_files.mdwn b/doc/bugs/po:_po_files_instead_of_html_files.mdwn
new file mode 100644
index 000000000..f84dc8ff4
--- /dev/null
+++ b/doc/bugs/po:_po_files_instead_of_html_files.mdwn
@@ -0,0 +1,30 @@
+On the home page of my wiki, when i click on the link "ikiwiki", i get the english file instead of the french file.
+At the bottom of this page, there is the "Links" line:
+Links: index index.fr templates templates.fr
+When i click on "templates.fr", i get the po.file instead of html.
+
+ Sorry for the noise! I set "po_master_language" to fr and all was ok.
+
+> Any chance you could be a bit more verbose about what the
+> misconfiguration was? I don't think the po plugin should behave like that
+> in any configuration. Unless, perhaps, it was just not configured to
+> support any languages at all, and so the po file was treated as a raw
+> file. --[[Joey]]
+
+>> I can reproduce the bug with:
+ # po plugin
+ # master language (non-PO files)
+ po_master_language => {
+ code => 'en',
+ name => 'English'
+ },
+ # slave languages (PO files)
+ po_slave_languages => [qw{fr|Français}],
+
+>>> I've never found any `.po` file in the destination directory on
+>>> any of my PO-enabled ikiwiki instances. Without more information,
+>>> there's nothing I can do: the config snippet pasted above is more
+>>> or less the example one and does not allow me to reproduce the
+>>> bug. --[[intrigeri]]
+
+>>>> I think it's best to close this as unreproducible. [[done]] --[[Joey]]
diff --git a/doc/bugs/po:_ugly_messages_with_empty_files.mdwn b/doc/bugs/po:_ugly_messages_with_empty_files.mdwn
new file mode 100644
index 000000000..d3992b6bc
--- /dev/null
+++ b/doc/bugs/po:_ugly_messages_with_empty_files.mdwn
@@ -0,0 +1,6 @@
+If there are empty .mdwn files, the po plugin displays some ugly messages.
+
+> This is due to a bug in po4a (not checking definedness of a
+> variable). One-liner patch sent. --[[intrigeri]]
+
+>> This seems to be fixed in po4a 0.40 => [[done]]. --[[intrigeri]]
diff --git a/doc/bugs/po:broken_links_to_translatable_basewiki_pages_that_lack_po_fies.mdwn b/doc/bugs/po:broken_links_to_translatable_basewiki_pages_that_lack_po_fies.mdwn
new file mode 100644
index 000000000..121d33807
--- /dev/null
+++ b/doc/bugs/po:broken_links_to_translatable_basewiki_pages_that_lack_po_fies.mdwn
@@ -0,0 +1,73 @@
+broken links to translatable basewiki pages that lack po files
+--------------------------------------------------------------
+
+If a page is not translated yet, the "translated" version of it
+displays wikilinks to other, existing (but not yet translated?)
+pages as edit links, as if those pages do not exist.
+
+That's really confusing, especially as clicking such a link
+brings up an edit form to create a new, english page.
+
+This is with po_link_to=current or negotiated. With default, it doesn't
+happen..
+
+Also, this may only happen if the page being linked to is coming from an
+underlay, and the underlays lack translation to a given language.
+--[[Joey]]
+
+> Any simple testcase to reproduce it, please? I've never seen this
+> happen yet. --[[intrigeri]]
+
+>> Sure, go here <http://l10n.ikiwiki.info/smiley/smileys/index.sv.html>
+>> (Currently 0% translateed) and see the 'WikiLink' link at the bottom,
+>> which goes to <http://l10n.ikiwiki.info/ikiwiki.cgi?page=ikiwiki/wikilink&from=smiley/smileys&do=create>
+>> Compare with eg, the 100% translated Dansk version, where
+>> the WikiLink link links to the English WikiLink page. --[[Joey]]
+
+>>> Seems not related to the page/string translation status: the 0%
+>>> translated Spanish version has the correct link, just like the
+>>> Dansk version => I'm changing the bug title accordingly.
+>>>
+>>> I tested forcing the sv html page to be rebuilt by translating a
+>>> string in it, it did not fix the bug. I did the same for the
+>>> Spanish page, it did not introduce the bug. So this is really
+>>> weird.
+>>>
+>>> The smiley underlay seems to be the only place where the wrong
+>>> thing happens: the basewiki underlay has similar examples
+>>> that do not exhibit this bug. An underlay linking to another might
+>>> be necessary to reproduce it. Going to dig deeper. --[[intrigeri]]
+
+>>>> After a few hours lost in the Perl debugger, I think I have found
+>>>> the root cause of the problem: in l10n wiki's configured
+>>>> `underlaydir`, the basewiki is present in every slave language
+>>>> that is enabled for this wiki *but* Swedish. With such a
+>>>> configuration, the `ikiwiki/wikilink` page indeed does not exist
+>>>> in Swedish language: no `ikiwiki/wikilink.sv.po` can be found
+>>>> where ikiwiki is looking. Have a look to
+>>>> <http://l10n.ikiwiki.info/ikiwiki/>, the basewiki is not
+>>>> available in Swedish language on this wiki. So this is not a po
+>>>> bug, but a configuration or directories layout issue. This is
+>>>> solved by adding the Swedish basewiki to the underlay dir, which
+>>>> is I guess not a possibility in the l10n wiki context. I guess
+>>>> this could be solved by adding `SRCDIR/basewiki` as an underlay
+>>>> to your l10n wiki configuration, possibly using the
+>>>> `add_underlays` configuration directive. --[[intrigeri]]
+
+>>>>> There is no complete Swedish underlay translation yet, so it is not
+>>>>> shipped in ikiwiki. I don't think it's a misconfiguration to use
+>>>>> a language that doesn't have translated underlays. --[[Joey]]
+
+>>>>>> Ok. The problem is triggered when using a language that doesn't
+>>>>>> have translated underlays, *and* defining
+>>>>>> `po_translatable_pages` in a way that renders the base wiki
+>>>>>> pages translatable in po's view of things, which in turns makes
+>>>>>> the po plugin act as if the translation pages did exist,
+>>>>>> although they do not in this case. I still need to have a deep
+>>>>>> look at the underlays-related code you added to `po.pm` a while
+>>>>>> ago. Stay tuned. --[[intrigeri]]
+
+>>>>>>> Fixed in my po branch, along with other related small bugs that
+>>>>>>> happen in the very same situation only. --[[intrigeri]]
+
+>>>>>>>> Merged. Not tested yet, but I trust you; [[done]] --[[Joey]]
diff --git a/doc/bugs/po_plugin_adds_new_dependency.mdwn b/doc/bugs/po_plugin_adds_new_dependency.mdwn
new file mode 100644
index 000000000..3ddcc30f2
--- /dev/null
+++ b/doc/bugs/po_plugin_adds_new_dependency.mdwn
@@ -0,0 +1,38 @@
+Was it intended that the po plugin add a new dependency?
+
+> Yes; see debian/control Build-Depends. However, I have made it disable
+> building that is po4a is not available. [[done]] --[[Joey]]
+
+ PERL5LIB=.. ./po2wiki underlay.setup
+ Failed to load plugin IkiWiki::Plugin::po: Can't locate Locale/Po4a/Common.pm in @INC (@INC contains: .. /Library/Perl/Updates/5.8.8 /System/Library/Perl/5.8.8/darwin-thread-multi-2level /System/Library/Perl/5.8.8 /Library/Perl/5.8.8/darwin-thread-multi-2level /Library/Perl/5.8.8 /Library/Perl /Network/Library/Perl/5.8.8/darwin-thread-multi-2level /Network/Library/Perl/5.8.8 /Network/Library/Perl /System/Library/Perl/Extras/5.8.8/darwin-thread-multi-2level /System/Library/Perl/Extras/5.8.8 /Library/Perl/5.8.6 /Library/Perl/5.8.1 /sw/lib/perl5/5.8.8/darwin-thread-multi-2level /sw/lib/perl5/5.8.8 /sw/lib/perl5/darwin-thread-multi-2level /sw/lib/perl5 /sw/lib/perl5/darwin /usr/local/lib/perl5/site_perl/5.8.8/darwin-thread-multi-2level /usr/local/lib/perl5/site_perl/5.8.8 /usr/local/lib/perl5/site_perl .) at ../IkiWiki/Plugin/po.pm line 13.
+ BEGIN failed--compilation aborted at ../IkiWiki/Plugin/po.pm line 13.
+ Compilation failed in require at (eval 27) line 2.
+ BEGIN failed--compilation aborted at (eval 27) line 2.
+
+ make[1]: *** [po2wiki_stamp] Error 2
+ make: *** [extra_build] Error 2
+
+And it looks like this dependency is not easy to work around. The issue is that the newly translated base wiki means that the po plugin is being used by the build system. It is no longer optional. I've turned it off in my workspace like this: (heavy handed, but it lets me keep going until a proper fix is available)
+
+ diff --git a/Makefile.PL b/Makefile.PL
+ index 602d8fb..68728b7 100755
+ --- a/Makefile.PL
+ +++ b/Makefile.PL
+ @@ -42,7 +42,7 @@ extra_build: ikiwiki.out ikiwiki.setup docwiki
+ ./mdwn2man ikiwiki-makerepo 1 doc/ikiwiki-makerepo.mdwn > ikiwiki-makerepo.man
+ ./mdwn2man ikiwiki-transition 1 doc/ikiwiki-transition.mdwn > ikiwiki-transition.man
+ ./mdwn2man ikiwiki-update-wikilist 1 doc/ikiwiki-update-wikilist.mdwn > ikiwiki-update-wikilist.man
+ - $(MAKE) -C po
+ + # $(MAKE) -C po
+
+ docwiki: ikiwiki.out
+ $(PERL) -Iblib/lib $(extramodules) $(tflag) ikiwiki.out -libdir . -setup docwiki.setup -refresh
+ @@ -114,7 +114,7 @@ extra_install: underlay_install
+ install ikiwiki.out $(DESTDIR)$(PREFIX)/bin/ikiwiki
+ install ikiwiki-makerepo ikiwiki-transition ikiwiki-update-wikilist $(DESTDIR)$(PREFIX)/bin/
+
+ - $(MAKE) -C po install DESTDIR=$(DESTDIR) PREFIX=$(PREFIX)
+ + # $(MAKE) -C po install DESTDIR=$(DESTDIR) PREFIX=$(PREFIX)
+
+ # These might fail if a regular user is installing into a home
+ # directory.
diff --git a/doc/bugs/po_plugin_cannot_add_po_files_into_git.mdwn b/doc/bugs/po_plugin_cannot_add_po_files_into_git.mdwn
new file mode 100644
index 000000000..8e3399611
--- /dev/null
+++ b/doc/bugs/po_plugin_cannot_add_po_files_into_git.mdwn
@@ -0,0 +1,34 @@
+po files are not added to git (error: /path/to/po/file not in repository tree) in my setup.
+
+I have set absolute path for srcdir = '/path/to/repo/doc/'. The root of my git repository is '/path/to/repo/'. When I enable the po plugin, it creates all po files and produces an error when it try to add the file saying that the /path/to/repo/doc/index.fr.po is not in the repository tree.
+
+I have no problem when I use an relative path like srcdir = '.'.
+
+I have an other issue with the po plugin when I set the srcdir to './doc/' (provided that my config file is in /path/to/repo). In this case the po plugin try to add 'doc/doc/index.fr.po' which does not exists (seems like the srcdir path is prepended twice).
+
+> You should never use a relative srcdir path with ikiwiki.
+>
+> I wonder what version of git you have there, since it works ok with the
+> version I have here. But, the po plugin is definitly doing the wrong
+> thing; it's telling git to add the po file with the full scrdir path
+> rather than relative to its root. Fixed that. [[done]] --[[Joey]]
+
+>> Yeah, I figured for the relative path
+>> Git version 1.6.3.3 (on both my dev and server machines)
+>>
+>> Here is an example of what I get when I update the po file on my laptop and I push to the master repository:
+
+ From /srv/git/sb
+ 5eb4619..ecac4d7 master -> origin/master
+ scanning doc.fr.po
+ building doc.fr.po
+ building doc.mdwn, which depends on doc.fr
+ building recentchanges.mdwn, which depends on recentchanges/change_ecac4d7311b15a3a3ed03102b9250487315740bc
+ fatal: '/srv/www/sb.l.n/new/doc/doc.fr.po' is outside repository
+ 'git add /srv/www/sb.l.n/new/doc/doc.fr.po' failed: at /usr/share/perl5/IkiWiki/Plugin/git.pm line 161.
+ done
+ To ssh://git.lohrun.net/var/cache/git/songbook.git
+ 5eb4619..ecac4d7 master -> master
+
+>> The root repository used to run ikiwiki is `/srv/www/sb.l.n/new/`
+>> -- [[AlexandreDupas]]
diff --git a/doc/bugs/po_vs_templates.mdwn b/doc/bugs/po_vs_templates.mdwn
new file mode 100644
index 000000000..d826546e6
--- /dev/null
+++ b/doc/bugs/po_vs_templates.mdwn
@@ -0,0 +1,48 @@
+The po plugin's protection against processing loops (i.e. the
+alreadyfiltered stuff) is playing against us: the template plugin
+triggers a filter hooks run with the very same ($page, $destpage)
+arguments pair that is used to identify an already filtered page.
+
+Processing an included template can then mark the whole translation
+page as already filtered, which prevented `po_to_markup` to be called on
+the PO content.
+
+Symptoms: the unprocessed gettext file goes unfiltered to the
+generated HTML.
+
+This has been fixed in my po branch.
+
+> My commit dcd57dd5c9f3265bb7a78a5696b90976698c43aa updates the
+> bugfix in a much more elegant manner. Its main disadvantage is to
+> add an (optional) argument to IkiWiki::filter. Please review.
+
+-- [[intrigeri]]
+
+>> Hmm. Don't like adding a fourth positional parameter to that (or
+>> any really) function.
+>>
+>> I think it's quite possible that some of the directives that are
+>> calling filter do so unnecessarily. For example, conditional,
+>> cutpaste, more, and toggle each re-filter text that comes from the
+>> page and so has already been filtered. They could probably drop
+>> the filtering. template likewise does not need to filter the
+>> parameters passed into it. Does it need to filter the template output?
+>> Well, it allows the (deprecated) embed plugin to work on template
+>> content, but that's about it.
+>>
+>> Note also that the only other plugin to provide a filter, txt,
+>> could also run into similar problems as po has, in theory (it looks at
+>> the page parameter and assumes the content is for the whole page).
+>>
+>> [[!template id=gitbranch branch=origin/filter-full author="[[joey]]"]]
+>> So, I've made a filter-full branch, where I attempt to fix this
+>> by avoiding unnecessary filtering. Can you check it and merge it into
+>> your po branch and remove your other workarounds so I can merge?
+>> --[[Joey]]
+
+>>> I merged your filter-full branch into my po branch and reverted my
+>>> other workarounds. According to my tests this works ok. I'm glad
+>>> you found this solution, as I didn't like changing the filter
+>>> prototype. I believe you can now merge this code. --[[intrigeri]]
+
+[[!tag patch done]]
diff --git a/doc/bugs/poll_plugin:_can__39__t_vote_for_non-ascii_options.mdwn b/doc/bugs/poll_plugin:_can__39__t_vote_for_non-ascii_options.mdwn
new file mode 100644
index 000000000..0f045c254
--- /dev/null
+++ b/doc/bugs/poll_plugin:_can__39__t_vote_for_non-ascii_options.mdwn
@@ -0,0 +1,7 @@
+I don't seem to be able to vote for options that have non-ascii names, using the poll plugin.
+
+As an example, see http://test.liw.fi/testpoll/index.html: the "red", "green", and "blue" options work fine, but the "ehkä" one does not.
+--[liw](http://liw.fi/)
+
+> Ok, 4.5 hours of beating my head against a brick wall, and I've fixed this.
+> [[done]] --[[Joey]]
diff --git a/doc/bugs/poll_plugin_uses_GET.mdwn b/doc/bugs/poll_plugin_uses_GET.mdwn
new file mode 100644
index 000000000..0538aaa93
--- /dev/null
+++ b/doc/bugs/poll_plugin_uses_GET.mdwn
@@ -0,0 +1,8 @@
+The [[plugins/poll]] plugin uses GET for the vote links. As a result, the
+[[news/openid]] poll has a number of votes from Google. :)
+
+done -- [[Joey]]
+
+Not quite; [the `<form>` `method` attribute defaults to GET](http://www.w3.org/TR/html401/interact/forms.html#adef-method). The forms each need the attribute `method="POST"`.
+
+[[bugs/done]] -- [[Joey]]
diff --git a/doc/bugs/possibly_po_related_error.mdwn b/doc/bugs/possibly_po_related_error.mdwn
new file mode 100644
index 000000000..2a65ae606
--- /dev/null
+++ b/doc/bugs/possibly_po_related_error.mdwn
@@ -0,0 +1,20 @@
+A site got stuck like this:
+
+<pre>
+/home/b-fusioninventory/public_html/documentation/index.es.html independently created, not overwriting with version from documentation.es
+</pre>
+
+I tried rebuilding it, and the rebuild failed like this:
+
+<pre>
+building recentchanges/change_ef4b9f92821335d96732c4b2c93ed96bc84c2f0d._change, which depends on templates/page.tmpl
+removing recentchanges/change_9ca1de878ea654566ce4a8a031d1ad8ed135ea1c/index.html, no longer built by recentchanges/change_9ca1de878ea654566ce4a8a031d1ad8ed135ea1c
+internal error: recentchanges/change_9ca1de878ea654566ce4a8a031d1ad8ed135ea1c._change cannot be found in /home/b-fusioninventory/source or underlay
+</pre>
+
+This internal error seems like the root cause of the original failure.
+ikiwiki crashed and did not record that it wrote the index.es.html file.
+
+Deleting the indexdb and rebuilding cleaned up the problem.
+
+This needs more investigation. --[[Joey]]
diff --git a/doc/bugs/post-commit_hangs.mdwn b/doc/bugs/post-commit_hangs.mdwn
new file mode 100644
index 000000000..32820d886
--- /dev/null
+++ b/doc/bugs/post-commit_hangs.mdwn
@@ -0,0 +1,47 @@
+# post-commit hangs
+
+I installed ikiwiki v3.14159 in /usr/local from tarball (/usr contains an older version). Having done so, and used ikiwiki-transition to update setup file, the post commit hook is now blocking in flock (as seen by ps). I should also mention that I added the goodstuff, attachment and remove plugins (which was the purpose of upgrading to v3). Any clues as how to debug/fix gratefully received. The wiki is publically viewable at wiki.sgcm.org.uk if that helps.
+
+> It's blocking when you do what? Save a page from the web? Make a commit
+> to the underlaying VCS? Which VCS? These are all different code paths..
+> --[[Joey]]
+
+>> It's blocking when I run "ikiwiki --setup ikiwiki.setup" (which calls hg update, which calls ikiwiki --post-commit).
+>> Hmm, maybe it's the recursive call to ikiwiki which is the problem.
+>> The underlying VCS is mercurial. --Ali
+
+>>> You're not supposed to run ikiwiki -setup manually in your post commit hook.
+>>> Doing so will certianly lead to a locking problem; it also forces ikiwiki to rebuild
+>>> the entire wiki anytime a single page changes, which is very inefficient!
+>>>
+>>> Instead, you should use the `mercurial_wrapper` setting
+>>> in the setup file, which will make ikiwiki generate a small
+>>> executable expressly designed to be run at post commit time.
+>>> Or, you can use the `--post-commit` option, as documented
+>>> in [[rcs/mecurial]] --[[Joey]]
+
+>>>> I don't run ikiwiki --setup in the commit hook; I run ikiwiki --post-commit (as mentioned above).
+>>>> I'm trying to run ikiwiki --setup from the command line after modifying the setup file.
+>>>> ikiwiki --setup is calling hg update, which is calling ikiwiki --post-commit. Am I not supposed to do that? --Ali
+
+>>>>> No, I don't think that hg update should call ikiwiki anything. The
+>>>>> [[hgrc_example|rcs/mercurial]] doesn't seem to configure it to do that? --[[Joey]]
+
+>>>>>> Ok, I'm not sure I understand what's going on, but my problem is solved.
+>>>>>>
+>>>>>> My hgrc used to say:
+>>>>>>
+>>>>>> [hooks]
+>>>>>>
+>>>>>> incoming.update = hg up
+>>>>>>
+>>>>>> update.ikiwiki = ikiwiki --setup /home/ikiwiki/ikiwiki.setup --post-commit
+>>>>>>
+>>>>>> I've now changed it to match the example page and it works. Thanks --Ali.
+
+>>>>>>> [[done]]
+
+> Also, how have you arranged to keep it from seeing the installation in /usr? Perl could well be loading
+> modules from the old installation, and if it's one with a different locking strategy that would explain your problem. --[[Joey]]
+
+>> Good point. Not knowing perl, I just assumed /usr/local would take precedence. I've now used "dpkg -r ikiwiki" to remove the problem. --Ali
diff --git a/doc/bugs/post-update_hook_can__39__t_be_compiled_with_tcc.mdwn b/doc/bugs/post-update_hook_can__39__t_be_compiled_with_tcc.mdwn
new file mode 100644
index 000000000..a8fb19888
--- /dev/null
+++ b/doc/bugs/post-update_hook_can__39__t_be_compiled_with_tcc.mdwn
@@ -0,0 +1,19 @@
+Thinking that any c compiler would do the job, I tried to use tcc with ikiwiki, as explicitely allowed by the Debian package dependencies.
+
+I installed `tcc` and `libc6-dev` (for `libcrt1`). The wrapper compilation was OK, but the wrapper fails to run correctly and dies with
+
+ usage: ikiwiki [options] source dest
+ ikiwiki --setup configfile
+
+Everything works fine with gcc.
+
+versions: Debian lenny + backports
+
+> Seems that tcc does not respect changing where `environ` points as a way
+> to change the environment seen after `exec`
+>
+> Given that the man page for `clearenv` suggests using `environ=NULL`
+> if `clearenv` is not available, I would be lerry or using tcc to compile
+> stuff, since that could easily lead to a security compromise of code that
+> expects that to work. However, I have fixed ikiwiki to use `clearenv`.
+> --[[Joey]] [[done]]
diff --git a/doc/bugs/prettydate_with_weekday-date_inconsistency.mdwn b/doc/bugs/prettydate_with_weekday-date_inconsistency.mdwn
new file mode 100644
index 000000000..430d65a3f
--- /dev/null
+++ b/doc/bugs/prettydate_with_weekday-date_inconsistency.mdwn
@@ -0,0 +1,32 @@
+Prettydate creates strings like this: _Last edited in the wee hours of Tuesday night, July 1st, 2009_. However, July 1st is a Wednesday, so either date or Weekday should be modified. In the spirit is probably _Tuesday night, June 30th_. --ulrik
+
+> The default prettydate times are fairly idiosyncratic to
+> how [[Joey]] thinks about time. Specifically, it's still
+> Tuesday night until he wakes up Wednesday morning -- which
+> could be in the afternoon. :-P But, Joey also realizes
+> that dates change despite his weird time sense, and so
+> July 1st starts at midnight on Tuesday and continues
+> through Tuesday night and part of Wednesday.
+>
+> (This might not be as idiosyncratic as I make it out to be..
+> I think that many people would agree that in the wee hours
+> of New Years Eve, when they're staggering home ahead of
+> the burning daylight, the date is already January 1st.)
+>
+> I think the bug here is that prettydate can't represent
+> all views of time. While the times
+> of day can be configured, and it's possible to configure it
+> to call times after midnight "Wednesday morning, July 1st",
+> it is not possible to configure the date or weekday based
+> on the time of day.
+>
+> In order to do so, prettydate's timetable would need to be
+> extended to include the "%B %o, %Y" part, and that extended
+> to include "%B-", "%o-", and "%Y-" to refer to the day
+> before.
+>
+> --[[Joey]]
+
+>> fair enough, I think I can get converted to a warped time perspective. --ulrik
+
+>>> Perhaps we can consider this [[done]], then? --[[smcv]]
diff --git a/doc/bugs/preview_base_url_should_be_absolute.mdwn b/doc/bugs/preview_base_url_should_be_absolute.mdwn
new file mode 100644
index 000000000..f160a84c4
--- /dev/null
+++ b/doc/bugs/preview_base_url_should_be_absolute.mdwn
@@ -0,0 +1,53 @@
+The edit page CGI defines a `base` tag with an URL which is not
+absolute, which can break the preview function in some circumstances
+(with e.g. images not showing). The trivial [[patch]] that fixes
+it can be found [[here|http://sprunge.us/EPHT]] as well as on [[my
+git|http://git.oblomov.eu/ikiwiki]].
+
+> That patch does mean that if you're accessing the CGI via HTTPS but your
+> $config{url} and $config{cgiurl} are HTTP, you'll get preview images loaded
+> via HTTP, causing the browser to complain. See
+> [[todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both]]
+> for background.
+>
+> Perhaps the CGI could form its `<base>` URL by using
+> `URI->new_abs(urlto(...), $cgi->url)` instead?
+>
+> You'd also need to change `IkiWiki/Wrapper.pm` to pass at least the
+> SERVER_NAME and SERVER_PORT through the environment, probably.
+>
+> Joey's last comment on
+> [[todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both]]
+> suggests that this might already work, but I'm not quite sure how - I'd
+> expect it to need more environment variables? --[[smcv]]
+>
+>> `CGI::url` uses `REQUEST_URI`. So it could be used, but I don't see
+>> how to get from the `CGI::url` to an url to the page that is being
+>> edited. --[[Joey]]
+>>> (The right rune seems to be: `URI->new_abs(urlto($params{page}), $cgi->url))` --[[Joey]]
+
+---
+
+Update: This bug is worse than it first appeared, and does not only affect
+previewing. The cgi always has a `<base>` url, and it's always relative,
+and that can break various links etc. For example, when the 404 plugin
+displays a missing page, it has a Recentchanges link, which would be broken
+if the cgi was in an unusual place.
+
+`misctemplate` needs to *always* set an absolute baseurl. Which is a problem,
+since `misctemplate` is not currently passed a cgi object from which to
+construct one. --[[Joey]]
+
+Update: Worse and worse. `baseurl(undef)` can be a relative url, but
+nearly every use of it I can find actually needs to be absolute.
+the numerous `redirect($q, baseurl(undef))` all need to be absolute
+according to `CGI` documentation.
+
+So, I'm seriously thinking about reverting the part of
+[[todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both]]
+that made `baseurl(undef)` relative.
+And I suppose, re-opening that todo. :( --[[Joey]]
+
+----
+
+This was fixed in version 3.20110105 [[done]] --[[Joey]]
diff --git a/doc/bugs/preview_pagestate.mdwn b/doc/bugs/preview_pagestate.mdwn
new file mode 100644
index 000000000..7f7ec0976
--- /dev/null
+++ b/doc/bugs/preview_pagestate.mdwn
@@ -0,0 +1,13 @@
+If a change to a page is previewed, but not saved, `%pagestate` and
+`%wikistate` can be changed, and saved. Actually, it's not limited to
+those. Seems that spurious dependencies can be added, though existing
+dependencies will at least not be removed.
+
+It calls saveindex to record state about files created on disk for the
+preview. Those files will expire later. However, saveindex also
+saves other state changes.
+
+Seems like it needs to isolate all state changes when previewing... ugh.
+--[[Joey]]
+
+[[done]]
diff --git a/doc/bugs/previewing_new_page_can_leave_files_dangling.mdwn b/doc/bugs/previewing_new_page_can_leave_files_dangling.mdwn
new file mode 100644
index 000000000..22df485ad
--- /dev/null
+++ b/doc/bugs/previewing_new_page_can_leave_files_dangling.mdwn
@@ -0,0 +1,53 @@
+Steps to reproduce:
+
+1. Make a new post via web interface.
+2. Use a directive that generates extra files (say, teximg).
+3. Click cancel.
+
+What I expect:
+
+The files that teximg created should (eventually) be removed, along with the whole directory of the non-existant new post.
+
+What I got:
+
+I refresh and rebuild a few times, and the files are still dangling there. If I then try to create a post with the same name and same content, I get a "file independently created, not overwriting" error.
+
+> This is specific to previewing when creating a new page. If the page
+> previously existed, the next update to the page will remove the stale
+> preview files.
+>
+> Problem is that ikiwiki doesn't store state about files rendered by a
+> page if the page doesn't exist yet.
+>
+> However, just storing that state wouldn't entirely solve the problem,
+> since it would still not delete the leftovers until the page is updated,
+> which it never is if it's previewed and then canceled. And requiring the
+> cancel button be hit doesn't solve this, because people won't.
+>
+> Also, it's not really ideal that an existing page has to be updated to
+> remove stale files, because if the edit is aborted, the page might not be
+> updated for a long time.
+>
+> One fix would be to stash a copy of `%renderedfiles` before generating
+> the preview, then compare it afterwards to see how it changed and
+> determine what files were added, and record those someplace, and delete
+> them on a future refresh (after some reasonable time period). [[done]]
+>
+> Another approach would be to make previewing always render files with
+> some well-known temporary name. Then all such temp files could be removed
+> periodically. This would need changes to all plugins that write files
+> during preview though. For example, `will_render` might be changed to
+> return the actual filename to write to. --[[Joey]]
+
+For teximg, I think this can be fixed by using data url like graphviz, but
+I think plugins in general should be allowed to create files during preview
+and have them be cleaned up when the user presses cancel. This segues into
+what my actual problem is: I wrote a htmlize plugin to format .tex files as
+page images (following hnb and teximg, since I was completely unfamiliar
+with perl until yesterday (and ikiwiki until a few days ago)), and there is
+no way to tell if I'm in preview mode (so I can use data url and not leave
+files dangling) or commit mode (so I can use real images and not have
+bloated html).
+
+> It seems too ugly to thread an indicator to preview mode through to
+> htmlize, so I'd prefer to not deal with the problem that way.
diff --git a/doc/bugs/previewing_with_an_edittemplate_reverts_edit_box.mdwn b/doc/bugs/previewing_with_an_edittemplate_reverts_edit_box.mdwn
new file mode 100644
index 000000000..4405a7ab8
--- /dev/null
+++ b/doc/bugs/previewing_with_an_edittemplate_reverts_edit_box.mdwn
@@ -0,0 +1,5 @@
+The 'editcontent' textarea that should be saved across previews is being overridden whenever an edittemplate is in use, 'losing' edits on preview unless the browser maintains them in history.
+
+ --[[JoeRayhawk]]
+
+> ugly one... [[done]] --[[Joey]]
diff --git a/doc/bugs/problem_adding_tag_from_template.mdwn b/doc/bugs/problem_adding_tag_from_template.mdwn
new file mode 100644
index 000000000..0d1cf45a8
--- /dev/null
+++ b/doc/bugs/problem_adding_tag_from_template.mdwn
@@ -0,0 +1,10 @@
+I tried to make the [[plugin_template|templates/plugin]] automatically add the
+`type/core` tag if passed the `core` parameter. However, this did not appear
+to have the desired effect: if I removed `type/core` from the tags on a plugin
+page that used `core=1` in the template (such as [[plugins/mdwn]]), the
+`type/core` tag disappeared, and the template did not supply
+it. --[[JoshTriplett]]
+
+Problem was that setting a tag cleared all earlier tags. [[bugs/done]], and
+I like the idea of the autotagging..
+--[[Joey]]
diff --git a/doc/bugs/proxy.py_utf8_troubles.mdwn b/doc/bugs/proxy.py_utf8_troubles.mdwn
new file mode 100644
index 000000000..7e8f70e59
--- /dev/null
+++ b/doc/bugs/proxy.py_utf8_troubles.mdwn
@@ -0,0 +1,35 @@
+when writing an external plugin using `proxy.py`, the getstate and setstate
+functions don't accept unicode data:
+
+ uncaught exception: 'ascii' codec can't encode character u'\xe4' in position 25: ordinal not in range(128)
+ Traceback (most recent call last):
+ File "proxy.py", line 309, in run
+ self._in_fd, self._out_fd)
+ File "proxy.py", line 192, in handle_rpc
+ ret = self._dispatcher.dispatch(method, params)
+ File "proxy.py", line 84, in dispatch
+ return self._dispatch(method, params)
+ File "/usr/lib/python2.7/SimpleXMLRPCServer.py", line 420, in _dispatch
+ return func(*params)
+ File "proxy.py", line 251, in hook_proxy
+ ret = function(self, *args)
+ File "/home/chrysn/git/ikiwiki-plugins//plugins/my_plugin", line 49, in data2html
+ proxy.setstate(kwargs['page'], 'meta', 'title', unicode_containing_umlauts)
+ File "proxy.py", line 291, in setstate
+ return self.rpc('setstate', page, id, key, value)
+ File "proxy.py", line 233, in rpc
+ *args, **kwargs)
+ File "proxy.py", line 178, in send_rpc
+ cmd, data))
+ UnicodeEncodeError: 'ascii' codec can't encode character u'\xe4' in position 25: ordinal not in range(128)
+
+the culprit is the last `_debug_fn` invocation in `send_rpc` (line 178), where
+unicode data is format-fed into a string. while this could be circumvented by
+making the formatting string a unicode string, that would cause trouble with
+python3 and we'd just move the problem to the stderr writing later on; instead,
+"`cmd, data))`" should become "`cmd, repr(data)))`" and everything is fine.
+debug output doesn't look that pretty any more, but is safe.
+
+--[[chrysn]]
+
+> ok, [[done]] --[[Joey]]
diff --git a/doc/bugs/prune_causing_taint_mode_failures.mdwn b/doc/bugs/prune_causing_taint_mode_failures.mdwn
new file mode 100644
index 000000000..5fc1d8b75
--- /dev/null
+++ b/doc/bugs/prune_causing_taint_mode_failures.mdwn
@@ -0,0 +1,35 @@
+Using ikiwiki version 2.5gpa1 (the backport to Debian 3.1), I suddenly started getting the following error when rebuilding the wiki:
+
+<pre>
+successfully generated /home/ikiwiki/cgi-bin/ikiwiki.cgi
+Insecure dependency in rmdir while running with -T switch at /usr/share/perl5/IkiWiki/Render.pm line 242.
+BEGIN failed--compilation aborted at (eval 5) line 130.
+</pre>
+
+I've no idea what's happening (hey, I'm a C programmer), but I've hacked prune() to workaround this as follows:
+
+<pre>
+use Scalar::Util qw(tainted);
+
+sub prune ($) {
+ my $file=shift;
+
+ unlink($file);
+ my $dir=dirname($file);
+ if (!tainted($file) && $dir =~ /^(.*)$/) {
+ $dir = $1;
+ }
+ while (rmdir($dir)) {
+ $dir=dirname($dir);
+ if (!tainted($file) && $dir =~ /^(.*)$/) {
+ $dir = $1;
+ }
+ }
+}
+</pre>
+
+> Old versions of perl are known to have bugs with taint checking.
+> I don't really support using ikiwiki with the perl 5.8.4 in debian
+> oldstable, and would recommend upgrading. --[[Joey]]
+
+[[!tag patch done]]
diff --git a/doc/bugs/pruning_is_too_strict.mdwn b/doc/bugs/pruning_is_too_strict.mdwn
new file mode 100644
index 000000000..ee954e4bc
--- /dev/null
+++ b/doc/bugs/pruning_is_too_strict.mdwn
@@ -0,0 +1,12 @@
+ikiwiki compiles my wiki successfully. But the svn post-commit hook it installs doesn't work at all. Instead of rendering the files, it deletes their rendered versions. The reason is that the src directory, /home/.kelli/glasserc/wikiwc, matches the prune regexp, so no files in the wiki get added to @files.
+
+I think the prune regexp would be more useful if it was only used to check the relative path from the src root to a file in the wiki.
+
+> I agree with this feature wish. Here is a _first cut_
+> implementation for this feature.
+>
+> --[[roktas]]
+
+[[bugs/Done]], and sorry it took so long to apply --[[Joey]]
+
+> Thank you! -- Ethan \ No newline at end of file
diff --git a/doc/bugs/quieten_mercurial.mdwn b/doc/bugs/quieten_mercurial.mdwn
new file mode 100644
index 000000000..3fd75ea1b
--- /dev/null
+++ b/doc/bugs/quieten_mercurial.mdwn
@@ -0,0 +1,34 @@
+The mercurial backend does not pass the --quiet option to hg, and it sometimes prints
+messages which are then taken for CGI output, causing errors and general trouble. --Michał
+
+ --- iki/usr/share/perl5/IkiWiki/Rcs/mercurial.pm 2006-12-29 02:48:30.000000000 +0100
+ +++ /usr/share/perl5/IkiWiki/Rcs/mercurial.pm 2007-03-18 22:45:24.000000000 +0100
+ @@ -55,7 +55,7 @@
+ }
+
+ sub rcs_update () {
+ - my @cmdline = ("hg", "-R", "$config{srcdir}", "update");
+ + my @cmdline = ("hg", "-q", "-R", "$config{srcdir}", "update");
+ if (system(@cmdline) != 0) {
+ warn "'@cmdline' failed: $!";
+ }
+ @@ -80,7 +80,7 @@
+
+ $message = possibly_foolish_untaint($message);
+
+ - my @cmdline = ("hg", "-R", "$config{srcdir}", "commit",
+ + my @cmdline = ("hg", "-q", "-R", "$config{srcdir}", "commit",
+ "-m", "$message", "-u", "$user");
+ if (system(@cmdline) != 0) {
+ warn "'@cmdline' failed: $!";
+ @@ -92,7 +92,7 @@
+ sub rcs_add ($) {
+ my ($file) = @_;
+
+ - my @cmdline = ("hg", "-R", "$config{srcdir}", "add", "$file");
+ + my @cmdline = ("hg", "-q", "-R", "$config{srcdir}", "add", "$file");
+ if (system(@cmdline) != 0) {
+ warn "'@cmdline' failed: $!";
+ }
+
+Thanks much for the patch. [[bugs/done]] --[[Joey]]
diff --git a/doc/bugs/raw_html_in-page_and___91____91____33__included__93____93__.mdwn b/doc/bugs/raw_html_in-page_and___91____91____33__included__93____93__.mdwn
new file mode 100644
index 000000000..5860d330a
--- /dev/null
+++ b/doc/bugs/raw_html_in-page_and___91____91____33__included__93____93__.mdwn
@@ -0,0 +1,100 @@
+I'm trying to add a flickr stream thing to my (static) ikiwiki. I've disabled htmlscrubber and enabled rawhtml, and I get many strange errors.
+
+[[!toc ]]
+
+## putting the html right into the markdown index.mdwn
+
+This should work, but html code (listing 1) shows up with a hash replaced for the actual content (listing 2)
+
+I have to suspect that replacing html with some hash is a bug.
+
+> Congrats, you're another person to fall afoul of markdown [[!debbug 380212]].
+> The fix is to use Text::Markdown, or markdown 1.0.2 instead of buggy
+> old markdown 1.0.1. --[[Joey]] [[!tag done]]
+
+## inlining raw html
+
+This would be my prefered solution. in index.mdwn:
+
+ \[[!inline pages="flickr.html" rss="no"]]
+
+but this refuses to show any content. Trying to RTFM and adding raw="yes" results in this error:
+
+ uppdaterar wiki..
+ söker av index.mdwn
+ ritar upp index.mdwn
+ private//ikiwiki.setup: Can't call method "param" on an undefined value at /usr/share/perl5/IkiWiki/Plugin/inline.pm line 253.
+ BEGIN failed--compilation aborted at (eval 10) line 63.
+
+
+## current workaround: iframe
+
+I'm no html guru so I put the stuff in an iframe, but that doesn't work, since the links are script-generated and need a target="" attribute in them to load in the right place (replacing ikiwiki page).
+
+Ikiwiki version: 2.44
+
+plugin configuration:
+
+ disable_plugins => [qw{htmlscrubber}],
+ add_plugins => [qw{img map rawhtml toggle template prettydate haiku meta}],
+
+best regards
+ulrik
+
+
+## listing 1
+ <!-- Start of Flickr Badge -->
+ <style type="text/css">
+ #flickr_badge_source_txt {padding:0; font: 11px Arial, Helvetica, Sans serif; color:#666666;}
+ #flickr_badge_icon {display:block !important; margin:0 !important; border: 1px solid rgb(0, 0, 0) !important;}
+ #flickr_icon_td {padding:0 5px 0 0 !important;}
+ .flickr_badge_image {text-align:center !important;}
+ .flickr_badge_image img {border: 1px solid black !important;}
+ #flickr_www {display:none; text-align:left; padding:0 10px 0 10px !important; font: 11px Arial, Helvetica, Sans serif !important; color:#3993ff !important;}
+ #flickr_badge_uber_wrapper a:hover,
+ #flickr_badge_uber_wrapper a:link,
+ #flickr_badge_uber_wrapper a:active,
+ #flickr_badge_uber_wrapper a:visited {text-decoration:none !important; background:inherit !important;color:#6600CC;}
+ #flickr_badge_wrapper {}
+ #flickr_badge_source {padding:0 !important; font: 11px Arial, Helvetica, Sans serif !important; color:#666666 !important;}
+ </style>
+ <table id="flickr_badge_uber_wrapper" cellpadding="0" cellspacing="10" border="0">
+ <tr>
+ <td><a href="http://www.flickr.com" id="flickr_www">www.<strong style="color:#3993ff">flick<span style="color:#ff1c92">r</span></strong>.com</a><table cellpadding="0" cellspacing="2" border="0" id="flickr_badge_wrapper">
+ <tr>
+ <script type="text/javascript" src="http://www.flickr.com/badge_code_v2.gne?show_name=1&count=5&display=latest&size=s&layout=h&source=user&user=23579158%40N05"></script>
+ <td id="flickr_badge_source" valign="center" align="center">
+ <table cellpadding="0" cellspacing="0" border="0"><tr>
+ <td width="10" id="flickr_icon_td"><a href="http://www.flickr.com/photos/23579158@N05/"><img id="flickr_badge_icon" alt="englabenny's items" src="http://farm3.static.flickr.com/2338/buddyicons/23579158@N05.jpg?1211285412#23579158@N05" align="left" width="48" height="48"></a></td>
+ <td id="flickr_badge_source_txt"><nobr>Go to</nobr> <a href="http://www.flickr.com/photos/23579158@N05/">englabenny's photostream</a></td>
+ </tr></table>
+ </td>
+ </tr>
+ </table>
+ </td></tr></table>
+ <!-- End of Flickr Badge -->
+
+
+## listing 2
+ <!-- Start of Flickr Badge -->
+ <style type="text/css">
+ #flickr_badge_source_txt {padding:0; font: 11px Arial, Helvetica, Sans serif; color:#666666;}
+ #flickr_badge_icon {display:block !important; margin:0 !important; border: 1px solid rgb(0, 0, 0) !important;}
+ #flickr_icon_td {padding:0 5px 0 0 !important;}
+ .flickr_badge_image {text-align:center !important;}
+ .flickr_badge_image img {border: 1px solid black !important;}
+ #flickr_www {display:none; text-align:left; padding:0 10px 0 10px !important; font: 11px Arial, Helvetica, Sans serif !important; color:#3993ff !important;}
+ #flickr_badge_uber_wrapper a:hover,
+ #flickr_badge_uber_wrapper a:link,
+ #flickr_badge_uber_wrapper a:active,
+ #flickr_badge_uber_wrapper a:visited {text-decoration:none !important; background:inherit !important;color:#6600CC;}
+ #flickr_badge_wrapper {}
+ #flickr_badge_source {padding:0 !important; font: 11px Arial, Helvetica, Sans serif !important; color:#666666 !important;}
+ </style>
+
+
+ 7383eb73071488c9ef46d617acf3e402
+
+
+ </td></tr></table>
+ <!-- End of Flickr Badge -->
diff --git a/doc/bugs/rebuild_after_changing_the_underlaydir_config_option.mdwn b/doc/bugs/rebuild_after_changing_the_underlaydir_config_option.mdwn
new file mode 100644
index 000000000..8613ef03c
--- /dev/null
+++ b/doc/bugs/rebuild_after_changing_the_underlaydir_config_option.mdwn
@@ -0,0 +1,12 @@
+It seems that rebuild a wiki (`ikiwiki --rebuild`) after changing the `underlaydir` config option doesn't remove the pages coming from the previous underlaydir.
+
+I've noticed this with the debian package version 3.20100102.3~bpo50+1.
+
+Perhaps it is possible to improve this or mention it in the manual page?
+
+--prosper
+
+> --rebuild causes ikiwiki to throw away all its info about what it built
+> before, so it will never clean up pages that have been removed, by any
+> means. Suggest you do a --refresh, possibly followed by a --rebuild
+> if that is really necessary. --[[Joey]]
diff --git a/doc/bugs/recentchanges_escaping.mdwn b/doc/bugs/recentchanges_escaping.mdwn
new file mode 100644
index 000000000..1ad16d198
--- /dev/null
+++ b/doc/bugs/recentchanges_escaping.mdwn
@@ -0,0 +1,5 @@
+When committing a page like this one, with an escaped toc directive in it:
+
+ \[[!toc ]]
+
+The recentchangesdiff comes back with it unescaped. Which can be confusing.
diff --git a/doc/bugs/recentchanges_feed_links.mdwn b/doc/bugs/recentchanges_feed_links.mdwn
new file mode 100644
index 000000000..ef0f9d1c4
--- /dev/null
+++ b/doc/bugs/recentchanges_feed_links.mdwn
@@ -0,0 +1,107 @@
+(Moved from [[plugins/recentchanges/discussion]])
+
+I've just upgraded to ikiwiki 2.50 with the `recentchanges` plugin enabled, and
+figured out that I have to turn on `rss` in `ikiwiki.setup` in order to get the
+recentchanges feed. Now the feed shows up, but the links in the feed go to the
+change pages, e.g. `recentchanges/change_1700.html`. I can see a `recentchanges`
+directory created in the working copy, containing files like `change_1700._change`
+but for some reason they are not getting htmlized and carried over. I can see
+in `recentchanges.pm` that it explicitly registers an `htmlize` hook for the
+`_change` type, but something isn't happening. I also see `return if $type=~/^_/;` in
+`render()` in `Render.pm` so I guess the upshot is I'm not sure how this is
+supposed to work; is there a bug here or just something I overlooked that I need
+to turn on? --Chapman Flack
+
+> It's a (minor) bug that recentchanges optimises away generating the
+> change pages, but that the rss/atom feed still links to them. --[[Joey]]
+
+>> Hmm, ok, what's the intended correct behavior? To really generate the
+>> change pages, or to change the links in the feed to point somewhere else that's
+>> not missing? If you can easily point me to the right neighborhood in the code
+>> I might work on a patch for this. It may be a (minor) bug in the grand scheme
+>> of things, but it does seem pretty goofy if you've just clicked an RSS link. :)
+>> --Chap (p.s. should this be moved to bugs?)
+
+>>> The latter -- I think making the permalink point to
+>>> "recentchanges#someid" will probably work. Probably first by addressing the
+>>> todo about [[todo/ability_to_force_particular_UUIDs_on_blog_posts]],
+>>> and then by just using that new ability in the page. --[[Joey]]
+
+>>>> <del title="Prerequisite done now?">Ah. The prerequisite todo looks like more than I'd like to take on.
+>>>> In the meantime, would it be very involved to change whatever bug now
+>>>> optimizes away the change pages, or to simply have all the links in the
+>>>> feed point to the recentchanges page itself, with no fragment id?
+>>>> Either would be a bit nicer than having broken links in the feed. --Chap</del>
+
+>>>> Does the completion of that todo mean it would be straightforward to get
+>>>> recentchanges working now? Is it just that the recentchanges plugin
+>>>> needs to generate `\[[!meta guid=something]]` into the internal files,
+>>>> and the inline plugin would then generate working links in feeds? How should
+>>>> the guid be constructed? Something based on the rcs revision number? I guess
+>>>> I'm still not completely clear on your vision for how it ought to work. --Chap
+
+>>>> My idea is to use `\[[meta guid="http://url/recentchanges#rev"]]`, with the
+>>>> `#rev` anchor also included in the change file, and being the rcs's
+>>>> internal revision id. Then the guid is globally unique, and actually
+>>>> links to the change in the recentchanges page. And, when the change
+>>>> has fallen off the page, the link will still go to the recentchanges page.
+>>>>
+>>>> First, need to check that guids in rss and atom feeds can have anchors in
+>>>> them, and that the anchor is treated as part of the guid. (If the guid
+>>>> is interpreted as just "http://url/recentchanges", then it's
+>>>> not a very good guid.) If using an anchor for a guid is a problem,
+>>>> it could instead generate a random uuid, and use `\[[meta
+>>>> guid="urn:uuid:<foo>" permalink="http://url/recentchanges"]]`
+
+>>>>> I had a quick look into this after fixing the "prerequisite", but got
+>>>>> bogged down in minor details. Anyway, I'd be happy to help.
+>>>>> I think the guid stuff is actually fairly irrelevant, you just need
+>>>>> `\[[!meta permalink]]` (and in fact you're using guid incorrectly, by
+>>>>> expecting it to be treated as a link).
+>>>>>
+>>>>> My advice would be: first, fix the bug as reported, by
+>>>>> using `\[[!meta permalink="http://blah/blah/blah#change-$rev"]]` (starting
+>>>>> anchor names with a number isn't syntactically valid, if I remember
+>>>>> correctly, so do have a prefix like "change-" or "rev-" or something).
+>>>>>
+>>>>> Then, optionally, force the guid too (although it defaults to the permalink
+>>>>> anyway, so this shouldn't actually be necessary).
+>>>>>
+>>>>> Some more explanation of how guids work: it's actually easier to think
+>>>>> about them in Atom terms than in RSS terms, since Atom has a clearer
+>>>>> conceptual model.
+>>>>>
+>>>>> The `\[[!meta permalink]]` becomes the `<link>`
+>>>>> element in Atom, which contains a link that users can follow; if it's not
+>>>>> explicitly given, ikiwiki uses its idea of the page's URL.
+>>>>>
+>>>>> The `\[[!meta guid]]` becomes the `<id>` element in Atom, which contains an
+>>>>> opaque, not-necessarily-resolvable identifier; if it's
+>>>>> not explicitly given, ikiwiki uses the same URL as the `<link>`.
+>>>>>
+>>>>> In RSS the semantics aren't so clear-cut (which is part of why Atom exists!),
+>>>>> but the way ikiwiki interprets them is:
+>>>>>
+>>>>> * `<link>` is the same as in Atom
+>>>>> * if `\[[!meta guid]]` is explicitly given, put it in `<guid permalink="no">`
+>>>>> (the assumption in this case is that it's a UUID or something)
+>>>>> * if `\[[!meta guid]]` is not explicitly given, copy the `<link>` into the `<guid>`
+>>>>>
+>>>>> I believe RSS aggregators (are meant to) compare `<guid>`s as opaque
+>>>>> strings, so using an anchor there should be fine. Atom aggregators are certainly
+>>>>> required to compare `<id>`s as opaque strings.
+>>>>>
+>>>>> --[[smcv]]
+
+>>>>>> Here's my attempt at a [[patch]] for anchor-based change permalinks:
+>>>>>> <http://pastie.org/295016>.
+>>>>>> --[[JasonBlevins]], 2008-10-17
+
+[[JasonBlevins]] nailed it, [[done]] --[[Joey]]
+
+> Thanks for applying the patch (and improving it). There's still one small issue:
+> the old opening div tag still needs to be removed (it's hard to see the removed line
+> with the pastie color scheme).
+> --[[JasonBlevins]], 2008-10-18
+
+>> Thanks, missed that when I had to hand-apply the patch. --[[Joey]]
diff --git a/doc/bugs/recentchanges_sets_has__95__diffurl__61__1_when_diffurl_is_empty.mdwn b/doc/bugs/recentchanges_sets_has__95__diffurl__61__1_when_diffurl_is_empty.mdwn
new file mode 100644
index 000000000..6c6e24b02
--- /dev/null
+++ b/doc/bugs/recentchanges_sets_has__95__diffurl__61__1_when_diffurl_is_empty.mdwn
@@ -0,0 +1,18 @@
+recentchanges.pm sets the template variable HAS_DIFFURL to 1 based solely on whether or not diffurl is defined. I found that diffurl was defined, but empty. The recentchanges template depends on this for recentchangesdiff to properly function -- diff toggling is dependent on HAS_DIFFURL evaluating to false. Adding a check for a non-zero length diffurl fixed the issue for me. A patch against ikiwiki-3.20121212 is as follows:
+
+ --- a/IkiWiki/Plugin/recentchanges.pm 2013-01-27 20:08:59.000000000 -0800
+ +++ b/IkiWiki/Plugin/recentchanges.pm 2013-01-27 20:08:30.000000000 -0800
+ @@ -181,7 +181,8 @@ sub store ($$$) {
+ else {
+ $_->{link} = pagetitle($_->{page});
+ }
+ - if (defined $_->{diffurl}) {
+ + if (defined $_->{diffurl} &&
+ + length($_->{diffurl}) > 0) {
+ $has_diffurl=1;
+ }
+
+
+(There should be one more line at the bottom with a single space on it...)
+
+> [[applied|done]] --[[Joey]]
diff --git a/doc/bugs/recentchangesdiff_crashes_on_commits_which_remove_a_lot_of_files.mdwn b/doc/bugs/recentchangesdiff_crashes_on_commits_which_remove_a_lot_of_files.mdwn
new file mode 100644
index 000000000..b3578f26a
--- /dev/null
+++ b/doc/bugs/recentchangesdiff_crashes_on_commits_which_remove_a_lot_of_files.mdwn
@@ -0,0 +1,46 @@
+[[plugins/recentchangesdiff]] causes rendering to segfault if a commit removes a lot of contents. I removed close to 400 files, total size of about 950Kb in a single commit and now `ikiwiki` segfaults on refresh and rebuild:
+
+ [...]
+ rendering recentchanges.mdwn
+ [1] 5541 segmentation fault ikiwiki --verbose --setup ikiwiki.setup --refresh
+
+If I disable the plugin, the segfault does not happen, but I have to remove `wc/recentchanges/*` or else it will crash just as well.
+
+This is reproducible, but I cannot provide the source code.
+
+> Can you provide a sanitised version of the source code? I've tried
+> ikiwiki on some files that are just large, and cannot reproduce any
+> problems, so it must be something in the specific file. (A perl bug is
+> also clearly involved here.) --[[Joey]]
+
+The tarball is at http://scratch.madduck.net/__tmp__recentchanges-segfault.tgz - unpack it in `/tmp` and `chdir()` to /tmp/cdt.taF18912, then run
+
+ ikiwiki --setup ikiwiki.setup
+ # segfaults
+ git checkout HEAD^
+ ikiwiki --setup ikiwiki.setup
+ # segfaults
+ rm -rf wc/recentchanges
+ ikiwiki --setup ikiwiki.setup
+ # works
+
+> I can reproduce it fine with that, thanks, and it's really looking like a
+> pure perl bug, that is triggered by markdown. Here's a simpler test case:
+
+ joey@kodama:/tmp>markdown < f
+ zsh: segmentation fault markdown < f
+
+> Where f is a 6.3 mb file that I
+> extracted from ikiwiki's rendering pipeline.
+
+> It seems to be crashing at markdown line 345, which is a big nasty
+> `s///` statement.
+
+> The good news: markdown version 1.0.2~b8-2 does not trigger this perl bug.
+> I only see it with 1.0.1. (Bad news: Newer versions of markdown are
+> slooooooow, especially on such large files.)
+
+> I'm calling this [[done]] since I've filed [[!debbug 470676]] on perl, and
+> also have modified recentchangesdiff to only show the first 200 lines of
+> diff, which should be enough without bloating the recentchanges into
+> perl-crashing territory. --[[Joey]]
diff --git a/doc/bugs/relative_date_weird_results.mdwn b/doc/bugs/relative_date_weird_results.mdwn
new file mode 100644
index 000000000..9f35e47f7
--- /dev/null
+++ b/doc/bugs/relative_date_weird_results.mdwn
@@ -0,0 +1,4 @@
+I just submitted a new bug, and... after clicking "save", my brand new bug page displays, at the bottom: "Last edited 6 hours and 3 minutes ago". Timezone issue, I guess? (Hint: I'm in France) -- [[intrigeri]]
+
+> Yep, it wasn't including a timezone in the machine parseable time.
+> [[done]] --[[Joey]]
diff --git a/doc/bugs/removal_of_transient_pages.mdwn b/doc/bugs/removal_of_transient_pages.mdwn
new file mode 100644
index 000000000..6d0caf42e
--- /dev/null
+++ b/doc/bugs/removal_of_transient_pages.mdwn
@@ -0,0 +1,78 @@
+The remove plugin cannot remove [[todo/transient_pages]].
+
+> this turns out to be harder than
+> I'd hoped, because I don't want to introduce a vulnerability in the
+> non-regular-file detection, so I'd rather defer that. --[[smcv]]
+
+This is particularly a problem for tag pages, and autoindex
+created pages. So both plugins default to not creating transient
+pages, until this is fixed. --[[Joey]]
+
+> I'll try to work out which of the checks are required for security
+> and which are just nice-to-have, but I'd appreciate any pointers
+> you could give. --[[smcv]]
+
+>> I assume by "non-regular file", you are referring to the check
+>> in remove that the file "Must exist on disk, and be a regular file" ?
+>> --[[Joey]]
+
+>>> Yes. It's not entirely clear to me why that's there... --s
+
+>>>> Yeah, 2461ce0de6231bfeea4d98c86806cdbb85683297 doesn't really
+>>>> say, and I tend to assume that when I've written paranoid code
+>>>> it's there for a reason. I think that here the concern was that
+>>>> the file might be in some underlay that the user should not be able
+>>>> to affect by web edits. The `-f` check seems rather redundant,
+>>>> surely if it's in `%pagesources` ikiwiki has already verified it's
+>>>> safe. --[[Joey]]
+
+----
+
+[[!template id=gitbranch branch=smcv/ready/transient-rm author="[[Simon McVittie|smcv]]"]]
+
+Here's a branch. It special-cases the `$transientdir`, but in such a way
+that the special case could easily be extended to other locations where
+deletion should be allowed.
+
+It also changes `IkiWiki::prune()` to optionally stop pruning empty
+parent directories at the point where you'd expect it to (for instance,
+previously it would remove the `$transientdir` itself, if it turns out
+to be empty), and updates callers.
+
+The new `prune` API looks like this:
+
+ IkiWiki::prune("$config{srcdir}/$file", $config{srcdir});
+
+with the second argument optional. I wonder whether it ought to look
+more like `writefile`:
+
+ IkiWiki::prune($config{srcdir}, $file);
+
+although that would be either an incompatible change to internal API
+(forcing all callers to update to 2-argument), or being a bit
+inconsistent between the one-and two-argument forms. Thoughts?
+
+--[[smcv]]
+
+> I've applied the branch as-is, so this bug is [[done]].
+> `prune` is not an exported API so changing it would be ok..
+> I think required 2-argument would be better, but have not checked
+> all the call sites to see if the `$file` is available split out
+> as that would need. --[[Joey]]
+
+[[!template id=gitbranch branch=smcv/ready/prune author="[[Simon McVittie|smcv]]"]]
+
+>> Try this, then? I had to make some changes to `attachment`
+>> to make the split versions available. I suggest reviewing
+>> patch-by-patch.
+
+>>> Branch updated; I'd missed a use of prune in ikiwiki.in itself.
+>>> Unfortunately, this means it does still need to support the
+>>> "undefined top directory" case: there isn't an obvious top
+>>> directory for wrappers. --[[smcv]]
+
+>> I also tried to fix a related bug which I found while testing it:
+>> the special case for renaming held attachments didn't seem to work.
+>> (`smcv/wip/rename-held`.) Unfortunately, it seems that with that
+>> change, the held attachment is committed to the `srcdir` when you
+>> rename it, which doesn't seem to be the intention either? --[[smcv]]
diff --git a/doc/bugs/remove_orphaned_sparkline-php_from_Suggests.mdwn b/doc/bugs/remove_orphaned_sparkline-php_from_Suggests.mdwn
new file mode 100644
index 000000000..ab08c0b26
--- /dev/null
+++ b/doc/bugs/remove_orphaned_sparkline-php_from_Suggests.mdwn
@@ -0,0 +1,22 @@
+Hi!
+
+How about to replace sparkline-php from Suggests by a better alternative?
+
+I would like to file a RM bug to get it out of archive. Do you have a better alternative for it? PHP has a lot of them..
+
+Thanks
+
+> sparline-php is orphaned *in Debian*. Upstream, is has seen activity as
+> recently as 11 months ago.
+>
+> I don't know of a better alternative. I looked at the perl sparkline
+> stuff in CPAN and is was bad enough that the pain of using php from this
+> perl program was a better alternative.
+>
+> Anyway, it works great; maintaining the sparkline-php package in Debian
+> would certianly be much less work than finding some alternative and
+> rewriting the ikiwiki code to use it, *and* packaging that alternative
+> and maintaining it in Debian. So your suggestion doesn't make a lot of
+> sense; Debian should just find a maintainer for sparkline-php. --[[Joey]]
+
+[[done]]
diff --git a/doc/bugs/remove_plugin_and_untracked_files.mdwn b/doc/bugs/remove_plugin_and_untracked_files.mdwn
new file mode 100644
index 000000000..07408c3bc
--- /dev/null
+++ b/doc/bugs/remove_plugin_and_untracked_files.mdwn
@@ -0,0 +1,6 @@
+The [[plugins/remove]] plugin does not report an error if git rm fails. (It
+probably doesn't if other VCS backends fail too). This can happen for example
+if a page in your source directory is not a tracked file for whatever reason
+(in my case, due to renaming the files and forgetting to commit that change).
+
+ -- [[Jon]]
diff --git a/doc/bugs/removing_pages_with_utf8_characters.mdwn b/doc/bugs/removing_pages_with_utf8_characters.mdwn
new file mode 100644
index 000000000..0d96aa75f
--- /dev/null
+++ b/doc/bugs/removing_pages_with_utf8_characters.mdwn
@@ -0,0 +1,51 @@
+I have a page with the name "umläute". When I try to remove it, ikiwiki says:
+
+Error: ?umläute does not exist
+
+> I'm curious about the '?' in the "?umläute" message. Suggests that the
+> filename starts with another strange character. Can I get a copy of a
+> git repository or tarball containing this file? --[[Joey]]
+
+I wrote the following patch, which seems to work on my machine. I'm running on FreeBSD 6.3-RELEASE with ikiwiki-3.20100102.3 and perl-5.8.9_3.
+
+ --- remove.pm.orig 2009-12-14 23:26:20.000000000 +0100
+ +++ remove.pm 2010-01-18 17:49:39.000000000 +0100
+ @@ -193,6 +193,7 @@
+ # and that the user is allowed to edit(/remove) it.
+ my @files;
+ foreach my $page (@pages) {
+ + $page = Encode::decode_utf8($page);
+ check_canremove($page, $q, $session);
+
+ # This untaint is safe because of the
+
+
+> The problem with this patch is that, in a recent fix to the same
+> plugin, I made `@pages` come from `$form->field("page")`, and
+> that, in turn is already run through `decode_form_utf8` just above the
+> code you patched. So I need to understand why that is apparently not
+> working for you. (It works fine for me, even when deleting a file named
+> "umläute" --[[Joey]]
+
+----
+
+> Update, having looked at the file in the src of the wiki that
+> is causing trouble for remove, it is: `uml\303\203\302\244ute.mdwn`
+> And that is not utf-8 encoded, which, represented the same
+> would be: `uml\303\244ute.mdwn`
+>
+> I think it's doubly-utf-8 encoded, which perhaps explains why the above
+> patch works around the problem (since the page name gets doubly-decoded
+> with it). The patch doesn't fix related problems when using remove, etc.
+>
+> Apparently, on apoca's system, perl encodes filenames differently
+> depending on locale settings. On mine, it does not. Ie, this perl
+> program always creates a file named `uml\303\244ute`, no matter
+> whether I run it with LANG="" or LANG="en_US.UTF-8":
+>
+> perl -e 'use IkiWiki; writefile("umläute", "./", "baz")'
+>
+> Remains to be seen if this is due to the older version of perl used
+> there, or perhaps FreeBSD itself. --[[Joey]]
+>
+> Update: Perl 5.10 fixed the problem. --[[Joey]]
diff --git a/doc/bugs/rename_fixup_not_attributed_to_author.mdwn b/doc/bugs/rename_fixup_not_attributed_to_author.mdwn
new file mode 100644
index 000000000..bcfafac22
--- /dev/null
+++ b/doc/bugs/rename_fixup_not_attributed_to_author.mdwn
@@ -0,0 +1,12 @@
+When I renamed `todo/transient_in-memory_pages` to [[todo/transient pages]],
+`rename::fixlinks` was meant to blame me for the link-fixing commit, and title it
+`update for rename of %s to %s`. Instead, it blamed Joey for the commit,
+and didn't set a commit message.
+
+(It also committed a pile of recentchanges pages which shouldn't have
+been committed, for which see [[bugs/git_commit_adds_files_that_were_not_tracked]].)
+
+--[[smcv]]
+
+> It was calling `rcs_commit` old-style, and so not passing the session
+> object that is used to get the user's name. [[fixed|done]] --[[Joey]]
diff --git a/doc/bugs/renaming_a_page_destroyed_some_links.mdwn b/doc/bugs/renaming_a_page_destroyed_some_links.mdwn
new file mode 100644
index 000000000..fd7a80bd4
--- /dev/null
+++ b/doc/bugs/renaming_a_page_destroyed_some_links.mdwn
@@ -0,0 +1,12 @@
+When renaming a page here, ikiwiki destroyed unrelated links from unrelated pages. You can see the effect [here](http://mesh.openisp.ca/recentchanges/#diff-dc8dfa96efd3a4d649f571c3aa776f20b3ce0131), or by checking out the git tree (`git://mesh.openisp.ca/
+`) and looking at commit `dc8dfa96efd3a4d649f571c3aa776f20b3ce0131`.
+
+The renamed page was `configuration/bat-hosts` to `configuration/batman/bat-hosts` and the deleted links were ``\[[AUR | https://aur.archlinux.org/]]` and `\[[CHANGELOG|http://svn.dd-wrt.com:8000/browser/src/router/batman-adv/CHANGELOG]]`. --[[anarcat]]
+
+> <del>Nevermind that, that commit was unrelated to the rename and probably an operator error.</del> - No, actually, I just reproduced this again - see [another example](http://mesh.openisp.ca/recentchanges/#diff-d67dc2f0fdc149b13122fd6cba887a01c693e949).
+
+>> Looks like these all involve the wacky wikilink form that includes an
+>> external url in the link. Fixed rename code to know about those.
+>> [[done]] --[[Joey]]
+
+>>> Phew!!! Thanks a *lot* for that one, it was really annoying! :) --[[anarcat]]
diff --git a/doc/bugs/resized_img_with_only_width_or_height_breaks_ie.mdwn b/doc/bugs/resized_img_with_only_width_or_height_breaks_ie.mdwn
new file mode 100644
index 000000000..a5a1c6768
--- /dev/null
+++ b/doc/bugs/resized_img_with_only_width_or_height_breaks_ie.mdwn
@@ -0,0 +1,9 @@
+When using the img directive while reducing the size of the image by only specifying either the width ("100x") or height ("x100"), the resulting HTML breaks/confuses IE (at least 8 and 9).
+
+In those cases img plugin do generate HTML with the missing attribute as "empty". For example, if the new size is specified as "100x", the resulting HTML will be &lt;img&nbsp;...&nbsp;width="100"&nbsp;height=""/&gt;. When IE encounters such empty attributes, the image is sort of compressed into a one (1!) pixel high (or wide) image, which is **not** what you expected.
+
+If we instead always get the resulting the width and height from the resized image, and uses those values in the img attrs, we make IE happy (and all other renders as well).
+
+A patch (tested and deployed) is sitting waiting in my git repository.
+
+> I've applied your patch. Thanks! [[done]] --[[Joey]]
diff --git a/doc/bugs/rss_feed_cleanup_on_delete.mdwn b/doc/bugs/rss_feed_cleanup_on_delete.mdwn
new file mode 100644
index 000000000..fe0400ff8
--- /dev/null
+++ b/doc/bugs/rss_feed_cleanup_on_delete.mdwn
@@ -0,0 +1,6 @@
+If a page stops inlining anthing, its rss feed file will linger around and
+not be deleted.
+
+(The linkmap plugin has the same problem with the png files it creates.)
+
+[[bugs/done]]
diff --git a/doc/bugs/rss_feeds_do_not_use_recommended_encoding_of_entities_for_some_fields.mdwn b/doc/bugs/rss_feeds_do_not_use_recommended_encoding_of_entities_for_some_fields.mdwn
new file mode 100644
index 000000000..0a435cea3
--- /dev/null
+++ b/doc/bugs/rss_feeds_do_not_use_recommended_encoding_of_entities_for_some_fields.mdwn
@@ -0,0 +1,52 @@
+The Atom and RSS templates use `ESCAPE=HTML` in the title elements. However, HTML-escaped characters aren't valid according to <http://feedvalidator.org/>.
+
+Removing `ESCAPE=HTML` works fine, but I haven't checked to see if there are any characters it won't work for.
+
+For Atom, at least, I believe adding `type="xhtml"` to the title element will work. I don't think there's an equivalent for RSS.
+
+> Removing the ESCAPE=HTML will not work, feed validator hates that just as
+> much. It wants rss feeds to use a specific style of escaping that happens
+> to work in some large percentage of all rss consumers. (Most of which are
+> broken).
+> <http://www.rssboard.org/rss-profile#data-types-characterdata>
+> There's also no actual spec about how this should work.
+>
+> This will be a total beast to fix. The current design is very clean in
+> that all (well, nearly all) xml/html escaping is pushed back to the
+> templates. This allows plugins to substitute fields in the templates
+> without worrying about getting escaping right in the plugins -- and a
+> plugin doesn't even know what kind of template is being filled out when
+> it changes a field's value, so it can't do different types of escaping
+> for different templates.
+>
+> The only reasonable approach seems to be extending HTML::Template with an
+> ESCAPE=RSS and using that. Unfortunately its design does not allow doing
+> so without hacking its code in several places. I've contacted its author
+> to see if he'd accept such a patch.
+>
+> (A secondary bug is that using meta title currently results in unnecessry
+> escaping of the title value before it reaches the template. This makes
+> the escaping issues show up much more than they need to, since lots more
+> characters are currently being double-escaped in the rss.)
+>
+> --[[Joey]]
+
+> Update: Ok, I've fixed this for titles, as a special case, but the
+> underlying problem remains for other fields in rss feeds (such as
+> author), so I'm leaving this bug report open. --[[Joey]]
+
+>> I'm curious if there has been any progress on better RSS output?
+>> I've been prototyping a new blog and getting good RSS out of it
+>> seems important as the bulk of my current readers use RSS.
+>> I note, in passing that the "more" plugin doesn't quite do what
+>> I want either - I'd like to pass a full RSS feed of a post and only
+>> have "more" apply to the front page of the blog. Is there a way to do that?
+>> -- [[dtaht]]
+>>
+>>> To be clear, the RSS spec sucks to such an extent that, as far as
+>>> I know, there is no sort of title escaping that will work in all
+>>> RSS consumers. Titles are currently escaped in the way
+>>> that tends to break the fewest according to what I've read.
+>>> If you're unlucky enough to
+>>> have a "&" or "<" in your **name**, then you may still run into
+>>> problems with how that is escaped in rss feeds. --[[Joey]]
diff --git a/doc/bugs/rss_output_relative_links.mdwn b/doc/bugs/rss_output_relative_links.mdwn
new file mode 100644
index 000000000..ff607cbb3
--- /dev/null
+++ b/doc/bugs/rss_output_relative_links.mdwn
@@ -0,0 +1,3 @@
+RSS output contains relative links. Ie.
+http://kitenet.net/~joey/blog/index.rss contains a link to
+http://kitenet.net/~joey/blog/../blog.html
diff --git a/doc/bugs/rst_fails_on_file_containing_only_a_number.mdwn b/doc/bugs/rst_fails_on_file_containing_only_a_number.mdwn
new file mode 100644
index 000000000..99e46aac9
--- /dev/null
+++ b/doc/bugs/rst_fails_on_file_containing_only_a_number.mdwn
@@ -0,0 +1,29 @@
+If you create a foo.rst containing only a number, such as "11", rendering
+results in the following error being thrown. (Now that I've fixed the error
+throwing code..):
+
+ exceptions.TypeError:coercing to Unicode: need string or buffer, int found
+
+--[[Joey]]
+
+> Does this patch against proxy.py help?
+
+ index 5136b3c..545e226 100755
+ --- a/plugins/proxy.py
+ +++ b/plugins/proxy.py
+ @@ -88,7 +101,7 @@ class _IkiWikiExtPluginXMLRPCHandler(object):
+
+ @staticmethod
+ def _write(out_fd, data):
+ - out_fd.write(data)
+ + out_fd.write(str(data))
+ out_fd.flush()
+
+ @staticmethod
+
+> No, still the same failure. I think it's failing parsing the input data,
+> (which perl probably transmitted as an int due to perl internals)
+> not writing out its response. --[[Joey]]
+
+> On second thought, this was a bug in ikiwiki, it should be transmitting
+> that as a string. Fixed in external.pm --[[Joey]]
diff --git a/doc/bugs/rst_plugin_hangs_on_utf-8.mdwn b/doc/bugs/rst_plugin_hangs_on_utf-8.mdwn
new file mode 100644
index 000000000..b0f417209
--- /dev/null
+++ b/doc/bugs/rst_plugin_hangs_on_utf-8.mdwn
@@ -0,0 +1,20 @@
+When rendering an rst page with utf-8 characters in (specifically but not
+solely "£"), ikiwiki seems to hang.
+
+Killing with Control-C gives the following traceback:
+
+ Traceback (most recent call last):
+ File "/usr/lib/ikiwiki/plugins/rst", line 97, in ?
+ handler.handle_request()
+ File "/usr/lib/ikiwiki/plugins/rst", line 74, in handle_request
+ ret = rpc_read(processor)
+ File "/usr/lib/ikiwiki/plugins/rst", line 42, in rpc_read
+ line = sys.stdin.readline()
+ KeyboardInterrupt
+
+rst2html on the same file has no problem rendering the file as expected. The
+markdown plugin also has no problems rendering the same file, so I'm guessing
+it's a problem with the XML-RPC interface.
+
+Sorry for the delay, this is now fixed! --[[Joey]]
+[[!tag done]]
diff --git a/doc/bugs/rst_plugin_has_python_hardcode_in_shebang_line.mdwn b/doc/bugs/rst_plugin_has_python_hardcode_in_shebang_line.mdwn
new file mode 100644
index 000000000..a594adc09
--- /dev/null
+++ b/doc/bugs/rst_plugin_has_python_hardcode_in_shebang_line.mdwn
@@ -0,0 +1,15 @@
+Current the rst plugin uses this shebang line:
+
+ #!/usr/bin/python
+
+The problem is that rst plugin uses some feature (for example, iterator comprehension) which is unavailable on old version of Python.
+
+So rst plugin will not work on a machine which has an old version of python in system path even though
+the user have installed a new version of python in other place. For example, I am using ikiwiki with the rst plugin on Mac OS X 10.4 which ships python 2.3 but I do have python2.6 installed on /opt/local/bin/python (via macports).
+
+Thus I suggest to change the shebang line to:
+
+ #!/usr/bin/env python
+
+> [[done]], although the irony of all the perl hashbangs in ikiwiki
+> being hardcoded doesn't escape me. --[[Joey]]
diff --git a/doc/bugs/rst_plugin_traceback_with_SimpleXMLRPCDispatcher_from_pyhton_2.5.mdwn b/doc/bugs/rst_plugin_traceback_with_SimpleXMLRPCDispatcher_from_pyhton_2.5.mdwn
new file mode 100644
index 000000000..9997d383b
--- /dev/null
+++ b/doc/bugs/rst_plugin_traceback_with_SimpleXMLRPCDispatcher_from_pyhton_2.5.mdwn
@@ -0,0 +1,13 @@
+After adding rst to plugins, ikiwiki --setup fails:
+
+ Traceback (most recent call last):
+ File "/usr/lib/ikiwiki/plugins/rst", line 93, in <module>
+ handler = SimpleStdinOutXMLRPCHandler()
+ File "/usr/lib/ikiwiki/plugins/rst", line 65, in __init__
+ SimpleXMLRPCDispatcher.__init__(self)
+ TypeError: __init__() takes exactly 3 arguments (1 given)
+
+This is ikiwiki version 2.40 and
+[SimpleXMLRPCServer.py](http://svn.python.org/view/python/tags/r25/Lib/SimpleXMLRPCServer.py?rev=51918&view=markup) from python-2.5
+
+[[done]]
diff --git a/doc/bugs/rst_tweak.mdwn b/doc/bugs/rst_tweak.mdwn
new file mode 100644
index 000000000..8667a459b
--- /dev/null
+++ b/doc/bugs/rst_tweak.mdwn
@@ -0,0 +1,52 @@
+rst.pm disallows raw HTML input. (It's meant as a security feature.)
+IkiWiki generates HTML in rst files pretty much all the time. As
+such, we should enable raw HTML support. --Ethan
+
+> [[done]], although I did add a news item about it, since it could break
+> the security of certian setups that don't use the htmlscrubber. --[[Joey]]
+
+<pre>
+Index: IkiWiki/Plugin/rst.pm
+===================================================================
+--- IkiWiki/Plugin/rst.pm (revision 3926)
++++ IkiWiki/Plugin/rst.pm (working copy)
+@@ -30,7 +30,7 @@
+ html = publish_string(stdin.read(), writer_name='html',
+ settings_overrides = { 'halt_level': 6,
+ 'file_insertion_enabled': 0,
+- 'raw_enabled': 0 }
++ 'raw_enabled': 1 }
+ );
+ print html[html.find('<body>')+6:html.find('</body>')].strip();
+ ";
+</pre>
+
+----
+
+Does the Perl version of this plugin still exist? There appears to be no "rst.pm" in the current distribution; all there is is the python version. --Peter
+
+> No, only the python version exists. It does have `raw_enabled` set.
+> --[[Joey]]
+
+I am sorry, but I am confused. Does this mean that I can use Ikiwiki
+features that translate to HTML in rst files? For example, when I use a
+\[[pagename]]-style link in a rst file, the page generated by Ikiwiki's rst
+plugin says &lt;a href="./../pagename/">pagename&lt;/a> as text. The link
+is expanded correctly, but the result isn't interpreted as HTML. Is that
+what is supposed to happen? --Peter
+
+> `raw_enabled` allows you to use the
+> [raw directive](http://docutils.sourceforge.net/docs/ref/rst/directives.html),
+> but this is not used by ikiwiki for wikilinks or anything else.
+> That's why the [[plugin_page|plugins/rst]] has its note about
+> issues with wikilinks and directives. You'd have to put those inside
+> raw directives yourself to avoid rst escaping their result. --[[Joey]]
+
+You can also create a raw "role" which is at least easier than raw directives.
+
+ .. role:: ikiwiki(raw)
+ :format: html
+
+ :ikiwiki:`\[[WikiLink]]`
+
+A role assigns meaning to interpreted text (for example :acronym:`ABC`) or :PEP:`8`. --ulrik [kaizer.se]
diff --git a/doc/bugs/search:___34__link__34___and___34__title__34___fields_are_incorrectly_specified.mdwn b/doc/bugs/search:___34__link__34___and___34__title__34___fields_are_incorrectly_specified.mdwn
new file mode 100644
index 000000000..c088d1eae
--- /dev/null
+++ b/doc/bugs/search:___34__link__34___and___34__title__34___fields_are_incorrectly_specified.mdwn
@@ -0,0 +1,29 @@
+Currently, ikiwiki indexes the "title" and "link" fields of a page
+using the prefix "Z".
+This is incorrect.
+"Z" is for stemmed terms,
+which xapian inserts itself.
+Furthermore, the custom field "LINK" should use the "X" prefix.
+(This is according to the [xapian-omega documentation] [xapian].)
+
+I have a [patch][] that fixes this.
+Once it is applied,
+the wiki should be rebuilt to fix the search index.
+
+What problems does the current behaviour cause?
+Consider the [[tags]] page.
+ikiwiki indexes the term "ZStags" for its title.
+xapian stems this and also indexes "ZZStag".
+(Notice the additional "Z".)
+Now when [searching for "title:tags"] [search],
+xapian stems this and searches for "ZStag",
+and so only finds pages which were indexed by _ikiwiki_ with "ZStag"
+(i.e. those pages with the singular "tag" in the title).
+
+--Gabriel.
+
+[xapian]: http://xapian.org/docs/omega/termprefixes.html
+ [patch]: http://www.gmcmanus.org/0001-Use-correct-term-prefixes-when-searching.patch
+[search]: http://ikiwiki.info/ikiwiki.cgi?P=title%3Atags
+
+[[!tag done]]
diff --git a/doc/bugs/search_creates_configuration_files_many_times_on_rebuild.mdwn b/doc/bugs/search_creates_configuration_files_many_times_on_rebuild.mdwn
new file mode 100644
index 000000000..e933feeca
--- /dev/null
+++ b/doc/bugs/search_creates_configuration_files_many_times_on_rebuild.mdwn
@@ -0,0 +1,9 @@
+Currently, if ikiwiki is rebuilding then search.pm will (wastefully)
+create its configuration files every time it indexes a file.
+
+[This patch](http://www.gmcmanus.org/0001-search-generate-configuration-files-once-only-when.patch)
+ensures the configuration files are created only once.
+
+--Gabriel
+
+> [[Done]] (and fixed your url) --[[Joey]]
diff --git a/doc/bugs/search_for_locale_data_in_the_installed_location.mdwn b/doc/bugs/search_for_locale_data_in_the_installed_location.mdwn
new file mode 100644
index 000000000..08af5fe2c
--- /dev/null
+++ b/doc/bugs/search_for_locale_data_in_the_installed_location.mdwn
@@ -0,0 +1,25 @@
+It seems like gettext only searches for locale information in /usr/share/locale, by default. I installed ikiwiki into /usr/local, therefore the locale information wasn't found. This patch fixes the issue:
+
+ --- a/IkiWiki.pm
+ +++ b/IkiWiki.pm
+ @@ -1057,6 +1057,7 @@ sub gettext {
+ $gettext_obj=undef;
+ return shift;
+ }
+ + $gettext_obj->dir("$installdir/share/locale/");
+ }
+ return $gettext_obj->get(shift);
+ }
+
+[[!tag patch patch/core]]
+-- [[ThomasBleher]]
+
+> According to my testing, this patch makes ikiwiki's localisation fail for
+> `LANG=fr_FR` when everything is installed to the default locations,
+> though `LANG=es_ES` works. I don't understand this behavior, especially
+> since strace shows it successfully opening the file
+> `/usr/share/locale/fr/LC_MESSAGES/ikiwiki.mo`.
+>
+> (Also, it should check that $installdir is set before using it.)
+>
+> --[[Joey]]
diff --git a/doc/bugs/search_plugin_and_CGI_preview.mdwn b/doc/bugs/search_plugin_and_CGI_preview.mdwn
new file mode 100644
index 000000000..2a3d270b7
--- /dev/null
+++ b/doc/bugs/search_plugin_and_CGI_preview.mdwn
@@ -0,0 +1,19 @@
+Text on a page gets indexed when you preview.
+
+I discovered this by using (perhaps this is weird) the Sandbox to
+preview my markup in a file that I was preparing to check in through svn.
+I just deleted the original Sandbox text in the edit form, pasted in my
+file, hit Preview, then cancelled the edit, leaving the Sandbox unchanged.
+
+After that, the Sandbox was one of the search hits for terms in the new
+page, and the Sandbox excerpt in the search results showed text taken
+from the new page, that was never really in the Sandbox page at all.
+
+Clicking Edit and then Preview on the original Sandbox page corrected
+the problem, of course.
+
+Making the indexing only happen on a real commit might also speed the
+Preview up a small amount.
+--Chapman Flack
+
+[[!tag done]]
diff --git a/doc/bugs/search_plugin_finds_no_results_with_xapian_1.2.7.mdwn b/doc/bugs/search_plugin_finds_no_results_with_xapian_1.2.7.mdwn
new file mode 100644
index 000000000..3bc430f68
--- /dev/null
+++ b/doc/bugs/search_plugin_finds_no_results_with_xapian_1.2.7.mdwn
@@ -0,0 +1,14 @@
+I'm using the most recent release of ikiwiki (3.20110905), the Perl shipped with SuSE 11.4 (v5.12.3), and built and installed xapian 1.2.7 from source, as it seems the current stable version that's encouraged for use by xapian.
+
+After enabling the search plugin and pointing ikiwiki to the omega program, rerunning ikiwiki --setup, and attempting a search, all searches return 0 results. No errors are reported by omindex or ikiwiki while producing the indexes in .ikiwiki/xapian/*, and the files appear to contain the indexed data. I don't think it's a problem in indexing.
+
+When running omega by hand in the .ikiwiki/xapian directory, providing queries on the command-line, runs correctly but again provides no results.
+
+I found that Debian stable is currently shipping 1.2.3, and on a hunch, I built that version, and searching now works fine. This looks like the usage of xapian's query template has changed somewhere between 1.2.3 and 1.2.7. Someone more familiar with xapian's query template language should be able to figure out what needs to be changed more specifically.
+
+> Debian has 1.2.7 now, and I have it installed and searching is working
+> fine with it. --[[Joey]]
+
+> I have this same issue. I tried xapian version 1.2.5. 1.2.8, 1.2.13. I will try and see if installing 1.2.3 fixes this issue. --[[Ramsey]]
+
+> 1.2.3 didn't fix the issue either --[[Ramsey]]
diff --git a/doc/bugs/search_plugin_uses_wrong_css_path.mdwn b/doc/bugs/search_plugin_uses_wrong_css_path.mdwn
new file mode 100644
index 000000000..688d51ee6
--- /dev/null
+++ b/doc/bugs/search_plugin_uses_wrong_css_path.mdwn
@@ -0,0 +1,14 @@
+The search result page uses a wrong path for the css files.
+
+This is, because the search plugin provides $config{cgiurl} as
+basepath to the misc template. And so the path to the css
+files ends up to be "$config{cgiurl}/local.css". But this is
+wrong if $config{cgiurl} is not the same as $config{url}.
+
+Maybe misctemplate() and misc.tmpl should use an additional
+variable which points always to the base of the wiki.
+
+e.g. use "wikibase" for css and favicon and "baseurl" for the &lt;base&gt; tag.
+
+> thanks for pointing this bug out, I've fixed it --[[Joey]].
+[[!tag done]]
diff --git a/doc/bugs/search_template_missing_dep.mdwn b/doc/bugs/search_template_missing_dep.mdwn
new file mode 100644
index 000000000..eebc5926e
--- /dev/null
+++ b/doc/bugs/search_template_missing_dep.mdwn
@@ -0,0 +1,4 @@
+The [[plugins/search]] plugin caches a filled in version of `page.tmpl` for
+omega. This is updated only if missing or on rebuild, so if the template is
+modified otherwise and normal refresh allowed to update the rest of the
+site, this gets missed and a stale template is used. --[[Joey]] [[done]]
diff --git a/doc/bugs/several_entries_in_docs__47__bugs_contain_colons_in_the_filename.mdwn b/doc/bugs/several_entries_in_docs__47__bugs_contain_colons_in_the_filename.mdwn
new file mode 100644
index 000000000..9b65aa2fd
--- /dev/null
+++ b/doc/bugs/several_entries_in_docs__47__bugs_contain_colons_in_the_filename.mdwn
@@ -0,0 +1,15 @@
+I just tried to clone the git repo onto a windows machine to test things out a bit and it turns out i cannot even successfully checkout the code because of those colons. Would a patch changing those to underscores be accepted? --Mithaldu
+
+> Well, this is a difficult thing. Ikiwiki has a configuration setting to
+> prevent it writing filenames with colons, but for backwards compatability
+> that is not enabled by default. Also nothing would stop people from
+> making commits that added filenames with colons even if it were disabled
+> in ikiwiki. I don't know that trying to work around obscure limitations
+> in OSs that I've never heard of ikiwiki being used on is worth the bother
+> TBH, but have not really made up my mind. --[[Joey]]
+
+>> I'm not trying to run it there. Ikiwiki is way too friggin' weird to try that. I just want to be able to check out the main repo so i can work in a native editor. Right now your core repository is downright hostile to cross-platform development in any way, shape or form. (Just plain splitting the docs from the code would work too.) --Mithaldu
+
+>>> Does(n't) cygwin handle the filename limitation/translations? If so, can you check out via git inside a cygwin environment? — [[Jon]]
+
+>>>> That actually allows me to check things out, but the resulting repo isn't compatible with most of the rest of my system, so it's extremely painful. --Mithaldu
diff --git a/doc/bugs/shortcut_encoding.mdwn b/doc/bugs/shortcut_encoding.mdwn
new file mode 100644
index 000000000..66fd81023
--- /dev/null
+++ b/doc/bugs/shortcut_encoding.mdwn
@@ -0,0 +1,28 @@
+* The URL is rewritten to
+ <http://cvs.savannah.gnu.org/viewvc/gnumach/ddb%2Fdb%5Fexpr%2Eh?view=log&root=hurd&pathrev=gnumach-1-branch>,
+ which the remove server doesn't like. Mind the esacping of [^A-Za-z0-9].
+ Might this be a problem of the web server?
+
+Also, I'd like to put the shortcut usages into backticks
+-- `[[!iki shortcuts]]` --
+to have them displayed in the usual backtick-formatting.
+That also doesn't work, but this is an already-reported issue, as far as I know.
+
+--[[tschwinge]]
+
+> The encoding of the shortcut text is done so that a shortcut can have
+> spaces in it etc and they're converted into a valid url. As in the
+> example of a shortcut to the wikipedia page for "War of 1812" (although
+> the example puts underscores in, it should also work without them).
+>
+> I suspect that if I dropped the endoding of characters other than space
+> and maybe plus, it would break some shortcuts though. Consider a shortcut
+> used to do a google search for "foo&bar". You want to encode the "&"
+> in that search, otherwise google will search for just foo!
+>
+> It does seem to be partly a web server problem, since savannah's viewvc
+> doesn't decode the escaped characters in the path string.
+>
+> I could add a %S that is not escaped, and leave %s escaped.. --[[Joey]]
+>
+> [[done]]
diff --git a/doc/bugs/shortcut_plugin_will_not_work_without_shortcuts.mdwn.mdwn b/doc/bugs/shortcut_plugin_will_not_work_without_shortcuts.mdwn.mdwn
new file mode 100644
index 000000000..5cc669106
--- /dev/null
+++ b/doc/bugs/shortcut_plugin_will_not_work_without_shortcuts.mdwn.mdwn
@@ -0,0 +1,33 @@
+On my initial ikiwiki -setup auto.setup, I get the following error:
+
+ shortcut plugin will not work without shortcuts.mdwn
+ /home/turian/utils/etc/ikiwiki/auto.setup: ikiwiki --refresh --setup /home/turian/iki.setup failed at IkiWiki/Setup/Automator.pm line 105.
+
+
+This is using the latest git pull of ikiwiki.
+I am not sure why it is not finding shortcuts.mdwn. -- [[JosephTurian]]
+
+> The error, and the weird paths suggest to me that you
+> have installed ikiwiki in a strange way, and it is failing
+> to find its basewiki underlay. The `$installdir` is
+> hardcoded into IkiWiki.pm at build time, based on the PREFIX
+> setting (see README).
+>
+> If that's not set right, you'll have other problems than just this one,
+> so I suggest you check how you installed ikiwiki.
+>
+> Anyway, I've made the shortcut plugin only warn about this..
+> --[[Joey]]
+
+> > I have
+> > $installdir="/home/turian/utils/"
+> > and the underlay dir is set to:
+> > "$installdir/share/ikiwiki/basewiki",
+> > which does contain shortcuts.mdwn. So I am not sure why it is not finding it.
+> > I am grappling with installing ikiwiki in a user account, and would like to get the directories set up correctly.
+> > How can I debug this issue further?
+
+>>>> Why don't you strace it and look at where it's looking for
+>>>> shortcuts.mdwn. --[[Joey]]
+
+>>>>>> Hmm, so change the PERL5LIB seemed to fix this. [[Done]].
diff --git a/doc/bugs/shortcuts_don__39__t_escape_from_Markdown.mdwn b/doc/bugs/shortcuts_don__39__t_escape_from_Markdown.mdwn
new file mode 100644
index 000000000..022987efb
--- /dev/null
+++ b/doc/bugs/shortcuts_don__39__t_escape_from_Markdown.mdwn
@@ -0,0 +1,7 @@
+Writing [[!wikipedia Low_frequency_oscillation]] causes the word "frequency"
+to show up in italics, since underscores are Markdown for italics. Using
+[[!wikipedia low frequency oscillation]] works in this case, because Wikipedia
+will redirect, but it's hardly clean. Maybe the shortcuts plugin should
+run pagetitle() on the text of its link? --Ethan
+
+> [[bugs/done]] --[[Joey]]
diff --git a/doc/bugs/sidebar_is_obscured_by_recentchanges.mdwn b/doc/bugs/sidebar_is_obscured_by_recentchanges.mdwn
new file mode 100644
index 000000000..6acc13b84
--- /dev/null
+++ b/doc/bugs/sidebar_is_obscured_by_recentchanges.mdwn
@@ -0,0 +1,59 @@
+I've set up a simple sidebar on an otherwise fairly default wiki. The sidebar uses css float:right and sits above most pages quite nicely.
+
+For example, my wiki's [front](http://www.cse.unsw.edu.au/~cs3431/wiki/) and [news](http://www.cse.unsw.edu.au/~cs3431/wiki/news/) pages show the sidebar nicely floating on top of the background. (As a side note, I had to add:
+
+ #sidebar {
+ border: 1px solid;
+ background: white;
+ }
+
+to <code>local.css</code> to get the border and make sure that the RSS feed's grey title didn't show through on the news page.)
+
+> Hmm the background color setting seems like a change it makes sense to make to
+> style.css .. done.
+> --[[Joey]]
+
+Unfortunately, the [recentchanges](http://www.cse.unsw.edu.au/~cs3431/wiki/recentchanges/) page doesn't look so nice - the sidebar appears below the recentchanges list.
+
+I don't understand why the sidebar is appearing below the recentchanges inline, but above the news inline.
+
+> I don't see the problem here in firefox 3. The sidebar is at the top of
+> both pages. However, it might have to do with the recentchanges page
+> itself using floating elements to build up the table-like display. --[[Joey]]
+
+>> I didn't test in firefox. I now have screenshots for both firefox and safari. It is still interesting to compare the layout. The first is quite broken. The second is only a little broken. The third is what I was expecting.
+
+Here is a screenshot of the broken behaviour in Safari:
+
+<img src="http://www.cse.unsw.edu.au/~willu/screenshots/safari-1.png" alt="screenshot of broken behaviour in Safari" width="50%" />
+
+Here is a screenshot of the same thing in FireFox. Notice that while there are no overlaps, there is still a large gap in the layout.
+
+<img src="http://www.cse.unsw.edu.au/~willu/screenshots/firefox-1.png" alt="screenshot of semi-working behaviour in Firefox" width="50%" />
+
+Here is an inline news page (in Safari, but it looks similar in firefox). I was expecting both of the previous layouts to look like this.
+
+<img src="http://www.cse.unsw.edu.au/~willu/screenshots/safari-2.png" alt="screenshot of working behaviour in Safari" width="50%" />
+
+What really surprises me is WHY this looks any different. And when you look at style.css you see that recentchanges and sidebar both use float, whereas normal inline pages do not.
+Note that in the third (working) screenshot, the top bullet point is wrapped. This is because the sidebar is floated.
+
+I think there is:
+
+ * A display bug in safari, and
+ * It would be nice to clean up the way recentchanges are displayed so that there isn't a vertical gap for the sidebar. I'll play with this and see what I can do.
+
+Looked at this a little more. I've found the following. Here is my current local.css:
+
+ div.recentchanges {
+ clear: both;
+ overflow: visible;
+ }
+
+Adding "clear: both;" makes the recentchanges div start below (as in further down the page) the sidebar. This makes safari behave like firefox above (changes the 1st screenshot to look more like the 2nd screenshot).
+
+Adding "overflow: visible;" (or removing "overflow: auto" from style.css) makes the sidebar appear above (as in printed over the top of, not higher up the page) the recentchanges (similar to the third screen shot above). Unfortunately because ".recentchanges .pagelinks" uses "float: right;" it looks strange in other ways. For this reason I use the "clear:both;" as well.
+
+-- [[users/Will]]
+
+>> Looks like [[Joey]] has added `clear:both;` to style.css, so this is [[bugs/done]]. -- [[Will]]
diff --git a/doc/bugs/sidebar_not_updated_in_unedited_subpages.mdwn b/doc/bugs/sidebar_not_updated_in_unedited_subpages.mdwn
new file mode 100644
index 000000000..c3e0ee18c
--- /dev/null
+++ b/doc/bugs/sidebar_not_updated_in_unedited_subpages.mdwn
@@ -0,0 +1,9 @@
+I turned on the sidebar plugin, with global_sidebars on (in the web setup page), created a sidebar page in the root, and edited the sidebar a few times.
+
+I then noticed that all pages on the root had been updated with a sidebar, but no subpages (i.e. a/b). Only after editing a subpage did it get a sidebar. Editing sidebar itself only updated subpages with sidebars, the other subpages had not been refreshed (proven by their unchanged filesystem date)
+
+After calling ikiwiki --setup on the command line all pages were updated. So this seems to be a difference between web-started --setup and command-line --setup. Or it just doesn't work the first time --setup is called after sidebars are enabled.
+
+
+
+
diff --git a/doc/bugs/sitemap_includes_images_directly.mdwn b/doc/bugs/sitemap_includes_images_directly.mdwn
new file mode 100644
index 000000000..d9d07c65f
--- /dev/null
+++ b/doc/bugs/sitemap_includes_images_directly.mdwn
@@ -0,0 +1,8 @@
+A bug in the plugin [[/plugins/map]]: It displays images inline.
+
+When I tried, it displayed the one image I have in my small wiki inline in the map. ideally it should link to it, just like it links to pages. [example at my site][uw]. Note that I normally keep images outside, but this time I thought, why not have it all at the same place? (Images are also contextual content (fits to its subpage))--ulrik
+
+[uw]: http://www.student.lu.se/~cif04usv/wiki/sitemap.html
+
+> [[done]] (hope noone was relying on the map inlining their images..)
+> --[[Joey]]
diff --git a/doc/bugs/some_but_not_all_meta_fields_are_stored_escaped.mdwn b/doc/bugs/some_but_not_all_meta_fields_are_stored_escaped.mdwn
new file mode 100644
index 000000000..587771ba4
--- /dev/null
+++ b/doc/bugs/some_but_not_all_meta_fields_are_stored_escaped.mdwn
@@ -0,0 +1,44 @@
+[[!template id=gitbranch branch=smcv/unescaped-meta author="[[Simon_McVittie|smcv]]"]]
+[[!tag patch]]
+(Warning: this branch has not been tested thoroughly.)
+
+While discussing the [[plugins/meta]] plugin on IRC, Joey pointed out that
+it stores most meta fields unescaped, but 'title', 'guid' and 'description'
+are special-cased and stored escaped (with numeric XML/HTML entities). This
+is to avoid emitting markup in the `<title>` of a HTML page, or in an RSS/Atom
+feed, neither of which are subject to the [[plugins/htmlscrubber]].
+
+However, having the meta fields "partially escaped" like this is somewhat
+error-prone. Joey suggested that perhaps everything should be stored
+unescaped, and the escaping should be done on output; this branch
+implements that.
+
+Points of extra subtlety:
+
+* The title given to the [[plugins/search]] plugin was previously HTML;
+ now it's plain text, potentially containing markup characters. I suspect
+ that that's what Xapian wants anyway (which is why I didn't change it),
+ but I could be wrong...
+
+ > AFAICS, this if anything, fixes a bug, xapian definitely expects
+ > unescaped text here. --[[Joey]]
+
+* Page descriptions in the HTML `<head>` were previously double-escaped:
+ the description was stored escaped with numeric entities, then that was
+ output with a second layer of escaping! In this branch, I just emit
+ the page description escaped once, as was presumably the intention.
+
+* It's safe to apply this change to a wiki and neglect to rebuild it
+ (assuming I implemented it correctly!), but until the wiki is rebuilt,
+ titles, descriptions and GUIDs for unchanged pages will appear
+ double-escaped on any page that inlines them in `quick=yes` mode, and
+ is rebuilt for some other reason. The failure mode is too much escaping
+ rather than too little, so it shouldn't be a security problem.
+
+* Reverting this change, if applied, is more dangerous; until the wiki is
+ rebuilt, any titles, descriptions and GUIDs on unchanged pages that
+ contained markup could appear unescaped on any page that inlines them
+ in `quick=yes` mode, and is rebuilt for some other reason. The failure
+ mode here would be too little escaping, i.e. cross-site scripting.
+
+[[!tag done]]
diff --git a/doc/bugs/some_strings_are_not_internationalized.mdwn b/doc/bugs/some_strings_are_not_internationalized.mdwn
new file mode 100644
index 000000000..a1b38257a
--- /dev/null
+++ b/doc/bugs/some_strings_are_not_internationalized.mdwn
@@ -0,0 +1,47 @@
+A lot of strings in ikiwiki are hardcoded and not taken for locales resources through gettext. This is bad because ikiwiki is thus difficult to spread for non-english users.
+
+I mean that, for instance in CGI.pm, line like:
+
+`my @buttons=("Save Page", "Preview", "Cancel");`
+
+should be written as
+
+`my @buttons=(gettext("Save Page"), gettext("Preview"), gettext("Cancel"));`
+
+> Yes, these need to be fixed. But note that the localised texts come back
+> into ikiwiki and are used in various places, including plugins.
+> Including, possibly, third-party plugins. So localising the buttons would
+> seem to require converting from the translations back into the C locale
+> when the form is posted. --[[Joey]]
+
+>> Wouldn't it be more easy to change all calls to the corrects ones (including in plugins) ?
+>> For instance in the same file (CGI.pm): `elsif ($form->submitted eq gettext("Save Page")) {`.
+>> That way no conversion to the C locale is needed.
+>> gettext use should just be publicized in documentation (at least in [[plugins/write]]). --[[bbb]]
+
+>>> It would be easy, but it could break third-party plugins that hardcode
+>>> the english strings. It's also probably less efficient to run gettext
+>>> over and over. --[[Joey]]
+
+In standards templates things seems wrongly written too. For instance in page.tmpl line like:
+
+`<li><a href="<TMPL_VAR EDITURL>" rel="nofollow">Edit</a></li>`
+
+should be written as
+
+`<li><a href="<TMPL_VAR EDITURL>" rel="nofollow"><TMPL_VAR EDITURL_TEXT</a></li>`
+
+with EDITURL_TEXT variable initialized in Render.pm through a gettext call.
+
+Am I wrong ?
+
+> No, that's not a sane way to localise the templates. The templates can be
+> translated by making a copy and modifying it, or by using a tool to
+> generate .mo files from the templates, and generate translated templates
+> from .po files. (See [[todo/l10n]] for one attempt.) But pushing the
+> localisation of random strings in the templates through the ikiwiki
+> program defeats the purpose of having templates at all. --[[Joey]]
+
+If not I can spend some time preparing patches for such corrections if it can help.
+
+-- [[/users/bbb]]
diff --git a/doc/bugs/space_in_a___91____91__page_link__93____93___doesn__39__t_make_link.mdwn b/doc/bugs/space_in_a___91____91__page_link__93____93___doesn__39__t_make_link.mdwn
new file mode 100644
index 000000000..39f5d891e
--- /dev/null
+++ b/doc/bugs/space_in_a___91____91__page_link__93____93___doesn__39__t_make_link.mdwn
@@ -0,0 +1,32 @@
+I attempted to make a new webpage by having wiki code with
+ [[!new page]]
+ [newpage]
+
+This was converted to literal:
+ [[!new page]]
+and the correct hyperlink:
+ ?newpage
+
+So when has a space it doesn't let you create a new page. I am using 1.35. Let's see what happens here:
+
+[[!new page]]
+
+A moment later ... same thing ... it is not a link (no question mark to create).
+
+Is this documented? How do I create a webpage with space in filename?
+
+> You use underscores in place of spaces. I've improved the docs a bit.
+> Spaces are not allowed because preprocessor directives look like
+> wikilinks, except they contain one or more spaces.. --[[Joey]]
+
+Examples in various files show spaces within the double brackets.
+
+> I don't know of any that do that, can you either point me to them or fix
+> them in the wiki? Note that examples of preprocessor directives _will_
+> contain spaces. --[[Joey]]
+
+(By the way, the Page Location dropdown above has underscores for spaces and underscore91underscore and 93 and 39 instead of left bracket, right bracket and single quote. When rendered on final page it will be correct but in the select option box it looks strange.)
+
+> This is fixed now. --Ethan
+
+>> Calling this [[bugs/done]], all issues seem addressed. --[[Joey]]
diff --git a/doc/bugs/special_characters_in_tag_names_need_manual_escaping.mdwn b/doc/bugs/special_characters_in_tag_names_need_manual_escaping.mdwn
new file mode 100644
index 000000000..4ff6763a3
--- /dev/null
+++ b/doc/bugs/special_characters_in_tag_names_need_manual_escaping.mdwn
@@ -0,0 +1,3 @@
+Having read i18n_characters_in_post_title, I have a page named `St John's` in a file named `St_John__39__s.mdwn`. Regular wikilinks like `\\[[St_John's]]` successfully point to that page. However, if I tag a page with `\[[!tag St_John's]]`, that link is shown as pointing to a non-existant page. Modify the tag to read `\[[!tag St_John__39__s]]` works around the problem.
+
+[[done]] in 1.49 --[[Joey]]
diff --git a/doc/bugs/ssl_certificates_not_checked_with_openid.mdwn b/doc/bugs/ssl_certificates_not_checked_with_openid.mdwn
new file mode 100644
index 000000000..04ece0ae8
--- /dev/null
+++ b/doc/bugs/ssl_certificates_not_checked_with_openid.mdwn
@@ -0,0 +1,85 @@
+As far as I can tell, ikiwiki is not checking the SSL certificate of the remote host when using openid authentication. If so, this would allow for man-in-the-middle type attacks. Alternatively, maybe I am getting myself confused.
+
+Test #1: Enter URL as openid server that cannot be verified (either because the certificate is self signed or signed by an unknown CA). I get no SSL errors.
+
+Test #2: Download net\_ssl\_test from dodgy source (it uses the same SSL perl library, and test again. It seems to complain (on same site ikiwiki worked with) when it can't verify the signature. Although there is other breakage with the version I managed to download (eg. argument parsing is broken; also if I try to connect to a proxy server, it instructs the proxy server to connect to itself for some weird reason).
+
+For now, I want to try and resolve the issues with net\_ssl\_test, and run more tests. However, in the meantime, I thought I would document the issue here.
+
+-- Brian May
+
+> Openid's security model does not rely on the openid consumer (ie,
+> ikiwiki) performing any sanity checking of the openid server. All the
+> security authentication goes on between your web browser and the openid
+> server. This may involve ssl, or not.
+>
+>> Note that I'm not an openid expert, and the above may need to be taken
+>> with a grain of salt. I also can make no general statements about openid
+>> being secure. ;-) --[[Joey]]
+>
+> For example, my openid is "http://joey.kitenet.net/". If I log in with
+> this openid, ikiwiki connects to that http url to determine what openid
+> server it uses, and then redirects my browser to the server
+> (https://www.myopenid.com/server), which validates the user and redirects
+> the browser back to ikiwiki with a flag set indicating that the openid
+> was validated. At no point does ikiwiki need to verify that the https url
+> is good.
+> --[[Joey]]
+
+>> Ok, so I guess the worst that could happen when ikiwiki talks to the http
+>> address is that it gets intercepted, and ikiwiki gets the wrong address.
+>> ikiwiki will then redirect the browser to the wrong address. An attacker could
+>> trick ikiwiki to redirect to their site which always validates the user
+>> and then redirects back to ikiwiki. The legitimate user may not even notice.
+>> That doesn't so seem secure to me...
+
+>> All the attacker needs is access to the network somewhere between ikiwiki
+>> and http://joey.kitenet.net/ or the ability to inject false DNS host names
+>> for use by ikiwiki and the rest is simple.
+
+>> -- Brian May
+
+>>> I guess that the place to add SSL cert checking would be in either
+>>> [[!cpan LWPx::ParanoidAgent]] or [[!cpan Net::OpenID::Consumer]]. Adding
+>>> it to ikiwiki itself, which is just a user of those libraries, doesn't
+>>> seem right.
+>>>
+>>> It's not particularly clear to me how a SSL cert can usefully be
+>>> checked at this level, where there is no way to do anything but
+>>> succeed, or fail; and where the extent of the check that can be done is
+>>> that the SSL cert is issued by a trusted party and matches the domain name
+>>> of the site being connected to. I also don't personally think that SSL
+>>> certs are the right fix for DNS poisoning issues. --[[Joey]]
+
+I was a bit vague myself on the details on openid. So I looked up the standard.
+I was surprised to note that they have already considered these issues, in
+section 15.1.2, <http://openid.net/specs/openid-authentication-2_0.html#anchor41>.
+
+It says:
+
+"Using SSL with certificates signed by a trusted authority prevents these kinds of
+attacks by verifying the results of the DNS look-up against the certificate. Once
+the validity of the certificate has been established, tampering is not possible.
+Impersonating an SSL server requires forging or stealing a certificate, which is
+significantly harder than the network based attacks."
+
+With regards to implementation, I am surprised that the libraries don't seem to
+do this checking, already, and by default. Unfortunately, I am not sure how to test
+this adequately, see [[!debbug 466055]]. -- Brian May
+
+---
+
+I think [[!cpan Crypt::SSLeay]] already supports checking the certificate. The trick
+is to get [[!cpan LWP::UserAgent]], which is used by [[!cpan LWPx::ParanoidAgent]] to
+enable this checking.
+
+I think the trick is to set one of the the following environment variables before retrieving
+the data:
+
+$ENV{HTTPS\_CA\_DIR} = "/etc/ssl/certs/";
+$ENV{HTTPS\_CA\_FILE} = "/etc/ssl/certs/file.pem";
+
+Unfortunately I get weird results if the certificate verification fails, see [[!debbug 503440]].
+It still seems to work though, regardless.
+
+-- Brian May
diff --git a/doc/bugs/strange_hook_id_in_skeleton.pm.mdwn b/doc/bugs/strange_hook_id_in_skeleton.pm.mdwn
new file mode 100644
index 000000000..5e96acf60
--- /dev/null
+++ b/doc/bugs/strange_hook_id_in_skeleton.pm.mdwn
@@ -0,0 +1,5 @@
+ hook(type => "savestate", id => "savestate", call => \&savestate);
+
+Shouldn't that id be "skeleton", like all the other ids? --Ethan
+
+[[done]]
diff --git a/doc/bugs/stray___60____47__p__62___tags.mdwn b/doc/bugs/stray___60____47__p__62___tags.mdwn
new file mode 100644
index 000000000..99d6fe09f
--- /dev/null
+++ b/doc/bugs/stray___60____47__p__62___tags.mdwn
@@ -0,0 +1,17 @@
+When using the [[plugins/htmltidy]] plugin (and possibly in other circumstances), ikiwiki sometimes creates more `</p>` tags than `<p>` tags, causing unbalanced markup. I've previously noticed unbalanced tags when a `\[[!map]]` matches no pages. This is part of the reason I developed [[plugins/htmlbalance]].
+
+This is particularly noticeable if htmltidy is enabled when building the docwiki: on the 'contrib' plugin pages, the title becomes `foo </p> (third-party plugin)` (with the angle-brackets escaped - it seems the text gets sanitized but is then escaped anyway).
+
+I believe that this snippet in `IkiWiki.pm` might be the reason for the imbalance:
+
+ if ($oneline) {
+ # hack to get rid of enclosing junk added by markdown
+ # and other htmlizers
+ $content=~s/^<p>//i;
+ $content=~s/<\/p>$//i;
+ chomp $content;
+ }
+
+The fact that HTML in a `\[[!meta title]]` is added but then escaped might indicate that some other bug is involved.
+
+> [[done]] --[[Joey]]
diff --git a/doc/bugs/support_for_openid2_logins.mdwn b/doc/bugs/support_for_openid2_logins.mdwn
new file mode 100644
index 000000000..a71ed7ba9
--- /dev/null
+++ b/doc/bugs/support_for_openid2_logins.mdwn
@@ -0,0 +1,24 @@
+I have several complaints that users cannot contribute to my ikiwiki instances since they only have OpenID logins that support OpenID2. E.g. Yahoo!'s OpenID only supports 2.0+
+
+This is not the fault of ikiwiki, though the problem lies within the [perl openid consumer](http://packages.qa.debian.org/libn/libnet-openid-consumer-perl.html) in Debian which is a 1.x implementation AFAIK.
+
+I've contacted JanRain who have pointed me to:
+
+* [OpenID4Perl](http://code.sxip.com/openid4perl/)
+* Some [work](http://code.sixapart.com/svn/openid/trunk/perl/) by David Recordon
+
+However both Perl OpenID 2.x implementations have not been released and are incomplete implementations. :(
+
+> Both of the projects referenced above have since been released.
+> Net::OpenID::Consumer 0.x in Debian is indeed only an OpenID 1
+> implementation. However, Net::OpenID::Consumer 1.x claims to be
+> an OpenID 2 implementation (it's the second of the projects
+> above). I've filed a bug in Debian asking for the package to be
+> updated. --[[smcv]]
+
+> Net::OpenID::Consumer 1.x is now in Debian unstable --[[dom]]
+
+> I've tested with yahoo, and it works with the updated module. Sweet and
+> [[done]] --[[Joey]]
+
+## A quick fix for the impatient running stable is simply `sudo apt-get install libnet-openid-consumer-perl -t unstable`
diff --git a/doc/bugs/svn+ssh_commit_fail.mdwn b/doc/bugs/svn+ssh_commit_fail.mdwn
new file mode 100644
index 000000000..b58f43721
--- /dev/null
+++ b/doc/bugs/svn+ssh_commit_fail.mdwn
@@ -0,0 +1,5 @@
+If I try to do a web commit, to a svn+ssh repo, it fails with
+"Host key verification failed."
+I think that the setuid isn't fully taking; it should be running as me,
+but commit log shows www-data. So maybe it has the wrong username? Or
+EUID/Real UID screwage. [[bugs/done]]
diff --git a/doc/bugs/svn-commit-hanging.mdwn b/doc/bugs/svn-commit-hanging.mdwn
new file mode 100644
index 000000000..e5c5dde14
--- /dev/null
+++ b/doc/bugs/svn-commit-hanging.mdwn
@@ -0,0 +1,7 @@
+When I'm committing a page via the Web, Ikiwiki hangs at the `svn commit` stage. I'm sure that this is a result of all my local modifications, so there's no need to consider this a bug per se, but I'd be interested in hearing whether anybody else has seen this. -- [[Ben]]
+
+What kinds of local modifications are those? --[[Joey]]
+
+Many. ;-) I have the usedirs patch applied, about 10 minor patches to customise things that weren't otherwise customisable, and a custom SVN hook. As you would expect, the problem was the hook (which I'd forgotten about)... Pretend I was never here. ;-) -- [[Ben]]
+
+[[bugs/done]] ;-) --[[Joey]]
diff --git a/doc/bugs/svn_commit_failures_interpreted_as_merge_conflicts.mdwn b/doc/bugs/svn_commit_failures_interpreted_as_merge_conflicts.mdwn
new file mode 100644
index 000000000..0c9bce4b9
--- /dev/null
+++ b/doc/bugs/svn_commit_failures_interpreted_as_merge_conflicts.mdwn
@@ -0,0 +1,21 @@
+I'm attempting a merge with the SVN plugin via the web interface
+with ikiwiki-3.20100403 and subversion 1.6.11.
+
+The web interface says
+
+ Your changes conflict with other changes made to the page.
+
+ Conflict markers have been inserted into the page content. Reconcile the conflict and commit again to save your changes.
+
+However there are no merge conflict markers in the page. My apache error log says:
+
+ [Fri Apr 30 16:43:57 2010] [error] [client 10.64.64.42] svn: Commit failed (details follow):, referer: https://unixwiki.ncl.ac.uk/ikiwiki.cgi
+ [Fri Apr 30 16:43:57 2010] [error] [client 10.64.64.42] svn: Authorization failed, referer: https://unixwiki.ncl.ac.uk/ikiwiki.cgi
+
+-- [[Jon]]
+
+> Only way for this to be improved would be for the svn plugin to
+> explicitly check the file for conflict markers. I guess it could
+> change the error message then, but the actual behavior of putting the
+> changed file back in the editor so the user can recommit is about right
+> as far as error recovery goes. --[[Joey]]
diff --git a/doc/bugs/svn_fails_to_update.mdwn b/doc/bugs/svn_fails_to_update.mdwn
new file mode 100644
index 000000000..6ed839cf6
--- /dev/null
+++ b/doc/bugs/svn_fails_to_update.mdwn
@@ -0,0 +1,89 @@
+In poking around at the svn backend I found that the svn post-commit
+hook calls to svn update fail regularly with an error code of 256.
+Apparently during the post-commit hook can't update because the
+working copy is locked from the commit. Since the post-commit hook doesn't send
+errors anywhere and svn update runs with --quiet anyhow, this error
+isn't usually visible, but on my system:
+
+ ethan@sundance:~/tests/webtemplates/ikiwiki3/wc$ svn commit -m "Blah.."
+ Sending index.mdwn
+ Transmitting file data .
+ Committed revision 3.
+
+ #verifying output was created
+ ethan@sundance:~/tests/webtemplates/ikiwiki3/wc$ less ../dest/index.html
+
+ ethan@sundance:~/tests/webtemplates/ikiwiki3/wc$ svn info
+ Path: .
+ URL: file:///home/ethan/tests/webtemplates/ikiwiki3/svn/trunk
+ Repository Root: file:///home/ethan/tests/webtemplates/ikiwiki3/svn
+ Repository UUID: f42bb0d6-3c1e-0410-b2d4-aeaad48dd6c4
+ Revision: 2
+ Node Kind: directory
+ Schedule: normal
+ Last Changed Author: ethan
+ Last Changed Rev: 2
+ Last Changed Date: 2006-09-24 21:15:55 -0400 (Sun, 24 Sep 2006)
+
+A sample error message (obtained through file redirection) is:
+
+ svn: Working copy '.' locked
+ svn: run 'svn cleanup' to remove locks (type 'svn help cleanup' for details)
+
+Did I do something stupid again or is this the case on your system too?
+--Ethan
+
+Additional note: this doesn't happen when performing svn commits from another wc,
+but *does* happen when committing from the web.
+--Ethan
+
+> Yeah, this makes sense now that you bring it up. Perhaps I should make
+> ikiwiki skip the update when called from the post-commit hook if the repo
+> is locked, although this could mask other problems.. --[[Joey]]
+
+>> I don't think it's (yet) a serious problem, because any commit to the repo
+>> either comes from another WC, in which case, no problem, or it is committed by
+>> ikiwiki through its own WC, in which case that WC is "the newest". The only problem
+>> is that ikiwiki's rcs information for web commits gets screwed up. I think the
+>> correct fix is to call rcs_update from rcs_commit in svn.pm, if
+>> the commit succeeds. I'm not sure whether this ought to happen for all RCSes
+>> or just svn. --Ethan
+
+>>> You say that the rcs information for web commits is screwed up .. how?
+>>> Does this affect something that I'm not seeing? --[[Joey]]
+
+I just meant that when you call ikiwiki.cgi?do=edit, it gets the
+"current" RCS revision, and uses that in the merges later if there
+are other edits in the meantime. So I guess if you have a file a.mdwn,
+and at revision X it contains the list:
+
+ a
+ b
+ c
+ d
+
+And then one user edits it by removing "c" from web, and
+then starts editing it again, ikiwiki.cgi will think the edit "started"
+at revision X (although it's really X+1). So if another user edits via
+web in the meantime, the subsequent merge will try to remove "c" again.
+To be honest I don't know what will happen in this case (svn merge fails?
+conflict markers?), but I'm pretty sure it's a problem. Anyhow, I think we
+should call update manually after commit, I just don't know if this should
+be RCS-specific, or whether it's safe to update after commit on all RCSes.
+--Ethan
+
+Hmm, turns out that isn't the case! svn's prepedit function calls svn info
+which gets the "right" information even when the WC isn't current. I am
+having problems merging but that probably has nothing to do with this bug.
+[This patch](http://ikidev.betacantrips.com/patches/update.patch) calls
+rcs_update after commit in CGI.pm, it might be a good idea anyhow. --Ethan
+
+> Ok, I follow you. I am unsure whether this problem effects other rcses
+> besides svn. Depends on how they handle locking, etc. But calling
+> rcs_update will always be safe, so I'll do that. [[bugs/done]]
+>
+> That still leaves the issue that it calls svn update in the post-commit
+> hook when it's locked and fails with that error message. Granted svn does
+> throw that away by default, but it's still ugly and wasteful. But
+> checking for a lock first is even uglier (and racey) and more wasteful,
+> so I don't see a fix.. --[[Joey]]
diff --git a/doc/bugs/svn_post-commit_wrapper_can__39__t_find_IkiWiki.pm_if_not_installed.mdwn b/doc/bugs/svn_post-commit_wrapper_can__39__t_find_IkiWiki.pm_if_not_installed.mdwn
new file mode 100644
index 000000000..49e956a23
--- /dev/null
+++ b/doc/bugs/svn_post-commit_wrapper_can__39__t_find_IkiWiki.pm_if_not_installed.mdwn
@@ -0,0 +1,22 @@
+If you're using ikiwiki without installing it, the svn post-commit wrapper will die (in a difficult-to-debug way) when it tries to execute ikiwiki.pl because it can't find IkiWiki.pm.
+
+I'm not sure how to fix this in a secure way. For now I'm just changing use lib '.' in ikiwiki.pl to point to the hard-coded directory where ikiwiki was unpacked.
+
+> This workaround doesn't work here. "`./ikiwiki.pl --setup ikiwiki.setup`" is ok, but the
+> wrappers fail in action. Using "`FindBin`" seems a solution. Here is a (kinda ugly)
+> [patch](http://git.kirkambar.net/?p=ikiwiki.git;a=commitdiff;h=44511c00b98b3efedd4d31f15ea928fcf221401e)
+> which also allows you to use `basewiki` + `templates` in the source directory. The patched
+> version works fine in my [homepage](http://kirkambar.net). --[[Roktas]]
+
+New versions of ikiwiki support installation to nonstandard paths, just set
+PREFIX to the path when running Makefile.PL, and it will set up ikiwiki to
+look in the place it installed the libraries for its perl libraries, etc.
+
+I don't understand why the wrappers would fail if it were confgured link
+that. --[[Joey]]
+
+> I didn't install it, which was the problem. I'm running ikiwiki from a
+> hosting account somewhere so I didn't even try. You're right, it works
+> fine if you actually follow the directions. :) --Ethan
+
+Ok, well, I'll mark this [[bugs/done]] then. --[[Joey]]
diff --git a/doc/bugs/syntax_error_in_aggregate.mdwn b/doc/bugs/syntax_error_in_aggregate.mdwn
new file mode 100644
index 000000000..1e69e7fab
--- /dev/null
+++ b/doc/bugs/syntax_error_in_aggregate.mdwn
@@ -0,0 +1,11 @@
+Current git :
+
+ $ perl -c IkiWiki/Plugin/aggregate.pm
+ syntax error at IkiWiki/Plugin/aggregate.pm line 427, near "24;"
+ IkiWiki/Plugin/aggregate.pm had compilation errors.
+
+This prevents a Debian package build (due to `t/syntax.t`).
+
+Not knowing the units being used, I don't know where to add the missing parenthesis.
+
+[[done]]
diff --git a/doc/bugs/table_external_file_links.mdwn b/doc/bugs/table_external_file_links.mdwn
new file mode 100644
index 000000000..7b35383c5
--- /dev/null
+++ b/doc/bugs/table_external_file_links.mdwn
@@ -0,0 +1,9 @@
+If wikilinks are put in an external table file, those links are not seen at
+scan time, and so ikiwiki does not know to update the page containing the
+table when the pages the links point to change (are added, removed, etc).
+
+There seem only two solutions to that bug -- either really make wikilinks
+in an external table file not work (probably by escaping them),
+or run the preprocess code also in scan (expensive!). --[[Joey]]
+
+[[done]]
diff --git a/doc/bugs/table_plugin_does_not_handle___92__r__92__n_lines_in_CSV_files.mdwn b/doc/bugs/table_plugin_does_not_handle___92__r__92__n_lines_in_CSV_files.mdwn
new file mode 100644
index 000000000..001407ab8
--- /dev/null
+++ b/doc/bugs/table_plugin_does_not_handle___92__r__92__n_lines_in_CSV_files.mdwn
@@ -0,0 +1,5 @@
+The table plugin seems to be unable to read a CSV file that uses \r\n for line delimiters.
+The same file with \r works fine. The error message is "Empty data".
+--liw
+
+I was seeing this as well on an Ubuntu 11.04 system with Ubuntu 11.04, Perl 5.10.1, IkiWiki 3.20110124ubuntu1, and libtext-csv-perl 1.21-1, all installed from APT. However, when I removed libtext-csv-perl from APT and installed it from CPAN, the problem went away. FWIW, what CPAN grabbed was MAKAMAKA/Text-CSV-1.21.tar.gz. --micahrl
diff --git a/doc/bugs/tag_behavior_changes_introduced_by_typed_link_feature.mdwn b/doc/bugs/tag_behavior_changes_introduced_by_typed_link_feature.mdwn
new file mode 100644
index 000000000..ed93a2eb7
--- /dev/null
+++ b/doc/bugs/tag_behavior_changes_introduced_by_typed_link_feature.mdwn
@@ -0,0 +1,16 @@
+The use of typed links for tags and some of the consequent changes
+introduced some unwanted functionality variations in the tag system. Two
+problems in particular could be observed, when compared to the use of
+tags in older versions of IkiWiki:
+
+* tags in feeds (both rss and atom) would use the file path as their
+ name (e.g. you would have `<category term="tags/sometag" />` in an atom
+ item for a page tagged sometag with a tagbase of tags), whereas they
+ appeared pure before
+* tags containing a slash character would appear without the slash
+ character but be used with the slash character in other circumstances
+ (effect visible by tagging a page with a name such as "with/slash")
+
+I've written a [[patch]] to fix this issues by introducing a `tagname()` function that reverts `taglink()`, and it's available [[here|http://sprunge.us/SHRj]] as well as on my [[git|http://git.oblomov.eu/ikiwiki]]
+
+> [[Applied|done]], with some regexp improvements. --[[Joey]]
diff --git a/doc/bugs/tag_plugin:_autotag_vs._staged_changes.mdwn b/doc/bugs/tag_plugin:_autotag_vs._staged_changes.mdwn
new file mode 100644
index 000000000..e5526bedf
--- /dev/null
+++ b/doc/bugs/tag_plugin:_autotag_vs._staged_changes.mdwn
@@ -0,0 +1,17 @@
+The autotag functionality of the tag plugin committed (when doing its
+first commit) all changes that have been staged (in Git). I suggest it
+should be restricted to the specific file only. --[[tschwinge]]
+
+> This is not specific to the tag plugin. Same can happen
+> if you rename a file, or post a comment, or remove a file
+> via web interface. All of these use `rcs_commit_staged`.
+>
+> This is why ikiwiki is supposed to have a checkout of
+> the repository that it uses for its own purposes, and nobody else
+> should mess with. There are various notes about that being needed here
+> and there; you're free to not give ikiwiki its own repo, but you have to
+> be aware that it can fight with you if you're making changes to the same
+> repo. [[done]] --[[Joey]]
+
+>> Ack, that is reasonable. (And it's only been a very minor problem
+>> during manual testing.) --[[tschwinge]]
diff --git a/doc/bugs/tagged__40____41___matching_wikilinks.mdwn b/doc/bugs/tagged__40____41___matching_wikilinks.mdwn
new file mode 100644
index 000000000..a211654f1
--- /dev/null
+++ b/doc/bugs/tagged__40____41___matching_wikilinks.mdwn
@@ -0,0 +1,35 @@
+It may be that I'm simply misunderstanding something, but what is the rationale
+for having `tagged()` also match normal wikilinks?
+
+> It simply hasn't been implemented yet -- see the answer in
+> [[todo/tag_pagespec_function]]. Tags and wikilinks share the same
+> underlying implementation, although ab reasonable expectation is that
+> they are kept separate. --Ivan Z.
+
+The following situation. I have `tagbase => 'tag'`. On some pages, scattered
+over the whole wiki, I use `\[[!tag open_issue_gdb]]` to declare that this page
+contains information about an open issue with GDB. Then, I have a page
+`/tag/open_issues_gdb.mdwn` that essentially contains `\[[!map
+pages="tagged(open_issue_gdb)"]]`. So far, so good: this page indeed does list
+all pages that are tagged like this. But now, when I add in `/gdb.mdwn` a link
+to this page, like `\[[Open Issues|tag/open_issue_gdb]]`, then `/gdb.mdwn`
+itself shows up in the map on `tag/open_issues_gdb.mdwn`. In my understanding
+this is due to the wikilink being equal to a `\[[!tag ...]]`. What's the
+rationale on this, or what am I doing wrong, and how to achieve what I want?
+
+--[[tschwinge]]
+
+> What you are doing "wrong" is putting non-tag pages (i.e.
+> `/tag/open_issues_gdb.mdwn`) under your tagbase. The rationale for
+> implementing tag as it has been, I think, is one of simplicity and
+> conciseness. -- [[Jon]]
+
+>> No, he has no pages under tagbase that aren't tags. This bug
+>> is valid. [[todo/matching_different_kinds_of_links]] is probably
+>> how it will eventually be solved. --[[Joey]]
+
+>>> [[Done]]: `tagged` no longer matches other wikilinks. --[[smcv]]
+
+> And this is an illustration why a clean work-around (without changing the software) is not possible: while thinking about [[todo/matching_different_kinds_of_links]], I thought one could work around the problem by simply explicitly including the kind of the relation into the link target (like the tagbase in tags), and by having a separate page without the "tagbase" to link to when one wants simply to refer to the tag without tagging. But this won't work: one has to at least once refer to the real tag page if one wants to talk about it, and this reference will count as tagging (unwanted). --Ivan Z.
+
+> But well, perhaps there is a workaround without introducing different kinds of links. One could modify the [[tag plugin|plugins/tag]] so that it adds 2 links to a page: for tagging -- `tagbase/TAG`, and for navigation -- `tagdescription/TAG` (displayed at the bottom). Then the `tagdescription/TAG` page would hold whatever list one wishes (with `tagged(TAG)` in the pagespec), and whenever one wants to merely refer to the tag, one should link to `tagdescription/TAG`--this link won't count as tagging. So, `tagbase/TAG` would become completely auxiliary (internal) link targets for ikiwiki, the users would edit or link to only `tagdescription/TAG`. --Ivan Z.
diff --git a/doc/bugs/tags__44___backlinks_and_3.x.mdwn b/doc/bugs/tags__44___backlinks_and_3.x.mdwn
new file mode 100644
index 000000000..4fe9a4723
--- /dev/null
+++ b/doc/bugs/tags__44___backlinks_and_3.x.mdwn
@@ -0,0 +1,34 @@
+I think there might be an issue in the backlinks calculation in ikiwiki 3.04.
+
+I've just migrated to 3.04. In doing so, the following pagespec
+
+> "log/* and !link(tag/aggregation) and !link(tag/draft) and !*/Discussion"
+
+...started matching pages which contained
+
+> \[\[!template draft\]\]
+
+The page templates/draft.mdwn contains (amongst some markup)
+
+> \[\[!tag draft \]\]
+
+Prior to migration, the pagespec definitely took effect post-transclusion.
+
+An example: <http://jmtd.net/log/too_much_debconf_a_bad_thing/> contains the
+template inclusion, which can be seen to have worked due to markup at the
+bottom of the page. It even includes a "Tags: draft" link at the bottom.
+
+Strangely, <http://jmtd.net/tag/draft/> does not contain backlinks to pages
+which are tagged using the procedure above.
+
+After the first rebuild, it's broken, after a subsequent refresh, it is fixed.
+I've reproduced this twice (but assumed I'd done something wrong the first
+time, so went ahead and migrated live, spamming planet debian in the process
+:(). I will try and put together a testcase. -- [[users/Jon]], 2009/02/17
+
+> Looks like the same problem as
+> [[cannot_reliably_use_meta_in_template]]. AFAIK, this has never worked
+> reliably, although the linked page has a simple, though potentially
+> expensive fix. --[[Joey]]
+
+> fix made, [[done]] --[[Joey]]
diff --git a/doc/bugs/tags_base_dir_not_used_when_creating_new_tags.mdwn b/doc/bugs/tags_base_dir_not_used_when_creating_new_tags.mdwn
new file mode 100644
index 000000000..f2243ab2d
--- /dev/null
+++ b/doc/bugs/tags_base_dir_not_used_when_creating_new_tags.mdwn
@@ -0,0 +1,43 @@
+I'm using the tags plugin with tagbase="tags".
+
+Already existing tags, corresponding to pages like tags/foo.html work just
+fine.
+
+If I add to a page a tag which is not existing (e.g. with [[!tag newtag]])
+the just modified page will have a link which point to tags/newtag. This is
+in theory correct, but in practice leads to creating a tags/newtag subpage
+of the page I'm editing, while my tagbase is supposed to be relative to the
+wiki root.
+
+When used in a wiki which already have some tags this leads to mixing up
+tags located in tags/ and tags located in whatever/tags/.
+
+> When a new page is being edited, ikiwiki lets you chose where the page
+> will be created, so you'll get a dropdown list of possible locations for
+> the tag page. You should be able to choose between either tags/foo or
+> page/tags/foo.
+>
+> The way that ikiwiki decides which location to default to in this box is
+> fairly involved; but in general it will default to creating the page at
+> the same level as the tagged page. So if the tag is on any toplevel page
+> in the wiki, it will default to creating `tags/foo`; if the tag is on a
+> page in a subdirectory, it will default to creating subdir/tags/foo.
+>
+> I personally like this behavior; it allows me to create a subdirectory
+> for a particular thing and use tags that are specific to that thing,
+> which are kept confined to that subdirectory by default. For example,
+> this is used for ikiwiki's own plugins tags, which are all located
+> under plugins/type/* and which apply to pages under plugins/*.
+>
+> It's clearly not the right default for every situation though. Explcitly
+> setting a tagbase probably lessons the likelyhood that it's the right
+> default for things under that tag base. I'd not be opposed to adding a
+> special case to change the default in this case, or adding a
+> configuration option to change the default globally. On the other hand,
+> it is pretty simple to just check the location and select the right one
+> from the list when creating a new page..
+>
+> --[[Joey]]
+
+> And, this is [[done]], creating tags with tagbase will put them under the
+> tagbase, unless the tag name starts with "/". --[[Joey]]
diff --git a/doc/bugs/taint_and_-T.mdwn b/doc/bugs/taint_and_-T.mdwn
new file mode 100644
index 000000000..21ef17673
--- /dev/null
+++ b/doc/bugs/taint_and_-T.mdwn
@@ -0,0 +1,30 @@
+By default, tflag is not defined. But ikiwiki.in has -T causing build failure:
+
+perl -Iblib/lib ikiwiki.out -libdir . -setup docwiki.setup -refresh
+"-T" is on the #! line, it must also be used on the command line at ikiwiki.out
+line 1.
+
+> pm_filter removes the -T from ikiwiki.in when generating ikiwiki.out
+> unless NOTAINT=0 is set. I cannot reproduce your problem. --[[Joey]]
+
+>> Thanks. Now I see. NetBSD and DragonFly and several other systems don't have /usr/bin/perl so that path is replaced in the sh-bang lines of various scripts. So it doesn't match in the pm_filter expression. Can you please consider providing a variable or not matching on that assumed path to perl.
+
+ --- pm_filter.orig 2008-04-28 07:59:58 -0700
+ +++ pm_filter 2008-04-28 08:01:21 -0700
+ @@ -20,6 +20,6 @@
+ $_="use lib '$libdir';\n";
+ }
+ }
+ -elsif ($. == 1 && ($ENV{NOTAINT} || ! exists $ENV{NOTAINT}) && m{^(#!/usr/bin/perl) -T$}) {
+ +elsif ($. == 1 && ($ENV{NOTAINT} || ! exists $ENV{NOTAINT}) && m{^(#!.*) -T$}) {
+ $_=qq{$1\n};
+ }
+
+>> --[[JeremyReed]]
+
+>> I could look for "#!.*perl -T", if that would work. #!.*-T is perhaps
+>> over-broad. --[[Joey]]
+
+>>> Yes, being more precise should be fine. Note that some may have bin/perl5 or bin/perl5.8.8 for example, so please consider optional number, like ^(#!/.*/perl[0-9]*.*) -T$ or something like that.
+
+[[done]]
diff --git a/doc/bugs/taint_issue_with_regular_expressions.mdwn b/doc/bugs/taint_issue_with_regular_expressions.mdwn
new file mode 100644
index 000000000..172b49fd1
--- /dev/null
+++ b/doc/bugs/taint_issue_with_regular_expressions.mdwn
@@ -0,0 +1,35 @@
+Built from 2.1.17 source, works fine on commandline, but not working from CGI wrapper. Traced problem to regular expressions failing to match, specifically in contexts like the following in Render.pm:
+
+ my ($f)=/$config{wiki_file_regexp}/; # untaint
+
+It works if I replace it with:
+
+ my ($f)=/(^[-[:alnum:]_.:\/+]+$)/; # untaint
+
+which is exactly the same regular expression drawn out as a constant. It appears that %config gets some tainted data and is itself being marked entirely tainted, which may prevent using regular expressions contained in it for untainting other data. I'm using Perl 5.8.8.
+
+> How could `%config` possible get tainted? That would be a major security
+> hole. It seems more likely that perl containes to have taint flag bugs
+> even in 5.8. See also: [[prune_causing_taint_mode_failures]],
+> [[Insecure_dependency_in_mkdir]],
+> [[Insecure_dependency_in_eval_while_running_with_-T_switch]],
+> and especially [[!debbug 411786]]
+>
+> The last of those was the last straw for me, and I disabled taint
+> checking in the debian package. You can do the same by building ikiwiki
+> with NOTAINT=1. :-( --[[Joey]]
+
+----------------
+Continuing to dig into the problem I reported, it may not be taint after all. Running strings on the ikiwiki.cgi wrapper, I see stuff like:
+
+ 'wiki_file_regexp' => bless( do{\(my $o = undef)}, 'Regexp' )
+
+without any payload of the actual regexp, and that would also certainly also have the observed effect of the regexps being completely broken while running in CGI mode. This seems to implicate Data::Dumper (2.101). After upgrading Data::Dumper to 2.121 I get:
+
+ 'wiki_file_regexp' => qr/(?-xism:(^[-[:alnum:]_.:\/+]+$))/
+
+This would call for at most an installation prerequisite of Data::Dumper >= 1.121. A look at the module's changelog shows that no intervening versions were actually released, so 1.121 would be the minimal good one.
+
+> You must have a very old version of perl there. This seems to be a bug in
+> data dumper before 2.11, which didn't properly dump q// objects. Prereq
+> added, [[done]] --[[Joey]]
diff --git a/doc/bugs/tbasewiki__95__brokenlinks.t_broken.mdwn b/doc/bugs/tbasewiki__95__brokenlinks.t_broken.mdwn
new file mode 100644
index 000000000..db3917d21
--- /dev/null
+++ b/doc/bugs/tbasewiki__95__brokenlinks.t_broken.mdwn
@@ -0,0 +1,60 @@
+t/basewiki_brokenlinks.t was failing with the following error:
+
+ t/basewiki_brokenlinks.....Can't locate IkiWiki.pm in @INC (@INC contains: /etc/perl /usr/lib/perl5/vendor_perl/5.8.8/i686-linux
+ /usr/lib/perl5/vendor_perl/5.8.8 /usr/lib/perl5/vendor_perl /usr/lib/perl5/site_perl/5.8.8/i686-linux /usr/lib/perl5/site_perl/5.8.8
+ /usr/lib/perl5/site_perl /usr/lib/perl5/5.8.8/i686-linux /usr/lib/perl5/5.8.8 /usr/local/lib/site_perl) at ./ikiwiki.out line 9.
+
+When ikiwiki.out is executed the 'blib/lib' directory isn't inherited by the child process. I can add "use lib 'blib/lib'" to ikiwiki.out easily enough, but I can't figure out how to add it to ikiwiki.in so that pm_filter doesn't strip it out.
+
+Anyway, once the 'use lib' is added I get the following error:
+
+ t/basewiki_brokenlinks.....ok 1/3Can't locate object method "new" via package "HTML::Template" at blib/lib/IkiWiki.pm line 858.
+
+After some digging I found that HTML::Template is being required after the new statement, again, easily fixed:
+
+ Index: IkiWiki.pm
+ ===================================================================
+ --- IkiWiki.pm (revision 3724)
+ +++ IkiWiki.pm (working copy)
+ @@ -842,7 +842,6 @@
+ return "";
+ }
+
+ - require HTML::Template;
+ my @ret=(
+ filter => sub {
+ my $text_ref = shift;
+ @@ -857,6 +856,7 @@
+ }
+
+ sub template ($;@) {
+ + require HTML::Template;
+ HTML::Template->new(template_params(@_));
+ }
+
+**That** gave me:
+
+ t/basewiki_brokenlinks.....ok 1/3HTML::Template->new called with multiple (or no) template sources specified! A valid call to new() has exactly one filename => 'file' OR exactly one scalarref => \$scalar OR exactly one arrayref => \@array OR exactly one filehandle => *FH at blib/lib/IkiWiki.pm line 858
+
+After some step through I figured out that the template directory was invalid, again easily fixed:
+
+ Index: t/basewiki_brokenlinks.t
+ ===================================================================
+ --- t/basewiki_brokenlinks.t (revision 3724)
+ +++ t/basewiki_brokenlinks.t (working copy)
+ @@ -4,6 +4,6 @@
+ use Test::More tests => 3;
+
+ ok(! system("make ikiwiki.out"));
+ -ok(! system("PERL5LIB=. ./ikiwiki.out -plugin brokenlinks -rebuild -underlaydir=basewiki t/basewiki_brokenlinks t/basewiki_brokenlinks/out"));
+ +ok(! system("PERL5LIB=. ./ikiwiki.out -plugin brokenlinks -rebuild -underlaydir=basewiki -templatedir=templates t/basewiki_brokenlinks t/basewiki_brokenlinks/out"));
+ ok(`grep 'no broken links' t/basewiki_brokenlinks/out/index.html`);
+ system("rm -rf t/basewiki_brokenlinks/out t/basewiki_brokenlinks/.ikiwiki");
+
+Other than ikiwiki.in, am I missing something here?
+
+>> I think this is [[!debbug 425891]]. I have sent there a patch that incorporates the original
+>> author's two diffs but has a more correct solution to the first problem described
+>> above. -- Thomas, 2007-06-26
+
+[[done]]
diff --git a/doc/bugs/tbasewiki__95__brokenlinks.t_broken/discussion.mdwn b/doc/bugs/tbasewiki__95__brokenlinks.t_broken/discussion.mdwn
new file mode 100644
index 000000000..7c20eb3c3
--- /dev/null
+++ b/doc/bugs/tbasewiki__95__brokenlinks.t_broken/discussion.mdwn
@@ -0,0 +1,2 @@
+For what it's worth: I encounter exactly the same issues when building and
+testing ikiwiki 2.1 on NetBSD/i386 3.1. \ No newline at end of file
diff --git a/doc/bugs/templateForRecentChangesMissingCloseSpan.mdwn b/doc/bugs/templateForRecentChangesMissingCloseSpan.mdwn
new file mode 100644
index 000000000..5c322991a
--- /dev/null
+++ b/doc/bugs/templateForRecentChangesMissingCloseSpan.mdwn
@@ -0,0 +1,26 @@
+In the template for ikiwiki's recent changes page
+
+ /usr/share/ikiwiki/templates/change.tmpl
+
+there is a missing </span> tag after the
+
+ <span class="changedate"><TMPL_VAR COMMITDATE>
+
+This results in the recentchanges/ page being invalid and rendering quite horrifyingly in Internet Exploder.
+
+[I'm running](http://wiki.shlrm.org) (linked so you can see the one I'm running if you need to) the latest version of ikiwiki, and I note that it's broken on [ikiwiki.info](http://validator.w3.org/check?uri=http%3A%2F%2Fikiwiki.info%2Frecentchanges%2F&charset=%28detect+automatically%29&doctype=Inline&group=0&user-agent=W3C_Validator%2F1.767) too :)
+
+[This one on debian](https://www.icanttype.org/recentchanges/) is somehow [valid](http://validator.w3.org/check?uri=https%3A%2F%2Fwww.icanttype.org%2F%2Frecentchanges%2F&charset=%28detect+automatically%29&doctype=Inline&group=0&user-agent=W3C_Validator%2F1.767), although it's using the same template. Perhaps there's an additional scrubbing going on his end.
+
+Thanks,
+David
+
+PS: I have fixed the template by hand on my server, so it will validate, however ikiwiki.info will not.
+
+> [[!template id="gitbranch" branch=smcv/trivia author="[[smcv]]"]] [[!tag patch]]
+> Enabling either [[plugins/htmltidy]] or [[plugins/htmlbalance]] will automatically fix unbalanced
+> markup like this; using [[plugins/comments]] without having one or other of those is a bad idea
+> from the point of view of avoiding comment forgery, which is probably why icanttype.org works
+> correctly. Anyway, I've fixed this in a branch: Joey, care to review smcv/trivia? --[[smcv]]
+
+[[done]], thanks guys --[[Joey]]
diff --git a/doc/bugs/template_creation_error.mdwn b/doc/bugs/template_creation_error.mdwn
new file mode 100644
index 000000000..79dccc136
--- /dev/null
+++ b/doc/bugs/template_creation_error.mdwn
@@ -0,0 +1,111 @@
+Hi,
+I am trying to build a template. The compilation of this template results in a weird exception. I have isolated the cause of the exception to the following point:
+
+If i have this in the template code:
+
+\[[!inline<br/>
+pages="\<TMPL_VAR SEL_PAGES\>"<br/>
+template=extract-entry<br/>
+\]]<br/>
+
+There is no problem at all. I can use the template with the desired result. But if I try to use this (just adding the "show" parameter):
+
+\[[!inline <br/>
+pages="\<TMPL_VAR SEL_PAGES>"<br/>
+template=extract-entry<br/>
+show=\<TMPL_VAR CNTPG><br/>
+\]]<br/>
+
+I get this exception on the Git bash console:
+
+<pre>
+$ git push
+Counting objects: 7, done.
+Delta compression using up to 8 threads.
+Compressing objects: 100% (4/4), done.
+Writing objects: 100% (4/4), 410 bytes, done.
+Total 4 (delta 3), reused 0 (delta 0)
+remote: From /home/b-odelama-com/source
+remote: eb1421e..5e1bac5 master -> origin/master
+remote: Argument "\x{3c}\x{54}..." isn't numeric in numeric lt (<) at /usr/share/perl5/IkiWiki/Plugin/inline.pm line 231.
+remote: Argument "\x{3c}\x{54}..." isn't numeric in numeric lt (<) at /usr/share/perl5/IkiWiki/Plugin/inline.pm line 231.
+To ssh://b-odelama-com@odelama-com.branchable.com/
+ eb1421e..5e1bac5 master -> master
+</pre>
+
+Please, let me know what to do to avoid this kind of error.
+
+> When you add a template page `templates/foo.mdwn` for use
+> the [[ikiwiki/directive/template]] directive, two things happen:
+>
+> 1. `\[[!template id=foo ...]]` becomes available;
+> 2. a wiki page `templates/foo` is built, resulting in a HTML file,
+> typically `templates/foo/index.html`
+>
+> The warnings you're seeing are the second of these: when ikiwiki
+> tries to process `templates/foo.mdwn` as an ordinary page, without
+> interpreting the `<TMPL_VAR>` directives, `inline` receives invalid
+> input.
+>
+> This is a bit of a design flaw in [[plugins/template]] and
+> [[plugins/edittemplate]], I think - ideally it would be possible to
+> avoid parts of the page being interpreted when the page is being
+> rendered normally rather than being used as a template.
+>
+> There *is* a trick to avoid parts of the page being interpreted when
+> the page is being used as a template, while having them appear
+> when it's rendered as a page:
+>
+> <TMPL_IF FALSE>
+> <!-- This part only appears when being used as a page.
+> It assumes that you never set FALSE to a true value :-) -->
+> \[[!meta robots="noindex,nofollow"]]
+> This template is used to describe a thing. Parameters:
+> * name: the name of the thing
+> * size: the size of the thing
+> </TMPL_IF>
+>
+> The thing is called <TMPL_VAR name> and its size is <TMPL_VAR size>
+>
+> I suppose you could maybe extend that to something like this:
+>
+> <TMPL_IF FALSE>
+> <!-- This part only appears when being used as a page.
+> It assumes that you never set FALSE to a true value :-) -->
+> \[[!meta robots="noindex,nofollow"]]
+> This template is used to describe a thing. Parameters:
+> * name: the name of the thing
+> * size: the size of the thing
+> </TMPL_IF>
+>
+> <TMPL_IF FALSE>
+> \[[!if test="included() and !included()" then="""
+> </TMPL_IF>
+> <!-- This part only appears when being used as a template. It also
+> assumes that you never set FALSE to a true value, and it
+> relies on the [[ikiwiki/pagespec]] "included() and !included()"
+> never being true. -->
+> The thing is called <TMPL_VAR name> and its size is <TMPL_VAR size>
+> <TMPL_IF FALSE>
+> """]]
+> </TMPL_IF>
+>
+> but that's far harder than it ought to be!
+>
+> Perhaps the right solution would be to change how the template plugin
+> works, so that templates are expected to contain a new `definetemplate`
+> directive:
+>
+> This template is used to describe a thing. Parameters:
+> * name: the name of the thing
+> * size: the size of the thing
+>
+> \[[!definetemplate """
+> The thing is called <TMPL_VAR name> and its size is <TMPL_VAR size>
+> """]]
+>
+> with templates not containing a `\[[!definetemplate]]` being treated
+> as if the whole text of the page was copied into a `\[[!definetemplate]]`,
+> for backwards compatibility?
+>
+> --[[smcv]]
diff --git a/doc/bugs/teximg_does_not_work_Preview.mdwn b/doc/bugs/teximg_does_not_work_Preview.mdwn
new file mode 100644
index 000000000..1900ac299
--- /dev/null
+++ b/doc/bugs/teximg_does_not_work_Preview.mdwn
@@ -0,0 +1,12 @@
+Using ikiwiki 2.6.1 package with git 1.5.3.1 backend in Debian.
+
+The teximg plugin creates .png and .log files in $basedir/teximg/ even in Preview mode. This causes "File foo independently created, not overwriting with version from page bar"-errors from the will_render()-function, when repeatedly clicking "Preview" or trying to save the page after a preview.
+
+In my opinion there are two ways to fix this cleanly:
+
+1. change the plugin: do not create any files when rendering a preview. Instead, inline the images as base64-encoded like the graphviz-plugin. Disadvantage is a very slow preview if it contains a lot of latex.
+
+2. provide a clean way for plugins to create additional files even for previews. This files can be removed when the "Save Page" button is clicked or on the next page view, for example. On this instance one might also reconsider putting all tex-images into one folder and naming them after their md5-cksum. The hashspace may be large, but not infinite. The technically ingenuous user might not be able to handle cross-page Hash-collisions.
+
+Preview issue [[fixed|done]] (but see
+[[teximg_fails_if_same_tex_is_used_on_multiple_pages]]) --[[Joey]]
diff --git a/doc/bugs/teximg_fails_if_same_tex_is_used_on_multiple_pages.mdwn b/doc/bugs/teximg_fails_if_same_tex_is_used_on_multiple_pages.mdwn
new file mode 100644
index 000000000..700492345
--- /dev/null
+++ b/doc/bugs/teximg_fails_if_same_tex_is_used_on_multiple_pages.mdwn
@@ -0,0 +1,24 @@
+If the same teximg is put on multiple pages, the second one will fail:
+
+Error: /home/joey/html//teximg/3eb2b61be1e909df9008a499863eec90.png independently created, not overwriting with version from pics
+
+Ikiwiki doesn't support the concept of two different pages owning the same
+file; only one page can own a file. I don't see a good way around that,
+w/o large changes to ikiwiki.
+
+This could be fixed by making the teximg directory be a subdirectory
+of the page containing the image. I seem to remember suggesting this to
+winnie, and I forget why it wasn't done.. maybe because the same teximg
+on multiple pages was expected to work, and as an optimisation for that case?
+--[[Joey]]
+
+----
+At first the plugin doesn't use the writefile etc. functions of ikiwiki, but uses own ones (so the created img wasn't owned by any page).
+Then this setup of course works.
+
+After switching to the ikiwiki functions it seems so that I doesn't test this again... sorry a error of mine. I'll work out a fix for this.
+I think this will be a own dir for images for every page.
+
+--[[PatrickWinnertz]]
+
+[[!tag done]]
diff --git a/doc/bugs/textile_plugin_dies_if_input_has_a_non-utf8_character.mdwn b/doc/bugs/textile_plugin_dies_if_input_has_a_non-utf8_character.mdwn
new file mode 100644
index 000000000..bdd07210e
--- /dev/null
+++ b/doc/bugs/textile_plugin_dies_if_input_has_a_non-utf8_character.mdwn
@@ -0,0 +1,14 @@
+ 20:03:56$ ikiwiki --setup *setup --rebuild
+ successfully generated /home/jon/git/ikiwiki/hooks/post-update
+ utf8 "\x92" does not map to Unicode at /usr/share/perl5/IkiWiki.pm line 320, <$in> chunk 1.
+ utf8 "\x92" does not map to Unicode at /usr/share/perl5/IkiWiki.pm line 320, <$in> chunk 1.
+ ikiwiki.setup: Malformed UTF-8 character (fatal) at /usr/share/perl5/Text/Textile.pm line 775.
+ BEGIN failed--compilation aborted at (eval 6) line 166.
+
+The first two complaints happen if textile is not loaded, the third fatal one happens if it is.
+
+0x92 is "single quote" in the evil windows default codepage. It would be nice to handle this gracefully and not abort ikiwiki at this point, or alternatively, die fatally but mention which input page caused the error.
+
+Interestingly enough, in my case, the input file has several other bad windows characters (0xFC, u-umlaut) which have not caused ikiwiki to abort. ikiwiki version 2.50. -- [[users/Jon]]
+
+> Fixed in git. [[done]] --[[Joey]]
diff --git a/doc/bugs/the_login_page_is_unclear_when_multiple_methods_exist.mdwn b/doc/bugs/the_login_page_is_unclear_when_multiple_methods_exist.mdwn
new file mode 100644
index 000000000..70266c49c
--- /dev/null
+++ b/doc/bugs/the_login_page_is_unclear_when_multiple_methods_exist.mdwn
@@ -0,0 +1,16 @@
+When multiple login methods are enabled, the ikiwiki login page lists one form per method, e.g.
+
+ * one for openid
+ * one for local user/password store
+
+Followed by the "login" button underneath. It's not obvious to anyone unfamiliar with the software that these are distinct forms, or that there are multiple ways of logging in, etc. -- [[Jon]]
+
+> As discussed in [[login_page_non-obvious_with_openid]],
+> architectural reasons disallow multiple forms, with multiple
+> submit buttons. But the default style sheet includes
+> a styling for the openid portion of the form that makes
+> it visually distinct from the rest of the form. I'm sure the styling
+> could be improved, but the current form does not seem too non-obvious
+> to me, or to naive users in the field. --[[Joey]]
+
+>> [[done]], better fixed by new fancy openid login form. --[[Joey]]
diff --git a/doc/bugs/title__40____41___in_a_PageSpec__44___with_meta_enabled__44___causes_a_crash.mdwn b/doc/bugs/title__40____41___in_a_PageSpec__44___with_meta_enabled__44___causes_a_crash.mdwn
new file mode 100644
index 000000000..8dc78a4a9
--- /dev/null
+++ b/doc/bugs/title__40____41___in_a_PageSpec__44___with_meta_enabled__44___causes_a_crash.mdwn
@@ -0,0 +1,3 @@
+When the meta plugin is enabled, use of the title() predicate in a [[PageSpec]] fails with "Undefined subroutine &IkiWiki::Plugin::meta::pagetitle called". The [[patch]] is to replace "pagetitle" with "IkiWiki::pagetitle" in the meta plugin, as in [this git commit](http://git.debian.org/?p=users/smcv/ikiwiki.git;a=commit;h=1f26a1bf1655b1d0223b24ba1db70579a3774eb1) (git://git.debian.org/git/users/smcv/ikiwiki.git, branch=master, commit=1f26a).
+
+[[done]] thanks!
diff --git a/doc/bugs/toc_displays_headings_from_sidebar.mdwn b/doc/bugs/toc_displays_headings_from_sidebar.mdwn
new file mode 100644
index 000000000..469ca8a33
--- /dev/null
+++ b/doc/bugs/toc_displays_headings_from_sidebar.mdwn
@@ -0,0 +1,3 @@
+The [[/ikiwiki/directive/toc]] directive scrapes all headings from the page, including those in the sidebar. So, if the sidebar includes navigational headers, every page with a table of contents will display those navigational headers before the headers in that page's content.
+
+I'd like some way to exclude the sidebar from the table of contents. As discussed via Jabber, perhaps toc could have a config option to ignore headers inside a nav tag or a tag with id="sidebar".
diff --git a/doc/bugs/toc_in_sidebar.mdwn b/doc/bugs/toc_in_sidebar.mdwn
new file mode 100644
index 000000000..447a0e51b
--- /dev/null
+++ b/doc/bugs/toc_in_sidebar.mdwn
@@ -0,0 +1,21 @@
+Putting a toc in the sidebar used to work, but was broken by
+commit 9652cdfe2eb16150518e34af33c8858118fe0a09, which, in turn fixed a bug
+with the toc not appearing during page preview.
+
+So, if toc is a sanitize hook, it can't be used in the sidebar, because the
+sidebar is only added to the page later. If the toc is a format hook, it
+shows up in the sidebar, but not at page preview time (because format hooks
+are not called during preview). Also, calling the toc as a format hook
+makes any headers that are hardcoded into the page template show up in the
+toc, which is rarely desirable.
+
+I can't think of a way between these that works in all cases. Maybe call
+the format hooks when generating a page preview? Maybe add an option to toc
+to make it embeddable in the sidebar?
+
+Hmm, I think I need to call format during preview. Another case is that
+inline uses a format hook to insert the inlined content..
+
+--[[Joey]]
+
+[[done]]
diff --git a/doc/bugs/toggle_expects_body_element_without_attributes.mdwn b/doc/bugs/toggle_expects_body_element_without_attributes.mdwn
new file mode 100644
index 000000000..0b39346f4
--- /dev/null
+++ b/doc/bugs/toggle_expects_body_element_without_attributes.mdwn
@@ -0,0 +1,3 @@
+The toggle plugins checks for a `<body>` in the page; if not found, javascript tags are inserted at the top of the document. Since my page uses `<body onload="javascript:fixLinks()">`; a plain `<body>` is not found and I get script links before the docstring declaration. Please see the source of the following toggle-using page: http://kaizer.se/wiki/kupfer/ -- ulrik [kaizer.se]
+
+[[fixed|done]] --[[Joey]]
diff --git a/doc/bugs/toggle_fails_on_Safari.mdwn b/doc/bugs/toggle_fails_on_Safari.mdwn
new file mode 100644
index 000000000..25f62e088
--- /dev/null
+++ b/doc/bugs/toggle_fails_on_Safari.mdwn
@@ -0,0 +1,58 @@
+The [[plugins/toggle]] plugin has no effect when viewed on the Safari web browser.
+
+All toggles appear open all the time.
+
+I don't know if this is true for other webkit browsers (the new Konqueror, the iPhone, etc).
+I'm currently testing in the Safari nightly builds, but I've seen the bug in the current release
+of Safari too.
+
+Looking at the Safari Web Inspector, it believes there is a parse error on line 47 of the
+[[news]] page. This is the definition of the getElementsByClass(class) function.
+
+ 45 }
+ 46
+ 47 function getElementsByClass(class) {
+ SyntaxError: Parse error
+ 48 var ret = new Array();
+
+> Reproduced in epiphany-webkit on debian.
+>
+> Also noticed something interesting when I opened the page in vim. It
+> highlighted the "class" like a type definition, not a variable. Sure
+> enough, replacing with "c" fixed it.
+>
+> I wonder if webkit is actually in the right here, and using a reseved
+> word like, presumably, "class" as a variable name is not legal. As I try
+> to ignore javascript as much as possible, I can't say. [[done]] --[[Joey]]
+
+>> I also started having a look at this. I found the same issue with the
+>> the variable 'class'. I'm not a javascript guru so I looked on the web
+>> at other implementations of getElementsByClass() and noticed some
+>> things that we might use. I took a bunch of different ideas and came
+>> up with this:
+
+ function getElementsByClass(cls, node, tag) {
+ if (document.getElementsByClass)
+ return document.getElementsByClass(cls, node, tag);
+ if (! node) node = document;
+ if (! tag) tag = '*';
+ var ret = new Array();
+ var pattern = new RegExp("(^|\\s)"+cls+"(\\s|$)");
+ var els = node.getElementsByTagName(tag);
+ for (i = 0; i < els.length; i++) {
+ if ( pattern.test(els[i].className) ) {
+ ret.push(els[i]);
+ }
+ }
+ return ret;
+ }
+
+>> Most of the changes are minor, except that this one will use the
+>> built in function if it is available. That is likely to be significantly
+>> faster. Adding the extra parameters doesn't cause a problem --
+>> they're filled in with useful defaults.
+
+>> I don't know if it is worth making this change, but it is there if you want it.
+
+>>> Well, it seems to work. Although god only knows about IE. Suppose I
+>>> might as well.. --[[Joey]]
diff --git a/doc/bugs/trail_excess_dependencies.mdwn b/doc/bugs/trail_excess_dependencies.mdwn
new file mode 100644
index 000000000..f806a62eb
--- /dev/null
+++ b/doc/bugs/trail_excess_dependencies.mdwn
@@ -0,0 +1,95 @@
+I've just modified the trail plugin to use only presence, and not
+content dependencies. Using content dependencies, particularly to the page
+that defines the trail, meant that every time that page changed, *every*
+page in the trail gets rebuilt. This leads to users setting up sites that
+have horrible performance, if the trail is defined in, for example, the top
+page of a blog.
+
+Unfortunatly, this change to presence dependencies has
+introduced a bug. Now when an existing trail is removed, the pages in the
+trail don't get rebuilt to remove the trail (both html display and state).
+
+> Actually, this particular case is usually OK. Suppose a trail `untrail`
+> contains `untrail/a` (as is the case in the regression
+> test I'm writing), and you build the wiki, then edit `untrail` to no
+> longer be a trail, and refresh. `untrail` has changed, so it is
+> rendered. Assuming that the template of either `untrail` or another
+> changed page happens to contain the `TRAILS` variable (which is not
+> guaranteed, but is highly likely), `I::P::t::prerender`
+> is invoked. It notices that `untrail/a` was previously a trail
+> member and is no longer, and rebuilds it with the diagnostic
+> "building untrail/a, its previous or next page has changed".
+>
+> Strictly speaking, I should change `I::P::t::build_affected`
+> so it calls `prerender`, so we're guaranteed to have done the
+> recalculation. Fixed in my branch. --[[smcv]]
+
+I think that to fix this bug, the plugin should use a hook to
+force rebuilding of all the pages that were in the trail, when
+the trail is removed (or changed).
+
+> The case of "the trail is changed" is still broken:
+> if the order of items changes, or the trail is removed,
+> then the logic above means it's OK, but if you
+> change the `\[[!meta title]]` of the trail, or anything else
+> used in the prev/up/next bar, the items won't show that
+> change. Fixed in my branch. --[[smcv]]
+
+There's a difficulty in doing that: The needsbuild hook runs before the scan
+hook, so before it has a chance to see if the trail directive is still there.
+It'd need some changes to ikiwiki's hooks.
+
+> That's what `build_affected` is for, and trail already used it. --s
+
+(An improvement in this area would probably simplify other plugins, which
+currently abuse the needsbuild hook to unset state, to handle the case
+where the directive that resulted in that state is removed.)
+
+I apologise for introducing a known bug, but the dependency mess was too
+bad to leave as-is. And I have very little time (and regrettably, even less
+power) to deal with it right now. :( --[[Joey]]
+
+[[!template id=gitbranch branch=smcv/ready/trail author="[[Simon_McVittie|smcv]]"]]
+[[!tag patch]]
+
+> I believe my `ready/trail` branch fixes this. There are regression tests.
+>
+> Here is an analysis of how the trail pages interdepend.
+>
+> * If *trail* contains a page *member* which does exist, *member* depends
+> on *trail*. This is so that if the trail directive is deleted from
+> *trail*, or if *trail*'s "friendly" title or trail settings are changed,
+> the trail navigation bar in *member* will pick up that change. This is
+> now only a presence dependency, which isn't enough to make those happen
+> correctly. [Edited to add: actually, the title is the only thing that
+> can affect *member* without affecting the order of members.]
+>
+> * If *trail* contains consecutive pages *m1* and *m2* in that order,
+> *m1* and *m2* depend on each other. This is so that if one's
+> "friendly" title changes, the other is rebuilt. This is now only
+> a presence dependency, which isn't enough to make those happen
+> correctly. In my branch, I explicitly track the "friendly" title
+> for every page that's edited and is involved in a trail somehow.
+>
+> * If *trail* has *member* in its `pagenames` but there is no page called
+> *member*, then *trail* must be rebuilt if *member* is created. This
+> was always a presence dependency, and is fine.
+>
+> In addition, the `trail` plugin remembers the maps
+> { trail => next item in that trail } and { trail => previous item in
+> that trail } for each page. If either changes, the page gets rebuilt
+> by `build_affected`, with almost the same logic as is used to update
+> pages that link to a changed page. My branch extends this to track the
+> "friendly title" of each page involved in a trail, either by being
+> the trail itself or a member (or both).
+>
+> I think it's true to say that the trail always depends on every member,
+> even if it doesn't display them. This might mean that we can use
+> "render the trail page" as an opportunity to work out whether any of
+> its members are also going to need re-rendering?
+> [Edited to add: actually, I didn't need this to be true, but I made the
+> regression test check it anyway.]
+>
+> --[[smcv]]
+
+>>> Thanks **very** much! [[done]] --[[Joey]]
diff --git a/doc/bugs/trail_shows_on_cgi_pages.mdwn b/doc/bugs/trail_shows_on_cgi_pages.mdwn
new file mode 100644
index 000000000..af1de3028
--- /dev/null
+++ b/doc/bugs/trail_shows_on_cgi_pages.mdwn
@@ -0,0 +1,12 @@
+When commenting on, or I think editing, a page that uses the trail
+plugin, the trail is displayed across the top of the page. This should not
+happen, probably. --[[Joey]]
+
+> [[!template id=gitbranch branch=smcv/ready/no-trails-if-dynamic author="[[smcv]]"]]
+> [[!tag patch]]
+> Fixed in my branch. --[[smcv]]
+
+>> [[merged|done]], although I am ambivilant about hiding the search box,
+>> and unsure about hiding the sidebar. At least the latter fixes an
+>> annoying layout problem with the comment page, where the textarea
+>> appears below the sidebar due to its width. --[[Joey]]
diff --git a/doc/bugs/trail_test_suite_failures.mdwn b/doc/bugs/trail_test_suite_failures.mdwn
new file mode 100644
index 000000000..a3b7159ec
--- /dev/null
+++ b/doc/bugs/trail_test_suite_failures.mdwn
@@ -0,0 +1,97 @@
+[[!template id=gitbranch branch=smcv/trail author=smcv]] [[!tag patch]]
+
+`t/trail.t` has some test suite failures. This is after applying
+[[smcv]]'s patch that fixed some races that caused it to fail
+sometimes. These remaining failures may also be intermittant,
+although I can get them reliably on my laptop. I've added some debugging
+output, which seems to point to an actual bug in the plugin AFAICS. --[[Joey]]
+
+> I can reproduce this reliably at 0a23666ddd but not 3.20120203. Bisecting
+> indicates that it regressed in aaa72a3a80f, "inline: When the pagenames list
+> includes pages that do not exist, skip them".
+>
+> I don't think this is the bug noted in the commit message - the inline
+> containing `sorting/new` uses `pages`, not `pagenames`. --[[smcv]]
+
+>> It seems you removed `trail` support from `inline` in that commit.
+>> Assuming that wasn't intentional, this is fixed in `smcv/trail`.
+>> --[[smcv]]
+
+>>> Looks like a bad merge of some kind. pulled, [[done]] --[[Joey]]
+
+<pre>
+ok 71 - expected n=sorting/end p=sorting/beginning in sorting/middle.html
+not ok 72 - expected n=sorting/new p=sorting/middle in sorting/end.html
+# Failed test 'expected n=sorting/new p=sorting/middle in sorting/end.html'
+# at t/trail.t line 13.
+# got: 'n=sorting/linked2 p=sorting/middle'
+# expected: 'n=sorting/new p=sorting/middle'
+not ok 73 - expected n=sorting/old p=sorting/end in sorting/new.html
+# Failed test 'expected n=sorting/old p=sorting/end in sorting/new.html'
+# at t/trail.t line 13.
+# got: undef
+# expected: 'n=sorting/old p=sorting/end'
+not ok 74 - expected n=sorting/ancient p=sorting/new in sorting/old.html
+# Failed test 'expected n=sorting/ancient p=sorting/new in sorting/old.html'
+# at t/trail.t line 13.
+# got: undef
+# expected: 'n=sorting/ancient p=sorting/new'
+not ok 75 - expected n=sorting/linked2 p=sorting/old in sorting/ancient.html
+# Failed test 'expected n=sorting/linked2 p=sorting/old in sorting/ancient.html'
+# at t/trail.t line 13.
+# got: undef
+# expected: 'n=sorting/linked2 p=sorting/old'
+not ok 76 - expected n= p=sorting/ancient in sorting/linked2.html
+# Failed test 'expected n= p=sorting/ancient in sorting/linked2.html'
+# at t/trail.t line 13.
+# got: 'n= p=sorting/end'
+# expected: 'n= p=sorting/ancient'
+ok 77
+</pre>
+
+Here, the "new" page does not seem to be included into the trail as expected.
+Looking at the rendered page, there is no trail directive output on it either.
+--[[Joey]]
+
+<pre>
+ok 90
+not ok 91 - expected n=sorting/new p= in sorting/old.html
+# Failed test 'expected n=sorting/new p= in sorting/old.html'
+# at t/trail.t line 13.
+# got: undef
+# expected: 'n=sorting/new p='
+not ok 92 - expected n=sorting/middle p=sorting/old in sorting/new.html
+# Failed test 'expected n=sorting/middle p=sorting/old in sorting/new.html'
+# at t/trail.t line 13.
+# got: undef
+# expected: 'n=sorting/middle p=sorting/old'
+not ok 93 - expected n=sorting/linked2 p=sorting/new in sorting/middle.html
+# Failed test 'expected n=sorting/linked2 p=sorting/new in sorting/middle.html'
+# at t/trail.t line 13.
+# got: 'n=sorting/linked2 p='
+# expected: 'n=sorting/linked2 p=sorting/new'
+ok 94 - expected n=sorting/linked p=sorting/middle in sorting/linked2.html
+ok 95 - expected n=sorting/end p=sorting/linked2 in sorting/linked.html
+ok 96 - expected n=sorting/a/c p=sorting/linked in sorting/end.html
+ok 97 - expected n=sorting/beginning p=sorting/end in sorting/a/c.html
+ok 98 - expected n=sorting/a/b p=sorting/a/c in sorting/beginning.html
+not ok 99 - expected n=sorting/ancient p=sorting/beginning in sorting/a/b.html
+# Failed test 'expected n=sorting/ancient p=sorting/beginning in sorting/a/b.html'
+# at t/trail.t line 13.
+# got: 'n=sorting/z/a p=sorting/beginning'
+# expected: 'n=sorting/ancient p=sorting/beginning'
+not ok 100 - expected n=sorting/z/a p=sorting/a/b in sorting/ancient.html
+# Failed test 'expected n=sorting/z/a p=sorting/a/b in sorting/ancient.html'
+# at t/trail.t line 13.
+# got: undef
+# expected: 'n=sorting/z/a p=sorting/a/b'
+not ok 101 - expected n= p=sorting/ancient in sorting/z/a.html
+# Failed test 'expected n= p=sorting/ancient in sorting/z/a.html'
+# at t/trail.t line 13.
+# got: 'n= p=sorting/a/b'
+# expected: 'n= p=sorting/ancient'
+ok 102
+</pre>
+
+Haven't investigated, but looks like the same sort of problem, a
+page expected to be in the trail isn't. --[[Joey]]
diff --git a/doc/bugs/transient_autocreated_tagbase_is_not_transient_autoindexed.mdwn b/doc/bugs/transient_autocreated_tagbase_is_not_transient_autoindexed.mdwn
new file mode 100644
index 000000000..702608831
--- /dev/null
+++ b/doc/bugs/transient_autocreated_tagbase_is_not_transient_autoindexed.mdwn
@@ -0,0 +1,31 @@
+ mkdir -p ikiwiki-tag-test/raw/a_dir/ ikiwiki-tag-test/rendered/
+ echo '\[[!taglink a_tag]]' > ikiwiki-tag-test/raw/a_dir/a_page.mdwn
+ ikiwiki --verbose --plugin tag --plugin autoindex --plugin mdwn --set autoindex_commit=0 --set tagbase=tag --set tag_autocreate=1 --set tag_autocreate_commit=0 ikiwiki-tag-test/raw/ ikiwiki-tag-test/rendered/
+ ls -al ikiwiki-tag-test/raw/.ikiwiki/transient/
+ ls -al ikiwiki-tag-test/rendered/tag/
+
+Shouldn't `ikiwiki-tag-test/raw/.ikiwiki/transient/tag.mdwn` and `ikiwiki-tag-test/rendered/tag/index.html` exist?
+
+[[!tag patch]]
+[[!template id=gitbranch branch=smcv/ready/autoindex author=smcv]]
+[[!template id=gitbranch branch=smcv/ready/autoindex-more-often author=smcv]]
+
+> To have a starting point to (maybe) change this, my `ready/autoindex`
+> branch adds a regression test for the current behaviour, both with
+> and without `autoindex_commit` enabled. It also fixes an unnecessary
+> and potentially harmful special case for the transient directory.
+>
+> The fact that files in underlays (including transient files) don't
+> trigger autoindexing is deliberate. However, this is the second
+> request to change this behaviour: the first was
+> [[!debbug 611068]], which has a patch from Tuomas Jormola.
+> On that bug report, Joey explains why it's undesirable
+> for the original behaviour of autoindex (when the
+> index isn't transient).
+>
+> I'm not sure whether the same reasoning still applies when the
+> index is transient, though (`autoindex_commit => 0`),
+> because the index pages won't be cluttering up people's
+> git repositories any more? My `autoindex-more` branch changes
+> the logic so it will do what you want in the `autoindex_commit => 0`
+> case, and amends the appropriate regression test. --[[smcv]]
diff --git a/doc/bugs/transitive_dependencies.mdwn b/doc/bugs/transitive_dependencies.mdwn
new file mode 100644
index 000000000..c44fe7962
--- /dev/null
+++ b/doc/bugs/transitive_dependencies.mdwn
@@ -0,0 +1,94 @@
+If a sidebar contains a map, or inline (etc), one would expect a
+add/remove of any of the mapped/inlined pages to cause a full wiki
+rebuild. But this does not happen.
+
+If page A inlines page B, which inlines page C, a change to C will cause B
+to be updated, but A will not "notice" that this means A needs to be
+updated.
+
+One way to look at this bug is that it's a bug in where dependencies are
+recorded when preprocessing the rendered or sidebar page. The current code
+does:
+
+ add_depends($params{page}, $somepage);
+
+Where `$params{page}` is page B. If this is changed to `$params{destpage}`,
+then the dependency is added to page A, and updates to C cause it to
+change. This does result in the page A's getting lots more dependency info
+recorded than before (essentially a copy of all the B's dependency info).
+
+It's also a fragile, since all plugins that handle dependencies have to be
+changed, and do this going forward. And it seems non-obvious that this should
+be done. Or really, whether to use `page` or `destpage` there. Currently,
+making the "wrong" choice and using `destpage` instead of `page` (which nearly
+everything uses) will just result in semi-redundant dependency info being
+recorded. If we make destpage mandatory to fix this, goofing up will lead to
+this bug coming back. Ugh.
+
+----
+
+## rebuild = change approach
+
+[[!template id=gitbranch branch=origin/transitive-dependencies author="[[joey]]"]]
+
+Another approach to fix it is to say that anything that causes a
+rebuild of B is treated as a change of B. Then when C is changed, B is
+rebuilt due to dependencies, and in turn this means A is rebuilt because B
+"changed".
+
+This is essentially what is done with wikilinks now, and why, if a sidebar
+links to page C, add/remove of C causes all pages to be rebuilt, as seen
+here:
+
+ removing old page meep
+ building sidebar.mdwn, which links to meep
+ building TourBusStop.mdwn, which depends on sidebar
+ building contact.mdwn, which depends on sidebar
+ ...
+
+Downsides here:
+
+* Means a minimum of 2x as much time spent resolving dependencies,
+ at least in my simple implementation, which re-runs the dependency
+ resolution loop until no new pages are rebuilt.
+ (I added an optimisation that gets it down to 1.5X as much work on
+ average, still 2x as much worst case. I suppose building a directed
+ graph and traversing it would be theoretically more efficient.)
+* Causes extra work for some transitive dependencies that we don't
+ actually care about. This is amelorated, but not solved by
+ the current work on [[todo/dependency_types]].
+ For example, changing index causes
+ plugins/brokenlinks to update in the first pass; if there's a second
+ pass, plugins/map is no longer updated (contentless dependencies FTW),
+ but plugins is, because it depends on plugins/brokenlinks.
+ (Of course, this is just a special case of the issue that a real
+ modification to plugins/brokenlinks causes an unnecessary update of
+ plugins, and could be solved by adding more dependency types.)
+
+[[done]] --[[Joey]]
+
+> Some questions/comments... I've thought about this a lot for [[todo/tracking_bugs_with_dependencies]].
+>
+> * When you say that anything that causes a rebuild of B is treated as a change of B, are you: i) Treating
+> any rebuild as a change, or ii) Treating any rebuild that gives a new result as a change? Option ii) would
+> lead to fewer rebuilds. Implementation is easy: when you're about to rebuild a page, load the old rendered html in. Do the rebuild. Compare
+> the new and old html. If there is a difference, then mark that page as having changed. If there is no difference
+> then you don't need to mark that pages as changed, even though it has been rebuilt. (This would ignore pages in meta-data that don't
+> cause changes in html, but I don't think that is a huge issue.)
+
+>> That is a good idea. I will have to look at it to see if the overhead of
+>> reading back in the html of every page before building actually is a
+>> win though. So far, I've focused on avoiding unnecessary rebuilds, and
+>> there is still some room for more dependency types doing so.
+>> (Particularly for metadata dependencies..) --[[Joey]]
+
+> * The second comment I have relates to cycles in transitive dependencies. At the moment I don't think this is
+> possible, but with some additions it may well become so. This could be problematic as it could lead to a)
+> updates that never complete, or b) it being theoretically unclear what the final result should be (i.e. you
+> can construct logical paradoxes in the system). I think the point above about marking things as changed only when
+> the output actually changes fixes any cases that are well defined. For logical paradoxes and infinite loops (e.g.
+> two pages that include each other), you might want to put a limit on the number of times you'll rebuild a page in any
+> given run of ikiwiki. Say, only allow a page to rebuild twice on any run, regardless of whether a page it depends on changes.
+> This is not a perfect solution, but would be a good approximation. -- [[Will]]
+
+>> Ikiwiki only builds any given output file once per run, already. --[[Joey]]
diff --git a/doc/bugs/trouble_with_base_in_search.mdwn b/doc/bugs/trouble_with_base_in_search.mdwn
new file mode 100644
index 000000000..ca6a6c5cc
--- /dev/null
+++ b/doc/bugs/trouble_with_base_in_search.mdwn
@@ -0,0 +1,60 @@
+For security reasons, one of the sites I'm in charge of uses a Reverse Proxy to grab the content from another machine behind our firewall.
+Let's call the out-facing machine Alfred and the one behind the firewall Betty.
+
+For the static pages, everything is fine. However, when trying to use the search, all the links break.
+This is because, when Alfred passes the search query on to Betty, the search result has a "base" tag which points to Betty, and all the links to the "found" pages are relative.
+So we have
+
+ <base href="Betty.example.com"/>
+ ...
+ <a href="./path/to/found/page/">path/to/found/page</a>
+
+This breaks things for anyone on Alfred, because Betty is behind a firewall and they can't get there.
+
+What would be better is if it were possible to have a "base" which didn't reference the hostname, and for the "found" links not to be relative.
+Something like this:
+
+ <base href="/"/>
+ ...
+ <a href="/path/to/found/page/">path/to/found/page</a>
+
+The workaround I've come up with is this.
+
+1. Set the "url" in the config to '&nbsp;' (a single space). It can't be empty because too many things complain if it is.
+2. Patch the search plugin so that it saves an absolute URL rather than a relative one.
+
+Here's a patch:
+
+ diff --git a/IkiWiki/Plugin/search.pm b/IkiWiki/Plugin/search.pm
+ index 3f0b7c9..26c4d46 100644
+ --- a/IkiWiki/Plugin/search.pm
+ +++ b/IkiWiki/Plugin/search.pm
+ @@ -113,7 +113,7 @@ sub indexhtml (@) {
+ }
+ $sample=~s/\n/ /g;
+
+ - my $url=urlto($params{destpage}, "");
+ + my $url=urlto($params{destpage}, undef);
+ if (defined $pagestate{$params{page}}{meta}{permalink}) {
+ $url=$pagestate{$params{page}}{meta}{permalink}
+ }
+
+It works for me, but it has the odd side-effect of prefixing links with a space. Fortunately that doesn't seem to break browsers.
+And I'm sure someone else could come up with something better and more general.
+
+--[[KathrynAndersen]]
+
+> The `<base href>` is required to be genuinely absolute (HTML 4.01 §12.4).
+> Have you tried setting `url` to the public-facing URL, i.e. with `alfred`
+> as the hostname? That seems like the cleanest solution to me; if you're
+> one of the few behind the firewall and you access the site via `betty`
+> directly, my HTTP vs. HTTPS cleanup in recent versions should mean that
+> you rarely get redirected to `alfred`, because most URLs are either
+> relative or "local" (start with '/'). --[[smcv]]
+
+>> I did try setting `url` to the "Alfred" machine, but that doesn't seem clean to me at all, since it forces someone to go to Alfred when they started off on Betty.
+>> Even worse, it prevents me from setting up a test environment on, say, Cassandra, because as soon as one tries to search, one goes to Alfred, then Betty, and not back to Cassandra at all.
+>> Hardcoded solutions make me nervous.
+
+>> I suppose what I would like would be to not need to use a `<base href>` in searching at all.
+>> --[[KathrynAndersen]]
diff --git a/doc/bugs/txt_plugin_having_problems_with_meta_directives.mdwn b/doc/bugs/txt_plugin_having_problems_with_meta_directives.mdwn
new file mode 100644
index 000000000..22224483e
--- /dev/null
+++ b/doc/bugs/txt_plugin_having_problems_with_meta_directives.mdwn
@@ -0,0 +1,19 @@
+When applying my usual copyright and licensing header to a [[plugins/txt]]
+page, garbled output is created.
+
+Here is the header:
+
+ \[[meta copyright="Copyright © 2001, 2002, 2003, 2004, 2005, 2008 Free
+ Software Foundation, Inc."]]
+
+ \[[meta license="""[[toggle id="license" text="GFDL 1.2+"]][[toggleable
+ id="license" text="Permission is granted to copy, distribute and/or modify
+ this document under the terms of the GNU Free Documentation License,
+ Version 1.2 or any later version published by the Free Software Foundation;
+ with no Invariant Sections, no Front-Cover Texts, and no Back-Cover Texts.
+ A copy of the license is included in the section entitled
+ [[GNU_Free_Documentation_License|/fdl]]."]]"""]]
+
+--[[tschwinge]]
+
+> [[done]], made it less zealous about encoding html entities. --[[Joey]]
diff --git a/doc/bugs/typo_in_ikiwiki.setup.mdwn b/doc/bugs/typo_in_ikiwiki.setup.mdwn
new file mode 100644
index 000000000..a7b10ec8c
--- /dev/null
+++ b/doc/bugs/typo_in_ikiwiki.setup.mdwn
@@ -0,0 +1,9 @@
+Dot is automatically inserted. The htmlext option should be
+
+ #htmlext => 'htm',
+
+instead of
+
+ #htmlext => '.htm',
+
+> [[done]] --[[Joey]]
diff --git a/doc/bugs/typo_in_skeleton.pm:_sessionncgi.mdwn b/doc/bugs/typo_in_skeleton.pm:_sessionncgi.mdwn
new file mode 100644
index 000000000..4772aceee
--- /dev/null
+++ b/doc/bugs/typo_in_skeleton.pm:_sessionncgi.mdwn
@@ -0,0 +1,5 @@
+skeleton.pm.example contains the typo "sessionncgi" when defining a sub, which means the skeleton plugin won't work as a session CGI action as-is. [My repository has a patch on the 'trivia' branch](http://git.debian.org/?p=users/smcv/ikiwiki.git;a=commitdiff;h=72ffc85d). --[[smcv]]
+
+[[!tag patch]]
+
+[[done]]
diff --git a/doc/bugs/undefined_tags_or_mismatched_tags_won__39__t_get_converted.mdwn b/doc/bugs/undefined_tags_or_mismatched_tags_won__39__t_get_converted.mdwn
new file mode 100644
index 000000000..b01fc44f2
--- /dev/null
+++ b/doc/bugs/undefined_tags_or_mismatched_tags_won__39__t_get_converted.mdwn
@@ -0,0 +1,46 @@
+If you put in something such as undefined tags or mismatched tags in .mdwn file, ikiwiki will put &lt;p>&lt;/p> around them. But ikiwiki will NOT convert < and > to &amp;lt; and &amp;gt;!
+
+ <section>
+
+ some text
+
+ </section>
+
+
+the output html
+
+ <p><section></p> <p>some text</p> <p></section></p>
+
+And another example of mismatched tags:
+
+
+
+ <div>
+
+ some text
+
+ </div>
+ </div>
+
+
+The out put is:
+
+ <div>
+
+ some text
+
+ </div>
+
+ <p></div></p>
+
+> This is a bug in markdown. Actually, not converting `<` and `>` in tags is a
+> markdown feature -- markdown allows inserting arbirary html, even if it's
+> made-up tags. And putting paragraph tags around your `<section>` tag is
+> understandable, since markdown can't know if `<section>` is intended to
+> be a block-level tag or not. The bug is that it puts the `<p>` around the
+> trailing `<div>` -- it does know what a div is, and it should know that's
+> illegal and not do it. I've filed a [bug report](http://bugs.debian.org/459269) about that issue
+> alone. If you feel the other things you brought up are bugs, please talk
+> to the markdown maintainer. --[[Joey]]
+
+[[!tag done]]
diff --git a/doc/bugs/undefined_value_as_a_HASH_reference.mdwn b/doc/bugs/undefined_value_as_a_HASH_reference.mdwn
new file mode 100644
index 000000000..228c3baac
--- /dev/null
+++ b/doc/bugs/undefined_value_as_a_HASH_reference.mdwn
@@ -0,0 +1,68 @@
+Hello,
+
+does anyone have an idea why I see the following error when I run websetup (Setup button in Preferences)?
+
+ Error: Can't use an undefined value as a HASH reference at /usr/share/perl5/IkiWiki/Plugin/websetup.pm line 82, line 97.
+
+Maybe, related to this is also
+
+ $ ikiwiki --setup /etc/ikiwiki/auto-blog.setup
+ What will the blog be named? tmpblog
+ What revision control system to use? git
+ What wiki user (or openid) will be admin? wsh
+
+
+ Setting up tmpblog ...
+ Importing /home/wsh/tmpblog into git
+ Initialized empty shared Git repository in /home/wsh/tmpblog.git/
+ Initialized empty Git repository in /home/wsh/tmpblog/.git/
+ [master (root-commit) d6847e1] initial commit
+ 8 files changed, 48 insertions(+)
+ create mode 100644 .gitignore
+ create mode 100644 archives.mdwn
+ create mode 100644 comments.mdwn
+ create mode 100644 index.mdwn
+ create mode 100644 posts.mdwn
+ create mode 100644 posts/first_post.mdwn
+ create mode 100644 sidebar.mdwn
+ create mode 100644 tags.mdwn
+ Counting objects: 11, done.
+ Delta compression using up to 4 threads.
+ Compressing objects: 100% (9/9), done.
+ Writing objects: 100% (11/11), 1.53 KiB, done.
+ Total 11 (delta 0), reused 0 (delta 0)
+ Unpacking objects: 100% (11/11), done.
+ To /home/wsh/tmpblog.git
+ * [new branch] master -> master
+ Directory /home/wsh/tmpblog is now a clone of git repository /home/wsh/tmpblog.git
+ Reference found where even-sized list expected at /usr/share/perl5/IkiWiki/Setup.pm line 177, <GEN4> line 97.
+ Reference found where even-sized list expected at /usr/share/perl5/IkiWiki/Setup.pm line 224, <GEN4> line 97.
+ Use of uninitialized value $section in hash element at /usr/share/perl5/IkiWiki/Setup.pm line 226, <GEN4> line 97.
+ Use of uninitialized value $section in hash element at /usr/share/perl5/IkiWiki/Setup.pm line 227, <GEN4> line 97.
+ Use of uninitialized value $section in concatenation (.) or string at /usr/share/perl5/IkiWiki/Setup.pm line 233, <GEN4> line 97.
+ /etc/ikiwiki/auto-blog.setup: Can't use an undefined value as a HASH reference at /usr/share/perl5/IkiWiki/Setup.pm line 252, <GEN4> line 97.
+
+ usage: ikiwiki [options] source dest
+ ikiwiki --setup configfile
+
+I'm on Debian unstable.
+
+Thanks,
+-Michal
+
+> Some plugin has a broken getsetup hook, and is feeding a corrupted setup list in. Both the websetup and the auto.setup files cause all plugins to be loaded and all their setup to be available.
+>
+> This command will help you find the plugin. Here it prints some noise around the rst plugin, for unrelated reasons,
+> but what you're looking for is the plugin printed before the "even sized list" message.
+
+<pre>
+perl -le 'use warnings; use strict; use Data::Dumper; use IkiWiki; %config=IkiWiki::defaultconfig(); use IkiWiki::Setup; my @s=IkiWiki::Setup::getsetup(); foreach my $pair (@s) { print "plugin ".$pair->[0]; my $setup=$pair->[1]; if ($pair->[0] eq "rst") { print Dumper($setup)} my %s=@{$setup} }'
+</pre>
+
+> I was able to replicate this by making a plugin's getsetup hook return a list reference, rather than a list,
+> and have put in a guard against that sort of thing.
+> --[[Joey]]
+
+>> Thanks. Your command didn't helped me, but with trial and error approach I found that the victim an old version asciidoc plugin. For some reason, asciidoc was never listed in the output of the command. --[[wentasah]]
+
+>>> Ok. My fix should prevent the problem, so [[done]] --[[Joey]]
diff --git a/doc/bugs/underlaydir_file_expose.mdwn b/doc/bugs/underlaydir_file_expose.mdwn
new file mode 100644
index 000000000..4ee30e39d
--- /dev/null
+++ b/doc/bugs/underlaydir_file_expose.mdwn
@@ -0,0 +1,13 @@
+If a file in the srcdir is removed, exposing a file in the underlaydir,
+ikiwiki will not notice the removal, and the
+page from the underlay will not be built. (However, it will be if the wiki
+gets rebuilt.)
+
+> This problem is caused by ikiwiki storing only filenames relative to
+> the srcdir or underlay, and mtime comparison not handling this case.
+
+> A related problem occurs if changing a site's theme with the
+> [[plugins/theme]] plugin. The style.css of the old and new theme
+> often has the same mtime, so ikiwiki does not update it w/o a rebuild.
+> This is worked around in theme.pm with a special-purpose needsbuild hook.
+> --[[Joey]]
diff --git a/doc/bugs/unicode_chars_in_wikiname_break_auth.mdwn b/doc/bugs/unicode_chars_in_wikiname_break_auth.mdwn
new file mode 100644
index 000000000..472597c46
--- /dev/null
+++ b/doc/bugs/unicode_chars_in_wikiname_break_auth.mdwn
@@ -0,0 +1,20 @@
+I spent hours trying to understand why my wiki suddenly refused me to log in (using passwordauth).
+The failure message was always: `login failed, perhaps you need to turn on cookies?`
+
+Inspecting the cookie information (thanks to Iceweasel's webdeveloper add-on), I realized there were some weird-looking encoded chars in the cookie name.
+
+Replacing "·" with "-" in `wikiname` fixed this login issue.
+
+> Hmm, Recai sent me a patch a long time ago to handle utf-8 here by encoding
+> the wikiname. But it doesn't seem to work, somehow the encoded utf-8
+> value still doesn't make it through. (CGI::Session seems to have underermined utf-8
+> issues too.) Seems like I will have to possibly break some sessions and
+> entity-encode the wikiname in the cookie.. [[done]]. --[[Joey]]
+
+>> I confirm it fixes the bug for me. --[[intrigeri]]
+
+(BTW, such a char was replaced by -I don't remember what encoding thingie- in my setup file, when running `ikiwiki-transition setupformat`.)
+
+> Thanks for the heads up, fixed that too. --[[Joey]]
+
+>> I confirm it fixes the bug for me. --[[intrigeri]]
diff --git a/doc/bugs/unicode_encoded_urls_and_recentchanges.mdwn b/doc/bugs/unicode_encoded_urls_and_recentchanges.mdwn
new file mode 100644
index 000000000..262aa24fc
--- /dev/null
+++ b/doc/bugs/unicode_encoded_urls_and_recentchanges.mdwn
@@ -0,0 +1,38 @@
+it appears that unicode characters in the title that are unicode letters are spared the __ filename encoding but instead saved in their utf8 encoding. (correct me if i'm wrong; didn't find the code that does this.) -- see below for examples.
+
+> Filenames can have any alphanumerics in them without the __ escaping.
+> Your locale determines whether various unicode characters are considered
+> alphanumeric. In other words, it just looks at the \[[:alpha:]] character
+> class, whatever your locale defines it to be. --[[Joey]]
+
+this is not a problem per se, but (at least with git backend) the recent changes missinterpret the file name character set (it seems to read the filenames as latin1) and both display wrong titles and create broken links.
+
+the problem can be shown with an auto-setup'd ikiwiki without cgi when manually creating utf8 encoded filenames and running ikiwiki with LANG=en_GB.UTF-8 .
+
+> Encoding issue, I figured out a fix. [[done]] --[[Joey]]
+
+>> the link text works now, but the link goes to
+>> `ikiwiki.cgi?page=uml%C3%A4ute&do=recentchanges_link`, which fails with
+>> "missing page". it seems that bestlink can't handle utf8 encoded texts. (the
+>> same happens, by the way, when using meta-redir to a page with high bytes in
+>> the name.)
+>>
+>>> The problem is that all cgi inputs have to be explicitly decoded to
+>>> utf-8, which I've now done for `recentchange_link`.
+>>>> thanks a lot, i think that closed the bug.
+>>>
+>>> I cannot, however, reproduce a problem with meta redir. Here it
+>>> generated the following html, which redirected the browser ok:
+>>> <meta http-equiv="refresh" content="0; URL=./../â/" />
+>>>> sorry, my fault -- it was the blank which needed to be replaced by an
+>>>> underscore, not the high byte character
+>>
+>> update: i've had a look at the git options; you could run git with '-z' (NUL
+>> termination) in the `git_commit_info` function; this would require some
+>> changes in `parse_diff_tree`, but otherwise completely eliminate the
+>> problems with git escaping.
+>>
+>>> If you would like to develop a patch to that effect, I'd be glad to
+>>> drop the current nasty code.
+>>>> i'll have a look, but i'm afraid that's above my current perl skills.
+>>>> --[[chrysn]]
diff --git a/doc/bugs/unrecognized___34__do__61__blog__34___CGI_parameter_when_creating_todo_item.mdwn b/doc/bugs/unrecognized___34__do__61__blog__34___CGI_parameter_when_creating_todo_item.mdwn
new file mode 100644
index 000000000..934f1480e
--- /dev/null
+++ b/doc/bugs/unrecognized___34__do__61__blog__34___CGI_parameter_when_creating_todo_item.mdwn
@@ -0,0 +1,18 @@
+While I am creating this entry, the following appears
+as the page title, above the form:
+
+ ikiwiki/ creating unrecognized "do=blog" CGI parameter when creating todo item
+
+An the following appears below:
+
+ Content-type: text/html
+ ikiwiki/ Error
+
+ Error: unknown do parameter
+
+This is harmless.
+I have to go, but will have a look at what could be going on when I'm back.
+
+--[[JeremieKoenig]]
+
+[[fixed|done]] --[[Joey]]
diff --git a/doc/bugs/unwanted_discussion_links_on_discussion_pages.mdwn b/doc/bugs/unwanted_discussion_links_on_discussion_pages.mdwn
new file mode 100644
index 000000000..c74a094ce
--- /dev/null
+++ b/doc/bugs/unwanted_discussion_links_on_discussion_pages.mdwn
@@ -0,0 +1,36 @@
+Background: some po translations (amongst which `fr.po`) translate "discussion" to an upper-cased word (in French: "Discussion").
+By the way, this is wished e.g. in German, where such a noun has to be written with an upper-cased "D", but I can not see
+the logic behind the added "D" in French.
+
+Anyway, this gettext-translated word is used to name the discussion pages, as `$discussionlink` in `Render.pm` is
+built from `gettext("discussion")`. In the same piece of code, a case-sensitive regexp that tests wether the page
+being rendered is a discussion page is case-sensitive.
+
+On the other hand, new discussion pages are created with a name built from `gettext("Discussion")` (please note the upper-cased
+"D"). Such a new page name seems to be automagically downcased.
+
+This leads to newly created discussion pages not being recognized as discussion pages by the
+`$page !~ /.*\/\Q$discussionlink\E$/` regexp, so that then end with an unwanted discussion link.
+
+A simple fix that seems to work is to make this regexp case-insensitive:
+
+ git diff IkiWiki/Render.pm
+ diff --git a/IkiWiki/Render.pm b/IkiWiki/Render.pm
+ index adae9f0..093c25b 100644
+ --- a/IkiWiki/Render.pm
+ +++ b/IkiWiki/Render.pm
+ @@ -77,7 +77,7 @@ sub genpage ($$) {
+ }
+ if ($config{discussion}) {
+ my $discussionlink=gettext("discussion");
+ - if ($page !~ /.*\/\Q$discussionlink\E$/ &&
+ + if ($page !~ /.*\/\Q$discussionlink\E$/i &&
+ (length $config{cgiurl} ||
+ exists $links{$page."/".$discussionlink})) {
+ $template->param(discussionlink => htmllink($page, $page, gettext("Discussion"), noimageinline => 1, forcesubpage => 1));
+
+But the best way would be to avoid assuming implicitely that translators will translate "discussion" and "Discussion" the same way.
+
+> [[done]] --[[Joey]]
+
+[[!tag patch]]
diff --git a/doc/bugs/urlto_API_change_breaks_wikis_with_po_plugin.mdwn b/doc/bugs/urlto_API_change_breaks_wikis_with_po_plugin.mdwn
new file mode 100644
index 000000000..4268a1390
--- /dev/null
+++ b/doc/bugs/urlto_API_change_breaks_wikis_with_po_plugin.mdwn
@@ -0,0 +1,98 @@
+The po plugin needs to be updated to match the urlto sub API and
+signature changes. Else a wiki with the po plugin enabled cannot be
+refreshed / rebuilt because of (correct) Perl errors.
+
+My po branch contains a fix.
+--[[intrigeri]]
+
+> The commit looks sane to me, for what it's worth. Joey, please
+> consider merging? --[[smcv]]
+
+>> Merged. --[[Joey]]
+
+Also, I fear the lack of any useful `$from` parameter might break some
+l10n'd link niceness when using `po_link_to = current` but I have not
+investigated this yet.
+--[[intrigeri]]
+
+> If `urlto` is called without a second parameter, it means we need
+> a URL valid from either the CGI URL or any page in the wiki,
+> (so we'd previously have set the third parameter true), but we
+> don't *necessarily* need an absolute URL - so return what you'd
+> have returned if asked for an absolute URL, but looking like
+> `/bugs/` rather than `http://ikiwiki.info/bugs/` if possible.
+>
+> It looks as though `beautify_urlpath` under `po_link_to = current`,
+> and 3-argument `urlto`, aren't tested by `t/po.t` - perhaps you
+> could add some test cases there? To test 3-argument `urlto` you'd
+> need to add `$config{baseurl} = "http://example.com"` or
+> something. --[[smcv]]
+
+>> I'm leaving this bug report open until this can be checked. --[[Joey]]
+
+>>> My `ready/urlto` branch improves the test coverage. The bugfix from
+>>> that branch fixes most of `po` too, but leaves behind some perhaps
+>>> less-than-ideal behaviour: links where the current language is unknown,
+>>> with `po_link_to = current`, always go to the master language,
+>>> whereas perhaps it'd be better to go to the negotiated language in
+>>> this case? --[[smcv]]
+
+>>>> Thanks for taking care, thanks for these improvements!
+>>>>
+>>>> OTOH I consider any of these behaviours (either the brand new one
+>>>> = link to master language, or the alternative one = link to
+>>>> negotiated) as a regression. Any of these is contrary to what
+>>>> `po_link_to = current` is supposed to do according to the
+>>>> documentation.
+>>>>
+>>>> Let's be less technical, let me display my practical usecase
+>>>> (making this possible was one of the main reasons I initially
+>>>> implemented `po_link_to = current`).
+>>>>
+>>>> Summary: the current state of things is an annoying regression
+>>>> and it needs to be fixed.
+>>>>
+>>>> Context: I participate in building a Live system based on Debian
+>>>> Live; the project's multilingual website
+>>>> ([T(A)ILS](https://amnesia.boum.org/) is built using ikiwiki. A
+>>>> static / offline copy is shipped on ISO images; this is the way
+>>>> end-user documentation lands on the CDs. Note that no webserver
+>>>> runs on the Live system to serve this wiki, so `po_link_to =
+>>>> current` is compulsory. A user can choose her preferred language
+>>>> at boot time. Depending on her decision, The desktop shortcut
+>>>> that points to the embedded documentation (i.e. static wiki)
+>>>> links to a different entry point depending on the chosen
+>>>> language.
+>>>>
+>>>> The previous (documented) behaviour was deadly simple: if I am
+>>>> presented a page in English (master language) it means it does
+>>>> not exist in my preferred language; the computer always displays
+>>>> me the best available version according to my needs. The new
+>>>> behaviour brings a troubling seemingly random factor into the
+>>>> user navigation experience and IMHO is a mess from a web
+>>>> ergonomics point of view (no content negotiation available,
+>>>> remember): I sometimes am shown an English page although it is
+>>>> fully translated in my language one click away, and on the
+>>>> contrary I sometimes I am shown the optimal page. This, is, well,
+>>>> interesting. This practically forces the non-English speaking
+>>>> website visitor to check the otherlanguages list on every single
+>>>> page to make sure *herself* there is nothing better available,
+>>>> and sometimes click on her preferred language link to get a page
+>>>> she actually can read.
+>>>>
+>>>> I unfortunately might not be able to dedicate the needed time to
+>>>> help fix this in a timely manner, so I don't want to urge anyone.
+>>>> Take care! --[[intrigeri]]
+
+>>>>> I can see why this is bad, but to the best of my knowledge it's
+>>>>> not a regression: each of the calls to 1-argument `urlto` was
+>>>>> previously a call to 3-argument `urlto`, which always produces
+>>>>> a fully absolute URL, so in either case there isn't enough
+>>>>> context to know the current language. Links that were previously
+>>>>> 2-argument `urlto` still have a defined second argument;
+>>>>> I've just edited `plugins/write` to clarify why the second
+>>>>> argument should be provided whenever possible. --[[smcv]]
+
+>>>>>> Ok. I am sorry for the burden that arose from my
+>>>>>> misunderstanding. No need to keep this bug open then =>
+>>>>>> [[done]] --[[intrigeri]]
diff --git a/doc/bugs/urlto__40____34____34____44___...__44___1__41___not_absolute.mdwn b/doc/bugs/urlto__40____34____34____44___...__44___1__41___not_absolute.mdwn
new file mode 100644
index 000000000..8a93848b3
--- /dev/null
+++ b/doc/bugs/urlto__40____34____34____44___...__44___1__41___not_absolute.mdwn
@@ -0,0 +1,9 @@
+[[!template id=gitbranch branch=smcv/ready/urlto author="[[Simon_McVittie|smcv]]"]]
+[[!tag patch]]
+
+urlto() has a special-case for a zero-length first argument, but it
+produces a relative path even if the third argument is given and true.
+
+My `ready/urlto` branch simplifies this special case so it works. --[[smcv]]
+
+[[merged|done]] --[[Joey]]
diff --git a/doc/bugs/user_links_on_recentchanges_pages_problem.mdwn b/doc/bugs/user_links_on_recentchanges_pages_problem.mdwn
new file mode 100644
index 000000000..d00f6815b
--- /dev/null
+++ b/doc/bugs/user_links_on_recentchanges_pages_problem.mdwn
@@ -0,0 +1,12 @@
+When I click on a linked username for commits on the recentchanges page (on
+the live ikiwiki.info) I get a link such as
+<http://ikiwiki.info/ikiwiki.cgi?page=users%2Fjoey&do=recentchanges_link>
+which returns something like
+
+ <a href="http://ikiwiki.info">ikiwiki</a>/ Error
+ <p class="error">Error: unknown do parameter
+
+-- [[Jon]]
+
+> That was fixed in 3.04 but ikiwiki.info had not been upgraded to it yet.
+> [[done]] --[[Joey]]
diff --git a/doc/bugs/utf-8_bug_in_websetup.pm.mdwn b/doc/bugs/utf-8_bug_in_websetup.pm.mdwn
new file mode 100644
index 000000000..debedb01c
--- /dev/null
+++ b/doc/bugs/utf-8_bug_in_websetup.pm.mdwn
@@ -0,0 +1,22 @@
+[[!tag patch bugs]]
+
+I type chinese characters into the fields. After press "save setup" button the characters turn into gibberish.
+
+I submit a patch that solve the problem for me. --Lingo
+
+> Fully fixing it is slightly more complex, but now [[done]] --[[Joey]]
+
+----
+
+ --- websetup.pm 2009-12-02 05:07:46.000000000 +0800
+ +++ /usr/share/perl5/IkiWiki/Plugin/websetup.pm 2010-01-08 22:05:16.000000000 +0800
+ @@ -308,7 +308,8 @@
+ $fields{$_}=$shown{$_} foreach keys %shown;
+ }
+ }
+ -
+ +
+ + IkiWiki::decode_form_utf8($form);
+ if ($form->submitted eq "Cancel") {
+ IkiWiki::redirect($cgi, $config{url});
+ return;
diff --git a/doc/bugs/utf8_html_templates.mdwn b/doc/bugs/utf8_html_templates.mdwn
new file mode 100644
index 000000000..a750b23f6
--- /dev/null
+++ b/doc/bugs/utf8_html_templates.mdwn
@@ -0,0 +1,22 @@
+HTML::Template does not read files as utf-8, so modifying ikiwiki's
+template files to contain utf-8 won't currently work.
+
+It seems that the best way to fix this would be to make HTML::Template
+support utf-8.
+
+A workaround is to change all the template reading code like this:
+
+ - my $template=HTML::Template->new(blind_cache => 1,
+ - filename => "$config{templatedir}/page.tmpl");
+ + open(TMPL, "<:utf8", "$config{templatedir}/page.tmpl");
+ + my $template=HTML::Template->new(filehandle => *TMPL);
+ + close(TMPL);
+
+However, this will make ikiwiki slower when rebuilding a wiki, since it
+won't cache templates.
+
+Could be approached by using HTML::Template's support for filters. Just make it use a filter that turns on utf-8
+
+Or by subclassing it and overriding the \_init\_template method, though that's a bit uglier
+
+[[bugs/done]]
diff --git a/doc/bugs/utf8_svn_log.mdwn b/doc/bugs/utf8_svn_log.mdwn
new file mode 100644
index 000000000..abd957719
--- /dev/null
+++ b/doc/bugs/utf8_svn_log.mdwn
@@ -0,0 +1,11 @@
+svn log messages containing utf-8 (such as r773) don't get displayed
+right in RecentChanges. The problem is ikiwiki runs svn log in locale C,
+which makes it spit out eacaped charcters for utf-8 chars. If it's run in
+locale en_US.UTF-8, it would be ok, but that would require the system
+have that locale.
+
+Seems that the right fix for this is to use svn log --xml, which is
+always utf-8 and come up with a parser for that. Also fixes the spoofing
+issue in [[security]].
+
+[[bugs/done]]
diff --git a/doc/bugs/web_reversion_on_ikiwiki.info.mdwn b/doc/bugs/web_reversion_on_ikiwiki.info.mdwn
new file mode 100644
index 000000000..6f18cfcba
--- /dev/null
+++ b/doc/bugs/web_reversion_on_ikiwiki.info.mdwn
@@ -0,0 +1,14 @@
+I created [[sandbox/revert me]] and then tried the revert button on
+[[recentchanges]], but I was not allowed to revert it. The specific error
+was
+
+ Error: you are not allowed to change sandbox/revert_me.mdwn
+
+I've just tried reading through the revert code, and I haven't figured out
+what permission I am lacking. Perhaps the error message could be a little
+clearer on that. The error might have been thrown by git_parse_changes in
+git.pm or check_canchange in IkiWiki.pm, via IkiWiki::Receive. -- Jon
+
+[[fixed|done]] --[[Joey]]
+
+: Brilliant, many thanks. -- [[Jon]]
diff --git a/doc/bugs/websetup_eats_setupconf_and_allow__95__symlinks__95__before__95__srcdir.mdwn b/doc/bugs/websetup_eats_setupconf_and_allow__95__symlinks__95__before__95__srcdir.mdwn
new file mode 100644
index 000000000..47063d5cf
--- /dev/null
+++ b/doc/bugs/websetup_eats_setupconf_and_allow__95__symlinks__95__before__95__srcdir.mdwn
@@ -0,0 +1,21 @@
+My web server runs in a chroot jail. This makes things interesting because the paths are slightly different depending on whether you are inside or outside the chroot.
+
+To override an incorrectly guessed path, I set setupconf in the .setup file. I also set allow_symlinks_before_srcdir=>1. However, when I tried websetup, the setup file was correctly changed but these important settings disappeared. This seems like a bug.
+
+> I don't know what "setupconf" is. This is the first mention of it in the
+> ikiwiki source tree.
+>
+> I've fixed the `allow_symlinks_before_srcdir` issue. --[[Joey]]
+
+I meant setupfile as in IkiWiki::Setup::dump($config{setupfile}) from IkiWiki/Plugin/websetup.pm
+
+Sorry for the confusion.
+
+> Ok, that's an internal setting that I never envisioned someone digging
+> out and setting in their setup file. It could be made an exported config
+> option, but then every generated setup file will have this setting in it,
+> which will be at best redundant.
+>
+> Can you find another solution, such as a symlink, for your special case?
+
+I see your point. [[done]]
diff --git a/doc/bugs/weird_signature_in_match__95__included__40____41__.mdwn b/doc/bugs/weird_signature_in_match__95__included__40____41__.mdwn
new file mode 100644
index 000000000..cd9f27735
--- /dev/null
+++ b/doc/bugs/weird_signature_in_match__95__included__40____41__.mdwn
@@ -0,0 +1,7 @@
+Is this a bug? IkiWiki/Plugins/conditional.pm:
+
+`sub match_included ($$;$) { #{{{`
+
+The other match_XXX functions seem to take ($$;@). --Ethan
+
+> indeed, [[done]] --[[Joey]]
diff --git a/doc/bugs/weird_syntax_in_aggregate.pm.mdwn b/doc/bugs/weird_syntax_in_aggregate.pm.mdwn
new file mode 100644
index 000000000..632ff2c35
--- /dev/null
+++ b/doc/bugs/weird_syntax_in_aggregate.pm.mdwn
@@ -0,0 +1,9 @@
+ open (IN, "$config{wikistatedir}/aggregate" ||
+ die "$config{wikistatedir}/aggregate: $!");
+
+It looks like the intent was "open this file, and die if you can't",
+but I'm pretty sure it actually means "open this file and ignore errors
+silently". Shouldn't this be `open(IN, $file) || die "$file: $!";`
+(i.e. with the parens before the call to `die`)? --Ethan
+
+> Thanks, [[done]] --[[Joey]]
diff --git a/doc/bugs/wiki_formatting_does_not_work_between_toc_and_an_inline.mdwn b/doc/bugs/wiki_formatting_does_not_work_between_toc_and_an_inline.mdwn
new file mode 100644
index 000000000..29e03f2d6
--- /dev/null
+++ b/doc/bugs/wiki_formatting_does_not_work_between_toc_and_an_inline.mdwn
@@ -0,0 +1,30 @@
+Wiki formatting between `\[[!toc ]]` and an inline fails to render. The
+problem does not seem to trigger if the inline uses the titlepage template,
+or if it doesn't match any pages. See example below; also reproducible
+with a single-file wiki containing the text below, rendered via `ikiwiki
+--plugin toc`.
+
+> This is [[!debbug 421843]], and I suspect it affects certian other plugins
+> that also use empty divs as placeholders. It's fixed in markdown 1.0.2 b7
+> (available in debian experimental). So I'll [[close|done]] this as it's
+> not really an ikiwiki bug. --[[Joey]]
+
+[[!toc ]]
+
+**not bold**
+
+`not fixed-pitch`
+
+# heading not rendered
+
+[not a link](http://ikiwiki.info)
+
+[[!inline pages="news/*" description="Sparse News" show=1 feeds=no]]
+
+**bold**
+
+`fixed-pitch`
+
+# heading rendered
+
+[a link](http://ikiwiki.info)
diff --git a/doc/bugs/wiki_links_still_processed_inside_code_blocks.mdwn b/doc/bugs/wiki_links_still_processed_inside_code_blocks.mdwn
new file mode 100644
index 000000000..9f0a1d102
--- /dev/null
+++ b/doc/bugs/wiki_links_still_processed_inside_code_blocks.mdwn
@@ -0,0 +1,67 @@
+In [[ikiwiki/markdown]] syntax, none of the other special characters get processed
+inside a code block. However, in ikiwiki, [[wiki_links|ikiwiki/wikilink]] and
+[[preprocessor_directives|ikiwiki/directive]] still get processed
+inside a code block, requiring additional escaping. For example, `[links
+don't work](#here)`, but `a [[ikiwiki/wikilink]] becomes HTML`. --[[JoshTriplett]]
+
+Indented lines provide a good way to escape a block of text containing
+markdown syntax, but ikiwiki links like \[[this]] are still
+interpreted within such a block. I think that intepretation should not
+be happening. That is I should be able to write:
+
+ [[this]]
+
+and have it render like:
+
+ \[[this]]
+
+--[[cworth]]
+
+----
+
+> Has there been any progress or ideas on this bug recently? I use an
+> expanded CamelCase regexp, and without much escaping in freelink text, or
+> url links, or in codeblocks I get IkiWiki's attempt at creating a "link
+> within a link".
+>
+> I have no ideas other than perhaps once IkiWiki encounters \[\[ or the
+> position is reset with a backreference from a CamelCased word, further
+> processing of wikilinks is disabled until the position is reset and a "no
+> not makelinks" flag or variable is cleared.
+>
+> I've come up with some _really_ ugly workarounds to handle case specific
+> stuff like codeblocks but the problem creeps up again and again in
+> unexpected places. I'd be happy to come up with a patch if anyone has a
+> bright idea on a nice clean way (_in theroy_) to fix this. I'm out of ideas.
+>
+> --CharlesMauch
+
+> I've moved the above comment here because it seems to be talking about
+> this bug, not the similar Smileys bug.
+>
+> In the case of either bug, no, I don't have an idea of a solution yet.
+> --[[Joey]]
+
+> I've now solved a similar bug involving the smiley plugin. The code used
+> there should give some strong hints how to fix this bug, though I haven't
+> tried to apply the method yet. --[[Joey]]
+
+>> As far, as I can see, smileys bug is solved by checking for code/pre. In
+>> this case, however, this is not applicable. WikiLinks/directives *should* be
+>> expanded before passing text to formatter, as their expansion may contain
+>> markup. Directives should be processed before, as they may provide *partial*
+>> markup (eg `template` ones), that have no sense except when in the page
+>> cotext. Links should be processed before, because, at least multimarkdown may
+>> try to expand them as anchor-links.
+>>
+>> For now, my partial solution is to restrict links to not have space at the
+>> start, this way in many cases escaping in code may be done in natural way
+>> and not break copypastability. For example, shell 'if \[[ condition ]];'
+>> will work fine with this.
+>>
+>> Maybe directives can also be restricted to only be allowed on the line by
+>> themselves (not separated by blank lines, however) or something similar.
+>>
+>> --[[isbear]]
+
+[[!debbug 487397]]
diff --git a/doc/bugs/wiki_rebuild_should_throw_errors_if_the_configured_underlaydir_or_templatedir_don__39__t_exist.mdwn b/doc/bugs/wiki_rebuild_should_throw_errors_if_the_configured_underlaydir_or_templatedir_don__39__t_exist.mdwn
new file mode 100644
index 000000000..c8f6ed05f
--- /dev/null
+++ b/doc/bugs/wiki_rebuild_should_throw_errors_if_the_configured_underlaydir_or_templatedir_don__39__t_exist.mdwn
@@ -0,0 +1,15 @@
+I originally set up ikiwiki by using the debian package, but had some odd issues, so i figured i'd try installing from git. To do that i uninstalled the debian package and then did the Makefile dance from the git dir. In that process the original dirs configured in templatedir underlaydir in my wiki were deleted; HOWEVER when rebuilding the script just went ahead and did not even note the lack of those dirs. It would be nice if it threw errors if the dirs were configured, but non-existant.
+
+> Hmm. This behavior was explicitly coded into ikiwiki for underlay dirs:
+> [commit](http://source.ikiwiki.branchable.com/?p=source.git;a=commitdiff;h=cb4b99929757f970d5ae697f0d09514ad624ed46).
+> Pity I didn't say why, but presumably there are cases
+> where one of the underlaydirs is expected to be missing, or where
+> this robustness of not crashing is needed.
+>
+> The situation with missing templatedirs is more clear: When
+> it's looking for a given template file it just tries to open it in each
+> directory in turn, and uses the first file found; checking that a
+> directory exists would be extra work and there's a nice error message if
+> a template cannot be found. --[[Joey]]
+
+>> I'd agree with the thought behind that ... if it actually had thrown an error. However it did not. How about just checking the config variables when the template and/or config is set up? --Mithaldu
diff --git a/doc/bugs/wikilink_in_table.mdwn b/doc/bugs/wikilink_in_table.mdwn
new file mode 100644
index 000000000..cca01718c
--- /dev/null
+++ b/doc/bugs/wikilink_in_table.mdwn
@@ -0,0 +1,36 @@
+I try to create wikilink in table. But it does not work. Here is example:
+
+ \[[!table class=table1 data="""
+ \[[wikilink_test|index]]
+ \[[wikilink_test\|index]]
+ [wikilink test](/servers/webmail1)
+ """]]
+
+First two wikilink entries do not work.
+
+The last one is url link and it works but it is not a wikilink. Or maybe it does not matter if I use URL links in stead of wikilinks for local wiki content?
+
+> [[fixed|done]] --[[Joey]]
+
+>> works !! Great!
+
+What exactly is a difference between wikilink and URL reference to the same page ?
+
+> ikiwiki will not be able to track pages linked using urls as having a
+> link.
+
+Trying to report this I found something weird. I changed in the example [[ with || because wiki renders something wrongly. You can see what I tried originally here:
+
+ \[[!table class=table1 data="""
+ \[[wikilink_test|servers/webmail1]]
+ \[[wikilink_test|servers/webmail1]]
+ [wikilink test](/servers/webmail1)
+ """]]
+
+Please click edit to see unrendered text. First, it is not monospace-d (I have 4 spaces) and second, some wierd html is shown...
+Am I doing something wrong ?
+
+> See above for the right way to do it. Note that I also fixed a minor bug
+> in ikiwiki to allow this. --[[Joey]]
+
+>> Just curious ... if I wanted to have that block in monospace (four spaces in front of each line), how can I do that ?
diff --git a/doc/bugs/word_wrap.mdwn b/doc/bugs/word_wrap.mdwn
new file mode 100644
index 000000000..95cfc1538
--- /dev/null
+++ b/doc/bugs/word_wrap.mdwn
@@ -0,0 +1,16 @@
+Web browsers don't word-wrap lines in submitted text, which makes editing a
+page that someone wrote in a web browser annoying (`gqip` is vim user's
+friend here). Is there any way to improve this?
+
+> See "using the web interface with a real text editor" on the [[tips]]
+> page. --[[JoshTriplett]]
+
+>> Would it be useful to allow a "max width" plugin, which would force on commit the split of long lines ?
+
+>>> Please, no. That would wreak havoc on code blocks and arguments to
+>>> preprocessor directives, and it would make bulleted lists and quoted
+>>> blocks look bogus (because the subsequent lines would not match), among
+>>> other problems. On the other hand, if you want to propose a piece of
+>>> client-side JavaScript that looks at the active selection in a text area
+>>> and word-wraps it, and have a plugin that adds a "Word-Wrap Selection"
+>>> button to the editor, that seems fine. --[[JoshTriplett]]
diff --git a/doc/bugs/wrapper_can__39__t_find_the_perl_modules.mdwn b/doc/bugs/wrapper_can__39__t_find_the_perl_modules.mdwn
new file mode 100644
index 000000000..9804d86c5
--- /dev/null
+++ b/doc/bugs/wrapper_can__39__t_find_the_perl_modules.mdwn
@@ -0,0 +1,16 @@
+If i intsall perl modules in my custom directory, cgi wrapper can't find them. I found clearing enviroment variables in code of wrapper. But information about custom directories put to perl with PERL5LIB variable.
+
+Workaround: add newenviron variable PERL5LIB
+
+My additional question - what wrapper do? I'am russian hosting provider. I am interesting with ikiwiki.
+
+> The wrapper allows ikiwiki to run as the user who owns the wiki, which
+> is generally not the same as the user that runs the web server.
+> (It also handles some other things, like some locking.)
+>
+> As a suid program, the wrapper cannot safely let environment variables
+> pass through.
+>
+> If you want to install ikiwiki's perl modules in a nonstandard location,
+> you can set `INSTALL_BASE` when running `Makefile.PL`. ikiwiki will then
+> be built to look in that location. --[[Joey]] [[!tag done]]
diff --git a/doc/bugs/wrong_attachment_size.mdwn b/doc/bugs/wrong_attachment_size.mdwn
new file mode 100644
index 000000000..958576d5d
--- /dev/null
+++ b/doc/bugs/wrong_attachment_size.mdwn
@@ -0,0 +1,8 @@
+Using ikiwiki 3.20100815.7 on Debian Squeeze or ikiwiki 3.20100815~bpo50+1 on Debian Lenny, the [[attachments|plugins/attachment] list on an edit page all show the same (wrong) file size, which is the size of one of the files.
+You can see the phenomenon on this wiki page:
+<http://tools.ipol.im/ikiwiki.cgi?page=sandbox&do=edit>.
+
+-- [[nil|users/nil]]
+
+> This bug was fixed in version 3.20100926: "attachment: Fix attachment file size
+> display." [[done]] --[[Joey]]
diff --git a/doc/bugs/wrong_discussion_page_created.mdwn b/doc/bugs/wrong_discussion_page_created.mdwn
new file mode 100644
index 000000000..f9399da72
--- /dev/null
+++ b/doc/bugs/wrong_discussion_page_created.mdwn
@@ -0,0 +1,12 @@
+I seem to have broken Discussion page creation links. --[[Joey]]
+
+For the toplevel, it needs to save to index/discussion. index/Discussion
+or Discussion won't work. It defaults to Discussion on the toplevel.
+And this is particularly bad as it breaks all other discussion links on
+other pages.
+
+For other pages, it should save to page/discussion. page/Discussion does
+not work. And it defaults to Discussion which is just *wrong*.
+
+> This was broken by [[Titles_are_lower-cased_when_creating_a_page]].
+> Added a fix. [[done]] --[[Joey]]
diff --git a/doc/bugs/wrong_link_in_recentchanges_when_reverting_an_ikiwiki_outside_git_root.mdwn b/doc/bugs/wrong_link_in_recentchanges_when_reverting_an_ikiwiki_outside_git_root.mdwn
new file mode 100644
index 000000000..5f7450b79
--- /dev/null
+++ b/doc/bugs/wrong_link_in_recentchanges_when_reverting_an_ikiwiki_outside_git_root.mdwn
@@ -0,0 +1,8 @@
+in ikiwiki instances that don't reside in the git root directory (the only ones i know of are ikiwiki itself), reverts show the wrong link in the recentchanges (for example, in the ikiwiki main repository's 4530430 and its revert, the main index page was edited, but the revert shows doc/index as a link).
+
+the expected behavior is to compensate for the modified root directory (i.e., show index instead of doc/index).
+
+> This seems to work OK now - commit 84c4ca33 and its reversion both
+> appear correctly in [[recentchanges]]. Looking at git history,
+> Joey [[fixed this|done]] in commit 1b6c1895 before 3.20120203.
+> --[[smcv]]
diff --git a/doc/bugs/wrong_permissions_on_some_files_in_source.mdwn b/doc/bugs/wrong_permissions_on_some_files_in_source.mdwn
new file mode 100644
index 000000000..2cecd8b1b
--- /dev/null
+++ b/doc/bugs/wrong_permissions_on_some_files_in_source.mdwn
@@ -0,0 +1,11 @@
+A few files in the source are only readable by root:
+ikiwiki/basewiki/favicon.ico
+ikiwiki/doc/logo/ikiwiki_large.png
+ikiwiki/doc/logo/ikiwiki.svgz
+ikiwiki/templates/atomitem.tmpl
+ikiwiki/templates/atompage.tmpl
+
+This means that (depending on installation) maybe the permissions are wrong for the final install.
+So when ikiwiki is ran (as non-root) it fails.
+
+> [[bugs/done]] (in my tree) --[[Joey]]
diff --git a/doc/bugs/wrong_rss_url_when_inside_another_blog-like_page.mdwn b/doc/bugs/wrong_rss_url_when_inside_another_blog-like_page.mdwn
new file mode 100644
index 000000000..f1dde7e9f
--- /dev/null
+++ b/doc/bugs/wrong_rss_url_when_inside_another_blog-like_page.mdwn
@@ -0,0 +1,36 @@
+I have a blog-like page collecting item, that's ok, display is good, rss
+url is correct. In some of the items, I have another rss. When viewing the
+item itself, url of the rss is correct. But when viewing the item inside
+the main page, the basepath of the rss is wrong, thus the url doesn't go
+anywhere.
+
+For example :
+
+(1) a blog-like page which is collecting tag entry :
+
+* address of the page : /tag.html
+* address of the rss : /tag.rss
+
+(2) a tag item "foo"
+which is containing a rss, listing all the foo-tagged pages.
+
+* address of the item : /tag/foo.html
+* address of the rss : /tag/foo.rss
+
+(3) when viewing /tag.html
+
+* the rss url inside tag/foo.html is /foo.rss, not /tag/foo.rss
+
+Is it a bug or did I miss something ?
+
+> I've fixed at least
+> two bugs in this area in the unreleased ikiwiki in subversion, one just
+> now. If you still see the problem using the ikiwiki version in
+> subversion, it would be helpful if you could post a tarball of your wiki,
+> or a test case derived from it that I can use to reproduce the problem
+> --[[Joey]]
+
+>> Joey, thank you, it's ok now. (since the version 1.40, AFAIK, maybe earlier)
+>> --Hugues (hb)
+
+>>> Thanks for the followup, tagging this [[bugs/done]].
diff --git a/doc/bugs/xgettext_issue.mdwn b/doc/bugs/xgettext_issue.mdwn
new file mode 100644
index 000000000..dc49c69a1
--- /dev/null
+++ b/doc/bugs/xgettext_issue.mdwn
@@ -0,0 +1,73 @@
+I ran into a problem when installing from svn. I got "invalid variable
+interpolation" errors for Wrappers.pm. I added the flag '--extract-all' to
+'po/Makefile' and 'po/t' to the xgettext line. Once I did that I was able
+to make and make test just fine. --HarleyPig
+
+> It would be helpful if you could post the actual error message you saw.
+> Also would be nice to know what versions of perl and gettext you have.
+> Perhaps your xgettext is an older version from before it natively
+> supported perl.
+> Adding --extract-all doesn't seem like a good idea, since this causes it
+> to treat every string in the entire wiki as translatable. I don't know
+> what you're talking about regarding 'po/t'. --[[Joey]]
+
+>> make[1]: Entering directory `/home/www/ikiwiki/po'
+>> Rebuilding the pot file
+>> xgettext ../IkiWiki/CGI.pm ../IkiWiki/Plugin/aggregate.pm ../IkiWiki/Plugin/brokenlinks.pm ../IkiWiki/Plugin/camelcase.pm ../IkiWiki/Plugin/ddate.pm ../IkiWiki/Plugin/favicon.pm ../IkiWiki/Plugin/fortune.pm ../IkiWiki/Plugin/goodstuff.pm ../IkiWiki/Plugin/googlecalendar.pm ../IkiWiki/Plugin/haiku.pm ../IkiWiki/Plugin/html.pm ../IkiWiki/Plugin/htmlscrubber.pm ../IkiWiki/Plugin/htmltidy.pm ../IkiWiki/Plugin/httpauth.pm ../IkiWiki/Plugin/img.pm ../IkiWiki/Plugin/inline.pm ../IkiWiki/Plugin/linkmap.pm ../IkiWiki/Plugin/map.pm ../IkiWiki/Plugin/mdwn.pm ../IkiWiki/Plugin/meta.pm ../IkiWiki/Plugin/mirrorlist.pm ../IkiWiki/Plugin/openid.pm ../IkiWiki/Plugin/orphans.pm ../IkiWiki/Plugin/otl.pm ../IkiWiki/Plugin/pagecount.pm ../IkiWiki/Plugin/pagestats.pm ../IkiWiki/Plugin/passwordauth.pm ../IkiWiki/Plugin/poll.pm ../IkiWiki/Plugin/polygen.pm ../IkiWiki/Plugin/rawhtml.pm ../IkiWiki/Plugin/rst.pm ../IkiWiki/Plugin/search.pm ../IkiWiki/Plugin/shortcut.pm ../IkiWiki/Plugin/sidebar.pm ../IkiWiki/Plugin/skeleton.pm ../IkiWiki/Plugin/smiley.pm ../IkiWiki/Plugin/tag.pm ../IkiWiki/Plugin/template.pm ../IkiWiki/Plugin/textile.pm ../IkiWiki/Plugin/toc.pm ../IkiWiki/Plugin/toggle.pm ../IkiWiki/Plugin/typography.pm ../IkiWiki/Plugin/wikitext.pm ../IkiWiki/Rcs/Stub.pm ../IkiWiki/Rcs/git.pm ../IkiWiki/Rcs/mercurial.pm ../IkiWiki/Rcs/svn.pm ../IkiWiki/Rcs/tla.pm ../IkiWiki/Render.pm ../IkiWiki/Setup.pm ../IkiWiki/Setup/Standard.pm ../IkiWiki/UserInfo.pm ../IkiWiki/Wrapper.pm ../ikiwiki.in ../IkiWiki.pm -o ikiwiki.pot -Lperl --add-comments=translators ../IkiWiki/Wrapper.pm:64: invalid variable interpolation at "$"
+>> make[1]: *** [ikiwiki.pot] Error 1
+>> make[1]: Leaving directory `/home/www/ikiwiki/po'
+>> make: *** [extra_build] Error 2
+>>
+>> harleypig ikiwiki # xgettext --version
+>>
+>> xgettext (GNU gettext-tools) 0.15
+>>
+>> harleypig ikiwiki # perl -v
+>>
+>> This is perl, v5.8.8 built for i686-linux
+>>
+>> Sorry about the po/t report ... it was the test file I used to figure out what was wrong and I forgot to remove it. This is against the subversion repository, version 2338.
+>> The referenced line has a $! variable, which the documentation for gettext indicates is the problem.
+
+>>> Ok, I think that you need to upgrade xgettext to 0.16. However, there's
+>>> no reason why you should need to rebuild the pot file anyway, so I've
+>>> checked it into svn, and that's one problem [[bugs/done]].
+
+>>>> FWIW, I get the same error when building manually from SVN trunk on Ubuntu Edgy. I'm also using xgettext 0.15, because it's the latest version that's in the repos. However,
+>>>> I don't think that's the sole problem because I can build fine on another (Debian) box which is running _0.14.4_... I know very little about `gettext`, but could this be
+>>>> related to my language settings? On the Edgy box I have `LANGUAGE=en_GB:en` and `LANG=en_GB.UTF-8`. The `ikiwiki` package installs on Edgy with no problems. --Ben
+
+>>>>> You'll only see the problem if it needs to rebuild po/ikiwiki.pot,
+>>>>> which it generally doesn't if you're just building the package. If
+>>>>> you edit files and build, it will rebuilt the pot and then fail with
+>>>>> older gettexts. --[[Joey]]
+
+>>>>>> I guess I'm confused then, because I do get that error when I just build. :-( To reproduce:
+
+ svn co svn://ikiwiki.kitenet.net/ikiwiki/trunk ikiwiki
+ cd ikiwiki
+ perl Makefile.PL
+ make
+
+>>>>>> Then I see:
+
+<pre>
+./mdwn2man ikiwiki 1 doc/usage.mdwn > ikiwiki.man
+./mdwn2man ikiwiki-mass-rebuild 8 doc/ikiwiki-mass-rebuild.mdwn > ikiwiki-mass-rebuild.man
+./pm_filter /usr/local 1.44 /usr/local/share/perl/5.8.8 < ikiwiki.in > ikiwiki.out
+make -C po
+make[1]: Entering directory `/home/ben/tmp/ikiwiki/po'
+Rebuilding the pot file
+xgettext ../IkiWiki/CGI.pm ../IkiWiki/Plugin/aggregate.pm ../IkiWiki/Plugin/anonok.pm ../IkiWiki/Plugin/brokenlinks.pm ../IkiWiki/Plugin/camelcase.pm ../IkiWiki/Plugin/conditional.pm ../IkiWiki/Plugin/ddate.pm ../IkiWiki/Plugin/favicon.pm ../IkiWiki/Plugin/fortune.pm ../IkiWiki/Plugin/goodstuff.pm ../IkiWiki/Plugin/googlecalendar.pm ../IkiWiki/Plugin/haiku.pm ../IkiWiki/Plugin/html.pm ../IkiWiki/Plugin/htmlscrubber.pm ../IkiWiki/Plugin/htmltidy.pm ../IkiWiki/Plugin/httpauth.pm ../IkiWiki/Plugin/img.pm ../IkiWiki/Plugin/inline.pm ../IkiWiki/Plugin/linkmap.pm ../IkiWiki/Plugin/lockedit.pm ../IkiWiki/Plugin/map.pm ../IkiWiki/Plugin/mdwn.pm ../IkiWiki/Plugin/meta.pm ../IkiWiki/Plugin/mirrorlist.pm ../IkiWiki/Plugin/more.pm ../IkiWiki/Plugin/opendiscussion.pm ../IkiWiki/Plugin/openid.pm ../IkiWiki/Plugin/orphans.pm ../IkiWiki/Plugin/otl.pm ../IkiWiki/Plugin/pagecount.pm ../IkiWiki/Plugin/pagestats.pm ../IkiWiki/Plugin/passwordauth.pm ../IkiWiki/Plugin/poll.pm ../IkiWiki/Plugin/polygen.pm ../IkiWiki/Plugin/prettydate.pm ../IkiWiki/Plugin/rawhtml.pm ../IkiWiki/Plugin/rst.pm ../IkiWiki/Plugin/search.pm ../IkiWiki/Plugin/shortcut.pm ../IkiWiki/Plugin/sidebar.pm ../IkiWiki/Plugin/signinedit.pm ../IkiWiki/Plugin/skeleton.pm ../IkiWiki/Plugin/smiley.pm ../IkiWiki/Plugin/tag.pm ../IkiWiki/Plugin/template.pm ../IkiWiki/Plugin/textile.pm ../IkiWiki/Plugin/toc.pm ../IkiWiki/Plugin/toggle.pm ../IkiWiki/Plugin/typography.pm ../IkiWiki/Plugin/wikitext.pm ../IkiWiki/Rcs/Stub.pm ../IkiWiki/Rcs/git.pm ../IkiWiki/Rcs/mercurial.pm ../IkiWiki/Rcs/svn.pm ../IkiWiki/Rcs/tla.pm ../IkiWiki/Render.pm ../IkiWiki/Setup.pm ../IkiWiki/Setup/Standard.pm ../IkiWiki/UserInfo.pm ../IkiWiki/Wrapper.pm ../ikiwiki.in ../IkiWiki.pm -o ikiwiki.pot -Lperl --add-comments=translators
+../IkiWiki/Wrapper.pm:64: invalid variable interpolation at "$"
+make[1]: *** [ikiwiki.pot] Error 1
+make[1]: Leaving directory `/home/ben/tmp/ikiwiki/po'
+make: *** [extra_build] Error 2
+</pre>
+
+>>>>>> Other than installing a newer version of `gettext` from outside of the repos, is there any workaround?
+
+>>>>>>> It's probably because you're pulling it from svn, and I don't
+>>>>>>> always update the pot file every time I commit to svn. So this will
+>>>>>>> affect svn checkouts, but not released tarballs. Anyway, I put in a
+>>>>>>> workaround.. [[bugs/done]] --[[Joey]]
diff --git a/doc/bugs/yaml:xs_codependency_not_listed.mdwn b/doc/bugs/yaml:xs_codependency_not_listed.mdwn
new file mode 100644
index 000000000..f136d8b12
--- /dev/null
+++ b/doc/bugs/yaml:xs_codependency_not_listed.mdwn
@@ -0,0 +1,13 @@
+YAML:XS is not listed as a dep in the spec file which results in
+
+```
+HOME=/home/me /usr/bin/perl -Iblib/lib ikiwiki.in -dumpsetup ikiwiki.setup
+Can't locate YAML/XS.pm in @INC (@INC contains: . blib/lib /usr/local/lib64/perl5 /usr/local/share/perl5 /usr/lib64/perl5/vendor_perl /usr/share/perl5/vendor_perl /usr/lib64/perl5 /usr/share/perl5) at (eval 39) line 2.
+BEGIN failed--compilation aborted at (eval 39) line 2.
+make: *** [ikiwiki.setup] Error 2
+error: Bad exit status from /var/tmp/rpm-tmp.Sgq2QK (%build)
+```
+
+when trying to build
+
+> Ok, added. [[done]] --[[Joey]]
diff --git a/doc/bugs/yaml_setup_file_does_not_support_UTF-8_if_XS_is_installed.mdwn b/doc/bugs/yaml_setup_file_does_not_support_UTF-8_if_XS_is_installed.mdwn
new file mode 100644
index 000000000..73da32d0c
--- /dev/null
+++ b/doc/bugs/yaml_setup_file_does_not_support_UTF-8_if_XS_is_installed.mdwn
@@ -0,0 +1,104 @@
+I converted an ikiwiki setup file to YAML as
+[[documented|tips/yaml_setup_files]].
+
+On my Debian Squeeze system, attempting to build the wiki using the
+YAML setup file triggers the following error message:
+
+ YAML::XS::Load Error: The problem:
+
+ Invalid trailing UTF-8 octet
+
+ was found at document: 0
+ usage: ikiwiki [options] source dest
+ ikiwiki --setup configfile
+
+Indeed, my setup file contains UTF-8 characters.
+
+Deinstalling YAML::XS ([[!debpkg libyaml-libyaml-perl]]) resolves this
+issue. According to YAML::Any's POD, YAML::Syck is used instead of
+YAML::XS in this case since it's the best YAML implementaion available
+on my system.
+
+No encoding-related setting is mentionned in YAML::XS' POD. We may
+consider there is a bug in there. I'll see if it's known / fixed
+somewhere as soon as I get online.
+
+Joey, as a (hopefully) temporary workaround, what do you think of
+explicitely using YAML::Syck (or whatever other YAML implementation
+that does not expose this bug) rather than letting YAML::Any pick its
+preferred one?
+
+--[[intrigeri]]
+
+> Upgrading YAML::XS ([[!debpkg libyaml-libyaml-perl]]) to current sid
+> version (0.34-1) fixes this bug for me. --[[intrigeri]]
+
+>> libyaml-syck-perl's description mentions that the module is now
+>> deprecated. (I had to do some ugly workaround to make unicode work with
+>> Syck earlier.) So it appears the new YAML::Xs is the
+>> way to go longterm, and presumably YAML::Any will start depending on it
+>> in due course? --[[Joey]]
+
+>>> Right. Since this bug is fixed in current testing/sid, only
+>>> Squeeze needs to be taken care of. As far as Debian Squeeze is
+>>> concerned, I see two ways out of the current buggy situation:
+>>>
+>>> 1. Add `Conflicts: libyaml-libyaml-perl (< 0.34-1~)` to the
+>>> ikiwiki packages uploaded to stable and squeeze-backports.
+>>> Additionally uploading the newer, fixed `libyaml-libyaml-perl`
+>>> to squeeze-backports would make the resulting situation a bit
+>>> easier to deal with from the Debian stable user point of view.
+>>> 2. Patch the ikiwiki packages uploaded to stable and
+>>> squeeze-backports:
+>>> - either to workaround the bug by explicitly using YAML::Syck
+>>> (yeah, it's deprecated, but it's Debian stable)
+>>> - or to make the bug easier to workaround by the user, e.g. by
+>>> warning her of possible problems in case YAML::Any has chosen
+>>> YAML::XS as its preferred implementation (the
+>>> `YAML::Any->implementation` module method can come in handy
+>>> in this case).
+>>>
+>>> I tend to prefer the first aforementioned solution, but any of
+>>> these will anyway be kinda ugly, so...
+
+>>>> I was wrong: I just experienced that bug with YAML::XS 0.34-1
+>>>> too. Seems like [[!cpanrt 54683]]. --[[intrigeri]]
+
+>>>>> Yes, [[!debbug 625713]] reports this also affects debian unstable.
+>>>>> So, I will add a conflict I guess. [[done]] --[[Joey]]
+
+>>>>>> With the additional info and test cases I provided on the
+>>>>>> Debian bug (Message #22), I now doubt this is a YAML::XS bug
+>>>>>> very much. Also, the RT bug I linked to happens with `use
+>>>>>> utf8`, which is not the case in ikiwiki AFAIK => I think you
+>>>>>> shall reconsider whether this bug really is YAML::XS' fault, or
+>>>>>> YAML::Any's fault, or Perl's fault, or... the way ikiwiki
+>>>>>> slurps and untaints UTF-8 YAML setup files. Sorry for providing
+>>>>>> information that may have been misguided. --[[intrigeri]]
+
+>>>>>>> `use utf8` is completely irrelevant; that only tells
+>>>>>>> perl to support utf8 in its source code.
+>>>>>>>
+>>>>>>> I don't know what `Path::Class::File` is, but if it
+>>>>>>> provides non-decoded bytes to the module than it would likely
+>>>>>>> avoid this failure, while resulting in parsed yaml where every
+>>>>>>> string was likewise not decoded unicode, which is not very useful.
+>>>>>>> --[[Joey]]
+
+>>>>>>>> You guessed right about the non-decoded bytes being passed to
+>>>>>>>> YAML::XS, except this is the way it shall be done. YAML::XS
+>>>>>>>> POD reads: "YAML::XS only deals with streams of utf8 octets".
+>>>>>>>> Feed it with non-decoded UTF-8 bytes and it gives you
+>>>>>>>> properly encoded UTF-8 Perl strings in exchange.
+>>>>>>>>
+>>>>>>>> Once this has been made clear, since 1. this module indeed
+>>>>>>>> seems to be the future of YAML in Perl, and 2. is depended on
+>>>>>>>> by other popular software such as dh-make-perl (on the 2nd
+>>>>>>>> degree), I suggest using it explicitly instead of the current
+>>>>>>>> "try to support every single YAML Perl module and end up
+>>>>>>>> conflicting with the now recommended one" nightmare.
+>>>>>>>> --[[intrigeri]]
+
+>>>>>>>>> Ok, [[done]] (although YAML::Syck does also still work.) --[[Joey]]
+
+>>>>>>>>>> Thanks a lot. --[[intrigeri]]
diff --git a/doc/cgi.mdwn b/doc/cgi.mdwn
new file mode 100644
index 000000000..1448fa4d5
--- /dev/null
+++ b/doc/cgi.mdwn
@@ -0,0 +1,5 @@
+While ikiwiki is primarily a wiki compiler, which generates static html
+pages, it does use CGI for online page editing.
+
+To enable CGI, you need to create and install an ikiwiki.cgi wrapper.
+[[Setup]] explains how to do this.
diff --git a/doc/cgi/discussion.mdwn b/doc/cgi/discussion.mdwn
new file mode 100644
index 000000000..4cd7c5a9c
--- /dev/null
+++ b/doc/cgi/discussion.mdwn
@@ -0,0 +1,22 @@
+## Markdown or CGI error prevents web-based editing
+
+I have a working ikiwiki configuration with an SVN backend, running on Ubuntu 7.10, Apache 2.2.4. I'm using Perl 5.8.8, however, I had to use Text::Markdown 1.0.5 from CPAN instead of the latest because I had the same issue as someone [here](http://ikiwiki.info/index/discussion/) (Namely I was getting this error until I used the old Markdown version: "*** glibc detected *** double free or corruption (!prev): 0x0922e478 ***
+").
+
+> aside: that might have been related to [Text::Markdown bug #37297](http://rt.cpan.org/Public/Bug/Display.html?id=37297).
+> --ChapmanFlack 9Jul2008
+
+CGI seems to be working at least partly - the History and Recent Changes pages both work. However, if I attempt to edit or create a page, I get this error:
+ Error: Failed to load plugin IkiWiki::Plugin::mdwn: Can't locate IkiWiki/Plugin/mdwn.pm in @INC (@INC contains: /etc/perl /usr/local/lib/perl/5.8.8 /usr/local/share/perl/5.8.8 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.8 /usr/share/perl/5.8 /usr/local/lib/site_perl .) at (eval 6) line 2. BEGIN failed--compilation aborted at (eval 6) line 2.
+
+Since ikiwiki builds, it has to be finding Markdown at build time, right? What am I doing wrong here? I would appreciate a point in the right direction. Thanks. --mrled
+
+> Ikiwiki is failing to find `IkiWiki/Plugin/mdwn.pm` which is a plugin included in ikiwiki itself.
+> So, that file must not have been installed in any of the directories in the @INC search path listed.
+> Either fix the installation so ikiwiki's perl modules are installed in one of the standard locations,
+> or you could use the `libdir` setting in the setup file to point ikiwiki
+> at the directory the perl modules are installed. --[[Joey]]
+
+>> Ah hah! That helped... once I knew that it was ikiwiki's internal thing, I was able to figure out that it was a permissions issue. For the record, I didn't change permissions or the default install prefix of /usr/local (well, at least not on purpose). Thanks for your help. --mrled
+
+>>> Interesting.. so just a permissions problem of some sort that prevented the cgi from seeing the modules that were there in-path? --[[Joey]]
diff --git a/doc/commit-internals.mdwn b/doc/commit-internals.mdwn
new file mode 100644
index 000000000..3a464ffbf
--- /dev/null
+++ b/doc/commit-internals.mdwn
@@ -0,0 +1,20 @@
+Saving this irc transcript here, since it's a fairly in-depth discussion of
+how ikiwiki handles commits, locking, etc, and avoids some races while
+doing so.
+
+ <tschwinge> What happens if I edit a page and in the underground a new version is installed into the svn repository?
+ <tschwinge> The revision when I started editing was saved, right?
+ <joeyh> what happens, exactly is:
+ <joeyh> 1. the new version that was committed first get into svn, and ikiwiki updates its WC to have the new version
+ <joeyh> 2. When you save your edit, ikiwiki detects a conflict.
+ <joeyh> 3. It uses svn merge to try to resolve it; if it's resolved it adds your changes transparently
+ <joeyh> 4. If the conflict needs manual resolution, it displays the page with conflict markers in the editor for you to resolve
+ <tschwinge> Ok.
+ <joeyh> Note that in step 2, it detects the conflict by using svn info to get the current Revision of the page in the WC, and compares that to a revision that is stored when you start to edit the page
+ <joeyh> that's why rcs_prepedit exists, to get that revision info
+ <tschwinge> But isn't there a race condition?
+ <joeyh> well, there is locking going on too
+ <joeyh> ikiwiki won't update the WC in step 1. if another instance of itself is getting the Revision info
+ <tschwinge> Is that lockwiki()?
+ <joeyh> yeah
+ <joeyh> note that when it gets the current Revision info of a page during its conflict detection, svn could have changed the page in the repo, and the WC not been updated yet due to the lock, but this isn't a race since the commit will then fail due to a regular svn conflict and the conflict detction will still work.
diff --git a/doc/competition.mdwn b/doc/competition.mdwn
new file mode 100644
index 000000000..2c782ea92
--- /dev/null
+++ b/doc/competition.mdwn
@@ -0,0 +1,19 @@
+When I started ikiwiki in 2006, there were no other existing systems that
+filled quite the niche of generating a static html wiki out of markdown
+files stored in a [[VCS|rcs]]. My
+[first blog about ikiwiki](http://kitenet.net/~joey/blog/entry/seeking_wiki/)
+looked at some projects that were semi-close, and found them wanting.
+
+My hope was that besides being useful to all its [[users|ikiwikiusers]],
+ikiwiki would help spread its underlying concepts. Let a thousand flowers
+bloom! These are some that have sprung up since. --[[Joey]]
+
+* [Gitit](http://gitit.johnmacfarlane.net/) is a wiki backed by a git (or
+ darcs) filestore. No static rendering here; pages are generated on the fly.
+ It's written in Haskell and uses the amazing PanDoc to generate html
+ from markdown or many other formats.
+
+* [Markdoc](http://blog.zacharyvoase.com/post/246800035) statically builds
+ a wiki from markdown source (which can be in a VCS, if you check it in).
+ It includes a built-in webserver to ease serving the generated static
+ html.
diff --git a/doc/consultants.mdwn b/doc/consultants.mdwn
new file mode 100644
index 000000000..ee0915600
--- /dev/null
+++ b/doc/consultants.mdwn
@@ -0,0 +1,9 @@
+Ikiwiki is free software, worked on by various people to "scratch their
+itch". Some people either don't have the time or have specialized needs and
+are willing to hire someone to either maintain or add additional
+functionality to ikiwiki. The following is a list of people who are
+available to do consulting or other work on ikiwiki.
+
+* [[Joey]] wrote ikiwiki. He is available for consulting on a part-time basis.
+
+Feel free to add yourself to this list.
diff --git a/doc/contact.mdwn b/doc/contact.mdwn
new file mode 100644
index 000000000..7d31ddf10
--- /dev/null
+++ b/doc/contact.mdwn
@@ -0,0 +1,11 @@
+The ikiwiki project strongly encourages collaboration through ikiwiki itself,
+and thus does not have a mailing list. Anyone can create an account on
+ikiwiki's own wiki. ikiwiki provides a [[bug_tracker|bugs]], a
+[[TODO_list|TODO]], and "discussion" sub-pages for every page, as well as a
+[[forum]] for general questions and discussion. ikiwiki
+developers monitor [[RecentChanges]] closely, via the webpage, email,
+and IRC, and respond in a timely fashion.
+
+You could also drop by the IRC channel `#ikiwiki` on
+[OFTC](http://www.oftc.net/) (`irc.oftc.net`), or use the
+[identi.ca ikiwiki group](http://identi.ca/group/ikiwiki).
diff --git a/doc/contact/discussion.mdwn b/doc/contact/discussion.mdwn
new file mode 100644
index 000000000..7bc479463
--- /dev/null
+++ b/doc/contact/discussion.mdwn
@@ -0,0 +1,14 @@
+... does not having a mailing list really encourage people to "collaborate
+through ikiwiki itself"? Personally, I strongly prefer to participate in
+mailing lists via my mail/nntp client. I follow over 50 mailing lists, so
+if I can't participate in a mailing list mail or nntp, I'll probably just
+not participate at all. I suspect there are others in my situation. I
+think that by not having an SMTP-driven mailing list, ikiwiki is passing up
+a big opportunity to build community. -- AdamMegacz
+
+> If you want to gate ikiwiki's various rss feeds to email, that's
+> [trivial](http://rss2email.infogami.com/) --[[Joey]]
+
+Also see [[this_wishlist_request|todo/provide_a_mailing_list]]. Sure, you can
+collaborate with RSS and a wiki, but the ikiwiki community is not only made up
+of developers.
diff --git a/doc/convert.mdwn b/doc/convert.mdwn
new file mode 100644
index 000000000..fd4fbeac3
--- /dev/null
+++ b/doc/convert.mdwn
@@ -0,0 +1,9 @@
+Do you have an existing wiki or blog using other software, and would like
+to convert it to ikiwiki? Various tools and techniques have been developed
+to handle such conversions.
+
+* [[tips/convert_mediawiki_to_ikiwiki]]
+* [[tips/convert_moinmoin_to_ikiwiki]]
+* [[tips/convert_blogger_blogs_to_ikiwiki]]
+
+In addition, [[JoshTriplett]] has written scripts to convert Twiki sites, see [his page](/users/JoshTriplett) for more information.
diff --git a/doc/css.mdwn b/doc/css.mdwn
new file mode 100644
index 000000000..bc070cb99
--- /dev/null
+++ b/doc/css.mdwn
@@ -0,0 +1,24 @@
+[[!meta title="CSS"]]
+
+## Using CSS with ikiwiki
+
+Ikiwiki comes with two CSS stylesheets: [[style.css]] and [[local.css]].
+The idea is to customize the second one, overriding the first one and
+defining brand new rendering rules.
+
+While ikiwiki's default use of stylesheets is intentionally quite plain and
+minimalistic, CSS allows creating any kind of look you can dream up.
+
+The [[theme_plugin|plugins/theme]] provides some prepackaged [[themes]] in an
+easy to use way.
+
+The [[css_market]] page is an attempt to collect user contributed local.css
+files.
+
+## Per-page CSS
+
+The [[plugins/meta]] plugin can be used to add additional style sheets to a
+page.
+
+The [[plugins/localstyle]] plugin can be used to override the toplevel
+[[local.css]] for a whole section of the wiki.
diff --git a/doc/css/discussion.mdwn b/doc/css/discussion.mdwn
new file mode 100644
index 000000000..dc9663e56
--- /dev/null
+++ b/doc/css/discussion.mdwn
@@ -0,0 +1,18 @@
+I must be doing something wrong. Running setup:
+
+$ ikiwiki --setup MyIkiwiki.setup
+
+overwrites my local.css with the default (empty) version.
+
+I am using ikiwiki_3.1415926.tar.gz installed into /usr/local.
+
+---
+Sorry. Never mind. RTFM. --refresh duh.
+
+> Hmm, well. Using --refresh is a good thing because it allow ikiwiki to
+> update a site quicker. But, it must only be hiding the real problem.
+> Sounds like you are trying to edit local.css directly inside
+> the destdir. But ikiwiki has a local.css located in its basewiki,
+> so when you rebuild your local mods are lost. Fix is to put you
+> locally modified local.css inside the srcdir, along with the other pages
+> of your wiki. --[[Joey]]
diff --git a/doc/css_market.mdwn b/doc/css_market.mdwn
new file mode 100644
index 000000000..c9c6694e7
--- /dev/null
+++ b/doc/css_market.mdwn
@@ -0,0 +1,68 @@
+[[!meta title="CSS Market"]]
+
+User contributed stylesheet files for ikiwiki. Unless otherwise noted,
+these style sheets can be installed by copying them into your wiki's source
+dir with a filename of `local.css`.
+
+Some of stylesheets have developed into fullfledged [[themes]] that are
+included in ikiwiki for easy use.
+
+Feel free to add your own stylesheets here. (Upload as wiki pages; wiki
+gnomes will convert them to css files..)
+
+* **[lessish.css](https://raw.github.com/spiffin/ikiwiki_lessish/master/lessish.css)**, contributed by [[Spiffin]],
+ A responsive stylesheet based on the [Less CSS Framework](http://lessframework.com).
+ Links: [PNG preview](https://github.com/spiffin/ikiwiki_lessish/blob/master/lessish_preview.png) and [GitHub repo](https://github.com/spiffin/ikiwiki_lessish).
+
+* **[[css_market/zack.css]]**, contributed by [[StefanoZacchiroli]],
+ customized mostly for *blogging purposes*, can be seen in action on
+ [zack's blog](http://upsilon.cc/~zack/blog/)
+ [[!meta stylesheet="zack"]]
+
+* **[[css_market/kirkambar.css]]**, contributed by [[Roktas]]. This far from perfect
+ stylesheet follows a [Gitweb](http://www.kernel.org/git/?p=git/git.git;a=tree;f=gitweb)
+ like theme, so it may provide a consistent look'n feel along with the [[rcs/git]] backend. ;-)
+ [[!meta stylesheet="kirkambar"]]
+
+* **[[css_market/embeddedmoose.css]]**, contributed by [[JoshTriplett]].
+ Designed for [Embedded Moose](http://embeddedmoose.com). Some ideas from the
+ Debian lighttpd index.html page.
+ [[!meta stylesheet="embeddedmoose"]]
+
+* **[[02_Template.css]]**, contributed and adapted by [maxx](http://martin.wuertele.net/), [original](http://www.openwebdesign.org/viewdesign.phtml?id=3057)
+ designed by [jarico](http://www.openwebdesign.org/userinfo.phtml?user=jcarico)
+ (License: public domain). You'll need a modified page.tmpl
+ **[[css_market/02_Template.tmpl]]**. If you prefer
+ [my header image](http://martin.wuertele.net/images/header.png) you can
+ use it under the terms of the MIT License (see png comment).
+ [[!meta stylesheet="02_Template"]]
+
+* **[[css_market/cstamas.css]]**, contributed by [[cstamas]].
+ This one is based on embeddedmoose, however it is slightly different now.
+ [My webpage's](http://users.itk.ppke.hu/~cstamas/tag/english) is not the same.
+ You can grab some pictures used as background patterns from there.
+ [[!meta stylesheet="cstamas"]]
+
+* **[[css_market/bma.css]]**, contributed by [bma](http://subvert.org.uk/~bma/).
+ Not quite the same as I use on my site, since that has slightly modified
+ templates.
+ [[!meta stylesheet="bma"]]
+
+* **[blankoblues.css][1]**, contributed by [[Blanko]]. Can be seen on [Blankoblues Demo][2]. Local.css and templates available [here][3].
+
+* **[contraste.css][4]**, contributed by [[Blanko]]. Can be seen on [Contraste Demo][5]. Local.css and templates available [here][6].
+
+* **[wiki.css](http://cyborginstitute.net/includes/wiki.css)** by [[tychoish]].
+ I typically throw this in as `local.css` in new wikis as a slightly more clear and readable
+ layout for wikis that need to be functional and elegant, but not necessarily uniquely designed.
+ Currently in use by the [the outeralliance wiki](http://oa.criticalfutures.com/).
+* **[ikiwiked gray-green](https://github.com/AntPortal/ikiwiked/raw/master/theme/gray-green/local.css)**, contributed by [Danny Castonguay](https://antportal.com/).
+* **[ikiwiked gray-orange](https://github.com/AntPortal/ikiwiked/raw/master/theme/gray-orange/local.css)**, contributed by [Danny Castonguay](https://antportal.com/). Can be seen in action at [antportal.com/wiki](https://antportal.com/wiki/). Feel free to modify and contribute on [Github](https://github.com/AntPortal/ikiwiked)
+<!-- Page links -->
+
+ [1]: http://blankoworld.homelinux.com/demo/ikiwiki/blankoblues/src/local.css (Download Blankoblues CSS)
+ [2]: http://blankoworld.homelinux.com/demo/ikiwiki/blankoblues/htdocs/ (Take a tour on Blankoblues Demo)
+ [3]: http://blankoworld.homelinux.com/demo/ikiwiki/blankoblues/blankoblues.tar.gz (Download local.css and templates for Blankoblues theme)
+ [4]: http://blankoworld.homelinux.com/demo/ikiwiki/contraste/src/local.css (Download Contraste CSS)
+ [5]: http://blankoworld.homelinux.com/demo/ikiwiki/contraste/htdocs/ (Take a tour on Contraste Demo)
+ [6]: http://blankoworld.homelinux.com/demo/ikiwiki/contraste/contraste.tar.gz (Download local.css and templates for Contraste theme)
diff --git a/doc/css_market/02_template.css b/doc/css_market/02_template.css
new file mode 100644
index 000000000..522d9a452
--- /dev/null
+++ b/doc/css_market/02_template.css
@@ -0,0 +1,307 @@
+/* ikiwiki local style sheet */
+
+/* Add local styling here, instead of modifying style.css. */
+
+/* This stylesheet is based on 02 Template
+ (http://www.openwebdesign.org/viewdesign.phtml?id=3057)
+ by jarico (http://www.openwebdesign.org/userinfo.phtml?user=jcarico)
+
+ License: public domain
+
+ modifications for ikiwiki by Martin Wuertele <web@wuertele.net>
+*/
+
+/******** General tags ********/
+
+body{
+ padding:0;
+ margin:0;
+ border:none;
+ font-family:georgia, arial;
+ font-size:12px;
+ background:url(images/bg.png) repeat-x top #4B546B;
+ /*background:#4B546B;*/
+ color:#000000;
+}
+
+#pagestyle{
+ width:80%;
+ padding-top:5px;
+ padding-bottom:5px;
+ padding-left:5px;
+ padding-right:5px;
+ margin:0 auto;
+ border:none;
+ background-color:#FFFFFF;
+
+}
+
+#pagehead{
+ background:url(images/header.png) no-repeat center #808080;
+ height:203px;
+}
+
+a{
+ margin:0;
+ padding:0;
+ border-bottom:1px #8994AF dotted;
+ border-top:none;
+ border-left:none;
+ border-right:none;
+ background-color:transparent;
+ color:#000000;
+ display:inline;
+ color:#8994AF;
+ text-decoration:none;
+}
+
+a:hover{
+ margin:0;
+ padding:0;
+ border-bottom:1px #4B556A dotted;
+ border-top:none;
+ border-left:none;
+ border-right:none;
+ background-color:transparent;
+ color:#000000;
+ display:inline;
+ color:#4B556A;
+ text-decoration:none;
+}
+
+a img{
+ border:0;
+}
+
+p{
+ margin:0 0 18px 10px;
+}
+
+pre{
+ margin:0 0 18px 10px;
+}
+
+ul,ol,dl{
+}
+
+ul ul,ol ol{
+ margin:4px 0 4px 35px;
+}
+
+h1{
+ font-family:georgia, arial;
+ font-size:16px;
+ color:#4A5368;
+ text-transform:uppercase;
+ font-weight:bold;
+ padding:0;
+ margin-top:0;
+ margin-bottom:5px;
+ margin-left:auto;
+ margin-right:auto;
+}
+
+h2{
+ font-family:georgia, arial;
+ font-size:14px;
+ color:#4A5368;
+ text-transform:uppercase;
+ font-weight:bold;
+ padding:0;
+ margin-top:0;
+ margin-bottom:5px;
+ margin-left:auto;
+ margin-right:auto;
+}
+
+h3{
+ font-family:georgia, arial;
+ font-size:12px;
+ color:#4A5368;
+ text-transform:uppercase;
+ font-weight:bold;
+ padding:0;
+ margin-top:0;
+ margin-bottom:5px;
+ margin-left:auto;
+ margin-right:auto;
+}
+
+blockquote{
+}
+
+#pageheader{
+}
+
+#afront {
+}
+
+.header{
+ width:auto;
+ border:none;
+ padding-top:1px;
+ padding-bottom:10px;
+ padding-left:5px;
+ padding-right:5px;
+ text-align:left;
+ margin:0;
+ height:20px;
+ line-height:20px;
+ font-family:georgia, arial;
+ font-size:12px;
+ color:#808080;
+}
+
+.actions {
+ border-bottom: none;
+}
+
+#backlinks{
+ width:auto;
+ border:none;
+ padding-top:0;
+ padding-bottom:1px;
+ padding-left:5px;
+ padding-right:5px;
+ text-align:right;
+ margin:0;
+ height:20px;
+ line-height:20px;
+ font-family:georgia, arial;
+ font-size:12px;
+ color:#808080;
+}
+
+div.tags {
+ border: none;
+}
+
+#content{
+}
+
+#contentalt{
+}
+
+#pageinfo {
+ border: none;
+}
+
+.inlinepage {
+ border:none;
+ padding:5px 10px;
+ margin:0
+}
+
+.inlinepage .header {
+ font-family:georgia, arial;
+ font-size:16px;
+ color:#4A5368;
+ text-transform:uppercase;
+ font-weight:bold;
+ padding:0;
+ margin-top:0;
+ margin-bottom:5px;
+ margin-left:auto;
+ margin-right:auto;
+}
+
+#tags {
+ border: none;
+}
+
+/******** Content variations ********/
+
+.feedbutton {
+ color:#ffffff;
+ font-size:0.9em;
+ background-color:#4088b8;
+ border:1px solid #c8c8c8;
+ line-height:1.3em;
+ padding: 0px 0.5em 0px 0.5em;
+}
+
+/*.feedbutton:hover {
+ color: #4088b8 !important;
+ background: #ffffff;
+}*/
+
+
+/******** sidebar ********/
+
+#sidebar{
+ float:right;
+ width:175px;
+ background-color:#FFFFFF;
+ border:1px #808080 solid;
+ padding:5px 10px;
+ margin:0;
+ height:100%
+}
+
+#sidebar ul{
+ list-style:none;
+}
+
+#sidebar li{
+ list-style:none;
+}
+
+#sidebar li a{
+}
+
+#sidebar ul ul{
+}
+
+#sidebar ul ul li a{
+}
+
+#sidebar h1{
+ font-family:georgia, arial;
+ font-size:16px;
+ color:#4A5368;
+ text-transform:uppercase;
+ font-weight:bold;
+ padding:0;
+ margin-top:0;
+ margin-bottom:5px;
+ margin-left:auto;
+ margin-right:auto;
+}
+
+#sidebar h2{
+ margin:3px 0px 8px 0px;
+}
+
+
+/* CSS fixes */
+.popup:focus .balloon {
+ position: absolute;
+ display: inline;
+ margin: 1em 0 0 -2em;
+ padding: 0.625em;
+ border: 2px solid;
+ background-color: #dee;
+ color: black;
+}
+
+
+
+/******** Footer ********/
+#footer{
+ clear:both;
+ text-align:right;
+ color:#808080;
+ border-top:1px #808080 solid;
+ margin:0 auto;
+ padding:8px 0;
+}
+
+.pagedate{
+ font-size:small;
+ text-align:right;
+ margin:0;
+ font-family:georgia, arial;
+ font-size:10px;
+ color:#808080;
+ padding:8px 0;
+}
diff --git a/doc/css_market/02_template.tmpl b/doc/css_market/02_template.tmpl
new file mode 100644
index 000000000..e35c5eb05
--- /dev/null
+++ b/doc/css_market/02_template.tmpl
@@ -0,0 +1,20 @@
+--- /usr/share/ikiwiki/templates/page.tmpl 2007-12-03 20:09:46.000000000 +0100
++++ templates/page.tmpl 2007-12-05 21:15:38.000000000 +0100
+@@ -14,6 +14,9 @@
+ </head>
+ <body>
+
++<div id="pagestyle">
++<div id="pagehead">
++</div>
+ <div class="header">
+ <span>
+ <TMPL_LOOP NAME="PARENTLINKS">
+@@ -111,5 +100,7 @@
+ <!-- from <TMPL_VAR NAME=WIKINAME> -->
+ </div>
+
++</div>
++
+ </body>
+ </html>
diff --git a/doc/css_market/bma.css b/doc/css_market/bma.css
new file mode 100644
index 000000000..347f4430d
--- /dev/null
+++ b/doc/css_market/bma.css
@@ -0,0 +1,108 @@
+/*
+ * local.css: stylesheet for subvert.org.uk.
+ * Copyright © 2008 Benjamin M. A'Lee <bma@subvert.org.uk>
+ *
+ * This work is free software: you can redistribute it and/or modify
+ * it under the terms of the GNU General Public License, version 3, as
+ * published by the Free Software Foundation.
+ *
+ * This work is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+ * GNU General Public License for more details.
+ *
+ * You should have received a copy of the GNU General Public License
+ * along with this work. If not, see <http://www.gnu.org/licenses/>.
+ */
+
+/* Positioning first. Colours at the end. */
+
+body {
+ font-family: sans-serif;
+ margin: 0;
+ padding: 0;
+}
+
+#content {
+ margin: 1em 15em 0 1em;
+ padding: 2em 3em 4em 1.5em;
+ border: 1px solid;
+}
+
+#sidebar {
+ width: 10em;
+ min-height: 50%;
+
+ font-family: sans-serif;
+ text-align: left;
+ margin: 1em 1em 2em 1em;
+ line-height: 1em;
+ border: 1px solid;
+
+ position: absolute;
+ right: 0;
+ top: 5;
+}
+
+#sidebar h2 {
+ font-size: 1em;
+}
+
+
+.header {
+ padding: 1ex;
+ border-bottom: solid 1px;
+ margin: 0;
+ font-size: 1.5em;
+}
+
+#footer {
+ border: 1px solid;
+ padding: .5em;
+ padding-left: 1em;
+ margin: 1em 15em 1em 1em;
+}
+
+.pagedate, .pagelicense, .pagecopyright {
+ margin-top: 0;
+ margin-bottom: 0;
+}
+
+#pageinfo {
+ border: none;
+}
+
+.inlinepage {
+ border: none;
+}
+
+.pagelicense p, .pagecopyright p {
+ display: inline;
+}
+
+pre {
+ overflow: auto;
+ border: solid;
+ border-width: thin;
+ padding: 5px 10px;
+}
+
+/* Set colours here. */
+
+body {
+ background: royalblue;
+}
+
+#content, #footer, .header {
+ background-color: silver;
+ border-color: black;
+}
+
+#sidebar {
+ background-color: silver;
+ border-color: black;
+}
+
+pre, code {
+ background-color: #EEEEEE;
+}
diff --git a/doc/css_market/cstamas.css b/doc/css_market/cstamas.css
new file mode 100644
index 000000000..0855f3b86
--- /dev/null
+++ b/doc/css_market/cstamas.css
@@ -0,0 +1,69 @@
+/* This template is based on:
+ * "Embedded Moose local.css for use with ikiwiki
+ * Written by Josh Triplett <josh@freedesktop.org>
+ * Some ideas from the Debian lighttpd index.html page."
+ *
+ * The improoved version is made by
+ * Csillag Tamas <cstamas@digitus.itk.ppke.hu>
+ * 2008-01-??
+ * */
+
+body {
+ background: #474747;
+}
+
+#content {
+ background: #333333;
+ margin: 10px 0px;
+ border: 1px dotted #c0c0c0;
+ padding: 10px;
+ font-family: sans-serif;
+ color: #acacac
+}
+
+h1 {
+ font-size: 150%;
+ color: #e6deee;
+}
+
+h2 {
+ font-size: 130%;
+ color: #e6deee;
+}
+
+a {
+ color: #efefef;
+ border-bottom: 1px dashed;
+ text-decoration: none;
+}
+
+a:hover { background: #300000; }
+
+pre {
+ color: #d0d0d0;
+ border: 1px dotted #c0c0c0;
+ background: black;
+ padding: 2px;
+ font-size: 110%;
+}
+
+.feedbutton { background: #ff0000; }
+
+.header {
+ background: #800000;
+ border: 2px solid #500000;
+ padding: 10px;
+ color: #efefef;
+ font-family: sans-serif;
+}
+
+.header a { margin-right: 1ex; color: #efefef; font-family: sans-serif;}
+.pagedate, .tags, #backlinks { background: #640000; border: 2px solid #500000; padding: 4px; color: #efefef; font-family: sans-serif;}
+.tags { color: yellow }
+.selflink { background: yellow; color: black }
+
+.actions ul { background: #640000; border: none; padding-bottom: 0px; font-family: sans-serif;}
+.actions a { margin-right: 1ex; color: #dfdfdf; font-family: sans-serif;}
+#footer { border: none; font-family: sans-serif;}
+
+
diff --git a/doc/css_market/discussion.mdwn b/doc/css_market/discussion.mdwn
new file mode 100644
index 000000000..3dc47b55a
--- /dev/null
+++ b/doc/css_market/discussion.mdwn
@@ -0,0 +1,37 @@
+What is the correct way to install the .tmpl files? -- [[JosephTurian]]
+
+> For themes that need them, you can set `templatedir` to some directory in
+> your setup file, and put the templates there. Or install directly overtop
+> ikiwiki's standard templates (in, eg `/usr/share/ikiwiki/templates`)
+> --[[Joey]]
+
+----
+
+How can I update a .css file uploaded to the CSS market?
+My CSS has been updated to address comments (for the comments plugin), but I cannot find a way to update zack.css.
+The most recent version is always available at: <http://git.upsilon.cc/cgi-bin/gitweb.cgi?p=zack-homepage.git;a=blob_plain;f=local.css;hb=HEAD>
+
+-- [[StefanoZacchiroli]]
+
+> Just letting me know about the change works -- updated. --[[Joey]]
+
+----
+
+Added small changes to embeddedmoose.css to work with ikiwiki 3.x. Figured here is as good as any until Josh can review and update if he so chooses: <http://bosboot.org/tb-embeddedmoose.css>. It removes annoying borders around the header and footer. -- [[TimBosse]]
+
+-----
+
+I removed this from the list since both places to download it are broken.
+--[[Joey]]
+
+* **Refresh**, contributed by [[FredericLespez]]. Adapted from a free template
+ designed by [styleshout](http://www.styleshout.com).
+ You can see it [here](http://fred.ccheznous.org). You can download the local.css file and
+ the modified templates [here](http://fred.ccheznous.org/refresh_20060602.tgz).
+
+ * This link (above) seems to deliver an empty tarball.
+
+ * You'll find a updated version of these templates [here](http://www.der-winnie.de/~winnie/configs/ikiwiki-templates.tar.gz).
+ These templates are known to work with ikiwiki 2.31, and since I'll install always the newest one on my server I'll will update them on a regular basis.
+ * (This link appears to be broken?)
+
diff --git a/doc/css_market/embeddedmoose.css b/doc/css_market/embeddedmoose.css
new file mode 100644
index 000000000..3d19fca3f
--- /dev/null
+++ b/doc/css_market/embeddedmoose.css
@@ -0,0 +1,13 @@
+/* Embedded Moose local.css for use with ikiwiki
+ * Written by Josh Triplett <josh@freedesktop.org>
+ * Some ideas from the Debian lighttpd index.html page. */
+
+body { background: #e7e7e7; }
+
+#content { background: #ffffff; margin: 10px 0px; border: 2px solid #c0c0c0; padding: 10px; font-family: sans-serif;}
+
+.header { background: #4b6983; border: 2px solid #7590ae; padding: 10px; color: #ffffff; font-family: sans-serif;}
+.header a { margin-right: 1ex; color: #ffffff; font-family: sans-serif;}
+
+.actions ul { border: none; padding-bottom: 0px; font-family: sans-serif;}
+#footer { border: none; font-family: sans-serif;}
diff --git a/doc/css_market/kirkambar.css b/doc/css_market/kirkambar.css
new file mode 100644
index 000000000..e756a1260
--- /dev/null
+++ b/doc/css_market/kirkambar.css
@@ -0,0 +1,142 @@
+/*
+ * IkiWiki `local.css` stylesheet following the Gitweb theme.
+ *
+ * Copyright © 2006 Recai Oktaş <roktasATdebian.org>
+ *
+ * Licensed under the GNU General Public License, version 2.
+ * See the file `http://www.gnu.org/copyleft/gpl.txt`.
+ *
+ */
+
+
+/*
+ * -----------------------------------------------------------------------------
+ * Generic style elements.
+ * -----------------------------------------------------------------------------
+ */
+
+body {
+ font-family: "Trebuchet MS",
+ "Luxi Sans",
+ "Bitstream Vera Sans",
+ "Tahoma",
+ "Verdana",
+ "Arial",
+ "Helvetica",
+ sans-serif;
+ padding: 1em;
+ margin: 0;
+ font-size: 100.01%;
+ line-height: 1.5em;
+ color: black;
+ background-color: white;
+}
+
+pre, tt, code {
+ font-family: "Bitstream Vera Sans Mono",
+ "Luxi Mono",
+ "Courier New",
+ "Courier",
+ monospace;
+}
+
+pre, tt, code, tr.changeinfo, .blogform {
+ color: inherit;
+ background-color: #f6f6f0;
+}
+
+pre {
+ margin: 0px 96px 0px 48px;
+ padding: 12px 0px 12px 0px;
+}
+
+h1, h2, h3, h4, h5, h6, dl, dt {
+ font-weight: bold;
+ background-color: inherit;
+ color: #c00040 !important;
+}
+
+h1, h2, h3, h4, h5, h6 {
+ letter-spacing: .04em;
+}
+
+
+/*
+ * -----------------------------------------------------------------------------
+ * Headers, footers.
+ * -----------------------------------------------------------------------------
+ */
+
+.header, #footer, .changeheader {
+ color: black !important;
+ background-color: #d9d8d1;
+}
+
+.header, #footer {
+ height: 1.8em;
+ padding: 6px 6px;
+ border: 1px solid #aaa;
+ margin-bottom: 4px;
+ display: block;
+}
+
+.header {
+ font-size: 120.01%;
+ font-weight: normal;
+ letter-spacing: .11em;
+}
+
+span.header {
+ background-image: none !important;
+ text-align: right;
+}
+
+.header { /* Optional header logo (right aligned). */
+ background-image: url(/* ENTER HEADER LOGO PATH */);
+ background-repeat: no-repeat;
+ background-position: 99%;
+}
+
+#footer { /* Optional footer logo (right aligned). */
+ background-image: url(/* ENTER FOOTER LOGO PATH */);
+ background-repeat: no-repeat;
+ background-position: 99%;
+}
+
+
+/*
+ * -----------------------------------------------------------------------------
+ * Specials.
+ * -----------------------------------------------------------------------------
+ */
+
+#searchform {
+ position: absolute;
+ top: 25px;
+ right: 90px;
+}
+
+
+td.changetime {
+ font-style: italic;
+}
+
+td.changelog {
+ font-style: normal;
+ font-size: x-small;
+ font-weight: bold;
+}
+
+/*
+ * Attribution `div` for IkiWiki. Use something like as follows:
+ * <div id="attribution">
+ * This site is maintained using Joey Hess's
+ * <a href="http://ikiwiki.info/">
+ * <img src="ikiwiki.png" title="IkiWiki" alt="IkiWiki" />
+ * </a>.
+ * </div>
+ */
+#attribution img {
+ border: 1px solid black;
+ padding: 2px;
+}
diff --git a/doc/css_market/zack.css b/doc/css_market/zack.css
new file mode 100644
index 000000000..5a0521d54
--- /dev/null
+++ b/doc/css_market/zack.css
@@ -0,0 +1,193 @@
+/* local.css stylesheet to be used with ikiwiki
+ *
+ * Copyright: (C) 2006 Stefano Zacchiroli <zack@debian.org>
+ * License: GNU General Public License version 2 or above.
+ *
+ * TODO
+ * - plone-like actions in the toplevel bar, but remember: resist the
+ * temptation of making them floating to the right: the breadcrumb trail can
+ * grow indefinitely
+ * - blog form aligned to the right, keeping the RSS logo to the left
+ * - some rendering for the tags (a la 'xhtml' logo of plone? ...)
+ * - some rendering for backlinks
+ * - some rendering for posting dates
+ */
+
+body {
+ font-family: sans-serif;
+ font-size: medium;
+}
+h1, h2, h3, h4 {
+ font-weight: normal;
+}
+h1 { font-size: 140%; }
+h2 { font-size: 120%; }
+h3 { font-size: 110%; }
+h4 { font-size: 105% }
+
+a { text-decoration: none; }
+a:hover { text-decoration: underline; }
+
+.flow {
+ float: right;
+ margin-left: 10px;
+ margin-bottom: 10px;
+ text-align: center;
+}
+
+.attrib-caption {
+ font-size: xx-small;
+ font-style: italic;
+}
+
+input {
+ border: solid 1px;
+ border-color: #aaa;
+}
+
+.header { font-weight: normal; }
+
+.selflink { text-decoration: underline; }
+
+.pageheader .actions ul,
+#sitemeta {
+ border-top: solid 1px;
+ border-bottom: solid 1px;
+ font-size: small;
+ border-color: #aaa;
+ background: #eee;
+}
+.actions ul {
+ padding: 1px;
+ margin-top: 5px;
+}
+#sitemeta {
+ padding: 0;
+ margin-bottom: 5px;
+}
+#backlinks,
+.tags {
+ margin-top: 0;
+ margin-bottom: 0;
+}
+
+#pageinfo {
+ border: none;
+}
+
+#searchform div:before {
+ font-size: small;
+ content: "search:";
+}
+#searchform input {
+ border-top: none;
+ border-bottom: none;
+ vertical-align: bottom;
+ margin-right: 7px;
+}
+
+#sidebar {
+ border: solid;
+ border-width: 1px;
+ padding: 0;
+ margin-top: 15px;
+ border: 1px solid;
+ border-color: #aaa;
+ background: #eee;
+ width: 16ex;
+}
+#sidebar ul {
+ margin: 0;
+ padding-left: 1em;
+ list-style-type: none;
+}
+#sidebar ul ul {
+ padding-left: 1.5em;
+ font-size: 90%;
+}
+
+#pageinfo,
+#footer {
+ margin: 0;
+}
+#pageinfo {
+ font-size: small;
+}
+.pagecopyright,
+.pagelicense,
+.pagedate {
+ margin: 0;
+ display: inline;
+}
+#backlinks {
+ margin-top: 5px;
+ margin-bottom: 10px;
+ font-size: larger;
+}
+.validation {
+ display: inline;
+ float: right;
+}
+
+.pagecloud {
+ margin-left: 5px;
+}
+
+table.identikit tr th {
+ text-align: right;
+ vertical-align: top;
+}
+table.identikit tr th:after {
+ content: ":";
+}
+table.identikit tr td {
+ text-align: left;
+ vertical-align: top;
+}
+
+.doi_logo , .doi_logo a {
+ background: #3965bd;
+ color: white !important;
+ font-size: 80%;
+ text-decoration: none;
+ font-family: times;
+ font-weight: bold;
+ padding: 0px 1px 0px 2px;
+}
+
+#comments {
+ margin-top: 5ex;
+ border-top: solid 1px;
+ border-color: #aaa;
+ font-size: small;
+}
+
+#comments #feedlink {
+ text-align: right;
+}
+#comments #feedlink:before {
+ content: "comment feeds: ";
+}
+
+.addcomment {
+ padding: 5px;
+ font-style: italic;
+}
+
+.comment {
+ border: none;
+ background-color: #eee;
+ margin: 5px;
+ margin-top: 10px;
+}
+
+.comment-subject {
+ font-style: normal;
+}
+
+.comment-header {
+ border-top: solid 1px;
+ border-color: #aaa;
+ text-align: right;
+ font-style: normal;
+}
diff --git a/doc/download.mdwn b/doc/download.mdwn
new file mode 100644
index 000000000..f1ae5ad31
--- /dev/null
+++ b/doc/download.mdwn
@@ -0,0 +1,52 @@
+Here's how to get ikiwiki in source or prepackaged form. See [[setup]] for
+how to use it, and be sure to add your wiki to [[IkiwikiUsers]] if you use
+ikiwiki.
+
+## source
+
+Ikiwiki is developed in a [[git_repository|git]].
+
+The best place to download a tarball of the latest release is from
+<http://packages.debian.org/unstable/source/ikiwiki>.
+
+Manual installation steps and requirements are listed on the [[install]] page.
+
+## Debian / Ubuntu packages
+
+To install with [apt](http://www.debian.org/doc/manuals/debian-reference/ch02.en.html#_basic_package_management_operations), if using Debian or Ubuntu:
+
+ apt-get install ikiwiki
+
+Or download the deb from <http://packages.debian.org/unstable/web/ikiwiki>.
+
+There is a backport of a recent version of ikiwiki for Debian 5.0 at
+<http://packages.debian.org/lenny-backports/ikiwiki>.
+
+There is also an unofficial backport of ikiwiki for Ubuntu Jaunty, provided by
+[[Paweł_Tęcza|users/ptecza]],
+at [http://gpa.net.icm.edu.pl/ubuntu/](http://gpa.net.icm.edu.pl/ubuntu/index-en.html).
+
+## RPM packages
+
+Fedora versions 8 and newer have RPMs of ikiwiki available.
+
+Ikiwiki's source includes a RPM spec file, which you can use to build your
+own RPM.
+
+## BSD ports
+
+Ikiwiki can be installed [from macports](http://www.macports.org/ports.php?by=name&substr=ikiwiki)
+by running `sudo port install ikiwiki`.
+
+NetBSD and many other platforms: pkgsrc has an [ikiwiki package](ftp://ftp.netbsd.org/pub/pkgsrc/current/pkgsrc/www/ikiwiki/README.html).
+
+FreeBSD has ikiwiki in its
+[ports collection](http://www.freshports.org/www/ikiwiki/).
+
+## Other packages
+
+Gentoo has an [ebuild](http://bugs.gentoo.org/show_bug.cgi?id=144453) in its bug database.
+
+The [openSUSE Build Service](http://software.opensuse.org/search?baseproject=ALL&p=1&q=ikiwiki) has packages for openSUSE
+
+A [PKGBUILD for Arch Linux](http://aur.archlinux.org/packages.php?ID=12284) is in the AUR.
diff --git a/doc/examples.mdwn b/doc/examples.mdwn
new file mode 100644
index 000000000..19631ad47
--- /dev/null
+++ b/doc/examples.mdwn
@@ -0,0 +1,12 @@
+To make it easier to get started using ikiwiki for some common tasks, this
+page gives some examples of ways to use ikiwiki.
+
+* [[blog]] - a weblog with tags, a tag cloud, archives, and an optional sidebar
+* [[softwaresite]] - a website for some software package, the package
+ can also build static html docs from its wiki
+
+Each example is contained in its own subdirectory; just copy the source
+files into your wiki to start using one of the examples.
+
+The [[tips]] page has some other ideas for ways to use ikiwiki, and the
+[[css_market]] and [[theme market|themes]] has some example stylesheets to change ikiwiki's look.
diff --git a/doc/examples/blog.mdwn b/doc/examples/blog.mdwn
new file mode 100644
index 000000000..5f8f6c3ce
--- /dev/null
+++ b/doc/examples/blog.mdwn
@@ -0,0 +1,26 @@
+This is an [[example_blog|index]]. Just copy the blog subdirectory into
+your wiki to quickly get started blogging with ikiwiki.
+
+Or, run this command to set up a blog with ikiwiki.
+
+ % ikiwiki -setup /etc/ikiwiki/auto-blog.setup
+
+Some additional configuration you might want to do, if not using
+`auto-blog.setup`:
+
+* Make sure to configure ikiwiki to generate RSS or Atom feeds.
+
+* Make sure you have the [[tag|plugins/tag]] plugin enabled, and the
+ `tagbase` set to "tags". Tag pages will then automatically be created.
+ An example of how to tag a post is:
+ \[[!tag life]]
+
+* Enable the [[pagestats|plugins/pagestats]] plugin to get a tag cloud
+ to display on the [[index]].
+
+* Enable the [[comments|plugins/comments]] plugin to
+ enable comments to posts to the blog.
+
+* Enable the [[calendar|plugins/calendar]] plugin and run the
+ [[ikiwiki-calendar]] command from cron daily to get an interlinked
+ set of calendar archives.
diff --git a/doc/examples/blog/archives.mdwn b/doc/examples/blog/archives.mdwn
new file mode 100644
index 000000000..d07b73b74
--- /dev/null
+++ b/doc/examples/blog/archives.mdwn
@@ -0,0 +1,8 @@
+[[!if test="archives/*" then="""
+Browse through blog archives by year:
+[[!map pages="./archives/* and !./archives/*/* and !*/Discussion"]]
+"""
+else="""
+You need to use the `ikiwiki-calendar` program to generate calendar-based
+archive pages.
+"""]]
diff --git a/doc/examples/blog/comments.mdwn b/doc/examples/blog/comments.mdwn
new file mode 100644
index 000000000..e22b50a34
--- /dev/null
+++ b/doc/examples/blog/comments.mdwn
@@ -0,0 +1,10 @@
+[[!sidebar content="""
+[[!inline pages="comment_pending(./posts/*)" feedfile=pendingmoderation
+description="comments pending moderation" show=-1]]
+Comments in the [[!commentmoderation desc="moderation queue"]]:
+[[!pagecount pages="comment_pending(./posts/*)"]]
+"""]]
+
+Recent comments on posts in the [[blog|index]]:
+[[!inline pages="./posts/*/Discussion or comment(./posts/*)"
+template="comment"]]
diff --git a/doc/examples/blog/discussion.mdwn b/doc/examples/blog/discussion.mdwn
new file mode 100644
index 000000000..d9c716658
--- /dev/null
+++ b/doc/examples/blog/discussion.mdwn
@@ -0,0 +1,13 @@
+##How to remove the postform for my blog from front page?
+
+I have an inline to create a blog on the front page of my site. I don't want any visitors to see that form nor attempt to click "Edit" on it. I tried setting postform="no" but I did not notice any change.
+
+Any suggestions on how I can have a private webpage that offers the blog post form ("Add a new post titled:") and also turn off that form from my front page but still keep the blog articles displayed from the front page?
+
+I looked at the "inline" docs but may have overlooked this.
+
+I do see I can disable the editpage plugin to remove from front page. But then that made it so I can't add a new blog posting (I want to just not from the front page).
+
+-- [[JeremyReed]]
+
+> You need two separate inlines, one on your front page which can be as simple as `\[[!inline pages="blog/*"]]`, and another on a hidden/unadvertised page, which has `postform=yes` added, that you will use to add posts. Removing the 'Edit' link from the front page (and all other pages — presumably you don't want it on blog post pages either) can be achieved in a number of ways. I do it by removing it from my `page.tmpl` file (point `templatedir` in your setup file to a directory under your control; copy `/usr/share/ikiwiki/templates/page.tmpl` into it, and remember that every time ikiwiki is upgraded, potentially the file has changed, and you might need to merge in the changes). A better way might be to hide the link via CSS (`.actions { display: none; }`). You can't add pages via the web interface if you remove [[plugins/editpage]] from your setup. You should look at [[plugins/lockedit]] to make sure that only you can edit pages/submit blog posts, should anyone else stumble across your unadvertised "submit blog post" page. — [[Jon]]
diff --git a/doc/examples/blog/index.mdwn b/doc/examples/blog/index.mdwn
new file mode 100644
index 000000000..7914cd203
--- /dev/null
+++ b/doc/examples/blog/index.mdwn
@@ -0,0 +1,11 @@
+[[!if test="enabled(sidebar)" then="""
+[[!sidebar]]
+""" else="""
+[[!inline pages=sidebar raw=yes]]
+"""]]
+
+[[!inline pages="page(./posts/*) and !*/Discussion" show="10"
+actions=yes rootpage="posts"]]
+
+
+This blog is powered by [ikiwiki](http://ikiwiki.info).
diff --git a/doc/examples/blog/posts.mdwn b/doc/examples/blog/posts.mdwn
new file mode 100644
index 000000000..2bd0f1d6f
--- /dev/null
+++ b/doc/examples/blog/posts.mdwn
@@ -0,0 +1,3 @@
+Here is a full list of posts to the [[blog|index]].
+
+[[!inline pages="page(./posts/*) and !*/Discussion" archive=yes feedshow=10 quick=yes trail=yes]]
diff --git a/doc/examples/blog/posts/first_post.mdwn b/doc/examples/blog/posts/first_post.mdwn
new file mode 100644
index 000000000..343497d18
--- /dev/null
+++ b/doc/examples/blog/posts/first_post.mdwn
@@ -0,0 +1,2 @@
+This is the first post to this example blog. To add new posts, just add
+files to the posts/ subdirectory, or use the web form.
diff --git a/doc/examples/blog/sidebar.mdwn b/doc/examples/blog/sidebar.mdwn
new file mode 100644
index 000000000..e0895f63f
--- /dev/null
+++ b/doc/examples/blog/sidebar.mdwn
@@ -0,0 +1,10 @@
+[[!if test="enabled(calendar)" then="""
+[[!calendar pages="page(./posts/*) and !*/Discussion"]]
+"""]]
+
+[[Recent Comments|comments]]
+
+[[Archives]]
+
+[[Tags]]:
+[[!pagestats style="list" pages="./tags/*" among="./posts/*"]]
diff --git a/doc/examples/blog/tags.mdwn b/doc/examples/blog/tags.mdwn
new file mode 100644
index 000000000..b5eca5b71
--- /dev/null
+++ b/doc/examples/blog/tags.mdwn
@@ -0,0 +1,3 @@
+[[!pagestats pages="./tags/*" among="./posts/*"]]
+
+On the right you can see the tag cloud for this blog.
diff --git a/doc/examples/softwaresite.mdwn b/doc/examples/softwaresite.mdwn
new file mode 100644
index 000000000..99f791177
--- /dev/null
+++ b/doc/examples/softwaresite.mdwn
@@ -0,0 +1,19 @@
+This is an [[example_software_package_website|index]].
+Just copy the softwaresite subdirectory into your wiki to quickly produce
+a website for a piece of software.
+
+Some additional configuration you might want to do:
+
+* Make sure to configure ikiwiki to generate RSS or Atom feeds.
+
+* The softwaresite/doc subdirectory is intended to hold docs about your
+ software package. These docs can be included in the package itself;
+ there is a [[softwaresite/Makefile]] that will use ikiwiki to build
+ static html documentation. ikiwiki itself uses a similar system to build
+ its documentation.
+
+* Read the [[tips/integrated_issue_tracking_with_ikiwiki]] article for tips
+ about how to use ikiwiki as a BTS.
+
+* Read [[tips/spam_and_softwaresites]] for information on how to keep spam
+ and spam-fighting commits out of your main version control history.
diff --git a/doc/examples/softwaresite/Makefile b/doc/examples/softwaresite/Makefile
new file mode 100644
index 000000000..f2c4d8e54
--- /dev/null
+++ b/doc/examples/softwaresite/Makefile
@@ -0,0 +1,15 @@
+# Build static html docs suitable for being shipped in the software
+# package. This depends on ikiwiki being installed to build the docs.
+
+ifeq ($(shell which ikiwiki),)
+IKIWIKI=echo "** ikiwiki not found" >&2 ; echo ikiwiki
+else
+IKIWIKI=ikiwiki
+endif
+
+all:
+ $(IKIWIKI) `pwd` html -v --wikiname FooBar --plugin=goodstuff \
+ --exclude=html --exclude=Makefile
+
+clean:
+ rm -rf .ikiwiki html
diff --git a/doc/examples/softwaresite/bugs.mdwn b/doc/examples/softwaresite/bugs.mdwn
new file mode 100644
index 000000000..46ead2b62
--- /dev/null
+++ b/doc/examples/softwaresite/bugs.mdwn
@@ -0,0 +1,4 @@
+This is FooBar's bug list. Link bugs to [[bugs/done]] when done.
+
+[[!inline pages="./bugs/* and !./bugs/done and !link(done)
+and !*/Discussion" actions=yes postform=yes show=0]]
diff --git a/doc/examples/softwaresite/bugs/done.mdwn b/doc/examples/softwaresite/bugs/done.mdwn
new file mode 100644
index 000000000..ad332e2a2
--- /dev/null
+++ b/doc/examples/softwaresite/bugs/done.mdwn
@@ -0,0 +1,3 @@
+recently fixed [[bugs]]
+
+[[!inline pages="./* and link(./done) and !*/Discussion" sort=mtime show=10]]
diff --git a/doc/examples/softwaresite/bugs/fails_to_frobnicate.mdwn b/doc/examples/softwaresite/bugs/fails_to_frobnicate.mdwn
new file mode 100644
index 000000000..0f47e810a
--- /dev/null
+++ b/doc/examples/softwaresite/bugs/fails_to_frobnicate.mdwn
@@ -0,0 +1,4 @@
+FooBar, when used with the `--frob` option, fails to properly frobnicate
+output.
+
+> This is fixed in [[news/version_1.0]]; marking this bug [[done]].
diff --git a/doc/examples/softwaresite/bugs/hghg.mdwn b/doc/examples/softwaresite/bugs/hghg.mdwn
new file mode 100644
index 000000000..cece64126
--- /dev/null
+++ b/doc/examples/softwaresite/bugs/hghg.mdwn
@@ -0,0 +1 @@
+hghg
diff --git a/doc/examples/softwaresite/bugs/needs_more_bugs.mdwn b/doc/examples/softwaresite/bugs/needs_more_bugs.mdwn
new file mode 100644
index 000000000..a150570a4
--- /dev/null
+++ b/doc/examples/softwaresite/bugs/needs_more_bugs.mdwn
@@ -0,0 +1,3 @@
+FooBar does not have enough bugs, which suggests that it's not a real Free
+Software project. Please help create more bugs by adding code to FooBar!
+:-)
diff --git a/doc/examples/softwaresite/contact.mdwn b/doc/examples/softwaresite/contact.mdwn
new file mode 100644
index 000000000..facfa900f
--- /dev/null
+++ b/doc/examples/softwaresite/contact.mdwn
@@ -0,0 +1,7 @@
+To reach the authors of FooBar, join channel `#foobar` on the `examplenet`
+irc network.
+
+There's also a mailing list,
+[foobar-l](http://example.com/mailman/listinfo/foobar-l).
+
+Be sure to read the [[doc/FAQ]] first.
diff --git a/doc/examples/softwaresite/doc.mdwn b/doc/examples/softwaresite/doc.mdwn
new file mode 100644
index 000000000..f134febb6
--- /dev/null
+++ b/doc/examples/softwaresite/doc.mdwn
@@ -0,0 +1,5 @@
+Documentation for FooBar.
+
+* First, you'll want to [[install]] it.
+* Then you'll want to [[setup]] the config files.
+* There's also a [[FAQ]].
diff --git a/doc/examples/softwaresite/doc/faq.mdwn b/doc/examples/softwaresite/doc/faq.mdwn
new file mode 100644
index 000000000..fe0c3eff0
--- /dev/null
+++ b/doc/examples/softwaresite/doc/faq.mdwn
@@ -0,0 +1,11 @@
+FooBar frequently asked questions.
+
+[[!toc ]]
+
+## Is this a real program?
+
+No, it's just an example.
+
+## Really?
+
+Yes, really.
diff --git a/doc/examples/softwaresite/doc/install.mdwn b/doc/examples/softwaresite/doc/install.mdwn
new file mode 100644
index 000000000..1e877a45a
--- /dev/null
+++ b/doc/examples/softwaresite/doc/install.mdwn
@@ -0,0 +1,10 @@
+Installing FooBar is pretty straightforward:
+
+ tar xzvf foobar.tar.gz
+ cd foobar
+ ./configure
+ make
+ make install
+
+Note that you'll need `libfrobnicate` installed first. You might also want to
+edit `config.h`.
diff --git a/doc/examples/softwaresite/doc/setup.mdwn b/doc/examples/softwaresite/doc/setup.mdwn
new file mode 100644
index 000000000..aa2b26345
--- /dev/null
+++ b/doc/examples/softwaresite/doc/setup.mdwn
@@ -0,0 +1,4 @@
+FooBar is configured via the config file `/etc/foobarrc`, and the per-user
+`~/.foobarrc`.
+
+The file format should be self-explanatory.
diff --git a/doc/examples/softwaresite/download.mdwn b/doc/examples/softwaresite/download.mdwn
new file mode 100644
index 000000000..1b4e599eb
--- /dev/null
+++ b/doc/examples/softwaresite/download.mdwn
@@ -0,0 +1,5 @@
+FooBar tarballs can be downloaded from
+[here](http://foobar.example.com/download/).
+
+There's also a Subversion repository, at
+`svn://foobar.example.com/foobar/trunk`.
diff --git a/doc/examples/softwaresite/index.mdwn b/doc/examples/softwaresite/index.mdwn
new file mode 100644
index 000000000..e03a969a0
--- /dev/null
+++ b/doc/examples/softwaresite/index.mdwn
@@ -0,0 +1,13 @@
+FooBar is an amazing example program that does not exist. Use it for all
+your example program needs. This is its wiki.
+
+* **[[download]]**
+* [[news]]
+* [[documentation|doc]]
+* [[bugs]]
+* [[contact]]
+
+----
+
+This wiki is powered by [ikiwiki](http://ikiwiki.info).
+
diff --git a/doc/examples/softwaresite/news.mdwn b/doc/examples/softwaresite/news.mdwn
new file mode 100644
index 000000000..20efba6e0
--- /dev/null
+++ b/doc/examples/softwaresite/news.mdwn
@@ -0,0 +1,5 @@
+This is where announcements of new releases, features, and other news is
+posted. FooBar users are recommended to subscribe to this page's RSS
+feed.
+
+[[!inline pages="./news/* and !*/Discussion" rootpage="news" show="30"]]
diff --git a/doc/examples/softwaresite/news/version_1.0.mdwn b/doc/examples/softwaresite/news/version_1.0.mdwn
new file mode 100644
index 000000000..83c805e6e
--- /dev/null
+++ b/doc/examples/softwaresite/news/version_1.0.mdwn
@@ -0,0 +1 @@
+Version 1.0 of foobar is released. [[Download]] it today!
diff --git a/doc/examples/softwaresite/templates/release.mdwn b/doc/examples/softwaresite/templates/release.mdwn
new file mode 100644
index 000000000..ac7ff93c7
--- /dev/null
+++ b/doc/examples/softwaresite/templates/release.mdwn
@@ -0,0 +1,7 @@
+<TMPL_IF news>News for FooBar <TMPL_VAR version>:
+
+<TMPL_VAR news>
+
+</TMPL_IF>
+FooBar <TMPL_VAR version> released with [[!toggle text="these changes" id="changelog"]]
+[[!toggleable id="changelog" text="""<TMPL_VAR changelog>"""]]
diff --git a/doc/favicon.ico b/doc/favicon.ico
new file mode 100644
index 000000000..b55eba280
--- /dev/null
+++ b/doc/favicon.ico
Binary files differ
diff --git a/doc/features.mdwn b/doc/features.mdwn
new file mode 100644
index 000000000..66f7ecb73
--- /dev/null
+++ b/doc/features.mdwn
@@ -0,0 +1,181 @@
+An overview of some of ikiwiki's features:
+[[!toc ]]
+
+## Uses a real RCS
+
+Rather than implement its own system for storing page histories etc,
+ikiwiki uses a real [[Revision_Control_System|rcs]]. This isn't (just)
+because we're lazy, it's because a real RCS is a good thing to have, and
+there are advantages to using one that are not possible with a standard
+wiki.
+
+Instead of editing pages in a stupid web form, you can use vim and commit
+changes via [[Subversion|rcs/svn]], [[rcs/git]], or any of a number of other
+[[Revision_Control_Systems|rcs]].
+
+Ikiwiki can be run from a [[post-commit]] hook to update your wiki
+immediately whenever you commit a change using the RCS.
+
+It's even possible to securely let
+[[anonymous_users_git_push_changes|tips/untrusted_git_push]]
+to the wiki.
+
+Note that ikiwiki does not require a RCS to function. If you want to
+run a simple wiki without page history, it can do that too.
+
+## A wiki compiler
+
+Ikiwiki is a wiki compiler; it builds a static website for your wiki, and
+updates it as pages are edited. It is fast and smart about updating a wiki,
+it only builds pages that have changed (and tracks things like creation of
+new pages and links that can indirectly cause a page to need a rebuild)
+
+## Supports many markup languages
+
+By default, pages in the wiki are written using the [[ikiwiki/MarkDown]] format.
+Any page with a filename ending in ".mdwn" is converted from markdown to html
+by ikiwiki. Markdown understands text formatted as it would be in an email,
+and is quite smart about converting it to html. The only additional markup
+provided by ikiwiki on top of regular markdown is the [[ikiwiki/WikiLink]] and
+the [[ikiwiki/directive]].
+
+If you prefer to use some other markup language, ikiwiki allows others to
+easily be added by [[plugins]]. For example it also supports traditional
+[[plugins/WikiText]] formatted pages, pages written as pure
+[[plugins/HTML]], or pages written in [[reStructuredText|plugins/rst]]
+or [[Textile|plugins/textile]].
+
+Ikiwiki also supports files of any other type, including plain text,
+images, etc. These are not converted to wiki pages, they are just copied
+unchanged by ikiwiki as it builds your wiki. So you can check in an image,
+program, or other special file and link to it from your wiki pages.
+
+## Blogging
+
+You can turn any page in the wiki into a [[blog]]. Pages matching a
+specified [[ikiwiki/PageSpec]] will be displayed as a weblog within the blog
+page. And RSS or Atom feeds can be generated to follow the blog.
+
+Ikiwiki's own [[TODO]], [[news]], and [[plugins]] pages are good examples
+of some of the flexible ways that this can be used. There is also an
+[[example_blog|examples/blog]] set up that you can copy into your own wiki.
+
+Ikiwiki can also [[plugins/aggregate]] external blogs, feeding them into
+the wiki. This can be used to create a Planet type site that aggregates
+interesting feeds.
+
+You can also mix blogging with podcasting by dropping audio files where
+they will be picked up like blog posts. This will work for any files that
+you would care to syndicate.
+
+## Valid html and [[css]]
+
+Ikiwiki aims to produce
+[valid XHTML 1.0](http://validator.w3.org/check?url=referer).
+(Experimental [[tips/HTML5]] support is also available.)
+
+Ikiwiki generates html using [[templates]], and uses [[css]], so you
+can change the look and layout of all pages in any way you would like.
+
+Ikiwiki ships with several ready to use [[themes]].
+
+## [[Plugins]]
+
+Plugins can be used to add additional features to ikiwiki. The interface is
+quite flexible, allowing plugins to implement additional markup languages,
+register [[directives|ikiwiki/directive]], provide a [[RCS]] backend, hook
+into [[CGI]] mode, and much more. Most of ikiwiki's features are actually
+provided by plugins.
+
+The standard language for ikiwiki plugins is perl, but ikiwiki also supports
+[[plugins/write/external]] plugins: Standalone programs that can be written in
+any language and communicate with ikiwiki using XML RPC.
+
+## [[todo/utf8]]
+
+After rather a lot of fiddling, we think that ikiwiki correctly and fully
+supports utf8 everywhere.
+
+## Other features
+
+The above are the core design goals and features of ikiwiki, but on that
+foundation a lot of other important features are added. Here is an
+incomplete list of some of them.
+
+### [[Tags]]
+
+You can tag pages and use these tags in various ways. Tags will show
+up in the ways you'd expect, like at the bottom of pages, in blogs, and
+in RSS and Atom feeds.
+
+### [[SubPages|ikiwiki/SubPage]]
+
+Arbitrarily deep hierarchies of pages with fairly simple and useful
+[[ikiwiki/SubPage/LinkingRules]]
+
+### [[BackLinks]]
+
+Automatically included on pages. Rather faster than eg MoinMoin and
+always there to help with navigation.
+
+### Smart merging and conflict resolution in your web browser
+
+Since it uses a real RCS, ikiwiki takes advantage of its smart merging to
+avoid any conflicts when two people edit different parts of the same page
+at the same time. No annoying warnings about other editors, or locking,
+etc, instead the other person's changes will be automatically merged with
+yours when you commit.
+
+In the rare cases where automatic merging fails due to the same part of a
+page being concurrently edited, regular commit conflict markers are
+shown in the file to resolve the conflict, so if you're already familiar
+with that there's no new commit marker syntax to learn.
+
+### [[RecentChanges]], editing pages in a web browser
+
+Nearly the definition of a wiki, although perhaps ikiwiki challenges how
+much of that web gunk a wiki really needs. These features are optional
+and can be enabled by enabling [[CGI]] and a [[Revision_Control_System|rcs]].
+
+### User registration
+
+Can optionally be configured to allow only registered users to edit
+pages.
+
+User registration can be done using a web form, or ikiwiki can be
+configured to accept users authenticated with OpenID, or HTTP basic
+authentication, or other methods implemented via plugins.
+
+### Discussion pages
+
+Thanks to subpages, every page can easily and automatically have a
+/Discussion subpage. By default, these links are included in the
+[[templates]] for each page. If you prefer blog-style
+[[plugins/comments]], that is available too.
+
+### Edit controls
+
+Wiki admins can lock pages so that only other admins can edit them. Or a
+wiki can be set up to allow anyone to edit Discussion pages, but only
+registered users to edit other pages. These are just two possibilities,
+since page edit controls can be changed via plugins.
+
+### [[PageHistory]]
+
+Well, sorta. Rather than implementing YA history browser, it can link to
+[[ViewVC]] or the like to browse the history of a wiki page.
+
+### Full text search
+
+Ikiwiki can use the xapian search engine to add powerful
+full text [[plugins/search]] capabilities to your wiki.
+
+### Translation via po files
+
+The [[plugins/po]] plugin allows translating individual wiki pages using
+standard `po` files.
+
+### [[w3mmode]]
+
+Can be set up so that w3m can be used to browse a wiki and edit pages
+without using a web server.
diff --git a/doc/forum.mdwn b/doc/forum.mdwn
new file mode 100644
index 000000000..62b62a401
--- /dev/null
+++ b/doc/forum.mdwn
@@ -0,0 +1,11 @@
+This is a place for questions and discussions that don't have a Discussion
+page fitting enough. Users of ikiwiki can ask questions here.
+
+Note that for more formal bug reports or todo items, you can also edit the
+[[bugs]] and [[todo]] pages.
+
+
+## Current topics ##
+
+[[!inline pages="forum/* and !forum/discussion and !forum/*/*"
+archive=yes rootpage="forum" postformtext="Add a new thread titled:" show=0]]
diff --git a/doc/forum/404_-_not_found.mdwn b/doc/forum/404_-_not_found.mdwn
new file mode 100644
index 000000000..dc3318901
--- /dev/null
+++ b/doc/forum/404_-_not_found.mdwn
@@ -0,0 +1,22 @@
+Hi,
+
+I've followed the tutorial to install ikiwiki. Once installed (on a Ubuntu
+10.04 distro running under VirtualBox on a Windows XP, SP3 host), I can
+access the **http://ubuntu1004/index.lighttpd.html** page without any
+issues.
+
+But when I try to access the page **http://ubuntu1004/~geertvc/gwiki** (as
+is mentioned at the end of the ikiwiki setup), I get the error message
+"**404 - not found**".
+
+I've also followed the "dot-cgi" trick, but with the same negative result.
+The web server I've installed, is lighttpd.
+
+What did I miss?
+
+Best rgds,
+
+--Geert
+
+> Perhaps your webserver is not exporting your `public_html` directory
+> in `~geertvc`? Check its configuration. --[[Joey]]
diff --git a/doc/forum/404_-_not_found/comment_1_3dea2600474f77fb986767da4d507d62._comment b/doc/forum/404_-_not_found/comment_1_3dea2600474f77fb986767da4d507d62._comment
new file mode 100644
index 000000000..453419cf3
--- /dev/null
+++ b/doc/forum/404_-_not_found/comment_1_3dea2600474f77fb986767da4d507d62._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://jmtd.livejournal.com/"
+ ip="188.222.50.68"
+ subject="comment 1"
+ date="2010-09-09T21:41:07Z"
+ content="""
+You probably need to run \"lighttpd-enable-mod userdir\"
+"""]]
diff --git a/doc/forum/404_-_not_found/comment_2_948e4678be6f82d9b541132405351a2c._comment b/doc/forum/404_-_not_found/comment_2_948e4678be6f82d9b541132405351a2c._comment
new file mode 100644
index 000000000..c3fb72db5
--- /dev/null
+++ b/doc/forum/404_-_not_found/comment_2_948e4678be6f82d9b541132405351a2c._comment
@@ -0,0 +1,31 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawllEHb4oGNaUrl7vyziQGrxAlQFri_BfaE"
+ nickname="Geert"
+ subject="comment 2"
+ date="2010-09-12T06:45:27Z"
+ content="""
+After a re-installation of ikiwiki (first removed all old files), I get the following feedback:
+
+ Successfully set up gwiki:
+ url: http://ubuntu1004/~geertvc/gwiki
+ srcdir: ~/gwiki
+ destdir: ~/public_html/gwiki
+ repository: ~/gwiki.git
+ To modify settings, edit ~/gwiki.setup and then run:
+ ikiwiki -setup ~/gwiki.setup
+
+
+In the lighttpd config file (/etc/lighttpd/lighttpd.conf), I've now changed the item \"server.document-root\" from the default \"/var/www\" to (in my case) \"/home/geertvc/public_html/gwiki\". I've taken the destdir location (see above) as document root for lighttpd.
+
+When doing this, I can see the \"index.html\" page of ikiwiki (by typing the following URL in the address box of the browser: \"ubuntu1004/index.html\"). So, that seems to be the right modification, right? Or isn't it?
+
+Note: when I take the directory \"/home/geertvc/gwiki\" (= the URL given above), then things do not work. I can't see the content of \"index.html\", I get the error message I mentioned in my initial post (404 - not found).
+
+But when clicking, for instance, the \"Edit\" button, the link brings me to the following location:
+
+ http://ubuntu1004/~geertvc/gwiki/ikiwiki.cgi?page=index&do=edit
+
+However, there's not at all a file called \"ikiwiki.cgi\" at that location. The location of the file \"ikiwiki.cgi\" is \"/home/geertvc/public_html/gwiki\", so why is the link \"Edit\" leading me to that (wrong?) location?
+
+Apparently, something is still wrong with my settings. Hope, with the above information, someone can put me on the right track...
+"""]]
diff --git a/doc/forum/404_-_not_found/comment_3_4c7b1fa88776815bbc6aa286606214c2._comment b/doc/forum/404_-_not_found/comment_3_4c7b1fa88776815bbc6aa286606214c2._comment
new file mode 100644
index 000000000..9f606f04e
--- /dev/null
+++ b/doc/forum/404_-_not_found/comment_3_4c7b1fa88776815bbc6aa286606214c2._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://jmtd.livejournal.com/"
+ ip="78.105.191.131"
+ subject="Follow instructions"
+ date="2010-09-12T12:26:49Z"
+ content="""
+Please re-read my comment. If you enable usersdirs then /~user corresponds to ~/public_html. The change you have made has / corresponding instead, which is why the links don't work.
+"""]]
diff --git a/doc/forum/Accessing_meta_values_in_pages__63__.mdwn b/doc/forum/Accessing_meta_values_in_pages__63__.mdwn
new file mode 100644
index 000000000..78594f912
--- /dev/null
+++ b/doc/forum/Accessing_meta_values_in_pages__63__.mdwn
@@ -0,0 +1,8 @@
+If I set a meta value on a page (lets say \[[!meta author="Adam Shand"]] is there some way to retrieve the value of author and put it somewhere visible on the page? Eg. can I write:
+
+author: $author
+
+I know I can update the raw templates but it'd be nice to be able to do this in the pages them selves.
+
+Cheers,
+Adam.
diff --git a/doc/forum/Adding_new_markup_to_markdown.mdwn b/doc/forum/Adding_new_markup_to_markdown.mdwn
new file mode 100644
index 000000000..39d233add
--- /dev/null
+++ b/doc/forum/Adding_new_markup_to_markdown.mdwn
@@ -0,0 +1,11 @@
+I'm using ikiwiki to manage my personal wiki. One of the things I'm toying with is storing my grocery list in a wiki. The way I typically grocery-shop is to make one huge master list containing all the items I typically buy in a single cycle. Then, on any given trip, I make a subset list containing only the items I need. I'd like to streamline this process by making the master list a series of checkboxes. Before each trip, I load the list page on my phone, check off all the items I already have, then check off individual items as I get them.
+
+I'm not sure if there's a convenient way of adding checkboxes to wiki pages, and after a bit of thought I decided that "( )" would be a good markup for this. Ideally I'd like to still have access to other markdown conventions so I could, say, organize the list with headings and such when it grows large, so I don't want to create an entirely separate format, or a separate copy of the markdown plugin.
+
+Is there an existing means of, say, adding supersets to wiki markup? I suppose I could use an inline directive that inserts a multisellect HTML element, but I really like ( ). :)
+
+Ideal would be some sort of filter infrastructure. Plugins could register with a larger filter plugin that adds an inline directive. I could then invoke the checkbox filter at the top of my grocery list, and all instances of ( ) would be replaced with HTML. Might also make sense for the individual filters to specify whether or not they're invoked before or after the page template, or perhaps just always invoke them after. *shrug*
+
+Does something like this exist? I'd really like to avoid messing around with raw HTML or an inline for each of 40-50 list items. :)
+
+-- [[Nolan]]
diff --git a/doc/forum/Allow_only_specific_OpenIDs_to_login.mdwn b/doc/forum/Allow_only_specific_OpenIDs_to_login.mdwn
new file mode 100644
index 000000000..27eb69647
--- /dev/null
+++ b/doc/forum/Allow_only_specific_OpenIDs_to_login.mdwn
@@ -0,0 +1,7 @@
+How do I allow only specific OpenIDs to log in to ikiwiki? I found a way to only allow edits from specific OpenIDs, but I would like to restrict the logins and not the edits. Currently I've disabled the passwordauth plugin, locked all pages, and set the allowed OpenIDs as adminuser:
+
+ adminuser => [qw{MY_OPENIDS}],
+ disable_plugins => [qw{passwordauth}],
+ locked_pages => '*',
+
+The problem with this solution is that every OpenID that logs in is saved in ikiwiki's `userdb` file, which I'd like to avoid. Pointers to the documentation or a sample setup are very welcome. Thanks!
diff --git a/doc/forum/Anyone_mirroring_ikiwiki_inline_feed_to_identi.ca__63__.mdwn b/doc/forum/Anyone_mirroring_ikiwiki_inline_feed_to_identi.ca__63__.mdwn
new file mode 100644
index 000000000..c0b896515
--- /dev/null
+++ b/doc/forum/Anyone_mirroring_ikiwiki_inline_feed_to_identi.ca__63__.mdwn
@@ -0,0 +1,3 @@
+Is anyone successfull mirroring feeds from ikiwiki to identi.ca (or another status.net instance)? How did you set up your feed?
+
+When I try to, identi.ca presents me with an error about no "author ID URI" being found in the feed. Indeed the ikiwiki-generated atom feed only has got a global "author" - I presume identi.ca requires author information in each entry. Is it possible to set up ikiwiki's feed that way?
diff --git a/doc/forum/Anyone_mirroring_ikiwiki_inline_feed_to_identi.ca__63__/comment_1_8a5acbb6234104b607c8c4cf16124ae4._comment b/doc/forum/Anyone_mirroring_ikiwiki_inline_feed_to_identi.ca__63__/comment_1_8a5acbb6234104b607c8c4cf16124ae4._comment
new file mode 100644
index 000000000..1d710d153
--- /dev/null
+++ b/doc/forum/Anyone_mirroring_ikiwiki_inline_feed_to_identi.ca__63__/comment_1_8a5acbb6234104b607c8c4cf16124ae4._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="Franek"
+ ip="188.99.178.40"
+ subject="[[!meta author=&quot;..."
+ date="2012-05-19T14:51:42Z"
+ content="""
+Adding [[!meta author=\"me\"]] to the entries and/or the feedpage does not help.
+"""]]
diff --git a/doc/forum/Anyone_mirroring_ikiwiki_inline_feed_to_identi.ca__63__/comment_2_155e5823860a91989647ede8b5c9224a._comment b/doc/forum/Anyone_mirroring_ikiwiki_inline_feed_to_identi.ca__63__/comment_2_155e5823860a91989647ede8b5c9224a._comment
new file mode 100644
index 000000000..6c709b3f0
--- /dev/null
+++ b/doc/forum/Anyone_mirroring_ikiwiki_inline_feed_to_identi.ca__63__/comment_2_155e5823860a91989647ede8b5c9224a._comment
@@ -0,0 +1,16 @@
+[[!comment format=mdwn
+ username="Franek"
+ ip="188.99.178.40"
+ subject="Further enquiries"
+ date="2012-05-20T10:46:07Z"
+ content="""
+I did some more experiments setting not only \"[[!meta author=...\", but also \"authorurl\" globally and per-entry in various combinations, with no success. As far as I could see, \"authorurl\" had no effect on the atom feed whatsoever.
+
+It seems that identi.ca wants a feed to have an <author> field with a <uri> subfield, as described here: [[http://www.atomenabled.org/developers/syndication/#person]] . Is there a way to achieve this with ikiwiki inline-feeds?
+
+I also found two old and unresolved status.net bugreports on the matter:
+
+[[http://status.net/open-source/issues/2840]]
+
+[[http://status.net/open-source/issues/2839]]
+"""]]
diff --git a/doc/forum/Anyone_mirroring_ikiwiki_inline_feed_to_identi.ca__63__/comment_3_317f1202a3da1bfc845d4becbac4bba8._comment b/doc/forum/Anyone_mirroring_ikiwiki_inline_feed_to_identi.ca__63__/comment_3_317f1202a3da1bfc845d4becbac4bba8._comment
new file mode 100644
index 000000000..6bda93433
--- /dev/null
+++ b/doc/forum/Anyone_mirroring_ikiwiki_inline_feed_to_identi.ca__63__/comment_3_317f1202a3da1bfc845d4becbac4bba8._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="Franek"
+ ip="92.74.26.119"
+ subject="kind of solved, but another problem comes up"
+ date="2012-05-26T19:31:19Z"
+ content="""
+The templates atompage.tmpl and/or atomitem.tmpl appear to be what would have to be altered to satisfy identi.ca. I did that on my system, just hard-coding a <uri> element into <author> for testing. In one respect, it worked: identi.ca does not complain about the missing author uri any more. In another, it did not, another error comes up now: \"Internal server error\" and something like \"could not add feed\".
+
+I do not know where to go from this very unspecific error message. I guess I am going to try something like twitterfeed.com, for now.
+"""]]
diff --git a/doc/forum/Apache_XBitHack.mdwn b/doc/forum/Apache_XBitHack.mdwn
new file mode 100644
index 000000000..d5da0825e
--- /dev/null
+++ b/doc/forum/Apache_XBitHack.mdwn
@@ -0,0 +1,28 @@
+I'd like to be able to use the Apache XBitHack to enable Server Side Includes on my site. Yes, it is possible to enable SSI by setting the page extension to .shtml, and that is what I am doing at the moment.
+However, the disadvantage of this approach is that the server does not give a LastModified header, which means that the content can't be cached. However, the way that I am using SSI is such that the main content of the page really is "last modified" when the page was last modified, so I'd like to be able to indicate that. And using the XBitHack - that is, setting the executable bit on the generated page - would enable me to do that.
+
+I gather from the [[security]] page that having the executable bit set on files is considered a security hole, but how big a hole would it be if I'm the only one editing the site? Is there a way, a somewhat safe way, of implementing XBitHack for IkiWiki?
+
+-- [[KathrynAndersen]]
+
+> The risk with execute bits on files in the generated site is that someone
+> commits an executable, ikiwiki copies it as-is, and now the web browser
+> can be used to run it. Obviously if you're the only committer, that is
+> not much of a risk. Or you can lock down apache to not allow running
+> arbitrary files. It's also pretty unlikely that a rendered mdwn file
+> would result in a html page that can be run as an executable. So an
+> option that makes all files rendered from mdwn or other markups
+> get the x bit set would be pretty safe even with untrusted editors. --[[Joey]]
+
+>> So how about this: if something has a page-type (i.e. mdwn or whatever authorized page types there are)
+>> then add something at the end of the process (would that be the "changes" hook?)
+>> which sets the x bit on the generated page file. Would that work?
+
+>> Or is there a way to say "tell me all the generated files that end in .html" and use that as a list to start from?
+
+>> --[[KathrynAndersen]]
+
+>>> Yes, the `change` hook is passed the names of source files that got
+>>> built. Use `pagetype` to check which got htmlized (and filter out ones
+>>> that got copied), and then use `htmlpage` to get the name of the html
+>>> file that was generated, and chmod it. --[[Joey]]
diff --git a/doc/forum/Are_these_revisions_legit_or_accidentally_committed_jiberish__63__.mdwn b/doc/forum/Are_these_revisions_legit_or_accidentally_committed_jiberish__63__.mdwn
new file mode 100644
index 000000000..ec9980c30
--- /dev/null
+++ b/doc/forum/Are_these_revisions_legit_or_accidentally_committed_jiberish__63__.mdwn
@@ -0,0 +1,12 @@
+When rebuilding, I see
+
+ building recentchanges/change_63476fba3a519d42ee56c7611250ada41111356d._change, which depends on templates/page.tmpl
+ building recentchanges/change_bd07c4308d5eea2cd27ad067a502318dc0e9c8bb._change, which depends on templates/page.tmpl
+ building recentchanges/change_6c2a66b022276951d79f29c1221d22fe1592f67f._change, which depends on templates/page.tmpl
+ building recentchanges/change_f08669f128d618d0da460234b2cee555b0818584._change, which depends on templates/page.tmpl
+ building recentchanges/change_b0347df66da5c515dc0d1d612ecdfbe203a0a674._change, which depends on templates/page.tmpl
+ building recentchanges/change_0bb246c481e9ede8686f6caa4de40b9e94642e40._change, which depends on templates/page.tmpl
+ building recentchanges/change_511846ca75fb2e87fb90582ead282d104a5e13fc._change, which depends on templates/page.tmpl
+ ...
+
+Are these accidentally committed gibberish? If so, how to remove them properly?
diff --git a/doc/forum/Are_these_revisions_legit_or_accidentally_committed_jiberish__63__/comment_1_b425823f800fba82ad2aaaa0dbe6686a._comment b/doc/forum/Are_these_revisions_legit_or_accidentally_committed_jiberish__63__/comment_1_b425823f800fba82ad2aaaa0dbe6686a._comment
new file mode 100644
index 000000000..9326f73d5
--- /dev/null
+++ b/doc/forum/Are_these_revisions_legit_or_accidentally_committed_jiberish__63__/comment_1_b425823f800fba82ad2aaaa0dbe6686a._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-12-30T18:13:18Z"
+ content="""
+They are little per-change pages that are inlined together to produce your RecentChanges page.
+
+(They're not committed to git, only stored internally.)
+"""]]
diff --git a/doc/forum/Asciidoc_plugin.mdwn b/doc/forum/Asciidoc_plugin.mdwn
new file mode 100644
index 000000000..57d6fd91e
--- /dev/null
+++ b/doc/forum/Asciidoc_plugin.mdwn
@@ -0,0 +1,14 @@
+I have completely overhauled the Asciidoc plugin for ikiwiki that was created by [[Karl Mowson|http://www.mowson.org/karl/colophon/]]. The source can be downloaded from my [[Dropbox|http://dl.dropbox.com/u/11256359/asciidoc.pm]].
+
+### Features
+
+* Uses a filter hook to escape WikiLinks and Directives using Asciidoc `+++` passthrough macros, to avoid them being processed by Asciidoc. This behavior is configurable in the wiki setup file.
+* Adds a preprocessor directive 'asciidoc' which allows extra Asciidoc command-line options to be passed on a per-page basis. Each parameter name is the option name (the leading `--` will be inserted automatically), and the parameter value is the option value. Currently, only 'conf-file' and 'doctype' are allowed (or even useful).
+* Sets the page title from the first line in the Asciidoc file using a meta directive. This behavior is configurable in the wiki setup file.
+* Searches for an Asciidoc configuration file named the same as the wiki if none is specified in the setup file.
+* Asciidoc configuration files are stored in the wiki. They should be named `._conf` to avoid publishing them.
+
+### Problems
+
+* Escaping Directives is not optimal. It prevents markup from being used in Directives, and the passthrough macros have to include extra spaces to avoid having directives that return an empty string collapse to `++++++`. In addition, I had to borrow the regexps from the Ikiwiki source code; it would be nice if this were available as part of the API.
+* Handling of Asciidoc errors is suboptimal; they are simply inserted into the returned page. This could be fixed in Perl 5.12 by using the run_forked() in IPC::Cmd.
diff --git a/doc/forum/Attachment_and_sub-directory.mdwn b/doc/forum/Attachment_and_sub-directory.mdwn
new file mode 100644
index 000000000..91d7aee27
--- /dev/null
+++ b/doc/forum/Attachment_and_sub-directory.mdwn
@@ -0,0 +1,5 @@
+Hi.
+
+If I create a page and attach a file to the page, ikiwiki creates a sub-directory with the page name and places the attachment in the sub-directory regardless of usedirs setup. Is there any setup not to create the sub-directory and to place the attachment in the same directory where the page is, so that I can edit and properly *preview* at a local machine using third-party markdown editors?
+
+Thanks in advance.
diff --git a/doc/forum/Background_picture_and_css.mdwn b/doc/forum/Background_picture_and_css.mdwn
new file mode 100644
index 000000000..827100984
--- /dev/null
+++ b/doc/forum/Background_picture_and_css.mdwn
@@ -0,0 +1,8 @@
+Is it possible to put two different background pictures into the right and left sides of the following ikiwiki css?
+
+[lessish css theme](https://raw.github.com/spiffin/ikiwiki_lessish/master/lessish.css)
+
+Is it also possible to have a background like this: [http://ysharifi.wordpress.com/](http://ysharifi.wordpress.com/)
+or this [tex.stackexchange.com](tex.stackexchange.com)
+
+I am not a css expert so, it would be nice if you could provide some details.
diff --git a/doc/forum/Blog_posting_times_and_ikiwiki_state.mdwn b/doc/forum/Blog_posting_times_and_ikiwiki_state.mdwn
new file mode 100644
index 000000000..0c1da5b97
--- /dev/null
+++ b/doc/forum/Blog_posting_times_and_ikiwiki_state.mdwn
@@ -0,0 +1,28 @@
+What I wanted
+-------------
+
+I thought to myself it would be nice to see from the console the dates that my ikiwiki blog posts were published. Especially as I would like to know the order of my todo list without having to view the webpage.
+
+What I discovered
+-----------------
+
+Looked at the code and saw the functions for grabbing the ctime from git but couldn't reconcile them to the "Posted" date in the RSS feed. Some more reading and I figured out that the Posted time is taken from the UNIX ctime when first uploaded into the repository and this information is stored in the page state via a Perl storable database - indexdb. (I'm sure most know this but to be clear in UNIX ctime is *not* the actual creation time of a file. UNIX has no facility for recording the actual creation time - however on first upload to the wiki it's good enough).
+
+Wrote a Perl script to query and sort indexdb. Now I can list my todos or blog posts in the order they appear on the web. Handy.
+
+However the ikiwiki state is specifically excluded via '.gitignore'. I work a lot on trains and not having this file in my cloned wiki means I can't list published posts or my todos in the proper order. I can get an approximation from git logs but, dam it, I want it the same!
+
+What can I do?
+--------------
+
+Is it a spectacularly bad idea to include the ikiwiki state file in my cloned repo (I suspect it is). What else could be done? Can I disable pagestate somehow or force ikiwiki to always use git commit times for Posted times?
+
+> Have you tried running ikiwiki with the `--gettime` option on your laptop,
+> to force it to retrieve initial commit times from git? You should only
+> need to do that once, and it'll be cached in the pagestate thereafter.
+>
+> Because that functionality is slow on every supported VCS except git,
+> ikiwiki tries to avoid using it unless it's really needed. [[rcs]]
+> lists it as "fast" for git, though, so depending how fast it really is
+> and how large your wiki is, you might be able to get away with running
+> ikiwiki with `--gettime` all the time? --[[smcv]]
diff --git a/doc/forum/Blog_posting_times_and_ikiwiki_state/comment_1_87304dfa2caea7e526cdb04917524e8c._comment b/doc/forum/Blog_posting_times_and_ikiwiki_state/comment_1_87304dfa2caea7e526cdb04917524e8c._comment
new file mode 100644
index 000000000..62bae02b0
--- /dev/null
+++ b/doc/forum/Blog_posting_times_and_ikiwiki_state/comment_1_87304dfa2caea7e526cdb04917524e8c._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawmwYptyV5ptNt8LCbMYsmpcNkk9_DRt-EY"
+ nickname="Matt"
+ subject="comment 1"
+ date="2010-11-04T11:52:53Z"
+ content="""
+Perhaps I have a different setup from you but on my laptop I don't have ikiwiki installed - only a clone of the git repo. You mean to run --gettime on the post-update git hook?
+"""]]
diff --git a/doc/forum/Broken_after_upgrading_Ikiwiki.mdwn b/doc/forum/Broken_after_upgrading_Ikiwiki.mdwn
new file mode 100644
index 000000000..aea5fdbd9
--- /dev/null
+++ b/doc/forum/Broken_after_upgrading_Ikiwiki.mdwn
@@ -0,0 +1,10 @@
+I upgraded my Debian server from 5 to 6, then I updated ikiwiki. Now I can't rebuild my Ikiwiki instance:
+
+ $ ikiwiki --setup ./pages.setup
+ Failed to load plugin IkiWiki::Plugin::scrubber: Can't locate IkiWiki/Plugin/scrubber.pm in @INC (@INC contains: /home/xyzfoobar/devel/ikiwiki/lib /etc/perl /usr/local/lib/perl/5.10.1 /usr/local/share/perl/5.10.1 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.10 /usr/share/perl/5.10 /usr/local/lib/site_perl .) at (eval 77) line 2.
+ BEGIN failed--compilation aborted at (eval 77) line 2.
+
+ $ ikiwiki --version
+ ikiwiki version 3.20100815.7
+
+What's the proper way to fix this?
diff --git a/doc/forum/Broken_after_upgrading_Ikiwiki/comment_1_3d0588a845c58b3aedc35970e8dcc811._comment b/doc/forum/Broken_after_upgrading_Ikiwiki/comment_1_3d0588a845c58b3aedc35970e8dcc811._comment
new file mode 100644
index 000000000..93360d167
--- /dev/null
+++ b/doc/forum/Broken_after_upgrading_Ikiwiki/comment_1_3d0588a845c58b3aedc35970e8dcc811._comment
@@ -0,0 +1,14 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnZ0g2UAijV7RGrKtWPljCCAYHBJ3pwPvM"
+ nickname="Meng"
+ subject="comment 1"
+ date="2011-12-20T07:21:19Z"
+ content="""
+If I try to create new setup file
+
+ $ ikiwiki --dumpsetup newpages.setup
+ Traceback (most recent call last):
+ File \"/usr/lib/ikiwiki/plugins/rst\", line 18, in <module>
+ from docutils.core import publish_parts;
+ ImportError: No module named docutils.core
+"""]]
diff --git a/doc/forum/Broken_after_upgrading_Ikiwiki/comment_2_fd65d4b87a735b67543bb0cf4053b652._comment b/doc/forum/Broken_after_upgrading_Ikiwiki/comment_2_fd65d4b87a735b67543bb0cf4053b652._comment
new file mode 100644
index 000000000..47fa4ae7b
--- /dev/null
+++ b/doc/forum/Broken_after_upgrading_Ikiwiki/comment_2_fd65d4b87a735b67543bb0cf4053b652._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 2"
+ date="2011-12-20T15:16:23Z"
+ content="""
+Your setup file seems to refer to a \"scrubber\" plugin, which has never existed in ikiwiki. Perhaps a typo of \"htmlscrubber\"?
+
+3.20100815.7 is an old version of ikiwiki. The current version avoids the rst docutils breakage. Installing python-docutils on Debian can work around that problem as well.
+"""]]
diff --git a/doc/forum/Broken_after_upgrading_Ikiwiki/comment_3_7c8b46eabdb25cbc01c56c7b53ed3b91._comment b/doc/forum/Broken_after_upgrading_Ikiwiki/comment_3_7c8b46eabdb25cbc01c56c7b53ed3b91._comment
new file mode 100644
index 000000000..72d38e76c
--- /dev/null
+++ b/doc/forum/Broken_after_upgrading_Ikiwiki/comment_3_7c8b46eabdb25cbc01c56c7b53ed3b91._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnZ0g2UAijV7RGrKtWPljCCAYHBJ3pwPvM"
+ nickname="Meng"
+ subject="comment 3"
+ date="2011-12-20T16:58:23Z"
+ content="""
+Thanks. That was exactly the issue. When editing my setup file, I accidentally splitted `htmlscrubber` to `html` and `scrubber`.
+"""]]
diff --git a/doc/forum/CGI_script_and_HTTPS.mdwn b/doc/forum/CGI_script_and_HTTPS.mdwn
new file mode 100644
index 000000000..2f255002d
--- /dev/null
+++ b/doc/forum/CGI_script_and_HTTPS.mdwn
@@ -0,0 +1,29 @@
+Dear ikiwiki folks,
+
+using Debian Wheezy and ikiwiki 3.20120629 for some reason when accessing the site using HTTP (and not HTTPS), going to Edit, so executing the CGI script, all URLs are prepended with HTTPS, which I do not want.
+
+ <base href="https://www.example.org/" />
+
+Trying to look at the source, I guess it is originating from `IkiWiki/CGI.pm`.
+
+ sub printheader ($) {
+ my $session=shift;
+
+ if (($ENV{HTTPS} && lc $ENV{HTTPS} ne "off") || $config{sslcookie}) {
+ print $session->header(-charset => 'utf-8',
+ -cookie => $session->cookie(-httponly => 1, -secure => 1));
+ }
+ else {
+ print $session->header(-charset => 'utf-8',
+ -cookie => $session->cookie(-httponly => 1));
+ }
+ }
+
+Does it check if HTTPS is enabled in the environment? During `ikiwiki --setup example.setup` or when the CGI script is run when the site is accessed (for example in an Apache environment)?
+
+Can this somehow be disabled in ikiwiki. Reading the code I guess I could somehow set `HTTPS = off` somewhere in the `VirtualHost` section of the Apache configuration.
+
+
+Thanks,
+
+--[[PaulePanter]]
diff --git a/doc/forum/CGI_script_and_HTTPS/comment_1_3f8ef438ca7de11635d4e40080e7baa9._comment b/doc/forum/CGI_script_and_HTTPS/comment_1_3f8ef438ca7de11635d4e40080e7baa9._comment
new file mode 100644
index 000000000..03f1032e9
--- /dev/null
+++ b/doc/forum/CGI_script_and_HTTPS/comment_1_3f8ef438ca7de11635d4e40080e7baa9._comment
@@ -0,0 +1,43 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2012-11-05T11:27:02Z"
+ content="""
+IkiWiki generates self-referential URLs using the `url` and `cgiurl`
+configuration parameters, and the `urlto()` and `cgiurl()` functions;
+the code you quoted isn't involved (it's choosing whether to set
+HTTPS-only cookies or not, rather than choosing how to generate
+self-referential URLs).
+
+If you want your wiki to be accessible via both HTTP and HTTPS, and use
+whichever the user first requested, you should set both `url` and
+`cgiurl` to the same URI scheme and hostname with no port specified,
+either both `http` or both `https`, for instance:
+
+ url: http://www.example.com/
+ cgiurl: http://www.example.com/ikiwiki.cgi
+
+or
+
+ url: https://example.org/wiki/
+ cgiurl: https://example.org/cgi-bin/ikiwiki
+
+(or the Perl-syntax equivalents if you're not using a YAML
+setup file).
+
+If you use one of those, IkiWiki will attempt to generate
+path-only links, like \"/wiki/\" and \"/cgi-bin/ikiwiki?...\",
+whenever it's valid to do so. A visitor using HTTP will stay
+on HTTP and a visitor using HTTPS will stay on HTTPS.
+
+The choice of `http` or `https` for the `url` and `cgiurl`
+still matters when a URL *must* be absolute, such as in an
+RSS feed.
+
+I improved this code in late 2010 for this todo item:
+[[todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both]].
+It's possible that it has regressed (that's happened
+a couple of times). If it has, please quote your exact
+`url` and `cgiurl` configuration.
+"""]]
diff --git a/doc/forum/Calendar:_listing_multiple_entries_per_day.mdwn b/doc/forum/Calendar:_listing_multiple_entries_per_day.mdwn
new file mode 100644
index 000000000..c3ecf36be
--- /dev/null
+++ b/doc/forum/Calendar:_listing_multiple_entries_per_day.mdwn
@@ -0,0 +1,21 @@
+Hi,
+
+I'd very much like to be able to list my blog posts on a daily basis (used for logging work) - rather than have the Calendar plugin return the first post only.
+
+There was some effort to do this as detailed here.
+
+[[todo/Set_arbitrary_date_to_be_used_by_calendar_plugin]]
+
+I had a quick go at doing something similar on Debian Stable (Ikiwiki 3.0) but alas my Ikiwiki fu is not strong enough.
+
+I'm not sure how I go about swapping the link on the day number to a link to, I guess, a dynamically generated page of links?
+
+ $linkcach{"$year/$mtag/$mday"}{$p} = ${p}
+
+and a suitable whilst loop looks to be all that's needed...
+
+Any pointers appreciated.
+
+A [[!taglink patch]] has been proposed in [comment](#comment-d6f94e2b779d1c038b6359aad79ed14b)
+
+> This has been applied. --[[Joey]]
diff --git a/doc/forum/Calendar:_listing_multiple_entries_per_day/comment_1_d3dd0b97c63d615e3dee22ceacaa5a30._comment b/doc/forum/Calendar:_listing_multiple_entries_per_day/comment_1_d3dd0b97c63d615e3dee22ceacaa5a30._comment
new file mode 100644
index 000000000..ca287581a
--- /dev/null
+++ b/doc/forum/Calendar:_listing_multiple_entries_per_day/comment_1_d3dd0b97c63d615e3dee22ceacaa5a30._comment
@@ -0,0 +1,83 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnXybLxkPMYpP3yw4b_I6IdC3cKTD-xEdU"
+ nickname="Matt"
+ subject="comment 1"
+ date="2011-11-29T00:52:49Z"
+ content="""
+So I ported the original patch mentioned to the latest Debian version. Well at least the bits that matter to me. See below.
+
+In doing so I realized that it does quite work how I imagined it would; instead of generating a dynamic page for a particular day with a list of links it actually fills in the calendar with the details of the posts (making for some ugly formatting).
+
+I'm hoping the tag generation pages will give me a clue how to alter this into what I want.
+
+<pre>
+diff --git a/IkiWiki/Plugin/calendar.pm b/IkiWiki/Plugin/calendar.pm
+index c7d2b7c..c931fe6 100644
+--- a/IkiWiki/Plugin/calendar.pm
++++ b/IkiWiki/Plugin/calendar.pm
+@@ -75,6 +75,8 @@ sub format_month (@) {
+ my %params=@_;
+
+ my %linkcache;
++ my @list;
++ my $detail = 1;
+ foreach my $p (pagespec_match_list($params{page},
+ \"creation_year($params{year}) and creation_month($params{month}) and ($params{pages})\",
+ # add presence dependencies to update
+@@ -88,7 +90,7 @@ sub format_month (@) {
+ my $mtag = sprintf(\"%02d\", $month);
+
+ # Only one posting per day is being linked to.
+- $linkcache{\"$year/$mtag/$mday\"} = $p;
++ $linkcache{\"$year/$mtag/$mday\"}{$p} = $IkiWiki::pagesources{$p};
+ }
+
+ my $pmonth = $params{month} - 1;
+@@ -219,14 +221,38 @@ EOF
+ $tag='month-calendar-day-this-day';
+ }
+ else {
+- $tag='month-calendar-day-link';
++ if ( $detail == 0 ) {
++ $tag='month-calendar-day-link';
++ }
++ else{
++ $tag='month-calendar-day';
++ }
+ }
+ $calendar.=qq{\t\t<td class=\"$tag $downame{$wday}\">};
+- $calendar.=htmllink($params{page}, $params{destpage},
+- $linkcache{$key},
+- noimageinline => 1,
+- linktext => $day,
+- title => pagetitle(IkiWiki::basename($linkcache{$key})));
++ if ( $detail == 0 ) {
++ $calendar.=htmllink($params{page}, $params{destpage},
++ $linkcache{$key},
++ noimageinline => 1,
++ linktext => $day,
++ title => pagetitle(IkiWiki::basename($linkcache{$key})));
++ }
++ else {
++ my $day_label = qq{<span class=\"month-calendar-day-label\">$day</span>};
++ $calendar.=qq{$day_label\n};
++ my $srcpage; my $destpage;
++ while(($srcpage,$destpage) = each(%{$linkcache{$key}})) {
++ my $title = IkiWiki::basename(pagename($srcpage));
++ if (exists $pagestate{$srcpage}{meta}{title} ) {
++ $title = $pagestate{$srcpage}{meta}{title};
++ }
++ $calendar.=qq{\t\t<div class=\"$tag $downame{$wday}\">};
++ $calendar.=htmllink($params{page}, $params{destpage},
++ pagename($destpage),
++ linktext => $title);
++ push @list, pagename($linkcache{$key}{$srcpage});
++ $calendar.=qq{\t\t</div>};
++ }
++ }
+ $calendar.=qq{</td>\n};
+ }
+ else {
+
+</pre>
+"""]]
diff --git a/doc/forum/Calendar:_listing_multiple_entries_per_day/comment_2_2311b96483bb91dc25d5e3695bbca513._comment b/doc/forum/Calendar:_listing_multiple_entries_per_day/comment_2_2311b96483bb91dc25d5e3695bbca513._comment
new file mode 100644
index 000000000..ef100b555
--- /dev/null
+++ b/doc/forum/Calendar:_listing_multiple_entries_per_day/comment_2_2311b96483bb91dc25d5e3695bbca513._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnXybLxkPMYpP3yw4b_I6IdC3cKTD-xEdU"
+ nickname="Matt"
+ subject="comment 2"
+ date="2011-11-29T01:30:09Z"
+ content="""
+To revert the changes made above it is necessary not only to switch the detail level back down but to also regenerate the side-bar index.html file
+
+ ikiwiki --setup blog.setup
+
+should do it.
+"""]]
diff --git a/doc/forum/Calendar:_listing_multiple_entries_per_day/comment_3_d23f0cedd0b9e937eaf200eef55ac457._comment b/doc/forum/Calendar:_listing_multiple_entries_per_day/comment_3_d23f0cedd0b9e937eaf200eef55ac457._comment
new file mode 100644
index 000000000..2433967e5
--- /dev/null
+++ b/doc/forum/Calendar:_listing_multiple_entries_per_day/comment_3_d23f0cedd0b9e937eaf200eef55ac457._comment
@@ -0,0 +1,166 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnXybLxkPMYpP3yw4b_I6IdC3cKTD-xEdU"
+ nickname="Matt"
+ subject="comment 3"
+ date="2011-11-30T20:42:55Z"
+ content="""
+I got to grip with things to make a patch to do this. :-)
+
+Here it is in case of use to anyone.
+
+<pre>
+From f0554c5b61e1915086d5cf071f095ff233c2590d Mon Sep 17 00:00:00 2001
+From: Matt Ford <matt@dancingfrog.co.uk>
+Date: Wed, 30 Nov 2011 19:40:10 +0000
+Subject: [PATCH] Patch to allow daily archival generation and link to them in
+ the calendar
+
+---
+ IkiWiki/Plugin/calendar.pm | 17 ++++++++++++++++-
+ ikiwiki-calendar.in | 34 +++++++++++++++++++++++++++++++---
+ templates/calendarday.tmpl | 5 +++++
+ templates/calendarmonth.tmpl | 2 +-
+ templates/calendaryear.tmpl | 2 +-
+ 5 files changed, 54 insertions(+), 6 deletions(-)
+ create mode 100644 templates/calendarday.tmpl
+
+diff --git a/IkiWiki/Plugin/calendar.pm b/IkiWiki/Plugin/calendar.pm
+index c7d2b7c..9999c03 100644
+--- a/IkiWiki/Plugin/calendar.pm
++++ b/IkiWiki/Plugin/calendar.pm
+@@ -47,6 +47,13 @@ sub getsetup () {
+ safe => 1,
+ rebuild => 1,
+ },
++ archiveday => {
++ type => \"boolean\",
++ example => 1,
++ description => \"enable archiving on a daily basis (otherwise monthly)\",
++ safe => 1,
++ rebuild => 1,
++ },
+ archive_pagespec => {
+ type => \"pagespec\",
+ example => \"page(posts/*) and !*/Discussion\",
+@@ -222,11 +229,19 @@ EOF
+ $tag='month-calendar-day-link';
+ }
+ $calendar.=qq{\t\t<td class=\"$tag $downame{$wday}\">};
+- $calendar.=htmllink($params{page}, $params{destpage},
++ if (exists $pagesources{\"$archivebase/$params{year}/$params{month}/\".sprintf(\"%02d\",$day)}) {
++ $calendar.=htmllink($params{page}, $params{destpage},
++ \"$archivebase/$params{year}/$params{month}/\".sprintf(\"%02d\",$day),
++ noimageinline => 1,
++ linktext => \"$day\",
++ title => \"$key\");
++ }else{
++ $calendar.=htmllink($params{page}, $params{destpage},
+ $linkcache{$key},
+ noimageinline => 1,
+ linktext => $day,
+ title => pagetitle(IkiWiki::basename($linkcache{$key})));
++ }
+ $calendar.=qq{</td>\n};
+ }
+ else {
+diff --git a/ikiwiki-calendar.in b/ikiwiki-calendar.in
+index 037ef7d..af22bc5 100755
+--- a/ikiwiki-calendar.in
++++ b/ikiwiki-calendar.in
+@@ -30,21 +30,44 @@ IkiWiki::checkconfig();
+ my $archivebase = 'archives';
+ $archivebase = $config{archivebase} if defined $config{archivebase};
+
++my $archiveday = 0;
++$archiveday = $config{archiveday} if defined $config{archiveday};
++
+ if (! defined $pagespec) {
+ $pagespec=$config{archive_pagespec} || \"*\";
+ }
+
+-sub writearchive ($$;$) {
++sub is_leap_year {
++ my $year=shift;
++ return ($year % 4 == 0 && (($year % 100 != 0) || $year % 400 == 0));
++}
++
++sub month_days {
++ my $month=shift;
++ my $year=shift;
++ my $days_in_month = (31,28,31,30,31,30,31,31,30,31,30,31)[$month-1];
++ if ($month == 2 && is_leap_year($year)) {
++ $days_in_month++;
++ }
++ return $days_in_month;
++}
++
++sub writearchive ($$;$;$) {
+ my $template=template(shift);
+ my $year=shift;
+ my $month=shift;
++ my $day=shift;
+
+- my $page=defined $month ? \"$year/$month\" : $year;
++ my $page;
++ if (defined $year) {$page = \"$year\"};
++ if (defined $month) {$page = \"$year/$month\"};
++ if (defined $day) {$page = \"$year/$month/$day\"};
+
+ my $pagefile=newpagefile(\"$archivebase/$page\", $config{default_pageext});
+ $template->param(pagespec => $pagespec);
+ $template->param(year => $year);
+ $template->param(month => $month) if defined $month;
++ $template->param(day => $day) if defined $day;
+
+ if ($force || ! -e \"$config{srcdir}/$pagefile\") {
+ writefile($pagefile, $config{srcdir}, $template->output);
+@@ -54,8 +77,13 @@ sub writearchive ($$;$) {
+
+ foreach my $y ($startyear..$endyear) {
+ writearchive(\"calendaryear.tmpl\", $y);
+- foreach my $m (qw{01 02 03 04 05 06 07 08 09 10 11 12}) {
++ foreach my $m (map {sprintf(\"%02d\",$_)} (1..12)) {
+ writearchive(\"calendarmonth.tmpl\", $y, $m);
++ if ($archiveday ) {
++ foreach my $d (map {sprintf(\"%02d\",$_)} (1..month_days($m,$y))) {
++ writearchive(\"calendarday.tmpl\", $y, $m, $d);
++ }
++ }
+ }
+ }
+
+diff --git a/templates/calendarday.tmpl b/templates/calendarday.tmpl
+new file mode 100644
+index 0000000..ac963b9
+--- /dev/null
++++ b/templates/calendarday.tmpl
+@@ -0,0 +1,5 @@
++[[!sidebar content=\"\"\"
++[[!calendar type=month month=<TMPL_VAR MONTH> year=<TMPL_VAR YEAR> pages=\"<TMPL_VAR PAGESPEC>\"]]
++\"\"\"]]
++
++[[!inline pages=\"creation_day(<TMPL_VAR DAY>) and creation_month(<TMPL_VAR MONTH>) and creation_year(<TMPL_VAR YEAR>) and <TMPL_VAR PAGESPEC>\" archive=\"yes\" show=0 feeds=no reverse=yes]]
+diff --git a/templates/calendarmonth.tmpl b/templates/calendarmonth.tmpl
+index 23cd954..c998c16 100644
+--- a/templates/calendarmonth.tmpl
++++ b/templates/calendarmonth.tmpl
+@@ -2,4 +2,4 @@
+ [[!calendar type=month month=<TMPL_VAR MONTH> year=<TMPL_VAR YEAR> pages=\"<TMPL_VAR PAGESPEC>\"]]
+ \"\"\"]]
+
+-[[!inline pages=\"creation_month(<TMPL_VAR MONTH>) and creation_year(<TMPL_VAR YEAR>) and <TMPL_VAR PAGESPEC>\" show=0 feeds=no reverse=yes]]
++[[!inline pages=\"creation_month(<TMPL_VAR MONTH>) and creation_year(<TMPL_VAR YEAR>) and <TMPL_VAR PAGESPEC>\" archive=\"yes\" show=0 feeds=no reverse=yes]]
+diff --git a/templates/calendaryear.tmpl b/templates/calendaryear.tmpl
+index 714bd6d..b6e33c5 100644
+--- a/templates/calendaryear.tmpl
++++ b/templates/calendaryear.tmpl
+@@ -1 +1 @@
+-[[!calendar type=year year=<TMPL_VAR YEAR> pages=\"<TMPL_VAR PAGESPEC>\"]]
++[[!calendar type=year year=<TMPL_VAR YEAR> pages=\"<TMPL_VAR PAGESPEC>\" archive=\"yes\"]]
+--
+1.7.7.3
+
+:
+
+</pre>
+"""]]
diff --git a/doc/forum/Calendar:_listing_multiple_entries_per_day/comment_4_4be39c2043821848d4b25d0bf946a718._comment b/doc/forum/Calendar:_listing_multiple_entries_per_day/comment_4_4be39c2043821848d4b25d0bf946a718._comment
new file mode 100644
index 000000000..a71276d6b
--- /dev/null
+++ b/doc/forum/Calendar:_listing_multiple_entries_per_day/comment_4_4be39c2043821848d4b25d0bf946a718._comment
@@ -0,0 +1,15 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 4"
+ date="2012-02-21T17:23:00Z"
+ content="""
+To be clear, this patch creates a `yyyy/mm/dd` file for each day, listing the posts for that day, so the calendar can link to it rather than a random single post.
+
+While a valid solution certainly, that's a lot of added pages! Especially a high overhead for such a minor UI point as this.
+
+Surely something interesting could be done with javascript or some other form of UI to make clicking on a day in a calendar that has multiple posts present a list of them? That would have essentially no overhead, since the calendar plugin already has a list of the posts made on a given day.
+
+Ikiwiki already does something similar to deal with the case where a page has a great many backlinks.. It makes a UI element that, if hovered over, pops up a display of all the rest. This is done quite simply in the `page.tmpl`
+using the popup and balloon CSS classes. Calendar could also use this.
+"""]]
diff --git a/doc/forum/Calendar:_listing_multiple_entries_per_day/comment_5_de545ebb6376066674ef2aaae4757b9c._comment b/doc/forum/Calendar:_listing_multiple_entries_per_day/comment_5_de545ebb6376066674ef2aaae4757b9c._comment
new file mode 100644
index 000000000..fef852066
--- /dev/null
+++ b/doc/forum/Calendar:_listing_multiple_entries_per_day/comment_5_de545ebb6376066674ef2aaae4757b9c._comment
@@ -0,0 +1,97 @@
+[[!comment format=mdwn
+ username="spalax"
+ subject="Popup listing multiple entries per day"
+ date="2012-06-08T00:56:06Z"
+ content="""
+[[!tag patch]]
+
+Hello,
+here is a patch that:
+
+- if there is a single entry in one day, does not change anything (compared to the previous version of the calendar plugin);
+- if there are several entries, when mouse passes over the day, displays a popup listing all the entries of that day.
+
+That's all. No new pages for each day, takes as little space as it took before, and only a few lines more in the source.
+
+The only thing I am not totally happy with is the CSS. We have to say that the text is aligned on the left (otherwise, it is aligned on the right, as is each day of the calendar), but I do not know which place is the more sensible to put that line of CSS in.
+
+Regards,
+-- Louis
+
+
+ diff --git a/IkiWiki/Plugin/calendar.pm b/IkiWiki/Plugin/calendar.pm
+ index d443198..2c9ed79 100644
+ --- a/IkiWiki/Plugin/calendar.pm
+ +++ b/IkiWiki/Plugin/calendar.pm
+ @@ -86,8 +86,11 @@ sub format_month (@) {
+ my $year = $date[5] + 1900;
+ my $mtag = sprintf(\"%02d\", $month);
+
+ - # Only one posting per day is being linked to.
+ - $linkcache{\"$year/$mtag/$mday\"} = $p;
+ + # Several postings per day
+ + if (! $linkcache{\"$year/$mtag/$mday\"}) {
+ + $linkcache{\"$year/$mtag/$mday\"} = [];
+ + }
+ + push(@{$linkcache{\"$year/$mtag/$mday\"}}, $p);
+ }
+
+ my $pmonth = $params{month} - 1;
+ @@ -221,11 +224,36 @@ EOF
+ $tag='month-calendar-day-link';
+ }
+ $calendar.=qq{\t\t<td class=\"$tag $downame{$wday}\">};
+ - $calendar.=htmllink($params{page}, $params{destpage},
+ - $linkcache{$key},
+ - noimageinline => 1,
+ - linktext => $day,
+ - title => pagetitle(IkiWiki::basename($linkcache{$key})));
+ + if ( scalar(@{$linkcache{$key}}) == 1) {
+ + # Only one posting on this page
+ + my $page = $linkcache{$key}[0];
+ + $calendar.=htmllink($params{page}, $params{destpage},
+ + $page,
+ + noimageinline => 1,
+ + linktext => $day,
+ + title => pagetitle(IkiWiki::basename($page)));
+ + } else {
+ + $calendar.=qq{<div class='popup'>$day<div class='balloon'>};
+ + # Several postings on this page
+ + $calendar.=qq{<ul>};
+ + foreach my $page (@{$linkcache{$key}}) {
+ + $calendar.= qq{\n\t\t\t<li>};
+ + my $title;
+ + if (exists $pagestate{$page}{meta}{title}) {
+ + $title = \"$pagestate{$page}{meta}{title}\";
+ + } else {
+ + $title = pagetitle(IkiWiki::basename($page));
+ + }
+ + $calendar.=htmllink($params{page}, $params{destpage},
+ + $page,
+ + noimageinline => 1,
+ + linktext => $title,
+ + title => $title);
+ + $calendar.= '</li>';
+ + }
+ + $calendar.=qq{\n\t\t</ul>};
+ + $calendar.=qq{</div></div>};
+ + }
+ $calendar.=qq{</td>\n};
+ }
+ else {
+ diff --git a/doc/style.css b/doc/style.css
+ old mode 100644
+ new mode 100755
+ index 6e2afce..4149229
+ --- a/doc/style.css
+ +++ b/doc/style.css
+ @@ -316,6 +316,7 @@ div.progress-done {
+ .popup .paren,
+ .popup .expand {
+ display: none;
+ + text-align: left;
+ }
+ .popup:hover .balloon,
+ .popup:focus .balloon {
+
+"""]]
diff --git a/doc/forum/Can_I_change_the_default_menu_items__63__.mdwn b/doc/forum/Can_I_change_the_default_menu_items__63__.mdwn
new file mode 100644
index 000000000..58134ab6d
--- /dev/null
+++ b/doc/forum/Can_I_change_the_default_menu_items__63__.mdwn
@@ -0,0 +1,6 @@
+I'm looking for a way to change the RecentChanges, Preferences, Branchable, Comment menu items from my wiki page. I can see that the value for these items are set in template variables. Is there a way I can change these variables? If so can you tell me how?
+
+Thanks,
+
+Maria
+
diff --git a/doc/forum/Can_I_change_the_default_menu_items__63__/comment_2_eb56fed3b5fc19c8dd49af4444a049c5._comment b/doc/forum/Can_I_change_the_default_menu_items__63__/comment_2_eb56fed3b5fc19c8dd49af4444a049c5._comment
new file mode 100644
index 000000000..22b1c0558
--- /dev/null
+++ b/doc/forum/Can_I_change_the_default_menu_items__63__/comment_2_eb56fed3b5fc19c8dd49af4444a049c5._comment
@@ -0,0 +1,31 @@
+[[!comment format=mdwn
+ username="http://jmtd.livejournal.com/"
+ ip="188.222.50.68"
+ subject="comment 2"
+ date="2011-10-30T21:23:03Z"
+ content="""
+You need to define a `templatedir` and put a copy of your current version of ikiwiki's `page.tmpl` file into that directory. Then, edit around line 62 or thereabouts, e.g.
+
+ <li><a href=\"<TMPL_VAR EDITURL>\" rel=\"nofollow\">Edit</a></li>
+ </TMPL_IF>
+ <TMPL_IF RECENTCHANGESURL>
+ -<li><a href=\"<TMPL_VAR RECENTCHANGESURL>\">RecentChanges</a></li>
+ +<li><a href=\"<TMPL_VAR RECENTCHANGESURL>\">Recent Changes</a></li>
+ </TMPL_IF>
+ <TMPL_IF HISTORYURL>
+ -<li><a href=\"<TMPL_VAR HISTORYURL>\">History</a></li>
+ +<li><a href=\"<TMPL_VAR HISTORYURL>\">Site history</a></li>
+ </TMPL_IF>
+ <TMPL_IF GETSOURCEURL>
+ -<li><a href=\"<TMPL_VAR GETSOURCEURL>\">Source</a></li>
+ +<li><a href=\"<TMPL_VAR GETSOURCEURL>\">View Source</a></li>
+ </TMPL_IF>
+ <TMPL_IF PREFSURL>
+ -<li><a href=\"<TMPL_VAR PREFSURL>\">Preferences</a></li>
+ +<li><a href=\"<TMPL_VAR PREFSURL>\">Your Preferences</a></li>
+ </TMPL_IF>
+ <TMPL_IF ACTIONS>
+ <TMPL_LOOP ACTIONS>
+
+— [[Jon]]
+"""]]
diff --git a/doc/forum/Can_I_have_different_favicons_for_each_folder__63__.mdwn b/doc/forum/Can_I_have_different_favicons_for_each_folder__63__.mdwn
new file mode 100644
index 000000000..0fefb3560
--- /dev/null
+++ b/doc/forum/Can_I_have_different_favicons_for_each_folder__63__.mdwn
@@ -0,0 +1 @@
+I would like to have different favicons for different parts (folders) of my Ikiwiki site, like you can have different CSS files via the localstyle plugin. Is this possible? If not, could it be made possible?
diff --git a/doc/forum/Can_I_have_different_favicons_for_each_folder__63__/comment_1_a01112ba235e2f44a7655c36ef680e7e._comment b/doc/forum/Can_I_have_different_favicons_for_each_folder__63__/comment_1_a01112ba235e2f44a7655c36ef680e7e._comment
new file mode 100644
index 000000000..dee6e610e
--- /dev/null
+++ b/doc/forum/Can_I_have_different_favicons_for_each_folder__63__/comment_1_a01112ba235e2f44a7655c36ef680e7e._comment
@@ -0,0 +1,19 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2011-11-21T11:37:02Z"
+ content="""
+You could use [[plugins/pagetemplate]] to override all of `page.tmpl`, but
+that's using a sledgehammer to crack a nut.
+
+Another way to do it would be to modify `IkiWiki/Plugins/favicon.pm`
+to use `bestlink` to find the favicon, like [[plugins/localstyle]] does.
+If you made it a config option (`localfavicon => 1` or something)
+it could probably be included in ikiwiki as part of the official
+[[plugins/favicon]] plugin?
+
+Another way would be to have a new `localfavicon` plugin which overrides
+the template variable from [[plugins/favicon]], using `last => 1` to
+make sure it \"wins\".
+"""]]
diff --git a/doc/forum/Can_I_have_different_favicons_for_each_folder__63__/comment_2_b8ccd3c29249eca73766f567bce12569._comment b/doc/forum/Can_I_have_different_favicons_for_each_folder__63__/comment_2_b8ccd3c29249eca73766f567bce12569._comment
new file mode 100644
index 000000000..0c8ca3bce
--- /dev/null
+++ b/doc/forum/Can_I_have_different_favicons_for_each_folder__63__/comment_2_b8ccd3c29249eca73766f567bce12569._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="Franek"
+ ip="178.7.43.64"
+ subject="comment 2"
+ date="2012-06-25T09:58:03Z"
+ content="""
+I did as you suggested (finally) and created a simple modification of the [[plugins/favicon]] plugin: [[plugins/contrib/localfavicon]]. It checks for the \"localfavicon\" option, and if it is set, it uses bestlink() to determine which favicon to use for each page; if not, it behaves just like the original favicon plugin.
+"""]]
diff --git a/doc/forum/Can_Ikiwiki_recognize_multimarkdown_meta_tags__63__.mdwn b/doc/forum/Can_Ikiwiki_recognize_multimarkdown_meta_tags__63__.mdwn
new file mode 100644
index 000000000..3e5fc8bb5
--- /dev/null
+++ b/doc/forum/Can_Ikiwiki_recognize_multimarkdown_meta_tags__63__.mdwn
@@ -0,0 +1,4 @@
+[[!meta author=tjgolubi]]
+
+It would be nice, if configured for multimarkdown, for ikiwiki to recognize and use/remove meta information from multimarkdown documents.
+Title, Author, and Date would be especially nice. -- [[tjgolubi]]
diff --git a/doc/forum/Can_OpenID_users_be_adminusers__63__.mdwn b/doc/forum/Can_OpenID_users_be_adminusers__63__.mdwn
new file mode 100644
index 000000000..17c60c423
--- /dev/null
+++ b/doc/forum/Can_OpenID_users_be_adminusers__63__.mdwn
@@ -0,0 +1,69 @@
+I've just finished an upgrade to 3.141 and am trying to give myself admin rights to play with the new webadmin features. My login is via OpenID but from reading on the wiki I believe that OpenID users should be able to be granted admin rights. However I'm obviously doing something wrong as when I click on the "Preferences" link at the top of the page I don't see any admin features.
+
+My login is: http://adam.shand.net/
+
+In .ikiwiki/userdb I see:
+
+> adam@shand.net
+> email <br>
+> password <br>
+> locked_pages <br>
+> banned <br>
+> 1229722296 <br>
+> regdate <br>
+> http://adam.shand.net/ <br>
+
+And in my config file I have:
+
+> adminuser => [qw{http://adam.shand.net/}],
+
+Any pointers to what I'm doing wrong would be much appreciated.
+
+Thanks,
+Adam.
+
+> This is certianly supposed to work. For example, the admin
+> user on my ikiwikis is `http://joey.kitenet.net/`
+>
+> The only caveat I know of to make it work is that the
+> adminuser openid url has to exactly match the openid url that
+> ikiwiki sees when you log in. Including any trailing slash,
+> and the `http://`. --[[Joey]]
+
+>> Hrm, it's not working. I'm sure I've made a silly mistake somewhere but
+>> I've looked and looked and just can't find it. Any suggestions on where
+>> to look for debugging information would be much appreciated. -- [[Adam]]
+
+>>> Well, you could use this patch to add debugging info about admin
+>>> username comparisons:
+
+<pre>
+diff --git a/IkiWiki/UserInfo.pm b/IkiWiki/UserInfo.pm
+index 0bf100a..77b467a 100644
+--- a/IkiWiki/UserInfo.pm
++++ b/IkiWiki/UserInfo.pm
+@@ -71,6 +71,8 @@ sub userinfo_setall ($$) {
+ sub is_admin ($) {
+ my $user_name=shift;
+
++ print STDERR "is_admin test @{$config{adminuser}} vs $user_name: ".(grep { $_ eq $user_name } @{$config{adminuser}})."\n";
++
+ return grep { $_ eq $user_name } @{$config{adminuser}};
+ }
+
+</pre>
+
+>>>> After applying that change to what is probably
+>>>> `/usr/share/perl5/IkiWiki/UserInfo.pm` on your system,
+>>>> when you go to the preferences page it should log in your web server's
+>>>> error.log, something like this:
+
+ [Wed Jul 08 12:54:35 2009] [error] [client 127.0.1.1] is_admin test http://joey.kitenet.net/ vs http://joey.kitenet.net/: 1
+
+>>>> So you can see if the two usernames/openids match. If the end is "0",
+>>>> they don't match. If nothing is logged, you have not enabled the websetup plugin.
+>>>> If the end if "1" you should see the "Setup" button, if not the
+>>>> problem is not in determining if you're an admin, but elsewhere..
+>>>> --[[Joey]]
+
+I was being incredibly stupid and missed that websetup is a **plugin** and thus needed to be enabled. Many thanks for your patient assistance, by helping me eliminate the unlikely it eventually led me to the obvious. Cheers. -- [[Adam]]
diff --git a/doc/forum/Can__39__t_get_comments_plugin_working.mdwn b/doc/forum/Can__39__t_get_comments_plugin_working.mdwn
new file mode 100644
index 000000000..f189d9b64
--- /dev/null
+++ b/doc/forum/Can__39__t_get_comments_plugin_working.mdwn
@@ -0,0 +1,16 @@
+I feel like I must be missing something.
+
+My blog is based on Ikiwiki, and uses /yyyy/mm/dd/title/ for blog posts.
+Because I use the plugin that generates index pages for subdirectories, I
+have to use /????/??/??/* to identify posts and avoid missing the index
+pages for years, months and days.
+
+I've enabled the comments plugin, but no matter what I do, I can't seem to make the comment form appear on my posts. I've removed the entire site and have rebuilt. I've set the pagespec to /????/??/??/* and ./????/??/??/*, but neither seems to work. I don't see any output, or anything else to indicate that pages aren't working.
+
+Are there any other plugins that need to be enabled for this to work? I think I've locked things down such that anonymous users can't edit by enabling signinedit and setting a lock, but this may be blocking the ability to comment (though I don't recall seeing anything in the docs about needing additional plugins.)
+
+> Just use '????/??/??/*' , and it will work.
+> [[Pagespecs|ikiwiki/pagespec]] are automatically matched absolute to the
+> top of the site, and adding a leading "/" is not necessary and will
+> make the PageSpec not match. (And the relative PageSpec with "./" is
+> not right in this case either). --[[Joey]]
diff --git a/doc/forum/Can__39__t_get_ikiwiki_working_again_after_reinstall.mdwn b/doc/forum/Can__39__t_get_ikiwiki_working_again_after_reinstall.mdwn
new file mode 100644
index 000000000..08187e6f2
--- /dev/null
+++ b/doc/forum/Can__39__t_get_ikiwiki_working_again_after_reinstall.mdwn
@@ -0,0 +1,16 @@
+My server got hacked by an EXIM vulnerability, and so I reimaged the system. After installing ikiwiki I can't get it to accept my old setup file, and I'm not sure what to do.
+
+I'm running debian stable with security updates. Running setup I get.
+Can't use an undefined value as an ARRAY reference at /usr/share/perl5/IkiWiki/Setup/Standard.pm line 33.
+That line in the source file has something todo with wrappers. Also since the reinstall there is no /etc/ikiwiki/auto.setup
+
+After futzing with it for over an hour I tried installing the debian backports version, and get a new different error.
+
+Can't exec "git": No such file or directory at /usr/share/perl5/IkiWiki/Plugin/git.pm line 169.
+Cannot exec 'git pull origin': No such file or directory
+'git pull origin' failed: at /usr/share/perl5/IkiWiki/Plugin/git.pm line 195.
+Can't exec "git": No such file or directory at /usr/share/perl5/IkiWiki/Plugin/git.pm line 169.
+Cannot exec 'git log --max-count=100 --pretty=raw --raw --abbrev=40 --always -c -r HEAD -- .': No such file or directory
+'git log --max-count=100 --pretty=raw --raw --abbrev=40 --always -c -r HEAD -- .' failed:
+
+Any ideas how I can get ikiwiki working again?
diff --git a/doc/forum/Can__39__t_get_ikiwiki_working_again_after_reinstall/comment_1_87a360155ff0502fe08274911cc6a53f._comment b/doc/forum/Can__39__t_get_ikiwiki_working_again_after_reinstall/comment_1_87a360155ff0502fe08274911cc6a53f._comment
new file mode 100644
index 000000000..fa974765f
--- /dev/null
+++ b/doc/forum/Can__39__t_get_ikiwiki_working_again_after_reinstall/comment_1_87a360155ff0502fe08274911cc6a53f._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawkpwzlIQkUFJvJ8dF2-Y-sQklGpVB1fTzk"
+ nickname="Daniel"
+ subject="Fixed."
+ date="2011-01-19T10:18:16Z"
+ content="""
+Oops forgot to install git. Could have used a more helpful error message.
+"""]]
diff --git a/doc/forum/Can__39__t_login_using_Google__44___or_openID__44___but_can_use_Ikiwiki_login.mdwn b/doc/forum/Can__39__t_login_using_Google__44___or_openID__44___but_can_use_Ikiwiki_login.mdwn
new file mode 100644
index 000000000..11d8c23a8
--- /dev/null
+++ b/doc/forum/Can__39__t_login_using_Google__44___or_openID__44___but_can_use_Ikiwiki_login.mdwn
@@ -0,0 +1,5 @@
+I find that users can't login my Ikiwiki instance using Google, or openID, but can use Ikiwiki user name and password to login.
+
+Using the two formers get error
+
+ Error: ... is locked and cannot be edited
diff --git a/doc/forum/Can__39__t_login_using_Google__44___or_openID__44___but_can_use_Ikiwiki_login/comment_1_79127e3c09a1d798146088dee5a67708._comment b/doc/forum/Can__39__t_login_using_Google__44___or_openID__44___but_can_use_Ikiwiki_login/comment_1_79127e3c09a1d798146088dee5a67708._comment
new file mode 100644
index 000000000..41ff2fc36
--- /dev/null
+++ b/doc/forum/Can__39__t_login_using_Google__44___or_openID__44___but_can_use_Ikiwiki_login/comment_1_79127e3c09a1d798146088dee5a67708._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-12-22T16:15:34Z"
+ content="""
+That error message means that the user has logged in, but the lockedit plugin has been configured, via `locked_pages` to not let some pages be edited by them.
+
+Inability to log in with openid is a separate problem, I can only guess as you didn't say how it fails when you try to log in by openid, but perhaps you need to install the Net::OpenID::Consumer perl module from cpan.
+"""]]
diff --git a/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__.mdwn b/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__.mdwn
new file mode 100644
index 000000000..a07c31c00
--- /dev/null
+++ b/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__.mdwn
@@ -0,0 +1,8 @@
+Do custom [[themes]] have to live outside of the wiki (eg. `/usr/share/ikiwiki/themes/`) or is there a way for them to live inside of the wiki srcdir?
+
+I haven't been able to find a way so for now I'm just using a symlink, but that's a bit ugly.
+
+I ask because I do most of my ikiwiki work on my laptop and then push changes to my server. It's not a big deal but it's annoying to have to sync the themes separately and it seems like something which should be able to live inside the wiki like templates.
+
+Cheers,
+[[AdamShand]]
diff --git a/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__/comment_1_d1e79825dfb5213d2d1cba2ace1707b1._comment b/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__/comment_1_d1e79825dfb5213d2d1cba2ace1707b1._comment
new file mode 100644
index 000000000..027127b41
--- /dev/null
+++ b/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__/comment_1_d1e79825dfb5213d2d1cba2ace1707b1._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2011-01-29T18:17:40Z"
+ content="""
+The theme plugin is just a shortcut for adding an underlay with a style.css and maybe some images. If you want to base your design on a modified theme, copy the theme's style.css (or part of it) to the local.css in your wiki's repository; you can also copy in the images and disable the theme plugin entirely.
+"""]]
diff --git a/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__/comment_2_8177ede5a586b1a573a13fd26f8d3cc0._comment b/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__/comment_2_8177ede5a586b1a573a13fd26f8d3cc0._comment
new file mode 100644
index 000000000..2b312731e
--- /dev/null
+++ b/doc/forum/Can_custom_themes_live_somewhere_inside_srcdir__63__/comment_2_8177ede5a586b1a573a13fd26f8d3cc0._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://adam.shand.net/"
+ nickname="Adam"
+ subject="Doh."
+ date="2011-01-29T19:32:18Z"
+ content="""
+Ah that makes sense, thanks!
+"""]]
diff --git a/doc/forum/Can_ikiwiki_be_configured_as_multi_user_blog__63__.mdwn b/doc/forum/Can_ikiwiki_be_configured_as_multi_user_blog__63__.mdwn
new file mode 100644
index 000000000..118b534ed
--- /dev/null
+++ b/doc/forum/Can_ikiwiki_be_configured_as_multi_user_blog__63__.mdwn
@@ -0,0 +1,7 @@
+All the ikiwiki blogs I have seen have a single user blog. Is it possible to give every user a blog, where they can create their own pages in their own directory = based on their user name?
+
+I feel that a wiki might give more options in the way users share and collaborate when compared to a blog engine (like Word Press in multi user format)
+
+Is this the best place to post a question like this? There doesn't seem to be much traffic in this forum
+Thanks for your help
+Richard
diff --git a/doc/forum/Can_not_advance_past_first_page_of_results_using_search_plugin.mdwn b/doc/forum/Can_not_advance_past_first_page_of_results_using_search_plugin.mdwn
new file mode 100644
index 000000000..1a9391e48
--- /dev/null
+++ b/doc/forum/Can_not_advance_past_first_page_of_results_using_search_plugin.mdwn
@@ -0,0 +1,26 @@
+I'm using the [[/plugins/search/]] plugin and it correctly displays the first page of results, but the "Next" button doesn't work.
+
+If I search for "linux", for example, I see "1-10 of exactly 65 matches" and this in my browser's address bar: https://example.com/ikiwiki.cgi?P=linux
+
+Then, I scroll down and click "Next" and I see. . .
+
+> Although this page is encrypted, the information you have entered is to be sent over an unencrypted connection and could easily be read by a third party.
+>
+> Are you sure you want to continue sending this information?
+
+. . . then I click "Continue" but I'm stuck on the first page of search results (it still says "1-10 of exactly 65 matches") and I have the following in my browser's address bar:
+
+https://example.com/ikiwiki.cgi?P=linux&DEFAULTOP=or&%253E=Next&DB=default&FMT=query&xP=Zlinux&xDB=default&xFILTERS=--O
+
+I noticed that if I change what's in the address bar to the following, I **can** advance to page 2 (it shows "11-20 of exactly 65 matches"). That is to say, I'm removing "25" from "%253E" as a work around:
+
+https://example.com/ikiwiki.cgi?P=linux&DEFAULTOP=or&%3E=Next&DB=default&FMT=query&xP=Zlinux&xDB=default&xFILTERS=--O
+
+Based on this output, I might need to make a change to "searchquery.tmpl", which is under [[/templates]]. . .
+
+ [wikiuser@ikiwiki1 ~]$ grep -r DEFAULTOP /usr/share/ikiwiki
+ /usr/share/ikiwiki/templates/searchquery.tmpl:<SELECT NAME=DEFAULTOP>
+ [wikiuser@ikiwiki1 ~]$ rpm -qf /usr/share/ikiwiki/templates/searchquery.tmpl
+ ikiwiki-3.20120202-1.el6.noarch
+
+. . . but I would appreciate any guidance on what the fix might be.
diff --git a/doc/forum/Can_one_tell_if_a_page_is_added_rather_than_changed__63__.mdwn b/doc/forum/Can_one_tell_if_a_page_is_added_rather_than_changed__63__.mdwn
new file mode 100644
index 000000000..4c06cbabb
--- /dev/null
+++ b/doc/forum/Can_one_tell_if_a_page_is_added_rather_than_changed__63__.mdwn
@@ -0,0 +1,5 @@
+In the plugin interface, there are hooks for "deleted" and "changed" and the "changed" interfaces includes files which are *either* changed or added. Is there any way of telling that a file has been added rather than changed? With some plugins (for example, [[plugins/sidebar]]) if a new page (of a certain sort) is added, the only way one can fix up the dependencies is to rebuild the whole site from scratch. This is Not Good. Now, one could do something in the "changed" hook, but since one can't tell if a file has been changed or added, if one did something for every changed file, one would be doing a lot of needless work (so one might as well rebuild the site when one *knows* that a new (relevant) page has been added).
+
+But I really would like to be able to do things just to the *new* files, so is there any way that one can distinguish the changed files from the added files?
+
+-- [[KathrynAndersen]]
diff --git a/doc/forum/Can_one_tell_if_a_page_is_added_rather_than_changed__63__/comment_1_1397feebfb0fb7cc57af2f8b74ce047e._comment b/doc/forum/Can_one_tell_if_a_page_is_added_rather_than_changed__63__/comment_1_1397feebfb0fb7cc57af2f8b74ce047e._comment
new file mode 100644
index 000000000..7ddbb40fd
--- /dev/null
+++ b/doc/forum/Can_one_tell_if_a_page_is_added_rather_than_changed__63__/comment_1_1397feebfb0fb7cc57af2f8b74ce047e._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-06-20T00:22:54Z"
+ content="""
+I think that presence dependencies mostly cover this case. But no, there is not currently a hook that gets information about which files changed vs were added. The information is available at the time the hooks are called so some new ones could be added.
+"""]]
diff --git a/doc/forum/Can_one_tell_if_a_page_is_added_rather_than_changed__63__/comment_2_ad36c945f59fe525428fc30246911ff5._comment b/doc/forum/Can_one_tell_if_a_page_is_added_rather_than_changed__63__/comment_2_ad36c945f59fe525428fc30246911ff5._comment
new file mode 100644
index 000000000..4cffde3fc
--- /dev/null
+++ b/doc/forum/Can_one_tell_if_a_page_is_added_rather_than_changed__63__/comment_2_ad36c945f59fe525428fc30246911ff5._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="202.173.183.92"
+ subject="comment 2"
+ date="2011-06-20T03:24:49Z"
+ content="""
+If new hooks could be added, that would be greatly appreciated. Perhaps two new hooks: `added` and `updated` (the \"updated\" hook would be for files which were already-existing files which were changed).
+
+--[[KathrynAndersen]]
+"""]]
diff --git a/doc/forum/Cannot_write_to_commitlock.mdwn b/doc/forum/Cannot_write_to_commitlock.mdwn
new file mode 100644
index 000000000..05490a799
--- /dev/null
+++ b/doc/forum/Cannot_write_to_commitlock.mdwn
@@ -0,0 +1,28 @@
+I am following the laptop wiki with git tip page. I have set up my local and remote wiki as suggested. However, when I try to push my local changes back to the server I get the following error:
+
+Writing objects: 100% (4/4), 359 bytes, done.
+Total 4 (delta 2), reused 0 (delta 0)
+cannot write to /home/ian/ianbarton/.ikiwiki/commitlock: No such file or directory
+To ian@wilkesley.org:~/ikiwiki/ianbarton.git
+ 5cf9054..16a871d master -> master
+
+The relevnt bit of my setup file is:
+
+git_wrapper => '/home/ian/ianbarton.git/hooks/post-commit',
+
+Now ~/ianbarton/.ikiwiki exists and is owned and writable by me. I have tried touching commitlock and also removing lock and commitlock before pushing. Any suggestions for further trouble shooting?
+
+Ian.
+
+> I'm guessing that this is some kind of permissions problem,
+> and that the error message is just being misleading.
+>
+> When you push the changes to the server, what user is
+> git logging into the server as? If that user is different
+> than `ian` (possibly due to using git-daemon?), the post-commit
+> wrapper needs to be setuid to `ian`. This ensures that ikiwiki
+> runs as you and can see and write to the files. --[[Joey]]
+
+The user is logging as ian, the same user as the laptop. I can push and pull git repos on the same server owned by the same user via ssh with no problem. I have deleted and re-started from scratch several times. However, for my use case I think it's simpler to keep the repo on my local computer and just rsync the web pages to the server.
+
+Ian.
diff --git a/doc/forum/Chinese_file_name_corruption.mdwn b/doc/forum/Chinese_file_name_corruption.mdwn
new file mode 100644
index 000000000..49d7d5b3f
--- /dev/null
+++ b/doc/forum/Chinese_file_name_corruption.mdwn
@@ -0,0 +1,5 @@
+I added a file with Chinese file name, and committed and pushed it to my Ikiwiki instance on Web. It displays correctly in browser. But, the file name shows up like
+
+http://i53.tinypic.com/2cikb9y.png
+
+Is there a way to fix it?
diff --git a/doc/forum/Chinese_file_name_corruption/comment_1_765ac8b6f70083bb5aaaaac5beab461f._comment b/doc/forum/Chinese_file_name_corruption/comment_1_765ac8b6f70083bb5aaaaac5beab461f._comment
new file mode 100644
index 000000000..c3e17c205
--- /dev/null
+++ b/doc/forum/Chinese_file_name_corruption/comment_1_765ac8b6f70083bb5aaaaac5beab461f._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-08-05T21:03:11Z"
+ content="""
+It's hard to tell from this small screenshot what I'm supposed to be looking at, but the \"Parent directory\" and date suggests this is an apache dirindex listing, which has nothing to do with ikiwiki.
+
+Ikiwiki is (or attempts to be) utf-8 clean on filenames and content, so as long as you use that encoding and have the requisite fonts your chinese filenames will display ok in ikiwiki. Apache probably doesn't try to get this right at all, or needs some more configuration.
+"""]]
diff --git a/doc/forum/Clarification_on_--cgi_option.mdwn b/doc/forum/Clarification_on_--cgi_option.mdwn
new file mode 100644
index 000000000..182b09b4b
--- /dev/null
+++ b/doc/forum/Clarification_on_--cgi_option.mdwn
@@ -0,0 +1,4 @@
+
+May anyone explain what's the intended use of --cgi option ?
+
+I understand you may run the cgi wrapper from command line using --cgi, but what is this option useful for ?. Perhaps for use with SSI pages ?. Any example ?
diff --git a/doc/forum/Clarification_on_--cgi_option/comment_1_deda457e4bff7dfe630dbc0192dfddea._comment b/doc/forum/Clarification_on_--cgi_option/comment_1_deda457e4bff7dfe630dbc0192dfddea._comment
new file mode 100644
index 000000000..9ead810f6
--- /dev/null
+++ b/doc/forum/Clarification_on_--cgi_option/comment_1_deda457e4bff7dfe630dbc0192dfddea._comment
@@ -0,0 +1,11 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2011-10-05T11:10:26Z"
+ content="""
+The `--cgi` option is for internal use (the compiled CGI wrapper
+basically cleans the environment, and runs `ikiwiki --cgi ...`
+to do the real work). It isn't useful from a command line unless
+you're doing something extremely strange.
+"""]]
diff --git a/doc/forum/Commiting_all_moderated_comments_into_special_branch__63__.mdwn b/doc/forum/Commiting_all_moderated_comments_into_special_branch__63__.mdwn
new file mode 100644
index 000000000..e45069c63
--- /dev/null
+++ b/doc/forum/Commiting_all_moderated_comments_into_special_branch__63__.mdwn
@@ -0,0 +1,8 @@
+I only interact with ikiwiki via cli & git; thus I would love to be able to moderate comments via git from all remote checkouts without being forced to ssh to my server and do that locally.
+
+Is anyone aware of a way to check all comments into a special branch, possibly called "moderation", and with a normal suffix, not "_comment_pending"? That would allow me to cherry-pick from that branch without having to remember to rename and simply delete all spam etc from the branch.
+
+Every now and then, I could delete the whole branch, thus cleaning out crud. As the approved comments live in master, that would not be a problem.
+
+
+RichiH
diff --git a/doc/forum/Commiting_all_moderated_comments_into_special_branch__63__/comment_1_8403e8ff9c5c8dddb6d744632322f7bc._comment b/doc/forum/Commiting_all_moderated_comments_into_special_branch__63__/comment_1_8403e8ff9c5c8dddb6d744632322f7bc._comment
new file mode 100644
index 000000000..5aa5af039
--- /dev/null
+++ b/doc/forum/Commiting_all_moderated_comments_into_special_branch__63__/comment_1_8403e8ff9c5c8dddb6d744632322f7bc._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2012-03-22T16:56:39Z"
+ content="""
+A branch makes sense if it's used in a way that never ties its history with mainline, so all the spam can eventually be dropped out of git, by deleting the branch. Though this is clearly an expert user level option.
+
+Probably the easiest way to get there would be to keep the branch, checked out, in `.ikiwiki/comments_pending/`. The old code that used that directory is still in ikiwiki for backwards compatability, so it should be easy to get comments written into that location. Then a minimum of vcs-specific code would be needed to set up the branch and commit pending comments to it.
+
+But there's a wrinkle -- if you just cherry pick from that branch, the comments_pending directory will retain all the old spam, growing without bounds. And the web moderation interface will show them all. I suppose you could check out the branch and revert or delete spammy comments too, but this is feeling like a lot of work the user needs to do in order to use git to moderate spammy comments. I have to think most users would prefer, as I do, to occasionally flailing at a web form in this case.
+"""]]
diff --git a/doc/forum/Darcs_as_the_RCS___63__.mdwn b/doc/forum/Darcs_as_the_RCS___63__.mdwn
new file mode 100644
index 000000000..9664240ee
--- /dev/null
+++ b/doc/forum/Darcs_as_the_RCS___63__.mdwn
@@ -0,0 +1,13 @@
+Hi,
+
+I have successfully installed and set up my first instance of [[ikiwiki]] on my dedicated server. Following [[joey]]'s screencasts made this easy (thank you).
+
+Currently, I have set up the RCS to be git but I do not like this very much. I'd rather want darcs but if I replace rcs settings, it fails.
+
+What should I put in the configuration file to use darcs ?
+
+> Darcs is not yet supported. It's being [[worked_on|todo/darcs]].
+
+> > That's good news for me then ! Thank you.
+
+>>> Better news: It will be in version 2.10. --[[Joey]]
diff --git a/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm.mdwn b/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm.mdwn
new file mode 100644
index 000000000..b501a11c8
--- /dev/null
+++ b/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm.mdwn
@@ -0,0 +1,19 @@
+Hi People,
+
+first thanks for this nice an usable piece of software.
+
+Recently i moved to a new server, reinstalled ikiwiki via aptitude and now i'm getting this error:
+
+ ikiwiki -setup /etc/ikiwiki/auto.setup
+ /etc/ikiwiki/auto.setup: Can't locate IkiWiki/Setup/Automator.pm in @INC (@INC contains: /etc/perl /usr/local/lib/perl/5.10.0 /usr/local/share/perl/5.10.0 /usr/lib/perl5 /usr/share/perl5 /usr/lib/perl/5.10 /usr/share/perl/5.10 /usr/local/lib/site_perl .) at (eval 10) line 13.
+
+Or with an existing wiki:
+
+ ikiwiki -setup younameit.setup
+ younameit.setup: Can't use an undefined value as an ARRAY reference at /usr/share/perl5/IkiWiki/Setup/Standard.pm line 33.
+ BEGIN failed--compilation aborted at (eval 10) line 293.
+
+Can you help?
+
+Best wishes,
+Tobias.
diff --git a/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm/comment_1_aec4bf4ca7d04d580d2fa83fd3f7166f._comment b/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm/comment_1_aec4bf4ca7d04d580d2fa83fd3f7166f._comment
new file mode 100644
index 000000000..2c884e261
--- /dev/null
+++ b/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm/comment_1_aec4bf4ca7d04d580d2fa83fd3f7166f._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-01-25T19:18:21Z"
+ content="""
+You're using an old version of ikiwiki with setup files from a newer version. That won't work for various reasons, and the simplest fix is to upgrade your server to the version you had before.
+"""]]
diff --git a/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm/comment_4_c682ebb0e8e72088a8f92356dc31ef37._comment b/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm/comment_4_c682ebb0e8e72088a8f92356dc31ef37._comment
new file mode 100644
index 000000000..35a258885
--- /dev/null
+++ b/doc/forum/Debian_5.0.7:_Can__39__t_locate_IkiWiki__47__Setup__47__Automator.pm/comment_4_c682ebb0e8e72088a8f92356dc31ef37._comment
@@ -0,0 +1,13 @@
+[[!comment format=mdwn
+ username="tk"
+ ip="79.222.20.29"
+ subject="comment 4"
+ date="2011-01-26T12:34:29Z"
+ content="""
+Thank you for the fast reply, Joey!
+I tried it with ikiwiki from debian backports and it works as usual :)
+
+Bye,
+Tobias.
+
+"""]]
diff --git a/doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__.mdwn b/doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__.mdwn
new file mode 100644
index 000000000..50b5ea900
--- /dev/null
+++ b/doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__.mdwn
@@ -0,0 +1,7 @@
+Unfortunately debian stable / squeeze repos are still on version 3.20100815.7
+
+Do you think squeeze will update to a newer version soon? I am lacking support of the gallery plugin and 1 more that I forgot right now ;)
+
+Is everyone else upgrading by directly installing via dpkg (no updates, but yeah)?
+
+Regards
diff --git a/doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__/comment_1_5e916c8fa90470909064ea73531f79d4._comment b/doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__/comment_1_5e916c8fa90470909064ea73531f79d4._comment
new file mode 100644
index 000000000..e6d09b5ab
--- /dev/null
+++ b/doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__/comment_1_5e916c8fa90470909064ea73531f79d4._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2012-01-16T14:52:22Z"
+ content="""
+Nothing wrong with 3.20100815.7 unless you need newer features.
+
+At Branchable we run Debian stable and backport the current ikiwiki to run on it. This is quite easy to do, it just builds and works. (Edit debian/control and s/markdown-discount/markdown/)
+
+It's probably time someone released a backport. I have historically not done that myself though.
+"""]]
diff --git a/doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__/comment_2_2fa15f0eaf8c860b82e366130c8563c7._comment b/doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__/comment_2_2fa15f0eaf8c860b82e366130c8563c7._comment
new file mode 100644
index 000000000..07955ac38
--- /dev/null
+++ b/doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__/comment_2_2fa15f0eaf8c860b82e366130c8563c7._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawndsaC4GaIBw49WNdbk2Faqfm_mrtQgul8"
+ nickname="Christian"
+ subject="thats cool"
+ date="2012-01-16T15:31:22Z"
+ content="""
+thanks
+"""]]
diff --git a/doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__/comment_3_c5af589dcdfe4f91dba50243762065e5._comment b/doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__/comment_3_c5af589dcdfe4f91dba50243762065e5._comment
new file mode 100644
index 000000000..3ab401eb8
--- /dev/null
+++ b/doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__/comment_3_c5af589dcdfe4f91dba50243762065e5._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawndsaC4GaIBw49WNdbk2Faqfm_mrtQgul8"
+ nickname="Christian"
+ subject="Great! It worked!"
+ date="2012-01-17T01:18:06Z"
+ content="""
+Thanks Joey, also for the replacement hint, had to install one or two dependencies and it worked like a charm. Version from the day before yesterday now. (Awesome!)
+
+I have not upgraded the mypage.setup file, yet. I included \"headinganchors\" which does not seem to work right now. The page compiles without errors but contains no anchors when viewed in browser. Hm..
+
+Great job thanks again!
+"""]]
diff --git a/doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__/comment_4_3090da7bafbf92a825edec8ffc45af20._comment b/doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__/comment_4_3090da7bafbf92a825edec8ffc45af20._comment
new file mode 100644
index 000000000..569a73546
--- /dev/null
+++ b/doc/forum/Debian_squeeze_still_on_3.20100815.7_-_update_recommended__63__/comment_4_3090da7bafbf92a825edec8ffc45af20._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawndsaC4GaIBw49WNdbk2Faqfm_mrtQgul8"
+ nickname="Christian"
+ subject="updating setup file"
+ date="2012-01-17T01:30:32Z"
+ content="""
+just for the record - I created a new wiki and the setup file is then automatically in YAML format. I think I am going to just transfer the settings from the \"old\" setup file to the new one.
+
+If anyone has an idea why the headinganchors don't work, pls let me know.
+
+As for the rest - insanely cool piece of software - great
+"""]]
diff --git a/doc/forum/Define_custom_commands.mdwn b/doc/forum/Define_custom_commands.mdwn
new file mode 100644
index 000000000..c8ac00eed
--- /dev/null
+++ b/doc/forum/Define_custom_commands.mdwn
@@ -0,0 +1 @@
+Is it possible to define custom "commands" in ikiwiki? For example if I write &%test&% in the source of a wiki-page in markdown, the word "test" should be displayed red, bold and italic.
diff --git a/doc/forum/Define_custom_commands/comment_1_7d82637bc8c706b69e4a55585677f6bf._comment b/doc/forum/Define_custom_commands/comment_1_7d82637bc8c706b69e4a55585677f6bf._comment
new file mode 100644
index 000000000..da57072a9
--- /dev/null
+++ b/doc/forum/Define_custom_commands/comment_1_7d82637bc8c706b69e4a55585677f6bf._comment
@@ -0,0 +1,11 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-09-05T17:33:51Z"
+ content="""
+Plugins can be written that do this kind of arbitrary transformation. For
+example, the smiley plugin does.
+
+However, this is not usual in ikiwiki, instead there is a specific syntax for a [[ikiwiki/directive]] that is used consistently for a great many things of this sort.
+"""]]
diff --git a/doc/forum/Different_behaviour_when_editing_with_editor_than_with_web-edit.mdwn b/doc/forum/Different_behaviour_when_editing_with_editor_than_with_web-edit.mdwn
new file mode 100644
index 000000000..5440ddf73
--- /dev/null
+++ b/doc/forum/Different_behaviour_when_editing_with_editor_than_with_web-edit.mdwn
@@ -0,0 +1,24 @@
+Hi,
+
+today I encountered a strange behaviour of ikiwiki that I'm trying to understand...
+
+I linked to a subpage using
+
+\[[somename|/path/from/root/dir/filename]]
+
+and saved the file. Then I added it to the repo, committed it, and ran
+
+ ikiwiki --setup setupfile.setup
+
+Afterwards some older links of the same style were displayed correctly, but one new link remained black - i.e. with no question mark to it. I figure that means ikiwiki has recognized the syntax but did not create a link in the html file.
+
+I messed around a bit, recompiling many times did not work. After a while I opened the file via the web editing formular and saved it from there ("Save Page").
+
+After that the link was there.
+
+I dont know why. My question is - what does ikiwiki do differently when saved from the web form than when saving an editor and then running ikiwiki --setup setupfile.setup.
+
+Btw. my version is still 3.20100815.7 since I'm on squeeze and like to stick with the repo versions.
+
+Thanks in advance,
+Chris
diff --git a/doc/forum/Different_behaviour_when_editing_with_editor_than_with_web-edit/comment_1_ac6bda46ad00bfe980bc76c4a39aa796._comment b/doc/forum/Different_behaviour_when_editing_with_editor_than_with_web-edit/comment_1_ac6bda46ad00bfe980bc76c4a39aa796._comment
new file mode 100644
index 000000000..91e7d9e7a
--- /dev/null
+++ b/doc/forum/Different_behaviour_when_editing_with_editor_than_with_web-edit/comment_1_ac6bda46ad00bfe980bc76c4a39aa796._comment
@@ -0,0 +1,9 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnam4a8RtFwaWkKOX2BkA5I7cpGHcFw8E0"
+ nickname="Christian"
+ subject="comment 1"
+ date="2011-12-03T17:17:57Z"
+ content="""
+with \"no question mark to it\" means \"being no link (target exists) AND with no question mark to it\"
+thx
+"""]]
diff --git a/doc/forum/Different_behaviour_when_editing_with_editor_than_with_web-edit/comment_3_10a46f8ee23c8935e20c70842671cee4._comment b/doc/forum/Different_behaviour_when_editing_with_editor_than_with_web-edit/comment_3_10a46f8ee23c8935e20c70842671cee4._comment
new file mode 100644
index 000000000..a14752ffe
--- /dev/null
+++ b/doc/forum/Different_behaviour_when_editing_with_editor_than_with_web-edit/comment_3_10a46f8ee23c8935e20c70842671cee4._comment
@@ -0,0 +1,13 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnam4a8RtFwaWkKOX2BkA5I7cpGHcFw8E0"
+ nickname="Christian"
+ subject="comment 3"
+ date="2011-12-03T17:25:58Z"
+ content="""
+One more thing: my syntax with the above post is not correct - the linking syntax was/is
+
+ \[[somename|path/from/website/root/dir/filename]]
+
+ie. I got one slash too much above.
+
+"""]]
diff --git a/doc/forum/Different_templates_for_subdirectories__63_____40__Blogging_and_Wiki_pages__41__.mdwn b/doc/forum/Different_templates_for_subdirectories__63_____40__Blogging_and_Wiki_pages__41__.mdwn
new file mode 100644
index 000000000..8d6700651
--- /dev/null
+++ b/doc/forum/Different_templates_for_subdirectories__63_____40__Blogging_and_Wiki_pages__41__.mdwn
@@ -0,0 +1,7 @@
+I have been mucking about with ikiwiki for two whole days now.
+
+I like many things about it. Even though I've been spending most of my time wrestling with css I did manage to write a whole lot of blog posts and love what ikiwiki is doing for the "revise" part of my writing cycle. And I like the idea of integrating the wiki and the blog into one unifying architecture....
+
+But... I would like very much to have different page templates for blogging and wiki-ing, some way of specifying that for stuff in the "/posts" directory I'd rather use blogpost.tmpl rather than page.tmpl. I just spent a few minutes looking at the perl for this (I assume Render.pm) and my mind dumped core...
+
+(generically, some way to specify output formatting on a subdirectory basis would be good)
diff --git a/doc/forum/Different_templates_for_subdirectories__63_____40__Blogging_and_Wiki_pages__41__/comment_1_15651796492a6f04a19f4a481947c97c._comment b/doc/forum/Different_templates_for_subdirectories__63_____40__Blogging_and_Wiki_pages__41__/comment_1_15651796492a6f04a19f4a481947c97c._comment
new file mode 100644
index 000000000..e92f4107d
--- /dev/null
+++ b/doc/forum/Different_templates_for_subdirectories__63_____40__Blogging_and_Wiki_pages__41__/comment_1_15651796492a6f04a19f4a481947c97c._comment
@@ -0,0 +1,16 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawlY5yDefnXSHvWGbJ9kvhnAyQZiAAttENk"
+ nickname="Javier"
+ subject="comment 1"
+ date="2010-10-21T15:00:50Z"
+ content="""
+You can do what you want with the [[ikiwiki/directive/pagetemplate]] directive, but in a slightly cumbersome way, because you have to say what template you want in every page that differs from the default.
+
+See also: [[templates]]
+
+And, a perhaps more proper solution to your problem, although I don't fully understand the way of tackling it, in [[todo/multiple_template_directories]].
+
+If you could create a proper page in this wiki, centralizing all the knowledge dispersed in those pages, it would be nice ;)
+
+--[[jerojasro]]
+"""]]
diff --git a/doc/forum/Disable_account_creation_for_new_users.mdwn b/doc/forum/Disable_account_creation_for_new_users.mdwn
new file mode 100644
index 000000000..4a24323b3
--- /dev/null
+++ b/doc/forum/Disable_account_creation_for_new_users.mdwn
@@ -0,0 +1,9 @@
+Hi,
+I'm planning on abusing ikiwiki as a mini-CMS for a static website and think it's exactly the right system for it. Great stuff btw.
+
+But I still got a problem that I can't solve at the moment. For the static website (with commenting enabled tho) I want no one else to edit the pages, i.e. I want to turn off the editing function for everyone except me.
+
+That works with the lockedit plugin so far, and I can also hide the edit and preferences links on the pages. But if someone manually enters the link to the edit page he/she can still create an account (I have disabled openid and plan to use passwordauth only). The account can then not be used to edit pages but still, everyone can create accounts in userdb over and over. Is there a way to solve that? I just don't see it right now.
+
+Best regards,
+Chris
diff --git a/doc/forum/Disable_account_creation_for_new_users/comment_1_adafddb0aff7c2c1f4574101c4cf9073._comment b/doc/forum/Disable_account_creation_for_new_users/comment_1_adafddb0aff7c2c1f4574101c4cf9073._comment
new file mode 100644
index 000000000..116b2a592
--- /dev/null
+++ b/doc/forum/Disable_account_creation_for_new_users/comment_1_adafddb0aff7c2c1f4574101c4cf9073._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnam4a8RtFwaWkKOX2BkA5I7cpGHcFw8E0"
+ nickname="Christian"
+ subject="comment on my own question"
+ date="2011-11-30T15:15:21Z"
+ content="""
+REALLY great was what I wanted to say - all the essentials are there, including commenting and search function. What else can you want <3
+"""]]
diff --git a/doc/forum/Disable_account_creation_for_new_users/comment_2_865591f77966f1657a9a4b2426318c51._comment b/doc/forum/Disable_account_creation_for_new_users/comment_2_865591f77966f1657a9a4b2426318c51._comment
new file mode 100644
index 000000000..d04cfb6db
--- /dev/null
+++ b/doc/forum/Disable_account_creation_for_new_users/comment_2_865591f77966f1657a9a4b2426318c51._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 2"
+ date="2011-11-30T16:09:16Z"
+ content="""
+Not currently possible, but you can configure [[plugins/passwordauth]] to
+ require an `account_creation_password`, and not tell anyone else what
+it is; or you could use [[plugins/httpauth]] for authenticated editing.
+(My mostly-static ikiwiki sites only allow [[plugins/httpauth]] over
+HTTPS.)
+"""]]
diff --git a/doc/forum/Disable_account_creation_for_new_users/comment_3_05193e563682f634f13691ee0a8359db._comment b/doc/forum/Disable_account_creation_for_new_users/comment_3_05193e563682f634f13691ee0a8359db._comment
new file mode 100644
index 000000000..66b521407
--- /dev/null
+++ b/doc/forum/Disable_account_creation_for_new_users/comment_3_05193e563682f634f13691ee0a8359db._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnam4a8RtFwaWkKOX2BkA5I7cpGHcFw8E0"
+ nickname="Christian"
+ subject="thx"
+ date="2011-11-30T18:50:50Z"
+ content="""
+thank you for your quick reply, smcv!
+"""]]
diff --git a/doc/forum/Discussion_PageSpec__63__.mdwn b/doc/forum/Discussion_PageSpec__63__.mdwn
new file mode 100644
index 000000000..2860d0d17
--- /dev/null
+++ b/doc/forum/Discussion_PageSpec__63__.mdwn
@@ -0,0 +1,3 @@
+I've looked around but haven't found it. Can you set a Discussion PageSpec so only certain pages allow discussion?
+
+> Not currently, sorry. --[[Joey]]
diff --git a/doc/forum/Doing_related_links_based_on_tags.mdwn b/doc/forum/Doing_related_links_based_on_tags.mdwn
new file mode 100644
index 000000000..9f6a1b937
--- /dev/null
+++ b/doc/forum/Doing_related_links_based_on_tags.mdwn
@@ -0,0 +1,31 @@
+I've been recently using a template this
+
+ ----
+ Related posts:
+
+ \[[!inline pages="blog/posts/*
+ and !blog/posts/*/*
+ and !Discussion
+ and !tagged(draft)
+ and <TMPL_VAR raw_tagged>"
+ archive="yes"
+ quick="yes"
+ show="5"]]
+
+Which I then call by doing this at the end of my blog posts on my
+ikiwiki install
+
+ \[[!tag software linux]]
+ \[[!template id=related tagged="tagged(software) or tagged(linux)"]]
+
+It somewhat works, I was wondering if anyone else has tried to do
+something like the above to get "related posts" based on tags. The way
+that I have done it isn't very clever as it only links to the last 5
+most recently posted items based on my parameters. Is it possible to
+"randomly" select a bunch of links from a set of user defined
+pagespecs?
+
+I know that the [[backlinks]] plugin exists for this sort of stuff
+(related links), it just lacks some user configuration options.
+
+> I guess what you need is an extension to [[ikiwiki/pagespec/sorting]] to support "random" as a sort method. Remember though, that the chosen few would only change when the page was regenerated, not on every page view. -- [[Jon]]
diff --git a/doc/forum/Dump_plugin.mdwn b/doc/forum/Dump_plugin.mdwn
new file mode 100644
index 000000000..ff3bfea90
--- /dev/null
+++ b/doc/forum/Dump_plugin.mdwn
@@ -0,0 +1,4 @@
+I have a second plugin that adds a directive 'dump', and dumps all sorts of information (env variables and template variables) about a page into the end of the page. It's cheesy, but it's available in my [[Dropbox|http://dl.dropbox.com/u/11256359/dump.pm]] as well as the Asciidoc plugin.
+
+### Issues
+* It really ought to use some kind of template instead of HTML. In fact, it ought to embed its information in template variables of some kind rather than stuffing it into the end of the page.
diff --git a/doc/forum/Dump_plugin/comment_1_bfce80b3f5be78ec28692330843d4ae1._comment b/doc/forum/Dump_plugin/comment_1_bfce80b3f5be78ec28692330843d4ae1._comment
new file mode 100644
index 000000000..855b72bbb
--- /dev/null
+++ b/doc/forum/Dump_plugin/comment_1_bfce80b3f5be78ec28692330843d4ae1._comment
@@ -0,0 +1,14 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawngqGADV9fidHK5qabIzKN0bx1ZIfvaTqs"
+ nickname="Glenn"
+ subject="New dump plugin"
+ date="2010-10-03T00:45:47Z"
+ content="""
+I took my own advice and rewrote the dump plugin so that it uses a template. A sample template has been added to my [[Dropbox|http://dl.dropbox.com/u/11256359/dump.tmpl]].
+
+### Issues:
+
+* Dumps appear at the end of the page rather than where the directive occurs.
+* For some reason I haven't yet figured out, dumps don't appear in page previews.
+* I haven't tested inlined content and the dump plugin.
+"""]]
diff --git a/doc/forum/Email_notifications_for_comment_moderation.mdwn b/doc/forum/Email_notifications_for_comment_moderation.mdwn
new file mode 100644
index 000000000..6a840ce53
--- /dev/null
+++ b/doc/forum/Email_notifications_for_comment_moderation.mdwn
@@ -0,0 +1,3 @@
+I use both [[plugins/moderatedcomments/]] and [[plugins/anonok]] on my [blog](http://feeding.cloud.geek.nz) but having to remember to visit the comment moderation page manually is not ideal.
+
+Is there a way to get an email notification (or maybe even just an RSS feed) whenever a new comment enters the moderation queue?
diff --git a/doc/forum/Email_notifications_for_comment_moderation/comment_1_668bf6a21310dcc8b882bc60a130ba06._comment b/doc/forum/Email_notifications_for_comment_moderation/comment_1_668bf6a21310dcc8b882bc60a130ba06._comment
new file mode 100644
index 000000000..06140728e
--- /dev/null
+++ b/doc/forum/Email_notifications_for_comment_moderation/comment_1_668bf6a21310dcc8b882bc60a130ba06._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawm6VdLM7IKcNgkDiJ2YKHpcWli9bRLsZDk"
+ nickname="Oscar"
+ subject="How to get a comment moderation feed"
+ date="2013-02-25T18:04:21Z"
+ content="""
+I was looking for the same subject and I have found the following:
+
+http://ikiwiki.info/tips/comments_feed/
+
+
+"""]]
diff --git a/doc/forum/Empty_sha1sum_messages.mdwn b/doc/forum/Empty_sha1sum_messages.mdwn
new file mode 100644
index 000000000..f464232c4
--- /dev/null
+++ b/doc/forum/Empty_sha1sum_messages.mdwn
@@ -0,0 +1,11 @@
+Hi,
+
+running ikiwiki from squeeze backports I see frequently in the logs:
+
+> Empty sha1sum for 'ikiwiki/pagespec.mdwn'.
+
+The page varies. What are these messages about? The code that emits this
+comes from the git plugin, and this page is indeed not in git. So is
+this just noise? Or rather, why does ikiwiki need to look at these?
+
+thanks in advance!
diff --git a/doc/forum/Empty_sha1sum_messages/comment_1_b260b5e6b4c4f4c203b01183fee9fd69._comment b/doc/forum/Empty_sha1sum_messages/comment_1_b260b5e6b4c4f4c203b01183fee9fd69._comment
new file mode 100644
index 000000000..7ad6f2815
--- /dev/null
+++ b/doc/forum/Empty_sha1sum_messages/comment_1_b260b5e6b4c4f4c203b01183fee9fd69._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2012-03-22T17:06:05Z"
+ content="""
+Hmm, this is a debug message, so only printed or logged if verbose mode is enabled. As far as I can see, in normal operation this could only happen if you use the web interface to edit a page that's coming originally from ikiwiki's underlay -- such as `ikiwiki/pagespec.mdwn`, or a page in the srcdir whose file is indeed not checked into git for some reason.
+
+Doesn't seem useful, so I've nuked the message.
+"""]]
diff --git a/doc/forum/Empty_sha1sum_messages/comment_2_d6a47838a3c81d0a75e6fc22e786c976._comment b/doc/forum/Empty_sha1sum_messages/comment_2_d6a47838a3c81d0a75e6fc22e786c976._comment
new file mode 100644
index 000000000..452fc946a
--- /dev/null
+++ b/doc/forum/Empty_sha1sum_messages/comment_2_d6a47838a3c81d0a75e6fc22e786c976._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://k1024.org/~iusty/"
+ nickname="Iustin Pop"
+ subject="Re: comment 1"
+ date="2012-03-22T19:34:55Z"
+ content="""
+Hmm, I see this 2-3 times a day on a internet exposed ikiwiki. There shouldn't be any editing of such files, especially not of files from the underlay - the only editing permissions are for blog comments.
+
+I'll have to check what's going on better, thanks for the reply!
+"""]]
diff --git a/doc/forum/Encoding_problem_in_french_with_ikiwiki-calendar.mdwn b/doc/forum/Encoding_problem_in_french_with_ikiwiki-calendar.mdwn
new file mode 100644
index 000000000..472412de1
--- /dev/null
+++ b/doc/forum/Encoding_problem_in_french_with_ikiwiki-calendar.mdwn
@@ -0,0 +1,20 @@
+Hi!
+
+I'm using the ikiwiki calendar plugin.
+
+My website is in french (locale fr_FR.UTF-8), and calendars that are generated by the plugin makes some encodi$
+
+I don't know how the plugin generate translation for dates, but I've seen that there is no ikiwiki translation$
+
+That's why I suppose (but I'm not sure) that it use date unix command to insert date into the html page, witho$
+
+Could I have forgotten some options to make it nice or not?
+
+Is someone could test it and verify if it works or not?
+
+Thanks.
+
+Zut
+
+> This was discussed in [[bugs/Encoding_problem_in_calendar_plugin]]
+> and is now fixed. --[[Joey]]
diff --git a/doc/forum/Error:_CGI::tmpFileName_failed_to_return_the_uploaded_file_name.mdwn b/doc/forum/Error:_CGI::tmpFileName_failed_to_return_the_uploaded_file_name.mdwn
new file mode 100644
index 000000000..ad52c0091
--- /dev/null
+++ b/doc/forum/Error:_CGI::tmpFileName_failed_to_return_the_uploaded_file_name.mdwn
@@ -0,0 +1,11 @@
+I tried to upload a file to my wiki last week, and got this error:
+
+ Error: CGI::tmpFileName failed to return the uploaded file name
+
+This used to work, and I honestly don't know what could have changed to screw it up.
+
+I see that there's a lot of frustrated comments around this particular block of code in `attachment.pm`; so, for the record, I'm on Debian 6, using perl `5.10.1-17squeeze4`.
+
+Any thoughts?
+
+~jonathon
diff --git a/doc/forum/Error:_CGI::tmpFileName_failed_to_return_the_uploaded_file_name/comment_1_66c321b9eb618d20872cee7d6ca9e44c._comment b/doc/forum/Error:_CGI::tmpFileName_failed_to_return_the_uploaded_file_name/comment_1_66c321b9eb618d20872cee7d6ca9e44c._comment
new file mode 100644
index 000000000..fb499004e
--- /dev/null
+++ b/doc/forum/Error:_CGI::tmpFileName_failed_to_return_the_uploaded_file_name/comment_1_66c321b9eb618d20872cee7d6ca9e44c._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawlRjjrKEyPmXnh2qBEGx9PgH5DP32wCMAQ"
+ nickname="Jonathon"
+ subject="ikiwiki version"
+ date="2013-01-02T14:59:19Z"
+ content="""
+I should also identify that I'm using ikiwiki `3.20100815.9`.
+"""]]
diff --git a/doc/forum/Error:_CGI::tmpFileName_failed_to_return_the_uploaded_file_name/comment_2_80296d67c7f1dd75b56b85c14f5efa3b._comment b/doc/forum/Error:_CGI::tmpFileName_failed_to_return_the_uploaded_file_name/comment_2_80296d67c7f1dd75b56b85c14f5efa3b._comment
new file mode 100644
index 000000000..e7659413e
--- /dev/null
+++ b/doc/forum/Error:_CGI::tmpFileName_failed_to_return_the_uploaded_file_name/comment_2_80296d67c7f1dd75b56b85c14f5efa3b._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawlRjjrKEyPmXnh2qBEGx9PgH5DP32wCMAQ"
+ nickname="Jonathon"
+ subject="figured it out"
+ date="2013-01-19T15:59:09Z"
+ content="""
+It looks like this was just another expression of [the header size limit issue] [1] that has already been reported and addressed.
+
+I got `3.20120629` from `squeeze-backports`, and my issue has been resolved.
+
+[1]: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=638009
+"""]]
diff --git a/doc/forum/Error:___34__do__34___parameter_missing.mdwn b/doc/forum/Error:___34__do__34___parameter_missing.mdwn
new file mode 100644
index 000000000..402217e82
--- /dev/null
+++ b/doc/forum/Error:___34__do__34___parameter_missing.mdwn
@@ -0,0 +1,13 @@
+Hi
+
+I'm stuck with a «Error: "do" parameter missing» message I can't fix.
+
+I'm using ikiwiki 3.20100815.7 on a debian 6.0.4 system.
+
+Error redirection is obvisously configured, also the dot cgi thing.
+
+You can test it at http://wikimix.cc/thisuridoesntexist
+
+The procedure of creating a reference to a new page gives the same error.
+
+Any clue?
diff --git a/doc/forum/Error:___34__do__34___parameter_missing/comment_1_3a51c303ba1670f1567f323349b53837._comment b/doc/forum/Error:___34__do__34___parameter_missing/comment_1_3a51c303ba1670f1567f323349b53837._comment
new file mode 100644
index 000000000..a01a33468
--- /dev/null
+++ b/doc/forum/Error:___34__do__34___parameter_missing/comment_1_3a51c303ba1670f1567f323349b53837._comment
@@ -0,0 +1,16 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2012-05-22T13:46:20Z"
+ content="""
+Did you enable the [[plugins/404]] plugin?
+
+Which web server? That plugin is meant to work with Apache 2; in
+principle it should be possible to make it work with other web servers,
+but it'll need some setup.
+
+The 404 plugin relies on your web server giving IkiWiki some extra
+information about 404s; lighttpd doesn't currently provide enough
+information for IkiWiki to detect 404s reliably, for instance.
+"""]]
diff --git a/doc/forum/Error:___34__do__34___parameter_missing/comment_2_c5f24a8c4d2de0267cf0de1908480e82._comment b/doc/forum/Error:___34__do__34___parameter_missing/comment_2_c5f24a8c4d2de0267cf0de1908480e82._comment
new file mode 100644
index 000000000..dd2fdb248
--- /dev/null
+++ b/doc/forum/Error:___34__do__34___parameter_missing/comment_2_c5f24a8c4d2de0267cf0de1908480e82._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="http://ismael.olea.org/"
+ ip="150.214.94.198"
+ subject="comment 2"
+ date="2012-05-22T21:24:37Z"
+ content="""
+Gosh, that was it.
+
+Grrr. Sorry for being too rookie O:-)
+
+Thanks!
+"""]]
diff --git a/doc/forum/Error:_bad_page_name.mdwn b/doc/forum/Error:_bad_page_name.mdwn
new file mode 100644
index 000000000..70277a1e4
--- /dev/null
+++ b/doc/forum/Error:_bad_page_name.mdwn
@@ -0,0 +1,46 @@
+I'm trying to use ikiwiki for the first time. In the start, I had problems
+with installing the package, because I don't have a root account on my
+server.
+
+When I solved this, I finally set up my wiki, but whenever I try to edit a
+page, I get an error: “Error: bad page name”.
+
+What am I doing wrong? The wiki is at
+<http://atrey.karlin.mff.cuni.cz/~onderka/wiki/>, the setupfile I used at
+<http://atrey.karlin.mff.cuni.cz/~onderka/wiki/ikiwiki.setup>.
+
+> This means that one of the checks that ikiwiki uses to prevent
+> editing files with strange or insecure names has fired incorrectly.
+> Your setup file seems fine.
+> We can figure out what is going wrong through a series of tests:
+>
+> * Test if your perl has a problem with matching alphanumerics:
+> `perl -le 'print int "index"=~/^([-[:alnum:]+\/.:_]+)$/'`
+> * Check if something is breaking pruning of disallowed files:
+> `perl -le 'use IkiWiki; %config=IkiWiki::defaultconfig(); print ! IkiWiki::file_pruned("index")'`
+> --[[Joey]]
+
+>>Both seem to run fine:
+
+ onderka@atrey:~$ perl -le 'print int "index"=~/^([-[:alnum:]+\/.:_]+)$/'
+ 1
+ onderka@atrey:~$ perl -le 'use IkiWiki; %config=IkiWiki::defaultconfig(); print ! IkiWiki::file_pruned("index")'
+ 1
+
+>>> Try installing this [instrumented
+>>> version](http://kitenet.net/~joey/tmp/editpage.pm) of
+>>> `IkiWiki/Plugin/editpage.pm`, which will add some debugging info
+>>> to the error message. --[[Joey]]
+
+>>>>When I tried to `make` ikiwiki with this file, I got the error
+
+ ../IkiWiki/Plugin/editpage.pm:101: invalid variable interpolation at "$"
+
+>>>>> Sorry about that, I've corrected the above file. --[[Joey]]
+
+>>>>>> Hmm, funny. Now that I reinstalled it with your changed file, it started working. I didn't remember how exactly did I install it the last time, so this time, it seems I did it correctly. Thank you very much for your help.
+
+>>>>>>> Well, this makes me suspect you installed an older version of
+>>>>>>> ikiwiki and my file, which is from the latest version, included a
+>>>>>>> fix for whatever bug you were seeing. If I were you, I'd ensure
+>>>>>>> that I have a current version of ikiwiki installed. --[[Joey]]
diff --git a/doc/forum/Error:_cannot_write_to___47__home__47__user__47__myiki2__47__.ikiwiki__47__lockfile:_Permission_denied_.mdwn b/doc/forum/Error:_cannot_write_to___47__home__47__user__47__myiki2__47__.ikiwiki__47__lockfile:_Permission_denied_.mdwn
new file mode 100644
index 000000000..c6af80ab1
--- /dev/null
+++ b/doc/forum/Error:_cannot_write_to___47__home__47__user__47__myiki2__47__.ikiwiki__47__lockfile:_Permission_denied_.mdwn
@@ -0,0 +1,5 @@
+I installed a new wiki on my local machine. When I click on edit, I get the following error:
+
+`Error: cannot write to /home/user/myiki2/.ikiwiki/lockfile: Permission denied`
+
+I checked the permissions of that file and assured that they are the same as in my other working ikiwiki, but it doesn't help. Any idea?
diff --git a/doc/forum/Error:_cannot_write_to___47__home__47__user__47__myiki2__47__.ikiwiki__47__lockfile:_Permission_denied_/comment_1_64146f306ec8c10614521359b6de4f82._comment b/doc/forum/Error:_cannot_write_to___47__home__47__user__47__myiki2__47__.ikiwiki__47__lockfile:_Permission_denied_/comment_1_64146f306ec8c10614521359b6de4f82._comment
new file mode 100644
index 000000000..02f70860d
--- /dev/null
+++ b/doc/forum/Error:_cannot_write_to___47__home__47__user__47__myiki2__47__.ikiwiki__47__lockfile:_Permission_denied_/comment_1_64146f306ec8c10614521359b6de4f82._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="https://launchpad.net/~tale"
+ nickname="tale"
+ subject="What are the permissions?"
+ date="2011-12-28T19:35:25Z"
+ content="""
+What exactly are the permissions for the lockfile and the directory it is in?
+
+What user is the ikiwiki running as?
+"""]]
diff --git a/doc/forum/Error:_cannot_write_to___47__home__47__user__47__myiki2__47__.ikiwiki__47__lockfile:_Permission_denied_/comment_2_ed2b4b8f7122b42bbde1189fbd2969dd._comment b/doc/forum/Error:_cannot_write_to___47__home__47__user__47__myiki2__47__.ikiwiki__47__lockfile:_Permission_denied_/comment_2_ed2b4b8f7122b42bbde1189fbd2969dd._comment
new file mode 100644
index 000000000..cc1a30adb
--- /dev/null
+++ b/doc/forum/Error:_cannot_write_to___47__home__47__user__47__myiki2__47__.ikiwiki__47__lockfile:_Permission_denied_/comment_2_ed2b4b8f7122b42bbde1189fbd2969dd._comment
@@ -0,0 +1,23 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk_MMtLPS7osC5MjX00q2ATjvvXPWqm0ik"
+ nickname="micheal"
+ subject="comment 2"
+ date="2011-12-28T23:06:06Z"
+ content="""
+<code>
+stat -c '%A %a %U %G %n' myiki2 myiki1
+drwxr-xr-x 755 user user myiki2
+drwxr-xr-x 755 user user myiki1
+
+stat -c '%A %a %U %G %n' myiki2/.ikiwiki myiki1/.ikiwiki
+drwxr-xr-x 755 user user myiki2/.ikiwiki
+drwxr-xr-x 755 user user myiki1/.ikiwiki
+
+stat -c '%A %a %U %G %n' myiki2/.ikiwiki/lockfile myiki1/.ikiwiki/lockfile
+-rw-r--r-- 644 user user myiki2/.ikiwiki/lockfile
+-rw-r--r-- 644 user user myiki1/.ikiwiki/lockfile
+</code>
+
+As you see, myiki2 has the same permissions as myiki1, but myiki1 works.
+
+"""]]
diff --git a/doc/forum/Error_Code_1.mdwn b/doc/forum/Error_Code_1.mdwn
new file mode 100644
index 000000000..3e6878bb7
--- /dev/null
+++ b/doc/forum/Error_Code_1.mdwn
@@ -0,0 +1,7 @@
+Hi. I'm new to ikiwiki. I typed
+
+"make install"
+
+and got "Error Code 1".
+
+Any help is appreciated.
diff --git a/doc/forum/Error_Code_1/comment_1_0459afcc383aad382df67a19eaf2e731._comment b/doc/forum/Error_Code_1/comment_1_0459afcc383aad382df67a19eaf2e731._comment
new file mode 100644
index 000000000..f4bb410eb
--- /dev/null
+++ b/doc/forum/Error_Code_1/comment_1_0459afcc383aad382df67a19eaf2e731._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-02-25T18:57:22Z"
+ content="""
+Read the README file. Ikiwiki's source does not include a Makefile; you have to run ./Makefile.PL to create one.
+"""]]
diff --git a/doc/forum/Everyone_can_remove_comments.mdwn b/doc/forum/Everyone_can_remove_comments.mdwn
new file mode 100644
index 000000000..5e30b08ed
--- /dev/null
+++ b/doc/forum/Everyone_can_remove_comments.mdwn
@@ -0,0 +1 @@
+Having enabled anonok and leads to the undesirable situation, that everybody is able to remove every comment. Is there an easy workaround, or am I missing something? --[[wiebel]]
diff --git a/doc/forum/Flowplayer.mdwn b/doc/forum/Flowplayer.mdwn
new file mode 100644
index 000000000..9bf3ab3af
--- /dev/null
+++ b/doc/forum/Flowplayer.mdwn
@@ -0,0 +1 @@
+[Flowplayer](http://flowplayer.org) is the open source flash video player plugin. [My site](http://mcfrisk.kapsi.fi) has raw html enabled to work with old content so I was able to use the raw html and javascript [examples](http://flowplayer.org/documentation/installation/index.html) in blog posts, but some of them fail when combined on the [aggregate page](http://mcfrisk.kapsi.fi/skiing/). Any hints on how to properly use Flowplayer with ikiwiki?
diff --git a/doc/forum/Flowplayer/comment_1_75d13cd915a736422db47e00dbe46671._comment b/doc/forum/Flowplayer/comment_1_75d13cd915a736422db47e00dbe46671._comment
new file mode 100644
index 000000000..159e9dce1
--- /dev/null
+++ b/doc/forum/Flowplayer/comment_1_75d13cd915a736422db47e00dbe46671._comment
@@ -0,0 +1,9 @@
+[[!comment format=mdwn
+ username="Sphynkx"
+ ip="85.238.106.185"
+ subject="Videoplugin"
+ date="2011-06-27T02:27:19Z"
+ content="""
+Hello!!
+Test my [videoplugin](http://ikiwiki.info/plugins/contrib/video/)!!
+"""]]
diff --git a/doc/forum/Flowplayer/comment_2_1b2d3891006a87a4773bd126baacddfc._comment b/doc/forum/Flowplayer/comment_2_1b2d3891006a87a4773bd126baacddfc._comment
new file mode 100644
index 000000000..597638397
--- /dev/null
+++ b/doc/forum/Flowplayer/comment_2_1b2d3891006a87a4773bd126baacddfc._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://mcfrisk.myopenid.com/"
+ nickname="mikko.rapeli"
+ subject="Bad html"
+ date="2011-07-02T19:09:22Z"
+ content="""
+My problems on the aggregate pages were resolved by checking and fixing html validation problems with the embedded video stuff. I will have look at the video plugin later, and maybe html5 and `<video>` tag as well.
+"""]]
diff --git a/doc/forum/Forward_slashes_being_escaped_as_252F.mdwn b/doc/forum/Forward_slashes_being_escaped_as_252F.mdwn
new file mode 100644
index 000000000..5df81e561
--- /dev/null
+++ b/doc/forum/Forward_slashes_being_escaped_as_252F.mdwn
@@ -0,0 +1,33 @@
+When I try to edit a page that has a forward slash in the URL, I get "Error:
+bad page name". I think the problem is because the forward slash is escaped as
+`%252F` instead of `%2F`.
+
+For example, if I go to `http://ciffer.net/~svend/tech/hosts/` and click Edit,
+I am sent to a page with the URL
+`http://ciffer.net/~svend/ikiwiki.cgi?page=tech%252Fhosts&do=edit`.
+
+I am running ikiwiki 3.20100504~bpo50+1 on Debian Lenny.
+
+
+> But on your page, the Edit link is escaped normally and correctly (using %2F).
+> Look at the page source!
+>
+> The problem is that your web server is forcing a hard (302) redirect
+> to the doubly-escaped url. In wireshark I see your web server send back:
+
+ HTTP/1.1 302 Found\r\n
+ Apache/2.2.9 (Debian) PHP/5.2.6-1+lenny9 with Suhosin-Patch
+ Location: http://ciffer.net/~svend/ikiwiki.cgi?page=tech%252Fhosts&do=edit
+
+> You'll need to investigate why your web server is doing that... --[[Joey]]
+
+>> Thanks for pointing me in the right direction. I have the following redirect
+>> in my Apache config.
+
+ RewriteEngine on
+ RewriteCond %{HTTP_HOST} ^www\.ciffer\.net$
+ RewriteRule /(.*) http://ciffer.net/$1 [L,R]
+
+>> and my ikiwiki url setting contained `www.ciffer.net`, which was causing the
+>> redirect. Correcting the url fixed the problem. I'm still not sure why
+>> Apache was mangling the URL. --[[Svend]]
diff --git a/doc/forum/Forward_slashes_being_escaped_as_252F/comment_1_7702cf6d354ab600d6643b075b9f09da._comment b/doc/forum/Forward_slashes_being_escaped_as_252F/comment_1_7702cf6d354ab600d6643b075b9f09da._comment
new file mode 100644
index 000000000..7c9ccbca1
--- /dev/null
+++ b/doc/forum/Forward_slashes_being_escaped_as_252F/comment_1_7702cf6d354ab600d6643b075b9f09da._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawm6gCEo_Z36OJ7x5kyZn52lEVvyjn3zSUc"
+ nickname="Ángel"
+ subject="comment 1"
+ date="2013-05-31T16:03:03Z"
+ content="""
+Simply add the NE (No escape) flag:
+
+RewriteEngine on
+RewriteCond %{HTTP_HOST} ^www\.ciffer\.net$
+RewriteRule /(.*) http://ciffer.net/$1 [L,R,NE]
+"""]]
diff --git a/doc/forum/Google_searches_of_ikiwiki.info_are_broken._:__40__.mdwn b/doc/forum/Google_searches_of_ikiwiki.info_are_broken._:__40__.mdwn
new file mode 100644
index 000000000..3716a1475
--- /dev/null
+++ b/doc/forum/Google_searches_of_ikiwiki.info_are_broken._:__40__.mdwn
@@ -0,0 +1,14 @@
+I know that these pages exist on ikiwiki.info:
+
+* http://ikiwiki.info/ikiwiki/formatting/
+* http://ikiwiki.info/ikiwiki/subpage/
+
+But I can't get either to show up in Google search results. I have even tried:
+> site:ikiwiki.info inurl:formatting
+
+and
+
+> site:ikiwiki.info inurl:formatting -inurl:discussion
+
+...Is this some robots.txt problem?
+
diff --git a/doc/forum/Have_creation_time_for_new_pages_fetched_from_the_VCS.mdwn b/doc/forum/Have_creation_time_for_new_pages_fetched_from_the_VCS.mdwn
new file mode 100644
index 000000000..1ce63dbcf
--- /dev/null
+++ b/doc/forum/Have_creation_time_for_new_pages_fetched_from_the_VCS.mdwn
@@ -0,0 +1,7 @@
+I use ikiwiki for my blog, and I'd like the creation page of each page to be the one registered in the VCS. However, the only way to get that is really with the --gettime flag, which gets ALL the timestamps from the VCS... which take quite some time after a few years of writing ;-)
+
+So I'd like to suggest that ikiwiki could fetch ctime through rcs_getctime() by default when it finds a new page...
+
+mtime is a different matter, not so important to me...
+
+-- [Richard Levitte](http://journal.richard.levitte.org/)
diff --git a/doc/forum/Have_creation_time_for_new_pages_fetched_from_the_VCS/comment_1_9572dd6f7a2f6f630b12f24bb5c4a8ce._comment b/doc/forum/Have_creation_time_for_new_pages_fetched_from_the_VCS/comment_1_9572dd6f7a2f6f630b12f24bb5c4a8ce._comment
new file mode 100644
index 000000000..d9e974c8a
--- /dev/null
+++ b/doc/forum/Have_creation_time_for_new_pages_fetched_from_the_VCS/comment_1_9572dd6f7a2f6f630b12f24bb5c4a8ce._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-07-19T15:50:41Z"
+ content="""
+This is not done for reasons of speed. `rcs_getctime` is for many VCSs still slow, requiring a new svn or whatever to be forked off (and in the case of svn, possibly make a network connection). In the case where a commit comes in that adds multiple pages, that would be a lot of work. And for the VCSs where it's not slow, it manages to not be slow by optimising for the case where *all* the times for all pages are looked up in one go .. which is still rather a lot to do when only a single new page has been added. So we lose either way, and it's not suitable for being run all the time, unfortunately.
+"""]]
diff --git a/doc/forum/Help_with_tag__95__autocreate.mdwn b/doc/forum/Help_with_tag__95__autocreate.mdwn
new file mode 100644
index 000000000..f470c57cf
--- /dev/null
+++ b/doc/forum/Help_with_tag__95__autocreate.mdwn
@@ -0,0 +1,9 @@
+I have the tag plugin enabled, and these additional lines in my setup file:
+
+ tagbase => "tags",
+ tag_autocreate => 1,
+ tag_autocreate_commit => 1,
+
+However, when I use a !tag or !taglink directive, nothing gets autocreated in the tags/ directory. What am I doing wrong?
+
+Edit: I'm using ikiwiki version 3.20100122ubuntu1 on Ubuntu 10.04.3 LTS... upgraded to ikiwiki_3.20110905_all from the debian repository and that solved my problem. Oops. :)
diff --git a/doc/forum/Hide_text.mdwn b/doc/forum/Hide_text.mdwn
new file mode 100644
index 000000000..cb4568342
--- /dev/null
+++ b/doc/forum/Hide_text.mdwn
@@ -0,0 +1,3 @@
+I want to make a kind of puzzle page with ikiwiki. For that I want to have the solution text hidden directly below the puzzle which appears if I click a button.
+
+Is it possible to do this in ikiwiki?
diff --git a/doc/forum/Hide_text/comment_1_f21d21c130f97a7b21d8a317178e2e0c._comment b/doc/forum/Hide_text/comment_1_f21d21c130f97a7b21d8a317178e2e0c._comment
new file mode 100644
index 000000000..7b2031d48
--- /dev/null
+++ b/doc/forum/Hide_text/comment_1_f21d21c130f97a7b21d8a317178e2e0c._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-09-07T20:58:18Z"
+ content="""
+Sounds like you want [[ikiwiki/directive/toggle]]
+"""]]
diff --git a/doc/forum/Hide_text/comment_2_5a878865f34f78a89c4ec91a9425a085._comment b/doc/forum/Hide_text/comment_2_5a878865f34f78a89c4ec91a9425a085._comment
new file mode 100644
index 000000000..80a8d4281
--- /dev/null
+++ b/doc/forum/Hide_text/comment_2_5a878865f34f78a89c4ec91a9425a085._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk_MMtLPS7osC5MjX00q2ATjvvXPWqm0ik"
+ nickname="micheal"
+ subject="comment 2"
+ date="2011-09-11T11:59:35Z"
+ content="""
+Thanks, that's exactly what I wanted!
+"""]]
diff --git a/doc/forum/How_best_to_manage_a_bilingual_ikiwiki_site__63__.mdwn b/doc/forum/How_best_to_manage_a_bilingual_ikiwiki_site__63__.mdwn
new file mode 100644
index 000000000..8528c3b68
--- /dev/null
+++ b/doc/forum/How_best_to_manage_a_bilingual_ikiwiki_site__63__.mdwn
@@ -0,0 +1,15 @@
+For example, I need to deploy a wiki site in English and a non-English language. It can be assumed all pages are available in English. Some of the pages are/should be available in the other language. When a user creates/edits the one version of a page, he may or may not create/edit the version of the opposite language.
+
+Is there some recommended settings for this?
+
+For example, I'd like a link pointing to the opposite language version displayed on a page:
+
+ This is an English page <LINK TO LOCAL LANGUAGE VERSION>
+
+and
+
+ &*^&*^*(*()*)(^%^$%^%^&*^ <LINK TO ENGLISH VERSION>
+
+Any ideas how best to achieve something like this?
+
+Also, can I have non-ASCII char in source file names (`.mdwn`)?
diff --git a/doc/forum/How_best_to_manage_a_bilingual_ikiwiki_site__63__/comment_1_4389d65b14fa1b7134098e0ffe3bf055._comment b/doc/forum/How_best_to_manage_a_bilingual_ikiwiki_site__63__/comment_1_4389d65b14fa1b7134098e0ffe3bf055._comment
new file mode 100644
index 000000000..019c585e8
--- /dev/null
+++ b/doc/forum/How_best_to_manage_a_bilingual_ikiwiki_site__63__/comment_1_4389d65b14fa1b7134098e0ffe3bf055._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-05-22T22:24:12Z"
+ content="""
+Use the [[po_plugin|plugins/po]] for this.
+
+(Yes, you may use any utf-8 in filenames of pages, as well as page contents.)
+"""]]
diff --git a/doc/forum/How_can_I_prevent_spam__63__.mdwn b/doc/forum/How_can_I_prevent_spam__63__.mdwn
new file mode 100644
index 000000000..44e31927e
--- /dev/null
+++ b/doc/forum/How_can_I_prevent_spam__63__.mdwn
@@ -0,0 +1,17 @@
+I am getting continuous spam like this:
+
+ discussion 85.25.146.11 web 11:02:19 05/17/13 2rand[0,1,1]
+ discussion 85.25.146.11 web 11:02:13 05/17/13 2rand[0,1,1]
+
+The bot uses an IP address as the username and puts '2rand[0,1,1]' as comment text.
+
+I do not have a page 'discussion' in use, so I have redirected this page with an apache2
+Alias to a static page, just in case anyone stumbles on it. This means it cannot really
+be edited via the web. However the bots that post
+this spam are evidently not opening the page to edit it, but merely sending a cgi request
+as if they had edited the page. The result is that no damage is done on the site and no
+benefit is achieved for the spammer since google cannot see the result. However, the
+logs are stuffed with spurious entries and a page is constantly recompiled, which wastes
+resources.
+
+Is there some way to reject edits that do not arise from an established session?
diff --git a/doc/forum/How_can_I_prevent_spam__63__/comment_1_fd26fb7f1569e8c44ba8262794f938db._comment b/doc/forum/How_can_I_prevent_spam__63__/comment_1_fd26fb7f1569e8c44ba8262794f938db._comment
new file mode 100644
index 000000000..a7293288c
--- /dev/null
+++ b/doc/forum/How_can_I_prevent_spam__63__/comment_1_fd26fb7f1569e8c44ba8262794f938db._comment
@@ -0,0 +1,19 @@
+[[!comment format=mdwn
+ username="http://joeyh.name/"
+ nickname="joey"
+ subject="comment 1"
+ date="2013-05-17T17:55:46Z"
+ content="""
+Normally ikiwiki requires a valid session cookie of a logged in user to edit pages. It sounds like you may have the opendiscussion or anonok plugins enabled, which allows anyone to edit without logging in. Recommend disabling them.
+
+Since you know the spammer's IP, put it into ikiwiki.setup:
+
+<pre>
+banned_users:
+ - ip(85.25.146.11)
+</pre>
+
+If the user was logging in, you could also put their username in the ban list.
+
+You can also try enabling the blogspam plugin.
+"""]]
diff --git a/doc/forum/How_can_I_prevent_spam__63__/comment_2_d098124f005976ee815d25c883bc9106._comment b/doc/forum/How_can_I_prevent_spam__63__/comment_2_d098124f005976ee815d25c883bc9106._comment
new file mode 100644
index 000000000..53e743361
--- /dev/null
+++ b/doc/forum/How_can_I_prevent_spam__63__/comment_2_d098124f005976ee815d25c883bc9106._comment
@@ -0,0 +1,16 @@
+[[!comment format=mdwn
+ username="http://claimid.com/richard-lyons"
+ nickname="richard-lyons"
+ subject="comment 2"
+ date="2013-05-17T20:56:23Z"
+ content="""
+I did indeed have opendiscussion active. I shall wait to see what happens after disabling it.
+
+The bots seem to make 5 consecutive edits at short intervals (around 2 minutes) using an IP
+address as a username. I do not know if the IP is the one from which they work. There are
+usually two or three sets of five edits using different IP addresses as username in each hour.
+
+I did try blocking specific IPs but they constantly change.
+
+It would be good if blocking could match a regexp, but as far as I can see this is not an option,
+"""]]
diff --git a/doc/forum/How_can_I_prevent_spam__63__/comment_3_deb434d01aaefa18d2791e48d6c824ae._comment b/doc/forum/How_can_I_prevent_spam__63__/comment_3_deb434d01aaefa18d2791e48d6c824ae._comment
new file mode 100644
index 000000000..64783befc
--- /dev/null
+++ b/doc/forum/How_can_I_prevent_spam__63__/comment_3_deb434d01aaefa18d2791e48d6c824ae._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://claimid.com/richard-lyons"
+ nickname="richard-lyons"
+ subject="SOLVED -- How can I prevent spam?"
+ date="2013-05-18T08:13:19Z"
+ content="""
+I can now confirm that this particular attack has stopped after removing the opendiscussion plugin.
+"""]]
diff --git a/doc/forum/How_do_I_enable_OpenID__63__.mdwn b/doc/forum/How_do_I_enable_OpenID__63__.mdwn
new file mode 100644
index 000000000..a4e1a4531
--- /dev/null
+++ b/doc/forum/How_do_I_enable_OpenID__63__.mdwn
@@ -0,0 +1 @@
+I'm trying to set up a new ikiwiki based blog and I want commentors to be able to log in with OpenID. To enable OpenID I installed Net::OpenID::Consumer, and enabled the openid plugin in my blog.setup file and ran the ikiwiki --setup command. Despite doing that I still only see the username and password in the login screen. What am I missing?
diff --git a/doc/forum/How_does_ikiwiki_remember_times__63__.mdwn b/doc/forum/How_does_ikiwiki_remember_times__63__.mdwn
new file mode 100644
index 000000000..6b7739fd0
--- /dev/null
+++ b/doc/forum/How_does_ikiwiki_remember_times__63__.mdwn
@@ -0,0 +1,98 @@
+This is similar to the last post in this forum. I want to know exactly how ikiwiki remembers the times associated with pages, especially when using it for blogging, so I know whether I can trust it or not. From that last thread, I think what ikiwiki does is this:
+
+* The created time of a file is when that file was first committed into the versioning repository (in my case git)
+
+ > If `--getctime` it used, yes. In normal operation, when new files
+ > are added, ikiwiki sets the creation time to the ctime of the file
+ > on disk, rather than bothering to ask the VCS. --[[Joey]]
+
+* The modified time of a file is what that file was last updated in the repository
+
+ > Almost right, the modified time is actually taken from the
+ > modification time of the file in disk. --[[Joey]]
+
+And with a blog, by default, the posts are ordered by creation time, although an option can order them by modified time.
+
+Okay. So this should mean that the times are safe if, for example, I delete my working copy and then clone another one from the bare git repository, or otherwise mess up the creation times and mtimes stored as file metadata on the filesystem.
+
+Do I have it right?
+
+> Some VCS, like git, set the file mtimes to the current time
+> when making a new checkout, so they will be lost if you do that.
+> The creation times can be retrived using the `--getctime` option.
+> --[[Joey]]
+>
+> > Thanks for the clarification. I ran some tests of my own to make sure I understand it right, and I'm satisfied
+> > that the order of posts in my blog can be retrieved from the VCS using the `--getctime` option, at least if I
+> > choose to order my posts by creation time rather than modification time. But I now know that I can't rely on
+> > page modification times in ikiwiki as these can be lost permanently.
+>
+> > > Update: It's now renamed to `--gettime`, and pulls both the creation
+> > > and modification times. Also, per [[todo/auto_getctime_on_fresh_build]],
+> > > this is now done automatically the first time ikiwiki builds a
+> > > srcdir. So, no need to worry about this any more! --[[Joey]]
+> >
+> > I would suggest that there should at least be a `--getmtime` option like you describe, and perhaps that
+> > `--getctime` and `--getmtime` be _on by default_. In my opinion the creation times and modification times of
+> > pages in ikiwiki are part of the user's content and are important to protect, because the user may be relying
+> > on them, especially if they use blogging or lists of recently modified pages, etc. Right now the modification
+> > times can be lost permanently.
+> >
+> > Is there a typo in the description of `--getctime` in the man page?
+> >
+> > > --getctime
+> > > Pull **last changed time** for each new page out of the revision
+> > > control system. This rarely used option provides a way to get
+> > > the real creation times of items in weblogs, such as when build‐
+> > > ing a wiki from a new Subversion checkout. It is unoptimised and
+> > > quite slow. It is best used with --rebuild, to force ikiwiki to
+> > > get the ctime for all pages.
+> >
+> > Surely it is not the _last changed time_ but the _first seen time_ of each page that is pulled out of the VCS?
+> > If the aim is to get the real creation times of items in weblogs, then the last times that the items were
+> > changed in the VCS is not going to help. -- [[seanh]]
+>>> Typo, fixed. --[[Joey]]
+
+> > > If you want to preserve the date of a page, the best way to do it is to
+> > > use [[ikiwiki/directive/meta]] date="foo". This will survive checkouts,
+> > > VCS migrations, etc. -- [[Jon]]
+> > >
+> > > > That's a good tip Jon. That would also survive renaming a page by renaming its mdwn file, which would
+> > > > normally lose the times also. (And in that case I think both times are irretrievable, even by
+> > > > `--getctime`). I might start using a simple script to make blog posts that creates a file for
+> > > > me, puts today's date in the file as a meta, and opens the file in my editor. -- [[seanh]]
+
+>>>>> I use a script that does that and also sets up templates and tags
+>>>>> for a new item:
+
+ #!/bin/sh
+ set -u
+ set -e
+
+ if [ $# -ne 1 ]; then
+ echo usage: $0 pagename >&2
+ exit 1
+ fi
+
+ pagename="$1"
+
+ if [ -e "$pagename" ]; then
+ echo error: "$pagename" exists >&2
+ exit 1
+ fi
+
+ date=$(date)
+ echo '\[[!template id=draft]]' >> "$pagename"
+ echo "\[[!meta date=\"$date\"]]" >> "$pagename"
+ echo "\[[!tag draft]]" >> "$pagename"
+ git add "$pagename"
+ $EDITOR "$pagename"
+
+>>>>> -- [[Jon]]
+
+> A quick workaround for me to get modification times right is the following
+> little zsh script, which unfortunately only works for git:
+
+>> Elided; no longer needed since --gettime does that, and much faster! --[[Joey]]
+
+> --[[David_Riebenbauer]]
diff --git a/doc/forum/How_is_TITLE_evaluated_in_inline_archive_templates__63__.mdwn b/doc/forum/How_is_TITLE_evaluated_in_inline_archive_templates__63__.mdwn
new file mode 100644
index 000000000..fc84fabbd
--- /dev/null
+++ b/doc/forum/How_is_TITLE_evaluated_in_inline_archive_templates__63__.mdwn
@@ -0,0 +1,11 @@
+Hi,
+
+I'm wondering how is TITLE evaluated in inline archive templates?
+
+Needless to say, I don't know much perl except the code that looks similar to other languages like bash.
+
+I found this line:
+
+$template->param(title => pagetitle(basename($page)));
+
+It seems to return a page name (pagetitle having no effect). Or maybe I'm not testing this the right way!
diff --git a/doc/forum/How_long_does_server_delay_newly_pushed_revisions__63__.mdwn b/doc/forum/How_long_does_server_delay_newly_pushed_revisions__63__.mdwn
new file mode 100644
index 000000000..dea95c285
--- /dev/null
+++ b/doc/forum/How_long_does_server_delay_newly_pushed_revisions__63__.mdwn
@@ -0,0 +1,10 @@
+So I
+
+1. checked out the repository of the my Ikiwiki instance and added new page and editted existing pages in the working copy
+2. Then add and commit the revisions: `git add .` and `git commit -m "update"`
+3. Then push it to server: `git push`
+
+I then go to browser and go to the Ikiwiki URL of recent changes. I didn't find the new revision, neither I can open the new pages.
+Is there a way to verify my revision did go in? How long does the server delays processing it?
+
+
diff --git a/doc/forum/How_to_add_a_mouse-over_pop-up_label_for_a_text__63__.mdwn b/doc/forum/How_to_add_a_mouse-over_pop-up_label_for_a_text__63__.mdwn
new file mode 100644
index 000000000..cf9245404
--- /dev/null
+++ b/doc/forum/How_to_add_a_mouse-over_pop-up_label_for_a_text__63__.mdwn
@@ -0,0 +1,8 @@
+How to add a mouse-over pop-up label for a text?
+
+I'd like to have the following effect:
+
+when a user move the mouse arrow on top of the text, a small window will show up at the upper right of the text and the window contains some additional information about the text.
+
+Any idea how to achieve this?
+
diff --git a/doc/forum/How_to_add_additional_links_to_the_gray_horizontable_bar_under_page_title__63__.mdwn b/doc/forum/How_to_add_additional_links_to_the_gray_horizontable_bar_under_page_title__63__.mdwn
new file mode 100644
index 000000000..537ba6161
--- /dev/null
+++ b/doc/forum/How_to_add_additional_links_to_the_gray_horizontable_bar_under_page_title__63__.mdwn
@@ -0,0 +1,3 @@
+How to add additional links to the gray horizontable bar under page title?
+
+What I meant by the gray horizontable bar shows "Edit RecentChanges Preferences". I want to disable those three but add some other links to other pages in my website.
diff --git a/doc/forum/How_to_add_additional_links_to_the_gray_horizontable_bar_under_page_title__63__/comment_1_f2e52d38f60888c7d5142de853123540._comment b/doc/forum/How_to_add_additional_links_to_the_gray_horizontable_bar_under_page_title__63__/comment_1_f2e52d38f60888c7d5142de853123540._comment
new file mode 100644
index 000000000..1bb3fa7cb
--- /dev/null
+++ b/doc/forum/How_to_add_additional_links_to_the_gray_horizontable_bar_under_page_title__63__/comment_1_f2e52d38f60888c7d5142de853123540._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawmb2CAGUNU_Kx6YSImD2ox6MtjuaM6Jp1E"
+ nickname="Nicolas"
+ subject="comment 1"
+ date="2012-11-09T08:34:42Z"
+ content="""
+You could use custom templates and hardcode those links there.
+"""]]
diff --git a/doc/forum/How_to_add_link_to_previous_and_next_blog_on_blog_pages__63__.mdwn b/doc/forum/How_to_add_link_to_previous_and_next_blog_on_blog_pages__63__.mdwn
new file mode 100644
index 000000000..3f771e777
--- /dev/null
+++ b/doc/forum/How_to_add_link_to_previous_and_next_blog_on_blog_pages__63__.mdwn
@@ -0,0 +1,14 @@
+In a Ikiwiki instance, I made a subdirectory to hold blog pages
+
+ /website
+ blog.mdwn
+ ...
+
+ /website/blog
+ /website/blog/blog1.mdwn
+ /website/blog/blog2.mdwn
+ ...
+
+On blog.mdwn, reader by default see the last 10 posts, but it seems there is no link to blog pages older than that. What's a good way to make titles of blog pages older also somehow visible to reader at the blog front page `blog.mdwn`?
+
+On any individual blog page such as `blog1.mdwn`, there is not a link to the blog pages immediately before and after it. How to make such links?
diff --git a/doc/forum/How_to_add_link_to_previous_and_next_blog_on_blog_pages__63__/comment_1_aad510f45be505efaabcb6fb860665a4._comment b/doc/forum/How_to_add_link_to_previous_and_next_blog_on_blog_pages__63__/comment_1_aad510f45be505efaabcb6fb860665a4._comment
new file mode 100644
index 000000000..ce99b8475
--- /dev/null
+++ b/doc/forum/How_to_add_link_to_previous_and_next_blog_on_blog_pages__63__/comment_1_aad510f45be505efaabcb6fb860665a4._comment
@@ -0,0 +1,23 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="use trail=yes, and an extra inline with archive=yes"
+ date="2012-08-04T17:22:18Z"
+ content="""
+To get just the titles of older posts, you want an inline with
+`archive=\"yes\"`, probably one that skips the same number of posts
+displayed in full:
+
+ [[!inline pages=\"blog/* and !*/Discussion\"
+ skip=\"10\" feeds=\"no\" archive=\"yes\"]]
+
+To get 'next' and 'previous' links on each post, use a recent
+IkiWiki version, enable the [[plugins/trail]] plugin and add
+`trail=\"yes\"` to your main inline:
+
+ [[!inline pages=\"blog/* and !*/Discussion\"
+ show=\"10\" trail=\"yes\"]]
+
+For instance see
+[my blog](http://git.pseudorandom.co.uk/pseudorandom.co.uk/smcv.git/blob/83e9a713d77778b58460ed04f6c48665d817f3cd:/index.mdwn).
+"""]]
diff --git a/doc/forum/How_to_add_link_to_previous_and_next_blog_on_blog_pages__63__/comment_2_ee65792a5b796caa216f4e7a653fc668._comment b/doc/forum/How_to_add_link_to_previous_and_next_blog_on_blog_pages__63__/comment_2_ee65792a5b796caa216f4e7a653fc668._comment
new file mode 100644
index 000000000..5546c4789
--- /dev/null
+++ b/doc/forum/How_to_add_link_to_previous_and_next_blog_on_blog_pages__63__/comment_2_ee65792a5b796caa216f4e7a653fc668._comment
@@ -0,0 +1,23 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawmKyeW2G4jjSdnL1m6kPPtAiGFUBsnYCfY"
+ nickname="FName"
+ subject="How to install the trail package?"
+ date="2012-08-09T00:11:26Z"
+ content="""
+I tried putting
+
+trail
+
+in
+
+ add_plugins => [qw{
+ ...
+ trail
+ ...
+ }]
+
+in site.setup and rebuild. It gives error
+
+ Failed to load plugin IkiWiki::Plugin::trail: Can't locate IkiWiki/Plugin/trail.pm in @IN ...
+
+"""]]
diff --git a/doc/forum/How_to_allow_.markdown_and_.md_at_the_same_time_as_valid_extensions_for_source_files__63__.mdwn b/doc/forum/How_to_allow_.markdown_and_.md_at_the_same_time_as_valid_extensions_for_source_files__63__.mdwn
new file mode 100644
index 000000000..d5f144957
--- /dev/null
+++ b/doc/forum/How_to_allow_.markdown_and_.md_at_the_same_time_as_valid_extensions_for_source_files__63__.mdwn
@@ -0,0 +1 @@
+How to allow .markdown and .md (at the same time) as valid extensions for source files? The default is .mdwn.
diff --git a/doc/forum/How_to_apply_a_background_color_to_a_page__63__.mdwn b/doc/forum/How_to_apply_a_background_color_to_a_page__63__.mdwn
new file mode 100644
index 000000000..5beba1258
--- /dev/null
+++ b/doc/forum/How_to_apply_a_background_color_to_a_page__63__.mdwn
@@ -0,0 +1 @@
+I want one page to use gray background color (full page, not just background of text). How?
diff --git a/doc/forum/How_to_change_registration_page.mdwn b/doc/forum/How_to_change_registration_page.mdwn
new file mode 100644
index 000000000..f339f71c4
--- /dev/null
+++ b/doc/forum/How_to_change_registration_page.mdwn
@@ -0,0 +1,9 @@
+Well, I simply don't see it.
+I would like to change the "account registration" page, where it says user, password, repeat password, Account Creation Password, E-Mail.
+
+I simply want it to ask a question like "Who's your daddy" or "What are we all working on" instead of "Account creation password".
+
+I already grepped through the files of the source which I compiled ikiwiki from - I just can't find it. I'm a noob in cgi, it seems to be somewhat in there, but that could also be totally wrong.
+
+Can you tell me where to look?
+
diff --git a/doc/forum/How_to_change_registration_page/comment_1_43758a232e4360561bc84f710862ff40._comment b/doc/forum/How_to_change_registration_page/comment_1_43758a232e4360561bc84f710862ff40._comment
new file mode 100644
index 000000000..5edd993d7
--- /dev/null
+++ b/doc/forum/How_to_change_registration_page/comment_1_43758a232e4360561bc84f710862ff40._comment
@@ -0,0 +1,14 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2012-01-30T19:30:20Z"
+ content="""
+Sure.. You're looking for the file `IkiWiki/Plugin/passwordauth.pm`
+
+This line in particular is the text that gets modified and displayed to the user.
+
+<pre>
+ name => \"account_creation_password\",
+</pre>
+"""]]
diff --git a/doc/forum/How_to_change_registration_page/comment_2_8176ef231cf901802fc60b6d414018e6._comment b/doc/forum/How_to_change_registration_page/comment_2_8176ef231cf901802fc60b6d414018e6._comment
new file mode 100644
index 000000000..8e67162e1
--- /dev/null
+++ b/doc/forum/How_to_change_registration_page/comment_2_8176ef231cf901802fc60b6d414018e6._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawndsaC4GaIBw49WNdbk2Faqfm_mrtQgul8"
+ nickname="Christian"
+ subject="comment 2"
+ date="2012-02-29T06:58:26Z"
+ content="""
+thanks!
+"""]]
diff --git a/doc/forum/How_to_configure_po_plugin__63__.mdwn b/doc/forum/How_to_configure_po_plugin__63__.mdwn
new file mode 100644
index 000000000..a03358ddd
--- /dev/null
+++ b/doc/forum/How_to_configure_po_plugin__63__.mdwn
@@ -0,0 +1,21 @@
+I put
+
+ # po plugin
+ po_master_language => 'en|English',
+
+ po_slave_languages => [ 'zh|Chinese']
+
+
+in page.setup. And did
+
+ $ikiwiki --setup ./page.setup
+
+but get errors:
+
+ Can't use string ("en|English") as a HASH ref while "strict refs" in use at /usr/share/perl5/IkiWiki/Plugin/po.pm line 162.
+
+Line 162 of the file reads
+
+ delete $config{po_slave_languages}{$config{po_master_language}{code}};;
+
+What's wrong? How to fix it?
diff --git a/doc/forum/How_to_configure_po_plugin__63__/comment_1_5e0cc4cdfd126f2f4af64104f02102d6._comment b/doc/forum/How_to_configure_po_plugin__63__/comment_1_5e0cc4cdfd126f2f4af64104f02102d6._comment
new file mode 100644
index 000000000..fc194b3e6
--- /dev/null
+++ b/doc/forum/How_to_configure_po_plugin__63__/comment_1_5e0cc4cdfd126f2f4af64104f02102d6._comment
@@ -0,0 +1,9 @@
+[[!comment format=mdwn
+ username="intrigeri"
+ ip="88.80.28.70"
+ subject="Please upgrade"
+ date="2011-12-20T09:34:56Z"
+ content="""
+You seem to be using an older version of ikiwiki. Either use a hash ref (see [old documentation](http://source.ikiwiki.branchable.com/?p=source.git;a=blobdiff;f=doc/plugins/po.mdwn;h=c36414c8e85f5bb11e2c3a7c3bd41e829abe15a6;hp=53327c1dae94ef5896119cc874133a9a3c1a9b4e;hb=862fc7c1ab1f7d709561bcb02fc8ede57b90a51b;hpb=e50df5ea7601b6a1fc03994650ddff728aba7b57)) or upgrade.
+
+"""]]
diff --git a/doc/forum/How_to_create_a_WikiLink_to_a_page_in_a_subdirectory__63__.mdwn b/doc/forum/How_to_create_a_WikiLink_to_a_page_in_a_subdirectory__63__.mdwn
new file mode 100644
index 000000000..75d98fea1
--- /dev/null
+++ b/doc/forum/How_to_create_a_WikiLink_to_a_page_in_a_subdirectory__63__.mdwn
@@ -0,0 +1,26 @@
+How to create a WikiLink to a page in a subdirectory?
+
+I have a page I want to create Wikilink to in
+
+
+ website/subdir/page.mdwn
+
+
+And the wikilink I want to create should be in a page in the website root directory:
+
+
+ website/index.mdwn
+
+
+If I just write
+
+ \[[page]]
+
+it seems it will assume the page should be found at
+
+
+ website/page.mdwn
+
+and adds a question mark ? in front of the link in the generated HTML file.
+
+How to make such a Wikilink?
diff --git a/doc/forum/How_to_create_a_WikiLink_to_a_page_in_a_subdirectory__63__/comment_1_d20ee1d8d7a3e77a445f8b887e807119._comment b/doc/forum/How_to_create_a_WikiLink_to_a_page_in_a_subdirectory__63__/comment_1_d20ee1d8d7a3e77a445f8b887e807119._comment
new file mode 100644
index 000000000..524852fad
--- /dev/null
+++ b/doc/forum/How_to_create_a_WikiLink_to_a_page_in_a_subdirectory__63__/comment_1_d20ee1d8d7a3e77a445f8b887e807119._comment
@@ -0,0 +1,11 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2012-06-27T07:31:30Z"
+ content="""
+`\[[subdir/page]]`.
+
+IkiWiki terminology is that a page in a subdirectory is a \"subpage\".
+See [[ikiwiki/SubPage]] and [[ikiwiki/subpage/LinkingRules]] for more details.
+"""]]
diff --git a/doc/forum/How_to_create_first_translation_page_using_po_plugin__63__.mdwn b/doc/forum/How_to_create_first_translation_page_using_po_plugin__63__.mdwn
new file mode 100644
index 000000000..e9dd6654b
--- /dev/null
+++ b/doc/forum/How_to_create_first_translation_page_using_po_plugin__63__.mdwn
@@ -0,0 +1,24 @@
+I followed instructions at
+
+ http://ikiwiki.info/plugins/po/
+
+and added to `configfile`
+
+ po_master_language => 'en|English',
+ po_slave_languages => [ 'zh|Chinese' ],
+ po_translatable_pages => '(* and !*/Discussion and !blog/*/comment_*)',
+ po_link_to => 'current'
+
+and did
+
+ ikiwiki --setup configfile
+
+But I don't seem to see any change in the newly built site.
+
+How do I actually use po to create translation pages?
+
+1) I have existing pages that's in English. How do I add translated versions of some of those pages in the slave language?
+
+2) How do I add new pages with the primary language version and alternative versions in slave languages?
+
+The documentation of po is not explicit with what are the concrete steps.
diff --git a/doc/forum/How_to_customize_page_title__63__.mdwn b/doc/forum/How_to_customize_page_title__63__.mdwn
new file mode 100644
index 000000000..1e5be83b8
--- /dev/null
+++ b/doc/forum/How_to_customize_page_title__63__.mdwn
@@ -0,0 +1,6 @@
+The title of a page typically looks like
+
+ `home/ dir1`
+ `home/ dir1/ dir2
+
+I want to customize the `/` to something else. What's the proper way of doing it?
diff --git a/doc/forum/How_to_customize_page_title__63__/comment_1_403e1f866b5e04e5899021f54bbdd1ed._comment b/doc/forum/How_to_customize_page_title__63__/comment_1_403e1f866b5e04e5899021f54bbdd1ed._comment
new file mode 100644
index 000000000..e8d79038b
--- /dev/null
+++ b/doc/forum/How_to_customize_page_title__63__/comment_1_403e1f866b5e04e5899021f54bbdd1ed._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-05-22T22:23:15Z"
+ content="""
+The only way to do that, other than some css3 hacks, is to edit `page.tmpl`. Which is not really recommended as then you get to maintain the customized version.
+
+More usual is to use a directive to set the actual page title displayed at the end: \[[!meta title=\"foo\"]]
+"""]]
diff --git a/doc/forum/How_to_fix___34__does_not_map_to_Unicode__34___errors__63__.mdwn b/doc/forum/How_to_fix___34__does_not_map_to_Unicode__34___errors__63__.mdwn
new file mode 100644
index 000000000..20a4ad022
--- /dev/null
+++ b/doc/forum/How_to_fix___34__does_not_map_to_Unicode__34___errors__63__.mdwn
@@ -0,0 +1,20 @@
+I'm getting a number of errors like this when running ikiwiki:
+
+ utf8 "\xA2" does not map to Unicode at /usr/local/share/perl/5.10.0/IkiWiki.pm line 739, <$in> chunk 1.
+
+I think it's because some of my files contain non-utf8, non-unicode, or somehow bad characters in them, probably fancy quotes and the like that have been copy-and-pasted from my web browser. The problem is that I have hundreds of files, I transferred them all over from pyblosxom to ikiwiki at once, and the error message doesn't tell me which file the error comes from. How can I fix this?
+
+Thanks
+-- seanh
+
+> Unfortunatly, these messages are logged by perl so there's no way to add
+> a filename to them.
+>
+> If you run the build in --verbose mode, you should see which page ikiwiki
+> is working on, and unless it inlines some other page, you can be pretty
+> sure that page contains invalid utf-8 if the message is then printed.
+>
+> Another option is to use the `isutf8` program from
+> [moreutils](http://kitenet.net/~joey/code/moreutils/),
+> and run it on each file, it will tell you the line number
+> and character position that is invalid. --[[Joey]]
diff --git a/doc/forum/How_to_format___91____91__foobar__93____93___in_code_blocks__63__.mdwn b/doc/forum/How_to_format___91____91__foobar__93____93___in_code_blocks__63__.mdwn
new file mode 100644
index 000000000..e26ef8d12
--- /dev/null
+++ b/doc/forum/How_to_format___91____91__foobar__93____93___in_code_blocks__63__.mdwn
@@ -0,0 +1,9 @@
+Normally if i indent by 4 spaces, the text will format as computer code, for example,
+
+ int main () {
+ ...
+ }
+
+but it doesn't work for [ [ foobar ] ] (remove the space between the two square brackets). It will expose code like:
+
+ [[foobar]]
diff --git a/doc/forum/How_to_format___91____91__foobar__93____93___in_code_blocks__63__/comment_1_ad000d39fd1dc05aa8ef6eb19d8d999b._comment b/doc/forum/How_to_format___91____91__foobar__93____93___in_code_blocks__63__/comment_1_ad000d39fd1dc05aa8ef6eb19d8d999b._comment
new file mode 100644
index 000000000..aa26b086b
--- /dev/null
+++ b/doc/forum/How_to_format___91____91__foobar__93____93___in_code_blocks__63__/comment_1_ad000d39fd1dc05aa8ef6eb19d8d999b._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://acathur.myopenid.com/"
+ ip="31.56.6.65"
+ subject="comment 1"
+ date="2012-02-26T08:08:15Z"
+ content="""
+doing `\\[[foobar]]` works for me
+"""]]
diff --git a/doc/forum/How_to_generate_blog_archive_pages_in___47__blog_subdir_but_not_ikiwiki_root_path__63__.mdwn b/doc/forum/How_to_generate_blog_archive_pages_in___47__blog_subdir_but_not_ikiwiki_root_path__63__.mdwn
new file mode 100644
index 000000000..e4901d78b
--- /dev/null
+++ b/doc/forum/How_to_generate_blog_archive_pages_in___47__blog_subdir_but_not_ikiwiki_root_path__63__.mdwn
@@ -0,0 +1,26 @@
+My Ikiwiki and blog has structure
+
+ /website
+ blog.mdwn
+ ...
+
+ /website/blog
+ /website/blog/blog1.mdwn
+ /website/blog/blog2.mdwn
+ ...
+
+After running
+
+ ikiwiki-calendar ./foo.setup "/blog/*"
+
+I got a
+
+ /website/archives/
+
+and archive pages in it.
+
+I'd like to have it in
+
+ /website/blog/archives/
+
+as those are archive pages for blog pages only. How to do it?
diff --git a/doc/forum/How_to_inline_a_page_from_another_git_repository.mdwn b/doc/forum/How_to_inline_a_page_from_another_git_repository.mdwn
new file mode 100644
index 000000000..528a48b6e
--- /dev/null
+++ b/doc/forum/How_to_inline_a_page_from_another_git_repository.mdwn
@@ -0,0 +1,5 @@
+I am migrating a dev site which was previously using Trac.
+
+Some of the wiki pages include RST documents from the code repository. What would be the best way to do this using ikiwiki? I would prefer not to include the full code repository in ikiwiki src as there are many source files I do not want to be processed by ikiwiki.
+
+A possible solution would be something like underlay, but for which only explicitly named files would be processed by ikiwiki.
diff --git a/doc/forum/How_to_list_all_pages_in_a_wiki_instance__63__.mdwn b/doc/forum/How_to_list_all_pages_in_a_wiki_instance__63__.mdwn
new file mode 100644
index 000000000..dcc0b15c6
--- /dev/null
+++ b/doc/forum/How_to_list_all_pages_in_a_wiki_instance__63__.mdwn
@@ -0,0 +1,5 @@
+How to list all pages in a wiki instance?
+
+
+
+How to list all pages in a wiki instance? I basically need an index of all existing pages.
diff --git a/doc/forum/How_to_list_all_pages_in_a_wiki_instance__63__/comment_1_920bcc70fe6d081cf27aa2cc7c6136f4._comment b/doc/forum/How_to_list_all_pages_in_a_wiki_instance__63__/comment_1_920bcc70fe6d081cf27aa2cc7c6136f4._comment
new file mode 100644
index 000000000..748f9f04c
--- /dev/null
+++ b/doc/forum/How_to_list_all_pages_in_a_wiki_instance__63__/comment_1_920bcc70fe6d081cf27aa2cc7c6136f4._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="202.173.183.92"
+ subject="comment 1"
+ date="2011-08-17T23:36:56Z"
+ content="""
+Use the [[plugins/map]] plugin.
+"""]]
diff --git a/doc/forum/How_to_list_new_pages__44___inline__63__.mdwn b/doc/forum/How_to_list_new_pages__44___inline__63__.mdwn
new file mode 100644
index 000000000..f28e8b99b
--- /dev/null
+++ b/doc/forum/How_to_list_new_pages__44___inline__63__.mdwn
@@ -0,0 +1,5 @@
+Hi, I'd love to include a "New posts" list into my front page, like at <http://danhixon.github.com/> for example.
+
+It should be different from recent changes in that it shouldn't show modifications of existing pages, and in that it would be inside a page with other content.
+
+Thanks, Thomas
diff --git a/doc/forum/How_to_list_new_pages__44___inline__63__/comment_1_e989b18bade34a92a9c8fe7099036e15._comment b/doc/forum/How_to_list_new_pages__44___inline__63__/comment_1_e989b18bade34a92a9c8fe7099036e15._comment
new file mode 100644
index 000000000..cf6f642d4
--- /dev/null
+++ b/doc/forum/How_to_list_new_pages__44___inline__63__/comment_1_e989b18bade34a92a9c8fe7099036e15._comment
@@ -0,0 +1,13 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="use an inline directive"
+ date="2010-11-29T20:39:37Z"
+ content="""
+This is what the [[ikiwiki/directive/inline]] directive is for. It's often used, to for example, show new posts to a blog. If you want to show new posts to anywhere in your site, or whatever, you can configure the [[ikiwiki/PageSpec]] in it to do that, too.
+
+For example, you could use this:
+
+ The most recent 3 pages added to this site:
+ \[[!inline pages=\"*\" archive=yes show=4]]
+"""]]
diff --git a/doc/forum/How_to_lock_all_pages_except_discussions_and_comments_to_blog_posts__63__.mdwn b/doc/forum/How_to_lock_all_pages_except_discussions_and_comments_to_blog_posts__63__.mdwn
new file mode 100644
index 000000000..18ff8f1f5
--- /dev/null
+++ b/doc/forum/How_to_lock_all_pages_except_discussions_and_comments_to_blog_posts__63__.mdwn
@@ -0,0 +1,11 @@
+I enabled `comments`, `lockedit` plugins. I put blog posts in `~/foobar.com/static/blog/` and other pages in `~/foobar.com/static/`. I want to
+
+* allow Wiki-style `Discussion` to my pages outside of `blog`
+* allow blog-style comment to my blog posts
+* lock all pages that's not comment or discussion
+
+How to achieve this?
+
+Is it `locked_pages => '!blog/comment_* and !*/Discussion'` for `lockedit` plugin, right?
+
+
diff --git a/doc/forum/How_to_lock_all_pages_except_discussions_and_comments_to_blog_posts__63__/comment_1_e153beb17b6ada69c6ab09d1f491d112._comment b/doc/forum/How_to_lock_all_pages_except_discussions_and_comments_to_blog_posts__63__/comment_1_e153beb17b6ada69c6ab09d1f491d112._comment
new file mode 100644
index 000000000..21abe8d83
--- /dev/null
+++ b/doc/forum/How_to_lock_all_pages_except_discussions_and_comments_to_blog_posts__63__/comment_1_e153beb17b6ada69c6ab09d1f491d112._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-05-22T17:23:55Z"
+ content="""
+Almost.. `'* and !*/Discussion and !postcomment(blog/*)'`
+"""]]
diff --git a/doc/forum/How_to_make_a_table_of_content_at_the_top_of_page__63__.mdwn b/doc/forum/How_to_make_a_table_of_content_at_the_top_of_page__63__.mdwn
new file mode 100644
index 000000000..b185e3b61
--- /dev/null
+++ b/doc/forum/How_to_make_a_table_of_content_at_the_top_of_page__63__.mdwn
@@ -0,0 +1,3 @@
+How to make a table of content at the top of page?
+
+Ideally, it should be a programmable approach, for example, allow such table of content to be made automatically when the page length is longer than certain configurable threshold.
diff --git a/doc/forum/How_to_make_a_table_of_content_at_the_top_of_page__63__/comment_1_6dedc31dd1145490bb5fa4ad14cc4c63._comment b/doc/forum/How_to_make_a_table_of_content_at_the_top_of_page__63__/comment_1_6dedc31dd1145490bb5fa4ad14cc4c63._comment
new file mode 100644
index 000000000..49d25ed02
--- /dev/null
+++ b/doc/forum/How_to_make_a_table_of_content_at_the_top_of_page__63__/comment_1_6dedc31dd1145490bb5fa4ad14cc4c63._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnxp2XU8gIribhhGhGuYtU6eMMwHv5gUGI"
+ nickname="Amitai"
+ subject="comment 1"
+ date="2011-06-05T17:11:40Z"
+ content="""
+To insert one where you want it, [[ikiwiki/directive/toc]].
+"""]]
diff --git a/doc/forum/How_to_migrate_multiple_pages_in_root_directory_into_a_new_subdirectory__63__.mdwn b/doc/forum/How_to_migrate_multiple_pages_in_root_directory_into_a_new_subdirectory__63__.mdwn
new file mode 100644
index 000000000..c7c2b72a7
--- /dev/null
+++ b/doc/forum/How_to_migrate_multiple_pages_in_root_directory_into_a_new_subdirectory__63__.mdwn
@@ -0,0 +1,10 @@
+How to migrate multiple pages in root directory into a new subdirectory?
+
+I have multiple pages in the root directory of my website:
+
+ $ pwd
+ /home/doss/public_html
+ $ ls
+ a. mdwn b.mdwn section.mdwn subdir
+
+What's the procedures to migrate a.mdwn and b.mdwn to subdir and keep correct links to them that might exist in other pages like section.mdwn?
diff --git a/doc/forum/How_to_migrate_multiple_pages_in_root_directory_into_a_new_subdirectory__63__/comment_1_a83a1a33afbf245971733b4128809365._comment b/doc/forum/How_to_migrate_multiple_pages_in_root_directory_into_a_new_subdirectory__63__/comment_1_a83a1a33afbf245971733b4128809365._comment
new file mode 100644
index 000000000..1935c7081
--- /dev/null
+++ b/doc/forum/How_to_migrate_multiple_pages_in_root_directory_into_a_new_subdirectory__63__/comment_1_a83a1a33afbf245971733b4128809365._comment
@@ -0,0 +1,15 @@
+[[!comment format=mdwn
+ username="http://jmtd.livejournal.com/"
+ ip="94.65.111.123"
+ subject="comment 1"
+ date="2011-08-22T12:53:03Z"
+ content="""
+If you use the [[plugins/rename]] plugin and go page-by-page, it will update backlinks for you.
+
+If you want to move a lot of files, such that it is impractical to do one at a time, you could write a script that
+
+ * moved each page to the new location
+ * replaced the old page with one containing a [[plugins/meta]] redirect to the new location.
+
+-- [[Jon]]
+"""]]
diff --git a/doc/forum/How_to_set_the_meta_author_field_from_user_name__63__.mdwn b/doc/forum/How_to_set_the_meta_author_field_from_user_name__63__.mdwn
new file mode 100644
index 000000000..e0d6829fc
--- /dev/null
+++ b/doc/forum/How_to_set_the_meta_author_field_from_user_name__63__.mdwn
@@ -0,0 +1,3 @@
+Several users will post to the same blog. I would like the meta "author" field to be set to their ikiwiki username automatically and attached to their posts such that they can not alter it. I imagine one could use the \<TMPL_VAR USER> variable in the "inlinepage" template, but this variable does not seem to be set. How can I accomplish that?
+
+Related question: is there a way to see all the variables which are set and their value?
diff --git a/doc/forum/How_to_set_the_meta_author_field_from_user_name__63__/comment_1_0906e1f3eb8b826a7730233b95cb5ddd._comment b/doc/forum/How_to_set_the_meta_author_field_from_user_name__63__/comment_1_0906e1f3eb8b826a7730233b95cb5ddd._comment
new file mode 100644
index 000000000..2d7c02a8e
--- /dev/null
+++ b/doc/forum/How_to_set_the_meta_author_field_from_user_name__63__/comment_1_0906e1f3eb8b826a7730233b95cb5ddd._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnNqLKszWk9EoD4CDCqNXJRIklKFBCN1Ao"
+ nickname="maurizio"
+ subject="comment 1"
+ date="2012-06-22T20:33:59Z"
+ content="""
+This does not seem to be possible directly according to the discussion here: [[Possible to use meta variables in templates?]].
+
+The solution I chose in the end was to set up a template which prepares the meta and suggests the author to fill it in with his user name. Maybe the best way would be to create actually one blog per author, define a template for each author (based on the pagespec and on the lockedpages to constrain authors to write only on their blog) and then an inline page which includes all the individual blogs.
+"""]]
diff --git a/doc/forum/How_to_set_up_a_page_as_internal__63__.mdwn b/doc/forum/How_to_set_up_a_page_as_internal__63__.mdwn
new file mode 100644
index 000000000..ed771292d
--- /dev/null
+++ b/doc/forum/How_to_set_up_a_page_as_internal__63__.mdwn
@@ -0,0 +1,5 @@
+I have a folder with few administrative tasks, like a page showing the comments waiting for moderation.
+
+There is no link from "normal/public" pages in my site to this administrative pages, but it doesn't prevent the possibility of any person writing down that address in the browser address-bar or finding that address in the browser navigation history.
+
+Please, is there any way or best-practice restricting the access to this kind of pages? I know that these pages will eventually ask the user to log-in as admin, but I don't want them to see this stuff.
diff --git a/doc/forum/How_to_set_up_git_repository_hook___63__.mdwn b/doc/forum/How_to_set_up_git_repository_hook___63__.mdwn
new file mode 100644
index 000000000..34bc4ace2
--- /dev/null
+++ b/doc/forum/How_to_set_up_git_repository_hook___63__.mdwn
@@ -0,0 +1,19 @@
+Hi,
+
+I want to set up hooks for Git, and I don't know how to. Is there any documentation somewhere? Basically, I'd like to do what [[/ikiwiki-makerepo]] does, but manually.
+
+Why? Because I want to have a special layout to my repository. Especially, I want to include at the root level some special files:
+
+- the nginx configuration
+- the script that installs the nginx configuration to the system
+- the script that starts the fast-cgi wrapper
+- the `ikiwiki.setup` file
+- ...
+
+And I want the ikiwiki sources to be in a subdirectory `src/` and the generated files in `out/` (where the nginx configuration points).
+
+So, what is the special `post-update` hook generated by [[/ikiwiki-makerepo]]? I noticed it was an ELF file, why not a script? What does it do?
+
+Thanks,
+
+Mildred
diff --git a/doc/forum/How_to_show_recent_changes_for_individual_pages__63__.mdwn b/doc/forum/How_to_show_recent_changes_for_individual_pages__63__.mdwn
new file mode 100644
index 000000000..890f24a33
--- /dev/null
+++ b/doc/forum/How_to_show_recent_changes_for_individual_pages__63__.mdwn
@@ -0,0 +1 @@
+The "RecentChanges" shown under page titles on a individual is linked to the revision history page for the whole site where change to every file in the wiki is listed. Is there a way to get that for individual pages?
diff --git a/doc/forum/How_to_show_recent_changes_for_individual_pages__63__/comment_1_cd34affc6883f4e4bc5e7e7b711cc8ba._comment b/doc/forum/How_to_show_recent_changes_for_individual_pages__63__/comment_1_cd34affc6883f4e4bc5e7e7b711cc8ba._comment
new file mode 100644
index 000000000..8ed341c09
--- /dev/null
+++ b/doc/forum/How_to_show_recent_changes_for_individual_pages__63__/comment_1_cd34affc6883f4e4bc5e7e7b711cc8ba._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2012-03-03T15:01:34Z"
+ content="""
+The RecentChanges page is a wiki page like any other, containing a special inline directive. You can copy that, and modify the inline's [[ikiwiki/PageSpec]] to match pages changed by a specific author, or with a specific title.
+
+If you want separate change logs for *every* page, install gitweb and configure historyurl. There will then be a \"History\" link going to the gitweb for each page.
+"""]]
diff --git a/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__.mdwn b/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__.mdwn
new file mode 100644
index 000000000..ad8f27252
--- /dev/null
+++ b/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__.mdwn
@@ -0,0 +1,3 @@
+I'm new to ikiwiki and I'm trying to install it and set it up. I've read the documentation but I still don't understand how access to the repository works. We want ikiwiki to run on one machine but we want the repository to be on a separate machine running svn. How can I configure ikiwiki to access the repository on the remote machine? And how is authentication on the remote host handled in ikiwiki? Does there have to be a one-to-one correspondence between account names (and passwords) on the ikiwiki machine and the accounts on the svn machine? Thanks,
+
+Eric
diff --git a/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__/comment_1_0c71e17ae552cbab1056ac96fbd36c59._comment b/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__/comment_1_0c71e17ae552cbab1056ac96fbd36c59._comment
new file mode 100644
index 000000000..954ef0810
--- /dev/null
+++ b/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__/comment_1_0c71e17ae552cbab1056ac96fbd36c59._comment
@@ -0,0 +1,9 @@
+[[!comment format=mdwn
+ username="http://adam.shand.net/"
+ nickname="Adam"
+ subject="Depending ..."
+ date="2011-01-25T03:00:12Z"
+ content="""
+... on exactly what you are trying to do, you may find some answers [[here|forum/how_to_setup_ikiwiki_on_a_remote_host/]].
+
+"""]]
diff --git a/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__/comment_2_b309302a084fbd8bcd4cd9bd2509cf5a._comment b/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__/comment_2_b309302a084fbd8bcd4cd9bd2509cf5a._comment
new file mode 100644
index 000000000..4ceb69474
--- /dev/null
+++ b/doc/forum/How_to_specify_repository_is_on_a_remote_host__63__/comment_2_b309302a084fbd8bcd4cd9bd2509cf5a._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="here's the scoop .. but, don't do it"
+ date="2011-01-25T18:30:01Z"
+ content="""
+To do what you describe, you would set up the svn repository on your server, and then do a regular svn checkout of it to the machine running ikiwiki, and configure ikiwiki to use that directory as its srcdir. The only unix user ikiwiki does anything as is the one you use to set it up, so it's up to you to allow that user to commit to svn without needing to enter a password.
+
+However, I don't recommend this configuration at all. You're adding a ssh (or webdav) connection overhead to every edit to the wiki, since ikiwiki's commit to svn will have to be pushed across the network to the server. And ikiwiki's svn support is missing many of the [[newer_ikiwiki_features|rcs]], such as including for example support for easily reverting edits.
+"""]]
diff --git a/doc/forum/How_to_style_main_sidebar_and_SubPage_sidebar_differently_using_CSS__63__.mdwn b/doc/forum/How_to_style_main_sidebar_and_SubPage_sidebar_differently_using_CSS__63__.mdwn
new file mode 100644
index 000000000..09cf01435
--- /dev/null
+++ b/doc/forum/How_to_style_main_sidebar_and_SubPage_sidebar_differently_using_CSS__63__.mdwn
@@ -0,0 +1,13 @@
+How to style main sidebar and SubPage sidebar differently using CSS?
+
+I have a main sidebar
+
+ /sidebar.mdwn
+
+and a SubPage sidebar
+
+ /blog/sidebar.mdwn
+
+How to style the two differently using CSS?
+
+For example I'd like the sidebar shown on any page inside /blog to have a blue border and any other page outside of /blog to have a sidebar with red border.
diff --git a/doc/forum/How_to_use___126____47__bin__47__multimarkdown_instead_of_Text::MultiMarkdown.mdwn b/doc/forum/How_to_use___126____47__bin__47__multimarkdown_instead_of_Text::MultiMarkdown.mdwn
new file mode 100644
index 000000000..5dadb600d
--- /dev/null
+++ b/doc/forum/How_to_use___126____47__bin__47__multimarkdown_instead_of_Text::MultiMarkdown.mdwn
@@ -0,0 +1,5 @@
+[[!meta author=tjgolubi]]
+
+I want to configure IkiWiki to use Fletcher Penney's recent version of [[MultiMarkdown|http://fletcherpenney.net/multimarkdown]] instead of the default perl implementation.
+I don't know perl, but I think I need to replace the mdwn.pm plugin with something that uses "open2".
+Please help me get started. -- [[tjgolubi]]
diff --git a/doc/forum/How_to_use_number_as_bullet_labels_but_not_letter_in_toc_plugin.mdwn b/doc/forum/How_to_use_number_as_bullet_labels_but_not_letter_in_toc_plugin.mdwn
new file mode 100644
index 000000000..dd77e9441
--- /dev/null
+++ b/doc/forum/How_to_use_number_as_bullet_labels_but_not_letter_in_toc_plugin.mdwn
@@ -0,0 +1,8 @@
+I am using toc plugin, and it gives me
+
+ 1. Section1
+ 2. Section2
+ a. subsection21
+ b. subsection22
+
+How do I make a. and b. as 2.1 and 2.2, respectively?
diff --git a/doc/forum/Howto_add_tag_from_plugin_code.mdwn b/doc/forum/Howto_add_tag_from_plugin_code.mdwn
new file mode 100644
index 000000000..a17faf727
--- /dev/null
+++ b/doc/forum/Howto_add_tag_from_plugin_code.mdwn
@@ -0,0 +1,12 @@
+Hi, I want to add tags to some pages automatically (generating album images
+and want to tag all generated pages). I managed to do so in following way:
+
+ IkiWiki::Plugin::tag::preprocess_tag(
+ page => $viewer,
+ destpage => $params{destpage},
+ map { ($_ => 1) } @tags,
+ );
+
+This works, however if some tag does not exist, it is not created. I tracked it so far that I found that the Render.pm's method gen_autofile() is not called , so it is most likely that I need to somehow trigger Render.pm's refresh()...but how can I do it?
+
+BTW: The code is modified album plugin that is in [my git](https://github.com/llipavsky/ikiwiki)
diff --git a/doc/forum/Howto_add_tag_from_plugin_code/comment_1_c61454825874a6fe1905cb549386deb0._comment b/doc/forum/Howto_add_tag_from_plugin_code/comment_1_c61454825874a6fe1905cb549386deb0._comment
new file mode 100644
index 000000000..2122083ec
--- /dev/null
+++ b/doc/forum/Howto_add_tag_from_plugin_code/comment_1_c61454825874a6fe1905cb549386deb0._comment
@@ -0,0 +1,77 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2013-07-01T09:13:51Z"
+ content="""
+(If you want to branch from my version of album, please add my git repo
+as a remote and merge or cherry-pick the album4 branch: pasting from my
+gitweb seems to have given you some incorrect UTF-8.)
+
+The problem you have here is that for this plugin, the correct order
+for IkiWiki to do things is quite subtle. Am I right in thinking that
+the feature you want goes something like this?
+
+> To add a set of tags to every \"viewer\" page in the album, you can add
+> the tags parameter to the album:
+>
+> \[[!album tags=\"holiday hawaii\"]]
+>
+> The individual viewers will all be tagged \"holiday\" and \"hawaii\",
+> for instance. These tags cannot be removed by editing the viewers.
+
+`preprocess_albumimage` runs twice: once in the scan stage, and once in
+the render stage. In the render stage, it's too late to add tags, because
+tags are a special form of [[ikiwiki/wikilinks]], and wikilinks have to
+be added during the scan stage to work correctly.
+
+The part of `preprocess_albumimage` after the line
+`return unless defined wantarray;` only runs in the render stage, which
+is too late. You'd need to set up the tags further up: just after the
+calls to `IkiWiki::Plugin::meta::preprocess` would be a good place.
+
+I would also suggest checking for
+`IkiWiki::Plugin::tag->can('preprocess_tag')`,
+like I do for meta - if you do that, you won't need to force the tag plugin
+to be loaded.
+
+Unfortunately, I'm still not sure that this is fully correct. Pages
+are scanned in a random order. If the `\[[!album]]` is scanned before
+a \"viewer\" page, then everything is fine: the tags are present when
+the \"viewer\" is scanned. However, if the \"viewer\" is scanned first,
+then it will get the tags that the `\[[!album]]` had in the *previous*
+IkiWiki run (if any), which are still in the index, because the
+`\[[!album]]` hasn't been re-scanned yet...
+
+Are you sure this form of the feature is what you want? You'll end up with
+a *lot* of pages with those tags. If it's what you want, it might be
+clearer how it works if you changed the syntax to something like this,
+perhaps?
+
+>
+> \[[!album tag_all=\"holiday hawaii\"]]
+
+Another possible syntax would be to have the feature be more like this:
+
+> If you use the `tag_default` parameter to the `\[[!album]]` directive,
+> each \"viewer\" page created for images will have those tags by
+> default. Changing the `\[[!album]]` will not affect any \"viewer\"
+> pages that have already been created, and editing the \"viewer\"
+> can add or remove those default tags.
+>
+> \[[!album tag_default=\"holiday hawaii\"]]
+
+which I think removes the ordering problems? If you go this route,
+you'd want to either add e.g. `[[!tag holiday hawaii]]`
+to the generated viewer page in `create_viewer`, or add a `tag`
+parameter to `\[[!albumimage]]` that's a shortcut for the
+tag directive, in the same way that author is a shortcut for
+`[[!meta author]]`).
+
+The purpose of the \"shortcut\" parameters in `\[[!albumimage]]`,
+like title, author and date, is that I eventually want to add
+a specialized CGI interface to this plugin so you can edit
+all the images of an album in one go; when I add that,
+it'll probably only be able to process something as
+machine-readable as `\[[!albumimage]]`.
+"""]]
diff --git a/doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__.mdwn b/doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__.mdwn
new file mode 100644
index 000000000..e23d3fddf
--- /dev/null
+++ b/doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__.mdwn
@@ -0,0 +1,14 @@
+Since ikiwiki doesn't have much of a chance of working in windows, how about we compromise by making an offline ikiwiki editor for Windows? In fact, it might be advantageous to use it in Linux, too...
+
+It should be very simple: It would enter the source wiki and show the Markdown code by default, but would have an option to preview your page in another tab.
+
+Basic features:
+
+* wikilinks, maps, images, inlinepages, and other basic functions should all work in the preview
+* perhaps use local.css to format preview
+* See the DVCS history with diffs and all
+* have a discussion tab to easily see what other people have said about the page
+
+If we want to add some more bells and whistles, maybe we could throw in some buttons to insert markdown formatting (like in forums, mediawiki, or RES).
+
+Any thoughts on this?
diff --git a/doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__/comment_1_a66fd9d7ab4359784a5420cd899a1057._comment b/doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__/comment_1_a66fd9d7ab4359784a5420cd899a1057._comment
new file mode 100644
index 000000000..20fd763e2
--- /dev/null
+++ b/doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__/comment_1_a66fd9d7ab4359784a5420cd899a1057._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="202.173.183.92"
+ subject="comment 1"
+ date="2012-01-13T22:32:47Z"
+ content="""
+It would probably be quite complex to write, and difficult to maintain. I don't think much of your chances of getting someone to write it. If you want to write it yourself, have fun doing so!
+"""]]
diff --git a/doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__/comment_2_3351ff773fea3f640f4036bb8c7c7efd._comment b/doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__/comment_2_3351ff773fea3f640f4036bb8c7c7efd._comment
new file mode 100644
index 000000000..b83042c36
--- /dev/null
+++ b/doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__/comment_2_3351ff773fea3f640f4036bb8c7c7efd._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawkr8GVPw30JBR34Btg-SKcS8gxEf7zpSJQ"
+ nickname="Lawrence"
+ subject="comment 2"
+ date="2012-01-14T03:14:38Z"
+ content="""
+Eh, ok, lol. I know that implementing most of the wiki features over again could be difficult, and so would a Git diff reader, but it shouldn't be that hard to get Wikilinking or a markdown previewer working.
+
+Could you point out some specific problems of this approach, so that it would help me out to do so?
+"""]]
diff --git a/doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__/comment_3_273b2b63a9af2bc4eeb030e026436687._comment b/doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__/comment_3_273b2b63a9af2bc4eeb030e026436687._comment
new file mode 100644
index 000000000..e5eaf2c4c
--- /dev/null
+++ b/doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__/comment_3_273b2b63a9af2bc4eeb030e026436687._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawkr8GVPw30JBR34Btg-SKcS8gxEf7zpSJQ"
+ nickname="Lawrence"
+ subject="comment 3"
+ date="2012-01-14T17:41:52Z"
+ content="""
+Like, there's already a whole host of Markdown previewer apps that are pretty good. [Here's a](http://www.macworld.com/article/164744/2012/01/marked_excels_at_previewing_markdown_and_html_documents.html) popular one on Mac, and there are many more...
+
+There's also a plugin for Emacs that does so, and even resolves wikilinks (in some way..).
+
+But I'd have to say that I probably made a misleading title, WYSIWYG would probably be low on the list of needed features. And I'm just dumping an idea I have here in case anyone has any suggestions or comments, I'll probably do it myself in my free time.
+"""]]
diff --git a/doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__/comment_4_546771c13ea1b550301586e187d82cb5._comment b/doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__/comment_4_546771c13ea1b550301586e187d82cb5._comment
new file mode 100644
index 000000000..472417457
--- /dev/null
+++ b/doc/forum/If_there__39__s_no_Windows_ikiwiki__44___how_about_a_WYSIWYG_ikiwiki_editor_for_Windows__63__/comment_4_546771c13ea1b550301586e187d82cb5._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawndsaC4GaIBw49WNdbk2Faqfm_mrtQgul8"
+ nickname="Christian"
+ subject="just my 2 cents"
+ date="2012-01-17T11:10:09Z"
+ content="""
+why?
+"""]]
diff --git a/doc/forum/Ikiwiki_CGI_not_working_on_my_server__44___and_it__39__s_a_binary_file__63__.mdwn b/doc/forum/Ikiwiki_CGI_not_working_on_my_server__44___and_it__39__s_a_binary_file__63__.mdwn
new file mode 100644
index 000000000..35db20dc8
--- /dev/null
+++ b/doc/forum/Ikiwiki_CGI_not_working_on_my_server__44___and_it__39__s_a_binary_file__63__.mdwn
@@ -0,0 +1,33 @@
+Hey, trying to get ikiwiki working on my account on a shared webserver. Actually installing ikiwiki on the server is phase 2. For now I'm running the latest ikiwiki (from source) locally, compiling the output with the ikiwiki command, then rsyncing the output dir up to the server. This works for the static HTML files, but the CGI file doesn't work, the server redirects to an error page. The error log on the server says "Premature end of script headers: /path/to/ikiwiki.cgi"
+
+My first thought was that this is a Perl CGI and I would need to change the shebang to point to the unusual location of Perl on this server, it's at /usr/pkg/bin/perl. But when I looked at ikiwiki.cgi I found it was a binary file.
+
+Why is it a binary? And what can I do about this error?
+
+> It's a binary because it's SUID, so that it has permission to write to the ikiwiki repository. See [[security]], under 'suid wrappers', for more on that.
+>
+> As to why you get 'premature end of script headers', that suggests there is a problem running
+> the script (and there is output occurring before the HTTP headers are printed). Do you have access
+> to the webserver logs for your host? They might contain some clues. Are you sure that the webserver
+> is setup for CGI properly? -- [[Jon]]
+
+> Quite likely your laptop and your server do not run the same
+> OS, so the wrapper binary cannot just be copied from one
+> to the other and run. Also, the wrapper is just that, a
+> thin wrapper which then runs ikiwiki. As ikiwiki is not
+> yet installed on your server, that's another reason what
+> you're trying can't work.
+>
+> If installing ikiwiki on the server is not possible or
+> too much work right now, you could try building your wiki
+> on your laptop with cgi disabled in the setup file.
+> The result would be a static website that you could deploy to
+> the server this way. Of course, it wouldn't be editable
+> on the server, and other features that need the CGI would
+> also be disabled. --[[Joey]]
+
+> > Ah, ok thanks. Yes the server runs a different OS and ikiwiki
+> > is not installed on it. I've got it working as a static site,
+> > so if I want the CGI I'll have to install ikiwiki on the server.
+> > Ok. It might not work as I don't have root access, but I might
+> > give it a try. Thanks
diff --git a/doc/forum/Ikiwiki_themes_for_mobile_devices__63__.mdwn b/doc/forum/Ikiwiki_themes_for_mobile_devices__63__.mdwn
new file mode 100644
index 000000000..dc1b31cd8
--- /dev/null
+++ b/doc/forum/Ikiwiki_themes_for_mobile_devices__63__.mdwn
@@ -0,0 +1,7 @@
+Has anyone else created ikiwiki themes for mobile devices like phones and tablets?
+
+I've been using Blueview theme for a few years and finally tried to adapt the theme for my phone.
+My local.css is [here](http://mcfrisk.kapsi.fi/local.css), and my hobby web page full of images and videos [here](http://mcfrisk.kapsi.fi/skiing/).
+
+Previously I also had problems like wasted screen space, big minimum width and images not scaled down to the CSS element. Those got fixed as well.
+Would be nice if others could test that and maybe share their setups.
diff --git a/doc/forum/Include_attachment_in_a_page.mdwn b/doc/forum/Include_attachment_in_a_page.mdwn
new file mode 100644
index 000000000..e4a5a53ec
--- /dev/null
+++ b/doc/forum/Include_attachment_in_a_page.mdwn
@@ -0,0 +1,9 @@
+Is there any way of embedding an attachment in a page - like, when I upload a picture, I would like to have it showing on a page.... I tried the Markdown image syntax like this:
+
+ ![Alt text](/path/to/img.jpg "Optional title")
+
+But if you upload a PDF, f.ex., there will be a "broken URL/no image" thumbnail, although the link to the uploaded file works correctly, if entered correctly.
+
+So if it's just an image I want to embed, then I would want to use HTML syntax directly, I guess. (Like stated [[here|http://daringfireball.net/projects/markdown/syntax#img]].) But in case it's some other type of attachment it would be nice to be able to have some specific include syntax or the like.
+
+Probably a feature for the "attachment" plugin's wishlist!?
diff --git a/doc/forum/Include_attachment_in_a_page/comment_1_275aad6ca3b2972749b7f6636b130035._comment b/doc/forum/Include_attachment_in_a_page/comment_1_275aad6ca3b2972749b7f6636b130035._comment
new file mode 100644
index 000000000..a6f995626
--- /dev/null
+++ b/doc/forum/Include_attachment_in_a_page/comment_1_275aad6ca3b2972749b7f6636b130035._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="http://jmtd.net/"
+ nickname="Jon Dowland"
+ subject="comment 1"
+ date="2012-04-02T12:51:33Z"
+ content="""
+Forgive me if I don't fully understand the question, but:
+
+ * the attachment functionality includes a button \"Insert Links\" which, on the edit form for a page, inserts the correct markup to link to the attachment, which addresses the general case
+
+ * For images which you want inline, you could convert the basic wikilink e.g. `\[[foo.png]]` into a call to the [[plugins/img]] plugin e.g. `\[[!img foo.png]]`
+"""]]
diff --git a/doc/forum/Is_it_possible_to_change_default_mdwn_suffix__63__.mdwn b/doc/forum/Is_it_possible_to_change_default_mdwn_suffix__63__.mdwn
new file mode 100644
index 000000000..6a1059d48
--- /dev/null
+++ b/doc/forum/Is_it_possible_to_change_default_mdwn_suffix__63__.mdwn
@@ -0,0 +1 @@
+I usually use .md as a Markdown file extension. I know other people use .txt. Is there any way to modify ikiwiki setup to accept this as the suffix for Markdown pages (instead of .mdwn)?
diff --git a/doc/forum/Is_it_possible_to_change_default_mdwn_suffix__63__/comment_1_2a449c6017ecdb4f557963266fb4ec41._comment b/doc/forum/Is_it_possible_to_change_default_mdwn_suffix__63__/comment_1_2a449c6017ecdb4f557963266fb4ec41._comment
new file mode 100644
index 000000000..da6377607
--- /dev/null
+++ b/doc/forum/Is_it_possible_to_change_default_mdwn_suffix__63__/comment_1_2a449c6017ecdb4f557963266fb4ec41._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-05-20T14:12:38Z"
+ content="""
+It's fairly easy to make a copy of the mdwn plugin and s/mdwn/foo/ in it and get what you want. But I don't see value in providing this option in ikiwiki as it just reduces interoperability. Not all options are good options, and this would be a bad one.
+"""]]
diff --git a/doc/forum/Is_there_a_pagespec_for_creation_dates_relative_to_today__63__.mdwn b/doc/forum/Is_there_a_pagespec_for_creation_dates_relative_to_today__63__.mdwn
new file mode 100644
index 000000000..53c70e50a
--- /dev/null
+++ b/doc/forum/Is_there_a_pagespec_for_creation_dates_relative_to_today__63__.mdwn
@@ -0,0 +1,26 @@
+Dear users,
+
+
+using the directive inline, I want to show all pages (for example named 2008.10.2:foo.mdwn or 2009.12.3:bar.mdwn), whose date in the title are in the future. So in this example only the second one.
+
+I did not find a directive doing this in [[/ikiwiki/PageSpec]].
+
+Does somebody have an idea? I just came up with using a tag “recent” or using a separate folder. But this would be a quite some work to maintain this setup.
+
+
+Thanks,
+
+Paul
+
+> There's no such pagespec, and doing one is difficult, because such a
+> pagespec will change what it matches over time. So ikiwiki would have to
+> somehow figure out that pages matched by it yesterday no longer match,
+> and that pages containing the pagespec need to be rebuilt. Which means
+> you'd also need a cron job.
+
+>> Thank you for the explanation.
+
+> I suspect what you're trying to accomplish is
+> [[todo/tagging_with_a_publication_date]]? --[[Joey]]
+
+>> Yeah, something like that. Thanks! --[[PaulePanter]]
diff --git a/doc/forum/LaTeX_Error.mdwn b/doc/forum/LaTeX_Error.mdwn
new file mode 100644
index 000000000..587baec6c
--- /dev/null
+++ b/doc/forum/LaTeX_Error.mdwn
@@ -0,0 +1,66 @@
+Greetings.
+
+I am put this code in one page:
+[[!teximg code="\frac{1}{5}" alt="1/5"]]
+
+this is the configuration file ikiwiki.info:
+
+add_plugins => [qw{sidebar goodstuff textile html htmlscrubber table pagetemplate teximg map meta anonok img version textile txt}]
+
+ *Here the log*
+
+This is pdfTeXk, Version 3.141592-1.40.3 (Web2C 7.5.6) (format=latex 2008.8.4) 5 AUG 2008 10:01
+entering extended mode
+ %&-line parsing enabled.
+**/tmp/fb7742f8dd0c66473643ba40592e2be2.SBQfJo94ii/fb7742f8dd0c66473643ba40592e
+2be2.tex
+
+(/tmp/fb7742f8dd0c66473643ba40592e2be2.SBQfJo94ii/fb7742f8dd0c66473643ba40592e2
+be2.tex
+
+[...]
+
+Package scrkbase Info: You've used the obsolete option `12pt'.
+(scrkbase) \KOMAoptions{fontsize=12pt} will be used instead.
+(scrkbase) You should do this change too on input line 594.
+
+[...]
+
+! LaTeX Error: File `mhchem.sty' not found.
+
+Type X to quit or <RETURN> to proceed,
+or enter new name. (Default extension: sty)
+
+Enter file name:
+! Emergency stop.
+<read *>
+
+l.1 ...l}\usepackage[version=3]{mhchem}\usepackage
+
+ {amsmath}\usepackage{amsfo...
+
+(cannot \read from terminal in nonstop modes)
+
+
+Here is how much of TeX's memory you used:
+ 761 strings out of 94074
+ 10268 string characters out of 1167096
+ 66007 words of memory out of 1500000
+ 4120 multiletter control sequences out of 10000+50000
+ 3938 words of font info for 15 fonts, out of 1200000 for 2000
+ 645 hyphenation exceptions out of 8191
+ 30i,1n,28p,410b,45s stack positions out of 5000i,500n,6000p,200000b,10000s
+No pages of output.
+
+
+Some idea ?.
+
+>> It looks like teximg uses some less standard LaTeX packages. (see line 100 of Ikiwiki/Plugin/teximg.pm in the Ikiwiki source.)
+>> A quick work-around for an end-user would be to install the 'mhchem' LaTeX package (look in [CTAN](http://www.ctan.org/) ).
+>> A medium-term workaround would be to replace 'scrartcl' on line 100 with 'article', and delete line 101 in the teximg source.
+>> Longer term it would be nice to give teximg a configurable preamble.
+>> Hrm - maybe that configurable preamble should be a [[todo]]? -- [[users/Will]]
+
+>>>Yes it works. Thanks. I am writing some code for examples.
+
+
diff --git a/doc/forum/Last_visited_pages.mdwn b/doc/forum/Last_visited_pages.mdwn
new file mode 100644
index 000000000..ae87cf177
--- /dev/null
+++ b/doc/forum/Last_visited_pages.mdwn
@@ -0,0 +1 @@
+Is it possible to add a list of the n last visited pages on the bottom of each ikiwiki page (or maybe on a sidebar)?
diff --git a/doc/forum/Last_visited_pages/comment_1_e34650064dd645b35da98e80c0311df9._comment b/doc/forum/Last_visited_pages/comment_1_e34650064dd645b35da98e80c0311df9._comment
new file mode 100644
index 000000000..c5e2cc8ad
--- /dev/null
+++ b/doc/forum/Last_visited_pages/comment_1_e34650064dd645b35da98e80c0311df9._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="202.173.183.92"
+ subject="comment 1"
+ date="2011-11-27T11:09:55Z"
+ content="""
+Only if you could do it with JavaScript or SSI; remember IkiWiki is a wiki *compiler* - all the pages are generated beforehand, their content remains the same no matter what your visitor is doing.
+"""]]
diff --git a/doc/forum/Last_visited_pages/comment_2_2a0c4e844da1deaa2c286e87c8eab84d._comment b/doc/forum/Last_visited_pages/comment_2_2a0c4e844da1deaa2c286e87c8eab84d._comment
new file mode 100644
index 000000000..c5e90752d
--- /dev/null
+++ b/doc/forum/Last_visited_pages/comment_2_2a0c4e844da1deaa2c286e87c8eab84d._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk_MMtLPS7osC5MjX00q2ATjvvXPWqm0ik"
+ nickname="micheal"
+ subject="comment 2"
+ date="2011-11-27T12:23:31Z"
+ content="""
+How to to it with JavaScript OR SSI?
+"""]]
diff --git a/doc/forum/Link_to_a_local_pdf_file.mdwn b/doc/forum/Link_to_a_local_pdf_file.mdwn
new file mode 100644
index 000000000..61d6829a0
--- /dev/null
+++ b/doc/forum/Link_to_a_local_pdf_file.mdwn
@@ -0,0 +1 @@
+How can I make a link to a local file (for example a pdf file) in ikiwiki?
diff --git a/doc/forum/Link_to_a_local_pdf_file/comment_1_b6c57588042373f8e1f187041c1a8530._comment b/doc/forum/Link_to_a_local_pdf_file/comment_1_b6c57588042373f8e1f187041c1a8530._comment
new file mode 100644
index 000000000..b8dc275f5
--- /dev/null
+++ b/doc/forum/Link_to_a_local_pdf_file/comment_1_b6c57588042373f8e1f187041c1a8530._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-08-05T21:09:21Z"
+ content="""
+Typically this is done using standard markdown html linking.
+"""]]
diff --git a/doc/forum/Log_in_error.mdwn b/doc/forum/Log_in_error.mdwn
new file mode 100644
index 000000000..b9281f90f
--- /dev/null
+++ b/doc/forum/Log_in_error.mdwn
@@ -0,0 +1,5 @@
+When i login my Ikiwiki instance using Google, it issues error
+
+ url_fetch_error: Error fetching URL: Internal Server Error
+
+then fail. How to fix it?
diff --git a/doc/forum/Log_in_error/comment_1_0ef13ea01a413160d81951636c15c3e6._comment b/doc/forum/Log_in_error/comment_1_0ef13ea01a413160d81951636c15c3e6._comment
new file mode 100644
index 000000000..df36f9e72
--- /dev/null
+++ b/doc/forum/Log_in_error/comment_1_0ef13ea01a413160d81951636c15c3e6._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-08-05T20:54:02Z"
+ content="""
+This error suggests that ikiwiki's attempt to contact google to check your openid is failing.
+
+What version on [[!cpan Openid::Consumer]] is that? Version 1.03 has a slightly different error message, perhaps you have an old version that is somehow broken.
+"""]]
diff --git a/doc/forum/Map_Plugin__44___would_like_to_add___63__updated_to_all_links.mdwn b/doc/forum/Map_Plugin__44___would_like_to_add___63__updated_to_all_links.mdwn
new file mode 100644
index 000000000..fcffe690f
--- /dev/null
+++ b/doc/forum/Map_Plugin__44___would_like_to_add___63__updated_to_all_links.mdwn
@@ -0,0 +1,3 @@
+Map Plugin, would like to add ?updated to all links created.
+
+When I edit a page and then click that page in a map in a sidebar Safari always shows me a cached page.
diff --git a/doc/forum/Map_Plugin__44___would_like_to_add___63__updated_to_all_links/comment_1_3fe4c5967e704355f9b594aed46baf67._comment b/doc/forum/Map_Plugin__44___would_like_to_add___63__updated_to_all_links/comment_1_3fe4c5967e704355f9b594aed46baf67._comment
new file mode 100644
index 000000000..ce1a78584
--- /dev/null
+++ b/doc/forum/Map_Plugin__44___would_like_to_add___63__updated_to_all_links/comment_1_3fe4c5967e704355f9b594aed46baf67._comment
@@ -0,0 +1,13 @@
+[[!comment format=mdwn
+ username="justint"
+ ip="24.182.207.250"
+ subject="skip it"
+ date="2010-10-13T05:30:50Z"
+ content="""
+skip it, I added
+
+ <meta http-equiv=\"expires\" value=\"Thu, 16 Mar 2000 11:00:00 GMT\" />
+ <meta http-equiv=\"pragma\" content=\"no-cache\" />
+
+to my page.tmpl and the problem went away.
+"""]]
diff --git a/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated.mdwn b/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated.mdwn
new file mode 100644
index 000000000..b659212b6
--- /dev/null
+++ b/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated.mdwn
@@ -0,0 +1,27 @@
+I copied ikiwiki from old host to new. Old is Debian GNU/Linux version 5.0.7, with ikiwiki 3.1415926~bpo50+1
+New host is 5.0.8 with ikiwiki 3.20100815~bpo50+1
+
+I have ikiwiki.setup and both GIT repos from old host, src and src.git, the latter is bare repo.
+
+I suspect I have messed up things in the old host, since the src directory tree is much larger than the src.git.
+tale@tugelbend:~/wiki$ du -sh src src.git/
+8,3M src
+2,6M src.git/
+
+If I clone the src.git to the new host, I get after ikiwiki --setup web pages from year 2009. So I did the migration like this:
+
+Copy the src directory to the new host to temp dir; git clone --bare /tmp/Foo/src ~/wiki/wiki.git
+cd ~/wiki
+git clone wiki.git wiki.src
+cd ..
+ikiwiki --setup ikiwiki.setup
+
+I believe I have modified the ikiwiki.setup file correctly, I get no error messages and it makes the web page with the
+same content as on old host. But when I git clone wiki.git a working copy for myself, and edit it, git commit -a ; git push
+I am sad to see the web page is not updated.
+
+How can I see what is wrong? The hook seems OK:
+taleman@porixi:~/wiki$ ls -lh wiki.git/hooks/post-update
+-rwsr-sr-x 1 taleman taleman 14K 23.12. 17:42 wiki.git/hooks/post-update
+
+ikiwiki --setup created that and did not claim any errors.
diff --git a/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_2_b44a492c7f10395a31f3c0830ef33f0c._comment b/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_2_b44a492c7f10395a31f3c0830ef33f0c._comment
new file mode 100644
index 000000000..576d11f0b
--- /dev/null
+++ b/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_2_b44a492c7f10395a31f3c0830ef33f0c._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 2"
+ date="2011-12-25T00:07:28Z"
+ content="""
+Try running the post-update hook by hand and see if it pulls the changes into ~/wiki/wiki.src and see if it updates the site.
+
+Make sure `gitorigin_branch` is set in the setup file.
+"""]]
diff --git a/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_2_f9240b217b2d1ee8d51dada9cb1186b3._comment b/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_2_f9240b217b2d1ee8d51dada9cb1186b3._comment
new file mode 100644
index 000000000..635aa9340
--- /dev/null
+++ b/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_2_f9240b217b2d1ee8d51dada9cb1186b3._comment
@@ -0,0 +1,28 @@
+[[!comment format=mdwn
+ username="https://launchpad.net/~tale"
+ nickname="tale"
+ subject="comment 2"
+ date="2011-12-27T16:18:31Z"
+ content="""
+In both old and new host gitorigin_branch is empty string in ikiwiki.setup.
+
+ tale@tugelbend:~$ grep -i gitorigin ikiwiki.setup
+ gitorigin_branch => '',
+
+The branches subdir is empty on both hosts:
+
+ tale@tugelbend:~$ LANG=C ls -lha wiki/src.git/branches/
+ total 8.0K
+ drwxr-xr-x 2 tale tale 4.0K Dec 24 2009 .
+ drwxr-xr-x 7 tale tale 4.0K Dec 24 2009 ..
+ tale@tugelbend:~$
+
+I do not know what value I should assign to gitorigin_branch.
+
+I tried
+
+ wiki.git/hooks/post-update
+
+but nothing seems to happen. No output, web page stays the same. File timestamps are not updated
+on the dest directory.
+"""]]
diff --git a/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_3_c3c5c41a4c220793c6d16f3fd6132272._comment b/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_3_c3c5c41a4c220793c6d16f3fd6132272._comment
new file mode 100644
index 000000000..f845f130f
--- /dev/null
+++ b/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_3_c3c5c41a4c220793c6d16f3fd6132272._comment
@@ -0,0 +1,15 @@
+[[!comment format=mdwn
+ username="https://launchpad.net/~tale"
+ nickname="tale"
+ subject="Editing via browser works"
+ date="2011-12-27T18:47:16Z"
+ content="""
+I now set up virtual host in apache2 and fudged the ipnumber to correspond to hostnames the ikiwiki uses. I do not want to update DNS before I have checked the site works.
+
+Now I can log in using OpenID and edit the wiki via browser. This time the web pages are updated and the changes I made appear in the wiki.
+
+I cheched the gitorigin_branch setting was empty string even in 2009 when I got this wiki. I begin to suspect editing the wiki using a git checkout would not work even in the host the ikiwiki is currently running on, and maybe did not work in the host it was running on in 2009.
+
+It looks to me like ikiwiki is working on this new host except git configuration is somehow messed up. It is possible I never did use git checkout on the old host.
+
+"""]]
diff --git a/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_4_1f6f9e3939a454c1eb8d2fb29bd519de._comment b/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_4_1f6f9e3939a454c1eb8d2fb29bd519de._comment
new file mode 100644
index 000000000..1a5a91466
--- /dev/null
+++ b/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_4_1f6f9e3939a454c1eb8d2fb29bd519de._comment
@@ -0,0 +1,16 @@
+[[!comment format=mdwn
+ username="https://launchpad.net/~tale"
+ nickname="tale"
+ subject="Checked old host, git messed up there, too"
+ date="2011-12-27T18:59:38Z"
+ content="""
+I did
+ git clone src.git
+on the host ikiwiki is currently running on, and got an old version of the wiki, not the one that is diplayed on the web page.
+
+So even there editing via browser works, but git checks out an old version and the changes I make do not get shown on the web page after git push.
+
+This new host I set up is thus no worse, actually slighty better now because git clone at least gets me the sources the web pages are generated from.
+
+How to figure out what is wrong with ikiwiki setup in the using git part?
+"""]]
diff --git a/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_5_8611fc62797e70a0d2a61d94fcb03170._comment b/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_5_8611fc62797e70a0d2a61d94fcb03170._comment
new file mode 100644
index 000000000..137c198cc
--- /dev/null
+++ b/doc/forum/Migrated_ikiwiki_to_new_host._Web_page_is_not_updated/comment_5_8611fc62797e70a0d2a61d94fcb03170._comment
@@ -0,0 +1,22 @@
+[[!comment format=mdwn
+ username="https://launchpad.net/~tale"
+ nickname="tale"
+ subject="branch = master, still no luck"
+ date="2011-12-28T03:22:57Z"
+ content="""
+I learned from docs setting gitorigin_branch to empty string disables git pushing and pulling. So guessing
+
+gitorigin_branch => 'master',
+
+seemed a good idea.
+
+However, now ikiwiki -setup gives:
+
+ taleman@porixi:~$ ikiwiki -setup ikiwiki.setup
+ successfully generated /var/www/ikiwiki/debian.fi/ikiwiki.cgi
+ successfully generated /home/taleman/wiki/wiki.git/hooks/post-update
+ fatal: 'master': unable to chdir or not a git archive
+ fatal: The remote end hung up unexpectedly
+ 'git pull master' failed: at /usr/share/perl5/IkiWiki/Plugin/git.pm line 195.
+ taleman@porixi:~$
+"""]]
diff --git a/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn b/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn
new file mode 100644
index 000000000..d7a33b526
--- /dev/null
+++ b/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__.mdwn
@@ -0,0 +1,54 @@
+How do I setup an old ikiwiki repository on a new system?
+
+I have a git repository from an old ikiwiki system.
+I reformatted that hard drive, but saved the repository.
+
+I copied it the repository to my new system, which is now the "master" host.
+I installed ikiwiki on the new system.
+
+How do I set up an ikiwiki system using a pre-existing repository (instead of creating a new one)? --[[JosephTurian]]
+
+> Well, if you have:
+> * A git repository of the wiki
+> * A setup file for the wiki
+>
+> Then you should:
+>
+> 1. Manually set up a bare git repository, and push
+> your backed up repository to it.
+> 2. `git clone` from the bare git repository to
+> recreate the ikiwiki srcdir
+> 3. `git clone` from the bare git repository a second time,
+> to create a checkout you can manually edit (optional)
+>
+> If you preserved your repository, but not the setup file,
+> the easiest way to make one is probably to run
+> `ikiwiki -dumpsetup` and edit the setup file. --[[Joey]]
+
+> > I get the following errors after running ikiwiki setup:
+
+ shortcut plugin will not work without shortcuts.mdwn
+ shortcut plugin will not work without shortcuts.mdwn
+ successfully generated /home/turian/public_html/iki/ikiwiki.cgi
+ shortcut plugin will not work without shortcuts.mdwn
+ successfully generated /home/turian/repos/iki.git/hooks/post-update
+ Can't stat /usr/share/ikiwiki/basewiki/../javascript: No such file or directory
+ at /home/turian/utils//lib/perl5/site_perl/5.8.8//IkiWiki/Plugin/autoindex.pm line 60
+ Can't stat /usr/share/ikiwiki/basewiki/../smiley: No such file or directory
+ at /home/turian/utils//lib/perl5/site_perl/5.8.8//IkiWiki/Plugin/autoindex.pm line 60
+ Can't stat /usr/share/ikiwiki/basewiki: No such file or directory
+ at /home/turian/utils//lib/perl5/site_perl/5.8.8//IkiWiki/Plugin/autoindex.pm line 60
+ Can't stat /usr/share/ikiwiki/basewiki/../javascript: No such file or directory
+ at /home/turian/utils//lib/perl5/site_perl/5.8.8//IkiWiki/Render.pm line 320
+ Can't stat /usr/share/ikiwiki/basewiki/../smiley: No such file or directory
+ at /home/turian/utils//lib/perl5/site_perl/5.8.8//IkiWiki/Render.pm line 320
+ Can't stat /usr/share/ikiwiki/basewiki: No such file or directory
+ at /home/turian/utils//lib/perl5/site_perl/5.8.8//IkiWiki/Render.pm line 320
+ internal error: smileys.mdwn cannot be found in /home/turian/iki or underlay
+
+> > How do I resolve these errors? I have my PERL5LIB location set correctly.
+
+>>> Well, that's unrelated to the original question, but
+>>> I guess you should set `underlaydir` in your setup file to
+>>> point to whereever you have installed the basewiki directory.
+>>> --[[Joey]]
diff --git a/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__/comment_1_e5ce524c5d34b1d4218172296bd99100._comment b/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__/comment_1_e5ce524c5d34b1d4218172296bd99100._comment
new file mode 100644
index 000000000..78703bc27
--- /dev/null
+++ b/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__/comment_1_e5ce524c5d34b1d4218172296bd99100._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://launchpad.net/~tale"
+ nickname="tale"
+ subject="When is ikiwiki --setup run?"
+ date="2011-12-23T13:49:54Z"
+ content="""
+Am I to run ikiwiki --setup after all the three steps?
+"""]]
diff --git a/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__/comment_3_65c4a4895f6541ff0ff2d094ff447bba._comment b/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__/comment_3_65c4a4895f6541ff0ff2d094ff447bba._comment
new file mode 100644
index 000000000..a9bb2791a
--- /dev/null
+++ b/doc/forum/Migrating_old_repository_to_new_ikiwiki_system__63__/comment_3_65c4a4895f6541ff0ff2d094ff447bba._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 3"
+ date="2011-12-25T00:08:22Z"
+ content="""
+You run ikiwiki -setup when you have all the files in place for ikiwiki and would like it to rebuild the site and generate the wrappers.
+"""]]
diff --git a/doc/forum/Moving_wiki.git_folder__63__.mdwn b/doc/forum/Moving_wiki.git_folder__63__.mdwn
new file mode 100644
index 000000000..77d1da1ee
--- /dev/null
+++ b/doc/forum/Moving_wiki.git_folder__63__.mdwn
@@ -0,0 +1,17 @@
+Hi folks, I created a simple wiki to keep notes and references for projects, it's worked quite nice so far. I decided to use git as it's what I use daily to manage code, and it's available on all my machines.
+
+Anyway, I wanted to move all the wiki source stuff into a subfolder so that it stops cluttering up my ~ directory. However, there seems to be a problem with moving wiki.git (I moved wiki, wiki.git and wiki.setup) and I'm not sure where to tell ikiwiki that the git directory has been moved. I changed
+
+ srcdir => '/home/pixel/.notebook/wiki',
+ git_wrapper => '/home/pixel/.notebook/wiki.git/hooks/post-update',
+
+and that seems to be fine. However when I go to run ikiwiki --setup things go wrong:
+
+ pixel@tosh: [~ (ruby-1.9.2-p0)] ➔ ikiwiki -setup .notebook/wiki.setup
+ successfully generated /home/pixel/public_html/wiki/ikiwiki.cgi
+ successfully generated /home/pixel/.notebook/wiki.git/hooks/post-update
+ fatal: '/home/pixel/wiki.git' does not appear to be a git repository
+ fatal: The remote end hung up unexpectedly
+ 'git pull origin' failed: at /usr/share/perl5/IkiWiki/Plugin/git.pm line 193.
+
+I've gone through wiki.setup and nothing has jumped out as the place to set this, have I missed something?
diff --git a/doc/forum/Moving_wiki.git_folder__63__/comment_1_05238461520613f4ed1b0d02ece663bd._comment b/doc/forum/Moving_wiki.git_folder__63__/comment_1_05238461520613f4ed1b0d02ece663bd._comment
new file mode 100644
index 000000000..d654591c0
--- /dev/null
+++ b/doc/forum/Moving_wiki.git_folder__63__/comment_1_05238461520613f4ed1b0d02ece663bd._comment
@@ -0,0 +1,11 @@
+[[!comment format=mdwn
+ username="http://users.itk.ppke.hu/~cstamas/openid/"
+ ip="212.183.140.47"
+ subject="comment 1"
+ date="2010-10-27T22:45:28Z"
+ content="""
+I think you want to edit
+
+ .git/config
+
+"""]]
diff --git a/doc/forum/Moving_wiki.git_folder__63__/comment_2_72b2b842dfa0cfaf899fe7af12977519._comment b/doc/forum/Moving_wiki.git_folder__63__/comment_2_72b2b842dfa0cfaf899fe7af12977519._comment
new file mode 100644
index 000000000..f2e7ece18
--- /dev/null
+++ b/doc/forum/Moving_wiki.git_folder__63__/comment_2_72b2b842dfa0cfaf899fe7af12977519._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://pixel.dreamwidth.org/"
+ ip="65.29.14.21"
+ subject="comment 2"
+ date="2010-10-28T02:54:15Z"
+ content="""
+That did it thanks!
+
+Should I make some sort of edit in the setup page? I've used git for a while and for whatever reason it never occurred to me that this was from git, not from ikiwiki itself.
+"""]]
diff --git a/doc/forum/Multiple_urls.mdwn b/doc/forum/Multiple_urls.mdwn
new file mode 100644
index 000000000..03125d27c
--- /dev/null
+++ b/doc/forum/Multiple_urls.mdwn
@@ -0,0 +1,8 @@
+Hi,
+Is there a way of making a given ikiwiki instance accessible both from the LAN where it's server is and from the WAN?
+
+Say I have ikiwiki installed on a server connected to a router. That router has port forwarding and dyndns configured so I could open ikiwiki from outside the LAN. Trying to open normal ikiwiki pages, from outside the LAN, or with a proxy, works. However, the Editing and Preferences pages, for example, redirect to http://192.168.x.x/~username/ikiwiki/ikiwiki.cgi?page=posts%2Fhello_world&do=edit (in the case of the edit page), which of course only exists inside the LAN, and fails loading.
+
+Editing the "url" and "cgiurl" directives in the .setup file to point to the dyndns address makes it work from the outside, but I can't edit the pages from inside the LAN anymore with this configuration. The normal pages, once again, are accessible. Edit or Preferences, on the other hand, redirect to the public address, which I can't open from inside the same LAN it points to.
+
+For this reason I ask, is there an way to have multiple urls point to the same ikiwiki page, namely a LAN IP url and a public IP one? Thanks in advance.
diff --git a/doc/forum/Multiple_urls/comment_1_e4c1256346d5a421161c20e344d8bada._comment b/doc/forum/Multiple_urls/comment_1_e4c1256346d5a421161c20e344d8bada._comment
new file mode 100644
index 000000000..9806f5376
--- /dev/null
+++ b/doc/forum/Multiple_urls/comment_1_e4c1256346d5a421161c20e344d8bada._comment
@@ -0,0 +1,22 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="60.241.8.244"
+ subject="A Few Ways To Do This"
+ date="2012-10-09T02:02:09Z"
+ content="""
+I don't think one can alter IkiWiki to have multiple URLs, because the URL is built in to the CGI when the CGI is generated.
+
+1. Use the external hostname (say, foo.com) for the URL, and tell your local machine that foo.com has an IP of 192.168.x.x, thus making it accessible from within the LAN.
+2. Give the URL as a relative-absolute URL; that is, rather than \"http://foo.com/ikiwiki.cgi\" give it as \"/ikiwiki.cgi\". This doesn't always work, though.
+3. Build two versions of the site from the same git repo. One for access from inside, and one for access from outside. Both setup files would need to be identical, apart from
+
+ * the destination directory
+ * the URLs
+ * the git-update file name; one would need to call it something other than post-update.
+
+ Then one would make a new \"post-update\" file which calls *both* of the ikiwiki post-update scripts, so that both versions of the site are updated when you make a change.
+ Then set up your web-server to point to the \"external\" directory for the external site, and the \"internal\" directory for the internal site; easy enough to do if you use virtual hosts.
+
+Yes, I know the third one is somewhat complex... I use the idea myself in order to make two versions of a site where one is editable and the other is not, but that's not what you're aiming for, I know.
+
+"""]]
diff --git a/doc/forum/Need_help_installing_h1title_plugin.mdwn b/doc/forum/Need_help_installing_h1title_plugin.mdwn
new file mode 100644
index 000000000..f6de2fe6f
--- /dev/null
+++ b/doc/forum/Need_help_installing_h1title_plugin.mdwn
@@ -0,0 +1,5 @@
+I am trying to install plugins that's not included in Ikiwiki following instructions at `http://ikiwiki.info/plugins/install/`. So far I tried `http://jblevins.org/git/ikiwiki/plugins.git/plain/h1title.pm` and `http://ikiwiki.info/plugins/contrib/default_content_for___42__copyright__42___and___42__license__42__/`. After putting these `.pm` files in `/home/foo/website/libi/IkiWiki/Plugin` and making them executable, I rebuilt the wiki instance. But the plugins aren't working.
+
+Any ideas what might be wrong?
+
+Is there some way to debug?
diff --git a/doc/forum/Need_help_setting_up_ikiwiki_CGI.mdwn b/doc/forum/Need_help_setting_up_ikiwiki_CGI.mdwn
new file mode 100644
index 000000000..66c820c34
--- /dev/null
+++ b/doc/forum/Need_help_setting_up_ikiwiki_CGI.mdwn
@@ -0,0 +1,16 @@
+After installing and setting up Ikiwiki on my Debian server. I somehow got the following error when trying to edit index page in browser / online:
+
+ Not Found
+
+ The requested URL /foobar.com/static/ikiwiki.cgi was not found on this server.
+
+ Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.
+
+
+My `foobar.setup` has the following for CGI configuration
+
+ cgiurl => 'http:/foobar.com/static/ikiwiki.cgi',
+ cgi_wrapper => '/home/foobaruser/foobar.com/static/ikiwiki.cgi',
+
+
+What could be wrong?
diff --git a/doc/forum/Need_help_setting_up_ikiwiki_CGI/comment_1_0fc4573568711c56a0df4af620110c2f._comment b/doc/forum/Need_help_setting_up_ikiwiki_CGI/comment_1_0fc4573568711c56a0df4af620110c2f._comment
new file mode 100644
index 000000000..3c5be5c6e
--- /dev/null
+++ b/doc/forum/Need_help_setting_up_ikiwiki_CGI/comment_1_0fc4573568711c56a0df4af620110c2f._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-05-20T13:56:51Z"
+ content="""
+Well, your cgiurl is an invalid url (valid urls start with \"http://\" , not \"http:/\")
+
+Perhaps you also have a misconfigured web server. You don't say if `/home/foobaruser/foobar.com/` is configured to be served up for foobar.com. Or perhaps you need to follow the instructions in [[tips/dot_cgi]].
+
+Or, perhaps `/home/foobaruser/foobar.com/static/ikiwiki.cgi` does not exist; did you run ikiwiki -setup?
+"""]]
diff --git a/doc/forum/Need_help_setting_up_ikiwiki_CGI/comment_3_89f2cd7d874a6257786478e4cae1e2bc._comment b/doc/forum/Need_help_setting_up_ikiwiki_CGI/comment_3_89f2cd7d874a6257786478e4cae1e2bc._comment
new file mode 100644
index 000000000..ed771bd00
--- /dev/null
+++ b/doc/forum/Need_help_setting_up_ikiwiki_CGI/comment_3_89f2cd7d874a6257786478e4cae1e2bc._comment
@@ -0,0 +1,16 @@
+[[!comment format=mdwn
+ username="https://profiles.google.com/lumeng.dev"
+ nickname="lumeng.dev"
+ subject="comment 3"
+ date="2011-05-21T19:11:27Z"
+ content="""
+After testing using a test CGI script (thanks to smcv), I confirmed this is not a problem of the server not being set up to serve CGI. I then tried changing the permissions of `ikiwiki.cgi` and found:
+
+* on the remote server, if I have `-rwsr-sr-x` for `ikiwiki.cgi`, the CGI features such as `Preference` and `Edit` buttons don't work, but if I set it `-rwsr-xr-x`, it works. The server is a Debian VPS.
+
+* in contrast, on my local LAMP server `localhost`, both permissions work
+
+Any ideas?
+
+Thanks!
+"""]]
diff --git a/doc/forum/Need_help_setting_up_ikiwiki_CGI/comment_3_cbc20267fe5f0531f63db881d50596d1._comment b/doc/forum/Need_help_setting_up_ikiwiki_CGI/comment_3_cbc20267fe5f0531f63db881d50596d1._comment
new file mode 100644
index 000000000..4e645ef0b
--- /dev/null
+++ b/doc/forum/Need_help_setting_up_ikiwiki_CGI/comment_3_cbc20267fe5f0531f63db881d50596d1._comment
@@ -0,0 +1,20 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 3"
+ date="2011-05-21T18:27:15Z"
+ content="""
+Here is a simple CGI script:
+
+ #!/bin/sh
+ printf \"Content-type: text/plain\r\n\"
+ printf \"\r\n\"
+ printf \"Hello, world!\r\n\"
+
+Here is a simple Perl CGI script:
+
+ #!/usr/bin/perl
+ print \"Content-type: text/plain\r\n\";
+ print \"\r\n\";
+ print \"Hello, world!\r\n\";
+"""]]
diff --git a/doc/forum/Need_help_setting_up_ikiwiki_CGI/comment_4_2eaf53935eecd0a918755d728450a642._comment b/doc/forum/Need_help_setting_up_ikiwiki_CGI/comment_4_2eaf53935eecd0a918755d728450a642._comment
new file mode 100644
index 000000000..057a09a51
--- /dev/null
+++ b/doc/forum/Need_help_setting_up_ikiwiki_CGI/comment_4_2eaf53935eecd0a918755d728450a642._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://profiles.google.com/lumeng.dev"
+ nickname="lumeng.dev"
+ subject="comment 4"
+ date="2011-05-21T19:36:19Z"
+ content="""
+BTW, I use `cgi_wrappermode => '06755'` in my `wiki.setup`
+"""]]
diff --git a/doc/forum/Need_some_help_on_starting_to_use_po_plugin_for_creating_pages_in_multiple_languages.mdwn b/doc/forum/Need_some_help_on_starting_to_use_po_plugin_for_creating_pages_in_multiple_languages.mdwn
new file mode 100644
index 000000000..5e4e56fde
--- /dev/null
+++ b/doc/forum/Need_some_help_on_starting_to_use_po_plugin_for_creating_pages_in_multiple_languages.mdwn
@@ -0,0 +1,6 @@
+I've installed the po plugin. I'm still trying to figure out how to use it.
+
+I have a Ikiwiki instance with multiple existing pages in English.
+
+How do I use po to create alternative pages in slave languages for these existing pages?
+
diff --git a/doc/forum/Need_something_more_powerful_than_Exclude.mdwn b/doc/forum/Need_something_more_powerful_than_Exclude.mdwn
new file mode 100644
index 000000000..5e8043258
--- /dev/null
+++ b/doc/forum/Need_something_more_powerful_than_Exclude.mdwn
@@ -0,0 +1,5 @@
+When I originally looked at the "exclude" option, I thought it meant that it excluded pages completely, but it apparently doesn't. What I've found in practice is that a file which matches the "exclude" regex is excluded from *processing*, but it is still copied over to the destination directory. Thus, for example, if I have "^Makefile$" as the exclude pattern, and I have a file `src/foo/Makefile`, then that file is copied unaltered into `dest/foo/Makefile`. However, what I want is for `src/foo/Makefile` to be completely ignored: that it is not only not processed, but not even *copied* into the destination directory.
+
+I'm not sure if the current behaviour is a bug or a feature, but I would like a "totally ignore this file" feature if it's possible to have one.
+
+-- [[KathrynAndersen]]
diff --git a/doc/forum/Need_something_more_powerful_than_Exclude/comment_2_0019cd6b34c8d8678b2532de57a92d15._comment b/doc/forum/Need_something_more_powerful_than_Exclude/comment_2_0019cd6b34c8d8678b2532de57a92d15._comment
new file mode 100644
index 000000000..7842caeac
--- /dev/null
+++ b/doc/forum/Need_something_more_powerful_than_Exclude/comment_2_0019cd6b34c8d8678b2532de57a92d15._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="expression anchored too closely?"
+ date="2010-11-23T10:43:21Z"
+ content="""
+It looks as though you might only be excluding a top-level Makefile, and not a Makefile in subdirectories. Try excluding `(^|/)Makefile$` instead, for instance? (See `wiki_file_prune_regexps` in `IkiWiki.pm` for hints.)
+
+The match operation in `&file_pruned` ends up a bit like this:
+
+ \"foo/Makefile\" =~ m{…|…|…|(^|/)Makefile$}
+"""]]
diff --git a/doc/forum/Need_something_more_powerful_than_Exclude/comment_2_f577ab6beb9912471949d8d18c790267._comment b/doc/forum/Need_something_more_powerful_than_Exclude/comment_2_f577ab6beb9912471949d8d18c790267._comment
new file mode 100644
index 000000000..bd964d540
--- /dev/null
+++ b/doc/forum/Need_something_more_powerful_than_Exclude/comment_2_f577ab6beb9912471949d8d18c790267._comment
@@ -0,0 +1,11 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="60.241.8.244"
+ subject="Missed It By That Much"
+ date="2010-11-25T02:55:20Z"
+ content="""
+I discovered that I not only needed to change the regexp, but I also needed to delete .ikiwiki/indexdb because `file_pruned` only gets called for files that aren't in the `%pagesources` hash, and since the file in question was already there because it had been put there before the exclude regex was changed, it wasn't even being checked!
+
+[[KathrynAndersen]]
+
+"""]]
diff --git a/doc/forum/Need_something_more_powerful_than_Exclude/comment_3_1ed260b0083a290688425a006a83f603._comment b/doc/forum/Need_something_more_powerful_than_Exclude/comment_3_1ed260b0083a290688425a006a83f603._comment
new file mode 100644
index 000000000..8b93acd79
--- /dev/null
+++ b/doc/forum/Need_something_more_powerful_than_Exclude/comment_3_1ed260b0083a290688425a006a83f603._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 3"
+ date="2010-11-29T20:41:49Z"
+ content="""
+`%pagesources` gets nuked when you rebuild the whole wiki with eg, ikiwiki -setup or ikiwiki -rebuild. So you shouldn't normally need to remove the indexdb, just rebuild when making this sort of change that affects the whole site.
+"""]]
diff --git a/doc/forum/Need_something_more_powerful_than_Exclude/comment_4_c39bdaf38e1e20db74eb26f0560bd673._comment b/doc/forum/Need_something_more_powerful_than_Exclude/comment_4_c39bdaf38e1e20db74eb26f0560bd673._comment
new file mode 100644
index 000000000..15f1fecb8
--- /dev/null
+++ b/doc/forum/Need_something_more_powerful_than_Exclude/comment_4_c39bdaf38e1e20db74eb26f0560bd673._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="60.241.8.244"
+ subject="comment 4"
+ date="2010-11-30T02:35:43Z"
+ content="""
+One would think that would be the case, yes, but for some reason it didn't work for me. 8-(
+
+[[KathrynAndersen]]
+"""]]
diff --git a/doc/forum/Need_something_more_powerful_than_Exclude/comment_5_39b01857f7e0b388a6e7a3c1cf5388d5._comment b/doc/forum/Need_something_more_powerful_than_Exclude/comment_5_39b01857f7e0b388a6e7a3c1cf5388d5._comment
new file mode 100644
index 000000000..17228b891
--- /dev/null
+++ b/doc/forum/Need_something_more_powerful_than_Exclude/comment_5_39b01857f7e0b388a6e7a3c1cf5388d5._comment
@@ -0,0 +1,9 @@
+[[!comment format=mdwn
+ username="http://tbm.myopenid.com/"
+ ip="188.222.45.200"
+ subject="Still there?"
+ date="2012-03-20T18:35:41Z"
+ content="""
+Joey, I believe I see the same problem with 3.20120202. I add foo.mdwn, run ikiwiki --setup ikiwiki.setup, add \"exclude: foo\", run --setup again and it still says \"building foo.mdwn\".
+
+"""]]
diff --git a/doc/forum/Need_something_more_powerful_than_Exclude/comment_6_1dccdfebad31446200213a2cae25f0e2._comment b/doc/forum/Need_something_more_powerful_than_Exclude/comment_6_1dccdfebad31446200213a2cae25f0e2._comment
new file mode 100644
index 000000000..d93684d10
--- /dev/null
+++ b/doc/forum/Need_something_more_powerful_than_Exclude/comment_6_1dccdfebad31446200213a2cae25f0e2._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawkiulxucQx_YZQZUVJdNF6oMaZwWb8JF2M"
+ nickname="Martin"
+ subject="Reproduced"
+ date="2012-07-25T02:23:13Z"
+ content="""
+I also encountered this bug (having to delete indexdb) in 3.20120629.
+
+-- Martin
+"""]]
diff --git a/doc/forum/News_site_where_articles_are_submitted_and_then_reviewed_before_posting.mdwn b/doc/forum/News_site_where_articles_are_submitted_and_then_reviewed_before_posting.mdwn
new file mode 100644
index 000000000..8dd755274
--- /dev/null
+++ b/doc/forum/News_site_where_articles_are_submitted_and_then_reviewed_before_posting.mdwn
@@ -0,0 +1,23 @@
+[[!meta date="2008-04-28 14:57:25 -0400"]]
+
+I am considering moving a news site to Ikiwiki. I am hoping that Ikiwiki has a feature where anonymous posters can submit a form that moderators can review and then accept for it to be posted on a news webpage (like front page of the website).
+
+> Well, you can have one blog that contains unreviewed articles, and
+> moderators can then add a tag that makes the article show up in the main
+> news feed. There's nothing stopping someone submitting an article
+> pre-tagged though. If you absolutely need to lock that down, you could
+> have one blog with unreviewed articles in one subdirectory, and reviewers
+> then move the file over to another subdirectory when they're ready to
+> publish it. (This second subdirectory would be locked to prevent others
+> from writing to it.) --[[Joey]]
+
+Also it would be good if the news page would keep maybe just the latest 10 entries with links to an archive that make it easy to browse to old entries by date. (Could have over a thousand news articles.)
+
+> The inline plugin allows setting up things like this.
+
+Plus users be able to post feedback to news items. If anonymous, they must be approved first. I'd prefer to not use normal "wiki" editor for feedback.
+
+Any thoughts or examples on this? Any links to examples of news sites or blogs with outside feedback using ikiwiki?
+
+Thanks --[[JeremyReed]]
+
diff --git a/doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__.mdwn b/doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__.mdwn
new file mode 100644
index 000000000..e58844bac
--- /dev/null
+++ b/doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__.mdwn
@@ -0,0 +1,12 @@
+Hi,
+
+unfortunately, openID is not working at my wiki. I get the error
+
+no_identity_server: The provided URL doesn't declare its OpenID identity server.
+
+I think this is related to the ID of my wiki not being defined right. Where and how do I have to define it? I have used the !meta openid as described on ikiwiki.info (are there two quotes at the end where only one should be (joeyh example) on index.html.
+
+Somehow I think its not transferred right to the openID provider of the user upon login.
+
+thanks in advance
+chris
diff --git a/doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__/comment_1_bf1bec748d6ab419276a73a7001024cf._comment b/doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__/comment_1_bf1bec748d6ab419276a73a7001024cf._comment
new file mode 100644
index 000000000..06d2a332b
--- /dev/null
+++ b/doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__/comment_1_bf1bec748d6ab419276a73a7001024cf._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawndsaC4GaIBw49WNdbk2Faqfm_mrtQgul8"
+ nickname="Christian"
+ subject="apache module?"
+ date="2012-01-18T15:40:57Z"
+ content="""
+Do I have to install the openid apache module, load it, and configure apache to use my openid? Except that in my case I can get it from the package system, there is a description <a href=\"http://findingscience.com/mod_auth_openid/\">here</a> what I mean. I got a feeling that's it.
+"""]]
diff --git a/doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__/comment_2_14a1b269be6dbcc9b2068d3e18b55711._comment b/doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__/comment_2_14a1b269be6dbcc9b2068d3e18b55711._comment
new file mode 100644
index 000000000..3078a1473
--- /dev/null
+++ b/doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__/comment_2_14a1b269be6dbcc9b2068d3e18b55711._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 2"
+ date="2012-01-30T19:34:00Z"
+ content="""
+Yes, good spotting, [[ikiwiki/directive/meta]] had a doubled quote in the openid example.
+
+Otherwise, that example will work. You don't need anything installed on your server to add openid delegation to a page.
+"""]]
diff --git a/doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__/comment_3_f581afcdb4481ea5d65bcc33bdbab99a._comment b/doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__/comment_3_f581afcdb4481ea5d65bcc33bdbab99a._comment
new file mode 100644
index 000000000..1ac15d74c
--- /dev/null
+++ b/doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__/comment_3_f581afcdb4481ea5d65bcc33bdbab99a._comment
@@ -0,0 +1,25 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnp4lzWSX1pvSpwAoboehP3SSbmbQESe80"
+ nickname="Felipe Augusto"
+ subject="I'm trying to use OpenID without success"
+ date="2012-02-08T04:54:22Z"
+ content="""
+I'm using ikiwiki package from Debian squeeze and I can't seem
+to be able to make OpenID work. It's a blog and when I try to
+add a comment and click on SignIn, I'm redirected to
+
+>http://my.site/ikiwiki.cgi?do=commentsignin
+
+
+Once I click on Google logo/icon, it takes a while before showing
+
+>no_identity_server: The provided URL doesn't declare its OpenID identity server.
+
+
+It's not clear for me what's wrong or if I should add meta openid in some page.
+This is the version of libnet-openid-consumer-perl: 1.03-1. It also fails for
+Yahoo! and other providers, we never get redirected to Google/Yahoo! or other
+verification page.
+
+Thank you in advance!
+"""]]
diff --git a/doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__/comment_4_b0d39d30852bca1525ab9612a7532670._comment b/doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__/comment_4_b0d39d30852bca1525ab9612a7532670._comment
new file mode 100644
index 000000000..ce3cf2156
--- /dev/null
+++ b/doc/forum/OpenID_not_working___47___where_to_define_wiki__39__s_ID__63__/comment_4_b0d39d30852bca1525ab9612a7532670._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawndsaC4GaIBw49WNdbk2Faqfm_mrtQgul8"
+ nickname="Christian"
+ subject="comment 4"
+ date="2012-02-29T06:59:21Z"
+ content="""
+I had the error with squeeze, too. Have now moved to passwordauth, at least for now...
+"""]]
diff --git a/doc/forum/PERL5LIB__44___wrappers_and_homedir_install.mdwn b/doc/forum/PERL5LIB__44___wrappers_and_homedir_install.mdwn
new file mode 100644
index 000000000..fba941efc
--- /dev/null
+++ b/doc/forum/PERL5LIB__44___wrappers_and_homedir_install.mdwn
@@ -0,0 +1,38 @@
+What is the way to tell wrappers that PERL5LIB should include ~/bin directories?
+
+Having this in the wiki.setup doesn't help anymore:
+
+ # environment variables
+ ENV => {
+ PATH => '/home/user/bin/bin:/usr/local/bin:/usr/bin:/bin:/usr/games:/home/user/ikiwiki/usr/bin/:/home/user/ikiwiki/usr/sbin/:/home/user/bin/bin/:~/bin/bin/',
+ PERL5LIB => '/home/user/bin/share/perl/5.10.0:/home/user/bin/lib/perl/5.10.0'
+ },
+
+Or at least I get CGI errors and running ikiwiki.cgi manually fails too:
+
+ Use of uninitialized value $tainted in pattern match (m//) at /usr/share/perl5/IkiWiki.pm line 233.
+ Argument "" isn't numeric in umask at /usr/share/perl5/IkiWiki.pm line 139.
+ Undefined subroutine &IkiWiki::cgierror called at /home/user/bin/bin/ikiwiki line 199.
+
+Server has an older ikiwiki installed but I'd like to use a newer version from git, and I don't have root access.
+
+> You can't set `PERL5LIB` in `ENV` in a setup file, because ikiwiki is already
+> running before it reads that, and so it has little effect. Your error
+> messages do look like a new bin/ikiwiki is using an old version of
+> `IkiWiki.pm`.
+>
+> The thing to do is set `INSTALL_BASE` when you're installing ikiwiki from
+> source. Like so:
+
+ cd ikiwiki
+ perl Makefile.PL INSTALL_BASE=$HOME PREFIX=
+ make install
+
+> Then `$HOME/bin/ikiwiki` will have hardcoded into it to look
+> for ikiwiki's perl modules in `$HOME/lib/perl5/`
+> (This is documented in the README file by the way.) --[[Joey]]
+
+>> Ok, *perl Makefile.PL INSTALL_BASE=$HOME/bin PREFIX=* finally did it for me. I tried too many things with
+>> these paths so I wasn't sure which actually worked. After that I did
+>> *$ ikiwiki --setup www.setup --wrappers --rebuild*. Somehow in this update mess I seem to have lost the user
+>> accounts, maybe the --rebuild was too much.
diff --git a/doc/forum/PageSpec_results_from_independent_checkout.mdwn b/doc/forum/PageSpec_results_from_independent_checkout.mdwn
new file mode 100644
index 000000000..693287d2b
--- /dev/null
+++ b/doc/forum/PageSpec_results_from_independent_checkout.mdwn
@@ -0,0 +1,8 @@
+I'd like to be able to do PageSpec matches independent of the Ikiwiki checkout, but at best I'm currently restricted to copying over and using whatever is in the indexdb with this approach:
+
+ perl -MIkiWiki -le '$config{wikistatedir}=".ikiwiki"; IkiWiki::loadindex(); print foreach pagespec_match_list("", shift)' "bugs/*"
+
+I get the impression there's a way to build up enough state to run pagespec matches without doing any rendering, but I don't know how. Any ideas? -- JoeRayhawk
+
+> It's not possible to build up enough state without at a minimum
+> performing the scan pass of rendering on every page. --[[Joey]]
diff --git a/doc/forum/Parent_Links_all_link_to_root.mdwn b/doc/forum/Parent_Links_all_link_to_root.mdwn
new file mode 100644
index 000000000..b9c4c8e1a
--- /dev/null
+++ b/doc/forum/Parent_Links_all_link_to_root.mdwn
@@ -0,0 +1,18 @@
+My parent links all link to the root instead of to the appropriate index.mdwn. Is this a sign of a broken link pre-compile? Is there some setting that controls this?
+
+<code>
+\<span class="parentlinks">
+
+\<a href="../../../">root\</a>/
+
+\<a href="../../../">level1\</a>/
+
+\<a href="../../../">level2\</a>/
+
+\</span>
+</code>
+
+Thanks,
+Sean
+
+ikiwiki version 3.20100722 - Pretty plain Jane install
diff --git a/doc/forum/Parent_Links_all_link_to_root/comment_1_4b5ed25cceb7740f64ee08aba00a1d91._comment b/doc/forum/Parent_Links_all_link_to_root/comment_1_4b5ed25cceb7740f64ee08aba00a1d91._comment
new file mode 100644
index 000000000..656cc0f56
--- /dev/null
+++ b/doc/forum/Parent_Links_all_link_to_root/comment_1_4b5ed25cceb7740f64ee08aba00a1d91._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joeyh.name/"
+ ip="4.154.4.117"
+ subject="comment 1"
+ date="2012-06-03T17:11:11Z"
+ content="""
+All I can think is that you must have modified the `page.tmpl` template and broken the url inside the parenlinks loop somehow.
+"""]]
diff --git a/doc/forum/Perhaps_I__39__m_doing_it_wrong_-_tracking_non-post_files_in_a_blog.mdwn b/doc/forum/Perhaps_I__39__m_doing_it_wrong_-_tracking_non-post_files_in_a_blog.mdwn
new file mode 100644
index 000000000..383ae17cc
--- /dev/null
+++ b/doc/forum/Perhaps_I__39__m_doing_it_wrong_-_tracking_non-post_files_in_a_blog.mdwn
@@ -0,0 +1,7 @@
+I've been searching on this topic for a while and haven't found a solution, so I'd like to ask here.
+
+I have a blog which I mostly use as a tech-note reminder system for myself (how did I setup my server, etc). Occasionally I find it useful to include files which are not posts, and links to those files.
+
+Right now, I scp the files to the server to get them in a place accessible by the web server, then use a relative link within the post. This works, but it strikes me that the files are as much a part of the post as the post itself, and therefore should be tracked. The problem with tracking the files is the inline directive gives those files their own entries as posts in the blog. I do not want them to have their own entries, but I *do* want them co-located with the file containing the post from which they are referenced.
+
+So, is there a way to have *only* `*.mdwn` files be picked up as posts by the inline directive (I tried using a PageSpec of `*.mdwn`, but that didn't work)? Or, conversely, to exclude other files from being picked up as posts? Or am I not seeing another way to go about this task?
diff --git a/doc/forum/Perhaps_I__39__m_doing_it_wrong_-_tracking_non-post_files_in_a_blog/comment_1_45ecaf6efa2065837fa54a42737f0a66._comment b/doc/forum/Perhaps_I__39__m_doing_it_wrong_-_tracking_non-post_files_in_a_blog/comment_1_45ecaf6efa2065837fa54a42737f0a66._comment
new file mode 100644
index 000000000..4d2c93238
--- /dev/null
+++ b/doc/forum/Perhaps_I__39__m_doing_it_wrong_-_tracking_non-post_files_in_a_blog/comment_1_45ecaf6efa2065837fa54a42737f0a66._comment
@@ -0,0 +1,18 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2012-01-03T11:29:59Z"
+ content="""
+Change the [[ikiwiki/pagespec]] in the `inline`, for instance from
+`posts/*` to `page(posts/*)`.
+
+`page(*)` only matches \"pages\" (things that get rendered to HTML, which is just
+`.mdwn` files in a default ikiwiki, but can include other things with the right
+plugins).
+
+On my blog I use \"`2* and copyright(*)`\", which is a bit of a hack: it matches
+files in the directories I use for posts (which are year-based), but only if
+they have an explicit copyright statement - which my blog posts do, but
+\"structural\" pages (like a list of all posts from 2011) don't.
+"""]]
diff --git a/doc/forum/Perhaps_I__39__m_doing_it_wrong_-_tracking_non-post_files_in_a_blog/comment_2_45ca7ef4190c281d703c8c7ca6979298._comment b/doc/forum/Perhaps_I__39__m_doing_it_wrong_-_tracking_non-post_files_in_a_blog/comment_2_45ca7ef4190c281d703c8c7ca6979298._comment
new file mode 100644
index 000000000..ecb6f5df9
--- /dev/null
+++ b/doc/forum/Perhaps_I__39__m_doing_it_wrong_-_tracking_non-post_files_in_a_blog/comment_2_45ca7ef4190c281d703c8c7ca6979298._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="dave"
+ ip="24.209.97.191"
+ subject="comment 2"
+ date="2012-01-04T04:01:54Z"
+ content="""
+Thank you sir, that was exactly what I needed!
+
+Don't know why I didn't think to try page - apparently I have a special blindness which applies to looking at the pagespec help page. ;)
+
+Anyway, thanks again, that fixed it.
+"""]]
diff --git a/doc/forum/Possible_to_use_meta_variables_in_templates__63__.mdwn b/doc/forum/Possible_to_use_meta_variables_in_templates__63__.mdwn
new file mode 100644
index 000000000..3c214d457
--- /dev/null
+++ b/doc/forum/Possible_to_use_meta_variables_in_templates__63__.mdwn
@@ -0,0 +1,11 @@
+I'm trying to create a [[!iki plugins/template desc=template]] which references variables from the [[!iki plugins/meta desc=meta]] plugin, but either it's not supported or I'm doing something wrong. This is what my template looks like:
+
+ <div class="attributionbox">
+ <p><b>Written by:</b> <a href="<TMPL_VAR AUTHORURL>"><TMPL_VAR AUTHOR></a></p>
+ <p><TMPL_VAR text></b></p>
+ </div>
+
+The template is working because I get the content, but all the places where I reference meta variables are blank. Is this supposed to work or am I trying to do something unsupported? Many thanks for any pointers.
+
+Cheers,
+[[AdamShand]]
diff --git a/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_1_556078a24041289d8f0b7ee756664690._comment b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_1_556078a24041289d8f0b7ee756664690._comment
new file mode 100644
index 000000000..3aeeec793
--- /dev/null
+++ b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_1_556078a24041289d8f0b7ee756664690._comment
@@ -0,0 +1,20 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="not supported at the moment"
+ date="2011-01-24T15:17:59Z"
+ content="""
+This isn't supported, because [[ikiwiki/directive/template]] templates
+don't run `pagetemplate` hooks (which is how information gets from
+[[ikiwiki/directive/meta]] into, for instance, `page.tmpl`). The only
+inputs to the `HTML::Template` are the parameters passed to the
+directive, plus the `raw_`-prefixed versions of those, plus the extra
+parameters passed to every `preprocess` hook (currently `page`, `destpage`
+and `preview`).
+
+I think having `pagetemplate` hooks run for this sort of template
+by default would be rather astonishing, but perhaps some sort of
+opt-in while defining the template would be reasonable? One problem
+with that is that the templates used by [[ikiwiki/directive/template]]
+are just wiki pages, and don't really have any special syntax support.
+"""]]
diff --git a/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_2_e7e954218d39bc310015b95aa1a5212c._comment b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_2_e7e954218d39bc310015b95aa1a5212c._comment
new file mode 100644
index 000000000..b53188128
--- /dev/null
+++ b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_2_e7e954218d39bc310015b95aa1a5212c._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://adam.shand.net/"
+ nickname="Adam"
+ subject="Bummer."
+ date="2011-01-24T15:26:33Z"
+ content="""
+Thanks for the quick response! I'm trying to figure out some way that I can reference meta variables inside of a page. Specifically I'm trying to create an attribution box which lists all of the information I have about who wrote the page, where the original can be found etc. I can just pass the values to the template, but it would be really nice not have to put this information in for the meta plugin and my attribution box!
+
+The changes you suggest sound wonderful but are beyond my abilities right row. Any ideas how I might accomplish this in the mean time?
+"""]]
diff --git a/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_3_8b16c563c89eb6980ad6a5539d934d7a._comment b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_3_8b16c563c89eb6980ad6a5539d934d7a._comment
new file mode 100644
index 000000000..a20f8f5c6
--- /dev/null
+++ b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_3_8b16c563c89eb6980ad6a5539d934d7a._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 3"
+ date="2011-01-24T20:58:52Z"
+ content="""
+I usually just have a template that contains a suitable `\[[!meta]]` directive.
+"""]]
diff --git a/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_4_76eadf93cce4e2168960131d4677c5fc._comment b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_4_76eadf93cce4e2168960131d4677c5fc._comment
new file mode 100644
index 000000000..b5c626130
--- /dev/null
+++ b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_4_76eadf93cce4e2168960131d4677c5fc._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="202.173.183.92"
+ subject="contrib plugins can do this"
+ date="2011-01-24T23:11:40Z"
+ content="""
+You can do this by using the [[plugins/contrib/field]] plugin with the [[plugins/contrib/ftemplate]] plugin.
+"""]]
diff --git a/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_5_ddabe4a005042d19c7669038b49275c1._comment b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_5_ddabe4a005042d19c7669038b49275c1._comment
new file mode 100644
index 000000000..6279b20ba
--- /dev/null
+++ b/doc/forum/Possible_to_use_meta_variables_in_templates__63__/comment_5_ddabe4a005042d19c7669038b49275c1._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="http://adam.shand.net/"
+ nickname="Adam"
+ subject="Thanks!"
+ date="2011-01-25T02:51:35Z"
+ content="""
+smcv, sorry I don't understand? How are you getting the \[[!meta] to work on a template page, I thought that's what you said didn't work? Do you mean a pagetemplate?
+
+kerravonsen, thanks for the pointer I'll check those out.
+
+I realised last night that I think I could also do this with a pagetemplate, since I should be able to access meta variables there. A little clumsy for what I want to do but should hopefully work fine. Would be really neat with the [section template](http://ikiwiki.info/todo/Set_templates_for_whole_sections_of_the_site/) plugin, I'll have to look at that.
+"""]]
diff --git a/doc/forum/Problem_with_gitweb.mdwn b/doc/forum/Problem_with_gitweb.mdwn
new file mode 100644
index 000000000..98a7f39a3
--- /dev/null
+++ b/doc/forum/Problem_with_gitweb.mdwn
@@ -0,0 +1,3 @@
+I use gitweb to display the pagehistories of my local ikiwiki. However since a few weeks it doesn't work anymore and displays just: 404 - No such project. I don't remember that I changed something with my wiki. Any ideas how to fix this? I guess that it could be a permission problem, however I don't really know which permissions are important in this case.
+
+
diff --git a/doc/forum/Problem_with_gitweb/comment_2_23cc0d87448d3cbdac20a005e9191589._comment b/doc/forum/Problem_with_gitweb/comment_2_23cc0d87448d3cbdac20a005e9191589._comment
new file mode 100644
index 000000000..f80bd38d8
--- /dev/null
+++ b/doc/forum/Problem_with_gitweb/comment_2_23cc0d87448d3cbdac20a005e9191589._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 2"
+ date="2012-03-05T21:08:45Z"
+ content="""
+This seems entirely a gitweb configuration problem, so look at `/etc/gitweb.conf`
+
+Or, if you are able to navigate to a gitweb url that does show your wiki's source, fix up ikiwiki's `historyurl` to use the url that works.
+"""]]
diff --git a/doc/forum/Problem_with_gitweb/comment_3_697c6038009249e6a49d9e458a5ba271._comment b/doc/forum/Problem_with_gitweb/comment_3_697c6038009249e6a49d9e458a5ba271._comment
new file mode 100644
index 000000000..72eeda124
--- /dev/null
+++ b/doc/forum/Problem_with_gitweb/comment_3_697c6038009249e6a49d9e458a5ba271._comment
@@ -0,0 +1,47 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk_MMtLPS7osC5MjX00q2ATjvvXPWqm0ik"
+ nickname="micheal"
+ subject="comment 3"
+ date="2012-03-06T09:50:53Z"
+ content="""
+I don't know how to navigate to a gitweb url that does show my wiki's source.
+
+My gitweb.conf looks like this:
+
+ cat /etc/gitweb.conf
+ # path to git projects (<project>.git)
+ #$projectroot = \"/var/cache/git\";
+ $projectroot = \"/home/myuser/myiki\";
+
+ # directory to use for temp files
+ $git_temp = \"/tmp\";
+
+ # Change This
+ $site_name = \"myiki\";
+
+ # target of the home link on top of all pages
+ #$home_link = $my_uri || \"/\";
+
+ # html text to include at home page
+ #$home_text = \"indextext.html\";
+
+ # file with project list; by default, simply scan the projectroot dir.
+ #$projects_list = $projectroot;
+
+ # stylesheet to use
+ #@stylesheets = (\"static/gitweb.css\");
+
+ # javascript code for gitweb
+ #$javascript = \"static/gitweb.js\";
+
+ # logo to use
+ #$logo = \"static/git-logo.png\";
+
+ # the 'favicon'
+ #$favicon = \"static/git-favicon.png\";
+
+ # git-diff-tree(1) options to use for generated patches
+ #@diff_opts = (\"-M\");
+ @diff_opts = ();
+
+"""]]
diff --git a/doc/forum/Problem_with_gitweb/comment_3_6a5b96f7e0d6b169c090e3df7281d938._comment b/doc/forum/Problem_with_gitweb/comment_3_6a5b96f7e0d6b169c090e3df7281d938._comment
new file mode 100644
index 000000000..c8bbe9ba1
--- /dev/null
+++ b/doc/forum/Problem_with_gitweb/comment_3_6a5b96f7e0d6b169c090e3df7281d938._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk_MMtLPS7osC5MjX00q2ATjvvXPWqm0ik"
+ nickname="micheal"
+ subject="comment 3"
+ date="2012-03-27T17:35:49Z"
+ content="""
+Any ideas???
+"""]]
diff --git a/doc/forum/Problem_with_gitweb/comment_5_8a79b879205bd265d54e30f0eee2ac63._comment b/doc/forum/Problem_with_gitweb/comment_5_8a79b879205bd265d54e30f0eee2ac63._comment
new file mode 100644
index 000000000..242ce8928
--- /dev/null
+++ b/doc/forum/Problem_with_gitweb/comment_5_8a79b879205bd265d54e30f0eee2ac63._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk_MMtLPS7osC5MjX00q2ATjvvXPWqm0ik"
+ nickname="micheal"
+ subject="comment 5"
+ date="2012-03-29T18:55:37Z"
+ content="""
+Solved by `sudo chmod a+x /home/myuser`
+"""]]
diff --git a/doc/forum/Problem_with_local_git_commit.mdwn b/doc/forum/Problem_with_local_git_commit.mdwn
new file mode 100644
index 000000000..e9dfdb417
--- /dev/null
+++ b/doc/forum/Problem_with_local_git_commit.mdwn
@@ -0,0 +1,42 @@
+I have a problem when I edit my wiki with a text editor and use just git to commit.
+
+Suppose `iki` is my scrdir and `iki.git` my repository. Then I did `git clone iki.git myiki` to get a copy. Then I do
+
+ cd myiki
+ echo "test" >> somepage.mdwm"
+ git add somepage.mdwm
+ git pull
+ git commit -m "test"
+ git push
+
+Then I get the following error message
+
+ Counting objects: 5, done.
+ Delta compression using up to 2 threads.
+ Compressing objects: 100% (2/2), done.
+ Writing objects: 100% (3/3), 287 bytes, done.
+ Total 3 (delta 1), reused 0 (delta 0)
+ Unpacking objects: 100% (3/3), done.
+ remote: From /home/myuser/iki
+ remote: 32bb6be..1f3a647 master -> origin/master
+ remote: There are no candidates for merging among the refs that you just fetched.
+ remote: Generally this means that you provided a wildcard refspec which had no
+ remote: matches on the remote end.
+ remote: 'git pull --prune origin' failed: at /usr/share/perl5/IkiWiki/Plugin/git.pm line 207.
+ remote: skipping bad filename local.css~
+ remote: skipping bad filename #tex_sandbox.mdwn#
+ To /home/myuser/iki.git
+ 32bb6be..1f3a647 master -> master
+
+When I check the repository via gitk I see that everything seems to be ok, if I check the scrdir the same way origin master is one step away from master and the change doesn't appear on the iki web page. Then I tried to do a `sudo git pull --prune origin master` in my scrdir which sets master to the head, but the change isn't there anyway. It foremost appears when I do a second change as above or if I do `sudo ikiwiki --setup iki.setup`.
+
+By the way the setup gives me a similar error message:
+
+ successfully generated /var/www/iki/ikiwiki.cgi
+ successfully generated /home/myuser/iki.git/hooks/post-update
+ There are no candidates for merging among the refs that you just fetched.
+ Generally this means that you provided a wildcard refspec which had no
+ matches on the remote end.
+ 'git pull --prune origin' failed: at /usr/share/perl5/IkiWiki/Plugin/git.pm line 207.
+
+Any ideas what may be wrong here and how to fix this?
diff --git a/doc/forum/Processing_non-pages.mdwn b/doc/forum/Processing_non-pages.mdwn
new file mode 100644
index 000000000..23af417a4
--- /dev/null
+++ b/doc/forum/Processing_non-pages.mdwn
@@ -0,0 +1,7 @@
+I'd like to be able to write a plugin that minifies CSS pages, but the whole plugin mechanism appears to be oriented towards generating HTML pages. That is, all files appear to be split into "pages with page types" and "pages without page types". Pages without page types are copied from the source to the destination directory and that's all. Pages *with* page-types go through the whole gamut: scan, filter, preprocess, linkify, htmlize, sanitize, format, and then they're written as "foo.html".
+
+I could be mistaken, but I don't think registering "css" as a page-type would work. Sure, I could then process the content to my heart's content, but at the end, my foo.css file would be saved as foo.html, which is NOT what I want.
+
+What I would like would be something in-between, where one could take `foo.css`, process it (in this case, run a minify over it) and output it as `foo.css`.
+
+How?
diff --git a/doc/forum/Recent_changes_on_main_site_or_on_a_sidebar.mdwn b/doc/forum/Recent_changes_on_main_site_or_on_a_sidebar.mdwn
new file mode 100644
index 000000000..0c328a9f1
--- /dev/null
+++ b/doc/forum/Recent_changes_on_main_site_or_on_a_sidebar.mdwn
@@ -0,0 +1 @@
+Is it possible to display the recent changes on the main site of the wiki or on a sidebar?
diff --git a/doc/forum/Recent_changes_on_main_site_or_on_a_sidebar/comment_1_018b977ff7ee59fc53838e0c20c3a9a7._comment b/doc/forum/Recent_changes_on_main_site_or_on_a_sidebar/comment_1_018b977ff7ee59fc53838e0c20c3a9a7._comment
new file mode 100644
index 000000000..1bc0cc509
--- /dev/null
+++ b/doc/forum/Recent_changes_on_main_site_or_on_a_sidebar/comment_1_018b977ff7ee59fc53838e0c20c3a9a7._comment
@@ -0,0 +1,11 @@
+[[!comment format=mdwn
+ username="jean_magnan"
+ ip="81.56.145.104"
+ subject="comment 1"
+ date="2011-12-19T10:20:59Z"
+ content="""
+Hi,
+I have this line in the sidebar file, it says to show the titles and dates of the last 5 pages:
+
+[[!inline pages=\"*\" archive=\"yes\" show=\"5\"]]
+"""]]
diff --git a/doc/forum/Recent_changes_on_main_site_or_on_a_sidebar/comment_2_927c11f18315baa39f08ca4982ed2ab1._comment b/doc/forum/Recent_changes_on_main_site_or_on_a_sidebar/comment_2_927c11f18315baa39f08ca4982ed2ab1._comment
new file mode 100644
index 000000000..2b6237bc4
--- /dev/null
+++ b/doc/forum/Recent_changes_on_main_site_or_on_a_sidebar/comment_2_927c11f18315baa39f08ca4982ed2ab1._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 2"
+ date="2011-12-20T15:13:48Z"
+ content="""
+The [[RecentChanges]] page is a regular wiki page that inlines a few special pages with a special template. That content can be copied anywhere else in the wiki to get the same effect.
+"""]]
diff --git a/doc/forum/Refresh_or_recreate_style.css__63__.mdwn b/doc/forum/Refresh_or_recreate_style.css__63__.mdwn
new file mode 100644
index 000000000..262b0e3c6
--- /dev/null
+++ b/doc/forum/Refresh_or_recreate_style.css__63__.mdwn
@@ -0,0 +1,40 @@
+I was trying to use plain blueview theme but that's not what I see in the installed style.css:
+
+ ~/src/ikiwiki/themes/blueview$ grep bzed style.css
+ /* bzed theme for ikiwiki
+ ~/src/ikiwiki/themes/blueview$ wc -l style.css
+ 281 style.css
+ $ grep bzed ~/www/style.css
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ $ wc -l ~/www/style.css
+ 7913
+
+I have installed ikiwiki to my home directory on the shared server and it seems the big css file is there too:
+
+ $ grep bzed ~/bin/share/ikiwiki/themes/blueview/style.css
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ /* bzed theme for ikiwiki
+ $ wc -l ~/bin/share/ikiwiki/themes/blueview/style.css
+ 7913
+
+Is the style.css really supposed to be that big?
+If not, how to create it from scratch?
+
+Reason why I'm debugging the css is that I'd like to make it better on small handset screens, like drop all margins, inline or hide sidebar etc. Chromium shows that the processed css is quite a mess.
diff --git a/doc/forum/Refresh_or_recreate_style.css__63__/comment_1_3274be931d0b543c7f7cf641810817aa._comment b/doc/forum/Refresh_or_recreate_style.css__63__/comment_1_3274be931d0b543c7f7cf641810817aa._comment
new file mode 100644
index 000000000..608dca0ce
--- /dev/null
+++ b/doc/forum/Refresh_or_recreate_style.css__63__/comment_1_3274be931d0b543c7f7cf641810817aa._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://mcfrisk.myopenid.com/"
+ nickname="mikko.rapeli"
+ subject="bug/feature in Makefile.PL"
+ date="2013-03-30T11:53:41Z"
+ content="""
+Theme style.css files were appended when installing in Makefile.PL. IMO overwriting the destination files is more correct. Sent a patch to Joey.
+"""]]
diff --git a/doc/forum/Regex_for_Valid_Characters_in_Filenames.mdwn b/doc/forum/Regex_for_Valid_Characters_in_Filenames.mdwn
new file mode 100644
index 000000000..618576f81
--- /dev/null
+++ b/doc/forum/Regex_for_Valid_Characters_in_Filenames.mdwn
@@ -0,0 +1,19 @@
+I'm sure that this is documented somewhere but I've ransacked the wiki and I can't find it. :-( What are the allowed characters in an ikiwiki page name? I'm writing a simple script to make updating my blog easier and need to filter invalid characters (so far I've found that # and , aren't allowed ;-)). Thanks for any pointers. -- [[AdamShand]]
+
+> The default `wiki_file_regexp` matches filenames containing only
+> `[-[:alnum:]_.:/+]`
+>
+> The titlepage() function will convert freeform text to a valid
+> page name. See [[todo/should_use_a_standard_encoding_for_utf_chars_in_filenames]]
+> for an example. --[[Joey]]
+
+>> Perfect, thanks!
+>>
+>> In the end I decided that I didn't need any special characters in filenames and replaced everything but alphanumeric characters with underscores. In addition to replacing bad characters I also collapse multiple underscores into a single one, and strip off trailing and leading underscores to make tidy filenames. If it's useful to anybody else here's a sed example:
+>>
+>> # echo "++ Bad: ~@#$%^&*()_=}{[];,? Iki: +_-:./ Num: 65.5 ++" | sed -e 's/[^A-Za-z0-9_]/_/g' -e 's/__*/_/g' -e 's/^_//g' -e 's/_$//g'
+>> Bad_Iki_Num_65_5
+>>
+>>--[[AdamShand]]
+
+[[!meta date="2008-01-18 23:40:02 -0500"]]
diff --git a/doc/forum/Render_more_than_one_dest_page_from_same_source_page.mdwn b/doc/forum/Render_more_than_one_dest_page_from_same_source_page.mdwn
new file mode 100644
index 000000000..e7362c903
--- /dev/null
+++ b/doc/forum/Render_more_than_one_dest_page_from_same_source_page.mdwn
@@ -0,0 +1,51 @@
+Is it possible to render more than one destination page from the same source page?
+That is, same source, slightly different presentation at the other end, needing a different output file.
+
+> It's possible to render more than one output _file_ from a given source
+> page. See, for example, the inline plugin's generation of rss files.
+> This is done by calling `will_render()` and using `writefile()` to
+> generate the additional files. Probably in a format hook if you want
+> to generate html files.
+
+>> Thanks for the tip, I'll take a look at that. -- [[KathrynAndersen]]
+
+> It's not possible for one source file to represent multiple wiki pages.
+> There is a 1:1 mapping between source filenames and page names. The
+> difference between wiki pages and output files is that you can use
+> wikilinks to link to wiki pages, etc. --[[Joey]]
+
+I have two problems that would be solved by being able to do this.
+
+[[!toc startlevel=2]]
+
+##"full" and "print" versions of a page.
+
+One has a page "foo", which is rendered into foo.html.
+One also wants a foo-print.html page, which uses "page-print.tmpl" rather than "page.tmpl" as its template.
+
+I want to do this for every page on the site, automatically, so it isn't feasible to do it by hand.
+
+> Did you know that ikiwiki's `style.css` arranges for pages to display
+> differently when printed out? Things like the Action bar are hidden in
+> printouts (search for `@media print`). So I don't see a reason to need
+> whole files for printing when you can use these style sheet tricks.
+> --[[Joey]]
+
+>>Fair enough. --[[KathrynAndersen]]
+
+##"en" and "en-us" versions of a page.
+
+My site is in non-US English. However, I want US-English people to find my site when they search for it when they use US spelling on certain search terms (such as "optimise" versus "optimize"). This requires a (crude) US-English version of the site where the spellings are changed automatically, and the LANG is "en-us" rather than "en". (No, don't tell me to use keywords; Google ignores keywords and has for a number of years).
+
+So I want the page "foo" to render to "foo.en.html" and "foo.en-us.html" where the content is the same, just some automated word-substitution applied before foo.en-us.html is written. And do this for every page on the site.
+
+I can't do this with the "po" plugin, as it considers "en-us" not to be a valid language. And the "po" plugin is probably overkill for what I want anyway.
+
+But I'm not sure how to achieve the result I need.
+
+-- [[KathrynAndersen]]
+
+> Sounds like this could be considered a single page that generates two
+> html files, so could be handled per above. --[[Joey]]
+
+>>Thanks! --[[KathrynAndersen]]
diff --git a/doc/forum/Revision_history_for_single_pages.mdwn b/doc/forum/Revision_history_for_single_pages.mdwn
new file mode 100644
index 000000000..76b4f7608
--- /dev/null
+++ b/doc/forum/Revision_history_for_single_pages.mdwn
@@ -0,0 +1,3 @@
+In other wikis [[for example|http://en.wikipedia.org/w/index.php?title=Main_Page&action=history]] in mediawiki there is a revision history of every single page (and not only of the whole wiki).
+
+Is it possible to have this in ikiwiki, too?
diff --git a/doc/forum/Revision_history_for_single_pages/comment_1_d509d5d726cd7eab9472d723013f5ec4._comment b/doc/forum/Revision_history_for_single_pages/comment_1_d509d5d726cd7eab9472d723013f5ec4._comment
new file mode 100644
index 000000000..242702e91
--- /dev/null
+++ b/doc/forum/Revision_history_for_single_pages/comment_1_d509d5d726cd7eab9472d723013f5ec4._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="202.173.183.92"
+ subject="comment 1"
+ date="2011-09-26T22:52:07Z"
+ content="""
+Er, but IkiWiki does have this. Click on \"History\" and you get the revision history of that page.
+"""]]
diff --git a/doc/forum/Revision_history_for_single_pages/comment_2_d39a6177fc4c1e3c3c2c4e2592be9e3d._comment b/doc/forum/Revision_history_for_single_pages/comment_2_d39a6177fc4c1e3c3c2c4e2592be9e3d._comment
new file mode 100644
index 000000000..4ca754fd7
--- /dev/null
+++ b/doc/forum/Revision_history_for_single_pages/comment_2_d39a6177fc4c1e3c3c2c4e2592be9e3d._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk_MMtLPS7osC5MjX00q2ATjvvXPWqm0ik"
+ nickname="micheal"
+ subject="comment 2"
+ date="2011-09-27T17:49:15Z"
+ content="""
+Hm. I overlooked that ikiwiki.info has this. However my own wiki doesn't. Do I have adjust any settings to get this?
+"""]]
diff --git a/doc/forum/Revision_history_for_single_pages/comment_3_aecf2b031ace001afaa2a0f2b5f50c82._comment b/doc/forum/Revision_history_for_single_pages/comment_3_aecf2b031ace001afaa2a0f2b5f50c82._comment
new file mode 100644
index 000000000..f3ddff19d
--- /dev/null
+++ b/doc/forum/Revision_history_for_single_pages/comment_3_aecf2b031ace001afaa2a0f2b5f50c82._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 3"
+ date="2011-09-29T16:10:22Z"
+ content="""
+`historyurl` is the setting in the `.setup` file: it just links to an external history viewer, which is likely to be more advanced and better-suited to your [[RCS]] than anything built into ikiwiki would be. Use your favourite viewer for the RCS you're using - ikiwiki.info uses gitweb, you could also use something like cgit - and write `\[[file]]` where you want the filename to appear.
+"""]]
diff --git a/doc/forum/Run_script_on_markdown_source.mdwn b/doc/forum/Run_script_on_markdown_source.mdwn
new file mode 100644
index 000000000..614815c9b
--- /dev/null
+++ b/doc/forum/Run_script_on_markdown_source.mdwn
@@ -0,0 +1 @@
+How can I add a button to each wiki page which launches an external application or script with the markdown code of the current page as input?
diff --git a/doc/forum/See_rendered_old_revisions_via_pagehistory.mdwn b/doc/forum/See_rendered_old_revisions_via_pagehistory.mdwn
new file mode 100644
index 000000000..465746ef9
--- /dev/null
+++ b/doc/forum/See_rendered_old_revisions_via_pagehistory.mdwn
@@ -0,0 +1 @@
+Via `historyurl` and `gitweb` I can view the markdown source of old revisions of a page (by clicking on `blob` in `gitweb`). Is it also possible to see the rendered versions of this old revisions directly (i.e. such they would be rendered by ikiwiki, not only the markdown source)?
diff --git a/doc/forum/Setting_http__95__proxy.mdwn b/doc/forum/Setting_http__95__proxy.mdwn
new file mode 100644
index 000000000..3bf8a76bc
--- /dev/null
+++ b/doc/forum/Setting_http__95__proxy.mdwn
@@ -0,0 +1,22 @@
+Hi! My wiki is behind a proxy and, as I understood looking in the web, I need to set the environment variables using ENV inside the wiki's config.
+
+So far I tried:
+
+ENV: {
+ http_proxy => 'http://proxy.uns.edu.ar:1280/',
+ https_proxy => 'http://proxy.uns.edu.ar:1280/'
+}
+
+without luck, as I get:
+
+
+YAML::XS::Load Error: The problem:
+
+ found unexpected ':'
+
+was found at document: 1, line: 85, column: 22
+while scanning a plain scalar at line: 85, column: 3
+usage: ikiwiki [options] source dest
+ ikiwiki --setup configfile
+
+What am I missing? (maybe learning perl?)
diff --git a/doc/forum/Setting_http__95__proxy/comment_1_350a7c4834c9f422e107b646cdbae3b0._comment b/doc/forum/Setting_http__95__proxy/comment_1_350a7c4834c9f422e107b646cdbae3b0._comment
new file mode 100644
index 000000000..3623652ab
--- /dev/null
+++ b/doc/forum/Setting_http__95__proxy/comment_1_350a7c4834c9f422e107b646cdbae3b0._comment
@@ -0,0 +1,20 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2012-10-10T13:45:10Z"
+ content="""
+If your wiki configuration is written in YAML (it says IkiWiki::Setup::Yaml near the top), the correct syntax is something like
+
+ ENV:
+ http_proxy: http://proxy.uns.edu.ar:1280/
+ https_proxy: http://proxy.uns.edu.ar:1280/
+
+or
+
+ ENV: { http_proxy: 'http://proxy.uns.edu.ar:1280/', https_proxy: 'http://proxy.uns.edu.ar:1280/' }
+
+(many variations are possible, see <http://www.yaml.org/>).
+
+The syntax you quoted is correct for Perl-syntax setup files (which will mention IkiWiki::Setup::Standard near the top), but not YAML ones.
+"""]]
diff --git a/doc/forum/Setting_template_variable_from_config_file__63__.mdwn b/doc/forum/Setting_template_variable_from_config_file__63__.mdwn
new file mode 100644
index 000000000..ac7631e60
--- /dev/null
+++ b/doc/forum/Setting_template_variable_from_config_file__63__.mdwn
@@ -0,0 +1 @@
+Is ist possible to set a template variable from the config file?
diff --git a/doc/forum/Setting_template_variable_from_config_file__63__/comment_1_bb4b5a7a49f33d660b5116fc0ce3c92d._comment b/doc/forum/Setting_template_variable_from_config_file__63__/comment_1_bb4b5a7a49f33d660b5116fc0ce3c92d._comment
new file mode 100644
index 000000000..6dddb1f21
--- /dev/null
+++ b/doc/forum/Setting_template_variable_from_config_file__63__/comment_1_bb4b5a7a49f33d660b5116fc0ce3c92d._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="202.173.183.92"
+ subject="comment 1"
+ date="2011-08-29T23:07:21Z"
+ content="""
+With the [[plugins/contrib/field]] plugin one can; set the `field_allow_config` config value to 1, and the config variables are accessible with a \"CONFIG-\" prefix. That is, if you set a value \"foo\" in the config file, then you would access it in in the template as `<TMPL_VAR CONFIG-FOO>`.
+"""]]
diff --git a/doc/forum/Setting_up_a_development_environment.mdwn b/doc/forum/Setting_up_a_development_environment.mdwn
new file mode 100644
index 000000000..0b4e555c1
--- /dev/null
+++ b/doc/forum/Setting_up_a_development_environment.mdwn
@@ -0,0 +1,32 @@
+Hi,
+
+I'm trying to setup a development environment to hack on the comments plugin and I'm having problems getting my Ikiwiki CGI to use my git checkout as the libdir and templatedir instead of the system one.
+
+My <tt>.setup</tt> contains:
+
+ srcdir => '/home/francois/wiki/testblog',
+ destdir => '/var/www/testblog',
+ url => 'http://localhost/testblog',
+ cgiurl => 'http://localhost/testblog/ikiwiki.cgi',
+ cgi_wrapper => '/var/www/testblog/ikiwiki.cgi',
+ templatedir => '/home/francois/devel/remote/ikiwiki/templates',
+ underlaydir => '/home/francois/devel/remote/ikiwiki/doc',
+ libdir => '/home/francois/devel/remote/ikiwiki',
+ ENV => {},
+ git_wrapper => '/home/francois/wiki/testblog.git/hooks/post-update',
+
+Now, if I modify <tt>~/devel/remote/ikiwiki/templates/comment.tmpl</tt>, my changes don't appear when I add a comment to a blog post. On the other hand, if I hack <tt>/usr/share/ikiwiki/templates/comment.tmpl</tt> and cause the page to be rebuilt by adding a new comment then that does have an effect.
+
+The same is true for <tt>~/devel/remote/ikiwiki/Ikiwiki/Plugin/comments.pm</tt> (doesn't appear to be used) and <tt>/usr/share/perl5/Ikiwiki/Plugin/comments.pm</tt> (my hacks affect pages as they are recompiled).
+
+I must be missing something obvious, but the [[ikiwiki development environment tips]] didn't help me...
+
+Cheers,
+
+[[Francois|fmarier]]
+
+> I updated the [[ikiwiki development environment tips]] page with my
+> approach to running ikiwiki from the git checkout (with changes). For
+> the templates, also make sure that you do not have custom templates in
+> your src dir as they will be used instead of those from the template
+> dir if found. --GB
diff --git a/doc/forum/Should_files_in_.ikiwiki_be_committed_and_pushed__63__.mdwn b/doc/forum/Should_files_in_.ikiwiki_be_committed_and_pushed__63__.mdwn
new file mode 100644
index 000000000..f5b1d58d1
--- /dev/null
+++ b/doc/forum/Should_files_in_.ikiwiki_be_committed_and_pushed__63__.mdwn
@@ -0,0 +1,14 @@
+In my local working copy, I discovered a .ikiwiki directory with files and subdirectories. What are they for in a working copy? Should them be committed to central repository (origin) on remote server?
+
+ $ git status
+ # On branch master
+ # Changes to be committed:
+ # (use "git reset HEAD <file>..." to unstage)
+ #
+ # new file: .ikiwiki/commitlock
+ # new file: .ikiwiki/indexdb
+ # new file: .ikiwiki/lockfile
+ # new file: .ikiwiki/transient/recentchanges/change_0326ad7c7aa2c40b8db5d59033ecda7ed4d61295._change
+ <snip>
+
+Should these be committed and pushed?
diff --git a/doc/forum/Should_files_in_.ikiwiki_be_committed_and_pushed__63__/comment_1_8e65d7d8298e3c31d2a16446a71c8049._comment b/doc/forum/Should_files_in_.ikiwiki_be_committed_and_pushed__63__/comment_1_8e65d7d8298e3c31d2a16446a71c8049._comment
new file mode 100644
index 000000000..43e7a0068
--- /dev/null
+++ b/doc/forum/Should_files_in_.ikiwiki_be_committed_and_pushed__63__/comment_1_8e65d7d8298e3c31d2a16446a71c8049._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-12-22T16:12:42Z"
+ content="""
+Absolutely not. If you use ikiwiki-makerepo to set up ikiwiki, it makes a `.gitignore` containing `/.ikiwiki`
+
+The `.ikiwiki` directory is where ikiwiki stores all the state it needs to track about a given build of a wiki.
+"""]]
diff --git a/doc/forum/Should_not_create_an_existing_page.mdwn b/doc/forum/Should_not_create_an_existing_page.mdwn
new file mode 100644
index 000000000..b9500757f
--- /dev/null
+++ b/doc/forum/Should_not_create_an_existing_page.mdwn
@@ -0,0 +1,15 @@
+[[!meta date="2007-01-08 14:55:31 +0000"]]
+
+This might be a bug, but will discuss it here first.
+Clicking on an old "?" or going to a create link but new Markdown content exists, should not go into "create" mode, but should do a regular "edit".
+
+> I belive that currently it does a redirect to the new static web page.
+> At least that's the intent of the code. --[[Joey]]
+
+>> Try at your site: `?page=discussion&from=index&do=create`
+>> It brings up an empty textarea to start a new webpage -- even though it already exists here. --reed
+
+>>> Ah, right. Notice that the resulting form allows saving the page as
+>>> discussion, or users/discussion, but not index/discussion, since this
+>>> page already exists. If all the pages existed, it would do the redirect
+>>> thing. --[[Joey]]
diff --git a/doc/forum/Sidebar_with_links__63__.mdwn b/doc/forum/Sidebar_with_links__63__.mdwn
new file mode 100644
index 000000000..790ee85a2
--- /dev/null
+++ b/doc/forum/Sidebar_with_links__63__.mdwn
@@ -0,0 +1,58 @@
+I'm trying to create a template to use as a sidebar with links. The template will be static
+(no variables are used). I first created a page with this directive: \[[!template id=sidebar]],
+and then created the template with the web interface.
+
+This is the code I put in the template:
+
+ <div class="infobox">
+ <ul>
+ <li>\[[Existing internal link|exists]]</li>
+ <li>\[[Non-existing internal link|doesnotexist]]</li>
+ <li>[External link](http://google.com/)</li>
+ </ul>
+ <http://google.com/>
+ </div>
+
+This is the relevant part of the resulting html file `template/sidebar.html`:
+
+ <div class="infobox">
+ <ul>
+ <li><a href="../exists.html">Existing internal link</a></li>
+ <li><span class="createlink"><a href="http://localhost/cgi-bin/itesohome.cgi?page=doesnotexist&amp;from=templates%2Fsidebar&amp;do=create" rel="nofollow">?</a>Non-existing internal link</span></li>
+ <li>[External link](http://google.com/)</li>
+ </ul>
+ </div>
+
+Note that the `<http://google.com/>` link has disappeared, and that `[External link](http://google.com/)`
+has been copied literally instead of being converted to a link, as I expected.
+
+> Templates aren't Markdown page. [[ikiwiki/WikiLink]] only are expanded. --[[Jogo]]
+
+>> Thanks for the help Jogo. Looking at the [[templates]] page, it says that
+"...you can include WikiLinks and all other forms of wiki markup in the template." I read this
+to mean that a template may indeed include Markdown. Am I wrong in my interpratation? --[[buo]]
+
+>> I discovered that if I eliminate all html from my sidebar.mdwn template, the links are
+rendered properly. It seems that the mix of Markdown and html is confusing some part of
+Ikiwiki. --[[buo]]
+
+Worse, this is the relevant part of the html file of the page that includes the template:
+
+ <div class="infobox">
+ <ul>
+ <li><span class="selflink">Existing internal link</span></li>
+ <li><span class="createlink"><a href="http://localhost/cgi-bin/itesohome.cgi?page=doesnotexist&amp;from=research&amp;do=create" rel="nofollow">?</a>Non-existing internal link</span></li>
+ <li>[External link](http://google.com/)</li>
+ </ul>
+ </div>
+
+Note that the `Existing internal link` is no longer a link. It is only text.
+
+What am I doing wrong? Any help or pointers will be appreciated. --[[buo]]
+
+-----
+
+I think I have figured this out. I thought the template was filled and then
+processed to convert Markdown to html. Instead, the text in each variable is
+processed and then the template is filled. I somehow misunderstood the
+[[templates]] page. -- [[buo]]
diff --git a/doc/forum/Slow_ikiwiki_after_first_run.mdwn b/doc/forum/Slow_ikiwiki_after_first_run.mdwn
new file mode 100644
index 000000000..db07f6dc3
--- /dev/null
+++ b/doc/forum/Slow_ikiwiki_after_first_run.mdwn
@@ -0,0 +1 @@
+I have local ikiwiki on my notebook. When I save an edit the first time after booting and logging in, saving is very slow. Any idea how to fix this?
diff --git a/doc/forum/Spaces_in_wikilinks.mdwn b/doc/forum/Spaces_in_wikilinks.mdwn
new file mode 100644
index 000000000..9326ac448
--- /dev/null
+++ b/doc/forum/Spaces_in_wikilinks.mdwn
@@ -0,0 +1,104 @@
+[[!meta date="2007-07-02 13:21:29 +0000"]]
+
+# Spaces in WikiLinks?
+
+Hello Joey,
+
+I've just switched from ikiwiki 2.0 to ikiwiki 2.2 and I'm really surprised
+that I can't use the spaces in WikiLinks. Could you please tell me why the spaces
+aren't allowed in WikiLinks now?
+
+My best regards,
+
+--[[PaweB|ptecza]]
+
+> See [[bugs/Spaces_in_link_text_for_ikiwiki_links]]
+
+----
+
+# Build in OpenSolaris?
+
+Moved to [[bugs/build_in_opensolaris]] --[[Joey]]
+
+----
+
+# Various ways to use Subversion with ikiwiki
+
+I'm playing around with various ways that I can use subversion with ikiwiki.
+
+* Is it possible to have ikiwiki point to a subversion repository which is on a different server? The basic checkin/checkout functionality seems to work but there doesn't seem to be any way to make the post-commit hook work for a non-local server?
+
+> This is difficult to do since ikiwiki's post-commit wrapper expects to
+> run on a machine that contains both the svn repository and the .ikiwiki
+> state directory. However, with recent versions of ikiwiki, you can get
+> away without running the post-commit wrapper on commit, and all you lose
+> is the ability to send commit notification emails.
+
+> (And now that [[recentchanges]] includes rss, you can just subscribe to
+> that, no need to worry about commit notification emails anymore.)
+
+* Is it possible / sensible to have ikiwiki share a subversion repository with other data (either completely unrelated files or another ikiwiki instance)? This works in part but again the post-commit hook seems problematic.
+
+--[[AdamShand]]
+
+> Sure, see ikiwiki's subversion repository for example of non-wiki files
+> in the same repo. If you have two wikis in one repository, you will need
+> to write a post-commit script that calls the post-commit wrappers for each
+> wiki.
+
+----
+
+# Regex for Valid Characters in Filenames
+
+I'm sure that this is documented somewhere but I've ransacked the wiki and I can't find it. :-( What are the allowed characters in an ikiwiki page name? I'm writing a simple script to make updating my blog easier and need to filter invalid characters (so far I've found that # and , aren't allowed ;-)). Thanks for any pointers. -- [[AdamShand]]
+
+> The default `wiki_file_regexp` matches filenames containing only
+> `[-[:alnum:]_.:/+]`
+>
+> The titlepage() function will convert freeform text to a valid
+> page name. See [[todo/should_use_a_standard_encoding_for_utf_chars_in_filenames]]
+> for an example. --[[Joey]]
+
+>> Perfect, thanks!
+>>
+>> In the end I decided that I didn't need any special characters in filenames and replaced everything but alphanumeric characters with underscores. In addition to replacing bad characters I also collapse multiple underscores into a single one, and strip off trailing and leading underscores to make tidy filenames. If it's useful to anybody else here's a sed example:
+>>
+>> # echo "++ Bad: ~@#$%^&*()_=}{[];,? Iki: +_-:./ Num: 65.5 ++" | sed -e 's/[^A-Za-z0-9_]/_/g' -e 's/__*/_/g' -e 's/^_//g' -e 's/_$//g'
+>> Bad_Iki_Num_65_5
+>>
+>>--[[AdamShand]]
+
+# Upgrade steps from RecentChanges CGI to static page?
+
+Where are the upgrade steps for RecentChanges change from CGI to static feed?
+I run multiple ikiwiki-powered sites on multiple servers, but today I just upgraded one to 2.32.3.
+Please have a look at
+<http://bsdwiki.reedmedia.net/wiki/recentchanges.html>
+Any suggestions?
+
+> There are no upgrade steps required. It does look like you need to enable
+> the meta plugin to get a good recentchanges page though.. --[[Joey]]
+
+# News site where articles are submitted and then reviewed before posting?
+
+I am considering moving a news site to Ikiwiki. I am hoping that Ikiwiki has a feature where anonymous posters can submit a form that moderators can review and then accept for it to be posted on a news webpage (like front page of the website).
+
+> Well, you can have one blog that contains unreviewed articles, and
+> moderators can then add a tag that makes the article show up in the main
+> news feed. There's nothing stopping someone submitting an article
+> pre-tagged though. If you absolutely need to lock that down, you could
+> have one blog with unreviewed articles in one subdirectory, and reviewers
+> then move the file over to another subdirectory when they're ready to
+> publish it. (This second subdirectory would be locked to prevent others
+> from writing to it.) --[[Joey]]
+
+Also it would be good if the news page would keep maybe just the latest 10 entries with links to an archive that make it easy to browse to old entries by date. (Could have over a thousand news articles.)
+
+> The inline plugin allows setting up things like this.
+
+Plus users be able to post feedback to news items. If anonymous, they must be approved first. I'd prefer to not use normal "wiki" editor for feedback.
+
+Any thoughts or examples on this? Any links to examples of news sites or blogs with outside feedback using ikiwiki?
+
+Thanks --[[JeremyReed]]
+
diff --git a/doc/forum/Split_a_wiki.mdwn b/doc/forum/Split_a_wiki.mdwn
new file mode 100644
index 000000000..f1c7e50da
--- /dev/null
+++ b/doc/forum/Split_a_wiki.mdwn
@@ -0,0 +1,21 @@
+Is it possible to split an ikiwiki (with git backend) in to two?
+
+Suppose I have an ikiwiki called myiki
+
+which contains the pages
+
+pageA1,pageA2,...,pageB1,pageB2,...
+
+now I want to have two wikis called myikiA and myikiB
+
+such that
+
+myikiA contains pageA1,pageA2,...
+
+The history of myikiA should contain the whole history of those pages but no history of pageB1,pageB2,...
+
+and
+
+myikiB contains pageB1,pageB2,...
+
+The history of myikiB should contain the whole history of those pages but no history of pageA1,pageA2,...
diff --git a/doc/forum/Split_a_wiki/comment_1_1599c26891b2071a2f1ca3fd90627fc4._comment b/doc/forum/Split_a_wiki/comment_1_1599c26891b2071a2f1ca3fd90627fc4._comment
new file mode 100644
index 000000000..66401914a
--- /dev/null
+++ b/doc/forum/Split_a_wiki/comment_1_1599c26891b2071a2f1ca3fd90627fc4._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="202.173.183.92"
+ subject="comment 1"
+ date="2011-12-03T19:45:06Z"
+ content="""
+The only thing I can think of is to clone the wiki, and for WikiA, delete PageB, and for WikiB, delete PageA. This won't remove the histories of those pages, but it will at least remove those pages.
+"""]]
diff --git a/doc/forum/Split_a_wiki/comment_2_1c54d3594f0350340f8dfb3e95c29ffd._comment b/doc/forum/Split_a_wiki/comment_2_1c54d3594f0350340f8dfb3e95c29ffd._comment
new file mode 100644
index 000000000..8040ad5e2
--- /dev/null
+++ b/doc/forum/Split_a_wiki/comment_2_1c54d3594f0350340f8dfb3e95c29ffd._comment
@@ -0,0 +1,20 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 2"
+ date="2011-12-04T13:37:27Z"
+ content="""
+If just deleting the unwanted pages is insufficient (e.g. perhaps they
+contain information that must not be made public), you can split a git
+repository (including ikiwiki repositories) with `git filter-branch` (see
+[this stackoverflow question](http://stackoverflow.com/questions/359424/detach-subdirectory-into-separate-git-repository),
+for instance).
+
+This preserves the history of each individual page, but rewrites the
+entire history of the repository (it re-does every commit, pretending
+that the addition of the omitted pages and every subsequent edit to
+them had never happened); it's like `git rebase` but more so.
+
+As a result, existing branches will no longer be able to push to the
+rewritten repository.
+"""]]
diff --git a/doc/forum/Split_a_wiki/comment_3_9eac1d1b93df27d849acc574b1f0f26d._comment b/doc/forum/Split_a_wiki/comment_3_9eac1d1b93df27d849acc574b1f0f26d._comment
new file mode 100644
index 000000000..e2dbd2546
--- /dev/null
+++ b/doc/forum/Split_a_wiki/comment_3_9eac1d1b93df27d849acc574b1f0f26d._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk_MMtLPS7osC5MjX00q2ATjvvXPWqm0ik"
+ nickname="micheal"
+ subject="comment 3"
+ date="2011-12-04T14:03:55Z"
+ content="""
+@smcv: Thanks, that looks promising. The example from stackoverflow is with subdirectories. What do I have to change to match a list of single files (pages) instead?
+"""]]
diff --git a/doc/forum/Split_a_wiki/comment_4_e193ba447c0188f72ba589180b5d529e._comment b/doc/forum/Split_a_wiki/comment_4_e193ba447c0188f72ba589180b5d529e._comment
new file mode 100644
index 000000000..450529697
--- /dev/null
+++ b/doc/forum/Split_a_wiki/comment_4_e193ba447c0188f72ba589180b5d529e._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk_MMtLPS7osC5MjX00q2ATjvvXPWqm0ik"
+ nickname="micheal"
+ subject="comment 4"
+ date="2011-12-09T09:21:05Z"
+ content="""
+I have put the question on stackoverflow: http://stackoverflow.com/questions/8443372/split-an-ikiwiki
+"""]]
diff --git a/doc/forum/TMPL__95__VAR_IS__95__ADMIN.mdwn b/doc/forum/TMPL__95__VAR_IS__95__ADMIN.mdwn
new file mode 100644
index 000000000..c8eec0bc9
--- /dev/null
+++ b/doc/forum/TMPL__95__VAR_IS__95__ADMIN.mdwn
@@ -0,0 +1 @@
+Could someone point out where I could implement a template variable? I would like an IS_ADMIN. I'm pretty sure I could do it but I'm sure someone has an opinion on where this might belong. I want to hide the edit links on my blog for non-admin users.
diff --git a/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_1_3172568473e9b79ad7ab623afd19411a._comment b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_1_3172568473e9b79ad7ab623afd19411a._comment
new file mode 100644
index 000000000..7967ea6f0
--- /dev/null
+++ b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_1_3172568473e9b79ad7ab623afd19411a._comment
@@ -0,0 +1,13 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2011-02-23T11:14:14Z"
+ content="""
+This isn't possible without out-of-band mechanisms (Javascript or something).
+ikiwiki produces static HTML; template variables are evaluated when the HTML
+is compiled, and admins and non-admins see the exact same file.
+
+(More precisely, URLs containing `/ikiwiki.cgi/` are dynamically-generated
+pages; everything else is static.)
+"""]]
diff --git a/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_2_4302d56a6fe68d17cc42d26e6f3566c2._comment b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_2_4302d56a6fe68d17cc42d26e6f3566c2._comment
new file mode 100644
index 000000000..5e34ab4f8
--- /dev/null
+++ b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_2_4302d56a6fe68d17cc42d26e6f3566c2._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 2"
+ date="2011-02-23T11:16:56Z"
+ content="""
+See [[bugs/logout in ikiwiki]] for discussion of a similar issue.
+"""]]
diff --git a/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_3_4cc44e61b9c28a2d524fa874f115041a._comment b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_3_4cc44e61b9c28a2d524fa874f115041a._comment
new file mode 100644
index 000000000..e152091bc
--- /dev/null
+++ b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_3_4cc44e61b9c28a2d524fa874f115041a._comment
@@ -0,0 +1,14 @@
+[[!comment format=mdwn
+ username="justint"
+ ip="24.182.207.250"
+ subject="IS_ADMIN template wouldn't work"
+ date="2011-02-23T16:03:31Z"
+ content="""
+Ok, I think I get it. The template is used when IkiWiki is compiling the static pages, then users access the static pages. So at the time the templates are compiled there isn't a concept of who is accessing the page or what their session may be like (be it admin or anon or whatever).
+
+Is there a simple way to serve a different static page based on session information? Off the top of my head I would say no but I thought I would ask. I suppose I could try to compile two static sites, one for me and one for the world. That would solve my IS_ADMIN problem, but I don't think its a solution for similar types of things.
+
+I like ikiwiki for what it is, I get the feeling I may be asking it to do something it wasn't meant to do. If so I'd appreciate it if someone told me to stop trying. [[users/justint]]
+
+
+"""]]
diff --git a/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_4_33143bad68f3f6beae963a3d0ec5d0bd._comment b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_4_33143bad68f3f6beae963a3d0ec5d0bd._comment
new file mode 100644
index 000000000..ab7370c5a
--- /dev/null
+++ b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_4_33143bad68f3f6beae963a3d0ec5d0bd._comment
@@ -0,0 +1,53 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 4"
+ date="2011-02-23T16:19:28Z"
+ content="""
+> ... at the time the templates are compiled there isn't a concept of who is accessing the page
+
+Yes, this is the problem with what you're asking for.
+
+> Is there a simple way to serve a different static page based on session information?
+
+No, the thing serving the static pages is your web server; IkiWiki isn't involved
+at all.
+
+> I suppose I could try to compile two static sites, one for me and one for the world
+
+I've done similar in the past with two setup files, under the same user ID, running
+different checkouts of the same git repository - one for me, on https with
+[[plugins/httpauth]], and one for the world, with only [[plugins/openid]]. You have
+to make them write their git wrappers to different filenames, and make the real
+git hook be a shell script that runs one wiki's wrapper, then the other, to refresh
+both wikis when something gets committed.
+
+It's a bit fiddly to admin (you have to duplicate most setup changes in the two
+setup files), but can be made to work. I've given up on that in favour of having
+a single wiki reachable from both http and https, with [[plugins/httpauth]]
+only working over https.
+
+> I get the feeling I may be asking it to do something it wasn't meant to do.
+
+Pretty much, yes.
+
+> If so I'd appreciate it if someone told me to stop trying.
+
+I can help! \"Stop trying.\" :-)
+
+But, if you want this functionality badly enough, one way you could get
+it would be to have all the links on all the pages (for the benefit of
+`NoScript` users), use Javascript to make an XMLHTTPRequest (or something)
+to to a CGI action provided by a [[plugin|plugins/write]]
+(`ikiwiki.cgi?do=amiadminornot` or something), and if that says the user
+isn't an admin, hide some of the links to not confuse them.
+
+That would break the normal way that people log in to ikiwiki (by trying
+to do something that needs them logged-in, like editing), so you'd also
+want to add a \"Log In\" button or link (or just remember that editing your
+Preferences has the side-effect of logging you in).
+
+Note that hiding the links isn't useful for security, only for
+usability - the actual edit obviously needs to check whether the
+user is a logged-in admin, and it already does.
+"""]]
diff --git a/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_5_ef790766456d723670f52cc9e3955e90._comment b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_5_ef790766456d723670f52cc9e3955e90._comment
new file mode 100644
index 000000000..5cbc0d206
--- /dev/null
+++ b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_5_ef790766456d723670f52cc9e3955e90._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="justint"
+ ip="24.182.207.250"
+ subject="easy money"
+ date="2011-02-23T17:47:35Z"
+ content="""
+I've done a plugin, but I haven't done a CGI one yet. I can probably handle it though. I have a little javascript, I could probably do that too.
+
+I'm fuzzy on the log in bit, I don't know how to bring up a log in page in IkiWiki (that would just return to the calling page and not an edit page).
+
+If I were going to do this I'd want to have a log out button appear when the user is logged in. Is it possible to add a log out function to the same plugin?
+"""]]
diff --git a/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_6_3db50264e01c8fad2e5567b5a9c7b6dc._comment b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_6_3db50264e01c8fad2e5567b5a9c7b6dc._comment
new file mode 100644
index 000000000..4d1c224b7
--- /dev/null
+++ b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_6_3db50264e01c8fad2e5567b5a9c7b6dc._comment
@@ -0,0 +1,23 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 6"
+ date="2011-02-23T18:08:37Z"
+ content="""
+> I'm fuzzy on the log in bit, I don't know how to bring up a log in page in IkiWiki
+
+Cheap hack: make a link to `cgiurl(do => prefs)` and the user will have
+to press Back a couple of times when they've logged in :-)
+
+Less-cheap hack: have a CGI plugin that responds to `do=login` by doing
+basically the same thing as `IkiWiki::needsignin`, but instead of
+returning to the `QUERY_STRING`, return to the HTTP referer, or
+a page whose name is passed in the query string, or some such.
+
+> If I were going to do this I'd want to have a log out button appear
+> when the user is logged in. Is it possible to add a log out function to the same plugin?
+
+I don't see why not; you could create it from Javascript for logged-in
+users only. That'd close the bug [[bugs/logout in ikiwiki]] (see that
+bug for related ideas).
+"""]]
diff --git a/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_7_bdc5c96022fdb8826b57d68a41ef6ca0._comment b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_7_bdc5c96022fdb8826b57d68a41ef6ca0._comment
new file mode 100644
index 000000000..79fd8516f
--- /dev/null
+++ b/doc/forum/TMPL__95__VAR_IS__95__ADMIN/comment_7_bdc5c96022fdb8826b57d68a41ef6ca0._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="justint"
+ ip="24.182.207.250"
+ subject="Continuing discussion..."
+ date="2011-02-24T02:59:04Z"
+ content="""
+Ok, I'll go over to the [[bugs/logout_in_ikiwiki]] page. Thank you for your help.
+"""]]
diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server..mdwn b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server..mdwn
new file mode 100644
index 000000000..4061a7348
--- /dev/null
+++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server..mdwn
@@ -0,0 +1,11 @@
+Hi,
+Installed ikiwiki on my ubuntu (04.10) box; after creating a blog according to your setup instructions I cannot edit files on the web interface, and I get this errer «The requested URL /~jean/blog/ikiwiki.cgi was not found on this server.»
+I have no idea what to do (sorry for my ignorance)
+
+tia,
+
+> Make sure you have a `~/public_html/ikiwiki.cgi`. Your setup
+> file should generate that via the `cgi_wrapper` option.
+>
+> Maybe you need to follow the [[tips/dot_cgi]] tip to make apache see it.
+> --[[Joey]]
diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_1_d36ce6fab90e0a086ac84369af38d205._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_1_d36ce6fab90e0a086ac84369af38d205._comment
new file mode 100644
index 000000000..f95972c4f
--- /dev/null
+++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_1_d36ce6fab90e0a086ac84369af38d205._comment
@@ -0,0 +1,16 @@
+[[!comment format=mdwn
+ username="jeanm"
+ subject="comment 1"
+ date="2010-06-19T13:35:37Z"
+ content="""
+OK, I followed the dot cgi tip and this error diappears, thanks a lot! So ubuntu doesn't provide a \"working out of the box\" ikiwiki.
+
+But I get a new error message now when trying to edit a page:
+
+Error: \"do\" parameter missing
+
+My plugins now:
+add_plugins => [qw{goodstuff websetup comments blogspam 404 muse}],
+
+
+"""]]
diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_2_5836bba08172d2ddf6a43da87ebb0332._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_2_5836bba08172d2ddf6a43da87ebb0332._comment
new file mode 100644
index 000000000..0a544eeb1
--- /dev/null
+++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_2_5836bba08172d2ddf6a43da87ebb0332._comment
@@ -0,0 +1,7 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ subject="do parameter missing"
+ date="2010-06-23T17:03:12Z"
+ content="""
+That's an unusual problem. Normally the url or form that calls ikiwiki.cgi includes a \"do\" parameter, like \"do=edit\". I'd have to see the site to debug why it is missing for you.
+"""]]
diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_3_4eec15c8c383275db5401c8e3c2d9242._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_3_4eec15c8c383275db5401c8e3c2d9242._comment
new file mode 100644
index 000000000..faf3ad31b
--- /dev/null
+++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_3_4eec15c8c383275db5401c8e3c2d9242._comment
@@ -0,0 +1,9 @@
+[[!comment format=mdwn
+ username="jeanm"
+ ip="81.56.145.104"
+ subject="do parameter missing"
+ date="2010-06-30T07:30:08Z"
+ content="""
+the site address is piaffer.org, with a link to blog just over the picture.
+tia,
+"""]]
diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_4_43ac867621efb68affa6ae2b92740cad._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_4_43ac867621efb68affa6ae2b92740cad._comment
new file mode 100644
index 000000000..d8b516f5f
--- /dev/null
+++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_4_43ac867621efb68affa6ae2b92740cad._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 4"
+ date="2010-07-04T18:16:26Z"
+ content="""
+What is the muse plugin that you have enabled? I am not familiar with it.
+
+Apparently your ikiwiki is not seeing cgi parameters that should be passed to it. This appears to be some kind of web server misconfiguration, or possibly a broken ikiwiki wrapper or broken CGI.pm.
+"""]]
diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_5_e098723bb12adfb91ab561cae21b492b._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_5_e098723bb12adfb91ab561cae21b492b._comment
new file mode 100644
index 000000000..b832d64f4
--- /dev/null
+++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_5_e098723bb12adfb91ab561cae21b492b._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="do parameter missing"
+ date="2010-07-08T06:04:44Z"
+ content="""
+I just debugged this problem with someone else who was using ngix-fcgi. There was a problem with it not passing CGI environment variables properly. If you're using that, it might explain your problem.
+"""]]
diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_6_101183817ca4394890bd56a7694bedd9._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_6_101183817ca4394890bd56a7694bedd9._comment
new file mode 100644
index 000000000..25a4e8bae
--- /dev/null
+++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_6_101183817ca4394890bd56a7694bedd9._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="jeanm"
+ ip="81.56.145.104"
+ subject="comment 6"
+ date="2010-07-12T17:43:31Z"
+ content="""
+I'm using apache and mostly firefox.
+I've tried some changes in my config but still the same problem, then I fell ill and unable to try anything more. Now I seem to be better and I will go back to the problem soon.
+Thx
+"""]]
diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_7_2f514e6ba78d43d90e7ff4ae387e65e0._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_7_2f514e6ba78d43d90e7ff4ae387e65e0._comment
new file mode 100644
index 000000000..e35436e9d
--- /dev/null
+++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_7_2f514e6ba78d43d90e7ff4ae387e65e0._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawm_oOJLk8xnq4K5yRKzEoPPr9dMAFjiSi4"
+ nickname="Bayle"
+ subject="how to fix '&quot;do&quot; parameter missing' on nginx-fcgi?"
+ date="2011-10-31T05:57:43Z"
+ content="""
+I also get \"Error: \"do\" parameter missing\" and am using nginx with fcgi. What was the problem with the environment variables and how should i fix it? thanks,
+ bayle
+
+"""]]
diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_8_098bb7a3112751a7e6167483dde626bb._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_8_098bb7a3112751a7e6167483dde626bb._comment
new file mode 100644
index 000000000..ff0d79dbd
--- /dev/null
+++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_8_098bb7a3112751a7e6167483dde626bb._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://andrewspiers.pip.verisignlabs.com/"
+ ip="118.209.91.42"
+ subject="passing environment variables."
+ date="2012-08-24T03:47:07Z"
+ content="""
+I am getting this 'Error: \"do\" parameter missing' when trying to log in as well. I am using Apache and Firefox. The Apache error log says \"Died at /usr/share/perl5/IkiWiki/CGI.pm line 428.\" when it dies.
+
+I do have ssl set up, not sure if this is part of the problem?
+"""]]
diff --git a/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_9_fbf403255c38da93caa5b98589fbb285._comment b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_9_fbf403255c38da93caa5b98589fbb285._comment
new file mode 100644
index 000000000..53c44d14f
--- /dev/null
+++ b/doc/forum/The_requested_URL___47____126__jean__47__blog__47__ikiwiki.cgi_was_not_found_on_this_server./comment_9_fbf403255c38da93caa5b98589fbb285._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://andrewspiers.pip.verisignlabs.com/"
+ ip="118.209.91.42"
+ subject="comment 9"
+ date="2012-08-24T04:29:06Z"
+ content="""
+SSL was the problem, it was necessary to specify https:// in the url=> and cgiurl=> parameters in ikiwiki.setup, the redirect wasn't working otherwise.
+"""]]
diff --git a/doc/forum/Translating_ikiwiki_interface.mdwn b/doc/forum/Translating_ikiwiki_interface.mdwn
new file mode 100644
index 000000000..747af15b5
--- /dev/null
+++ b/doc/forum/Translating_ikiwiki_interface.mdwn
@@ -0,0 +1,8 @@
+I am using ikiwiki for a spanish language wiki. I've read the [[translation]] page and [[plugins/po]] plugin page but it is not completely clear to me. As I understand it the po plugin is the recommended way to create translated versions of existing pages in your wiki based on a master language. But I actually don't need that as myself and other users already edit the wiki in spanish. What I would actually like is to have the ikiwiki interface itself translated into spanish.
+Is it possible to have my wiki always appear in spanish? I can see that the debian package already includes po files for spanish. How do i activate the spanish translation permanently? Did I miss something obvious?
+
+> Ikiwiki has a Spanish translation of much of the program's output.
+> However, there is currently no translation of the page.tmpl and other
+> templates that are used to build your wiki. You can of course modify
+> these and translate them yourself, but we have no way to maintaining
+> those translations in po files. --[[Joey]]
diff --git a/doc/forum/Upgrade_steps_from_RecentChanges_CGI_to_static_page.mdwn b/doc/forum/Upgrade_steps_from_RecentChanges_CGI_to_static_page.mdwn
new file mode 100644
index 000000000..298ff49f1
--- /dev/null
+++ b/doc/forum/Upgrade_steps_from_RecentChanges_CGI_to_static_page.mdwn
@@ -0,0 +1,10 @@
+Where are the upgrade steps for RecentChanges change from CGI to static feed?
+I run multiple ikiwiki-powered sites on multiple servers, but today I just upgraded one to 2.32.3.
+Please have a look at
+<http://bsdwiki.reedmedia.net/wiki/recentchanges.html>
+Any suggestions?
+
+> There are no upgrade steps required. It does look like you need to enable
+> the meta plugin to get a good recentchanges page though.. --[[Joey]]
+
+[[!meta date="2008-02-23 21:10:42 -0500"]]
diff --git a/doc/forum/Various_ways_to_use_Subversion_with_ikiwiki.mdwn b/doc/forum/Various_ways_to_use_Subversion_with_ikiwiki.mdwn
new file mode 100644
index 000000000..8eed30cd8
--- /dev/null
+++ b/doc/forum/Various_ways_to_use_Subversion_with_ikiwiki.mdwn
@@ -0,0 +1,23 @@
+[[!meta date="2007-08-17 03:54:10 +0000"]]
+
+I'm playing around with various ways that I can use subversion with ikiwiki.
+
+* Is it possible to have ikiwiki point to a subversion repository which is on a different server? The basic checkin/checkout functionality seems to work but there doesn't seem to be any way to make the post-commit hook work for a non-local server?
+
+> This is difficult to do since ikiwiki's post-commit wrapper expects to
+> run on a machine that contains both the svn repository and the .ikiwiki
+> state directory. However, with recent versions of ikiwiki, you can get
+> away without running the post-commit wrapper on commit, and all you lose
+> is the ability to send commit notification emails.
+
+> (And now that [[recentchanges]] includes rss, you can just subscribe to
+> that, no need to worry about commit notification emails anymore.)
+
+* Is it possible / sensible to have ikiwiki share a subversion repository with other data (either completely unrelated files or another ikiwiki instance)? This works in part but again the post-commit hook seems problematic.
+
+--[[AdamShand]]
+
+> Sure, see ikiwiki's subversion repository for example of non-wiki files
+> in the same repo. If you have two wikis in one repository, you will need
+> to write a post-commit script that calls the post-commit wrappers for each
+> wiki. --[[Joey]]
diff --git a/doc/forum/What__39__s_the_difference_between_tag_and_taglink__63__.mdwn b/doc/forum/What__39__s_the_difference_between_tag_and_taglink__63__.mdwn
new file mode 100644
index 000000000..4fee07db4
--- /dev/null
+++ b/doc/forum/What__39__s_the_difference_between_tag_and_taglink__63__.mdwn
@@ -0,0 +1,9 @@
+It occurred to me the difference between tag and taglink, as described in http://ikiwiki.info/ikiwiki/directive/tag/ is just that the latter enable the option to have a displayed form of the tag different from the tag itself, e.g. a tag `foo` can be displayed as `bar` using
+
+ \[[!taglink foo|bar]]
+
+while with tag you can only display the tag `foo` as itself
+
+ \[[!tag foo]]
+
+Is that it?
diff --git a/doc/forum/What__39__s_the_difference_between_tag_and_taglink__63__/comment_1_b3553d65d12af4c4a87f1f66f961c8d9._comment b/doc/forum/What__39__s_the_difference_between_tag_and_taglink__63__/comment_1_b3553d65d12af4c4a87f1f66f961c8d9._comment
new file mode 100644
index 000000000..239444516
--- /dev/null
+++ b/doc/forum/What__39__s_the_difference_between_tag_and_taglink__63__/comment_1_b3553d65d12af4c4a87f1f66f961c8d9._comment
@@ -0,0 +1,49 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2012-08-19T15:05:35Z"
+ content="""
+`\[[!tag]]` does not produce any output in the body of the page, but
+stores an invisible tag (which, in the default templates, gets displayed
+in the footer of the page).
+
+For instance, this
+
+ Here is some text about badgers
+ \[[!tag badger]]
+
+or this
+
+ \[[!tag badger]]
+ Here is some text about badgers
+
+or even this
+
+ Here is some text about \[[!tag badger]]badgers
+
+will all come out like this:
+
+ Edit | RecentChanges | etc.
+ ----
+ Here is some text about badgers
+ ----
+ tags: badger
+
+`\[[!taglink]]` produces a [[ikiwiki/WikiLink]] in the body of the
+page, *and* stores an invisible tag like `\[[!tag]]`.
+
+So this:
+
+ Some text about \[[!tag mushrooms]] and toadstools
+
+renders like this
+
+ Edit | RecentChanges | etc.
+ ----
+ Some text about _mushrooms_ and toadstools
+ ----
+ tags: mushrooms
+
+where `_mushrooms_` represents a hyperlink.
+"""]]
diff --git a/doc/forum/What_is_wrong_with_my_recentchange_page___63__.mdwn b/doc/forum/What_is_wrong_with_my_recentchange_page___63__.mdwn
new file mode 100644
index 000000000..4914cba59
--- /dev/null
+++ b/doc/forum/What_is_wrong_with_my_recentchange_page___63__.mdwn
@@ -0,0 +1,13 @@
+Hi again,
+
+I have finally finished my setup *but* I still have a problem with RecentChanges page.
+
+Can somebody check it for me at [http://maillard.mobi/~xma/wiki/recentchanges/] and tell what is wrong ?
+
+Thank you.
+
+--[[xma]]
+
+> Looks to me like you don't have the meta plugin enabled. --[[Joey]]
+
+> > You are right. Now all is ok. --[[xma]]
diff --git a/doc/forum/When_do_tags_like_a__47__b_get_listed_as_a__47__b_and_not_only_b__63__.mdwn b/doc/forum/When_do_tags_like_a__47__b_get_listed_as_a__47__b_and_not_only_b__63__.mdwn
new file mode 100644
index 000000000..42f470a8b
--- /dev/null
+++ b/doc/forum/When_do_tags_like_a__47__b_get_listed_as_a__47__b_and_not_only_b__63__.mdwn
@@ -0,0 +1,6 @@
+Hello,
+
+For example the page [[plugins/tag|plugins/tag]] here is tagged type/link and type/tags, what gets listed exactly so below the page's content. However, when I use tags like concept/getopt or lang/Perl on my private wiki, it gets only listed as getopt and Perl. Is this behavior configurable or is it implemented firstly in a version later than 3.20100815~bpo50+1 (for which I'm stuck ATM.)?
+
+Greetings,
+ Mike Dornberger
diff --git a/doc/forum/When_do_tags_like_a__47__b_get_listed_as_a__47__b_and_not_only_b__63__/comment_1_cd5ea3aac8a59793ece5bf01a6190b53._comment b/doc/forum/When_do_tags_like_a__47__b_get_listed_as_a__47__b_and_not_only_b__63__/comment_1_cd5ea3aac8a59793ece5bf01a6190b53._comment
new file mode 100644
index 000000000..953a7141b
--- /dev/null
+++ b/doc/forum/When_do_tags_like_a__47__b_get_listed_as_a__47__b_and_not_only_b__63__/comment_1_cd5ea3aac8a59793ece5bf01a6190b53._comment
@@ -0,0 +1,9 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2012-04-19T17:32:18Z"
+ content="""
+I think this change was made in 2011, in commit a17469e3882f55bee93863c6e265b96b80ec9fef.
+
+"""]]
diff --git a/doc/forum/Wikilink_to_a_symbolic_link.mdwn b/doc/forum/Wikilink_to_a_symbolic_link.mdwn
new file mode 100644
index 000000000..69c39725c
--- /dev/null
+++ b/doc/forum/Wikilink_to_a_symbolic_link.mdwn
@@ -0,0 +1 @@
+If I want to make a link to a local pdf-file I put the file into my scrdir and use the usual wikilink syntax. However if I put only a symbolic link into the scrdir instead of the actual file, it doesn't work. Is there a way to make that ikiwiki handles symbolic links in this situation?
diff --git a/doc/forum/Wikilink_to_a_symbolic_link/comment_1_e3ad5099491e0c84cd7729eba82ce552._comment b/doc/forum/Wikilink_to_a_symbolic_link/comment_1_e3ad5099491e0c84cd7729eba82ce552._comment
new file mode 100644
index 000000000..c47897d3c
--- /dev/null
+++ b/doc/forum/Wikilink_to_a_symbolic_link/comment_1_e3ad5099491e0c84cd7729eba82ce552._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="202.173.183.92"
+ subject="comment 1"
+ date="2011-09-27T21:00:16Z"
+ content="""
+No, because disallowing symbolic links is a security feature which can't be disabled.
+"""]]
diff --git a/doc/forum/Wikilink_to_a_symbolic_link/comment_2_46848020b1e3d0cd55bc1ec0ba382aad._comment b/doc/forum/Wikilink_to_a_symbolic_link/comment_2_46848020b1e3d0cd55bc1ec0ba382aad._comment
new file mode 100644
index 000000000..f8058f4fd
--- /dev/null
+++ b/doc/forum/Wikilink_to_a_symbolic_link/comment_2_46848020b1e3d0cd55bc1ec0ba382aad._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk_MMtLPS7osC5MjX00q2ATjvvXPWqm0ik"
+ nickname="micheal"
+ subject="comment 2"
+ date="2011-09-28T17:14:11Z"
+ content="""
+I use ikiwiki only locally for my personal use. So it wouldn't be a security issue for me. Perhaps there is some \"hack\" to achieve what I want?
+"""]]
diff --git a/doc/forum/Wikilink_to_section_of_a_wikipage.mdwn b/doc/forum/Wikilink_to_section_of_a_wikipage.mdwn
new file mode 100644
index 000000000..1a6c04e21
--- /dev/null
+++ b/doc/forum/Wikilink_to_section_of_a_wikipage.mdwn
@@ -0,0 +1 @@
+Is it possible to link directly to a specific section of another ikiwiki-page?
diff --git a/doc/forum/Wikilink_to_section_of_a_wikipage/comment_1_c1409a3c07dfc4ed7274560c962aba75._comment b/doc/forum/Wikilink_to_section_of_a_wikipage/comment_1_c1409a3c07dfc4ed7274560c962aba75._comment
new file mode 100644
index 000000000..ace6b66a8
--- /dev/null
+++ b/doc/forum/Wikilink_to_section_of_a_wikipage/comment_1_c1409a3c07dfc4ed7274560c962aba75._comment
@@ -0,0 +1,11 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-09-07T20:58:46Z"
+ content="""
+Quoting [[ikiwiki/wikilink]]:
+
+> To link to an anchor inside a page, you can use something like
+> `\[[WikiLink#foo]]` .
+"""]]
diff --git a/doc/forum/Wikilink_to_section_of_a_wikipage/comment_2_8a04eb7b0d7f17b9e5bb4cd04ba45871._comment b/doc/forum/Wikilink_to_section_of_a_wikipage/comment_2_8a04eb7b0d7f17b9e5bb4cd04ba45871._comment
new file mode 100644
index 000000000..1583dae2f
--- /dev/null
+++ b/doc/forum/Wikilink_to_section_of_a_wikipage/comment_2_8a04eb7b0d7f17b9e5bb4cd04ba45871._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk_MMtLPS7osC5MjX00q2ATjvvXPWqm0ik"
+ nickname="micheal"
+ subject="comment 2"
+ date="2011-09-11T12:00:08Z"
+ content="""
+Thanks!
+"""]]
diff --git a/doc/forum/Xapian_search:_empty_postlist_table.mdwn b/doc/forum/Xapian_search:_empty_postlist_table.mdwn
new file mode 100644
index 000000000..704f017ca
--- /dev/null
+++ b/doc/forum/Xapian_search:_empty_postlist_table.mdwn
@@ -0,0 +1,34 @@
+Hi,
+
+I'm trying to set up a personal wiki and I'm having trouble getting
+search to work. All my searches give zero results. I eventually
+figured out that the Xapian database that's being created has an empty
+postlist table. The position and termlist tables are all fine, and
+when I add new content to the wiki I can see the database is updated
+and the search terms are in the position table in plaintext. But I
+can't query for them, even using Xapian's command line tools.
+
+<pre>
+mexon:~/Test/.ikiwiki/xapian/default$ ls -l
+total 76
+-rw-rw-r-- 1 mexon mexon 0 Dec 16 15:56 flintlock
+-rw-rw-r-- 1 mexon mexon 28 Dec 16 15:55 iamchert
+-rw-rw-r-- 1 mexon mexon 13 Dec 16 15:55 position.baseA
+-rw-rw-r-- 1 mexon mexon 49152 Dec 16 15:55 position.DB
+-rw-rw-r-- 1 mexon mexon 13 Dec 16 15:55 postlist.baseA
+-rw-rw-r-- 1 mexon mexon 0 Dec 16 15:55 postlist.DB
+-rw-rw-r-- 1 mexon mexon 13 Dec 16 15:55 record.baseA
+-rw-rw-r-- 1 mexon mexon 0 Dec 16 15:55 record.DB
+-rw-rw-r-- 1 mexon mexon 13 Dec 16 15:55 termlist.baseA
+-rw-rw-r-- 1 mexon mexon 16384 Dec 16 15:55 termlist.DB
+mexon:~/Test/.ikiwiki/xapian/default$ delve -a .
+All terms in database:
+mexon:~/Test/.ikiwiki/xapian/default$
+</pre>
+
+I don't know how to debug from here. Clearly ikiwiki is doing
+something right when it's building the database, but one of the tables
+is missing. Can anyone guess what's wrong, or tell me where to start
+troubleshooting?
+
+I'm using Centos 5. Xapian is version 1.2.5. Ikiwiki version 3.20111107.
diff --git a/doc/forum/Xapian_search:_empty_postlist_table/comment_1_de9a7c94beec2707eda0924ca58be9df._comment b/doc/forum/Xapian_search:_empty_postlist_table/comment_1_de9a7c94beec2707eda0924ca58be9df._comment
new file mode 100644
index 000000000..23e539f06
--- /dev/null
+++ b/doc/forum/Xapian_search:_empty_postlist_table/comment_1_de9a7c94beec2707eda0924ca58be9df._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-12-16T18:51:16Z"
+ content="""
+Perhaps you should try a current verison of xapian, 1.2.7 works here. You don't say what version of the xapian perl module you have; 1.2.7.0 is working here. The \"postlist\" is an internal part of xapian AFAICS, not something that has to be explicitly set up, and it gets populated here.
+"""]]
diff --git a/doc/forum/Xapian_search:_empty_postlist_table/comment_2_55f191e4b1306a318a30319f01802229._comment b/doc/forum/Xapian_search:_empty_postlist_table/comment_2_55f191e4b1306a318a30319f01802229._comment
new file mode 100644
index 000000000..41cdf3d4a
--- /dev/null
+++ b/doc/forum/Xapian_search:_empty_postlist_table/comment_2_55f191e4b1306a318a30319f01802229._comment
@@ -0,0 +1,15 @@
+[[!comment format=mdwn
+ username="https://me.yahoo.com/a/2d7oNP9wlop3PaHlGlGS1J2ppVqXf4zQAw--#17b9b"
+ nickname="Matthew"
+ subject="comment 2"
+ date="2011-12-17T09:10:38Z"
+ content="""
+I'm using RPMs to install Xapian packages, xapian-omega and xapian-bindings-perl, and they're all 1.2.5. I originally tried building and installing Xapian 1.2.7 from source, but found that ikiwiki failed like this:
+
+<pre>
+Use of inherited AUTOLOAD for non-method Search::Xapian::DB_CREATE_OR_OPEN() is deprecated at /home/mexon/system/Linux//lib/perl5/site_perl/5.8.8/IkiWiki/Plugin/search.pm line 220.
+Can't locate auto/Search/Xapian/DB_CREATE_O.al in @INC (@INC contains: /home/mat/.ikiwiki /home/mexon/system/Linux//lib/perl5/site_perl/5.8.8/i386-linux-thread-multi /home/mexon/system/Linux//lib/perl5/site_perl/5.8.8 /home/mexon/system/perl5lib /usr/lib/perl5/site_perl/5.8.8/i386-linux-thread-multi /usr/lib/perl5/site_perl/5.8.8 /usr/lib/perl5/site_perl /usr/lib/perl5/vendor_perl/5.8.8/i386-linux-thread-multi /usr/lib/perl5/vendor_perl/5.8.8 /usr/lib/perl5/vendor_perl /usr/lib/perl5/5.8.8/i386-linux-thread-multi /usr/lib/perl5/5.8.8 .) at /home/mexon/system/Linux//lib/perl5/site_perl/5.8.8/IkiWiki/Plugin/search.pm line 220
+</pre>
+
+Much fruitless googling later I found that there were these 1.2.5 RPMs lying around so I switched to those. If you know a solution to the DB_CREATE_O problem I could give 1.2.7 another go.
+"""]]
diff --git a/doc/forum/Xapian_search:_empty_postlist_table/comment_3_0bd424a89c3a52ff393a1e7e00c806be._comment b/doc/forum/Xapian_search:_empty_postlist_table/comment_3_0bd424a89c3a52ff393a1e7e00c806be._comment
new file mode 100644
index 000000000..05f9c874e
--- /dev/null
+++ b/doc/forum/Xapian_search:_empty_postlist_table/comment_3_0bd424a89c3a52ff393a1e7e00c806be._comment
@@ -0,0 +1,24 @@
+[[!comment format=mdwn
+ username="https://me.yahoo.com/a/2d7oNP9wlop3PaHlGlGS1J2ppVqXf4zQAw--#17b9b"
+ nickname="Matthew"
+ subject="comment 3"
+ date="2011-12-19T06:18:56Z"
+ content="""
+I had another go, this time with Xapian 1.2.8.0, and I finally got it working. The errors I was seeing earlier were because Xapian installs itself in /usr/local, but the default CentOS environment doesn't have /usr/local/lib in the LD_LIBRARY_PATH. As usual, the problem and solution are very duh, it's the error messages that make everything hard. It's a lot more clear if you do a \"make test\" while building Search::Xapian:
+
+<pre>
+mexon:~/ikiwiki-temp/Search-Xapian-1.2.8.0$ make test
+PERL_DL_NONLAZY=1 /usr/bin/perl \"-MExtUtils::Command::MM\" \"-e\" \"test_harness(0, 'blib/lib', 'blib/arch')\" t/*.t
+t/01use...............
+# Failed test 'use Search::Xapian;'
+# in t/01use.t at line 3.
+# Tried to use 'Search::Xapian'.
+# Error: Can't load '/home/mexon/ikiwiki-temp/Search-Xapian-1.2.8.0/blib/arch/auto/Search/Xapian/Xapian.so' for module Search::Xapian: libxapian.so.22: cannot open shared object file: No such file or directory at /usr/lib/perl5/5.8.8/i386-linux-thread-multi/DynaLoader.pm line 230.
+# at (eval 3) line 2
+# Compilation failed in require at (eval 3) line 2.
+# BEGIN failed--compilation aborted at t/01use.t line 3.
+# Looks like you failed 1 test of 3.
+</pre>
+
+So yeah. Worth noting that Xapian 1.2.5 is apparently broken with ikiwiki. Maybe some kind of warning?
+"""]]
diff --git a/doc/forum/Xapian_search:_empty_postlist_table/comment_4_40479ac2cfbca609f5f423e539a20ee0._comment b/doc/forum/Xapian_search:_empty_postlist_table/comment_4_40479ac2cfbca609f5f423e539a20ee0._comment
new file mode 100644
index 000000000..a120f976d
--- /dev/null
+++ b/doc/forum/Xapian_search:_empty_postlist_table/comment_4_40479ac2cfbca609f5f423e539a20ee0._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawlXywgEUJjKArnORJR-5hmNFv8bTraXO1Y"
+ nickname="Ramsey"
+ subject="The same issue is happening with me"
+ date="2013-03-06T13:10:15Z"
+ content="""
+I use ikiwiki version 3.20130212 and tried xapian versions 1.2.5, 1.2.8, 1.2.13. All to no avail. The postlist.DB file is empty for me. I think that is the crux of the problem. Does anybody know why this could be?
+"""]]
diff --git a/doc/forum/Xapian_search:_empty_postlist_table/comment_5_397443138da276e11c2e9b9fa7b51406._comment b/doc/forum/Xapian_search:_empty_postlist_table/comment_5_397443138da276e11c2e9b9fa7b51406._comment
new file mode 100644
index 000000000..56dd7c61d
--- /dev/null
+++ b/doc/forum/Xapian_search:_empty_postlist_table/comment_5_397443138da276e11c2e9b9fa7b51406._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawljJoWAYhI55qm4hRLdzIOQBNMVB6fgrs8"
+ nickname="Ramsey"
+ subject="comment 5"
+ date="2013-03-06T22:42:54Z"
+ content="""
+Tried version xapian 1.2.3 and it did not work either. Can someone please help me debug this?
+"""]]
diff --git a/doc/forum/__42__.html_source_file_containing___60__script__62__...__60____47__script__62___not_working__63__.mdwn b/doc/forum/__42__.html_source_file_containing___60__script__62__...__60____47__script__62___not_working__63__.mdwn
new file mode 100644
index 000000000..61d612a69
--- /dev/null
+++ b/doc/forum/__42__.html_source_file_containing___60__script__62__...__60____47__script__62___not_working__63__.mdwn
@@ -0,0 +1,13 @@
+Is a *.html source file containing <script>...</script> supposed to work?
+
+I added a `foo.html` containing
+
+ <body><script type="text/javascript" src="http://friendfeed.com/embed/widget/..."></script></body>
+
+after normal build when I visit `http://foobar.com/foo/` it gives me a normal page with head and footnote texts but empty body
+ <div id="pagebody">
+ <div id="content">
+ </div>
+ </div>
+
+Any ideas how this could/should work?
diff --git a/doc/forum/__42__.html_source_file_containing___60__script__62__...__60____47__script__62___not_working__63__/comment_1_953bd716373dcf51fa444ac098b7f971._comment b/doc/forum/__42__.html_source_file_containing___60__script__62__...__60____47__script__62___not_working__63__/comment_1_953bd716373dcf51fa444ac098b7f971._comment
new file mode 100644
index 000000000..e985b595c
--- /dev/null
+++ b/doc/forum/__42__.html_source_file_containing___60__script__62__...__60____47__script__62___not_working__63__/comment_1_953bd716373dcf51fa444ac098b7f971._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-05-21T23:37:23Z"
+ content="""
+Ikiwiki has a [[plugins/htmlscrubber]] that removes possibly insecure javascript (ie, all javascript) by default. It can be configured. Or you can use the [[plugins/rawhtml]] plugin if you want to include raw html in a site without ikiwiki touching it at all.
+"""]]
diff --git a/doc/forum/__42__.html_source_file_containing___60__script__62__...__60____47__script__62___not_working__63__/comment_2_c7360852f9bf069f28c193373333c9a8._comment b/doc/forum/__42__.html_source_file_containing___60__script__62__...__60____47__script__62___not_working__63__/comment_2_c7360852f9bf069f28c193373333c9a8._comment
new file mode 100644
index 000000000..c5ac9a92f
--- /dev/null
+++ b/doc/forum/__42__.html_source_file_containing___60__script__62__...__60____47__script__62___not_working__63__/comment_2_c7360852f9bf069f28c193373333c9a8._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://profiles.google.com/lumeng.dev"
+ nickname="lumeng.dev"
+ subject="comment 2"
+ date="2011-05-22T01:22:42Z"
+ content="""
+Is it possible to have a section of `rawhtml` in a `foo.mdwn` file so I can mix `markdown` with `HTML` with `<script>...</script>`?
+"""]]
diff --git a/doc/forum/__42__.html_source_file_containing___60__script__62__...__60____47__script__62___not_working__63__/comment_3_6ffc30e27387366b48112198b66c01fa._comment b/doc/forum/__42__.html_source_file_containing___60__script__62__...__60____47__script__62___not_working__63__/comment_3_6ffc30e27387366b48112198b66c01fa._comment
new file mode 100644
index 000000000..44f82ed2c
--- /dev/null
+++ b/doc/forum/__42__.html_source_file_containing___60__script__62__...__60____47__script__62___not_working__63__/comment_3_6ffc30e27387366b48112198b66c01fa._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 3"
+ date="2011-05-22T17:27:32Z"
+ content="""
+No, if the htmlscrubber is enabled for that page it will still scrub it.
+"""]]
diff --git a/doc/forum/access_restrictions:_for_extranet.mdwn b/doc/forum/access_restrictions:_for_extranet.mdwn
new file mode 100644
index 000000000..66f0f7fea
--- /dev/null
+++ b/doc/forum/access_restrictions:_for_extranet.mdwn
@@ -0,0 +1,8 @@
+Hi folks,
+
+are there any plugins or best-ways to create a kind of extranet. Just a few pages or namespaces with access restrictions?
+
+There is a [htaccess solution](http://www.branchable.com/forum/Read_access_restrictions/). Would be fine, but only if there are other solutions.
+
+greetz
+klml
diff --git a/doc/forum/access_restrictions:_for_extranet/comment_1_a0666c3c15661fb0fff70f313cd0d47d._comment b/doc/forum/access_restrictions:_for_extranet/comment_1_a0666c3c15661fb0fff70f313cd0d47d._comment
new file mode 100644
index 000000000..767fb7c03
--- /dev/null
+++ b/doc/forum/access_restrictions:_for_extranet/comment_1_a0666c3c15661fb0fff70f313cd0d47d._comment
@@ -0,0 +1,29 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2012-04-15T20:53:44Z"
+ content="""
+Read/view access and write/edit access are rather different.
+You can limit write access via wiki configuration, and even
+configure it over the web with [[plugins/websetup]].
+
+The only way to limit read access is to restrict access to the
+ entire wiki via `.htaccess` or other web server configuration,
+preferably combined with use of `https`.
+IkiWiki can't limit read access to pages on its own[*],
+because it's a wiki compiler: when a page is viewed, the web
+server serves the compiled HTML without IkiWiki being involved.
+
+The best way to integrate access control into IkiWiki would
+probably be to have a CGI user interface for `.htaccess` or
+equivalent - but you'd still have to be careful, because,
+for instance, if a user can edit public pages, then they
+can insert a `\[[!include]]` directive to make the content
+of a private page public. As a result, the safest way to
+use it is to keep public and private information in
+separate wikis.
+
+[\*] strictly speaking, it *could* via a new plugin, but
+that would defeat many of its advantages
+"""]]
diff --git a/doc/forum/access_restrictions:_for_extranet/comment_2_563040aa099c9366dc5701eb4bc9c10d._comment b/doc/forum/access_restrictions:_for_extranet/comment_2_563040aa099c9366dc5701eb4bc9c10d._comment
new file mode 100644
index 000000000..75b9d49bc
--- /dev/null
+++ b/doc/forum/access_restrictions:_for_extranet/comment_2_563040aa099c9366dc5701eb4bc9c10d._comment
@@ -0,0 +1,20 @@
+[[!comment format=mdwn
+ username="klml"
+ ip="188.174.93.195"
+ subject="comment 2"
+ date="2012-04-16T19:57:20Z"
+ content="""
+hi smcv,
+
+> when a page is viewed, the web server serves the compiled HTML without IkiWiki being involved.
+
+yes you are right, but I still think its a feature ;)
+
+> The best way to integrate access control into IkiWiki would probably be to have a CGI user interface for .htaccess or equivalent - but you'd still have to be careful, because, for instance, if a user can edit public pages, then they can insert a \[[!include]] directive to make the content of a private page public.
+
+My usecase is a website with an small internal area, its just for not \"so public\" files, no private files. And I only have some trusted users.
+
+thx
+klml
+
+"""]]
diff --git a/doc/forum/an_alternative_approach_to_structured_data.mdwn b/doc/forum/an_alternative_approach_to_structured_data.mdwn
new file mode 100644
index 000000000..6e6af8adb
--- /dev/null
+++ b/doc/forum/an_alternative_approach_to_structured_data.mdwn
@@ -0,0 +1,63 @@
+## First Pass
+
+Looking at the discussion about [[todo/structured_page_data]], it looks a bit like folks are bogged down in figuring out what *markup* to use for structured page data, something I doubt that people will really agree on. And thus, little progress is made.
+
+I propose that, rather than worry about what the data looks like, that we take a similar approach
+to the way Revision Control Systems are used in ikiwiki: a front-end + back-end approach.
+The front-end would be a common interface, where queries are made about the structured data,
+and there would be any number of back-ends, which could use whatever markup or format that they desired.
+
+To that purpose, I've written the [[plugins/contrib/field]] plugin for a possible front-end.
+I called it "field" because each page could be considered a "record" where one could request the values of "fields" of that record.
+The idea is that back-end plugins would register functions which can be called when the value of a field is desired.
+
+This is gone into in more depth on the plugin page itself, but I would appreciate feedback and improvements on the approach.
+I think it could be really powerful and useful, especially if it becomes part of ikiwiki proper.
+
+--[[KathrynAndersen]]
+
+> It looks like an interesting idea. I don't have time right now to look at it in depth, but it looks interesting. -- [[Will]]
+
+> I agree such a separation makes some sense. But note that the discussion on [[todo/structured_page_data]]
+> talks about associating data types with fields for a good reason: It's hard to later develop a good UI for
+> querying or modifying a page's data if all the data has an implicit type "string". --[[Joey]]
+
+>> I'm not sure that having an implicit type of "string" is really such a bad thing. After all, Perl itself manages with just string and number, and easily converts from one to the other. Strong typing is generally used to (a) restrict what can be done with the data and/or (b) restrict how the data is input. The latter could be done with some sort of validated form, but that, too, could be decoupled from looking up and returning the value of a field. --[[KathrynAndersen]]
+
+## Second Pass
+
+I have written additional plugins which integrate with the [[plugins/contrib/field]] plugin to both set and get structured page data.
+
+* [[plugins/contrib/getfield]] - query field values inside a page using {{$*fieldname*}} markup
+* [[plugins/contrib/ftemplate]] - like [[plugins/template]] but uses "field" data as well as passed-in data
+* [[plugins/contrib/ymlfront]] - looks for YAML-format data at the front of a page; this is just one possible back-end for the structured data
+
+--[[KathrynAndersen]]
+
+> I'm not an IkiWiki committer ([[Joey]] is the only one I think)
+> but I really like the look of this scheme. In particular,
+> having `getfield` interop with `field` without being *part of*
+> `field` makes me happy, since I'm not very keen on `getfield`'s
+> syntax (i.e. "ugh, yet another mini-markup-language without a
+> proper escaping mechanism"), but this way people can experiment
+> with different syntaxes while keeping `field` for the
+> behind-the-scenes bits.
+>
+>> I've started using `field` on a private site and it's working
+>> well for me; I'll try to do some code review on its
+>> [[plugins/contrib/field/discussion]] page. --s
+>
+> My [[plugins/contrib/album]] plugin could benefit from
+> integration with `field` for photos' captions and so on,
+> probably... I'll try to work on that at some point.
+>
+> [[plugins/contrib/report]] may be doing too much, though:
+> it seems to be an variation on `\[[inline archive="yes"]]`,
+> with an enhanced version of sorting, a mini version of
+> [[todo/wikitrails]], and some other misc. I suspect it could
+> usefully be divided up into discrete features? One good way
+> to do that might be to shuffle bits of its functionality into
+> the IkiWiki distribution and/or separate plugins, until there's
+> nothing left in `report` itself and it can just go away.
+>
+> --[[smcv]]
diff --git a/doc/forum/appear_if_you_are_login_or_not_in_a_page.mdwn b/doc/forum/appear_if_you_are_login_or_not_in_a_page.mdwn
new file mode 100644
index 000000000..be9854a08
--- /dev/null
+++ b/doc/forum/appear_if_you_are_login_or_not_in_a_page.mdwn
@@ -0,0 +1,36 @@
+Hi,
+
+Can you give me a hint for showing if one user is logged or not. If user is logged, then I want to display the user name, as wikipedia or dokuwiki for example.
+Regards,
+Xan.
+
+> ikiwiki doesn't serve pages, so this can't be done inside ikiwiki.
+> For certain kinds of authentication it might be possible anyway.
+> For instance, if you're using [[plugins/httpauth]] exclusively and
+> your server has PHP, you could put `<?php print("$REMOTE_USER");
+> ?>` in all the relevant ikiwiki [[templates]] and arrange for the
+> generated HTML pages to get run through the PHP interpreter. The trick
+> would work differently with other [[plugins/type/auth]] plugins,
+> if at all. --[[Schmonz]]
+
+>> Thanks a lot, Xan.
+
+>>> Another possible trick would be to use some Javascript to make a
+>>> "who am I?" AJAX request to the CGI (the CGI would receive the
+>>> session cookie, if any, and be able to answer). Obviously, this
+>>> wouldn't work for users who've disabled Javascript, but since it's
+>>> non-essential, that's not so bad. You'd need to
+>>> [[write_a_plugin|plugins/write]] to add a suitable CGI action,
+>>> perhaps ?do=whoami, and insert the Javascript. --[[smcv]]
+
+>>>> It's an idea, but you're trading off a serious speed hit for a very
+>>>> minor thing. --[[Joey]]
+
+>>>> Cool idea. A similar trick (I first saw it
+>>>> [here](http://www.peej.co.uk/articles/http-auth-with-html-forms.html))
+>>>> could be used to provide a [[plugins/passwordauth]]-like login form
+>>>> for [[plugins/httpauth]]. --[[Schmonz]]
+
+>>>>> I always assumed the entire reason someone might want to use the
+>>>>> httpauth plugin is to avoid nasty site-specific login forms..
+>>>>> --[[Joey]]
diff --git a/doc/forum/attachments_fail_to_upload.mdwn b/doc/forum/attachments_fail_to_upload.mdwn
new file mode 100644
index 000000000..62e363a16
--- /dev/null
+++ b/doc/forum/attachments_fail_to_upload.mdwn
@@ -0,0 +1,8 @@
+I am having a problem with ikiwiki on an armel processor based machine running 32 bit debian squeeze.
+I first installed the ikiwiki deb from the repos and realized there was a problem uploading images.
+I downloaded the latest version of ikiwiki from the git repo and made sure I had all of the necessary dependencies and libraries.
+Make doesn't seem to complain about anything being missing and make test passes fine. I can create a new wiki and edit pages but anytime I try to upload an image it fails.
+I have the attachment plugin activated.And I added mimetype(image/*) and maxsize(5000kb) to the PageSpec field but that made no difference.
+I am able to successully add images to the appropriate folders manually via the command line and the commit them to git but I'd liekt o make it work through the web interface. Is there anything that I may have missed?
+
+Edit: I just noticed that if I save the page anyway after the the javascript ui reports that the upload has failed, the file has in fact uploaded.
diff --git a/doc/forum/attachments_fail_to_upload/comment_1_577adde1dfa49463dfa8e169c462fc42._comment b/doc/forum/attachments_fail_to_upload/comment_1_577adde1dfa49463dfa8e169c462fc42._comment
new file mode 100644
index 000000000..7d2d66c14
--- /dev/null
+++ b/doc/forum/attachments_fail_to_upload/comment_1_577adde1dfa49463dfa8e169c462fc42._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2012-03-01T16:11:09Z"
+ content="""
+Saying \"it fails\" is not going to get the best help. If you can look in your web server's error.log file, or get a error message from somewhere else, you might get somewhere.
+
+You might also check if the machine is running out of memory. It's quite likely that a POSTed attachment is all buffered in the web server's memory before ikiwiki gets ahold of it.
+"""]]
diff --git a/doc/forum/attachments_fail_to_upload/comment_2_473f38c6d523496fac8dad13ac6d20c3._comment b/doc/forum/attachments_fail_to_upload/comment_2_473f38c6d523496fac8dad13ac6d20c3._comment
new file mode 100644
index 000000000..f491a9b71
--- /dev/null
+++ b/doc/forum/attachments_fail_to_upload/comment_2_473f38c6d523496fac8dad13ac6d20c3._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="jaime"
+ ip="201.141.41.68"
+ subject="comment 2"
+ date="2012-03-01T20:08:47Z"
+ content="""
+Sorry, \"failed\" is just the message ikwiki's web interface returns. Nginx's error logs don't seem to register anything when the \"failure\" occurs. I am not sure how to properly monitor what is happening with the web server's memory at the time of uploading but just watching htop I can see that the ikiwiki begins to use 100% if the cpu until the process stops but there doesn't seem to be much impact on the overall memory usage, seems to remain about half of available memory.
+I'm sorry if that is not helpful. If you can give me some pointers on where to look for more detailed information I can follow instructions.
+
+
+
+"""]]
diff --git a/doc/forum/attachments_fail_to_upload/comment_3_799a2f1b7b259157e97fd31ec76fb845._comment b/doc/forum/attachments_fail_to_upload/comment_3_799a2f1b7b259157e97fd31ec76fb845._comment
new file mode 100644
index 000000000..ebf2756a4
--- /dev/null
+++ b/doc/forum/attachments_fail_to_upload/comment_3_799a2f1b7b259157e97fd31ec76fb845._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 3"
+ date="2012-03-03T14:54:10Z"
+ content="""
+What version are you running on squeeze? The version shipped with Debian stable does not have the javascript uploader.
+
+There was a recent problem involving filenames with unicode characters that broke the javascript uploader as you describe, which was fixed in a recent release.
+"""]]
diff --git a/doc/forum/attachments_fail_to_upload/comment_4_e37d1497acafd3fda547462f000636e3._comment b/doc/forum/attachments_fail_to_upload/comment_4_e37d1497acafd3fda547462f000636e3._comment
new file mode 100644
index 000000000..148c7b799
--- /dev/null
+++ b/doc/forum/attachments_fail_to_upload/comment_4_e37d1497acafd3fda547462f000636e3._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="jaime"
+ ip="201.141.91.196"
+ subject="ikiwiki version 3.20120203"
+ date="2012-03-05T19:52:02Z"
+ content="""
+I installed ikiwiki version 3.20120203 from source. Should I pull more recent changes from the repo?
+"""]]
diff --git a/doc/forum/attachments_fail_to_upload/comment_5_da03f9c4917cb1ef52de984b8ba86b68._comment b/doc/forum/attachments_fail_to_upload/comment_5_da03f9c4917cb1ef52de984b8ba86b68._comment
new file mode 100644
index 000000000..dbe0d6574
--- /dev/null
+++ b/doc/forum/attachments_fail_to_upload/comment_5_da03f9c4917cb1ef52de984b8ba86b68._comment
@@ -0,0 +1,11 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 5"
+ date="2012-03-05T20:48:50Z"
+ content="""
+Your version already contains the unicode fix, which was commit 1572c3c376df36ea09e27a1ea437e3a75cdf0f84.
+
+I think it's possible that the javascript file upload widget is timing out waiting for a response from ikiwiki when uploading the file. Since this is a slow CPU, it might exceed some limit in that code. At this point all I know is that the javascript file upload widget is setting an error flag, which is displayed as \"failed!\" in red. The next step is probably to get a http protocol analizer like firebug and see what if anything is being returned by the ikiwiki.cgi when the attachment is uploaded to it -- it should return some JSON with a `stored_msg` field.
+
+"""]]
diff --git a/doc/forum/attachments_fail_to_upload/comment_6_04498946a300ddb652dec73c2950f48f._comment b/doc/forum/attachments_fail_to_upload/comment_6_04498946a300ddb652dec73c2950f48f._comment
new file mode 100644
index 000000000..877050eb5
--- /dev/null
+++ b/doc/forum/attachments_fail_to_upload/comment_6_04498946a300ddb652dec73c2950f48f._comment
@@ -0,0 +1,19 @@
+[[!comment format=mdwn
+ username="jaime"
+ ip="201.141.54.196"
+ subject="comment 6"
+ date="2012-03-08T01:34:57Z"
+ content="""
+Ok... figured out how to use firebug, started the profile, and tried uploading an image. POST http://myserver/ikiwiki.cgi immediately turns red with a little X as I get the javascript \"failed\" message in the ui. In the post tab of firebug, halfway through the binary content of the png I can see the message \"... Firebug request size limit has been reached by Firebug. ... \"
+
+So next I try uploading a tiny 3k image. This time the post completes and I can see \"Error: Can't locate JSON.pm in @INC\" in the output. A bit of googling tells me I need to install the libjson-perl package. Done.
+
+I try and upload the tiny 3k image again. This time it works. :)
+I try and upload a 9k image and the POST just dies just like before with the \"... Firebug request size limit has been reached by Firebug. ... \" in the post tab.
+
+So I tried changing the extensions.firebug.netDisplayedPostBodyLimit variable in firefox to see if that would me to get more info. Now the I don't get the request size limit message but the post still doesn't get anything back.
+
+I decided to try some other http protocal analyzers. Firefox 10 internal webdeveloper tools don't give me any more info.
+Next I tried HttpFox and the only thing I get back is this...
+Error loading content (NS_ERROR_DOCUMENT_NOT_CACHED)
+"""]]
diff --git a/doc/forum/bashman.mdwn b/doc/forum/bashman.mdwn
new file mode 100644
index 000000000..32c006bfb
--- /dev/null
+++ b/doc/forum/bashman.mdwn
@@ -0,0 +1,7 @@
+ [[!teximg code="\{}_pF_q(a_1,...,a_p;c_1,...,c_q;z) = \sum_{n=0}^\infty \frac{(a_1)_n\cdot\cdot\cdot(a_p)_n}{(c_1)_n\cdot\cdot\cdot(c_q)_n} \frac{z^n}{n!}"]]
+
+JAJAJA, the teximg is not loaded... :-(
+
+Bye.
+
+> This wiki does not have teximg enabled. --[[Joey]]
diff --git a/doc/forum/build_error:_Cannot_decode_string_with_wide_characters.mdwn b/doc/forum/build_error:_Cannot_decode_string_with_wide_characters.mdwn
new file mode 100644
index 000000000..2e5ac7e6e
--- /dev/null
+++ b/doc/forum/build_error:_Cannot_decode_string_with_wide_characters.mdwn
@@ -0,0 +1,12 @@
+ $ ikiwiki -setup mywiki.setup
+ generating wrappers..
+ rebuilding wiki..
+ Cannot decode string with wide characters at /opt/local/lib/perl5/5.12.3/darwin-multi-2level/Encode.pm line 175.
+
+I am running Mac OS X 10.6.8
+
+ $ ikiwiki --version
+ ikiwiki version 3.20110608
+ $ perl --version
+
+ This is perl 5, version 12, subversion 3 (v5.12.3) built for darwin-multi-2level
diff --git a/doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_1_83fbb415dd3ae6a19ed5ea5f82065c28._comment b/doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_1_83fbb415dd3ae6a19ed5ea5f82065c28._comment
new file mode 100644
index 000000000..d1b555b2a
--- /dev/null
+++ b/doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_1_83fbb415dd3ae6a19ed5ea5f82065c28._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2012-03-13T03:57:42Z"
+ content="""
+The problem could be your system's locale setting. Perhaps LANG is not set to a utf-8 capable locale.
+"""]]
diff --git a/doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_2_d258536c98538d4744f66eb3132439a9._comment b/doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_2_d258536c98538d4744f66eb3132439a9._comment
new file mode 100644
index 000000000..28222618d
--- /dev/null
+++ b/doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_2_d258536c98538d4744f66eb3132439a9._comment
@@ -0,0 +1,20 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawmKyeW2G4jjSdnL1m6kPPtAiGFUBsnYCfY"
+ nickname="FName"
+ subject="comment 2"
+ date="2012-03-13T04:43:25Z"
+ content="""
+ $ locale
+ LANG=\"en_US.UTF-8\"
+ LC_COLLATE=\"en_US.UTF-8\"
+ LC_CTYPE=\"en_US.UTF-8\"
+ LC_MESSAGES=\"en_US.UTF-8\"
+ LC_MONETARY=\"en_US.UTF-8\"
+ LC_NUMERIC=\"en_US.UTF-8\"
+ LC_TIME=\"en_US.UTF-8\"
+ LC_ALL=
+ $ uname -a
+ Darwin x4430 10.8.0 Darwin Kernel Version 10.8.0: Tue Jun 7 16:33:36 PDT 2011; root:xnu-1504.15.3~1/RELEASE_I386 i386
+
+Does it look OK?
+"""]]
diff --git a/doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_3_d62173d0ae220ab7b063631952856587._comment b/doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_3_d62173d0ae220ab7b063631952856587._comment
new file mode 100644
index 000000000..8dc2f9851
--- /dev/null
+++ b/doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_3_d62173d0ae220ab7b063631952856587._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 3"
+ date="2012-03-16T20:33:20Z"
+ content="""
+The locale settings look ok.
+
+I'd try upgrading your perl. 5.12.3 is rather old, and the code that is failing is part of perl.
+"""]]
diff --git a/doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_4_d5d0174e09a94359c23fd9c006a22bbc._comment b/doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_4_d5d0174e09a94359c23fd9c006a22bbc._comment
new file mode 100644
index 000000000..57c99bee9
--- /dev/null
+++ b/doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_4_d5d0174e09a94359c23fd9c006a22bbc._comment
@@ -0,0 +1,50 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawmKyeW2G4jjSdnL1m6kPPtAiGFUBsnYCfY"
+ nickname="FName"
+ subject="Still can't use ikiwiki on Mac OS X"
+ date="2012-10-21T17:12:15Z"
+ content="""
+I'm still not able to use Ikiwiki on Mac:
+
+ $ ikiwiki --setup ./web.setup
+ generating wrappers..
+ rebuilding wiki..
+ Cannot decode string with wide characters at /opt/local/lib/perl5/5.12.4/darwin-thread-multi-2level/Encode.pm line 174.
+
+
+ $ ls -la /opt/local/bin/perl*
+ lrwxr-xr-x 1 root admin 20 Oct 21 12:06 /opt/local/bin/perl -> /opt/local/bin/perl5
+ lrwxr-xr-x 1 root admin 23 Oct 21 12:06 /opt/local/bin/perl5 -> /opt/local/bin/perl5.12
+ -rwxr-xr-x 1 root admin 9896 Jun 26 01:39 /opt/local/bin/perl5.12
+ lrwxr-xr-x 1 root admin 8 Jun 26 01:39 /opt/local/bin/perl5.12.4 -> perl5.12
+ -rwxr-xr-x 1 root admin 10000 Jun 26 01:55 /opt/local/bin/perl5.14
+ lrwxr-xr-x 1 root admin 8 Jun 26 01:56 /opt/local/bin/perl5.14.2 -> perl5.14
+ -rwxr-xr-x 1 root admin 10000 Aug 23 13:41 /opt/local/bin/perl5.16
+ lrwxr-xr-x 1 root admin 8 Aug 23 13:42 /opt/local/bin/perl5.16.1 -> perl5.16
+ lrwxr-xr-x 1 root admin 12 Oct 21 11:44 /opt/local/bin/perlbug -> perlbug-5.16
+ -rwxr-xr-x 2 root admin 45815 Jun 26 01:39 /opt/local/bin/perlbug-5.12
+ -rwxr-xr-x 2 root admin 45203 Jun 26 01:55 /opt/local/bin/perlbug-5.14
+ -rwxr-xr-x 2 root admin 41712 Aug 23 13:41 /opt/local/bin/perlbug-5.16
+ lrwxr-xr-x 1 root admin 12 Oct 21 11:44 /opt/local/bin/perldoc -> perldoc-5.16
+ -rwxr-xr-x 1 root admin 244 Jun 26 01:39 /opt/local/bin/perldoc-5.12
+ -rwxr-xr-x 1 root admin 244 Jun 26 01:55 /opt/local/bin/perldoc-5.14
+ -rwxr-xr-x 1 root admin 244 Aug 23 13:41 /opt/local/bin/perldoc-5.16
+ lrwxr-xr-x 1 root admin 12 Oct 21 11:44 /opt/local/bin/perlivp -> perlivp-5.16
+ -rwxr-xr-x 1 root admin 12484 Jun 26 01:39 /opt/local/bin/perlivp-5.12
+ -rwxr-xr-x 1 root admin 12297 Jun 26 01:55 /opt/local/bin/perlivp-5.14
+ -rwxr-xr-x 1 root admin 10802 Aug 23 13:41 /opt/local/bin/perlivp-5.16
+ lrwxr-xr-x 1 root admin 15 Oct 21 11:44 /opt/local/bin/perlthanks -> perlthanks-5.16
+ -rwxr-xr-x 2 root admin 45815 Jun 26 01:39 /opt/local/bin/perlthanks-5.12
+ -rwxr-xr-x 2 root admin 45203 Jun 26 01:55 /opt/local/bin/perlthanks-5.14
+ -rwxr-xr-x 2 root admin 41712 Aug 23 13:41 /opt/local/bin/perlthanks-5.16
+
+
+If I simply relink `/opt/local/bin/perl` to a newer version of perl such as `/opt/local/bin/perl5.16`, it still doesn't work, as it seems
+
+ $ ikiwiki -version
+ ikiwiki version 3.20110608
+
+simply force to use perl5.12.
+
+
+"""]]
diff --git a/doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_5_e652027a8f90ebef6f21613b5784ded2._comment b/doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_5_e652027a8f90ebef6f21613b5784ded2._comment
new file mode 100644
index 000000000..08bde8c85
--- /dev/null
+++ b/doc/forum/build_error:_Cannot_decode_string_with_wide_characters/comment_5_e652027a8f90ebef6f21613b5784ded2._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnxp2XU8gIribhhGhGuYtU6eMMwHv5gUGI"
+ nickname="Amitai"
+ subject="may I recommend pkgsrc?"
+ date="2012-10-22T03:50:56Z"
+ content="""
+Looks like the MacPorts ikiwiki package is old. I use ikiwiki from pkgsrc as mentioned in [[tips/ikiwiki_on_mac_os_x]]. I also maintain the package, so it's updated regularly.
+"""]]
diff --git a/doc/forum/chinese_character_problem.mdwn b/doc/forum/chinese_character_problem.mdwn
new file mode 100644
index 000000000..aea55703f
--- /dev/null
+++ b/doc/forum/chinese_character_problem.mdwn
@@ -0,0 +1,21 @@
+just finished setting up ikiwiki..
+
+I can type chinese, save and display it correctly in ikiwiki for the first time. However, when i try to edit the page again, the chinese character in the form is unrecognizable. you can see it here <http://ikiwiki.perlchina.org/>
+
+I am using the latest ikiwiki(manually installed as non-root user) and CGI::FormBuilder(3.0501) on Debian 4.0
+
+这个没问题 it is not a problem on ikiwiki website though.
+
+Thanks.
+
+
+> Is your system perhaps not configured with a utf-8 default locale? Or ikiwiki not configured to use it?
+> Make sure that some utf-8 locale is enabled (in /etc/locale.gen on Debian for example) and try setting `locale` in your > ikiwiki setup file. --[[Joey]]
+
+I have installed locales-all and locale -a shows that zh_CN.UTF-8 is installed(there is no /etc/local.gen file though). then I enabled this line "locale => 'zh_CN.UTF-8'" in my wiki setup and -setup again. but that generated lots error messages "Missing constant domain at (eval 30) line 3"
+
+sorry being a n00b on this thing what else can I do?
+
+> See [[bugs/Missing_constant_domain_at_IkiWiki.pm_line_842]].
+> Looks like you need to upgrade to a newer version of
+> [[!cpan Locale::gettext]] --[[Joey]]
diff --git a/doc/forum/cleaning_up_discussion_pages_and_the_like.mdwn b/doc/forum/cleaning_up_discussion_pages_and_the_like.mdwn
new file mode 100644
index 000000000..35ceae59b
--- /dev/null
+++ b/doc/forum/cleaning_up_discussion_pages_and_the_like.mdwn
@@ -0,0 +1,11 @@
+For example in [[forum/ikiwiki__39__s_notion_of_time]], should one remove the
+text about the implementation bug that has been fixed, or should it stay there,
+for reference? --[[tschwinge]]
+
+> I have no problem with cleaning up obsolete stuff in the forum, tips, etc.
+> --[[Joey]]
+
+That's also what I think: such discussions or comments on [[forum]] discussion
+pages, or generally on all pages' [[Discussion]] subpages, can be removed if
+either they're simply not valid / interesting / ... anymore, or if they've been
+used to improve the *real* documentation. --[[tschwinge]]
diff --git a/doc/forum/copyright_and_license_template_variables___40__where_are_they_set__63____41__.mdwn b/doc/forum/copyright_and_license_template_variables___40__where_are_they_set__63____41__.mdwn
new file mode 100644
index 000000000..afca582fd
--- /dev/null
+++ b/doc/forum/copyright_and_license_template_variables___40__where_are_they_set__63____41__.mdwn
@@ -0,0 +1,13 @@
+The default template includes TMPL_IF LICENSE and TMPL_IF COPYRIGHT, but I can't figure out where these are set.
+
+This page seems to indicate they are created by the [[plugins/meta]] plugin:
+
+[[Default Content for Copyright and License|plugins/contrib/default_content_for___42__copyright__42___and___42__license__42__/]]
+
+Is this true? It just seems a little odd that the default template contains variables that are set by a non-default plugin, so I just wanted to confirm that.
+
+Thanks!
+
+--[[users/acodispo]]
+
+> It is true. --[[Joey]]
diff --git a/doc/forum/create_download_link.mdwn b/doc/forum/create_download_link.mdwn
new file mode 100644
index 000000000..44899b118
--- /dev/null
+++ b/doc/forum/create_download_link.mdwn
@@ -0,0 +1,4 @@
+Thank you very much for ikiwiki. I successfully started using it.
+Now I have a simple question that I am not able to answer myself at the moment:
+
+In the directory structure of ikiwiki I placed a file (in this case a video). In an ikiwiki page I would like to create a link to this file, so that the user can download it. Of course I can do this using a link to the absolute url, including hostname and the full path. But when I move the wiki to another host, this will not be valid anymore. Therefore my question: Is there a way to automatically create a link to a file by relatively specifying the destination? Say, I have a directory foo with a page foo/bar.mdwn and a folder foo/downloads with a file foo/downloads/video.mp4, I would like to refer to this file using "downloads/video.mp4" or something similar. Is there such a possibility?
diff --git a/doc/forum/create_download_link/comment_1_4797493157c569f8893b53b5e5a58e73._comment b/doc/forum/create_download_link/comment_1_4797493157c569f8893b53b5e5a58e73._comment
new file mode 100644
index 000000000..cec5fd6bb
--- /dev/null
+++ b/doc/forum/create_download_link/comment_1_4797493157c569f8893b53b5e5a58e73._comment
@@ -0,0 +1,14 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2011-09-11T10:20:14Z"
+ content="""
+You could use a relative URL in a standard [[ikiwiki/MarkDown]] link:
+
+ [download the video](../downloads/video.mp4)
+
+or an ikiwiki [[ikiwiki/WikiLink]] to the file (just like you would for a page):
+
+ \[[download the video|foo/bar/downloads/video.mp4]]
+"""]]
diff --git a/doc/forum/cutpaste.pm_not_only_file-local.mdwn b/doc/forum/cutpaste.pm_not_only_file-local.mdwn
new file mode 100644
index 000000000..0c5221cc9
--- /dev/null
+++ b/doc/forum/cutpaste.pm_not_only_file-local.mdwn
@@ -0,0 +1,14 @@
+I'd like to use the cutpaste plugin, but not only on a file-local basis: fileA
+has \[[!cut id=foo text="foo"]], and fileB does \[[!absorb pagenames=fileA]],
+and can then use \[[!paste id=foo]].
+
+Therefore, I've written an [*absorb* directive /
+plugin](http://schwinge.homeip.net/~thomas/tmp/absorb.pm), which is meant to
+absorb pages in order to get hold of their *cut* and *copy* directives'
+contents. This does work as expected. But it also absorbs page fileA's *meta*
+values, like a *meta title*, etc. How to avoid / solve this?
+
+Alternatively, do you have a better suggestion about how to achieve what I
+described in the first paragraph?
+
+--[[tschwinge]]
diff --git a/doc/forum/cutpaste.pm_not_only_file-local/comment_1_497c62f21fd1b87625b806407c72dbad._comment b/doc/forum/cutpaste.pm_not_only_file-local/comment_1_497c62f21fd1b87625b806407c72dbad._comment
new file mode 100644
index 000000000..8cc724a72
--- /dev/null
+++ b/doc/forum/cutpaste.pm_not_only_file-local/comment_1_497c62f21fd1b87625b806407c72dbad._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="60.241.8.244"
+ subject="field and getfield and ymlfront"
+ date="2010-08-12T02:33:54Z"
+ content="""
+Have you considered trying the [[plugins/contrib/field]] plugin, and its associated plugins? [[plugins/contrib/ymlfront]] can give you the source (\"cut\") and [[plugins/contrib/getfield]] and/or [[plugins/contrib/report]] can get you the value (\"paste\") including the values from other pages.
+"""]]
diff --git a/doc/forum/debian_backports_update_someone_please.mdwn b/doc/forum/debian_backports_update_someone_please.mdwn
new file mode 100644
index 000000000..7102d12a1
--- /dev/null
+++ b/doc/forum/debian_backports_update_someone_please.mdwn
@@ -0,0 +1,18 @@
+I'm just in the process of deploying ikiwiki and I'd love to use it in the html5 mode instead of in XHTML. Any chance that the ikiwiki's .deb in debian backports will be updated any time soon?
+
+> Formerer does a good job keeping the backport up-to-date with whatever is in Debian testing.
+> Which is the policy of what Backports should contain. So, I just need to stop releasing ikiwiki
+> for 2 weeks. :) --[[Joey]]
+
+>> And are there any chances you doing it... or rather not doing it?
+
+>>> Sure, I'm busily not doing it right now. Should reach testing in 3
+>>> days. I generally schedule things so a new ikiwiki reaches testing
+>>> every 2 weeks to month. Getting important new features and bugfixes out
+>>> can take priority though. --[[Joey]]
+
+>>>> Great! Thanks.
+
+>>>> Still not available in the backports; did you break the silence on the wire and got back to work [[Joey]]?
+
+>>>> I was blinded by my stupidity... thanks!
diff --git a/doc/forum/discussion.mdwn b/doc/forum/discussion.mdwn
new file mode 100644
index 000000000..93cf4656e
--- /dev/null
+++ b/doc/forum/discussion.mdwn
@@ -0,0 +1,7 @@
+I like the idea of this forum heirarchy -- but I think a map would be clearer than inlining the sub-pages. -- [[users/Jon]]
+
+> The easier way to accomplish this is to set archive=yes in the inline.
+> Switching to archive view can be useful when there are a lot of long
+> posts and people tend to want to scan by title to find interesting ones
+> and not necessarily read them all, which probably fits this forum pretty
+> well --[[Joey]]
diff --git a/doc/forum/double_forward_slash___39____47____47____39___in_the_address_bar.mdwn b/doc/forum/double_forward_slash___39____47____47____39___in_the_address_bar.mdwn
new file mode 100644
index 000000000..b5fb2aa18
--- /dev/null
+++ b/doc/forum/double_forward_slash___39____47____47____39___in_the_address_bar.mdwn
@@ -0,0 +1,3 @@
+Why do all the pages are preceded with a double forward slash in the address bar... ie. http://example.org//ikiwiki/pagespec/ .. maybe someone knows?
+
+> Sorted, base url in the .setup file had the unnecessary '/' suffix.
diff --git a/doc/forum/editing_a_comment.mdwn b/doc/forum/editing_a_comment.mdwn
new file mode 100644
index 000000000..eb534365e
--- /dev/null
+++ b/doc/forum/editing_a_comment.mdwn
@@ -0,0 +1,11 @@
+Is it possible to edit a comment? I did not find any button for it.
+
+> It was a design decision to not allow editing comments via the web
+> interface. The thinking being that comments on blogs tend to not allow
+> editing, and allowing wiki-style editing by anyone would sort of defeat
+> the purpose of comments.
+>
+> I do think there is room to support more forum-style comments in ikiwiki.
+> As long as the comment is not posted by an anonymous user, it would be
+> possible to open up editing to the original commenter. One day, perhaps..
+> --[[Joey]]
diff --git a/doc/forum/editing_the_style_sheet.mdwn b/doc/forum/editing_the_style_sheet.mdwn
new file mode 100644
index 000000000..b4aa8c89b
--- /dev/null
+++ b/doc/forum/editing_the_style_sheet.mdwn
@@ -0,0 +1,18 @@
+[[!meta date="2006-12-29 04:19:51 +0000"]]
+
+It would be nice to be able to edit the stylesheet by means of the cgi. Or is this possible? I wasn't able to achieve it.
+Ok, that's my last 2 cents for a while. --[Mazirian](http://mazirian.com)
+
+> I don't support editing it, but if/when ikiwiki gets [[todo/fileupload]] support,
+> it'll be possible to upload a style sheet. (If .css is in the allowed
+> extensions list.. no idea how safe that would be, a style sheet is
+> probably a great place to put XSS attacks and evil javascript that would
+> be filtered out of any regular page in ikiwiki). --[[Joey]]
+
+>> I hadn't thought of that at all. It's a common feature and one I've
+>> relied on safely, because the wikis I am maintaining at the moment
+>> are all private and restricted to trusted users. Given that the whole
+>> point of ikiwiki is to be able to access and edit via the shell as
+>> well as the web, I suppose the features doesn't add a lot. By the
+>> way, the w3m mode is brilliant. I haven't tried it yet, but the idea
+>> is great.
diff --git a/doc/forum/error_302___40__Found__41___when_editing_page.mdwn b/doc/forum/error_302___40__Found__41___when_editing_page.mdwn
new file mode 100644
index 000000000..aa2db2f8a
--- /dev/null
+++ b/doc/forum/error_302___40__Found__41___when_editing_page.mdwn
@@ -0,0 +1,59 @@
+I have an [IkiWiki site](http://ocikbapps.uzh.ch/gc3wiki), which works
+fine, except for one page which I cannot edit with the CGI. Only this
+single page is failing, editing every other works as expected.
+
+When clicking every button (well, except "Cancel") on the edit form, I
+get a "302 Found" page in the browser; the Apache logs show:
+
+ [client XXX] malformed header from script. Bad header=according%20to%20the%20availab: ikiwiki.cgi
+
+Capturing the output from `ikiwiki.cgi`, I see that just these two
+lines are sent:
+
+ Status: 302 Found
+ Location: https://ocikbapps.uzh.ch/gc3wiki/ikiwiki.auth.cgi?_submitted=1;do=edit;..;_submit=Preview;attachment=
+
+The total size in bytes of the reply is 16189; I thought this might be
+an issue with Apache imposing some limit on the header size; indeed,
+`tcpflow` shows that the "302 Found" message is encapsulated into an
+HTTP 500 "internal server error" response.
+
+So I added this to Apache's config (std Debian 6.0):
+
+ # cat /etc/apache2/conf.d/limits.conf
+ LimitRequestFieldSize 65534
+ LimitRequestLine 65534
+
+But I'm still getting the same error.
+
+Any suggestions?
+
+
+**Update 2011-08-16:**
+[This bug report](https://bugzilla.mozilla.org/show_bug.cgi?id=513989)
+shows the exact same symptoms; the solution they adopted is to not
+perform the redirect when the URL length exceeds the default Apache
+value of 8190.
+
+Regarding Apache limits: apparently, Apache (as of version 2.2.17)
+only applies the `LimitRequestLine` and `LimitRequestFiledsize` in
+client HTTP transactions; when dealing with the HTTP responses
+generated by CGI scripts, the code from `server/util_script.c`
+applies: (function `ap_scan_script_header_err_core`, lines 403--433)
+
+ char x[MAX_STRING_LEN];
+ char *w, *l;
+ [...]
+ if (buffer) {
+ *buffer = '\0';
+ }
+ w = buffer ? buffer : x;
+ [...]
+ while (1) {
+ int rv = (*getsfunc) (w, MAX_STRING_LEN - 1, getsfunc_data);
+
+where `MAX_STRING_LEN` is defined in `httpd.h` to be equal to
+`HUGE_STRING_LEN`, that is, 8192.
+
+> This has been filed as [[!debbug 638009]], so let's only
+> discuss it in one place (ie, there) --[[Joey]]
diff --git a/doc/forum/ever-growing_list_of_pages.mdwn b/doc/forum/ever-growing_list_of_pages.mdwn
new file mode 100644
index 000000000..9920e34bb
--- /dev/null
+++ b/doc/forum/ever-growing_list_of_pages.mdwn
@@ -0,0 +1,29 @@
+What is overyone's idea about the ever-growing list of pages in bugs/ etc.?
+Once linked to `done`, they're removed from the rendered [[bugs]] page -- but
+they're still present in the repository.
+
+Shouldn't there be some clean-up at some point for those that have been
+resolved? Or should all of them be kept online forever?
+
+--[[tschwinge]]
+
+> To answer a question with a question, what harm does having the done bugs
+> around cause? At some point in the future perhaps the number of done pages
+> will be large enough to be a time or space concern. Do you think we've
+> reached a point now? One advantage of having them around is that people
+> running older versions of the Ikiwiki software may find the page explaining
+> that the bug is fixed if they perform a search. -- [[Jon]]
+
+> I like to keep old bugs around. --[[Joey]]
+
+So, I guess it depends on whether you want to represent the development of the
+software (meaning: which bugs are open, which are fixed) *(a)* in a snapshot of
+the repository (a checkout; that is, what you see rendered on
+<http://ikiwiki.info/>), or *(b)* if that information is to be contained in the
+backing repository's revision history only. Both approaches are valid. For
+people used to using Git for accessing a project's history, *(b)* is what
+they're used to, but for those poor souls ;-) that only use a web browser to
+access this database, *(a)* is the more useful approach indeed. For me, using
+Git, it is a bit of a hindrance, as, when doing a full-text search for a
+keyword on a checkout, I'd frequently hit pages that reported a bug, but are
+tagged `done` by now. --[[tschwinge]]
diff --git a/doc/forum/field__95__tags_not_linking.mdwn b/doc/forum/field__95__tags_not_linking.mdwn
new file mode 100644
index 000000000..0b91e58f3
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking.mdwn
@@ -0,0 +1,66 @@
+Hey There.
+
+I'm using the [[plugins/contrib/field]] plugin together with [[plugins/contrib/ftemplate/ikiwiki/directive/ftemplate]] and [[plugins/contrib/ymlfront]]. Everything looks good, but there are no links created as defined in field_tags. I hope it's just a mistake by me and someone can help me.
+
+All three plugins are activated and ikiwiki's setup file reads
+
+ # field plugin
+ # simple registration
+ field_register => [qw{meta}],
+
+ # allow the config to be queried as a field
+ field_allow_config => 1,
+
+ # flag certain fields as "tags"
+ field_tags => {
+ Autor => '/users',
+ Rubrik => '/rubriken',
+ Themen => '/themen',
+ BuchTitel => 'rezensionen/titel',
+ BuchAutor => '/rezensionen/autoren',
+ Verlag => 'rezensionen/verlage',
+ },
+
+I use this template to ask the users for the fields:
+
+ ---
+ Autor:
+ BuchTitel:
+ BuchUntertitel:
+ BuchAutor:
+ Verlag:
+ ISBN:
+ Seiten:
+ Preis:
+ Rubrik: Rezensionen
+ Themen:
+ - (Anti-)Repression
+ - Aktion
+ - ...
+ ---
+ [[!ftemplate id="rezi"]]
+
+And this one tells what to do with them:
+
+ \[[!meta author="<TMPL_VAR AUTOR>"]]
+ \[[!meta title="<TMPL_VAR BUCHAUTOR>: <TMPL_VAR BUCHTITEL>"]]
+
+ <span class="infobox">
+ <TMPL_VAR BUCHAUTOR>:</br>
+ **<TMPL_VAR BUCHTITEL>**</br>
+ -<TMPL_IF BUCHUNTERTITEL><TMPL_VAR BUCHUNTERTITEL></TMPL_IF></br>
+ *rezensiert von <TMPL_VAR AUTOR>*</br></br>
+ * Verlag: <TMPL_VAR VERLAG></br>
+ * ISBN: <TMPL_VAR ISBN></br>
+ * Seiten: <TMPL_VAR SEITEN></br>
+ * Preis: <TMPL_VAR PREIS></br></br>
+ Rubrik: <TMPL_VAR RUBRIK></br>
+ Themen:
+ <TMPL_LOOP THEMEN_LOOP><TMPL_VAR THEMEN>
+ <TMPL_UNLESS __last__>, </TMPL_UNLESS>
+ </TMPL_LOOP>
+ </span>
+
+ <TMPL_VAR RUBRIK> # just for testing if infobox is the problem
+
+Do I have to register another plugin with field or what is wrong here?
diff --git a/doc/forum/field__95__tags_not_linking/comment_10_7c1540e6eb6aafd2e1c9c7016e6e6249._comment b/doc/forum/field__95__tags_not_linking/comment_10_7c1540e6eb6aafd2e1c9c7016e6e6249._comment
new file mode 100644
index 000000000..f6bf1eff4
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking/comment_10_7c1540e6eb6aafd2e1c9c7016e6e6249._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="60.241.8.244"
+ subject="comment 10"
+ date="2011-09-06T02:31:13Z"
+ content="""
+Hmmm. They are the latest versions.
+
+So what do you get when you use `<TMPL_VAR AUTOR>`? Is that blank?
+"""]]
diff --git a/doc/forum/field__95__tags_not_linking/comment_11_0c03cbaa4f748d2fb932fda08fe6e966._comment b/doc/forum/field__95__tags_not_linking/comment_11_0c03cbaa4f748d2fb932fda08fe6e966._comment
new file mode 100644
index 000000000..ddbb39197
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking/comment_11_0c03cbaa4f748d2fb932fda08fe6e966._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://xlogon.net/bacuh"
+ ip="209.234.253.228"
+ subject="comment 11"
+ date="2011-09-06T02:41:24Z"
+ content="""
+No. This gives the expected result.
+"""]]
diff --git a/doc/forum/field__95__tags_not_linking/comment_12_9f3a402173f9584d8a36bc61e5755f6d._comment b/doc/forum/field__95__tags_not_linking/comment_12_9f3a402173f9584d8a36bc61e5755f6d._comment
new file mode 100644
index 000000000..6ec12a52f
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking/comment_12_9f3a402173f9584d8a36bc61e5755f6d._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="60.241.8.244"
+ subject="comment 12"
+ date="2011-09-06T02:53:39Z"
+ content="""
+Okay, I'm going to release the latest version from another branch. Give me an hour.
+"""]]
diff --git a/doc/forum/field__95__tags_not_linking/comment_13_455a2f921059f9ecca810bb8afed0fda._comment b/doc/forum/field__95__tags_not_linking/comment_13_455a2f921059f9ecca810bb8afed0fda._comment
new file mode 100644
index 000000000..0d0d78783
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking/comment_13_455a2f921059f9ecca810bb8afed0fda._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://xlogon.net/bacuh"
+ ip="209.234.253.228"
+ subject="comment 13"
+ date="2011-09-06T03:07:10Z"
+ content="""
+There's no rush. But please don't forget to document this -TAGPAGE thing. ;) I can't find any reference to it. Should be mentioned in the manpage.
+
+Thanks for your help.
+"""]]
diff --git a/doc/forum/field__95__tags_not_linking/comment_14_b82294c290a215d9aa6774ee20b5a552._comment b/doc/forum/field__95__tags_not_linking/comment_14_b82294c290a215d9aa6774ee20b5a552._comment
new file mode 100644
index 000000000..0b29fafea
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking/comment_14_b82294c290a215d9aa6774ee20b5a552._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="60.241.8.244"
+ subject="comment 14"
+ date="2011-09-06T03:42:22Z"
+ content="""
+Releases done. And, yes, updating the docs was one of the things I did.
+
+So, if you can try again with the latest version...
+"""]]
diff --git a/doc/forum/field__95__tags_not_linking/comment_15_57fb279ad50f8460341dc0f217acef06._comment b/doc/forum/field__95__tags_not_linking/comment_15_57fb279ad50f8460341dc0f217acef06._comment
new file mode 100644
index 000000000..c7f527efd
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking/comment_15_57fb279ad50f8460341dc0f217acef06._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://xlogon.net/bacuh"
+ ip="209.234.253.228"
+ subject="comment 15"
+ date="2011-09-06T04:07:12Z"
+ content="""
+Great. This works.
+
+Thanks again.
+"""]]
diff --git a/doc/forum/field__95__tags_not_linking/comment_16_8dae1024e80cf6ea765dee0318324d71._comment b/doc/forum/field__95__tags_not_linking/comment_16_8dae1024e80cf6ea765dee0318324d71._comment
new file mode 100644
index 000000000..d8288f2d6
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking/comment_16_8dae1024e80cf6ea765dee0318324d71._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="60.241.8.244"
+ subject="comment 16"
+ date="2011-09-06T06:55:43Z"
+ content="""
+Yay!
+"""]]
diff --git a/doc/forum/field__95__tags_not_linking/comment_1_76a4fb4def8f13b906c848814de91660._comment b/doc/forum/field__95__tags_not_linking/comment_1_76a4fb4def8f13b906c848814de91660._comment
new file mode 100644
index 000000000..23e1ebae1
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking/comment_1_76a4fb4def8f13b906c848814de91660._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="202.173.183.92"
+ subject="comment 1"
+ date="2011-09-05T22:23:17Z"
+ content="""
+The `field_tags` option behaves like \[[!tag ]] rather than \[[!taglink ]] - that is, it flags the page as being linked to the tag page. In order to have an actual link, you have to put a link in the template, and use the \"tagpage\" variable suffix.
+
+For example:
+
+ *rezensiert von \[[<TMPL_VAR AUTOR-TAGPAGE>]]*
+"""]]
diff --git a/doc/forum/field__95__tags_not_linking/comment_2_64d51cc9ba953e7fed609c380e30bb7d._comment b/doc/forum/field__95__tags_not_linking/comment_2_64d51cc9ba953e7fed609c380e30bb7d._comment
new file mode 100644
index 000000000..ea9ece867
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking/comment_2_64d51cc9ba953e7fed609c380e30bb7d._comment
@@ -0,0 +1,13 @@
+[[!comment format=mdwn
+ username="http://xlogon.net/bacuh"
+ ip="209.234.253.228"
+ subject="comment 2"
+ date="2011-09-05T23:18:34Z"
+ content="""
+This doesn't work. Even with a exact copy of your code it just get
+
+ [[]]
+
+where the \"taglinks\" should be.
+
+"""]]
diff --git a/doc/forum/field__95__tags_not_linking/comment_3_7a6eac4e216133f1cf6fc12336fc2496._comment b/doc/forum/field__95__tags_not_linking/comment_3_7a6eac4e216133f1cf6fc12336fc2496._comment
new file mode 100644
index 000000000..57ea22fde
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking/comment_3_7a6eac4e216133f1cf6fc12336fc2496._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://xlogon.net/bacuh"
+ ip="209.234.253.228"
+ subject="comment 3"
+ date="2011-09-05T23:46:40Z"
+ content="""
+BTW: The PageSpecs provided by field are working.
+"""]]
diff --git a/doc/forum/field__95__tags_not_linking/comment_4_e6941a0df00fb9f45563c30e01efa622._comment b/doc/forum/field__95__tags_not_linking/comment_4_e6941a0df00fb9f45563c30e01efa622._comment
new file mode 100644
index 000000000..7bce25c8e
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking/comment_4_e6941a0df00fb9f45563c30e01efa622._comment
@@ -0,0 +1,9 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="60.241.8.244"
+ subject="comment 4"
+ date="2011-09-06T01:31:25Z"
+ content="""
+Hmmm.
+What happens when you just have `<TMPL_VAR AUTOR-tagpage>`? Is that blank too?
+"""]]
diff --git a/doc/forum/field__95__tags_not_linking/comment_5_f08ded5a946458aeba59a2c4cec29b2f._comment b/doc/forum/field__95__tags_not_linking/comment_5_f08ded5a946458aeba59a2c4cec29b2f._comment
new file mode 100644
index 000000000..cbbb45b6e
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking/comment_5_f08ded5a946458aeba59a2c4cec29b2f._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://xlogon.net/bacuh"
+ ip="209.234.253.228"
+ subject="comment 5"
+ date="2011-09-06T01:40:46Z"
+ content="""
+It makes no difference. Neither with a capitalized nor with an uncapitalized \"-tagpage\".
+"""]]
diff --git a/doc/forum/field__95__tags_not_linking/comment_6_6ea7de20c3db96589c05adbe97d57cfd._comment b/doc/forum/field__95__tags_not_linking/comment_6_6ea7de20c3db96589c05adbe97d57cfd._comment
new file mode 100644
index 000000000..cfd5e7981
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking/comment_6_6ea7de20c3db96589c05adbe97d57cfd._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="60.241.8.244"
+ subject="comment 6"
+ date="2011-09-06T01:56:22Z"
+ content="""
+I meant, if you just have `<TMPL_VAR AUTOR-tagpage>` without the square brackets `[[` around it, is that blank too?
+"""]]
diff --git a/doc/forum/field__95__tags_not_linking/comment_7_8ad385b61c46389d87c88b17430ab1f2._comment b/doc/forum/field__95__tags_not_linking/comment_7_8ad385b61c46389d87c88b17430ab1f2._comment
new file mode 100644
index 000000000..e077f6e27
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking/comment_7_8ad385b61c46389d87c88b17430ab1f2._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://xlogon.net/bacuh"
+ ip="209.234.253.228"
+ subject="comment 7"
+ date="2011-09-06T02:02:39Z"
+ content="""
+Sorry for being unclear; I tried exactly that. It's still blank.
+"""]]
diff --git a/doc/forum/field__95__tags_not_linking/comment_8_c3c5eced158babd8c3acb493a86b6ecb._comment b/doc/forum/field__95__tags_not_linking/comment_8_c3c5eced158babd8c3acb493a86b6ecb._comment
new file mode 100644
index 000000000..e7e4d1dba
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking/comment_8_c3c5eced158babd8c3acb493a86b6ecb._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="60.241.8.244"
+ subject="comment 8"
+ date="2011-09-06T02:13:06Z"
+ content="""
+Which versions of field and ftemplate are you using?
+"""]]
diff --git a/doc/forum/field__95__tags_not_linking/comment_9_9bd4b3df18a28a7ab3bbef5013856987._comment b/doc/forum/field__95__tags_not_linking/comment_9_9bd4b3df18a28a7ab3bbef5013856987._comment
new file mode 100644
index 000000000..d82c691c8
--- /dev/null
+++ b/doc/forum/field__95__tags_not_linking/comment_9_9bd4b3df18a28a7ab3bbef5013856987._comment
@@ -0,0 +1,11 @@
+[[!comment format=mdwn
+ username="http://xlogon.net/bacuh"
+ ip="209.234.253.228"
+ subject="comment 9"
+ date="2011-09-06T02:27:47Z"
+ content="""
+* field: 1.20110610
+* ftemplate: 1.20100519
+
+Both were downloaded two days ago from their git repositorie's master branch.
+"""]]
diff --git a/doc/forum/field_and_forms.mdwn b/doc/forum/field_and_forms.mdwn
new file mode 100644
index 000000000..97fda1856
--- /dev/null
+++ b/doc/forum/field_and_forms.mdwn
@@ -0,0 +1,13 @@
+Dear ikiwiki users, and specially [[users/KathrynAndersen]] ([[users/rubykat]]):
+have you considered some way of extending ikiwiki to allow some kind of
+on-the-fly generation of web forms to create new pages? these web forms should
+offer as many fields as one has defined in some [[page
+template|plugins/contrib/ftemplate]], and, once POSTed, should create a page
+using that template, with those fields already filled with the values the user
+provided.
+
+I see this a a generalization of the `postform` option of the
+[[ikiwiki/directive/inline]] directive. That option tells ikiwiki to create a
+form with one field already filled (title).
+
+What are your ideas about this?
diff --git a/doc/forum/field_and_forms/comment_1_a0e976cb79f03dcff5e9a4511b90d160._comment b/doc/forum/field_and_forms/comment_1_a0e976cb79f03dcff5e9a4511b90d160._comment
new file mode 100644
index 000000000..3e10dbbd9
--- /dev/null
+++ b/doc/forum/field_and_forms/comment_1_a0e976cb79f03dcff5e9a4511b90d160._comment
@@ -0,0 +1,19 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="60.241.8.244"
+ subject="Limitations"
+ date="2010-11-23T02:18:52Z"
+ content="""
+I'd already had a look at this idea before you posted this suggestion, and I ran into difficulties.
+So far as I can see, it makes most sense to use the mechanisms already in place for editing pages, and enhance them.
+Unfortunately, the whole edit-page setup expects a template file (by default, when editing pages, editpage.tmpl)
+and anything apart from \"submit\" buttons must have a placeholder in the template file, or it doesn't get displayed at all in the form.
+At least, that's what I've found - I could be mistaken.
+
+But if it's true, that rather puts the kybosh on dynamically generated forms, so far as I can see.
+I mean, if you knew beforehand what all your fields were going to be, you could make a copy of editpage.tmpl, add in your fields where you want, and then make a plugin that uses that template instead of editpage.tmpl, but that's very limited.
+
+If someone could come up with a way of making dynamic forms, that would solve the problem, but I've come up against a brick wall myself. Joey? Anyone?
+
+-- [[KathrynAndersen]]
+"""]]
diff --git a/doc/forum/formating:_how_to_align_text_to_the_right.mdwn b/doc/forum/formating:_how_to_align_text_to_the_right.mdwn
new file mode 100644
index 000000000..2b56bd70b
--- /dev/null
+++ b/doc/forum/formating:_how_to_align_text_to_the_right.mdwn
@@ -0,0 +1,15 @@
+as in title, how to align text to the right?
+
+> Add to your local.css a class that aligns text to the right:
+
+ .alignright { text-align: right; }
+
+> And then you just just use `<span class="alignright">` around
+> other html.
+>
+> You can refine that, and allow right-aligning markdowned text
+> by using the [[ikiwiki/directive/template]]
+> directive, with a template that contains the html. The
+> [[templates/note]] template does something similar. --[[Joey]]
+
+>> Thanks!
diff --git a/doc/forum/getsource_from_gitweb_with_cgi_disabled___44___Is_it_possible__63__.mdwn b/doc/forum/getsource_from_gitweb_with_cgi_disabled___44___Is_it_possible__63__.mdwn
new file mode 100644
index 000000000..0219329c8
--- /dev/null
+++ b/doc/forum/getsource_from_gitweb_with_cgi_disabled___44___Is_it_possible__63__.mdwn
@@ -0,0 +1,6 @@
+Hi,
+
+I'm thinking about running my wiki with cgi disabled. I already did the gitweb setup.
+
+Is it possible to use gitweb to give me what getsource gives me. You know, like the History item but with:
+http://127.0.0.1/gitweb/gitweb.cgi?p=wiki.git;a=blob_plain;f=\[[file]];hb=HEAD
diff --git a/doc/forum/getsource_from_gitweb_with_cgi_disabled___44___Is_it_possible__63__/comment_1_747cc477584028ce2c7bc198070b1221._comment b/doc/forum/getsource_from_gitweb_with_cgi_disabled___44___Is_it_possible__63__/comment_1_747cc477584028ce2c7bc198070b1221._comment
new file mode 100644
index 000000000..c506a363c
--- /dev/null
+++ b/doc/forum/getsource_from_gitweb_with_cgi_disabled___44___Is_it_possible__63__/comment_1_747cc477584028ce2c7bc198070b1221._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnQs9icnfI79gWOQY_Yxv2XmYI3z703PrQ"
+ nickname="misc"
+ subject="Solved - patched source"
+ date="2011-07-21T15:36:47Z"
+ content="""
+Ok, I implemented sourceurl and it worked. I didn't want to touch the source code at first but it turned out to be very easy to work with.
+
+Thank you for your excellent and creative work on ikiwiki.
+"""]]
diff --git a/doc/forum/getsource_from_gitweb_with_cgi_disabled___44___Is_it_possible__63__/comment_2_a230861b26dba6d61461862bfedbc09c._comment b/doc/forum/getsource_from_gitweb_with_cgi_disabled___44___Is_it_possible__63__/comment_2_a230861b26dba6d61461862bfedbc09c._comment
new file mode 100644
index 000000000..1cf1d3282
--- /dev/null
+++ b/doc/forum/getsource_from_gitweb_with_cgi_disabled___44___Is_it_possible__63__/comment_2_a230861b26dba6d61461862bfedbc09c._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 2"
+ date="2011-07-23T11:07:27Z"
+ content="""
+Could you attach the patch here, please? That sounds like a useful feature to have.
+"""]]
diff --git a/doc/forum/getsource_from_gitweb_with_cgi_disabled___44___Is_it_possible__63__/comment_3_848b4801fc7887906a21a676e802023c._comment b/doc/forum/getsource_from_gitweb_with_cgi_disabled___44___Is_it_possible__63__/comment_3_848b4801fc7887906a21a676e802023c._comment
new file mode 100644
index 000000000..ea28449c7
--- /dev/null
+++ b/doc/forum/getsource_from_gitweb_with_cgi_disabled___44___Is_it_possible__63__/comment_3_848b4801fc7887906a21a676e802023c._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnQs9icnfI79gWOQY_Yxv2XmYI3z703PrQ"
+ nickname="misc"
+ subject="patch - getsource from gitweb"
+ date="2011-07-24T23:50:07Z"
+ content="""
+> Could you attach the patch here, please? That sounds like a useful feature to have.
+
+<https://gitorious.org/ikiwiki-nezmer/ikiwiki-nezmer/commit/71fd3bda5cd7e8b0d3e403b2e9ba51889329b60d?format=patch>
+"""]]
diff --git a/doc/forum/google_openid_broken__63__.mdwn b/doc/forum/google_openid_broken__63__.mdwn
new file mode 100644
index 000000000..d25d8fe4c
--- /dev/null
+++ b/doc/forum/google_openid_broken__63__.mdwn
@@ -0,0 +1,82 @@
+Now that google supports using thier profiles as OpenIDs, that can be used
+directly to sign into ikiwiki. Just use, for example,
+<http://www.google.com/profiles/joeyhess> . Tested and it works. --[[Joey]]
+
+> This seems to work fine if you use the profile directly as an OpenID. It doesn't seem to work with delegation. From that I can see, this is a deliberate decision by Google for security reasons. See the response [here](http://groups.google.com/group/google-federated-login-api/browse_thread/thread/825067789537568c/23451a68c8b8b057?show_docid=23451a68c8b8b057). -- [[Will]]
+
+### adding the GMail OpenID as an admin is unhandy
+Adding the non-human-friendly OpenID from Gmail as an admin for ikiwiki (if you haven't set up a profile with a readabe URL) is unhandy; first, you need to discover the URL, for example, by making a web edit with it (like me [here](http://source.ikiwiki.branchable.com/?p=source.git;a=search;s=https://www.google.com/accounts/o8/id%3Fid%3DAItOawl3JW_Ow4xMqj98Ig1vwGx_AnjUSsgwE8E;st=author)), and then copy the URL to ikiwiki.setup. --Ivan Z.
+
+## historical discussion
+
+when I login via to this wiki (or ours) via Google's OpenID, I get this error:
+
+Error: OpenID failure: no_identity_server: The provided URL doesn't declare its OpenID identity server.
+
+Any idea how to fix this??
+
+> Google is [doing things with openid that are not in the spec](http://googledataapis.blogspot.com/2008/10/federated-login-for-google-account.html)
+> and it's not clear to me that they intend regular openid to work at all.
+> What is your google openid URL so I can take a look at the data they are
+> providing? --[[Joey]]
+
+
+http://openid-provider.appspot.com/larrylud
+
+> I've debugged this some and filed
+> <https://rt.cpan.org/Ticket/Display.html?id=48728> on the Openid perl
+> module. It's a pretty easy fix, so I hope upstream will fix it quickly.
+> --[[Joey]]
+
+>> A little more information here: I'm using that same openid provider at the moment. Note that
+>> that provider isn't google - it is someone using the google API to authenticate. I normally have it
+>> set up as a redirect from my home page (which means I can change providers easily).
+
+ <link rel="openid.server" href="http://openid-provider.appspot.com/will.uther">
+ <link rel="openid.delegate" href="http://openid-provider.appspot.com/will.uther">
+
+>> In that mode it works (I used it to log in to make this edit). However, when I try the openid
+>> URL directly, it doesn't work. I think there is something weird with re-direction. I hope this
+>> isn't a more general security hole.
+>> -- [[Will]]
+
+----
+
+So, while the above bug will probably get fixed sooner or later,
+the best approach for those of you needing a google openid now is
+to use gmail.
+
+
+Just a note that someone has apparently figured out how to use a google
+openid, and not a third-party provider either, to edit this site.
+The openid is
+<https://www.google.com/accounts/o8/id?id=AItOawltlTwUCL_Fr1siQn94GV65-XwQH5XSku4>
+(what a mouthfull!), and I don't know who that is or how to use it since it
+points to a fairly useless xml document, rather than a web page. --[[Joey]]
+
+> That string is what's received via the discovery protocol. The user logging in with a Google account is not supposed to write that when logging in, but rather <https://www.google.com/accounts/o8/id>. The OpenID client library will accept that and redirect the user to a sign in page, which will return that string as the OpenID. It's not really usable as an identifier for edits and whatnots, but an alternative would be to use the attribute exchange extension to get the email address and display that. See <http://code.google.com/apis/accounts/docs/OpenID.html#Parameters>.
+
+> Yahoo's OpenID implementation works alike, but I haven't looked at it as much. It uses <https://me.yahoo.com/> to receive the endpoint.
+
+> I've added buttons that submit the two above URLs for logging in with a Google and Yahoo OpenID, respectively, to my locally changed OpenID login plugin.
+
+> Using the Google profile page as the OpenID is really orthogonal to the above. --[[kaol]]
+
+>> First, I don't accept that the openid google returns from their
+>> generic signin url *has* to be so freaking ugly. For contrast,
+>> look at the openid you log in as if you use the yahoo url.
+>> <https://me.yahoo.com/joeyhess#35f22>. Nice and clean, now
+>> munged by ikiwiki to "joeyhess [me.yahoo.com]".
+>>
+>> Displaying email addresses is not really an option, because ikiwiki
+>> can't leak user email addresses like that. Displaying nicknames or
+>> usernames is, see [[todo/Separate_OpenIDs_and_usernames]].
+>>
+>> It would probably be good if the openid plugin could be configured with
+>> a list of generic openid urls, so it can add quick login buttons using
+>> those urls.
+>>
+>> The ugly google url will still be exposed here and there where
+>> a unique user id is needed. That can be avoided by not using the generic
+>> <https://www.google.com/accounts/o8/id>, but instead your own profile
+>> like <http://www.google.com/profiles/joeyhess>. --[[Joey]]
diff --git a/doc/forum/how_can_I_use___39____47____39___as_tagbase__63__.mdwn b/doc/forum/how_can_I_use___39____47____39___as_tagbase__63__.mdwn
new file mode 100644
index 000000000..8a24152dc
--- /dev/null
+++ b/doc/forum/how_can_I_use___39____47____39___as_tagbase__63__.mdwn
@@ -0,0 +1,13 @@
+I'd like tags to be top-level pages, like /some-tag.
+
+I achieve this most of the time by *not* defining `tagbase`.
+
+However, this goes wrong if the name of a tag matches the name of a page further down a tree.
+
+Example:
+
+ * tag scm, corresponding page /scm
+ * a page /log/scm tagged 'scm' does not link to /scm
+ * a page /log/puppet tagged 'scm' links to /log/scm in the Tags: section
+
+Is this possible, or am I pushing tags too far (again)? -- [[Jon]]
diff --git a/doc/forum/how_can_I_use___39____47____39___as_tagbase__63__/comment_1_e7897651ba8d9156526d36d6b7744eae._comment b/doc/forum/how_can_I_use___39____47____39___as_tagbase__63__/comment_1_e7897651ba8d9156526d36d6b7744eae._comment
new file mode 100644
index 000000000..361c51b09
--- /dev/null
+++ b/doc/forum/how_can_I_use___39____47____39___as_tagbase__63__/comment_1_e7897651ba8d9156526d36d6b7744eae._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2010-12-05T20:15:28Z"
+ content="""
+From the code, it seems to me like setting tagbase to \"/\" would actually do what you want. Does it not work?
+"""]]
diff --git a/doc/forum/how_could_i_generate_a_flat_textfile_from_metadata_in_multiple_pages.mdwn b/doc/forum/how_could_i_generate_a_flat_textfile_from_metadata_in_multiple_pages.mdwn
new file mode 100644
index 000000000..d82a419a3
--- /dev/null
+++ b/doc/forum/how_could_i_generate_a_flat_textfile_from_metadata_in_multiple_pages.mdwn
@@ -0,0 +1,3 @@
+I am already using the [[plugins/contrib/report]] plugin to generate reports aggregated from multiple pages, and it's great! However, I am now looking at generating non-HTML reports. Basically, I want to generate a BIND zonefile from the data aggregated from similar reports. I have gone as far as using the [[plugins/pagetemplate]] plugin to have an empty page as a template - but even that bit doesn't work as i still get pesky `<script>` tags in the output. Besides, the data actually gets parsed on display, and I'd like to do some validation and processing.
+
+How should I go forward? Should i write a separate plugin from [[plugins/contrib/report]]? Should I make a plugin that, like [[plugins/graphviz]] generates data in a separate page? Any suggestions? --[[anarcat]]
diff --git a/doc/forum/how_do_I_revert_edits_in_the_web_mode__63__.mdwn b/doc/forum/how_do_I_revert_edits_in_the_web_mode__63__.mdwn
new file mode 100644
index 000000000..d69b3801b
--- /dev/null
+++ b/doc/forum/how_do_I_revert_edits_in_the_web_mode__63__.mdwn
@@ -0,0 +1,46 @@
+Puzzled a bit :-/
+
+> There is no explicit interface for reverting edits. Most of us use `git revert`. --[[Joey]]
+
+>> That's a blow; I was planning on appointing no techies to keep law and order on our pages :-/ Is there a plugin or at least a plan to add such a 'in demand' feature?
+
+>>> A lot of things complicate adding that feature to the web interface.
+>>>
+>>> First, ikiwiki happily uses whatever the VCS's best of breed web
+>>> history interface is. (ie, viewvcs, gitweb). To allow reverting
+>>> past the bottom of the RecentChanges page, it would need to have its
+>>> own history browser. Not sure I want to go there.
+>>>
+>>> And the mechanics of handling reverting can quickly get complex.
+>>> Web reverting should only allow users to revert things they can edit,
+>>> but reverting a whole commit in git might touch multiple files.
+>>> Some files may not be editable over the web at all. (The
+>>> [[tips/untrusted_git_push]] also has to deal with those issues.)
+>>> Finally, a revert can fail with a conflict. The revert could touch
+>>> multiple files, and multiple ones could conflict. The conflict may
+>>> involve non-page files that can't be diffed. So an interface for
+>>> resolving such a conflict could be hard.
+>>>
+>>> Probably web-based reverting would need to be limited to reverting
+>>> single file changes, not whole commits, and not having very good
+>>> conflict handling. And maybe only being accessible for changes
+>>> still visible on RecentChanges. With those limitations, it's certianly
+>>> doable (as a plugin even), but given how excellent `git revert` is in
+>>> comparison, I have not had a real desire to do so. --[[Joey]]
+
+>>>> Web edits are single-file anyway, so I wouldn't expect web reverts
+>>>> to handle the multi-file case. OTOH, I've sometimes wished ikiwiki
+>>>> had its own history browser (somewhere down my todo list). --[[schmonz]]
+
+>>>> Yup, having a possibility to revert a single file would suffice.
+
+---
+
+Perer Gammie and I are working on reversion over at [[todo/web_reversion]].
+--[[Joey]]
+
+Update: Web reversion is now supported by ikiwiki. Only changes committed
+to your wiki after you upgrade to the version of ikiwiki that supports it
+will get revert buttons on the RecentChanges page. If you want to force
+adding buttons for older changes, you can delete `recentchanges/*._change`
+from your srcdir, and rebuild the wiki. --[[Joey]]
diff --git a/doc/forum/how_do_I_revert_edits_in_the_web_mode__63__/comment_1_e4720e8e4fe74bd6cba746e8259832e6._comment b/doc/forum/how_do_I_revert_edits_in_the_web_mode__63__/comment_1_e4720e8e4fe74bd6cba746e8259832e6._comment
new file mode 100644
index 000000000..597cab2e4
--- /dev/null
+++ b/doc/forum/how_do_I_revert_edits_in_the_web_mode__63__/comment_1_e4720e8e4fe74bd6cba746e8259832e6._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawmlZJCPogIE74m6GSCmkbJoMZiWNOlXcjI"
+ nickname="Ian"
+ subject="comment 1"
+ date="2010-09-24T19:01:08Z"
+ content="""
++1 for a \"revert\" web plugin which at least handles the simple cases. -- Ian Osgood, The TOVA Company
+"""]]
diff --git a/doc/forum/how_do_I_translate_a_TWiki_site.mdwn b/doc/forum/how_do_I_translate_a_TWiki_site.mdwn
new file mode 100644
index 000000000..5bfdbb86c
--- /dev/null
+++ b/doc/forum/how_do_I_translate_a_TWiki_site.mdwn
@@ -0,0 +1,44 @@
+[[!meta date="2006-12-19 09:56:21 +0000"]]
+
+# Excellent - how do I translate a TWiki site?
+
+I just discovered ikiwiki quite by chance, I was looking for a console/terminal
+menu system and found pdmenu. So pdmenu brought me to here and I've found ikiwiki!
+It looks as if it's just what I've been wanting for a long time. I wanted something
+to create mostly text web pages which, as far as possible, have source which is human
+readable or at least in a standard format. ikiwiki does this twice over by using
+markdown for the source and producing static HTML from it.
+
+I'm currently using TWiki and have a fair number of pages in that format, does
+anyone have any bright ideas for translating? I can knock up awk scripts fairly
+easily, perl is possible (but I'm not strong in perl).
+
+> Let us know if you come up with something to transition from the other
+> format. Another option would be writing a ikiwiki plugin to support the
+> TWiki format. --[[Joey]]
+
+> Jamey Sharp and I have a set of scripts in progress to convert other wikis to ikiwiki, including history, so that we can migrate a few of our wikis. We already have support for migrating MoinMoin wikis to ikiwiki, including conversion of the entire history to Git. We used this to convert the [XCB wiki](http://xcb.freedesktop.org/wiki/) to ikiwiki; until we finalize the conversion and put the new wiki in place of the old one, you can browse the converted result at <http://xcb.freedesktop.org/ikiwiki>. We already plan to add support for TWiki (including history, since you can just run parsecvs on the TWiki RCS files to get Git), so that we can convert the [Portland State Aerospace Society wiki](http://psas.pdx.edu) (currently in Moin, but with much of its history in TWiki, and with many of its pages still in TWiki format using Jamey's TWiki format for MoinMoin).
+>
+> Our scripts convert by way of HTML, using portions of the source wiki's code to render as HTML (with some additional code to do things like translate MoinMoin's `\[[TableOfContents]]` to ikiwiki's `\[[!toc ]]`), and then using a modified [[!cpan HTML::WikiConverter]] to turn this into markdown and ikiwiki. This produces quite satisfactory results, apart from things that don't have any markdown equivalent and thus remain HTML, such as tables and definition lists. Conversion of the history occurs by first using another script we wrote to translate MoinMoin history to Git, then using our git-map script to map a transformation over the Git history.
+>
+> We will post the scripts as soon as we have them complete enough to convert our wikis.
+>
+> -- [[JoshTriplett]]
+
+>> Thanks for an excellent Xmas present, I will appreciate the additional
+>> users this will help switch to ikiwiki! --[[Joey]]
+
+
+>> Sounds great indeed. Learning from [here](http://www.bddebian.com/~wiki/AboutTheTWikiToIkiwikiConversion/) that HTML::WikiConverter needed for your conversion was not up-to-date on Debian I have now done an unofficial package, including your proposed Markdown patches, apt-get'able at <pre>deb http://debian.jones.dk/ sid wikitools</pre>
+>> -- [[JonasSmedegaard]]
+
+
+>>I see the "We will post the scripts ...." was committed about a year ago. A current site search for "Moin" does not turn them up. Any chance of an appearance in the near (end of year) future?
+>>
+>> -- [[MichaelRasmussen]]
+
+>>> It appears the scripts were never posted? I recently imported my Mediawiki site into Iki. If it helps, my notes are here: <http://iki.u32.net/Mediawiki_Conversion> --[[sabr]]
+
+>>>>> The scripts have been posted now, see [[joshtriplett]]'s user page,
+>>>>> and I've pulled together all ways I can find to [[convert]] other
+>>>>> systems into ikiwiki. --[[Joey]]
diff --git a/doc/forum/how_to_add_post_titles_in_ikiwiki_blog__63__.mdwn b/doc/forum/how_to_add_post_titles_in_ikiwiki_blog__63__.mdwn
new file mode 100644
index 000000000..68eb06c4c
--- /dev/null
+++ b/doc/forum/how_to_add_post_titles_in_ikiwiki_blog__63__.mdwn
@@ -0,0 +1,28 @@
+Look at these two blogs:
+
+1) http://ciffer.net/~svend/blog/
+
+2) http://upsilon.cc/~zack/blog/
+
+Well, i set up successfully my blog (i am using inline function in a wiki page) but i have manually to insert blog pos titles and the result is that of blog #2.
+Instead i would like to have blog post titles automatically inserted like blog #1 (and they are links too! I want them that way).
+I looked in git repo of the two blogs but i couldn't find the answer.
+Any help would be really appreciated.
+
+Thanks!
+
+Raf
+
+> Either name the blog post files with the full title you want them to
+> have, or use [[ikiwiki/directive/meta]] title to set the title of a blog post.
+>
+> \[[!meta title="this is my blog post"]]
+>
+> Either way, the title will automatically be displayed, clickable, at the top.
+> (zack has hacked his templates not to do that). --[[Joey]]
+
+>> Thanks for your answer.<br/>
+>> I looked in the [templates](http://git.upsilon.cc/cgi-bin/gitweb.cgi?p=zack-homepage.git;a=tree;f=templates;h=824100e62a06cee41b582ba84fcb9cdd982fe4be;hb=HEAD) folder of zack but couldn't see any hack of that kind.<br/>
+>> Anyway, I didn't hack my template...<br/>
+>> I will follow your suggestion of using \[[ikiwiki/directive/meta]] title to set titles.<br/>
+>> Thanks a lot. --Raf
diff --git a/doc/forum/how_to_enable_multimarkdown__63__.mdwn b/doc/forum/how_to_enable_multimarkdown__63__.mdwn
new file mode 100644
index 000000000..208aadcb0
--- /dev/null
+++ b/doc/forum/how_to_enable_multimarkdown__63__.mdwn
@@ -0,0 +1,9 @@
+I enabled multimarkdown in my setup file but I get this message 'remote: multimarkdown is enabled, but Text::MultiMarkdown is not installed'.
+I also installed multimarkdown-git for my distro (archlinux), which should take care of installing all required perl modules, I believe.
+What am I missing?
+
+Thanks.
+
+> You are apparently still missing the [[!cpan Text::MultiMarkdown]]
+> perl module. Not being familiar with arch linux, I don't know what
+> multimarkdown-git is, so I can't say more than that.. --[[Joey]]
diff --git a/doc/forum/how_to_enable_multimarkdown__63__/comment_1_037f858c4d0bcbb708c3efd264379500._comment b/doc/forum/how_to_enable_multimarkdown__63__/comment_1_037f858c4d0bcbb708c3efd264379500._comment
new file mode 100644
index 000000000..6045a4a3f
--- /dev/null
+++ b/doc/forum/how_to_enable_multimarkdown__63__/comment_1_037f858c4d0bcbb708c3efd264379500._comment
@@ -0,0 +1,14 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawlzADDUvepOXauF4Aq1VZ4rJaW_Dwrl6xE"
+ nickname="Dário"
+ subject="comment 1"
+ date="2010-07-15T15:37:31Z"
+ content="""
+multimarkdown-git is a package build that fetches the git version of multimarkdown.
+It should install Text::Markdown I believe.
+I tried to install it by hand on the cpan command line but it didn't work either:
+perl -MCPAN -e shell
+install Text::MultiMarkdown
+
+says couldn't run make file or something.
+"""]]
diff --git a/doc/forum/how_to_enable_multimarkdown__63__/comment_2_b7d512a535490dabf8d6ce55439741c7._comment b/doc/forum/how_to_enable_multimarkdown__63__/comment_2_b7d512a535490dabf8d6ce55439741c7._comment
new file mode 100644
index 000000000..804b71c67
--- /dev/null
+++ b/doc/forum/how_to_enable_multimarkdown__63__/comment_2_b7d512a535490dabf8d6ce55439741c7._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 2"
+ date="2010-07-16T19:44:55Z"
+ content="""
+All I can tell you is that, if multimarkdown is correctly installed (ie, if `perl -e 'use Text::MultiMarkdown'` runs successfully), ikiwiki can use it.
+"""]]
diff --git a/doc/forum/how_to_get_nice_pdf_from_ikiwiki_pages__63__.mdwn b/doc/forum/how_to_get_nice_pdf_from_ikiwiki_pages__63__.mdwn
new file mode 100644
index 000000000..993dd8c1d
--- /dev/null
+++ b/doc/forum/how_to_get_nice_pdf_from_ikiwiki_pages__63__.mdwn
@@ -0,0 +1,3 @@
+Is there a way to make nice printable pages out of ikiwiki? Preferably pdfs.
+
+marius
diff --git a/doc/forum/how_to_get_nice_pdf_from_ikiwiki_pages__63__/comment_1_332d32850c3dc0d45f5cc50434205f39._comment b/doc/forum/how_to_get_nice_pdf_from_ikiwiki_pages__63__/comment_1_332d32850c3dc0d45f5cc50434205f39._comment
new file mode 100644
index 000000000..54f9f396b
--- /dev/null
+++ b/doc/forum/how_to_get_nice_pdf_from_ikiwiki_pages__63__/comment_1_332d32850c3dc0d45f5cc50434205f39._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk_MMtLPS7osC5MjX00q2ATjvvXPWqm0ik"
+ nickname="micheal"
+ subject="comment 1"
+ date="2011-11-05T09:35:40Z"
+ content="""
+I have the same question. Any ideas?
+"""]]
diff --git a/doc/forum/how_to_load_an_external_page_and_still_have_it_under_ikiwiki_template.mdwn b/doc/forum/how_to_load_an_external_page_and_still_have_it_under_ikiwiki_template.mdwn
new file mode 100644
index 000000000..a747911a5
--- /dev/null
+++ b/doc/forum/how_to_load_an_external_page_and_still_have_it_under_ikiwiki_template.mdwn
@@ -0,0 +1,3 @@
+OK, probably title is bit confusing. Basically I'd like to be able to keep my left hand side menu, which is part of the template, and at the same time load let's say forum on the right hand side, which sits on a separate domain. Is it possible then to construct template that for some special links it runs as lets say in *frameset* mode?
+
+> I think I'll have to use [[ikiwiki/directive/pagetemplate]] and this <http://stackoverflow.com/questions/153152/resizing-an-iframe-based-on-content> solution
diff --git a/doc/forum/how_to_login_as_admin.mdwn b/doc/forum/how_to_login_as_admin.mdwn
new file mode 100644
index 000000000..807f82501
--- /dev/null
+++ b/doc/forum/how_to_login_as_admin.mdwn
@@ -0,0 +1,18 @@
+I even managed to set up ikiwiki so it works fine with git; but how on earth do
+I log in as an administrator? In the .setup file the admin user is set to
+'zimek' but when I go and register 'zimek' on the web it appears as normal user
+not the administrator. What am I missing?
+
+> That's really all there is to it. The [[automatic_setup|setup]] script
+> registers the admin user for you before the wiki goes live. If you didn't
+> use it, registering the right account name will get you the admin account.
+>
+> The name is case sensative, perhaps you really spelled one of them `Zimek`?
+>
+> Or maybe you're the admin, and don't know it? Everything looks the same for the admin,
+> except they can edit even locked pages, and can access the websetup interface from their
+> Preferences page, if you have that plugin enabled. --[[Joey]]
+
+>> Maybe I'm indeed. I know that I've disabled all the plugins while installing ikiwiki. Checking it now ;-)
+
+>> Yup, I'm the God of my ikiwiki. (Thanks)
diff --git a/doc/forum/how_to_login_as_admin/comment_1_295e130c6400a2d7336758e82bcd5647._comment b/doc/forum/how_to_login_as_admin/comment_1_295e130c6400a2d7336758e82bcd5647._comment
new file mode 100644
index 000000000..bceecf8e8
--- /dev/null
+++ b/doc/forum/how_to_login_as_admin/comment_1_295e130c6400a2d7336758e82bcd5647._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://ismael.olea.org/"
+ ip="150.214.94.198"
+ subject="comment 1"
+ date="2012-05-22T23:31:09Z"
+ content="""
+Can be adminuser an OpenID address?
+
+Because I can't get the websetup link at my preferences (plugin is set). I deactivated passwordauth plugin.
+"""]]
diff --git a/doc/forum/how_to_setup_ikiwiki_on_a_remote_host.mdwn b/doc/forum/how_to_setup_ikiwiki_on_a_remote_host.mdwn
new file mode 100644
index 000000000..1c0f8f561
--- /dev/null
+++ b/doc/forum/how_to_setup_ikiwiki_on_a_remote_host.mdwn
@@ -0,0 +1,35 @@
+Hi all!
+I really like ikiwiki and i tested it on my local machine but i have one question that i can't answer reading documentation (my fault of course)...
+I have an account and some space on a free hosting service.
+Now, i want to put my ikiwiki on this remote web space so that i can browse it from wherever i want.
+I have my source dir and my git dir on my local machine.
+How can i upload my ikiwiki on the remote host and manage it via git as i can do when i test it locally?
+Where is specified? Where can i find documentation about it?
+
+Thanks in advance!
+
+Pab
+
+> There are several ways to accomplish this, depending on what you really
+> want to do.
+>
+> If your goal is to continue generating the site locally, but then
+> transfer it to the remote host for serving, you could use the
+> [[plugins/rsync]] plugin.
+>
+> If your goal is to install and run the ikiwiki software on the remote host,
+> then you would follow a similar path to the ones described in these tips:
+> [[tips/nearlyfreespeech]] [[tips/DreamHost]]. Or even [[install]] ikiwiki
+> from a regular package if you have that kind of access. Then you could
+> push changes from your local git to git on the remote host to update the
+> wiki. [[tips/Laptop_wiki_with_git]] explains one way to do that.
+> --[[Joey]]
+
+Thanks a lot for your answer.
+rsync plugin would be perfect but... how would i manage blog post?
+I mean... is it possible to manage ikiwiki blog too with rsync plugin in the way you told me? --Pab
+
+> If you want to allow people to make comments on your blog, no, the rsync plugin will not help, since it will upload a completely static site where nobody can make comments. Comments require a full IkiWiki setup with CGI enabled, so that people add content (comments) from the web. --[[KathrynAndersen]]
+
+Ok, i understand, thanks.
+Is there any hosting service that permits to have a full installation of iwkiwiki or i am forced to get a vps or to mantain a personal server for that? --Pab
diff --git a/doc/forum/howto_install_the_pagedown_plugin.mdwn b/doc/forum/howto_install_the_pagedown_plugin.mdwn
new file mode 100644
index 000000000..51b0a554b
--- /dev/null
+++ b/doc/forum/howto_install_the_pagedown_plugin.mdwn
@@ -0,0 +1 @@
+How can I install the [[todo/pagedown_plugin]] on an existing ikiwiki? What are the detailed steps to do so?
diff --git a/doc/forum/howto_install_the_pagedown_plugin/comment_1_158fbcef24d20920c40968da8f10442a._comment b/doc/forum/howto_install_the_pagedown_plugin/comment_1_158fbcef24d20920c40968da8f10442a._comment
new file mode 100644
index 000000000..3985a797d
--- /dev/null
+++ b/doc/forum/howto_install_the_pagedown_plugin/comment_1_158fbcef24d20920c40968da8f10442a._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2012-04-05T18:35:25Z"
+ content="""
+See [[plugins/install]]
+"""]]
diff --git a/doc/forum/html_source_pages_in_version_3.20100704.mdwn b/doc/forum/html_source_pages_in_version_3.20100704.mdwn
new file mode 100644
index 000000000..7a620fd57
--- /dev/null
+++ b/doc/forum/html_source_pages_in_version_3.20100704.mdwn
@@ -0,0 +1,8 @@
+Is this different from using the html/rawhtml plugins?
+
+> I suppose you're talking about this:
+
+ * po: Added support for .html source pages. (intrigeri)
+
+> That means the [[plugins/po]] plugin is able to translate html pages
+> used by the [[plugins/html]] plugin. --[[Joey]]
diff --git a/doc/forum/ikiwiki_+_mathjax.mdwn b/doc/forum/ikiwiki_+_mathjax.mdwn
new file mode 100644
index 000000000..1279a2c80
--- /dev/null
+++ b/doc/forum/ikiwiki_+_mathjax.mdwn
@@ -0,0 +1 @@
+Is it possible to use [mathjax](http://www.mathjax.org/) in ikiwiki to typeset math formulas? If so, is this compatible with the [wmd](http://ikiwiki.info/plugins/wmd/) plugin?
diff --git a/doc/forum/ikiwiki_+_mathjax/comment_1_8426a985ecfbb02d364116503ef3a0d4._comment b/doc/forum/ikiwiki_+_mathjax/comment_1_8426a985ecfbb02d364116503ef3a0d4._comment
new file mode 100644
index 000000000..f5849e7bf
--- /dev/null
+++ b/doc/forum/ikiwiki_+_mathjax/comment_1_8426a985ecfbb02d364116503ef3a0d4._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawkIfDOOUJ0h_niLRZZL5HsHHOuQfUrVcQo"
+ nickname="Carl"
+ subject="Works with mathjax, with a little help"
+ date="2011-01-02T22:59:27Z"
+ content="""
+Yes, mathjax works with ikiwiki. The main trouble I ran into was markdown trying to parse the math. For example markdown and tex both use underscores. I wrote a quick plugin to replace all the TeX with strip markers in the 'filter' phase and put it back in the 'sanitize' phase (just replacing all the TeX content with its Base64 representation temporarily) and that seems to have fixed the problem well enough.
+"""]]
diff --git a/doc/forum/ikiwiki_+_mathjax/comment_2_ddb7a4d59bbe7145167d122a146e8f65._comment b/doc/forum/ikiwiki_+_mathjax/comment_2_ddb7a4d59bbe7145167d122a146e8f65._comment
new file mode 100644
index 000000000..af15e0875
--- /dev/null
+++ b/doc/forum/ikiwiki_+_mathjax/comment_2_ddb7a4d59bbe7145167d122a146e8f65._comment
@@ -0,0 +1,11 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnl3JHr3pFPOZMsKgx11_mLCbic1Rb3y8s"
+ nickname="patrick"
+ subject="comment 2"
+ date="2011-01-09T09:48:23Z"
+ content="""
+Hi Carl,
+That's great news, I've been looking for a solution like this for some time.
+would you mind sharing your patch or write up a small howto?
+Thanks
+"""]]
diff --git a/doc/forum/ikiwiki_+_mathjax/comment_3_5a118654bc008bbb118285ff141eb6f1._comment b/doc/forum/ikiwiki_+_mathjax/comment_3_5a118654bc008bbb118285ff141eb6f1._comment
new file mode 100644
index 000000000..79551a671
--- /dev/null
+++ b/doc/forum/ikiwiki_+_mathjax/comment_3_5a118654bc008bbb118285ff141eb6f1._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk_MMtLPS7osC5MjX00q2ATjvvXPWqm0ik"
+ nickname="micheal"
+ subject="Tutorial?"
+ date="2011-07-30T20:14:34Z"
+ content="""
+Would be nice to have a little tutorial. I don't know how to set up mathjax.
+"""]]
diff --git a/doc/forum/ikiwiki_--setup_creates_tmp__47___directory_in_destdir.mdwn b/doc/forum/ikiwiki_--setup_creates_tmp__47___directory_in_destdir.mdwn
new file mode 100644
index 000000000..dab883632
--- /dev/null
+++ b/doc/forum/ikiwiki_--setup_creates_tmp__47___directory_in_destdir.mdwn
@@ -0,0 +1,10 @@
+Hi,
+
+I just recently migrated my pyblosxom installation to ikiwiki (see <http://blog.well-adjusted.de/posts/>). Should have done that years ago!
+
+Anyway, so far my only problem is that `ikiwiki --setup mysite.setup` creates a directory named `tmp` in the directory containing my postings, together with an empty index.html. It happens every time if I do a complete rebuild. It does not happen every time with `--refresh`. For example, editing only `local.css` does not trigger the behaviour. Editing any posting will do, but that triggers a complete rebuild anyway due to my sidebar with the tag cloud.
+
+Do you have any idea what might cause this or how I should proceed to find it out? I am a programmer but know next to nothing about Perl.
+
+Thanks,
+Jochen.
diff --git a/doc/forum/ikiwiki__39__s_notion_of_time.mdwn b/doc/forum/ikiwiki__39__s_notion_of_time.mdwn
new file mode 100644
index 000000000..ee564fcc9
--- /dev/null
+++ b/doc/forum/ikiwiki__39__s_notion_of_time.mdwn
@@ -0,0 +1,35 @@
+I'm having some difficulties with ikiwiki's notion of time.
+
+For (regular) pages, the *last edited* date is the one where the file
+was indeed last modified according to the file system information.
+The *created* date (commented out in the HTML) is, at least for
+`--getctime` operation, the date, where the file was last registered
+as changed with the VCS.
+
+Now, at least with git, the thing is that when you're checking out files,
+they'll get the checkout-time's current time stamp.
+
+What I strive for is the following: *created* be the date when the file
+(under its current name) was *first* registered with the VCS (which is
+more logical in my opinion), and *last edited* be the date the file was
+last registered as changed with the VCS, which is the current
+`--getctime` *created* date.
+
+This means that I can build the HTML files from different checkouts of the
+VCS and they won't differ in the time stamps they contain in the HTML.
+
+What is the rationale for ikiwiki's current behavior with respect to these
+time stamps?
+
+--[[tschwinge]]
+
+> Presumably it's the authors of the git and mercurial backends
+> not understanding the documentation for `rcs_getctime`,
+> which states:
+>
+>>This is used to get the page creation time for a file from the RCS, by
+>>looking it up in the history.
+>
+> I've fixed both broken implementations to correctly look
+> up the first, not the last, commit. Other VCS do not seem
+> to have the problem. --[[Joey]]
diff --git a/doc/forum/ikiwiki_and_big_files.mdwn b/doc/forum/ikiwiki_and_big_files.mdwn
new file mode 100644
index 000000000..cd41d9fce
--- /dev/null
+++ b/doc/forum/ikiwiki_and_big_files.mdwn
@@ -0,0 +1,102 @@
+My website has 214 hand written html, 1500 of pictures and a few, err sorry, 114
+video files. All this takes around 1.5 GB of disk space at the moment.
+Plain html files take 1.7 MB and fit naturally into git.
+
+But what about the picture and video files?
+
+Pictures are mostly static and rarely need to be edited after first upload,
+wasting a megabyte or two after an edit while having them in git doesn't really matter.
+Videos on the other hand are quite large from megabytes to hundreds. Sometimes
+I re-encode them from the original source with better codec parameters and just
+replace the files under html root so they are accessible from the same URL.
+So having a way to delete a 200 MB file and upload a new one with same name and access URL
+is what I need. And it appears git has trouble erasing commits from history, or requires
+some serious gitfoo and good backups of the original repository.
+
+So which ikiwiki backend could handle piles of large binary files? Or should I go for a separate
+data/binary blob directory next to ikiwiki content?
+
+Further complication is my intention to keep URL compatibility with old handwritten and ikiwiki
+based site. Sigh, tough job but luckily just a hobby.
+
+[-Mikko](http://mcfrisk.kapsi.fi)
+
+ps. here's how to calculate space taken by html, picture and video files:
+
+ ~/www$ unset sum; for size in $( for ext in htm html txt xml log; \
+ do find . -iname "*$ext" -exec stat -c "%s" \{\} \; ; done | xargs ); \
+ do sum=$(( $sum + $size )); done ; echo $sum
+ 1720696
+ ~/www$ unset sum; for size in $( for ext in jpg gif jpeg png; \
+ do find . -iname "*$ext" -exec stat -c "%s" \{\} \; ; done | xargs ); \
+ do sum=$(( $sum + $size )); done ; echo $sum
+ 46032184
+ ~/www$ unset sum; for size in $( for ext in avi dv mpeg mp4; \
+ do find . -iname "*$ext" -exec stat -c "%s" \{\} \; ; done | xargs ); \
+ do sum=$(( $sum + $size )); done ; echo $sum
+ 1351890888
+
+> One approach is to use the [[plugins/underlay]] plugin to
+> configure a separate underlay directory, and put the large
+> files in there. Those files will then be copied to the generated
+> wiki, but need not be kept in revision control. (Or could be
+> revision controlled in a separate repository -- perhaps one using
+> a version control system that handles large files better than git;
+> or perhaps one that you periodically blow away the old history to
+> in order to save space.)
+>
+> BTW, the `hardlink` setting is a good thing to enable if you
+> have large files, as it saves both disk space and copying time.
+> --[[Joey]]
+
+Can underlay plugin handle the case that source and destination directories
+are the same? I'd rather have just one copy of these underlay files on the server.
+
+> No, but enabling hardlinks accomplishes the same effect. --[[Joey]]
+
+And did I goof in the setup file since I got this:
+
+ $ ikiwiki -setup blog.setup -rebuild --verbose
+ Can't use string ("/home/users/mcfrisk/www/blog/med") as an ARRAY ref while
+ "strict refs" in use at
+ /home/users/mcfrisk/bin/share/perl/5.10.0/IkiWiki/Plugin/underlay.pm line 41.
+ $ grep underlay blog.setup
+ add_plugins => [qw{goodstuff websetup comments blogspam html sidebar underlay}],
+ underlaydir => '/home/users/mcfrisk/bin/share/ikiwiki/basewiki',
+ # underlay plugin
+ # extra underlay directories to add
+ add_underlays => '/home/users/mcfrisk/www/blog/media',
+ $ egrep "(srcdir|destdir)" blog.setup
+ srcdir => '/home/users/mcfrisk/blog',
+ destdir => '/home/users/mcfrisk/www/blog',
+ # allow symlinks in the path leading to the srcdir (potentially insecure)
+ allow_symlinks_before_srcdir => 1,
+ # directory in srcdir that contains directive descriptions
+
+-Mikko
+
+> The plugin seems to present a bad default value in the setup file.
+> (Fixed in git.) A correct configuration would be:
+
+ add_underlays => ['/home/users/mcfrisk/www/blog/media'],
+
+Umm, doesn't quite fix this yet:
+
+ $ ikiwiki -setup blog.setup -v
+ Can't use an undefined value as an ARRAY reference at /home/users/mcfrisk/bin/share/perl/5.10.0/IkiWiki
+ /Plugin/underlay.pm line 44.
+ $ grep underlay blog.setup
+ add_plugins => [qw{goodstuff websetup comments blogspam html sidebar underlay}],
+ underlaydir => '/home/users/mcfrisk/bin/share/ikiwiki/basewiki',
+ # underlay plugin
+ # extra underlay directories to add
+ add_underlays => ['/home/users/mcfrisk/www/blog/media'],
+ $ ikiwiki --version
+ ikiwiki version 3.20091032
+
+-Mikko
+
+> Yeah, I've fixed that in git, but you can work around it with this:
+> --[[Joey]]
+
+ templatedirs => [],
diff --git a/doc/forum/ikiwiki_and_big_files/comment_1_df8a9f4249af435cc335f77768a3278d._comment b/doc/forum/ikiwiki_and_big_files/comment_1_df8a9f4249af435cc335f77768a3278d._comment
new file mode 100644
index 000000000..4ab5b52ee
--- /dev/null
+++ b/doc/forum/ikiwiki_and_big_files/comment_1_df8a9f4249af435cc335f77768a3278d._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://mildred.fr/"
+ ip="2a01:e35:2f7b:8350:8d29:c70d:c3e:d110"
+ subject="git-annex"
+ date="2012-12-18T14:12:31Z"
+ content="""
+I suppose we could use git-annex to do that. The question is: does the Git plugin in ikiwiki supports git-annex ? I'd hope so.
+"""]]
diff --git a/doc/forum/ikiwiki_and_big_files/comment_2_2d996f1124aedc10f345139c3d8b11df._comment b/doc/forum/ikiwiki_and_big_files/comment_2_2d996f1124aedc10f345139c3d8b11df._comment
new file mode 100644
index 000000000..6a11d9ae2
--- /dev/null
+++ b/doc/forum/ikiwiki_and_big_files/comment_2_2d996f1124aedc10f345139c3d8b11df._comment
@@ -0,0 +1,19 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 2"
+ date="2012-12-21T11:02:19Z"
+ content="""
+Unfortunately, ikiwiki [[doesn't follow symlinks for security
+reasons|security]] - if it did, anyone who can commit to the wiki
+repository could publish any file readable by the user who runs ikiwiki,
+including secrets like `~/.gnupg/secring.gpg` or
+`~/.ssh/identity`.
+
+git-annex relies on symlinks, so that restriction breaks it.
+It would be great to be able to use some restricted, safe subset
+of symlinks (\"relative symlinks that point into `.git/annex`\" would
+be enough to support git-annex), and I've looked into it in the past.
+My [[plugins/contrib/album]] plugin would benefit from being able
+to annex the actual photos, for instance.
+"""]]
diff --git a/doc/forum/ikiwiki_and_big_files/comment_3_dfbd38e2b457ea3c4f70266dbf8fbeab._comment b/doc/forum/ikiwiki_and_big_files/comment_3_dfbd38e2b457ea3c4f70266dbf8fbeab._comment
new file mode 100644
index 000000000..6aae6dbd7
--- /dev/null
+++ b/doc/forum/ikiwiki_and_big_files/comment_3_dfbd38e2b457ea3c4f70266dbf8fbeab._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joeyh.name/"
+ ip="2001:4978:f:21a::2"
+ subject="comment 3"
+ date="2012-12-21T14:49:13Z"
+ content="""
+git-annex is gaining a new \"direct\" mode where it does not use symlinks. It remains to be seen if enough git operations will be supported in that mode to make it attractive to use.
+"""]]
diff --git a/doc/forum/ikiwiki_development_environment_tips.mdwn b/doc/forum/ikiwiki_development_environment_tips.mdwn
new file mode 100644
index 000000000..f9c584159
--- /dev/null
+++ b/doc/forum/ikiwiki_development_environment_tips.mdwn
@@ -0,0 +1,68 @@
+I haven't settled on a comfortable/flexible/quick development environment for hacking on ikiwiki. The VM I host my web pages on is not fast enough to use for RAD and ikiwiki. For developing plugins, it seems a bit heavy-weight to clone the entire ikiwiki repository. I haven't managed to get into a habit of running a cloned version of ikiwiki from it's own dir, rather than installing it (If that's even possible). The ikiwiki site source (source ./doc) is quite large and not a great testbed for hacking (e.g. if you are working on a plugin you need a tailored test suite for that plugin).
+
+Does anyone have a comfortable setup or tips they would like to share? -- [[Jon]]
+
+> I've just been setting `libdir` in an existing wiki's setup file. When the plugin's in a decent state, I copy it over to a git checkout and commit. For the plugins I've been working on (auth and VCS), this has been just fine. Are you looking for something more? --[[schmonz]]
+
+>> I think this suffers from two problems. Firstly, unless you are tracking git
+>> master in your existing wiki, there's the possibility that your plugin will
+>> not work with a more modern version of ikiwiki (or that it would benefit
+>> from using a newly added utility subroutine or similar).
+
+>>> Unlikely. I don't make changes to the plugin interface that break
+>>> existing plugins. (Might change non-exported `IkiWiki::` things
+>>> from time to time.) --[[Joey]]
+
+>> Second, sometimes I
+>> find that even writing a plugin can involve making minor changes outside of
+>> the plugin code (bug fixes, or moving functionality about). So, I think
+>> having some kind of environment built around a git checkout is best.
+>>
+>> However, this does not address the issue of the tedium writing/maintaining a
+>> setup file for testing things.
+>>
+>> I think I might personally benefit from a more consistent environment (I
+>> move from machine-to-machine frequently). -- [[Jon]]
+
+> If you set `libdir` to point to a checkout of ikiwiki's git repository,
+> it will override use of the installed version of ikiwiki, so ikiwiki will
+> immediatly use any changed or new `.pm` files (with the exception of
+> IkiWiki.pm), and you can use git to manage it all without an installation
+> step. If I am modifying IkiWiki.pm, I generally symlink it from
+> `/usr/share/perl5/IkiWiki.pm` to my git reporisitory. Granted, not ideal.
+>
+> I often use my laptop's local version of my personal wiki for testing.
+> It has enough stuff that I can easily test most things, and if I need
+> a test page I just dump test cases on the sandbox. I can make
+> any changes necessary during testing and then `git reset --hard
+> origin/master` to avoid publishing them.
+>
+> If the thing I'm testing involves templates, or underlays,
+> I will instead use ikiwiki's `docwiki.setup` for testing, modifying it as
+> needed, since it is preconfigured to use the templates and underlays
+> from ikiwiki's source repository.
+> --[[Joey]]
+
+> I work with Ikiwiki from the git checkout directory the following way.
+>
+> * instead of running ikiwiki, I wrote the following `mykiwiki` shell script,
+> that also allows me to use my custom lib-ifited multimarkdown:
+
+ #!/bin/sh
+
+ MMDSRC="$HOME/src/multimarkdown/lib"
+ IKIWIKISRC="$HOME/src/ikiwiki"
+ PLUGINS="$HOME/src/ikiplugins"
+
+ /usr/bin/perl -I"$MMDSRC" -I"$IKIWIKISRC/blib/lib" -I"$PLUGINS" "$IKIWIKISRC/ikiwiki.out" -libdir "$IKIWIKISRC" "$@"
+
+> * I also have an installed ikiwiki from Debian unstable, from which I only use the base wiki, so my `.setup` has the following configs:
+
+ # additional directory to search for template files
+ templatedir => '/home/oblomov/src/ikiwiki/templates',
+ # base wiki source location
+ underlaydir => '/usr/share/ikiwiki/basewiki',
+ # extra library and plugin directory
+ libdir => '/home/oblomov/src/ikiwiki',
+
+> Hope that helps --GB
diff --git a/doc/forum/ikiwiki_generates_html_files_with_600_permission..mdwn b/doc/forum/ikiwiki_generates_html_files_with_600_permission..mdwn
new file mode 100644
index 000000000..70383372b
--- /dev/null
+++ b/doc/forum/ikiwiki_generates_html_files_with_600_permission..mdwn
@@ -0,0 +1,8 @@
+I installed ikiwiki the usual way, my rcs is git, i configure the post-update hook in the bare-repo and use the cgi script in the non-bare.
+
+I update my wiki through git (clone the bare repo on my laptop (WORKING CLONE), make a change and push it back to origin ($REPOSITORY)). Then the post-update hook (configured in my ikiwiki.config) kicks in and updates the checked out wiki ($DESTDIR) and the cgi script there generates html. See [[rcs/git]] if something is not clear.
+
+My problem is: every generated html/css/favicon file is only write and readable by the user (600) and no one else.
+
+ - Edit: If i edit the wiki through the webinterface everything is fine.
+ - Edit2: Set _everything_ to chmod 0755 but when i run --setup or push to the bare repo; **then pages that are generated through the post update hook still have the wrong permissions.**
diff --git a/doc/forum/ikiwiki_generates_html_files_with_600_permission./comment_1_6d73d412a9cc6f6ae426b62885c1f157._comment b/doc/forum/ikiwiki_generates_html_files_with_600_permission./comment_1_6d73d412a9cc6f6ae426b62885c1f157._comment
new file mode 100644
index 000000000..7008a05ac
--- /dev/null
+++ b/doc/forum/ikiwiki_generates_html_files_with_600_permission./comment_1_6d73d412a9cc6f6ae426b62885c1f157._comment
@@ -0,0 +1,19 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="your shell has a restrictive umask"
+ date="2011-11-27T13:24:57Z"
+ content="""
+Your shell login to the server (presumably SSH?) is running under a
+restrictive `umask`, and by default ikiwiki doesn't overrule that.
+For instance, perhaps you're using the `pam_umask` module, or
+perhaps your `.bashrc` sets a restrictive mask. This is generally
+a good thing for privacy from other users of a shared server, but
+counterproductive when you're publishing things!
+
+You can configure ikiwiki to set a less restrictive `umask` with
+the `umask` option in your setup file. 18 is probably a good value
+(18 decimal = 022 octal, and a `umask` of 022 octal corresponds
+to `chmod 0755`, because the `umask` is subtracted from 0777 octal
+to get the default permissions).
+"""]]
diff --git a/doc/forum/ikiwiki_generates_html_files_with_600_permission./comment_2_1392fcde369d11a264f31f6b8993ccec._comment b/doc/forum/ikiwiki_generates_html_files_with_600_permission./comment_2_1392fcde369d11a264f31f6b8993ccec._comment
new file mode 100644
index 000000000..7e0818ce6
--- /dev/null
+++ b/doc/forum/ikiwiki_generates_html_files_with_600_permission./comment_2_1392fcde369d11a264f31f6b8993ccec._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 2"
+ date="2011-11-27T13:46:16Z"
+ content="""
+See also [[bugs/octal umask setting is unintuitive]] for more about 18 vs. 022.
+"""]]
diff --git a/doc/forum/ikiwiki_generates_html_files_with_600_permission./comment_3_962306f22ceb17afb4150e766e9a05b3._comment b/doc/forum/ikiwiki_generates_html_files_with_600_permission./comment_3_962306f22ceb17afb4150e766e9a05b3._comment
new file mode 100644
index 000000000..6ed955061
--- /dev/null
+++ b/doc/forum/ikiwiki_generates_html_files_with_600_permission./comment_3_962306f22ceb17afb4150e766e9a05b3._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="ikiwikert"
+ ip="134.99.22.236"
+ subject="comment 3"
+ date="2011-11-27T16:54:05Z"
+ content="""
+thank you for your enlighting post! i set the umask option to 022 (octal) and the wrapper to 0755 and it worked. However i guess it is not a good thing to mix modes and i would appreciate it, if you implemented the \"keyword-approach\" you mentioned.
+
+Or at least one way of defining modes would be okay for average joes like me.
+"""]]
diff --git a/doc/forum/ikiwiki_generates_html_files_with_600_permission./comment_4_8b988d85cfde123798238d0348764c79._comment b/doc/forum/ikiwiki_generates_html_files_with_600_permission./comment_4_8b988d85cfde123798238d0348764c79._comment
new file mode 100644
index 000000000..838b13756
--- /dev/null
+++ b/doc/forum/ikiwiki_generates_html_files_with_600_permission./comment_4_8b988d85cfde123798238d0348764c79._comment
@@ -0,0 +1,22 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 4"
+ date="2011-11-28T09:52:38Z"
+ content="""
+Joey merged my branch, so from the next release of ikiwiki you'll be able
+to say `umask => 'public'` (or `private` or `group` for the other two
+potentially-useful umasks).
+
+I'm not sure what you mean about mixing modes? The wrapper modes are
+something else - the wrapper modes are about who can run the CGI (or the
+git commit hook) and whether they're setuid (run as the user who owns
+the wiki) or not (run as the web server user or the git push user),
+whereas `umask` is about the permissions that ikiwiki will assign to
+new files it creates (like the HTML).
+
+A typical public wiki like this one will have `umask => 'public'`;
+the wrapper modes will either be `0755` or `04755` (both octal),
+depending on the details of how the web server runs the CGI
+and how git pushes are done.
+"""]]
diff --git a/doc/forum/ikiwiki_over_database__63__.wiki b/doc/forum/ikiwiki_over_database__63__.wiki
new file mode 100644
index 000000000..a70f9c989
--- /dev/null
+++ b/doc/forum/ikiwiki_over_database__63__.wiki
@@ -0,0 +1,11 @@
+Is there here any possibility to modifying ikiwiki (via plugin) for store pages in database. I'm thinking about storing pages in sqlite or mysql for serving it much faster. The idea is from sputnik.org [http://sputnik.freewisdom.org/] but with perl ;-). Could we integrate the sputnik code in ikiwiki as a solution?
+
+-----
+
+ikiwiki generates static pages in a filesystem. It's responsible for editing and regenerating them, but they're served by any old web server. If you go to the trouble of stuffing the generated pages into a database, you'll need to go to further trouble to serve them back out somehow: write your own web server, perhaps, or a module for a particular web server. Either way you'll have sacrificed ikiwiki's interoperability, and it's not at all clear (since you're adding, in the best case, one layer of indirection reading the generated files) you'll have gained any improved page-serving performance. If it's source pages you want to store in a database, then you lose the ability to do random Unixy things to source pages, including managing them in a revision control system.
+
+Static HTML pages in a filesystem and the ability to do random Unixy things are two of the uniquely awesome features of ikiwiki. It's probably possible to do what you want, but it's unlikely that you really want it. I'd suggest you either get to know ikiwiki better, or choose one of the many wiki implementations that already works as you describe. --[[Schmonz]]
+
+---
+
+Thanks, [[Schmonz]]. You clarify me much things,.... Xan.
diff --git a/doc/forum/ikiwiki_vim_integration.mdwn b/doc/forum/ikiwiki_vim_integration.mdwn
new file mode 100644
index 000000000..4724807e8
--- /dev/null
+++ b/doc/forum/ikiwiki_vim_integration.mdwn
@@ -0,0 +1,17 @@
+Hi all. I upgraded the [ikiwiki-nav plugin](http://www.vim.org/scripts/script.php?script_id=2968)
+so that now it supports:
+
+ * Jumping to the file corresponding to the wikilink under the cursor.
+ * Creating the file corresponding to the wikilink under the cursor (including
+ directories if necessary.)
+ * Jumping to the previous/next wikilink in the current file.
+ * Autocomplete link names.
+
+Download it from [here](http://www.vim.org/scripts/script.php?script_id=2968)
+
+I've also created a new page unifying all the hints available here to use vim
+with ikiwiki files, in [[tips/vim_and_ikiwiki]]
+
+
+--[[jerojasro]]
+
diff --git a/doc/forum/ikiwiki_vim_syntaxfile.mdwn b/doc/forum/ikiwiki_vim_syntaxfile.mdwn
new file mode 100644
index 000000000..e6942cd2d
--- /dev/null
+++ b/doc/forum/ikiwiki_vim_syntaxfile.mdwn
@@ -0,0 +1,26 @@
+See the new syntax file [[here|tips/vim_syntax_highlighting]]. It fixes both of
+the problems reported below.
+
+----
+
+Hi all,
+
+I'm teaching myself how to write syntax files for vim by fixing several issues
+(and up to certain extent, taking over the maintenance) of the vim syntax
+(highlighting) file for ikiwiki.
+
+I'd like you to document here which problems you have found, so I can hunt them
+and see if I can fix them.
+
+## Problems Found
+
+ * Arguments of directives with a value of length 1 cause the following text to
+ be highlighted incorrectly. Example:
+
+ [[!directive param1="val1" param2="1"]] more text ...
+
+ * A named wikilink in a line, followed by text, and then another wikilink,
+ makes the text in between the links to be incorrectly highlighted. Example:
+
+ \[[a link|alink]] text that appears incorrectly .... \[[link]]
+
diff --git a/doc/forum/index_attachments.mdwn b/doc/forum/index_attachments.mdwn
new file mode 100644
index 000000000..8167a60f8
--- /dev/null
+++ b/doc/forum/index_attachments.mdwn
@@ -0,0 +1,9 @@
+Why doesn't the [[plugins/search]] plugin index attachments? are there any
+technical reasons for not including this feature/option? (besides increased
+processing time, and depending from external programs.)
+
+One could check for all non-mdwn files, convert them to text, if such thing is
+possible, and add them as documents; I guess `needsbuild` would be a good site
+for that.
+
+--[[jerojasro]]
diff --git a/doc/forum/index_attachments/comment_1_18b9531d273292b45051eef6a306ca26._comment b/doc/forum/index_attachments/comment_1_18b9531d273292b45051eef6a306ca26._comment
new file mode 100644
index 000000000..056c4139a
--- /dev/null
+++ b/doc/forum/index_attachments/comment_1_18b9531d273292b45051eef6a306ca26._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2012-01-13T17:46:49Z"
+ content="""
+I don't think there are really any reasons, other than noone having done it.
+
+Although it is worth noting that using additional libraries/programs to eg, pull exif data and comments out of image files and make it searchable, does potentially increase ikiwiki's attack surface.
+"""]]
diff --git a/doc/forum/index_attachments/comment_2._comment b/doc/forum/index_attachments/comment_2._comment
new file mode 100644
index 000000000..a7eec29cb
--- /dev/null
+++ b/doc/forum/index_attachments/comment_2._comment
@@ -0,0 +1,31 @@
+[[!comment format=mdwn
+ username="jerojasro"
+ nickname="jerojasro"
+ subject="RE: comment 1"
+ date="2012-01-15T23:49:49Z"
+ content="""
+I've modified the plugin adding the possibility of indexing attachments. Only
+PDF attachments for now, but support for other filetypes should be real easy to add.
+
+The changes to `IkiWiki/Plugin/search.pm` are available at
+<http://git.devnull.li/ikiwiki.git>, in the `srchatt` branch.
+
+I have a small question about filenames and security: I'm using `qx` to execute
+the program that extracts the text from the PDF files, but `qx` executes a
+whole string, and passes it not to the program I want to run, but to a shell,
+so it is possible (I think) to craft a filename that, in a shell, expands to
+something nasty.
+
+How do the Perl/IkiWiki experts suggest to handle these potentially unsafe
+filenames? I've thought of the following options:
+
+ * Running the text extractor program using `Proc::Safe`. I could not find a
+ Debian package for it, and I'd rather avoid adding another dependency to
+ IkiWiki.
+ * Running the text extractor program as suggested in the `perlipc` document,
+ using `fork` + `exec`.
+
+I haven't done any of those because I'd like to check if there are any helpers
+in IkiWiki to do this. Perhaps the `IkiWiki::possibly_foolish_untaint` function
+does it? (I didn't really understand what it does...)
+"""]]
diff --git a/doc/forum/index_attachments/comment_3_050e5847641a27e0c14232632f3e700a._comment b/doc/forum/index_attachments/comment_3_050e5847641a27e0c14232632f3e700a._comment
new file mode 100644
index 000000000..b589ae823
--- /dev/null
+++ b/doc/forum/index_attachments/comment_3_050e5847641a27e0c14232632f3e700a._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawljSQThLsc4vHz0jw1aSR74Dj9K5J_NKqk"
+ nickname="Michal"
+ subject="comment 3"
+ date="2012-01-17T16:45:37Z"
+ content="""
+Maybe it could be sufficient to run a command similar to
+
+ omindex --db /path/to/.ikiwiki/xapian/default --url http://webserver/ikiwiki /path/to/public_html
+"""]]
diff --git a/doc/forum/index_attachments/comment_4._comment b/doc/forum/index_attachments/comment_4._comment
new file mode 100644
index 000000000..1a2726290
--- /dev/null
+++ b/doc/forum/index_attachments/comment_4._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="jerojasro"
+ nickname="jerojasro"
+ subject="RE: comment 1"
+ date="2012-01-21T21:44:00"
+ content="""
+[[Michal]], that's not a bad idea IMO, but we would lose some [[searching
+keywords|ikiwiki/searching]] and would also index structural elements
+(navigation text, and so on)
+"""]]
diff --git a/doc/forum/installation_and_setup_questions.mdwn b/doc/forum/installation_and_setup_questions.mdwn
new file mode 100644
index 000000000..50633ec3f
--- /dev/null
+++ b/doc/forum/installation_and_setup_questions.mdwn
@@ -0,0 +1,52 @@
+[[!meta date="2007-03-02 00:57:08 +0000"]]
+
+Ikiwiki creates a .ikiwiki directory in my wikiwc working directory. Should I
+"svn add .ikiwiki" or add it to svn:ignore?
+
+> `.ikiwiki` is used by ikiwiki to store internal state. You can add it to
+> svn:ignore. --[[Joey]]
+> > Thanks a lot.
+
+Is there an easy way to log via e-mail to some webmaster address, instead
+of via syslog?
+
+> Not sure why you'd want to do that, but couldn't you use a tool like
+> logwatch to mail selected lines from the syslog? --[[Joey]]
+
+> > The reason is that I'm not logged in on the web server regularly to
+> > check the log files. I'll see whether I can install a logwatch instance.
+
+I'm trying to install from scratch on a CentOS 4.6 system. I installed perl 5.8.8 from source and then added all the required modules via CPAN. When I build ikiwiki from the tarball, I get this message:
+
+ rendering todo/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn
+ *** glibc detected *** double free or corruption (!prev): 0x0922e478 ***
+ make: *** [extra_build] Aborted
+
+I'm kind of at a loss how to track this down or work around it. Any suggestions? --Monty
+
+> All I can tell you is that it looks like a problem with your C library or
+> perl. Little perl programs like ikiwiki should only be able to trigger
+> such bugs, not contain them. :-) Sorry I can't be of more help.
+> --[[Joey]]
+
+> I had a similar problem after upgrading to the latest version of
+> Text::Markdown from CPAN. You might try either looking for a Markdown
+> package for CentOS or using the latest version of John Gruber's
+> Markdown.pl:
+> <http://daringfireball.net/projects/downloads/Markdown_1.0.2b8.tbz>
+> --[[JasonBlevins]], April 1, 2008 18:22 EDT
+
+>> Unfortunately I couldn't find a CentOS package for markdown, and I
+>> couldn't quite figure out how to use John Gruber's version instead.
+>> I tried copying it to site_perl, etc., but the build doesn't pick
+>> it up. For now I can just play with it on my Ubuntu laptop for which
+>> the debian package installed flawlessly. I'll probably wait for an
+>> updated version of Markdown to see if this is fixed in the future.
+>> --Monty
+
+>I suggest that you pull an older version of Text::Markdown from CPAN. I am using <http://backpan.perl.org/authors/id/B/BO/BOBTFISH/Text-Markdown-1.0.5.tar.gz> and that works just fine.
+>There is a step change in version and size between this version (dated 11Jan2008) and the next version (1.0.12 dated 18Feb2008). I shall have a little look to see why, in due course.
+>Ubuntu Hardy Heron has a debian package now, but that does not work either.
+> --Dirk 22Apr2008
+
+> This might be related to [Text::Markdown bug #37297](http://rt.cpan.org/Public/Bug/Display.html?id=37297).--ChapmanFlack 9Jul2008
diff --git a/doc/forum/installation_as_non-root_user.mdwn b/doc/forum/installation_as_non-root_user.mdwn
new file mode 100644
index 000000000..4997af2ac
--- /dev/null
+++ b/doc/forum/installation_as_non-root_user.mdwn
@@ -0,0 +1,7 @@
+I'd like to install ikiwiki as a non-root user. I can plow through getting all the
+perl dependencies installed because that's well documented in the perl world,
+but I don't know how to tell ikiwiki to install somewhere other than / --BrianWilson
+
+> Checkout the tips section for [[tips/DreamHost]]. It should do the trick. --MattReynolds
+
+[[!meta date="2008-01-13 16:02:52 -0500"]]
diff --git a/doc/forum/installation_of_selected_docs.mdwn b/doc/forum/installation_of_selected_docs.mdwn
new file mode 100644
index 000000000..81dd1ee00
--- /dev/null
+++ b/doc/forum/installation_of_selected_docs.mdwn
@@ -0,0 +1,29 @@
+[[!meta date="2007-09-06 19:47:23 +0000"]]
+
+# Installation of selected docs (html)
+
+The latest release has around 560 files (over 2MB) in html.
+
+Any suggestions or ideas on limiting what html is installed?
+
+For example, I don't see value in every ikiwiki install out there to also install personal "users" ikiwiki pages.
+
+For now I copy ikiwiki.setup. And then use pax with -L switch to copy the targets of the symlinks of the basewiki.
+
+I was thinking of making a list of desired documents from the html directory to install.
+
+--JeremyReed
+
+> You don't need any of them, unless you want to read ikiwiki's docs locally.
+>
+> I don't understand why you're installing the basewiki files manually;
+> ikiwiki has a Makefile that will do this for you. --[[Joey]]
+
+>> The Makefile's install doesn't do what I want so I use different installer for it.
+>> It assumes wrong location for man pages for me. (And it should consider using INSTALLVENDORMAN1DIR and
+>> MAN1EXT but I don't know about section 8 since I don't know of perl value for that.)
+>> I don't want w3m cgi installed; it is optional for my package.
+>> I will just patch for that instead of using my own installer.
+>> Note: I am working on the pkgsrc package build specification for this. This is for creating
+>> packages for NetBSD, DragonFly and other systems that use pkgsrc package system.
+>> --JeremyReed
diff --git a/doc/forum/is_it_possible_to_NOT_add_openid2_meta_tags.mdwn b/doc/forum/is_it_possible_to_NOT_add_openid2_meta_tags.mdwn
new file mode 100644
index 000000000..e952263a3
--- /dev/null
+++ b/doc/forum/is_it_possible_to_NOT_add_openid2_meta_tags.mdwn
@@ -0,0 +1,67 @@
+### "meta openid" problems
+
+I have add the followning to _index.mdwn_ on my site.
+
+ \[[!meta openid="http://certifi.ca/lunix"
+ server="http://certifi.ca/_serve"]]
+
+This resulted in the following being added to my site
+
+ <link href="http://certifi.ca/_serve" rel="openid.server" />
+ <link href="http://certifi.ca/_serve" rel="openid2.provider" />
+ <link href="http://certifi.ca/lunix" rel="openid.delegate" />
+ <link href="http://certifi.ca/lunix" rel="openid2.local_id" /> -->
+
+Perhaps I have done something wrong but this fails to work when I try to log in to several sites using my sites url as my login.
+If I edit index.html and remove the two openid2 lines all works fine.
+**Is there a way to only add openid version 1 tags to my index.html ?
+Or a way to make it work the way it is ?** --[Mick](http://www.lunix.com.au)
+
+> Before I think about adding a way to not add the openid 2 tags,
+> I'd like to know what the problem is. Is there something
+> wrong with the tags? Does your openid provider not support
+> openid 2, and the site you are logging into sees the openid 2 tags
+> and uses it, not falling back to openid 1?
+>
+> Since certifi.ca is a public openid provider (run by a
+> guy I know even!), I should be
+> able to reproduce your problem if you can tell me what
+> site(s) you are trying to log into. --[[Joey]]
+
+----------
+
+I was using _phpMyID_ and its not _openid2_ compliant so I switched to certifi.ca to counteract that but I really
+want to go back to running my own provider.
+I can't login to identi.ca.unless I comment out the openid2 lines.(this may be there problem, I get sent to certifi.ca's site and redirected back to identi.ca)
+I will test all the different openid enabled sites I log into today and see what happens.
+It seems that since I have moved my site to its final location and made it live over night I am able to login to most places now.
+I do not have a proper understanding of the inner workings of openid so not exactly sure what part is failing but I think the problem
+lays with the consumers not falling back to the openid1 tags when they are openid1 only consumers. --[Mick](http://www.lunix.com.au)
+
+> So, just to clarify, certifi.ca works ok (I verified this, logging into identi.ca using it).
+> You had the problem running your own openid provider which did not support 2.0, in which case,
+> consumers seem justified in not falling back (guess; I don't know the 2.0 spec).
+> The only way this seems fixable is to add an option to meta to allow disabling openid 2. Which
+> should be easy enough to do. --[[Joey]]
+
+I can't log into identi.ca with openid2 tags. strange. I will look at that again today.
+Having the option to disable openid2 tags would be perfect.
+Thanks Joey. --[Mick](http://www.lunix.com.au)
+
+>> Actually, it seems that identi.ca / certifi.ca do
+>> not interoperate when using openid2. It actually
+>> fails half the time, and succeeds half the time;
+>> seems to be picking openid1 and openid2 randomly and failing
+>> on the latter. I have emailed Evan Prodromou about this weird behavior.
+>> Not clear to me if identi.ca or certifi.ca is at fault,
+>> but luckily he runs both..
+>> --[[Joey]]
+
+Ahh so it's not just me.
+It's handy having contacts in the _right_ places. --[Mick](http://www.lunix.com.au)
+
+>> ikiwiki's next release will allow adding 'delegate=1' to the
+>> meta directive to only delegate to openid1. --[[Joey]]
+
+## awesome.
+--[Mick](http://www.lunix.com.au)
diff --git a/doc/forum/java_script_slideshow.mdwn b/doc/forum/java_script_slideshow.mdwn
new file mode 100644
index 000000000..7289e3dec
--- /dev/null
+++ b/doc/forum/java_script_slideshow.mdwn
@@ -0,0 +1,11 @@
+Hi there,
+
+I tried yesterday to make a slide-show for my frontpage. I enabled the [[!iki plugins/meta desc=meta]] and the [[!iki plugins/html desc=html]] plugins, then, on my frontpage, I pointed to the .js files that do the trick and I added the html code to actually run the slide-show. I also set [[!iki plugins/htmlscrubber desc=htmlscrubber]] to skip everything except the Discussion pages. I used this code [[http://www.dynamicdrive.com/dynamicindex14/fadeinslideshow.htm]].
+
+In theory this should have worked but unfortunately it didn't worked at all. I know I can use [[!iki plugins/rawhtml desc=rawhtml]] plugin but with this I get a page that is treated as data, I need the page to be treated as source so I can use directives. I'm now wondering if this is even possible in ikiwiki...
+
+Anyway, I thought to check with you guys. The main idea is to have a simple slide-show with 4 or 5 slides that change every 3 seconds and each slide should link to a different page. Any ideas are extremely welcomed!
+
+Thanks
+
+Marius
diff --git a/doc/forum/java_script_slideshow/comment_1_3eba0b2f3c12acc991dc3069d2b83d49._comment b/doc/forum/java_script_slideshow/comment_1_3eba0b2f3c12acc991dc3069d2b83d49._comment
new file mode 100644
index 000000000..745da0281
--- /dev/null
+++ b/doc/forum/java_script_slideshow/comment_1_3eba0b2f3c12acc991dc3069d2b83d49._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2011-06-23T07:52:04Z"
+ content="""
+See [[tips/embedding_content]], which applies equally to any scripting or other \"unsafe\" markup.
+"""]]
diff --git a/doc/forum/java_script_slideshow/comment_2_59d90f42b2ca2a5cc71a4d9ba9b9ee9f._comment b/doc/forum/java_script_slideshow/comment_2_59d90f42b2ca2a5cc71a4d9ba9b9ee9f._comment
new file mode 100644
index 000000000..6df07cb6a
--- /dev/null
+++ b/doc/forum/java_script_slideshow/comment_2_59d90f42b2ca2a5cc71a4d9ba9b9ee9f._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 2"
+ date="2011-06-23T07:54:06Z"
+ content="""
+Oh, sorry, you already said you disabled the htmlscrubber. In that case,
+please define \"doesn't work\" - it might be helpful to compare the IkiWiki
+output with similar code that you've added to a static copy of the HTML?
+"""]]
diff --git a/doc/forum/java_script_slideshow/comment_3_820a86db38231cff7239f0a88b1925fd._comment b/doc/forum/java_script_slideshow/comment_3_820a86db38231cff7239f0a88b1925fd._comment
new file mode 100644
index 000000000..e14d6d892
--- /dev/null
+++ b/doc/forum/java_script_slideshow/comment_3_820a86db38231cff7239f0a88b1925fd._comment
@@ -0,0 +1,21 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk3MUhMRflE8-Vg4fevsT1sadSetiAxVKg"
+ nickname="Marius"
+ subject="IT WORKS!!! Ikiwiki rules!"
+ date="2011-06-23T09:42:00Z"
+ content="""
+Well, surprise, I tried a different approach and this time it worked perfectly. So, how I did it?...
+
+I created externally a html file with the slid-show I wanted and then put it (along with the javascript files and the images) in the source directory. I rebuilt the wiki and then inlined that html page in the wiki page where I wanted the slide-show.
+
+So, from now on my wiki will look like any wordpress or drupal eye-candy website but without all that bloat.
+
+Short outline:
+
+- put the slid-show files in the source directory
+- rebuild the wiki
+- on the wiki page you want some eye-candy inline the html page that contains the slid-show
+- that's it ;)
+
+Marius
+"""]]
diff --git a/doc/forum/java_script_slideshow/comment_4_a68972e3dd20b65119211d4ab120b294._comment b/doc/forum/java_script_slideshow/comment_4_a68972e3dd20b65119211d4ab120b294._comment
new file mode 100644
index 000000000..1cbd2040a
--- /dev/null
+++ b/doc/forum/java_script_slideshow/comment_4_a68972e3dd20b65119211d4ab120b294._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk3MUhMRflE8-Vg4fevsT1sadSetiAxVKg"
+ nickname="Marius"
+ subject="comment 4"
+ date="2011-06-23T09:49:29Z"
+ content="""
+Oh, I forgot to mention. It seems that you have to have both [[!iki plugins/html desc=html]] and [[!iki plugins/rawhtml desc=rawhtml]] plugins enabled for this to work.
+
+Marius
+"""]]
diff --git a/doc/forum/link_autocompletion_in_vim.mdwn b/doc/forum/link_autocompletion_in_vim.mdwn
new file mode 100644
index 000000000..a46c7e4c1
--- /dev/null
+++ b/doc/forum/link_autocompletion_in_vim.mdwn
@@ -0,0 +1,22 @@
+This page is deprecated. See [[tips/vim_and_ikiwiki]] for the most up to date
+content.
+
+------
+
+I extended the functionality of the [ikiwiki-nav plugin](http://www.vim.org/scripts/script.php?script_id=2968)
+(see [[here|tips/vim_ikiwiki_ftplugin]]) to allow completion of
+wikilinks from inside vim, through the omnicompletion mechanism.
+
+It still has some bugs, but is usable, and will not destroy your data. It can
+only complete links whose definition (text) is on a single line, and still can't
+handle "named links" (`\[\[text|link\]\]`).
+
+I'd love to hear suggestions for improvement for it, and bug reports ;) For
+example, regarding how are sorted and presented the available completions
+(dates, alphabetically, etc).
+
+You can find a tarball for it
+[here](http://devnull.li/~jerojasro/ikiwiki-nav-dev.tar.gz). To install it,
+extract the tarball contents in your `.vim` directory.
+
+--[[jerojasro]]
diff --git a/doc/forum/link_to_an_image_inside_the_wiki_without_inlining_it.mdwn b/doc/forum/link_to_an_image_inside_the_wiki_without_inlining_it.mdwn
new file mode 100644
index 000000000..3f2713678
--- /dev/null
+++ b/doc/forum/link_to_an_image_inside_the_wiki_without_inlining_it.mdwn
@@ -0,0 +1,72 @@
+[[!template id=gitbranch branch=wtk/linktoimageonly author="[[wtk]]"]]
+
+how can I create a link to an image which is part of the wiki, without having it inserted in my page?
+
+I tought this:
+
+ \[[look at this|img/lolcat.png]]
+
+would work, but it doesn't.
+
+Any hints? --[[jerojasro]]
+
+> Well, currently the syntax above will display the image
+> inline with the specified link text used as an alt attribute. Although
+> that does not seem to be documented anywhere.
+>
+> A few places that use that (found with `git grep '\[\[' | egrep 'png|gif|jpeg|jpg' |grep \|`):
+>
+> * [[logo]] uses it to provide useful alt texts for the logos. (This
+> could easily be changed to use [[ikiwiki/directive/img]] though.)
+> * The `change.tmpl` template uses it to display
+> the [[diff|wikiicons/diff.png]] with a very useful "diff" alt text.
+> Using [[ikiwiki/directive/img]] here would mean that the
+> [[plugins/recentchanges]] plugin would depend upon the img
+> plugin.
+>
+> I do like your suggestion, it makes more sense than the current behavior.
+> I'm not sure the transition pain to get from here to there is worth it,
+> though.
+>
+> More broadly, if I were writing ikiwiki now, I might choose to leave out the
+> auto-inlining of images altogether. In practice, it has added a certian level
+> of complexity to ikiwiki, with numerous plugins needing to specify
+> `noimageinline` to avoid accidentially inlining an image. And there has not
+> been a lot of payoff from having the auto-inlining feature implicitly
+> available most places. And the img directive allows much needed control over
+> display, so it would be better for users to not have to worry about its
+> lesser cousin. But the transition from here to *there* would be another order
+> of pain.
+>
+> Anyway, the cheap and simple answer to your question is to use html
+> or markdown instead of a [[ikiwiki/wikilink]]. Ie,
+> `[look at this](img/lolcat.jpg)`. --[[Joey]]
+
+> > thanks a lot, that's a quite straightforward solution. I actually wrote a
+> > broken plugin to do that, and now I can ditch it --[[jerojasro]]
+
+>>> The plugin approach is not a bad idea if you want either the ability
+>>> to:
+>>>
+>>> * Have things that are wikilink-aware (like [[plugins/brokenlinks]]
+>>> treat your link to the image as a wikilink.
+>>> * Use standard wikilink path stuff (and not have to worry about
+>>> a relative html link breaking if the page it's on is inlined, for
+>>> example).
+>>>
+>>> I can help you bang that plugin into shape if need be. --[[Joey]]
+
+>>>> both my plugin and your suggestion yield broken html links when inlining the page (although probably that's what is expected from your suggestion (`[]()`))
+>>>>
+>>>> I thought using the `bestlink` function would take care of that, but alas, it doesn't.
+>>>> Get the "plugin" [here](http://devnull.li/~jerojasro/files/linktoimgonly.pm), see the broken
+>>>> links generated [here](http://devnull.li/~jerojasro/blog/posts/job_offers/) and the source
+>>>> file for that page [here](http://git.devnull.li/cgi-bin/gitweb.cgi?p=blog-jerojasro.git;a=blob;f=posts/job_offers.mdwn;hb=HEAD) --[[jerojasro]]
+
+>>>>> Use this --[[Joey]]
+
+ return htmllink($params{page}, $params{destpage}, $params{"img"},
+ linktext => $params{text},
+ noimageinline => 1);
+
+> [[patch]]: I've updated this plugin for the current ikiwiki. --[[wtk]]
diff --git a/doc/forum/links_to_diff_on_recentchanges__63__.mdwn b/doc/forum/links_to_diff_on_recentchanges__63__.mdwn
new file mode 100644
index 000000000..9a8db62b9
--- /dev/null
+++ b/doc/forum/links_to_diff_on_recentchanges__63__.mdwn
@@ -0,0 +1 @@
+How can I get the little glasses-icon with a link to the diff on Recentchanges, please? I have git integration working nicely, and recentchangesdiff is producing diffs in the RSS feed, but I'd really like a link on the RecentChanges like you have here.
diff --git a/doc/forum/links_to_diff_on_recentchanges__63__/comment_1_1dbc723cc2794f6d45de9cbd2fc2e0fd._comment b/doc/forum/links_to_diff_on_recentchanges__63__/comment_1_1dbc723cc2794f6d45de9cbd2fc2e0fd._comment
new file mode 100644
index 000000000..a09b410b3
--- /dev/null
+++ b/doc/forum/links_to_diff_on_recentchanges__63__/comment_1_1dbc723cc2794f6d45de9cbd2fc2e0fd._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2012-02-23T20:20:17Z"
+ content="""
+Set up gitweb and configure the `diffurl` in ikiwiki and the glasses will appear.
+"""]]
diff --git a/doc/forum/links_to_diff_on_recentchanges__63__/comment_2_4349c85d92cf9c1acf2e7678371ab12a._comment b/doc/forum/links_to_diff_on_recentchanges__63__/comment_2_4349c85d92cf9c1acf2e7678371ab12a._comment
new file mode 100644
index 000000000..b98172bfd
--- /dev/null
+++ b/doc/forum/links_to_diff_on_recentchanges__63__/comment_2_4349c85d92cf9c1acf2e7678371ab12a._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="bob"
+ ip="137.205.152.60"
+ subject="comment 2"
+ date="2012-02-24T09:30:50Z"
+ content="""
+Thanks, that was just the ticket :). My ikiwiki.setup file didn't even have a commented out diffurl in it - is there a handy list of configuration-variables somewhere? I thought the theory was that ikiwiki.setup was generated with (commented out) nearly all the config variables already mentioned?
+
+[FWIW, I'm using the version in Debian stable]
+"""]]
diff --git a/doc/forum/lockedit:_pages_don__39__t_get_locked.mdwn b/doc/forum/lockedit:_pages_don__39__t_get_locked.mdwn
new file mode 100644
index 000000000..109a6a173
--- /dev/null
+++ b/doc/forum/lockedit:_pages_don__39__t_get_locked.mdwn
@@ -0,0 +1,12 @@
+I just tried to lock some pages via the [[plugins/lockedit]] plugin in my wiki but it does not work. Not any page I've tried gets locked. Here's a snipped of my setup:
+
+ add_plugins => [qw{goodstuff edittemplate filecheck getsource htmltidy recentchanges relativedate rename remove search sidebar po httpauth attachment img 404 inline localstyle pagestats progress orphans map toc brokenlinks autoindex anonok blogspam recentchangesdiff}],
+ disable_plugins => [qw{smiley openid theme}],
+ [...]
+ anonok_pagespec => '*',
+ [...]
+ locked_pages => 'todo and todo/done and index and ikiwiki/*',
+
+Is there an interference between anonok and lockedit or is there just a typo?
+
+I can't imagine another source of the problem. Hope you can help me. --[[bacuh]]
diff --git a/doc/forum/lockedit:_pages_don__39__t_get_locked/comment_1_bacffb831e5ce7ece7e670c55ad9f3af._comment b/doc/forum/lockedit:_pages_don__39__t_get_locked/comment_1_bacffb831e5ce7ece7e670c55ad9f3af._comment
new file mode 100644
index 000000000..1f351ee97
--- /dev/null
+++ b/doc/forum/lockedit:_pages_don__39__t_get_locked/comment_1_bacffb831e5ce7ece7e670c55ad9f3af._comment
@@ -0,0 +1,14 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2011-05-26T09:38:12Z"
+ content="""
+[[plugins/anonok]] might take precedence over [[plugins/lockedit]] (it depends
+which gets loaded first). You should change `anonok_pagespec` to not cover the
+`locked_pages`.
+
+(It might be better if [[plugins/lockedit]] always applied first, though;
+except then it'd interfere with [[plugins/opendiscussion]], so maybe
+[[plugins/opendiscussion]] should apply first of all...)
+"""]]
diff --git a/doc/forum/lockedit:_pages_don__39__t_get_locked/comment_2_ad268d3f2cd3d529cfff281e0ecb2f16._comment b/doc/forum/lockedit:_pages_don__39__t_get_locked/comment_2_ad268d3f2cd3d529cfff281e0ecb2f16._comment
new file mode 100644
index 000000000..d73d114d8
--- /dev/null
+++ b/doc/forum/lockedit:_pages_don__39__t_get_locked/comment_2_ad268d3f2cd3d529cfff281e0ecb2f16._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://xlogon.net/bacuh"
+ ip="93.182.181.39"
+ subject="comment 2"
+ date="2011-05-26T10:56:28Z"
+ content="""
+Thanks a lot. Negating locked_pages in anonok_pagespec works.
+"""]]
diff --git a/doc/forum/lockedit:_pages_don__39__t_get_locked/comment_3_da2fb41c5313763e4393cdd921a3f36e._comment b/doc/forum/lockedit:_pages_don__39__t_get_locked/comment_3_da2fb41c5313763e4393cdd921a3f36e._comment
new file mode 100644
index 000000000..a57b268b4
--- /dev/null
+++ b/doc/forum/lockedit:_pages_don__39__t_get_locked/comment_3_da2fb41c5313763e4393cdd921a3f36e._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://xlogon.net/bacuh"
+ ip="93.182.181.39"
+ subject="comment 3"
+ date="2011-05-26T11:07:08Z"
+ content="""
+Ooops, that was wrong.
+
+It is just not possible to edit those pages anonymously. Lockedit still doesn't work.
+"""]]
diff --git a/doc/forum/lockedit:_pages_don__39__t_get_locked/comment_4_d0de7964db26cb6f3e81d6e8c29d860d._comment b/doc/forum/lockedit:_pages_don__39__t_get_locked/comment_4_d0de7964db26cb6f3e81d6e8c29d860d._comment
new file mode 100644
index 000000000..b97f44e15
--- /dev/null
+++ b/doc/forum/lockedit:_pages_don__39__t_get_locked/comment_4_d0de7964db26cb6f3e81d6e8c29d860d._comment
@@ -0,0 +1,16 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 4"
+ date="2011-05-26T15:08:10Z"
+ content="""
+ locked_pages => 'todo and todo/done and index and ikiwiki/*',
+
+I didn't spot it before, but this is wrong: you want \"or\"
+instead of \"and\".
+
+It's a condition under which pages are to be locked: you're trying
+to lock all pages that are simultaneously todo, todo/done and some
+other names, which is an impossible condition to satisfy, so nothing
+is locked!
+"""]]
diff --git a/doc/forum/lockedit:_pages_don__39__t_get_locked/comment_5_d60727c53197d1c667b59bc7250afd9f._comment b/doc/forum/lockedit:_pages_don__39__t_get_locked/comment_5_d60727c53197d1c667b59bc7250afd9f._comment
new file mode 100644
index 000000000..ae1b37c99
--- /dev/null
+++ b/doc/forum/lockedit:_pages_don__39__t_get_locked/comment_5_d60727c53197d1c667b59bc7250afd9f._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://xlogon.net/bacuh"
+ ip="93.182.181.39"
+ subject="comment 5"
+ date="2011-05-27T11:07:27Z"
+ content="""
+Obvious! Should have realized that.
+
+May this help other people.
+"""]]
diff --git a/doc/forum/managing_todo_lists.mdwn b/doc/forum/managing_todo_lists.mdwn
new file mode 100644
index 000000000..0a69af805
--- /dev/null
+++ b/doc/forum/managing_todo_lists.mdwn
@@ -0,0 +1,44 @@
+I keep some TODO lists on ikiwiki pages. I'm half-tempted to write a plugin
+to make ticking items off and adding items easier via the web interface. I'm
+aware though that this is not really what ikiwiki is designed for. Would
+anyone else find this useful? -- [[users/jon]]
+
+----
+
+My subsequent thoughts about how to approach this are two-fold.
+
+Firstly, a filetype for todo lists, probably OPML, but I haven't looked to see
+if there is something more suitable. A plugin that converts this source into a
+traditional page output, i.e. a DOM tree of ul or ol and li elements.
+
+Secondly, some magic javascript to make editing the list via the web page
+more interactive: add items, strike items out, reorder items etc., without
+round-tripping to the cgi for each operation.
+
+Finally, a mechanism whereby the changes made to the page live can be
+committed back to the repository:
+
+ * ...perhaps the input → output conversion is reversible, and the HTML DOM
+ representing the list can be transformed back into the source and submitted
+ to the cgi like a regular edit: issues include the result of other
+ postprocessing: templates, wikilinks, etc.
+ * perhaps an embedded copy of the source is included in the output and the
+ javascript operates on that in tandem with the static copy
+ * perhaps the "output" is generated live by the JS at view time (with maybe
+ a plugin-generated rendered output for non JS environments)
+
+I envisage a button called "commit changes" appearing once some changes are
+made that submits the changes to the CGI, perhaps via a back channel. I'm not
+sure how to handle embeds or challenges from the CGI such as a login challenge
+(maybe the back channel would not be necessary in the first cut).
+
+> You might look at the [[plugins/hnb]] plugin. HNB supports checklists.
+> There's not a fancy web interface, but the hnb command-line program can
+> be used to edit them. --[[Joey]]
+
+>> thanks - I'll give it a look. I spent a few hours writing some javascript to manipulate a ul/li DOM tree in an outliner-fashion the other day. I might be able to join the puzzle pieces together sometime. [[Jon]]
+
+a solution for this could be similar to a solution for [[todo/structured page data]], as todo lists are definitely a form of structured data. (in both cases, the page's current content is rendered into a html form, whose result is then saved as the page's new contents) --[[chrysn]]
+
+> Thanks for the link: yup, there's definitely some common ground there.
+> -- [[Jon]]
diff --git a/doc/forum/missing_pages_redirected_to_search-SOLVED.mdwn b/doc/forum/missing_pages_redirected_to_search-SOLVED.mdwn
new file mode 100644
index 000000000..3af83396c
--- /dev/null
+++ b/doc/forum/missing_pages_redirected_to_search-SOLVED.mdwn
@@ -0,0 +1,36 @@
+Is it possible to have any missing pages(404's) redirected to the search(omega) ?
+So if someone comes to my site with http://example.com/foo_was_here it would result in 'foo_was_here' being passed as a search parameter to omega ? --[Mick](http://www.lunix.com.au)
+
+##DONE
+
+I use nginx instead of apache.
+Just add the following to the `server` block outside of any location block in nginx.conf
+You must also make sure you have setup and enabled the search plugin(omega)
+
+ error_page 404 /ikiwiki.cgi?P=$uri;
+
+
+My full nginx.conf
+
+ server {
+ listen [::]:80; #IPv6 capable
+ server_name www.lunix.com.au;
+ access_log /var/log/nginx/www.lunix.com.au-access.log main;
+ error_log /var/log/nginx/www.lunix.com.au-error.log warn;
+ error_page 404 /ikiwiki.cgi?P=$uri;
+
+ location / {
+ root /home/lunix/public_html/lunix;
+ index index.html index.htm;
+ }
+
+ location ~ ikiwiki\.cgi$ {
+ root /home/lunix/public_html/lunix;
+ include /etc/nginx/fastcgi_params.cgi;
+
+ fastcgi_pass 127.0.0.1:9999;
+ fastcgi_param SCRIPT_FILENAME /home/lunix/public_html/lunix$fastcgi_script_name; # same path as above
+ }
+ }
+
+
diff --git a/doc/forum/missing_pages_redirected_to_search-SOLVED/comment_1_aa03c337b31d7acb95761eb51caab1ef._comment b/doc/forum/missing_pages_redirected_to_search-SOLVED/comment_1_aa03c337b31d7acb95761eb51caab1ef._comment
new file mode 100644
index 000000000..eac5dc165
--- /dev/null
+++ b/doc/forum/missing_pages_redirected_to_search-SOLVED/comment_1_aa03c337b31d7acb95761eb51caab1ef._comment
@@ -0,0 +1,44 @@
+[[!comment format=mdwn
+ username="mathdesc"
+ subject="For lighttpd with mod_magnet"
+ date="2012-08-18T18:27:32Z"
+ content="""
+Same can be done for lighttpd via a lua script (said rewrite.lua) using *mod_magnet* than need to be installed and
+called in your conf like this :
+
+<pre>
+# error-handler for status 404
+$HTTP[\"url\"] =~ \"^/mysite/\" {
+magnet.attract-physical-path-to = ( server.document-root + \"/rewrite.lua\" )
+}
+</pre>
+
+Ref :
+[[mod_magnet docs|http://redmine.lighttpd.net/projects/lighttpd/wiki/Docs_ModMagnet]]
+
+
+
+<pre>
+
+ function removePrefix(str, prefix)
+ return str:sub(1,#prefix+1) == prefix..\"/\" and str:sub(#prefix+2)
+ end
+
+
+
+ attr = lighty.stat(lighty.env[\"physical.path\"])
+ local prefix = '/mysite'
+ if (not attr) then
+ -- we couldn't stat() the file
+ -- let's generate a xapian query with it
+ new_uri =removePrefix(lighty.env[\"uri.path\"], prefix)
+ print (\"page not found : \" .. new_uri .. \" asking xapian\")
+ lighty.env[\"uri.path\"] = \"/mysite/ikiwiki.cgi\"
+ lighty.env[\"uri.query\"] = \"P=\" .. new_uri
+ lighty.env[\"physical.rel-path\"] = lighty.env[\"uri.path\"]
+ lighty.env[\"physical.path\"] = lighty.env[\"physical.doc-root\"] .. lighty.env[\"physical.rel-path\"]
+ end
+</pre>
+
+Hope this is useful to you :)
+"""]]
diff --git a/doc/forum/move_pages.mdwn b/doc/forum/move_pages.mdwn
new file mode 100644
index 000000000..44d36f6b8
--- /dev/null
+++ b/doc/forum/move_pages.mdwn
@@ -0,0 +1 @@
+How to move pages in ikiwiki?
diff --git a/doc/forum/move_pages/comment_1_3f1b9563af1e729a7311e869cf7a7787._comment b/doc/forum/move_pages/comment_1_3f1b9563af1e729a7311e869cf7a7787._comment
new file mode 100644
index 000000000..7e2424362
--- /dev/null
+++ b/doc/forum/move_pages/comment_1_3f1b9563af1e729a7311e869cf7a7787._comment
@@ -0,0 +1,11 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2011-08-05T13:57:09Z"
+ content="""
+Enable the [[plugins/rename]] plugin and ensure that you have permission to edit the page.
+Edit the page and press the Rename button.
+
+\"Following Unix tradition, renaming also allows moving to a different directory.\" —from its documentation
+"""]]
diff --git a/doc/forum/move_pages/comment_2_22b1c238faacbf10df5f03f415223b49._comment b/doc/forum/move_pages/comment_2_22b1c238faacbf10df5f03f415223b49._comment
new file mode 100644
index 000000000..c8f7964a1
--- /dev/null
+++ b/doc/forum/move_pages/comment_2_22b1c238faacbf10df5f03f415223b49._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 2"
+ date="2011-08-05T20:55:07Z"
+ content="""
+Or just make a checkout of the site using version control and move files there. :)
+"""]]
diff --git a/doc/forum/multi-user_setup_of_ikiwiki__44___gitosis_and_apache2_in_Debian_Sid.mdwn b/doc/forum/multi-user_setup_of_ikiwiki__44___gitosis_and_apache2_in_Debian_Sid.mdwn
new file mode 100644
index 000000000..b8e28e0a3
--- /dev/null
+++ b/doc/forum/multi-user_setup_of_ikiwiki__44___gitosis_and_apache2_in_Debian_Sid.mdwn
@@ -0,0 +1,96 @@
+Dear developers and users,
+
+
+# Problem
+
+I am trying to set up ikiwiki for a website. Users should be able to edit pages using the webbrowser (ikiwiki.cgi) and a few should be able to edit it using versioning control and in this case Git.
+
+I have ikiwiki working for a single user (me), but I do not [[get_the_permissions_right|/rcs/Git]] for multiple users and commiters. The wiki admin does not own the Git repository in this case. And I do not understand everything yet (especially concerning wrappers).
+
+# Programs
+
+I am running Debian Etch with gitosis (0.2+20080626-2) installed from etch-backports, Apache2 (apache2.2-common 2.2.3-4+etch5) and ikiwiki (2.63) from Sid.
+
+# Goal
+
+* The website (run by ikiwiki) should be accessable via http://www.example.org/
+* Users can edit pages using the webbrowser.
+* Git is used as the backend.
+* The Git repository should be publicaly browsable via http://git.example.org/git/project.git (gitweb).
+* The Git repository can be accessed with git clone git://git.example.org/git/project.git (git-daemon).
+* Some manually set up users can push their changes over SSH to the repository and the post-update hook updates the wiki.
+
+# Directory Layout and permissions.
+
+## Website
+
+The website is stored in /srv/www/www.example.org/htdocs/ (destdir in ikiwiki.setup) and is owned by www-data:root with rights 755.
+
+## Git repository
+
+The [package gitosis](http://joey.kitenet.net/blog/entry/locking_down_ssh_authorized_keys/) creates an user gitosis with the home directory /srv/gitosis/ and the repository are stored in /srv/gitosis/repository/project.git owned by gitosis:gitosis and permissions 750. I can setup the permissions who is allowed to access this repository and if it should be published using git-daemon or gitweb in the configuration file gitosis.conf.
+
+# My efforts without results
+
+I could not come up with a working set of users which are put into different groups to create a good result with ikiwiki. The main problem is that under Debian umask is set to 022 which means that the members of a group are not allowed to write. I did not want to change this.
+
+> You can set the umask for ikiwiki itself, without changing the system umask, via the usmask setting in the setup file. --[[Joey]]
+
+In the end, I did the following. I created a directory /srv/ikiwiki/ which is owned by gitosis. The [[setup_file|/usage]] is also located there (/srv/ikiwiki/project.setup). I put the srcdir there too (srcdir => '/srv/ikiwiki/project/'). So now sudo -u gitosis ikiwiki --project.setup is able to create the post-update hook (git_wrapper => '/srv/gitosis/repositories/project.git/hooks/post-update'). Since this hook is called every time something is checked in over SSH, it is run by gitosis, so I did not set it suid. Or do I have to, because ikiwiki.cgi will be run as www-data?
+
+> Generally, ikiwiki.cgi is run as the user who owns the wiki and repository, in this case, gitosis. The ikwiiki.cgi needs to be able to write to source files in the wiki; it needs to be able to commit changes,
+> and it needs to be able to generate and write the html files. If you don't want ikiwiki.cgi to run as gitosis, you will need to put gitosis and www-data in a group and give them both write access, with appropriate umask, etc. --[[Joey]]
+
+## cgi_wrapper
+
+I do not understand those wrappers completely. The cgi is a script, which can be called by a webserver, e. g. [[Apache_2|/tips/dot_cgi]]. But www-data is normally not allowed to write to the source directory (which is owned by gitosis or push to the repository). Therefore it should be run as the user gitosis. And because cgi scripts can not be made suid, I wrapper (in this case a C program) is created (cgi\_wrapper) which can be made suid and therefore be run as the user gitosis. Is this correct?
+
+> It seems to me like you understand the wrapper pretty well. It's main reson to exist is to safely be suid, yes.
+
+So where is good place to save this wrapper? cgi_wrapper => '/srv/ikiwiki/project-wrapper'? Then /srv/ikiwiki/project-wrapper is created from a temporary C file prject-wrapper.c?
+
+
+No sudo -u gitosis ikikwiki --setup project.setup is still not able to put the compilation result into /srv/www/www.project.org/htdocs because this is owned by www-data. I just came up with two things.
+
+1. Set destdir => '/srv/ikiwiki/html-project', do ln -s /srv/ikiwiki/html-project /srv/www/www.example.org/htdocs and adduser www-data gitosis. But I am not sure about the security implications of using symbolic links.
+
+2. Since the webserver (Apache 2) has just to read the html files (is that true for static and dynamic (PHP) pages) sudo chown -R gitosis:www-data /srv/www/www.example.org/ should do it. But it is per default www-data:root under Debian, so I do not know if this should be changed.
+
+
+Could you please enlighten me. It should be possible seeing for example this site.
+
+> www-data is not really intended to own files. So that if the web server is compromised, it cannot rewrite your web site. So make the site's destdir be owned by the same user that ikiwiki runs as.
+> /srv/www is not shipped by debian; it is a bug in debian for any package to make files owned by www-data; so it seems to me that your /srv/www www-data ownership is something you must have configured yourself. --[[Joey]]
+
+Thanks in advance,
+
+--[[PaulePanter]]
+
+---
+
+## Current Working Notes
+
+I've spent a little time getting this working and I wanted to share my notes on this, for those that come after me.
+
+### My Setup
+
+- Debian Lenny
+
+- Gitosis (hand compiled, for no good reason, but it's the same version as in the repository)
+
+- Ikiwiki 3.12 installed using packages from Sid
+
+### What Works
+
+- Everything needs to be chowned git:git (or gitosis:gitosis) by whatever gitosis runs with. This includes:
+ - the bare repository (as always)
+ - the srcdir
+ - the destdir
+
+- Ikiwiki needs to run in gitosis' group (eg. git in my case, but probably gitosis in yours)
+
+- ikiwiki.cgi needs be set with the wrapper mode 6755.
+
+-- [[tychoish]]
+
+
diff --git a/doc/forum/multi_domain_setup_possible__63__.mdwn b/doc/forum/multi_domain_setup_possible__63__.mdwn
new file mode 100644
index 000000000..01b31aafb
--- /dev/null
+++ b/doc/forum/multi_domain_setup_possible__63__.mdwn
@@ -0,0 +1,15 @@
+Hi! I am searching for a replacement of my blog and webpages made off static HTML with just some custom PHP around it for years already. ikiwiki seems to be one of the hot candidates, since it uses a RCS.
+
+I would like to have a multi domain setup like this:
+
+- myname.private.de => more of a personal page
+- professional.de => more of my professional work related page
+- and possibly others
+
+Now when I write a blog entry about some Linux, Debian or KDE stuff, I possibly would like to have it shown on my private and my professional domain.
+
+And I might like to use some kind of inter wiki links now and then.
+
+Is such a setup possible? I thought about have a big wiki with Apache serving sub directories from it under different domains, but then wiki links like would not work.
+
+Maybe having the same blog entry, same content on several domains is not such a hot idea, but as long as I do not see a problem with it, I'd like to do it.
diff --git a/doc/forum/multi_domain_setup_possible__63__/comment_1_43f5df30d09046ccc4f7c44703979a11._comment b/doc/forum/multi_domain_setup_possible__63__/comment_1_43f5df30d09046ccc4f7c44703979a11._comment
new file mode 100644
index 000000000..5b00272b3
--- /dev/null
+++ b/doc/forum/multi_domain_setup_possible__63__/comment_1_43f5df30d09046ccc4f7c44703979a11._comment
@@ -0,0 +1,17 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="branches"
+ date="2010-08-15T16:06:43Z"
+ content="""
+This is where the git backend (or bzr if you prefer) shines. Make a site, and then branch it to a second site, and put your personal type stuff on the branch. cherry-pick or merge changes from one branch to another.
+
+The possibility to do this kind of thing is why our recently launched Ikiwiki hosting service is called
+[Branchable.com](http://branchable.com). It makes it easy to create branches of a Ikiwiki site hosted
+there: <http://www.branchable.com/tips/branching_an_existing_site/>
+(Merging between branches need manual git, for now.)
+
+BTW, for links between the branched wikis you can just use the [[plugins/shortcut]] plugin.
+
+--[[Joey]]
+"""]]
diff --git a/doc/forum/multi_domain_setup_possible__63__/comment_2_75d6581f81b71fb8acbe3561047ea759._comment b/doc/forum/multi_domain_setup_possible__63__/comment_2_75d6581f81b71fb8acbe3561047ea759._comment
new file mode 100644
index 000000000..473f52f5c
--- /dev/null
+++ b/doc/forum/multi_domain_setup_possible__63__/comment_2_75d6581f81b71fb8acbe3561047ea759._comment
@@ -0,0 +1,16 @@
+[[!comment format=mdwn
+ username="http://claimid.com/helios"
+ nickname="helios"
+ subject="branches"
+ date="2010-08-15T16:18:35Z"
+ content="""
+So I I just put a blog entry, which is just a file on both branches. Seems I have to learn cherry-picking and merging only some changes.
+
+Still I am duplicating files then and when I edit one file I have to think to also edit the other one or merge the change to it. I thought of a way to tag a blog entry on which site it should appear. And then I just have to edit one file and contents changes on all sites that share it.
+
+But then I possibly can do some master blog / shared content branch, so that shared content is only stored once. Then I need to find a way to automatically replicate the changes there to all sites it belongs too. But how do I store it.
+
+I also thought about just using symlinks for files. Can I have two sites in one repository and symlink shared files stuff around? I know bzr can version control symlinks.
+
+Hmmm, I think I better read more about branching, cherry-picking and merging before I proceed. I used bzr and git, but from the user interface side of things prefer bzr, which should be fast enough for this use case.
+"""]]
diff --git a/doc/forum/navigation_of_wiki_pages_on_local_filesystem_with_vim.mdwn b/doc/forum/navigation_of_wiki_pages_on_local_filesystem_with_vim.mdwn
new file mode 100644
index 000000000..7bfcf3088
--- /dev/null
+++ b/doc/forum/navigation_of_wiki_pages_on_local_filesystem_with_vim.mdwn
@@ -0,0 +1,130 @@
+**UPDATE** I have created a [[page|tips/follow_wikilinks_from_inside_vim]] in
+the tips section about the plugin, how to get it, install it and use it. Check
+that out. --[[jerojasro]]
+
+I wrote a vim function to help me navigate the wiki when I'm editing it. It extends the 'gf' (goto file) functionality. Once installed, you place the cursor on a wiki page name and press 'gf' (without the quotes); if the file exists, it gets loaded.
+
+This function takes into account the ikiwiki linking rules when deciding which file to go to.
+
+> 'gf' gets in the way when there are directories with the same name of a wiki page. The
+> function below doesn't implement the linking rules properly (test the link (ignoring case),
+> if there is no match ascend the dir. hierarchy and start over, until we reach the root of
+> the wiki). I'm rewriting it to follow these rules properly
+>
+> I think the page for [[LinkingRules|ikiwiki/subpage/linkingrules]] should say that ikiwiki **ascends**
+> the dir. hierarchy when looking for a wikilink, not that it **descends** it. Am I correct? --[[jerojasro]]
+
+>> Conventionally, the root directory is considered to be lower than other
+>> directories, so I think the current wording is correct. --[[Joey]]
+
+let me know what you think
+
+> " NOTE: the root of the wiki is considered the first directory that contains a
+> " .ikiwiki folder, except $HOME/.ikiwiki (the usual ikiwiki libdir)
+>
+> That's not going to work in all situations; for example, with an ikiwiki which uses git as the backend, the normal setup is that one has
+>
+> * a bare git repository
+> * a git repository which ikiwiki builds the wiki from (which has a .ikiwiki directory in it)
+> * an *additional* git repository cloned from the bare repository, which is used for making changes from the command-line rather than the web. It is this repository in which one would be editing files with vim, and *this* repository does not have a .ikiwiki directory in it. It does have a .git directory in the root, however, so I suppose you could use that as a method of detection of a root directory, but of course that would only work for git repositories.
+>
+> -- [[KathrynAndersen]]
+>
+>> You are completely right; all of my wikis are compiled both locally and
+>> remotely, and so the local repo also has a `.ikiwiki` folder. And that's not the
+>> "usual" setup.
+>>
+>> checking for a `.git` dir would not work when the wiki's source files aren't
+>> located at the root of the repo.
+>>
+>> So, besides of doing a `touch .ikiwiki` at the root of the wiki in your local
+>> repo, do you see any alternative?
+>>
+>> -- [[jerojasro]]
+
+well. I've rewritten the whole thing, to take into account:
+
+ * file matching ignoring case (MyPage matches mypage.mdwn)
+ * checking all the way down (up) to the root of the wiki (if there is a link `\[[foo]]` on `a/b/page`),
+ try `a/b/page/foo`, then `a/b/foo`, and so on, up to `foo`
+ * the alternate name for a page: when looking for the file for `\[[foo]]`, try both `foo.mdwn` and `foo/index.mdwn`
+
+you can find the file [here](http://git.devnull.li/cgi-bin/gitweb.cgi?p=vim-jerojasro.git;a=blob;f=.vim/ftplugin/ikiwiki_nav.vim;hb=HEAD). To use it, place it in `$HOME/.vim/ftplugin`. After that, hitting `<CR>` (Enter) in normal mode over a wikilink will take you to that page, if it exists.
+
+the plugin has, as of now, two problems:
+
+ * doesn't work with wikilinks that take more than one line (though this isn't really that bad)
+ * it assumes that the root of the wiki is the first directory down the filesystem hierarchy that
+ has a `.ikiwiki` folder on it. If your copy of the wiki doesn't have it, you must create it for
+ the plugin to work
+
+-- [[jerojasro]]
+
+> Interesting. I was at one point looking at "potwiki.vim", which implements a local wiki and follows CamelCase links, creating new files where necessary etc., to see if it could be adapted for ikiwiki (See [[tips/vim syntax highlighting/discussion]]). I didn't get anywhere. -- [[Jon]]
+
+>> when I wrote the plugin I also considered the possibility of creating files (and their dirs, if necessary)
+>> from new wikilinks; the changes needed to get that working are fairly small -- [[jerojasro]]
+
+> Seems about ready for me to think about pulling it into ikiwiki
+> alongside [[tips/vim_syntax_highlighting/ikiwiki.vim]]. If you'll
+> please slap a license on it. :) --[[Joey]]
+>
+>> GPL version 2 or later (if that doesn't cause any problems here). I'll add it
+>> to the file --[[jerojasro]]
+>>
+>>> I see you've put the plugin on vim.org. Do you think it makes sense to
+>>> also include a copy in ikiwiki? --[[Joey]]
+>>>
+>>>> mmm, no. There would be two copies of it, and the git repo. I'd rather have
+>>>> a unique place for the "official" version (vim.org), and another for the dev
+>>>> version (its git repo).
+>>>>
+>>>> actually, I would also suggest to upload the [[`ikiwiki.vim`|tips/vim_syntax_highlighting]] file to vim.org --[[jerojasro]]
+>>>>>
+>>>>> If you have any interest in maintaining the syntax highlighting
+>>>>> plugin and putting it there, I'd be fine with that. I think it needs
+>>>>> some slight work to catch up with changes to ikiwiki's directives
+>>>>> (!-prefixed now), and wikilinks (able to have spaces now). --[[Joey]]
+
+<a id='syn-maintenance'>
+
+>>>>>
+>>>>>> I don't really know too much about syntax definitions in vim. But I'll give it a stab. I know it fails when there are 2 \[[my text|link]] wikilinks in the same page.
+>>>>>> I'm not promising anything, though ;) --[[jerojasro]]
+>
+> Also, I have a possible other approach for finding ikiwiki's root. One
+> could consider that any subdirectory of an ikiwiki wiki is itself
+> a standalone wiki, though probably one missing a toplevel index page.
+> The relative wikilinks work such that this assumption makes sense;
+> you can build any subdirectory with ikiwiki and probably get something
+> reasonable with links that work, etc.
+>
+> So, if that's the case, then one could say that the directory that the
+> user considers to be the toplevel of their wiki is really also a subwiki,
+> enclosed in a succession of parents that go all the way down to the root
+> directory (or alternatively, to the user's home directory). I think that
+> logically makes some sense.
+>
+> And if that's the case, you can resolve an absolute link by looking for
+> the page closest to the root that matches the link.
+>
+>> I like your idea; it doesn't alter the matching of the relative links, and
+>> should work fine with absolute links too. I'll implement it, though I see
+>> some potential (but small) issues with it --[[jerojasro]]
+>
+> It may even make sense to change ikiwiki's own handling of "absolute"
+> links to work that way. But even without changing ikiwiki, I think it
+> would be a reasonable thing for vim to do. It would only fail in two
+> unusual circumstances:
+>
+> 1. There is a file further down, outside what the user considers
+> the wiki, that matches. Say a `$HOME/index.mdwn`
+> 2. An absolute link is broken in that the page linked to does
+> not exist in the root of the wiki. But it does exist in a subdir,
+> and vim would go to that file.
+>
+> --[[Joey]]
+>
+>> your approach will add more noise when the plugin grows the page-creation
+>> feature, since there will be no real root to limit the possible locations for
+>> the new page. But it is far better than demanding for a `.ikiwiki` dir --[[jerojasro]]
diff --git a/doc/forum/nginx:_404_plugin_not_working.mdwn b/doc/forum/nginx:_404_plugin_not_working.mdwn
new file mode 100644
index 000000000..dd23e3128
--- /dev/null
+++ b/doc/forum/nginx:_404_plugin_not_working.mdwn
@@ -0,0 +1,12 @@
+The [[plugins/404]] plugin is not working with nginx here:
+
+ Error: "do" parameter missing
+
+If i use the [[shell script for lighttpd|bugs/404_plugin_and_lighttpd]], ikiwiki answers
+
+ Error: missing page parameter
+
+Might this be a mistake by me or does anyone know a workaround to get the 404 plugin working with nginx? --[[bacuh]]
+
+
+
diff --git a/doc/forum/nginx:_404_plugin_not_working/comment_1_02a82e468676ae64374cc91ec87e39d6._comment b/doc/forum/nginx:_404_plugin_not_working/comment_1_02a82e468676ae64374cc91ec87e39d6._comment
new file mode 100644
index 000000000..982011286
--- /dev/null
+++ b/doc/forum/nginx:_404_plugin_not_working/comment_1_02a82e468676ae64374cc91ec87e39d6._comment
@@ -0,0 +1,15 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2011-05-26T06:58:04Z"
+ content="""
+The 404 plugin relies on environment variables set by Apache for 404 handlers. Not all web servers set the same variables (404 handlers aren't a standard feature) so it might need adapting to support nginx. (lighttpd has a similer problem - it doesn't indicate that the request was a 404 in an obvious way.)
+
+If you temporarily set this (shell!) CGI script as your 404 handler, its output should indicate what variables nginx sets:
+
+ #!/bin/sh
+ printf \"Content-type: text/plain\r\n\r\n\"
+ env
+
+"""]]
diff --git a/doc/forum/nginx:_404_plugin_not_working/comment_2_ce6bd8e98e4be08316522182f5f85a11._comment b/doc/forum/nginx:_404_plugin_not_working/comment_2_ce6bd8e98e4be08316522182f5f85a11._comment
new file mode 100644
index 000000000..e35c393dd
--- /dev/null
+++ b/doc/forum/nginx:_404_plugin_not_working/comment_2_ce6bd8e98e4be08316522182f5f85a11._comment
@@ -0,0 +1,11 @@
+[[!comment format=mdwn
+ username="http://xlogon.net/bacuh"
+ ip="93.182.181.39"
+ subject="comment 2"
+ date="2011-05-26T10:48:12Z"
+ content="""
+I checked the script's output and some nginx documentation. The only variable I could not find is $REDIRECT_URL. Also I could not discover any equivalent. Trying to define the variable myself in nginx's config does not help out.
+
+Any ideas?
+(Should I provide env?)
+"""]]
diff --git a/doc/forum/nginx:_404_plugin_not_working/comment_3_52b05c3274455db7bee3c1765776fd52._comment b/doc/forum/nginx:_404_plugin_not_working/comment_3_52b05c3274455db7bee3c1765776fd52._comment
new file mode 100644
index 000000000..e404d3465
--- /dev/null
+++ b/doc/forum/nginx:_404_plugin_not_working/comment_3_52b05c3274455db7bee3c1765776fd52._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 3"
+ date="2011-06-03T16:48:04Z"
+ content="""
+The relevant environment variables used by the plugin are `REDIRECT_STATUS` (should be \"404\") and `REDIRECT_URL`. I doubt it can be made to work if ngix does not provide the URL or some other indication of the page that is 404ing..
+"""]]
diff --git a/doc/forum/nginx:_404_plugin_not_working/comment_4_5a8c2987f442106c68eb822c5bce3bf1._comment b/doc/forum/nginx:_404_plugin_not_working/comment_4_5a8c2987f442106c68eb822c5bce3bf1._comment
new file mode 100644
index 000000000..8ad6dadfc
--- /dev/null
+++ b/doc/forum/nginx:_404_plugin_not_working/comment_4_5a8c2987f442106c68eb822c5bce3bf1._comment
@@ -0,0 +1,23 @@
+[[!comment format=mdwn
+ username="http://tychoish.livejournal.com/"
+ ip="74.108.56.136"
+ subject="Alternative"
+ date="2011-06-04T12:57:14Z"
+ content="""
+I just have the following in my nginx config, which isn't as friendly, I think, as a 404 plugin, but it does the job:
+
+ location / {
+ index index.html index.htm;
+ if (!-d $request_filename) {
+ rewrite ^/(.*)/$ /ikiwiki.cgi?page=$1&do=create last;
+ rewrite ^(.*)/$ /$1.html last;
+ rewrite ^(.*)/$ /$1.htm last;
+ }
+ if (!-e $request_filename) {
+ rewrite ^/(.*)$ /ikiwiki.cgi?page=$1&do=create last;
+ rewrite ^(.*)$ $1.html last;
+ rewrite ^(.*)$ $1.htm last;
+ }
+ }
+
+"""]]
diff --git a/doc/forum/nginx:_404_plugin_not_working/comment_5_0720cd8842dc1cb338b74a0e6fdb2aac._comment b/doc/forum/nginx:_404_plugin_not_working/comment_5_0720cd8842dc1cb338b74a0e6fdb2aac._comment
new file mode 100644
index 000000000..60824bd8a
--- /dev/null
+++ b/doc/forum/nginx:_404_plugin_not_working/comment_5_0720cd8842dc1cb338b74a0e6fdb2aac._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawl981Fi5YVeEC_ncO9cJTfkPyyy2A_-tG8"
+ nickname="Mick"
+ subject="comment 5"
+ date="2011-06-25T12:00:24Z"
+ content="""
+Is [[this|missing_pages_redirected_to_search-SOLVED]] what you are after ? --[Mick](http://www.lunix.com.au)
+"""]]
diff --git a/doc/forum/pandoc-iki_plugin.mdwn b/doc/forum/pandoc-iki_plugin.mdwn
new file mode 100644
index 000000000..b9cc6b3d2
--- /dev/null
+++ b/doc/forum/pandoc-iki_plugin.mdwn
@@ -0,0 +1,5 @@
+I've updated [[Jason Blevin|users/jasonblevins]]'s pandoc plugin to permit tighter integration between Ikiwiki and [Pandoc](http://johnmacfarlane.net/pandoc/). Given the features Pandoc has added over the past 6-12 months, this makes for a very powerful combination, e.g. with code block syntax highlighting and lots of options for how to process and display inline TeX. See <https://github.com/dubiousjim/pandoc-iki> for details.
+
+How do I get this added to the contrib section of the plugin list? --Profjim
+
+
diff --git a/doc/forum/pandoc-iki_plugin/comment_1_11eef903493378fd704a6bd92e968508._comment b/doc/forum/pandoc-iki_plugin/comment_1_11eef903493378fd704a6bd92e968508._comment
new file mode 100644
index 000000000..f119035af
--- /dev/null
+++ b/doc/forum/pandoc-iki_plugin/comment_1_11eef903493378fd704a6bd92e968508._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-06-03T17:01:45Z"
+ content="""
+Be bold! This is a wiki, you can add it youself. I've put a form at the bottom of [[plugins/contrib]] to make it easier.
+"""]]
diff --git a/doc/forum/pandoc-iki_plugin/comment_2_2c437577390cffe3401f5cc2f08a2ab1._comment b/doc/forum/pandoc-iki_plugin/comment_2_2c437577390cffe3401f5cc2f08a2ab1._comment
new file mode 100644
index 000000000..472bd38b5
--- /dev/null
+++ b/doc/forum/pandoc-iki_plugin/comment_2_2c437577390cffe3401f5cc2f08a2ab1._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://profjim.myopenid.com/"
+ nickname="profjim"
+ subject="comment 2"
+ date="2011-06-12T20:20:13Z"
+ content="""
+Done, thanks. --Profjim
+"""]]
diff --git a/doc/forum/paths_to_files_outside_the_wiki_root.mdwn b/doc/forum/paths_to_files_outside_the_wiki_root.mdwn
new file mode 100644
index 000000000..c1c6466cc
--- /dev/null
+++ b/doc/forum/paths_to_files_outside_the_wiki_root.mdwn
@@ -0,0 +1,34 @@
+Hello. My ikiwiki is at an unreliable and unprofessional host, that still has many advantages. It's my university. One thing making sure I tolerate this is that it is free and I have ssh access. Now I have two location onto which I would like to put my website, both are of the form of http://users.university.top/user/wiki/, where "user" is different for each host. This is a problem for some of my wiki pages:
+
+I have some pages in the wiki (content) and some files in a directory parallel to the wiki directory called publicfiles/, and some images in images/. Now I want to link to files in publicfiles/ and images/ from the wiki, and I can't use absolute paths, since the "user" part is different in each URL (and will be different for each host I choose). I could use relative paths, that has worked before, so I type ../../publicfiles/file.tar.gz; but I've run into a problem again: inline pages. For inline pages where you include pages from a different level, it doesn't work.
+
+I've tried a middle ground with pages relative to the wiki top, that is the link index/../publicfiles/file.tar.gz. I think ikiwiki doesn't allow this, doing index/../ doesn't take you out of the wiki root.
+
+I found one solution, and that was to put a symlink called publicfiles into the wiki/ directory. This has to be put in place on the server I think, I don't know how this will do with ikiwiki or my remote sync.
+--ulrik Thu, 20 Dec 2007 22:07:05 +0100
+
+But not even the symlink solution works, since I have only two choices, specify relative to the ikiwiki page, or relative to the absolute webdomain root. I need to specify from wiki root :( for example the link publicfiles/ will not as I thought link always to wikiroot/publicfiles , but rather link to publicfiles in the level of the wiki you are at right now.
+
+--ulrik
+
+> If you put the publicfiles/ into the srcdir that ikiwiki builds the wiki
+> from, then it'll know about them and wikilinks to the files will work
+> same as wikilinks to any other files ikiwiki knows about. Perhaps
+> there's a reason you can't do that, such as the files being too large, or
+> not being available on the host you build on, I don't know. --[[Joey]]
+
+Yes, that would solve it. A part of me wanted though that it should be possible to put a symlink called publicfiles that basically points to wikiroot/../publicfiles, and have the ability to link to files with wikisyntax anyway. But it doesn't work since symlinks are dangerous, and.. how would ikiwiki know how to interpret a relative link, relative to source dir or dest dir (etc..)? I'll have to put all my images and publicfiles into the wiki; it is principally wrong since publicfiles are tar.gz (and a few .deb files) for software, but it is practically ok since the individual files are not more than 150K and I don't have anything against archiving them. --ulrik
+
+> You know, you don't need to check the files into revision control, they
+> can just be put in the srcdir of the wiki outside revision control.
+
+To try to formalize and clarify my first proposal: An administrator would
+be able for each wiki to create a list of off-wiki "places" that are
+accessed via certain items in the wiki root (or could be under a subpage
+too of course). The example is illustrated by publicfiles and the symlink,
+but a non-symlink solution would probably be better. A natural way to
+specify off-wiki places are absolute URLs, but also relative to the wiki
+root, since that would fit to my case. Just like you can't go out of the
+wiki root, you should not be able to go up from such an external resource.
+This should all be done in some plugin of course. I'll have to learn Perl
+before I write the plugin though :) --ulrik
diff --git a/doc/forum/perl5lib_and_wrappers.mdwn b/doc/forum/perl5lib_and_wrappers.mdwn
new file mode 100644
index 000000000..83efc7cb5
--- /dev/null
+++ b/doc/forum/perl5lib_and_wrappers.mdwn
@@ -0,0 +1,13 @@
+I don't know if I'm doing this right... I'm using a server provider that doesn't allow me to install into standard perl locations, so I used PREFIX to install things in my home dir. The problem is that when the wrapper is run by the CGI server, it can't find the perl modules I installed. There didn't seem to be a way to set the PERL5LIB from the standard config, so I added one. Patch attached. Or did I miss something and this was already possible?
+
+> The standard way to do it is to set `INSTALL_BASE=$HOME` when running
+> `Makefile.PL`. If you do this, ikiwiki will be built with a special `use
+> lib $HOME` line inserted, that will make it look in the specified
+> directory for perl modules.
+>
+> The [[tips/nearlyfreespeech]] tip has an example of doing this.
+> --[[Joey]]
+
+>> Thanks! I found that page, but didn't recognise the importance of INSTALL_BASE.
+
+>> It looks like INSTALL_BASE only appeared in version 6.31 of the Perl MakeMaker. My provider is still running version 6.30. Looks like I'll be keeping my patches for the moment... sigh.
diff --git a/doc/forum/po_plugin_doesn__39__t_create_po_files___40__only_pot__41__..mdwn b/doc/forum/po_plugin_doesn__39__t_create_po_files___40__only_pot__41__..mdwn
new file mode 100644
index 000000000..95cb62d29
--- /dev/null
+++ b/doc/forum/po_plugin_doesn__39__t_create_po_files___40__only_pot__41__..mdwn
@@ -0,0 +1,11 @@
+On [[the po plugin's page|plugins/po]] it is clearly stated that "when the plugin has just been enabled, or when a page has just been declared as being translatable, the needed POT and PO files are created". Yet on all my attempts, only the pot file was created. Do I have to create the po files manually somehow?
+
+To be precise, these are the settings I put in my wiki's setup file to enable the po plugin:
+
+ add_plugins => [qw{... po ...}],
+ po_master_language => 'de|Deutsch',
+ po_slave_languages => 'en|English',
+ po_translatable_pages => "mytranslatedpage",
+ po_link_to => 'current',
+
+… followed by "ikiwiki --setup mysetupfile".
diff --git a/doc/forum/possible_utf-8_problem__63__.mdwn b/doc/forum/possible_utf-8_problem__63__.mdwn
new file mode 100644
index 000000000..fb87fadd1
--- /dev/null
+++ b/doc/forum/possible_utf-8_problem__63__.mdwn
@@ -0,0 +1,26 @@
+I have some problems with a blog wiki that I try to setup.
+
+Everything seemed to work correctly with utf-8 until
+I tried to have a page name with utf-8 characters that
+contained a blog. More testing showed that the 'from'-parameter
+in the form didn't like utf-8. Just that variable, everything
+else works fine.
+
+As soon as I try to add a new entry I get "bad page name"
+instead of the editpage.
+
+Here is an example:
+The page itself is named 'hönshuset.mdwn' and it contains
+this blog inline:
+
+\[\[!inline pages="honshuset/*" postform=yes ]]
+
+Looking at the form, it is the from-parameter that contains 'hönshuset'
+that triggers the problem. If I rename the file to honshuset.mdwn
+everything works fine.
+
+For some reason the from-parameter seems to depend on both the filename and
+the rootpage parameter. If I add *rootpage*, then I must not use
+utf-8 characters in *rootpage* or in the source filename.
+
+I use ikiwiki 3.20120629 in Debian sid.
diff --git a/doc/forum/postsignin_redirect_not_working.mdwn b/doc/forum/postsignin_redirect_not_working.mdwn
new file mode 100644
index 000000000..bc8855b7b
--- /dev/null
+++ b/doc/forum/postsignin_redirect_not_working.mdwn
@@ -0,0 +1,30 @@
+I'm confused. I got a plugin working that allows a button to call up a login screen but I can't seem to get it to return to the calling page. I end up on the prefs page.
+
+When the plugin first runs it puts the http_referer into a param:
+
+ $session->param("postsignin" => $ENV{HTTP_REFERER} );
+
+Then when it runs for postsignin its supposed to pull it out and send the user to the original page:
+
+ my $page=$q->param("postsignin");
+ ...
+ IkiWiki::redirect($q, $page);
+ exit;
+
+Full code is available on the plugin page: [[plugins/contrib/justlogin]].
+
+I searched the site and there's very little info available for postsignin or redirect. Perhaps I'm using the wrong function?
+
+> I don't know why you end up on the prefs page. Have you tried
+> looking inside the session database to see what postsignin
+> parameter is stored?
+>
+> But, `cgi_postsignin()` assumes it can directly pass the postsignin cgi
+> parameter into `cgi()`. You're expecting it to redirect to an url, and it
+> just doesn't do that. Although I have considered adding a redirect
+> there, just so that openid login info doesn't appear in the url after
+> signin (which breaks eg, reload). That would likely still not make your
+> code work, since the value of postsignin is a url query string, not a
+> full url.
+>
+> I'd suggest you put a do=goto redirect into postsignin. --[[Joey]]
diff --git a/doc/forum/problem_with_git_after_a_commit_of_ikiwiki.mdwn b/doc/forum/problem_with_git_after_a_commit_of_ikiwiki.mdwn
new file mode 100644
index 000000000..3fb892933
--- /dev/null
+++ b/doc/forum/problem_with_git_after_a_commit_of_ikiwiki.mdwn
@@ -0,0 +1,4 @@
+after adding a comment on ikiwiki i have this error when im updating ikiwiki from gitolite with git push or using ikiwiki --setup alicewiki.setup
+http://paste.debian.net/160953
+
+and i can't update new post or everythings other
diff --git a/doc/forum/problem_with_git_after_a_commit_of_ikiwiki/comment_1_2b9986717769419a8ae0f730c36b7e65._comment b/doc/forum/problem_with_git_after_a_commit_of_ikiwiki/comment_1_2b9986717769419a8ae0f730c36b7e65._comment
new file mode 100644
index 000000000..ecdc20bc1
--- /dev/null
+++ b/doc/forum/problem_with_git_after_a_commit_of_ikiwiki/comment_1_2b9986717769419a8ae0f730c36b7e65._comment
@@ -0,0 +1,22 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2012-03-26T08:22:37Z"
+ content="""
+The problem is visible in what you pasted:
+
+ fatal: empty ident <git@r36457.ovh.net
+ > not allowed
+
+and so is the solution:
+
+ *** Please tell me who you are.
+
+ Run
+
+ git config --global user.email \"you@example.com\"
+ git config --global user.name \"Your Name\"
+
+ to set your account's default identity.
+"""]]
diff --git a/doc/forum/recentchanges_dir_should_be_under_control_of_RCS__63__.mdwn b/doc/forum/recentchanges_dir_should_be_under_control_of_RCS__63__.mdwn
new file mode 100644
index 000000000..2fe97366b
--- /dev/null
+++ b/doc/forum/recentchanges_dir_should_be_under_control_of_RCS__63__.mdwn
@@ -0,0 +1,105 @@
+Hello Joey,
+
+I noticed that my Ikiwiki started to rebuild pages very slowly after my last changes
+when I upgraded Ikiwiki to version 3.20100623. Now I have the latest release 3.20100704,
+but it doesn't help me.
+
+I started to debug the problem and I found that I can see a lot of messages
+like below when I try to rebuild my wiki manually:
+
+ svn: '/path/to/ikiwiki/trunk/pages/ostatnie_zmiany' is not a working copy
+ svn: Can't open file '/path/to/ikiwiki/trunk/pages/ostatnie_zmiany/.svn/entries': No such file or directory
+ svn log exited 256
+
+"ostatnie_zmiany" is a value of `recentchangespage` parameter in my
+`ikiwiki.setup` file. It is not under control Subversion I use for Ikiwiki:
+
+ $ svn status pages/ostatnie_zmiany
+ ? pages/ostatnie_zmiany
+
+ $ ls pages/ostatnie_zmiany/*._change |wc -l
+ 100
+
+`recentchangesnum` parameter has value 100 for me and I noticed that my Ikiwiki
+takes a lot of time to parse all `._change` files. Finally it doesn't refresh
+/ostatnie_zmiany.html page.
+
+Do you think I should add `ostatnie_zmiany` directory under control of my
+Subversion repo? If it's not necessary, could you please give me any hint
+to find a reason of problem with my Ikiwiki?
+
+My best regards,
+Pawel
+
+> No, the recentchanges pages are automatically generated and should not
+> themselves be in revision control.
+>
+> Ikiwiki has recently started automatically enabing `--gettime`, but
+> it should not do it every time, but only on the initial build
+> of a wiki. It will print "querying svn for file creation and modification
+> times.." when it does this. If it's doing it every time, something
+> is wrong. (Specifically, `.ikiwiki/indexdb` must be missing somehow.)
+>
+> The support for svn with --gettime is rather poor. (While with git it is
+> quite fast.) But as it's only supposed to happen on the first build,
+> I haven't tried to speed it up. It would be hard to do it fast with svn.
+> It would be possible to avoid the warning message above, or even skip
+> processing files in directories not checked into svn -- but I'd much
+> rather understand why you are seeing this happen on repeated builds.
+> --[[Joey]]
+
+>> Thanks a lot for your reply! I've just checked my `rebuild-pages.sh`
+>> script and discovered that it contains
+>> `/usr/bin/ikiwiki --setup ikiwiki.setup --gettime` command... :D
+>> The warnings disappeared when I removed `--gettime` parameter.
+>> Sorry for confusing! :)
+>>
+>> I have `.ikiwiki/indexdb` file here, but I noticed that it has been
+>> modified about 1 minute **after** last Subversion commit:
+>>
+>> $ LANG=C svn up
+>> At revision 5951.
+>>
+>> $ LANG=C svn log -r 5951
+>> ------------------------------------------------------------------------
+>> r5951 | svn | 2010-07-06 09:02:30 +0200 (Tue, 06 Jul 2010) | 1 line
+>>
+>> web commit by xahil
+>> ------------------------------------------------------------------------
+>>
+>> $ LANG=C stat pages/.ikiwiki/indexdb
+>> File: `pages/.ikiwiki/indexdb'
+>> Size: 184520 Blocks: 368 IO Block: 131072 regular file
+>> Device: 2bh/43d Inode: 1931145 Links: 1
+>> Access: (0644/-rw-r--r--) Uid: ( 1005/ svn) Gid: ( 1005/ svn)
+>> Access: 2010-07-06 12:06:24.000000000 +0200
+>> Modify: 2010-07-06 09:03:38.000000000 +0200
+>> Change: 2010-07-06 09:03:38.000000000 +0200
+>>
+>> I believe it's the time I have to wait to see that my wiki page has been rebuilt.
+>> Do you have any idea how to find a reason of that delay? --[[Paweł|ptecza]]
+
+>>> Well, I hope that your svn post-commit hook is not running your
+>>> `rebuild-pages.sh`. That script rebuilds everything, rather than just
+>>> refreshing what's been changed.
+>>>
+>>> Using subversion is not asking for speed. Especially if your svn
+>>> repository is on a remote host. You might try disabling
+>>> recentchanges and see if that speeds up the refreshes (it will avoid
+>>> one `svn log`).
+>>>
+>>> Otherwise, take a look at [[tips/optimising_ikiwiki]]
+>>> for some advice on things that can make ikiwiki run slowly. --[[Joey]]
+
+>>>> Thanks for the hints! I don't understand it, but it seems that refreshing
+>>>> all pages has resolved the problem and now my wiki works well again :)
+>>>>
+>>>> No, I use `rebuild-pages.sh` script only when I want to rebuild
+>>>> my wiki manually, for example when you release new Ikiwiki version
+>>>> then I need to update my templates. Some of them have been translated
+>>>> to Polish by me.
+>>>>
+>>>> Fortunately my wiki and its Subversion repo are located on the same host.
+>>>> We have a lot of Subversion repos for our projects and I don't want to
+>>>> change only wiki repo for better performance. I'm rather satisfied with
+>>>> its speed. --[[Paweł|ptecza]]
diff --git a/doc/forum/recovering_original_title_with_meta_directive.mdwn b/doc/forum/recovering_original_title_with_meta_directive.mdwn
new file mode 100644
index 000000000..ad0b02a9e
--- /dev/null
+++ b/doc/forum/recovering_original_title_with_meta_directive.mdwn
@@ -0,0 +1 @@
+When using the \[[!meta title=""]] directive, the documentation states that a template variable is set when the title is overridden. However, how does one recover the original page title? --[[Glenn|geychaner@mac.com]]
diff --git a/doc/forum/remove_css__63__.mdwn b/doc/forum/remove_css__63__.mdwn
new file mode 100644
index 000000000..80da621d6
--- /dev/null
+++ b/doc/forum/remove_css__63__.mdwn
@@ -0,0 +1,5 @@
+I removed a local.css file and pushed the changes to git but the 'compiled' wiki still shows the same css.
+Is this a bug or you are supposed to remove the css by hand?
+ikiwiki version 3.20100705
+
+> It's a [[bug|bugs/underlaydir_file_expose]]. --[[Joey]]
diff --git a/doc/forum/report_pagination.mdwn b/doc/forum/report_pagination.mdwn
new file mode 100644
index 000000000..03a77b16d
--- /dev/null
+++ b/doc/forum/report_pagination.mdwn
@@ -0,0 +1,18 @@
+I am thinking of adding pagination to the [[plugins/contrib/report]] plugin, but I'm not sure which is the best approach to take. (By "pagination" I mean breaking up a report into multiple pages with N entries per page.)
+
+Approaches:
+
+1. generate additional HTML files on the fly which are placed in the sub-directory for the page the report is on. These are not "pages", they are not under revision control, they aren't in the %pagesources hash etc. But using the `will_render` mechanism assures that they will be removed when they are no longer needed.
+
+2. create new pages which each have a report directive which shows a subset of the full result; add them to revision control, treat them as full pages. Problems with this are: (a) trying to figure out when to create these new pages and when not to, (b) whether or not these pages can be deleted automatically.
+
+3. some other approach I haven't thought of.
+
+I'm afraid that whatever approach I take, it will end up being a kludge.
+
+> Well, it should be perfectly fine, and non-kludgy for a single page to
+> generate multiple html files. Just make sure that html files have names
+> that won't conflict with the html files generated by some other page.
+>
+> Of course the caveat is that since such files are not pages, you won't
+> be able to use wikilinks to link directly to them, etc. --[[Joey]]
diff --git a/doc/forum/screenplay_plugin.mdwn b/doc/forum/screenplay_plugin.mdwn
new file mode 100644
index 000000000..5891532f0
--- /dev/null
+++ b/doc/forum/screenplay_plugin.mdwn
@@ -0,0 +1 @@
+I have a wordstar-style (dot command) screenplay plugin working. How do I know if its good enough or interesting enough to submit to ikiwiki?
diff --git a/doc/forum/screenplay_plugin/comment_1_6c353acfc80b972ee3a34c8bb09dede3._comment b/doc/forum/screenplay_plugin/comment_1_6c353acfc80b972ee3a34c8bb09dede3._comment
new file mode 100644
index 000000000..fa0762e59
--- /dev/null
+++ b/doc/forum/screenplay_plugin/comment_1_6c353acfc80b972ee3a34c8bb09dede3._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-02-25T18:58:38Z"
+ content="""
+You can list it on [[plugins/contrib]] right away, and see if people seem interested in it.
+"""]]
diff --git a/doc/forum/screenplay_plugin/comment_2_1868aeebebefae80531f2031ffba35d3._comment b/doc/forum/screenplay_plugin/comment_2_1868aeebebefae80531f2031ffba35d3._comment
new file mode 100644
index 000000000..7168b890f
--- /dev/null
+++ b/doc/forum/screenplay_plugin/comment_2_1868aeebebefae80531f2031ffba35d3._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="justint"
+ ip="24.182.207.250"
+ subject="thank you"
+ date="2011-02-27T05:21:54Z"
+ content="""
+Thank you Joey, I put it up there. Hopefully someone will enjoy it.
+"""]]
diff --git a/doc/forum/section_editing.mdwn b/doc/forum/section_editing.mdwn
new file mode 100644
index 000000000..a20fa1def
--- /dev/null
+++ b/doc/forum/section_editing.mdwn
@@ -0,0 +1 @@
+Is [[Section editing|http://www.wikimatrix.org/wiki/feature:Section%20Editing]] possible in ikiwiki? Is there a plugin?
diff --git a/doc/forum/section_editing/comment_1_b193caa886a47c685ac7dafaf60c1761._comment b/doc/forum/section_editing/comment_1_b193caa886a47c685ac7dafaf60c1761._comment
new file mode 100644
index 000000000..c40cf8aa7
--- /dev/null
+++ b/doc/forum/section_editing/comment_1_b193caa886a47c685ac7dafaf60c1761._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2011-08-05T21:00:56Z"
+ content="""
+Not exactly the same, but a large document is sometimes stiched together by inlining smaller sections. The sections can then be clicked on to go to their page and edited. For example, see <http://git-annex.branchable.com/walkthrough/>. A side benefit of this is that users can post comments on each section of the page, which are visible when viewing the page for that section, but not at the top level.
+
+I suspect section editing is necessary on eg, Wikipedia because it reduces the change for conflicts when multiple people are editing different parts of the page. Ikiwiki instead avoids conflicts by harnessing the merging power of a version control system such as git, so multiple edits to different sections of a page can be made and each saved without manual conflict resolution.
+
+I'm happy that modern web browsers allow searching inside text edit boxes, so when I want to jump to a given part of a large page I'm editing, I just search for the text.
+"""]]
diff --git a/doc/forum/speeding_up_ikiwiki.mdwn b/doc/forum/speeding_up_ikiwiki.mdwn
new file mode 100644
index 000000000..799186cf8
--- /dev/null
+++ b/doc/forum/speeding_up_ikiwiki.mdwn
@@ -0,0 +1,90 @@
+My website takes a fairly long time to render. It takes a long time to do
+things like add pages, too. I'd like to try and understand what takes the
+time and what I might be able to do to speed things up.
+
+I have 1,234 objects on my site (yikes!). 717 are items under "/log" which
+I think might be the main culprit because there are some complex pagespecs
+operating in that area (e.g. log/YYYY/MM/DD, YYYY/MM and YYYY for YYYY >=
+2003, YYYY <= 2008 which include every page under log/ which was modified
+in the corresponding YYYY or YYYY/MM or YYYY/MM/DD). There is very little
+linking between the world outside of /log and that within it.
+
+I was interested in generating a graphical representation of ikiwiki's idea of
+page inter-dependencies. I started by looking at the '%links' hash using the
+following plugin:
+
+ #!/usr/bin/perl
+ package IkiWiki::Plugin::deps;
+
+ use warnings;
+ use strict;
+ use IkiWiki 3.00;
+
+
+ sub import {
+ hook(type => "format", id => "deps", call => \&fooble);
+ }
+
+ my $hasrun = 0;
+
+ sub fooble ($$) {
+ if(0 == $hasrun) {
+ $hasrun = 1;
+ open MYFILE, ">/home/jon/deps.dot";
+ foreach my $key (keys %links) {
+ my $arrref = $links{$key};
+ foreach my $page (@$arrref) {
+ print MYFILE "$key -> $page;\n";
+ }
+ }
+ close MYFILE;
+ }
+ }
+
+ 1
+
+The resulting file was enormous: 2,734! This turns out to be because of the following code in scan() in Render.pm:
+
+ if ($config{discussion}) {$
+ # Discussion links are a special case since they're
+ # not in the text of the page, but on its template.
+ $links{$page}=[ $page."/".gettext("discussion") ];
+
+Worst case (no existing discussion pages) this will double the number of link
+relationships. Filtering out all of those, the output drops to 1,657. This
+number is still too large to really visualize: the graphviz PNG and PDF output
+engines segfault for me, the PS one works but I can't get any PS software to
+render it without exploding.
+
+Now, the relations in the links hash are not the same thing as Ikiwiki's notion of dependencies. Can anyone point me at that data structure / where I might be able to add some debugging foo to generate a graph of it?
+
+Once I've figured out that I might be able to optimize some pagespecs. I
+understand pagespecs are essentially translated into sequential perl code. I
+might gain some speed if I structure my complex pagespecs so that the tests
+which have the best time complexity vs. "pages ruled out" ratio are performed
+first.
+
+I might also be able to find some dependencies which shouldn't be there and
+remove the dependency.
+
+In general any advice people could offer on profiling ikiwiki would be great.
+I did wonder about invoking the magic profiling arguments to perl via the CGI
+wrapper.
+
+
+-- [[Jon]]
+
+> Dependencies go in the `%IkiWiki::depends` hash, which is not exported. It
+> can also be dumped out as part of the wiki state - see [[tips/inside_dot_ikiwiki]].
+>
+> Nowadays, it's a hash of pagespecs, and there
+> is also a `IkiWiki::depends_simple` hash of simple page names.
+>
+> I've been hoping to speed up IkiWiki too - making a lot of photo albums
+> with my [[plugins/contrib/album]] plugin makes it pretty slow.
+>
+> One thing that I found was a big improvement was to use `quick=yes` on all
+> my `archive=yes` [[ikiwiki/directive/inline]]s. --[[smcv]]
+
+> Take a look at [[tips/optimising_ikiwiki]] for lots of helpful advice.
+> --[[Joey]]
diff --git a/doc/forum/square_brackets_inside_backticks_generates_incorrect_html___40__interpreted_as_wikilinks__41____63__.html b/doc/forum/square_brackets_inside_backticks_generates_incorrect_html___40__interpreted_as_wikilinks__41____63__.html
new file mode 100644
index 000000000..fe20a05b1
--- /dev/null
+++ b/doc/forum/square_brackets_inside_backticks_generates_incorrect_html___40__interpreted_as_wikilinks__41____63__.html
@@ -0,0 +1,19 @@
+the following markdown code
+`foo \[["bar"]]` generates the following output:
+
+foo <span class="createlink"><a href="http://euler/~dabd/wiki/ikiwiki.cgi?page=__34__bar__34__&amp;from=foo&amp;do=create" rel="nofollow">?</a>&#34;bar&#34;</span>
+
+Perhaps this is a bug in the markdown processor?
+
+<blockquote>
+This has nothing to do with markdown; wikilinks and directives
+are not part of markdown, and just get expanded to html before
+markdown processing.
+
+There is a bug open about this:
+[[bugs/wiki_links_still_processed_inside_code_blocks]]
+
+Note that escaping the wikilink with a slash will avoid it being expanded
+to html.
+--[[Joey]]
+</blockquote>
diff --git a/doc/forum/suppressing_output_of_pages_included_only_for_their_side_effects.mdwn b/doc/forum/suppressing_output_of_pages_included_only_for_their_side_effects.mdwn
new file mode 100644
index 000000000..99784cdd2
--- /dev/null
+++ b/doc/forum/suppressing_output_of_pages_included_only_for_their_side_effects.mdwn
@@ -0,0 +1,16 @@
+In particular, it's kind of annoying that using the sidebar plugin results in the creation of a free-standing sidebar.html (which in the simplest case of course includes a copy of *its own content* as a sidebar). It would be nice if there were a way to tell Ikiwiki to treat a file like sidebar.mdwn as "inline only": allow its content to be inlined but not to render it separately nor allow linking to it.
+
+In reading through the code and associated docs, it appears that the ideal method is for the file to be removed from the $changed array by plugin's "needsbuild" hook. Either the sidebar plugin could define such a hook, or perhaps a more general solution is the creation of a meta variable or config file regexp that would handle it according to the user's wishes.
+
+I'm about ready to code up such a change but want to find out if I'm thinking along the right lines. --[[blipvert]]
+
+> Internal pages should be able to be used for this, as they are used for
+> comments. So you'd have
+> `sidebar._mdwn`. However, mwdn would need to be changed to register a
+> htmlize hook for the `_mdwn` extension for that to really work.
+>
+> But, if there's no rendered sidebar page, how can users easily edit the page
+> in the web interface? In the specific case of the sidebar, It seems
+> better to have the page display something different when built standalone
+> than when built as the sidebar.
+> --[[Joey]]
diff --git a/doc/forum/tag_plugin:_rebuilding_autocreated_pages.mdwn b/doc/forum/tag_plugin:_rebuilding_autocreated_pages.mdwn
new file mode 100644
index 000000000..114837031
--- /dev/null
+++ b/doc/forum/tag_plugin:_rebuilding_autocreated_pages.mdwn
@@ -0,0 +1,11 @@
+I have a bunch of tag pages autogenerated by the tag plugin. As part of a redesign for my wiki, I have changed the `autotag.tmpl` template, but IkiWiki refuses to rebuild existing tag pages using the updated template. I understand that existing tag pages are not rebuilt because they have been marked as "created" in `.ikiwiki/indexdb`. Is there a way to purge all tag pages from `indexdb`? --dkobozev
+
+> Well, you can delete the indexdb and rebuild, and that will do it.
+> The tag plugin is careful not to replace existing pages or even recreate
+> tag pages if you delete them, which does cause a problem if you need to
+> update them. --[[Joey]]
+
+>> Thanks. I thought about deleting `indexdb`, but hesitated to do that. According to [[tips/inside dot ikiwiki]], `indexdb` stores "all persistent state about pages". I didn't know if it's harmless to lose all persistent state. --dkobozev
+
+>>> The persistant state is best thought of as a cache,
+>>> so it's safe to delete it. --[[Joey]]
diff --git a/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds.mdwn b/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds.mdwn
new file mode 100644
index 000000000..f52486341
--- /dev/null
+++ b/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds.mdwn
@@ -0,0 +1,28 @@
+Ok, I'm confused. See http://lovesgoodfood.com/jason/tags/napowrimo/ and
+http://lovesgoodfood.com/jason/tags/NaPoWriMo/ for two examples of not
+picking up pages quite right. I didn't realize that tags are randomly
+case-sensitive while still capitalized in the output title? See the list
+of backlinks on each. Also, the only pages actually being ''listed'' are
+from a year ago, but the backlinks include current pages. The posts
+''are'' being included on http://lovesgoodfood.com/jason/tags/poetry/ .
+The feeds are populated on my host, but not on my laptop (Debian
+unstable-ish, as opposed to a git pull on my host).
+
+Halp? I've blown away the old (including .ikiwiki) and rebuilt to no
+effect. The tag pages are meant to be transients (loaded by default,
+according to the docs?), but they're still being created. Nothing seems
+quite correct.
+
+-- JasonRiedy
+
+> What's going on with the case sensativity is that ikiwiki is
+> case-insensative, but in the edge case where there are somehow two pages
+> that vary only in case, it makes at least a token (partial, probably
+> incomplete and buggy) gesture at having the case of links to them
+> influence which one is linked to.
+>
+> Possibly this is interacting badly with tag page autocreation when
+> different cases are used for a tag?
+>
+> I don't know why new posts are not showing up in the tags. Can I download
+> the source from somewhere? --[[Joey]]
diff --git a/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_1_ec4ffab10e60510b53660b70908d1bd8._comment b/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_1_ec4ffab10e60510b53660b70908d1bd8._comment
new file mode 100644
index 000000000..77db9c615
--- /dev/null
+++ b/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_1_ec4ffab10e60510b53660b70908d1bd8._comment
@@ -0,0 +1,14 @@
+[[!comment format=mdwn
+ username="http://lovesgoodfood.com/jason/"
+ nickname="Jason Riedy"
+ subject="oh no..."
+ date="2011-04-20T17:56:08Z"
+ content="""
+I just realized I blew away my outward-facing git repos and setup when I blew away the site. augh. It'll take more time than I have to fix that right now.
+
+Current git repo is over dumb http at http://lovesgoodfood.com/jason.git until I can fix the rest. And two necessary extra plugins are at http://lovesgoodfood.com/htmlpage.pm and http://lovesgoodfood.com/imgcss.pm . Haven't cleaned / documented them enough to contribute. They shouldn't interfere with the tag plugin. I'll be up your way this weekend, except on a super-slow satellite link so won't be able to play much on-line. Might be able to debug locally.
+
+If you get a change to poke, I'd be grateful, but there's plenty else to do. ;) Morels should be up...
+
+-- JasonRiedy
+"""]]
diff --git a/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_2_a47884ffd749df980cd62f4c1e3167ce._comment b/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_2_a47884ffd749df980cd62f4c1e3167ce._comment
new file mode 100644
index 000000000..683c6a59e
--- /dev/null
+++ b/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_2_a47884ffd749df980cd62f4c1e3167ce._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://lovesgoodfood.com/jason/"
+ nickname="Jason Riedy"
+ subject="Less painful to clone."
+ date="2011-04-21T02:34:15Z"
+ content="""
+http://lovesgoodfood.com/jason/git/JasonsChatter.git
+
+(and maybe this time I'll remember to save my setup)
+"""]]
diff --git a/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_3_6c4affdbc637946506d0c28a8648dc6e._comment b/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_3_6c4affdbc637946506d0c28a8648dc6e._comment
new file mode 100644
index 000000000..3f02528b3
--- /dev/null
+++ b/doc/forum/tags_acting_strangely:_not_picking_up_all_pages__44___not_populating_feeds/comment_3_6c4affdbc637946506d0c28a8648dc6e._comment
@@ -0,0 +1,32 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 3"
+ date="2011-04-30T20:33:59Z"
+ content="""
+So, I don't see the issue of only one of the two capitalizations of a tag being updated when a page is added.
+
+<pre>
+joey@wren:~/tmp>ikiwiki -plugin html -plugin inline -tagbase tags -plugin goodstuff --refresh -v JasonsChatter JasonsChatter.html
+refreshing wiki..
+scanning posts/foo.html
+building posts/foo.html
+building tags.mdwn, which depends on posts/foo
+building sidebar.mdwn, which depends on posts/foo
+building posts.mdwn, which depends on posts/foo
+building index.mdwn, which depends on sidebar
+building archives/2010/04.mdwn, which depends on posts/foo
+building archives.mdwn, which depends on archives/2010/04
+building tags/NaPoWriMo.mdwn, which depends on posts/foo
+building tags/poetry.mdwn, which depends on posts/foo
+building tags/rwp.mdwn, which depends on posts/foo
+building tags/napowrimo.mdwn, which depends on posts/foo
+done
+</pre>
+
+Both caps of the tag were updated there. I do see some evidence of your site being updated by ikiwiki running with possibly different configuration. Compare date formats used on <http://lovesgoodfood.com/jason/tags/NaPoWriMo/> and <http://lovesgoodfood.com/jason/tags/napowrimo/>. Now, that could just be a different LANG setting, but if the configuration different goes deeper, it could point toward an explanation of the inconsistency of only one case of a tag being updated to list a page.. possibly.
+
+I can reproduce tag autocreation creating multiple tag pages that differ only in case. That's a bug, fixed in bad5072c02d506b5b5a0d74cd60639f7f62cc7d3.
+
+AFAICS, you don't have `tag_autocreate_commit` set to false, so transient tags are not being used.
+"""]]
diff --git a/doc/forum/teximg_not_working.mdwn b/doc/forum/teximg_not_working.mdwn
new file mode 100644
index 000000000..0c07c3b2b
--- /dev/null
+++ b/doc/forum/teximg_not_working.mdwn
@@ -0,0 +1,26 @@
+I following installation and configuration instructions at [[plugins/teximg/]]
+
+But I get
+
+ [[!teximg Error: missing tex code]]
+
+for code
+
+ [[!teximg $$\sin (x)$$]]
+
+On server I do have `texlive`, `dvips`, `convert` installed ready. My configuration looks like
+
+ # teximg_dvipng => 1, # use dvipng
+ teximg_dvipng => 0, # use dvips and convert
+ # LaTeX prefix for teximg plugin
+ teximg_prefix => '\\documentclass{article}
+ \\usepackage{amsmath}
+ \\usepackage{amsfonts}
+ \\usepackage{amssymb}
+ \\pagestyle{empty}
+ \\begin{document}
+ ',
+ # LaTeX postfix for teximg plugin
+ teximg_postfix => '\\end{document}',
+
+Any ideas why it's not working?
diff --git a/doc/forum/teximg_not_working/comment_2_35e2ebf3893fc0c7966490e1fef1e6cf._comment b/doc/forum/teximg_not_working/comment_2_35e2ebf3893fc0c7966490e1fef1e6cf._comment
new file mode 100644
index 000000000..5abab94cc
--- /dev/null
+++ b/doc/forum/teximg_not_working/comment_2_35e2ebf3893fc0c7966490e1fef1e6cf._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://jmtd.livejournal.com/"
+ ip="188.222.50.68"
+ subject="comment 2"
+ date="2011-05-23T18:51:41Z"
+ content="""
+You haven't followed the examples properly. The code fragment needs to be inside `code=\"\"`, i.e. `[[!teximg code=\"\frac{1}{2}\"]]`
+
+(taken directly from [[ikiwiki/directive/teximg]])
+"""]]
diff --git a/doc/forum/transition_from_handwritten_html_to_ikiwiki.mdwn b/doc/forum/transition_from_handwritten_html_to_ikiwiki.mdwn
new file mode 100644
index 000000000..a8d04a0ad
--- /dev/null
+++ b/doc/forum/transition_from_handwritten_html_to_ikiwiki.mdwn
@@ -0,0 +1,90 @@
+I'm trying to convert hand written html site to ikiwiki and maintain url compatibility. html plugin with indexpages=1 converts all dir_name/index.html correctly to dir_name urls with wiki/css based content, but somedir/somefile.html files are only accessible as somedir/somefile/. Non .html files seem to accessible with their full paths, for example somedir/pic.jpg from hand written html can be accessed by same path under ikiwiki.
+
+How to make somedir/somefile.html accessible as somedir/somefile.html under ikiwiki?
+
+Thanks,
+
+-Mikko
+
+> Hello! The options you need to investigate are `--usedirs` and
+> `--no-usedirs`. The default `--usedirs` takes any source page foo
+> (regardless of its format, be it markdown or html) and converts it into a
+> destination page foo/index.html (URL foo/). By comparison, `--no-usedirs`
+> maps the source file onto a destination file directly: src/foo.html becomes
+> dest/foo.html, src/bar.mdwn becomes dest/bar.html, etc.
+>
+> It sounds like you want `--no-usedirs`, or the corresponding `usedirs => 0,`
+> option in your setup file. See [[usage]] for more information. -- [[Jon]]
+
+Thanks, usedirs seems to be just the thing I need.
+
+-Mikko
+
+Actually usedirs didn't do exactly what I want. The old site contains both
+somedir/index.html and somedir/somename.html files. With html plugin and
+indexpages=1 the somedir/index.html pages are accessed correctly but
+somedir/somefile.html files not.
+
+With usedirs => 0, somedir/somename.html pages are accessed correctly but
+somedir/index.html pages are not. Actually the handwritten somedir/index.html
+files were removed on a rebuild:
+
+ $ ikiwiki -setup blog.setup -rebuild -v
+ ...
+ removing test2/index.html, no longer built by test2
+
+Is there a way for both index.html and somename.html raw html files to show up through ikiwki?
+
+-Mikko
+
+> I think you want usedirs => 0 and indexpages => 0?
+>
+> What IkiWiki does is to map the source filename to an abstract page name
+> (indexpages alters how this is done), then map the abstract page name
+> to an output filename (usedirs alters how this is done).
+>
+> The three columns here are input, abstract page name, output:
+>
+> usedirs => 0, indexpages => 0:
+> a/index.html -> a/index -> a/index.html
+> a/b.html -> a/b -> a/b.html
+> usedirs => 1, indexpages => 0:
+> a/index.html -> a/index -> a/index/index.html
+> a/b.html -> a/b -> a/b/index.html
+> usedirs => 0, indexpages => 1:
+> a/index.html -> a -> a.html
+> a/b.html -> a/b -> a/b.html
+> usedirs => 1, indexpages => 1:
+> a/index.html -> a -> a/index.html
+> a/b.html -> a/b -> a/b/index.html
+>
+> The abstract page name is what you use in wikilinks and pagespecs.
+>
+> What I would suggest you do instead, though, is break your URLs once
+> (but put in Apache redirections), to get everything to be consistent;
+> I strongly recommend usedirs => 1 and indexpages => 0, then always
+> advertising URLs that look like <http://www.example.com/a/b/>. This is
+> what ikiwiki.info itself does, for instance. --[[smcv]]
+
+Thanks for the explanation. usedirs => 0 and indexpages => 0 does the trick,
+but I'll try to setup mod_rewrite from foo/bar.html to foo/bar in the final
+conversion.
+
+-Mikko
+
+> That's roughly what I do, but you can do it with `Redirect` and `RedirectMatch` from `mod_alias`, rather than fire up rewrite. Mind you I don't write a generic rule, I have a finite set of pages to redirect which I know. -- [[Jon]]
+
+I'm getting closer. Now with usedirs => 1 and raw html pages, ikiwiki transforms foo/index.html to foo/index/index.html.
+Can ikiwiki be instructed map foo/index.html to page foo instead that foo/index?
+
+-Mikko
+
+> If you don't already have a foo.html in your source, why not just rename foo/index.html to foo.html? With usedirs, it will then map to foo/index.html. Before, you had 'foo/' and 'foo/index.html' as working URLS, and they will work after too.
+>
+> If you did have a foo.html and a foo/index.html, hmm, that's a tricky one. -- [[Jon]]
+
+> We may be going round in circles - that's what indexpages => 1 does :-)
+> See the table I constructed above, which explains the mapping from input
+> files to abstract page names, and then the mapping from abstract page
+> names to output files. (I personally think that moving your source pages
+> around like Jon suggested is a better solution, though. --[[smcv]]
diff --git a/doc/forum/two_new_contrib_plugins:_newpage__44___jssearchfield.mdwn b/doc/forum/two_new_contrib_plugins:_newpage__44___jssearchfield.mdwn
new file mode 100644
index 000000000..8293b098f
--- /dev/null
+++ b/doc/forum/two_new_contrib_plugins:_newpage__44___jssearchfield.mdwn
@@ -0,0 +1,20 @@
+Just thought people might like to know I've added a couple more plugins to contrib.
+
+[[plugins/contrib/newpage]]: This plugin adds a new action to the "ACTIONS" section of a page; a button labelled "create" and an input field next to it.
+
+The common way of creating a new page is to edit a different page and add a link to the new page. However, there are some situations where that is a nuisance; for example, where pages are listed using a map directive. The newpage plugin enables one to simply type the name of the new page, click the "create" button, and one is then taken to the standard IkiWiki create-page form.
+
+[[plugins/contrib/jssearchfield]]: This plugin provides the [[plugins/contrib/ikiwiki/directive/jssearchfield]] directive. This
+enables one to search the structured data ("field" values) of multiple pages.
+This uses Javascript for the searching, which means that the entire thing
+is self-contained and does not require a server or CGI access, unlike
+the default IkiWiki search. This means that it can be used in places such
+as ebook readers. The disadvantage is that because Javascript runs
+in the browser, the searching is only as fast as the machine your browser
+is running on.
+
+Because this uses Javascript, the htmlscrubber must be turned off for any page where the directive is used.
+
+This plugin depends on the [[!iki plugins/contrib/field]] plugin.
+
+--[[KathrynAndersen]]
diff --git a/doc/forum/understanding_filter_hooks.mdwn b/doc/forum/understanding_filter_hooks.mdwn
new file mode 100644
index 000000000..e6ddc91cc
--- /dev/null
+++ b/doc/forum/understanding_filter_hooks.mdwn
@@ -0,0 +1,17 @@
+Hi All;
+
+I'm trying to use a filter hook as part of making [[wikilinks|ikiwiki/wikilink]] work in [[plugins/contrib/tex4ht]].
+It seems that filter is called for every page. For my application I just want it to be called for ".tex" files,
+but right now I have to have a look at the content, which I don't like so much.
+
+Is there a better hook to use for this? I need to transform the input before preprocessing.
+
+[[DavidBremner]]
+
+>You can check the type of the page without having to look at the content of the page:
+
+ my $page_file=$pagesources{$page};
+ my $page_type=pagetype($page_file);
+
+>Then you can check whether `$page_type` is "tex".
+>--[[KathrynAndersen]]
diff --git a/doc/forum/upgrade_steps.mdwn b/doc/forum/upgrade_steps.mdwn
new file mode 100644
index 000000000..1c85e6402
--- /dev/null
+++ b/doc/forum/upgrade_steps.mdwn
@@ -0,0 +1,147 @@
+[[!meta date="2007-08-27 21:52:18 +0000"]]
+
+I upgrades from 1.40 to 2.6.1. I ran "ikiwiki --setup" using my existing ikiwiki.setup configuration.
+I had many errors like:
+
+ /home/bsdwiki/www/wiki/wikilink/index.html independently created, not overwriting with version from wikilink
+ BEGIN failed--compilation aborted at (eval 5) line 129.
+
+and:
+
+ failed renaming /home/bsdwiki/www/wiki/smileys.ikiwiki-new to /home/bsdwiki/www/wiki/smileys: Is a directory
+ BEGIN failed--compilation aborted at (eval 5) line 129.
+
+Probably about six errors like this. I worked around this by removing the files and directories it complained about.
+Finally it finished.
+
+> As of version 2.0, ikiwiki enables usedirs by default. See
+> [[tips/switching_to_usedirs]] for details. --[[Joey]]
+
+>> I read the config wrong. I was thinking that it showed the defaults even though commented out
+>> (like ssh configs do). I fixed that part. --JeremyReed
+
+My next problem was that ikiwiki start letting me edit without any password authentication. It used to prompt
+me for a password but now just goes right into the "editing" mode.
+The release notes for 2.0 say password auth is still on by default.
+
+> It sounds like you have the anonok plugin enabled?
+
+>> Where is the default documented? My config doesn't have it uncommented.
+
+The third problem is that when editing my textbox is empty -- no content.
+
+This is using my custom rcs.pm which has been used thousands of times.
+
+> Have you rebuilt the cgi wrapper since you upgraded ikiwiki? AFAIK I
+> fixed a bug that could result in the edit box always being empty back in
+> version 2.3. The only other way it could happen is if ikiwiki does not
+> have saved state about the page that it's editing (in .ikiwiki/index).
+
+>> Rebuilt it several times. Now that I think of it, I think my early problem of having
+>> no content in the textbox was before I rebuilt the cgi. And after I rebuilt the whole webpage was empty.
+
+Now I regenerated my ikiwiki.cgi again (no change to my configuration,
+and I just get an empty HTML page when attempting editing or "create".
+
+> If the page is completly empty then ikiwiki is crashing before it can
+> output anything, though this seems unlikely. Check the webserver logs.
+
+Now I see it created directories for my data. I fixed that by setting
+usedirs (I see that is in the release notes for 2.0) and rerunning ikiwiki --setup
+but I still have empty pages for editing (no textbox no html at all).
+
+> Is IkiWiki crashing? If so, it would probably leave error text in the apache logs. --[[TaylorKillian]]
+
+>> Not using apache. Nothing useful in logs other thn the HTTP return codes are "0" and bytes is "-"
+>> on the empty ikiwiki.cgi output (should say " 200 " followed by bytes).
+
+>>> You need to either figure out what your web server does with stderr
+>>> from cgi programs, or run ikiwiki.cgi at the command line with an
+>>> appropriate environment so it thinks it's being called from a web
+>>> server, so you can see how it's failing. --[[Joey]]
+
+(I am posting this now, but will do some research and post some more.)
+
+Is there any webpage with upgrade steps?
+
+> Users are expected to read [[news]], which points out any incompatible
+> changes or cases where manual action is needed.
+
+>> I read it but read the usedirs option wrong :(.
+>> Also it appears to be missing the news from between 1.40 to 2.0 unless they dont' exist.
+>> If they do exist maybe they have release notes I need?
+
+>>> All the old ones are in the NEWS file. --[[Joey]]
+
+--JeremyReed
+
+My followup: I used a new ikiwiki.setup based on the latest version. But no changes for me.
+
+Also I forgot to mention that do=recentchanges works good for me. It uses my
+rcs_recentchanges in my rcs perl module.
+
+The do=prefs does nothing though -- just a blank webpage.
+
+> You need to figure out why ikiwiki is crashing. The webserver logs should
+> tell you.
+
+I also set verbose => 1 and running ikiwiki --setup was verbose, but no changes in running CGI.
+I was hoping for some output.
+
+I am guessing that my rcs perl module stopped working on the upgrade. I didn't notice any release notes
+on changes to revision control modules. Has something changed? I will also look.
+
+> No, the rcs interface has not needed to change in a long time. Also,
+> nothing is done with the rcs for do=prefs.
+
+>> Thanks. I also checked differences between 1.40 Rcs plugins and didn't notice anything significant.
+
+--JeremyReed
+
+Another Followup: I created a new ikiwiki configuration and did the --setup to
+create an entirely different website. I have same problem there. No prompt for password
+and empty webpage when using the cgi.
+I never upgraded any perl modules so maybe a new perl module is required but I don't see any errors so I don't know.
+
+The only errors I see when building and installing ikiwiki are:
+
+ Can't exec "otl2html": No such file or directory at IkiWiki/Plugin/otl.pm line 66.
+
+ gettext 0.14 too old, not updating the pot file
+
+I don't use GNU gettext on here.
+
+I may need to revert back to my old ikiwiki install which has been used to thousands of times (with around
+1000 rcs commits via ikiwiki).
+
+--JeremyReed
+
+I downgraded to version 1.40 (that was what I had before I wrote wrong above).
+Now ikiwiki is working for me again (but using 1.40). I shouldn't have tested on production system :)
+
+--JeremyReed
+
+I am back. On a different system, I installed ikiwiki 2.6.1. Same problem -- blank CGI webpage.
+
+So I manually ran with:
+
+ REQUEST_METHOD=GET QUERY_STRING='do=create&page=jcr' kiwiki.cgi
+
+And clearly saw the error:
+
+ [IkiWiki::main] Fatal: Bad template engine CGI::FormBuilder::Template::div: Can't locate CGI/FormBuilder/Template/div.pm
+
+So I found my version was too old and 3.05 is the first to provide "Div" support. I upgraded my p5-CGI-FormBuilder to 3.0501.
+And ikiwiki CGI started working for me.
+
+The Ikiwiki docs about this requirement got removed in Revision 4367. There should be a page that lists the requirements.
+(I guess I could have used the debian/control file.)
+
+> There is a page, [[install]] documents that 3.05 is needed.
+
+>> Sorry, I missed that. With hundreds of wikipages it is hard to read all of them.
+>> I am updating the download page now to link to it.
+
+I am now using ikiwiki 2.6.1 on my testing system.
+
+--JeremyReed
diff --git a/doc/forum/use_php-markdown-extra_with_ikiwiki__63__.mdwn b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__.mdwn
new file mode 100644
index 000000000..86ed70fd2
--- /dev/null
+++ b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__.mdwn
@@ -0,0 +1,3 @@
+Is it possible to use php-markdown-extra with ikiwiki?
+
+Thanks.
diff --git a/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_1_66d48218361caa4c07bd714b82ed0021._comment b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_1_66d48218361caa4c07bd714b82ed0021._comment
new file mode 100644
index 000000000..af60ecbdb
--- /dev/null
+++ b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_1_66d48218361caa4c07bd714b82ed0021._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="60.241.8.244"
+ subject="PHP != Perl"
+ date="2010-07-10T12:44:15Z"
+ content="""
+Er, why? IkiWiki is written in Perl. Presumably php-markdown-extra is written in PHP, which is a completely different language.
+"""]]
diff --git a/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_2_f2ee0a4dce571d329f795e52139084c0._comment b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_2_f2ee0a4dce571d329f795e52139084c0._comment
new file mode 100644
index 000000000..ce60f4b3a
--- /dev/null
+++ b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_2_f2ee0a4dce571d329f795e52139084c0._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawlzADDUvepOXauF4Aq1VZ4rJaW_Dwrl6xE"
+ nickname="Dário"
+ subject="comment 2"
+ date="2010-07-10T21:48:13Z"
+ content="""
+Because php-markdown-extra extends the basic markdown language (footnotes, etc.)
+"""]]
diff --git a/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_3_e388714f457ccb6ef73630179914558c._comment b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_3_e388714f457ccb6ef73630179914558c._comment
new file mode 100644
index 000000000..72ce7bb6f
--- /dev/null
+++ b/doc/forum/use_php-markdown-extra_with_ikiwiki__63__/comment_3_e388714f457ccb6ef73630179914558c._comment
@@ -0,0 +1,9 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="202.173.183.92"
+ subject="I still don't get it"
+ date="2010-07-11T07:18:35Z"
+ content="""
+But if you need the \"extra\" features of Markdown, all you have to do is turn on the \"multimarkdown\" option in your configuration. It makes no sense to try to use PHP with Perl.
+
+"""]]
diff --git a/doc/forum/usedirs___38___indexpages_using_problem.mdwn b/doc/forum/usedirs___38___indexpages_using_problem.mdwn
new file mode 100644
index 000000000..05c85e281
--- /dev/null
+++ b/doc/forum/usedirs___38___indexpages_using_problem.mdwn
@@ -0,0 +1,17 @@
+My ikiwiki setup file configed like:
+
+ usedirs => 0,
+ indexpages => 1,
+
+I create a directory and some .mdwn source file /Whatis/index.mdwn and /Whatis/OSS.mdwn . The html file ikiwiki generated is
+/Whatis/index.html and /Whatis/OSS.html .
+
+But in the page [OSS.html](http://atoz.org.cn/Whatis/OSS.html) , the auto generated link (on the page top)
+to “Whatis” is /Whatis.html file , not to /Whatis/index.html. So the link to “Whatis” is fail .
+
+Is it a bug , and how can I do for that ?
+
+> I suggest that you name your page `Whatis.mdwn`, and not
+> `Whatis/index.mdwn`. That will make ikiwiki's links work,
+> and allows you to link to the `Whatis` page by that name.
+> --[[Joey]]
diff --git a/doc/forum/users/acodispo.mdwn b/doc/forum/users/acodispo.mdwn
new file mode 100644
index 000000000..cf07b6386
--- /dev/null
+++ b/doc/forum/users/acodispo.mdwn
@@ -0,0 +1,2 @@
+# ACodispo
+
diff --git a/doc/forum/using_l10n__39__d_basewiki.mdwn b/doc/forum/using_l10n__39__d_basewiki.mdwn
new file mode 100644
index 000000000..a361d18c9
--- /dev/null
+++ b/doc/forum/using_l10n__39__d_basewiki.mdwn
@@ -0,0 +1,7 @@
+Hey there!
+
+I'm trying to get the translated version of basewiki activated in my wiki. Setting "locale => 'de_DE.UTF-8'" gave me some german messages on the CLI and a few changes in the wiki itself but the basewiki is still english. The files in /usr/share/ikiwiki/po/de/ are there.
+
+As I understand, [[plugins/po]] is just for translating.
+
+So, what am I doing wrong?
diff --git a/doc/forum/using_l10n__39__d_basewiki/comment_1_eaab671848ee6129f6fe9399474eeac0._comment b/doc/forum/using_l10n__39__d_basewiki/comment_1_eaab671848ee6129f6fe9399474eeac0._comment
new file mode 100644
index 000000000..1f21b485b
--- /dev/null
+++ b/doc/forum/using_l10n__39__d_basewiki/comment_1_eaab671848ee6129f6fe9399474eeac0._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 1"
+ date="2010-12-05T20:12:17Z"
+ content="""
+The translated basewiki depends on the po plugin being enabled and configured with the language(s) to use.
+"""]]
diff --git a/doc/forum/using_l10n__39__d_basewiki/comment_2_d907676a1db1210ca59506673c564359._comment b/doc/forum/using_l10n__39__d_basewiki/comment_2_d907676a1db1210ca59506673c564359._comment
new file mode 100644
index 000000000..c8d1e4e04
--- /dev/null
+++ b/doc/forum/using_l10n__39__d_basewiki/comment_2_d907676a1db1210ca59506673c564359._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://xlogon.net/bacuh"
+ ip="93.182.190.4"
+ subject="comment 2"
+ date="2010-12-05T22:48:53Z"
+ content="""
+This works, thanks.
+
+But is there also a way to get \"Edit\" etc. and the buttons behind it translated?
+"""]]
diff --git a/doc/forum/using_l10n__39__d_basewiki/comment_3_5e9d5bc5ecaf63f9bfe3315b09a279aa._comment b/doc/forum/using_l10n__39__d_basewiki/comment_3_5e9d5bc5ecaf63f9bfe3315b09a279aa._comment
new file mode 100644
index 000000000..f72bb37af
--- /dev/null
+++ b/doc/forum/using_l10n__39__d_basewiki/comment_3_5e9d5bc5ecaf63f9bfe3315b09a279aa._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://joey.kitenet.net/"
+ nickname="joey"
+ subject="comment 3"
+ date="2010-12-05T22:53:12Z"
+ content="""
+That requires translating the templates, which has never quite been finished. [[todo/l10n]] discusses that.
+
+(You can edit the templates yourself of course and manually translate.)
+"""]]
diff --git a/doc/forum/using_svn+ssh_with_ikiwiki.mdwn b/doc/forum/using_svn+ssh_with_ikiwiki.mdwn
new file mode 100644
index 000000000..8d9c27e46
--- /dev/null
+++ b/doc/forum/using_svn+ssh_with_ikiwiki.mdwn
@@ -0,0 +1,11 @@
+Just as an experiment, I tried running ikiwiki using a remote repository, i.e. via "svn+ssh". After setting up the repo and relocating the working copy, unfortunately, it doesn't work; editing a page gives the error:
+
+> Error: no element found at line 3, column 0, byte 28 at /opt/local/lib/perl5/vendor_perl/5.10.1/darwin-multi-2level/XML/Parser.pm line 187
+
+I think this is because, despite a SetEnv directive in the apache configuration, the CGI wrapper is expunging SVN_SSH from the environment (based on perusing the source of Wrapper.pm and looking at "envsave" there at the top). Is this the case? --[[Glenn|geychaner@mac.com]]
+
+> That seems likely. You can edit Wrapper.pm and add SVN_SSH to the @envsave list and rebuild your wrappers to test it. --Joey
+
+A better way(?) would be to add a plugin to set the SVN_SSH variable at the appropriate moment (or even to add this to the SVN plugin). What kind of hook should this be; it needs to run just *after* the CGI script cleans its environment? --[[Glenn|geychaner@mac.com]]
+
+Actually, this probably doesn't need to be a plugin; setting SVN_SSH in ENV can probably be done through the setup file. (Right?) --[[Glenn|geychaner@mac.com]]
diff --git a/doc/forum/utf8_warnings_for___34____92__xAB__34__.mdwn b/doc/forum/utf8_warnings_for___34____92__xAB__34__.mdwn
new file mode 100644
index 000000000..72f2d38e0
--- /dev/null
+++ b/doc/forum/utf8_warnings_for___34____92__xAB__34__.mdwn
@@ -0,0 +1,47 @@
+# getting Warnings about UTF8-Chars.
+
+I'm getting multiple warnings:
+
+ utf8 "\xAB" does not map to Unicode at /usr/share/perl5/IkiWiki.pm line 774, <$in> chunk 1.
+
+
+I'm assuming this is once per File, but even in verbose mode, it doesn't tell me which file is a problem.
+It first reads all the files, and afterwards when parsing/compiling them, it outputs the warning, so I can't
+deduce the offending files.
+
+Is there a way to have ikiwiki output the position, where it encounters the character?
+
+Probably all this has to do with locale-settings, and usage of mixed locales in a distributed setup ...
+I'd rather cleanup some of the file(name)s of unexpected characters. --[[jwalzer]]
+
+--------
+
+**Update** : So I took the chance to insert debug into ikiwiki.pm:
+
+ root@novalis:/usr/share/perl5# diff -p /tmp/IkiWiki.orig.pm IkiWiki.pm
+ *** /tmp/IkiWiki.orig.pm Sun Feb 14 15:16:08 2010
+ --- IkiWiki.pm Sun Feb 14 15:16:28 2010
+ *************** sub readfile ($;$$) {
+ *** 768,773 ****
+ --- 768,774 ----
+ }
+
+ local $/=undef;
+ + debug("opening File: $file:");
+ open (my $in, "<", $file) || error("failed to read $file: $!");
+ binmode($in) if ($binary);
+ return \*$in if $wantfd;
+
+
+But what I see now is not quite helpful, as it seems, STDERR and DEBUG are asyncronous, so they mix up in a way, that I can't really see, whats the problem ... Maybe I'm better off for troubleshooting, to insert an printf to strerr to have it in the same stream.. --[[jwalzer]]
+
+
+----
+
+**Update:** The "print STDERR $file;"-Trick did it .. I was able to find a mdwn-file, that (was generated by a script of me) had \0xAB in it.
+
+Nevertheless I still wonder if this should be a problem. This character happend to be in an *\[\[meta title='$CHAR'\]\]-tag* and an *\[$CHAR\]http://foo)-Link*
+
+Should this throw an warning? Maybe this warning could be catched an reported inclusively the containing filename? maybe even with an override, if one knows that it is correct that way? --[[jwalzer]]
+
+[[!tag solved]]
diff --git a/doc/forum/w3mmode___91__Save_Page__93___results_in_403.mdwn b/doc/forum/w3mmode___91__Save_Page__93___results_in_403.mdwn
new file mode 100644
index 000000000..6c7a59b62
--- /dev/null
+++ b/doc/forum/w3mmode___91__Save_Page__93___results_in_403.mdwn
@@ -0,0 +1,9 @@
+My setup matches w3mmode [[w3mmode/ikiwiki.setup]] exactly.
+My doc/index.mdwn just has a line or two of plain text.
+When I try to edit that page in w3m, it works fine until I push [Save Page].
+Then I just get a page that only contains "403".
+
+ikiwiki version is 3.20110715ubuntu1.
+w3m is 0.5.3.
+
+-- [[terry|tjgolubi]]
diff --git a/doc/forum/web_service_API__44___fastcgi_support.mdwn b/doc/forum/web_service_API__44___fastcgi_support.mdwn
new file mode 100644
index 000000000..84b227eef
--- /dev/null
+++ b/doc/forum/web_service_API__44___fastcgi_support.mdwn
@@ -0,0 +1,18 @@
+This is a half-baked thought of mine so I thought I would post it in forum for discussion.
+
+There are some things that ikiwiki.cgi is asked to do which do not involve changing the repository: these include form generation, handling logins, the "goto" from [[recentchanges]], edit previews, etc.
+
+For one thing I am working on slowly ([[todo/interactive todo lists]]), I've hit a situation where I am likely to need to implement doing markup evaluation for a subset of a page. The problem I face is, if a user edits content in the browser, markup, ikiwiki directives etc. need to be expanded. I could possibly do this with a round-trip through edit preview, but that would be for the whole content of a page, and I hit the problem with editing a list item.
+
+> (slight addendum on this front. I'm planning to split the javascript code for interactive todo lists into two parts: one for handling round trips of content to and from ikiwiki.cgi, and the various failure modes that might occur (permission denied, edit conflicts, login required, etc.) ; then the list-specific stuff can build on top of this. The first chunk might be reusable by others for other AJAXY-edit fu.)
+
+Anyway - I've realised that a big part of the interactive todo lists stuff is trying to handle round trips to ikiwiki.cgi through javascript. A web services API would make handling the various conditions of this easier (e.g. need to login, login failed, etc.). I'm not sure what else might benefit from a web services API and I have no real experience of either using or writing them so I don't know what pros/cons there are for REST vs SOAP etc.
+
+Second, and in a way related, I've been mooting hacking fastcgi support into ikiwiki. Essentially one ikiwiki.cgi process would persist and serve CGI-ish requests on stdin/stdout. The initial content-scanning and dependency generation would happen once and not need to be repeated for future requests. Although, all state-changing operations would need to be careful to ensure the in-memory models were accurate. Also, I don't know how suited the data structures would be for persistence, since the current model is build em up, throw em away, they might not be space-efficient enough for persistence.
+
+If I did attempt this, I would want to avoid restructuring things in a way which would impair ikiwiki's core philosophy of being a static compiler. -- [[Jon]]
+
+> This is quite interesting! There is a related discussion about FastCGI
+> support (and therefore better support for Nginx, for example) in
+> [[todo/fastcgi_or_modperl_installation_instructions/]]... --
+> [[anarcat]]
diff --git a/doc/forum/what_is_the_easiest_way_to_implement_order:_disallow_all__44___allow_chosen__95__few_page_editing_policy__63__.mdwn b/doc/forum/what_is_the_easiest_way_to_implement_order:_disallow_all__44___allow_chosen__95__few_page_editing_policy__63__.mdwn
new file mode 100644
index 000000000..fbc5c58e2
--- /dev/null
+++ b/doc/forum/what_is_the_easiest_way_to_implement_order:_disallow_all__44___allow_chosen__95__few_page_editing_policy__63__.mdwn
@@ -0,0 +1,3 @@
+As in title, I'd like to allow editing only some pages on my wiki. Rest by default is not editable by users except admin. Thanks
+
+> See [[plugins/lockedit]]. --[[schmonz]]
diff --git a/doc/forum/where_are_the_tags.mdwn b/doc/forum/where_are_the_tags.mdwn
new file mode 100644
index 000000000..ecb49fe43
--- /dev/null
+++ b/doc/forum/where_are_the_tags.mdwn
@@ -0,0 +1,9 @@
+Where is the tag cloud/tag listing of all the tags used in this wiki? I know we
+have tags enabled. --[[jerojasro]]
+
+> This wiki does not use one global toplevel set of tags (`tagbase` is not
+> set).
+>
+> There are tags used for the [[plugins]], and a tag cloud of those
+> there. [[wishlist]] and [[patch]] are tags too, but I don't see the point
+> of a tag cloud for such tags. --[[Joey]]
diff --git a/doc/forum/where_are_the_tags/comment_1_6a559c3bfe72011c45b006d33176da3d._comment b/doc/forum/where_are_the_tags/comment_1_6a559c3bfe72011c45b006d33176da3d._comment
new file mode 100644
index 000000000..6878a7af8
--- /dev/null
+++ b/doc/forum/where_are_the_tags/comment_1_6a559c3bfe72011c45b006d33176da3d._comment
@@ -0,0 +1,14 @@
+[[!comment format=mdwn
+ username="http://intranetsdoneright.blogspot.com/"
+ ip="85.127.82.246"
+ subject="Tag cloud or list of tags possible?"
+ date="2011-04-30T06:10:14Z"
+ content="""
+I want to create a helpful welcome page http://pyjs.org/wiki/ mainly by using links or listings of what is called \"special pages\" in MediaWiki. (Currently, we have a list of all wiki articles, this frightens new readers.)
+
+Can I somehow list all tags of the wiki, or is there a tag cloud?
+How can I list the first level of subpages (not the pages, just the \"hierarchy\" level or so)?
+What other \"special\" lists or so can ikiwiki generate (e.g. users, tags, changes, ...)?
+
+Thanks for any help or directions!
+"""]]
diff --git a/doc/forum/which_file_ikiwiki_--setup_is_processing_right_now__63__.mdwn b/doc/forum/which_file_ikiwiki_--setup_is_processing_right_now__63__.mdwn
new file mode 100644
index 000000000..6c5ee4301
--- /dev/null
+++ b/doc/forum/which_file_ikiwiki_--setup_is_processing_right_now__63__.mdwn
@@ -0,0 +1,4 @@
+is there a way to know which file ikiwiki is currently processing while i am running "ikiwiki --setup $FOO.setup" ?
+
+i am migrating a large ikiwiki instance and the compiler dies in the middle of setup -- but i don't know which file is causing the problem.
+
diff --git a/doc/forum/which_file_ikiwiki_--setup_is_processing_right_now__63__/comment_1_4f52f8fc083982bd5a572742cf35c74f._comment b/doc/forum/which_file_ikiwiki_--setup_is_processing_right_now__63__/comment_1_4f52f8fc083982bd5a572742cf35c74f._comment
new file mode 100644
index 000000000..21b5e7d2f
--- /dev/null
+++ b/doc/forum/which_file_ikiwiki_--setup_is_processing_right_now__63__/comment_1_4f52f8fc083982bd5a572742cf35c74f._comment
@@ -0,0 +1,7 @@
+[[!comment format=mdwn
+ username="https://id.koumbit.net/anarcat"
+ subject="try --verbose"
+ date="2012-09-14T05:01:16Z"
+ content="""
+you can try `--verbose` when you use `--rebuild`, otherwise you could also try `strace`.
+"""]]
diff --git a/doc/forum/wiki_clones_on_dynamic_IPs.mdwn b/doc/forum/wiki_clones_on_dynamic_IPs.mdwn
new file mode 100644
index 000000000..f69f6501e
--- /dev/null
+++ b/doc/forum/wiki_clones_on_dynamic_IPs.mdwn
@@ -0,0 +1,10 @@
+OK, this is not really a ikiwiki problem... but ikiwiki makes wiki clones
+really easy to setup, and this is related to having a website cloned at
+different places and pulling from the servers which are online.
+
+My setup is like this: I have a server at home and another at my dorm
+which will serve as a wiki clone. Each of them has a dynamic IP with DynDNS
+set up. Now the problem lies in linking my domain to these two DynDNS addresses.
+Multiple CNAMEs are not supported, and I don't know if there's any utility
+which can update the A records on the DNS to point directly point to the
+two separate IPs.
diff --git a/doc/forum/wiki_name_in_page_titles.mdwn b/doc/forum/wiki_name_in_page_titles.mdwn
new file mode 100644
index 000000000..4e9e51835
--- /dev/null
+++ b/doc/forum/wiki_name_in_page_titles.mdwn
@@ -0,0 +1,32 @@
+I'd like to have the wiki name appear in page titles as in "WikiName:
+Page Title." If I use `<TMPL_VAR WIKINAME>: <TMPL_VAR TITLE>` in the
+template this works for all pages except the index page itself which
+will have title "WikiName: WikiName" as its title. Does anyone know
+of a template-based solution to this or do I need to write a plugin
+that provides a `IS_HOMEPAGE` template variable? --[[JasonBlevins]]
+
+> Hmm, one way to work around this is to put a meta title directive on the
+> index page. Then TITLE will be that, rather than WIKINAME, and your
+> template should work. --[[Joey]]
+
+>> I ended up writing a [path][] plugin since I had some other
+>> path-specific conditional things to include in my templates.
+>>
+>> So now I can do things like this:
+>>
+>> <title>
+>> <TMPL_VAR WIKINAME><TMPL_UNLESS IS_HOMEPAGE>: <TMPL_VAR TITLE></TMPL_UNLESS>
+>> </title>
+>>
+>> But also more complicated path-specific conditionals like
+>> `IN_DIR_SUBDIR` to indicate subpages of `/dir/subdir/`. I've got a
+>> few other small plugins brewing so I'll try to put up some contrib
+>> pages for them soon. --[[JasonBlevins]]
+
+[path]: http://jblevins.org/git/ikiwiki/plugins.git/plain/path.pm
+
+> I used the following trick in some page.tmpl:
+>
+> <title><TMPL_VAR WIKINAME><TMPL_IF NAME="PARENTLINKS">: <TMPL_VAR TITLE></TMPL_IF></title>
+>
+> --[[JeanPrivat]]
diff --git a/doc/forum/wishlist-discussion:_Editformular_showing_existing_tags.mdwn b/doc/forum/wishlist-discussion:_Editformular_showing_existing_tags.mdwn
new file mode 100644
index 000000000..49c55e20e
--- /dev/null
+++ b/doc/forum/wishlist-discussion:_Editformular_showing_existing_tags.mdwn
@@ -0,0 +1,15 @@
+# How about:
+
+having a list of all existing tags in the Edit-Formular as a selectionbox?
+
+Assume I have tagbase=/tags/ and for every tag I have given to articles an existing page there.
+
+Would it be possible to list all these tags together with the Formular, as selectionbox.
+Maybe even with parsing of the content and preselecting the tags, that are given in the article and vice-versa when selecting the fields then also generating the \[\[\!tag\]\]-sourcecode ?
+
+this would need a bit JS-work and somehow on compiletime we need to put the list of tags somewhere, where the cgi could read them from.
+This way, even a pagespec would suffice to determine the usable list of tags and not only the tagbase-variable.
+
+> I think this would be very hard to achieve with the current tag plugin, due to the nature of its implementation.
+>
+> I've had a "tag2" plugin on the go for a while which supports this. It's in a very rough stage but I'll try to find it and upload it somewhere. -- [[Jon]]
diff --git a/doc/forum/wishlist:_Allow_OpenID_users_to_set_a_display_name.mdwn b/doc/forum/wishlist:_Allow_OpenID_users_to_set_a_display_name.mdwn
new file mode 100644
index 000000000..2551ef75e
--- /dev/null
+++ b/doc/forum/wishlist:_Allow_OpenID_users_to_set_a_display_name.mdwn
@@ -0,0 +1,3 @@
+Preferences should offer me a way to define my display name. This name should, with proper markings, be used in git commits, etc. -- RichiH
+
+> I.e. care should be taken that I don't simply change it to joeyh@kitenet.net and start writing crap. For example, the display name could be displayed before the OpenID string, by default. Which would be worthwhile change anyway, no matter if you will support display names. -- RichiH
diff --git a/doc/forum/wishlist:_support_staging_area.mdwn b/doc/forum/wishlist:_support_staging_area.mdwn
new file mode 100644
index 000000000..7628ad00a
--- /dev/null
+++ b/doc/forum/wishlist:_support_staging_area.mdwn
@@ -0,0 +1,12 @@
+It would be nice if ikiwiki had built-in support for a staging area.
+
+Default branch name would be *staging*, default result path would be *staging* as well.
+
+Thus, if I push into the staging branch, it should automagically create a full copy of the site in staging/ .
+
+This would make playing with ikiwiki easier while not cluttering the main branch and site.
+
+A simple .htaccess would take care of keeping the staging are limited to people who should be allowed to access it.
+
+
+Richard
diff --git a/doc/forum/wmd_editor_double_preview.mdwn b/doc/forum/wmd_editor_double_preview.mdwn
new file mode 100644
index 000000000..7a105ef36
--- /dev/null
+++ b/doc/forum/wmd_editor_double_preview.mdwn
@@ -0,0 +1,3 @@
+I use the wmd editor in my ikiwiki. However live preview seems not to be a fully correct preview so nevertheless I have to hit the preview button to get a correct preview. However then I have two previews so that I have to scroll down to see the correct one.
+
+Is it possible to disable the live preview or to *replace* the live preview with the correct one after pressing the preview button?
diff --git a/doc/forum/wmd_editor_double_preview/comment_1_0d3acf67f3c35f8c4156228f96dcd975._comment b/doc/forum/wmd_editor_double_preview/comment_1_0d3acf67f3c35f8c4156228f96dcd975._comment
new file mode 100644
index 000000000..cc8c9ac43
--- /dev/null
+++ b/doc/forum/wmd_editor_double_preview/comment_1_0d3acf67f3c35f8c4156228f96dcd975._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk_MMtLPS7osC5MjX00q2ATjvvXPWqm0ik"
+ nickname="micheal"
+ subject="comment 1"
+ date="2012-08-18T06:30:56Z"
+ content="""
+Any Ideas how to do this?
+"""]]
diff --git a/doc/freesoftware.mdwn b/doc/freesoftware.mdwn
new file mode 100644
index 000000000..2243d9b1f
--- /dev/null
+++ b/doc/freesoftware.mdwn
@@ -0,0 +1,11 @@
+[[!meta title="Free Software"]]
+
+ikiwiki, and this documentation wiki, are licensed under the terms of the
+GNU [[GPL]], version 2 or later.
+
+The parts of ikiwiki that become part of your own wiki (the [[basewiki]]
+pages (but not the smilies) and the [[templates]]) are licensed
+as follows:
+
+> Redistribution and use in source and compiled forms, with or without
+> modification, are permitted under any circumstances. No warranty.
diff --git a/doc/freesoftware/discussion.mdwn b/doc/freesoftware/discussion.mdwn
new file mode 100644
index 000000000..e71fd295d
--- /dev/null
+++ b/doc/freesoftware/discussion.mdwn
@@ -0,0 +1,3 @@
+And where is the code, please ?
+
+> [[download]] --[[Joey]]
diff --git a/doc/git.mdwn b/doc/git.mdwn
new file mode 100644
index 000000000..b86132ed2
--- /dev/null
+++ b/doc/git.mdwn
@@ -0,0 +1,84 @@
+Ikiwiki, and this documentation wiki, are developed in a git repository and
+can be checked out like this:
+
+[[!template id=note text="""
+You can push changes back to ikiwiki's git repository over the `git://`
+transport, to update this wiki, if you'd like, instead of editing it on the
+web. Changes that could not be made via the web will be automatically
+rejected.
+"""]]
+
+ git clone git://git.ikiwiki.info/
+
+The gitweb is [here](http://source.ikiwiki.branchable.com/?p=source.git;a=summary).
+
+Commits to this git repository are fed into [KGB](http://kgb.alioth.debian.org/)
+for transmission to the #ikiwiki irc channel.
+
+## personal git repositories
+
+You are of course free to set up your own ikiwiki git repository with your
+own [[patches|patch]]. If you list it here, the `gitremotes` script will
+automatically add it to git remotes. Your repo will automatically be pulled
+into [[Joey]]'s working repository where he can see your branches and
+think about merging them. This is recommended. :-)
+
+<!-- Machine-parsed format: * wikilink <git:url> -->
+
+* github `git://github.com/joeyh/ikiwiki.git`
+ ([browse](http://github.com/joeyh/ikiwiki/tree/master))
+ A mirror of the main repo, automatically updated. Also provides http
+ cloning at `http://github.com/joeyh/ikiwiki.git`
+* l10n `git://l10n.ikiwiki.info/`
+ Open push localization branch used for <http://l10n.ikiwiki.info/>
+* [[smcv]] `git://git.pseudorandom.co.uk/git/smcv/ikiwiki.git`
+ ([browse](http://git.pseudorandom.co.uk/smcv/ikiwiki.git))
+* [[intrigeri]] `git://gaffer.ptitcanardnoir.org/ikiwiki.git`
+* [[gmcmanus]] `git://github.com/gmcmanus/ikiwiki.git`
+* [[jelmer]] `git://git.samba.org/jelmer/ikiwiki.git`
+* [[hendry]] `git://webconverger.org/git/ikiwiki`
+* [[jon]] `git://github.com/jmtd/ikiwiki.git`
+* [[ikipostal|DavidBremner]] `git://pivot.cs.unb.ca/git/ikipostal.git`
+* [[ikimailbox|DavidBremner]] `git://pivot.cs.unb.ca/git/ikimailbox.git`
+* [[ikiplugins|DavidBremner]] `git://pivot.cs.unb.ca/git/ikiplugins.git`
+* [[jonas|JonasSmedegaard]] `git://source.jones.dk/ikiwiki-upstream`
+* [[arpitjain]] `git://github.com/arpitjain11/ikiwiki.git`
+* [[chrysn]] `git://github.com/github076986099/ikiwiki.git`
+* [[simonraven]] `git://github.com/kjikaqawej/ikiwiki-simon.git`
+* [[schmonz]] `git://github.com/schmonz/ikiwiki.git`
+* [[will]] `http://www.cse.unsw.edu.au/~willu/ikiwiki.git`
+* [[kaizer]] `git://github.com/engla/ikiwiki.git`
+* [[bbb]] `http://git.boulgour.com/bbb/ikiwiki.git`
+* [[KathrynAndersen]] `git://github.com/rubykat/ikiplugins.git`
+* [[ktf]] `git://github.com/ktf/ikiwiki.git`
+* [[tove]] `git://github.com/tove/ikiwiki.git`
+* [[GiuseppeBilotta]] `git://git.oblomov.eu/ikiwiki`
+* [[roktas]] `git://github.com/roktas/ikiwiki.git`
+* [[davrieb|David_Riebenbauer]] `git://git.liegesta.at/git/ikiwiki`
+ ([browse](http://git.liegesta.at/?p=ikiwiki.git;a=summary))
+* [[GustafThorslund]] `http://gustaf.thorslund.org/src/ikiwiki.git`
+* [[users/peteg]] `git://git.hcoop.net/git/peteg/ikiwiki.git`
+* [[privat]] `git://github.com/privat/ikiwiki.git`
+* [[blipvert]] `git://github.com/blipvert/ikiwiki.git`
+* [[bzed|BerndZeimetz]] `git://git.recluse.de/users/bzed/ikiwiki.git`
+* [[wtk]] `git://github.com/wking/ikiwiki.git`
+* [[sunny256]] `git://github.com/sunny256/ikiwiki.git`
+* [[fmarier]] `git://gitorious.org/~fmarier/ikiwiki/fmarier-sandbox.git`
+* [[levitte]] `git://github.com/levitte/ikiwiki.git`
+* jo `git://git.debian.org/users/jo-guest/ikiwiki.git`
+ ([browse](http://git.debian.org/?p=users/jo-guest/ikiwiki.git;a=summary))
+* [[timonator]] `git://github.com/timo/ikiwiki.git`
+* [[sajolida]] `http://un.poivron.org/~sajolida/ikiwiki.git/`
+* nezmer `git://gitorious.org/ikiwiki-nezmer/ikiwiki-nezmer.git`
+* [[yds]] `git://github.com/yds/ikiwiki.git`
+* [[pelle]] `git://github.com/hemmop/ikiwiki.git`
+* [[chrismgray]] `git://github.com/chrismgray/ikiwiki.git`
+* [[ttw]] `git://github.com/ttw/ikiwiki.git`
+* [[anarcat]] `git://src.anarcat.ath.cx/ikiwiki`
+* anderbubble `git://civilfritz.net/ikiwiki.git`
+* frioux `git://github.com/frioux/ikiwiki`
+* llipavsky `git://github.com/llipavsky/ikiwiki`
+
+## branches
+
+Current branches of ikiwiki are listed on [[branches]].
diff --git a/doc/ikiwiki-calendar.mdwn b/doc/ikiwiki-calendar.mdwn
new file mode 100644
index 000000000..d311a1859
--- /dev/null
+++ b/doc/ikiwiki-calendar.mdwn
@@ -0,0 +1,53 @@
+# NAME
+
+ikiwiki-calendar - create calendar archive pages
+
+# SYNOPSIS
+
+ikiwiki-calendar [-f] your.setup [pagespec] [startyear [endyear]]
+
+# DESCRIPTION
+
+`ikiwiki-calendar` creates pages that use the [[ikiwiki/directive/calendar]]
+directive, allowing the archives to be browsed one month
+at a time, with calendar-based navigation.
+
+You must specify the setup file for your wiki. The pages will
+be created inside its `srcdir`, beneath the `archivebase`
+directory used by the calendar plugin (default "archives").
+
+To control which pages are included on the calendars,
+a [[ikiwiki/PageSpec]] can be specified. The default is
+all pages, or the pages specified by the `comments_pagespec`
+setting in the config file. A pagespec can also be specified
+on the command line. To limit it to only posts in a blog,
+use something like "posts/* and !*/Discussion".
+
+It defaults to creating calendar pages for the current
+year. If you specify a year, it will create pages for that year.
+Specify a second year to create pages for a span of years.
+
+Existing pages will not be overwritten by this command by default.
+Use the `-f` switch to force it to overwrite any existing pages.
+
+# CRONTAB
+
+While this command only needs to be run once a year to update
+the archive pages for each new year, you are recommended to set up
+a cron job to run it daily, at midnight. Then it will also update
+the calendars to highlight the current day.
+
+An example crontab:
+
+ 0 0 * * * ikiwiki-calendar ~/ikiwiki.setup 'posts/* and !*/Discussion'
+
+# TEMPLATES
+
+This command uses two [[templates]] to generate
+the pages, `calendarmonth.tmpl` and `calendaryear.tmpl`.
+
+# AUTHOR
+
+Joey Hess <joey@ikiwiki.info>
+
+Warning: this page is automatically made into ikiwiki-calendar's man page, edit with care
diff --git a/doc/ikiwiki-calendar/discussion.mdwn b/doc/ikiwiki-calendar/discussion.mdwn
new file mode 100644
index 000000000..b64321008
--- /dev/null
+++ b/doc/ikiwiki-calendar/discussion.mdwn
@@ -0,0 +1,36 @@
+Suggestion to change
+
+ 0 0 * * * ikiwiki-calendar ~/ikiwiki.setup "posts/* and !*/Discussion"
+
+to
+
+ 0 0 * * * ikiwiki-calendar ~/ikiwiki.setup 'posts/* and !*/Discussion'
+
+I ran into (for me) unexpected behaviour with double quotes, since when I tried it in the interactive shell, the "!" made it fail ([history expansion](http://mywiki.wooledge.org/BashPitfalls#echo_.22Hello_World.21.22)). I thought "aha, it should be escaped!", did so, and did not get any error messages, but it no longer functioned as intended (as per the earlier linked page).
+
+The latter line will work everywhere, not just in environments without history expansion. I think trying a command manually before putting it into crontab is common, and this would avoid the possible user issue I ran into.
+
+----
+
+I wonder what this program (ikiwiki-calendar) is useful for. Would not it be simpler to have the [[calendar|ikiwiki/directive/calendar]] directive generate the `archive_base/year/month.mdwn` files on the fly, as are the tag pages generated by the [[tag|plugins/tag]] plugin? This solution would have the advantage of automatically generating the right calendar pages, instead of having to tell ikiwiki-calendar which years to take into account.
+
+Using this solution would mean to have the pagespec stored somewhere in the configuration. But this is already the case, as the pagespec used by [[ikiwiki-calendar]] is either set in the configuration file of the wiki, or in the crontab or whatever script used to call ikiwiki-calendar.
+
+Having done this, the only purpose of ikiwiki-calendar would be to re-generate the wiki on a daily (or whatever frequency) basis, which can be done using ikiwiki instead of ikiwiki-calendar.
+
+Did I miss something? If I am right, I offer to write the necessary patch, copied and adapted from the tag plugin, to generate the pages `archive_base/year/month.mdwn` on the fly.
+
+-- Spalax
+
+> Good spotting, `ikiwiki-calendar` predates the `add_autofile` API used to
+> autocreate tag pages and was bolted in as an easy way to create calendar
+> pages.
+>
+> It would be possible to do that inside the caneldar plugin now. Although
+> some command would still need to be run on a daily (or weekly, or
+> monthly, or yearly..) basis to have it wake up and make the new calendar
+> pages and update the displayed current day from calendar directives.
+>
+> That last is, arguably, the real point of running ikiwiki-calendar in
+> a cron job. Of course all it really does is run `ikiwiki -setup foo
+> -refresh`. --[[Joey]]
diff --git a/doc/ikiwiki-makerepo.mdwn b/doc/ikiwiki-makerepo.mdwn
new file mode 100644
index 000000000..928440f99
--- /dev/null
+++ b/doc/ikiwiki-makerepo.mdwn
@@ -0,0 +1,44 @@
+# NAME
+
+ikiwiki-makerepo - check an ikiwiki srcdir into revision control
+
+# SYNOPSIS
+
+ikiwiki-makerepo git|svn|monotone|darcs|cvs srcdir repo
+
+ikiwiki-makerepo bzr|mercurial srcdir
+
+# DESCRIPTION
+
+`ikiwiki-makerepo` injects an existing `srcdir` directory, containing
+sources for an ikiwiki wiki, into revision control. It is rarely
+run directly; consider using `ikiwiki --setup /etc/ikiwiki/wiki.setup` instead
+to set up a wiki.
+
+For git, the `repo` is created as a bare git repository, and the srcdir is
+made into a clone of it. (monotone and darcs are similar.)
+
+For svn and cvs, the `repo` is the centralized repository, and the `srcdir`
+is a checkout of it.
+
+For mercurial and bzr, the srcdir is the only repository set up.
+
+For darcs, the master repo's apply hook will be preconfigured to call a
+ikiwiki wrapper.
+
+Note that for monotone, you are assumed to already have run "mtn genkey"
+to generate a key.
+
+# EXAMPLE
+
+`ikiwiki-makerepo git /srv/web/wiki /srv/git/wiki.git/`
+
+This creates a bare repository `/srv/git/wiki.git/`,
+and sets up `/srv/web/wiki` to be a clone of it, committing
+any files that already exist in that directory.
+
+# AUTHOR
+
+Joey Hess <joey@ikiwiki.info>
+
+Warning: this page is automatically made into ikiwiki-makerepo's man page, edit with care
diff --git a/doc/ikiwiki-makerepo/discussion.mdwn b/doc/ikiwiki-makerepo/discussion.mdwn
new file mode 100644
index 000000000..660823fbb
--- /dev/null
+++ b/doc/ikiwiki-makerepo/discussion.mdwn
@@ -0,0 +1 @@
+Not sure how well I described the example. :-/ -- Jeremiah
diff --git a/doc/ikiwiki-mass-rebuild.mdwn b/doc/ikiwiki-mass-rebuild.mdwn
new file mode 100644
index 000000000..a2db0a858
--- /dev/null
+++ b/doc/ikiwiki-mass-rebuild.mdwn
@@ -0,0 +1,33 @@
+# NAME
+
+ikiwiki-mass-rebuild - rebuild all ikiwiki wikis on a system
+
+# SYNOPSIS
+
+ikiwiki-mass-rebuild
+
+# DESCRIPTION
+
+`ikiwiki-mass-rebuild` can be used to force a rebuild of all the wikis
+on a system (when run as root), or all of a user's wikis (when run as
+non-root).
+
+You will need to list the setup files for the wikis it should
+build in the file `/etc/ikiwiki/wikilist`, which has the format:
+
+user /path/to/ikiwiki.setup
+
+It's also possible to let a user list setup files in `~user/.ikiwiki/wikilist`
+in their home directory. To do so, list only the user's name, without a
+setup file. The format of `~/.ikiwiki/wikilist` is the same as
+`/etc/ikiwiki/wikilist`.
+
+# OPTIONS
+
+All options are passed on to ikiwiki.
+
+# AUTHOR
+
+Joey Hess <joey@ikiwiki.info>
+
+Warning: this page is automatically made into ikiwiki-mass-rebuild's man page, edit with care
diff --git a/doc/ikiwiki-mass-rebuild/discussion.mdwn b/doc/ikiwiki-mass-rebuild/discussion.mdwn
new file mode 100644
index 000000000..8b1378917
--- /dev/null
+++ b/doc/ikiwiki-mass-rebuild/discussion.mdwn
@@ -0,0 +1 @@
+
diff --git a/doc/ikiwiki-transition.mdwn b/doc/ikiwiki-transition.mdwn
new file mode 100644
index 000000000..3d81d659f
--- /dev/null
+++ b/doc/ikiwiki-transition.mdwn
@@ -0,0 +1,75 @@
+# NAME
+
+ikiwiki-transition - transition ikiwiki pages to new syntaxes, etc
+
+# SYNOPSIS
+
+ikiwiki-transition type ...
+
+# DESCRIPTION
+
+`ikiwiki-transition` aids in converting wiki pages when there's a major
+change in ikiwiki syntax. It also handles other transitions not involving
+wiki pages.
+
+# prefix_directives your.setup
+
+The `prefix_directives` mode converts all pages from the old preprocessor
+directive syntax, requiring a space, to the new syntax, prefixed by '!'.
+
+Preprocessor directives which already use the new syntax will remain
+unchanged.
+
+Note that if a page contains wiki links with spaces, which some
+older versions of ikiwiki accepted, the prefix_directives transition will
+treat these as preprocessor directives and convert them.
+
+# setupformat your.setup
+
+The `setupformat` mode converts a setup file from using a single `wrappers` block
+to using `cgi_wrapper`, `git_wrapper`, etc.
+
+Note that all comments and any unusual stuff like perl code in the setup
+file will be lost, as it is entirely rewritten by the transition.
+
+# aggregateinternal your.setup
+
+The `aggregateinternal` mode moves pages aggregated by the aggregate plugin
+so that the `aggregateinternal` option can be enabled.
+
+# moveprefs your.setup
+
+Moves values that used to be admin preferences into the setup file.
+
+Note that all comments and any unusual stuff like perl code in the setup
+file will be lost, as it is entirely rewritten by the move.
+
+# indexdb your.setup|srcdir
+
+The `indexdb` mode handles converting a plain text `.ikiwiki/index` file to
+a binary `.ikiwiki/indexdb`. You do not normally need to run
+`ikiwiki-transition indexdb`; ikiwiki will automatically run it as
+necessary.
+
+# hashpassword your.setup|srcdir
+
+The `hashpassword` mode forces any plaintext passwords stored in the
+`.ikiwiki/userdb` file to be replaced with password hashes. (The
+Authen::Passphrase perl module is needed to do this.)
+
+If this is not done explicitly, a user's plaintext password will be
+automatically converted to a hash when a user logs in for the first time
+after upgrade to ikiwiki 2.48.
+
+# deduplinks your.setup
+
+In the past, bugs in ikiwiki have allowed duplicate link information
+to be stored in its indexdb. This mode removes such duplicate information,
+which may speed up wikis afflicted by it. Note that rebuilding the wiki
+will have the same effect.
+
+# AUTHOR
+
+Josh Triplett <josh@freedesktop.org>, Joey Hess <joey@ikiwiki.info>
+
+Warning: this page is automatically made into ikiwiki-transition's man page, edit with care
diff --git a/doc/ikiwiki-update-wikilist.mdwn b/doc/ikiwiki-update-wikilist.mdwn
new file mode 100644
index 000000000..b6330c5e5
--- /dev/null
+++ b/doc/ikiwiki-update-wikilist.mdwn
@@ -0,0 +1,33 @@
+# NAME
+
+ikiwiki-update-wikilist - add or remove user from /etc/ikiwiki/wikilist
+
+# SYNOPSIS
+
+ikiwiki-update-wikilist [-r]
+
+# DESCRIPTION
+
+`ikiwiki-update-wikilist` is designed to be safely run as root by arbitrary
+users, either by being made suid and using the (now deprecated suidperl), or
+by being configured in `/etc/sudoers` to allow arbitrary users to run.
+
+All it does is allows users to add or remove their names
+from the `/etc/ikiwiki/wikilist` file.
+
+By default, the user's name will be added.
+The `-r` switch causes the user's name to be removed.
+
+If your name is in `/etc/ikiwiki/wikilist`, the [[ikiwiki-mass-rebuild]](8)
+command will look for a ~/.ikiwiki/wikilist file, and rebuild the wikis listed
+in that file.
+
+# OPTIONS
+
+None.
+
+# AUTHOR
+
+Joey Hess <joey@ikiwiki.info>
+
+Warning: this page is automatically made into ikiwiki-update-wikilist's man page, edit with care
diff --git a/doc/ikiwiki.mdwn b/doc/ikiwiki.mdwn
new file mode 100644
index 000000000..4d840696c
--- /dev/null
+++ b/doc/ikiwiki.mdwn
@@ -0,0 +1,17 @@
+[[!meta robots="noindex, follow"]]
+This wiki is powered by [ikiwiki](http://ikiwiki.info/).
+[[!if test="enabled(version)"
+ then="(Currently running version [[!version ]].)"
+]]
+
+Some documentation on using ikiwiki:
+
+* [[ikiwiki/formatting]]
+* [[ikiwiki/wikilink]]
+* [[ikiwiki/subpage]]
+* [[ikiwiki/pagespec]]
+* [[ikiwiki/directive]]
+* [[ikiwiki/markdown]]
+* [[ikiwiki/openid]]
+* [[ikiwiki/searching]]
+* [[templates]]
diff --git a/doc/ikiwiki/directive.mdwn b/doc/ikiwiki/directive.mdwn
new file mode 100644
index 000000000..1dc1e517d
--- /dev/null
+++ b/doc/ikiwiki/directive.mdwn
@@ -0,0 +1,56 @@
+[[!meta robots="noindex, follow"]]
+Directives are similar to a [[ikiwiki/WikiLink]] in form, except they
+begin with `!` and may contain parameters. The general form is:
+
+ \[[!directive param="value" param="value"]]
+
+This gets expanded before the rest of the page is processed, and can be used
+to transform the page in various ways.
+
+The quotes around values can be omitted if the value is a simple word.
+Also, some directives may use parameters without values, for example:
+
+ \[[!tag foo]]
+
+A directive does not need to all be on one line, it can be
+wrapped to multiple lines if you like:
+
+ \[[!directive foo="baldersnatch"
+ bar="supercalifragilisticexpialidocious" baz=11]]
+
+Also, multiple lines of *quoted* text can be used for a value.
+To allow quote marks inside the quoted text, delimit the block
+of text with triple-double-quotes or triple-single-quotes:
+
+ \[[!directive text="""
+ 1. "foo"
+ 2. "bar"
+ 3. "baz"
+ """ othertext='''
+ 1. 'quux'
+ 2. "foo"
+ ''']]
+
+If you want to put text with triple quotes into a parameter value, you can
+use perl-style here-doc syntax, even nesting it like this:
+
+ \[[!directive text=<<OUTER
+ [[!otherdirective <<INNER
+ inner text
+ INNER]]
+ outer text
+ OUTER]]
+
+ikiwiki also has an older syntax for directives, which requires a space in
+directives to distinguish them from [[wikilinks|ikiwiki/wikilink]]. This
+syntax has several disadvantages: it requires a space after directives with
+no parameters (such as `\[[pagecount ]]`), and it prohibits spaces in
+[[wikilinks|ikiwiki/wikilink]]. ikiwiki now provides the `!`-prefixed
+syntax shown above as default. However, ikiwiki still supports wikis using
+the older syntax, if the `prefix_directives` option is disabled.
+
+[[!if test="enabled(listdirectives)" then="""
+Here is a list of currently available directives in this wiki:
+
+[[!listdirectives ]]
+"""]]
diff --git a/doc/ikiwiki/directive/aggregate.mdwn b/doc/ikiwiki/directive/aggregate.mdwn
new file mode 100644
index 000000000..ddfcd40b7
--- /dev/null
+++ b/doc/ikiwiki/directive/aggregate.mdwn
@@ -0,0 +1,57 @@
+The `aggregate` directive is supplied by the [[!iki plugins/aggregate desc=aggregate]] plugin.
+This plugin requires extra setup, specifically, a cron job. See the
+plugin's documentation for details.
+
+This directive allows content from other feeds to be aggregated into the wiki.
+Aggregate a feed as follows:
+
+ \[[!aggregate name="example blog" dir="example"
+ feedurl="http://example.com/index.rss"
+ url="http://example.com/" updateinterval="15"]]
+
+That example aggregates posts from the specified RSS feed, updating no
+more frequently than once every 15 minutes (though possibly less
+frequently, if the cron job runs less frequently than that), and puts a
+page per post under the example/ directory in the wiki.
+
+You can then use ikiwiki's [[inline]] directive to create a blog of one or
+more aggregated feeds. For example:
+
+ \[[!inline pages="internal(example/*)"]]
+
+Note the use of `internal()` in the [[ikiwiki/PageSpec]] to match
+aggregated pages. By default, aggregated pages are internal pages,
+which prevents them from showing up directly in the wiki, and so this
+special [[PageSpec]] is needed to match them.
+
+## usage
+
+Here are descriptions of all the supported parameters to the `aggregate`
+directive:
+
+* `name` - A name for the feed. Each feed must have a unique name.
+ Required.
+* `url` - The url to the web page for the feed that's being aggregated.
+ Required.
+* `dir` - The directory in the wiki where pages should be saved. Optional,
+ if not specified, the directory is based on the name of the feed.
+* `feedurl` - The url to the feed. Optional, if it's not specified ikiwiki
+ will look for feeds on the `url`. RSS and atom feeds are supported.
+* `updateinterval` - How often to check for new posts, in minutes. Default
+ is 15 minutes.
+* `expireage` - Expire old items from this feed if they are older than
+ a specified number of days. Default is to never expire on age.
+* `expirecount` - Expire old items from this feed if there are more than
+ the specified number total. Oldest items will be expired first. Default
+ is to never expire on count.
+* `tag` - A tag to tag each post from the feed with. A good tag to use is
+ the name of the feed. Can be repeated multiple times. The [[tag]] plugin
+ must be enabled for this to work.
+* `template` - Template to use for creating the aggregated pages. Defaults to
+ aggregatepost.
+
+Note that even if you are using subversion or another revision control
+system, pages created by aggregation will *not* be checked into revision
+control.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/aggregate/discussion.mdwn b/doc/ikiwiki/directive/aggregate/discussion.mdwn
new file mode 100644
index 000000000..ddece9746
--- /dev/null
+++ b/doc/ikiwiki/directive/aggregate/discussion.mdwn
@@ -0,0 +1,10 @@
+It would be awesome if table could aggregrate remote CSVs too. I want something like:
+
+ !table file="http://cyclehireapp.com/cyclehirelive/cyclehire.csv"
+
+> Ok, but that has nothing to do with the aggregate plugin. File a
+> [[todo]]?
+>
+> Anyway, it seems difficult, how would it know when the remote content
+> had changed? Aggregate has its cron job support and has time stamps
+> in rss feeds to rely on. --[[Joey]]
diff --git a/doc/ikiwiki/directive/brokenlinks.mdwn b/doc/ikiwiki/directive/brokenlinks.mdwn
new file mode 100644
index 000000000..91bafe5a0
--- /dev/null
+++ b/doc/ikiwiki/directive/brokenlinks.mdwn
@@ -0,0 +1,14 @@
+The `brokenlinks` directive is supplied by the [[!iki plugins/brokenlinks desc=brokenlinks]] plugin.
+
+This directive generates a list of broken links on pages in the wiki. This is
+a useful way to find pages that still need to be written, or links that
+are written wrong.
+
+The optional parameter "pages" can be a [[ikiwiki/PageSpec]] specifying the
+pages to search for broken links, default is search them all.
+
+Example:
+
+ \[[!brokenlinks pages="* and !recentchanges"]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/brokenlinks/discussion.mdwn b/doc/ikiwiki/directive/brokenlinks/discussion.mdwn
new file mode 100644
index 000000000..34760584d
--- /dev/null
+++ b/doc/ikiwiki/directive/brokenlinks/discussion.mdwn
@@ -0,0 +1,3 @@
+Would it be possible to have such a thing also checking for external links? -- [[user/emptty]]
+
+> I guess this is not very practical. For internal wiki links, we only need to check if the linked source file exist or not. But for external links, we have to ping every links, which will slow down the build process a lot. --[[weakish]]
diff --git a/doc/ikiwiki/directive/calendar.mdwn b/doc/ikiwiki/directive/calendar.mdwn
new file mode 100644
index 000000000..cb40f884e
--- /dev/null
+++ b/doc/ikiwiki/directive/calendar.mdwn
@@ -0,0 +1,60 @@
+The `calendar` directive is supplied by the [[!iki plugins/calendar desc=calendar]] plugin.
+
+This directive displays a calendar, similar to the typical calendars shown on
+some blogs.
+
+# examples
+
+ \[[!calendar ]]
+
+ \[[!calendar type="month" pages="blog/* and !*/Discussion"]]
+
+ \[[!calendar type="year" year="2005" pages="blog/* and !*/Discussion"]]
+
+## setup
+
+The calendar is essentially a fancy front end to archives of previous
+pages, usually used for blogs. It can produce a calendar for a given month,
+or a list of months for a given year. The month format calendar simply
+links to any page posted on each day of the month. The year format calendar
+links to archive pages, with names like `archives/2007` (for all of 2007)
+and `archives/2007/01` (for January, 2007).
+
+While you can insert calendar directives anywhere on your wiki, including
+in the sidebar, you'll also need to create these archive pages. They
+typically use this directive to display a calendar, and also use [[inline]]
+to display or list pages created in the given time frame.
+
+The `ikiwiki-calendar` command can be used to automatically generate the
+archive pages. It also refreshes the wiki, updating the calendars to
+highlight the current day. This command is typically run at midnight from
+cron.
+
+An example crontab:
+
+ 0 0 * * * ikiwiki-calendar ~/ikiwiki.setup "posts/* and !*/Discussion"
+
+## usage
+
+* `type` - Used to specify the type of calendar wanted. Can be one of
+ "month" or "year". The default is a month view calendar.
+* `pages` - Specifies the [[ikiwiki/PageSpec]] of pages to link to from the
+ month calendar. Defaults to "*".
+* `archivebase` - Configures the base of the archives hierarchy.
+ The default is "archives". Note that this default can also be overridden
+ for the whole wiki by setting `archivebase` in ikiwiki's setup file.
+ Calendars link to pages under here, with names like "2010/04" and
+ "2010". These pages can be automatically created using the
+ `ikiwiki-calendar` program.
+* `year` - The year for which the calendar is requested. Defaults to the
+ current year. Can also use -1 to refer to last year, and so on.
+* `month` - The numeric month for which the calendar is requested, in the
+ range 1..12. Used only for the month view calendar, and defaults to the
+ current month. Can also use -1 to refer to last month, and so on.
+* `week_start_day` - A number, in the range 0..6, which represents the day
+ of the week that the month calendar starts with. 0 is Sunday, 1 is Monday,
+ and so on. Defaults to 0, which is Sunday.
+* `months_per_row` - In the year calendar, number of months to place in
+ each row. Defaults to 3.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/color.mdwn b/doc/ikiwiki/directive/color.mdwn
new file mode 100644
index 000000000..553767f00
--- /dev/null
+++ b/doc/ikiwiki/directive/color.mdwn
@@ -0,0 +1,25 @@
+The `color` directive is supplied by the [[!iki plugins/color desc=color]] plugin.
+
+This directive can be used to color a piece of text on a page.
+It can be used to set the foreground and/or background color of the text.
+
+You can use a color name (e.g. `white`) or HTML code (e.g. `#ffffff`)
+to define colors.
+
+## examples
+
+Here the foreground color is defined as a word, while the background color
+is defined as a HTML color code:
+
+ \[[!color foreground=white background=#ff0000 text="White text on red background"]]
+
+The background color is missing, so the text is displayed on default
+background:
+
+ \[[!color foreground=white text="White text on default color background"]]
+
+The foreground is missing, so the text has the default foreground color:
+
+ \[[!color background=#ff0000 text="Default color text on red background"]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/comment.mdwn b/doc/ikiwiki/directive/comment.mdwn
new file mode 100644
index 000000000..398130e2e
--- /dev/null
+++ b/doc/ikiwiki/directive/comment.mdwn
@@ -0,0 +1,40 @@
+The `comment` directive is supplied by the
+[[!iki plugins/comments desc=comments]] plugin, and is used to add a comment
+to a page. Typically, the directive is the only thing on a comment page,
+and is filled out by the comment plugin when a user posts a comment.
+
+Example:
+
+ \[[!comment format=mdwn
+ username="foo"
+ subject="Bar"
+ date="2009-06-02T19:05:01Z"
+ content="""
+ Blah blah.
+ """
+ ]]
+
+## usage
+
+The only required parameter is `content`, the others just add or override
+metadata of the comment.
+
+* `content` - Text to display for the comment.
+ Note that [[directives|ikiwiki/directive]]
+ may not be allowed, depending on the configuration
+ of the comment plugin.
+* `format` - Specifies the markup used for the content.
+* `subject` - Subject for the comment.
+* `date` - Date the comment was posted. Can be entered in
+ nearly any format, since it's parsed by [[!cpan TimeDate]]
+* `username` - Used to record the username (or OpenID)
+ of a logged in commenter.
+* `nickname` - Name to display for a logged in commenter.
+ (Optional; used for OpenIDs.)
+* `ip` - Can be used to record the IP address of a commenter,
+ if they posted anonymously.
+* `claimedauthor` - Records the name that the user entered,
+ if anonymous commenters are allowed to enter their (unverified)
+ name.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/commentmoderation.mdwn b/doc/ikiwiki/directive/commentmoderation.mdwn
new file mode 100644
index 000000000..8553b5b17
--- /dev/null
+++ b/doc/ikiwiki/directive/commentmoderation.mdwn
@@ -0,0 +1,9 @@
+The `commentmoderation` directive is supplied by the
+[[!iki plugins/comments desc=comments]] plugin, and is used to link
+to the comment moderation queue.
+
+Example:
+
+ \[[!commentmoderation desc="here is the comment moderation queue"]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/copy.mdwn b/doc/ikiwiki/directive/copy.mdwn
new file mode 100644
index 000000000..a0aa0ef7f
--- /dev/null
+++ b/doc/ikiwiki/directive/copy.mdwn
@@ -0,0 +1,3 @@
+[[!meta redir=/ikiwiki/directive/cutpaste]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/cut.mdwn b/doc/ikiwiki/directive/cut.mdwn
new file mode 100644
index 000000000..a0aa0ef7f
--- /dev/null
+++ b/doc/ikiwiki/directive/cut.mdwn
@@ -0,0 +1,3 @@
+[[!meta redir=/ikiwiki/directive/cutpaste]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/cutpaste.mdwn b/doc/ikiwiki/directive/cutpaste.mdwn
new file mode 100644
index 000000000..ca580e54f
--- /dev/null
+++ b/doc/ikiwiki/directive/cutpaste.mdwn
@@ -0,0 +1,50 @@
+The `copy`, `cut` and `paste` directives are supplied by the
+[[!iki plugins/cutpaste desc=cutpaste]] plugin.
+
+With these directives you can store and recall pieces of text in a page:
+
+ * `\[[!cut id=name text="text"]]` memorizes the text allowing to recall it
+ using the given ID. The text being cut is not included in the output.
+ * `\[[!copy id=name text="text"]]` memorizes the text allowing to recall it
+ using the given ID. The text being cut *is* included in the output.
+ * `\[[!paste id=name]]` is replaced by the previously memorized text.
+
+The text being cut, copied and pasted can freely include wiki markup, including
+more calls to cut, copy and paste.
+
+You do not need to memorize the text before using it: a cut directive can
+follow the paste directive that uses its text. In fact, this is quite useful
+to postpone big blocks of text like long annotations and have a more natural
+flow. For example:
+
+ \[[!toggleable id="cut" text="[[!paste id=cutlongdesc]]"]]
+ \[[!toggleable id="copy" text="[[!paste id=copylongdesc]]"]]
+ \[[!toggleable id="paste" text="[[!paste id=pastelongdesc]]"]]
+
+ [...some time later...]
+
+ \[[!cut id=cutlongdesc text="""
+ blah blah blah
+ """]]
+ \[[!cut id=copylongdesc text="""
+ blah blah blah
+ """]]
+ \[[!cut id=pastelongdesc text="""
+ blah blah blah
+ """]]
+
+This can potentially be used to create loops, but ikiwiki is clever and breaks
+them.
+
+Since you can paste without using double quotes, copy and paste can be used to
+nest directives that require multiline parameters inside each other:
+
+ \[[!toggleable id=foo text="""
+ [[!toggleable id=bar text="[[!paste id=baz]]"]]
+ """]]
+
+ \[[!cut id=baz text="""
+ multiline parameter!
+ """]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/date.mdwn b/doc/ikiwiki/directive/date.mdwn
new file mode 100644
index 000000000..b89241e4c
--- /dev/null
+++ b/doc/ikiwiki/directive/date.mdwn
@@ -0,0 +1,16 @@
+The `date` directive is supplied by the [[!iki plugins/date desc=date]] plugin.
+
+This directive can be used to display a date on a page, using the same
+display method that is used to display the modification date in the page
+footer, and other dates in the wiki. This can be useful for consistency
+of display, or if you want to embed parseable dates into the page source.
+
+Like the dates used by the [[meta]] directive, the date can be entered in
+nearly any format, since it's parsed by [[!cpan TimeDate]].
+
+For example, an update to a page with an embedded date stamp could look
+like:
+
+ Updated \[[!date "Wed, 25 Nov 2009 01:11:55 -0500"]]: mumble mumble
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/edittemplate.mdwn b/doc/ikiwiki/directive/edittemplate.mdwn
new file mode 100644
index 000000000..569c2818f
--- /dev/null
+++ b/doc/ikiwiki/directive/edittemplate.mdwn
@@ -0,0 +1,34 @@
+The `edittemplate` directive is supplied by the [[!iki plugins/edittemplate desc=edittemplate]] plugin.
+
+This directive allows registering template pages, that provide default
+content for new pages created using the web frontend. To register a
+template, insert a [[ikiwiki/directive/template]] directive on some other
+page.
+
+ \[[!edittemplate template="bugtemplate" match="bugs/*"]]
+
+A recommended place to put the directive is on the parent page
+of the pages that will be created using the template. So the above
+example would be put on the bugs page. (Do not put the directive on the
+template page itself.)
+
+In the above example, the page named "bugtemplate" is registered as a
+template to be used when any page named "bugs/*" is created. To avoid
+the directive displaying a note about the template being registered, add
+"silent=yes".
+
+Often the template page contains a simple skeleton for a particular type of
+page. For the bug report pages in the above example, it might look
+something like:
+
+ Package:
+ Version:
+ Reproducible: y/n
+ Details:
+
+The template page can also contain [[!cpan HTML::Template]] directives,
+like other ikiwiki [[templates]]. Currently only one variable is
+set: `<TMPL_VAR name>` is replaced with the name of the page being
+created.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/flattr.mdwn b/doc/ikiwiki/directive/flattr.mdwn
new file mode 100644
index 000000000..5083005ce
--- /dev/null
+++ b/doc/ikiwiki/directive/flattr.mdwn
@@ -0,0 +1,45 @@
+The `flattr` directive is supplied by the [[!iki plugins/flattr desc=flattr]] plugin.
+
+This directive allows easily inserting Flattr buttons onto wiki pages.
+
+Flattr supports both static buttons and javascript buttons. This directive
+only creates dynamic javascript buttons. If you want to insert a static
+Flattr button, you can simply copy the html code for it from Flattr, instead.
+Note that this directive inserts javascript code into the page, that
+loads more javascript code from Flattr.com. So only use it if you feel
+comfortable with that.
+
+The directive can be used to display a button for a thing you have already
+manually submitted to Flattr. In this mode, the only parameter you need to
+include is the exact url to the thing that was submitted to Flattr.
+(If the button is for the current page, you can leave that out.) For
+example, this is the Flattr button for ikiwiki. Feel free to add it to all
+your pages. ;)
+
+ \[[!flattr url="http://ikiwiki.info/" button=compact]]
+
+The directive can also be used to create a button that automatically
+submits a page to Flattr when a user clicks on it. In this mode you
+need to include parameters to specify your uid, and a title, category, tags,
+and description for the page. For example, this is a Flattr button for
+a blog post:
+
+ \[[!flattr uid=25634 title="my new blog post" category=text
+ tags="blog,example" description="This is a post on my blog."]]
+
+Here are all possible parameters you can pass to the Flattr directive.
+
+* `button` - Set to "compact" for a small button.
+* `url` - The url to the thing to be Flattr'd. If omitted, defaults
+ to the url of the current page.
+* `uid` - Your numeric Flattr userid. Not needed if the flattr plugin
+ has been configured with a global `flattr_userid`.
+* `title` - A short title for the thing, to show on its Flattr page.
+* `description` - A description of the thing, to show on its Flattr
+ page.
+* `category` - One of: text, images, video, audio, software, rest.
+* `tags` - A list of tags separated by a comma.
+* `language` - A language code.
+* `hidden` - Set to 1 to hide the button from listings on Flattr.com.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/format.mdwn b/doc/ikiwiki/directive/format.mdwn
new file mode 100644
index 000000000..7d11d225f
--- /dev/null
+++ b/doc/ikiwiki/directive/format.mdwn
@@ -0,0 +1,29 @@
+The `format` directive is supplied by the [[!iki plugins/format desc=format]]
+plugin.
+
+The directive allows formatting a chunk of text using any available page
+format. It takes two parameters. First is the type of format to use,
+ie the extension that would be used for a standalone file of this type.
+Second is the text to format.
+
+For example, this will embed an otl outline inside a page using mdwn or
+some other format:
+
+ \[[!format otl """
+ foo
+ 1
+ 2
+ bar
+ 3
+ 4
+ """]]
+
+Note that if the highlight plugin is enabled, this directive can also be
+used to display syntax highlighted code. Many languages and formats are
+supported. For example:
+
+ \[[!format perl """
+ print "hello, world\n";
+ """]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/fortune.mdwn b/doc/ikiwiki/directive/fortune.mdwn
new file mode 100644
index 000000000..45f533eb2
--- /dev/null
+++ b/doc/ikiwiki/directive/fortune.mdwn
@@ -0,0 +1,8 @@
+The `fortune` directive is supplied by the [[!iki plugins/fortune desc=fortune]] plugin.
+
+This just uses the `fortune` program to insert a fortune cookie into the page.
+Usage:
+
+ \[[!fortune ]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/graph.mdwn b/doc/ikiwiki/directive/graph.mdwn
new file mode 100644
index 000000000..7021e47fb
--- /dev/null
+++ b/doc/ikiwiki/directive/graph.mdwn
@@ -0,0 +1,32 @@
+The `graph` directive is supplied by the [[!iki plugins/graphviz desc=graphviz]] plugin.
+
+This directive allows embedding [graphviz](http://www.graphviz.org/)
+graphs in a page. Example usage:
+
+ \[[!graph src="a -> b -> c; a -> c;"]]
+
+Nodes on the graph can link to external urls using regular graphviz syntax,
+and a clickable imagemap will be created. As a special extension for
+ikiwiki, [[WikiLinks|ikiwiki/wikilink]] can also be used. For example:
+
+ \[[!graph src="""
+ google [ href="http://google.com/" ]
+ sandbox [ href=\[[SandBox]] ]
+ help [ href=\[[ikiwiki/formatting]] ]
+ newpage [ href=\[[NewPage]] ]
+
+ google -> sandbox -> help -> newpage -> help -> google;
+ """]]
+
+The `graph` directive supports the following parameters:
+
+- `src` - The graphviz source to render.
+- `type` - The type of graph to render: `graph` or `digraph`. Defaults to
+ `digraph`.
+- `prog` - The graphviz program to render with: `dot`, `neato`, `fdp`, `twopi`,
+ or `circo`. Defaults to `dot`.
+- `height`, `width` - Limit the size of the graph to a given height and width,
+ in inches. You must specify both to limit the size; otherwise, graphviz will
+ choose a size, without any limit.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/graph/discussion.mdwn b/doc/ikiwiki/directive/graph/discussion.mdwn
new file mode 100644
index 000000000..f88da7261
--- /dev/null
+++ b/doc/ikiwiki/directive/graph/discussion.mdwn
@@ -0,0 +1,27 @@
+How to align this?
+==================
+
+I have found this patch to be the only way I can float graphs to the right:
+
+[[!format diff """
+--- Plugin/graphviz.pm.orig 2012-04-25 10:26:59.531525247 -0400
++++ Plugin/graphviz.pm 2012-04-25 10:26:01.282922144 -0400
+@@ -87,8 +87,10 @@
+ error gettext("failed to run graphviz") if ($sigpipe || $?);
+ }
+
++ my $class = '';
++ $class = 'class="' . $params{class} if $params{class};
+ return "<img src=\"".urlto($dest, $params{destpage}).
+- "\" usemap=\"#graph$sha\" />\n".
++ "\" usemap=\"#graph$sha\" $class />\n".
+ $map;
+ }
+"""]]
+
+Then I can use `[[!graph class="align-right" ...]]`.. --[[anarcat]]
+
+> You can already use `<div class="align-right">[[!graph ...]]</div>`,
+> doesn't that have the same practical effect? --[[smcv]]
+
+> > It does! I didn't think of that, thanks! I am not used to plain HTML in wikis, and the [[plugins/contrib/osm]] plugin has "right" and "left" directives... --[[anarcat]]
diff --git a/doc/ikiwiki/directive/haiku.mdwn b/doc/ikiwiki/directive/haiku.mdwn
new file mode 100644
index 000000000..979f0891f
--- /dev/null
+++ b/doc/ikiwiki/directive/haiku.mdwn
@@ -0,0 +1,15 @@
+The `haiku` directive is supplied by the [[!iki plugins/haiku desc=haiku]] plugin.
+
+This directive allows inserting a randomly generated haiku into a wiki page.
+Just type:
+
+ \[[!haiku hint="argument"]]
+
+[[!haiku hint="argument test"]]
+
+The hint parameter can be omitted, it only provides the generator a hint of
+what to write the haiku about. If no hint is given, it might base it on the
+page name. Since the vocabulary it knows is very small, many hints won't
+affect the result at all.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/if.mdwn b/doc/ikiwiki/directive/if.mdwn
new file mode 100644
index 000000000..492adf499
--- /dev/null
+++ b/doc/ikiwiki/directive/if.mdwn
@@ -0,0 +1,50 @@
+The `if` directive is supplied by the [[!iki plugins/conditional desc=conditional]] plugin.
+
+With this directive, you can make text be conditionally displayed on a page.
+For example:
+
+ \[[!if test="enabled(smiley)"
+ then="The smiley plugin is enabled :-)"
+ else="No smiley plugin here.."]]
+
+If the specified `test` succeeds, the `then` text will be displayed,
+otherwise the `else` text will be displayed. The `else` part is optional.
+
+The `then` and `else` values can include any markup that would be allowed
+in the wiki page outside the template. Triple-quoting the values even allows
+quotes to be included.
+
+The `test` is a [[ikiwiki/PageSpec]]; if it matches any page in the wiki
+then it succeeds. So you can do things like testing for the existence of a
+page or pages, testing to see if any pages were created in a given month,
+and so on.
+
+If you want the [[ikiwiki/PageSpec]] to only match against the page that
+contains the conditional, rather than matching against all pages in the
+wiki, set the "all" parameter to "no".
+
+In an `if` directive, the regular [[ikiwiki/PageSpec]] syntax is expanded
+with the following additional tests:
+
+* enabled(plugin)
+
+ Tests whether the specified plugin is enabled.
+
+* sourcepage(glob)
+
+ Tests whether the glob matches the name of the page that contains the
+ conditional.
+
+* destpage(glob)
+
+ Tests whether the glob matches the name of the page that is being built.
+ That might be different than the name of the page that contains the
+ conditional, if it's being inlined into another page.
+
+* included()
+
+ Tests whether the page is being included onto another page, for example
+ via [[inline]] or [[map]]. Note that pages inserted into other pages
+ via [[template]] are not matched here.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/img.mdwn b/doc/ikiwiki/directive/img.mdwn
new file mode 100644
index 000000000..cda62b58f
--- /dev/null
+++ b/doc/ikiwiki/directive/img.mdwn
@@ -0,0 +1,39 @@
+The `img` directive is supplied by the [[!iki plugins/img desc=img]] plugin.
+
+This is an image handling directive. While ikiwiki supports inlining full-size
+images by making a [[ikiwiki/WikiLink]] that points to the image, using
+this directive you can easily scale down an image for inclusion onto a page,
+providing a link to a full-size version.
+
+## usage
+
+ \[[!img image1.jpg size="200x200" alt="clouds"]]
+
+The image file will be searched for using the same rules as used to find
+the file pointed to by a [[ikiwiki/WikiLink]].
+
+The `size` parameter is optional, defaulting to full size. Note that the
+original image's aspect ratio is always preserved, even if this means
+making the image smaller than the specified size. You can also specify only
+the width or the height, and the other value will be calculated based on
+it: "200x", "x200"
+
+You can also pass `alt`, `title`, `class`, `align`, `id`, `hspace`, and
+`vspace` parameters.
+These are passed through unchanged to the html img tag. If you include a
+`caption` parameter, the caption will be displayed centered beneath the image.
+
+The `link` parameter is used to control whether the scaled image links
+to the full size version. By default it does; set "link=somepage" to link
+to another page instead, or "link=no" to disable the link, or
+"link=http://url" to link to a given url.
+
+You can also set default values that will be applied to all later images on
+the page, unless overridden. Useful when including many images on a page.
+
+ \[[!img defaults size=200x200 alt="wedding photo"]]
+ \[[!img photo1.jpg]]
+ \[[!img photo2.jpg]]
+ \[[!img photo3.jpg size=200x600]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/img/discussion.mdwn b/doc/ikiwiki/directive/img/discussion.mdwn
new file mode 100644
index 000000000..6fc28e75e
--- /dev/null
+++ b/doc/ikiwiki/directive/img/discussion.mdwn
@@ -0,0 +1,34 @@
+## How to insert an image?
+
+I have goodstuff; I don't have img disabled.
+
+I used
+
+ \[[!img utah-2006-100-180.png]]
+
+and
+
+ \[[utah-2006-100-180.png]]
+
+and
+
+ \[[!img utah-2006-100-180.png link=no]]
+
+But it doesn't show the image file I already put in that directory. I can access the file by directly going to it in my browser. I want to see it on my wiki page.
+
+It does show a clickable question mark for ikiwiki.cgi?page=utah-2006-100-180.png&from=roadtrips&do=create
+
+-- [[JeremyReed]]
+
+> The question mark means ikiwiki does not know about your image.
+> It sounds as if you may have copied it onto your web server's `public_html`
+> type directory manually. For ikiwiki to know about it, you need to put it in
+> ikiwiki's srcdir with the rest of the wiki content, or you could upload
+> it with the `Edit -> Attachment` web interface.
+>
+> To display an image that is really legitimately not part of the wiki,
+> you can't use a directive, but you can insert `<img>` html if you really want to. --[[Joey]]
+
+I have a local copy of the [[rcs/Git]] page. After installing the `imagemagick-perl` package some of the elements display and others are missing including the page outlines with turned corners and all of the yellow folders. Ideas?
+
+-- [[RonParker]]
diff --git a/doc/ikiwiki/directive/inline.mdwn b/doc/ikiwiki/directive/inline.mdwn
new file mode 100644
index 000000000..a9c241afc
--- /dev/null
+++ b/doc/ikiwiki/directive/inline.mdwn
@@ -0,0 +1,126 @@
+The `inline` directive is supplied by the [[!iki plugins/inline desc=inline]] plugin.
+
+This is a directive that allows including one wiki page inside another.
+The most common use of inlining is generating blogs and RSS or Atom feeds.
+
+Example:
+
+ \[[!inline pages="blog/* and !*/Discussion" show="10" rootpage="blog"]]
+
+Any pages that match the specified [[PageSpec]] (in the example, any
+[[SubPage]] of "blog") will be part of the blog, and the newest 10
+of them will appear in the page. Note that if files that are not pages
+match the [[PageSpec]], they will be included in the feed using RSS
+enclosures, which is useful for podcasting.
+
+The optional `rootpage` parameter tells the wiki that new posts to this
+blog should default to being [[SubPages|SubPage]] of "blog", and enables a
+form at the top of the blog that can be used to add new items.
+
+If you want your blog to have an archive page listing every post ever made
+to it, you can accomplish that like this:
+
+ \[[!inline pages="blog/* and !*/Discussion" archive="yes"]]
+
+You can even create an automatically generated list of all the pages on the
+wiki, with the most recently added at the top, like this:
+
+ \[[!inline pages="* and !*/Discussion" archive="yes"]]
+
+If you want to be able to add pages to a given blog feed by tagging them,
+you can do that too. To tag a page, just make it link to a page or pages
+that represent its tags. Then use the special `link()` [[PageSpec]] to match
+all pages that have a given tag:
+
+ \[[!inline pages="link(life)"]]
+
+Or include some tags and exclude others:
+
+ \[[!inline pages="link(debian) and !link(social)"]]
+
+## usage
+
+There are many parameters you can use with the `inline`
+directive. These are the commonly used ones:
+
+* `pages` - A [[PageSpec]] of the pages to inline.
+* `show` - Specify the maximum number of matching pages to inline.
+ Default is 10, unless archiving, when the default is to show all.
+ Set to 0 to show all matching pages.
+* `archive` - If set to "yes", only list page titles and some metadata, not
+ full contents.
+* `description` - Sets the description of the rss feed if one is generated.
+ Defaults to the name of the wiki.
+* `skip` - Specify a number of pages to skip displaying. Can be useful
+ to produce a feed that only shows archived pages.
+* `postform` - Set to "yes" to enable a form to post new pages to a
+ blog.
+* `postformtext` - Set to specify text that is displayed in a postform.
+* `rootpage` - Enables the postform, and allows controling where
+ newly posted pages should go, by specifiying the page that
+ they should be a [[SubPage]] of.
+
+Here are some less often needed parameters:
+
+* `actions` - If set to "yes" add links to the bottom of the inlined pages
+ for editing and discussion (if they would be shown at the top of the page
+ itself).
+* `rss` - controls generation of an rss feed. If the wiki is configured to
+ generate rss feeds by default, set to "no" to disable. If the wiki is
+ configured to `allowrss`, set to "yes" to enable.
+* `atom` - controls generation of an atom feed. If the wiki is configured to
+ generate atom feeds by default, set to "no" to disable. If the wiki is
+ configured to `allowatom`, set to "yes" to enable.
+* `feeds` - controls generation of all types of feeds. Set to "no" to
+ disable generating any feeds.
+* `emptyfeeds` - Set to "no" to disable generation of empty feeds.
+ Has no effect if `rootpage` or `postform` is set.
+* `id` - Set to specify the value of the HTML `id` attribute for the
+ feed links or the post form. Useful if you have multiple forms in the
+ same page.
+* `template` - Specifies the template to fill out to display each inlined
+ page. By default the `inlinepage` template is used, while
+ the `archivepage` template is used for archives. Set this parameter to
+ use some other, custom template, such as the `titlepage` template that
+ only shows post titles or the `microblog` template, optimised for
+ microblogging. Note that you should still set `archive=yes` if
+ your custom template does not include the page content.
+* `raw` - Rather than the default behavior of creating a blog,
+ if raw is set to "yes", the page will be included raw, without additional
+ markup around it, as if it were a literal part of the source of the
+ inlining page.
+* `sort` - Controls how inlined pages are [[sorted|pagespec/sorting]].
+ The default is to sort the newest created pages first.
+* `reverse` - If set to "yes", causes the sort order to be reversed.
+* `feedshow` - Specify the maximum number of matching pages to include in
+ the rss/atom feeds. The default is the same as the `show` value above.
+* `feedonly` - Only generate the feed, do not display the pages inline on
+ the page.
+* `quick` - Build archives in quick mode, without reading page contents for
+ metadata. This also turns off generation of any feeds.
+* `timeformat` - Use this to specify how to display the time or date for pages
+ in the blog. The format string is passed to the strftime(3) function.
+* `feedpages` - A [[PageSpec]] of inlined pages to include in the rss/atom
+ feeds. The default is the same as the `pages` value above, and only pages
+ matched by that value are included, but some of those can be excluded by
+ specifying a tighter [[PageSpec]] here.
+* `guid` - If a URI is given here (perhaps a UUID prefixed with `urn:uuid:`),
+ the Atom feed will have this as its `<id>`. The default is to use the URL
+ of the page containing the `inline` directive.
+* `feedfile` - Can be used to change the name of the file generated for the
+ feed. This is particularly useful if a page contains multiple feeds.
+ For example, set "feedfile=feed" to cause it to generate `page/feed.atom`
+ and/or `page/feed.rss`. This option is not supported if the wiki is
+ configured not to use `usedirs`.
+* `pagenames` - If given instead of `pages`, this is interpreted as a
+ space-separated list of absolute page names ([[SubPage/LinkingRules]] are
+ not taken into account), and they are inlined in exactly the order given:
+ the `sort` and `pages` parameters cannot be used in conjunction with
+ this one.
+* `trail` - If set to "yes" and the [[!iki plugins/trail desc=trail]] plugin
+ is enabled, turn the inlined pages into a trail with next/previous links,
+ by passing the same options to [[ikiwiki/directive/trailitems]]. The `skip`
+ and `show` options are ignored by the trail, so the next/previous links
+ traverse through all matching pages.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/inline/discussion.mdwn b/doc/ikiwiki/directive/inline/discussion.mdwn
new file mode 100644
index 000000000..5489d5f16
--- /dev/null
+++ b/doc/ikiwiki/directive/inline/discussion.mdwn
@@ -0,0 +1,163 @@
+## Combine inline and toggle
+
+Is it possible to combine the behaviour of toggle and inline? ie, have it present of list of 'headlines' which are created from seperate subpages which can be clicked to expand to the body of the inlined page. Thanks.
+
+-- Thiana
+
+---
+## How do you provide the per post discussion links in your own blog?
+
+> That's configured by the "actions" parameter to the inline directive. See
+> docs in [[plugins/inline]]. --[[Joey]]
+
+And do you have any ideas/hints about implementing a "comments" feature.
+What I'm after is something for users who don't quite understand the Wiki
+style for discussions. I would like to have a form for them to post a
+comment and have the comment appended to the discussion Wiki-style. Maybe
+take it as far as implementing "replies" to other comments.
+
+-- Marcelo
+
+> See [[plugins/comments]]
+> --[[Joey]]
+
+---
+
+## More dynamic `rootpage` parameter of inline plugin?
+
+(Moved to [[todo/dynamic_rootpage]])
+
+---
+
+## Excluding Images
+
+Is there a simple way to exclude images, stylesheets, and other
+"non-page" files other than a blacklist approach like
+`pages="* and !*.png and !*.css"`? --[[JasonBlevins]]
+
+> The [[plugins/filecheck]] plugin adds a 'ispage()' pagespec test that can do that.
+> --[[Joey]]
+
+---
+
+## Documentation for parameter `template`?
+
+I would be especially interested in a list of variables which can be used in such a template.
+
+> I try to keep ikiwiki's templates self-documenting, so if you take
+> a look at a template used by inline, such as the default `/usr/share/ikiwiki/template/inlinepage.tmpl`,
+> you can see all or nearly all the template variables in use in it.
+
+I have a page template with some structured information as parameters. For
+example `location="nowhere"` and `price="20"`. Is there a possibility to
+extract those information, i. e. access the parameters, to compose the item
+for the inline directive from these information? For example the line »Go
+to nowhere for 20 bugs.« is shown inlined.
+
+--[[PaulePanter]]
+
+> Let's not confuse the template directive with the templates used by inline.
+> When a page is inlined, any template directives in it are first expanded,
+> using the user-defined templates for that. Then, the inline directive's
+> template is used to insert it into the inlining page.
+>
+> So no, you can't reference template directive parameters inside inline's
+> template, because it's already expanded at that point. --[[Joey]]
+
+>> Thank you for the explanation. Can you think of another way to accomplish
+>> my goals?
+>>
+>> Right now, I only see the option to edit the title with the
+>> `[[/ikiwiki/directive/meta]]` directive and the field `title`.
+>>
+>> How could a solution look like?
+>>
+>> 1. The possibility to add custom fields to the `meta` directive.
+>> 1. The possibility to specify in a page, how the page should be displayed
+>> when used by inlined. That could be done by a new directive `cinlined`
+>> (for »custom inlined«) which is chosen by the `inline` directive to
+>> display if told to do so.
+>>
+>> [[!cinlined text="""Text which can also use Parameter, bla blubb …"""]]
+>> --[[PaulePanter]]
+>>> You can make the body of a page change depending on whether it's being
+>>> inlined, with the [[ikiwiki/directive/if]] directive from the
+>>> [[plugins/conditional]] plugin:
+>>>
+>>> \[[!if test="inlined()"
+>>> then="""[[!template id=productsummary
+>>> location="Warehouse 23" price=20
+>>> ]]"""
+>>> else="""[[!template id=productdetail
+>>> location="Warehouse 23" price=20
+>>> description="Every home should have one"
+>>> ]]"""
+>>> ]]
+>>>
+>>> Perhaps that does some of what you want?
+>>>
+>>> If you want to go beyond that, my inclination would be to write
+>>> a simple plugin to deal with whatever it is you want to do (bug
+>>> metadata or product metadata or whatever) rather than prematurely
+>>> generalizing. --[[smcv]]
+
+## meta parameters are not enough
+
+I think I have the same problem as Paule, as I want extra arbitary parameters in my template.
+
+This is what I am doing currently, which makes my skin crawl. In `wgts/foo.mdwn`
+I have resorted to using AUTHORURL as the location of this widgets icon:
+
+ [[!meta authorurl="/ico/aHR0cDovL2JvbmRpLm9tdHAub3JnL3dpZGdldHMvYmF0dGVyeQ==.png" ]]
+
+In templates I have a file called `wgtlist.tmpl`:
+
+ <div class="widget">
+ <TMPL_IF NAME="AUTHORURL">
+ <img src="<TMPL_VAR AUTHORURL>" />
+ </TMPL_IF>
+ <TMPL_IF NAME="PERMALINK">
+ <a href="<TMPL_VAR PERMALINK>"><TMPL_VAR TITLE></a><br />
+ <TMPL_ELSE>
+ <a href="<TMPL_VAR PAGEURL>"><TMPL_VAR TITLE></a><br />
+ </TMPL_IF>
+ Posted <TMPL_VAR CTIME>
+ </div>
+
+My index page has:
+
+ [[!inline pages="./wgts/*" show=5 feeds=no actions=no rootpage="wgts" archive="yes" template=wgtlist]]
+
+Else can you please suggest a smarter way of getting certain data out from pages for a inline index?
+
+--[[hendry]]
+
+## A different idea: smuggling hook routines in through %params.
+
+The part that fetches the inlined content is quite compact. It's just the if ($needcontent) {} chunk. Would a patch that accepts a perl sub smuggled through something like $params{inliner_} be accepted? If that param exists, call it instead of the current content of that chunk. Pass $page, %params, and $template. Receive $content, possibly seeing $template modified. The custom directives can add inliner_ to %params and call IkiWiki::preprocess_inline. I suppose IkiWiki::Plugin::inline could be modified to strip any *_ out of the directive's arguments to prevent any custom behavior from leaking into the inline directive.
+
+I'm about to try this for a CV/resume type of thing. I want only one element with a specific id out of the generated content (with a little post-processing). I don't need performance for my case.
+
+Update: Pretty much works. I need a way to skip sources, but inline shrinks the list of all pages *before* trying to form them. Next little bit...
+
+--[[JasonRiedy]]
+
+---
+
+## Interaction of `show` and `feedshow`
+
+Reading the documentation I would think that `feedshow` does not
+influence `show`.
+
+ \[[!inline pages="./blog/*" archive=yes quick=yes feedshow=10 sort=title reverse=yes]]
+
+Only ten pages are listed in this example although `archive` is set to
+yes. Removing `feedshow=10` all matching pages are shown.
+
+Is that behaviour intended?
+
+> Is something going wrong because `quick="yes"` [[»turns off generation of any feeds«|inline]]? --[[PaulePanter]]
+
+--[[PaulePanter]]
+
+>> Bug was that if feedshow was specified without show it limited to it incorrectly. Fixed. --[[Joey]]
diff --git a/doc/ikiwiki/directive/linkmap.mdwn b/doc/ikiwiki/directive/linkmap.mdwn
new file mode 100644
index 000000000..baa6fff61
--- /dev/null
+++ b/doc/ikiwiki/directive/linkmap.mdwn
@@ -0,0 +1,29 @@
+The `linkmap` directive is supplied by the [[!iki plugins/linkmap desc=linkmap]] plugin.
+
+This directive uses [graphviz](http://www.graphviz.org/) to generate a
+graph showing the links between a set of pages in the wiki. Example usage:
+
+ \[[!linkmap pages="* and !blog/* and !*/Discussion"]]
+
+Only links between mapped pages will be shown; links pointing to or from
+unmapped pages will be omitted. If the pages to include are not specified,
+the links between all pages (and other files) in the wiki are mapped.
+
+Here are descriptions of all the supported parameters to the `linkmap`
+directive:
+
+* `pages` - A [[ikiwiki/PageSpec]] of the pages to map.
+* `height`, `width` - Limit the size of the map to a given height and width,
+ in inches. Both must be specified for the limiting to take effect, otherwise
+ the map's size is not limited.
+* `connected` - Controls whether to include pages on the map that link to
+ no other pages (connected=no, the default), or to only show pages that
+ link to others (connected=yes).
+
+For best results, only a small set of pages should be mapped, since
+otherwise the map can become very large, unwieldy, and complicated.
+If too many pages are included, the map may get so large that graphviz
+cannot render it. Using the `connected` parameter is a good way to prune
+out pages that clutter the map.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/listdirectives.mdwn b/doc/ikiwiki/directive/listdirectives.mdwn
new file mode 100644
index 000000000..b41d27a80
--- /dev/null
+++ b/doc/ikiwiki/directive/listdirectives.mdwn
@@ -0,0 +1,20 @@
+The `listdirectives` directive is supplied by the [[!iki plugins/listdirectives desc=listdirectives]] plugin.
+
+This directive generates a list of available
+[[directives|ikiwiki/directive]].
+
+ \[[!listdirectives]]
+
+There is one optional keyword argument, `generated`. Normally the
+`listdirectives` directive will list all built in directives and directives
+directly registered by plugins. With this keyword, `listdirectives` will
+also list directives generated later. For example, all [[shortcuts]] are
+directives generated in turn by the `shortcut` directive. They will only
+be listed if the `generated` argument is supplied.
+
+ \[[!listdirectives generated]]
+
+This extended list is often quite long, and often contains many
+undocumented directives.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/map.mdwn b/doc/ikiwiki/directive/map.mdwn
new file mode 100644
index 000000000..4b6499547
--- /dev/null
+++ b/doc/ikiwiki/directive/map.mdwn
@@ -0,0 +1,21 @@
+The `map` directive is supplied by the [[!iki plugins/map desc=map]] plugin.
+
+This directive generates a hierarchical page map for the wiki. Example usage:
+
+ \[[!map pages="* and !blog/* and !*/Discussion"]]
+
+If the pages to include are not specified, all pages (and other files) in
+the wiki are mapped.
+
+By default, the names of pages are shown in the map. The `show` parameter
+can be used to show the titles or descriptions of pages instead (as set by
+the [[meta]] directive). For example:
+
+ \[[!map pages="* and !blog/* and !*/Discussion" show=title]]
+
+ \[[!map pages="* and !blog/* and !*/Discussion" show=description]]
+
+Hint: To limit the map to displaying pages less than a certain level deep,
+use a [[ikiwiki/PageSpec]] like this: `pages="* and !*/*/*"`
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/map/discussion.mdwn b/doc/ikiwiki/directive/map/discussion.mdwn
new file mode 100644
index 000000000..6c2e6f1c2
--- /dev/null
+++ b/doc/ikiwiki/directive/map/discussion.mdwn
@@ -0,0 +1,99 @@
+### Sorting
+
+Is there a way to have the generated maps sorted by *title* instead of *filename* when show=title is used?
+Thanks
+
+-- Thiana
+
+> [[bugs/map_sorts_by_pagename_and_not_title_when_show__61__title_is_used]] --[[Joey]]
+
+----
+
+Question: Is there a way to generate a listing that shows *both* title and description meta information? Currently, a \[\[!map ...]] shows only one of the two, but I'd like to generate a navigation that looks like a description list. For example:
+
+ * This is the title meta information.
+
+ This is the description meta information
+
+ * This is another title.
+
+ And so on ...
+
+Is that possible?
+
+--Peter
+
+> the map directive could be made to use templates as the [[inline directive|ikiwiki/directive/inline]] does. (for the ambitious, the map functionality might even be special-cased into the inline plugin, i think) --[[chrysn]]
+
+----
+
+The site I'm trying to set up right now (not really a wiki - no public editing) is divided into topics. Topics are pages that have `\[[!meta link="/topic"]]`. Topic pages contain an index of their subpages (done with `\[[!inline]]`); the subpages are the real content. I want a map in the sidebar that lists:
+
+ * all of the topics;
+ * all of the first-level subpages of the *current topic only*.
+
+That is, if the current page is "Topic A" or "Topic A/Page 1", then the map should look like
+
+ Topic A
+ Page 1
+ Page 2
+ Page 3
+ Topic B
+ Topic C
+
+but if the current page is "Topic B" or one of its subpages, then the map should look like
+
+ Topic A
+ Topic B
+ Page 1
+ Page 2
+ Page 3
+ Topic C
+
+On the top-level index page, or on any other page that is neither a topic nor a subpage of a topic, the map should list only the topics.
+
+Is there any way to do that? I don't mind mucking around with `\[[!meta]]` on every page if that's what it takes.
+
+-- Zack
+
+> I think that you're looking for this:
+>
+> `pages="((Topic*/* or Topic*) and ./*) or (Topic* and ! Topic*/*)"`
+>
+> Let's pull that [[PageSpec]] apart.
+>
+> * `(Topic*/* or Topic*)` matches all pages that are underneath a Topic
+> page or are a topic page themselves.
+> * `and ./*` further adds the limitation that the pages have to be
+> in the same directory as the page that is displaying the map. So,
+> for `Topic_A/Page_1`, it will match `Topic_A/*`; for `Topic_A`,
+> it will match `Topic_*` but not subpages.
+> * Finally, `Topic* and ! Topic*/*` matches all the toplevel topic pages,
+> since we always want those to show up.
+>
+> I haven't tested that this works or displays, but I hope it gets you
+> on the right track. PS, be aware of
+> [[this_sidebar_issue|todo/Post-compilation_inclusion_of_the_sidebar]]!
+> --[[Joey]]
+
+>> Thanks, but this assumes that topic pages are named `Topic<something>`.
+>> They aren't. They are tagged with `\[[!meta link="/topic"]]`, and as
+>> far as I can tell there is no [[PageSpec]] notation for "subpages of a
+>> page that satisfies link(foo)"...
+>> -- Zack
+
+>>> I think that the ideas and code in
+>>> [[todo/tracking_bugs_with_dependencies]] might also handle this case.
+>>> --[[Joey]]
+
+----
+
+I feel like this should be obvious, but I can't figure out how to sort numerically.
+
+I have `map pages="./* and !*/Discussion and !*/sidebar"` and a bunch of pages with names like 1, 2, 3, 11, 12, 1/1.1, 12/12.3 etc. I want to sort them numerically. I see lots of conversation implying there's a simple way to do it, but not how.
+
+> No, you can't: map can't currently use a non-default sort order. If it
+> could, then you could use [[plugins/sortnaturally]]. There's a
+> [[feature_request|todo/sort_parameter_for_map_plugin_and_directive]];
+> [[a_bug_references_it|bugs/map_sorts_by_pagename_and_not_title_when_show=title_is_used]].
+> --[[smcv]]
diff --git a/doc/ikiwiki/directive/meta.mdwn b/doc/ikiwiki/directive/meta.mdwn
new file mode 100644
index 000000000..984f68540
--- /dev/null
+++ b/doc/ikiwiki/directive/meta.mdwn
@@ -0,0 +1,206 @@
+The `meta` directive is supplied by the [[!iki plugins/meta desc=meta]] plugin.
+
+This directive allows inserting arbitrary metadata into the source of a page.
+Enter the metadata as follows:
+
+ \[[!meta field="value"]]
+ \[[!meta field="value" param="value" param="value"]]
+
+The first form sets a given field to a given value, while the second form
+also specifies some additional sub-parameters. You can have only one field
+per `meta` directive, use more directives if you want to specify more fields.
+
+The field values are treated as HTML entity-escaped text, so you can include
+a quote in the text by writing `&quot;` and so on.
+
+Supported fields:
+
+* title
+
+ Overrides the title of the page, which is generally the same as the
+ page name.
+
+ Note that if the title is overridden, a "title_overridden" variable will
+ be set to a true value in the template; this can be used to format things
+ differently in this case.
+
+ An optional `sortas` parameter will be used preferentially when
+ [[ikiwiki/pagespec/sorting]] by `meta(title)`:
+
+ \[[!meta title="The Beatles" sortas="Beatles, The"]]
+
+ \[[!meta title="David Bowie" sortas="Bowie, David"]]
+
+* license
+
+ Specifies a license for the page, for example, "GPL". Can contain
+ WikiLinks and arbitrary markup.
+
+* copyright
+
+ Specifies the copyright of the page, for example, "Copyright 2007 by
+ Joey Hess". Can contain WikiLinks and arbitrary markup.
+
+* author
+
+ Specifies the author of a page.
+
+ An optional `sortas` parameter will be used preferentially when
+ [[ikiwiki/pagespec/sorting]] by `meta(author)`:
+
+ \[[!meta author="Joey Hess" sortas="Hess, Joey"]]
+
+* authorurl
+
+ Specifies an url for the author of a page.
+
+* description
+
+ Specifies a short description for the page. This will be put in
+ the html header, and can also be displayed by eg, the [[map]] directive.
+
+* keywords
+
+ Specifies keywords summarizing the contents of the page. This
+ information will be put in the html header. Only letters,
+ numbers, spaces and commas are allowed in this string; other
+ characters are stripped. Note that the majority of search
+ engines, including Google, do not use information from the
+ keywords header.
+
+* permalink
+
+ Specifies a permanent link to the page, if different than the page
+ generated by ikiwiki.
+
+* date
+
+ Specifies the creation date of the page. The date can be entered in
+ nearly any format, since it's parsed by [[!cpan TimeDate]].
+
+* stylesheet
+
+ Adds a stylesheet to a page. The stylesheet is treated as a wiki link to
+ a `.css` file in the wiki, so it cannot be used to add links to external
+ stylesheets. Example:
+
+ \[[!meta stylesheet=somestyle rel="alternate stylesheet"
+ title="somestyle"]]
+
+ However, this will be scrubbed away if the
+ [[!iki plugins/htmlscrubber desc=htmlscrubber]] plugin is enabled,
+ since it can be used to insert unsafe content.
+
+* script
+
+ Adds a script to a page. The script is treated as a wiki link to
+ a `.js` file in the wiki, so it cannot be used to add links to external
+ scripts. The optional `defer` and `async` keywords can be used to set
+ the corresponding HTML4 and HTML5 script options. Example:
+
+ \[[!meta script=somescript defer async]]
+
+ The tag is subject to scrubbing as with the stylesheet and link fields.
+
+* openid
+
+ Adds html &lt;link&gt; tags to perform OpenID delegation to an external
+ OpenID server. This lets you use an ikiwiki page as your OpenID.
+
+ By default this will delegate for both `openid` and `openid2`. To only
+ delegate for one, add a parameter such as `delegate=openid`.
+
+ An optional `xrds-location`
+ parameter lets you specify the location of any [eXtensible Resource
+ DescriptorS](http://www.windley.com/archives/2007/05/using_xrds.shtml).
+
+ Example:
+
+ \[[!meta openid="http://joeyh.myopenid.com/"
+ server="http://www.myopenid.com/server"
+ xrds-location="http://www.myopenid.com/xrds?username=joeyh.myopenid.com"]]
+
+* link
+
+ Specifies a link to another page. This can be used as a way to make the
+ wiki treat one page as linking to another without displaying a user-visible
+ [[ikiwiki/WikiLink]]:
+
+ \[[!meta link=otherpage]]
+
+ It can also be used to insert a html &lt;link&gt; tag. For example:
+
+ \[[!meta link="http://joeyh.myopenid.com/" rel="openid.delegate"]]
+
+ However, this latter syntax won't be allowed if the
+ [[!iki plugins/htmlscrubber desc=htmlscrubber]] plugin is enabled, since it can be used to
+ insert unsafe content.
+
+* redir
+
+ Causes the page to redirect to another page in the wiki.
+
+ \[[!meta redir=otherpage]]
+
+ The default is to redirect without delay.
+ Optionally, a delay (in seconds) can be specified: "delay=10"
+
+ It can also be used to redirect to an external url. For example:
+
+ \[[!meta redir="http://example.com/"]]
+
+ However, this latter syntax won't be allowed if the
+ [[!iki plugins/htmlscrubber desc=htmlscrubber]] plugin is enabled, since it can be used to
+ insert unsafe content.
+
+ For both cases, an anchor to jump to inside the destination page may also be
+ specified using the common `#ANCHOR` syntax.
+
+* robots
+
+ Causes the robots meta tag to be written:
+
+ \[[!meta robots="index, nofollow"]]
+
+ Valid values for the attribute are: "index", "noindex", "follow", and
+ "nofollow". Multiple comma-separated values are allowed, but obviously only
+ some combinations make sense. If there is no robots meta tag, "index,
+ follow" is used as the default.
+
+ The value is escaped, but its contents are not otherwise checked.
+
+* guid
+
+ Specifies a globally unique ID for a page. This guid should be a URI,
+ and it will be used to identify the page's entry in RSS
+ and Atom feeds. If not given, the default is to use the page's URL as its
+ guid.
+
+ This is mostly useful when a page has moved, to keep the guids for
+ pages unchanged and avoid flooding aggregators
+ (see [[!iki tips/howto_avoid_flooding_aggregators]]).
+
+* updated
+
+ Specifies a fake modification time for a page, to be output into RSS and
+ Atom feeds. This is useful to avoid flooding aggregators that sort by
+ modification time, like Planet: for instance, when editing an old blog post
+ to add tags, you could set `updated` to be one second later than the original
+ value. The date/time can be given in any format that
+ [[!cpan TimeDate]] can understand, just like the `date` field.
+
+* foaf
+
+ Adds a Friend of a Friend ([FOAF](http://wiki.foaf-project.org/w/Autodiscovery))
+ reference to a page.
+
+ Example:
+
+ \[[!meta foaf=foaf.rdf]]
+
+If the field is not one of the above predefined fields, the metadata will be
+written to the generated html page as a &lt;meta&gt; header. However, this
+won't be allowed if the [[!iki plugins/htmlscrubber desc=htmlscrubber]] plugin is enabled,
+since it can be used to insert unsafe content.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/meta/discussion.mdwn b/doc/ikiwiki/directive/meta/discussion.mdwn
new file mode 100644
index 000000000..a0aefe081
--- /dev/null
+++ b/doc/ikiwiki/directive/meta/discussion.mdwn
@@ -0,0 +1,69 @@
+Is there any reason the [language attribute](https://en.wikipedia.org/wiki/Meta_element#The_language_attribute) is not supported?
+--[[LucaCapello]]
+
+> Attached a patch against the Git repository, working on Debian ikiwiki_3.20100815.9. --[[LucaCapello]]
+
+[[patch]]
+
+-----
+
+<pre>
+From 680e57fd384b65e289d92054835687f3d6f3a19d Mon Sep 17 00:00:00 2001
+From: Luca Capello <luca@pca.it>
+Date: Sat, 6 Oct 2012 14:11:19 +0200
+Subject: [PATCH] IkiWiki/Plugin/meta.pm: support the language attribute
+
+---
+ IkiWiki/Plugin/meta.pm | 9 +++++++++
+ doc/ikiwiki/directive/meta.mdwn | 4 ++++
+ 2 files changed, 13 insertions(+)
+
+diff --git a/IkiWiki/Plugin/meta.pm b/IkiWiki/Plugin/meta.pm
+index 421f1dc..1a49f0c 100644
+--- a/IkiWiki/Plugin/meta.pm
++++ b/IkiWiki/Plugin/meta.pm
+@@ -102,6 +102,10 @@ sub preprocess (@) {
+ $pagestate{$page}{meta}{description}=$value;
+ # fallthrough
+ }
++ elsif ($key eq 'language') {
++ $pagestate{$page}{meta}{language}=$value;
++ # fallthrough
++ }
+ elsif ($key eq 'guid') {
+ $pagestate{$page}{meta}{guid}=$value;
+ # fallthrough
+@@ -279,6 +283,11 @@ sub preprocess (@) {
+ push @{$metaheaders{$page}}, '<meta name="'.$key.
+ '" content="'.encode_entities($value).'" />';
+ }
++ elsif ($key eq 'language') {
++ push @{$metaheaders{$page}},
++ '<meta http-equiv="Content-Language" content="'.
++ encode_entities($value).'" />';
++ }
+ elsif ($key eq 'name') {
+ push @{$metaheaders{$page}}, scrub('<meta name="'.
+ encode_entities($value).
+diff --git a/doc/ikiwiki/directive/meta.mdwn b/doc/ikiwiki/directive/meta.mdwn
+index 984f685..b82fa58 100644
+--- a/doc/ikiwiki/directive/meta.mdwn
++++ b/doc/ikiwiki/directive/meta.mdwn
+@@ -59,6 +59,10 @@ Supported fields:
+ Specifies a short description for the page. This will be put in
+ the html header, and can also be displayed by eg, the [[map]] directive.
+
++* language
++
++ Specifies the natural language for the page, for example, "en".
++
+ * keywords
+
+ Specifies keywords summarizing the contents of the page. This
+--
+1.7.10.4
+</pre>
+
+----
+
+I guess patching [[/ikiwiki/directive/meta]] to document the fact this attribute is supported would be good. — [[Jon]]
diff --git a/doc/ikiwiki/directive/more.mdwn b/doc/ikiwiki/directive/more.mdwn
new file mode 100644
index 000000000..bda1427f3
--- /dev/null
+++ b/doc/ikiwiki/directive/more.mdwn
@@ -0,0 +1,21 @@
+The `more` directive is supplied by the [[!iki plugins/more desc=more]] plugin.
+
+This directive provides a way to have a "more" link on a post in a blog, that
+leads to the full version of the page. Use it like this:
+
+ \[[!more linktext="click for more" text="""
+ This is the rest of my post. Not intended for people catching up on
+ their blogs at 30,000 feet. Because I like to make things
+ difficult.
+ """]]
+
+If the `linktext` parameter is omitted it defaults to just "more".
+
+An optional `pages` parameter can be used to specify a
+[[ikiwiki/PageSpec]], and then the "more" link will only be displayed
+when the page is inlined into a page matching that PageSpec, and otherwise
+the full content shown.
+
+Note that you can accomplish something similar using a [[toggle]] instead.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/orphans.mdwn b/doc/ikiwiki/directive/orphans.mdwn
new file mode 100644
index 000000000..b03f5ac30
--- /dev/null
+++ b/doc/ikiwiki/directive/orphans.mdwn
@@ -0,0 +1,15 @@
+The `orphans` directive is supplied by the [[!iki plugins/orphans desc=orphans]] plugin.
+
+This directive generates a list of possibly orphaned pages -- pages that no
+other page links to. Example:
+
+ \[[!orphans pages="* and !blog/*"]]
+
+The optional parameter "pages" can be a [[ikiwiki/PageSpec]] specifying the
+pages to check for orphans, default is search them all.
+
+Note that it takes backlinks into account, but does not count inlining a
+page as linking to it, so will generally count many blog-type pages as
+orphans.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/osm.mdwn b/doc/ikiwiki/directive/osm.mdwn
new file mode 100644
index 000000000..6807a8198
--- /dev/null
+++ b/doc/ikiwiki/directive/osm.mdwn
@@ -0,0 +1,69 @@
+The `osm` directive is supplied by the [[!iki plugins/osm desc=osm]] plugin.
+
+This directive inserts an OpenStreetMap map onto a page.
+It is typically combined with the [[waypoint]] directive
+to add points to the map.
+
+## examples
+
+ \[[!osm]]
+ \[[!waypoint lat="45°30N" lon="73°40W" name="My city" tag="city"]]
+
+The osm directive will display the actual map, while the waypoint
+directive adds waypoints to the map.
+
+The above can also be shortened as:
+
+ \[[!waypoint lat="45°30N" lon="73°40W" name="My city" tag="city" embed]]
+
+The tag is also taken from the tags elsewhere in the page, so the
+above is equivalent to:
+
+ \[[!waypoint lat="45°30N" lon="73°40W" name="My city" embed]]
+ \[[!tag city]]
+
+The icon is also taken from the tag if attached to the tag page as
+icon.png (default, modifiable)..
+
+## map display
+
+ * `map` - map to display, defaults to "map"
+ * `zoom` - the level to zoom to on the OSM map
+ * `loc` - lattitude and longitude of the map center
+ * `lat` - lattitude
+ * `lon` - longitude
+ * `editable` - add edit controls in a separate layer
+ * `right` - float the map right
+ * `left` - float the map left (default)
+ * `width` - width of the map
+ * `height` - height of the map
+
+## waypoints
+
+Waypoints can be added to any page. By default the waypoint takes the
+name of the page, which allows you to easily tag pages and make them
+appear on the central map.
+
+Waypoints, by default, show up as a image (the `icon` parameter) link
+to the main map (or the `map` parameter provided). That markup can be
+hidden with the `hidden` parameter.
+
+ * `name` - the name of this point, defaults to the page name (!) must
+ be unique, otherwise later incantation will overwrite previous
+ ones.
+ * `map` - the map to add the point to (defaults to "map")
+ * `desc` - description to embed in the map
+ * `loc` - lattitude and longitude
+ * `lat` - lattitude
+ * `lon` - longitude
+ * `tag` - the type of points, maps to an icon in the osm_types array
+ * `hidden` - do not display the link to the map (will not affect `embed`)
+ * `icon` - URL to the icon to show in the link to the map and within
+ the map
+ * `embed` - embed the map display alongside the point, in which case
+ the regular arguments to the map display can be used
+
+## Links
+
+If two pages with waypoints have a link between them, that link will
+magically show up on the map. Now how awesome is that?
diff --git a/doc/ikiwiki/directive/osm/discussion.mdwn b/doc/ikiwiki/directive/osm/discussion.mdwn
new file mode 100644
index 000000000..d9eb56951
--- /dev/null
+++ b/doc/ikiwiki/directive/osm/discussion.mdwn
@@ -0,0 +1,13 @@
+For some reason this stopped working after the 20120203 upgrade:
+
+ Suppression de /home/a-mesh/public_html/map/pois.kml, qui n'est plus rendu par nodes/anarcat
+ Suppression de /home/a-mesh/public_html/map/pois.txt, qui n'est plus rendu par nodes/anarcat
+ Suppression de /home/a-mesh/public_html/map/pois.json, qui n'est plus rendu par nodes/anarcat
+
+The map ceased to be generated, basically. --[[anarcat]]
+
+> Weird. This went away after adding debugging. No clue what happened here. But note that this following debugging code was quite useful in the output of --rebuild:
+
+ debug("writing pois file pois.kml in " . $config{destdir} . "/$map");
+
+The `width` and `height` parameters of the `[[!osm]]` directive stopped working after that upgrade too. The map doesn't show at all when they are added to the directive. --[[anarcat]]
diff --git a/doc/ikiwiki/directive/pagecount.mdwn b/doc/ikiwiki/directive/pagecount.mdwn
new file mode 100644
index 000000000..0e6ca3c46
--- /dev/null
+++ b/doc/ikiwiki/directive/pagecount.mdwn
@@ -0,0 +1,10 @@
+The `pagecount` directive is supplied by the [[!iki plugins/pagecount desc=pagecount]] plugin.
+
+This directive counts pages currently in the wiki. Example:
+
+ \[[!pagecount pages="*"]]
+
+The optional parameter "pages" can be a [[ikiwiki/PageSpec]] specifying the
+pages to count, default is to count them all.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/pagestats.mdwn b/doc/ikiwiki/directive/pagestats.mdwn
new file mode 100644
index 000000000..8d904f5a3
--- /dev/null
+++ b/doc/ikiwiki/directive/pagestats.mdwn
@@ -0,0 +1,40 @@
+The `pagestats` directive is supplied by the [[!iki plugins/pagestats desc=pagestats]] plugin.
+
+This directive can generate stats about how pages link to each other. It can
+produce either a tag cloud, or a table counting the number of links to each
+page.
+
+Here's how to use it to create a [[tag]] cloud, with tags sized based
+on frequency of use:
+
+ \[[!pagestats pages="tags/*"]]
+
+Here's how to create a list of tags, sized by use as they would be in a
+cloud.
+
+ \[[!pagestats style="list" pages="tags/*"]]
+
+And here's how to create a table of all the pages on the wiki:
+
+ \[[!pagestats style="table"]]
+
+The optional `among` parameter limits the pages whose outgoing links are
+considered. For instance, to display a cloud of tags used on blog
+entries, while ignoring other pages that use those tags, you could use:
+
+ \[[!pagestats pages="tags/*" among="blog/posts/*"]]
+
+Or to display a cloud of tags related to Linux, you could use:
+
+ \[[!pagestats pages="tags/* and !tags/linux" among="tagged(linux)"]]
+
+The optional `show` parameter limits display to the specified number of
+pages. For instance, to show a table of the top ten pages with the most
+links:
+
+ \[[!pagestats style="table" show="10"]]
+
+The optional `class` parameter can be used to control the class
+of the generated tag cloud `div` or page stats `table`.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/pagestats/discussion.mdwn b/doc/ikiwiki/directive/pagestats/discussion.mdwn
new file mode 100644
index 000000000..99029e88e
--- /dev/null
+++ b/doc/ikiwiki/directive/pagestats/discussion.mdwn
@@ -0,0 +1,18 @@
+I am trying to create a tag cloud using:
+
+ \[[!pagestats pages="tags/*"]]
+
+Nothing shows up when I first used this directive. I found that I had to create a page for the tag for it to show up in pagestats.
+I would rather not find and create a page for every tag I have created or will create. Is there an easier way to create a list of tags?
+
+Thanks
+
+> Hello unknown person.
+
+> I think it would require a different approach to what "tags" are, and/or what "pagestats" are. The pagestats plugin gives statistical information about *pages*, so it requires the pages in question to exist before it can get information about them. The tags plugin creates links to tag *pages*, with the expectation that a human being will create said pages and put whatever content they want on them (such as describing what the tag is about, and a map linking back to the tagged pages).
+
+> The approach that [PmWiki](http://www.pmwiki.org) takes is that it enables the optional auto-creation of (empty) pages which match a particular "group" (set of sub-pages); thus one could set all the "tags/*" pages to be auto-created, creating a new tags/foo page the first time the \[[!tag foo]] directive is used. See [[todo/auto-create_tag_pages_according_to_a_template]] for more discussion on this idea.
+> -- [[KathrynAndersen]]
+
+> Update: Ikiwiki can auto-create tags now, though it only defaults to
+> doing so when tagbase is set. --[[Joey]]
diff --git a/doc/ikiwiki/directive/pagetemplate.mdwn b/doc/ikiwiki/directive/pagetemplate.mdwn
new file mode 100644
index 000000000..401b38099
--- /dev/null
+++ b/doc/ikiwiki/directive/pagetemplate.mdwn
@@ -0,0 +1,13 @@
+The `pagetemplate` directive is supplied by the [[!iki plugins/pagetemplate desc=pagetemplate]] plugin.
+
+This directive allows a page to be displayed using a different
+[[template|templates]] than the default `page.tmpl` template.
+
+The page text is inserted into the template, so the template controls the
+overall look and feel of the wiki page. This is in contrast to the
+[[ikiwiki/directive/template]] directive, which allows inserting templates
+_into_ the body of a page.
+
+ \[[!pagetemplate template="my_fancy.tmpl"]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/paste.mdwn b/doc/ikiwiki/directive/paste.mdwn
new file mode 100644
index 000000000..a0aa0ef7f
--- /dev/null
+++ b/doc/ikiwiki/directive/paste.mdwn
@@ -0,0 +1,3 @@
+[[!meta redir=/ikiwiki/directive/cutpaste]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/ping.mdwn b/doc/ikiwiki/directive/ping.mdwn
new file mode 100644
index 000000000..691c75843
--- /dev/null
+++ b/doc/ikiwiki/directive/ping.mdwn
@@ -0,0 +1,18 @@
+The `ping` directive is supplied by the [[!iki plugins/pinger desc=pinger]] plugin.
+
+This directive allows ikiwiki to be configured to hit a URL each time it
+updates the wiki. One way to use this is in conjunction with the [[!iki plugins/pingee desc=pingee]]
+plugin to set up a loosely coupled mirror network, or a branched version of
+a wiki. By pinging the mirror or branch each time the main wiki changes, it
+can be kept up-to-date.
+
+ \[[!ping from="http://mywiki.com/"
+ to="http://otherwiki.com/ikiwiki.cgi?do=ping"]]
+
+The "from" parameter must be identical to the url of the wiki that is doing
+the pinging. This is used to prevent ping loops.
+
+The "to" parameter is the url to ping. The example shows how to ping
+another ikiwiki instance.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/poll.mdwn b/doc/ikiwiki/directive/poll.mdwn
new file mode 100644
index 000000000..0b47a2167
--- /dev/null
+++ b/doc/ikiwiki/directive/poll.mdwn
@@ -0,0 +1,27 @@
+The `poll` directive is supplied by the [[!iki plugins/poll desc=poll]] plugin.
+
+This directive allows you to create online polls in the wiki. Here's an
+example use:
+
+ \[[!poll 0 "red" 0 "green" 0 "blue"]]
+
+The numbers indicate how many users voted for that choice. When a user
+votes for a choice in the poll, the page is modified and the number
+incremented.
+
+While some basic precautions are taken to prevent users from accidentially
+voting twice, this sort of poll should not be counted on to be very
+accurate; all the usual concerns about web based polling apply. Unless the
+page that the poll is in is locked, users can even edit the page and change
+the numbers!
+
+Parameters:
+
+* `open` - Whether voting is still open. Set to "no" to close the poll to
+ voting.
+* `expandable` - Set to "yes" to make this poll have an interface to add
+ another choice to the poll.
+* `total` - Show total number of votes at bottom of poll. Default is "yes".
+* `percent` - Whether to display percents. Default is "yes".
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/polygen.mdwn b/doc/ikiwiki/directive/polygen.mdwn
new file mode 100644
index 000000000..e8726341a
--- /dev/null
+++ b/doc/ikiwiki/directive/polygen.mdwn
@@ -0,0 +1,11 @@
+The `polygen` directive is supplied by the [[!iki plugins/polygen desc=polygen]] plugin.
+
+This directive allows inserting text generated by polygen into a wiki page.
+For example:
+
+ \[[!polygen grammar="genius"]]
+
+It's also possible to specify a starting nonterminal for the grammar by
+including `symbol="text"` in the directive.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/postsparkline.mdwn b/doc/ikiwiki/directive/postsparkline.mdwn
new file mode 100644
index 000000000..318512eef
--- /dev/null
+++ b/doc/ikiwiki/directive/postsparkline.mdwn
@@ -0,0 +1,45 @@
+The `postsparkline` directive is supplied by the [[!iki plugins/postsparkline desc=postsparkline]] plugin.
+
+This directive uses the [[!iki plugins/sparkline desc=sparkline]] plugin to create a
+[[sparkline]] of statistics about a set of pages, such as posts to a blog.
+
+# examples
+
+ Post interval:
+ \[[!postsparkline pages="blog/* and !*/Discussion" max=100
+ formula=interval style=bar barwidth=2 barspacing=1 height=13]]
+
+ Posts per month this year:
+ \[[!postsparkline pages="blog/* and !*/Discussion" max=12
+ formula=permonth style=bar barwidth=2 barspacing=1 height=13]]
+
+# usage
+
+All options aside from the `pages`, `max`, `formula`, `time`, and `color`
+options are the same as in [[sparkline]] directive.
+
+You don't need to specify any data points (though you can if you want to).
+Instead, data points are automatically generated based on the creation
+times of pages matched by the specified `pages` [[ikiwiki/PageSpec]]. A
+maximum of `max` data points will be generated.
+
+The `formula` parameter controls the formula used to generate data points.
+Available formulae:
+
+* `interval` - The height of each point represents how long it has been
+ since the previous post.
+* `perday` - Each point represents a day; the height represents how
+ many posts were made that day.
+* `permonth` - Each point represents a month; the height represents how
+ many posts were made that month.
+* `peryear` - Each point represents a year; the height represents how
+ many posts were made that year.
+
+The `time` parameter has a default value of "ctime", since formulae use
+the creation times of pages by default. If you instead want
+them to use the modification times of pages, set it to "mtime".
+
+To change the color used to draw the sparkline, use the `color` parameter.
+For example, "color=red".
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/progress.mdwn b/doc/ikiwiki/directive/progress.mdwn
new file mode 100644
index 000000000..529f1c3c1
--- /dev/null
+++ b/doc/ikiwiki/directive/progress.mdwn
@@ -0,0 +1,18 @@
+The `progress` directive is supplied by the [[!iki plugins/progress desc=progress]] plugin.
+
+This directive generates a progress bar.
+
+There are two possible parameter sets. The first is a single parameter
+"percent" which holds a percentage figure of how complete the progress bar is.
+
+The second possible set of parameters is a pair of [[ikiwiki/PageSpec]]s,
+`totalpages` and `donepages`. The directive counts the number of
+pages in each pagespec and shows the percentage of the total pages that are
+done.
+
+For example, to show what percentage of pages have
+discussion pages:
+
+ \[[!progress totalpages="* and !*/Discussion" donepages="*/Discussion"]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/shortcut.mdwn b/doc/ikiwiki/directive/shortcut.mdwn
new file mode 100644
index 000000000..29db86ca5
--- /dev/null
+++ b/doc/ikiwiki/directive/shortcut.mdwn
@@ -0,0 +1,9 @@
+The `shortcut` directive is supplied by the [[!iki plugins/shortcut desc=shortcut]] plugin.
+
+This directive allows external links to commonly linked to sites to be made
+more easily using shortcuts.
+
+The available shortcuts are defined on the [[shortcuts]] page in
+the wiki. The `shortcut` directive can only be used on that page.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/sidebar.mdwn b/doc/ikiwiki/directive/sidebar.mdwn
new file mode 100644
index 000000000..599695d22
--- /dev/null
+++ b/doc/ikiwiki/directive/sidebar.mdwn
@@ -0,0 +1,20 @@
+The `sidebar` directive is supplied by the [[!iki plugins/sidebar desc=sidebar]] plugin.
+
+This directive can specify a custom sidebar to display on the page,
+overriding any sidebar that is displayed globally.
+
+If no custom sidebar content is specified, it forces the sidebar page to
+be used as the sidebar, even if the `global_sidebars` setting has been
+used to disable use of the sidebar page by default.
+
+## examples
+
+ \[[!sidebar content="""
+ This is my custom sidebar for this page.
+
+ \[[!calendar pages="posts/*"]]
+ """]]
+
+ \[[!sidebar]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/sidebar/discussion.mdwn b/doc/ikiwiki/directive/sidebar/discussion.mdwn
new file mode 100644
index 000000000..17c50ace6
--- /dev/null
+++ b/doc/ikiwiki/directive/sidebar/discussion.mdwn
@@ -0,0 +1,10 @@
+Things can get messy if you have enabled the global sidebar setting and you browse the `ikiwiki/directive/*` pages. You will get the content of `ikiwiki/directive/sidebar` as a sidebar to all the directives page... I have emptied the sidebar.mdwn page on our wiki to work around that, but isn't this a bug? --[[anarcat]]
+
+> Another reason to dislike the global sidebar option and wish it didn't
+> exist, as if I didn't have a dozen already. However, renaming this page
+> does not seem like an appropriate fix; adding cruft to every directive/
+> page to force the sidebar off does not seem like an appropriate fix;
+> this leaves only special casing the plugin to not treat this page as a
+> sidebar, but that's disgusting. --[[Joey]]
+
+>> Yep, this all sounds wrong... Maybe we could add a global "sidebar exclusion" pattern? Or reverse, allow customizing what name the global sidebar functionality is looking for? For example, we could look for `globalsidebar.mdwn` page instead of just `sidebar.mdwn`? --[[anarcat]]
diff --git a/doc/ikiwiki/directive/sparkline.mdwn b/doc/ikiwiki/directive/sparkline.mdwn
new file mode 100644
index 000000000..e5a03f84e
--- /dev/null
+++ b/doc/ikiwiki/directive/sparkline.mdwn
@@ -0,0 +1,52 @@
+The `sparkline` directive is supplied by the [[!iki plugins/sparkline desc=sparkline]] plugin.
+
+This directive allows for embedding sparklines into wiki pages. A
+sparkline is a small word-size graphic chart, that is designed to be
+displayed alongside text.
+
+# examples
+
+ \[[!sparkline 1 3 5 -3 10 0 width=40 height=16
+ featurepoint="4,-3,red,3" featurepoint="5,10,green,3"]]
+
+This creates a simple line graph, graphing several points.
+It will be drawn 40 pixels wide and 16 pixels high. The high point in the
+line has a green marker, and the low point has a red marker.
+
+ \[[!sparkline 1 -1(red) 1 -1(red) 1 1 1 -1(red) -1(red) style=bar barwidth=2
+ barspacing=1 height=13]]
+
+This more complex example generates a bar graph.
+The bars are 2 pixels wide, and separated by one pixel, and the graph is 13
+pixels tall. Width is determined automatically for bar graphs. The points
+with negative values are colored red, instead of the default black.
+
+# usage
+
+The form for the data points is "x,y", or just "y" if the x values don't
+matter. Bar graphs can also add "(color)" to specify a color for that bar.
+
+The following named parameters are recognised. Most of these are the same
+as those used by the underlying sparkline library, which is documented in
+more detail in [its wiki](http://sparkline.wikispaces.com/usage).
+
+* `style` - Either "line" (the default) or "bar".
+* `width` - Width of the graph in pixels. Only needed for line graphs.
+* `height` - Height of the graph in pixels. Defaults to 16.
+* `barwidth` - Width of bars in a bar graph. Default is 1 pixel.
+* `barspacing` - Spacing between bars in a bar graph, in pixels. Default is
+ 1 pixel.
+* `ymin`, `ymax` - Minimum and maximum values for the Y axis. This is
+ normally calculated automatically, but can be explicitly specified to get
+ the same values for multiple related graphs.
+* `featurepoint` - Adds a circular marker to a line graph, with optional
+ text. This can be used to label significant points.
+
+ The value is a comma-delimited list of parameters specifying the feature
+ point: X value, Y value, color name, circle diameter, text (optional),
+ and text location (optional). Example: `featurepoint="3,5,blue,3"`
+
+ Available values for the text location are: "top", "right", "bottom", and
+ "left".
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/table.mdwn b/doc/ikiwiki/directive/table.mdwn
new file mode 100644
index 000000000..3e7917789
--- /dev/null
+++ b/doc/ikiwiki/directive/table.mdwn
@@ -0,0 +1,53 @@
+The `table` directive is supplied by the [[!iki plugins/table desc=table]] plugin.
+
+This directive can build HTML tables from data in CSV (comma-separated values)
+or DSV (delimiter-separated values) format.
+
+## examples
+
+ \[[!table data="""
+ Customer |Amount
+ Fulanito |134,34
+ Menganito|234,56
+ Menganito|234,56
+ """]]
+
+ \[[!table class="book_record" format=csv file="data/books/record1"]]
+
+In this second example the `record1` page should be similar to:
+
+ "Title","Perl Best Practices"
+ "Author","Damian Conway"
+ "Publisher","O’Reilly"
+
+To make a cell span multiple columns, follow it with one or more empty
+cells. For example:
+
+ \[[!table data="""
+ left||right|
+ a|b|c|d
+ this cell spans **4** columns|||
+ """]]
+
+## usage
+
+* `data` - Values for the table.
+* `file` - A file in the wiki containing the data.
+* `format` - The format of the data, either "csv", "dsv", or "auto"
+ (the default).
+* `delimiter` - The character used to separate fields. By default,
+ DSV format uses a pipe (`|`), and CSV uses a comma (`,`).
+* `class` - A CSS class for the table html element.
+* `header` - By default, or if set to "row", the first data line is used
+ as the table header. Set it to "no" to make a table without a header, or
+ "column" to make the first column be the header.
+
+For tab-delimited tables (often obtained by copying and pasting from HTML
+or a spreadsheet), `delimiter` must be set to a literal tab character. These
+are difficult to type in most web browsers - copying and pasting one from
+the table data is likely to be the easiest way.
+
+Note that the contents of table cells can contain arbitrary ikiwiki and
+markdown markup.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/table/discussion.mdwn b/doc/ikiwiki/directive/table/discussion.mdwn
new file mode 100644
index 000000000..87d2e0cd1
--- /dev/null
+++ b/doc/ikiwiki/directive/table/discussion.mdwn
@@ -0,0 +1 @@
+The problem I have in my tables, is that some fields contain example HTML that needs to be escaped.
diff --git a/doc/ikiwiki/directive/tag.mdwn b/doc/ikiwiki/directive/tag.mdwn
new file mode 100644
index 000000000..c8d9b9816
--- /dev/null
+++ b/doc/ikiwiki/directive/tag.mdwn
@@ -0,0 +1,35 @@
+The `tag` and `taglink` directives are supplied by the [[!iki plugins/tag desc=tag]] plugin.
+
+These directives allow tagging pages. List tags as follows:
+
+ \[[!tag tech life linux]]
+
+The tags work the same as if you had put a (hidden) [[ikiwiki/WikiLink]] on
+the page for each tag, so you can use a [[ikiwiki/PageSpec]] match all
+pages that are tagged with a given tag, for example. The tags will also
+show up on blog entries and at the bottom of the tagged pages, as well as
+in RSS and Atom feeds.
+
+If you want a visible [[ikiwiki/WikiLink]] along with the tag, use taglink
+instead:
+
+ \[[!taglink foo]]
+ \[[!taglink tagged_as_foo|foo]]
+
+Note that if the wiki is configured to use a tagbase, then the tags will be
+located under a base directory, such as "tags/". This is a useful way to
+avoid having to write the full path to tags, if you want to keep them
+grouped together out of the way. Also, since ikiwiki then knows where to put
+tags, it will automatically create tag pages when new tags are used.
+
+Bear in mind that specifying a tagbase means you will need to incorporate it
+into the `link()` [[ikiwiki/PageSpec]] you use: e.g., if your tagbase is
+`tag`, you would match pages tagged "foo" with `link(tag/foo)`.
+
+If you want to override the tagbase for a particular tag, you can use
+something like this:
+
+ \[[!tag /foo]]
+ \[[!taglink /foo]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/tag/discussion.mdwn b/doc/ikiwiki/directive/tag/discussion.mdwn
new file mode 100644
index 000000000..23352ebe7
--- /dev/null
+++ b/doc/ikiwiki/directive/tag/discussion.mdwn
@@ -0,0 +1,13 @@
+# Tags in HTML pages
+
+http://rhombus-tech.net is an ikiwiki site where the hardware development is expanding: there are now four hardware projects each of which has its own news page. For convenience (putting in images for example), the pages have to use HTML not markdown or any other non-HTML format.
+
+However as there are quite a lot of them it would make sense to have an overview page saying "news reports", and that page to be auto-generated because every individual news page is tagged.
+
+... except the news pages are written in HTML, not any markup language into which a tag can be placed.
+
+question: what is the directive which allows an HTML page to have embedded within it a markup "tag"?
+
+> You can use the tag directive in `.html` pages, just like in `.mdwn` pages. This is if you're using
+> the default html plugin. If you instead use the rawhtml plugin, ikiwiki just copies your html files
+> and directives in them won't work. --[[Joey]]
diff --git a/doc/ikiwiki/directive/taglink.mdwn b/doc/ikiwiki/directive/taglink.mdwn
new file mode 100644
index 000000000..dbfabf8e1
--- /dev/null
+++ b/doc/ikiwiki/directive/taglink.mdwn
@@ -0,0 +1,3 @@
+[[!meta redir=/ikiwiki/directive/tag]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/template.mdwn b/doc/ikiwiki/directive/template.mdwn
new file mode 100644
index 000000000..9e3ae54df
--- /dev/null
+++ b/doc/ikiwiki/directive/template.mdwn
@@ -0,0 +1,91 @@
+The `template` directive is supplied by the [[!iki plugins/template desc=template]] plugin.
+
+The template directive allows wiki pages to be used as templates.
+These templates can be filled out and inserted into other pages in the
+wiki using the directive. The [[templates]] page lists templates
+that can be used with this directive.
+
+The directive has an `id` parameter
+that identifies the template to use. The remaining parameters are used to
+fill out the template.
+
+## Example
+
+ \[[!template id=note text="""Here is the text to insert into my note."""]]
+
+This fills out the `note` template, filling in the `text` field with
+the specified value, and inserts the result into the page.
+
+## Using a template
+
+Generally, a value can include any markup that would be allowed in the wiki
+page outside the template. Triple-quoting the value even allows quotes to
+be included in it. Combined with multi-line quoted values, this allows for
+large chunks of marked up text to be embedded into a template:
+
+ \[[!template id=foo name="Sally" color="green" age=8 notes="""
+ * \[[Charley]]'s sister.
+ * "I want to be an astronaut when I grow up."
+ * Really 8 and a half.
+ """]]
+
+## Creating a template
+
+The template is a regular wiki page, located in the `templates/`
+subdirectory inside the source directory of the wiki.
+
+Alternatively, templates can be stored in a directory outside the wiki,
+as files with the extension ".tmpl".
+By default, these are searched for in `/usr/share/ikiwiki/templates`,
+the `templatedir` setting can be used to make another directory be searched
+first. When referring to templates outside the wiki source directory, the "id"
+parameter is not interpreted as a pagespec, and you must include the full filename
+of the template page, including the ".tmpl" extension. E.g.:
+
+ \[[!template id=blogpost.tmpl]]
+
+The template uses the syntax used by the [[!cpan HTML::Template]] perl
+module, which allows for some fairly complex things to be done. Consult its
+documentation for the full syntax, but all you really need to know are a
+few things:
+
+* Each parameter you pass to the template directive will generate a
+ template variable. There are also some pre-defined variables like PAGE
+ and BASENAME.
+* To insert the value of a variable, use `<TMPL_VAR variable>`. Wiki markup
+ in the value will first be converted to html.
+* To insert the raw value of a variable, with wiki markup not yet converted
+ to html, use `<TMPL_VAR raw_variable>`.
+* To make a block of text conditional on a variable being set use
+ `<TMPL_IF variable>text</TMPL_IF>`.
+* To use one block of text if a variable is set and a second if it's not,
+ use `<TMPL_IF variable>text<TMPL_ELSE>other text</TMPL_IF>`
+
+Here's a sample template:
+
+ <span class="infobox">
+ Name: \[[<TMPL_VAR raw_name>]]<br />
+ Age: <TMPL_VAR age><br />
+ <TMPL_IF color>
+ Favorite color: <TMPL_VAR color><br />
+ <TMPL_ELSE>
+ No favorite color.<br />
+ </TMPL_IF>
+ <TMPL_IF notes>
+ <hr />
+ <TMPL_VAR notes>
+ </TMPL_IF>
+ </span>
+
+The filled out template will be formatted the same as the rest of the page
+that contains it, so you can include WikiLinks and all other forms of wiki
+markup in the template. Note though that such WikiLinks will not show up as
+backlinks to the page that uses the template.
+
+Note the use of "raw_name" inside the [[ikiwiki/WikiLink]] generator in the
+example above. This ensures that if the name contains something that might
+be mistaken for wiki markup, it's not converted to html before being
+processed as a [[ikiwiki/WikiLink]].
+
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/testpagespec.mdwn b/doc/ikiwiki/directive/testpagespec.mdwn
new file mode 100644
index 000000000..dde7d99f5
--- /dev/null
+++ b/doc/ikiwiki/directive/testpagespec.mdwn
@@ -0,0 +1,24 @@
+The `testpagespec` directive is supplied by the [[!iki plugins/testpagespec desc=testpagespec]] plugin.
+
+This directive allows testing a [[ikiwiki/PageSpec]] to see if it matches a
+page, and to see the part that matches, or causes the match to fail.
+
+Example uses:
+
+ \[[!testpagespec pagespec="foopage and barpage" match="foopage"]]
+
+This will print out something like "no match: barpage does not match
+foopage", highlighting which part of the [[ikiwiki/PageSpec]] is causing
+the match to fail.
+
+ \[[!testpagespec pagespec="foopage or !bar*" match="barpage"]]
+
+This will print out something like "no match: bar* matches barpage", since
+the part of the [[ikiwiki/PageSpec]] that fails is this negated match.
+
+ \[[!testpagespec pagespec="foopage or barpage" match="barpage"]]
+
+This will print out something like "match: barpage matches barpage",
+indicating the part of the [[ikiwiki/PageSpec]] that caused it to match.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/testpagespec/discussion.mdwn b/doc/ikiwiki/directive/testpagespec/discussion.mdwn
new file mode 100644
index 000000000..66c9a9ca9
--- /dev/null
+++ b/doc/ikiwiki/directive/testpagespec/discussion.mdwn
@@ -0,0 +1,6 @@
+How does one test a user identity? I tried "pagename and user(username) for the match, and had a "no user specified" error.
+
+> You can't test them with this directive, because such pagespecs test to
+> see if logged in user, who is performing some action, matches. When the
+> page with the directive is built, the concept of a user being logged in
+> doesn't really apply. --[[Joey]]
diff --git a/doc/ikiwiki/directive/teximg.mdwn b/doc/ikiwiki/directive/teximg.mdwn
new file mode 100644
index 000000000..992a3f68d
--- /dev/null
+++ b/doc/ikiwiki/directive/teximg.mdwn
@@ -0,0 +1,23 @@
+The `teximg` directive is supplied by the [[!iki plugins/teximg desc=teximg]] plugin.
+
+This directive renders LaTeX formulas into images.
+
+## examples
+
+ \[[!teximg code="\frac{1}{2}"]]
+ \[[!teximg code="E = - \frac{Z^2 \cdot \mu \cdot e^4}{32\pi^2 \epsilon_0^2 \hbar^2 n^2}" ]]
+
+To scale the image, use height=x:
+
+ \[[!teximg code="\frac{1}{2}" height="17"]]
+ \[[!teximg code="\frac{1}{2}" height="8"]]
+
+If no height is chosen the default height 12 is used. Valid heights are: 8, 9,
+10, 11, 12, 14, 17, 20. If another height is entered, the closest available
+height is used.
+
+To add an alt text to the image, use alt="text":
+
+ \[[!teximg code="\frac{1}{2}" alt="1/2"]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/toc.mdwn b/doc/ikiwiki/directive/toc.mdwn
new file mode 100644
index 000000000..bb1afa1ac
--- /dev/null
+++ b/doc/ikiwiki/directive/toc.mdwn
@@ -0,0 +1,27 @@
+The `toc` directive is supplied by the [[!iki plugins/toc desc=toc]] plugin.
+
+Add a table of contents to a page:
+
+ \[[!toc ]]
+
+The table of contents will be automatically generated based on the
+headers of the page. By default only the largest headers present on the
+page will be shown; to control how many levels of headers are shown, use
+the `levels` parameter:
+
+ \[[!toc levels=2]]
+
+The toc directive will take the level of the first header as the topmost
+level, even if there are higher levels seen later in the file.
+
+To create a table of contents that only shows headers starting with a given
+level, use the `startlevel` parameter. For example, to show only h2 and
+smaller headers:
+
+ \[[!toc startlevel=2]]
+
+The table of contents will be created as an ordered list. If you want
+an unordered list instead, you can change the list-style in your local
+style sheet.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/toggle.mdwn b/doc/ikiwiki/directive/toggle.mdwn
new file mode 100644
index 000000000..db11f0335
--- /dev/null
+++ b/doc/ikiwiki/directive/toggle.mdwn
@@ -0,0 +1,34 @@
+The `toggle` and `toggleable` directives are supplied by the [[!iki plugins/toggle desc=toggle]] plugin.
+
+With these directives you can create links on pages that, when clicked, toggle
+display of other parts of the page.
+
+It uses javascript to accomplish this; browsers without javascript will
+always see the full page content.
+
+Example use:
+
+ \[[!toggle id="ipsum" text="show"]]
+
+ \[[!toggleable id="ipsum" text="""
+ Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do
+ eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim
+ ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut
+ aliquip ex ea commodo consequat.
+
+ [[!toggle id="ipsum" text="hide"]]
+ """]]
+
+Note that you can include wiki markup in the toggleable text,
+including even additional toggles, as shown in the above example.
+
+Also, the toggle and the togglable definitions do not need to be next to
+each other, but can be located anywhere on the page. There can also be
+mutiple toggles that all toggle a single togglable.
+
+The id has a default value of "default", so can be omitted in simple cases.
+
+If you'd like a toggleable to be displayed by default, and toggle to
+hidden, then pass a parameter "open=yes" when setting up the toggleable.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/toggleable.mdwn b/doc/ikiwiki/directive/toggleable.mdwn
new file mode 100644
index 000000000..5536f4489
--- /dev/null
+++ b/doc/ikiwiki/directive/toggleable.mdwn
@@ -0,0 +1,3 @@
+[[!meta redir=/ikiwiki/directive/toggle]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/trailitem.mdwn b/doc/ikiwiki/directive/trailitem.mdwn
new file mode 100644
index 000000000..59626b5a1
--- /dev/null
+++ b/doc/ikiwiki/directive/trailitem.mdwn
@@ -0,0 +1,9 @@
+The `trailitem` directive is supplied by the
+[[!iki plugins/trail desc=trail]] plugin. It is used like this:
+
+ \[[!trailitem some_other_page]]
+
+to add `some_other_page` to the trail represented by this page, without
+generating a visible hyperlink.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/trailitems.mdwn b/doc/ikiwiki/directive/trailitems.mdwn
new file mode 100644
index 000000000..387b403b9
--- /dev/null
+++ b/doc/ikiwiki/directive/trailitems.mdwn
@@ -0,0 +1,25 @@
+The `trailitems` directive is supplied by the
+[[!iki plugins/trail desc=trail]] plugin. It adds pages
+to the trail represented by the current page, without producing any output
+on that page.
+
+ \[[!trailitems pages="posts/*" sort="age"]]
+
+ \[[!trailitems pagenames="a b c"]]
+
+Options are similar to [[!iki ikiwiki/directive/inline desc=inline]]:
+
+* `pages`: adds pages that match a [[ikiwiki/PageSpec]] to the trail
+ (cannot be used with `pagenames`)
+
+* `pagenames`: if used instead of `pages`, this is interpreted as a
+ space-separated list of absolute page names
+ ([[SubPage/LinkingRules]] are not taken into account)
+ to add to the trail
+
+* `sort`: add the pages matched by `pages` to the trail in this
+ [[ikiwiki/pagespec/sorting]] order (cannot be used with `pagenames`)
+
+* `reverse`: reverse the order of `sort` (cannot be used with `pagenames`)
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/traillink.mdwn b/doc/ikiwiki/directive/traillink.mdwn
new file mode 100644
index 000000000..090e2538d
--- /dev/null
+++ b/doc/ikiwiki/directive/traillink.mdwn
@@ -0,0 +1,16 @@
+The `traillink` directive is supplied by the
+[[!iki plugins/trail desc=trail]]
+plugin. It generates a visible [[ikiwiki/WikiLink]], and also adds the
+linked page to the trail represented by the page containing the directive.
+
+In its simplest form, the first parameter is like the content of a WikiLink:
+
+ \[[!traillink some_other_page]]
+
+The displayed text can also be overridden, either with a `|` symbol or with
+a `text` parameter:
+
+ \[[!traillink Click_here_to_start_the_trail|some_other_page]]
+ \[[!traillink some_other_page text="Click here to start the trail"]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/trailoptions.mdwn b/doc/ikiwiki/directive/trailoptions.mdwn
new file mode 100644
index 000000000..d83f444c0
--- /dev/null
+++ b/doc/ikiwiki/directive/trailoptions.mdwn
@@ -0,0 +1,18 @@
+The `trailoptions` directive is supplied by the
+[[!iki plugins/trail desc=trail]] plugin. It sets options for the
+trail represented by this page.
+
+ \[[!trailoptions sort="meta(title)" circular="no"]]
+
+Options available:
+
+* `sort`: sets a [[ikiwiki/pagespec/sorting]] order for the entire trail,
+ overriding the order in which they were added
+
+* `reverse`: reverses the order of the trail
+
+* `circular`: if set to `yes` or `1`, the trail is made into a loop by
+ making the last page's "next" link point to the first page, and the first
+ page's "previous" link point to the last page
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/version.mdwn b/doc/ikiwiki/directive/version.mdwn
new file mode 100644
index 000000000..f1e2085a7
--- /dev/null
+++ b/doc/ikiwiki/directive/version.mdwn
@@ -0,0 +1,12 @@
+The `version` directive is supplied by the [[!iki plugins/version desc=version]] plugin.
+
+This directive allows inserting the version of ikiwiki onto a page.
+
+Whenever ikiwiki is upgraded to a new version, the page will be rebuilt,
+updating the version number.
+
+Use is simple:
+
+ \[[!version ]]
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/directive/waypoint.mdwn b/doc/ikiwiki/directive/waypoint.mdwn
new file mode 100644
index 000000000..e301f8573
--- /dev/null
+++ b/doc/ikiwiki/directive/waypoint.mdwn
@@ -0,0 +1,6 @@
+The `waypoint` directive is supplied by the [[!iki plugins/osm desc=osm]] plugin.
+
+This directive adds a waypoint ot an OpenStreetMap map displayed
+by the [[osm]] directive. See the [[osm]] directive for examples
+and options.
+
diff --git a/doc/ikiwiki/formatting.mdwn b/doc/ikiwiki/formatting.mdwn
new file mode 100644
index 000000000..befbce9aa
--- /dev/null
+++ b/doc/ikiwiki/formatting.mdwn
@@ -0,0 +1,106 @@
+[[!meta title="Formatting wiki pages"]]
+[[!meta robots="noindex, follow"]]
+
+Text on this wiki is, by default, written in a form very close to how you
+might write text for an email message. This style of text formatting is
+called [[MarkDown]], and it works like this:
+
+Leave blank lines between paragraphs.
+
+You can *\*emphasise\** or **\*\*strongly emphasise\*\*** text by placing it
+in single or double asterisks.
+
+To create a list, start each line with an asterisk:
+
+* "* this is my list"
+* "* another item"
+
+To make a numbered list, start each line with a number (any number will
+do) followed by a period:
+
+1. "1. first line"
+2. "2. second line"
+2. "2. third line"
+
+To create a header, start a line with one or more `#` characters followed
+by a space and the header text. The number of `#` characters controls the
+size of the header:
+
+# # h1
+## ## h2
+### ### h3
+#### #### h4
+##### ##### h5
+###### ###### h6
+
+To create a horizontal rule, just write three or more dashes or stars on
+their own line:
+
+----
+
+To quote someone, prefix the quote with ">":
+
+> To be or not to be,
+> that is the question.
+
+To write a code block, indent each line with a tab or 4 spaces:
+
+ 10 PRINT "Hello, world!"
+ 20 GOTO 10
+
+To link to an url or email address, you can just put the
+url in angle brackets: <<http://ikiwiki.info>>, or you can use the
+form \[link text\]\(url\)
+
+----
+
+In addition to basic html formatting using [[MarkDown]], this wiki lets
+you use the following additional features:
+
+* To link to another page on the wiki, place the page's name inside double
+ square brackets. So you would use `\[[WikiLink]]` to link to [[WikiLink]].
+
+[[!if test="enabled(smiley) and smileys" then="""
+* Insert [[smileys]] and some other useful symbols. :-)
+"""]]
+
+[[!if test="enabled(shortcut) and shortcuts" then="""
+* Use [[shortcuts]] to link to common resources.
+
+ \[[!wikipedia War\_of\_1812]]
+"""]]
+
+[[!if test="enabled(template) and templates" then="""
+* Create and fill out [[templates]] for repeated chunks of
+ parameterized wiki text.
+"""]]
+
+* Insert various [[directives|directive]] onto a page to perform useful
+ actions.
+[[!if test="enabled(toc) or enabled(meta) or enabled(inline)" then="""
+ For example, you can:
+"""]]
+
+[[!if test="enabled(toc)" then="""
+ * Add a table of contents to a page:
+
+ \[[!toc]]
+"""]]
+
+
+[[!if test="enabled(meta)" then="""
+ * Change the title of a page:
+
+ \[[!meta title="full page title"]]
+"""]]
+
+[[!if test="enabled(inline)" then="""
+ * Create a blog by inlining a set of pages:
+
+ \[[!inline pages="blog/*"]]
+"""]]
+
+[[!if test="enabled(listdirectives)" then="""
+ Full list of [[directives|directive]] enabled for this wiki:
+ [[!listdirectives ]]
+"""]]
diff --git a/doc/ikiwiki/formatting/discussion.mdwn b/doc/ikiwiki/formatting/discussion.mdwn
new file mode 100644
index 000000000..0a8d6f567
--- /dev/null
+++ b/doc/ikiwiki/formatting/discussion.mdwn
@@ -0,0 +1,20 @@
+The markdown syntax states that emails are written with html entities, but in ikiwiki only one part is encoded as it. For reference see <http://daringfireball.net/projects/markdown/syntax#misc>.
+
+In the HTML page I get this:
+
+ <a href="mailto:XXXXXXXXXX@gmail.com">&#x6D;&#x6D;&#x61;&#x73;&#x73;&#111;n&#110;&#101;&#x74;&#64;&#103;&#109;&#97;i&#108;&#46;&#x63;&#111;&#109;</a>
+
+while it the href="" attribute should also be encoded.
+
+--mike
+
+> The htmlscrubber removes entity encoding obfuscation from tag attributes
+> This has to be done because such entity encoding can be used to hide
+> javascript and other nonsense in html tag attributes. As a consequence,
+> markdown's mail obfuscation is reverted.
+>
+> I don't really see this as a serious issue, because if I were working for
+> a spammer, I would include entity decoding in my web spider that searched
+> for emails. And I could do it easily, as evidenced by the code in the
+> htmlscrubber that doe it. So I assume this technique is not very effective
+> at blocking spam. --[[Joey]]
diff --git a/doc/ikiwiki/markdown.mdwn b/doc/ikiwiki/markdown.mdwn
new file mode 100644
index 000000000..684191929
--- /dev/null
+++ b/doc/ikiwiki/markdown.mdwn
@@ -0,0 +1,11 @@
+[[!meta robots="noindex, follow"]]
+[Markdown](http://daringfireball.net/projects/markdown/)
+is a minimal markup language that resembles plain text as used in
+email messages. It is the markup language used by this wiki by default.
+
+For documentation about the markdown syntax, see [[formatting]] and
+[Markdown: syntax](http://daringfireball.net/projects/markdown/syntax).
+
+Note that [[WikiLinks|WikiLink]] and [[directives|directive]] are not part
+of the markdown syntax, and are the only bit of markup that this wiki
+handles internally.
diff --git a/doc/ikiwiki/openid.mdwn b/doc/ikiwiki/openid.mdwn
new file mode 100644
index 000000000..2fa972ede
--- /dev/null
+++ b/doc/ikiwiki/openid.mdwn
@@ -0,0 +1,28 @@
+[[!meta title="OpenID"]]
+[[!meta robots="noindex, follow"]]
+
+[[!if test="enabled(openid)"
+ then="This wiki has OpenID **enabled**."
+ else="This wiki has OpenID **disabled**."]]
+
+[OpenID](http://openid.net) is a decentralized authentication mechanism
+that allows you to have one login that you can use on a growing number of
+websites.
+
+If you have an account with some of the larger web service providers,
+you might already have an OpenID.
+[Directory of OpenID providers](http://openiddirectory.com/openid-providers-c-1.html)
+
+[[!if test="enabled(openid)" then="""
+To sign in to this wiki using OpenID, just enter it in the OpenID field in the
+signin form. You do not need to give this wiki a password or go through any
+registration process when using OpenID.
+"""]]
+
+---
+
+It's also possible to make a page in the wiki usable as an OpenID url,
+by delegating it to an openid server. Here's an example of how to do that:
+
+ \[[!meta openid="http://yourid.myopenid.com/"
+ server="http://www.myopenid.com/server"]]
diff --git a/doc/ikiwiki/pagespec.mdwn b/doc/ikiwiki/pagespec.mdwn
new file mode 100644
index 000000000..0f298ad78
--- /dev/null
+++ b/doc/ikiwiki/pagespec.mdwn
@@ -0,0 +1,86 @@
+[[!meta robots="noindex, follow"]]
+To select a set of pages, such as pages that are locked, pages
+whose commit emails you want subscribe to, or pages to combine into a
+blog, the wiki uses a PageSpec. This is an expression that matches
+a set of pages.
+
+The simplest PageSpec is a simple list of pages. For example, this matches
+any of the three listed pages:
+
+ foo or bar or baz
+
+More often you will want to match any pages that have a particular thing in
+their name. You can do this using a glob pattern. "`*`" stands for any part
+of a page name, and "`?`" for any single letter of a page name. So this
+matches all pages about music, and any [[SubPage]]s of the SandBox, but does
+not match the SandBox itself:
+
+ *music* or SandBox/*
+
+You can also prefix an item with "`!`" to skip pages that match it. So to
+match all pages except for Discussion pages and the SandBox:
+
+ * and !SandBox and !*/Discussion
+
+Some more elaborate limits can be added to what matches using these functions:
+
+* "`glob(someglob)`" - matches pages and other files that match the given glob.
+ Just writing the glob by itself is actually a shorthand for this function.
+* "`page(glob)`" - like `glob()`, but only matches pages, not other files
+* "`link(page)`" - matches only pages that link to a given page (or glob)
+* "`tagged(tag)`" - matches pages that are tagged or link to the given tag (or
+ tags matched by a glob)
+* "`backlink(page)`" - matches only pages that a given page links to
+* "`creation_month(month)`" - matches only files created on the given month
+ number
+* "`creation_day(mday)`" - or day of the month
+* "`creation_year(year)`" - or year
+* "`created_after(page)`" - matches only files created after the given page
+ was created
+* "`created_before(page)`" - matches only files created before the given page
+ was created
+* "`internal(glob)`" - like `glob()`, but matches even internal-use
+ pages that globs do not usually match.
+* "`title(glob)`", "`author(glob)`", "`authorurl(glob)`",
+ "`license(glob)`", "`copyright(glob)`", "`guid(glob)`"
+ - match pages that have the given metadata, matching the specified glob.
+* "`user(username)`" - tests whether a modification is being made by a
+ user with the specified username. If openid is enabled, an openid can also
+ be put here. Glob patterns can be used in the username. For example,
+ to match all openid users, use `user(*://*)`
+* "`admin()`" - tests whether a modification is being made by one of the
+ wiki admins.
+* "`ip(address)`" - tests whether a modification is being made from the
+ specified IP address. Glob patterns can be used in the address. For
+ example, `ip(127.0.0.*)`
+* "`comment(glob)`" - matches comments to a page matching the glob.
+* "`comment_pending(glob)`" - matches unmoderated, pending comments.
+* "`postcomment(glob)`" - matches only when comments are being
+ posted to a page matching the specified glob
+
+For example, to match all pages in a blog that link to the page about music
+and were written in 2005:
+
+ blog/* and link(music) and creation_year(2005)
+
+Note the use of "and" in the above example, that means that only pages that
+match each of the three expressions match the whole. Use "and" when you
+want to combine expression like that; "or" when it's enough for a page to
+match one expression. Note that it doesn't make sense to say "index and
+SandBox", since no page can match both expressions.
+
+More complex expressions can also be created, by using parentheses for
+grouping. For example, to match pages in a blog that are tagged with either
+of two tags, use:
+
+ blog/* and (tagged(foo) or tagged(bar))
+
+Note that page names in PageSpecs are matched against the absolute
+filenames of the pages in the wiki, so a pagespec "foo" used on page
+"a/b" will not match a page named "a/foo" or "a/b/foo". To match
+relative to the directory of the page containing the pagespec, you can
+use "./". For example, "./foo" on page "a/b" matches page "a/foo".
+
+To indicate the name of the page the PageSpec is used in, you can
+use a single dot. For example, `link(.)` matches all the pages
+linking to the page containing the PageSpec.
diff --git a/doc/ikiwiki/pagespec/attachment.mdwn b/doc/ikiwiki/pagespec/attachment.mdwn
new file mode 100644
index 000000000..fa2bc5867
--- /dev/null
+++ b/doc/ikiwiki/pagespec/attachment.mdwn
@@ -0,0 +1,38 @@
+[[!meta robots="noindex, follow"]]
+[[!if test="enabled(attachment)"
+ then="This wiki has attachments **enabled**."
+ else="This wiki has attachments **disabled**."]]
+
+If attachments are enabled, the wiki admin can control what types of
+attachments will be accepted, via the `allowed_attachments`
+configuration setting.
+
+For example, to limit most users to uploading small images, and nothing else,
+while allowing larger mp3 files to be uploaded by joey into a specific
+directory, and check all attachments for viruses, something like this could be
+used:
+
+ virusfree() and ((user(joey) and podcast/*.mp3 and mimetype(audio/mpeg) and maxsize(15mb)) or (mimetype(image/*) and maxsize(50kb)))
+
+The regular [[ikiwiki/PageSpec]] syntax is expanded with the following
+additional tests:
+
+* "`maxsize(size)`" - tests whether the attachment is no larger than the
+ specified size. The size defaults to being in bytes, but "kb", "mb", "gb"
+ etc can be used to specify the units.
+
+* "`minsize(size)`" - tests whether the attachment is no smaller than the
+ specified size.
+
+* "`ispage()`" - tests whether the attachment will be treated by ikiwiki as a
+ wiki page. (Ie, if it has an extension of ".mdwn", or of any other enabled
+ page format).
+
+ So, if you don't want to allow wiki pages to be uploaded as attachments,
+ use `!ispage()` ; if you only want to allow wiki pages to be uploaded
+ as attachments, use `ispage()`.
+
+* "`mimetype(foo/bar)`" - checks the MIME type of the attachment. You can
+ include a glob in the type, for example `mimetype(image/*)`.
+
+* "`virusfree()`" - checks the attachment with an antiviral program.
diff --git a/doc/ikiwiki/pagespec/attachment/discussion.mdwn b/doc/ikiwiki/pagespec/attachment/discussion.mdwn
new file mode 100644
index 000000000..373242b3f
--- /dev/null
+++ b/doc/ikiwiki/pagespec/attachment/discussion.mdwn
@@ -0,0 +1,15 @@
+Would it be possible to factor out this pagespec
+code so that other plugins can use it without enabling attachments?
+I am interested for [[todo/mbox]] --[[DavidBremner]]
+
+> I can split out all of them except for `ip()` and `user()` easily. I
+> have just changed the code so the rest will test the existing source file
+> is no other filename is specified. Do you have any reason to want to
+> check ip addresses and user names? Not sure what to call the plugin, but
+> breaking it out is easy enough. --[[Joey]]
+
+>> I don't think `ip()` and `user()` necessarily make sense for a mail box
+>> that is already on the disk, so no, I don't think I'll miss
+>> them. --[[DavidBremner]]
+
+>>> Done, [[plugins/filecheck]] --[[Joey]]
diff --git a/doc/ikiwiki/pagespec/discussion.mdwn b/doc/ikiwiki/pagespec/discussion.mdwn
new file mode 100644
index 000000000..2857a98ba
--- /dev/null
+++ b/doc/ikiwiki/pagespec/discussion.mdwn
@@ -0,0 +1,170 @@
+I am using ikiwiki 2.6.1.
+
+I can't figure out the locked pages.
+
+As an admin in preferences, I put in my Locked Pages:
+
+index and downloads
+
+I don't want anyone to be able to edit the front page or my downloads page.
+
+That didn't work. I am using a different web browser as a different non-ikiwiki-admin user.
+
+So I changed it to
+
+/index and /downloads
+
+That stopped me from editing the front page. It didn't say it was locked just repeatedly gave me the ikiwiki login. (How can I get it to tell me it is locked instead?)
+
+I also tried
+
+/index and /downloads/index
+
+But I could still edit my downloads page.
+
+Can someone share some hints on how to lock these two pages?
+
+My source pages for the lock are:
+
+source/downloads.mdwn
+source/index.mdwn
+
+My webpages to lock are:
+
+public\_html/downloads/index.html
+public\_html/index.html
+
+> So I tried again with using "or" instead of "and":
+>
+> index or downloads
+>
+> And that worked. I now get a message saying it is locked and cannot be edited.
+> To me saying "lock both 'index and downloads'" made sense while now it reads like: "lock either 'index or downloads'". Maybe the [[PageSpec]] should define "and" and "or" (beyond the examples it has).
+>
+> Also why did my "/index and /downloads" prevent editing the index by repeatedly showing login webpage?
+>
+> -JeremyReed
+
+>> I've clarified and/or in [[PageSpec]].
+>>
+>> I can't reproduce "/index and /downloads" causing the login webpage to
+>> be shown repeatedly. Sure you weren't having some independent issue with
+>> logging in? --[[Joey]]
+
+----
+
+I have a page for a tag. On that page I want to list every page on my wiki that has been so tagged. Easy enough, right?
+
+> \[[!inline pages="link(Categories/Ikiwiki_Plugins)" feeds=no archive=yes sort=title template=titlepage]]
+
+> (I'm using tagbase => "Categories" because I'm converting from Mediawiki)
+
+This works beautifully in my sandbox: <http://iki.u32.net/sandbox> But it is totally blank on the page where I actually do want output! <http://iki.u32.net/Categories/Ikiwiki_Plugins>
+
+How can I fix this? --[[sabr]]
+
+> I don't see why that wouldn't work. Can I download the source to your
+> wiki from somewhere to investigate? --[[Joey]]
+
+----
+
+Should negation work with user(), with locked_pages in setup? I
+experimented with setting locked_pages => 'user(someuser)' and was able to
+edit as a different user. However, setting locked_pages =>
+'!user(someuser)' doesn't seem to allow edits for only 'someuser' - it
+locks out all users.
+
+> Negation works with anything in any PageSpec. I tested the case you
+> describe, and a negated pagespec worked for me; all users except the
+> listed user (and except wiki admins of course) were locked out.
+> --[[Joey]]
+
+>> It must be a local problem, then, cause I've tried it with two separate
+>> machines. Both are running the most recent release of ikiwiki in
+>> pkgsrc - 2.66. Perhaps an update to a newer version would solve the issue.
+
+----
+
+Is there a way to refer to all subpages of the current page, if the name of the
+current page is not known (i.e. the pagespec is used in a template)? The ./ syntax
+does not seem suitable for this, as
+
+> \[[!map pages="./*"]]
+
+also lists the current page and all its siblings.
+
+---
+
+I am a little lost. I want to match the start page `/index.mdwn`. So I use
+
+ \[[!inline pages="/index"]]
+
+which does not work though. I also tried it in this Wiki. Just take a look at the end of the [[SandBox|sandbox]]. --[[PaulePanter]]
+
+> Unlike wikilinks, pagespecs match relative to the top of the wiki by
+> default. So lose the "/" and it will work. --[[Joey]]
+
+
+----
+
+I'd like to create a collapsable tree with a map, eg
+
+* top level item
+
+* * second level item1
+
+* * * content 1
+
+* * * content 2
+
+* * * content 3
+
+* * second level item2
+
+* * second level item3
+
+
+but I can't work out how to specify "all items at this level, and all directories at ../ and all the other directories to the root
+
+any ideas?
+
+> I don't think pagespecs currently support that. How would such a made-up
+> pagespec look? I can imagine it supporting something like `glob(../*) and not
+> glob(../*/*)` to match all "directories" of the parent page, and so on up
+> to the root. --[[Joey]]
+
+>> I don't know, perhaps some way of nesting pagespecs
+>>> glob(../* unless $_ eq 'second level item'{ glob 'second level item'/*})
+
+>> but that could get messy, perhaps a new cmd 'pagetree' or something
+>> might be better? --Colin
+
+>>> You could probably do a lot worse than stealing terminology from
+>>> [XPath Axes](http://www.w3.org/TR/xpath/#axes),
+>>> passing the "argument" through `bestlink` if there is one, and
+>>> treating an empty argument as "this page", something like:
+>>>
+>>> * `ancestor(/plugins/contrib/album)` matches `plugins` or
+>>> `plugins/contrib`
+>>> but not `plugins/map` or `plugins/contrib/album`
+>>> (does it match `index`? answers on a postcard)
+>>> * `descendant(/plugins)` is basically `plugins/*`
+>>> * `child(/plugins)` is basically `plugins/* and !plugins/*/*`
+>>> * `self(/plugins)` is just `plugins` but without interpreting
+>>> globs
+>>> * `ancestor-or-self(/plugins)`, `descendant-or-self(/plugins)`
+>>> are syntactic sugar for e.g. `ancestor(/plugins) or self(/plugins)`
+>>> * `self()` always matches the current page (not destpage)
+>>> * `ancestor-or-self()` always matches the current pages and all
+>>> pages that would go in its [[plugins/parentlinks]]
+>>>
+>>> XPath has `following-sibling` and `preceding-sibling` axes for
+>>> siblings, but pagespecs are unordered, so we'd probably want
+>>> to invent `sibling()` - so `sibling(/plugins/map)` matches
+>>> `plugins/inline` but not `plugins/map` or `plugins/contrib/album`.
+>>>
+>>> Then, the requested functionality would be `sibling() or ancestor()`,
+>>> or possibly `sibling() or ancestor() or self()`?
+>>> --[[smcv]]
+
+>>>> I like that idea! --[[KathrynAndersen]]
diff --git a/doc/ikiwiki/pagespec/po.mdwn b/doc/ikiwiki/pagespec/po.mdwn
new file mode 100644
index 000000000..f9956404c
--- /dev/null
+++ b/doc/ikiwiki/pagespec/po.mdwn
@@ -0,0 +1,23 @@
+[[!if test="enabled(po)"
+ then="This wiki has po support **enabled**."
+ else="This wiki has po support **disabled**."]]
+
+If the [[!iki plugins/po desc=po]] plugin is enabled, the regular
+[[ikiwiki/PageSpec]] syntax is expanded with the following additional
+tests that can be used to improve user navigation in a multi-lingual
+wiki:
+
+* "`lang(LL)`" - tests whether a page is written in the language
+ specified as a ISO639-1 (two-letter) language code.
+* "`currentlang()`" - tests whether a page is written in the same
+ language as the current page.
+* "`needstranslation()`" - tests whether a page needs translation
+ work. Only slave pages match this PageSpec. A minimum target
+ translation percentage can optionally be passed as an integer
+ parameter: "`needstranslation(50)`" matches only pages less than 50%
+ translated.
+
+Note that every non-po page is considered to be written in
+`po_master_language`, as specified in `ikiwiki.setup`.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/pagespec/sorting.mdwn b/doc/ikiwiki/pagespec/sorting.mdwn
new file mode 100644
index 000000000..0c6cc74c7
--- /dev/null
+++ b/doc/ikiwiki/pagespec/sorting.mdwn
@@ -0,0 +1,30 @@
+Some [[directives|ikiwiki/directive]] that use
+[[PageSpecs|ikiwiki/pagespec]] allow
+specifying the order that matching pages are shown in. The following sort
+orders can be specified.
+
+* `age` - List pages from the most recently created to the oldest.
+
+* `mtime` - List pages with the most recently modified first.
+
+* `title` - Order by title (page name), e.g. "z/a a/b a/c"
+
+* `path` - Order by page name including parents, e.g. "a/b a/c z/a"
+[[!if test="enabled(sortnaturally)" then="""
+* `title_natural` - Orders by title, but numbers in the title are treated
+ as such, ("1 2 9 10 20" instead of "1 10 2 20 9")
+
+* `path_natural` - Like `path`, but numbers in the title are treated as such
+"""]]
+[[!if test="enabled(meta)" then="""
+* `meta(title)` - Order according to the `\[[!meta title="foo" sortas="bar"]]`
+ or `\[[!meta title="foo"]]` [[ikiwiki/directive]], or the page name if no
+ full title was set. `meta(author)`, `meta(date)`, `meta(updated)`, etc.
+ also work.
+"""]]
+
+In addition, you can combine several sort orders and/or reverse the order of
+sorting, with a string like `age -title` (which would sort by age, then by
+title in reverse order if two pages have the same age).
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/ikiwiki/searching.mdwn b/doc/ikiwiki/searching.mdwn
new file mode 100644
index 000000000..4c1287933
--- /dev/null
+++ b/doc/ikiwiki/searching.mdwn
@@ -0,0 +1,20 @@
+[[!meta robots="noindex, follow"]]
+
+[[!if test="enabled(search)"
+then="This wiki has searching **enabled**."
+else="This wiki has searching **disabled**."]]
+
+If searching is enabled, you can enter search terms in the search field,
+as you'd expect. There are a few special things you can do to construct
+more powerful searches.
+
+* To match a phrase, enclose it in double quotes.
+* `AND` can be used to search for documents containing two expressions.
+* `OR` can be used to search for documents containing either one of
+ two expressions.
+* Parentheses can be used to build up complicated search expressions. For
+ example, "(foo AND bar) OR (me AND you)"
+* Prefix a search term with "-" to avoid it from appearing in the results.
+ For example, "-discussion" will omit "discussion".
+* To search for a page with a given title, use "title:foo".
+* To search for pages that contain a "bar" link, use "link:bar".
diff --git a/doc/ikiwiki/subpage.mdwn b/doc/ikiwiki/subpage.mdwn
new file mode 100644
index 000000000..862f45ec1
--- /dev/null
+++ b/doc/ikiwiki/subpage.mdwn
@@ -0,0 +1,12 @@
+[[!meta robots="noindex, follow"]]
+ikiwiki supports placing pages in a directory hierarchy. For example,
+this page, [[SubPage]] has some related pages placed under it, like
+[[SubPage/LinkingRules]]. This is a useful way to add some order to your
+wiki rather than just having a great big directory full of pages.
+
+To add a SubPage, just make a subdirectory and put pages in it. For
+example, this page is subpage.mdwn in this wiki's source, and there is also
+a subpage subdirectory, which contains subpage/linkingrules.mdwn. Subpages
+can be nested as deeply as you'd like.
+
+Linking to and from a SubPage is explained in [[LinkingRules]].
diff --git a/doc/ikiwiki/subpage/linkingrules.mdwn b/doc/ikiwiki/subpage/linkingrules.mdwn
new file mode 100644
index 000000000..e547f3090
--- /dev/null
+++ b/doc/ikiwiki/subpage/linkingrules.mdwn
@@ -0,0 +1,33 @@
+[[!meta robots="noindex, follow"]]
+To link to or from a [[SubPage]], you can normally use a regular
+[[WikiLink]] that does not contain the name of the parent directory of
+the [[SubPage]]. Ikiwiki descends the directory hierarchy looking for a
+page that matches your link.
+
+For example, if FooBar/SubPage links to "OtherPage", ikiwiki will first
+prefer pointing the link to FooBar/SubPage/OtherPage if it exists, next
+to FooBar/OtherPage and finally to OtherPage in the root of the wiki.
+
+Note that this means that if a link on FooBar/SomePage to "OtherPage"
+currently links to OtherPage, in the root of the wiki, and FooBar/OtherPage
+is created, the link will _change_ to point to FooBar/OtherPage. On the
+other hand, a link from BazBar to "OtherPage" would be unchanged by this
+creation of a [[SubPage]] of FooBar.
+
+You can also specify a link that contains a directory name, like
+"FooBar/OtherPage" to more exactly specify what page to link to. This is
+the only way to link to an unrelated [[SubPage]].
+
+You can use this to, for example, to link from BazBar to "FooBar/SubPage",
+or from BazBar/SubPage to "FooBar/SubPage".
+
+You can also use "/" at the start of a link, to specify exactly which page
+to link to, when there are multiple pages with similar names and the link
+goes to the wrong page by default. For example, linking from
+"FooBar/SubPage" to "/OtherPage" will link to the "OtherPage" in the root
+of the wiki, even if there is a "FooBar/OtherPage".
+
+Also, if the wiki is configured with a userdir, you can link to pages
+within the userdir without specifying a path to them. This is to allow for
+easy linking to a user's page in the userdir, to sign a comment. These
+links are checked for last of all.
diff --git a/doc/ikiwiki/wikilink.mdwn b/doc/ikiwiki/wikilink.mdwn
new file mode 100644
index 000000000..cf3b89c76
--- /dev/null
+++ b/doc/ikiwiki/wikilink.mdwn
@@ -0,0 +1,29 @@
+[[!meta robots="noindex, follow"]]
+WikiLinks provide easy linking between pages of the wiki. To create a
+[[WikiLink]], just put the name of the page to link to in double brackets.
+For example `\[[WikiLink]]`.
+
+If you ever need to write something like `\[[WikiLink]]` without creating a
+wikilink, just prefix it with a `\`, like `\\[[WikiLink]]`.
+
+There are some special [[SubPage/LinkingRules]] that come into play when
+linking between [[SubPages|SubPage]].
+
+WikiLinks are matched with page names in a case-insensitive manner, so you
+don't need to worry about getting the case the same, and can capitalise
+links at the start of a sentence, and so on.
+
+It's also possible to write a WikiLink that uses something other than the page
+name as the link text. For example `\[[foo_bar|SandBox]]` links to the SandBox
+page, but the link will appear like this: [[foo_bar|SandBox]].
+
+To link to an anchor inside a page, you can use something like
+`\[[WikiLink#foo]]` .
+
+If the file linked to by a WikiLink looks like an image, it will
+be displayed inline on the page.
+
+---
+
+You can also put an url in a WikiLink, to link to an external page.
+Email addresses can also be used to generate a mailto link.
diff --git a/doc/ikiwiki/wikilink/discussion.mdwn b/doc/ikiwiki/wikilink/discussion.mdwn
new file mode 100644
index 000000000..58bb8ca06
--- /dev/null
+++ b/doc/ikiwiki/wikilink/discussion.mdwn
@@ -0,0 +1,93 @@
+# Creating an [[anchor]] in Markdown
+
+Is it a native Markdown "tag" for creating an [[anchor]]? Unfortunately,
+I haven't any information about it at
+[Markdown syntax](http://daringfireball.net/projects/markdown/syntax) page.
+
+Of course, I know that I can use HTML tag to do it,
+for example &lt;a name="foo" /&gt;, but I don't want to mix Markdown
+and HTML code if it's not necessary.
+
+BTW, ikiwiki doesn't displays the #foo anchor in the example
+("To link to an anchor inside a page...") at [[WikiLink]] page...
+
+--[[Paweł|ptecza]]
+
+> No such syntax exists in markdown. ikiwiki could certainly have a
+> [[preprocessor_directive|directive]] for it, though.
+> --[[JoshTriplett]]
+
+>> [[!tag wishlist]]
+>> I'd like to implement such a thing. Joey, what is this supposed to look like?
+>> `\[[anchor WHATEVER]]`? --[[tschwinge]]
+
+>>> Why would you want to use a preprocessor directive for something that can
+>>> be more shortly and clearly done with plain HTML? Markdown is *designed*
+>>> to be intermixed with HTML. --[[Joey]]
+
+>>>> I tend to disagree.
+>>>> It just doesn't feel right for me to put HTML code straight into Markdown files.
+>>>>
+>>>> Quoting <http://daringfireball.net/projects/markdown/>:
+>>>>
+>>>>> The idea is that a Markdown-formatted document should be publishable as-is, as plain text, *without looking like it’s been marked up with tags or formatting instructions*.
+>>>>
+>>>> Also, in theorie at least, Markdown might also support other output formats than HTML.
+>>>> Those wouldn't know about how to deal with the intermingled HTML code.
+>>>>
+>>>> --[[tschwinge]]
+>>>>>Not sure \[[anchor WHATEVER]] looks any better than &lt;a name="WHATEVER"&gt;...? --[[sabr]]
+
+> The lack of the `#foo` anchor in the anchor example on [[wikilink]]
+> definitely looks like a bug. --[[JoshTriplett]]
+
+>> Fixed that --[[Joey]]
+
+>>> Sorry to bring this up (1 year since last change in this page) but what is the status of this? Can we use anchors like it's in [[wikilink]]? This discusion is tagged as wishlist, but isn't listed in [[wishlist]]. What is "fixed that" then? Again, sorry if this is a dead issue, but I have a tendency to create big wiki pages, and anchors are very needed for me, but I've spent all morning trying to make it work and it just doesn't. TY. --rbern
+
+The 'name' attribute of the 'a' element is a depracated way to create a named anchor. The right way to do that is using the 'id' attribute of any element. This is because an anchor may refer to a complete element rather than some point in the page.
+
+Standard purity aside, if you define an anchor (using either 'a name' or 'id') to a single point in the document but refer to a complete section, the browser may just show that specific point at the bottom of the page rather than trying to show all the section.
+--[[tzafrir]]
+
+---
+
+Considering a hierarchy like `foo/bar/bar`, I had the need to link from the
+`foo/bar/bar` page to the `foo/bar` one. It would have been convenient to
+simply write [[wikilink]]s like `\[[../bar]]` (or even just `\[[..]]`?), but
+this doesn't work, so I had to resort to using `\[[foo/bar]]` instead.
+--[[tschwinge]]
+
+> I believe, that doesn't entirely solve the problem. Just assume, your hierarchy is `/foo/bar/foo/bar`.
+
+> How do you access from the page `/foo/bar/foo/bar` the `/foo/bar` and not `/foo/bar/foo/bar`?
+
+> Do we have a way to implement `\[[../..]]` or `\[[/foo/bar]]`?
+
+> Even worse, trying to link from `/foo/bar` to `/foo/bar/foo/bar` ... this will probably need `\[[./foo/bar]]` --[[Jan|jwalzer]]
+
+>> There is no ".." syntax in wikilinks, but if the link begins with "/" it
+>> is rooted at the top of the wiki, as documented in
+>> [[subpage/linkingrules]]. Therefore, every example page name you listed
+>> above will work unchanged as a wikilink to that page! --[[Joey]]
+
+----
+
+How do I make images clickable? The obvious guess, \[[foo.png|/index]], doesn't work. --[[sabr]]
+
+> You can do it using the img plugin. The syntax you suggested would be ambiguous,
+> as there's no way to tell if the text is meant to be an image or displayed as-is.
+> --[[Joey]]
+
+----
+
+Is it possible to refer to a page, say \[[foobar]], such that the link text is taken from foobar's title [[directive/meta]] tag? --Peter
+
+> Not yet. :-) Any suggestion for a syntax for it? Maybe something like \[[|foobar]] ? --[[Joey]]
+
+I like your suggestion because it's short and conscise. However, it would be nice to be able to refer to more or less arbitrary meta tags in links, not just "title". To do that, the link needs two parameters: the page name and the tag name, i.e. \[[pagename!metatag]]. Any sufficiently weird separater can be used instead of '!', of course. I like \[[pagename->metatag]], too, because it reminds me of accessing a data member of a structure (which is what referencing a meta tag is, really). --Peter
+
+> I dislike \[[pagename->metatag]] because other wikis use that as their normal link/label syntax.
+> I'm not sure that it is a good idea to refer to arbitrary meta tags in links in the first place - what other meta tags would you really be interested in? Description? Author? It makes sense to me to refer to the title, because that is a "label" for a page.
+> As for syntax, I do like the \[[|foobar]] idea, or perhaps something like what <a href="http://www.pmwiki.org">PmWiki</a> does - they have their links the other way around, so they go \[[page|label]] and for link-text-as-title, they have \[[page|+]]. So for IkiWiki, that would be \[[+|page]] I guess.
+> --[[KathrynAndersen]]
diff --git a/doc/ikiwikiusers.mdwn b/doc/ikiwikiusers.mdwn
new file mode 100644
index 000000000..eda6e3ab0
--- /dev/null
+++ b/doc/ikiwikiusers.mdwn
@@ -0,0 +1,198 @@
+General information
+===================
+
+Feel free to add your own ikiwiki site! In case you have created a custom theme consider adding it to [the theme list](http://ikiwiki.info/themes/)
+
+See also: [Debian ikiwiki popcon graph](http://qa.debian.org/popcon.php?package=ikiwiki)
+and [google search for ikiwiki powered sites](http://www.google.com/search?q=%22powered%20by%20ikiwiki%22).
+
+While nothing makes me happier than knowing that ikiwiki has happy users,
+dropping some change in the [[TipJar]] is a nice way to show extra
+appreciation.
+
+Ikiwiki Hosting
+===============
+
+* [Branchable](http://branchable.com/)
+
+Projects & Organizations
+========================
+
+* [This wiki](http://ikiwiki.info) (of course!)
+* [NetBSD wiki](http://wiki.netbsd.org)
+* The [GNU Hurd](http://www.gnu.org/software/hurd/)
+* [DragonFly BSD](http://www.dragonflybsd.org/)
+* [Monotone](http://wiki.monotone.ca/)
+* The [Free Software Foundation](http://fsf.org) uses it for their internal wiki, with subversion.
+* The [cairo graphics library](http://cairographics.org/) website.
+* The [Portland State Aerospace Society](http://psas.pdx.edu) website. Converted from a combination of TWiki and MoinMoin to ikiwiki, including full history ([[rcs/Git]] backend).
+* [Planet Debian upstream](http://updo.debian.net/)
+* [Debian Mentors wiki](http://jameswestby.net/mentors/)
+* [The BSD Associate Admin Book Project](http://bsdwiki.reedmedia.net/)
+* The [maildirman wiki](http://svcs.cs.pdx.edu/maildirman)
+* The [Relativistic Programming research wiki](http://wiki.cs.pdx.edu/rp).
+* [Debian-IN](http://debian-in.alioth.debian.org/)
+* [Braawi Ltd](http://braawi.com/) and the community site [Braawi.org](http://braawi.org/)
+* [Webconverger](http://webconverger.org/) (a Web only linux distribution) with a [blog](http://webconverger.org/blog/)
+* [DebTorrent](http://debtorrent.alioth.debian.org)
+* The [Debian Packaging Handbook project](http://packaging-handbook.alioth.debian.org/wiki/)
+* The [libkdtree project](http://libkdtree.alioth.debian.org)
+* The [pcc](http://pcc.ludd.ltu.se/) (Portable C Compiler) project. (Simple rcs backend)
+* [The TOVA Company](http://www.tovatest.com) public site. We also use it for internal documentation and issue tracking, all with a [[rcs/Git]] backend.
+* Reusable technical support websites, developed for [Redpill](http://redpill.dk/) realms:
+ * [master demo site](http://support.redpill.dk/) ([source](http://source.redpill.dk/))
+ * [Homebase](http://support.homebase.dk/) ([source](http://source.homebase.dk/))
+ * [Bitbase](http://support.bitbase.dk/) ([source](http://source.bitbase.dk/))
+ * [Børneuniversitetet](http://support.borneuni.dk/) ([source](http://source.borneuni.dk/))
+* [CampusGrün Hamburg](http://www.campusgruen.org/)
+* The [awesome window manager homepage](http://awesome.naquadah.org/)
+* [vcs-pkg](http://vcs-pkg.org)
+* [vcs-home](http://vcs-home.madduck.net)
+* [Public Domain collection of Debian related tips & tricks](http://dabase.com/tips/) - please add any tips too
+* [Finnish Debian community](http://debian.fi)
+* [INCL intranuclear cascade and ABLA evaporation/fission](http://www.cs.helsinki.fi/u/kaitanie/incl/)
+* [dist-bugs](http://dist-bugs.kitenet.net/)
+* [Chaos Computer Club Düsseldorf](https://www.chaosdorf.de)
+* [monkeysphere](http://web.monkeysphere.info/)
+* [St Hugh of Lincoln Catholic Primary School in Surrey](http://www.sthugh-of-lincoln.surrey.sch.uk/)
+* [Cosin Homepage](http://cosin.ch) uses an Ikiwiki with a subversion repository.
+* [Bosco Free Orienteering Software](http://bosco.durcheinandertal.ch)
+* [MIT Student Information Processing Board](http://sipb.mit.edu/)
+* [Tinc VPN](http://tinc-vpn.org/)
+* [The XCB library](http://xcb.freedesktop.org/)
+* [The Philolexian Society of Columbia University](http://www.columbia.edu/cu/philo/)
+* [Fachschaft Informatik HU Berlin](http://fachschaft.informatik.hu-berlin.de/)
+* [Wetknee Books](http://www.wetknee.com/)
+* [IPOL Image Processing On Line](http://www.ipol.im)
+* [Debian Costa Rica](http://cr.debian.net/)
+* [Fvwm Wiki](http://fvwmwiki.xteddy.org)
+* [Serialist](http://serialist.net/)'s static pages (documentation, blog). We actually have ikiwiki generate its static content as HTML fragments using a modified page.tmpl template, and then the FastCGI powering our site grabs those fragments and embeds them in the standard dynamic site template.
+* [Apua IT](http://apua.se/)
+* [PDFpirate Community](http://community.pdfpirate.org/)
+* [Software in the Public Interest](http://spi-inc.org/)
+* [NXT Improved Firmware](http://nxt-firmware.ni.fr.eu.org/)
+* [The FreedomBox Foundation](http://www.freedomboxfoundation.org/)
+* [TenderWarehouse Community](http://community.tenderwarehouse.org/)
+* [AntPortal](http://antportal.com/wiki/) - also see our templates and themes on [github](https://github.com/AntPortal/ikiwiked)
+* [The Amnesic Incognito Live System](https://tails.boum.org/index.en.html)
+* [The Progress Linux OS wiki](http://wiki.progress-linux.org/)
+* [Oxford Computer Society](http://www.ox.compsoc.net/)
+* [Russian OpenBSD Community wiki](http://wiki.openbsd.ru/)
+* [Arcada Project](http://arcadaproject.org/)
+* [*BSD UNIX user group in Denmark](http://www.bsd-dk.dk/)
+* [Telecomix Broadcast System](http://broadcast.telecomix.org/)
+* [WikiMIX.cc](http://WikiMIX.cc/)
+* Paris Observatory [Information System website](http://dio.obspm.fr/), also used for internal documentation
+* [SolderPad Documentation](http://docs.solderpad.com)
+* [The Open TV White Space Project](http://opentvws.org)
+* [The RS-232 Club](http://rs232club.org)
+* [FusionInventory project](http://www.fusioninventory.org)
+* FabLab Deventer i.o.
+* [Börn og tónlist](http://bornogtonlist.net/) - an Icelandic open-content site, primarily for kindergarten teachers, about music and music-related activites with children. Migrated from MediaWiki to IkiWiki in June 2013. Heavily changed appearance with only minimal changes to page.tmpl. Also its sister site [Leikur að bókum](http://leikuradbokum.net), about children's books in a kindergarten/pre-school context.
+* [Réseaulibre.ca](http://wiki.reseaulibre.ca) - a mesh project in Montréal, most data is stored in the wiki, including IP address allocation and geographic data. Features map ([[plugins/osm]]) integration.
+* [Foufem](http://foufem.orangeseeds.org/) - Foufem, a feminist hackerspace
+
+Personal sites and blogs
+========================
+
+* [[Joey]]'s [homepage](http://kitenet.net/~joey/), including his weblog
+* [Kyle's MacLea Genealogy wiki](http://kitenet.net/~kyle/family/wiki) and [Livingstone and MacLea Emigration Registry](http://kitenet.net/~kyle/family/registry)
+* [Ulrik's personal web page](http://kaizer.se/wiki/)
+* [kite](http://kitenet.net)
+* [Paul Collins's as-yet purposeless wiki](http://wiki.ondioline.org/)
+* [Alessandro Dotti Contra's personal website](http://www.dotticontra.org/) and [weblog](http://www.dotticontra.org/blog)
+* [Kelly Clowers' personal website](http://www.clowersnet.net/)
+* [Anna's nature features](http://kitenet.net/~anna/nature-feature/)
+* [Roland Mas's blog](http://roland.entierement.nu/categories/geek-en.html)
+* [Sergio Talens-Oliag's personal wiki](http://mixinet.net/~sto/) and [blog](http://mixinet.net/~sto/blog.html)
+* [Christian Aichinger's homepage](http://greek0.net/)
+* Ben A'Lee's [homepage](http://subvert.org.uk/~bma/) and [wiki](http://wiki.subvert.org.uk/).
+* [Adam Shand's homepage](http://adam.shand.net/iki/)
+* [Hess family wiki](http://kitenet.net/~family/)
+* [Zack](http://upsilon.cc/~zack)'s homepage, including [his weblog](http://upsilon.cc/~zack/blog/)
+* [betacantrips, the personal website of Ethan Glasser-Camp](http://www.betacantrips.com/)
+* [Keith Packard's homepage and blog](http://keithp.com/).
+* [Christian Mock's homepage](http://www.tahina.priv.at/).
+* [Choffee](http://choffee.co.uk/).
+* [Tales from the Gryphon](http://www.golden-gryphon.com/blog/manoj/), Manoj Srivastava's free software blog.
+* [Proper Treatment 正當作法](http://conway.rutgers.edu/~ccshan/wiki/)
+* [lost scraps](http://web.mornfall.net), pages/blog of Petr Ročkai aka mornfall
+* [Schabis blaue Seite](http://schabi.de) - I abuse ikiwiki as blog/cms combo, and will migrate all existing content into ikiwiki eventually.
+* [blog of LukClaes](http://zomers.be/~luk/blog/).
+* [Embedded Moose](http://embeddedmoose.com), Andrew Greenberg's personal and consulting page.
+* [Cameron Dale](http://www.camrdale.org/)
+* [[KarlMW]]'s [homepage](http://mowson.org/karl/), generated with an ikiwiki
+ [asciidoc plugin](http://mowson.org/karl/colophon/).
+* [Carl Worth's Boring Web Pages](http://www.cworth.org)
+* [[NicolasLimare]] ([nil](http://poivron.org/~nil/)+[lab](http://www.ann.jussieu.fr/~limare/)+[id](http://nicolas.limare.net/)+[french translation of the basewiki](http://poivron.org/~nil/ikiwiki-fr/))
+* Andrew Sackville-West has setup a [family wiki](http://wiki.swclan.homelinux.org)
+* [Simon Ward's site](http://bleah.co.uk/) and [blog](http://bleah.co.uk/blog/).
+* [Paul Wise's homepage and blog](http://bonedaddy.net/pabs3/)
+* [Martin's PhD wiki](http://phd.martin-krafft.net/wiki)
+* [David Riebenbauer's page](http://liegesta.at/)
+* [Thomas Harning's 'eHarning' wiki](http://www.eharning.us/wiki/)
+* [madduck's (new) homepage](http://madduck.net)
+* [Olivier Berger's professional homepage](http://www-public.it-sudparis.eu/~berger_o/)
+* [Don Marti's blog](http://zgp.org/~dmarti/)
+* [[users/Jon]]'s [homepage](http://jmtd.net/)
+* [[JanWalzer|jwalzer]]'s [homepage](http://wa.lzer.net/) -- Work in Progress
+* [[Adam_Trickett|ajt]]'s home intranet/sanbox system ([Internet site & blog](http://www.iredale.net/) -- not ikiwiki yet)
+* [[Simon_McVittie|smcv]]'s [website](http://www.pseudorandom.co.uk/) and
+ [blog](http://smcv.pseudorandom.co.uk/)
+* Svend's [website](http://ciffer.net/~svend/) and [blog](http://ciffer.net/~svend/blog/)
+* [muammar's site](http://muammar.me)
+* [Per Bothner's blog](http://per.bothner.com/blog/)
+* [Bernd Zeimetz (bzed)](http://bzed.de/)
+* [Gaudenz Steinlin](http://gaudenz.durcheinandertal.ch)
+* [NeoCarz Wiki](http://www.neocarz.com/wiki/) Yes - its actually Ikiwiki behind that! I'm using Nginx and XSL to transform the ikiwiki renderings thanks to the valid XHTML output of ikiwiki. Great work Joey!!
+* [Natalian - Kai Hendry's personal blog](http://natalian.org/)
+* [Mick Pollard aka \_lunix_ - Personal sysadmin blog and wiki](http://www.lunix.com.au)
+* [Skirv's Wiki](http://wiki.killfile.org) - formerly Skirv's Homepage
+* [Jimmy Tang - personal blog and wiki](http://www.sgenomics.org/~jtang)
+* [Nico Schottelius' homepage](http://www.nico.schottelius.org)
+* [Andreas Zwinkaus homepage](http://beza1e1.tuxen.de)
+* [Walden Effect](http://waldeneffect.org)
+* [Avian Aqua Miser](http://www.avianaquamiser.com/)
+* [Cosmic Cookout](http://www.cosmiccookout.com/)
+* [Backyard Deer](http://www.backyarddeer.com/)
+* [Alex Ghitza homepage and blog](http://aghitza.org/)
+* [Andreas's homepage](http://0x7.ch/) - Ikiwiki, Subversion and CSS template
+* [Chris Dombroski's boring bliki](https://www.icanttype.org/)
+* [Josh Triplett's homepage](http://joshtriplett.org/) - Git backend with the CGI disabled, to publish a static site with the convenience of ikiwiki.
+* [Jonatan Walck](http://jonatan.walck.i2p/) a weblog + wiki over [I2P](http://i2p2.de/). Also [mirrored](http://jonatan.walck.se/) to the Internet a few times per day.
+* [Daniel Wayne Armstrong](http://circuidipity.com/)
+* [Mukund](https://mukund.org/)
+* [Nicolas Schodet](http://ni.fr.eu.org/)
+* [weakish](http://weakish.github.com)
+* [Thomas Kane](http://planetkane.org/)
+* [Marco Silva](http://marcot.eti.br/) a weblog + wiki using the [darcs](http://darcs.net) backend
+* [NeX-6](http://nex-6.taht.net/) ikiwiki blog and wiki running over ipv6
+* [Jason Riedy](http://lovesgoodfood.com/jason/), which may occasionally look funny if I'm playing with my branch...
+* [pmate](http://www.gnurant.org)'s homepage and [blog](http://www.gnurant.org/blog/)
+* [tychoish.com](http://tychoish.com/) - a blog/wiki mashup. blog posts are "rhizomes."
+* [Martin Burmester](http://www.martin-burmester.de/)
+* [Øyvind A. Holm (sunny256)](http://www.sunbase.org) — Read my Ikiwiki praise [here](http://www.sunbase.org/blog/why_ikiwiki/).
+* [Mirco Bauer (meebey)](http://www.meebey.net/)
+* [Richard "RichiH" Hartmann](http://richardhartmann.de/blog) - I thought I had added myself a year ago. Oups :)
+* [Jonas Smedegaard](http://dr.jones.dk/) multilingual "classic" website w/ blog
+* [Siri Reiter](http://sirireiter.dk/) portfolio website with a blog (in danish)
+* [L'Altro Wiki](http://laltromondo.dynalias.net/~iki/) Tutorials, reviews, miscellaneus articles in English and Italian.
+* gregoa's [p.r. - political rants](http://info.comodo.priv.at/pr/)
+* [Michael Hammer](http://www.michael-hammer.at/)
+* [Richardson Family Wiki](http://the4richardsons.com) A wiki, blog or some such nonsense for the family home page or something or other... I will eventually move the rest of my sites to ikiwiki. The source of the site is in git.
+* [The personal website of Andrew Back](http://carrierdetect.com)
+* [Paul Elms](http://paul.elms.pro) Personal site and blog in russian.
+* [James' Tech Notes](http://jamestechnotes.com) My technical notes, blog, wiki, personal site.
+* [Salient Dream](http://www.salientdream.com/) - All Things Strange.
+* [Kafe-in.net](https://www.kafe-in.net/) Ugly personnal blog.
+* [Anton Berezin's blog](http://blog.tobez.org/)
+* [Waldgarten]( http://waldgarten.greenonion.org/ ) News and documentation of a permaculture inspired neighbourhood-garden located in Hamburg, Germany.
+* [[OscarMorante]]'s [personal site](http://oscar.morante.eu).
+* [Puckspage]( http://www.puckspage.org/ ) Political and personal blog in German. The name comes from the elf out of midsummer nights dream.
+* [[LucaCapello]]'s [homepage](http://luca.pca.it)
+* [[Martín Ferrari's homepage|http://tincho.org/]] and [[blog|http://blog.tincho.org/]]
+* [WikiAtoBR](http://wiki.hi.ato.br) Open, free and annoymous wiki. No need for account registering and login. It is Brazilian so it is in Portuguese.
+* [Manifesto](http://manifesto.hi.ato.br) Open, free and annoymous blog. No need for account registering and login. It is Brazilian so it is in Portuguese.
+ * [Z is for Zombies](http://blog.zouish.org/) — personal blog/site of Francesca Ciceri
+* Julien Lefrique's [homepage](http://julien.lefrique.name/), hosted on [GitHub pages](https://github.com/jlefrique/jlefrique.github.com) with CGI disabled
+* [Anarcat's homepage](http://anarcat.ath.cx/) - with a custom [[theme|theme_market]]
diff --git a/doc/ikiwikiusers/discussion.mdwn b/doc/ikiwikiusers/discussion.mdwn
new file mode 100644
index 000000000..2c211b097
--- /dev/null
+++ b/doc/ikiwikiusers/discussion.mdwn
@@ -0,0 +1,39 @@
+Do you think it might prove useful to categorize these sites into "personal
+sites and blogs" and "project sites"? To some extent this page exists to
+promote ikiwiki and show its popularity, and those two categories contribute
+to ikiwiki's popularity in different ways. Personal sites and blogs primarily
+contribute through sheer numbers ($BIGNUM personal sites and blogs use
+ikiwiki), while project sites primarily contribute based on their own
+notability (UK Software Patents Info uses ikiwiki; the ion site uses ikiwiki).
+Burying the project sites in with the rest tends to make people less likely to
+notice them and remark on their notability. --[[JoshTriplett]]
+
+> Great idea. --[[Joey]]
+
+Where can I get a webhost for ikiwiki? Could someone tell me?
+I am very much interested in ikiwiki yet My server does not support some Perl Modules that ikiwiki requires.
+So did some of you try install ikiwiki on DREAMHOST?
+Or any suggestion where can I get a host which supports ikiwiki?
+Thanks, --[[Chao]]
+
+> I think that most users of ikiwiki are using it on servers they fully
+> control, or xen instances, or things like that. Some users with static
+> sites that don't need web editing might build them on their own machine,
+> and upload to a webhost service.
+>
+> For getting it working on Dreamhost, it seems to me that if you can
+> install ikiwiki, which is mostly a bunch of perl modules, it must be
+> possible to install other perl modules also. I'm not a Dreamhost user, so
+> I don't know how though. What perl modules are missing? --[[Joey]]
+
+Thanks for the reply Joey! ikiwiki is a fantastic wiki complier, though I do
+not have my own machine momentarily, i will pay close attention to its development.
+Hopefully I will be one of the ikiwiki users one day :) cheers --[[Chao]]
+
+----
+
+Are there automated hosting sites for ikiwiki yet? If you know one, can you add one in a new section on [[ikiwikiusers]] please? If you don't know any and you're willing to pay to set one up (shouldn't be much more expensive than a single ikiwiki IMO), [contact me](http://www.ttllp.co.uk/contact.html) and let's talk. -- MJR
+
+----
+
+People who have interests in getting a webhost for ikiwiki may have a look at [this site](http://www.pigro.net). -- weakish
diff --git a/doc/index.mdwn b/doc/index.mdwn
new file mode 100644
index 000000000..4c22ce0e0
--- /dev/null
+++ b/doc/index.mdwn
@@ -0,0 +1,29 @@
+Ikiwiki is a **wiki compiler**. It converts wiki pages into HTML pages
+suitable for publishing on a website. Ikiwiki stores pages and history in a
+[[revision_control_system|rcs]] such as [[Subversion|rcs/svn]] or [[rcs/Git]].
+There are many other [[features]], including support for
+[[blogging|blog]], as well as a large array of [[plugins]].
+
+[[!template id=links]]
+
+## using ikiwiki
+
+[[Setup]] has a tutorial for setting up ikiwiki, or you can read the
+[[man_page|usage]]. There are some [[examples]] of things you can do
+with ikiwiki, and some [[tips]]. Basic documentation for ikiwiki plugins
+and syntax is provided [[here|ikiwiki]]. The [[forum]] is open for
+discussions.
+
+All wikis are supposed to have a [[sandbox]], so this one does too.
+
+This site generally runs the latest release of ikiwiki; currently, it runs
+ikiwiki [[!version ]].
+
+## developer resources
+
+The [[RoadMap]] describes where the project is going.
+[[Bugs]], [[TODO]] items, [[wishlist]] items, and [[patches|patch]]
+can be submitted and tracked using this wiki.
+
+Ikiwiki is developed by [[Joey]] and many contributors,
+and is [[FreeSoftware]].
diff --git a/doc/index/discussion.mdwn b/doc/index/discussion.mdwn
new file mode 100644
index 000000000..749042910
--- /dev/null
+++ b/doc/index/discussion.mdwn
@@ -0,0 +1 @@
+All discussion that used to be here has moved to the [[forum]].
diff --git a/doc/index/openid/discussion.mdwn b/doc/index/openid/discussion.mdwn
new file mode 100644
index 000000000..4a50fd9dd
--- /dev/null
+++ b/doc/index/openid/discussion.mdwn
@@ -0,0 +1,62 @@
+# OpenID discussion
+
+## No return_to in OpenID server
+
+Hi, there's no return_to from a designated OpenID server page, specs requires (I think a "should" or "must", can't recall exact wording) that it redirects back to the RP, in order to complete the registration and authentication. Unless I'm missing something, and the doc is incomplete, I'd consider this a bug. I don't expect to be of much use WRT coming up with a patch, but I'm willing to test ;-) .
+
+> If this is a bug, could you please explain:
+>
+> * What happens when the bug occurs?
+> * How can one reproduce the bug?
+>
+> PS, please file bugs under [[bugs]] in future. --[[Joey]]
+
+>> Oops, my bad, didn't know that existed at the time I wrote this.
+>>
+>> What happened is that the process wouldn't complete, therefore I couldn't login with my OpenID.
+>>
+>> reproducibility: every time
+>>
+>> Should probably move this page, eh? ;)
+>> I'd do that, but I dunno know other than using the SCM backend in question....
+
+Here's some actual output (with my OpenID URL stripped out):
+
+do=postsignin&oic.time=1238224497-1450566d93097caa707f&openid.assoc_handle=%7BHMAC-SHA1%7D%7B49cdce76%7D%7BBhuXXw%3D%3D%7D&openid.identity=|<==== MY OPENID URL GOES HERE ====>|&openid.mode=id_res&openid.op_endpoint=http%3A%2F%2Fwww.myopenid.com%2Fserver&openid.response_nonce=2009-03-28T07%3A15%3A02ZDUFmG3&openid.return_to=http%3A%2F%2Fsimonraven.kisikew.org%2Fbin%2Fikiwiki.cgi%3Fdo%3Dpostsignin%26oic.time%3D1238224497-1450566d93097caa707f&openid.sig=E51Xh6Gnjku%2B0se57qCyhHbT5QY%3D&openid.signed=assoc_handle%2Cidentity%2Cmode%2Cop_endpoint%2Cresponse_nonce%2Creturn_to%2Csigned
+
+The `return_to` arg should NOT be `signed`, it should be the originating URL where you initially logged in.
+
+>> Yes, exactly. That's also my understanding of the spec.
+
+> I think you're confusing 'openid.return_to' with 'return_to'. The
+> former is present above, and is, indeed, the originating url, the latter
+> is part of the *value* of the 'openid.signed' parameter generated by myopenid.com. --[[Joey]]
+
+Also, I dunno what the assoc_handle is doing spitting out an arg like `{HMAC-SHA1}{49cdce76}{BhuXXw%3D%3D}` it should be processed further. I have the needed perl packages installed (latest for Lenny). Hrm, would endianness matter?
+
+> Again, this value is created by the openid server, not by ikiwiki.
+> I see the same HMAC-SHA1 when using myopenid, and completly different
+> things for other openid servers. (Ie, when using livejournal as an openid server,
+> `openid.assoc_handle=1239305290:STLS.QiU6zTZ6w2bM3ttRkdaa:e68f91b751`)
+
+>> OK, I wasn't too sure about that, seemed bogus or somehow wrong or in error, like it wasn't actually being `completed`.
+
+> I'm fairly sure that is all a red herring.
+>
+> So, when I was talking about reproducing the bug, I was thinking perhaps you could tell me what openid server you're using,
+> etc, so I can actually see the bug with my own eyes.
+
+>> myopenid.com, with the CNAME option turned on.
+
+> The sanitised url parameters you've provided are not generated by ikiwiki at all.
+> They don't even seem to be generated by the underlying [[!cpan Net::OpenID]] library.
+
+>> That was a server log entry with date/host/time stripped, and my URL also stripped. Everything else is as was in the log. I installed the Debian packages in Lenny, both server and consumer OpenID Perl packages.
+
+> I'm pretty sure that what you're showing me is the url myopenid redirects
+> the browser to after successfully signing in. At that point, ikiwiki
+> should complete the signin. What fails at this point? How can I reproduce this failure? --[[Joey]]
+
+I'll try it again myself. I had tried it oh probably 6 times before I finally gave up on it. Maybe I'm getting rusty and I'm just PEBKACing all over the place. :P
+
+Also, to address the point about this discussion being in the wrong area (not under bugs), should I move it, or will you? I don't mind doing it, if you can't.
diff --git a/doc/install.mdwn b/doc/install.mdwn
new file mode 100644
index 000000000..82fd299e3
--- /dev/null
+++ b/doc/install.mdwn
@@ -0,0 +1,48 @@
+This page documents how to install ikiwiki if a prepackaged version is not
+available for your distribution, and you are faced with [[downloading|download]]
+the source and installing by hand. Ikiwiki should work on most unix-like
+systems.
+
+## Dependencies
+
+Ikiwiki is a perl program, and needs a recent version of perl such as
+5.10. (5.8.0 has been reported not to work).
+
+It's recommended you have a C compiler, as ikiwiki uses one to build
+wrappers.
+
+Ikiwiki requires the [[!cpan Text::Markdown::Discount]] (or
+[[!cpan Text::Markdown]]), [[!cpan URI]],
+[[!cpan HTML::Parser]], [[!cpan HTML::Template]], [[!cpan YAML::XS]] and [[!cpan HTML::Scrubber]]
+perl modules be installed.
+It can also use a lot of other perl modules, if
+they are available.
+
+Various [[plugins]] use other perl modules and utilities; see their individual
+documentation for details.
+
+### Installing dependencies by hand
+
+If you want to install by hand from the tarball, you should make sure that
+all the perl modules are installed. This is one way to install them, using
+CPAN:
+
+ PERL5LIB=`pwd` PERL_MM_USE_DEFAULT=1 perl -MCPAN -e 'CPAN::Shell->install("Bundle::IkiWiki")'
+ PERL5LIB=`pwd` PERL_MM_USE_DEFAULT=1 perl -MCPAN -e 'CPAN::Shell->install("Bundle::IkiWiki::Extras")'
+
+## Installing ikiwiki by hand
+
+Then to build and install ikiwiki:
+
+ perl Makefile.PL # PREFIX=/dir to install elsewhere
+ make
+ make test # optional
+ make install
+
+If you're using a shared hosting provider, of the sort where you don't have
+root, you can still install ikiwiki. There are tutorials covering this for
+a few providers:
+
+
+* [[tips/NearlyFreeSpeech]]
+* [[tips/DreamHost]]
diff --git a/doc/install/discussion.mdwn b/doc/install/discussion.mdwn
new file mode 100644
index 000000000..b27cc4bac
--- /dev/null
+++ b/doc/install/discussion.mdwn
@@ -0,0 +1,358 @@
+No matter what I do, ikiwiki gives me a `Can't locate loadable object for module Locale::gettext in @INC` although I've installed (and reinstalled) the Locale module, and no luck. If I look at the directories in the INC path, I can see the file. The wiki won't compile in spite of this, and I've tried everything I can think of.. -- [[tychoish]]
+
+> Sounds like the `Locale::gettext` perl module is there, but your perl
+> installation is broken so that the accompnying so file is not there, or
+> doesn't work. On my system I have
+> `/usr/lib/perl5/Locale/gettext.pm` and
+> `/usr/lib/perl5/auto/Locale/gettext.so` -- suspect your problem is with
+> the second one.
+>
+> If you can't fix it, this problem could probably be worked around by
+> unsetting all environment variables when running ikiwiki (`LANG`,
+> `LC_ALL`, `LC_MESSAGES`). Then it won't try to load `Locale::gettext` at
+> all. --[[Joey]]
+
+---
+
+I am trying to install Ikiwiki version 2.1 from the source tarball.
+
+It has all gone fairly smoothly until I try and run 'make'.
+
+I.e. I have downloaded and unpacked ikiwiki_2.1.tar.gz and have run
+
+ perl Makefile.PL
+
+... which has run OK.
+
+
+However when I then run 'make' I get:-
+
+ LANG=C perl -I. -T ikiwiki.out doc html --templatedir=templates \
+ --underlaydir=basewiki --nousedirs\
+ --wikiname="ikiwiki" --verbose \
+ --exclude=/discussion --no-discussion --userdir=users \
+ --plugin=goodstuff \
+ --plugin=haiku --plugin=polygen --plugin=fortune
+ Failed to load plugin IkiWiki::Plugin::mdwn: IkiWiki version 2 required--this is only version 1.01 at IkiWiki/Plugin/mdwn.pm line 7.
+ BEGIN failed--compilation aborted at IkiWiki/Plugin/mdwn.pm line 7.
+ Compilation failed in require at (eval 4) line 2.
+ BEGIN failed--compilation aborted at (eval 4) line 2.
+
+ make: *** [extra_build] Error 1
+
+How do I fix this? There may be a bit of old ikiwiki left behind because
+I did once have an older version installed but I thought I had removed all
+traces of it.
+
+> I'm quite sure that you still have some of it floating around, since
+> ikiwiki seems to be loading an old IkiWiki.pm.
+>
+> I don't understand though why it's not finding ./IkiWiki.pm first. The
+> `-I` in the command line should make it look for files in the current
+> directory first. --[[Joey]]
+
+Well I have searched around and there really is nothing left that I can see.
+
+I have removed *everything* found by 'find' and 'locate' that contains 'ikiwiki' except the tar file
+and started from the beginning again and I see exactly the same error.
+
+Is it that I maybe have a too old version of some of the Perl dependencies? The only mdwn.pm files
+that I have are the two I have just extracted from the new tar file. There is *no* ./IkiWiki.pm file
+
+> It's interesting that you say you have no ./IkiWiki.pm file, since one is
+> included in the tarball. What happened to it, I wonder?
+
+so what/where is it loading to satisfy the ....... aaaaaaaaaaahhhhhhhhhhhhhh!!!!!!
+
+I wasn't noticing the case of the filename, I'd missed the upper case W and guess what 'find' shows me:-
+
+ /usr/local/lib/perl5/site_perl/5.8.8/IkiWiki.pm
+
+Removing the above file has fixed my basic problem, now I'm getting lots of (non-fatal) errors
+about "Can't locate Locale/gettext.pm", presumably that's a missing Perl module, I can probably
+sort that out.
+
+
+## Errors when running 'make test'
+
+OK, I've now got it to compile and install and presumably it's basically working. However there
+are a few errors when I run 'make test'. Several errors have disappeared by installing more Perl
+stuff (specifically XML::SAX)
+
+> XML::SAX is a requirement of XML::Simple, which is a documented build
+> requirement. (Only really needed if you'll be using subversion actually).
+
+and one error disappeared when I did a 'make install', i.e. the 'make
+test' has a test which requires IkiWiki to be installed first.
+
+> Yes, that bug is already fixed in subversion for the next release
+> --[[Joey]]
+
+However I'm still getting the following error in 'make test':-
+
+ t/pagename.................ok
+ t/pagespec_match...........ok 1/52Modification of a read-only value attempted at /home/chris/webdev/ikiwiki/blib/lib/IkiWiki.pm line 1023.
+ # Looks like you planned 52 tests but only ran 23.
+ # Looks like your test died just after 23.
+ t/pagespec_match...........dubious
+ Test returned status 255 (wstat 65280, 0xff00)
+ DIED. FAILED tests 24-52
+ Failed 29/52 tests, 44.23% okay
+ t/pagespec_merge...........ok
+
+> What version of perl are you using? It seems to have some problem with
+> operator overloading.
+> --[[Joey]]
+
+home$ perl -v
+
+This is perl, v5.8.8 built for i486-linux
+
+## Installation in a non-root enviroment
+I had a pretty hellacious time installing Ikiwiki (largely due to problems
+in Perl) and documented them in [[tips/Dreamhost]]. I'd like to get feedback on the doc and also know if I should file a few bugs to make the installation process a little friendlier to non-root folks. Thanks for the great app!
+
+
+## Typing error?
+
+[..] Mail::Sendmail, TimeDate, RPC::XML, [..]: should be DateTime? --[[vibrog]]
+
+> No, TimeDate and DateTime are two different CPAN modules. Ikiwiki uses
+> TimeDate. --[[Joey]]
+
+ah, i still don't fully get it, though (the following is slightly shortened):
+
+ $ perl -MCPAN -e shell
+ cpan> install DateTime
+ DateTime is up to date.
+ cpan> install TimeDate
+ Warning: Cannot install TimeDate, don't know what it is.
+ Try the command
+ i /TimeDate/
+ to find objects with matching identifiers.
+
+I'm trying to build IkiWiki on a fresh OpenSuse 10.3 box. I start out with
+
+ $ perl -MCPAN -e 'install Text::Markdown URI HTML::Parser HTML::Template HTML::Scrubber'
+ $ git clone git://git.ikiwiki.info/ ikiwiki && cd ikiwiki
+ $ perl Makefile.PL && make
+
+Are there other prerequisites?
+I also installed all optional Perl modules, except TimeDate.
+
+> TimeDate is also, confusingly, known as Date::Parse. Perhaps CPAN would
+> do better with that name. --[[Joey]]
+
+good. Date::Parse was already installed. --[[vibrog]]
+
+`make` exits with `make: *** [extra_build] Aborted`, `make test` complains `cannot stat 'ikiwiki.man'` --[[vibrog]]
+
+> If you show me the actual error message, and not just the last line make
+> outputs, I might be able to help. --[[Joey]]
+
+ ..
+ rendering todo/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn
+ make: *** [extra_build] Segmentation fault
+
+> So, perl on your system is segfaulting when running ikiwiki. What version
+> of perl is this, and what version of what distribution? --[[Joey]]
+
+ $ perl -V
+ Summary of my perl5 (revision 5 version 8 subversion 8) configuration:
+ osname=linux, osvers=2.6.22, archname=i586-linux-thread-multi
+ uname='linux ravel 2.6.22 #1 smp 20070921 22:29:00 utc i686 i686 i386 gnulinux '
+ config_args='-ds -e -Dprefix=/usr -Dvendorprefix=/usr -Dinstallusrbinperl -Dusethreads -Di_db -Di_dbm -Di_ndbm -Di_gdbm -Duseshrplib=true -Doptimize=-O2 -march=i586 -mtune=i686 -fmessage-length=0 -Wall -D_FORTIFY_SOURCE=2 -fstack-protector -g -Wall -pipe'
+ hint=recommended, useposix=true, d_sigaction=define
+ usethreads=define use5005threads=undef useithreads=define usemultiplicity=define
+ useperlio=define d_sfio=undef uselargefiles=define usesocks=undef
+ use64bitint=undef use64bitall=undef uselongdouble=undef
+ usemymalloc=n, bincompat5005=undef
+
+Not sure how to provide proper version information for you.--[[vibrog]]
+
+---
+
+I've tried a couple of times and my cpan has never recognised Bundle::IkiWiki. Is that section of the page still accurate? -- [[users/Jon]]
+
+> Are you running perl with the environemnt settings specified on the page?
+> Can you show how it fails to find the bundle? --[[Joey]]
+
+>> I was not. Next time I build I will have to try that (I'll need to tweak it as I already override PERL5LIB; also I need to specify http proxies). Thanks for your help! -- [[users/Jon]]
+
+---
+
+##Further problems with Bundle::IkiWiki
+I'm also having trouble with finding Bundle::IkiWiki. I've tried it with the environment settings and without them, and also using the interactive
+form of the cpan command. I've also gone to cpan.org and searched -- eg
+
+ http://search.cpan.org/search?query=ikiwiki&mode=all
+
+and no Bundle for IkiWiki comes up at all.
+
+The error I get from the various cpan attempts is basically always the same:
+
+ Warning: Cannot install Bundle::IkiWiki, don't know what it is.
+ Try the command
+
+ i /Bundle::IkiWiki/
+
+ to find objects with matching identifiers.
+
+When I try that command, BTW, it basically seems to find the same stuff I get when searching on the cpan web site.
+
+This happens both on Ubuntu 8.04 and CentOS 5.1
+
+Any help would be greatly appreciated... --kent
+
+> Bundle::IkiWiki is included in ikiwiki itself, so of course cpan.org
+> does not know about it.
+>
+> If you can show me exactly what command you ran (the tested, working
+> commands on the parent page?) and how it failed, I can try to debug
+> your problem.
+
+Just today I noticed the "Bundle" subdirectory. What a moron I am! :-) Also, I misunderstood the PERL5LIB=`pwd` part --
+I glibly thought it indicated the sink for the installation of the modules, rather than the source, and I was running
+the cpan command from another window in a different directory, and just spiraled down into error...
+
+> The real question in my mind is why you'd want to do this at all when
+> using Ubuntu, which incldues packages of ikiwiki and all its
+> dependencies. --[[Joey]]
+
+For ubuntu 8.04:
+
+ $ ikiwiki --version
+ ikiwiki version 2.32.3ubuntu2.1
+ $
+
+I was just trying to get the latest version.
+
+In any case, thanks for the help, and thanks for the superb software. I really like it a lot.
+
+---
+
+## Prerequisite modules not found for non-root user
+Hi, I'm a non-root user trying to use IkiWiki on an academic webserver with Perl 5.8.8 but several missing modules, so I grab them from CPAN (edited):
+
+ cd ~; PERL5LIB=`pwd`/ikiwiki:`pwd`/ikiwiki/cpan:`pwd`/lib/perl5 PERL_MM_USE_DEFAULT=1 perl -MCPAN -e 'CPAN::Shell->install("Bundle::IkiWiki")'
+
+That puts a lot of files in ~/.cpan. Then when I go into the directory where I untarred IkiWiki and try to run the Perl makefile:
+
+ cd ~/ikiwiki; perl Makefile.PL PREFIX=$HOME/ikiwiki
+
+I get warnings that all the modules needed were not found:
+
+Warning: prerequisite CGI::FormBuilder not found.
+Warning: prerequisite CGI::Session 0 not found.
+Warning: prerequisite Date::Parse 0 not found.
+Warning: prerequisite HTML::Scrubber 0 not found.
+Warning: prerequisite HTML::Template 0 not found.
+Warning: prerequisite Mail::Sendmail 0 not found.
+Warning: prerequisite Text::Markdown 0 not found.
+
+CORRECTION 1: I played around with CPAN and got the installation to the point of succeeding with >99% of tests in "make test".
+
+> What was the magic CPAN rune that worked for you? --[[Joey]]
+
+An attempt of "make install" failed while trying to put files in /etc/IkiWiki but per the output's instructions, I reran "make install" and that seemed to work, until this error, which doesn't seem to be satisfiable:
+
+ Warning: You do not have permissions to install into /usr/lib/perl5/site_perl/5.8.8/i386-linux-thread-multi at /usr/lib/perl5/5.8.8/ExtUtils/Install.pm line 114.
+ Installing /usr/lib/perl5/site_perl/5.8.8/IkiWiki.pm
+ mkdir /usr/lib/perl5/site_perl/5.8.8/IkiWiki: Permission denied at /usr/lib/perl5/5.8.8/ExtUtils/Install.pm line 176
+
+Any suggestions? Whew!
+
+> When you build ikiwiki, try doing it like this to make it
+> install to your home directory. Then you can run `~/bin/ikiwiki`
+> --[[Joey]]
+
+ perl Makefile.PL INSTALL_BASE=$HOME PREFIX=
+ make
+ make install
+
+---
+
+03 September 2010, Report on successful manual install in Debian 5 (Lenny) AMD64:
+
+note: Maybe much more easy using backports, but using this tools you get a plain user cpan :)
+
+This where my steps:
+
+As root (#):
+
+ aptitude install build-essential curl perl
+
+
+As plain user ($), I use to install user perl modules using local::lib
+
+ mkdir -p "$HOME/downloads"
+ cd "$HOME/downloads/"
+ wget http://search.cpan.org/CPAN/authors/id/G/GE/GETTY/local-lib-1.006007.tar.gz
+ wget http://ftp.de.debian.org/debian/pool/main/i/ikiwiki/ikiwiki_3.20100831.tar.gz
+ tar -zxf local-lib-1.006007.tar.gz
+ cd local-lib-1.006007/
+ perl Makefile.PL --bootstrap=~/.perl5
+ make test && make install
+ echo 'eval $(perl -I$HOME/.perl5/lib/perl5 -Mlocal::lib=$HOME/.perl5)' >>~/.bashrc
+ . ~/.bashrc
+ curl -L http://cpanmin.us | perl - App::cpanminus
+ cpanm CGI::FormBuilder
+ cpanm CGI::Session
+ cpanm HTML::Parser
+ cpanm HTML::Template
+ cpanm HTML::Scrubber
+ cpanm Text::Markdown
+ cpanm URI
+ cd ..
+ tar -zxf ikiwiki_3.20100831.tar.gz
+ cd ikiwiki/
+ perl Makefile.PL INSTALL_BASE= PREFIX=/home/$USER/.perl5
+ make test # All tests successful.
+ make install INSTALL_BASE=/home/$USER/.perl5
+ . ~/.bashrc
+
+Using cpan or cpanm with local::lib, you can install any other dependency, as plain user (in your home). XS modules may need -dev packages.
+
+After all, here it's:
+
+ ikiwiki -version
+ ikiwiki version 3.20100831
+
+It seems like this installation looses the /etc files (we're as plain user), but this can be used as a workaround:
+
+ ikiwiki -setup ~/downloads/ikiwiki/auto.setup
+
+I've not investigated more the /etc files ussage, but does not seems like a good idea to be as plain user...
+
+ /etc/ikiwiki/wikilist does not exist
+ ** Failed to add you to the system wikilist file.
+ ** (Probably ikiwiki-update-wikilist is not SUID root.)
+ ** Your wiki will not be automatically updated when ikiwiki is upgraded.
+
+
+Iñigo
+
+-----
+
+
+Portability fixes encountered while maintaining the pkgsrc package:
+
+* In `IkiWiki::Setup::Standard::gendump()`, generate a shebang
+ matching the current `perl`.
+* In `Makefile.PL`, provide overridable defaults for `SYSCONFDIR`
+ and `MANDIR`.
+* In `Makefile.PL`, use `perl` to bump `ikiwiki.spec` instead of
+ `sed -i`.
+* In `Makefile.PL`, specify more portable options to `cp`.
+
+I've attempted to mergeably patch these in my git, commit
+5c177c96ac98b24aaa0613ca241fb113f1b32c55.
+
+--[[schmonz]]
+
+-----
+
+[[!template id=gitbranch branch=schmonz/portability author="[[schmonz]]"]]
+
+My git was in a screwy state so I started over. These changes are
+now on a branch. --[[schmonz]]
diff --git a/doc/local.css b/doc/local.css
new file mode 100644
index 000000000..a0dec8cfd
--- /dev/null
+++ b/doc/local.css
@@ -0,0 +1,3 @@
+/* ikiwiki local style sheet */
+
+/* Add local styling here, instead of modifying style.css. */
diff --git a/doc/logo.mdwn b/doc/logo.mdwn
new file mode 100644
index 000000000..3608cb382
--- /dev/null
+++ b/doc/logo.mdwn
@@ -0,0 +1,69 @@
+The ikiwiki logo *reflects* ikiwiki turning the regular wiki concept
+on its head by being a wiki compiler. Or maybe just the fact that "ikiwiki"
+is an palindrome.
+
+Anyway, if you have this logo in other fonts or colors, feel free to send
+it to [[Joey]] for inclusion here. (Or upload it, once that feature is
+added).
+
+* [[ikiwiki_logo|ikiwiki.png]]
+ [[ikiwiki_logo_large|ikiwiki_large.png]]
+ [[ikiwiki_button|ikiwiki_button.png]]
+
+ [[SVG_source|ikiwiki.svgz]], can be used to generate a logo at any size
+ with a command like:
+
+ inkscape -w 90 -i logo -e ikiwiki.png ikiwiki.svgz
+ inkscape -w 150 -i logo -e ikiwiki_large.png ikiwiki.svgz
+
+ The [[favicon.ico]] can also be generated from this file, as follows:
+
+ inkscape -w 16 -i favicon -e favicon.ico ikiwiki.svgz
+
+ The button can also be generated as follows:
+
+ inkscape -w 80 -i button -e ikiwiki_button.png ikiwiki.svgz
+
+ Some other alternate icons and buttons are also included in the svg file
+ and can be extracted by specifying their names.
+
+ Contributed by Recai Oktaş
+
+* [[favicon.svgz]] is used to generate the favicon.ico:
+
+ inkscape -w 16 -i favicon -e favicon.ico favicon.svgz
+
+* [[ikiwiki_logo|ikiwiki_old2.png]]
+
+ [[SVG_source|ikiwiki_old2.svgz]] to this older version of the logo.
+
+ Contributed by Recai Oktaş
+
+* [[ikiwiki_logo|ikiwiki_old.png]]
+
+ LaTeX source to an image approximating the above:
+
+ \documentclass{article}
+ \usepackage{graphicx}
+ \begin{document}
+ \pagestyle{empty}
+ \huge\reflectbox{iki}wiki
+ \end{document}
+
+ Contributed by [[JeroenSchot]]
+
+* [iki.svg](http://jblevins.org/svg/iki.svg)
+
+ A simplified (442 byte) plain SVG version of the ikiwiki favicon.
+
+ Contributed by [[JasonBlevins]]
+
+* <https://archive.org/download/IkiwikiLogo-hi.ato.br/ikiwiki_logo-hiato.png>
+
+ Hosted on: <https://archive.org/details/IkiwikiLogo-hi.ato.br>
+
+ I had to change the black letters to #c8c8c8 to put in my website.
+
+ This was created from the svg source.
+
+ Contributed by [[hiato]]
diff --git a/doc/logo/discussion.mdwn b/doc/logo/discussion.mdwn
new file mode 100644
index 000000000..7f712a5a2
--- /dev/null
+++ b/doc/logo/discussion.mdwn
@@ -0,0 +1 @@
+![ikiwiki logo](http://wiki.ondioline.org/ikiwiki.png) The logo I'm using on my ikiwiki installation. -- Paul Collins \ No newline at end of file
diff --git a/doc/logo/favicon.svgz b/doc/logo/favicon.svgz
new file mode 100644
index 000000000..9e65991e4
--- /dev/null
+++ b/doc/logo/favicon.svgz
Binary files differ
diff --git a/doc/logo/ikiwiki.png b/doc/logo/ikiwiki.png
new file mode 100644
index 000000000..e5f07f187
--- /dev/null
+++ b/doc/logo/ikiwiki.png
Binary files differ
diff --git a/doc/logo/ikiwiki.svgz b/doc/logo/ikiwiki.svgz
new file mode 100644
index 000000000..a67774a55
--- /dev/null
+++ b/doc/logo/ikiwiki.svgz
Binary files differ
diff --git a/doc/logo/ikiwiki_button.png b/doc/logo/ikiwiki_button.png
new file mode 100644
index 000000000..afddbf740
--- /dev/null
+++ b/doc/logo/ikiwiki_button.png
Binary files differ
diff --git a/doc/logo/ikiwiki_large.png b/doc/logo/ikiwiki_large.png
new file mode 100644
index 000000000..cd6aa70f3
--- /dev/null
+++ b/doc/logo/ikiwiki_large.png
Binary files differ
diff --git a/doc/logo/ikiwiki_old.png b/doc/logo/ikiwiki_old.png
new file mode 100644
index 000000000..63b7f885a
--- /dev/null
+++ b/doc/logo/ikiwiki_old.png
Binary files differ
diff --git a/doc/logo/ikiwiki_old2.png b/doc/logo/ikiwiki_old2.png
new file mode 100644
index 000000000..793046164
--- /dev/null
+++ b/doc/logo/ikiwiki_old2.png
Binary files differ
diff --git a/doc/logo/ikiwiki_old2.svgz b/doc/logo/ikiwiki_old2.svgz
new file mode 100644
index 000000000..a0c345021
--- /dev/null
+++ b/doc/logo/ikiwiki_old2.svgz
Binary files differ
diff --git a/doc/news.mdwn b/doc/news.mdwn
new file mode 100644
index 000000000..58413f94c
--- /dev/null
+++ b/doc/news.mdwn
@@ -0,0 +1,9 @@
+This is where announcements of new releases, features, and other news are
+posted. [[IkiWikiUsers]] are recommended to subscribe to this page's RSS
+feed.
+
+[[!inline pages="news/* and !news/*/* and !news/discussion"
+feedpages="created_after(news/Article_on_Ikiwiki_as_a_BTS)" rootpage="news" show="30"]]
+
+By the way, some other pages with RSS feeds about ikiwiki include
+[[plugins]], [[TODO]] and [[bugs]].
diff --git a/doc/news/Article_on_Ikiwiki_as_a_BTS.mdwn b/doc/news/Article_on_Ikiwiki_as_a_BTS.mdwn
new file mode 100644
index 000000000..a5029f109
--- /dev/null
+++ b/doc/news/Article_on_Ikiwiki_as_a_BTS.mdwn
@@ -0,0 +1 @@
+[Integrated issue tracking with Ikiwiki](http://www.linuxworld.com/news/2007/040607-integrated-issue-tracking-ikiwiki.html) by Joey Hess is now available on LinuxWorld.com. (LinuxWorld's author contract also allows this article to become part of the project's documentation.) Learn how to use Ikiwiki inlining and PageSpecs for lightweight workflow. Joey also explains how having the BTS and docs in the project's revision control system can help users of distributed revision control systems keep bug tracking info in sync with code changes. \ No newline at end of file
diff --git a/doc/news/code_swarm.mdwn b/doc/news/code_swarm.mdwn
new file mode 100644
index 000000000..09b68523e
--- /dev/null
+++ b/doc/news/code_swarm.mdwn
@@ -0,0 +1,34 @@
+I've produced a [code_swarm](http://vis.cs.ucdavis.edu/~ogawa/codeswarm/)
+visualization of the first 2+ years of ikiwiki's commit history.
+
+[[!img screenshot.png size="480x360" alt="screenshot"]]
+
+* [15 mb avi](http://kitenet.net/~joey/screencasts/ikiwiki_swarm.avi)
+* [stream on vimeo](http://vimeo.com/1324348)
+
+PS, while I'm posting links to videos, here's a
+[video of a lightning talk about ikiwiki](http://log.hugoschotman.com/hugo/2008/07/webtuesday-2008-07-08-lightning-talk-by-axel-beckert-about-ikiwiki.html).
+
+--[[Joey]]
+
+### notes
+
+Interesting things to watch for:
+
+* Initial development of ikiwiki to the point it was getting web edits.
+ (First 2 seconds of video!)
+* Introduction to plugin support, and later, plugin changes dominating code
+ changes.
+* Introduction of openid support and the resulting *swarm* of openid
+ commenters.
+* Switch to git, my name in the logs changes from "joey" to "Joey Hess",
+ and there are more code commits directly from others.
+
+Getting the commit log was tricky because every web commit is in there too,
+so it has to deal with things like IPs and openids. The [[code_swarm_log.pl]]
+script will munge the log to handle these, and it was configured with
+[[code_swarm.config]].
+
+Video editing by kino, ffmpeg, ffmpeg2theora, and too many hours of pain.
+
+Audio by the Punch Brothers.
diff --git a/doc/news/code_swarm/code_swarm.config b/doc/news/code_swarm/code_swarm.config
new file mode 100644
index 000000000..eea55debd
--- /dev/null
+++ b/doc/news/code_swarm/code_swarm.config
@@ -0,0 +1,51 @@
+# This is a sample configuration file for code_swarm for ikiwiki
+
+# Frame width
+Width=640
+
+# Frame height
+Height=480
+
+# Input file
+InputFile=data/sample-repevents.xml
+
+# Particle sprite file
+ParticleSpriteFile=particle.png
+
+# Project time per frame
+MillisecondsPerFrame=21600000
+#MillisecondsPerFrame=43200000
+
+# Background in R,G,B
+Background=0,0,0
+
+# Color assignment rules
+# Keep in order, do not skip numbers. Numbers start
+# at 1.
+#
+# Pattern: "Label", "regex", R,G,B R,G,B
+# Label is optional. If it is omitted, the regex
+# will be used.
+#
+
+ColorAssign1="Discussion (blue)",".*discussion.*", 0,0,255, 0,0,255
+ColorAssign2="Docs (green)",".*\.mdwn", 255,0,0, 255,0,0
+ColorAssign3="Plugins (orange)",".*Plugin/.*", 255,116,0, 255,116,0
+ColorAssign4="Code (red)",".*\.p[ml]", 0,255,0, 0,255,0
+
+# Save each frame to an image?
+TakeSnapshots=true
+
+# Where to save each frame
+SnapshotLocation=frames/code_swarm-#####.png
+
+# Create a glow around names? (Runs slower)
+NameHalos=false
+
+# Natural distance of files to people
+EdgeLength=40
+
+debug=false
+
+# OpenGL is experimental. Use at your own risk.
+UseOpenGL=false
diff --git a/doc/news/code_swarm/code_swarm_log.pl b/doc/news/code_swarm/code_swarm_log.pl
new file mode 100755
index 000000000..25e0a67b0
--- /dev/null
+++ b/doc/news/code_swarm/code_swarm_log.pl
@@ -0,0 +1,25 @@
+#!/usr/bin/perl
+# Munge a git log into log for code_swarm.
+# Deals with oddities of ikiwiki commits, like web commits, and openids.
+use IkiWiki;
+use IkiWiki::Plugin::openid;
+
+my $sep='-' x 72;
+$/=$sep."\n";
+
+my %config=IkiWiki::defaultconfig();
+
+foreach (`git-log --name-status --pretty=format:'%n$sep%nr%h | %an | %ai (%aD) | x lines%n%nsubject: %s%n'`) {
+ my ($subject)=m/subject: (.*)\n/m;
+ if ($subject=~m/$config{web_commit_regexp}/) {
+ my $user = defined $2 ? "$2" : "$3";
+ my $oiduser = IkiWiki::openiduser($user);
+ if (defined $oiduser) {
+ $oiduser=~s/ \[.*\]//; # too much clutter for code_swarm
+ $user=$oiduser;
+ }
+ s/ \| [^|]+ \| / | $user | /;
+ }
+ s/subject: (.*)\n\n//m;
+ print;
+}
diff --git a/doc/news/code_swarm/discussion.mdwn b/doc/news/code_swarm/discussion.mdwn
new file mode 100644
index 000000000..3ecc81b86
--- /dev/null
+++ b/doc/news/code_swarm/discussion.mdwn
@@ -0,0 +1,3 @@
+Looks like ImageMagick isn't install on the new server! :-) -- AdamShand
+
+> Thanks for pointing out problem, fixed now. --[[Joey]]
diff --git a/doc/news/code_swarm/screenshot.png b/doc/news/code_swarm/screenshot.png
new file mode 100644
index 000000000..1178e3f64
--- /dev/null
+++ b/doc/news/code_swarm/screenshot.png
Binary files differ
diff --git a/doc/news/consultant_list.mdwn b/doc/news/consultant_list.mdwn
new file mode 100644
index 000000000..1adf2ed67
--- /dev/null
+++ b/doc/news/consultant_list.mdwn
@@ -0,0 +1,17 @@
+I was asked a good question today: How can a company find someone to work
+on ikiwiki? To help answer this question, I've set up a [[consultants]] page.
+If you might be interested in being paid to work on ikiwiki, please add your
+information to the page. --[[Joey]]
+
+And here's the first company looking for an ikiwiki developer that I am
+aware of:
+
+> The TOVA Company, a small medical software and hardware company in
+> Portland, Oregon, is looking for developers to add functionality to
+> ikiwiki. We're looking for developers who are already familiar with ikiwiki
+> development, including plugins, and who would be willing to work on a
+> part-time, non-employee, project-based basis for each of the small features
+> that we want. The [[features_we're_interested_in|users/The_TOVA_Company]]
+> would obviously be GPL'd, and released to the community (if they'll have
+> them :) ). Please contact Andrew Greenberg (andrew@thetovacompany) if
+> you're interested. Thanks!
diff --git a/doc/news/discussion.mdwn b/doc/news/discussion.mdwn
new file mode 100644
index 000000000..d6a548f8b
--- /dev/null
+++ b/doc/news/discussion.mdwn
@@ -0,0 +1,35 @@
+## 3.20091017 news item removed?
+Hi! Why have you [removed](http://git.ikiwiki.info/?p=ikiwiki;a=blobdiff;f=doc/news/version_3.20091017.mdwn;h=0000000000000000000000000000000000000000;hp=aba830a82f881bd97d11fe644eb2c78b99c2258d;hb=9fdd9af2db2bd21e543fa0f5f4bfa85b56b8dd5c;hpb=b74dceb884a60f6f7be395378a009ee414726d0b) the item for
+3.20091017? Perhaps, it's an error, isn't it? The corresponding code AFAIU is still there. --Ivan Z.
+
+> I always remove old news items when making a new release. The info is still there in the changelog if needed. --[[Joey]]
+
+## Ikiwiki 3.12
+
+Joey, what about news for Ikiwiki 3.12? The changelog says is has been released
+6 days ago... :) --[[Paweł|ptecza]]
+
+---
+
+## Ikiwiki 2.14
+
+Hi Joey! Where can I find the source package for ikiwiki 2.14? I rather prefer
+`wget` than `git` to download it :) I've just checked
+[Debian page of ikiwiki source package](http://packages.debian.org/unstable/source/ikiwiki)
+and it seems that it still contains older version 2.13. I didn't found
+the latest version at <http://incoming.debian.org/> too. --[[Paweł|ptecza]]
+
+> 2.14 is at the url you cited now. --[[Joey]]
+
+>> Thanks! I can confirm it :) I'm curious what a reason of "delay" is?
+>> The sources were accepted in unstable 2 days ago... Please forgive me
+>> if it's a stupid question, but I don't know how exactly it works in
+>> Debian project. --[[Paweł|ptecza]]
+
+>>> After I upload, ikiwiki is not pushed out to the mirrors until the
+>>> twice-daily mirror sync. After that, packages.debian.org has to run its
+>>> update. I'm not sure what the timing of that is, it seems to be
+>>> somewhat slower than the mirror sync, and also more subject to breaking
+>>> from time to time. --[[Joey]]
+
+>>>> Thank you for the explanation! :) --[[Paweł|ptecza]]
diff --git a/doc/news/donations.mdwn b/doc/news/donations.mdwn
new file mode 100644
index 000000000..5fea32c81
--- /dev/null
+++ b/doc/news/donations.mdwn
@@ -0,0 +1 @@
+After looking up and noticing that another 8 hours had passed, replying to people and hacking, I've added a [[TipJar]] page, in case anyone feels like tossing me a few bucks for ikiwiki. TIA! --[[Joey]] \ No newline at end of file
diff --git a/doc/news/git_push_to_this_wiki.mdwn b/doc/news/git_push_to_this_wiki.mdwn
new file mode 100644
index 000000000..4b3fcbe69
--- /dev/null
+++ b/doc/news/git_push_to_this_wiki.mdwn
@@ -0,0 +1,3 @@
+Now you can use [[git]] to clone this wiki, and push your changes back,
+thanks to ikiwiki's new support for [[tips/untrusted_git_push]]. Enjoy
+working on the wiki while offline! --[[Joey]]
diff --git a/doc/news/git_push_to_this_wiki/discussion.mdwn b/doc/news/git_push_to_this_wiki/discussion.mdwn
new file mode 100644
index 000000000..33230c7ef
--- /dev/null
+++ b/doc/news/git_push_to_this_wiki/discussion.mdwn
@@ -0,0 +1,37 @@
+Thanks, Joey! This is awesome...I had to try it out :)
+--[[JasonBlevins]]
+
+I am really happy to hear of this new feature, that I was (more or less)
+secretly dreaming of. But - and that's why I'm still insanely editing
+this wiki inside a web browser - I wonder how I'll use it for real: my
+own master branch contains a few dozens merge commits, and one is created
+every time I `git pull` ikiwiki repository (or another clone of it, living
+on one of my other boxes that by chance had Internet access more recently).
+I do not want to clutter Joey's repository with these commits, so I guess
+I have to learn some more of Git everything-is-possible world (a nice thing
+is: I am not limited anymore to "Emacs can do it", and I'm now in a position
+to say "Git can do it" or "ikiwiki already does it", depending on the
+situation). Well, let's focus. Git wizards amongst us (let's use this wiki
+as if it were users@ikiwiki.info, ok?), what would you suggest? I was thinking
+of having a new branch in my cloned repository, dedicated to editing this wiki;
+I could use `rebase` instead of `fetch+merge` to get the new upstream commits
+into this special-purpose branch. I guess it would work nicely if I had only
+one offline box with not-yet-pushed changes at the same time, but would break
+in awful and various ways when it is not the case. Any alternative idea?
+--[[intrigeri]]
+
+> Not that I'm very careful to avoid pushing merge commits (see git log ;-),
+> but I sometimes use `git pull --rebase` to pull changes from a repo. That
+> will rebase your local changes on top of the changes pulled, avoiding the
+> merge commits. I'm sure more involved solutions are possible. --[[Joey]]
+
+> I decided to use my local `master` branch as a copy of `origin/master`
+> (kitenet) and move my local modifications to a separate branch. I'm using
+> `master` to edit the wiki but there is still the problem of new upstream
+> commits since the last pull. I already had this problem as Joey had pushed
+> some changes while I was editing locally. Not knowing about
+> `pull --rebase`, I took the long way out: branch, roll back HEAD, rebase,
+> and merge. That was too much work...It looks like `pull --rebase` is the
+> way to go. --[[JasonBlevins]]
+
+Awesome ! --[[xma]]
diff --git a/doc/news/ikiwiki-hosting.mdwn b/doc/news/ikiwiki-hosting.mdwn
new file mode 100644
index 000000000..092530a14
--- /dev/null
+++ b/doc/news/ikiwiki-hosting.mdwn
@@ -0,0 +1,16 @@
+ikiwiki-hosting is an interface on top of Ikiwiki to allow easy management
+of lots of ikiwiki sites. I developed it for
+[Branchable](http://www.branchable.com/), an Ikiwiki hosting provider.
+It has a powerful, scriptable command-line interface, and also
+includes special-purpose ikiwiki plugins for things like a user control
+panel.
+
+To get a feel for it, here are some examples:
+
+ ikisite create foo.ikiwiki.net --admin http://joey.kitenet.net/
+ ikisite branch foo.ikiwiki.net bar.ikiwiki.net
+ ikisite backup bar.ikiwiki.net --stdout | ssh otherhost 'ikisite restore bar.ikiwiki.net --stdin'
+
+ikiwiki-hosting is free software, released under the AGPL. Its website:
+<http://ikiwiki-hosting.branchable.com/>
+--[[Joey]]
diff --git a/doc/news/ikiwiki_accepted_for_Summer_of_Code.mdwn b/doc/news/ikiwiki_accepted_for_Summer_of_Code.mdwn
new file mode 100644
index 000000000..0aa9d8821
--- /dev/null
+++ b/doc/news/ikiwiki_accepted_for_Summer_of_Code.mdwn
@@ -0,0 +1,5 @@
+Google has accepted ikiwiki as a mentoring organization for [Summer of Code 2007](http://code.google.com/soc).
+
+See our [[Summer_of_Code|soc]] page for projects.
+
+--[[JoshTriplett]] \ No newline at end of file
diff --git a/doc/news/ikiwiki_screencast.mdwn b/doc/news/ikiwiki_screencast.mdwn
new file mode 100644
index 000000000..429108246
--- /dev/null
+++ b/doc/news/ikiwiki_screencast.mdwn
@@ -0,0 +1,12 @@
+I've put together a short screencast that covers approximatly the first
+half of the [[setup]] document, and includes a demo of setting up a blog
+using ikiwiki.
+
+<http://kitenet.net/~joey/screencasts/ikiwiki_blog/>
+
+.. And now I've added a second screencast. Note that this uses a script
+that is only available in the as yet unreleased ikiwiki version 2.15.
+
+<http://kitenet.net/~joey/screencasts/ikiwiki_cgi_and_git/>
+
+--[[Joey]]
diff --git a/doc/news/ikiwiki_screencast/discussion.mdwn b/doc/news/ikiwiki_screencast/discussion.mdwn
new file mode 100644
index 000000000..d0950b336
--- /dev/null
+++ b/doc/news/ikiwiki_screencast/discussion.mdwn
@@ -0,0 +1,8 @@
+Hello! At first, thanks for that screencast! Yes, please make another one.
+I'm rather long-time and experienced user of ikiwiki, but I think I still
+can learn the new things about it. It's also very interesting for me what your
+favourite tools (window manager, editor, browser) are :D --[[Paweł|ptecza]]
+
+> Done --[[Joey]]
+
+>> Thank you very much! :) --[[Paweł|ptecza]] \ No newline at end of file
diff --git a/doc/news/ikiwiki_version_2.0.mdwn b/doc/news/ikiwiki_version_2.0.mdwn
new file mode 100644
index 000000000..e6723c873
--- /dev/null
+++ b/doc/news/ikiwiki_version_2.0.mdwn
@@ -0,0 +1,32 @@
+Ikiwiki has reached version 2.0 and entered a new phase in its
+[[development_cycle|roadmap]].
+
+With the 2.0 release of ikiwiki, some major changes have been made to the
+default configuration:
+
+* The `usedirs` setting is enabled by default. This *will* break all URLs
+ to wikis that did not have `usedirs` turned on before, unless you follow
+ the procedure described at [[tips/switching_to_usedirs]]
+ or edit your setup file to turn `usedirs` off: `usedirs => 0,`
+* [[plugins/OpenID]] logins are now enabled by default, if the
+ [[!cpan Net::OpenID::Consumer]] perl module is available. Password logins
+ are also still enabled by default. If you like, you can turn either OpenID
+ or password logins off via the `disable_plugins` setting.
+
+An overview of changes in the year since the 1.0 release:
+
+* New improved URLs to pages via `usedirs`.
+* [[plugins/OpenID]] support, enabled by default.
+* Plugin [[interface|plugins/write]] added, with some 60 [[plugins]] available,
+ greatly expanding the capabilities of ikiwiki.
+* [[Tags]], atom feeds, and generally full-fledged blogging support.
+* Fully working [[todo/utf8]].
+* Optimisations, approximately 3.5 times as fast as version 1.0.
+* Improved scalability to large numbers of pages.
+* Improved scalable [[logo]].
+* Support for additional revision control systems besides svn: git,
+ tla, mercurial.
+* Some support for other markup languages than markdown: rst, textile.
+* Unit test suite, with more than 300 tests.
+
+[[!meta date="2007-04-30 00:51:57 -0400"]]
diff --git a/doc/news/ikiwiki_version_3.0.mdwn b/doc/news/ikiwiki_version_3.0.mdwn
new file mode 100644
index 000000000..7ca636cf2
--- /dev/null
+++ b/doc/news/ikiwiki_version_3.0.mdwn
@@ -0,0 +1,42 @@
+Ikiwiki has reached version 3.0 and entered a new phase in its
+[[development_cycle|roadmap]].
+
+The 3.0 release of ikiwiki changes several defaults and finishes
+some transitions. You will need to modify your wikis to work with
+ikiwiki 3.0. A document explaining the process is available
+in [[tips/upgrade_to_3.0]].
+
+The highlights of the changes in version 3.0 include:
+
+* Support for uploading [[attachments|plugins/attachment]].
+* Can [[plugins/rename]] and [[plugins/remove]] pages and files via the web.
+* [[Web_based_setup|plugins/websetup]].
+* Blog-style [[plugins/comments]] as an alternative to Discussion pages.
+* Many other new plugins including [[plugins/htmlbalance]], [[plugins/format]],
+ [[plugins/progress]], [[plugins/color]], [[plugins/autoindex]],
+ [[plugins/cutpaste]], [[plugins/hnb]], [[plugins/creole]], [[plugins/txt]],
+ [[plugins/amazon_s3]], [[plugins/pinger]], [[plugins/pingee]],
+ [[plugins/edittemplate]]
+* The RecentChanges page is compiled statically, not generated from the CGI.
+* Support for additional revision control systems: [[rcs/bzr]],
+ [[rcs/monotone]]
+* Support for [[tips/untrusted_git_push]].
+* A new version (3.00) of the plugin API, exporting additional
+ commonly used functions from `IkiWiki.pm`.
+* Nearly everything in ikiwiki is now a plugin, from WikiLinks to
+ page editing, to RecentChanges.
+* Far too many bug fixes, features, and enhancements to list here.
+
+Thanks to the many contributors to ikiwiki 3.0, including:
+
+ Jelmer Vernooij, Recai Oktaş, William Uther, Simon McVittie, Axel Beckert,
+ Bernd Zeimetz, Gabriel McManus, Paweł Tęcza, Peter Simons, Manoj
+ Srivastava, Patrick Winnertz, Jeremie Koenig, Josh Triplett, thm, Michael
+ Gold, Jason Blevins, Alexandre Dupas, Henrik Brix Andersen, Thomas Keller,
+ Enrico Zini, intrigeri, Scott Bronson, Brian May, Adeodato Simó, Brian
+ Downing, Nis Martensen. (And anyone I missed.)
+
+Also, thanks to the users, bug submitters, and documentation wiki editors.
+Without you, ikiwiki would just be a little thing I use for my home page.
+
+--[[Joey]]
diff --git a/doc/news/irc_channel.mdwn b/doc/news/irc_channel.mdwn
new file mode 100644
index 000000000..248fd3c67
--- /dev/null
+++ b/doc/news/irc_channel.mdwn
@@ -0,0 +1,6 @@
+Ikiwiki now has an IRC channel: `#ikiwiki` on irc.oftc.net
+
+The channel features live commit messages for CIA for changes to both
+ikiwiki's code and this wiki. Plus occasional talk about ikiwiki.
+
+Thanks to [[JoshTriplett]] for making this happen.
diff --git a/doc/news/moved_to_git.mdwn b/doc/news/moved_to_git.mdwn
new file mode 100644
index 000000000..8a1fa11f3
--- /dev/null
+++ b/doc/news/moved_to_git.mdwn
@@ -0,0 +1,10 @@
+I've started using git as ikiwiki's main repository. See [[download]] for
+repository locations.
+
+Note that all the sha1sums have changed from those in previously published
+git repositories. Blame [git-svnimport](http://bugs.debian.org/447965).
+
+I hope that this will make it easier to maintain and submit patches for
+ikiwiki.
+
+--[[Joey]]
diff --git a/doc/news/moved_to_git/discussion.mdwn b/doc/news/moved_to_git/discussion.mdwn
new file mode 100644
index 000000000..fcb7186b9
--- /dev/null
+++ b/doc/news/moved_to_git/discussion.mdwn
@@ -0,0 +1,43 @@
+# Why Git?
+
+I'm very curious about the main reasons of your leaving Subversion and moving
+ikiwiki to Git. Are there only easier way to maintain and submit patches for
+ikiwiki? It's very interesting for me, because I know you are long-time Subversion
+user and very experienced with it.
+
+I know that Git is very "trendy" SCM these days, but I don't understand the hype
+about it. This's not only one distributed SCM on the free/open source world.
+Maybe that model of work is better for you, but then you can use also Darcs,
+Mercurial, Bazaar or SVK :)
+
+--[[Paweł|ptecza]]
+
+> You forgot monotoone. :-)
+>
+> Of those, only mercurial monotone and git have support in ikiwiki, and the
+> git support seems most mature and is definitely used by the most sites.
+>
+> I don't consider which rcs is used a permanant or particularly significant
+> decision. I switched to svn with the express idea that sometime (I figured
+> within 10 years, it turned out to be 3), there would be something better,
+> with excellent conversion tools from svn.
+>
+> At the moment, I'm happy with git, and it's definitely useful to not have
+> to worry about who derserves commit access to ikiwiki, or about next summer's
+> [[soc]] students (if we participate again) having to go through the ad-hoc
+> mess this year's did to contribute.
+>
+> Being able to git-am < doc/todo/patch.mdwn is also potentially pretty neat. ;-)
+>
+> --[[Joey]]
+
+>> Haha, I've also forgotten Arch and Superversion and probably a lot of
+>> another exotic SCMs ;)
+>>
+>> OK, Ikiwiki is your project, so you're the boss here ;)
+>>
+>> BTW, what do you think about migration of Debian projects from
+>> [svn.debian.org](http://svn.debian.org/) to [git.debian.org](http://git.debian.org/)?
+>> Is a good idea to use a few SCM servers by Debian?
+>>
+>> --[[Paweł|ptecza]] \ No newline at end of file
diff --git a/doc/news/new_domain_name.mdwn b/doc/news/new_domain_name.mdwn
new file mode 100644
index 000000000..395c3b651
--- /dev/null
+++ b/doc/news/new_domain_name.mdwn
@@ -0,0 +1 @@
+Ikiwiki has its own domain now, ikiwiki.info. Update your links. \ No newline at end of file
diff --git a/doc/news/no_more_email_notifications.mdwn b/doc/news/no_more_email_notifications.mdwn
new file mode 100644
index 000000000..685a0d340
--- /dev/null
+++ b/doc/news/no_more_email_notifications.mdwn
@@ -0,0 +1,14 @@
+ikiwiki.info has upgraded to the not yet released ikiwiki 2.30. This
+version of ikiwiki drops support for subscribing to commit mail
+notifications for pages. The idea is that you can subscribe to the new
+[[RecentChanges]] feed instead. (Or create your own custom feed of only the
+changes you're interested in, and subscribe to that.)
+
+So if you were subscribed to mail notifications on here, you'll need to
+change how you keep track of changes. Please let me know if there are any
+missing features in the [[RecentChanges]] feeds.
+
+Statically building the RecentChanges also has performance implications,
+I'll keep an eye on [[server_speed]]..
+
+--[[Joey]]
diff --git a/doc/news/openid.mdwn b/doc/news/openid.mdwn
new file mode 100644
index 000000000..0f4b8b5bf
--- /dev/null
+++ b/doc/news/openid.mdwn
@@ -0,0 +1,13 @@
+Ikiwiki in svn now has support for using [OpenID](http://openid.net), a
+decentralized authentication mechanism that allows you to have one login
+that you can use on a growing number of websites.
+
+Traditional password-based logins are still supported, but I'm considering
+switching at least ikiwiki.info over to using only OpenID logins.
+That would mean blowing away all the currently registered users and
+their preferences. If you're active on this wiki, I suggest you log out and
+log back in, try out the OpenID signup process if you don't already have an
+OpenID, and see how OpenID works for you. And let me know your feelings about
+making such a switch. --[[Joey]]
+
+[[!poll 70 "Accept only OpenID for logins" 21 "Accept only password logins" 47 "Accept both"]]
diff --git a/doc/news/openid/discussion.mdwn b/doc/news/openid/discussion.mdwn
new file mode 100644
index 000000000..bc9856ad9
--- /dev/null
+++ b/doc/news/openid/discussion.mdwn
@@ -0,0 +1,96 @@
+I think that I have logged in using openid! But I think the login page
+could use some adjustemnts.
+
+Perhaps the openid stuff should be seperate, unless I was supposed to login
+as well. Also have I just created an account on this wiki as well?
+
+> The idea is that you fill in one or the other but not both. If it's
+> switched to only openid, it's much clearer, since the
+> username/password/register stuff disappears from the form.
+>
+> If both login methods are enabled, it's limited to using one form for
+> both though...
+>
+> By signing in with openid, you have created an account on the wiki; you
+> can configure it to eg, subscribe your email address to changes to pages.
+> --[[Joey]]
+
+OK, my openid login works too. One question though, is there a setup parameter which controls whether new registrations are permitted at all? For instance, I'm thinking that I'd like to use the wiki format for content, but I don't want it editable by anyone who isn't already set up. Does this work? --[[Tim Lavoie]]
+
+----
+
+# How to ban an IP address?
+
+There is a way to ban ikiwiki users, but how to ban an IP address?
+For example if a bitchy anonymous is bombing our poll. I can use
+only Apache/iptables rules for this? Maybe it's related to
+[[ACL|todo/ACL]] request? --[[Paweł|ptecza]]
+
+> Well, the polls are not something I would worry about much. I do plan to
+> add_IP_range_banning, although I expect to wait until
+> there's a demonstrated need. --[[Joey]]
+
+>> Heh, do you really want a lot of spam of me? ;)
+
+>> It was only an example of banning reason. Recently I've read about
+>> problems of Wikipedia with the vandals from Qatar. They demolished
+>> Qatar Wikipedia pages and the admins of Wikipedia had to ban all
+>> IP addresses of that country (fortunately Qatar has only one ISP).
+>> --[[Paweł|ptecza]]
+
+----
+
+## Error voting
+
+> Error: /srv/web/ikiwiki.info/todo/Configurable_minimum_length_of_log_message_for_web_edits/index.html independently created, not overwriting with version from todo/Configurable_minimum_length_of_log_message_for_web_edits
+
+[[users/jon]]
+
+----
+
+### Logging Out
+
+If I've logged in by OpenID, how do I log out? I don't see any logout
+button anywhere on IkiWiki. (is it because I hit "forever" for my OpenID authorization duration?)
+> No, it's because it's on the preferences page! That's somewhat non-obvious...
+
+>> This is a problem with having a static wiki. If I just put "Logout" as
+>> an action on every page, that will look weird if you're not logged in.
+>> --[[Joey]]
+
+Even if IkiWiki does let me log out, how do I *stay* logged out? Let's say I'm using a kiosk. What's to prevent someone else from hitting my OpenID service right after I've walked away? My OpenID service will just auth the login again, won't it? --[[sabr]] (behavior seems to vary... does it depend on the OpenID service? guess I have some docs to read.)
+
+> If you're at a kiosk, you'll need to log out of your openid provider too.
+> Or use a provider that doesn't use cookies to keep you logged in. (Or
+> don't check the box that makes your provider set a cookie when you log in.)
+>
+> AFAIK openid doesn't have single signoff capabilities yet. --[[Joey]]
+
+I'm having a problem using my preferred openid. I have
+http://thewordnerd.info configured as a delegate to
+thewordnerd.myopenid.com. It works fine on Lighthouse, Slicehost and
+everywhere else I've used it. Here, though, if I use the delegate I'm sent
+to my openid identity URL on myopenid.com. If I use the identity URL
+directly, I get the verification page.
+
+Is my delegation broken in some way that works for all these other apps but
+which fails here? Or is something broken in Ikiwiki's implementation?
+
+> I guess this is the same issue filed by you at
+> [[bugs/OpenID_delegation_fails_on_my_server]] --[[Joey]]
+
+Yes. I'd only recently set up my server as a delegate under wordpress, so still thought that perhaps the issue was on my end. But I'd since used my delegate successfully elsewhere, so I filed it as a bug against ikiwiki.
+
+----
+###Pretty Painless
+I just tried logging it with OpenID and it Just Worked. Pretty painless. If you want to turn off password authentication on ikiwiki.info, I say go for it. --[[blipvert]]
+
+> I doubt I will. The new login interface basically makes password login
+> and openid cooexist nicely. --[[Joey]]
+
+###LiveJournal openid
+One caveat to the above is that, of course, OpenID is a distributed trust system which means you do have to think about the trust aspect. A case in point is livejournal.com whose OpenID implementation is badly broken in one important respect: If a LiveJournal user deletes his or her journal, and a different user registers a journal with the same name (this is actually quite a common occurrence on LiveJournal), they in effect inherit the previous journal owner's identity. LiveJournal does not even have a mechanism in place for a remote site even to detect that a journal has changed hands. It is an extremely dodgy situation which they seem to have *no* intention of fixing, and the bottom line is that the "identity" represented by a *username*.livejournal.com token should not be trusted as to its long-term uniqueness. Just FYI. --[[blipvert]]
+
+----
+
+Submitting bugs in the OpenID components will be difficult if OpenID must be working first...
diff --git a/doc/news/server_move.mdwn b/doc/news/server_move.mdwn
new file mode 100644
index 000000000..49d025788
--- /dev/null
+++ b/doc/news/server_move.mdwn
@@ -0,0 +1,9 @@
+I've gone ahead and moved ikiwiki.info to the faster box mentioned on
+[[server_speed]]. Most poll respondants felt the old box was fast enough,
+but it's getting a bit overloaded with other stuff.
+
+If you can see this, you're seeing the new server. If not, your DNS server
+hasn't caught up yet. I'll keep the old server up for a while too and merge
+any changes across since git makes that bog-easy.
+
+Please report any problems..
diff --git a/doc/news/server_move_2009.mdwn b/doc/news/server_move_2009.mdwn
new file mode 100644
index 000000000..8be5debe1
--- /dev/null
+++ b/doc/news/server_move_2009.mdwn
@@ -0,0 +1,6 @@
+[[!meta title="server move"]]
+
+The ikiwiki.info domain has been moved to a new server. If you can see
+this, your DNS has already caught up and you are using the new server.
+By the way, the new server should be somewhat faster.
+--[[Joey]]
diff --git a/doc/news/server_speed.mdwn b/doc/news/server_speed.mdwn
new file mode 100644
index 000000000..60ce59d42
--- /dev/null
+++ b/doc/news/server_speed.mdwn
@@ -0,0 +1,9 @@
+Quick poll: Do you feel that ikiwiki is fast enough on this server, or
+should I move it to my much beefier auxiliary server?
+
+[[!poll open=no 40 "It's fast enough" 6 "It's too slow!" 4 "No opinion"]]
+
+If you have specifics on performance issues, you might mention them on the
+[[discussion]] page.
+
+Ikiwiki is now hosted at [Branchable](http://branchable.com/).
diff --git a/doc/news/server_speed/discussion.mdwn b/doc/news/server_speed/discussion.mdwn
new file mode 100644
index 000000000..22b6e0856
--- /dev/null
+++ b/doc/news/server_speed/discussion.mdwn
@@ -0,0 +1 @@
+It is a little slow on page saving, but since the majority of our time is spent reading or writing rather then saving it's not too big a deal. -- [[AdamShand]] \ No newline at end of file
diff --git a/doc/news/stylesheets.mdwn b/doc/news/stylesheets.mdwn
new file mode 100644
index 000000000..a5f094606
--- /dev/null
+++ b/doc/news/stylesheets.mdwn
@@ -0,0 +1,16 @@
+Some people may consider ikiwiki's default look to be a bit plain. Someone
+on slashdot even suggested perhaps it uses html 1.0. (Yes, an ikiwiki site
+has survived its first slashdotting. With static html, that's not very
+hard..) While the default style is indeed plain, there's more fine-tuning
+going on than you might think, and it's actually all done with xhtml and
+style sheets.
+
+Stefano Zacchiroli came up with the idea of adding a [[css_market]] page
+where [[IkiWikiUsers]] can share style sheets that you've come up with for
+ikiwiki. This is a great idea and I encourage those of you who have
+customised stylesheets to post them.
+
+I'm also always looking for minimalistic yet refined additions to the default
+style sheet, and always appreciate suggestions for it.
+
+--[[Joey]]
diff --git a/doc/news/stylesheets/discussion.mdwn b/doc/news/stylesheets/discussion.mdwn
new file mode 100644
index 000000000..cd557ec8c
--- /dev/null
+++ b/doc/news/stylesheets/discussion.mdwn
@@ -0,0 +1,3 @@
+## A plain look is fine ##
+
+More of the web should look like this. Plain is easier to read, I like it that way. My (small) ikiwiki sure is plain (but has more style than this one: css to hide some things, and to make pages narrower). Ulrik \ No newline at end of file
diff --git a/doc/news/version_3.20121212.mdwn b/doc/news/version_3.20121212.mdwn
new file mode 100644
index 000000000..473b63190
--- /dev/null
+++ b/doc/news/version_3.20121212.mdwn
@@ -0,0 +1,6 @@
+ikiwiki 3.20121212 released with [[!toggle text="these changes"]]
+[[!toggleable text="""
+ * filecheck: Fix bug that prevented File::MimeInfo::Magic from ever
+ being used.
+ * openid: Display openid in Preferences page as a comment, so it can be
+ selected in all browsers."""]] \ No newline at end of file
diff --git a/doc/news/version_3.20130212.mdwn b/doc/news/version_3.20130212.mdwn
new file mode 100644
index 000000000..7ec4b0f0c
--- /dev/null
+++ b/doc/news/version_3.20130212.mdwn
@@ -0,0 +1,18 @@
+ikiwiki 3.20130212 released with [[!toggle text="these changes"]]
+[[!toggleable text="""
+ * htmlscrubber: Allow the bitcoin URI scheme.
+ * htmlscrubber: Allow the URI schemes of major VCS's.
+ * aggregate: When run with --aggregate, if an aggregation is already
+ running, don't go on and --refresh.
+ * trail: Avoid excess dependencies between pages in the trail
+ and the page defining the trail. Thanks, smcv.
+ * opendiscussion: Don't allow editing discussion pages if discussion pages
+ are disabled. (smcv)
+ * poll: Add expandable option to allow users to easily add new choices to
+ a poll.
+ * trail: Avoid massive slowdown caused by pagetemplate hook when displaying
+ dynamic cgi pages, which cannot use trail anyway.
+ * Deal with empty diffurl in configuration.
+ * cvs: Various fixes. (schmonz)
+ * highlight: Now adds a span with class highlight-&lt;extension&gt; around
+ highlighted content, allowing for language-specific css styling."""]] \ No newline at end of file
diff --git a/doc/news/version_3.20130504.mdwn b/doc/news/version_3.20130504.mdwn
new file mode 100644
index 000000000..18baf01c4
--- /dev/null
+++ b/doc/news/version_3.20130504.mdwn
@@ -0,0 +1,11 @@
+ikiwiki 3.20130504 released with [[!toggle text="these changes"]]
+[[!toggleable text="""
+ * Allow dots in directive parameter names. (tango)
+ * Add missing plugin section, and deal with missing sections with a warning.
+ * Detect plugins with a broken getsetup and warn.
+ * map: Correct reversion introduced in version 3.20110225 that could
+ generate invalid html. (smcv)
+ * Makefile.PL: overwrite theme style.css instead of appending
+ (Thanks, Mikko Rapeli)
+ * meta: Fix anchors used to link to the page's license and copyright.
+ Closes: #[706437](http://bugs.debian.org/706437)"""]] \ No newline at end of file
diff --git a/doc/news/version_3.20130518.mdwn b/doc/news/version_3.20130518.mdwn
new file mode 100644
index 000000000..635b86935
--- /dev/null
+++ b/doc/news/version_3.20130518.mdwn
@@ -0,0 +1,9 @@
+ikiwiki 3.20130518 released with [[!toggle text="these changes"]]
+[[!toggleable text="""
+ * Fix test suite to not fail when XML::Twig is not installed.
+ Closes: #[707436](http://bugs.debian.org/707436)
+ * theme: Now &lt;TMPL\_IF THEME\_$NAME&gt; can be used in all templates when
+ a theme is enabled.
+ * notifyemail: Fix bug that caused duplicate emails to be sent when
+ site was rebuilt.
+ * bzr: bzr rm no longer has a --force option, remove"""]] \ No newline at end of file
diff --git a/doc/news/version_3.20130710.mdwn b/doc/news/version_3.20130710.mdwn
new file mode 100644
index 000000000..f1b30a7ff
--- /dev/null
+++ b/doc/news/version_3.20130710.mdwn
@@ -0,0 +1,23 @@
+ikiwiki 3.20130710 released with [[!toggle text="these changes"]]
+[[!toggleable text="""
+ * blogspam: Fix encoding issue in RPC::XML call.
+ Thanks, Changaco
+ * comments: The formats allowed to be used in comments can be configured
+ using comments\_allowformats.
+ Thanks, Michal Sojka
+ * calendar: When there are multiple pages for a given day, they're
+ displayed in a popup on mouseover.
+ Thanks, Louis
+ * osm: Remove trailing slash from KML maps icon.
+ * page.tmpl: omit searchform, trails, sidebar and most metadata in CGI
+ (smcv)
+ * openid: Automatically upgrade openid\_realm to https when
+ accessed via https.
+ * The ip() pagespec can now contain glob characters to match eg, a subnet
+ full of spammers.
+ * Fix crash that could occur when a needsbuild hook returned a file
+ that does not exist.
+ * Fix python proxy to not crash when fed unicode data in getstate
+ and setstate.
+ Thanks, chrysn
+ * Fix committing attachments when using svn."""]] \ No newline at end of file
diff --git a/doc/pagehistory.mdwn b/doc/pagehistory.mdwn
new file mode 100644
index 000000000..5c3b4a8d0
--- /dev/null
+++ b/doc/pagehistory.mdwn
@@ -0,0 +1,8 @@
+ikiwiki supports adding "History" links to the top of pages to browse the
+revision history of a page. This is enabled by the `historyurl` setting,
+which is used to specify the URL to a web interface such as [[ViewVC]]
+(for Subversion) or [[Gitweb]]. In that url, "\[[file]]" is replaced with
+the name of the file to view.
+
+The [[plugins/repolist]] plugin can supplement this information with
+urls to the underlying repository of the wiki.
diff --git a/doc/patch.mdwn b/doc/patch.mdwn
new file mode 100644
index 000000000..7d0f9847c
--- /dev/null
+++ b/doc/patch.mdwn
@@ -0,0 +1,12 @@
+Since we have enough people working on ikiwiki to be dangerous, or at least
+to duplicate work without coordination, here's a queue of suggested patches.
+
+If you post a patch to the [[todo]] or [[bugs]] list, or elsewhere,
+once it's ready to be applied, add a 'patch' tag so it will show up here.
+
+If your patch is non-trivial and might need several iterations to get
+right, or you'd just like to make it easy for [[Joey]] to apply it,
+please consider publishing a [[git]] [[branch|branches]].
+
+[[!inline pages="(todo/* or bugs/*) and link(patch) and !link(bugs/done) and
+!link(todo/done) and !*/Discussion" rootpage="todo" archive="yes"]]
diff --git a/doc/patch/core.mdwn b/doc/patch/core.mdwn
new file mode 100644
index 000000000..fcf0bdb72
--- /dev/null
+++ b/doc/patch/core.mdwn
@@ -0,0 +1,7 @@
+Some [[patches|patch]] affect the ikiwiki core, rather than just a plugin.
+This tag collects those patches. Please tag such patches with 'patch/core'
+as well as 'patch'.
+
+[[!inline pages="(todo/* or bugs/*) and link(patch/core)
+ and !link(bugs/done) and !link(todo/done) and !*/Discussion"
+ rootpage="todo" archive="yes"]]
diff --git a/doc/plugins.mdwn b/doc/plugins.mdwn
new file mode 100644
index 000000000..0bea33592
--- /dev/null
+++ b/doc/plugins.mdwn
@@ -0,0 +1,17 @@
+Most of ikiwiki's [[features]] are implemented as plugins. Many of these
+plugins are included with ikiwiki.
+
+[[!pagestats pages="plugins/type/* and !plugins/type/slow" among="plugins/*"]]
+
+There's documentation if you want to [[write]] your own plugins, or you can
+[[install]] plugins [[contributed|contrib]] by others.
+
+To enable a plugin, use the `--plugin` switch described in
+[[usage]], or the equivalent `add_plugins` line in ikiwiki.setup.
+Enable the [[goodstuff]] plugin to get a nice selection of plugins that
+will fit most uses of ikiwiki.
+
+## Plugin directory
+
+[[!map pages="plugins/* and !plugins/type/* and !plugins/write and
+!plugins/write/* and !plugins/contrib and !plugins/contrib/*/* and !plugins/install and !*/Discussion"]]
diff --git a/doc/plugins/404.mdwn b/doc/plugins/404.mdwn
new file mode 100644
index 000000000..128b26e7b
--- /dev/null
+++ b/doc/plugins/404.mdwn
@@ -0,0 +1,24 @@
+[[!template id=plugin name=404 author="[[Simon_McVittie|smcv]]"]]
+[[!tag type/web]]
+
+This plugin lets you use the IkiWiki CGI script as an Apache 404 handler,
+to give the behaviour of various other wiki engines where visiting a
+nonexistent page provides you with a link to create it.
+
+To enable the 404 handler you need to:
+
+1. Edit your `.setup` file and add `404` to the `add_plugins` line.
+2. Add a 404 error document handler in your Apache configuration:
+
+ `ErrorDocument 404 /url/path/to/ikiwiki.cgi`
+
+ Where `/url/path/to` is the path portion of the URL to the `ikiwiki.cgi` binary.
+
+This plugin might also be useful on non-Apache web servers, if they provide the
+`REDIRECT_STATUS` and `REDIRECT_URL` environment variables to their 404 handlers.
+`REDIRECT_STATUS` should be `404` and `REDIRECT_URL` should be the path
+part of the URL (for instance it would be `/plugins/404/` if this page was missing).
+
+If you would like help with adapting this plugin for a different web server,
+you will need to provide the output of
+[[this 404 handler|forum/nginx:_404_plugin_not_working#comment-6b1607f7961d2873517d4780f56ac3ad]].
diff --git a/doc/plugins/404/discussion.mdwn b/doc/plugins/404/discussion.mdwn
new file mode 100644
index 000000000..5a8e8ed85
--- /dev/null
+++ b/doc/plugins/404/discussion.mdwn
@@ -0,0 +1,3 @@
+With Apache, if you have a page foo/bar/baz but no foo/bar, and if you've
+disabled `Indexes` option, you'll end up with a `403` response for foo/bar.
+The 404 plugin doesn't try to handle that. But it should. -- [[Jogo]]
diff --git a/doc/plugins/aggregate.mdwn b/doc/plugins/aggregate.mdwn
new file mode 100644
index 000000000..75123d923
--- /dev/null
+++ b/doc/plugins/aggregate.mdwn
@@ -0,0 +1,57 @@
+[[!template id=plugin name=aggregate author="[[Joey]]"]]
+[[!tag type/special-purpose]]
+
+This plugin allows content from other feeds to be aggregated into the
+wiki. To specify feeds to aggregate, use the
+[[ikiwiki/directive/aggregate]] [[ikiwiki/directive]].
+
+## requirements
+
+The [[meta]] and [[tag]] plugins are also recommended to be used with this
+one. Either the [[htmltidy]] or [[htmlbalance]] plugin is suggested, since
+feeds can easily contain html problems, some of which these plugins can fix.
+
+## triggering aggregation
+
+You will need to run ikiwiki periodically from a cron job, passing it the
+--aggregate parameter, to make it check for new posts. Here's an example
+crontab entry:
+
+ */15 * * * * ikiwiki --setup my.wiki --aggregate --refresh
+
+The plugin updates a file `.ikiwiki/aggregatetime` with the unix time stamp
+when the next aggregation run could occur. (The file may be empty, if no
+aggregation is required.) This can be integrated into more complex cron
+jobs or systems to trigger aggregation only when needed.
+
+Alternatively, you can allow `ikiwiki.cgi` to trigger the aggregation. You
+should only need this if for some reason you cannot use cron, and instead
+want to use a service such as [WebCron](http://webcron.org). To enable
+this, turn on `aggregate_webtrigger` in your setup file. The url to
+visit is `http://whatever/ikiwiki.cgi?do=aggregate_webtrigger`. Anyone
+can visit the url to trigger an aggregation run, but it will only check
+each feed if its `updateinterval` has passed.
+
+## aggregated pages
+
+This plugin creates a page for each aggregated item.
+
+If the `aggregateinternal` option is enabled in the setup file (which is
+the default), aggregated pages are stored in the source directory with a
+"._aggregated" extension. These pages cannot be edited by web users, and
+do not generate first-class wiki pages. They can still be inlined into a
+blog, but you have to use `internal` in [[PageSpecs|IkiWiki/PageSpec]],
+like `internal(blog/*)`.
+
+If `aggregateinternal` is disabled, you will need to enable the [[html]]
+plugin as well as aggregate itself, since feed entries will be stored as
+HTML, and as first-class wiki pages -- each one generates
+a separate HTML page in the output, and they can even be edited. This
+option is provided only for backwards compatability.
+
+## cookies
+
+The `cookiejar` option can be used to configure how [[!cpan LWP::UserAgent]]
+handles cookies. The default is to read them from a file
+`~/.ikiwiki/cookies`, which can be populated using standard perl cookie
+tools like [[!cpan HTTP::Cookies]].
diff --git a/doc/plugins/aggregate/discussion.mdwn b/doc/plugins/aggregate/discussion.mdwn
new file mode 100644
index 000000000..028775ec8
--- /dev/null
+++ b/doc/plugins/aggregate/discussion.mdwn
@@ -0,0 +1,137 @@
+I'm trying to set up a [planet of my users' blogs](http://help.schmonz.com/planet/). I've enabled the aggregate, meta, and tag plugins (but not htmltidy, that thing has a gajillion dependencies). `aggregateinternal` is 1. The cron job is running and I've also enabled the webtrigger. My usage is like so:
+
+ \[[!inline pages="internal(planet/*) show=0"]]
+
+ \[[!aggregate
+ name="Amitai's blog"
+ url="http://www.schmonz.com/"
+ dir="planet/schmonz-blog"
+ feedurl="http://www.schmonz.com/atom/"
+ expirecount="2"
+ tag="schmonz"
+ ]]
+
+ \[[!aggregate
+ name="Amitai's photos"
+ url="http://photos.schmonz.com/"
+ dir="planet/schmonz-photos"
+ feedurl="http://photos.schmonz.com/main.php?g2_view=rss.SimpleRender&g2_itemId=7"
+ expirecount="2"
+ tag="schmonz"
+ ]]
+
+
+(and a few more `aggregate` directives like these)
+
+Two things aren't working as I'd expect:
+
+1. `expirecount` doesn't take effect on the first run, but on the second. (This is minor, just a bit confusing at first.)
+2. Where are the article bodies for e.g. David's and Nathan's blogs? The bodies aren't showing up in the `._aggregated` files for those feeds, but the bodies for my own blog do, which explains the planet problem, but I don't understand the underlying aggregation problem. (Those feeds include article bodies, and show up normally in my usual feed reader rss2email.) How can I debug this further? --[[schmonz]]
+
+> I only looked at David's, but its rss feed is not escaping the html
+> inside the rss `description` tags, which is illegal for rss 2.0. These
+> unknown tags then get ignored, including their content, and all that's
+> left is whitespace. Escaping the html to `&lt;` and `&gt;` fixes the
+> problem. You can see the feed validator complain about it here:
+> <http://feedvalidator.org/check.cgi?url=http%3A%2F%2Fwww.davidj.org%2Frss.xml>
+>
+> It's sorta unfortunate that [[!cpan XML::Feed]] doesn't just assume the
+> un-esxaped html is part of the description field. Probably other feed
+> parsers are more lenient. --[[Joey]]
+
+>> Thanks for the quick response (and the `expirecount` fix); I've forwarded it to David so he can fix his feed. Nathan's Atom feed validates -- it's generated by the same CMS as mine -- so I'm still at a loss on that one. --[[schmonz]]
+
+>>> Nathan's feed contains only summary elements, with no content elements.
+>>> This is legal according to the Atom spec, so I've fixed ikiwiki to use
+>>> the summary if no content is available. --[[Joey]]
+
+>>>> After applying your diffs, blowing away my cached aggregated stuff, and running the aggregate cron job by hand, the resulting planet still doesn't have Nathan's summaries... and the two posts from each feed that aren't being expired aren't the two newest ones (not sure what the pattern is there). Have I done something wrong? --[[schmonz]]
+
+>>>>> I think that both issues are now fixed. Thanks for testing.
+>>>>> --[[Joey]]
+
+>>>>>> I can confirm, they're fixed on my end. --[[schmonz]]
+
+New bug: new posts aren't getting displayed (or cached for aggregation). After fixing his feed, David posted a new item today, and the aggregator is convinced there's nothing to do, whether by cronjob or webtrigger. I verified that it wasn't another problem with his feed by adding another of my ikiwiki's feed to the planet, running the aggregator, posting a new item, and running the aggregator again: no new item. --[[schmonz]]
+
+> Even if you start it more frequently, aggregation will only occur every
+> `updateinterval` minutes (default 15), maximum. Does this explain what
+> you're seeing? --[[Joey]]
+
+>> Crap, right, and my test update has since made it into the planet. His post still hasn't. So it must be something with David's feed again? A quick test with XML::Feed looks like it's parsing just fine: --[[schmonz]]
+
+ $ perl
+ use XML::Feed;
+ my $feed = XML::Feed->parse(URI->new('http://www.davidj.org/rss.xml')) or die XML::Feed->errstr;
+ print $feed->title, "\n";
+ for my $entry ($feed->entries) {
+ print $entry->title, ": ", $entry->issued, "\n";
+ }
+ ^D
+ davidj.org
+ Amway Stories - Refrigerator Pictures: 2008-09-19T00:12:27
+ Amway Stories - Coffee: 2008-09-13T10:08:17
+ Google Alphabet Update: 2008-09-11T22:55:37
+ Writing for writing's sake: 2008-09-09T23:39:05
+ Google Chrome: 2008-09-02T23:12:26
+ Mister Casual: 2008-07-25T09:01:17
+ Parental Conversations: 2008-07-24T10:44:44
+ Place Of George Orwell: 2008-06-03T22:11:07
+ The Raw Beauty Of A National Duolian: 2008-05-31T12:41:06
+
+> I had no problem getting the "Refrigerator Pictures" post to aggregate
+> here, though without a copy of the old feed I can't be 100% sure I've
+> reproduced your ikiwiki's state. --[[Joey]]
+
+>> Okay, I blew away the cached entries and aggregator state files and reran the aggregator and all appears well again. If the problem recurs I'll be sure to post here. :-) --[[schmonz]]
+
+>>> On the off chance that you retained a copy of the old state, I'd not
+>>> mind having a copy to investigate. --[[Joey]]
+
+>>>> Didn't think of that, will keep a copy if there's a next time. -- [[schmonz]]
+
+-----
+
+In a corporate environment where feeds are generally behind
+authentication, I need to prime the aggregator's `LWP::UserAgent`
+with some cookies. What I've done is write a custom plugin to populate
+`$config{cookies}` with an `HTTP::Cookies` object, plus this diff:
+
+ --- /var/tmp/pkg/lib/perl5/vendor_perl/5.10.0/IkiWiki/Plugin/aggregate.pm 2010-06-24 13:03:33.000000000 -0400
+ +++ aggregate.pm 2010-06-24 13:04:09.000000000 -0400
+ @@ -488,7 +488,11 @@
+ }
+ $feed->{feedurl}=pop @urls;
+ }
+ - my $res=URI::Fetch->fetch($feed->{feedurl});
+ + my $res=URI::Fetch->fetch($feed->{feedurl},
+ + UserAgent => LWP::UserAgent->new(
+ + cookie_jar => $config{cookies},
+ + ),
+ + );
+ if (! $res) {
+ $feed->{message}=URI::Fetch->errstr;
+ $feed->{error}=1;
+
+It works, but I have to remember to apply the diff whenever I update
+ikiwiki. Can you provide a more elegant means of allowing cookies and/or
+the user agent to be programmatically manipulated? --[[schmonz]]
+
+> Ping -- is the above patch perhaps acceptable (or near-acceptable)? -- [[schmonz]]
+
+>> Pong.. I'd be happier with a more 100% solution that let cookies be used
+>> w/o needing to write a custom plugin to do it. --[[Joey]]
+
+>>> According to LWP::UserAgent, for the common case, a complete
+>>> and valid configuration for `$config{cookies}` would be `{ file =>
+>>> "$ENV{HOME}/.cookies.txt" }`. In the more common case of not needing
+>>> to prime one's cookies, `cookie_jar` can be `undef` (that's the
+>>> default). In my less common case, the cookies are generated by
+>>> visiting a couple magic URLs, which would be trivial to turn into
+>>> config options, except that these particular URLs rely on SPNEGO
+>>> and so LWP::Authen::Negotiate has to be loaded. So I think adding
+>>> `$config{cookies}` (and using it in the aggregate plugin) should
+>>> be safe, might help people in typical cases, and won't prevent
+>>> further enhancements for less typical cases. --[[schmonz]]
+
+>>>> Ok, done. Called it cookiejar. --[[Joey]]
diff --git a/doc/plugins/amazon_s3.mdwn b/doc/plugins/amazon_s3.mdwn
new file mode 100644
index 000000000..7fe60cb8d
--- /dev/null
+++ b/doc/plugins/amazon_s3.mdwn
@@ -0,0 +1,68 @@
+[[!template id=plugin name=amazon_s3 author="[[Joey]]"]]
+[[!tag type/special-purpose]]
+
+This plugin allows ikiwiki to publish a wiki in the [Amazon Simple Storage
+Service](http://aws.amazon.com/s3) (S3). As pages are rendered, ikiwiki
+will upload them to Amazon S3. The entire wiki contents, aside from the
+ikiwiki CGI, can then be served directly out of Amazon S3.
+
+You'll need the [[!cpan Net::Amazon::S3]] and [[!cpan File::MimeInfo]] perl
+modules and an Amazon S3 account to use this plugin.
+
+## configuration
+
+This plugin uses the following settings in the setup file:
+
+* `amazon_s3_key_id` - Set to your public access key id.
+* `amazon_s3_key_file` - Set to the name of a file where you have
+ stored your secret access key. The content of this file *must*
+ be kept secret.
+* `amazon_s3_bucket` - The globally unique name of the bucket to use to
+ store the wiki. This determines the URL to your wiki. For example, if you
+ set it to "foo", then the url will be
+ "http://foo.s3.amazonaws.com/wiki/".
+* `amazon_s3_prefix` - A prefix to prepend to each page name.
+ The default is "wiki/". Note: In order to host your site at the root,
+ it needs to be set to "", and you'll have to
+ [read this](http://aws.typepad.com/aws/2011/02/host-your-static-website-on-amazon-s3.html)
+ for details about configuring your S3 bucket as a website.
+* `amazon_s3_location` - Optionally, this can be set to control which
+ datacenter to use. For example, set it to "EU" to for Europe.
+* `amazon_s3_dupindex` - Normally, when `usedirs` is enabled,
+ "foo/index.html" is stored in S3 as a key named "foo/", and all links
+ between pages use that name. If you also needs links that include
+ "index.html" in their names to work, you can enable this option. Then
+ each index.html file will be stored in S3 *twice*, under both names. This
+ will use more disk and bandwidth, and is not recommended unless you really
+ need it for some reason. These days, it's probably better to configure
+ your S3 bucket as a website.
+
+Note that you should still set `destdir` in the setup file. The files that
+are uploaded to Amazon S3 will still be written to the destdir, too.
+
+Likewise, you will probably want to set the `url` in the setup file.
+The url can use the `foo.s3.amazonaws.com` domain name, or another domain
+name that is a CNAME for it.
+
+The `hardlink` config file setting is not compatible with this plugin.
+
+## data transfer notes
+
+If you run 'ikiwiki -setup my.setup' to force a rebuild of your wiki, the
+entire thing will be re-uploaded to Amazon S3. This will take time, and
+cost you money, so it should be avoided as much as possible.
+
+If you run 'ikiwiki -setup my.setup -refresh', ikiwiki will only upload the
+modified pages that it refreshes. Faster and cheaper. Still, if you have
+very large pages (for example, a page that inlines hundreds of other pages
+.. or is just very large), the complete page contents will be re-uploaded
+each time it's changed. Amazon S3 does not currently support partial/rsync
+type uploads.
+
+Copy and rename detection is not done, so if you copy or rename a large file,
+it will be re-uploaded, rather than copied.
+
+## deleting a bucket
+
+You can use "ikiwiki -setup my.setup --delete-bucket" to delete anything
+that's in the configured bucket, and remove the bucket.
diff --git a/doc/plugins/amazon_s3/discussion.mdwn b/doc/plugins/amazon_s3/discussion.mdwn
new file mode 100644
index 000000000..e5dcf064a
--- /dev/null
+++ b/doc/plugins/amazon_s3/discussion.mdwn
@@ -0,0 +1,18 @@
+Awesome idea! With this + NearlyFreeSpeech.NET you can have a highly discounted Wiki hosted...
+
+Now... just wondering... how could this be done while keeping things such as Google pagerank/searches and such 'sane'...
+
+One 'could' host S3 under the 'www' domain and you'd be all set... but then you would probably have some ugly problems to work around if you wanted to put some other dynamic content... especially if
+I'm stressed to figure out any dynamic hosting I would even use that I'd care was indexed... (perhaps using JS in an ikiwiki page for some dynamic update)
+
+Any thoughts/ideas on this? Any example ikiwiki up on Amazon?
+-- [[harningt]]
+
+> Well, I haven't needed to use S3, so I deleted my test wikis to save
+> money (pennies..).
+>
+> The main problem is that S3 can't serve http://hostname/. You have to use
+> http://hostname/<wiki>/. This would be a problem in many situations.
+>
+> Once google has a link to it though, it should be able to index it fine,
+> just like any other web site. --[[Joey]
diff --git a/doc/plugins/anonok.mdwn b/doc/plugins/anonok.mdwn
new file mode 100644
index 000000000..407012b54
--- /dev/null
+++ b/doc/plugins/anonok.mdwn
@@ -0,0 +1,19 @@
+[[!template id=plugin name=anonok author="[[Joey]]"]]
+[[!tag type/auth type/comments]]
+
+By default, anonymous users cannot edit the wiki. This plugin allows
+anonymous web users, who have not signed in, to edit any page in the wiki
+by default.
+
+Please think twice before enabling this plugin. If your wiki is accessible
+to the internet, it *will* be subject to spamming if this plugin is
+enabled. Such spam is not only a pain to deal with, but it bloats the
+revision control history of your wiki.
+
+The plugin has a configuration setting, `anonok_pagespec`. This
+[[ikiwiki/PageSpec]] can be used to allow anonymous editing of matching pages.
+
+If you're using the [[comments]] plugin, you can allow anonymous comments
+to be posted by setting:
+
+ anonok_pagespec => "postcomment(*)"
diff --git a/doc/plugins/attachment.mdwn b/doc/plugins/attachment.mdwn
new file mode 100644
index 000000000..4fcd714f8
--- /dev/null
+++ b/doc/plugins/attachment.mdwn
@@ -0,0 +1,27 @@
+[[!template id=plugin name=attachment core=0 author="[[Joey]]"]]
+[[!tag type/web]]
+
+This plugin allows files to be uploaded to the wiki over the web.
+
+For each page `foo`, files in the subdirectory `foo/` are treated as
+attachments of that page. Attachments can be uploaded and managed as
+part of the interface for editing a page.
+
+Warning: Do not enable this plugin on publically editable wikis, unless you
+take care to lock down the types and sizes of files that can be uploaded.
+Bear in mind that if you let anyone upload a particular kind of file
+("*.mp3" files, say), then someone can abuse your wiki in at least three ways:
+
+1. By uploading many mp3 files, wasting your disk space.
+2. By uploading mp3 files that attempt to exploit security holes
+ in web browsers or other players.
+3. By uploading files that claim to be mp3 files, but are really some
+ other kind of file. Some web browsers may display a `foo.mp3` that
+ contains html as a web page; including running any malicious javascript
+ embedded in that page.
+
+If you enable this plugin, be sure to lock it down, via the
+`allowed_attachments` setup file option. This is a special
+[[enhanced_PageSpec|ikiwiki/pagespec/attachment]] using tests provided by
+the [[filecheck]] plugin. That plugin will be automatically enabled when
+this plugin is enabled.
diff --git a/doc/plugins/autoindex.mdwn b/doc/plugins/autoindex.mdwn
new file mode 100644
index 000000000..e1cfe1157
--- /dev/null
+++ b/doc/plugins/autoindex.mdwn
@@ -0,0 +1,10 @@
+[[!template id=plugin name=autoindex core=0 author="[[Joey]]"]]
+[[!tag type/special-purpose]]
+
+This plugin searches for [[SubPages|ikiwiki/subpage]] with a missing parent
+page, and generates the parent pages. The generated page content is
+controlled by the `autoindex.tmpl` [[template|templates]], which by
+default, uses a [[map]] to list the SubPages.
+
+The `autoindex_commit` setting is enabled by default, and causes
+pages generated by autoindex to be checked into version control.
diff --git a/doc/plugins/autoindex/discussion.mdwn b/doc/plugins/autoindex/discussion.mdwn
new file mode 100644
index 000000000..76d09cd3c
--- /dev/null
+++ b/doc/plugins/autoindex/discussion.mdwn
@@ -0,0 +1,84 @@
+Would it be possible to add an option to only generate the index files
+for the html output and not place the markdown files in the wiki source?
+
+> Or better still, add a mechanism for ikiwiki to hold transient source
+> pages in memory and render them as if they existed, without actually
+> writing them out, as [[JoeRayhawk]] suggests below? I think
+> add_autofile would be the way to do this.
+> I've added this to [[todo]] as [[todo/autoindex should use add__95__autofile]]
+> and [[todo/transient_pages]]. --[[smcv]]
+
+The reason being that I have a lot of directories which need to be autoindexed,
+but I would prefer if the index files didn't clutter up my git repository.
+
+even without that feature the plugin is a great help, thanks
+
+
+------
+
+If you just don't want to clutter your git repo, below it's a patch does the following:
+
+* If you set autoindex_commit to 0 in your ikiwiki.setup file, we *do* place auto-generated markdown files in the **wiki source** but *not* in the **repo**
+
+* If you set autoindex_commit to 1 (this is the default), auto-generated index files will be put in the repo provided you enabled rcs backend.
+
+[[!toggle id="patch-for-autoindex_commit" text="patch for autoindex_commit"]]
+[[!toggleable id="patch-for-autoindex_commit" text="""
+<pre>
+--- autoindex.pm.orig 2009-10-01 17:13:51.000000000 +0800
++++ autoindex.pm 2009-10-01 17:21:09.000000000 +0800
+@@ -17,6 +17,13 @@
+ safe => 1,
+ rebuild => 0,
+ },
++ autoindex_commit => {
++ type => 'boolean',
++ default => 1,
++ description => 'commit generated autoindex pages into RCS',
++ safe => 0,
++ rebuild => 0,
++ },
+ }
+
+ sub genindex ($) {
+@@ -25,7 +32,7 @@
+ my $template=template("autoindex.tmpl");
+ $template->param(page => $page);
+ writefile($file, $config{srcdir}, $template->output);
+- if ($config{rcs}) {
++ if ($config{rcs} and $config{autoindex_commit}) {
+ IkiWiki::rcs_add($file);
+ }
+ }
+@@ -94,13 +101,13 @@
+ }
+
+ if (@needed) {
+- if ($config{rcs}) {
++ if ($config{rcs} and $config{autoindex_commit}) {
+ IkiWiki::disable_commit_hook();
+ }
+ foreach my $page (@needed) {
+ genindex($page);
+ }
+- if ($config{rcs}) {
++ if ($config{rcs} and $config{autoindex_commit}) {
+ IkiWiki::rcs_commit_staged(
+ gettext("automatic index generation"),
+ undef, undef);
+</pre>
+"""]]
+
+Warning: I guess this patch may work, but I *haven't tested it yet*. -- [[weakish]]
+
+------
+
+`autoindex_commit => 0` would be nice, but uncommited files are definitely not.
+<pre>
+remote: From /srv/git/test3
+remote: 3047077..1df636c master -> origin/master
+remote: error: Untracked working tree file 'test.mdwn' would be overwritten by merge. Aborting
+remote: 'git pull --prune origin' failed: at /usr/share/perl5/IkiWiki/Plugin/git.pm line 201.
+</pre>
+
+It'd be nice if we were able to notice directories with no associated compilable markup files and compile a simple map directive straight to HTML without any intermediate markup file being involved at all. -- JoeRayhawk
diff --git a/doc/plugins/blogspam.mdwn b/doc/plugins/blogspam.mdwn
new file mode 100644
index 000000000..3dd017f61
--- /dev/null
+++ b/doc/plugins/blogspam.mdwn
@@ -0,0 +1,32 @@
+[[!template id=plugin name=blogspam author="[[Joey]]"]]
+[[!tag type/auth type/comments]]
+
+This plugin adds antispam support to ikiwiki, using the
+[blogspam.net](http://blogspam.net/) API. Both page edits and
+[[comment|comments]] postings can be checked for spam. Page edits that
+appear to contain spam will be rejected; comments that look spammy will be
+stored in a queue for moderation by an admin.
+
+To check for and moderate comments, log in to the wiki as an admin,
+go to your Preferences page, and click the "Comment Moderation" button.
+
+The plugin requires the [[!cpan RPC::XML]] perl module.
+
+You can control how content is tested via the `blogspam_options` setting.
+The list of options is [here](http://blogspam.net/api/testComment.html#options).
+By default, the options are configured in a way that is appropriate for
+wiki content. This includes turning off some of the more problematic tests.
+An interesting option for testing is `fail`, by setting it (e.g.,
+`blogspam_options => 'fail'`), *all* comments will be marked as SPAM, so that
+you can check whether the interaction with blogspam.net works.
+
+The `blogspam_pagespec` setting is a [[ikiwiki/PageSpec]] that can be
+used to configure which pages are checked for spam. The default is to check
+all edits. If you only want to check [[comments]] (not wiki page edits),
+set it to "postcomment(*)". Posts by admins are never checked for spam.
+
+By default, the blogspam.net server is used to do the spam checking. To
+change this, the `blogspam_server` option can be set to the url for a
+different server implementing the same API. Note that content is sent
+unencrypted over the internet to the server, and the server sees
+the full text of the content.
diff --git a/doc/plugins/brokenlinks.mdwn b/doc/plugins/brokenlinks.mdwn
new file mode 100644
index 000000000..c039a9d17
--- /dev/null
+++ b/doc/plugins/brokenlinks.mdwn
@@ -0,0 +1,9 @@
+[[!template id=plugin name=brokenlinks author="[[Joey]]"]]
+[[!tag type/link type/meta]]
+
+This plugin provides a [[ikiwiki/directive/brokenlinks]] [[ikiwiki/directive]]
+that generates a list of broken links on pages in the wiki.
+
+If this plugin is turned on, here's a list of broken links on this wiki:
+
+[[!brokenlinks pages="* and !recentchanges"]]
diff --git a/doc/plugins/calendar.mdwn b/doc/plugins/calendar.mdwn
new file mode 100644
index 000000000..76e718a3b
--- /dev/null
+++ b/doc/plugins/calendar.mdwn
@@ -0,0 +1,36 @@
+[[!template id=plugin name=calendar author="[[ManojSrivastava]]"]]
+[[!tag type/widget]]
+
+This plugin provides a [[ikiwiki/directive/calendar]] [[ikiwiki/directive]].
+The directive displays a calendar, similar to the typical calendars shown on
+some blogs.
+
+The [[ikiwiki-calendar]] command is used to keep the calendar up-to-date.
+
+## CSS
+
+The output is liberally sprinkled with classes, for fine grained CSS
+customization.
+
+* `month-calendar` - The month calendar as a whole.
+* `month-calendar-head` - The head of the month calendar (ie,"March").
+* `month-calendar-arrow` - Arrow pointing to previous/next month.
+* `month-calendar-day-head` - A column head in the month calendar (ie, a
+ day-of-week abbreviation).
+* `month-calendar-day-noday`, `month-calendar-day-link`,
+ `month-calendar-day-nolink`, `month-calendar-day-future`,
+ `month-calendar-day-this-day` - The day squares on the month calendar,
+ for days that are not in the month (before or after the month itself), that
+ don't have links, that do have links, that are in the future, or are that
+ are the current day, respectively.
+* `Sunday`, `Monday`, `Tuesday`, ... - Each day square is also given a class
+ matching its (localised) day of week, this can be used to highlight
+ weekends.
+* `year-calendar` - The year calendar as a whole.
+* `year-calendar-head` - The head of the year calendar (ie, "2007").
+* `year-calendar-arrow` - Arrow pointing to previous/next year.
+* `year-calendar-subhead` - For example, "Months".
+* `year-calendar-month-link`, `year-calendar-month-nolink`,
+ `year-calendar-month-future`, `year-calendar-this-month` - The month
+ squares on the year calendar, for months with stories,
+ without, in the future, and currently selected, respectively.
diff --git a/doc/plugins/calendar/discussion.mdwn b/doc/plugins/calendar/discussion.mdwn
new file mode 100644
index 000000000..6fc21e8ee
--- /dev/null
+++ b/doc/plugins/calendar/discussion.mdwn
@@ -0,0 +1,23 @@
+It would be nice if the "month" type calendar could collect all of the
+matching pages on a given date in some inline type way. --[[DavidBremner]]
+
+> I agree, but I have not come up with good html to display them. Seems
+> it might need some sort of popup.
+
+Is it possible to get the calendar to link to pages based not on their timestamp (as I understand that it does now, or have I misunderstood this?) and instead on for example their location in a directory hierarchy. That way the calendar could be used as a planning / timeline device which I think would be great. --[[Alexander]]
+
+I would like the ability to specify relative previous months. This way I
+could have a sidebar with the last three months by specifying no month,
+then 'month="-1"' and 'month="-2"'. Negative numbers for the month would
+otherwise be invalid, so this shouldn't produce any conflicts with expected
+behavior. (Right?) -- [[StevenBlack]]
+
+> Great idea! Just implemented that and also relative years. --[[Joey]]
+
+Anyone know of a way to generate a link to the previous and next calendar pages for archive browsing? In the worst case, that requires regenerating pages on either side of the current one when something is inserted in the history, and I can't quite figure that much out. --[[JasonRiedy]]
+
+> Well, the calendar directive puts such links on the calendars. They're
+> the arrows to either side of the month or year at the top. --[[Joey]]
+
+>> Thanks. I either missed them or they appeared on an upgrade. I might make them a bit more obvious with
+>> "Previous Month" / "Next Month" links above and below the text. Someday.--[[JasonRiedy]]
diff --git a/doc/plugins/camelcase.mdwn b/doc/plugins/camelcase.mdwn
new file mode 100644
index 000000000..d9b7172d5
--- /dev/null
+++ b/doc/plugins/camelcase.mdwn
@@ -0,0 +1,13 @@
+[[!template id=plugin name=camelcase author="[[Joey]]"]]
+
+This plugin makes words in CamelCase be treated as a [[ikiwiki/WikiLink]].
+That is to say, any two or more words capitalised and mashed together are
+assumed to be the name of some other page on the wiki, and so become a
+link.
+
+If this plugin is enabled, this will be a link: SandBox
+
+Use of this plugin is not recommended, particularly on complex wikis with
+things like [[aggregate]] in use.
+
+[[!tag type/link]]
diff --git a/doc/plugins/color.mdwn b/doc/plugins/color.mdwn
new file mode 100644
index 000000000..d639bf563
--- /dev/null
+++ b/doc/plugins/color.mdwn
@@ -0,0 +1,5 @@
+[[!template id=plugin name=color core=0 author="[[ptecza]]"]]
+[[!tag type/widget]]
+
+This plugin provides a [[ikiwiki/directive/color]] [[ikiwiki/directive]].
+The directive can be used to color a piece of text on a page.
diff --git a/doc/plugins/comments.mdwn b/doc/plugins/comments.mdwn
new file mode 100644
index 000000000..50a99415f
--- /dev/null
+++ b/doc/plugins/comments.mdwn
@@ -0,0 +1,55 @@
+[[!template id=plugin name=comments author="[[Simon_McVittie|smcv]]"]]
+[[!tag type/web type/comments]]
+
+This plugin adds "blog-style" comments. Unlike the wiki-style freeform
+Discussion pages, these comments are posted by a simple form, cannot later
+be edited, and rss/atom feeds are provided of each page's comments.
+
+When using this plugin, you should also enable [[htmlscrubber]] and either
+[[htmltidy]] or [[htmlbalance]]. Directives are filtered out by default, to
+avoid commenters slowing down the wiki by causing time-consuming
+processing. As long as the recommended plugins are enabled, comment
+authorship should hopefully be unforgeable by CGI users.
+
+The intention is that on a non-wiki site (like a blog) you can lock all
+pages for admin-only access, then allow otherwise unprivileged (or perhaps
+even anonymous) users to comment on posts. See the documentation of the
+[[opendiscussion]], [[lockedit]] and [[anonok]] pages for details on locking
+down a wiki so readers can only post comments.
+
+Individual comments are stored as internal-use pages named something like
+`page/comment_1`, `page/comment_2`, etc. These pages internally use a
+[[comment_directive|ikiwiki/directive/comment]].
+
+There are some global options for the setup file:
+
+* `comments_pagespec`: [[ikiwiki/PageSpec]] of pages where comments are
+ allowed. The default is not to allow comments on any pages. To allow
+ comments to all posts to a blog, you could use
+ `blog/posts/* and !*/Discussion`.
+* `comments_closed_pagespec`: [[ikiwiki/PageSpec]] of pages where
+ posting of new comments is closed, but any existing comments will still
+ be displayed. Often you will list a set of individual pages here.
+ For example: `blog/controversial or blog/flamewar`
+* `comments_pagename`: if this is e.g. `comment_` (the default), then
+ comment pages will be named something like `page/comment_12`
+* `comments_allowdirectives`: if true (default false), comments may
+ contain IkiWiki [[directives|ikiwiki/directive]]
+* `comments_commit`: if true (default true), comments will be committed to
+ the version control system
+* `comments_allowauthor`: if true (default false), anonymous commenters may
+ specify a name for themselves, and the \[[!meta author]] and
+ \[[!meta authorurl]] directives will not be overridden by the comments
+ plugin
+
+## comment moderation
+
+If you enable the [[blogspam]] plugin, comments that appear spammy will be
+held for moderation. (Or with the [[moderatedcomments]] plugin, all
+comments will be held.) Wiki admins can access the comment moderation queue
+via a button on their Preferences page.
+
+Comments pending moderation are not checked into revision control.
+To find unmoderated comments, `find /your/ikiwiki/srcdir -name '*._comment_pending'`
+To manually moderate a comment, just rename the file, removing the
+"_pending" from the end, and check it into revision control.
diff --git a/doc/plugins/comments/discussion.mdwn b/doc/plugins/comments/discussion.mdwn
new file mode 100644
index 000000000..2b8add938
--- /dev/null
+++ b/doc/plugins/comments/discussion.mdwn
@@ -0,0 +1,232 @@
+## Syndication autodiscovery for comment feeds
+
+A standard `\[[!inline]]` directive adds links to the autogenerated syndication feeds using link tags in the header:
+
+ <link rel="alternate" type="application/rss+xml" title="$title" href="$page.atom" />
+ <link rel="alternate" type="application/atom+xml" title="$title" href="$page.atom" />
+
+These links aren't added to my pages that include comments even though comments generate syndication feeds. How can I configure the comments plugin to add these links to the header? (These links are required for user-agent autodiscovery of syndication feeds.) --[[anderbubble]]
+
+## Moderating comments from the CLI
+
+How do you do this, without using the UI in the Preferences?
+
+Please put this info on the page. Many thanks --[[Kai Hendry]]
+
+## Why internal pages? (unresolved)
+
+Comments are saved as internal pages, so they can never be edited through the CGI,
+only by direct committers.
+
+> So, why do it this way, instead of using regular wiki pages in a
+> namespace, such as `$page/comments/*`? Then you could use [[plugins/lockedit]] to
+> limit editing of comments in more powerful ways. --[[Joey]]
+
+>> Er... I suppose so. I'd assumed that these pages ought to only exist as inlines
+>> rather than as individual pages (same reasoning as aggregated posts), though.
+>>
+>> lockedit is actually somewhat insufficient, since `check_canedit()`
+>> doesn't distinguish between creation and editing; I'd have to continue to use
+>> some sort of odd hack to allow creation but not editing.
+>>
+>> I also can't think of any circumstance where you'd want a user other than
+>> admins (~= git committers) and possibly the commenter (who we can't check for
+>> at the moment anyway, I don't think?) to be able to edit comments - I think
+>> user expectations for something that looks like ordinary blog comments are
+>> likely to include "others can't put words into my mouth".
+>>
+>> My other objection to using a namespace is that I'm not particularly happy about
+>> plugins consuming arbitrary pieces of the wiki namespace - /discussion is bad
+>> enough already. Indeed, this very page would accidentally get matched by rules
+>> aiming to control comment-posting... :-) --[[smcv]]
+
+>>> Thinking about it, perhaps one way to address this would be to have the suffix
+>>> (e.g. whether commenting on Sandbox creates sandbox/comment1 or sandbox/c1 or
+>>> what) be configurable by the wiki admin, in the same way that recentchanges has
+>>> recentchangespage => 'recentchanges'? I'd like to see fewer hard-coded page
+>>> names in general, really - it seems odd to me that shortcuts and smileys
+>>> hard-code the name of the page to look at. Perhaps I could add
+>>> discussionpage => 'discussion' too? --[[smcv]]
+
+>>> (I've now implemented this in my branch. --[[smcv]])
+
+>> The best reason to keep the pages internal seems to me to be that you
+>> don't want the overhead of every comment spawning its own wiki page. --[[Joey]]
+
+## Formats (resolved)
+
+The plugin now allows multiple comment formats while still using internal
+pages; each comment is saved as a page containing one `\[[!comment]]` directive,
+which has a superset of the functionality of [[ikiwiki/directives/format]].
+
+## Access control (unresolved?)
+
+By the way, I think that who can post comments should be controllable by
+the existing plugins opendiscussion, anonok, signinedit, and lockedit. Allowing
+posting comments w/o any login, while a nice capability, can lead to
+spam problems. So, use `check_canedit` as at least a first-level check?
+--[[Joey]]
+
+> This plugin already uses `check_canedit`, but that function doesn't have a concept
+> of different actions. The hack I use is that when a user comments on, say, sandbox,
+> I call `check_canedit` for the pseudo-page "sandbox[postcomment]". The
+> special `postcomment(glob)` [[ikiwiki/pagespec]] returns true if the page ends with
+> "[postcomment]" and the part before (e.g. sandbox) matches the glob. So, you can
+> have postcomment(blog/*) or something. (Perhaps instead of taking a glob, postcomment
+> should take a pagespec, so you can have postcomment(link(tags/commentable))?)
+>
+> This is why `anonok_pagespec => 'postcomment(*)'` and `locked_pages => '!postcomment(*)'`
+> are necessary to allow anonymous and logged-in editing (respectively).
+>
+>> I changed that to move the flag out of the page name, and into a variable that the `match_postcomment`
+>> function checks for. Other ugliness still applies. :-) --[[Joey]]
+>
+> This is ugly - one alternative would be to add `check_permission()` that takes a
+> page and a verb (create, edit, rename, remove and maybe comment are the ones I
+> can think of so far), use that, and port the plugins you mentioned to use that
+> API too. This plugin could either call `check_can("$page/comment1", 'create')` or
+> call `check_can($page, 'comment')`.
+>
+> One odd effect of the code structure I've used is that we check for the ability to
+> create the page before we actually know what page name we're going to use - when
+> posting the comment I just increment a number until I reach an unused one - so
+> either the code needs restructuring, or the permission check for 'create' would
+> always be for 'comment1' and never 'comment123'. --[[smcv]]
+
+>> Now resolved, in fact --[[smcv]]
+
+> Another possibility is to just check for permission to edit (e.g.) `sandbox/comment1`.
+> However, this makes the "comments can only be created, not edited" feature completely
+> reliant on the fact that internal pages can't be edited. Perhaps there should be a
+> `editable_pages` pagespec, defaulting to `'*'`? --[[smcv]]
+
+## comments directive vs global setting (resolved?)
+
+When comments have been enabled generally, you still need to mark which pages
+can have comments, by including the `\[[!comments]]` directive in them. By default,
+this directive expands to a "post a comment" link plus an `\[[!inline]]` with
+the comments. [This requirement has now been removed --[[smcv]]]
+
+> I don't like this, because it's hard to explain to someone why they have
+> to insert this into every post to their blog. Seems that the model used
+> for discussion pages could work -- if comments are enabled, automatically
+> add the comment posting form and comments to the end of each page.
+> --[[Joey]]
+
+>> I don't think I'd want comments on *every* page (particularly, not the
+>> front page). Perhaps a pagespec in the setup file, where the default is "*"?
+>> Then control freaks like me could use "link(tags/comments)" and tag pages
+>> as allowing comments.
+>>
+>>> Yes, I think a pagespec is the way to go. --[[Joey]]
+
+>>>> Implemented --[[smcv]]
+
+>>
+>> The model used for discussion pages does require patching the existing
+>> page template, which I was trying to avoid - I'm not convinced that having
+>> every possible feature hard-coded there really scales (and obviously it's
+>> rather annoying while this plugin is on a branch). --[[smcv]]
+
+>>> Using the template would allow customising the html around the comments
+>>> which seems like a good thing? --[[Joey]]
+
+>>>> The \[[!comments]] directive is already template-friendly - it expands to
+>>>> the contents of the template `comments_embed.tmpl`, possibly with the
+>>>> result of an \[[!inline]] appended. I should change `comments_embed.tmpl`
+>>>> so it uses a template variable `INLINE` for the inline result rather than
+>>>> having the perl code concatenate it, which would allow a bit more
+>>>> customization (whether the "post" link was before or after the inline).
+>>>> Even if you want comments in page.tmpl, keeping the separate comments_embed.tmpl
+>>>> and having a `COMMENTS` variable in page.tmpl might be the way forward,
+>>>> since the smaller each templates is, the easier it will be for users
+>>>> to maintain a patched set of templates. (I think so, anyway, based on what happens
+>>>> with dpkg prompts in Debian packages with monolithic vs split
+>>>> conffiles.) --[[smcv]]
+
+>>>>> I've switched my branch to use page.tmpl instead; see what you think? --[[smcv]]
+
+## Raw HTML (resolved?)
+
+Raw HTML was not initially allowed by default (this was configurable).
+
+> I'm not sure that raw html should be a problem, as long as the
+> htmlsanitizer and htmlbalanced plugins are enabled. I can see filtering
+> out directives, as a special case. --[[Joey]]
+
+>> Right, if I sanitize each post individually, with htmlscrubber and either htmltidy
+>> or htmlbalance turned on, then there should be no way the user can forge a comment;
+>> I was initially wary of allowing meta directives, but I think those are OK, as long
+>> as the comment template puts the \[[!meta author]] at the *end*. Disallowing
+>> directives is more a way to avoid commenters causing expensive processing than
+>> anything else, at this point.
+>>
+>> I've rebased the plugin on master, made it sanitize individual posts' content
+>> and removed the option to disallow raw HTML. Sanitizing individual posts before
+>> they've been htmlized required me to preserve whitespace in the htmlbalance
+>> plugin, so I did that. Alternatively, we could htmlize immediately and always
+>> save out raw HTML? --[[smcv]]
+
+>>> There might be some use cases for other directives, such as img, in
+>>> comments.
+>>>
+>>> I don't know if meta is "safe" (ie, guaranteed to be inexpensive and not
+>>> allow users to do annoying things) or if it will continue to be in the
+>>> future. Hard to predict really, all that can be said with certainty is
+>>> all directives will contine to be inexpensive and safe enough that it's
+>>> sensible to allow users to (ab)use them on open wikis.
+>>> --[[Joey]]
+
+----
+
+I have a test ikiwiki setup somewhere to investigate adopting the comments
+plugin. It is setup with no auth enabled and I got hammered with a spam attack
+over the last weekend (predictably). What surprised me was the scale of the
+attack: ikiwiki eventually triggered OOM and brought the box down. When I got
+it back up, I checked out a copy of the underlying git repository, and it
+measured 280M in size after being packed. Of that, about 300K was data prior
+to the spam attack, so the rest was entirely spam text, compressed via git's
+efficient delta compression.
+
+I had two thoughts about possible improvements to the comments plugin in the
+wake of this:
+
+ * comment pagination - there is a hard-to-define upper limit on the number
+ of comments that can be appended to a wiki page whilst the page remains
+ legible. It would be useful if comments could be paginated into sub-pages.
+
+ * crude flood control - asides from spam attacks (and I am aware of
+ [[plugins/blogspam]]), people can crap flood or just aggressively flame
+ repeatedly. An interesting prevention measure might be to not let an IP
+ post more than 3 sequential comments to a page, or to the site, without
+ at least one other comment being interleaved. I say 3 rather than 2 since
+ correction follow-ups are common.
+
+-- [[Jon]]
+
+
+---
+
+## Comment threads
+
+Any thoughts about implementing some simple threading in the comments?
+
+Or at least a reply functionality that quotes the subject/contents?
+
+-- [[iustin]]
+
+---
+
+## Disabling certain formats for comments
+
+It seems that comments plugin allows using all enabled formats and
+there is not way to disable some of them. For my blog, I want to use
+additional formats for writing posts but I do not want commenters to
+use those formats because it would be a security problem.
+
+Any suggestions or hints how to implement this?
+
+-- [[wentasah]]
+
+> I've implemented this. See [[todo/Restrict_formats_allowed_for_comments]].
+> --[[wentasah]]
diff --git a/doc/plugins/conditional.mdwn b/doc/plugins/conditional.mdwn
new file mode 100644
index 000000000..27a99bb7c
--- /dev/null
+++ b/doc/plugins/conditional.mdwn
@@ -0,0 +1,5 @@
+[[!template id=plugin name=conditional core=1 author="[[Joey]]"]]
+[[!tag type/special-purpose]]
+
+This plugin provides the [[ikiwiki/directive/if]] [[ikiwiki/directive]].
+With this directive, you can make text be conditionally displayed on a page.
diff --git a/doc/plugins/conditional/discussion.mdwn b/doc/plugins/conditional/discussion.mdwn
new file mode 100644
index 000000000..6e84fdfc1
--- /dev/null
+++ b/doc/plugins/conditional/discussion.mdwn
@@ -0,0 +1,76 @@
+## Conditional broken?
+
+Using \[\[!if test="tagged(plugin)" then="= Tagged as plugin =" else="*No plugins found*"]] on this wiki *should* present the 'Tagged as plugin' heading, instead it emits 'no plugins found'. Is the conditional plugin currently broken for tags or am I misusing it? Thanks.
+
+-- Thiana
+
+> This wiki has no page named "plugin", so nothing links to it; tags are a species of link
+> so tagging a large number of pages with a tag that doesn't exist (which change has
+> been reverted) doesn't make the pagespec match. It would if the tag's page existed. --[[Joey]]
+
+>> So if I understand this correctly... Assuming the tags Tag_A and Tag_B, the existence of
+>> @wiki-home@/tags/Tag_A.creole, and a number of files with a \[\[!tag Tag_A Tag_B]] the
+>> following is correct?
+>>
+>> * \[\[!if test="tagged(Tag_A)" then="OK" else="Fail"]] => OK
+>> * \[\[!if test="tagged(Tag_B)" then="OK" else="Fail"]] => Fail
+>> * \[\[!if test="tagged(Tag_A) and tagged(Tag_B)" then="OK" else="Fail"]] => Fail
+>>
+>> Is that the expected behaviour? If so, that's not what I'm seeing here since they all result
+>> in a Fail. If not, what exactly is wrong with those conditionals? Thanks.
+>>
+>> -- Thiana
+
+----
+
+Would there be a way for this plugin to emit fewer blank lines (i.e. *none at all*)?
+
+For example, having a look at [this page](http://www.bddebian.com/~wiki/Hurd/)'s sidebar.
+This [sidebar](http://www.bddebian.com/~wiki/sidebar/)
+([source code](http://www.bddebian.com/gitweb/?p=wiki;a=blob_plain;f=sidebar.mdwn))
+is supposed to have *no* blank lines between...
+
+* **Hurd** and *About*,
+* *Todo* and **Mach**,
+* **Mach** and **Mig**.
+
+--[[tschwinge]]
+
+> The blank lines in this example are coming from the newline after `then="`, and also from the newline before the close quote. If you remove those newlines, I think it should work. --[[Joey]]
+
+>> No, that's unfortunately not it, see here:
+>> [[!if test="enabled(trallala)" then="foot"]]
+>> Continued. But on the other
+>> [[!if test="enabled(trallala)" then="foot" else="hand:"]]
+>> Continued. --[[tschwinge]]
+
+>>> Seems ok, no? The only linebreaks I see in the source are the ones you
+>>> put at the end of the lines. --[[Joey]]
+
+>>>> Okay, that would explain the linebreak between 1 and 3. But then, why are all linebreaks removed between 3 and 5?
+
+>>>> 1 No, that's unfortunately not it, see here:
+>>>> [[!if test="enabled(trallala)" then="foot"]]
+>>>> 3 Continued. But on the other
+>>>> [[!if test="enabled(trallala)" then="foot" else="hand:"]]
+>>>> 5 Continued. --[[tschwinge]]
+
+>>>>> The conditional after 1 evaluates to "", so there's a blank line
+>>>>> there. The one after 3 evaluates to "hand:", so no blank line there.
+>>>>> --[[Joey]]
+
+I have a sidebar that contains
+<pre>
+ #### Archives
+
+ \[[!calendar type="year" months_per_row="6" pages="blog/* and !*/Discussion"]]
+ \[[!calendar type="month" pages="blog/* and !*/Discussion"]]
+ &lt;h4&gt;Indices&lt;/h4&gt
+ \[[!map pages="archives/* and !*/Discussion"]]
+</pre>
+I am trying to make it so that the archives and index only show up if the destpage is either blog/* or / -- the top of the wiki. Unfortunately, I don't think I am getting the
+conditional right -- I have a "]] left over at the end (looking at the rendered html). Ideally, I would like to be able to do todays calendar on the top level pagel and
+the annual calendar on archives/200[4567].mdwn, and monthly calendars for the proper month on archives/200[4567]/[0..12].mdwn. Do I have to create separate sidebars?
+I do not use the usedir directive, so all my annual archive pages live in archives/, and all my monthly archive pages live in, say, archives/2007/ --ManojSrivastava
+
+> Are you using triple quoting for the text in the conditional? --[[Joey]]
diff --git a/doc/plugins/contrib.mdwn b/doc/plugins/contrib.mdwn
new file mode 100644
index 000000000..8ae457ee2
--- /dev/null
+++ b/doc/plugins/contrib.mdwn
@@ -0,0 +1,7 @@
+These plugins are provided by third parties and are not currently
+included in ikiwiki. See [[install]] for installation help.
+
+[[!map pages="plugins/contrib/* and !plugins/contrib/*/* and !*/Discussion"]]
+
+[[!inline pages="plugins/contrib/*" rootpage="plugins/contrib" show=-1
+postformtext="Add a new plugin:" feeds=no]]
diff --git a/doc/plugins/contrib/album.mdwn b/doc/plugins/contrib/album.mdwn
new file mode 100644
index 000000000..745a44e8b
--- /dev/null
+++ b/doc/plugins/contrib/album.mdwn
@@ -0,0 +1,140 @@
+[[!template id=plugin name=album author="[[Simon_McVittie|smcv]]"]]
+[[!tag type/chrome]]
+
+This plugin provides the [[ikiwiki/directive/album]] [[ikiwiki/directive]],
+which turns a page into a photo album or image gallery, containing all
+images attached to the album or its subpages. It also provides the
+[[ikiwiki/directive/albumsection]] and [[ikiwiki/directive/albumimage]]
+directives.
+
+This plugin automatically enables the [[filecheck]], [[img]], [[inline]],
+[[trail]] and [[transient]] plugins. The [[meta]] plugin is also
+recommended.
+
+## Changing the templates
+
+When a viewer page is generated or inlined into an album, the template can
+contain these extra variables:
+
+* `<TMPL_VAR ALBUM>` - page name of the album
+* `<TMPL_VAR ALBUMURL>` - relative URL to the album
+* `<TMPL_VAR ALBUMTITLE>` - title of the album, usually taken from
+ a [[ikiwiki/directive/meta]] directive
+* `<TMPL_VAR CAPTION>` - caption for the image
+* `<TMPL_VAR THUMBNAIL>` - a small [[ikiwiki/directive/img]] for the image
+* `<TMPL_VAR IMAGEWIDTH>` - width of the full-size image in pixels
+* `<TMPL_VAR IMAGEHEIGHT>` - height of the full-size image in pixels
+* `<TMPL_VAR IMAGEFILESIZE>` - size of the image, e.g. `1.2 MiB`
+* `<TMPL_VAR IMAGEFORMAT>` - format of the image, typically `JPEG`
+
+The template for the viewer page can also contain:
+
+* `<TMPL_VAR IMG>` - a large [[ikiwiki/directive/img]] to display the image
+* `<TMPL_VAR PREV>` - a link to the previous viewer, typically with a
+ thumbnail
+* `<TMPL_VAR NEXT>` - a link to the next viewer, typically with a
+ thumbnail
+
+## Including album entries elsewhere
+
+To display images from elsewhere in the wiki with the same appearance as
+an [[ikiwiki/directive/album]] or [[ikiwiki/directive/albumsection]],
+you can use an [[ikiwiki/directive/inline]] with the `albumitem`
+template:
+
+ \[[!inline pages="..." sort="-age" template="albumitem"]]
+
+----
+
+[[!template id=gitbranch branch=smcv/album4 author="[[Simon_McVittie|smcv]]"]]
+
+Available from [[smcv]]'s git repository, in the `album4` branch.
+I've called it `album` to distinguish it from
+[[contrib/gallery|plugins/contrib/gallery]], although `gallery` might well be
+a better name for this functionality.
+
+(The Summer of Code [[plugins/contrib/gallery]] plugin does the
+next/previous UI in Javascript using Lightbox, which means that
+individual photos can't be bookmarked in a meaningful way, and
+the best it can do as a fallback for non-Javascript browsers
+is to provide a direct link to the image.)
+
+Updated, April 2012: rebased onto the version of [[trail]] that got merged
+
+## Manual installation
+
+First, you need a version of ikiwiki with the [[trail]] plugin merged in
+(version 3.20120203 or later).
+
+Manual installation requires these files (use the "raw" link in gitweb
+to download):
+
+* [album.pm](http://git.pseudorandom.co.uk/smcv/ikiwiki.git/blob/album4:/IkiWiki/Plugin/album.pm)
+ in an `IkiWiki/Plugin` subdirectory of your configured `plugindir`
+* [albumviewer.tmpl](http://git.pseudorandom.co.uk/smcv/ikiwiki.git/blob/album4:/templates/albumviewer.tmpl),
+ [albumitem.tmpl](http://git.pseudorandom.co.uk/smcv/ikiwiki.git/blob/album4:/templates/albumitem.tmpl),
+ [albumnext.tmpl](http://git.pseudorandom.co.uk/smcv/ikiwiki.git/blob/album4:/templates/albumnext.tmpl) and
+ [albumprev.tmpl](http://git.pseudorandom.co.uk/smcv/ikiwiki.git/blob/album4:/templates/albumprev.tmpl),
+ in your configured `templatedir`, or a `templates` subdirectory of your wiki repository
+* the album-related bits from the end of the
+ [stylesheet](http://git.pseudorandom.co.uk/smcv/ikiwiki.git/blob/album4:/doc/style.css)
+ (put them in your local.css)
+
+## Demo
+
+* [HTML page of thumbnails](http://ikialbum.hosted.pseudorandom.co.uk/album/)
+ as an entry point to the album
+* Each thumbnail links to
+ [a "viewer" HTML page](http://ikialbum.hosted.pseudorandom.co.uk/album/img_0120/)
+ with a full size image, optional next/previous thumbnail links, and
+ optional [[plugins/comments]]
+
+## Bugs
+
+* There's currently a hard-coded list of extensions that are treated as
+ images: `png`, `gif`, `jpg`, `jpeg` or `mov` files. More image and video
+ types could be added in future.
+
+* Videos aren't currently handled very well; ideally, something like
+ totem-video-thumbnailer would be used.
+
+* The plugin doesn't do anything special to handle albums that are subpages
+ of each other. If, say, `debconf` and `debconf/monday` are both albums,
+ then `debconf/monday/p100.jpg` will currently be assigned to one or the
+ other, arbitrarily. It should probably pick the closest (longest) album name.
+ (I'm not sure that it can do this reliably, though, since the scan stage
+ runs in an undefined order.)
+
+* The plugin doesn't do anything special to handle photos with similar names.
+ If you have `p100.jpg` and `p100.png`, one will get a viewer page called
+ `p100` and the other will be ignored. (I'm not sure what we could do better,
+ though.)
+
+* If there's no `albumimage` in a viewer page, one should probably be appended
+ automatically.
+
+## TODO
+
+* The generated viewer page should extract as much metadata as possible from
+ the photo's EXIF tags (creation/modification dates, author, title, caption,
+ copyright). [[smcv]] has a half-written implementation which runs
+ `scanimage` hooks, and has an `exiftool` plugin using [[!cpan Image::ExifTool]]
+ as a reference implementation of that hook.
+
+* There should be an option to reduce the size of photos and write them into
+ an underlay (perhaps just the transient underlay), for this workflow:
+
+ * your laptop's local ikiwiki has two underlays, `photos` and `webphotos`
+ * `photos` contains full resolution photos with EXIF tags
+ * for each photo that exists in `photos` but not in `webphotos`, the album
+ plugin automatically resamples it down to a web-compatible resolution
+ ([[smcv]] uses up to 640x640), optimizes it with `jpegoptim`, strips out
+ all EXIF tags, and and writes it into the corresponding location
+ in `webphotos`
+ * `webphotos` is what you rsync to the web server
+ * the web server's ikiwiki only has `webphotos` as an underlay
+
+* Eventually, there could be a specialized CGI user interface to batch-edit
+ all the photos of an album (so for each photo, you get an edit box each for
+ title, author, copyright etc.) - this would work by making programmatic
+ edits to all the `albumimage` directives.
diff --git a/doc/plugins/contrib/album/discussion.mdwn b/doc/plugins/contrib/album/discussion.mdwn
new file mode 100644
index 000000000..de1180d10
--- /dev/null
+++ b/doc/plugins/contrib/album/discussion.mdwn
@@ -0,0 +1,458 @@
+thanks for this plugin. it might help me in my application, which is to provide album/galleries which can be edited (ie. new images added, taken away, etc.) through web interface.
+
+> That's my goal eventually, too. Perhaps you can help to
+> design/write this plugin? At the moment I'm mostly
+> waiting for a design "sanity check" from [[Joey]],
+> but any feedback you can provide on the design would
+> also be helpful. --[[smcv]]
+
+i have two challenges: firstly, for installation, i'm not sure what all the files are that need to be downloaded (because of my setup i can't easily pull the repo). so far i have Ikiwiki/Plugins/album.pm; ikiwiki-album; and 4 files in templates/ any others?
+
+> Those are all the added files; ikiwiki-album isn't strictly
+> needed (IkiWiki itself doesn't use that code, but you can
+> use it to turn a directory full of images into correct
+> input for the album plugin).
+>
+> You probably also want the album plugin's expanded version of
+> style.css (or put its extra rules in your local.css).
+> Without that, your albums will be quite ugly.
+>
+> There aren't currently any other files modified by my branch.
+> --[[smcv]]
+
+secondly: barring the CGI interface for editing the album, which would be great, is there at least a way to use attachment plugin or any other to manually add images and then create viewers for them?
+
+> Images are just attachments, and viewers are pages (any supported
+> format, but .html will be fastest to render). Attach each image,
+> then write a page for each image containing the
+> \[[!albumimage]] directive (usually it will *only* contain that
+> directive).
+>
+> The script ikiwiki-album can help you to do this in a git/svn/etc.
+> tree; doing it over the web will be a lot of work (until I get
+> the CGI interface written), but it should already be possible!
+>
+> The structure is something like this:
+>
+> * album.mdwn (contains the \[[!album]] directive, and perhaps also
+> some \[[!albumsection]] directives)
+> * album/a.jpg
+> * album/a.html (contains the \[[!albumimage]] directive for a.jpg)
+> * album/b.jpg
+> * album/b.html (contains the \[[!albumimage]] directive for b.jpg)
+>
+> Have a look at ikiwiki-album to see how the directives are meant to
+> work in practice.
+>
+> --[[smcv]]
+
+>> In the current version of the branch, the viewer pages are
+>> generated automatically if you didn't generate them yourself,
+>> so `ikiwiki-album` is no longer needed. --[[smcv]]
+
+i'm new to ikiwiki, apologies if this is dealt with elsewhere. -brush
+
+> This plugin is pretty ambitious, and is unfinished, so I'd recommend
+> playing with a normal IkiWiki installation for a bit, then trying
+> out this plugin when you've mastered the basics of IkiWiki. --[[smcv]]
+
+----
+
+You had wanted my feedback on the design of this. I have not looked at the
+code or tried it yet, but here goes. --[[Joey]]
+
+* Needing to create the albumimage "viewer" pages for each photo
+ seems like it will become a pain. Everyone will need to come up
+ with their own automation for it, and then there's the question
+ of how to automate it when uploading attachments. -J
+
+> There's already a script (ikiwiki-album) to populate a git
+> checkout with skeleton "viewer" pages; I was planning to make a
+> specialized CGI interface for albums after getting feedback from
+> you (since the requirements for that CGI interface change depending
+> on the implementation). I agree that this is ugly, though. -s
+
+>> Would you accept a version where the albumimage "viewer" pages
+>> could be 0 bytes long, at least until metadata gets added?
+>>
+>> The more I think about the "binaries as first-class pages" approach,
+>> the more subtle interactions I notice with other plugins. I
+>> think I'm up to needing changes to editpage, comments, attachment
+>> and recentchanges, plus adjustments to img and Render (to reduce
+>> duplication when thumbnailing an image with a strange extension
+>> while simultaneously changing the extension, and to hardlink/copy
+>> an image with a strange extension to a differing target filename
+>> with the normal extension, respectively). -s
+
+>>> Now that we have `add_autofile` I can just create viewer pages
+>>> whenever there's an image to view. The current version of the
+>>> branch does that. -s
+
+* With each viewer page having next/prev links, I can see how you
+ were having the scalability issues with ikiwiki's data structures
+ earlier! -J
+
+> Yeah, I think they're a basic requirement from a UI point of view
+> though (although they don't necessarily have to be full wikilinks).
+> -s
+
+>> I think that with the new dependency types system, the dependencies for
+>> these can be presence dependencies, which will probably help with
+>> avoiding rebuilds of a page if the next/prev page is changed.
+>> (Unless you use img to make the thumbnails for those links, then it
+>> would rebuild the thumbnails anyway. Have not looked at the code.) --[[Joey]]
+
+>>> I do use img. -s
+
+* And doesn't each viewer page really depend on every other page in the
+ same albumsection? If a new page is added, the next/prev links
+ may need to be updated, for example. If so, there will be much
+ unnecessary rebuilding. -J
+
+> albumsections are just a way to insert headings into the flow of
+> photos, so they don't actually affect dependencies.
+>
+> One non-obvious constraint of ikiwiki's current design is that
+> everything "off-page" necessary to build any page has to happen
+> at scan time, which has caused a few strange design decisions,
+> like the fact that each viewer controls what album it's in.
+>
+> It's difficult for the contents of the album to just be a
+> pagespec, like for inline, because pagespecs can depend on
+> metadata, which is gathered in arbitrary order at scan time;
+> so the earliest you can safely apply a pagespec to the wiki
+> contents to get a concrete list of pages is at rebuild time.
+>
+> (This stalled my attempt at a trail plugin, too.) -s
+
+>> Not sure I understand why these need to look at pagespecs at scan time?
+>> Also, note that it is fairly doable to detect if a pagespec uses such
+>> metadata. Er, I mean, I have a cheezy hack in `add_depends` now that does
+>> it to deal with a similar case. --[[Joey]]
+
+>>> I think I was misunderstanding how early you have to call `add_depends`?
+>>> The critical thing I missed was that if you're scanning a page, you're
+>>> going to rebuild it in a moment anyway, so it doesn't matter if you
+>>> have no idea what it depends on until the rebuild phase. -s
+
+* One thing I do like about having individual pages per image is
+ that they can each have their own comments, etc. -J
+
+> Yes; also, they can be wikilinked. I consider those to be
+> UI requirements. -s
+
+* Seems possibly backwards that the albumimage controls what album
+ an image appears in. Two use cases -- 1: I may want to make a locked
+ album, but then anyone who can write to any other page on the wiki can
+ add an image to it. 2: I may want an image to appear in more than one
+ album. Think tags. So it seems it would be better to have the album
+ directive control what pages it includes (a la inline). -J
+
+> I'm inclined to fix this by constraining images to be subpages of exactly
+> one album: if they're subpages of 2+ nested albums then they're only
+> considered to be in the deepest-nested one (i.e. longest URL), and if
+> they're not in any album then that's a usage error. This would
+> also make prev/next links sane. -s
+
+>> The current version constrains images to be in at most one album,
+>> choosing one arbitrarily (dependent on scan order) if albums are
+>> nested. -s
+
+> If you want to reference images from elsewhere in the wiki and display
+> them as if in an album, then you can use an ordinary inline with
+> the same template that the album would use, and I'll make sure the
+> templates are set up so this works. -s
+
+>> Still needs documenting, I've put it on the TODO list on the main
+>> page. -s
+
+> (Implementation detail: this means that an image X/Y/Z/W/V, where X and
+> Y are albums, Z does not exist and W exists but is not an album,
+> would have a content dependency on Y, a presence dependency on Z
+> and a content dependency on W.)
+>
+> Perhaps I should just restrict to having the album images be direct
+> subpages of the album, although that would mean breaking some URLs
+> on the existing website I'm doing all this work for... -s
+
+>> The current version of the branch doesn't have this restriction;
+>> perhaps it's a worthwhile simplification, or perhaps it's too
+>> restrictive? I fairly often use directory hierarchies like
+>> `a_festival/saturday/foo.jpg` within an album, which makes
+>> it very easy to write `albumsection` filters. -s
+
+* Putting a few of the above thoughts together, my ideal album system
+ seems to be one where I can just drop the images into a directory and
+ have them appear in the album index, as well as each generate their own wiki
+ page. Plus some way I can, later, edit metadata for captions,
+ etc. (Real pity we can't just put arbitrary metadata into the images
+ themselves.) This is almost pointing toward making the images first-class
+ wiki page sources. Hey, it worked for po! :) But the metadata and editing
+ problems probably don't really allow that. -J
+
+> Putting a JPEG in the web form is not an option from my point of
+> view :-) but perhaps there could just be a "web-editable" flag supplied
+> by plugins, and things could be changed to respect it.
+
+>> Replying to myself: would you accept patches to support
+>> `hook(type => 'htmlize', editable => 0, ...)` in editpage? This would
+>> essentially mean "this is an opaque binary: you can delete it
+>> or rename it, and it might have its own special editing UI, but you
+>> can never get it in a web form".
+>>
+>> On the other hand, that essentially means we need to reimplement
+>> editpage in order to edit the sidecar files that contain the metadata.
+>> Having already done one partial reimplementation of editpage (for
+>> comments) I'm in no hurry to do another.
+>>
+>> I suppose another possibility would be to register hook
+>> functions to be called by editpage when it loads and saves the
+>> file. In this case, the loading hook would be to discard
+>> the binary and use filter() instead, and the saving conversion
+>> would be to write the edited content into the metadata sidecar
+>> (creating it if necessary).
+>>
+>> I'd also need to make editpage (and also comments!) not allow the
+>> creation of a file of type albumjpg, albumgif etc., which is something
+>> I previously missed; and I'd need to make attachment able to
+>> upload-and-rename.
+>> -s
+
+>>> I believe the current branch meets your requirements, by having
+>>> first-class wiki pages spring into existence using `add_autofile`
+>>> to be viewer pages for photos. -s
+
+> In a way, what you really want for metadata is to have it in the album
+> page, so you can batch-edit the whole lot by editing one file (this
+> does mean that editing the album necessarily causes each of its viewers
+> to be rebuilt, but in practice that happens anyway). -s
+
+>> Replying to myself: in practice that *doesn't* happen anyway. Having
+>> the metadata in the album page is somewhat harmful because it means
+>> that changing the title of one image causes every viewer in the album
+>> to be rebuilt, whereas if you have a metadata file per image, only
+>> the album itself, plus the next and previous viewers, need
+>> rebuilding. So, I think a file per image is the way to go.
+>>
+>> Ideally we'd have some way to "batch-edit" the metadata of all
+>> images in an album at once, except that would make conflict
+>> resolution much more complicated to deal with; maybe just
+>> give up and scream about mid-air collisions in that case?
+>> (That's apparently good enough for Bugzilla, but not really
+>> for ikiwiki). -s
+
+>>> This is now in the main page's TODO list; if/when I implement this,
+>>> I intend to make it a specialized CGI interface. -s
+
+>> Yes, [all metadata in one file] would make some sense.. It also allows putting one image in
+>> two albums, with different caption etc. (Maybe for different audiences.)
+>> --[[Joey]]
+
+>>> Eek. No, that's not what I had in mind at all; the metadata ends up
+>>> in the "viewer" page, so it's necessarily the same for all albums. -s
+
+>> It would probably be possible to add a new dependency type, and thus
+>> make ikiwiki smart about noticing whether the metadata has actually
+>> changed, and only update those viewers where it has. But the dependency
+>> type stuff is still very new, and not plugin friendly .. so only just
+>> possible, --[[Joey]]
+
+----
+
+'''I think the "special extension" design is a dead-end, but here's what
+happened when I tried to work out how it would work. --[[smcv]]'''
+
+Suppose that each viewer is a JPEG-or-GIF-or-something, with extension
+".albumimage". We have a gallery "memes" with three images, badger,
+mushroom and snake.
+
+> An alternative might be to use ".album.jpg", and ".album.gif"
+> etc as the htmlize extensions. May need some fixes to ikiwiki to support
+> that. --[[Joey]]
+
+>> foo.albumjpg (etc.) for images, and foo._albummeta (with
+>> `keepextension => 1`) for sidecar metadata files, seems viable. -s
+
+Files in git repo:
+
+* index.mdwn
+* memes.mdwn
+* memes/badger.albumjpg (a renamed JPEG)
+* memes/badger/comment_1._comment
+* memes/badger/comment_2._comment
+* memes/mushroom.albumgif (a renamed GIF)
+* memes/mushroom._albummeta (sidecar file with metadata)
+* memes/snake.albummov (a renamed video)
+
+Files in web content:
+
+* index.html
+* memes/index.html
+* memes/96x96-badger.jpg (from img)
+* memes/96x96-mushroom.gif (from img)
+* memes/96x96-snake.jpg (from img, hacked up to use totem-video-thumbnailer :-) )
+* memes/badger/index.html (including comments)
+* memes/badger.jpg
+* memes/mushroom/index.html
+* memes/mushroom.gif
+* memes/snake/index.html
+* memes/snake.mov
+
+ispage("memes/badger") (etc.) must be true, to make the above rendering
+happen, so albumimage needs to be a "page" extension.
+
+To not confuse other plugins, album should probably have a filter() hook
+that turns .albumimage files into HTML? That'd probably be a reasonable
+way to get them rendered anyway.
+
+> I guess that is needed to avoid preprocess, scan, etc trying to process
+> the image, as well as eg, smiley trying to munge it in sanitize.
+> --[[Joey]]
+
+>> As long as nothing has a filter() hook that assumes it's already
+>> text... filters are run in arbitrary order. We seem to be OK so far
+>> though.
+>>
+>> If this is the route I take, I propose to have the result of filter()
+>> be the contents of the sidecar metadata file (empty string if none),
+>> with the `\[[!albumimage]]` directive (which no longer requires
+>> arguments) prepended if not already present. This would mean that
+>> meta directives in the metadata file would work as normal, and it
+>> would be possible to insert text both before and after the viewer
+>> if desired. The result of filter() would also be a sensible starting
+>> point for editing, and the result of editing could be diverted into
+>> the metadata file. -s
+
+do=edit&page=memes/badger needs to not put the JPG in a text box: somehow
+divert or override the normal edit CGI by telling it that .albumimage
+files are not editable in the usual way?
+
+> Something I missed here is that editpage also needs to be told that
+> creating new files of type albumjpg, albumgif etc. is not allowed
+> either! -s
+
+Every image needs to depend on, and link to, the next and previous images,
+which is a bit tricky. In previous thinking about this I'd been applying
+the overly strict constraint that the ordered sequence of pages in each
+album must be known at scan time. However, that's not *necessarily* needed:
+the album and each photo could collect an unordered superset of dependencies
+at scan time, and at rebuild time that could be refined to be the exact set,
+in order.
+
+> Why do you need to collect this info at scan time? You can determine it
+> at build time via `pagespec_match_list`, surely .. maybe with some
+> memoization to avoid each image in an album building the same list.
+> I sense that I may be missing a subtelty though. --[[Joey]]
+
+>> I think I was misunderstanding how early you have to call `add_depends`
+>> as mentioned above. -s
+
+Perhaps restricting to "the images in an album A must match A/*"
+would be useful; then the unordered superset could just be "A/*". Your
+"albums via tags" idea would be nice too though, particularly for feature
+parity with e.g. Facebook: "photos of Joey" -> "tags/joey and albumimage()"
+maybe?
+
+If images are allowed to be considered to be part of more than one album,
+then a pretty and usable UI becomes harder - "next/previous" expands into
+"next photo in holidays/2009/germany / next photo in tagged/smcv / ..."
+and it could get quite hard to navigate. Perhaps next/previous links could
+be displayed only for the closest ancestor (in URL space) that is an
+album, or something?
+
+> Ugh, yeah, that is a problem. Perhaps wanting to support that was just
+> too ambitious. --[[Joey]]
+
+>> I propose to restrict to having images be subpages of albums, as
+>> described above. -s
+
+Requiring renaming is awkward for non-technical Windows/Mac users, with both
+platforms' defaults being to hide extensions; however, this could be
+circumvented by adding some sort of hook in attachment to turn things into
+a .albumimage at upload time, and declaring that using git/svn/... without
+extensions visible is a "don't do that then" situation :-)
+
+> Or extend `pagetype` so it can do the necessary matching without
+> renaming. Maybe by allowing a subdirectory to be specified along
+> with an extension. (Or allow specifying a full pagespec,
+> but I hesitate to seriously suggest that.) --[[Joey]]
+
+>> I think that might be a terrifying idea for another day. If we can
+>> mutate the extension during the `attach` upload, that'd be enough;
+>> I don't think people who are skilled enough to use git/svn/...,
+>> but not skilled enough to tell Explorer to show file extensions,
+>> represent a major use case. -s
+
+Ideally attachment could also be configured to upload into a specified
+underlay, so that photos don't have to be in your source-code control
+(you might want that, but I don't!).
+
+> Replying to myself: perhaps best done as an orthogonal extension
+> to attach? -s
+
+> Yet another non-obvious thing this design would need to do is to find
+> some way to have each change to memes/badger._albummeta show up as a
+> change to memes/badger in `recentchanges`. -s
+
+Things that would be nice, and are probably possible:
+
+* make the "Edit page" link on viewers divert to album-specific CGI instead
+ of just failing or not appearing (probably possible via pagetemplate)
+
+* some way to deep-link to memes/badger.jpg with a wikilink, without knowing a
+ priori that it's secretly a JPEG (probably harder than it looks - you'd
+ have to make a directive for it and it's probably not worth it)
+
+----
+
+Hi smcv, great plugin. I am an ikiwiki newbie but so far I've had success using your plugin.
+I've integrated the jquery masonry plugin into the albumitem template and it works great.
+But is there a way to create thumnails of different sizes? I've passed thumnailsize option
+and value to album directive and while it does create the new thumbnail sizes it doesn't use them,
+The 96x96 thumbnails still appear on the page no matter what I do. - jaime
+
+----
+
+Hi, the plugin looks great, but I am probably too dumb to use it ;( here is what I did:
+created page gal.mdwn with just \[\[!album\]\] directive (no arguments) and subdirectory gal/ with images in form img_1234.jpg
+
+when I run ikiwiki, I get something completely wrong though:
+
+generated gal/index.html page contains following code repeated for every image:
+
+ <div class="album-viewer">
+ <div id="album-img">
+ <div class="album-finish">
+ <a href="./"><span class="album-arrow">↑</span></a>
+ </div>
+ </div>
+ </div>
+
+So no links to any images, etc.
+
+The pages for individual images are generated though, but also not correct. Trails section is perfect, but the main part is wrong:
+
+ <div class="album-prev">
+ <a><span class="album-arrow">â†<90></span></a><br />
+ <div class="album-thumbnail">
+ <span class="selflink">
+ <img src="./96x96-img_2913.jpg" width="96" height="72" alt="img 2913" title="img 2913" class="img" /></span>
+ </div>
+ </div>
+
+This really seems like this should be in the album page and not individul page. It is only thumbnail and not full image. Also the full image is not in the generated html tree at all!
+
+I am using ikiwiki 3.20130518, and got the album sources from the links of [this page](http://ikiwiki.info/plugins/contrib/album/) (part manual installation)
+
+Any hint about what do I do wrong?
+
+Thanks Lukas
+
+> This plugin is not really finished. I probably need to update it for
+> current ikiwiki. I'll try to update it (and also update my demo
+> and installation instructions) at some point. --[[smcv]]
+
+>> I have to appologize, I accidentally copied the template wrongly and that caused all the issues ;(
+>> So now after two days debugging and tracing, I just fixed that and it works. Well, at least a learnt
+>> a lot about ikiwiki internal ;-)
+>> Thanks for all the work you did on the plugin! --Lukas
diff --git a/doc/plugins/contrib/asymptote.mdwn b/doc/plugins/contrib/asymptote.mdwn
new file mode 100644
index 000000000..a85c60efc
--- /dev/null
+++ b/doc/plugins/contrib/asymptote.mdwn
@@ -0,0 +1,141 @@
+[[!template id=plugin name=asymptote author="[Peter Simons](http://cryp.to/)"]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/asymptote]]
+[[ikiwiki/directive]] which allows embedding
+[asymptote](http://asymptote.sourceforge.net/) diagrams in a page.
+
+Security implications: asymptote has functions for reading files and
+other dangerous stuff, so enabling this plugin means that everyone who
+can edit your Wiki can also read any file from your hard drive thats
+accessible to the user running Ikiwiki.
+
+[[!if test="enabled(asymptote)" then="""
+An example diagram:
+
+[[!asymptote src="""
+import geometry;
+unitsize(1cm);
+triangle t = triangle((0,0), (4,0), (0.5,2));
+show(La="$D$", Lb="$E$", Lc="", t);
+dot(t.A^^t.B^^t.C);
+point pD = midpoint(t.BC); dot(pD);
+point pE = midpoint(t.AC); dot(pE);
+draw(pD--pE);
+
+point A_ = (pD-t.A)*2+t.A; dot("$A'$", A_, NE);
+draw(t.B--A_--t.C, dashed);
+draw(t.A--A_, dashed);
+
+point E_ = midpoint(line(t.B,A_)); dot(Label("$E'$", E_, E));
+draw(E_--pD, dashed);
+"""]]
+"""]]
+
+This plugin uses the [[!cpan Digest::SHA]] perl module.
+
+The full source code is:
+
+ #! /usr/bin/perl
+
+ package IkiWiki::Plugin::asymptote;
+ use warnings;
+ use strict;
+ use Digest::MD5 qw(md5_hex);
+ use File::Temp qw(tempdir);
+ use HTML::Entities;
+ use Encode;
+ use IkiWiki 3.00;
+
+ sub import {
+ hook(type => "getsetup", id => "asymptote", call => \&getsetup);
+ hook(type => "preprocess", id => "asymptote", call => \&preprocess);
+ }
+
+ sub getsetup () {
+ return
+ plugin => {
+ safe => 1,
+ rebuild => undef,
+ section => "widget",
+ },
+ }
+
+ sub preprocess (@) {
+ my %params = @_;
+
+ my $code = $params{src};
+ if (! defined $code && ! length $code) {
+ error gettext("missing src attribute");
+ }
+ return create($code, \%params);
+ }
+
+ sub create ($$$) {
+ # This function calls the image generating function and returns
+ # the <img .. /> for the generated image.
+ my $code = shift;
+ my $params = shift;
+
+ my $digest = md5_hex(Encode::encode_utf8($code));
+
+ my $imglink= $params->{page} . "/$digest.png";
+ my $imglog = $params->{page} . "/$digest.log";
+ will_render($params->{page}, $imglink);
+ will_render($params->{page}, $imglog);
+
+ my $imgurl=urlto($imglink, $params->{destpage});
+ my $logurl=urlto($imglog, $params->{destpage});
+
+ if (-e "$config{destdir}/$imglink" ||
+ gen_image($code, $digest, $params->{page})) {
+ return qq{<img src="$imgurl}
+ .(exists $params->{alt} ? qq{" alt="} . $params->{alt} : qq{})
+ .qq{" class="asymptote" />};
+ }
+ else {
+ error qq{<a href="$logurl">}.gettext("failed to generate image from code")."</a>";
+ }
+ }
+
+ sub gen_image ($$$$) {
+ # Actually creates the image.
+ my $code = shift;
+ my $digest = shift;
+ my $imagedir = shift;
+
+ my $tmp = eval { create_tmp_dir($digest) };
+ if (! $@ &&
+ writefile("$digest.asy", $tmp, $code) &&
+ writefile("$imagedir/$digest.png", $config{destdir}, "") &&
+ system("asy -render=2 -offscreen -f png -o $config{destdir}/$imagedir/$digest.png $tmp/$digest.asy &>$tmp/$digest.log") == 0
+ ) {
+ return 1;
+ }
+ else {
+ # store failure log
+ my $log="";
+ {
+ if (open(my $f, '<', "$tmp/$digest.log")) {
+ local $/=undef;
+ $log = <$f>;
+ close($f);
+ }
+ }
+ writefile("$digest.log", "$config{destdir}/$imagedir", $log);
+
+ return 0;
+ }
+ }
+
+ sub create_tmp_dir ($) {
+ # Create a temp directory, it will be removed when ikiwiki exits.
+ my $base = shift;
+
+ my $template = $base.".XXXXXXXXXX";
+ my $tmpdir = tempdir($template, TMPDIR => 1, CLEANUP => 1);
+ return $tmpdir;
+ }
+
+ 1
+
diff --git a/doc/plugins/contrib/asymptote/ikiwiki/directive/asymptote.mdwn b/doc/plugins/contrib/asymptote/ikiwiki/directive/asymptote.mdwn
new file mode 100644
index 000000000..c6bdb1a99
--- /dev/null
+++ b/doc/plugins/contrib/asymptote/ikiwiki/directive/asymptote.mdwn
@@ -0,0 +1,27 @@
+The `asymptote` directive is supplied by the [[!iki plugins/contrib/asymptote
+desc=asymptote]] plugin.
+
+This directive allows embedding [asymptote](http://asymptote.sourceforge.net/)
+diagrams in a page. Example usage:
+
+ \[[!asymptote src="""
+ import geometry;
+ unitsize(1cm);
+ triangle t = triangle((0,0), (4,0), (0.5,2));
+ show(La="$D$", Lb="$E$", Lc="", t);
+ dot(t.A^^t.B^^t.C);
+ point pD = midpoint(t.BC); dot(pD);
+ point pE = midpoint(t.AC); dot(pE);
+ draw(pD--pE);
+ point A_ = (pD-t.A)*2+t.A; dot("$A'$", A_, NE);
+ draw(t.B--A_--t.C, dashed);
+ draw(t.A--A_, dashed);
+ point E_ = midpoint(line(t.B,A_)); dot(Label("$E'$", E_, E));
+ draw(E_--pD, dashed);
+ """]]
+
+The `asymptote` directive supports the following parameters:
+
+- `src` - The asymptote source code to render.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/plugins/contrib/attach.mdwn b/doc/plugins/contrib/attach.mdwn
new file mode 100644
index 000000000..f44125b10
--- /dev/null
+++ b/doc/plugins/contrib/attach.mdwn
@@ -0,0 +1,47 @@
+[[!template id=plugin name=attach author="[[Ben]]"]]
+
+**Note: This plugin is currently pending upload. It is also most assuredly beta.**
+
+Most of this plugin's functionality is configured in the IkiWiki setup file (`ìkiwiki.setup`, by default), in an `attach` block. A minimum configuration looks something like this:
+
+ attach => {
+ enabled => 1, #If false, no new attachments are allowed via the web interface
+ every_page => 1, #Toggles whether attachments are allowed on every page of the IkiWiki
+ },
+
+This configuration allows any user of the IkiWiki to attach any file to any page of the IkiWiki. By default, each file must be no bigger than 1MB.
+
+## Configuration Options
+
+Each option is specified in the same format as above: the name is a hash key inside of an attach block, and the value is the hash key.
+
+* **ban_ips** - A space separated list of regular expressions corresponding to IP addresses which are prohibited from attaching files. IP address filtering is described in further detail below.
+* **enabled** -Toggles whether attachments are allowed. If false, the attachment form will not appear on any pages, nor will the CGI accept any new uploads. Details of existing attachments will continue to be displayed on the appropriate pages, however.
+* **max\_kbs** - The maximum size in kilobytes an attachment can be. If an upload's size exceeds this value, it will be prohibited. By default, this value is _1024_. If set to _0_, attachments of any size are permitted.
+* **dir** - The name of the temporary directory, relative to the source directory, in which attachments are stored pending validation. The value is prefixed with a period so that it is hidden on \*nix systems. The default value is _attachments_, and there shouldn't be any need to change this.
+* **mime\_strategy** - The method of filtering attachments on their MIME type. Permissible values are _allow,deny_ and _deny,allow_. MIME filtering is described in further detail below.
+* **mime\_allow** - A space-separated list of MIME types, used in conjunction with _mime\_strategy_ and _mime\_deny_. MIME filtering is described in further detail below.
+* **mime\_deny** - A space-separated list of MIME types, used in conjunction with _mime\_strategy_ and _mime\_allow_. MIME filtering is described in further detail below.
+
+## MIME Filtering
+Attachments may be filtered on the basis of their MIME type. For instance, an administrator may wish to prohibit video files from being uploaded to his IkiWiki. This is achieved by a "MIME strategy", a list of MIME types to allow, and a list of MIME types to deny.
+
+With an _allow,deny_ strategy: "First, all Allow directives are evaluated; at least one must match, or the [attachment] is rejected. Next, all Deny directives are evaluated. If any matches, the [attachment] is rejected. Last, any requests which do not match an Allow or a Deny directive are denied by default." (Excerpt from [Apache Module: mod_access](http://httpd.apache.org/docs/2.0/mod/mod_access.html) on which this feature is based).
+
+With a _deny,allow_ strategy: "First, all Deny directives are evaluated; if any match, the request is denied unless it also matches an Allow directive. Any requests which do not match any Allow or Deny directives are permitted." (Excerpt from [Apache Module: mod_access](http://httpd.apache.org/docs/2.0/mod/mod_access.html) on which this feature is based).
+
+## IP Address Filtering
+
+Attachments added via the web can be denied on the basis of their uploader's IP address matching a blacklist.
+
+The blacklist is defined as a space-separated list of regular expressions as a value of the _ban\_ips_ setting. For example, a value of '3 127\. ^2[45]' blocks all addresses containing the number 3, containing the octet 127, and starting with a 2 followed by a 4 or 5.
+
+## Allowing Attachments Only on Specific Pages
+
+An administrator may wish to only allow users to attach files to pages which he has chosen. To do so, he must set the _every\_page_ option to _0_, then an _attach_ preprocessor directive ("\[\[attach \]\] to the pages on which attachments should be allowed.
+
+## Attaching Files from the Command Line
+
+An attachment is simply a non-source file located in the source directory of the IkiWiki. The directory in which the file is located determines which page it is attached to. For example, to attach _song.ogg_ to the _music_ page, an administrator would simply create a _music_ sub-directory of the source directory, if it doesn't already exist, and move _song.ogg_ inside of it.
+
+Files attached in this manner are not subject to any of the validation procedures. They can be of arbitrary size and type.
diff --git a/doc/plugins/contrib/attach/discussion.mdwn b/doc/plugins/contrib/attach/discussion.mdwn
new file mode 100644
index 000000000..803b7dcdb
--- /dev/null
+++ b/doc/plugins/contrib/attach/discussion.mdwn
@@ -0,0 +1,18 @@
+I found this posted to todo list, moved here: --[[Joey]]
+
+> First pass at an attachments plugin. See [[plugins/contrib/attach]] for
+> details/docs. Here's the [diff](http://pastebin.com/f4d889b65), and
+> here's some [technical notes](http://pastebin.com/f584b9d9d). There are
+> still various things I want to fix and tweak, but it works reasonably for
+> me as is.
+
+I guess I missed this when the plugin page was posted last September, and
+since the [[soc]] stuff wasn't updated, I didn't realize this was Ben's soc
+work. Which is more or less why I didn't look at it.
+
+This plugin would need quite a lot of work to finish up, I do think it was
+taking the right approach, sorry I never followed up on it.
+
+In the meantime, I've written an attachment plugin that does most of the
+same stuff, and behaves closer to how I originally sketched [[todo/fileupload]]
+as working.
diff --git a/doc/plugins/contrib/bibtex.mdwn b/doc/plugins/contrib/bibtex.mdwn
new file mode 100644
index 000000000..e600f3963
--- /dev/null
+++ b/doc/plugins/contrib/bibtex.mdwn
@@ -0,0 +1,59 @@
+[[!template id=plugin name=bibtex author="[[Matthias]]"]]
+
+# bibtex for ikiwiki #
+
+(get me at [github]!)
+
+[github]: https://github.com/ihrke/iki-bibtex
+
+This [ikiwiki]-plugin provides a
+
+ [[!bibtex ]]
+
+directive for [ikiwiki].
+
+So far, it can display a raw or formatted bibtex-entry from a
+bibtex-file (either checked into ikiwiki, or not) and display a
+list of all bibtex-keys used on a key.
+
+[ikiwiki]: http://ikiwiki.info/
+
+Features:
+
+* three different output formats for citations:
+ + cite - Author (year)
+ + citation - Author1, Author2 (year): **Title.** *Journal*
+ vol(num). pp.
+ + raw - raw bibtex-entry preformatted
+* supports websetup
+* bibliography
+
+
+## Requirements ##
+
+* [Text::BibTeX] - available from CPAN
+
+[Text::BibTeX]: http://search.cpan.org/~ambs/Text-BibTeX-0.61/lib/Text/BibTeX.pm
+
+## Examples ##
+
+Output from file mybib.bib, bibtex key 'key1' in a citation-like
+format (authors (year): journal. volume (number), pages.).
+
+ [[!bibtex file="mybib.bib" key="key1" format="citation"]]
+
+Combine with toggle-plugin to optionally display the raw bibtex
+
+ [[!bibtex key="Ihrke2011"]] \[[!toggle id="bibtexentry" text="(entry)"]]
+ \[[!toggleable id="bibtexentry" text="""
+ [[!bibtex key="Ihrke2011" format="raw"]]
+ \[[!toggle id="bibtexentry" text="(hide)"]]
+ """]]
+
+Add a bibliography that includes all bibtex-directives from that page
+
+
+ ## Bibliography ##
+ [[!bibtex_bibliography ]]
+
+ ----
diff --git a/doc/plugins/contrib/created_in_future.mdwn b/doc/plugins/contrib/created_in_future.mdwn
new file mode 100644
index 000000000..5768057aa
--- /dev/null
+++ b/doc/plugins/contrib/created_in_future.mdwn
@@ -0,0 +1,18 @@
+# Created_in_future
+
+This plugin provides a `created_in_future()` [[PageSpec|ikiwiki/pagespec/]]
+function. It matches pages which have a creation date in the future.
+
+It also sets the date of the next modification of the page on its creation
+date, so that the corresponding page (and the pages referring to it) will be
+rebuilt on the relevant call of `ikiwiki`.
+
+## Usage
+
+It can be used to display a list of upcoming events.
+
+ \[[!inline pages="events/* and created_in_future()" reverse=yes sorted=meta(date)]]
+
+## Code
+
+Code and documentation this way: [[https://atelier.gresille.org/projects/gresille-ikiwiki/wiki/Created_in_future]].
diff --git a/doc/plugins/contrib/default_content_for___42__copyright__42___and___42__license__42__.mdwn b/doc/plugins/contrib/default_content_for___42__copyright__42___and___42__license__42__.mdwn
new file mode 100644
index 000000000..38d40a35f
--- /dev/null
+++ b/doc/plugins/contrib/default_content_for___42__copyright__42___and___42__license__42__.mdwn
@@ -0,0 +1,62 @@
+[[!template id=plugin name=copyright author="[[tschwinge]]"]]
+[[!template id=plugin name=license author="[[tschwinge]]"]]
+
+[[!meta title="default content for *copyright* and *license*"]]
+
+Someone was just asking for it and I had written these two plugins already some months ago,
+so I'm now publishing them here.
+
+[`copyright.pm`](http://git.savannah.gnu.org/cgit/hurd/web.git/plain/.library/IkiWiki/Plugin/copyright.pm)
+and
+[`license.pm`](http://git.savannah.gnu.org/cgit/hurd/web.git/plain/.library/IkiWiki/Plugin/license.pm)
+
+Usage instructions are found inside the two plugin files.
+
+--[[tschwinge]]
+
+I was asking about this in IRC the other day, but someone pointed me at the
+[[Varioki|todo/varioki_--_add_template_variables___40__with_closures_for_values__41___in_ikiwiki.setup]]
+plugin. It seems to me that it would be a better idea to have a way of defining
+template variables in general, rather than having to add a new plugin for every
+template variable somebody wants to use.
+
+--[[bma]]
+
+Copyright and license values are not "template values", they are values
+tracked by the [[meta]] plugin, and that various code compares and uses to fill
+out the templates. Something like varioki cannot do that. --[[Joey]]
+
+Somewhat more detailed usage documentation would be appreciated. I tried to setup
+those plugins with a current ikiwiki release, i.e. 2.61, but they appeared to do
+nothing, really. Also, those example pages don't seem to use those plugins, even;
+they set "copyright" and "license" properties using ordinary [[meta]] tags. Maybe
+I'm missing something terribly obvious? --Peter
+
+> Only obvious if you read the source :-). You need to put a file named "copyright.html"
+>(respectively "license.html") in your wiki. Everything underneath that (in the wikilink sense) will use that
+>content for the license or copyright. Saves putting \[[meta license="foo"]] in every page [[DavidBremner]]
+
+By the way: these need not be *HTML* files; `copyright.mdwn`,
+respectively `license.mdwn`, or every other format supported
+by ikiwiki are likewise fine. --[[tschwinge]]
+
+> Jon has done something similar in [[todo/allow_site-wide_meta_definitions]];
+> his version has the advantages that it doesn't invent magical page names,
+> and can extend beyond just copyright and license, but has the disadvantage
+> that it doesn't support setting defaults for a given "subdirectory"
+> only. --[[smcv]]
+
+> I downloaded the two *.pm files and made them executable, and put in
+> `$ ls /usr/local/lib/site_perl/IkiWiki/Plugin/` and added `copyright.mdwn` and `license.mdwn`
+> and rebuilt the wiki, but the copyright/license text doesn't show up. Does these plugin work with Ikiwiki `3.20100815`?
+> -- 9unmetal
+
+>> Solved by email long ago; the problem was that the user had not put them
+>> into the *add_plugins* set in the wiki's `ikiwiki.setup`. --[[tschwinge]]
+
+[[!template id=gitbranch branch=smcv/contrib/defcopyright author="[[tschwinge]]"]]
+
+> For `./gitremotes` convenience (taking the Linus approach to backups :-) )
+> I've added this to my git repository as a branch. No review, approval or
+> ownership is implied, feel free to replace this with a branch in any other
+> repository --[[smcv]]
diff --git a/doc/plugins/contrib/dynamiccookies.mdwn b/doc/plugins/contrib/dynamiccookies.mdwn
new file mode 100644
index 000000000..c43b18f6b
--- /dev/null
+++ b/doc/plugins/contrib/dynamiccookies.mdwn
@@ -0,0 +1,12 @@
+[[!template id=plugin name=dynamiccookies author="[[schmonz]]"]]
+[[!template id=gitbranch branch=schmonz/dynamiccookies author="[[schmonz]]"]]
+[[!tag type/web]]
+
+This plugin populates ikiwiki's cookiejar by calling an external
+program. The program is expected to print the serialized cookies
+on `stdout` in a form which can be `eval`'d (e.g., `Data::Dumper`).
+
+The plugin author's use case for this seemingly hacky interface:
+aggregating authenticated feeds at work, where for various reasons
+the needed cookies must be acquired using a separate `perl` from
+the one used by ikiwiki.
diff --git a/doc/plugins/contrib/field.mdwn b/doc/plugins/contrib/field.mdwn
new file mode 100644
index 000000000..363d3a7eb
--- /dev/null
+++ b/doc/plugins/contrib/field.mdwn
@@ -0,0 +1,219 @@
+[[!template id=plugin name=field author="[[rubykat]]"]]
+[[!tag type/meta]]
+[[!toc]]
+## NAME
+
+IkiWiki::Plugin::field - front-end for per-page record fields.
+
+## SYNOPSIS
+
+ # activate the plugin
+ add_plugins => [qw{goodstuff field ....}],
+
+ # simple registration
+ field_register => [qw{meta}],
+
+ # simple registration with priority
+ field_register => {
+ meta => 'last'
+ foo => 'DD'
+ },
+
+ # allow the config to be queried as a field
+ field_allow_config => 1,
+
+ # flag certain fields as "tags"
+ field_tags => {
+ BookAuthor => '/books/authors',
+ BookGenre => '/books/genres',
+ MovieGenre => '/movies/genres',
+ }
+
+## DESCRIPTION
+
+This plugin is meant to be used in conjunction with other plugins
+in order to provide a uniform interface to access per-page structured
+data, where each page is treated like a record, and the structured data
+are fields in that record. This can include the meta-data for that page,
+such as the page title.
+
+Plugins can register a function which will return the value of a "field" for
+a given page. This can be used in a few ways:
+
+* In page templates; all registered fields will be passed to the page template in the "pagetemplate" processing.
+* In PageSpecs; the "field" function can be used to match the value of a field in a page.
+* In SortSpecs; the "field" function can be used for sorting pages by the value of a field in a page.
+* By other plugins, using the field_get_value function, to get the value of a field for a page, and do with it what they will.
+
+## CONFIGURATION OPTIONS
+
+The following options can be set in the ikiwiki setup file.
+
+**field_allow_config**
+
+ field_allow_config => 1,
+
+Allow the $config hash to be queried like any other field; the
+keys of the config hash are the field names with a prefix of "CONFIG-".
+
+**field_register**
+
+ field_register => [qw{meta}],
+
+ field_register => {
+ meta => 'last'
+ foo => 'DD'
+ },
+
+A hash of plugin-IDs to register. The keys of the hash are the names of the
+plugins, and the values of the hash give the order of lookup of the field
+values. The order can be 'first', 'last', 'middle', or an explicit order
+sequence between 'AA' and 'ZZ'. If the simpler type of registration is used,
+then the order will be 'middle'.
+
+This assumes that the plugins in question store data in the %pagestatus hash
+using the ID of that plugin, and thus the field values are looked for there.
+
+This is the simplest form of registration, but the advantage is that it
+doesn't require the plugin to be modified in order for it to be
+registered with the "field" plugin.
+
+**field_tags**
+
+ field_tags => {
+ BookAuthor => '/books/authors',
+ BookGenre => '/books/genres',
+ MovieGenre => '/movies/genres',
+ }
+
+A hash of fields and their associated pages. This provides a faceted
+tagging system.
+
+The way this works is that a given field-name will be associated with a given
+page, and the values of that field will be linked to sub-pages of that page,
+the same way that the \[[!tag ]] directive does.
+
+This also provides a field with the suffix of `-tagpage` which gives
+the name of the page to which that field-value is linked.
+
+For example:
+
+ BookGenre: SF
+
+will link to "/books/genres/SF", with a link-type of "bookgenre".
+
+If one was using a template, then the following template:
+
+ Genre: <TMPL_VAR BOOKGENRE>
+ GenrePage: <TMPL_VAR BOOKGENRE-TAGPAGE>
+ GenreLink: \[[<TMPL_VAR BOOKGENRE-TAGPAGE>]]
+
+would give:
+
+ Genre: SF
+ GenrePage: /books/genres/SF
+ GenreLink: <a href="/books/genres/SF/">SF</a>
+
+## PageSpec
+
+The `field` plugin provides a few PageSpec functions to match values
+of fields for pages.
+
+* field
+ * **field(*name* *glob*)**
+ * field(bar Foo\*) will match if the "bar" field starts with "Foo".
+* destfield
+ * **destfield(*name* *glob*)**
+ * as for "field" but matches against the destination page (i.e when the source page is being included in another page).
+* field_item
+ * **field_item(*name* *glob*)**
+ * field_item(bar Foo) will match if one of the values of the "bar" field is "Foo".
+* destfield_item
+ * **destfield_item(*name* *glob*)**
+ * as for "field_item" but matches against the destination page.
+* field_null
+ * **field_null(*name*)**
+ * matches if the field is null, that is, if there is no value for that field, or the value is empty.
+* field_tagged
+ * **field_tagged(*name* *glob*)**
+ * like `tagged`, but this uses the tag-bases and link-types defined in the `field_tags` configuration option.
+* destfield_tagged
+ * **destfield_tagged(*name* *glob*)**
+ * as for "field_tagged" but matches against the destination page.
+
+## SortSpec
+
+The "field" SortSpec function can be used to sort a page depending on the value of a field for that page. This is used for directives that take sort parameters, such as **inline** or **report**.
+
+field(*name*)
+
+For example:
+
+sort="field(bar)" will sort by the value og the "bar" field.
+
+Additionally, the "field_natural" SortSpec function will use the
+Sort::Naturally module to do its comparison (though it will fail if that
+module is not installed).
+
+## FUNCTIONS
+
+### field_register
+
+field_register(id=>$id);
+
+Register a plugin as having field data. The above form is the simplest, where
+the field value is looked up in the %pagestatus hash under the plugin-id.
+
+Additional Options:
+
+**call=>&myfunc**
+
+A reference to a function to call rather than just looking up the value in the
+%pagestatus hash. It takes two arguments: the name of the field, and the name
+of the page. It is expected to return (a) an array of the values of that field
+if "wantarray" is true, or (b) a concatenation of the values of that field
+if "wantarray" is not true, or (c) undef if there is no field by that name.
+
+ sub myfunc ($$) {
+ my $field = shift;
+ my $page = shift;
+
+ ...
+
+ return (wantarray ? @values : $value);
+ }
+
+**first=>1**
+
+Set this to be called first in the sequence of calls looking for values. Since
+the first found value is the one which is returned, ordering is significant.
+This is equivalent to "order=>'first'".
+
+**last=>1**
+
+Set this to be called last in the sequence of calls looking for values. Since
+the first found value is the one which is returned, ordering is significant.
+This is equivalent to "order=>'last'".
+
+**order=>$order**
+
+Set the explicit ordering in the sequence of calls looking for values. Since
+the first found value is the one which is returned, ordering is significant.
+
+The values allowed for this are "first", "last", "middle", or a two-character
+ordering-sequence between 'AA' and 'ZZ'.
+
+### field_get_value($field, $page)
+
+ my $value = field_get_value($field, $page);
+
+ my $value = field_get_value($field, $page, foo=>'bar');
+
+Returns the value of the field for that page, or undef if none is found.
+It is also possible to override the value returned by passing in
+a value of your own.
+
+## DOWNLOAD
+
+* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/field.pm>
+* git repo at git://github.com/rubykat/ikiplugins.git
diff --git a/doc/plugins/contrib/field/discussion.mdwn b/doc/plugins/contrib/field/discussion.mdwn
new file mode 100644
index 000000000..6161f80df
--- /dev/null
+++ b/doc/plugins/contrib/field/discussion.mdwn
@@ -0,0 +1,407 @@
+Having tried out `field`, some comments (from [[smcv]]):
+
+The general concept looks great.
+
+The `pagetemplate` hook seems quite namespace-polluting: on a site containing
+a list of books, I'd like to have an `author` field, but that would collide
+with IkiWiki's use of `<TMPL_VAR AUTHOR>` for the author of the *page*
+(i.e. me). Perhaps it'd be better if the pagetemplate hook was only active for
+`<TMPL_VAR FIELD_AUTHOR>` or something? (For those who want the current
+behaviour, an auxiliary plugin would be easy.)
+
+> No, please. The idea is to be *able* to override field names if one wishes to, and choose, for yourself, non-colliding field names if one wishes not to. I don't wish to lose the power of being able to, say, define a page title with YAML format if I want to, or to write a site-specific plugin which calculates a page title, or other nifty things.
+>It's not like one is going to lose the fields defined by the meta plugin; if "author" is defined by \[[!meta author=...]] then that's what will be found by "field" (provided the "meta" plugin is registered; that's what the "field_register" option is for).
+>--[[KathrynAndersen]]
+
+>> Hmm. I suppose if you put the title (or whatever) in the YAML, then
+>> "almost" all the places in IkiWiki that respect titles will do the
+>> right thing due to the pagetemplate hook, with the exception being
+>> anything that has special side-effects inside `meta` (like `date`),
+>> or anything that looks in `$pagestate{foo}{meta}` directly
+>> (like `map`). Is your plan that `meta` should register itself by
+>> default, and `map` and friends should be adapted to
+>> work based on `getfield()` instead of `$pagestate{foo}{meta}`, then?
+
+>>> Based on `field_get_value()`, yes. That would be my ideal. Do you think I should implement that as an ikiwiki branch? --[[KathrynAndersen]]
+
+>>>> This doesn't solve cases where certain fields are treated specially; for
+>>>> instance, putting a `\[[!meta permalink]]` on a page is not the same as
+>>>> putting it in `ymlfront` (in the latter case you won't get your
+>>>> `<link>` header), and putting `\[[!meta date]]` is not the same as putting
+>>>> `date` in `ymlfront` (in the latter case, `%pagectime` won't be changed).
+>>>>
+>>>> One way to resolve that would be to have `ymlfront`, or similar, be a
+>>>> front-end for `meta` rather than for `field`, and call
+>>>> `IkiWiki::Plugin::meta::preprocess` (or a refactored-out function that's
+>>>> similar).
+>>>>
+>>>> There are also some cross-site scripting issues (see below)... --[[smcv]]
+
+>> (On the site I mentioned, I'm using an unmodified version of `field`,
+>> and currently working around the collision by tagging books' pages
+>> with `bookauthor` instead of `author` in the YAML.) --s
+
+>> Revisiting this after more thought, the problem here is similar to the
+>> possibility that a wiki user adds a `meta` shortcut
+>> to [[shortcuts]], or conversely, that a plugin adds a `cpan` directive
+>> that conflicts with the `cpan` shortcut that pages already use. (In the
+>> case of shortcuts, this is resolved by having plugin-defined directives
+>> always win.) For plugin-defined meta keywords this is the plugin
+>> author's/wiki admin's problem - just don't enable conflicting plugins! -
+>> but it gets scary when you start introducing things like `ymlfront`, which
+>> allow arbitrary, wiki-user-defined fields, even ones that subvert
+>> other plugins' assumptions.
+>>
+>> The `pagetemplate` hook is particularly alarming because page templates are
+>> evaluated in many contexts, not all of which are subject to the
+>> htmlscrubber or escaping; because the output from `field` isn't filtered,
+>> prefixed or delimited, when combined with an arbitrary-key-setting plugin
+>> like `ymlfront` it can interfere with other plugins' expectations
+>> and potentially cause cross-site scripting exploits. For instance, `inline`
+>> has a `pagetemplate` hook which defines the `FEEDLINKS` template variable
+>> to be a blob of HTML to put in the `<head>` of the page. As a result, this
+>> YAML would be bad:
+>>
+>> ---
+>> FEEDLINKS: <script>alert('code injection detected')</script>
+>> ---
+>>
+>> (It might require a different case combination due to implementation
+>> details, I'm not sure.)
+>>
+>> It's difficult for `field` to do anything about this, because it doesn't
+>> know whether a field is meant to be plain text, HTML, a URL, or something
+>> else.
+>>
+>> If `field`'s `pagetemplate` hook did something more limiting - like
+>> only emitting template variables starting with `field_`, or from some
+>> finite set, or something - then this would cease to be a problem, I think?
+>>
+>> `ftemplate` and `getfield` don't have this problem, as far as I can see,
+>> because their output is in contexts where the user could equally well have
+>> written raw HTML directly; the user can cause themselves confusion, but
+>> can't cause harmful output. --[[smcv]]
+
+From a coding style point of view, the `$CamelCase` variable names aren't
+IkiWiki style, and the `match_foo` functions look as though they could benefit
+from being thin wrappers around a common `&IkiWiki::Plugin::field::match`
+function (see `meta` for a similar approach).
+
+I think the documentation would probably be clearer in a less manpage-like
+and more ikiwiki-like style?
+
+> I don't think ikiwiki *has* a "style" for docs, does it? So I followed the Perl Module style. And I'm rather baffled as to why having the docs laid out in clear sections... make them less clear. --[[KathrynAndersen]]
+
+>> I keep getting distracted by the big shouty headings :-)
+>> I suppose what I was really getting at was that when this plugin
+>> is merged, its docs will end up split between its plugin
+>> page, [[plugins/write]] and [[ikiwiki/PageSpec]]; on some of the
+>> contrib plugins I've added I've tried to separate the docs
+>> according to how they'll hopefully be laid out after merge. --s
+
+If one of my branches from [[todo/allow_plugins_to_add_sorting_methods]] is
+accepted, a `field()` cmp type would mean that [[plugins/contrib/report]] can
+stop reimplementing sorting. Here's the implementation I'm using, with
+your "sortspec" concept (a sort-hook would be very similar): if merged,
+I think it should just be part of `field` rather than a separate plugin.
+
+ # Copyright © 2010 Simon McVittie, released under GNU GPL >= 2
+ package IkiWiki::Plugin::fieldsort;
+ use warnings;
+ use strict;
+ use IkiWiki 3.00;
+ use IkiWiki::Plugin::field;
+
+ sub import {
+ hook(type => "getsetup", id => "fieldsort", call => \&getsetup);
+ }
+
+ sub getsetup () {
+ return
+ plugin => {
+ safe => 1,
+ rebuild => undef,
+ },
+ }
+
+ package IkiWiki::SortSpec;
+
+ sub cmp_field {
+ if (!length $_[0]) {
+ error("sort=field requires a parameter");
+ }
+
+ my $left = IkiWiki::Plugin::field::field_get_value($_[0], $a);
+ my $right = IkiWiki::Plugin::field::field_get_value($_[0], $b);
+
+ $left = "" unless defined $left;
+ $right = "" unless defined $right;
+ return $left cmp $right;
+ }
+
+ 1;
+
+----
+
+Disclaimer: I've only looked at this plugin and ymlfront, not other related
+stuff yet. (I quite like ymlfront, so I looked at this as its dependency. :)
+I also don't want to annoy you with a lot of design discussion
+if your main goal was to write a plugin that did exactly what you wanted.
+
+My first question is: Why we need another plugin storing metadata
+about the page, when we already have the meta plugin? Much of the
+complication around the field plugin has to do with it accessing info
+belonging to the meta plugin, and generalizing that to be able to access
+info stored by other plugins too. (But I don't see any other plugins that
+currently store such info). Then too, it raises points of confusion like
+smcv's discuission of field author vs meta author above. --[[Joey]]
+
+> The point is exactly in the generalization, to provide a uniform interface for accessing structured data, no matter what the source of it, whether that be the meta plugin or some other plugin.
+
+> There were a few reasons for this:
+
+>1. In converting my site over from PmWiki, I needed something that was equivalent to PmWiki's Page-Text-Variables (which is how PmWiki implements structured data).
+>2. I also wanted an equivalent of PmWiki's Page-Variables, which, rather than being simple variables, are the return-value of a function. This gives one a lot of power, because one can do calculations, derive one thing from another. Heck, just being able to have a "basename" variable is useful.
+>3. I noticed that in the discussion about structured data, it was mired down in disagreements about what form the structured data should take; I wanted to overcome that hurdle by decoupling the form from the content.
+>4. I actually use this to solve (1), because, while I do use ymlfront, initially my pages were in PmWiki format (I wrote (another) unreleased plugin which parses PmWiki format) including PmWiki's Page-Text-Variables for structured data. So I needed something that could deal with multiple formats.
+
+> So, yes, it does cater to mostly my personal needs, but I think it is more generally useful, also.
+> --[[KathrynAndersen]]
+
+>> Is it fair to say, then, that `field`'s purpose is to take other
+>> plugins' arbitrary per-page data, and present it as a single
+>> merged/flattened string => string map per page? From the plugins
+>> here, things you then use that merged map for include:
+>>
+>> * sorting - stolen by [[todo/allow_plugins_to_add_sorting_methods]]
+>> * substitution into pages with Perl-like syntax - `getfield`
+>> * substitution into wiki-defined templates - the `pagetemplate`
+>> hook
+>> * substitution into user-defined templates - `ftemplate`
+>>
+>> As I mentioned above, the flattening can cause collisions (and in the
+>> `pagetemplate` case, even security problems).
+>>
+>> I wonder whether conflating Page Text Variables with Page Variables
+>> causes `field` to be more general than it needs to be?
+>> To define a Page Variable (function-like field), you need to write
+>> a plugin containing that Perl function; if we assume that `field`
+>> or something resembling it gets merged into ikiwiki, then it's
+>> reasonable to expect third-party plugins to integrate with whatever
+>> scaffolding there is for these (either in an enabled-by-default
+>> plugin that most people are expected to leave enabled, like `meta`
+>> now, or in the core), and it doesn't seem onerous to expect each
+>> plugin that wants to participate in this mechanism to have code to
+>> do so. While it's still contrib, `field` could just have a special case
+>> for the meta plugin, rather than the converse?
+>>
+>> If Page Text Variables are limited to being simple strings as you
+>> suggest over in [[forum/an_alternative_approach_to_structured_data]],
+>> then they're functionally similar to `meta` fields, so one way to
+>> get their functionality would be to extend `meta` so that
+>>
+>> \[[!meta badger="mushroom"]]
+>>
+>> (for an unrecognised keyword `badger`) would store
+>> `$pagestate{$page}{meta}{badger} = "mushroom"`? Getting this to
+>> appear in templates might be problematic, because a naive
+>> `pagetemplate` hook would have the same problem that `field` combined
+>> with `ymlfront` currently does.
+>>
+>> One disadvantage that would appear if the function-like and
+>> meta-like fields weren't in the same namespace would be that it
+>> wouldn't be possible to switch a field from being meta-like to being
+>> function-like without changing any wiki content that referenced it.
+>>
+>> Perhaps meta-like fields should just *be* `meta` (with the above
+>> enhancement), as a trivial case of function-like fields? That would
+>> turn `ymlfront` into an alternative syntax for `meta`, I think?
+>> That, in turn, would hopefully solve the special-fields problem,
+>> by just delegating it to meta. I've been glad of the ability to define
+>> new ad-hoc fields with this plugin without having to write an extra plugin
+>> to do so (listing books with a `bookauthor` and sorting them by
+>> `"field(bookauthor) title"`), but that'd be just as easy if `meta`
+>> accepted ad-hoc fields?
+>>
+>> --[[smcv]]
+
+>>> Your point above about cross-site scripting is a valid one, and something I
+>>> hadn't thought of (oops).
+
+>>> I still want to be able to populate pagetemplate templates with field, because I
+>>> use it for a number of things, such as setting which CSS files to use for a
+>>> given page, and, as I said, for titles. But apart from the titles, I
+>>> realize I've been setting them in places other than the page data itself.
+>>> (Another unreleased plugin, `concon`, uses Config::Context to be able to
+>>> set variables on a per-site, per-directory and a per-page basis).
+
+>>> The first possible solution is what you suggested above: for field to only
+>>> set values in pagetemplate which are prefixed with *field_*. I don't think
+>>> this is quite satisfactory, since that would still mean that people could
+>>> put un-scrubbed values into a pagetemplate, albeit they would be values
+>>> named field_foo, etc. --[[KathrynAndersen]]
+
+>>>> They can already do similar; `PERMALINK` is pre-sanitized to
+>>>> ensure that it's a "safe" URL, but if an extremely confused wiki admin was
+>>>> to put `COPYRIGHT` in their RSS/Atom feed's `<link>`, a malicious user
+>>>> could put an unsafe (e.g. Javascript) URL in there (`COPYRIGHT` *is*
+>>>> HTML-scrubbed, but "javascript:alert('pwned!')" is just text as far as a
+>>>> HTML sanitizer is concerned, so it passes straight through). The solution
+>>>> is to not use variables in situations where that variable would be
+>>>> inappropriate. Because `field` is so generic, the definition of what's
+>>>> appropriate is difficult. --[[smcv]]
+
+>>> An alternative solution would be to classify field registration as "secure"
+>>> and "insecure". Sources such as ymlfront would be insecure, sources such
+>>> as concon (or the $config hash) would be secure, since they can't be edited
+>>> as pages. Then, when doing pagetemplate substitution (but not ftemplate
+>>> substitution) the insecure sources could be HTML-escaped.
+>>> --[[KathrynAndersen]]
+
+>>>> Whether you trust the supplier of data seems orthogonal to whether its value
+>>>> is (meant to be) interpreted as plain text, HTML, a URL or what?
+>>>>
+>>>> Even in cases where you trust the supplier, you need to escape things
+>>>> suitably for the context, not for security but for correctness. The
+>>>> definition of the value, and the context it's being used in, changes the
+>>>> processing you need to do. An incomplete list:
+>>>>
+>>>> * HTML used as HTML needs to be html-scrubbed if and only if untrusted
+>>>> * URLs used as URLs need to be put through `safeurl()` if and only if
+>>>> untrusted
+>>>> * HTML used as plain text needs tags removed regardless
+>>>> * URLs used as plain text are safe
+>>>> * URLs or plain text used in HTML need HTML-escaping (and URLs also need
+>>>> `safeurl()` if untrusted)
+>>>> * HTML or plain text used in URLs need URL-escaping (and the resulting
+>>>> URL might need sanitizing too?)
+>>>>
+>>>> I can't immediately think of other data types we'd be interested in beyond
+>>>> text, HTML and URL, but I'm sure there are plenty.
+
+>>>>> But isn't this a problem with anything that uses pagetemplates? Or is
+>>>>> the point that, with plugins other than `field`, they all know,
+>>>>> beforehand, the names of all the fields that they are dealing with, and
+>>>>> thus the writer of the plugin knows which treatment each particular field
+>>>>> needs? For example, that `meta` knows that `title` needs to be
+>>>>> HTML-escaped, and that `baseurl` doesn't. In that case, yes, I see the problem.
+>>>>> It's a tricky one. It isn't as if there's only ever going to be a fixed set of fields that need different treatment, either. Because the site admin is free to add whatever fields they like to the page template (if they aren't using the default one, that is. I'm not using the default one myself).
+>>>>> Mind you, for trusted sources, since the person writing the page template and the person providing the variable are the same, they themselves would know whether the value will be treated as HTML, plain text, or a URL, and thus could do the needed escaping themselves when writing down the value.
+
+>>>>> Looking at the content of the default `page.tmpl` let's see what variables fall into which categories:
+
+>>>>> * **Used as URL:** BASEURL, EDITURL, PARENTLINKS->URL, RECENTCHANGESURL, HISTORYURL, GETSOURCEURL, PREFSURL, OTHERLANGUAGES->URL, ADDCOMMENTURL, BACKLINKS->URL, MORE_BACKLINKS->URL
+>>>>> * **Used as part of a URL:** FAVICON, LOCAL_CSS
+>>>>> * **Needs to be HTML-escaped:** TITLE
+>>>>> * **Used as-is (as HTML):** FEEDLINKS, RELVCS, META, PERCENTTRANSLATED, SEARCHFORM, COMMENTSLINK, DISCUSSIONLINK, OTHERLANGUAGES->PERCENT, SIDEBAR, CONTENT, COMMENTS, TAGS->LINK, COPYRIGHT, LICENSE, MTIME, EXTRAFOOTER
+
+>>>>> This looks as if only TITLE needs HTML-escaping all the time, and that the URLS all end with "URL" in their name. Unfortunately the FAVICON and LOCAL_CSS which are part of URLS don't have "URL" in their name, though that's fair enough, since they aren't full URLs.
+
+>>>>> --K.A.
+
+>>>> One reasonable option would be to declare that `field` takes text-valued
+>>>> fields, in which case either consumers need to escape
+>>>> it with `<TMPL_VAR FIELD_FOO ESCAPE=HTML>`, and not interpret it as a URL
+>>>> without first checking `safeurl`), or the pagetemplate hook needs to
+>>>> pre-escape.
+
+>>>>> Since HTML::Template does have the ability to do ESCAPE=HTML/URL/JS, why not take advantage of that? Some things, like TITLE, probably should have ESCAPE=HTML all the time; that would solve the "to escape or not to escape" problem that `meta` has with titles. After all, when one *sorts* by title, one doesn't really want HTML-escaping in it; only when one uses it in a template. -- K.A.
+
+>>>> Another reasonable option would be to declare that `field` takes raw HTML,
+>>>> in which case consumers need to only use it in contexts that will be
+>>>> HTML-scrubbed (but it becomes unsuitable for using as text - problematic
+>>>> for text-based things like sorting or URLs, and not ideal for searching).
+>>>>
+>>>> You could even let each consumer choose how it's going to use the field,
+>>>> by having the `foo` field generate `TEXT_FOO` and `HTML_FOO` variables?
+>>>> --[[smcv]]
+
+>>>>> Something similar is already done in `template` and `ftemplate` with the `raw_` prefix, which determines whether the variable should have `htmlize` run over it first before the value is applied to the template. Of course, that isn't scrubbing or escaping, because with those templates, the scrubbing is done afterwards as part of the normal processing.
+
+>>> Another problem, as you point out, is special-case fields, such as a number of
+>>> those defined by `meta`, which have side-effects associated with them, more
+>>> than just providing a value to pagetemplate. Perhaps `meta` should deal with
+>>> the side-effects, but use `field` as an interface to get the values of those special fields.
+
+>>> --[[KathrynAndersen]]
+
+-----
+
+I think the main point is: what is (or should be) the main point of the
+field plugin? If it's essentially a way to present a consistent
+interface to access page-related structured information, then it makes
+sense to have it very general. Plugins registering with fields would
+then present ways for recovering the structure information from the page
+(`ymlfront`, `meta`, etc), ways to manipulate it (like `meta` does),
+etc.
+
+In this sense, security should be entirely up to the plugins, although
+the fields plugin could provide some auxiliary infrastructure (like
+determining where the data comes from and raise or lower the security
+level accoringly).
+
+Namespacing is important, and it should be considered at the field
+plugin interface level. A plugin should be able to register as
+responsible for the processing of all data belonging to a given
+namespace, but plugins should be able to set data in any namespace. So
+for example, `meta` register are `meta` fields processing, and whatever
+method is used to set the data (`meta` directive, `ymlfront`, etc) it
+gets a say on what to do with data in its namespace.
+
+What I'm thinking of is something you could call fieldsets. The nice
+thing about them is that, aside from the ones defined by plugins (like
+`meta`), it would be possible to define custom ones (with a generic,
+default processor) in an appropriate file (like smileys and shortcuts)
+with a syntax like:
+
+ [[!fieldset book namespace=book
+ fields="author title isbn"
+ fieldtype="text text text"]]
+
+after which, you coude use
+
+ [[!book author="A. U. Thor"
+ title="Fields of Iki"]]
+
+and the data would be available under the book namespace, and thus
+as BOOK_AUTHOR, BOOK_TITLE etc in templates.
+
+Security, in this sense, would be up to the plugin responsible for the
+namespace processing (the default handler would HTML-escape text fields
+scrub, html fields, safeurl()ify url fields, etc.)
+
+> So, are you saying that getting a field value is sort of a two-stage process? Get the value from anywhere, and then call the "security processor" for that namespace to "secure" the value? I think "namespaces" are really orthogonal to this issue. What the issue seems to be is:
+
+ * what form do we expect the raw field to be in? (text, URL, HTML)
+ * what form do we expect the "secured" output to be in? (raw HTML, scrubbed HTML, escaped HTML, URL)
+
+> Only if we know both these things will we know what sort of security processing needs to be done.
+
+>> Fieldsets are orthogonal to the security issue in the sense that you can use
+>> them without worrying about the field security issue, but they happen to be
+>> a rather clean way of answering those two questions, by allowing you to
+>> attach preprocessing attributes to a field in a way that the user
+>> (supposedly) cannot mingle with.
+
+> There is also a difference between field values that are used inside pagetemplate, and field values which are used as part of a page's content (e.g. with ftemplate). If you have a TITLE, you want it to be HTML-escaped if you're using it inside pagetemplate, but you don't want it to be HTML-escaped if you're using it inside a page's content. On the other hand, if you have, say, FEEDLINKS used inside pagetemplate, you don't wish it to be HTML-escaped at all, or your page content will be completely stuffed.
+
+>> Not to talk about the many different ways date-like fields might be need
+>> processing. It has already been proposed to solve this problem by exposing
+>> the field values under different names depending on the kind or amout of
+>> postprocessing they had (e.g. RAW_SOMEFIELD, SOMEFIELD, to which we could add
+>> HTML_SOMEFIELD, URL_SOMEFIELD or whatever). Again, fieldsets offer a simple way
+>> of letting Ikiwiki know what kind of postprocessing should be offered for
+>> that particular field.
+
+> So, somehow, we have to know the meaning of a field before we can use it properly, which kind of goes against the idea of having something generic.
+
+>> We could have a default field type (text, for example), and a way to set a
+>> different field type (which is what my fieldset proposal was about).
+
+> --[[KathrynAndersen]]
+
+-----
+
+I was just looking at HTML5 and wondered if the field plugin should generate the new Microdata tags (as well as the internal structures)? <http://slides.html5rocks.com/#slide19> -- [[Will]]
+
+> This could just as easily be done as a separate plugin. Feel free to do so. --[[KathrynAndersen]]
diff --git a/doc/plugins/contrib/flattr.mdwn b/doc/plugins/contrib/flattr.mdwn
new file mode 100644
index 000000000..e9b4bf857
--- /dev/null
+++ b/doc/plugins/contrib/flattr.mdwn
@@ -0,0 +1,48 @@
+[[!template id=plugin name=flattr author="[[jaywalk]]"]]
+
+[flattr.com](http://flattr.com/) is a flatrate micropayment service, which revolves around the idea of having flattr buttons everywhere that people visiting your site can use to "flattr" you.
+
+This plugin makes it easier to put flattr buttons in ikiwiki. It supports both the static kind as well as the counting dynamic javascript version. The dynamic version does not work if [[htmlscrubber|/plugins/htmlscrubber]] is active on the page.
+
+The dynamic button does not require creation of the page on flattr before being added to a page, the static one does.
+
+I wrote some notes on [jonatan.walck.se](http://jonatan.walck.se/software/ikiwiki/plugin/flattr/) and put the source here: [flattr.pm](http://jonatan.walck.se/software/ikiwiki/flattr.pm)
+
+This plugin is licensed under [CC0](http://creativecommons.org/publicdomain/zero/1.0/) (public domain).
+
+Note that there is now a [[plugins/flattr]] plugin bundled with ikiwiki. It
+is less configurable, not supporting static buttons, but simpler to use.
+
+# Usage #
+
+ # [[!flattr args]] where args are in the form of arg=value.
+ # Possible args:
+ # type - static or dynamic. Defaults to static.
+
+ # vars in static mode:
+ # --------------------
+ # Required:
+ # url - URL to flattr page,
+ # e.g. http://flattr.com/thing/1994/jaywalks-weblog
+ # Optional:
+ # style - Set to compact for compact button.
+
+ # vars in dynamic mode:
+ # ---------------------
+ # Required:
+ # None.
+ # Optional:
+ # uid - Set the default in the plugin, override if needed.
+ # title - The title defaults to $wikiname/some/path (like on the top of
+ # the wiki).
+ # desc - A description of the content. Defaults to " ".
+ # cat - Category, this can be text, images, video, audio, software or
+ # rest. Defaults to text.
+ # lang - Language, list of available choises is on
+ # https://flattr.com/support/integrate/languages. Defaults to en_GB.
+ # tag - A list of comma separated tags. Empty per default.
+ # url - URL to thing to flattred,
+ # e.g. http://jonatan.walck.se/weblog
+ # style - Set it to compact to get the small button, big for any other
+ # value including empty.
+
diff --git a/doc/plugins/contrib/flattr/discussion.mdwn b/doc/plugins/contrib/flattr/discussion.mdwn
new file mode 100644
index 000000000..586139e9c
--- /dev/null
+++ b/doc/plugins/contrib/flattr/discussion.mdwn
@@ -0,0 +1,9 @@
+FWIW, it is possible for a plugin like this to add javascript to pages that
+are protected by htmlscrubber. Just return a token in your preprocess hook,
+and then have a format hook that replaces the token with the javascript.
+--[[Joey]]
+
+> Thanks, That's good to know. I'll try to continue the development of this
+> plugin later, for now I just needed it to work. :) It will most likely
+> evolve as my page does too.
+> --[[jaywalk]]
diff --git a/doc/plugins/contrib/ftemplate.mdwn b/doc/plugins/contrib/ftemplate.mdwn
new file mode 100644
index 000000000..d82867f94
--- /dev/null
+++ b/doc/plugins/contrib/ftemplate.mdwn
@@ -0,0 +1,25 @@
+[[!template id=plugin name=ftemplate author="[[rubykat]]"]]
+[[!tag type/meta type/format]]
+
+This plugin provides the [[ikiwiki/directive/ftemplate]] directive.
+
+This is like the [[ikiwiki/directive/template]] directive, with the addition
+that one does not have to provide all the values in the call to the template,
+because ftemplate can query structured data ("fields") using the [[field]]
+plugin.
+
+## Activate the plugin
+
+ add_plugins => [qw{goodstuff ftemplate ....}],
+
+## PREREQUISITES
+
+ IkiWiki
+ IkiWiki::Plugin::field
+ HTML::Template
+ Encode
+
+## DOWNLOAD
+
+* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/ftemplate.pm>
+* git repo at git://github.com/rubykat/ikiplugins.git
diff --git a/doc/plugins/contrib/ftemplate/discussion.mdwn b/doc/plugins/contrib/ftemplate/discussion.mdwn
new file mode 100644
index 000000000..1e0bca5d8
--- /dev/null
+++ b/doc/plugins/contrib/ftemplate/discussion.mdwn
@@ -0,0 +1,33 @@
+I initially thought this wasn't actually necessary - the combination
+of [[plugins/template]] with [[plugins/contrib/field]]'s `pagetemplate`
+hook ought to provide the same functionality. However, `template`
+doesn't run `pagetemplate` hooks; a more general version of this
+plugin would be to have a variant of `template` that runs `pagetemplate`
+hooks (probably easiest to just patch `template` to implement a
+second directive, or have a special parameter `run_hooks="yes"`,
+or something).
+
+> I got the impression that `pagetemplate` hooks are intended to be completely independent of `template` variables; page-template is for the actual `page.tmpl` template, while `template` is for other templates which are used inside the page content. So I don't understand why one would need a run_hooks option. --[[KathrynAndersen]]
+
+>> `Render`, `inline`, `comments` and `recentchanges` run `pagetemplate`
+>> hooks, as does anything that uses `IkiWiki::misctemplate`. From that
+>> quick survey, it seems as though `template` is the only thing that
+>> uses `HTML::Template` but *doesn't* run `pagetemplate` hooks?
+>>
+>> It just seems strange to me that `field` needs to have its own
+>> variant of `template` (this), its own variant of `inline` (`report`),
+>> and so on - I'd tend to lean more towards having `field`
+>> enhance the existing plugins. I'm not an ikiwiki committer,
+>> mind... Joey, your opinion would be appreciated! --[[smcv]]
+
+>>> I did it that way basically because I needed the functionality ASAP, and I didn't want to step on anyone's toes, so I made them as separate plugins. If Joey wants to integrate the functionality into IkiWiki proper, I would be very happy, but I don't want to put pressure on him. --[[KathrynAndersen]]
+
+Another missing thing is that `ftemplate` looks in
+the "system" templates directories, not just in the wiki, but that
+seems orthogonal (and might be a good enhancement to `template` anyway).
+--[[smcv]]
+
+> Yes, I added that because I wanted the option of not having to make all my templates work as wiki pages also. --[[KathrynAndersen]]
+
+>> Joey has added support for
+>> [[todo/user-defined_templates_outside_the_wiki]] now. --s
diff --git a/doc/plugins/contrib/ftemplate/ikiwiki/directive/ftemplate.mdwn b/doc/plugins/contrib/ftemplate/ikiwiki/directive/ftemplate.mdwn
new file mode 100644
index 000000000..3009fc830
--- /dev/null
+++ b/doc/plugins/contrib/ftemplate/ikiwiki/directive/ftemplate.mdwn
@@ -0,0 +1,106 @@
+The `ftemplate` directive is supplied by the [[!iki plugins/contrib/ftemplate desc=ftemplate]] plugin.
+
+This is like the [[ikiwiki/directive/template]] directive, with the addition
+that one does not have to provide all the values in the call to the template,
+because ftemplate can query structured data ("fields") using the
+[[plugins/contrib/field]] plugin.
+
+Templates are files that can be filled out and inserted into pages in
+the wiki, by using the ftemplate directive. The directive has an id
+parameter that identifies the template to use.
+
+Additional parameters can be used to fill out the template, in
+addition to the "field" values. Passed-in values override the
+"field" values.
+
+There are two places where template files can live. One is in the /templates
+directory on the wiki. These templates are wiki pages, and can be edited from
+the web like other wiki pages.
+
+The second place where template files can live is in the global
+templates directory (the same place where the page.tmpl template lives).
+This is a useful place to put template files if you want to prevent
+them being edited from the web, and you don't want to have to make
+them work as wiki pages.
+
+### EXAMPLES
+
+#### Example 1
+
+PageA:
+
+ \[[!meta title="I Am Page A"]]
+ \[[!meta description="A is for Apple."]]
+ \[[!meta author="Fred Nurk"]]
+ \[[!ftemplate id="mytemplate"]]
+
+Template "mytemplate":
+
+ # <TMPL_VAR NAME="TITLE">
+ by <TMPL_VAR NAME="AUTHOR">
+
+ **Summary:** <TMPL_VAR NAME="DESCRIPTION">
+
+This will give:
+
+ <h1>I Am Page A</h1>
+ <p>by Fred Nurk</p>
+ <p><strong>Summary:</strong> A is for Apple.
+
+#### Example 2: Overriding values
+
+PageB:
+
+ \[[!meta title="I Am Page B"]]
+ \[[!meta description="B is for Banana."]]
+ \[[!meta author="Fred Nurk"]]
+ \[[!ftemplate id="mytemplate" title="Bananananananas"]]
+
+This will give:
+
+ <h1>Bananananananas</h1>
+ <p>by Fred Nurk</p>
+ <p><strong>Summary:</strong> B is for Banana.
+
+#### Example 3: Loops
+
+(this example uses the [[plugins/contrib/ymlfront]] plugin)
+
+Page C:
+
+ ---
+ BookAuthor: Georgette Heyer
+ BookTitle: Black Sheep
+ BookGenre:
+ - Historical
+ - Romance
+ ---
+ \[[ftemplate id="footemplate"]]
+
+ I like this book.
+
+Template "footemplate":
+
+ # <TMPL_VAR BOOKTITLE>
+ by <TMPL_VAR BOOKAUTHOR>
+
+ <TMPL_IF BOOKGENRE>(
+ <TMPL_LOOP GENRE_LOOP><TMPL_VAR BOOKGENRE>
+ <TMPL_UNLESS __last__>, </TMPL_UNLESS>
+ </TMPL_LOOP>
+ )</TMPL_IF>
+
+This will give:
+
+ <h1>Black Sheep</h1>
+ <p>by Georgette Heyer</p>
+
+ <p>(Historical, Romance)</p>
+
+ <p>I like this book.</p>
+
+### LIMITATIONS
+
+One cannot query the values of fields on pages other than the current
+page. If you want to do that, check out the [[plugins/contrib/report]]
+plugin.
diff --git a/doc/plugins/contrib/gallery.mdwn b/doc/plugins/contrib/gallery.mdwn
new file mode 100644
index 000000000..72df13bd0
--- /dev/null
+++ b/doc/plugins/contrib/gallery.mdwn
@@ -0,0 +1,39 @@
+[[!template id=plugin name=gallery author="[[arpitjain]]"]]
+
+This plugin would create a nice looking gallery of the images. It has been build over the img plugin in Ikiwiki
+
+GIT repo of the plugin is located at <http://github.com/joeyh/ikiwiki/tree/gallery>
+
+
+USAGE :
+\[[!gallery imagedir="images" option="value"]]
+
+Available options : <br>
+ * imagedir(required) => Directory containing images. It will scan all the files with jpg|png|gif extension from the directory and will put it in the gallery.<br>
+ * thumbnailsize(optional,Default=200x200) => Size of the thumbnail that you want to generate for the gallery.<br>
+ * resize(optional, Default=>800x600) => Width and Height to resize image to. resize="0" for turning resizing off.<br>
+ * alt(optional) => If image can not be displayed, it will display the text contained in alt argument.<br>
+ * cols(optional,Default=3) => Number of columns of thumbnails that you want to generate.<br>
+ * rows(optional, Default=>3) => Number of Rows on a gallery page.<br>
+ * title(optional) => Title of the gallery.<br>
+ * sort(optional) => "asc" or "desc" . You can sort in ascending or descending order of names of images. <br>
+ * vcs(optional,Default=1) => This value decides whether to put the images out of IkiWiki's tree. If you set vcs=0, then you can specify a directory outside IkiWiki tree also to lookup. In that case you can also give absolute link of the image directory.<br>
+ * exif(optional, Default=>0) => Specify whether to Display exif information or not.<br>
+
+Additionaly, you can put Comment file filename.comm in image directory where filename is name of the image. Comments would then be displayed in the gallery.
+
+Features of the Gallery Plugin:<br>
+* You can go the next image by clicking on the right side of the image or by pressing 'n'.<br>
+* Similary, you can go the previous image by clicking on the left side of the image or by pressing 'p'.<br>
+* Press esc to close the gallery.<br>
+* While on image, nearby images are preloaded in the background, so as to make the browsing fast.<br>
+
+It uses templated named [Lightbox](http://www.hudddletogether.com).
+For any feedback or query, feel free to mail me at arpitjain11 [AT] gmail.com
+
+Additional details are available [here](http://myweb.unomaha.edu/~ajain/ikiwikigallery.html).
+> That link is broken. --[[JosephTurian]]
+
+-- [[arpitjain]]
+
+[[!tag plugins]] [[!tag patch]] [[!tag soc]] [[!tag wishlist]]
diff --git a/doc/plugins/contrib/gallery/discussion.mdwn b/doc/plugins/contrib/gallery/discussion.mdwn
new file mode 100644
index 000000000..08fc2456f
--- /dev/null
+++ b/doc/plugins/contrib/gallery/discussion.mdwn
@@ -0,0 +1,45 @@
+# Adaptation to Newer Versions of ikiwiki
+
+I tried using this plugin, but to no success so far. I used *gallery* git branch,
+rebased it onto the *origin/master*, built a Debian package and installed that one.
+
+However, I can't even get simply things like this to work:
+
+ $ cat web/index.mdwn
+ [[!gallery imagedir="b" vcs="0"]]
+ $ ls web/b/
+ 1.jpg 2.jpg 3.jpg 4.jpg
+ $ ikiwiki [...] --plugin gallery web web.rendered
+ [...]
+ $ grep gallery web.rendered/index.html
+ <p>[[!gallery Failed to Read Directory b.]]</p>
+
+When using `vcs="1"` it's no better:
+
+ $ ikiwiki [...] --plugin gallery web web.rendered
+ scanning index.mdwn
+ internal error: b/800x600-1.jpg cannot be found in web or underlay
+
+--[[tschwinge]]
+
+Its probably because of the restriction of permissions by plugins in newer version of IkiWiki.
+For the time being, you can turn resizing off till I look into conditional underlay directory feature.
+
+USAGE : [[!gallery imagedir="directory" resize="0"]]
+
+New version updated at SVN REPO : http://ned.snow-crash.org:8080/svn/ikiwiki-gallery/
+
+--[[arpitjain]]
+
+
+# Bug With Referring to *js* and *css* Files
+
+In the generated files `gallery/index/index.html` and `gallery/index/*/gallery_*`
+it should not be referred to `/js` but to `../../js` and `../../../js`, respectively.
+Likewise for `/css`.
+
+
+# All Thumbnails on One Page
+
+Could `rows="0"` be used to state that all thumbnails should be put on the same page,
+no matter how much there are?
diff --git a/doc/plugins/contrib/getfield.mdwn b/doc/plugins/contrib/getfield.mdwn
new file mode 100644
index 000000000..61e80c58a
--- /dev/null
+++ b/doc/plugins/contrib/getfield.mdwn
@@ -0,0 +1,137 @@
+[[!template id=plugin name=getfield author="[[rubykat]]"]]
+[[!tag type/meta type/format]]
+[[!toc]]
+## NAME
+
+IkiWiki::Plugin::getfield - query the values of fields
+
+## SYNOPSIS
+
+ # activate the plugin
+ add_plugins => [qw{goodstuff getfield ....}],
+
+## DESCRIPTION
+
+This plugin provides a way of querying the meta-data (data fields) of a page
+inside the page content (rather than inside a template) This provides a way to
+use per-page structured data, where each page is treated like a record, and the
+structured data are fields in that record. This can include the meta-data for
+that page, such as the page title.
+
+This plugin is meant to be used in conjunction with the [[field]] plugin.
+
+### USAGE
+
+One can get the value of a field by using special markup in the page.
+This does not use directive markup, in order to make it easier to
+use the markup inside other directives. There are four forms:
+
+* \{{$*fieldname*}}
+
+ This queries the value of *fieldname* for the source page.
+
+ For example:
+
+ \[[!meta title="My Long and Complicated Title With Potential For Spelling Mistakes"]]
+ # {{$title}}
+
+ When the page is processed, this will give you:
+
+ <h1>My Long and Complicated Title With Potential For Spelling Mistakes</h1>
+
+* \{{$*pagename*#*fieldname*}}
+
+ This queries the value of *fieldname* for the page *pagename*.
+
+ For example:
+
+ On PageFoo:
+
+ \[[!meta title="I Am Page Foo"]]
+
+ Stuff about Foo.
+
+ On PageBar:
+
+ For more info, see \[[\{{$PageFoo#title}}|PageFoo]].
+
+ When PageBar is displayed:
+
+ &lt;p&gt;For more info, see &lt;a href="PageFoo"&gt;I Am Page Foo&lt;/a&gt;.&lt;/p&gt;
+
+* \{{+$*fieldname*+}}
+
+ This queries the value of *fieldname* for the destination page; that is,
+ the value when this page is included inside another page.
+
+ For example:
+
+ On PageA:
+
+ \[[!meta title="I Am Page A"]]
+ # {{+$title+}}
+
+ Stuff about A.
+
+ On PageB:
+
+ \[[!meta title="I Am Page B"]]
+ \[[!inline pagespec="PageA"]]
+
+ When PageA is displayed:
+
+ <h1>I Am Page A</h1>
+ <p>Stuff about A.</p>
+
+ When PageB is displayed:
+
+ <h1>I Am Page B</h1>
+ <p>Stuff about A.</p>
+
+* \{{+$*pagename*#*fieldname*+}}
+
+ This queries the value of *fieldname* for the page *pagename*; the
+ only difference between this and \{{$*pagename*#*fieldname*}} is
+ that the full name of *pagename* is calculated relative to the
+ destination page rather than the source page.
+
+ I can't really think of a reason why this should be needed, but
+ this format has been added for completeness.
+
+### Escaping
+
+Getfield markup can be escaped by putting a backwards slash `\`
+in front of the markup.
+If that is done, then the markup is displayed as-is.
+
+### No Value Found
+
+If no value is found for the given field, then the field name is returned.
+
+For example:
+
+On PageFoo:
+
+ \[[!meta title="Foo"]]
+ My title is {{$title}}.
+
+ My description is {{$description}}.
+
+When PageFoo is displayed:
+
+ <p>My title is Foo.</p>
+
+ <p>My description is description.</p>
+
+This is because "description" hasn't been defined for that page.
+
+### More Examples
+
+Listing all the sub-pages of the current page:
+
+ \[[!map pages="{{$page}}/*"]]
+
+## DOWNLOAD
+
+* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/getfield.pm>
+* git repo at git://github.com/rubykat/ikiplugins.git
diff --git a/doc/plugins/contrib/getfield/discussion.mdwn b/doc/plugins/contrib/getfield/discussion.mdwn
new file mode 100644
index 000000000..13ea8b1b3
--- /dev/null
+++ b/doc/plugins/contrib/getfield/discussion.mdwn
@@ -0,0 +1,79 @@
+## Multiple values arrays
+
+This breaks if there are multiple values for a single key. It works fine in the report plugin, but inline display shows the ARRAY reference, e.g.
+
+ IPv6:
+ - fd64:2c08:9fa7:4::1
+ - 2001:470:1d:4a6::1
+
+and:
+
+ {{$IPv6}}
+
+yields:
+
+ ARRAY(0x266db10)
+
+Seems to me this could be checked and `join(" ")`'d. :) -- [[anarcat]]
+
+> I wrote a stupid fix for this, which works for getfield, but isn't as good for report. It simply does that `join()`. Here's the patch:
+>
+> [[!format diff """
+--- a/IkiWiki/Plugin/field.pm
++++ b/IkiWiki/Plugin/field.pm
+@@ -322,6 +322,9 @@ sub field_get_value ($$;@) {
+ {
+ $basevalue = calculated_values($lc_field_name, $page);
+ }
++ if (ref($basevalue) eq "ARRAY") {
++ $basevalue = join(" ", @{$basevalue}); # hack
++ }
+ if (defined $basevalue)
+ {
+ $Cache{$page}{$basename} = $basevalue;
+@@ -360,6 +363,9 @@ sub field_get_value ($$;@) {
+ {
+ $value = $basevalue;
+ }
++ if (ref($value) eq "ARRAY") {
++ $value = join(" ", @{$value}); # hack
++ }
+ if (defined $value)
+ {
+ $Cache{$page}{$lc_field_name} = $value;
+"""]]
+>
+> Seems to me this should be the default, at the very least in getfield. But at least, with the above patch we don't see expanded Perl ref's. ;) --[[anarcat]]
+
+## Templating, and other uses
+
+Like you mentioned in [[ftemplate]] IIRC, it'll only work on the same page. If it can be made to work anywhere, or from a specific place in the wiki - configurable, possibly - you'll have something very similar to mediawiki's templates. I can already think of a few uses for this combined with [[template]] ;) . --[[SR|users/simonraven]]
+
+> Yes, I mentioned "only current page" in the "LIMITATIONS" section.
+
+> What do you think would be a good syntax for querying other pages?
+> It needs to resolve to a single page, though I guess using "bestlink" to find the closest page would mean that one didn't have to spell out the whole page.
+
+>> I don't know the internals very well, I think that's how other plugins do it. *goes to check* Usually it's a `foreach` loop, and use a `pagestate{foo}` to check the page's status/state. There's also some stuff like 'pagespec_match_list($params{page}` ... they do slightly different thing depending on need. --[[SR|users/simonraven]]
+
+>>> No, I meant what markup I should use; the actual implementation probably wouldn't be too difficult.
+
+>>> The current markup is {{$*fieldname*}}; what you're wanting, perhaps it should be represented like {{$*pagename*:*fieldname*}}, or {{$*pagename*::*fieldname*}} or something else...
+>>> -- [[KathrynAndersen]]
+
+>>>> Oh. Hmm. I like your idea actually, or alternately, in keeping more with other plugins, doing it like {{pagename/fieldname}}. The meaning of the separator is less clear with /, but avoids potential issues with filename clashes that have a colon in them. It also keeps a certain logic - at least to me. Either way, I think both are good choices. [[SR|users/simonraven]]
+
+>>>>> What about using {{pagename#fieldname}}? The meaning of the hash in URLs sort of fits with what is needed here (reference to a 'named' thing within the page) and it won't conflict with actual hash usages (unless we expect different named parts of pages to define different values for the same field ...)
+>>>>> -- [[Oblomov]]
+>>>>>> That's a good one too. --[[simonraven]]
+>>>>>>> Done! I used {{$*pagename*#*fieldname*}} for the format. -- [[users/KathrynAndersen]]
+
+
+> I'm also working on a "report" plugin, which will basically apply a template like [[ftemplate]] does, but to a list of pages given from a pagespec, rather than the current page.
+
+> -- [[users/KathrynAndersen]]
+
+>> Ooh, sounds nice :) . --[[SR|users/simonraven]]
+
+>>> I've now released the [[plugins/contrib/report]] plugin. I've been using it on my site; the holdup on releasing was because I hadn't yet written the docs for it. I hope you find it useful.
+>>> -- [[users/KathrynAndersen]]
diff --git a/doc/plugins/contrib/googlemaps.mdwn b/doc/plugins/contrib/googlemaps.mdwn
new file mode 100644
index 000000000..c43490b13
--- /dev/null
+++ b/doc/plugins/contrib/googlemaps.mdwn
@@ -0,0 +1,21 @@
+[[!template id=plugin name=googlemaps author="Christian Mock"]]
+[[!tag type/special-purpose todo/geotagging]]
+
+`googlemaps` is a plugin that allows using the [Google Maps API][2]
+from ikiwiki.
+
+[2]: http://www.google.com/apis/maps/
+
+The plugin supports importing GPS tracks from [gpx][1] files, reducing
+the number of points in those tracks via [gpsbabel][2], and drawing
+them as an overlay on the map. the center of the map and zoom level
+can be calculated automatically.
+
+[1]: http://www.topografix.com/gpx.asp
+[2]: http://www.gpsbabel.org/
+
+It can be [found here][3].
+
+[3]: http://www.tahina.priv.at/hacks/googlemaps.html
+
+See also [[plugins/osm]].
diff --git a/doc/plugins/contrib/googlemaps/discussion.mdwn b/doc/plugins/contrib/googlemaps/discussion.mdwn
new file mode 100644
index 000000000..b148c0eee
--- /dev/null
+++ b/doc/plugins/contrib/googlemaps/discussion.mdwn
@@ -0,0 +1,16 @@
+Very neat. The warnings about it not being very secure are of course spot on.
+Seems that the need to embed the api key on the page is a security problem of one sort.
+
+The call to gpsbabel _seems_ secure thanks to the sanitisation done to the filename passed to gpsbabel.
+
+I probably won't ever add this to ikiwiki, but it's a great third-party plugin!
+
+--[[Joey]]
+
+-----
+
+Can this plugin be extended to allow the user to switch the displayed to
+many available maps, similar to the functionality provided by
+[OSMify](http://blog.johnmckerrell.com/2007/12/31/new-version-of-osmify-bookmarklet/)
+which allow replacing maps with one of the [OpenStreetMap](http://openstreetmap.org/) renderings.
+-- [[users/vibrog]]
diff --git a/doc/plugins/contrib/groupfile.mdwn b/doc/plugins/contrib/groupfile.mdwn
new file mode 100644
index 000000000..e5c0ded42
--- /dev/null
+++ b/doc/plugins/contrib/groupfile.mdwn
@@ -0,0 +1,105 @@
+[[!template id=plugin name=groupfile core=0 author="[[Jogo]]"]]
+
+This plugin add a `group(groupname)` function to [[ikiwiki/PageSpec]], which is true
+only if the actual user is member of the group named `groupname`.
+
+Groups membership are read from a file. The syntax of this file is very close to
+usual `/etc/passwd` Unix file : the group's name, followed by a colon, followed by
+a coma separated list of user's names. For exemple :
+
+ dev:toto,foo
+ i18n:zorba
+
+-----
+
+ #!/usr/bin/perl
+ # GroupFile plugin.
+ # by Joseph Boudou <jogo at matabio dot net>
+
+ package IkiWiki::Plugin::groupfile;
+
+ use warnings;
+ use strict;
+ use IkiWiki 3.00;
+
+ sub import {
+ hook(type => 'getsetup', id => 'groups', call => \&get_setup);
+ }
+
+ sub get_setup () {
+ return (
+ plugin => {
+ safe => 0,
+ rebuild => 0,
+ },
+ group_file => {
+ type => 'string',
+ example => '/etc/ikiwiki/group',
+ description => 'group file location',
+ safe => 0,
+ rebuild => 0,
+ },
+ );
+ }
+
+ my $users_of = 0;
+
+ sub get_groups () {
+ if (not $users_of) {
+
+ if (not defined $config{group_file}) {
+ return 'group_file option not set';
+ }
+
+ open my $file, '<', $config{group_file}
+ or return 'Unable to open group_file';
+
+ $users_of = {};
+ READ:
+ while (<$file>) {
+ next READ if (/^\s*$/);
+
+ if (/^(\w+):([\w,]+)/) {
+ %{ $users_of->{$1} } = map { $_ => 1 } split /,/, $2;
+ }
+ else {
+ $users_of = "Error at group_file:$.";
+ last READ;
+ }
+ }
+
+ close $file;
+ }
+
+ return $users_of;
+ }
+
+ package IkiWiki::PageSpec;
+
+ sub match_group ($$;@) {
+ shift;
+ my $group = shift;
+ my %params = @_;
+
+ if (not exists $params{user}) {
+ return IkiWiki::ErrorReason->new('no user specified');
+ }
+ if (not defined $params{user}) {
+ return IkiWiki::FailReason->new('not logged in');
+ }
+
+ my $users_of = IkiWiki::Plugin::groupfile::get_groups();
+ if (not ref $users_of) {
+ return IkiWiki::ErrorReason->new($users_of);
+ }
+
+ if (exists $users_of->{$group}{ $params{user} }) {
+ return IkiWiki::SuccessReason->new("user is member of $group");
+ }
+ else {
+ return IkiWiki::FailReason->new(
+ "user $params{user} isn't member of $group");
+ }
+ }
+
+ 1
diff --git a/doc/plugins/contrib/highlightcode.mdwn b/doc/plugins/contrib/highlightcode.mdwn
new file mode 100644
index 000000000..f1df204bb
--- /dev/null
+++ b/doc/plugins/contrib/highlightcode.mdwn
@@ -0,0 +1,10 @@
+[[!template id=plugin name=highlightcode author="[[sabr]]"]]
+[[!tag type/format]]
+
+(An alternative to this plugin, [[plugins/highlight]], is now provided with IkiWiki. --[[smcv]])
+
+A small plugin to allow Ikiwiki to display source files complete with syntax highlighting. Files with recognized extensions (i.e. my-file.cpp) are be rendered just like any other Ikiwiki page. You can even edit your source files with Ikiwiki's editor.
+
+It uses the Syntax::Highlight::Engine::Kate Perl module to do the highlighting.
+
+It can be found at: <http://iki.u32.net/Highlight_Code_Plugin> --[[sabr]]
diff --git a/doc/plugins/contrib/ikiwiki/directive/album.mdwn b/doc/plugins/contrib/ikiwiki/directive/album.mdwn
new file mode 100644
index 000000000..433eec8bd
--- /dev/null
+++ b/doc/plugins/contrib/ikiwiki/directive/album.mdwn
@@ -0,0 +1,56 @@
+The `album` directive is supplied by the [[!iki plugins/contrib/album desc=album]] plugin.
+
+Each page containing an `album` directive is treated as a photo album
+or image gallery. Example usage is as simple as:
+
+ \[[!album]]
+
+Every image attached to an album or its [[subpages|ikiwiki/subpage]] is
+considered to be part of the album. A "viewer" page, with the wiki's default
+page extension, will be generated in the
+[[transient underlay|plugins/transient]] to display the
+image, if there isn't already a page of the same name as the image: for
+instance, if `debconf` is an album and `debconf/tuesday/p100.jpg` exists,
+then `debconf/tuesday/p100.mdwn` might be created.
+
+The album is treated as a [[!iki plugins/contrib/trail desc=trail]], which
+gives each viewer page a link back to the album, and a link to the previous
+and next viewer in the album.
+
+The `album` directive also produces an [[ikiwiki/directive/inline]] which
+automatically includes all the viewers for this album, except those that
+will appear in an [[albumsection]]. If every image in the album is in a
+section, then the `album` directive is still required, but won't produce
+any output in the page.
+
+The `inline` can include some extra information about the images, including
+file size and a thumbnail made using [[ikiwiki/directive/img]]). The
+default template is `albumitem.tmpl`, which takes advantage of these things.
+
+## Options
+
+The directive can have some options for the entire album. The defaults are:
+
+ \[[!album
+ sort="-age"
+ size="full"
+ thumbnailsize="96x96"
+ viewertemplate="albumviewer"
+ prevtemplate="albumprev"
+ nexttemplate="albumnext"
+
+* `sort` - sets the order in which images appear, defaulting to earliest
+ creation date first
+* `size` - if not `full`, the [[ikiwiki/directive/img]] in the viewer page
+ will be resized to be no larger than this
+* `thumbnailsize` - the [[ikiwiki/directive/img]] in the album page,
+ which can also be used in the previous/next links, will be no larger than
+ this
+* `viewertemplate` - the template used for the [[albumimage]] in each
+ viewer page
+* `prevtemplate` - the template used to replace `<TMPL_VAR PREV>` if used in
+ the viewer page
+* `nexttemplate` - the template used to replace `<TMPL_VAR NEXT>` if used in
+ the viewer page
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/plugins/contrib/ikiwiki/directive/albumimage.mdwn b/doc/plugins/contrib/ikiwiki/directive/albumimage.mdwn
new file mode 100644
index 000000000..2385bb535
--- /dev/null
+++ b/doc/plugins/contrib/ikiwiki/directive/albumimage.mdwn
@@ -0,0 +1,26 @@
+The `albumimage` directive is supplied by the [[!iki plugins/contrib/album desc=album]] plugin.
+
+Each viewer page produced by the [[album]] directive
+contains an `albumimage` directive, which is replaced by an
+[[ikiwiki/directive/img]], wrapped in some formatting using a
+template (by default it's `albumviewer.tmpl`). That template can also include
+links to the next and previous photos, in addition to those provided by the
+[[!iki plugins/contrib/trail desc=trail]] plugin.
+
+The next/previous links are themselves implemented by evaluating a template,
+either `albumnext.tmpl` or `albumprev.tmpl` by default.
+
+The directive can also have parameters:
+
+* `title`, `date`, `updated`, `author`, `authorurl`, `copyright`, `license`
+ and `description` are short-cuts for the corresponding
+ [[ikiwiki/directive/meta]] directives
+
+* `caption` sets a caption which is displayed near this image in the album
+ and viewer pages
+
+The viewer page can also contain any text and markup before or after the
+`albumimage` directive, which will appear before or after the image in the
+viewer page.
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/plugins/contrib/ikiwiki/directive/albumsection.mdwn b/doc/plugins/contrib/ikiwiki/directive/albumsection.mdwn
new file mode 100644
index 000000000..7e5749e05
--- /dev/null
+++ b/doc/plugins/contrib/ikiwiki/directive/albumsection.mdwn
@@ -0,0 +1,29 @@
+The `albumsection` directive is supplied by the [[!iki plugins/contrib/album desc=album]] plugin.
+
+The `albumsection` directive is used to split an album into sections. It can
+only appear on a page that also has the [[album]] directive.
+
+The `filter` parameter is a [[ikiwiki/PageSpec]] against which viewer pages
+are matched. The `albumsection` directive displays all the images that match
+the filter, and the `album` directive displays any leftover images, like
+this:
+
+ # Holiday photos
+
+ \[[!album]]
+ <!-- replaced with a list of any uncategorized photos; it will be
+ empty if they're all tagged as 'people' and/or 'landscapes' -->
+
+ ## People
+
+ \[[!albumsection filter="tagged(people)"]]
+ <!-- replaced with a list of photos tagged 'people', including
+ any that are also tagged 'landscapes' -->
+
+ ## Landscapes
+
+ \[[!albumsection filter="tagged(landscapes)"]]
+ <!-- replaced with a list of photos tagged 'landscapes', including
+ any that are also tagged 'people' -->
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/plugins/contrib/ikiwiki/directive/jssearchfield.mdwn b/doc/plugins/contrib/ikiwiki/directive/jssearchfield.mdwn
new file mode 100644
index 000000000..5d338901d
--- /dev/null
+++ b/doc/plugins/contrib/ikiwiki/directive/jssearchfield.mdwn
@@ -0,0 +1,42 @@
+The `jssearchfield` directive is supplied by the [[!iki plugins/contrib/jssearchfield desc=jssearchfield]] plugin.
+
+This enables one to search the structured data ("field" values) of
+multiple pages. A search form is constructed, and the searching is
+done with Javascript, which means that the entire thing is self-contained.
+This depends on the [[!iki plugins/contrib/field]] plugin.
+
+The pages to search are selected by a PageSpec given by the "pages"
+parameter.
+The fields to search are given by the "fields" parameter. By default,
+the field name is given, and the user can type the search parameter for
+that field into a text input field.
+
+## OPTIONS
+
+**pages**: A PageSpec to determine the pages to search through.
+
+**fields**: The fields to put into the search form, and to display
+in the results.
+
+**tagfields**: Display the given fields as a list of tags that can
+be selected from, rather than having a text input field. Every distinct
+value of that field will be listed, so it is best used for things with
+short values, like "Author" rather than long ones like "Description".
+Note that "tagfields" must be a subset of "fields".
+
+**sort**: A SortSpec to determine how the matching pages should be sorted; this is the "default" sort order that the results will be displayed in.
+The search form also gives the option of "random" sort, which will
+display the search results in random order.
+
+## SEARCHING
+
+The search form that is created by this directive contains the following:
+
+* for each search field, a label, plus either a text input field, or a list of checkboxes with values next to them if the field is also a tagfield. Note that the lists of checkboxes are initially hidden; one must click on the triangle next to the label to display them.
+* a "sort" toggle. One can select either "default" or "random".
+* A "Search!" button, to trigger the search if needed (see below)
+* A "Reset" button, which will clear all the values.
+
+The searching is dynamic. As soon as a value is changed, either by tabbing out of the text field, or by selecting or de-selecting a checkbox, the search
+results are updated. Furthermore, for tagfields, the tagfield lists
+themselves are updated to reflect the current search results.
diff --git a/doc/plugins/contrib/ikiwiki/directive/ymlfront.mdwn b/doc/plugins/contrib/ikiwiki/directive/ymlfront.mdwn
new file mode 100644
index 000000000..1a01834f8
--- /dev/null
+++ b/doc/plugins/contrib/ikiwiki/directive/ymlfront.mdwn
@@ -0,0 +1,17 @@
+The `ymlfront` directive is supplied by the [[!iki plugins/contrib/ymlfront desc=ymlfront]] plugin.
+
+This directive allows the user to define arbitrary meta-data in YAML format.
+
+ \[[!ymlfront data="""
+ foo: fooness
+ bar: The Royal Pigeon
+ baz: 2
+ """]]
+
+There is one argument to this directive.
+
+* **data:**
+ The YAML-format data. This should be enclosed inside triple-quotes to preserve the data correctly.
+
+If more than one ymlfront directive is given per page, the result is undefined.
+Likewise, it is inadvisable to try to mix the non-directive ymlfront format with the directive form of the data.
diff --git a/doc/plugins/contrib/ikiwiki/directive/ymlfront/discussion.mdwn b/doc/plugins/contrib/ikiwiki/directive/ymlfront/discussion.mdwn
new file mode 100644
index 000000000..f49c85079
--- /dev/null
+++ b/doc/plugins/contrib/ikiwiki/directive/ymlfront/discussion.mdwn
@@ -0,0 +1,37 @@
+I can't seem to make this work. I have tried, in this [sandbox](http://mesh.openisp.ca/sandbox), to set values for fields and then display them with the getfield meta syntax, but it doesn't seem to be working.
+
+The getfield, field and ymlfront plugins are enabled. I have tried with and without the following field registration:
+
+ # field plugin
+ # define the fields for the meshmtl project
+ field_register:
+ - meta
+ - hostname
+ - MAC
+ - IP
+
+I have tried both the ymlfront directive and the YAML markup (with the
+`---` delimiter), no luck. Any idea what I am doing wrong? --
+[[anarcat]]
+
+> I'm afraid I can't tell from here what the problem could be. It's clear that ymlfront is turned on, or the ymlfront directive in your sandbox page wouldn't be processed. The only thing I can suggest, in order to get more information about what could be going wrong, would be to do a dump of your indexdb file (see [[tips/inside dot ikiwiki]]) and see what the data for your sandbox page is. If there is field data there, that would indicate a problem with getfield; if there isn't field data there, that would indicate a problem with field or ymlfront.
+
+> Oh, and you only need to register "meta" with field_register; that will enable the data defined by the "meta" plugin to be read by field. Unless "hostname", "MAC" and "IP" are plugins, you don't need to add them to field_register. They can be taken care of by the ymlfront plugin. Perhaps that is the problem?
+
+> --[[KathrynAndersen]]
+
+> > I have tried removing the other fields from the declaration, no luck. I did, however, notice the following error in the `--rebuild` output:
+> >
+> > ymlfront parse: Load of sandbox data failed: YAML Error: Stream does not end with newline character
+> > Code: YAML_PARSE_ERR_NO_FINAL_NEWLINE
+> > Line: 0
+> > Document: 0
+> > at /usr/share/perl5/YAML/Loader.pm line 38
+> >
+> > Now *that* has to be related... ;) In the index.db, there is no ymlfront metadata for the sandbox page... Note that the `---` delimiter approach doesn't trigger the warning but doesn't populate the DB either...
+> >
+> > Finally note that after adding debugging code, I was able to figure out that this seems to be using the `YAML::XS` library. I have also traced the data and confirmed that `$yml_str` does get properly initialized in `parse_yml`, and it is where the error is generated. So maybe there's something wrong with the YAML library?
+> >
+> > Update: well, look here: using `YAML::Syck` doesn't yield the same error *and* the metadata actually works! So this is a problem specific to `YAML::Any`. Hardcoding `use YAML::XS` or *even* `use YAML::Any` fixed the problem for me.
+> >
+> > Now delimiters also work, but the output is kind of ugly: it gets parsed as regular markdown makup so the `---` makes horizontal lines in the beginning and headings in the end... --[[anarcat]]
diff --git a/doc/plugins/contrib/imailhide.mdwn b/doc/plugins/contrib/imailhide.mdwn
new file mode 100644
index 000000000..6009aa012
--- /dev/null
+++ b/doc/plugins/contrib/imailhide.mdwn
@@ -0,0 +1,65 @@
+[[!template id=plugin name=imailhide author="Peter_Vizi"]]
+[[!tag type/widget type/html]]
+
+# Mailhide Plugin for Ikiwiki
+
+This plugin provides the directive mailhide, that uses the [Mailhide
+API][1] to protect email addresses from spammers.
+
+## Dependencies
+
+The [Captcha::reCAPTCHA::Mailhide][2] perl module is required for this
+plugin.
+
+## Download
+
+You can get the source code from [github][3].
+
+## Installation
+
+Copy `imailhide.pm` to `/usr/share/perl/5.10.0/IkiWiki/Plugin` or
+`~/.ikiwiki/IkiWiki/Plugin`, and enable it in your `.setup` file
+
+ add_plugins => [qw{goodstuff imailhide ....}],
+ mailhide_public_key => "8s99vSA99fF11mao193LWdpa==",
+ mailhide_private_key => "6b5e4545326b5e4545326b5e45453223",
+ mailhide_default_style => "short",
+
+## Configuration
+
+### `mailhide_public_key`
+
+This is your personal public key that you can get at [Google][4].
+
+### `mailhide_private_key`
+
+This is your personal private key that you can get at [Google][4].
+
+### `mailhide_default_style`
+
+As per the recommendation of the [Mailhide API documentation][5], you
+can define this as `short` or `long`. The `short` parameter will
+result in `<a href="...">john</a>` links, while the `long` parameter
+will result in `joh<a href="...">...</a>@example.com`.
+
+## Parameters
+
+### `email`
+
+*Required.* This is the email addres that you want to hide.
+
+### `style`
+
+*Optional.* You can set the style parameter individually for each
+ `mailhide` call. See `mailhide_default_style` for details.
+
+## Known Issues
+
+1. [opening new window when displaying email address][6]
+
+[1]: http://www.google.com/recaptcha/mailhide/
+[2]: http://search.cpan.org/perldoc?Captcha::reCAPTCHA::Mailhide
+[3]: http://github.com/petervizi/imailhide
+[4]: http://www.google.com/recaptcha/mailhide/apikey
+[5]: http://code.google.com/apis/recaptcha/docs/mailhideapi.html
+[6]: http://github.com/petervizi/imailhide/issues#issue/1
diff --git a/doc/plugins/contrib/img.mdwn b/doc/plugins/contrib/img.mdwn
new file mode 100644
index 000000000..6c25966e0
--- /dev/null
+++ b/doc/plugins/contrib/img.mdwn
@@ -0,0 +1,14 @@
+[[!template id=plugin name=img author="Christian Mock"]]
+[[!tag type/chrome]]
+
+`img` is an enhanced image handling plugin.
+
+the intention is to make image handling as easy as possible, so the
+plugin can scale down images for direct inclusion into the page,
+providing a link to one or more larger or full-size versions on their
+own page. `width` and `height` attributes are included in the `<img>`
+tag, and `alt` text can be specified.
+
+the plugin uses the ImageMagick tools via PerlMagick.
+
+it can be found [here](http://www.tahina.priv.at/hacks/img.html).
diff --git a/doc/plugins/contrib/img/discussion.mdwn b/doc/plugins/contrib/img/discussion.mdwn
new file mode 100644
index 000000000..ea4ccb042
--- /dev/null
+++ b/doc/plugins/contrib/img/discussion.mdwn
@@ -0,0 +1,51 @@
+This is a good idea, I was just the other day stuck writing ugly html to
+properly size an image for a blog post. Putting the image sizing support
+into a plugin instead of trying to shoehorn it into a wikilink seems like
+the way to go.
+
+I have two issues with this plugin as it's implemented now, the first is
+that the generation of whole pages containing a scaled version of the image
+seems gratuituous, as well as buggy. If you want three pages with
+differently scaled versions of the image, why not just create three pages
+and use the plugin once per page? Something like this on the first one if
+it's got multiple clickable thumbnails:
+
+ \[[!img foo.jpg width=256 link=page2]]
+
+This on the second:
+
+ \[[!img foo.jpg width=1024 link=page3]]
+ \[[small|page1]]
+ \[[medium|page2]]
+ \[[large|page3]]
+
+This on the third:
+
+ \[[!img foo.jpg link=page3]]
+ \[[small|page1]]
+ \[[medium|page2]]
+ \[[large|parge3]]
+
+Granted, this is more work, but it's more flexible too, and it doesn't have
+it creating new pages on the fly, which I don't personally like..
+
+----
+
+The second issue is whether it should use imagemagick to scale the image
+and generate a new scaled one, or just emit html that sets the sizes of the
+image. Benefits to scaling:
+
+1. Saves download time and bandwidth, especially if generating a page with a
+ lot of thumbnails of large images.
+
+Benefits of not scaling:
+
+1. Avoids any security issues with imagemagick.
+2. Avoids issue of how to clean up old scaled images that are no longer being
+ used. (Granted, this is a general ikiwiki problem that will eventually
+ be fixed in a general way. (Update: now fixed in a general way, use the
+ will_render function.))
+3. Makes clicking on thumbnails display the full version really fast, since
+ it's cached. :-)
+
+--[[Joey]]
diff --git a/doc/plugins/contrib/jscalendar.mdwn b/doc/plugins/contrib/jscalendar.mdwn
new file mode 100644
index 000000000..a320a0542
--- /dev/null
+++ b/doc/plugins/contrib/jscalendar.mdwn
@@ -0,0 +1,45 @@
+[[!meta title="Javascript equivalent of plugin 'calendar'"]]
+
+# Jscalendar
+
+Jscalendar is a javascript equivalent to the [[calendar|plugins/calendar]] plugin.
+
+## Description
+
+Here are some differences compared to this latter plugin.
+
+* Pros
+ * No need to rebuild the page containing the calendar each time day changes, or
+ a page (indexed by the calendar) is added, changed or deleted. This is
+ particularly useful if you want to have this calendar in the sidebar.
+ * Handles the case where several pages appear the same day: a popup appear to let user choose the day he wants.
+ * Smooth navigation among months.
+* Neutral
+ * Most of options are defined in Ikiwiki's setup files instead of the options
+ of the directive.
+* Cons
+ * As a consequence, every calendar of the wiki must index the same set of pages.
+ * Javascript :( .
+
+## Usage
+
+### Directive
+
+ \[[!jscalendar type="month" ]]
+
+### Setup file
+
+It being javascript rather than markdown, most of the configuration must be done in the IkiWiki configuration file rather than in the directive
+
+ 'archivebase' => "evenements/calendrier",
+ 'archive_pagespec' => "evenements/liste/* and ! evenements/liste/*/*",
+ 'week_start_day' => 1,
+ 'month_link' => 1,
+
+## Example
+
+You can see this plugin in action on [[our website|http://www.gresille.org]]. To see what happens when several pages happens on the same day, check the 15th of March 2012.
+
+Code and documentation can be found here : [[https://atelier.gresille.org/projects/gresille-ikiwiki/wiki/Jscalendar]]
+
+-- Louis
diff --git a/doc/plugins/contrib/jssearchfield.mdwn b/doc/plugins/contrib/jssearchfield.mdwn
new file mode 100644
index 000000000..2d41ee24f
--- /dev/null
+++ b/doc/plugins/contrib/jssearchfield.mdwn
@@ -0,0 +1,35 @@
+[[!template id=plugin name=jssearchfield author="[[rubykat]]"]]
+[[!tag type/search]]
+IkiWiki::Plugin::jssearchfield - Create a search form to search page field data.
+
+This plugin provides the [[ikiwiki/directive/jssearchfield]] directive. This
+enables one to search the structured data ("field" values) of multiple pages.
+This uses Javascript for the searching, which means that the entire thing
+is self-contained and does not require a server or CGI access, unlike
+the default IkiWiki search. This means that it can be used in places such
+as ebook readers. The disadvantage is that because Javascript runs
+in the browser, the searching is only as fast as the machine your browser
+is running on.
+
+Because this uses Javascript, the htmlscrubber must be turned off for any page where the directive is used.
+
+This plugin depends on the [[!iki plugins/contrib/field]] plugin.
+
+## Activate the plugin
+
+ # activate the plugin
+ add_plugins => [qw{goodstuff field jssearchfield ....}],
+
+ # disable scrubbing for search page
+ htmlscrubber_skip => 'mysearchpage',
+
+## PREREQUISITES
+
+ IkiWiki
+ IkiWiki::Plugin::field
+ HTML::Template
+
+## DOWNLOAD
+
+* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/jssearchfield.pm>
+* git repo at git://github.com/rubykat/ikiplugins.git
diff --git a/doc/plugins/contrib/justlogin.mdwn b/doc/plugins/contrib/justlogin.mdwn
new file mode 100644
index 000000000..90645b9ef
--- /dev/null
+++ b/doc/plugins/contrib/justlogin.mdwn
@@ -0,0 +1,52 @@
+This plugin has been abandoned while still in development. Currently it does bring up the login page and the login page does, with proper credentials, log in the user, but the returning page goes to prefs. I have no idea why. I decided to go in another direction so if someone wants to take over then please do so. Otherwise I have no problem if this page needs to be deleted. [[users/justint/]]
+
+Place this code into a page:
+
+&lt;form action="http://portable.local/cgi-bin/ikiwiki.cgi" method="get"&gt;
+
+&lt;input type="hidden" name="do" value="justlogin" /&gt;
+
+&lt;input type="submit" value="Login" /&gt;&lt;/form&gt;
+
+This is the plugin so far:
+#!/usr/bin/perl
+ # Bring up a login page that returns to the calling page
+ package IkiWiki::Plugin::justlogin;
+
+ use warnings;
+ use strict;
+ use IkiWiki 3.00;
+
+ sub import {
+ hook(type => "sessioncgi", id => "justlogin", call => \&sessioncgi);
+ }
+
+ sub sessioncgi ($$) {
+ my $q=shift;
+ my $session=shift;
+
+ debug("jl sessioncgi1 running.");
+
+ if ($q->param("do") eq "justlogin") {
+ debug("jl do=justlogin running.");
+ if (! defined $session->param("name") ) {
+ debug("jl param!defined running.");
+ $session->param("postsignin" => $ENV{HTTP_REFERER} );
+ $session->param("do" => "justgoback" );
+ IkiWiki::cgi_signin($q, $session);
+ IkiWiki::cgi_savesession($session);
+ }
+ exit;
+ } elsif ($session->param("do") eq "justgoback") {
+ debug("jl justgoback running.");
+ my $page=$q->param("postsignin");
+ $session->clear("postsignin");
+ $session->clear("do");
+ IkiWiki::cgi_savesession($session);
+ IkiWiki::redirect($q, $page);
+ exit;
+ }
+ }
+
+ 1
+
diff --git a/doc/plugins/contrib/linguas.mdwn b/doc/plugins/contrib/linguas.mdwn
new file mode 100644
index 000000000..84ece042e
--- /dev/null
+++ b/doc/plugins/contrib/linguas.mdwn
@@ -0,0 +1,107 @@
+[[!template id=plugin name=linguas author="Jordà Polo"]]
+
+Linguas
+=======
+
+Linguas is a plugin for [ikiwiki](http://ikiwiki.info/) that
+allows translations of wiki pages.
+
+Download: [linguas.pm](http://ettin.org/pub/ikiwiki/linguas.pm) (2006-08-21).
+
+Note that even though it is still available for download, this plugin is no
+longer actively maintained. If you are interested in multilingual wiki pages, you
+can also take a look at other approaches such as [[todo/l10n]], [[plugins/po]],
+or Lars Wirzenius's
+[Static website, with translations, using IkiWiki](http://liw.iki.fi/liw/log/2007-05.html#20070528b).
+
+Usage
+-----
+
+Translatable pages and translations must have the following format:
+`pagename.$LANG`, where `$LANG` is a ISO639-1 (two-letter) language code.
+To enable linguas, add the following line in the source code of the page:
+
+ \[[!linguas ]]
+
+Note that linguas is only required in one of the pages (the original,
+for instance); the rest of translations will be automatically
+updated. Additionally, it is also possible to specify the title of
+the translation:
+
+ \[[!linguas title="Translated title"]]
+
+
+Template
+--------
+
+This is the template code that should be added to `templates/page.tmpl`:
+
+ <TMPL_IF NAME="LINGUAS">
+ <div id="linguas">
+ <p class="otherlinguas"><TMPL_VAR NAME="OTHERLINGUAS"></p>
+ <ul>
+ <TMPL_LOOP NAME="LINGUAS">
+ <li><TMPL_VAR NAME=LINK></li>
+ </TMPL_LOOP>
+ </ul>
+ </div>
+ </TMPL_IF>
+
+
+TODO/Known Problems
+-------------------
+
+* The current language list only contains 4 languages (ca, de, en,
+es), and is "hardcoded" in linguas.pm. Would be interesting to define
+it in ikiwiki.setup, though some problems were found while trying to do
+so. (Actually, defining hash-like arguments from the command like works
+fine, but it fails from ikiwiki.setup.)
+
+ > My guess about this is that it's because of the way Setup/Standard.pm
+ > untaints the config items from the file. It has code to handle arrays,
+ > but not hashes or more complex data structures. --[[Joey]]
+
+ > > Right. With this simple
+ > > [patch](http://ettin.org/pub/ikiwiki/hash_setup.patch) it seems to
+ > > work. However, note that 1) it only allows simple hashes, hashes of
+ > > hashes will not work (I don't think getops can handle complex hashes
+ > > anyway); 2) I don't really know when/why you call
+ > > `possibly_foolish_untaint()`; and 3) I'm no perl guru ;). --Jordà
+
+ > > > It's good. Applied..
+
+* Wiki links to other translated pages require the full page name
+including the `.$LANG`. It should be possible to link automatically
+to pages with the same `.$LANG`, but that would probably require some
+changes in IkiWiki. (I'm not sure though, I still haven't looked at
+it... any hints?)
+
+ > Have you considered using the form ll/page? This would let more usual
+ > linking rules apply amoung pages without needing to specify the
+ > language. I'm not sure if you're supporting browser content
+ > negotiation, or whether that other layout would be harder to support it
+ > though. --[[Joey]]
+
+ > > Actually, I'm happy with the way it works now (and yeah, it is very
+ > > easy to take advantage of content negotiation). I just wanted
+ > > something simple to translatte a single page (or a few pages), not
+ > > the entire wiki. I'm not even sure it is a good idea to have fully
+ > > multilingual wikis, in most cases I would go for a different wiki
+ > > for each language. That said, I think it is an interesting idea, so
+ > > I'll take a look when I have the time. Thanks for your comments.
+ > > --Jordà
+
+* The changes to htmllink in ikiwiki 1.44 broke this plugin.
+The following fixes it:
+
+ --- linguas.pm.orig 2006-08-23 19:07:04.000000000 +0200
+ +++ linguas.pm 2007-03-24 01:53:18.000000000 +0100
+ @@ -100,7 +100,7 @@
+ if (exists $linguas{$2} && defined $linguas{$2}) {
+ $link = $linguas{$2}{'name'};
+ }
+ - push @links, IkiWiki::htmllink($page, $destpage, $trans, 0, 0, $link);
+ + push @links, IkiWiki::htmllink($page, $destpage, $trans, noimageinline => 0, forcesubpage => 0, linktext => $link);
+ }
+
+ my $otherlinguas = 'Translations:';
diff --git a/doc/plugins/contrib/livefyre.mdwn b/doc/plugins/contrib/livefyre.mdwn
new file mode 100644
index 000000000..d4a62c0cc
--- /dev/null
+++ b/doc/plugins/contrib/livefyre.mdwn
@@ -0,0 +1,14 @@
+[[!template id=plugin name=livefyre core=0 author="[[cmauch]]"]]
+[[!tag type/special-purpose]]
+
+[LiveFyre](http://www.livefyre.com) is a third party comment and discussion system similar in some ways to Disqus or IntenseDebate. All three services use javascript to attach comments to your site without the need to use a native commenting system.
+
+This plugin is designed to replace the commenting system in IkiWiki entirely. It embeds LiveFyre comments on your ikiwiki blog or posts. It is was originally based on the [Disqus Plugin](https://code.google.com/p/ikiwiki-plugin-disqus/). After a few days of noticing odd page title names on the livefyre moderation interface, I updated the script to make use of JSON. I made extensive use of the [integration guide](https://github.com/Livefyre/livefyre-docs/wiki/StreamHub-Integration-Guide) to get it all running.
+
+It's loud and messy and slow, but kind of neat too.
+
+Requires the [[!cpan JSON]], [[!cpan JSON::WebToken]], and [[!cpan Digest::MD5]] perl modules to be available.
+
+You can grab the source [here](https://bitbucket.org/cmauch/ikiwiki/src/master/IkiWiki/Plugin/livefyre.pm)
+
+See the POD documention in the module for installation and configuration instructions.
diff --git a/doc/plugins/contrib/localfavicon.mdwn b/doc/plugins/contrib/localfavicon.mdwn
new file mode 100644
index 000000000..66c9fdf5c
--- /dev/null
+++ b/doc/plugins/contrib/localfavicon.mdwn
@@ -0,0 +1,7 @@
+[[!template id=plugin name=localfavicon author="Franek"]]
+
+This is a trivial modification of the [[plugins/favicon]] plugin to allow different favicons for different parts of the site. For this, the option "localfavicon" has to be set to 1 in the setup file, otherwise the plugin behaves just like the favicon plugin.
+
+For now, it can be downloaded here: [[http://perm.lemtank.de/localfavicon.pm]]
+
+See the [[this forum thread|forum/Can_I_have_different_favicons_for_each_folder__63__]] for discussion.
diff --git a/doc/plugins/contrib/mailbox.mdwn b/doc/plugins/contrib/mailbox.mdwn
new file mode 100644
index 000000000..b7a9f81c7
--- /dev/null
+++ b/doc/plugins/contrib/mailbox.mdwn
@@ -0,0 +1,18 @@
+[[!template id=plugin name=mailbox author="[[DavidBremner]]"]]
+[[!tag type/format]]
+
+The `mailbox` plugin adds support to ikiwiki for
+rendering mailbox file into a page displaying the mails
+in the mailbox. It supports mbox, maildir, and MH folders,
+does threading, and deals with MIME.
+
+One hitch I noticed was that it is not currently possible to treat a
+maildir or an MH directory as a page (i.e. just call it foo.mh and have it
+transformed to page foo). I'm not sure if this is possible and worthwhile
+to fix. It is certainly workable to use a [[!mailbox ]] directive.
+-- [[DavidBremner]]
+
+This plugin is not in ikiwiki yet, but can be downloaded
+from <http://pivot.cs.unb.ca/git/ikimailbox.git>
+
+
diff --git a/doc/plugins/contrib/mailbox/discussion.mdwn b/doc/plugins/contrib/mailbox/discussion.mdwn
new file mode 100644
index 000000000..9520fdd70
--- /dev/null
+++ b/doc/plugins/contrib/mailbox/discussion.mdwn
@@ -0,0 +1,8 @@
+# The remote repo
+
+For some reason, `git fetch` from http://pivot.cs.unb.ca/git/ikimailbox.git/ didn't work very smoothly for me: it hung, and I had to restart it 3 times before the download was complete.
+
+I'm writing this just to let you know that there might be some problems with such connections to your http-server. --Ivan Z.
+> I can't replicate this (two months later!)
+> I can suggest trying the git:// url for download if you can.
+> Also, if you really want to get my attention, send me email [[DavidBremner]]
diff --git a/doc/plugins/contrib/mandoc.mdwn b/doc/plugins/contrib/mandoc.mdwn
new file mode 100644
index 000000000..672a268cc
--- /dev/null
+++ b/doc/plugins/contrib/mandoc.mdwn
@@ -0,0 +1,12 @@
+[[!template id=plugin name=mandoc author="[[schmonz]]"]]
+[[!template id=gitbranch branch=schmonz/mandoc author="[[schmonz]]"]]
+[[!tag type/format]]
+
+This plugin lets ikiwiki convert Unix man pages to HTML. It uses
+[mdocml](http://mdocml.bsd.lv/) for the conversion, and postprocesses
+xrefs into hyperlinks.
+
+Possible enhancements:
+
+* configurable path and args to `mandoc` (and it could be `groff`)
+* configurable location for rendered manpages (such as subdirs per section)
diff --git a/doc/plugins/contrib/mathjax.mdwn b/doc/plugins/contrib/mathjax.mdwn
new file mode 100644
index 000000000..a784b95d9
--- /dev/null
+++ b/doc/plugins/contrib/mathjax.mdwn
@@ -0,0 +1,13 @@
+[[!template id="plugin" name="mathjax" author="Baldur Kristinsson"]]
+
+The [mathjax plugin](https://github.com/bk/ikiwiki-plugin-mathjax), available on GitHub, provides easy MathJax support for ikiwiki.
+
+Features:
+
+- No change needed to page.tmpl
+- Javascript is only loaded on pages which need it.
+- Both inline and display math are supported.
+- Unlike [[the pandoc plugin|plugins/contrib/pandoc]] or a solution based on editing page.tmpl, no irritating conflicts with the smiley plugin.
+- Unlike the pandoc plugin, it is easy to use in shared hosting or other environments where you have difficulty in installing extra software (beyond Perl modules, obviously).
+
+However, if you need the power of Pandoc, such as bibliography support or pdf generation, then that is probably the better option for you.
diff --git a/doc/plugins/contrib/mediawiki.mdwn b/doc/plugins/contrib/mediawiki.mdwn
new file mode 100644
index 000000000..13c2d04b2
--- /dev/null
+++ b/doc/plugins/contrib/mediawiki.mdwn
@@ -0,0 +1,10 @@
+[[!template id=plugin name=mediawiki author="[[sabr]]"]]
+[[!tag type/format]]
+
+The Mediawiki plugin allows ikiwiki to process pages written using MediaWiki
+markup.
+
+Available at <http://github.com/jmtd/mediawiki.pm>.
+
+This plugin originally lived at <http://u32.net/Mediawiki_Plugin/>, but that
+website has disappeared.
diff --git a/doc/plugins/contrib/mediawiki/discussion.mdwn b/doc/plugins/contrib/mediawiki/discussion.mdwn
new file mode 100644
index 000000000..c288d9bd1
--- /dev/null
+++ b/doc/plugins/contrib/mediawiki/discussion.mdwn
@@ -0,0 +1,9 @@
+Anyone know a safe place where this plugin can be found? -- mjr at phonecoop.coop
+
+> I ended up doing a backassward way of doing it, as described at the [convert discussion page](http://ikiwiki.info/tips/convert_mediawiki_to_ikiwiki/discussion/). -[[simonraven]]
+
+>> I've mirrored it at <http://alcopop.org/~jon/mediawiki.pm>. -- [[Jon]]
+
+---
+
+Something that gives me better results is to edit the source of the [[wikitext]] plugin, and change all occurences of Text::WikiFormat to Text::MediawikiFormat. (This of course depends on ''libtext-mediawikiformat-perl'' instead of ''libtext-wikiformat-perl'' -- [[gi1242]]
diff --git a/doc/plugins/contrib/monthcalendar.mdwn b/doc/plugins/contrib/monthcalendar.mdwn
new file mode 100644
index 000000000..d48e4d6b7
--- /dev/null
+++ b/doc/plugins/contrib/monthcalendar.mdwn
@@ -0,0 +1,23 @@
+# Monthcalendar
+
+This plugin displays a calendar, containing in each of its day the list of links of pages published on this day. It can be used, for example, to display archives of blog posts, or to announce events.
+
+## Usage
+
+### Directive
+
+ \[[!monthcalendar type="month" year="2012" month="06" pages="events/*"]]
+
+### Automation
+
+By using the following line in template `calendarmonth.tmpl`, you can have `ikiwiki-calendar` using this plugin to display monthly archives.
+
+ \[[!monthcalendar type="month" year="<TMPL_VAR YEAR>" month="<TMPL_VAR MONTH>" pages="<TMPL_VAR PAGESPEC>"]]
+
+## Code
+
+Code and documentation can be found here : [[https://atelier.gresille.org/projects/gresille-ikiwiki/wiki/Monthcalendar]].
+
+## Example
+
+This plugin is used in [our website](http://www.gresille.org/evenements/calendrier/2012/03)
diff --git a/doc/plugins/contrib/mscgen.mdwn b/doc/plugins/contrib/mscgen.mdwn
new file mode 100644
index 000000000..792aaa4e3
--- /dev/null
+++ b/doc/plugins/contrib/mscgen.mdwn
@@ -0,0 +1,52 @@
+[[!template id=plugin name=mscgen author="[[users/Tjgolubi]]"]]
+[[!tag type/widget]]
+
+## NAME
+
+IkiWiki::Plugin::mscgen - embed message sequence chart
+
+## SYNOPSIS
+
+In the ikiwiki setup file, enable this plugin by adding it to the list of active plugins.
+
+ add_plugins:
+ - mscgen
+
+## DESCRIPTION
+
+This plugin provides the msc [[ikiwiki/directive]].
+This directive allows embedding [mscgen](http://www.mcternan.me.uk/mscgen/)
+message sequence chart graphs in an ikiwiki page.
+
+Here's an example that shows how an mscgen message sequence chart is embedded into an ikiwiki page.
+
+ \[[!msc src="""
+ arcgradient = 8;
+
+ a [label="Client"],b [label="Server"];
+
+ a=>b [label="data1"];
+ a-xb [label="data2"];
+ a=>b [label="data3"];
+ a<=b [label="ack1, nack2"];
+ a=>b [label="data2", arcskip="1"];
+ |||;
+ a<=b [label="ack3"];
+ |||;
+ """]]
+
+Security implications: to be determined.
+
+This plugin borrows heavily from the [[graphviz|plugins/graphviz]] plugin written by [[JoshTriplett]].
+
+## PREREQUISITES
+ IkiWiki
+ mscgen
+ Digest::SHA
+
+## DOWNLOAD
+
+* browse at GitHub: <http://github.com/tjgolubi/ikiwiki.mscgen>
+* repo at git://github.com/tjgolubi/ikiwiki.mscgen.git
+
+
diff --git a/doc/plugins/contrib/navbar.mdwn b/doc/plugins/contrib/navbar.mdwn
new file mode 100644
index 000000000..f1c15c6e0
--- /dev/null
+++ b/doc/plugins/contrib/navbar.mdwn
@@ -0,0 +1,40 @@
+[[!template id=plugin name=navbar author="[[TobiOetiker]]"]]
+
+The Navbar Plugin renders a Navigation Bar into your page. It is based on code
+from the [[sidebar_plugin|plugins/sidebar]].
+
+The plugin looks for a page called "navbar"
+
+This page must contain a itemized list of the form
+
+
+ * \[[Welcome|index]]
+ * \[[Management|mgmt]]
+ * \[[Leadership|mgmt/lead]]
+ * \[[Kidnapping|mgmt/kidnapping]]
+ * \[[Information_Technology|it]]
+ * \[[Windows|it/windows]]
+ * \[[Mobile_Communication|it/mobile]]
+
+This list will be turned into a folding menu structure
+
+Include this into your templates.
+
+ <TMPL_IF NAVBAR>
+ <div id="navbar">
+ <TMPL_VAR NAVBAR>
+ </div>
+ </TMPL_IF>
+
+
+To make a nice menu, some css magic is required, but since this is required to make
+ikiwiki look good anyway, I won't go into details here ...
+
+See the navbar in action on <http://insights.oetiker.ch>
+
+Tobi Oetiker 2006.12.30
+
+If you are interested in this, drop me a line tobi at oetiker dot ch
+
+
+In the meanwhile, I ([[MartinQuinson]]) have hacked this a bit to make it fit my needs. The result is [[here|http://www.loria.fr/~quinson/Hacking/ikiwiki/]]
diff --git a/doc/plugins/contrib/navbar/discussion.mdwn b/doc/plugins/contrib/navbar/discussion.mdwn
new file mode 100644
index 000000000..0bbec743c
--- /dev/null
+++ b/doc/plugins/contrib/navbar/discussion.mdwn
@@ -0,0 +1,2 @@
+Where can I download this plugin ?
+-- [[jogo]]
diff --git a/doc/plugins/contrib/newpage.mdwn b/doc/plugins/contrib/newpage.mdwn
new file mode 100644
index 000000000..54c2f53d0
--- /dev/null
+++ b/doc/plugins/contrib/newpage.mdwn
@@ -0,0 +1,29 @@
+[[!template id=plugin name=newpage author="[[rubykat]]"]]
+[[!tag type/web]]
+[[!toc]]
+## NAME
+
+IkiWiki::Plugin::newpage - add a "create new page" form to actions
+
+## SYNOPSIS
+
+ # activate the plugin
+ add_plugins => [qw{goodstuff newpage ....}],
+
+## DESCRIPTION
+
+This plugin adds a new action to the "ACTIONS" section of a page;
+a button labelled "create" and an input field next to it.
+
+The common way of creating a new page is to edit a different page
+and add a link to the new page. However, there are some situations
+where that is a nuisance; for example, where pages are listed using
+a [[plugins/map]] directive. The newpage plugin enables
+one to simply type the name of the new page, click the "create" button,
+and one is then taken to the standard IkiWiki create-page form.
+
+## DOWNLOAD
+
+* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/newpage.pm>
+* git repo at git://github.com/rubykat/ikiplugins.git
+
diff --git a/doc/plugins/contrib/newpage/discussion.mdwn b/doc/plugins/contrib/newpage/discussion.mdwn
new file mode 100644
index 000000000..fb186463d
--- /dev/null
+++ b/doc/plugins/contrib/newpage/discussion.mdwn
@@ -0,0 +1,10 @@
+How is this better than creating an inline with `rootpage` set,
+which creates a similar new page form? I sometimes make the inline match
+nothing, while still creating pages, in the odd cases where I have a map
+or such displaying the pages. --[[Joey]]
+
+> I wanted something that would automatically be available on every page, but only when editing was enabled.
+> One of the sites I maintain as webmaster (<http://www.constrainttec.com/>) has a two-stage publication process. The "working" site is on an internal server, where it is set up as a wiki that authorized users in the company can edit. When they're satisfied with the changes they've made, the "working" site gets pushed (with git) to the "production" site, which is on a different server. The ikiwiki setup for the production site has editing completely disabled, because it is the site which is exposed to the outside world.
+> For that site, I want all sign that it's a wiki to be hidden. Therefore using an inline directive would be unsuitable.
+
+> --[[KathrynAndersen]]
diff --git a/doc/plugins/contrib/opml.mdwn b/doc/plugins/contrib/opml.mdwn
new file mode 100644
index 000000000..3f98e8065
--- /dev/null
+++ b/doc/plugins/contrib/opml.mdwn
@@ -0,0 +1,11 @@
+[[!template id=plugin name=opml author="[[JanWalzer|jwalzer]]"]]
+[[!tag type/format]]
+
+The idea of this plugin is to parse in an OPML-File and output a linklist, maybe some customization.
+OPML-Files are xml-files that most RSS-Readers write out, to summarize their subscribes feedlist.
+
+I have a "dumb" perlscript running on my website, that tries to do a opml2mdwn transformation, but its quite bad on that.
+
+This Plugin is **NOT Ready** in any way. I'm just putting this page up as a hook, to discuss it.
+
+I intend to work on this, but I'd appreciate any help on this.
diff --git a/doc/plugins/contrib/opml/discussion.mdwn b/doc/plugins/contrib/opml/discussion.mdwn
new file mode 100644
index 000000000..3a145c79a
--- /dev/null
+++ b/doc/plugins/contrib/opml/discussion.mdwn
@@ -0,0 +1,4 @@
+If this is the wrong place for the development of the plugin, please mode it on to a more appropriate one.
+
+Currently I'm quite stuck with the perl-stuff itself. I'm trying to become comfortable with the language, but it seems, the language doesn't like me. I'm lost in complex datastructures, when trying to iterate through the output of XML::Simple. --[[Jan|jwalzer]]
+
diff --git a/doc/plugins/contrib/pagespec_alias.mdwn b/doc/plugins/contrib/pagespec_alias.mdwn
new file mode 100644
index 000000000..cb642ad33
--- /dev/null
+++ b/doc/plugins/contrib/pagespec_alias.mdwn
@@ -0,0 +1,28 @@
+[[!template id=plugin name=pagespec_alias author="[[Jon]]"]]
+[[!tag type/meta]]
+
+The pagespec_alias plugin allows the administrator(s) of a wiki to define
+[[PageSpec]] aliases: short names for PageSpecs to ease re-use.
+
+Within the setup file, the `pagespec_aliases` value is treated as a list
+of key/value pairs. The keys define alias names, the values the pagespecs
+to which they refer.
+
+For example:
+
+ pagespec_aliases:
+ image: "*.png or *.jpg or *.jpeg or *.gif or *.ico"
+ helper: "*.css or *.js"
+ boring: "image() or helper() or internal(*)"
+
+With the above, you could use the pagespec aliases such as
+
+ \[[!map pages="!boring()"]]
+
+To define a site map which excluded various page names which might be
+uninteresting to include in a site map.
+
+
+## Download
+
+ * <https://github.com/jmtd/ikiwiki/blob/pagespec-alias/IkiWiki/Plugin/pagespec_alias.pm>
diff --git a/doc/plugins/contrib/pandoc.mdwn b/doc/plugins/contrib/pandoc.mdwn
new file mode 100644
index 000000000..264aafd95
--- /dev/null
+++ b/doc/plugins/contrib/pandoc.mdwn
@@ -0,0 +1,6 @@
+[[!template id=plugin name=pandoc author="profjim"]]
+
+This plugin enables Markdown processing using [Pandoc](http://johnmacfarlane.net/pandoc/). You can configure it for Pandoc to take over processing of all .mkdn files, or only files with a different extension. Given the features Pandoc has added over the past 6-12 months, this makes for a very powerful combination, e.g. with code block syntax highlighting and lots of options for how to process and display inline TeX.
+
+This is an expanded and updated version of [[Jason Blevin|users/jasonblevins]]'s pandoc plugin. Get it and see further details at <https://github.com/dubiousjim/pandoc-iki>.
+
diff --git a/doc/plugins/contrib/plusone.mdwn b/doc/plugins/contrib/plusone.mdwn
new file mode 100644
index 000000000..a8d4c67fd
--- /dev/null
+++ b/doc/plugins/contrib/plusone.mdwn
@@ -0,0 +1,35 @@
+[[!template id=plugin name=plusone author="[[BerndZeimetz]]"]]
+[[!toc]]
+[[!tag plugins]] [[!tag patch]] [[!tag wishlist]]
+
+## NAME
+
+IkiWiki::Plugin::plusone - Adding the +1 button to your posts
+
+## SYNOPSIS
+
+ # activate the plugin
+ add_plugins => [qw{goodstuff plusone ....}],
+
+ # set some options:
+ plusone_count => 1,
+ plusone_size => 'standard',
+ plusone_lang => 'en-US',
+
+
+## DESCRIPTION
+
+This plugin allows to add a google plusone button using the plusone directive
+ [[!plusone ]]
+where ever you want the button to show up.
+
+## plusone directive
+The plusone directive allows to override the automativally generated url by specifying the wanted url as option:
+ [[!plusone url="http://bzed.de/"]]
+
+
+## DOWNLOAD
+
+* single file: [plusone.pm](http://git.recluse.de/?p=users/bzed/ikiwiki.git;a=blob_plain;f=IkiWiki/Plugin/plusone.pm;hb=refs/heads/plusone)
+* browse repository: <http://git.recluse.de/?p=users/bzed/ikiwiki.git;a=shortlog;h=refs/heads/plusone>
+* git repo: `git://git.recluse.de/users/bzed/ikiwiki.git` or <http://git.recluse.de/repos/users/bzed/ikiwiki.git> (Use the plusone branch)
diff --git a/doc/plugins/contrib/pod.mdwn b/doc/plugins/contrib/pod.mdwn
new file mode 100644
index 000000000..97a9c648a
--- /dev/null
+++ b/doc/plugins/contrib/pod.mdwn
@@ -0,0 +1,38 @@
+[[!template id=plugin name=pod author="[[rubykat]]"]]
+[[!tag type/format]]
+## NAME
+
+IkiWiki::Plugin::pod - process pages written in POD format.
+
+## SYNOPSIS
+
+In the ikiwiki setup file, enable this plugin by adding it to the
+list of active plugins.
+
+ add_plugins => [qw{goodstuff pod ....}],
+
+## DESCRIPTION
+
+IkiWiki::Plugin::pod is an IkiWiki plugin enabling ikiwiki to
+process pages written in POD ([Plain Old Documentation](http://en.wikipedia.org/wiki/Plain_Old_Documentation)) format.
+This will treat files with a `.pod` or `.pm` extension as files
+which contain POD markup.
+
+## OPTIONS
+
+The following options can be set in the ikiwiki setup file.
+
+* **pod_index:** If true, this will generate an index (table of contents) for the page.
+* **pod_toplink:** The label to be used for links back to the top of the page. If this is empty, then no top-links will be generated.
+
+## PREREQUISITES
+
+ IkiWiki
+ Pod::Xhtml
+ IO::String
+
+## DOWNLOAD
+
+* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/pod.pm>
+* git repo at git://github.com/rubykat/ikiplugins.git
+
diff --git a/doc/plugins/contrib/pod/discussion.mdwn b/doc/plugins/contrib/pod/discussion.mdwn
new file mode 100644
index 000000000..9187b1350
--- /dev/null
+++ b/doc/plugins/contrib/pod/discussion.mdwn
@@ -0,0 +1,14 @@
+My one concern about this plugin is the `=for` markup in POD.
+
+> Some format names that formatters currently are known to
+> accept include "roff", "man", "latex", "tex", "text", and "html".
+
+I don't know which of these [[!cpan Pod::Xhtml]] supports. If it currently
+supports, or later support latex, that could be problimatic since that
+could maybe be used to include files or run code. --[[Joey]]
+
+> I don't know, either; the documentation for [[!cpan Pod:Xhtml]] is silent on this subject. --[[KathrynAndersen]]
+
+>> I'm afraid the only approach is to audit the existing code in the perl
+>> module(s), and then hope nothing is added to them later that opens a
+>> security hole. --[[Joey]]
diff --git a/doc/plugins/contrib/postal.mdwn b/doc/plugins/contrib/postal.mdwn
new file mode 100644
index 000000000..c522f8bcb
--- /dev/null
+++ b/doc/plugins/contrib/postal.mdwn
@@ -0,0 +1,35 @@
+[[!template id=plugin name=postal author="[[DavidBremner]]"]]
+[[!tag type/special-purpose]]
+
+The `postal` plugin allows users to send mail to
+a special address to comment on a page. It uses the [[mailbox]]
+plugin to display their comments in the wiki.
+
+This plugin is not in ikiwiki yet, but can be downloaded
+from <http://pivot.cs.unb.ca/git/ikipostal.git>
+
+Details:
+
+ * Adds a mailto: url to each page matching some pagespec
+ (currently every page gets a comment footer)
+
+ * This mailto url goes to an address identifying the page (something like
+ user-iki-blog~I\_hate\_markdown@host.fqdn.tld).
+ [more details](http://www.cs.unb.ca/~bremner/blog/posts/encoding)
+
+ * on the mail receiving end, these messages are either deleted, or ran through
+ a filter to be turned into blog posts. I have
+[written](http://pivot.cs.unb.ca/git/?p=ikipostal.git;a=blob_plain;f=filters/postal-accept.pl;hb=HEAD)
+ a filter that decodes the address and writes the message into an appropriate
+mailbox. The changes are then checked into version control; typically a hook then updates the html version of the wiki.
+ * work in progress can be
+
+ - [cloned](http://pivot.cs.unb.ca/git/ikipostal.git), or
+ - [browsed](http://pivot.cs.unb.ca/git/?p=ikipostal.git;a=summary)
+
+ * I would be interested in any ideas people have about security.
+
+The current version of this plugin is now running on my home page. See for example
+[a recent post in my blog](http://www.cs.unb.ca/~bremner/blog/posts/can-i-haz-a-distributed-rss/).
+Unfortunately although the [[mailbox|todo/mbox]] renderer supports threading, I haven't had
+a chance to implement comments on comments yet. --[[DavidBremner]]
diff --git a/doc/plugins/contrib/postal/discussion.mdwn b/doc/plugins/contrib/postal/discussion.mdwn
new file mode 100644
index 000000000..4eaacc044
--- /dev/null
+++ b/doc/plugins/contrib/postal/discussion.mdwn
@@ -0,0 +1,24 @@
+It seems like the filter 'postal-accept.pl' I wrote doesn't refresh thoroughly enough. When a comment is added it calls
+
+ IkiWiki::add_depends($page,$comments_page);
+
+And then after adding the actual comment, it ends with
+
+ IkiWiki::refresh();
+ IkiWiki::saveindex();
+
+Sure enough, the page being commented on is refreshed, but not any inline pages (e.g. tags pages, blog top level) that contain it.
+Is there a way to recursively refresh? Or should it work that way by default. I guess it is some part of the api that I don't understand,
+since I think not many people grub about in the internals of ikiwiki this way.
+It would be nice to figure this out, doing a full rebuild every time I get a blog comment is not that fun.
+
+[[DavidBremner]]
+
+> Ikiwiki currently doesn't have support for transitive dependencies.
+> This is discussed deep inside [[todo/tracking_bugs_with_dependencies]]
+> and in [[todo/inlines_inheriting_links]].
+>
+> FYI, the [[plugins/comments]] plugin avoids this problem by only showing the
+> comments on the page, and not on pages that inline it. --[[Joey]]
+>> Ok, thanks for the speedy response. I guess I should do the same thing.
+>> [[DavidBremner]]
diff --git a/doc/plugins/contrib/proxies.mdwn b/doc/plugins/contrib/proxies.mdwn
new file mode 100644
index 000000000..7f8f5faaf
--- /dev/null
+++ b/doc/plugins/contrib/proxies.mdwn
@@ -0,0 +1,13 @@
+[[!template id=plugin name=proxies author="[[schmonz]]"]]
+[[!template id=gitbranch branch=schmonz/proxies author="[[schmonz]]"]]
+[[!tag type/web]]
+
+This plugin enables ikiwiki to open outbound connections (such as
+found in [[plugins/aggregate]], [[plugins/openid]], and [[plugins/pinger]])
+via a proxy. The proxy can be configurably avoided for connections
+to certain domains.
+
+### To do
+
+* Move duplicated user-agent setup out of other plugins into this one.
+* While I'm at it, fix [[bugs/http_proxy_for_openid]].
diff --git a/doc/plugins/contrib/report.mdwn b/doc/plugins/contrib/report.mdwn
new file mode 100644
index 000000000..0bd5392c6
--- /dev/null
+++ b/doc/plugins/contrib/report.mdwn
@@ -0,0 +1,26 @@
+[[!template id=plugin name=report author="[[rubykat]]"]]
+[[!tag type/meta type/format]]
+IkiWiki::Plugin::report - Produce templated reports from page field data.
+
+This plugin provides the [[ikiwiki/directive/report]] directive. This enables
+one to report on the structured data ("field" values) of multiple pages; the
+output is formatted via a template. This depends on the
+[[plugins/contrib/field]] plugin.
+
+
+## Activate the plugin
+
+ # activate the plugin
+ add_plugins => [qw{goodstuff report ....}],
+
+## PREREQUISITES
+
+ IkiWiki
+ IkiWiki::Plugin::field
+ HTML::Template
+ Encode
+
+## DOWNLOAD
+
+* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/report.pm>
+* git repo at git://github.com/rubykat/ikiplugins.git
diff --git a/doc/plugins/contrib/report/discussion.mdwn b/doc/plugins/contrib/report/discussion.mdwn
new file mode 100644
index 000000000..419c4bca6
--- /dev/null
+++ b/doc/plugins/contrib/report/discussion.mdwn
@@ -0,0 +1,80 @@
+Wow, this plugin does a lot... it seems to be `inline` (but without the feeds
+or the ability to not have `archive="yes"`), plus part of
+[[plugins/contrib/trail]], plus some sorting, plus an ingenious workaround
+for template evaluation being relatively stateless.
+
+A large part of this plugin would just fall off if one of the versions of
+"[[todo/allow_plugins_to_add_sorting_methods]]" was merged, which was a
+large part of the idea of that feature request :-) To make use of that
+you'd have to use `pagespec_match_list` in the trail case too, but that's
+easy enough - just add `list => [@the_trail_pages]` to the arguments.
+
+Another large part would fall off if this plugin required, and internally
+invoked, `inline` (like my `comments` plugin does) - `inline` runs
+`pagetemplate` hooks, and in particular, it'll run the `field` hook.
+Alternatively, this plugin could invoke `pagetemplate` hooks itself,
+removing the special case for `field`.
+
+Perhaps the `headers` thing could migrate into inline somehow? That might
+lead to making inline too big, though.
+
+> I think inline is *already* too big, honestly. --[[KathrynAndersen]]
+
+>> A fair point; perhaps my complaint should be that *inline* does
+>> too many orthogonal things. I suppose the headers feature wouldn't
+>> really make sense in an inline that didn't have `archive="yes"`,
+>> so it'd make sense to recommend this plugin as a replacement
+>> for inlining with archive=yes (for which I now realise "inline"
+>> is the wrong verb anyway :-) ) --s
+
+>>> I think *inline* would be a bit less unwieldy if there was some way of factoring out the feed stuff into a separate plugin, but I don't know if that's possible. --K.A.
+
+Is the intention that the `trail` part is a performance hack, or a way
+to select pages? How does it relate to [[todo/wikitrails]] or
+[[plugins/contrib/trail]]? --[[smcv]]
+
+> The `trail` part is *both* a performance hack, and a way to select pages. I have over 5000 pages on my site, I need all the performance hacks I can get.
+> For the performance hack, it is a way of reducing the need to iterate through every single page in the wiki in order to find matching pages.
+> For the way-to-select-pages, yes, it is intended to be similar to [[todo/wikitrails]] and [[plugins/contrib/trail]] (and will be more similar with the new release which will be happening soon; it will add prev_* and next_* variables).
+> The idea is that, rather than having to add special "trail" links on PageA to indicate that a page is part of the trail,
+> it takes advantage of the `%links` hash, which already contains, for each page, an array of the links from that page to other pages. No need for special markup, just use what's there; a trail is defined as "all the pages linked to from page X", and since it's an array, it has an order already.
+> But to avoid that being too limiting, one can use a `pages=...` pagespec to filter that list to a subset; only the pages one is interested in.
+> And one can also sort it, if one so desires.
+> --[[KathrynAndersen]]
+
+>> That's an interesting approach to trails; I'd missed the fact that
+>> links are already ordered.
+>>
+>> This does have the same problems as tags, though: see
+>> [[bugs/tagged()_matching_wikilinks]] and
+>> [[todo/matching_different_kinds_of_links]]. I suppose the question
+>> now is whether new code should be consistent with `tag` (and
+>> potentially be fixed at the same time as tag itself), or try to
+>> avoid those problems?
+>>
+>> The combination of `trail` with another pagespec in this plugin
+>> does provide a neat way for it to work around having unwanted
+>> pages in the report, by limiting by a suitable tag or subdirectory
+>> or something. --s
+
+>>> Either that, or somehow combine tagging with fields, such that one could declare a tag, and it would create both a link and a field with a given value. (I've been working on something like that, but it still has bugs).
+>>> That way, the test for whether something is tagged would be something like "link(tag/foo) and field(tag foo)".
+>>> --K.A.
+
+>>>> I can see that this'd work well for 1:1 relationships like next
+>>>> and previous, but I don't think that'd work for pages with more than
+>>>> one tag - as far as I can see, `field`'s data model is that each
+>>>> page has no more than one value for each field?
+>>>> [[todo/Matching_different_kinds_of_links]] has some thoughts about
+>>>> how it could be implemented, though. --s
+
+>>>>> You have a point there. I'm not sure what would be better: to add the concept of arrays/sets to `field`, or to think of tags as a special case. Problem is, I find tags as they currently exist to be too limiting. I prefer something that can be used for Faceted Tagging <http://en.wikipedia.org/wiki/Faceted_classification>; that is, things like Author:Fred Nurk, Genre:Historical, Rating:Good, and so on. Of course, that doesn't mean that each tag is limited to only one value, either; just to take the above examples, something might have more than one author, or have multiple genres (such as Historical + Romance).
+
+>>>>> It might be that adding arrays to the `field` plugin is a good way to go: after all, even though field=value is the most common, with the flexibility of things like YAML, one could define all sorts of things. What I'm not so sure about is how to return the values when queried, since some things would be expecting scalars all the time. Ah, perhaps I could use wantarray?
+>>>>> Is there a way of checking a HTML::Template template to see if it expecting an array for a particular value?
+>>>>> --[[KathrynAndersen]]
+
+How about arrays?
+-----------------
+
+In [[plugins/contrib/getfield/discussion]], I outline how there's a problem in getfield displaying array refs when the data is a YAML array. I also propose a patch there so that arrays are join'd with a space separator, which is less than ideal, but at least works for getfield. However, for report, I am not sure it's as good. Should it make two rows for those? How should we parse this? Thanks. -- [[anarcat]]
diff --git a/doc/plugins/contrib/report/ikiwiki/directive/report.mdwn b/doc/plugins/contrib/report/ikiwiki/directive/report.mdwn
new file mode 100644
index 000000000..4a740f97f
--- /dev/null
+++ b/doc/plugins/contrib/report/ikiwiki/directive/report.mdwn
@@ -0,0 +1,175 @@
+[[!toc]]
+The `report` directive is supplied by the [[!iki plugins/contrib/report desc=report]] plugin.
+
+This enables one to report on the structured data ("field" values) of
+multiple pages; the output is formatted via a template. This depends
+on the [[plugins/contrib/field]] plugin.
+
+The pages to report on are selected by a PageSpec given by the "pages"
+parameter. The template is given by the "template" parameter.
+The template expects the data from a single page; it is applied
+to each matching page separately, one after the other.
+
+Additional parameters can be used to fill out the template, in
+addition to the "field" values. Passed-in values override the
+"field" values.
+
+There are two places where template files can live. One is in the
+/templates directory on the wiki. These templates are wiki pages, and
+can be edited from the web like other wiki pages.
+
+The second place where template files can live is in the global
+templates directory (the same place where the page.tmpl template lives).
+This is a useful place to put template files if you want to prevent
+them being edited from the web, and you don't want to have to make
+them work as wiki pages.
+
+## OPTIONS
+
+**template**: The template to use for the report.
+
+**pages**: A PageSpec to determine the pages to report on.
+
+**pagenames**: If given instead of pages, this is interpreted as a
+space-separated list of links to pages, and they are shown in exactly the order
+given: the sort and pages parameters cannot be used in conjunction with this
+one. If they are used, they will be ignored.
+
+**trail**: A page or pages to use as a "trail" page.
+
+When a trail page is used, the matching pages are limited to (a subset
+of) the pages which that page links to; the "pages" pagespec in this
+case, rather than selecting pages from the entire wiki, will select
+pages from within the set of pages given by the trail page.
+
+Additional space-separated trail pages can be given in this option.
+For example:
+
+ trail="animals/cats animals/dogs"
+
+This will take the links from both the "animals/cats" page and the
+"animals/dogs" page as the set of pages to apply the PageSpec to.
+
+**start**: Start the report at the given page-index; the index starts
+from zero.
+
+**count**: Report only on N pages where count=N.
+
+**sort**: A SortSpec to determine how the matching pages should be sorted.
+
+**here_only**: Report on the current page only.
+
+This is useful in combination with "prev_" and "next_" variables to
+make a navigation trail.
+If the current page doesn't match the pagespec, then no pages will
+be reported on.
+
+### Headers
+
+An additional option is the "headers" option. This is a space-separated
+list of field names which are to be used as headers in the report. This
+is a way of getting around one of the limitations of HTML::Template, that
+is, not being able to do tests such as
+"if this-header is not equal to previous-header".
+
+Instead, that logic is performed inside the plugin. The template is
+given parameters "HEADER1", "HEADER2" and so on, for each header.
+If the value of a header field is the same as the previous value,
+then HEADER**N** is set to be empty, but if the value of the header
+field is new, then HEADER**N** is given that value.
+
+#### Example
+
+Suppose you're writing a blog in which you record "moods", and you
+want to display your blog posts by mood.
+
+ \[[!report template="mood_summary"
+ pages="blog/*"
+ sort="Mood Date title"
+ headers="Mood"]]
+
+The "mood_summary" template might be like this:
+
+ <TMPL_IF NAME="HEADER1">
+ ## <TMPL_VAR NAME="HEADER1">
+ </TMPL_IF>
+ ### <TMPL_VAR NAME="TITLE">
+ (<TMPL_VAR NAME="DATE">) \[[<TMPL_VAR NAME="PAGE">]]
+ <TMPL_VAR NAME="DESCRIPTION">
+
+### Multi-page Reports
+
+Reports can now be split over multiple pages, so that there aren't
+too many items per report-page.
+
+**per_page**: how many items to show per report-page.
+
+**first_page_is_index**: If true, the first page of the report is just
+an index which contains links to the other report pages.
+If false, the first page will contain report-content as well as links
+to the other pages.
+
+### Advanced Options
+
+The following options are used to improve efficiency when dealing
+with large numbers of pages; most people probably won't need them.
+
+**maketrail**:
+
+Make a trail; if true, then this report is called in "scan" mode and the
+pages which match the pagespec are added to the list of links from this
+page. This can be used by *another* report by setting this page to be a
+"trail" page in *that* report.
+
+It is not possible to use "trail" and "maketrail" at the same time.
+By default, "maketrail" is false.
+
+## TEMPLATE PARAMETERS
+
+The templates are in HTML::Template format, just as [[plugins/template]] and
+[[ftemplate]] are. The parameters passed in to the template are as follows:
+
+### Fields
+
+The structured data from the current matching page. This includes
+"title" and "description" if they are defined.
+
+### Common values
+
+Values known for all pages:
+
+* page (the current page)
+* destpage (the destination page)
+* basename (the base name of the page)
+* recno (N if the page is the Nth page in the report)
+
+### Prev_Page And Next_Page
+
+The "prev_page" and "next_page" variables will give the value of the
+previous page in the matching pages, or the next page in the matching pages.
+This is mainly useful for a "here_only" report.
+
+### Passed-in values
+
+Any additional parameters to the report directive are passed to the
+template; a parameter will override the matching "field" value.
+For example, if you have a "Mood" field, and you pass Mood="bad" to
+the report, then that will be the Mood which is given for the whole
+report.
+
+Generally this is useful if one wishes to make a more generic
+template and hide or show portions of it depending on what
+values are passed in the report directive call.
+
+For example, one could have a "hide_mood" parameter which would hide
+the "Mood" section of your template when it is true, which one could
+use when the Mood is one of the headers.
+
+### Headers
+
+See the section on Headers.
+
+### First and Last
+
+If this is the first page-record in the report, then "first" is true.
+If this is the last page-record in the report, then "last" is true.
diff --git a/doc/plugins/contrib/sar.mdwn b/doc/plugins/contrib/sar.mdwn
new file mode 100644
index 000000000..77c41a955
--- /dev/null
+++ b/doc/plugins/contrib/sar.mdwn
@@ -0,0 +1,109 @@
+[[!template id=plugin name=sar author="[[VictorMoral]]"]]
+[[!tag type/chrome type/slow ]]
+
+The `sar` plugin is useful to make global or local search and replace operations
+using common or specific terms.
+
+The characteristics are:
+
+- Support for a global dictionary page (optional but recommended).
+- Is possible to replace the first appearance with a text and the rest with
+other.
+
+The global dictionary page is like this:
+
+ ## Sites and projects
+
+ - [[!sar search="ikiwiki" first="[IkiWiki](http://ikiwiki.info)" next="_IkiWiki_"]]
+ - [[!sar search="debian" first="[Debian](http://debian.org)" next="_Debian_"]]
+ - [[!sar search="perl" first="[Perl](http://perl.org)" next="_Perl_"]]
+ - [[!sar search="linux" replace="GNU/Linux"]]
+
+ ## Persons
+ - [[!sar search="joey" first="[Joey Hess](http://ikiwiki.info/users/joey]]" next="_Joey_" ]]
+ - [[!sar search="angel" first="[Angel](http://triptico.com)" next="Angel"]]
+
+ ## Technical terms
+
+ - [[!sar search="smtp" first="\[[!wp SMTP]]" next="‘SMTP‘"]]
+ - [[!sar search="pop3" first="\[[!wp POP3]]" next="’POP3’"]]
+
+The search expressions must be surrounded by double dashes in a source ikiwiki
+page, like this:
+
+ Mis programas están escritos en lenguaje --perl--, funcionando con el
+ sistema --debian--, y mis páginas web funcionan con --ikiwiki-- cuyo autor
+ es --joey--.
+
+ --ikiwiki-- es un buen software.
+
+After a filter operation the content is:
+
+ Mis programas están escritos en lenguaje [Perl](http://perl.org),
+ funcionando con el sistema [Debian](http://debian.org), y mis páginas web
+ funcionan con [IkiWiki](http://ikiwiki.info) cuyo autor es [Joey
+ Hess](http://ikiwiki.info/users/joey).
+
+ _IkiWiki_ es un buen software.
+
+_Note_: I chose this syntax because don't clashes with markdown and it is easy to write.
+
+A _search and replace_ directive has the following parameters:
+
+- `search`: define the text to search.
+- `first`: define the replace text in the first match.
+- `next`: define the replace text in all matches except the first.
+- `replace`: define the replace text in all matches.
+
+Now the code is used at my site without problems, and the author will
+appreciate any help with his development or his english.
+
+## Configuration
+
+The plugin need the following global values:
+
+- `sar_mainpage`: define the global dictionary page. The default value is `sar`.
+- `sar_pagespec`: enable the plugin with a selection of pages. The default
+value is `*`, but a recommended value is `link(tag/sar)`.
+
+## Synopsis
+
+In a ikiwiki source page we can write this
+
+ \[[!sar search=debian replace="__Debian__"]]
+
+for define a global replace for the term `--debian--` or
+
+ \[[!sar search=ibm first=’[IBM](http://www.ibm.com)’
+ next="_IBM_"]]
+
+to define a replace for the first match of the string `--ibm--` and a different
+replace for the rest.
+
+## Changelog
+
+### version 0.8
+
+- First functional version with the new sar expressions.
+
+### version 0.7
+
+- New design for the search expressions.
+
+### version 0.6
+
+- Minor bugfixes in the pages selection.
+- Call to add_depends() for every page filtered
+
+### version 0.5
+
+- This is the first functional version.
+
+## Download
+
+The module can be downloaded from:
+
+- [My personal site](http://taquiones.net/files/misc)
+- [My personal Debian repository](http://taquiones.net/files/debian)
+
+
diff --git a/doc/plugins/contrib/screenplay.pm.mdwn b/doc/plugins/contrib/screenplay.pm.mdwn
new file mode 100644
index 000000000..5ff082da5
--- /dev/null
+++ b/doc/plugins/contrib/screenplay.pm.mdwn
@@ -0,0 +1,320 @@
+This plugin works for me. It follows the standard for a movie screenplay pretty closely, I am not aware of any errors in format. Please let me know if you find any.
+
+Right now all it does is display your pages properly in a web browser. What I would like to add is the ability to output a file that could easily be printed once the screenplay is finished. We keep all the scenes we work on in one folder and eventually we will want to print a script out of that folder. It would be great if an up to date PDF or TXT script could be put in the folder when a scene is saved. I will do it, it just isn't a priority yet.
+
+I am not a published writer and not an authority on script formatting. I got what I know out of a book.
+
+Briefly, you type a command on a line, like ".d", then on the next line (for the dialog command) you type a person's name. Then you hit return again and write the words he is supposed to speak out all on one line. When you save your document this simple text will become a properly formatted script.
+
+Thank you Joey for having me here.
+
+###Headings:
+ Most headings should begin with a transition. The list of valid commands is:
+ .fi => FADE IN: a gradual transition from a solid color to an image
+ .fo => FADE OUT.
+ .ftb => FADE TO BLACK.
+ .ftw => FADE TO WHITE.
+ .ct => CUT TO: indicates an instantaneous shift from one shot to the next
+ .shot => lack of an explicit transition assumes a cut
+ .hct => HARD CUT TO: describes a jarring transition
+ .qct => QUICK CUT TO: describes a cut sooner than expected
+ .tct => TIME CUT TO: emphasizes time passing
+ .mct => MATCH CUT TO: image in first shot visually or thematically matches image in second
+ .dt => DISSOLVE TO: gradual transition from image to another implies passage of time.
+ .rdt => RIPPLE DISSOLVE TO: indicates transition into daydream or imagination
+ .wt => WIPE TO: new image slides over top of last one
+
+ Example transition:
+
+ .fi (or any transition command) <= Writes a transition line, except .shot which omits it.
+ type shot heading here <= this line will be capitalized
+ First direction. <= these lines are not capitalized.
+ Second direction.
+ Third direction, etc...
+
+ Direction without a shot heading:
+ .dir
+ First direction.
+ Second direction.
+ Third direction, etc...
+
+ Some items aren't implemented in dialogue yet:
+ 1) you must watch that you don't leave a " -- " dangling on a line by itself,
+ instead, carry the last word onto the line with a dash
+ 2) observe lyrical line endings in dialogue by indenting wrapped lines by two spaces
+ 3) you must watch that the four line limit for parenthetical direction is not exceeded
+
+ Example dialogue:
+
+ .d
+ char name <= this line will be capitalized
+ this is what he's saying <= Dialogue
+ raises hand to wave <= Parenthetical direction
+ this is more of what he's saying <= Dialogue
+ this is going to be in parenthesis <= Parenthetical direction
+ this is more of what he's saying, etc... <= Dialogue
+
+ .note
+ Allows you to add a temporary note to a script without getting an error.
+ All notes need to be removed eventually because they are a format violation.
+
+
+
+ ###name this file screenplay.pm and pop it in your Plugin folder. Then you need to add the plugin to your Ikiwiki setup file.
+
+ #!/usr/bin/perl
+ # Screenplay markup language
+ package IkiWiki::Plugin::screenplay;
+
+ use warnings;
+ use strict;
+ use IkiWiki 3.00;
+ use Text::Format;
+ use Log::Log4perl qw(:easy);
+ Log::Log4perl->easy_init($INFO);
+ #Log::Log4perl->easy_init($ERROR);
+
+ sub import {
+ hook(type => "getsetup", id => "screenplay", call => \&getsetup);
+ hook(type => "htmlize", id => "screenplay", call => \&htmlize, longname => "Screenplay");
+ }
+
+ sub getsetup () {
+ return
+ plugin => {
+ safe => 1,
+ rebuild => 1, # format plugin
+ section => "format",
+ },
+ }
+
+ sub htmlize (@) {
+ #set up variables and fill with defaults
+ my %params=@_;
+ my $content = $params{content};
+ my @lines = split(/\r\n|\r|\n/, $content);
+ my @chunk;
+ my @formatted;
+ my $current_line = shift(@lines);
+ my $current_command = "";
+ my $current_chunk = "";
+
+ while (scalar(@lines) > 0) {
+ until ( &dot_command($current_line) || scalar(@lines) == 0 ) {
+ #skip spaces; mark bad lines
+ unless ( &blank_line($current_line) ) {
+ push(@formatted, "<br />");
+ push(@formatted, &no_command($current_line));
+ }
+ $current_line = shift(@lines);
+ }
+
+ #Exit while loop if we're out of lines
+ last if (scalar(@lines) == 0);
+
+ #set command for chunk
+ $current_command = $current_line;
+ $current_line = shift(@lines);
+
+ #get chunk, i.e. all text up to next blank line or a dot command.
+ until (substr($current_line,0,1) eq '.' || $current_line =~ m// || $current_line =~ m/^\s*$/) {
+ push(@chunk,$current_line);
+ $current_line = shift(@lines);
+ last unless defined $current_line;
+ }
+
+ #Start with a blank line unless unneeded.
+ if (scalar(@formatted) > 0 ) {
+ push(@formatted, "<br />");
+ }
+
+ #remaining lines are not commands.
+ if (scalar(@chunk)) {
+ $current_chunk = shift(@chunk);
+ if ($current_command eq ".shot") {
+ push(@formatted, &indent(&chunk(uc($current_chunk),57),17));
+ while (scalar(@chunk)) {
+ $current_chunk = shift(@chunk);
+ push(@formatted, "<br />");
+ push(@formatted, &indent(&chunk($current_chunk,57),17));
+ }
+
+ } elsif ($current_command eq ".note") {
+ push(@formatted, "NOTE:<br />");
+ push(@formatted, &chunk($current_chunk,75));
+ while (scalar(@chunk)) {
+ $current_chunk = shift(@chunk);
+ push(@formatted, "<br />");
+ push(@formatted, &chunk($current_chunk,75));
+ }
+
+ } elsif ($current_command eq ".dir") {
+ push(@formatted, &indent(&chunk($current_chunk,57),17));
+ while (scalar(@chunk)) {
+ $current_chunk = shift(@chunk);
+ push(@formatted, "<br />");
+ push(@formatted, &indent(&chunk($current_chunk,57),17));
+ }
+
+ } elsif ($current_command eq ".d") {
+ push(@formatted, &indent(&chunk(uc($current_chunk),32),41));
+ $current_chunk = shift(@chunk);
+ push(@formatted, &indent(&chunk($current_chunk,34),27));
+ while (scalar(@chunk) / 2 >= 1 ) {
+ $current_chunk = shift(@chunk);
+ push(@formatted, &indent(&chunk(&pd($current_chunk),19),34));
+ $current_chunk = shift(@chunk);
+ push(@formatted, &indent(&chunk($current_chunk,34),27));
+ }
+
+ } elsif ($current_command eq ".pd") {
+ push(@formatted, &indent(&chunk(uc($current_chunk),32),41));
+ $current_chunk = shift(@chunk);
+ push(@formatted, &indent(&chunk(&pd($current_chunk),19),34));
+ $current_chunk = shift(@chunk);
+ push(@formatted, &indent(&chunk($current_chunk,34),27));
+ while (scalar(@chunk) / 2 >= 1 ) {
+ $current_chunk = shift(@chunk);
+ push(@formatted, &indent(&chunk(&pd($current_chunk),19),34));
+ $current_chunk = shift(@chunk);
+ push(@formatted, &indent(&chunk($current_chunk,34),27));
+ }
+
+ } elsif ($current_command =~ m/^\.(fi|fo|ct|hct|qct|tct|mct|dt|rdt)$/) {
+ if ($current_command eq ".fi") {
+ push(@formatted, &indent(&chunk(uc("FADE IN:"),20),17));
+ } elsif ($current_command eq ".fo") {
+ push(@formatted, &indent(&chunk(uc("FADE OUT:"),20),60));
+ } elsif ($current_command eq ".ct") {
+ push(@formatted, &indent(&chunk(uc("CUT TO:"),20),60));
+ } elsif ($current_command eq ".hct") {
+ push(@formatted, &indent(&chunk(uc("HARD CUT TO:"),20),60));
+ } elsif ($current_command eq ".qct") {
+ push(@formatted, &indent(&chunk(uc("QUICK CUT TO:"),20),60));
+ } elsif ($current_command eq ".tct") {
+ push(@formatted, &indent(&chunk(uc("TIME CUT TO:"),20),60));
+ } elsif ($current_command eq ".mct") {
+ push(@formatted, &indent(&chunk(uc("MATCH CUT TO:"),20),60));
+ } elsif ($current_command eq ".dt") {
+ push(@formatted, &indent(&chunk(uc("DISSOLVE TO:"),20),60));
+ } elsif ($current_command eq ".rdt") {
+ push(@formatted, &indent(&chunk(uc("RIPPLE DISSOLVE TO:"),20),60));
+ } elsif ($current_command eq ".wt") {
+ push(@formatted, &indent(&chunk(uc("WIPE TO:"),20),60));
+ }
+ push(@formatted, &indent(&chunk(uc($current_chunk),57),17));
+ while (scalar(@chunk)) {
+ $current_chunk = shift(@chunk);
+ push(@formatted, "<br />");
+ push(@formatted, &indent(&chunk($current_chunk,57),17));
+ }
+
+ }
+ #mark the rest of the chunk as 'no command'
+ if (scalar(@chunk)) {
+ $current_chunk = shift(@chunk);
+ push(@formatted, &no_command($current_chunk));
+ }
+
+ }
+ }
+ my @content;
+ my $i = 0;
+ $current_line = "";
+ while (scalar(@formatted)) {
+ $i++;
+ $current_line = shift(@formatted);
+ if ( $i % 60 == 0 ) {
+ push(@content, &indent($i/60 . ".<br />",72) );
+ }
+ push(@content, $current_line);
+ }
+ $content = join("\r\n",@content);
+ return $content;
+ }
+
+ sub blank_line {
+ my $line = shift(@_);
+ my $ret = 0;
+
+ if ($line =~ m// || $line =~ m/^\s*$/) {
+ $ret = 1;
+ } else {
+ $ret = 0;
+ }
+
+ return $ret;
+ }
+
+ sub chunk () {
+ my $unchunked = shift(@_);
+ my $columns = shift(@_);
+ my $text = new Text::Format;
+ $text->rightFill(1);
+ $text->columns($columns);
+ $text->firstIndent(0);
+ $text->tabstop(0);
+ $text->extraSpace(1);
+ my @chunked = split /\n/, $text->format($unchunked);
+ my @formatted;
+ foreach (@chunked) {
+ push(@formatted, $_ . "<br />");
+ }
+ return @formatted;
+ }
+
+ sub dot_command {
+ my $line = shift(@_);
+ my $ret = 0;
+
+ if ($line =~ m/^\.(ct|dir|dt|d|fi|fo|hct|mct|note|pd|qct|rdt|shot|tct)$/) {
+ $ret = 1;
+ } else {
+ $ret = 0;
+ }
+
+ return $ret;
+ }
+
+ sub indent () {
+ my @unindented = @_;
+ my $spaces = pop @unindented;
+ my @indented;
+ foreach (@unindented) {
+ push(@indented, "&nbsp;" x $spaces . $_);
+ }
+ return @indented;
+ }
+
+ sub no_command () {
+ my $line = shift(@_);
+ my $text = new Text::Format;
+ $text->rightFill(1);
+ $text->columns(68);
+ $text->firstIndent(0);
+ $text->tabstop(0);
+ $text->extraSpace(1);
+ my @chunked = split /\n/, $text->format($line);
+ my @formatted;
+ push(@formatted, ("NO COMMAND: "));
+ foreach (@chunked) {
+ push(@formatted, ( $_ . "<br />" ));
+ }
+ return @formatted;
+ }
+
+ sub pd () {
+ my @chunk = @_;
+ # add '(' to top item
+ my $line = "(" . shift(@chunk);
+ unshift(@chunk, $line);
+
+ # add ')' to bottom item
+ $line = pop(@chunk) . ")";
+ push(@chunk, $line);
+
+ return @chunk;
+ }
+
+ 1
+
diff --git a/doc/plugins/contrib/siterel2pagerel.mdwn b/doc/plugins/contrib/siterel2pagerel.mdwn
new file mode 100644
index 000000000..9b09657bf
--- /dev/null
+++ b/doc/plugins/contrib/siterel2pagerel.mdwn
@@ -0,0 +1,30 @@
+[[!template id=plugin name=siterel2pagerel author="[[PaulWise]]"]]
+
+This is a simple plugin to convert all site-relative links to page-relative
+links (converts /foo into ../../../foo or similar). It works as a
+postprocessing filter, allowing it to work on mdwn, wiki, html, rst and any
+other format that produces html. The code is available here:
+
+ #!/usr/bin/perl
+ # quick HTML siterel2pagerel link hack by Paul Wise
+ package IkiWiki::Plugin::siterel2pagerel;
+
+ use warnings;
+ use strict;
+ use IkiWiki 2.00;
+
+ sub import {
+ hook(type => "sanitize", id => "siterel2pagerel", call => \&siterel2pagerel);
+ }
+
+ sub siterel2pagerel (@) {
+ my %params=@_;
+ my $baseurl=IkiWiki::baseurl($params{page});
+ my $content=$params{content};
+ $content=~s/(<a(?:\s+(?:class|id)\s*="?\w+"?)?)\s+href=\s*"\/([^"]*)"/$1 href="$baseurl$2"/mig;
+ $content=~s/(<img(?:\s+(?:class|id|width|height)\s*="?\w+"?)*)\s+src=\s*"\/([^"]*)"/$1 src="$baseurl$2"/mig;
+ # FIXME: do <script and everything else that can have URLs in it
+ return $content;
+ }
+
+ 1
diff --git a/doc/plugins/contrib/sourcehighlight.mdwn b/doc/plugins/contrib/sourcehighlight.mdwn
new file mode 100644
index 000000000..07ac2086f
--- /dev/null
+++ b/doc/plugins/contrib/sourcehighlight.mdwn
@@ -0,0 +1,30 @@
+[[!template id=plugin name=sourcehighlight core=0 author="[[DavidBremner]]"]]
+
+I noticed several places in the wiki talking about similar ideas, so I decided to put a page here to point to what I am working on.
+
+I have implemented a simple wrapper around
+ [source-highlight](http://www.gnu.org/software/src-highlite/). You can find the latest version in
+[git](http://pivot.cs.unb.ca/git?p=ikiplugins.git;a=blob_plain;f=IkiWiki/Plugin/sourcehighlight.pm;hb=HEAD).
+You must specify `highlight_lang=>"foo,bar"` in your setup file.
+where foo and bar are the (source-supported) languages you want to
+highlight
+### Issues
+
+- I would like to have a link to the raw source; using will_render() and then copying the file should work.
+
+> You might also like to look at the [[todo/source_link]] todo. -- [[Will]]
+
+- Is there a way to configure the colors used by source-highlight (other than editing the globally installed "default.style" file)? It would help if I could pass the command arbitrary command-line arguments; then I could configure which config file it's supposed to use. For instance, I'm not a fan of hard-coding the colors into the HTML output. IMHO, css-style formatting should be preferred. All that can be set via the command line ... --Peter
+
+> I don't really have time right now, but it should be easy to add, if you look at how src-lang is handled. Patches are welcome :-) --[[DavidBremner]]
+
+Note that [[Will]] wrote a plugin that uses source-highlight also. It's
+available
+[[here|todo/automatic_use_of_syntax_plugin_on_source_code_files/discussion]].
+--[[Joey]]
+
+To be honest, [[Will]]'s version of this looks more polished. I will try his
+plugin and see if it can just replace mine. --[[DavidBremner]]
+
+
+*Updated* Now uses keepextension so multiple extensions should be OK
diff --git a/doc/plugins/contrib/syntax.mdwn b/doc/plugins/contrib/syntax.mdwn
new file mode 100644
index 000000000..da4213000
--- /dev/null
+++ b/doc/plugins/contrib/syntax.mdwn
@@ -0,0 +1,65 @@
+[[!template id=plugin name=syntax author="[[VictorMoral]]"]]
+[[!tag type/chrome type/slow]]
+
+The `syntax` plugin adds support to ikiwiki for syntax highlighting through the *vim* editor and its perl interface [[!cpan Text::VimColor]]. It depends on a functional vim installation.
+
+The plugin inserts a fragment of HTML with special marks from a file or a string text. It accepts the following parameters:
+
+* **type** (optional): this is the file type for vim syntax highlighthing. It can be omitted if the param *file* exists.
+* **file**: Path to the source file. It must exist on every rebuild of the wiki.
+* **text**: Text string with the source.
+* **description** (optional): little description about the content.
+* **linenumbers** (optional): enable the line numering of the source page. A value greater than zero is the first line number.
+
+The params *file* and *text* are mutually exclusive.
+
+In the case of file parameter, `syntax` will build a html link for direct download.
+
+Example:
+
+ \[[!syntax type="perl" text="""
+ #!/usr/bin/perl
+
+ my $a = "World";
+ print "Hello, ${a}\n";
+ """]]
+
+or
+
+ \[[!syntax file="/examples/hello.pl" description="My first perl program"]]
+
+This plugin create the following CSS styles:
+
+* syntax
+* synComment
+* synConstant
+* syncIdentifier
+* synPreProc
+* synType
+* synSpecial
+* synUnderlined
+* synError
+* synTodo
+* synTitle
+
+It can be downloaded from [here](http://taquiones.net/files/misc/) or through my personal debian repository at <http://taquiones.net/files/debian/>. There is a page with examples: <http://taquiones.net/software/syntax-examples.html>
+
+_**NOTE:** all the above links are broken_
+
+Any help, comments or critics are welcome at <victor@taquiones.net>.
+
+## version 0.9
+
+- Add a force_subpage parameter for link build
+- Fix a bug in syntax page link
+- Documented a bug with markdown indented text
+- Documented the syntax directive
+
+## version 0.7
+
+- Version change to GPL
+- Add *linenumbers* parameter
+- The *file* parameter should be point to a ikiwiki source page.
+- The *description* parameter will be converted on a URL if the *file* parameter exist.
+
+I need help for debugging this module. Thanks in advance.
diff --git a/doc/plugins/contrib/syntax/discussion.mdwn b/doc/plugins/contrib/syntax/discussion.mdwn
new file mode 100644
index 000000000..af6c07aa5
--- /dev/null
+++ b/doc/plugins/contrib/syntax/discussion.mdwn
@@ -0,0 +1,23 @@
+I'd like to include this in ikiwiki. Using vim for syntax highlighting is
+suprising to me, but it seems to work great. Would it be possible to
+license it the same as the rest of ikiwiki (GPL) instead of dragging in the
+perl license?
+
+> Yes, no problem. I'm writing the next version. --[[VictorMoral]]
+
+Text::VimColor will need to be added to Debian..
+
+It looks to me like the file parameter is a security hole, since it allows
+inclusion of arbitrary files into the wiki, including ones outside of the
+wiki source tree. I think this option should either be removed, or be
+limited to reading files inside the wiki source tree. If it's retained it
+should also add an appropriate dependency on the included file.
+
+> You are right, Joey. I didn't think on it because i don't use the cgi mode. :-) I'm working on it. --[[VictorMoral]]
+
+--[[Joey]]
+
+> It looks like the author of Text::VimColor has already made a Debian package. I've
+> contacted him, but no answer back yet. --[[Roktas]]
+
+>>Meanwhile i've got a debian package for Text::VimColor [in my repository](http://taquiones.net/files/debian/). --[[VictorMoral]]
diff --git a/doc/plugins/contrib/tex4ht.mdwn b/doc/plugins/contrib/tex4ht.mdwn
new file mode 100644
index 000000000..bee18d96f
--- /dev/null
+++ b/doc/plugins/contrib/tex4ht.mdwn
@@ -0,0 +1,15 @@
+[[!template id=plugin name=tex4ht core=0 author="[[DavidBremner]]"]]
+
+I have written a simple wrapper around tex4ht to convert tex files to html. This is slow, and currently noisy. I do not recommend it for running from cgi. But for interactive conversion of
+my old tex4ht based home page, it seems to work OK.
+
+The current version is available from
+[git](http://pivot.cs.unb.ca/git?p=ikiplugins.git;a=blob_plain;f=IkiWiki/Plugin/tex4ht.pm;hb=HEAD)
+
+### Other related ideas/plugins:
+
+- [[todo/latex]] There is work in progress at converting snippets of latex. No idea how the hybrid approach of tex4ht (part font, part bitmaps) compares to the [[todo/latex]] approach.
+
+- pandoc can also convert latex to html or markdown. It is much faster than tex4ht; on the other hand, the rendering quality is not quite as good, and pandoc does not understand user defined TeX macros.
+
+[[!tag type/slow]]
diff --git a/doc/plugins/contrib/texinfo.mdwn b/doc/plugins/contrib/texinfo.mdwn
new file mode 100644
index 000000000..b6a6c4bf3
--- /dev/null
+++ b/doc/plugins/contrib/texinfo.mdwn
@@ -0,0 +1,122 @@
+[[!template id=plugin name=texinfo author="[[tschwinge]]"]]
+
+[[I|tschwinge]] started writing a plugin to render
+[GNU Texinfo](http://www.gnu.org/software/texinfo/)
+inside the ikiwiki environment.
+
+This plugin is not neccessarily meant to enable people to write arbitrary
+wiki pages in the Texinfo format (even though that is possible, of course),
+but rather to ease collaboration on existing Texinfo documents.
+
+The plugin is available at
+<http://git.savannah.gnu.org/cgit/hurd/web.git/plain/.library/IkiWiki/Plugin/texinfo.pm>.
+
+It's very basic at the moment, but will be improved over time.
+
+It also has not really been audited for any security issues.
+
+
+# Issues
+
+## How can I use verbatiminclude?
+
+I only can post a file ...
+
+## N-to-M Mapping of Input and Output Files
+
+Conventional ikiwiki [[*htmlize*ing|plugins/write#index6h3]] plugins
+have a one-to-one mapping of input file and output file:
+`some/where/page.mdwn` is rendered to `some/where/page.html`.
+This can also be achieved for Texinfo files, but is somewhat
+unusual there, when rendering them to HTML. In general, there
+is a N-to-M mapping:
+
+* N Texinfo input files (a main `.texi` file,
+ several helper files (`fdl.texi`, `version.texi`, ...), and
+ additional text files which are included from the main `.texi`
+ file, e.g. `history.texi`, `libfoo.texi`, `libbar.texi`. --[[tschwinge]]
+
+> As far as multiple input files, you'd need to use add_depends()
+> to let ikiwiki know that a change to any of those files should cause a
+> rebuild of the "main" file. --[[Joey]]
+
+>> (?) I'll see about a frob to get `makeinfo` provide me with a list of additional files
+>> it used for rendering a given `.texi` file. --[[tschwinge]]
+
+> I guess you'd also have to somehow deal with
+> it wanting to render pages for each of the helper files. Not quite sure
+> what the best way would be to avoid that. --[[Joey]]
+
+>> Might it be an option to simply not render the pages that are already
+>> being used as an `include` file for another `.texi` file?
+>> But how to assemble that list before actually having rendered all `.texi` files?
+>> One possibility might be to already render them at ikiwiki's *scanning* stage and
+>> store the rendered HTML files into temporary directories, and then at ikiwiki's
+>> *rendering* stage simply install the desired ones into the main tree and discard
+>> the others. --[[tschwinge]]
+
+* M Texinfo output files: the main `.texi` file (which `include`s
+ the other input files) is usually rendered into a (flat) hierarchy
+ of HTML files, one file per node, see the table on
+ <http://www.gnu.org/software/texinfo/manual/texinfo/html_node/#Top>
+ for an example. --[[tschwinge]]
+
+> Ikiwiki is perfectly happy with a page creating other files (see eg, the
+> img and teximg plugins, as well as the inline plugin's rss generation).
+> The will_render() function supports that.
+>
+> What hasn't been done though is a page creating more than one other _page_.
+> Perhaps you could call IkiWiki::genpage by hand for each additional page.
+> You might also want to manipulate each data structure that tracks info about
+> pages, adding the additional pages to them, so that they're first class
+> pages that work as pages everywhere in ikiwiki (ie, can be inlined,
+> appear in a site map, be linked to, etc). Not sure how to do that,
+> and perhaps you could get away without doing it actually. --[[Joey]]
+
+>> Currently I use `makeinfo --no-split` and render to stdout, so that I can
+>> easily capture the output and stuff it into the appropriate ikiwiki data structure.
+>> If we want to have multiple output files (which we'll eventually want to have,
+>> to avoid having such large single-file outputs), we won't be able to
+>> do this anymore.
+>> (?) Then we'll need a way to find the main output file, which
+>> will be the one to be copied into what ikiwiki expects to be the main output
+>> of the rendered `.texi` file.
+>> Perhaps (again) parse the `.texi` file for a `@setfilename` statement?
+>> The other generated files will also have to
+>> copied somewhere (preferably into a subdirectory named alike the main file
+>> to avoid name space collisions; but need to take care of links between the files then)
+>> and need to be registed within the ikiwiki system.
+>> --[[tschwinge]]
+
+There needs to be some logic to establish a mapping between the *N* input files
+and the *M* output files.
+(At least for web-editing via CGI this is needed: ikiwiki (currently) needs to be able
+to deduce *one* input file from a given output file)
+Easiest would be either to have *N = 1*
+(plus perhaps some input files that are not meant to be editable, like `gpl.texi`)
+or to have
+*M = N* and have a (?) one-to-one mapping between *input file n* and *output file m*
+(which is not possible in Texinfo's `makeinfo` at the moment).
+--[[tschwinge]]
+
+
+## `makeinfo` Output
+
+`makeinfo --html` is being used for rendering. It creates stand-alone
+HTML files, while ikiwiki only needs the files' `<body>`s.
+
+(?) One possibility (which is what I'm doing at the moment) is to simply cut away
+everythin until `<body>` is seen and after `</body>` has been seen. --[[tschwinge]]
+
+
+# Bugs
+
+## Non-functional Texinfo Commands
+
+Those commands are know to not work currently:
+
+* `@printindex`
+* `@shortcontents`
+* `@contents`
+
+This is due to `makeinfo` not providing this functionality if rendering to stdout.
diff --git a/doc/plugins/contrib/tracking.mdwn b/doc/plugins/contrib/tracking.mdwn
new file mode 100644
index 000000000..06d4120cd
--- /dev/null
+++ b/doc/plugins/contrib/tracking.mdwn
@@ -0,0 +1,30 @@
+[[!template id=plugin name=tracking author="[[BerndZeimetz]]"]]
+[[!toc]]
+[[!tag plugins]] [[!tag patch]] [[!tag wishlist]]
+
+## NAME
+
+IkiWiki::Plugin::tracking - enable google/piwik visitor tracking
+
+## SYNOPSIS
+
+ # activate the plugin
+ add_plugins => [qw{goodstuff tracking ....}],
+
+ # to use Piwik:
+ piwik_id => '1',
+ piwik_https_url => "https://ssl.example.com/piwik/",
+ piwik_http_url => "http://www.example.com/piwik/",
+
+ # to use Google Analytics:
+ google_analytics_id => "UA-xxxxxx-x"
+
+## DESCRIPTION
+
+This plugin includes the necessary tracking codes for Piwik and/or Google Analytics on all pages. Tracking codes will only be included if the necessary config options are set. The plugin could be enhanced to support goals/profiles and similar things, but I do not plan to do so.
+
+## DOWNLOAD
+
+* single files: [tracking.pm](http://git.recluse.de/?p=users/bzed/ikiwiki.git;a=blob;f=IkiWiki/Plugin/tracking.pm;hb=refs/heads/tracking) [piwik.tmpl](http://git.recluse.de/?p=users/bzed/ikiwiki.git;a=blob;f=templates/piwik.tmpl;hb=refs/heads/tracking) [google_analytics.tmpl](http://git.recluse.de/?p=users/bzed/ikiwiki.git;a=blob;f=templates/google_analytics.tmpl;hb=refs/heads/tracking)
+* browse repository: <http://git.recluse.de/?p=users/bzed/ikiwiki.git;a=shortlog;h=refs/heads/tracking>
+* git repo: `git://git.recluse.de/users/bzed/ikiwiki.git` or <http://git.recluse.de/repos/users/bzed/ikiwiki.git> (Use the tracking branch)
diff --git a/doc/plugins/contrib/unixauth.mdwn b/doc/plugins/contrib/unixauth.mdwn
new file mode 100644
index 000000000..c97312b59
--- /dev/null
+++ b/doc/plugins/contrib/unixauth.mdwn
@@ -0,0 +1,21 @@
+[[!template id=plugin name=unixauth core=0 author="[[schmonz]]"]]
+[[!tag type/auth]]
+
+[[!template id=gitbranch branch=unixauth author="[[schmonz]]"]]
+
+This plugin authenticates users against the Unix user database. It presents a similar UI to [[plugins/passwordauth]], but simpler, as there's no need to be able to register or change one's password.
+
+To authenticate, either [checkpassword](http://cr.yp.to/checkpwd.html) or [pwauth](http://www.unixpapa.com/pwauth/) must be installed and configured. `checkpassword` is strongly preferred. If your web server runs as an unprivileged user -- as it darn well should! -- then `checkpassword` needs to be setuid root. (Or your ikiwiki CGI wrapper, I guess, but don't do that.) Other checkpassword implementations are available, notably [checkpassword-pam](http://checkpasswd-pam.sourceforge.net/).
+
+Config variables that affect the behavior of `unixauth`:
+
+* `unixauth_type`: defaults to unset, can be "checkpassword" or "pwauth"
+* `unixauth_command`: defaults to unset, should contain the full path and any arguments
+* `unixauth_requiressl`: defaults to 1, can be 0
+* `sslcookie`: needs to be 1 if `unixauth_requiressl` is 1 (perhaps this should be done automatically?)
+
+__Security__: [As with passwordauth](/security/#index14h2), be wary of sending usernames and passwords in cleartext. Unlike passwordauth, sniffing `unixauth` credentials can get an attacker much further than mere wiki access. Therefore, this plugin defaults to not even _displaying_ the login form fields unless we're running under SSL. Nobody should be able to do anything remotely dumb until the admin has done at least a little thinking. After that, dumb things are always possible. ;-)
+
+`unixauth` needs the `HTTPS` environment variable, available in ikiwiki 2.67 or later (fixed in #[502047](http://bugs.debian.org/502047)), without which it fails closed.
+
+The plugin has not been tested with newer versions of ikiwiki. [[schmonz]] hopes to have time to polish this plugin soon.
diff --git a/doc/plugins/contrib/unixauth/discussion.mdwn b/doc/plugins/contrib/unixauth/discussion.mdwn
new file mode 100644
index 000000000..232649863
--- /dev/null
+++ b/doc/plugins/contrib/unixauth/discussion.mdwn
@@ -0,0 +1,38 @@
+The security of this plugin scares me. As noted in the plugin
+documentation, you basically have to use it with SSL, since snooping on the
+login password doesn't give you an essentially useless account -- it gives
+you an actual account on the machine!
+
+Also, apparently pwauth defers *all* auth attempts if one fails, and it
+does this by using a lock file, and sleeping after a failed auth attempt.
+Which is needed to avoid brute-forcing, since this is a significant
+password.. but how will that interact with ikiwiki? Well, ikiwiki _also_
+uses a lock file. So, at a minimum, someone can not only try to brute-force
+the pwauth password, but the ikiwiki processes that stack up due to that
+will also keep ikiwiki's lock held. Which basically DOSes the wiki for
+everyone else; noone else can try to log in, or log out, or edit a page,
+all of which require taking the lock.
+
+So I don't think I'll be accepting this plugin into ikiwiki itself..
+--[[Joey]]
+
+Thanks for the comments. That's definitely an undesirable interaction between pwauth and ikiwiki; in my current application it wouldn't be a serious problem, but I'd like this plugin to be general-purpose and safe enough for inclusion in ikiwiki. It's the system-users-are-wiki-users idea I'm married to here, not pwauth itself; can you suggest another approach I might take?
+-- [[schmonz]]
+
+> Have you considered using [[plugins/httpauth]] and then the appropriate apache module? There are apache modules like [mod_authnz_external](http://unixpapa.com/mod_auth_external.html) that might help. The advantage of these solutions is that they usually make the security implications explicit. -- Will
+
+Actually, yes. That's how I made sure I had pwauth working to begin with. I'm partial to the form-based approach because I'm not aware of any way to reliably "log out" browsers from HTTP authentication. If that *is* reliably possible, then I worked way too hard for no reason. ;-)
+-- [[schmonz]]
+
+I've added support for [checkpassword](http://cr.yp.to/checkpwd/interface.html), since those generally don't have any rate-limiting cleverness to interfere with ikiwiki's, and made a few other changes. Please check out the plugin docs again and let me know if this is closer to being acceptable.
+-- [[schmonz]]
+
+> I actually think that the rate limiting is a good thing. After all,
+> ikiwiki doesn't do its own login rate limiting. Just need to find a way
+> to disentangle the two locks. --[[Joey]]
+
+>> Ah, ok, I misunderstood your comment. I'll see what I can figure out. --[[schmonz]]
+
+>>> My time's been limited for this, but I just saw [[todo/avoid_thrashing]]. How does that interact with pwauth or checkpassword? --[[schmonz]]
+
+>>>> The DOS still happens, it just uses less memory. --[[Joey]]
diff --git a/doc/plugins/contrib/unixrelpagespec.mdwn b/doc/plugins/contrib/unixrelpagespec.mdwn
new file mode 100644
index 000000000..a35f76c30
--- /dev/null
+++ b/doc/plugins/contrib/unixrelpagespec.mdwn
@@ -0,0 +1,42 @@
+[[!template id=plugin name=unixrelpagespec core=0 author="[[Jogo]]"]]
+
+I don't understand why `./*` correspond to siblings and not subpages.
+This is probably only meaningfull with [[plugins/autoindex]] turned on.
+
+Here is a small plugin wich follow usual Unix convention :
+
+- `./*` expand to subpages
+- `../*` expand to siblings
+
+---
+ #!/usr/bin/perl
+ # UnixRelPageSpec plugin.
+ # by Joseph Boudou <jogo at matabio dot net>
+
+ package IkiWiki::Plugin::unixrelpagespec;
+
+ use warnings;
+ use strict;
+ use IkiWiki 3.00;
+
+ sub import {
+ inject(
+ name => 'IkiWiki::PageSpec::derel',
+ call => \&unix_derel
+ );
+ }
+
+ sub unix_derel ($$) {
+ my $path = shift;
+ my $from = shift;
+
+ if ($path =~ m!^\.{1,2}/!) {
+ $from =~ s#/?[^/]+$## if (defined $from and $path =~ m/^\.{2}/);
+ $path =~ s#^\.{1,2}/##;
+ $path = "$from/$path" if length $from;
+ }
+
+ return $path;
+ }
+
+ 1;
diff --git a/doc/plugins/contrib/video.mdwn b/doc/plugins/contrib/video.mdwn
new file mode 100644
index 000000000..baa0c6500
--- /dev/null
+++ b/doc/plugins/contrib/video.mdwn
@@ -0,0 +1,25 @@
+[[!template id=plugin name=video author="[[Yury Chumak|sphynkx]]"]]
+
+## Video
+
+This plugin provides embedding video on wikipages. Plugin uses most simple embedding method - only with *embed* tag and without any JS-scripts.
+
+###Usage
+
+>\[\[\!video width=100 height=100 type="application/x-shockwave-flash" src="/\_jwplayer/player.swf" allowscriptaccess="always" allowfullscreen="true" autostart="false" file="path\_to\_video"\]\]
+
+All parameters are optional except *file* and will be replaced with the default settings as showed in the above example.
+
+*file* is relative path in webdir or web-address (to Youtube page).
+
+### Install
+Download and unpack [archive](http://sphynkx.org.ua/progr/videoplug/jw_videoplugin.tar.bz2) in your ikiwiki webdir.
+Or download [JW Player](http://www.longtailvideo.com/players/jw-flv-player/) and [perl module](http://sphynkx.org.ua/progr/videoplug/video.pm) separately. Make dir *\_jwplayer* and put player.swf in it. Also put *video.pm* in *Plugin* dir. In Ikiwiki configuration switch on the plugin:
+
+ add_plugins => [qw{.......... video}]
+
+### Note
+
+[Htmlscrubber](http://ikiwiki.info/plugins/htmlscrubber/) may block *embed* tag.
+
+If embed tag present but video not playing - check mode of unpacked *player.swf*.
diff --git a/doc/plugins/contrib/video/discussion.mdwn b/doc/plugins/contrib/video/discussion.mdwn
new file mode 100644
index 000000000..577790988
--- /dev/null
+++ b/doc/plugins/contrib/video/discussion.mdwn
@@ -0,0 +1,3 @@
+I'm sure this is useful to its author in his situation, but I have to point
+out that ikiwiki supports the html5 `<video>` tag, and so this is not
+necessary to support any reasonably modern browser. --[[Joey]]
diff --git a/doc/plugins/contrib/wc.mdwn b/doc/plugins/contrib/wc.mdwn
new file mode 100644
index 000000000..fb5a7320e
--- /dev/null
+++ b/doc/plugins/contrib/wc.mdwn
@@ -0,0 +1,22 @@
+[[!template id=plugin name=wc author="[[schmonz]]"]]
+[[!template id=gitbranch branch=schmonz/wc author="[[schmonz]]"]]
+[[!tag type/meta]]
+[[!tag patch]]
+
+This plugin counts words in a page. For a single page, write a
+`\[[!wc]]` directive and the word count will be interpolated there.
+For a site, add `<TMPL_VAR WORDCOUNT>` to your [[templates]].
+
+If [[!cpan HTML::Strip]] is installed, the wordcount will be slightly
+more accurate.
+
+Possible enhancements:
+
+* Optimize: count words iff the result will be displayed. `sanitize()`
+ seems like the right place to count. Since it's called well after
+ `preprocess()`, I can tell whether a directive needs the result,
+ but since it appears to be called before `pagetemplate()`, I can't
+ tell whether a template wants to know and possibly skip the
+ computation. (In other words, if I add `$needed_for_template`
+ like `$needed_for_directive`, it gets set too late for `sanitize()`
+ to see.)
diff --git a/doc/plugins/contrib/xslt.mdwn b/doc/plugins/contrib/xslt.mdwn
new file mode 100644
index 000000000..80c956c58
--- /dev/null
+++ b/doc/plugins/contrib/xslt.mdwn
@@ -0,0 +1,39 @@
+[[!template id=plugin name=xslt author="[[rubykat]]"]]
+[[!tag type/chrome]]
+## NAME
+
+IkiWiki::Plugin::xslt - ikiwiki directive to process an XML file with XSLT
+
+## SYNOPSIS
+
+\[[!xslt file="data1.xml" stylesheet="style1.xsl"]]
+
+## DESCRIPTION
+
+IkiWiki::Plugin::xslt is an IkiWiki plugin implementing a directive
+to process an input XML data file with XSLT, and output the result in
+the page where the directive was called.
+
+It is expected that the XSLT stylesheet will output valid HTML markup.
+
+## OPTIONS
+
+There are two arguments to this directive.
+
+* **file:**
+ The file which contains XML data to be processed. This file *must* have a `.xml` extension (`filename.xml`). This file is searched for using the usual IkiWiki mechanism, thus finding the file first in the same directory as the page, then in the directory above, and so on.
+
+* **stylesheet:**
+ The file which contains XSLT stylesheet to apply to the XML data. This file *must* have a `.xsl` extension (`filename.xsl`). This file is searched for using the usual IkiWiki mechanism, thus finding the file first in the same directory as the page, then in the directory above, and so on.
+
+## PREREQUISITES
+
+ IkiWiki
+ XML::LibXML
+ XML::LibXSLT
+
+## DOWNLOAD
+
+* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/xslt.pm>
+* git repo at git://github.com/rubykat/ikiplugins.git
+
diff --git a/doc/plugins/contrib/xslt/discussion.mdwn b/doc/plugins/contrib/xslt/discussion.mdwn
new file mode 100644
index 000000000..72cce083c
--- /dev/null
+++ b/doc/plugins/contrib/xslt/discussion.mdwn
@@ -0,0 +1,49 @@
+## security
+
+I'm curious what the security implications of having this plugin on a
+publically writable wiki are.
+
+First, it looks like the way it looks up the stylesheet file will happily
+use a regular .mdwn wiki page as the stylsheet. Which means any user can
+create a stylesheet and have it be used, without needing permission to
+upload arbitrary files. That probably needs to be fixed; one way would be
+to mandate that the `srcfile` has a `.xsl` extension.
+
+Secondly, if an attacker is able to upload a stylesheet file somehow, could
+this be used to attack the server where it is built? I know that xslt is
+really a full programming language, so I assume at least DOS attacks are
+possible. Can it also read other arbitrary files, run other programs, etc?
+--[[Joey]]
+
+> For the first point, agreed. It should probably check that the data file has a `.xml` extension also. Have now fixed.
+
+> For the second point, I think the main concern would be resource usage. XSLT is a pretty limited language; it can read other XML files, but it can't run other programs so far as I know.
+
+> -- [[KathrynAndersen]]
+
+>> XSLT is, indeed, a Turing-complete programming language.
+ However, [XML::LibXSLT][] provides a set of functions to help
+ to minimize the damage that may be caused by running a random
+ program.
+
+>> In particular, `max_depth ()` allows for the maximum
+ recursion depth to be set, while
+ `read_file ()`, `write_file ()`, `create_dir ()`,
+ `read_net ()` and `write_net ()`
+ are the callbacks that allow any of the possible file
+ operations to be denied.
+
+>> To be honest, I'd prefer for the `read_file ()` callback to
+ only grant access to the files below the Ikiwiki source
+ directory, and for all the `write_`&hellip; and
+ &hellip;`_net` callbacks to deny the access unconditionally.
+
+>> One more wishlist item: allow the set of locations to take
+ `.xsl` files from to be preconfigured, so that, e.&nbsp;g.,
+ one could allow (preasumably trusted) system stylesheets,
+ while disallowing any stylesheets that are placed on the Wiki
+ itself.
+
+>> &mdash;&nbsp;Ivan Shmakov, 2010-03-28Z.
+
+[XML::LibXSLT]: http://search.cpan.org/~PAJAS/XML-LibXSLT/LibXSLT.pm
diff --git a/doc/plugins/contrib/ymlfront.mdwn b/doc/plugins/contrib/ymlfront.mdwn
new file mode 100644
index 000000000..2805be04f
--- /dev/null
+++ b/doc/plugins/contrib/ymlfront.mdwn
@@ -0,0 +1,143 @@
+[[!template id=plugin name=ymlfront author="[[rubykat]]"]]
+[[!tag type/meta]]
+[[!toc]]
+## NAME
+
+IkiWiki::Plugin::ymlfront - add YAML-format data to a page
+
+## SYNOPSIS
+
+ # activate the plugin
+ add_plugins => [qw{goodstuff ymlfront ....}],
+
+ # configure the plugin
+ ymlfront_delim => [qw(--YAML-- --YAML--)],
+
+## DESCRIPTION
+
+This plugin provides a way of adding arbitrary meta-data (data fields) to any
+page by prefixing the page with a YAML-format document. This also provides
+the [[ikiwiki/directive/ymlfront]] directive, which enables one to put
+YAML-formatted data inside a standard IkiWiki [[ikiwiki/directive]].
+
+This is a way to create per-page structured data, where each page is
+treated like a record, and the structured data are fields in that record. This
+can include the meta-data for that page, such as the page title.
+
+This plugin is meant to be used in conjunction with the [[field]] plugin.
+
+## DETAILS
+
+There are three formats for adding YAML data to a page. These formats
+should not be mixed - the result is undefined.
+
+1. ymlfront directive
+
+ See [[ikiwiki/directive/ymlfront]] for more information.
+
+2. default YAML-compatible delimiter
+
+ By default, the YAML-format data in a page is placed at the start of
+ the page and delimited by lines containing precisely three dashes.
+ This is what YAML itself uses to delimit multiple documents.
+ The "normal" content of the page then follows.
+
+ For example:
+
+ ---
+ title: Foo does not work
+ Urgency: High
+ Status: Assigned
+ AssignedTo: Fred Nurk
+ Version: 1.2.3
+ ---
+ When running on the Sprongle system, the Foo function returns incorrect data.
+
+ What will normally be displayed is everything following the second line of dashes. That will be htmlized using the page-type of the page-file.
+
+3. user-defined delimiter
+
+ Instead of using the default "---" delimiter, the user can define,
+ in the configuration file, the **ymlfront_delim** value, which is an
+ array containing two strings. The first string defines the markup for
+ the start of the YAML data, and the second string defines the markip
+ for the end of the YAML data. These two strings can be the same, or
+ they can be different. In this case, the YAML data section is not
+ required to be at the start of the page, but as with the default, it
+ is expected that only one data section will be on the page.
+
+ For example:
+
+ --YAML--
+ title: Foo does not work
+ Urgency: High
+ Status: Assigned
+ AssignedTo: Fred Nurk
+ Version: 1.2.3
+ --YAML--
+ When running on the Sprongle system, the Foo function returns incorrect data.
+
+ What will normally be displayed is everything outside the delimiters,
+ both before and after. That will be htmlized using the page-type of the page-file.
+
+### Accessing the Data
+
+There are a few ways to access the given YAML data.
+
+* [[getfield]] plugin
+
+ The **getfield** plugin can display the data as individual variable values.
+
+ For example:
+
+ ---
+ title: Foo does not work
+ Urgency: High
+ Status: Assigned
+ AssignedTo: Fred Nurk
+ Version: 1.2.3
+ ---
+ # {{$title}}
+
+ **Urgency:** {{$Urgency}}\\
+ **Status:** {{$Status}}\\
+ **Assigned To:** {{$AssignedTo}}\\
+ **Version:** {{$Version}}
+
+ When running on the Sprongle system, the Foo function returns incorrect data.
+
+* [[ftemplate]] plugin
+
+ The **ftemplate** plugin is like the [[plugins/template]] plugin, but it is also aware of [[field]] values.
+
+ For example:
+
+ ---
+ title: Foo does not work
+ Urgency: High
+ Status: Assigned
+ AssignedTo: Fred Nurk
+ Version: 1.2.3
+ ---
+ \[[!ftemplate id="bug_display_template"]]
+
+ When running on the Sprongle system, the Foo function returns incorrect data.
+
+* [[report]] plugin
+
+ The **report** plugin is like the [[ftemplate]] plugin, but it reports on multiple pages, rather than just the current page.
+
+* write your own plugin
+
+ In conjunction with the [[field]] plugin, you can write your own plugin to access the data.
+
+## PREREQUISITES
+
+ IkiWiki
+ IkiWiki::Plugin::field
+ YAML::Any
+
+## DOWNLOAD
+
+* browse at GitHub: <http://github.com/rubykat/ikiplugins/blob/master/IkiWiki/Plugin/ymlfront.pm>
+* git repo at git://github.com/rubykat/ikiplugins.git
diff --git a/doc/plugins/contrib/ymlfront/discussion.mdwn b/doc/plugins/contrib/ymlfront/discussion.mdwn
new file mode 100644
index 000000000..b122294bb
--- /dev/null
+++ b/doc/plugins/contrib/ymlfront/discussion.mdwn
@@ -0,0 +1,31 @@
+Now that I have implemented a \[[!ymlfront ...]] directive, I would like to remove support for the old "---" delimited format, because
+
+* it is fragile (easily breakable)
+* it is non-standard
+
+Any objections?
+
+> Well, I don't have much standing since I have been too lame to integrate
+> ymlfront into ikiwiki yet. Buy, my opinion is, I liked the old
+> format of putting the YAML literally at the front of the file. It
+> seemed to allow parsing the file as YAML, using any arbitrary YAML
+> processer. And it was nice how it avoided boilerplate. --[[Joey]]
+
+>> The old delimited format also has the advantage of being remarkably similar to the
+>> [MultiMarkDown](http://fletcherpenney.net/multimarkdown/users_guide/multimarkdown_syntax_guide/)
+>> way of including metadata in documents. The only difference is that MMD doesn't expect the
+>> triple-dash separators, but I'm thinking about submitting a patch to MMD to actually support
+>> that syntax. --GB
+
+>>> Yes, the idea was to allow the file to be parsed as YAML, you're right. I just found that I tended to have problems when people used "---" for horizontal rules. However, I have also found that trying to keep it solely as an IkiWiki directive doesn't work either, since sometimes the meta-data I need also contained "]]" which broke the parsing of the directive.
+>>> So I have decided to go for a compromise, and make the delimiter configurable, rather than hardcoded as "---"; the triple-dash is the default, but it can be configured to be something else instead. I haven't pushed the change yet, but I have written it, and it seems to work. -- [[KathrynAndersen]]
+
+>>>> I'm not sure about what kind of problems you're meeting with "---" being used
+>>>> for horizontal rules: isn't it sufficient to just check that (1) the triple-dash
+>>>> is the first thing in the page and (2) there are only YAML-style assignments
+>>>> (and no blank lines) between the two markers? Check #2 would also be enough to
+>>>> support MMD-style metadata, which means (a) no start marker and (b) empty line
+>>>> to mark the end of the metadata block. Would this be supported by the plugin?
+>>>> --GB
+
+>>>>> Since I allow all legal YAML, the only way to check if it is legal YAML is to use the YAML parser, by which time one is already parsing the YAML, so it seems a bit pointless to check before one does so. -- KA
diff --git a/doc/plugins/creole.mdwn b/doc/plugins/creole.mdwn
new file mode 100644
index 000000000..6961e8d1d
--- /dev/null
+++ b/doc/plugins/creole.mdwn
@@ -0,0 +1,22 @@
+[[!template id=plugin name=creole author="BerndZeimetz"]]
+[[!tag type/format]]
+
+This plugin allows ikiwiki to process pages written in
+[WikiCreole](http://www.wikicreole.org/) format.
+To use it, you need to have the [[!cpan Text::WikiCreole]] perl
+module installed, enable the plugin, then files with the extention `.creole`
+will be processed as creole.
+
+The creole format is based on common elements across many different
+wiki markup formats, so should be fairly easy to guess at. There is also a
+[CheatSheet](http://www.wikicreole.org/wiki/CheatSheet).
+
+Links are standard [[WikiLinks|ikiwiki/WikiLink]]. Links and
+[[directives|ikiwiki/directive]] inside `{{{ }}}` blocks are still expanded,
+since this happens before the creole format is processed. (You need to escape
+them manually, via \\\[[directives]], the ~ escaping of creole doesn't work on
+this.)
+
+The standard ikiwiki [[WikiLinks|ikiwiki/WikiLink]] is almost the same as Creole link, except that creole uses \[[pagename|description]] while ikiwiki uses \[[description|pagename]].
+
+
diff --git a/doc/plugins/creole/discussion.mdwn b/doc/plugins/creole/discussion.mdwn
new file mode 100644
index 000000000..7f47c2c97
--- /dev/null
+++ b/doc/plugins/creole/discussion.mdwn
@@ -0,0 +1,22 @@
+I've installed Text::WikiCreole 0.05 and enabled the plugin, but I get an error when rebuilding the wiki: `Undefined subroutine &IkiWiki::Plugin::creole::creole_custombarelinks called at /usr/pkg-20080723/lib/perl5/vendor_perl/5.8.0/IkiWiki/Plugin/creole.pm line 23`. Is there a newer Text::WikiCreole I'm not finding online?
+-- [[schmonz]]
+
+> There's a patch in the debian package of libtext-wikicreole-perl that
+> adds that option. I'm not sure what the status of it being released
+> upstream is, though IIRC I was assured it would not be a problem.
+> --[[Joey]]
+
+>> I've added the patch to pkgsrc as well. Thanks. --[[schmonz]]
+
+>> Currently the creole plugin is included in ikiwiki but the ikiwiki deb (3.0.3) doesn't suggests libtext-wikicreole-perl. Why? --[[weakish]]
+
+>>> forgot, done now --[[Joey]]
+
+---
+## External Links
+
+I'm moving over a really stinkingly old UseMod and creole seems the nearest match. I've worked out that Bare /Subpage links need to become \[\[Subpage\]\], and Top/Sub links need to be \[\[Top/Sub\]\] (or \[\[Top/Sub|Top/Sub\]\], to display in exactly the same way), but I'm stuck on generic hyperlinks. The creole cheat sheet says I should be able to do \[\[http://url.path/foo|LinkText\]\], but that comes out as a link to create the "linktext" page, and Markdown-style \[Link Text\](http://url.path/foo) just gets rendered as is. Any suggestions? --[[schmonz]]
+
+> Was this problem ever solved? -- Thiana
+
+>> Not by me. If I were looking at the problem now, with fresh eyes, I'd probably bite the bullet and just convert everything to Markdown. --[[schmonz]]
diff --git a/doc/plugins/cutpaste.mdwn b/doc/plugins/cutpaste.mdwn
new file mode 100644
index 000000000..ea3665c44
--- /dev/null
+++ b/doc/plugins/cutpaste.mdwn
@@ -0,0 +1,7 @@
+[[!template id=plugin name=cutpaste author="[[Enrico]]"]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/cut]],
+[[ikiwiki/directive/copy]] and [[ikiwiki/directive/paste]]
+[[directives|ikiwiki/directive]].
+With these directives you can store and recall pieces of text in a page.
diff --git a/doc/plugins/date.mdwn b/doc/plugins/date.mdwn
new file mode 100644
index 000000000..2a33f014c
--- /dev/null
+++ b/doc/plugins/date.mdwn
@@ -0,0 +1,6 @@
+[[!template id=plugin name=date author="[[Joey]]"]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/date]]
+[[ikiwiki/directive]], which provides a way to display an arbitrary date
+in a page.
diff --git a/doc/plugins/ddate.mdwn b/doc/plugins/ddate.mdwn
new file mode 100644
index 000000000..17bb16cff
--- /dev/null
+++ b/doc/plugins/ddate.mdwn
@@ -0,0 +1,10 @@
+[[!template id=plugin name=ddate author="[[Joey]]"]]
+[[!tag type/fun]]
+[[!tag type/date]]
+[[!tag type/chrome]]
+
+Enables use of Discordian dates. `--timeformat` can be used to change
+the date format; see `ddate(1)`.
+
+This plugin requires the [[!cpan DateTime]] and
+[[!cpan DateTime::Calendar::Discordian]] perl modules.
diff --git a/doc/plugins/discussion.mdwn b/doc/plugins/discussion.mdwn
new file mode 100644
index 000000000..d47fa4718
--- /dev/null
+++ b/doc/plugins/discussion.mdwn
@@ -0,0 +1,42 @@
+Maybe we can have a page for requesting a plugin? (Not from the head developer but other ikiwiki end-users.)
+
+I have seen a few requests on discussion pages, but not any place specific I think.
+
+--JeremyReed
+
+> [[todo/plugin]] has a list of stuff, moved yours there. --[[Joey]]
+
+It would be nice if the page gave a brief synopsis for each plugin. For example:
+
+>headinganchors
+>Adds IDs to all headings based on their text.
+>Posted Thu, 09 Aug 2007 02:34:19 -0400
+
+>pagetemplate
+>Inserts page text into chosen template with the template controlling the look and feel.
+>Posted Thu, 26 Jul 2007 16:50:58 -0400
+
+>graphviz
+>Allows embedding of graphviz graphs.
+>Posted Mon, 09 Apr 2007 05:09:04 -0400
+
+--MichaelRasmussen
+
+Any objections to listing plugins alphabetically rather than by creation date? (i.e. change the inline to have sort="title".)
+
+-- Will
+
+> Well, it's been done by Josh, but I do wonder if there wasn't value to
+> being able to look at the top of the page for new plugins? --[[Joey]]
+
+>> I agree, which is why I brought it up here rather than just changing it.
+>> On balance I think the alphabetical list is better. You could have a
+>> "recently changed" list with the 10 most recently changed plugins
+>> at the top. That would allow what you suggested, but still allow
+>> the main list to be alphabetical. -- [[Will]]
+
+### `themes.pm` instead of `themes.mdwn`
+
+Could someone please change the filename. I cannot fix this using the Web interface. Somebody step in please. --[[PaulePanter]]
+
+> Oops, not the first time I've made that mistake! --[[Joey]]
diff --git a/doc/plugins/editdiff.mdwn b/doc/plugins/editdiff.mdwn
new file mode 100644
index 000000000..8d9daa0ff
--- /dev/null
+++ b/doc/plugins/editdiff.mdwn
@@ -0,0 +1,13 @@
+[[!template id=plugin name=editdiff author="[[JeremieKoenig]]"]]
+[[!tag type/web]]
+
+This plugin adds a "Diff" button when a page is being edited.
+When clicked, a diff between the stored page and provided content
+is shown in the "Page Preview" area.
+
+## Problems
+
+No special handling is done of concurrent edits: changes introduced
+independently will show up in the requested diff, although they will
+be merged when the page is saved. I suspect even detecting this case
+would require changes in the RCS backends.
diff --git a/doc/plugins/editdiff/discussion.mdwn b/doc/plugins/editdiff/discussion.mdwn
new file mode 100644
index 000000000..dbb02aefe
--- /dev/null
+++ b/doc/plugins/editdiff/discussion.mdwn
@@ -0,0 +1,5 @@
+I've enabled the plugin on an SVN-backed wiki, but am not seeing the Diff button when editing. (I do see the Rename and Remove buttons from having enabled those plugins.) Any ideas why it wouldn't be showing up? --[[schmonz]]
+
+> It was broken, I've fixed it. --[[Joey]]
+
+>> Awesome, thanks! --[[schmonz]]
diff --git a/doc/plugins/editpage.mdwn b/doc/plugins/editpage.mdwn
new file mode 100644
index 000000000..346ee7c78
--- /dev/null
+++ b/doc/plugins/editpage.mdwn
@@ -0,0 +1,6 @@
+[[!template id=plugin name=editpage core=1 author="[[Joey]]"]]
+[[!tag type/web]]
+
+This plugin allows editing wiki pages in the web interface. It's enabled by
+default if [[cgi]] is enabled; disable it if you want cgi for other things
+while not allowing page edits.
diff --git a/doc/plugins/editpage/discussion.mdwn b/doc/plugins/editpage/discussion.mdwn
new file mode 100644
index 000000000..2ca586ade
--- /dev/null
+++ b/doc/plugins/editpage/discussion.mdwn
@@ -0,0 +1,24 @@
+## How to only have editpage link when admin is logged in? Or on secret page?
+
+I saw [[todo/Allow_disabling_edit_and_preferences_links]] but not sure if that is what I want.
+
+I want to have edit links to maintain site. (I am currently manually using vi and cvs on server or pasting in ikiwiki.cgi to browser location bar.) But I do not want the regular (non-admin) visitors to see the edit link.
+
+Any suggestions?
+
+Can two different websites be regenerated at the same time? Or is there any CSS or other template magic I can use to selectively show the EDITURL?
+
+I do have a secret page that I use to add new blog entry (postform).
+
+-- [[JeremyReed]]
+
+> You can have two different sites if you want, and perhaps push from
+> the "edit" site to the main site. See [[tips/laptop_wiki_with_git]] for example.
+>
+> There has been talk about finding a way to hide the edit links from
+> users who are not logged in sufficiently to be able to edit, using
+> javascript, but nothing yet.
+>
+> Ikiwiki puts x-wiki headers on pages, and these can be used by
+> plugins at <http://universaleditbutton.org/Universal_Edit_Button>,
+> then you can disable the edit links. --[[Joey]]
diff --git a/doc/plugins/edittemplate.mdwn b/doc/plugins/edittemplate.mdwn
new file mode 100644
index 000000000..c19ecd858
--- /dev/null
+++ b/doc/plugins/edittemplate.mdwn
@@ -0,0 +1,6 @@
+[[!template id=plugin name=edittemplate author="[[Joey]]"]]
+[[!tag type/web]]
+
+This plugin provides the [[ikiwiki/directive/edittemplate]] [[ikiwiki/directive]].
+This directive allows registering [[template|templates]] pages, that
+provide default content for new pages created using the web frontend.
diff --git a/doc/plugins/embed.mdwn b/doc/plugins/embed.mdwn
new file mode 100644
index 000000000..85592cb72
--- /dev/null
+++ b/doc/plugins/embed.mdwn
@@ -0,0 +1,53 @@
+[[!template id=plugin name=embed author="[[Joey]]"]]
+[[!tag type/html]]
+
+This plugin allows embedding content from external sites on
+wiki pages.
+
+Normally, the [[htmlscrubber]] does not allow the tags that are used for
+embedding content from external sites, since `<iframe>`, `<embed>`, and
+`<object>` tags can be used for various sorts of attacks. This plugin
+allows such tags to be put on a page, if they look like they are safe.
+
+In the examples below, the parts of the html that you can change are denoted
+with "XXX"; everything else must appear exactly as shown to be accepted by the
+plugin.
+
+**This plugin is deprecated.** Rather than relying on these complex lists
+of safe content, which constantly fall out of date, you're recommended to
+configure the [[htmlscrubber]] to not scrub some pages, which only trusted
+users can edit. Then you can embed anything from anywhere on those pages.
+See [[tips/embedding_content]] for details and examples.
+This plugin's lists of safe embedded content will not be maintained, and
+the plugin will be removed in a future release.
+
+## google maps
+
+Use html like this to embed a map:
+
+ <iframe width="XXX" height="XXX" frameborder="XXX" scrolling="XXXX" marginheight="XXXX" marginwidth="XXXX" src="http://maps.google.com/?XXX"></iframe>
+
+(This method only allows embeddeding a simple map. To use the full
+[Google Maps API](http://www.google.com/apis/maps/) from ikiwiki, including
+drawing points and GPS tracks on the map, try the [[contrib/googlemaps]]
+plugin.)
+
+## youtube
+
+Use html like this to embed a video:
+
+ <object width="XXX" height="XXX"><param name="movie" value="http://www.youtube.com/v/XXX"></param><param name="wmode" value="transparent"></param>
+ <embed src="http://www.youtube.com/v/XXX" type="application/x-shockwave-flash" wmode="transparent" width="XXX" height="XXX"></embed></object>
+
+## google video
+
+Use html like this to embed a video:
+
+ <embed style="width:XXXpx; height:XXXpx;" id="XXX" type="application/x-shockwave-flash" src="http://video.google.com/googleplayer.swf?XXX" flashvars=""></embed>
+
+## google calendar
+
+Use html like this to embed a calendar:
+
+ <iframe src="http://www.google.com/calendar/embed?XXX" style="border-width:XXX" width="XXX" frameborder="XXX" height="XXX"></iframe>
+
diff --git a/doc/plugins/favicon.mdwn b/doc/plugins/favicon.mdwn
new file mode 100644
index 000000000..7941f8a6c
--- /dev/null
+++ b/doc/plugins/favicon.mdwn
@@ -0,0 +1,7 @@
+[[!template id=plugin name=favicon author="[[Joey]]"]]
+[[!tag type/chrome]]
+
+If this plugin is enabled, then an icon link is added to pages, for web
+browsers to display. The icon is currently hardcoded to be a favicon.ico,
+which must be in the root of the wiki. The [[logo]] page explains how this
+icon was generated.
diff --git a/doc/plugins/favicon/discussion.mdwn b/doc/plugins/favicon/discussion.mdwn
new file mode 100644
index 000000000..e8b5fd60a
--- /dev/null
+++ b/doc/plugins/favicon/discussion.mdwn
@@ -0,0 +1,19 @@
+To change favicon you need edit lib/perl5/site_perl/5.8.8/IkiWiki/Plugin/favicon.pm and change line:
+> $template->param(favicon => "favicon.ico");
+
+at the end of file. And rebuild wiki:
+> ikiwiki -setup your_wiki_config
+
+After reload page you'll see your new favicon.
+
+That method allow configure Animated PNG even:
+> ikiwiki.sphynkx.org.ua
+
+----
+Sphynkx
+
+> Typically sites that use animated pngs (ugh!) as favicons just call it
+> `favicon.ico`. It's not as if (most) web servers and browsers trust
+> the filename extension to mean anything anyway. And using favicon.ico
+> makes it more likely to work with old browsers that just always look for
+> that. --[[Joey]]
diff --git a/doc/plugins/filecheck.mdwn b/doc/plugins/filecheck.mdwn
new file mode 100644
index 000000000..b038bc433
--- /dev/null
+++ b/doc/plugins/filecheck.mdwn
@@ -0,0 +1,17 @@
+[[!template id=plugin name=filecheck core=0 author="[[Joey]]"]]
+[[!tag type/special-purpose]]
+
+This plugin enhances the regular [[ikiwiki/PageSpec]] syntax with
+some additional tests, for things like file size, mime type, and virus
+status. These tests are mostly useful for the [[attachment]] plugin, and
+are documented [[here|ikiwiki/pagespec/attachment]].
+
+This plugin will use the [[!cpan File::MimeInfo::Magic]] perl module, if
+available, for mimetype checking. It falls back to using the `file` command
+if necessary for hard to detect files.
+
+The `virusfree` [[PageSpec|ikiwiki/pagespec/attachment]] requires that
+ikiwiki be configured with a virus scanner program via the `virus_checker`
+option in the setup file. If using `clamav`, with `clamd`, set it to
+"clamdscan -". Or to use clamav without the `clamd` daemon, you
+could set it to "clamscan -".
diff --git a/doc/plugins/filecheck/discussion.mdwn b/doc/plugins/filecheck/discussion.mdwn
new file mode 100644
index 000000000..f3f3c4ffd
--- /dev/null
+++ b/doc/plugins/filecheck/discussion.mdwn
@@ -0,0 +1,85 @@
+First, thanks again for making this plugin.
+
+I don't know if this is a problem for [[plugins/attachment]], but there seems to
+be no way to detect text/plain using File::MimeInfo::Magic::magic().
+There is a heuristic ::default that decides between text/plain and application/octet-stream.
+
+Would you be receptive to a patch that e.g. called File::MimeInfo::Magic::default()
+if ::magic() returns undef? --[[DavidBremner]]
+
+> Yes, that looks to be ok. --[[Joey]]
+
+>> OK, here is such a patch. One modification of previous behaviour is that
+>> that if default returns undef, this is returned. As far as I understand
+>> the code/doc for File::MimeInfo, under is used only as an error return
+>> for ::default
+
+>>> Applied
+
+---
+
+At first I need to thank you for ikiwiki - it is what I was always looking
+for - coming from a whole bunch of wiki engines, this is the most
+intelligent and least bloated one.
+
+My question is about the [[plugins/attachment]] plugin in conjunction with
+[[plugins/filecheck]]: I am using soundmanger2 js-library for having
+attached media files of all sorts played inline a page.
+
+To achieve this soundmanager2 asks for an id inside a ul-tag surrounding
+the a-tag. I was wondering if the Insert Link button could be provided with
+a more elegant solution than to have this code snippet to be filled in by
+hand every time you use it to insert links for attached media files. And in
+fact there apparently is a way in attachment.pm.
+
+While I can see that it is not needed for everyone inserting links to
+attached media files to have ul- and li-tags surrounding the link itself as
+well as being supplied with an id fill in, for me it would be the most
+straight forward solution. Pitty is I don't have the time to wrap my head
+around perl to write a patch myself. Is there any way to have this made an
+option which can be called via templates?
+
+For sure I would like to donate for such a patch as well as I will do it
+for ikiwiki anyway, because it is such a fine application.
+
+If you are not familiar with soundmanager2: It is a very straight forward
+solution to inline mediafiles, using the usual flash as well as html5
+solutions (used by soundcloud.com, freesound.org and the like). Worth a
+look anyway [schillmania.com](http://www.schillmania.com/)
+
+Boris
+
+> The behavior of "Insert Links" is currently hardcoded to support images
+> and has a fallback for other files. What you want is a
+> [[todo/generic_insert_links]] that can insert a template directive.
+> Then you could make a template that generates the html needed for
+> soundmanager2. I've written down a design at
+> [[todo/generic_insert_links]]; I am currently very busy and not sure
+> when I will get around to writing it, but with it on the todo list
+> I shouldn't forget. --[[Joey]]
+>
+> You could make a [[ikiwiki/directive/template]] for soundmanager2
+> now, and manually insert the template directive for now
+> when you want to embed a sound file. Something like this:
+
+ \[[!template id=embed_mp3 file=your.mp3]]
+
+> Then in templates/embed_mp3.mdwn, something vaguely like this:
+
+ <ul id="foo">
+ <a href="<TMPL_VAR FILE>">mp3</a>
+ </ul>
+
+>> Thanks a lot - looking forward to [[todo/generic_insert_links]] - I am using the [[ikiwiki/directive/template]] variant also adding a name vaiable, it looks like this and is working fine:
+
+ <ul class="playlist">
+ <li>
+ <a href="<TMPL_VAR FILE>"><TMPL_VAR NAME></a>
+ </li>
+ </ul>
+
+>> Calling it:
+
+ \[[!template id=embedmedia.tmpl file=../Tinas_Gonna_Have_A_Baby.mp3 name="Tina's Gonna Have A Baby" ]]
+
+>> BTW your Flattr button doesn't seem to work properly - or it is Flattr itself that doesn't- clicking it won't let ikiwiki show up on my Dashboard.
diff --git a/doc/plugins/flattr.mdwn b/doc/plugins/flattr.mdwn
new file mode 100644
index 000000000..5da279518
--- /dev/null
+++ b/doc/plugins/flattr.mdwn
@@ -0,0 +1,9 @@
+[[!template id=plugin name=flattr author="[[Joey]]"]]
+[[!tag type/web]]
+
+[Flattr](http://flattr.com/) is a social micropayment platform.
+This plugin allows easily adding Flattr buttons to pages,
+using the [[ikiwiki/directive/flattr]] directive.
+
+This plugin has a configuration setting. `flattr_userid` can be set
+to either your numeric flatter userid, or your flattr username.
diff --git a/doc/plugins/format.mdwn b/doc/plugins/format.mdwn
new file mode 100644
index 000000000..b41d365aa
--- /dev/null
+++ b/doc/plugins/format.mdwn
@@ -0,0 +1,9 @@
+[[!template id=plugin name=format core=0 author="[[Joey]]"]]
+[[!tag type/widget]]
+
+This plugin allows mixing different page formats together, by embedding
+text formatted one way inside a page formatted another way. This is done
+using the [[ikiwiki/directive/format]] [[ikiwiki/directive]].
+
+For example, it could be used to embed an [[otl]] outline inside a page
+that is formatted as [[mdwn]].
diff --git a/doc/plugins/format/discussion.mdwn b/doc/plugins/format/discussion.mdwn
new file mode 100644
index 000000000..df8448ed6
--- /dev/null
+++ b/doc/plugins/format/discussion.mdwn
@@ -0,0 +1,15 @@
+Is there any way to tell if an htmlize hook have been called from a format directive?
+
+I am currently modifying the [[contrib/highlightcode]] plugin by [[sabr]] and I wanted to have a different behavior depending on the fact that the htmlize hook is called from a format directive or not. For instance, this could disable the raw copy of the highlighted code. Since I have enabled the keepextension option, I tried to rely on the page extension to decide whenever I have to create the raw file or not but this does not seems a reliable approach.
+
+One possible solution is to add an optional parameter to the htmlize hook (and thus to htmlize function in IkiWiki.pm) which could tell if this is the format directive that called the function but I am not sure that is a good way to do this.
+
+> It's (probably) not just the format directive that has a potential problem here.
+> Imagine a syntax highlighted source code file that contains some other
+> directive, such as table or meta. Such a directive calls `htmlize` on the
+> parameters passed to it.
+>
+> There is one way to detect this ATM. If `%IkiWiki::preprocessing` has
+> anything in it, then ikiwiki is in the middle of handling a preprocessing
+> directive. So you could check that. It's getting into internals, so not
+> ideal.. --[[Joey]]
diff --git a/doc/plugins/fortune.mdwn b/doc/plugins/fortune.mdwn
new file mode 100644
index 000000000..3cb125ac1
--- /dev/null
+++ b/doc/plugins/fortune.mdwn
@@ -0,0 +1,14 @@
+[[!template id=plugin name=fortune author="[[Joey]]"]]
+[[!tag type/fun]]
+[[!tag type/widget]]
+
+This plugin implements the [[ikiwiki/directive/fortune]] [[ikiwiki/directive]].
+This directive uses the `fortune` program to insert a fortune into the page.
+
+[[!if test="enabled(fortune)" then="""
+Here's a fortune for you:
+
+----
+
+[[!fortune ]]
+"""]]
diff --git a/doc/plugins/getsource.mdwn b/doc/plugins/getsource.mdwn
new file mode 100644
index 000000000..d5404a628
--- /dev/null
+++ b/doc/plugins/getsource.mdwn
@@ -0,0 +1,14 @@
+[[!template id=plugin name=getsource author="[[Will_Uther|Will]]"]]
+[[!tag type/web]]
+
+This plugin adds a "Source" link to the top of each page that uses
+the CGI to display the page's source.
+
+Configuration for this plugin in the setup file:
+
+* `getsource_mimetype => "text/plain; charset=utf-8"`
+
+ Sets the MIME type used when page source is requested. The default is
+ usually appropriate, but you could set this to `application/octet-stream`
+ to encourage browsers to download the source to a file rather than showing
+ it in the browser.
diff --git a/doc/plugins/getsource/discussion.mdwn b/doc/plugins/getsource/discussion.mdwn
new file mode 100644
index 000000000..3e985948b
--- /dev/null
+++ b/doc/plugins/getsource/discussion.mdwn
@@ -0,0 +1,3 @@
+It would be very cool if this plugin was enabled by default. One of the best ways to learn how to do various advanced things is to be able to "view source" on other wiki's which do things you like. -- [[AdamShand]]
+
+This plugin requires the cgi plugin. If you run a static site, you may check the [[repolist]] plugin. -- [[weakish]]
diff --git a/doc/plugins/goodstuff.mdwn b/doc/plugins/goodstuff.mdwn
new file mode 100644
index 000000000..ee1bffcfa
--- /dev/null
+++ b/doc/plugins/goodstuff.mdwn
@@ -0,0 +1,29 @@
+[[!template id=plugin name=goodstuff author="[[Joey]]"]]
+[[!tag type/bundle]]
+
+This plugin enables a bunch of other plugins -- basically all the ones that
+are not too intrusive, work well with little configuration, and are nice to
+have on any capable wiki. The plugins in this bundle are not enabled by
+default in ikiwiki, so that by default ikiwiki is limited to a few core
+wiki features. If you want a more capable wiki, enable this plugin bundle.
+
+Currently included:
+
+* [[brokenlinks]]
+* [[img]]
+* [[map]]
+* [[more]]
+* [[orphans]]
+* [[pagecount]]
+* [[pagestats]]
+* [[progress]]
+* [[shortcut]]
+* [[smiley]]
+* [[tag]]
+* [[table]]
+* [[template]]
+* [[toc]]
+* [[toggle]]
+* [[repolist]]
+
+New plugins will be added to this bundle from time to time.
diff --git a/doc/plugins/goodstuff/discussion.mdwn b/doc/plugins/goodstuff/discussion.mdwn
new file mode 100644
index 000000000..4ccea4ad4
--- /dev/null
+++ b/doc/plugins/goodstuff/discussion.mdwn
@@ -0,0 +1,8 @@
+### What is the syntax for enabling plugins in the setup file?
+
+Here is an example snippet from a working setup file:
+
+ <pre>
+ # plugins to add to the default configuration
+ add_plugins => ['goodstuff'],
+</pre>
diff --git a/doc/plugins/google.mdwn b/doc/plugins/google.mdwn
new file mode 100644
index 000000000..5346b8f40
--- /dev/null
+++ b/doc/plugins/google.mdwn
@@ -0,0 +1,11 @@
+[[!template id=plugin name=google author="[Peter Simons](http://cryp.to/)"]]
+[[!tag type/web]]
+
+This plugin adds a search form to the wiki, using google's site search.
+
+Google is asked to search for pages in the domain specified in the wiki's
+`url` configuration parameter. Results will depend on whether google has
+indexed the site, and how recently.
+
+The [[search]] plugin offers full text search of only the wiki, but
+requires that a search engine be installed on your site.
diff --git a/doc/plugins/google/discussion.mdwn b/doc/plugins/google/discussion.mdwn
new file mode 100644
index 000000000..e664f5723
--- /dev/null
+++ b/doc/plugins/google/discussion.mdwn
@@ -0,0 +1,25 @@
+This plugin uses the googleform.tmpl
+which produces valid HTML but invalid XHTML.
+This is not very good since the default ikiwiki
+templates produce XHTML instead of HTML.
+
+> Fixed, thanks for the patch! --[[Joey]]
+
+It works to pass the whole wiki baseurl to Google, not just the
+domain, and appears to be legal. I've got a wiki that'd benefit
+(it's a few directories down from the root). Can the plugin be
+tweaked to do this? --[[schmonz]]
+
+> Done. --[[Joey]]
+
+The main page said:
+
+> Also, if the same domain has other content, outside the wiki's
+> content, it will be searched as well.
+
+Is it still true now? (Or this statement is out of date?) --[weakish]
+
+[weakish]: http://weakish.pigro.net
+
+> I checked, and it's never been true; google is given the url to the top
+> of the wiki and only searches things in there. --[[Joey]]
diff --git a/doc/plugins/goto.mdwn b/doc/plugins/goto.mdwn
new file mode 100644
index 000000000..8e1de7a10
--- /dev/null
+++ b/doc/plugins/goto.mdwn
@@ -0,0 +1,10 @@
+[[!template id=plugin name=goto author="[[Simon_McVittie|smcv]]"]]
+[[!tag type/web]]
+
+This plugin adds a `do=goto` mode for the IkiWiki CGI script. It's mainly
+for internal use by the [[404]], [[comments]] and [[recentchanges]]
+plugins, which enable it automatically.
+
+With this plugin enabled you can link to `ikiwiki.cgi?do=goto&page=some/where`
+to make a link that will redirect to the page `/some/where` if it exists, or
+offer a link to create it if it doesn't.
diff --git a/doc/plugins/graphviz.mdwn b/doc/plugins/graphviz.mdwn
new file mode 100644
index 000000000..d57d7dc94
--- /dev/null
+++ b/doc/plugins/graphviz.mdwn
@@ -0,0 +1,25 @@
+[[!template id=plugin name=graphviz author="[[JoshTriplett]]"]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/graph]] [[ikiwiki/directive]].
+This directive allows embedding [graphviz](http://www.graphviz.org/) graphs in a
+page.
+
+Security implications: graphviz does not seem to have any syntax exploitable to
+perform file access or shell commands on the server. However, the graphviz
+plugin does make denial of service attacks somewhat easier: any user with edit
+privileges can use this plugin to create large files without the need to send
+large amounts of data, allowing them to more quickly fill the disk, run the
+server out of memory, or use up large amounts of bandwidth. Any user can
+already do these things with just the core of ikiwiki, but the graphviz plugin
+allows for an amplification attack, since users can send less data to use large
+amounts of processing time and disk usage.
+
+[[!if test="enabled(graphviz)" then="""
+Some example graphs:
+
+[[!graph src="a -> b -> c; a -> b;"]]
+[[!graph src="a -- b -- c -- a;" prog="circo" type="graph"]]
+"""]]
+
+This plugin uses the [[!cpan Digest::SHA]] perl module.
diff --git a/doc/plugins/haiku.mdwn b/doc/plugins/haiku.mdwn
new file mode 100644
index 000000000..448733d95
--- /dev/null
+++ b/doc/plugins/haiku.mdwn
@@ -0,0 +1,12 @@
+[[!template id=plugin name=haiku author="[[Joey]]"]]
+[[!tag type/fun]]
+[[!tag type/widget]]
+
+This plugin provides a [[ikiwiki/directive/haiku]] [[ikiwiki/directive]].
+The directive allows inserting a randomly generated haiku into a wiki page.
+
+As a special bonus, enabling this plugin makes any error messages ikiwiki
+should display be written in haiku.
+
+You need to have the [[!cpan Coy]] module installed for this plugin to do
+anything interesting. That does all the heavy lifting.
diff --git a/doc/plugins/haiku/discussion.mdwn b/doc/plugins/haiku/discussion.mdwn
new file mode 100644
index 000000000..a5d0939ce
--- /dev/null
+++ b/doc/plugins/haiku/discussion.mdwn
@@ -0,0 +1,5 @@
+The output of this plugin does not validate as XHTML: [validator.w3.org][w3] --ulrik
+
+[w3]: http://validator.w3.org/check?uri=http%3A%2F%2Fikiwiki.info%2Fplugins%2Fhaiku%2Findex.html&charset=%28detect+automatically%29&doctype=Inline&group=0
+
+> Fixed --[[Joey]]
diff --git a/doc/plugins/headinganchors.mdwn b/doc/plugins/headinganchors.mdwn
new file mode 100644
index 000000000..f087abdf9
--- /dev/null
+++ b/doc/plugins/headinganchors.mdwn
@@ -0,0 +1,7 @@
+[[!template id=plugin name=headinganchors author="[[PaulWise]]"]]
+[[!tag type/html]]
+
+This is a simple plugin to add ids (which will serve as [[anchor]]s) to all
+headings (h1, h2, etc), based on their text. It works as a postprocessing
+filter, allowing it to work on mdwn, wiki, html, rst and any other format that
+produces html.
diff --git a/doc/plugins/headinganchors/discussion.mdwn b/doc/plugins/headinganchors/discussion.mdwn
new file mode 100644
index 000000000..eaf111f4e
--- /dev/null
+++ b/doc/plugins/headinganchors/discussion.mdwn
@@ -0,0 +1,49 @@
+Isn't this functionality a part of what [[plugins/toc]] needs and does? Then probably the [[plugins/toc]] plugin's code could be split into the part that implements the [[plugins/contrib/headinganchors]]'s functionality and the TOC generation itself. That will bring more order into the code and the set of available plugins. --Ivan Z.
+
+---
+
+A patch to make it more like MediaWiki:
+
+<pre>--- headinganchors.pm
++++ headinganchors.pm
+@@ -5,6 +5,7 @@
+ use warnings;
+ use strict;
+ use IkiWiki 2.00;
++use URI::Escape;
+
+ sub import {
+ hook(type => "sanitize", id => "headinganchors", call => \&headinganchors);
+@@ -14,9 +15,11 @@
+ my $str = shift;
+ $str =~ s/^\s+//;
+ $str =~ s/\s+$//;
+- $str = lc($str);
+- $str =~ s/[&\?"\'\.,\(\)!]//mig;
+- $str =~ s/[^a-z]/_/mig;
++ $str =~ s/\s/_/g;
++ $str =~ s/"//g;
++ $str =~ s/^[^a-zA-Z]/z-/; # must start with an alphabetical character
++ $str = uri_escape_utf8($str);
++ $str =~ s/%/./g;
+ return $str;
+ }
+ </pre>
+
+--Changaco
+
+----
+
+I think using this below would let the source html clear for the browser
+without changing the render:
+
+ #use URI::Escape
+ .
+ .
+
+ #$str = uri_escape_utf8($str);
+ $str = Encode::decode_utf8($str);
+ #$str =~ s/%/./g;
+
+Don't you think ?
+[[mathdesc]]
diff --git a/doc/plugins/highlight.mdwn b/doc/plugins/highlight.mdwn
new file mode 100644
index 000000000..5f04fda52
--- /dev/null
+++ b/doc/plugins/highlight.mdwn
@@ -0,0 +1,77 @@
+[[!template id=plugin name=highlight author="[[Joey]]"]]
+[[!tag type/format]]
+
+This plugin allows ikiwiki to syntax highlight source code, using
+a fast syntax highlighter that supports over a hundred programming
+languages and file formats.
+
+## prerequisites
+
+You will need to install the perl bindings to the
+[highlight library](http://www.andre-simon.de/). In Debian
+they are in the [[!debpkg libhighlight-perl]] package. If
+your distribution does not have them, look in `examples/swig`
+in highlight's source.
+
+## embedding highlighted code
+
+To embed highlighted code on a page, you can use the
+[[format]] plugin.
+
+For example:
+
+ \[[!format c """
+ void main () {
+ printf("hello, world!");
+ }
+ """]]
+
+ \[[!format diff """
+ -bar
+ +foo
+ """]]
+
+You can do this for any extension or language name supported by
+the [highlight library](http://www.andre-simon.de/) -- basically anything
+you can think of should work.
+
+## highlighting entire source files
+
+To enable syntax highlighting of entire standalone source files, use the
+`tohighlight` setting in your setup file to control which files should be
+syntax highlighted. Here is a typical setting for it, enabling highlighting
+for files with the extensions .c, etc, and also for any files named
+"Makefile".
+
+ tohighlight => ".c .h .cpp .pl .py Makefile:make",
+
+It knows what language to use for most filename extensions (see
+`/etc/highlight/filetypes.conf` for a partial list), but if you want to
+bind an unusual filename extension, or any file without an extension
+(such as a Makefile), to a language, you can do so by appending a colon
+and the name of the language, as illustrated for Makefiles above.
+
+With the plugin configured this way, source files become full-fledged
+wiki pages, which means they can include [[WikiLinks|ikiwiki/wikilink]]
+and [[directives|ikiwiki/directive]] like any other page can, and are also
+affected by the [[smiley]] plugin, if it is enabled. This can be annoying
+if your code accidentially contains things that look like those.
+
+On the other hand, this also allows your syntax highlighed
+source code to contain markdown formatted comments and hyperlinks
+to other code files, like this:
+
+ /* \[[!format mdwn """
+ This comment will be formatted as *markdown*!
+
+ See \[[bar.h]].
+ ""]] */
+
+Finally, bear in mind that this lets anyone who can edit a page in your
+wiki also edit source code files that are in your wiki. Use appropriate
+caution.
+
+## colors
+
+The colors etc used for the syntax highlighting are entirely configurable
+by CSS. See ikiwiki's [[style.css]] for the defaults.
diff --git a/doc/plugins/highlight/discussion.mdwn b/doc/plugins/highlight/discussion.mdwn
new file mode 100644
index 000000000..a258f21fd
--- /dev/null
+++ b/doc/plugins/highlight/discussion.mdwn
@@ -0,0 +1,23 @@
+It would be nice to be able to set a few options for the highlighter
+object. In particular, today I noticed my tabs were not being expanded
+correctly, which could be fixed the command line with --replace-tabs but
+programmatically needs a call to setPreformatting. I could probably play
+with this, but what is your preferred way to support options? something
+like 'highlight_options=>{replace_tabs=>8,line_numbers=>0}' ? Of course,
+if you want to implement it I won't complain :-). [[DavidBremner]]
+
+> I don't know about tab replacement, which I can't really see the point
+> of, but if there are multiple options, giving each its own nane would
+> word better for websetup than would putting all the options in a
+> sub-hash. --[[Joey]]
+
+
+Has anyone got this running with CentOS/RHEL ?
+Having trouble working out where to get the perl bindings for highlight. --[Mick](http://www.lunix.com.au)
+
+> The perl bindings are hidden in `examples/swig` in highlight's source.
+> --[[Joey]]
+
+Thanks for prompt reply.All working. I will post on my site tonight and link here what I did on CentOS to make this work. --[Mick](http://www.lunix.com.au)
+
+Any hint on how to highlight actual mdwn or any other supported markup code? -- [wiebel](http://wiebels.info)
diff --git a/doc/plugins/hnb.mdwn b/doc/plugins/hnb.mdwn
new file mode 100644
index 000000000..afe04c943
--- /dev/null
+++ b/doc/plugins/hnb.mdwn
@@ -0,0 +1,6 @@
+[[!template id=plugin name=hnb author="[[XTaran]]"]]
+[[!tag type/format type/slow]]
+
+This plugin allows ikiwiki to process `.hnb` XML files, as created by
+the Hierachical Notebook [hnb](http://hnb.sourceforge.net/). To use it, you need to have
+hnb installed, since it uses the commandline interface of `hnb` program.
diff --git a/doc/plugins/hnb/discussion.mdwn b/doc/plugins/hnb/discussion.mdwn
new file mode 100644
index 000000000..45bd703c4
--- /dev/null
+++ b/doc/plugins/hnb/discussion.mdwn
@@ -0,0 +1,28 @@
+I've reviewed this plugin's code, and there is one major issue with it,
+namely this line:
+
+ system("hnb '$params{page}.hnb' 'go root' 'export_html $tmp' > /dev/null");
+
+This could potentially allow execution of artibtary shell code, if the filename
+contains a single quote.
+
+* Fixed with version 0.02 by usage of `$params{content}` -- XTaran
+
+Which ikiwiki doesn't allow by default, but I prefer to never involve a shell where one is not needed. The otl plugin is a good example of how to safely fork a child process without involving the shell.
+
+* Had a look at that one as example before writing the hnb plugin, but hnb has different input/output characteristics. I would prefer another solution, too, but as long as it works and is secure, I'm fine with the current (fixed :-) ) solution -- [[XTaran]].
+
+Other problems:
+
+* Use of shell mktemp from perl is suboptimal. File::Temp would be better.
+ * Fixed with version 0.02 -- [[XTaran]]
+* The htmlize hook should not operate on the contents of `$params{page}.hnb`.
+ The content that needs to be htmlized is passed in to the hook in
+ `$params{content}`.
+ * Fixed with version 0.02 -- [[XTaran]]
+
+If these problems are resolved and a copyright statement is added to the file,
+
+* Copyright Statement is in their for about a month. -- [[XTaran]]
+
+I'd be willing to include this plugin in ikiwiki. --[[Joey]]
diff --git a/doc/plugins/html.mdwn b/doc/plugins/html.mdwn
new file mode 100644
index 000000000..55e11bff0
--- /dev/null
+++ b/doc/plugins/html.mdwn
@@ -0,0 +1,11 @@
+[[!template id=plugin name=html author="[[Joey]]"]]
+[[!tag type/html type/format]]
+
+This plugin lets html pages be used as source pages for the wiki. The
+html pages will still be wrapped in the same html template as any other
+page, so for best results you should include only the page body in the html
+file. Also, if the [[htmlscrubber]] plugin is enabled, the html pages will be
+sanitised like any other page. You can also use standard
+[[WikiLinks|ikiwiki/WikiLink]] etc in the html pages.
+
+This plugin is included in ikiwiki, but is not enabled by default.
diff --git a/doc/plugins/htmlbalance.mdwn b/doc/plugins/htmlbalance.mdwn
new file mode 100644
index 000000000..f4e2298ee
--- /dev/null
+++ b/doc/plugins/htmlbalance.mdwn
@@ -0,0 +1,9 @@
+[[!template id=plugin name=htmlbalance author="[[Simon_McVittie|smcv]]"]]
+[[!tag type/html]]
+
+This plugin ensures that the HTML emitted by ikiwiki contains well-balanced
+HTML tags, by parsing it with [[!cpan HTML::TreeBuilder]] and re-serializing it. This
+acts as a lighter-weight alternative to [[plugins/htmltidy]]; it doesn't
+ensure validity, but it does at least ensure that formatting from a
+blog post pulled in by the [[ikiwiki/directive/inline]] directive doesn't
+leak into the rest of the page.
diff --git a/doc/plugins/htmlbalance/discussion.mdwn b/doc/plugins/htmlbalance/discussion.mdwn
new file mode 100644
index 000000000..c66528a4f
--- /dev/null
+++ b/doc/plugins/htmlbalance/discussion.mdwn
@@ -0,0 +1,10 @@
+Would it be possible to use [[!cpan HTML::Entities]] rather than
+`XML::Atom::Util` for encoding entities? The former is already an ikiwiki
+dependency (via [[!cpan HTML::Parser]]).
+
+> Now switched to HTML::Entities --[[Joey]]
+
+I also wonder if there's any benefit to using this plugin aside from with
+aggregate. Perhaps a small one but aggregate seems like the main case..
+wondering if it would be better to just have aggregate balanace the html
+automatically and do away with the separate plugin. --[[Joey]]
diff --git a/doc/plugins/htmlscrubber.mdwn b/doc/plugins/htmlscrubber.mdwn
new file mode 100644
index 000000000..08c81212b
--- /dev/null
+++ b/doc/plugins/htmlscrubber.mdwn
@@ -0,0 +1,51 @@
+[[!template id=plugin name=htmlscrubber core=1 author="[[Joey]]"]]
+[[!tag type/html]]
+
+This plugin is enabled by default. It sanitizes the html on pages it renders
+to avoid XSS attacks and the like.
+
+It excludes all html tags and attributes except for those that are
+whitelisted using the same lists as used by Mark Pilgrim's Universal Feed
+Parser, documented at
+<http://web.archive.org/web/20110726052341/http://feedparser.org/docs/html-sanitization.html>.
+Notably it strips `style` and `link` tags, and the `style` attribute.
+
+Any attributes that could be used to specify a URL are checked to ensure
+that they are known, safe schemes. It will also block embedded javascript
+in such URLs.
+
+It uses the [[!cpan HTML::Scrubber]] perl module to perform its html
+sanitisation, and this perl module also deals with various entity encoding
+tricks.
+
+While I believe that this makes ikiwiki as resistant to malicious html
+content as anything else on the web, I cannot guarantee that it will
+actually protect every user of every browser from every browser security
+hole, badly designed feature, etc. I can provide NO WARRANTY, like it says
+in ikiwiki's [[GPL]] license.
+
+The web's security model is *fundamentally broken*; ikiwiki's html
+sanitisation is only a patch on the underlying gaping hole that is your web
+browser.
+
+Note that enabling or disabling the htmlscrubber plugin also affects some
+other HTML-related functionality, such as whether [[meta]] allows
+potentially unsafe HTML tags.
+
+The `htmlscrubber_skip` configuration setting can be used to skip scrubbing
+of some pages. Set it to a [[ikiwiki/PageSpec]], such as
+`posts/* and !comment(*) and !*/Discussion`, and pages matching that can have
+all the evil CSS, JavsScript, and unsafe html elements you like. One safe
+way to use this is to use [[lockedit]] to lock those pages, so only admins
+can edit them.
+
+----
+
+Some examples of embedded javascript that won't be let through when this
+plugin is active:
+
+* script tag test <script>window.location='http://example.org';</script>
+* <span style="background: url(javascript:window.location='http://example.org/')">CSS script test</span>
+* <span style="&#x61;&#x6e;&#x79;&#x3a;&#x20;&#x65;&#x78;&#x70;&#x72;&#x65;&#x73;&#x73;&#x69;&#x6f;&#x6e;&#x28;&#x77;&#x69;&#x6e;&#x64;&#x6f;&#x77;&#x2e;&#x6c;&#x6f;&#x63;&#x61;&#x74;&#x69;&#x6f;&#x6e;&#x3d;&#x27;&#x68;&#x74;&#x74;&#x70;&#x3a;&#x2f;&#x2f;&#x65;&#x78;&#x61;&#x6d;&#x70;&#x6c;&#x65;&#x2e;&#x6f;&#x72;&#x67;&#x2f;&#x27;&#x29;">entity-encoded CSS script test</span>
+* <span style="&#97;&#110;&#121;&#58;&#32;&#101;&#120;&#112;&#114;&#101;&#115;&#115;&#105;&#111;&#110;&#40;&#119;&#105;&#110;&#100;&#111;&#119;&#46;&#108;&#111;&#99;&#97;&#116;&#105;&#111;&#110;&#61;&#39;&#104;&#116;&#116;&#112;&#58;&#47;&#47;&#101;&#120;&#97;&#109;&#112;&#108;&#101;&#46;&#111;&#114;&#103;&#47;&#39;&#41;">entity-encoded CSS script test</span>
+* <a href="javascript&#x3A;alert('foo')">click me</a>
diff --git a/doc/plugins/htmlscrubber/discussion.mdwn b/doc/plugins/htmlscrubber/discussion.mdwn
new file mode 100644
index 000000000..5e8b637b7
--- /dev/null
+++ b/doc/plugins/htmlscrubber/discussion.mdwn
@@ -0,0 +1,18 @@
+**Ok, I have yet to post a big dummy wiki-noobie question around here, so here goes:**
+
+Yes, I want to play around with *gulp* Google Ads on an ikiwiki blog, namely, in the *sidebar*.
+
+No, I do not want to turn htmlscrubber off, but apart from that I have not been able to allow &lt;script&gt; elements as required by Google.
+
+Thoughts?
+
+---
+
+***Fixed!***
+
+Did some more reading, did some searching on the wiki, and found, under *embed*, these
+
+ htmlscrubber_skip => '!*/Discussion',
+ locked_pages => '!*/Discussion',
+
+Thanks!
diff --git a/doc/plugins/htmltidy.mdwn b/doc/plugins/htmltidy.mdwn
new file mode 100644
index 000000000..580e56f59
--- /dev/null
+++ b/doc/plugins/htmltidy.mdwn
@@ -0,0 +1,11 @@
+[[!template id=plugin name=htmltidy author="Faidon Liambotis"]]
+[[!tag type/html]]
+[[!tag type/slow]]
+
+This plugin uses [tidy](http://tidy.sourceforge.net/) to tidy up the html
+emitted by ikiwiki. Besides being nicely formatted, this helps ensure that
+even if users enter suboptimal html, your wiki generates valid html.
+
+Note that since tidy is an external program, that is run each time a page
+is built, this plugin will slow ikiwiki down somewhat. [[plugins/htmlbalance]]
+might provide a faster alternative.
diff --git a/doc/plugins/httpauth.mdwn b/doc/plugins/httpauth.mdwn
new file mode 100644
index 000000000..2fae07739
--- /dev/null
+++ b/doc/plugins/httpauth.mdwn
@@ -0,0 +1,37 @@
+[[!template id=plugin name=httpauth author="Alec Berryman"]]
+[[!tag type/auth]]
+
+This plugin allows HTTP basic authentication to be used to log into the
+wiki. In this mode, the web browser authenticates the user by some means,
+and sets the `REMOTE_USER CGI` environment variable. This plugin trusts
+that if that variable is set, the user is authenticated.
+
+## fully authenticated wiki
+
+One way to use the plugin is to configure your web server to require
+HTTP basic authentication for any access to the directory containing the
+wiki (and `ikiwiki.cgi`). The authenticated user will be automatically
+signed into the wiki. This method is suitable only for private wikis.
+
+## separate cgiauthurl
+
+To use httpauth for a wiki where the content is public, and where
+the `ikiwiki.cgi` needs to be usable without authentication (for searching,
+or logging in using other methods, and so on), you can configure a separate
+url that is used for authentication, via the `cgiauthurl` option in the setup
+file. This url will then be redirected to when a user chooses to log in using
+httpauth.
+
+A typical setup is to make an `auth` subdirectory, and symlink `ikiwiki.cgi`
+into it. Then configure the web server to require authentication only for
+access to the `auth` subdirectory. Then `cgiauthurl` is pointed at this
+symlink.
+
+## using only httpauth for some pages
+
+If you want to only use httpauth for editing some pages, while allowing
+other authentication methods to be used for other pages, you can
+configure `httpauth_pagespec` in the setup file. This makes Edit
+links on pages that match the [[ikiwiki/PageSpec]] automatically use
+the `cgiauthurl`, and prevents matching pages from being edited by
+users authentication via other methods.
diff --git a/doc/plugins/img.mdwn b/doc/plugins/img.mdwn
new file mode 100644
index 000000000..a6cd90f28
--- /dev/null
+++ b/doc/plugins/img.mdwn
@@ -0,0 +1,14 @@
+[[!template id=plugin name=img author="Christian Mock"]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/img]] [[ikiwiki/directive]].
+While ikiwiki supports inlining full-size images by making a
+[[ikiwiki/WikiLink]] that points to the image, using this directive you can
+easily scale down an image for inclusion onto a page, providing a link to a
+full-size version.
+
+This plugin uses the [ImageMagick](http://www.imagemagick.org/) tools via
+[PerlMagick](http://www.imagemagick.org/www/perl-magick.html).
+
+Note that this is a stripped down version of Christian Mock's
+[[original_img_plugin|contrib/img]].
diff --git a/doc/plugins/img/discussion.mdwn b/doc/plugins/img/discussion.mdwn
new file mode 100644
index 000000000..e1bb2d15b
--- /dev/null
+++ b/doc/plugins/img/discussion.mdwn
@@ -0,0 +1,12 @@
+It would be useful (at least for me) if one could specify
+(using a [[ikiwiki/WikiLink]]) where the image links to. For example,
+on <http://www.bddebian.com/~wiki/sidebar/> I'd like to have the
+logo link to \[[hurd/logo]] / <http://www.bddebian.com/~wiki/hurd/logo/>
+instead of linking to the PNG image file. --[[tschwinge]]
+
+> Done, use link=somepage --[[Joey]]
+
+It would be handy if the `class` and `id` tags were passed through to the surrounding `table` in the case of `caption` being present. Would this break anything? --[[neale]]
+
+> Seems unlikely to break *too* much. I can imagine css that styles the img
+> unexpectedly applying the table. --[[Joey]]
diff --git a/doc/plugins/inline.mdwn b/doc/plugins/inline.mdwn
new file mode 100644
index 000000000..3eb849fdb
--- /dev/null
+++ b/doc/plugins/inline.mdwn
@@ -0,0 +1,6 @@
+[[!template id=plugin name=inline core=1 author="[[Joey]]"]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/inline]]
+[[ikiwiki/directive]], which allows including one wiki page
+inside another.
diff --git a/doc/plugins/install.mdwn b/doc/plugins/install.mdwn
new file mode 100644
index 000000000..900662eec
--- /dev/null
+++ b/doc/plugins/install.mdwn
@@ -0,0 +1,19 @@
+[[!meta title="Installing third party plugins"]]
+
+Most ikiwiki plugins are perl modules and should be installed somewhere in
+the perl module search path. See the @INC list at the end of the output of
+`perl -V` for a list of the directories in that path. All plugins are in
+the IkiWiki::Plugin namespace, so they go in a IkiWiki/Plugin subdirectory
+inside the perl search path. For example, if your perl looks in
+`/usr/local/lib/site_perl` for modules, you can locally install ikiwiki
+plugins to `/usr/local/lib/site_perl/IkiWiki/Plugin`
+
+You can use the `libdir` configuration option to add a directory to the
+search path. For example, if you set `libdir` to `/home/you/.ikiwiki/`,
+then ikiwiki will look for plugins in `/home/you/.ikiwiki/IkiWiki/Plugin`.
+
+Ikiwiki also supports plugins that are external programs. These are
+typically written in some other language than perl. Ikiwiki searches for
+these in `/usr/lib/ikiwiki/plugins` by default. If `libdir` is set, it will
+also look under that directory, for example in `/home/you/.ikiwiki/plugins`.
+Note that this type of plugin has to be executable for ikiwiki to use it.
diff --git a/doc/plugins/link.mdwn b/doc/plugins/link.mdwn
new file mode 100644
index 000000000..7dfa50de4
--- /dev/null
+++ b/doc/plugins/link.mdwn
@@ -0,0 +1,5 @@
+[[!template id=plugin name=link core=1 author="[[Joey]]"]]
+[[!tag type/link]]
+
+This plugin implements standard [[WikiLinks|ikiwiki/wikilink]] and links to
+external pages.
diff --git a/doc/plugins/linkmap.mdwn b/doc/plugins/linkmap.mdwn
new file mode 100644
index 000000000..7e51cd935
--- /dev/null
+++ b/doc/plugins/linkmap.mdwn
@@ -0,0 +1,14 @@
+[[!template id=plugin name=linkmap author="[[Joey]]"]]
+[[!tag type/meta]]
+[[!tag type/widget]]
+[[!tag type/slow]]
+
+This plugin provides the [[ikiwiki/directive/linkmap]] [[ikiwiki/directive]].
+It uses [graphviz](http://www.graphviz.org/) to generate a graph showing the
+links between a set of pages in the wiki.
+
+[[!if test="enabled(linkmap)" then="""
+Here is an example link map, of the index page and all pages it links to:
+
+[[!linkmap pages="index or (backlink(index) and !*.png)"]]
+"""]]
diff --git a/doc/plugins/listdirectives.mdwn b/doc/plugins/listdirectives.mdwn
new file mode 100644
index 000000000..df854de52
--- /dev/null
+++ b/doc/plugins/listdirectives.mdwn
@@ -0,0 +1,16 @@
+[[!template id=plugin name=listdirectives author="Will"]]
+[[!tag type/meta]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/listdirectives]]
+[[ikiwiki/directive]], which inserts a list of currently available
+directives into the page.
+
+Each directive links to a wiki page with the same name, that should
+document that directive. The location of these pages can be controlled via
+the `directive_description_dir` setting in the setup file, the default is
+"ikiwiki/directive/foo".
+
+When this plugin is enabled, it enables the directives underlay, which
+contains documentation about all the directives included in plugins shipped
+with ikiwiki. This adds about 200 kb to the size of your wiki.
diff --git a/doc/plugins/localstyle.mdwn b/doc/plugins/localstyle.mdwn
new file mode 100644
index 000000000..70a909d68
--- /dev/null
+++ b/doc/plugins/localstyle.mdwn
@@ -0,0 +1,12 @@
+[[!template id=plugin name=localstyle author="[[Joey]]"]]
+[[!tag type/chrome]]
+
+This plugin allows styling different sections of a wiki using different
+versions of the local.css [[CSS]] file. Normally this file is read from the
+top level of the wiki, but with this plugin enabled, standard
+[[ikiwiki/subpage/LinkingRules]] are used to find the closest local.css
+file to each page.
+
+So, for example, to use different styling for page `foo`, as well as all
+of its [[SubPages|ikiwiki/subpage]], such as `foo/bar`, create a
+`foo/local.css`.
diff --git a/doc/plugins/lockedit.mdwn b/doc/plugins/lockedit.mdwn
new file mode 100644
index 000000000..8569238b1
--- /dev/null
+++ b/doc/plugins/lockedit.mdwn
@@ -0,0 +1,23 @@
+[[!template id=plugin name=lockedit core=1 author="[[Joey]]"]]
+[[!tag type/auth type/comments]]
+
+This plugin allows the administrator of a wiki to lock some pages, limiting
+who can edit them using the online interface. This doesn't prevent anyone
+who can commit to the underlying revision control system from editing the
+pages, however. (Unless you set up [[tips/untrusted_git_push]].)
+
+The `locked_pages` setting configures what pages are locked. It is a
+[[ikiwiki/PageSpec]], so you have lots of control over what kind of pages
+to lock. For example, you could choose to lock all pages created before
+2006, or all pages that are linked to from the page named "locked". More
+usually though, you'll just list some names of pages to lock.
+
+If you want to lock down a blog so only you can post to it, you can just
+lock "*", and enable the [[opendiscussion]] plugin, so readers can still post
+[[comments]].
+
+Wiki administrators can always edit locked pages. The [[ikiwiki/PageSpec]]
+can specify that some pages are not locked for some users. For example,
+"important_page and !user(joey)" locks `important_page` while still
+allowing joey to edit it, while "!*/Discussion and user(bob)" prevents bob
+from editing pages except for Discussion pages.
diff --git a/doc/plugins/lockedit/discussion.mdwn b/doc/plugins/lockedit/discussion.mdwn
new file mode 100644
index 000000000..867fc6a51
--- /dev/null
+++ b/doc/plugins/lockedit/discussion.mdwn
@@ -0,0 +1,18 @@
+This plugin not only locks pages but ensures too a user is logged in. This
+seems to me redundant with signedit. I propose [removing the if block that
+calls needsignin ].
+
+> That was added because the most typical reason for being unable to edit a
+> page is that you are not logged in. And without the jump to logging the
+> user in, there is no way for the user to log in, without navigating away
+> from the page they were trying to edit. --[[Joey]]
+
+>> Ok, but the problem is that when you don't want any signin form you end up
+>> with a lone login button. That might happend if you lock pages only on IP
+>> adresses, if you use another cookie from another webapp...
+
+>> That happends to me and I had to reimplement lockedit in my private auth
+>> plugin.
+
+>> Perhaps you could return undef on that case and let another plugin do the
+>> needsignin call ? -- [[Jogo]]
diff --git a/doc/plugins/map.mdwn b/doc/plugins/map.mdwn
new file mode 100644
index 000000000..b164d5ca8
--- /dev/null
+++ b/doc/plugins/map.mdwn
@@ -0,0 +1,11 @@
+[[!template id=plugin name=map author="Alessandro Dotti Contra"]]
+[[!tag type/meta type/widget]]
+
+This plugin provides the [[ikiwiki/directive/map]] [[ikiwiki/directive]],
+which generates a hierarchical page map for the wiki.
+
+[[!if test="enabled(map)" then="""
+Here's an example map, for the plugins section of this wiki:
+
+[[!map pages="(plugins or plugins/*) and !*/*/*"]]
+"""]]
diff --git a/doc/plugins/map/discussion.mdwn b/doc/plugins/map/discussion.mdwn
new file mode 100644
index 000000000..54c921b0f
--- /dev/null
+++ b/doc/plugins/map/discussion.mdwn
@@ -0,0 +1,49 @@
+I'm wanting a [[map]] (with indentation levels) showing page _titles_
+instead of page 'names'. As far as I can see, this is not an option with
+existing plugins - I can get a list of pages using [[inline]] and
+appropriate [[templates]], but that has no indentation and therefore
+doesn't show structure well.
+
+The quick way is to modify the map plugin to have a 'titles' option. The
+hard way is to modify inline to have an indentation option, in which case
+inline will be a superset of map functionality. The second option seems a
+little wrong from the point of view of perversely extending what 'inline'
+means, but it seems right from the point of view of combining common
+features. Maybe adding template support to map is a more useful approach
+than just adding a title option.
+
+Thoughts, anyone? --[[KarlMW]]
+
+We'd also very much like to have an option to display the title of the page instead of the filename in the map plugin. --Andrew
+
+There's a patch implementing this in [[!debbug 484510]]. It needs a few fixes
+before I merge it. Now applied. --[[Joey]]
+
+----
+
+I noticed that when the pagespec returns no map items, the map plugin does not close off the ul and div tags. Below is a simple patch
+that seems to work on the examples I tried. I am a beginner so please help me out here. Thanks. --[[harishcm]]
+
+ --- a/map.pm
+ +++ b/map.pm
+ @@ -81,6 +81,13 @@
+ my $openli=0;
+ my $addparent="";
+ my $map = "<div class='map'>\n<ul>\n";
+ +
+ + # Return properly closed $map if %mapitems is empty
+ + if (!scalar(keys %mapitems)) {
+ + $map .= "</ul>\n</div>\n";
+ + return $map;
+ + }
+ +
+ foreach my $item (sort keys %mapitems) {
+ my @linktext = (length $mapitems{$item} ? (linktext => $mapitems{$item}) : ());
+ $item=~s/^\Q$common_prefix\E\///
+
+> This was also reported as [[bugs/map_fails_to_close_ul_element_for_empty_list]];
+> this patch is simpler than the one there, but has the same problem (it emits
+> `<ul></ul>`, which technically isn't valid HTML either). --[[smcv]]
+
+>> Thanks for the tip, I added another patch addressing the issue at
+>> [[bugs/map_fails_to_close_ul_element_for_empty_list]]. --[[harishcm]]
diff --git a/doc/plugins/mdwn.mdwn b/doc/plugins/mdwn.mdwn
new file mode 100644
index 000000000..8a7308305
--- /dev/null
+++ b/doc/plugins/mdwn.mdwn
@@ -0,0 +1,23 @@
+[[!template id=plugin name=mdwn core=1 author="[[Joey]]"]]
+[[!tag type/format]]
+
+This plugin lets ikwiki convert files with names ending in ".mdwn" to html.
+It uses the [[ikiwiki/markdown]] minimal markup language.
+
+This is the standard markup language used by ikiwiki, although some others
+are also available in other plugins.
+
+There are several implementations of markdown support that can be used by
+this plugin. In order of preference:
+
+* [Discount](http://www.pell.portland.or.us/~orc/Code/discount/),
+ via the [[!cpan Text::Markdown::Discount]] perl module.
+* The [[!cpan Text::Markdown]] perl module.
+* The [original version of markdown](http://daringfireball.net/projects/markdown/).
+
+[[!cpan Text::MultiMarkdown]] can be used in order to use tables, footnotes,
+and other new features from the markdown variant called
+[multimarkdown](http://fletcherpenney.net/MultiMarkdown/). Multimarkdown is
+not enabled by default, but can be turned on via the `multimarkdown` option
+in the setup file. Note that multimarkdown's metadata and wikilinks
+features are disabled when it's used with ikiwiki.
diff --git a/doc/plugins/mdwn/discussion.mdwn b/doc/plugins/mdwn/discussion.mdwn
new file mode 100644
index 000000000..4b05e7f4e
--- /dev/null
+++ b/doc/plugins/mdwn/discussion.mdwn
@@ -0,0 +1,7 @@
+Unlike other format, ikiwiki is somehow depends
+on mdwn, since the underlay dir
+is written in mdwn. If you want to disable mdwn,
+you need to overwrite the underlay
+dir (set underlaydir in ikiwiki.setup
+to your own underlay dir or replace underlay pages
+in your $SRC).
diff --git a/doc/plugins/meta.mdwn b/doc/plugins/meta.mdwn
new file mode 100644
index 000000000..e49bdcc50
--- /dev/null
+++ b/doc/plugins/meta.mdwn
@@ -0,0 +1,5 @@
+[[!template id=plugin name=meta core=1 author="[[Joey]]"]]
+[[!tag type/meta]]
+
+This plugin provides the [[ikiwiki/directive/meta]] [[ikiwiki/directive]],
+which allows inserting various metadata into the source of a page.
diff --git a/doc/plugins/meta/discussion.mdwn b/doc/plugins/meta/discussion.mdwn
new file mode 100644
index 000000000..814b93a41
--- /dev/null
+++ b/doc/plugins/meta/discussion.mdwn
@@ -0,0 +1,18 @@
+# Quotation marks inside the title parameter?
+
+How can I use quotation marks inside the title parameter? Is it possible?
+
+I was trying to escape them using backslashes (title="Foo \"Bar\" Baz")
+or "cheat" ikiwiki using apostrophes (title='Foo "Bar" Baz'), but unfortunately
+no success.
+
+Now I've work-arounded it using apostrophes in another way
+(title="Foo ''Bar'' Baz") :)
+
+--[[Paweł|ptecza]]
+
+> As with any other parameter in a [[ikiwiki/directive]], you can
+> triple-quote, and then include quotation marks inside. --[[Joey]]
+
+>> Thanks for the hint! Toggle plugin is probably my favourite ikiwiki
+>> plugin, but I forget about that :D --[[Paweł|ptecza]]
diff --git a/doc/plugins/mirrorlist.mdwn b/doc/plugins/mirrorlist.mdwn
new file mode 100644
index 000000000..b63685813
--- /dev/null
+++ b/doc/plugins/mirrorlist.mdwn
@@ -0,0 +1,22 @@
+[[!template id=plugin name=mirror author="[[Joey]]"]]
+[[!tag type/web]]
+
+This plugin allows adding links a list of mirrors to each page in the
+wiki. For each mirror, a name and an url should be specified. Pages are
+assumed to exist in the same location under the specified url on each
+mirror.
+
+In case the `usedirs` setting is not the same on all your mirrors, or
+if it is not the same on your local wiki as on the mirror a
+possibility is to let each mirror's ikiwiki CGI find out the correct
+target page url themselves; in that case the mirrors urls must be set
+to their ikiwiki CGI url instead of their base url. Example:
+
+ mirrorlist_use_cgi => 1,
+ mirrorlist => {
+ 'mirror1' => 'https://mirror.example.org/ikiwiki.cgi',
+ 'mirror2' => 'https://mirror2.example.org/ikiwiki.cgi',
+ },
+
+The mirrors must have the ikiwiki CGI and the [[goto]] plugin enabled
+for this to work.
diff --git a/doc/plugins/moderatedcomments.mdwn b/doc/plugins/moderatedcomments.mdwn
new file mode 100644
index 000000000..85bcf652b
--- /dev/null
+++ b/doc/plugins/moderatedcomments.mdwn
@@ -0,0 +1,12 @@
+[[!template id=plugin name=moderatedcomments author="[[Joey]]"]]
+[[!tag type/auth type/comments]]
+
+This plugin causes [[comments]] to be held for manual moderation.
+Admins can access the comment moderation queue via their preferences page.
+
+By default, all comments made by anyone who is not an admin will be held
+for moderation. The `moderate_pagespec` setting can be used to specify a
+[[ikiwiki/PageSpec]] to match comments and users who should be moderated.
+For example, to avoid moderating comments from logged-in users, set
+`moderate_pagespec` to "`!user(*)`". Or to moderate everyone except for
+admins, set it to "`!admin(*)`".
diff --git a/doc/plugins/more.mdwn b/doc/plugins/more.mdwn
new file mode 100644
index 000000000..a0664e843
--- /dev/null
+++ b/doc/plugins/more.mdwn
@@ -0,0 +1,6 @@
+[[!template id=plugin name=more author="Ben"]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/more]] [[ikiwiki/directive]],
+which is a way to have a "more" link on a post in a blog, that leads to the
+full version of the page.
diff --git a/doc/plugins/more/discussion.mdwn b/doc/plugins/more/discussion.mdwn
new file mode 100644
index 000000000..f369d1e12
--- /dev/null
+++ b/doc/plugins/more/discussion.mdwn
@@ -0,0 +1,7 @@
+# Test:
+
+[[!more linktext="click for more" text="""
+This is the rest of my post. Not intended for people catching up on
+their blogs at 30,000 feet. Because I like to make things
+difficult.
+"""]]
diff --git a/doc/plugins/notifyemail.mdwn b/doc/plugins/notifyemail.mdwn
new file mode 100644
index 000000000..302979e6e
--- /dev/null
+++ b/doc/plugins/notifyemail.mdwn
@@ -0,0 +1,14 @@
+This plugin allows uses to subscribe to pages, and emails them when
+they are created or changed.
+
+It needs the [[!cpan Mail::SendMail]] perl module, and sends mail
+using the local MTA.
+
+Each user can configure which pages they are interested in, using an
+[[ikiwiki/PageSpec]] on their Preferences page. Any change to a page
+matching the PageSpec will send an email that includes the new content of
+the page, and a link to the page on the web.
+
+To make it easy to subscribe to comment threads when posting a comment,
+or a page, there is a check box that can be used to subscribe, without
+needing to manually edit the [[ikiwiki/PageSpec]].
diff --git a/doc/plugins/notifyemail/discussion.mdwn b/doc/plugins/notifyemail/discussion.mdwn
new file mode 100644
index 000000000..631c680fc
--- /dev/null
+++ b/doc/plugins/notifyemail/discussion.mdwn
@@ -0,0 +1,5 @@
+When I try to add this plugin to the setup file and run "ikiwiki --setup" I get an error: Can't locate IkiWiki/Plugin/notifyemail.pm
+
+All the other plugins I have installed have worked, so my setup should be ok - just this one is missing!?!
+
+> It's new in version 3.20120419, perhaps you have an older version? --[[smcv]]
diff --git a/doc/plugins/opendiscussion.mdwn b/doc/plugins/opendiscussion.mdwn
new file mode 100644
index 000000000..8c12199e1
--- /dev/null
+++ b/doc/plugins/opendiscussion.mdwn
@@ -0,0 +1,11 @@
+[[!template id=plugin name=opendiscussion author="[[Joey]]"]]
+[[!tag type/auth]]
+
+This plugin allows editing of Discussion pages, and posting of comments,
+even when the [[lockedit]] plugin has been configured to otherwise prevent
+editing.
+
+Like the [[plugins/anonok]] plugin, this plugin allows anonymous edits
+and comments, so be prepared to deal with comment spam;
+the [[plugins/moderatedcomments]] and [[plugins/blogspam]] plugins might
+be useful to reduce this.
diff --git a/doc/plugins/openid.mdwn b/doc/plugins/openid.mdwn
new file mode 100644
index 000000000..d56d1a396
--- /dev/null
+++ b/doc/plugins/openid.mdwn
@@ -0,0 +1,32 @@
+[[!template id=plugin name=openid core=1 author="[[Joey]]"]]
+[[!tag type/auth]]
+
+This plugin allows users to use their [OpenID](http://openid.net/) to log
+into the wiki.
+
+The plugin needs the [[!cpan Net::OpenID::Consumer]] perl module.
+Version 1.x is needed in order for OpenID v2 to work.
+
+The [[!cpan LWPx::ParanoidAgent]] perl module is used if available, for
+added security. Finally, the [[!cpan Crypt::SSLeay]] perl module is needed
+to support users entering "https" OpenID urls.
+
+This plugin is enabled by default, but can be turned off if you want to
+only use some other form of authentication, such as [[passwordauth]].
+
+## options
+
+These options do not normally need to be set, but can be useful in
+certain setups.
+
+* `openid_realm` can be used to control the scope of the openid request.
+ It defaults to the `cgiurl` (or `openid_cgiurl` if set); only allowing
+ ikiwiki's [[CGI]] to authenticate. If you have multiple ikiwiki instances,
+ or other things using openid on the same site, you may choose to put them
+ all in the same realm to improve the user's openid experience. It is an
+ url pattern, so can be set to eg "http://*.example.com/"
+
+* `openid_cgiurl` can be used to cause a different than usual `cgiurl`
+ to be used when doing openid authentication. The `openid_cgiurl` must
+ point to an ikiwiki [[CGI]], and it will need to match the `openid_realm`
+ to work.
diff --git a/doc/plugins/openid/discussion.mdwn b/doc/plugins/openid/discussion.mdwn
new file mode 100644
index 000000000..a88da8b9d
--- /dev/null
+++ b/doc/plugins/openid/discussion.mdwn
@@ -0,0 +1,26 @@
+There will be a *talk* by Stephane Bortzmeye about *OpenID* within the program of the
+[*umeet 2007* online conference](http://umeet.uninet.edu/umeet2007/english/pres.html).
+
+It is scheduled for 2007-12-20 18:00 UTC.
+
+See <http://umeet.uninet.edu/umeet2007/english/prog.html> for the complete program
+and for information about how to join.
+
+--[[tschwinge]]
+
+----
+<a id="Yahoo_unsupported" />
+[[!tag bugs]]
+
+It looks like OpenID 2.0 (the only supported by Yahoo) is not supported in ikiwiki. :( I signed up at http://openid.yahoo.com/ , and tried to login to my ikiwiki with the new ID (of the form: https://me.yahoo.com/a-username), but Yahoo told me:
+
+> Sorry! You will not be able to login to this website as it is using an older version of the the OpenID technology. Yahoo! only supports OpenID 2.0 because it is more secure. For more information, check out the OpenID documentation at [Yahoo! Developer Network](http://developer.yahoo.com/openid/).
+
+-- Ivan Z.
+
+They have more on OpenID 2.0 in [their FAQ](http://developer.yahoo.com/openid/faq.html). --Ivan Z.
+
+----
+I'm trying to add a way to query the data saved by the OpenID plugin from outside of ikiwiki, to see what identity the user has been authenticated as, if any. I'm thinking of designating some directories as internal pages and check the identity against a list in a mod_perl access hook. I would also write a CGI script that would return a JSON formatted reply to tell if the user is authenticated for those pages and query it with AJAX and only render links to the internal pages if the user would have access to them. That's just a couple of ideas I'm working on first, but I can imagine that there's any number of other tricks that people could implement with that sort of a thing.
+
+Also, this isn't really specific to OpenID but to all auth plugins, but I'm going to use only OpenID for authentication so that's what I'm targeting right now. I suppose that would be worth its own TODO item. --[[kaol]]
diff --git a/doc/plugins/orphans.mdwn b/doc/plugins/orphans.mdwn
new file mode 100644
index 000000000..09ad0a51d
--- /dev/null
+++ b/doc/plugins/orphans.mdwn
@@ -0,0 +1,16 @@
+[[!template id=plugin name=orphans author="[[Joey]]"]]
+[[!tag type/meta]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/orphans]]
+[[ikiwiki/directive]], which generates a list of possibly orphaned pages --
+pages that no other page links to.
+
+[[!if test="enabled(orphans)" then="""
+Here's a list of orphaned pages on this wiki:
+
+[[!orphans pages="* and !news/* and !todo/* and !bugs/* and !users/* and
+!recentchanges and !examples/* and !tips/* and !sandbox/* and !templates/* and
+!forum/* and !*.js and
+!wikiicons/* and !plugins/*"]]
+"""]]
diff --git a/doc/plugins/orphans/discussion.mdwn b/doc/plugins/orphans/discussion.mdwn
new file mode 100644
index 000000000..3165e5968
--- /dev/null
+++ b/doc/plugins/orphans/discussion.mdwn
@@ -0,0 +1,22 @@
+It seems that the orphans plugin doesn't recognize markdown-style links of the kind:
+
+ [Pretty link name](realname)
+
+In my wiki, the page "realname" shows up as an orphan although it's being linked to.
+
+> Like anything in ikiwiki that deals with links, this only takes
+> [[WikiLinks|ikiwiki/wikilink]] into account. There should be no real
+> reason to use other link mechanisms provided by eg, markdown for internal
+> links in the wiki (indeed, using them is likely to cause broken links
+> when doing things like inlining or renaming pages). --[[Joey]]
+
+
+The orphans plugin fails with an error when it has to deal with a page that contains '+' characters as part of the filename. Apparently the code uses regular expressions and forgets to quote that string at some cruicial point. The error message I see is:
+
+ \[[!orphans Error: Nested quantifiers in regex;
+ marked by <-- HERE in m/^(c++ <-- HERE |)$/ at
+ /usr/lib/perl5/vendor_perl/5.8.8/IkiWiki/Plugin/orphans.pm line 43.]]
+
+--Peter
+
+> Fixed. BTW, for an important bug like this, use [[bugs]]. --[[Joey]]
diff --git a/doc/plugins/osm.mdwn b/doc/plugins/osm.mdwn
new file mode 100644
index 000000000..040d175ca
--- /dev/null
+++ b/doc/plugins/osm.mdwn
@@ -0,0 +1,31 @@
+[[!template id=plugin name=osm author="Blars Blarson, Antoine Beaupré"]]
+[[!tag type/special-purpose todo/geotagging]]
+
+## Openstreetmap/Openlayers support for ikiwiki
+
+This plugin provides simple Openstreetmap/Openlayers support for ikiwiki.
+It can embed Openstreetmap viewports within a page or link to a bigger map
+that will have multiple markers, generated with a KML (or CSV, or GeoJSON)
+datafile of markers based on the different calling pages. Multiple distinct
+maps on a single wiki are supported.
+
+You will need the [[!cpan XML::Writer]] perl module to write KML files,
+which is the default mode of operation. GeoJSON files can also be generated
+if the [[!cpan JSON]] perl module is installed.
+
+This provides the [[ikiwiki/directive/waypoint]] and [[ikiwiki/directive/osm]] directives.
+
+---
+
+The plugin was originally written by
+[[the techno-viking|http://techno-viking.com/posts/ikiwiki-maps/]] and fixed up
+by [[anarcat]].
+
+See [[the Mtl-mesh
+wiki|http://mesh.openisp.ca/nodes/anarcat]] for a sample of what this
+plugin can do
+
+See also [[plugins/contrib/googlemaps]].
+
+This plugin would be greatly improved by
+[[todo/internal_definition_list_support]].
diff --git a/doc/plugins/otl.mdwn b/doc/plugins/otl.mdwn
new file mode 100644
index 000000000..d890b0126
--- /dev/null
+++ b/doc/plugins/otl.mdwn
@@ -0,0 +1,6 @@
+[[!template id=plugin name=otl author="[[Joey]]"]]
+[[!tag type/format]]
+
+This plugin allows ikiwiki to process `.otl` outline files, as created by
+[vimoutliner](http://www.vimoutliner.org/). To use it, you need to have
+vimoutliner installed, since it uses the `otl2html` program.
diff --git a/doc/plugins/pagecount.mdwn b/doc/plugins/pagecount.mdwn
new file mode 100644
index 000000000..71872fae8
--- /dev/null
+++ b/doc/plugins/pagecount.mdwn
@@ -0,0 +1,11 @@
+[[!template id=plugin name=pagecount author="[[Joey]]"]]
+[[!tag type/meta]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/pagecount]]
+[[ikiwiki/directive]], which displays the number of pages
+currently in the wiki.
+
+If it is turned on it can tell us that this wiki includes
+[[!pagecount ]] pages, of which
+[[!pagecount pages="*/Discussion"]] are discussion pages.
diff --git a/doc/plugins/pagestats.mdwn b/doc/plugins/pagestats.mdwn
new file mode 100644
index 000000000..347e39a89
--- /dev/null
+++ b/doc/plugins/pagestats.mdwn
@@ -0,0 +1,6 @@
+[[!template id=plugin name=pagestats author="Enrico Zini"]]
+[[!tag type/meta type/tags type/widget]]
+
+This plugin provides the [[ikiwiki/directive/pagestats]]
+[[ikiwiki/directive]], which can generate stats about how pages link to
+each other, or display a tag cloud.
diff --git a/doc/plugins/pagetemplate.mdwn b/doc/plugins/pagetemplate.mdwn
new file mode 100644
index 000000000..8254e14c5
--- /dev/null
+++ b/doc/plugins/pagetemplate.mdwn
@@ -0,0 +1,6 @@
+[[!template id=plugin name=pagetemplate author="[[Joey]]"]]
+[[!tag type/chrome]]
+
+This plugin provides the [[ikiwiki/directive/pagetemplate]]
+[[ikiwiki/directive]], which allows a page to be displayed
+using a different [[template|templates]] than the default.
diff --git a/doc/plugins/parentlinks.mdwn b/doc/plugins/parentlinks.mdwn
new file mode 100644
index 000000000..c2d364bef
--- /dev/null
+++ b/doc/plugins/parentlinks.mdwn
@@ -0,0 +1,5 @@
+[[!template id=plugin name=parentlinks core=1 author="[[intrigeri]]"]]
+[[!tag type/link type/chrome]]
+
+This plugin generates the links to a page's parents that typically appear
+at the top of a wiki page.
diff --git a/doc/plugins/passwordauth.mdwn b/doc/plugins/passwordauth.mdwn
new file mode 100644
index 000000000..fe680a0f8
--- /dev/null
+++ b/doc/plugins/passwordauth.mdwn
@@ -0,0 +1,33 @@
+[[!template id=plugin name=passwordauth core=1 author="[[Joey]]"]]
+[[!tag type/auth]]
+
+This plugin lets ikiwiki prompt for a user name and password when logging
+into the wiki. It also handles registering users, resetting passwords, and
+changing passwords in the prefs page.
+
+It is enabled by default, but can be turned off if you want to only use
+some other form of authentication, such as [[httpauth]] or [[openid]].
+
+When the `account_creation_password` configuration option is enabled with
+a password, this plugin prompts for the password when creating an
+account as a simplistic anti-spam measure.
+(Some wikis edited by a particular group use an account creation password
+as an "ask an existing member to get an account" system.)
+
+## password storage
+
+Users' passwords are stored in the `.ikiwiki/userdb` file, which needs to
+be kept safe to prevent exposure of passwords. If the
+[[!cpan Authen::Passphrase]] perl module is installed, only hashes of the
+passwords will be stored. This is strongly recommended.
+
+The `password_cost` configuration option can be used to make the stored
+password hashes be more difficult to brute force, at the expense of also
+taking more time to check a password when a user logs into the wiki. The
+default value is 8, max value is (currently) 31, and each step *doubles*
+the time required.
+
+So if you're worried about your password files leaking and being cracked,
+you can increase the `password_cost` and make that harder. But a better
+choice might be to not deal with user passwords at all, and instead use
+[[openid]]!
diff --git a/doc/plugins/passwordauth/discussion.mdwn b/doc/plugins/passwordauth/discussion.mdwn
new file mode 100644
index 000000000..50e21062e
--- /dev/null
+++ b/doc/plugins/passwordauth/discussion.mdwn
@@ -0,0 +1,151 @@
+It's a bit inconvenient that one also has to type in the
+*Login - Confirm Password* if one only wants to change
+the *Preferences -- Subscriptions*. --[[tschwinge]]
+
+> You don't. The password fields on the preferences fields are only needed
+> if you want to change your password and should otherwise be left blank.
+> --[[Joey]]
+
+>> Aha, then the problem is Firefox, which is automatically filling the
+>> *Password* field with its previous value, but not filling the
+>> *Confirm Password* one. --[[tschwinge]]
+
+## easy access to the userdb for apache auth?
+
+My use case is:
+
+* restricted ikiwiki
+* read/edit only allowed from the local network (done with apache restrictions)
+* edit only for people authenticated (done with vanilla ikiwiki passwordauth)
+
+I would like to allow people to read/edit the wiki from outside of the
+local network, if and only if they already have an ikiwiki account.
+
+[[httpauth]] doesn't fit since it doesn't allow anonymous local users
+to create their own account. I want a single, local, simple auth
+database.
+
+My (naïve?) idea would be:
+
+* keep the [[passwordauth]] system
+* provide a way for Apache to use the userdb for authentication if
+people want to connect from outside
+
+I looked at the various auth modules for apache2. It seems that none
+can use a "perl Storable data" file. So, I think some solutions could
+be:
+
+* use a sqlite database instead of a perl Storable file
+ * can be used with
+ [mod_auth_dbd](http://httpd.apache.org/docs/2.2/mod/mod_authn_dbd.html)
+ * requires a change in ikiwiki module [[passwordauth]]
+* use an external program to read the userdb and talk with
+ [mod_auth_external](http://unixpapa.com/mod_auth_external.html)
+ * requires the maintainance of this external auth proxy over ikiwiki
+ userdb format changes
+ * (I don't know perl)
+* include this wrapper in ikiwiki
+ * something like `ikiwiki --auth user:pass:userdb` check the
+ `user:pass` pair in `userdb` and returns an Accept/Reject flag to
+ Apache
+ * requires a change in ikiwiki core
+ * still requires
+ [mod_auth_external](http://unixpapa.com/mod_auth_external.html)
+* do it with Apache perl sections
+ * (I don't know perl)
+
+Any opinion/suggestion/solution to this is welcome and appreciated.
+
+--
+[[NicolasLimare]]
+
+For a similar use case, I've been intending to implement
+[[todo/httpauth_feature_parity_with_passwordauth]], but your idea may
+actually be the way to go. IMHO, the Perl sections idea is the
+easiest to setup, but on the long run, I'd prefer ikiwiki to optionnally
+use a userdb storage backend supported at least by Apache and lighttpd.
+--[[intrigeri]]
+
+Tons of CPAN modules may help, but most of them are specific to `mod_perl`,
+and AFAIK, ikiwiki is generally not run with `mod_perl`. It's not clear to me
+wether these modules depend on the webapp to be run with `mod_perl` set
+as the script handler, or only on `mod_perl` to be installed and loaded.
+
+* CPAN's `Apache::AuthenHook` allows to plug arbitrary Perl handlers as
+ Apache authentication providers.
+* CPAN's `Apache::Authen::Program` (`mod_perl`)
+* [http://www.openfusion.com.au/labs/mod_auth_tkt/](mod_auth_tkt) along with CPAN's
+ `Apache::AuthTkt`
+--[[intrigeri]]
+
+I've more or less managed to implement something based on `mod_perl` and
+`Apache::AuthenHook`, respectively in Debian packages `libapache2-mod-perl2`
+and `libapache-authenhook-perl`.
+
+In the Apache VirtualHost configuration, I have added the following:
+
+ PerlLoadModule Apache::AuthenHook
+ PerlModule My::IkiWikiBasicProvider
+
+ <Location /test/>
+ AuthType Basic
+ AuthName "wiki"
+ AuthBasicProvider My::IkiWikiBasicProvider
+ Require valid-user
+ ErrorDocument 401 /test/ikiwiki.cgi?do=signin
+ </Location>
+ <LocationMatch "^/test/(ikiwiki\.cgi$|.*\.css$|wikiicons/)">
+ Satisfy any
+ </LocationMatch>
+
+The perl module lies in `/etc/apache2/My/IkiWikiBasicProvider.pm`:
+
+ package My::IkiWikiBasicProvider;
+
+ use warnings;
+ use strict;
+ use Apache2::Const -compile => qw(OK DECLINED HTTP_UNAUTHORIZED);
+ use Storable;
+ use Authen::Passphrase;
+
+ sub userinfo_retrieve () {
+ my $userinfo=eval{ Storable::lock_retrieve("/var/lib/ikiwiki/test/.ikiwiki/userdb") };
+ return $userinfo;
+ }
+
+ sub handler {
+ my ($r, $user, $password) = @_;
+ my $field = "password";
+
+ if (! defined $password || ! length $password) {
+ return Apache2::Const::DECLINED;
+ }
+ my $userinfo = userinfo_retrieve();
+ if (! length $user || ! defined $userinfo ||
+ ! exists $userinfo->{$user} || ! ref $userinfo->{$user}) {
+ return Apache2::Const::DECLINED;
+ }
+ my $ret=0;
+ if (exists $userinfo->{$user}->{"crypt".$field}) {
+ error $@ if $@;
+ my $p = Authen::Passphrase->from_crypt($userinfo->{$user}->{"crypt".$field});
+ $ret=$p->match($password);
+ }
+ elsif (exists $userinfo->{$user}->{$field}) {
+ $ret=$password eq $userinfo->{$user}->{$field};
+ }
+ if ($ret) {
+ return Apache2::Const::OK;
+ }
+ return Apache2::Const::DECLINED;
+ }
+
+ 1;
+
+This setup also allows people with the master password to create their own
+account.
+
+I'm not really fluent in Perl, and all this can probably be improved (*or
+might destroy your computer as it is* and YMMV).
+
+-- [[Lunar]]
diff --git a/doc/plugins/pingee.mdwn b/doc/plugins/pingee.mdwn
new file mode 100644
index 000000000..6156c235f
--- /dev/null
+++ b/doc/plugins/pingee.mdwn
@@ -0,0 +1,11 @@
+[[!template id=plugin name=pingee author="[[Joey]]"]]
+[[!tag type/special-purpose]]
+
+This plugin causes ikiwiki to listen for pings, typically delivered from
+another ikiwiki instance using the [[pinger]] plugin. When a ping is
+received, ikiwiki will update the wiki, the same as if `ikiwiki --refresh`
+were ran at the command line.
+
+An url such as the following is used to trigger a ping:
+
+ http://mywiki.com/ikiwiki.cgi?do=ping
diff --git a/doc/plugins/pingee/discussion.mdwn b/doc/plugins/pingee/discussion.mdwn
new file mode 100644
index 000000000..a54caf235
--- /dev/null
+++ b/doc/plugins/pingee/discussion.mdwn
@@ -0,0 +1,9 @@
+I see that this code performs an RCS update to collect updates. Is there anything in this that would help propagate modifications to the other machine... or would this require a reciprocal pingee setup on the other end?
+
+-- [[harningt]]
+
+> The pingee won't push any pending updates it has back to the pinger, no.
+> But often the pinger is the origin of the rcs checkout on the pingee, and
+> if so, ikiwiki defaults to automatically pushing updates to the origin
+> when they are made. Or you can set up a reciprocal pingee/pinger setup as
+> described. --[[Joey]]
diff --git a/doc/plugins/pinger.mdwn b/doc/plugins/pinger.mdwn
new file mode 100644
index 000000000..00d83e1bb
--- /dev/null
+++ b/doc/plugins/pinger.mdwn
@@ -0,0 +1,20 @@
+[[!template id=plugin name=pinger author="[[Joey]]"]]
+[[!tag type/special-purpose]]
+
+This plugin allows ikiwiki to be configured to hit a URL each time it
+updates the wiki. One way to use this is in conjunction with the [[pingee]]
+plugin to set up a loosely coupled mirror network, or a branched version of
+a wiki. By pinging the mirror or branch each time the main wiki changes, it
+can be kept up-to-date.
+
+To configure what URLs to ping, use the [[ikiwiki/directive/ping]]
+[[ikiwiki/directive]].
+
+The [[!cpan LWP]] perl module is used for pinging. Or the [[!cpan
+LWPx::ParanoidAgent]] perl module is used if available, for added security.
+Finally, the [[!cpan Crypt::SSLeay]] perl module is needed to support pinging
+"https" urls.
+
+By default the pinger will try to ping a site for 15 seconds before timing
+out. This timeout can be changed by setting the `pinger_timeout`
+configuration setting in the setup file.
diff --git a/doc/plugins/po.mdwn b/doc/plugins/po.mdwn
new file mode 100644
index 000000000..b7c1582ca
--- /dev/null
+++ b/doc/plugins/po.mdwn
@@ -0,0 +1,260 @@
+[[!template id=plugin name=po core=0 author="[[intrigeri]]"]]
+[[!tag type/format]]
+
+This plugin adds support for multi-lingual wikis, translated with
+gettext, using [po4a](http://po4a.alioth.debian.org/).
+
+It depends on the Perl `Locale::Po4a::Po` library (`apt-get install po4a`).
+As detailed bellow in the security section, `po4a` is subject to
+denial-of-service attacks before version 0.35.
+
+[[!toc levels=2]]
+
+Introduction
+============
+
+A language is chosen as the "master" one, and any other supported
+language is a "slave" one.
+
+A page written in the "master" language is a "master" page. It can be
+of any page type supported by ikiwiki, except `po`. It does not have to be
+named a special way: migration to this plugin does not imply any page
+renaming work.
+
+Example: `bla/page.mdwn` is a "master" Markdown page written in
+English; if `usedirs` is enabled, it is rendered as
+`bla/page/index.en.html`, else as `bla/page.en.html`.
+
+Any translation of a "master" page into a "slave" language is called
+a "slave" page; it is written in the gettext PO format. `po` is now
+a page type supported by ikiwiki.
+
+Example: `bla/page.fr.po` is the PO "message catalog" used to
+translate `bla/page.mdwn` into French; if `usedirs` is enabled, it is
+rendered as `bla/page/index.fr.html`, else as `bla/page.fr.html`
+
+(In)Compatibility
+=================
+
+This plugin does not support the `indexpages` mode. If you don't know
+what it is, you probably don't care.
+
+
+Configuration
+=============
+
+Supported languages
+-------------------
+
+`po_master_language` is used to set the "master" language in
+`ikiwiki.setup`, such as:
+
+ po_master_language: en|English
+
+`po_slave_languages` is used to set the list of supported "slave"
+languages, such as:
+
+ po_slave_languages:
+ - fr|Français
+ - es|Español
+ - de|Deutsch
+
+Decide which pages are translatable
+-----------------------------------
+
+The `po_translatable_pages` setting configures what pages are
+translatable. It is a [[ikiwiki/PageSpec]], so you have lots of
+control over what kind of pages are translatable.
+
+The `.po` files are not considered as being translatable, so you don't need to
+worry about excluding them explicitly from this [[ikiwiki/PageSpec]].
+
+Internal links
+--------------
+
+### Links targets
+
+The `po_link_to` option in `ikiwiki.setup` is used to decide how
+internal links should be generated, depending on web server features
+and site-specific preferences.
+
+#### Default linking behavior
+
+If `po_link_to` is unset, or set to `default`, ikiwiki's default
+linking behavior is preserved: `\[[destpage]]` links to the master
+language's page.
+
+#### Link to current language
+
+If `po_link_to` is set to `current`, `\[[destpage]]` links to the
+`destpage`'s version written in the current page's language, if
+available, *i.e.*:
+
+* `foo/destpage/index.LL.html` if `usedirs` is enabled
+* `foo/destpage.LL.html` if `usedirs` is disabled
+
+#### Link to negotiated language
+
+If `po_link_to` is set to `negotiated`, `\[[page]]` links to the
+negotiated preferred language, *i.e.* `foo/page/`.
+
+(In)compatibility notes:
+
+* if `usedirs` is disabled, it does not make sense to set `po_link_to`
+ to `negotiated`; this option combination is neither implemented
+ nor allowed.
+* if the web server does not support Content Negotiation, setting
+ `po_link_to` to `negotiated` will produce a unusable website.
+
+Server support
+==============
+
+Apache
+------
+
+Using Apache `mod_negotiation` makes it really easy to have Apache
+serve any page in the client's preferred language, if available.
+
+Add 'Options MultiViews' to the wiki directory's configuration in Apache.
+
+When `usedirs` is enabled, you should also set `DirectoryIndex index`.
+
+These settings are also recommended, in order to avoid serving up rss files
+as index pages:
+
+ AddType application/rss+xml;qs=0.8 .rss
+ AddType application/atom+xml;qs=0.8 .atom
+
+For details, see [Apache's documentation](http://httpd.apache.org/docs/2.2/content-negotiation.html).
+
+lighttpd
+--------
+
+Recent versions of lighttpd should be able to use
+`$HTTP["language"]` to configure the translated pages to be served.
+
+See [Lighttpd Issue](http://redmine.lighttpd.net/issues/show/1119)
+
+TODO: Example
+
+Usage
+=====
+
+Templates
+---------
+
+When `po_link_to` is not set to `negotiated`, one should replace some
+occurrences of `BASEURL` with `HOMEPAGEURL` to get correct links to
+the wiki homepage.
+
+The `ISTRANSLATION` and `ISTRANSLATABLE` variables can be used to
+display things only on translatable or translation pages.
+
+The `LANG_CODE` and `LANG_NAME` variables can respectively be used to
+display the current page's language code and pretty name.
+
+### Display page's versions in other languages
+
+The `OTHERLANGUAGES` loop provides ways to display other languages'
+versions of the same page, and the translations' status.
+
+An example of its use can be found in the default
+`templates/page.tmpl`. In case you want to customize it, the following
+variables are available inside the loop (for every page in):
+
+* `URL` - url to the page
+* `CODE` - two-letters language code
+* `LANGUAGE` - language name (as defined in `po_slave_languages`)
+* `MASTER` - is true (1) if, and only if the page is a "master" page
+* `PERCENT` - for "slave" pages, is set to the translation completeness, in percents
+
+### Display the current translation status
+
+The `PERCENTTRANSLATED` variable is set to the translation
+completeness, expressed in percent, on "slave" pages. It is used by
+the default `templates/page.tmpl`.
+
+Additional PageSpec tests
+-------------------------
+
+This plugin enhances the regular [[ikiwiki/PageSpec]] syntax with some
+additional tests that are documented [[here|ikiwiki/pagespec/po]].
+
+Automatic PO file update
+------------------------
+
+Committing changes to a "master" page:
+
+1. updates the POT file and the PO files for the "slave" languages;
+ the updated PO files are then put under version control;
+2. triggers a refresh of the corresponding HTML slave pages.
+
+Also, when the plugin has just been enabled, or when a page has just
+been declared as being translatable, the needed POT and PO files are
+created, and the PO files are checked into version control.
+
+Discussion pages and other sub-pages
+------------------------------------
+
+Discussion should happen in the language in which the pages are
+written for real, *i.e.* the "master" one. If discussion pages are
+enabled, "slave" pages therefore link to the "master" page's
+discussion page.
+
+Likewise, "slave" pages are not supposed to have sub-pages;
+[[WikiLinks|ikiwiki/wikilink]] that appear on a "slave" page therefore link to
+the master page's sub-pages.
+
+Translating
+-----------
+
+One can edit the PO files using ikiwiki's CGI (a message-by-message
+interface could also be implemented at some point).
+
+If [[tips/untrusted_git_push]] is setup, one can edit the PO files in one's
+preferred `$EDITOR`, without needing to be online.
+
+Markup languages support
+------------------------
+
+[[Markdown|mdwn]] and [[html]] are well supported. Some other markup
+languages supported by ikiwiki mostly work, but some pieces of syntax
+are not rendered correctly on the slave pages:
+
+* [[reStructuredText|rst]]: anonymous hyperlinks and internal
+ cross-references
+* [[wikitext]]: conversion of newlines to paragraphs
+* [[creole]]: verbatim text is wrapped, tables are broken
+* LaTeX: not supported yet; the dedicated po4a module
+ could be used to support it, but it would need a security audit
+* other markup languages have not been tested.
+
+Renaming a page
+---------------
+
+A translatable page may be renamed using the web interface and the
+[[rename plugin|plugins/rename]], or using the VCS directly; in
+the latter case, *both* the "master" page and every corresponding
+`.po` file must be renamed in the same commit.
+
+Security
+========
+
+[[po/discussion]] contains a detailed security analysis of this plugin
+and its dependencies.
+
+When using po4a older than 0.35, it is recommended to uninstall
+`Text::WrapI18N` (Debian package `libtext-wrapi18n-perl`), in order to
+avoid a potential denial of service.
+
+BUGS
+====
+
+[[!inline pages="bugs/po:* and !bugs/done and !link(bugs/done) and !bugs/*/*"
+feeds=no actions=no archive=yes show=0]]
+
+TODO
+====
+
+[[!inline pages="todo/po:* and !todo/done and !link(todo/done) and !todo/*/*"
+feeds=no actions=no archive=yes show=0]]
diff --git a/doc/plugins/po/discussion.mdwn b/doc/plugins/po/discussion.mdwn
new file mode 100644
index 000000000..50998e822
--- /dev/null
+++ b/doc/plugins/po/discussion.mdwn
@@ -0,0 +1,721 @@
+[[!toc ]]
+
+----
+
+# Security review
+
+## Probable holes
+
+_(The list of things to fix.)_
+
+### po4a-gettextize
+
+* po4a CVS 2009-01-16
+* Perl 5.10.0
+
+`po4a-gettextize` uses more or less the same po4a features as our
+`refreshpot` function.
+
+Without specifying an input charset, zzuf'ed `po4a-gettextize` quickly
+errors out, complaining it was not able to detect the input charset;
+it leaves no incomplete file on disk. I therefore had to pretend the
+input was in UTF-8, as does the po plugin.
+
+ zzuf -c -s 13 -r 0.1 \
+ po4a-gettextize -f text -o markdown -M utf-8 -L utf-8 \
+ -m GPL-3 -p GPL-3.pot
+
+Crashes with:
+
+ Malformed UTF-8 character (UTF-16 surrogate 0xdfa4) in substitution
+ iterator at /usr/share/perl5/Locale/Po4a/Po.pm line 1449.
+ Malformed UTF-8 character (fatal) at /usr/share/perl5/Locale/Po4a/Po.pm
+ line 1449.
+
+An incomplete pot file is left on disk. Unfortunately Po.pm tells us
+nothing about the place where the crash happens.
+
+> It's fairly standard perl behavior when fed malformed utf-8. As long
+> as it doesn't crash ikiwiki, it's probably acceptable. Ikiwiki can
+> do some similar things itself when fed malformed utf-8 (doesn't
+> crash tho) --[[Joey]]
+
+----
+
+## Potential gotchas
+
+_(Things not to do.)_
+
+
+### Blindly activating more po4a format modules
+
+The format modules we want to use have to be checked, as not all are
+safe (e.g. the LaTeX module's behaviour is changed by commands
+included in the content); they may use regexps generated from
+the content.
+
+----
+
+## Hopefully non-holes
+
+_(AKA, the assumptions that will be the root of most security holes...)_
+
+### PO file features
+
+No [documented](http://www.gnu.org/software/gettext/manual/gettext.html#PO-Files)
+directive that can be put in po files is supposed to cause mischief
+(ie, include other files, run commands, crash gettext, whatever).
+
+### gettext
+
+#### Security history
+
+The only past security issue I could find in GNU gettext is
+[CVE-2004-0966](http://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2004-0966),
+*i.e.* [Debian bug #278283](http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=278283):
+the autopoint and gettextize scripts in the GNU gettext package (1.14
+and later versions) may allow local users to overwrite files via
+a symlink attack on temporary files.
+
+This plugin would not have allowed to exploit this bug, as it does not
+use, either directly or indirectly, the faulty scripts.
+
+Note: the lack of found security issues can either indicate that there
+are none, or reveal that no-one ever bothered to find or publish them.
+
+#### msgmerge
+
+`refreshpofiles()` runs this external program.
+
+* I was not able to crash it with `zzuf`.
+* I could not find any past security hole.
+
+#### msgfmt
+
+`isvalidpo()` runs this external program.
+
+* I was not able to make it behave badly using zzuf: it exits cleanly
+ when too many errors are detected.
+* I could not find any past security hole.
+
+### po4a
+
+#### Security history
+
+The only past security issue I could find in po4a is
+[CVE-2007-4462](http://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2007-4462):
+`lib/Locale/Po4a/Po.pm` in po4a before 0.32 allowed local users to
+overwrite arbitrary files via a symlink attack on the
+gettextization.failed.po temporary file.
+
+This plugin would not have allowed to exploit this bug, as it does not
+use, either directly or indirectly, the faulty `gettextize` function.
+
+Note: the lack of found security issues can either indicate that there
+are none, or reveal that no-one ever bothered to find or publish them.
+
+#### General feeling
+
+Are there any security issues on running po4a on untrusted content?
+
+To say the least, this issue is not well covered, at least publicly:
+
+* the documentation does not talk about it;
+* grep'ing the source code for `security` or `trust` gives no answer.
+
+On the other hand, a po4a developer answered my questions in
+a convincing manner, stating that processing untrusted content was not
+an initial goal, and analysing in detail the possible issues.
+The following analysis was done with his help.
+
+#### Details
+
+* the core (`Po.pm`, `Transtractor.pm`) should be safe
+* po4a source code was fully checked for other potential symlink
+ attacks, after discovery of one such issue
+* the only external program run by the core is `diff`, in `Po.pm` (in
+ parts of its code we don't use)
+* `Locale::gettext` is only used to display translated error messages
+* Nicolas François "hopes" `DynaLoader` is safe, and has "no reason to
+ think that `Encode` is not safe"
+* Nicolas François has "no reason to think that `Encode::Guess` is not
+ safe". The po plugin nevertheless avoids using it by defining the
+ input charset (`file_in_charset`) before asking `TransTractor` to
+ read any file. NB: this hack depends on po4a internals.
+
+##### Locale::Po4a::Text
+
+* does not run any external program
+* only `do_paragraph()` builds regexp's that expand untrusted
+ variables; according to [[Joey]], this is "Freaky code, but seems ok
+ due to use of `quotementa`".
+
+##### Locale::Po4a::Xhtml
+
+* does not run any external program
+* does not build regexp's from untrusted variables
+
+=> Seems safe as far as the `includessi` option is disabled; the po
+plugin explicitly disables it.
+
+Relies on Locale::Po4a::Xml` to do most of the work.
+
+##### Locale::Po4a::Xml
+
+* does not run any external program
+* the `includeexternal` option makes it able to read external files;
+ the po plugin explicitly disables it
+* untrusted variables are escaped when used to build regexp's
+
+##### Text::WrapI18N
+
+`Text::WrapI18N` can cause DoS
+([Debian bug #470250](http://bugs.debian.org/470250)).
+It is optional, and we do not need the features it provides.
+
+If a recent enough po4a (>=0.35) is installed, this module's use is
+fully disabled. Else, the wiki administrator is warned about this
+at runtime.
+
+##### Term::ReadKey
+
+`Term::ReadKey` is not a hard dependency in our case, *i.e.* po4a
+works nicely without it. But the po4a Debian package recommends
+`libterm-readkey-perl`, so it will probably be installed on most
+systems using the po plugin.
+
+`Term::ReadKey` has too far reaching implications for us to
+be able to guarantee anything wrt. security.
+
+If a recent enough po4a (>=2009-01-15 CVS, which will probably be
+released as 0.35) is installed, this module's use is fully disabled.
+
+##### Fuzzing input
+
+###### po4a-translate
+
+* po4a CVS 2009-01-16
+* Perl 5.10.0
+
+`po4a-translate` uses more or less the same po4a features as our
+`filter` function.
+
+Without specifying an input charset, same behaviour as
+`po4a-gettextize`, so let's specify UTF-8 as input charset as of now.
+
+`LICENSES` is a 21M file containing 100 concatenated copies of all the
+files in `/usr/share/common-licenses/`; I had no existing PO file or
+translated versions at hand, which renders these tests
+quite incomplete.
+
+ zzuf -cv -s 0:10 -r 0.001:0.3 \
+ po4a-translate -d -f text -o markdown -M utf-8 -L utf-8 \
+ -k 0 -m LICENSES -p LICENSES.fr.po -l test.fr
+
+... seems to lose the fight, at the `readpo(LICENSES.fr.po)` step,
+against some kind of infinite loop, deadlock, or any similar beast.
+
+The root of this bug lies in `Text::WrapI18N`, see the corresponding
+section.
+
+
+----
+
+## Fixed holes
+
+
+----
+
+# original contrib/po page, with old commentary
+
+I've been working on a plugin called "po", that adds support for multi-lingual wikis,
+translated with gettext, using [po4a](http://po4a.alioth.debian.org/).
+
+More information:
+
+* It can be found in my "po" branch:
+ `git clone git://gaffer.ptitcanardnoir.org/ikiwiki.git`
+* It is self-contained, *i.e.* it does not modify ikiwiki core at all.
+* It is documented (including TODO and plans for next work steps) in
+ `doc/plugins/po.mdwn`, which can be found in the same branch.
+* No public demo site is available so far, I'm working on this.
+
+My plan is to get this plugin clean enough to be included in ikiwiki.
+
+The current version is a proof-of-concept, mature enough for me to dare submitting it here,
+but I'm prepared to hear various helpful remarks, and to rewrite parts of it as needed.
+
+Any thoughts on this?
+
+> Well, I think it's pretty stunning what you've done here. Seems very
+> complete and well thought out. I have not read the code in great detail
+> yet.
+>
+> Just using po files is an approach I've never seen tried with a wiki. I
+> suspect it will work better for some wikis than others. For wikis that
+> just want translations that match the master language as closely as
+> possible and don't wander off and diverge, it seems perfect. (But what happens
+> if someone edits the Discussion page of a translated page?)
+>
+> Please keep me posted, when you get closer to having all issues solved
+> and ready for merging I can do a review and hopefully help with the
+> security items you listed. --[[Joey]]
+
+>> Thanks a lot for your quick review, it's reassuring to hear such nice words
+>> from you. I did not want to design and write a full translation system, when
+>> tools such as gettext/po4a already have all the needed functionality, for cases
+>> where the master/slave languages paradigm fits.
+>> Integrating these tools into ikiwiki plugin system was a pleasure.
+>>
+>> I'll tell you when I'm ready for merging, but in the meantime,
+>> I'd like you to review the changes I did to the core (3 added hooks).
+>> Can you please do this? If not, I'll go on and hope I'm not going to far in
+>> the wrong direction.
+>>
+>>> Sure.. I'm not completly happy with any of the hooks since they're very
+>>> special purpose, and also since `run_hooks` is not the best interface
+>>> for a hook that modifies a variable, where only the last hook run will
+>>> actually do anything. It might be better to just wrap
+>>> `targetpage`, `bestlink`, and `beautify_urlpath`. But, I noticed
+>>> the other day that such wrappers around exported functions are only visible by
+>>> plugins loaded after the plugin that defines them.
+>>>
+>>> Update: Take a look at the new "Function overriding" section of
+>>> [[plugins/write]]. I think you can just inject wrappers about a few ikiwiki
+>>> functions, rather than adding hooks. The `inject` function is pretty
+>>> insane^Wlow level, but seems to work great. --[[Joey]]
+>>>
+>>>> Thanks a lot, it seems to be a nice interface for what I was trying to achieve.
+>>>> I may be forced to wait two long weeks before I have a chance to confirm
+>>>> this. Stay tuned. --[[intrigeri]]
+>>>>
+>>>>> I've updated the plugin to use `inject`. It is now fully self-contained,
+>>>>> and does not modify the core anymore. --[[intrigeri]]
+>>
+>> The Discussion pages issue is something I am not sure about yet. But I will
+>> probably decide that "slave" pages, being only translations, don't deserve
+>> a discussion page: the discussion should happen in the language in which the
+>> pages are written for real, which is the "master" one. --[[intrigeri]]
+>>
+>> I think that's a good decision, you don't want to translate discussion,
+>> and if the discussion page turns out multilingual, well, se la vi. ;-)
+>>
+>> Relatedly, what happens if a translated page has a broken link, and you
+>> click on it to edit it? Seems you'd first have to create a master page
+>> and could only then translate it, right? I wonder if this will be clear
+>> though to the user.
+>>
+>>> Right: a broken link points to the URL that allows to create
+>>> a page that can either be a new master page or a non-translatable
+>>> page, depending on `po_translatable_pages` value. The best
+>>> solution I can thing of is to use [[plugins/edittemplate]] to
+>>> insert something like "Warning: this is a master page, that must
+>>> be written in $MASTER_LANGUAGE" into newly created master pages,
+>>> and maybe another warning message on newly created
+>>> non-translatable pages. It seems quite doable to me, but in order
+>>> to avoid breaking existing functionality, it implies to hack a bit
+>>> [[plugins/edittemplate]] so that multiple templates can be
+>>> inserted at page creation time. [[--intrigeri]]
+>>>
+>>>> I implemented such a warning using the formbuilder_setup hook.
+>>>> --[[intrigeri]]
+>>
+>> And also, is there any way to start a translation of a page into a new
+>> lanauge using the web interface?
+>>
+>>> When a new language is added to `po_slave_languages`, a rebuild is
+>>> triggered, and all missing PO files are created and checked into
+>>> VCS. An unpriviledged wiki user can not add a new language to
+>>> `po_slave_languages`, though. One could think of adding the needed
+>>> interface to translate a page into a yet-unsupported slave
+>>> language, and this would automagically add this new language to
+>>> `po_slave_languages`. It would probably be useful in some
+>>> usecases, but I'm not comfortable with letting unpriviledged wiki
+>>> users change the wiki configuration as a side effect of their
+>>> actions; if this were to be implemented, special care would be
+>>> needed. [[--intrigeri]]
+>>>
+>>>> Actually I meant into any of the currently supported languages.
+>>>> I guess that if the template modification is made, it will list those
+>>>> languages on the page, and if a translation to a language is missing,
+>>>> the link will allow creating it?
+>>>>
+>>>>> Any translation page always exist for every supported slave
+>>>>> language, even if no string at all have been translated yet.
+>>>>> This implies the po plugin is especially friendly to people who
+>>>>> prefer reading in their native language if available, but don't
+>>>>> mind reading in English else.
+>>>>>
+>>>>> While I'm at it, there is a remaining issue that needs to be
+>>>>> sorted out: how painful it could be for non-English speakers
+>>>>> (assuming the master language is English) to be perfectly able
+>>>>> to navigate between translation pages supposed to be written in
+>>>>> their own language, when their translation level is most
+>>>>> often low.
+>>>>>
+>>>>> (It is currently easy to display this status on the translation
+>>>>> page itself, but then it's too late, and how frustrating to load
+>>>>> a page just to realize it's actually not translated enough for
+>>>>> you. The "other languages" loop also allows displaying this
+>>>>> information, but it is generally not the primary
+>>>>> navigation tool.)
+>>>>>
+>>>>> IMHO, this is actually a social problem (i.e. it's no use adding
+>>>>> a language to the supported slave ones if you don't have the
+>>>>> manpower to actually do the translations), that can't be fully
+>>>>> solved by technical solutions, but I can think of some hacks
+>>>>> that would limit the negative impact: a given translation's
+>>>>> status (currently = percent translated) could be displayed next
+>>>>> to the link that leads to it; a color code could as well be used
+>>>>> ("just" a matter of adding a CSS id or class to the links,
+>>>>> depending on this variable). As there is already work to be done
+>>>>> to have the links text generation more customizable through
+>>>>> plugins, I could do both at the same time if we consider this
+>>>>> matter to be important enough. --[[intrigeri]]
+>>>>>
+>>>>>> The translation status in links is now implemented in my
+>>>>>> `po`branch. It requires my `meta` branch changes to
+>>>>>> work, though. I consider the latter to be mature enough to
+>>>>>> be merged. --[[intrigeri]]
+
+>> FWIW, I'm tracking your po branch in ikiwiki master git in the po
+>> branch. One thing I'd like to try in there is setting up a translated
+>> basewiki, which seems like it should be pretty easy to do, and would be
+>> a great demo! --[[Joey]]
+>>
+>>> I have a complete translation of basewiki into danish, available merged into
+>>> ikiwiki at git://source.jones.dk/ikiwiki-upstream (branch underlay-da), and am working with
+>>> others on preparing one in german. For a complete translated user
+>>> experience, however, you will also need templates translated (there are a few
+>>> translatable strings there too). My most recent po4a Markdown improvements
+>>> adopted upstream but not yet in Debian (see
+>>> [bug#530574](http://bugs.debian.org/530574)) correctly handles multiple
+>>> files in a single PO which might be relevant for template translation handling.
+>>> --[[JonasSmedegaard]]
+>>
+>>> I've merged your changes into my own branch, and made great
+>>> progress on the various todo items. Please note my repository
+>>> location has changed a few days ago, my user page was updated
+>>> accordingly, but I forgot to update this page at the same time.
+>>> Hoping it's not too complicated to relocated an existing remote...
+>>> (never done that, I'm a Git beginner as well as a Perl
+>>> newbie) --[[intrigeri]]
+>>>>
+>>>> Just a matter of editing .git/config, thanks for the heads up.
+>>>>>
+>>>>> Joey, please have a look at my branch, your help would be really
+>>>>> welcome for the security research, as I'm almost done with what
+>>>>> I am able to do myself in this area. --[[intrigeri]]
+>>>>>>
+>>>>>> I came up with a patch for the WrapI18N issue --[[Joey]]
+
+I've set this plugin development aside for a while. I will be back and
+finish it at some point in the first quarter of 2009. --[[intrigeri]]
+
+> Abstract: Joey, please have a look at my po and meta branches.
+>
+> Detailed progress report:
+>
+> * it seems the po branch in your repository has not been tracking my
+> own po branch for two months. any config issue?
+> * all the plugin's todo items have been completed, robustness tests
+> done
+> * I've finished the detailed security audit, and the fix for po4a
+> bugs has entered upstream CVS last week
+> * I've merged your new `checkcontent` hook with the `cansave` hook
+> I previously introduced in my own branch; blogspam plugin updated
+> accordingly
+> * the rename hook changes we discussed elsewhere are also part of my
+> branch
+> * I've introduced two new hooks (`canremove` and `canrename`), not
+> a big deal; IMHO, they extend quite logically the plugin interface
+> * as highlighted on [[bugs/pagetitle_function_does_not_respect_meta_titles]],
+> my `meta` branch contains a new feature that is really useful in a
+> translatable wiki
+>
+> As a conclusion, I'm feeling that my branches are ready to be
+> merged; only thing missing, I guess, are a bit of discussion and
+> subsequent adjustments.
+>
+> --[[intrigeri]]
+
+> I've looked it over and updated my branch with some (untested)
+> changes.
+>
+>> I've merged your changes into my branch. Only one was buggy.
+>
+> Sorry, I'd forgotten about your cansave hook.. sorry for the duplicate
+> work there.
+>
+> Reviewing the changes, mostly outside of `po.pm`, I have
+> the following issues.
+>
+> * renamepage to renamelink change would break the ikiwiki
+> 3.x API, which I've promised not to do, so needs to be avoided
+> somehow. (Sorry, I guess I dropped the ball on not getting this
+> API change in before cutting 3.0..)
+>>
+>> Fixed, see [[todo/need_global_renamepage_hook]].
+>>
+> * I don't understand the parentlinks code change and need to figure it
+> out. Can you explain what is going on there?
+>>
+>> I'm calling `bestlink` there so that po's injected `bestlink` is
+>> run. This way, the parent links of a page link to the parent page
+>> version in the proper language, depending on the
+>> `po_link_to=current` and `po_link_to=negotiated` settings.
+>> Moreover, when using my meta branch enhancements plus meta title to
+>> make pages titles translatable, this small patch is needed to get
+>> the translated titles into parentlinks.
+>>
+> * canrename's mix of positional and named parameters is way too
+> ugly to get into an ikiwiki API. Use named parameters
+> entirely. Also probably should just use named parameters
+> for canremove.
+> * `skeleton.pm.example`'s canrename needs fixing to use either
+> the current or my suggested parameters.
+>>
+>> Done.
+>>
+> * I don't like the exporting of `%backlinks` and `$backlinks_calculated`
+> (the latter is exported but not used).
+>>
+>> The commit message for 85f865b5d98e0122934d11e3f3eb6703e4f4c620
+>> contains the rationale for this change. I guess I don't understand
+>> the subtleties of `our` use, and perldoc does not help me a lot.
+>> IIRC, I actually did not use `our` to "export" these variables, but
+>> rather to have them shared between `Render.pm` uses.
+>>
+>>> My wording was unclear, I meant exposing. --[[Joey]]
+>>>
+>>>> I guess I still don't know Perl's `our` enough to understand clearly.
+>>>> No matter whether these variables are declared with `my` or `our`,
+>>>> any plugin can `use IkiWiki::Render` and then access
+>>>> `$IkiWiki::backlinks`, as already does e.g. the pagestat plugin.
+>>>> So I guess your problem is not with letting plugins use these
+>>>> variables, but with them being visible for every piece of
+>>>> (possibly external) code called from `Render.pm`. Am I right?
+>>>> If I understand clearly, using a brace block to lexically enclose
+>>>> these two `our` declarations, alongside with the `calculate_backlinks`
+>>>> and `backlinks` subs definitions, would be a proper solution, wouldn't
+>>>> it? --[[intrigeri]]
+>>>>
+>>>>> No, %backlinks and the backlinks() function are not the same thing.
+>>>>> The variable is lexically scoped; only accessible from inside
+>>>>> `Render.pm` --[[Joey]]
+>>>>
+> * What is this `IkiWiki::nicepagetitle` and why are you
+> injecting it into that namespace when only your module uses it?
+> Actually, I can't even find a caller of it in your module.
+>>
+>> I guess you should have a look to my `meta` branch and to
+>> [[bugs/pagetitle_function_does_not_respect_meta_titles]] in order
+>> to understand this :)
+>>
+>>> It would probably be good if I could merge this branch without
+>>> having to worry about also immediatly merging that one. --[[Joey]]
+>>>
+>>>> I removed all dependencies on my `meta` branch from the `po` one.
+>>>> This implied removing the `po_translation_status_in_links` and
+>>>> `po_strictly_refresh_backlinks` features, and every link text is now
+>>>> displayed in the master language. I believe the removed features really
+>>>> enhance user experience of a translatable wiki, that's why I was
+>>>> initially supposing the `meta` branch would be merged first.
+>>>> IMHO, we'll need to come back to this quite soon after `po` is merged.
+>>>> --[[intrigeri]]
+>>>>
+>>>> Maybe you should keep those features in a meta-po branch?
+>>>> I did a cursory review of your meta last night, have some issues with it,
+>>>> but this page isn't the place for a detailed review. --[[Joey]]
+>>>>
+>>>>> Done. --[[intrigeri]]
+>>>
+> * I'm very fearful of the `add_depends` in `indexhtml`.
+> Does this make every page depend on every page that links
+> to it? Won't this absurdly bloat the dependency pagespecs
+> and slow everything down? And since nicepagetitle is given
+> as the reason for doing it, and nicepagetitle isn't used,
+> why do it?
+>>
+>> As explained in the 85f865b5d98e0122934d11e3f3eb6703e4f4c620 log:
+>> this feature hits performance a bit. Its cost was quite small in my
+>> real-world use-cases (a few percents bigger refresh time), but
+>> could be bigger in worst cases. When using the po plugin with my
+>> meta branch changes (i.e. the `nicepagetitle` thing), and having
+>> enabled the option to display translation status in links, this
+>> maintains the translation status up-to-date in backlinks. Same when
+>> using meta title to make the pages titles translatable. It does
+>> help having a nice and consistent translated wiki, but as it can
+>> also involve problems, I just turned it into an option.
+>>
+>>> This has been completely removed for now due to the removal of
+>>> the dependency on my `meta` branch. --[[intrigeri]]
+>>
+> * The po4a Suggests should be versioned to the first version
+> that can be used safely, and that version documented in
+> `plugins/po.mdwn`.
+>>
+>> Done.
+>>
+>> --[[intrigeri]]
+>
+> --[[Joey]]
+
+I reverted the `%backlinks` and `$backlinks_calculated` exposing.
+The issue they were solving probably will arise again when I'll work
+on my meta branch again (i.e. when the simplified po one is merged),
+but the po thing is supposed to work without these ugly `our`.
+Seems like it was the last unaddressed item from Joey's review, so I'm
+daring a timid "please pull"... or rather, please review again :)
+--[[intrigeri]]
+
+> Ok, I've reviewed and merged into my own po branch. It's looking very
+> mergeable.
+>
+> * Is it worth trying to fix compatability with `indexpages`?
+>>
+>> Supporting `usedirs` being enabled or disabled was already quite
+>> hard IIRC, so supporting all four combinations of `usedirs` and
+>> `indexpages` settings will probably be painful. I propose we forget
+>> about it until someone reports he/she badly needs it, and then
+>> we'll see what can be done.
+>>
+> * Would it make sense to go ahead and modify `page.tmpl` to use
+> OTHERLANGUAGES and PERCENTTRANSLATED, instead of documenting how to modify it?
+>>
+>> Done in my branch.
+>>
+> * Would it be better to disable po support for pages that use unsupported
+> or poorly-supported markup languages?
+>
+>> I prefer keeping it enabled, as:
+>>
+>> * most wiki markups "almost work"
+>> * when someone needs one of these to be fully supported, it's not
+>> that hard to add dedicated support for it to po4a; if it were
+>> disabled, I fear the ones who could do this would maybe think
+>> it's blandly impossible and give up.
+>>
+
+> * What's the reasoning behind checking that the link plugin
+> is enabled? AFAICS, the same code in the scan hook should
+> also work when other link plugins like camelcase are used.
+>>
+>> That's right, fixed.
+>>
+> * In `pagetemplate` there is a comment that claims the code
+> relies on `genpage`, but I don't see how it does; it seems
+> to always add a discussion link?
+>>
+>> It relies on IkiWiki::Render's `genpage` as this function sets the
+>> `discussionlink` template param iff it considers a discussion link
+>> should appear on the current page. That's why I'm testing
+>> `$template->param('discussionlink')`.
+>>
+>>> Maybe I was really wondering why it says it could lead to a broken
+>>> link if the cgiurl is disabled. I think I see why now: Discussionlink
+>>> will be set to a link to an existing disucssion page, even if cgi is
+>>> disabled -- but there's no guarantee of a translated discussion page
+>>> existing in that case. *However*, htmllink actually checks
+>>> for this case, and will avoid generating a broken link so AFAICS, the
+>>> comment is actually innacurate.. what will really happen in this case
+>>> is discussionlink will be set to a non-link translation of
+>>> "discussion". Also, I consider `$config{cgi}` and `%links` (etc)
+>>> documented parts of the plugin interface, which won't change; po could
+>>> rely on them to avoid this minor problem. --[[Joey]]
+>>>>
+>>>> Done in my branch. --[[intrigeri]]
+>>>>
+>
+> * Is there any real reason not to allow removing a translation?
+> I'm imagining a spammy translation, which an admin might not
+> be able to fix, but could remove.
+>>
+>> On the other hand, allowing one to "remove" a translation would
+>> probably lead to misunderstandings, as such a "removed" translation
+>> page would appear back as soon as it is "removed" (with no strings
+>> translated, though). I think an admin would be in a position to
+>> delete the spammy `.po` file by hand using whatever VCS is in use.
+>> Not that I'd really care, but I am slightly in favour of the way
+>> it currently works.
+>>
+>>> That would definitly be confusing. It sounds to me like if we end up
+>>> needing to allow web-based deletion of spammy translations, it will
+>>> need improvements to the deletion UI to de-confuse that. It's fine to
+>>> put that off until needed --[[Joey]]
+>>
+
+ * As discussed at [[todo/l10n]] the templates needs to be translatable too. They
+ should be treated properly by po4a using the markdown option - at least with my
+ later patches in [bug#530574](http://bugs.debian.org/530574)) applied.
+
+ * It seems to me that the po plugin (and possibly other parts of ikiwiki) wrongly
+ uses gettext. As I understand it, gettext (as used currently in ikiwiki) always
+ lookup a single language, That might make sense for a single-language site, but
+ multilingual sites should emit all strings targeted at the web output in each own
+ language.
+
+ So generally the system language (used for e.g. compile warnings) should be separated
+ from both master language and slave languages.
+
+ Preferrably the gettext subroutine could be extended to pass locale as optional
+ secondary parameter overriding the default locale (for messages like "N/A" as
+ percentage in po plugin). Alternatively (with above mentioned template support)
+ all such strings could be externalized as templates that can then be localized.
+
+# Robustness tests
+
+### Enabling/disabling the plugin
+
+* enabling the plugin with `po_translatable_pages` set to blacklist: **OK**
+* enabling the plugin with `po_translatable_pages` set to whitelist: **OK**
+* enabling the plugin without `po_translatable_pages` set: **OK**
+* disabling the plugin: **OK**
+
+### Changing the plugin config
+
+* adding existing pages to `po_translatable_pages`: **OK**
+* removing existing pages from `po_translatable_pages`: **OK**
+* adding a language to `po_slave_languages`: **OK**
+* removing a language from `po_slave_languages`: **OK**
+* changing `po_master_language`: **OK**
+* replacing `po_master_language` with a language previously part of
+ `po_slave_languages`: needs two rebuilds, but **OK** (this is quite
+ a perverse test actually)
+
+### Creating/deleting/renaming pages
+
+All cases of master/slave page creation/deletion/rename, both via RCS
+and via CGI, have been tested.
+
+### Misc
+
+* general test with `usedirs` disabled: **OK**
+* general test with `indexpages` enabled: **not OK**
+* general test with `po_link_to=default` with `userdirs` enabled: **OK**
+* general test with `po_link_to=default` with `userdirs` disabled: **OK**
+
+Duplicate %links ?
+------------------
+
+I notice code in the scan hook that seems to assume
+that %links will accumulate duplicate links for a page.
+That used to be so, but the bug was fixed. Does this mean
+that po might be replacing the only link on a page, in error?
+--[[Joey]]
+
+> It would replace it. The only problematic case is when another
+> plugin has its own reasons, in its `scan` hook, to add a page
+> that is already there to `$links{$page}`. This other plugin's
+> effect might then be changed by po's `scan` hook... which could
+> be either good (better overall l10n) or bad (break the other
+> plugin's goal). --[[intrigeri]]
+
+>> Right.. well, the cases where links are added is very small.
+>> Grepping for `add_link`, it's just done by link, camelcase, meta, and
+>> tag. All of these are supposed to work just link regular links
+>> so I'd think that is ok. We could probably remove the currently scary
+>> comment about only wanting to change the first link. --[[Joey]]
+
+>>> Commit 3c2bffe21b91684 in my po branch does this. --[[intrigeri]]
+>>>> Cherry-picked --[[Joey]]
diff --git a/doc/plugins/poll.mdwn b/doc/plugins/poll.mdwn
new file mode 100644
index 000000000..099cb399c
--- /dev/null
+++ b/doc/plugins/poll.mdwn
@@ -0,0 +1,5 @@
+[[!template id=plugin name=poll author="[[Joey]]"]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/poll]] [[ikiwiki/directive]],
+which allows inserting an online poll into a page.
diff --git a/doc/plugins/poll/discussion.mdwn b/doc/plugins/poll/discussion.mdwn
new file mode 100644
index 000000000..eed3f6ef9
--- /dev/null
+++ b/doc/plugins/poll/discussion.mdwn
@@ -0,0 +1 @@
+Has anyone given any thought to approval voting (ie. marking more than one option), ranking or more complex decision-making protocols here? --[[anarcat]]
diff --git a/doc/plugins/polygen.mdwn b/doc/plugins/polygen.mdwn
new file mode 100644
index 000000000..f9cea1f4d
--- /dev/null
+++ b/doc/plugins/polygen.mdwn
@@ -0,0 +1,25 @@
+[[!template id=plugin name=polygen author="Enrico Zini"]]
+[[!tag type/fun]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/polygen]] [[ikiwiki/directive]],
+which allows inserting text generated by polygen into a wiki page.
+
+[[!if test="enabled(polygen)" then="""
+----
+
+Here are a few notes about ikiwiki, courtesy of polygen:
+
+Ikiwiki is internally based on a [[!polygen grammar="designpatterns"]]
+coupled to a [[!polygen grammar="designpatterns"]], as described in
+"[[!polygen grammar="paper"]]" by [[!polygen grammar="nipponame"]] of
+[[!polygen grammar="boyband"]].
+
+Ikiwiki reviews:
+<ul>
+<li>[[!polygen grammar="reviews"]]</li>
+<li>[[!polygen grammar="reviews"]]</li>
+<li>[[!polygen grammar="reviews"]]</li>
+</ul>
+
+"""]]
diff --git a/doc/plugins/postsparkline.mdwn b/doc/plugins/postsparkline.mdwn
new file mode 100644
index 000000000..b0733e343
--- /dev/null
+++ b/doc/plugins/postsparkline.mdwn
@@ -0,0 +1,14 @@
+[[!template id=plugin name=postsparkline author="[[Joey]]"]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/postsparkline]] [[ikiwiki/directive]].
+It uses the [[sparkline]] plugin to create a sparkline of
+statistics about a set of pages, such as posts to a blog.
+
+# adding formulae
+
+Additional formulae can be added without modifying this plugin by writing
+plugins that register functions in the
+`IkiWiki::Plugin::postsparkline::formula` namespace. These functions will
+receive on input a reference to a hash of parameters, and a sorted list of
+pages, and should return a list of data points for the sparkline plugin.
diff --git a/doc/plugins/prettydate.mdwn b/doc/plugins/prettydate.mdwn
new file mode 100644
index 000000000..149b7c29c
--- /dev/null
+++ b/doc/plugins/prettydate.mdwn
@@ -0,0 +1,20 @@
+[[!template id=plugin name=prettydate author="[[Joey]]"]]
+[[!tag type/date]]
+[[!tag type/chrome]]
+
+Enabling this plugin changes the dates displayed on pages in the wiki to
+a format that is nice and easy to read. Examples: "late Wednesday evening,
+February 14th, 2007", "at midnight, March 15th, 2007"
+
+The names given to each of the hours in the day can be customised by
+setting the `timetable` configuration variable in ikiwiki's setup file.
+The default value of this configuration value can be seen near the top of
+`prettydate.pm`. Note that an hour can be left blank, to make it display the
+same as the hour before. Midnight, noon, and teatime are all hardcoded,
+since they do not occupy the whole hour.
+
+The format used for the date can be customised using the `prettydateformat`
+configuration variable in the setup file. `%X` will be expanded to the
+prettified time value. The default prettydateformat is `"%X, %B %o, %Y"`.
+
+This plugin uses the [[!cpan TimeDate]] perl module.
diff --git a/doc/plugins/progress.mdwn b/doc/plugins/progress.mdwn
new file mode 100644
index 000000000..20736d18c
--- /dev/null
+++ b/doc/plugins/progress.mdwn
@@ -0,0 +1,5 @@
+[[!template id=plugin name=progress author="[[Will]]"]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/progress]]
+[[ikiwiki/directive]], which generates a progress bar.
diff --git a/doc/plugins/rawhtml.mdwn b/doc/plugins/rawhtml.mdwn
new file mode 100644
index 000000000..3b2d3d06c
--- /dev/null
+++ b/doc/plugins/rawhtml.mdwn
@@ -0,0 +1,13 @@
+[[!template id=plugin name=rawhtml author="[[Joey]]"]]
+[[!tag type/html type/format]]
+
+This plugin changes how ikiwiki handles html files, making it treat html
+or xhtml files not as source files but as data files that are copied
+unchanged when the wiki is built. Compared with the [[html]] plugin, which
+treats html files as source files for the wiki, this plugin allows complete
+control over html pages. Note that this means that the html will not be
+sanitised of problematic content such as javascript, so it can be insecure
+to enable this plugin if untrusted users have write access to your wiki's
+repository.
+
+This plugin is included in ikiwiki, but is not enabled by default.
diff --git a/doc/plugins/rawhtml/discussion.mdwn b/doc/plugins/rawhtml/discussion.mdwn
new file mode 100644
index 000000000..9ed8230ba
--- /dev/null
+++ b/doc/plugins/rawhtml/discussion.mdwn
@@ -0,0 +1,7 @@
+Is there anyway to allow this only on locked pages? I'd like to be able to do raw HTML on certain pages (eg. on the sidebar page) but don't want to allow it on the entire wiki. Thanks -- Adam.
+
+> Not at the moment. Long-term, ikiwiki needs some general permission mechanisms that encompass this sort of issue. --[[JoshTriplett]]
+
+>> Thanks. Bummer though, looking forward to when this is possible. :-) -- Adam.
+
+> Well, this plugin is different from the [[html]] plugin. It **copies** html files. So users cannot do raw HTML via cgi. Thus it is safe in most cases. -- weakish
diff --git a/doc/plugins/recentchanges.mdwn b/doc/plugins/recentchanges.mdwn
new file mode 100644
index 000000000..6fff18e8a
--- /dev/null
+++ b/doc/plugins/recentchanges.mdwn
@@ -0,0 +1,32 @@
+[[!template id=plugin name=recentchanges core=1 author="[[Joey]]"]]
+[[!tag type/meta]]
+
+This plugin examines the [[revision_control_system|rcs]] history and
+generates a page describing each recent change made to the wiki. These
+pages can be joined together with [[inline]] to generate the
+[[RecentChanges]] page.
+
+This plugin also currently handles web-based reversion of changes.
+
+Typically only the RecentChanges page will use the pages generated by this
+plugin, but you can use it elsewhere too if you like. It's used like this:
+
+ \[[!inline pages="internal(recentchanges/change_*)"
+ template=recentchanges show=0]]
+
+Here's an example of how to show only changes to "bugs/*".
+This matches against the title of the change, which includes a list of
+modified pages.
+
+ \[[!inline pages="internal(recentchanges/change_*) and title(*bugs/*)"
+ template=recentchanges show=0]]
+
+Here's an example of how to show only changes that Joey didn't make.
+(Joey commits sometimes as user `joey`, and sometimes via openid.)
+
+ \[[!inline pages="internal(recentchanges/change_*) and
+ !author(joey) and !author(http://joey.kitenet.net*)"
+ template=recentchanges show=0]]
+
+If you want to generate feeds for the RecentChanges page, you have to use
+[[`rss`_or_`atom`_in_the_setup_file|/todo/minor adjustment to setup documentation for recentchanges feeds]].
diff --git a/doc/plugins/recentchanges/discussion.mdwn b/doc/plugins/recentchanges/discussion.mdwn
new file mode 100644
index 000000000..a16cb5217
--- /dev/null
+++ b/doc/plugins/recentchanges/discussion.mdwn
@@ -0,0 +1,17 @@
+Thanks for that one, again, it's great!
+
+One minor thing I noticed, seen on <http://www.bddebian.com/~wiki/recent_changes/>:
+The links to user pages of e.g. *MichaelBanck* or *GianlucaGuida* don't work, as they're
+being linked to <http://www.bddebian.com/~wiki/user/MichaelBanck>, whereas it should be
+<http://www.bddebian.com/~wiki/user/michaelbanck>.
+
+> I've fixed this.. --[[Joey]]
+
+Another one. If you change the *recentchangespage* configuration option, (it seems to me)
+that the pages from the old hierarchy will not be removed from the disk. But then, changing
+this should be a rather uncommon thing.
+
+--[[tschwinge]]
+
+> And fixed this, by making it look at all *._change pages, not just
+> those in a specific directory, when deciding which to expire. --[[Joey]]
diff --git a/doc/plugins/recentchangesdiff.mdwn b/doc/plugins/recentchangesdiff.mdwn
new file mode 100644
index 000000000..660a430b9
--- /dev/null
+++ b/doc/plugins/recentchangesdiff.mdwn
@@ -0,0 +1,9 @@
+[[!template id=plugin name=recentchangesdiff core=0 author="[[Joey]]"]]
+[[!tag type/meta]]
+
+This plugin extends the [[recentchanges]] plugin, adding a diff for each
+change. The diffs can be toggled on the recentchanges page (requires
+javascript), and are also included in its feeds.
+
+The [[rcs]] must have implemented support for the `rcs_diff()` function for
+any diffs to be generated.
diff --git a/doc/plugins/relativedate.mdwn b/doc/plugins/relativedate.mdwn
new file mode 100644
index 000000000..d6e8eb08b
--- /dev/null
+++ b/doc/plugins/relativedate.mdwn
@@ -0,0 +1,11 @@
+[[!template id=plugin name=relativedate author="[[Joey]]"]]
+[[!tag type/date]]
+[[!tag type/chrome]]
+
+This plugin lets dates be displayed in relative form. Examples: "2 days ago",
+"1 month and 3 days ago", "30 minutes ago". Hovering over the date will
+cause a tooltip to pop up with the absolute date.
+
+This only works in browsers with javascript enabled; other browsers will
+show the absolute date instead. Also, this plugin can be used with other
+plugins like [[prettydate]] that change how the absolute date is displayed.
diff --git a/doc/plugins/remove.mdwn b/doc/plugins/remove.mdwn
new file mode 100644
index 000000000..47993f44b
--- /dev/null
+++ b/doc/plugins/remove.mdwn
@@ -0,0 +1,7 @@
+[[!template id=plugin name=remove core=0 author="[[Joey]]"]]
+[[!tag type/web]]
+
+This plugin allows pages or other files to be removed using the web
+interface.
+
+Users can only remove things that they are allowed to edit or upload.
diff --git a/doc/plugins/rename.mdwn b/doc/plugins/rename.mdwn
new file mode 100644
index 000000000..abb361329
--- /dev/null
+++ b/doc/plugins/rename.mdwn
@@ -0,0 +1,12 @@
+[[!template id=plugin name=rename core=0 author="[[Joey]]"]]
+[[!tag type/web]]
+
+This plugin allows pages or other files to be renamed using the web
+interface. Following Unix tradition, renaming also allows moving to a
+different directory.
+
+Users can only rename things that they are allowed to edit or upload.
+
+It attempts to fix up links to renamed pages, and if some links cannot be
+fixed up, the pages that contain the broken links will be displayed after
+the rename.
diff --git a/doc/plugins/repolist.mdwn b/doc/plugins/repolist.mdwn
new file mode 100644
index 000000000..efd9c9352
--- /dev/null
+++ b/doc/plugins/repolist.mdwn
@@ -0,0 +1,17 @@
+[[!template id=plugin name=repolist author="[[Joey]]"]]
+[[!tag type/web]]
+
+This plugin allows you to configure ikiwiki with the location of
+[[rcs]] repositories for your wiki's source. This is done via the
+"repositories" setting in the setup file. Once you tell it where the source
+to your wiki can be downloaded from, this information can be published on
+your wiki in various ways.
+
+This plugin supports the [rel-vcs-*](http://kitenet.net/~joey/rfc/rel-vcs/)
+microformat, and uses it to embed the repository location information in
+every wiki page.
+
+By using this plugin, you will make [[Joey]] very happy, as he will be able
+to easily check out the source of your wiki, for purposes of debugging and
+general curiosity. More generally, making it easy for others to find the
+repository for your wiki is just a Plain Good Idea(TM).
diff --git a/doc/plugins/rst.mdwn b/doc/plugins/rst.mdwn
new file mode 100644
index 000000000..5e97e2d80
--- /dev/null
+++ b/doc/plugins/rst.mdwn
@@ -0,0 +1,18 @@
+[[!template id=plugin name=rst author="martin f. krafft"]]
+[[!tag type/format]]
+
+This plugin lets ikwiki convert files with names ending in ".rst" to html.
+It uses the [reStructuredText](http://docutils.sourceforge.net/rst.html)
+markup syntax. You need to have [[!cpan RPC::XML]], python and the
+python-docutils module installed to use it.
+
+Note that this plugin does not interoperate very well with the rest of
+ikiwiki. Limitations include:
+
+* There are issues with inserting raw html into documents, as ikiwiki
+ does with [[WikiLinks|ikiwiki/WikiLink]] and many
+ [[directives|ikiwiki/directive]].
+
+So while you may find this useful for importing old files into your wiki,
+using this as your main markup language in ikiwiki isn't recommended at
+this time.
diff --git a/doc/plugins/rst/discussion.mdwn b/doc/plugins/rst/discussion.mdwn
new file mode 100644
index 000000000..c84a6218e
--- /dev/null
+++ b/doc/plugins/rst/discussion.mdwn
@@ -0,0 +1,81 @@
+The main problem with more sophisticated RST support is that ikiwiki turns
+preprocessor directives into raw HTML and reST hates inline HTML.
+inline relies on Markdown's handling of raw HTML, specifically
+that it doesn't wrap paragraph-level `<div>`s in `<p>` tags -- see
+[[todo/htmlvalidation]]. Other plugins might expect their output to be
+interpreted in certain ways too -- [[Joey]] mentions toggleable and fortune.
+
+Is [prest][1] the perl version of the reST processor referred to in the text?
+It seems to be reasonably well-maintained to me, and differences between it and
+"standard" reST are pretty minor. A fairly exhaustive list, taken from the
+prest docs, follows:
+
+[1]: http://search.cpan.org/~nodine/Text-Restructured-0.003024/
+
+An exhaustive list of differences between prest and "standard" reST follows:
+
+* fewer alternatives for bullet lists (only "+", "*" and "-")
+* escaped colons are not allowed in field names
+* RCS keyword processing is only activated on "recognized bibliographic
+ field names"
+* multiple consecutive blockquotes seperated by attributions may not be allowed
+ (not sure; text could be interpreted either way)
+* a warning about auto-symbol footnotes is missing (maybe it's not relevant?)
+* colons are allowed within hyperlink reference names
+* inline markup can be nested
+* some directives are missing (epigraph, highlights, pull quote, date) and
+ some have been added (MathML, code execution (disabled by default), enscript)
+* container directive now uses "class" instead of "classes"
+* csv directive doesn't require csv.py
+* references directive doesn't allow options
+
+There may be a few others; my eyes glazed over. --Ethan
+
+rst support for ikiwiki seems to be on hold. rst is much more elegant
+than markdown in my opinion, so I tried it out in ikiwiki. I found out
+in other places that some directives work just fine, like [[meta]] and
+[[tag]], others work fine if you wrap them in `.. raw::`, like [[inline]].
+
+But to make a wiki we need [[WikiLinks]]; they can't be escape-inserted or
+such since they are inline elements in the text.. But images work fine in
+rst's syntax.. what about using rst syntax for wikilinks as well?
+Is it possible to inject something into the parser to turn unmached links
+``WikiLink`_` into ikiwiki links? --ulrik
+
+------
+
+Resolving WikiLinks in rst
+==========================
+
+I wanted to look into if we can hook into rst and influence how links are resolved.
+It turns out it is possible, and I have a working WIP for the rst plugin that does this.
+
+My work in progress for `/usr/lib/ikiwiki/plugins/rst` is here:
+[[todo/Resolve native reStructuredText links to ikiwiki pages]]
+
+It basically matches normal rst links just like ikiwiki would match a wikilink
+if it existed.
+I can't read perl so I haven't found out so much. The plugin successfully registers backlinks using
+`proxy.rpc('add_link', on_page, bestlink)` (since the destination page will be rebuilt to update),
+but the backlinks don't show up.
+
+I converted one of my pages to rst:
+
+Before: <http://kaizer.se/wiki/kupfer-mdwn>
+After: <http://kaizer.se/wiki/kupfer-rst>
+
+I need help on a couple of points
+
+* How to fix the backlinks with `add_link`?
+* How to generate NonExistingLinks using the plugin API?
+* Can we include this in ikiwiki's rst if it is not too hairy?
+
+--ulrik
+
+
+----
+
+> The main problem with more sophisticated RST support is that ikiwiki turns
+preprocessor directives into raw HTML and reST hates inline HTML.
+
+Is it possible for ikiwiki to store preprocessor directives in memory, and replace them with place holders, then do the rst process. After the rst processing, process the preprocessor directives and replace place holders. --[[weakish]]
diff --git a/doc/plugins/rsync.mdwn b/doc/plugins/rsync.mdwn
new file mode 100644
index 000000000..e48886168
--- /dev/null
+++ b/doc/plugins/rsync.mdwn
@@ -0,0 +1,19 @@
+[[!template id=plugin name=rsync author="[[schmonz]]"]]
+[[!tag type/special-purpose]]
+
+This plugin allows ikiwiki to push generated pages to another host
+by running a command such as `rsync`.
+
+The command to run is specified by setting `rsync_command` in your setup
+file. The command will be run in your destdir, so something like this
+is a typical command:
+
+ rsync_command => 'rsync -qa --delete . user@host:/path/to/docroot/',
+
+If using rsync over ssh, you will need to enable noninteractive ssh login
+to the remote host. It's also a good idea to specify the exact command line
+to be permitted in the remote host's `$HOME/.ssh/authorized_keys`.
+
+A typical ikiwiki configuration when using this plugin is to disable cgi
+support, so ikiwiki builds a completely static site that can be served from
+the remote host.
diff --git a/doc/plugins/rsync/discussion.mdwn b/doc/plugins/rsync/discussion.mdwn
new file mode 100644
index 000000000..ef0fa9967
--- /dev/null
+++ b/doc/plugins/rsync/discussion.mdwn
@@ -0,0 +1,79 @@
+## A use case
+
+Why I needed this plugin: I have two web servers available to me
+for a project. Neither does everything I need, but together they
+do. (This is a bit like the [Amazon S3
+scenario](http://kitenet.net/~joey/blog/entry/running_a_wiki_on_Amazon_S3/).)
+
+Server (1) is a university web server. It provides plentiful space
+and bandwidth, easy authentication for people editing the wiki, and
+a well-known stable URL. The wiki really wants to live here and
+very easily could except that the server doesn't allow arbitrary
+CGIs.
+
+Server (2) is provided by a generous alumnus's paid [[tips/DreamHost]]
+account. Disk and particularly network usage need to be minimized
+because over some threshold it costs him. CGI, etc. are available.
+
+My plan was to host the wiki on server (1) by taking advantage of
+server (2) to store the repository, source checkout, and generated
+pages, to host the repository browser, and to handle ikiwiki's CGI
+operations. In order for this to work, web edits on (2) would need
+to automatically push any changed pages to (1).
+
+As a proof of concept, I added an rsync post-commit hook after
+ikiwiki's usual. It worked, just not for web edits, which is how
+the wiki will be used. So I wrote this plugin to finish the job.
+The wiki now lives on (1), and clicking "edit" just works. --[[schmonz]]
+
+> Just out of interest, why use `rsync` and not `git push`. i.e. a
+> different setup to solve the same problem would be to run a
+> normal ikiwiki setup on the universities server with its git
+> repository available over ssh (same security setup your using
+> for rsync should work for git over ssh). On the cgi-capable server,
+> when it would rsync, make it git push. It would seem that git
+> has enough information that it should be able to be more
+> network efficient. It also means that corruption at one end
+> wouldn't be propagated to the other end. -- [[Will]]
+
+>> Hey, that's a nice solution. (The site was in svn to begin with,
+>> but it's in git now.) One advantage of my approach in this particular
+>> case: server (1) doesn't have `git` installed, but does have `rsync`,
+>> so (1)'s environment can remain completely untweaked other than the
+>> SSH arrangement. I kind of like that all the sysadmin effort is
+>> contained on one host.
+>>
+>> This plugin is definitely still useful for projects not able to use
+>> a DVCS (of which I've got at least one other), and possibly for
+>> other uses not yet imagined. ;-) --[[schmonz]]
+
+>>> I'm now using this plugin for an additional purpose. One of the aforementioned wikis (there are actually two) can only be read by trusted users, the list of which is kept in an `.htaccess` file. I added it to git as `htaccess.txt`, enabled the [[plugins/txt]] plugin, and in my `rsync_command` script, have it copied to the destdir as `.htaccess` before calling `rsync`. Now my users (who aren't tech-savvy, but are trustworthy) can edit the access list directly in the wiki. This idea might also be useful for wikis not using `rsync` at all. --[[schmonz]]
+
+----
+
+Revew: --[[Joey]]
+
+* I think it should not throw an error if no command is set. Just don't do anything.
+* If the rsync fails, it currently errors out, which will probably also leave
+ the wiki in a broken state, since ikiwiki will not get a chance to save
+ its state. This seems fragile; what if the laptop is offline, or the
+ server is down, etc. Maybe it should just warn if the rsync fails?
+* Is a new hook really needed? The savestate hook runs at a similar time;
+ only issue with it is that it is run even when ikiwiki has not
+ rendered any updated pages. Bah, I think you do need the new hook, how
+ annoying..
+
+> * Depends whether the plugin would be on by default. If yes, then yes.
+> If the admin has to enable it, I'd think they'd want the error.
+> * Changed the other errors to warnings.
+> * The name might be wrong: there isn't anything rsync-specific about the
+> plugin, that's just the command I personally need to run. --[[schmonz]]
+
+>> One problem with the error is that it prevents dumping a new setup file with
+>> the plugin enabled, and then editing it to configure. ie:
+
+ joey@gnu:~>ikiwiki -setup .ikiwiki/joeywiki.setup -plugin rsync -dumpsetup new.setup
+ Must specify rsync_command
+
+> rsync seems by far the most likely command, though someone might use something
+> to push via ftp instead. I think calling it rsync is ok. --[[Joey]]
diff --git a/doc/plugins/search.mdwn b/doc/plugins/search.mdwn
new file mode 100644
index 000000000..f116649c1
--- /dev/null
+++ b/doc/plugins/search.mdwn
@@ -0,0 +1,18 @@
+[[!template id=plugin name=search author="[[Joey]]"]]
+[[!tag type/web]]
+
+This plugin adds full text search to ikiwiki, using the
+[xapian](http://xapian.org/) engine, its
+[omega](http://xapian.org/docs/omega/overview.html) frontend, and the
+[[!cpan Search::Xapian]], [[!cpan Digest::SHA]], and [[!cpan HTML::Scrubber]]
+perl modules (on debian, check that you have packages `libsearch-xapian-perl`, `libdigest-sha-perl` and `libhtml-scrubber-perl` installed).
+
+The [[ikiwiki/searching]] page describes how to write search queries.
+
+Ikiwiki will handle indexing new and changed page contents. Note that since
+it only indexes page contents, files copied by the [[rawhtml]] plugin will
+not be indexed, nor will other types of data files.
+
+There is one setting you may need to use in the config file. `omega_cgi`
+should point to the location of the omega cgi program. The default location
+is `/usr/lib/cgi-bin/omega/omega`.
diff --git a/doc/plugins/search/discussion.mdwn b/doc/plugins/search/discussion.mdwn
new file mode 100644
index 000000000..8b1378917
--- /dev/null
+++ b/doc/plugins/search/discussion.mdwn
@@ -0,0 +1 @@
+
diff --git a/doc/plugins/shortcut.mdwn b/doc/plugins/shortcut.mdwn
new file mode 100644
index 000000000..1e8e85ed8
--- /dev/null
+++ b/doc/plugins/shortcut.mdwn
@@ -0,0 +1,9 @@
+[[!template id=plugin name=shortcut author="[[Joey]]"]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/shortcut]] [[ikiwiki/directive]].
+It allows external links to commonly linked to sites to be made
+more easily using shortcuts.
+
+The available shortcuts are defined on the [[shortcuts]] page in
+the wiki.
diff --git a/doc/plugins/shortcut/discussion.mdwn b/doc/plugins/shortcut/discussion.mdwn
new file mode 100644
index 000000000..2e2b1b281
--- /dev/null
+++ b/doc/plugins/shortcut/discussion.mdwn
@@ -0,0 +1,18 @@
+The plugin depends on [[mdwn]]. If you have
+disabled [[mdwn]], to get [[shortcut]] work, you need
+commit in a shortcuts.ext (ext is `rcs|creole|html|txt|etc`),
+and edit/patch [[shortcut]].
+
+Maybe use the `default_pageext` is better than hardcode .mdwn?
+
+--[[weakish]]
+
+> done, it will use `default_pageext` now --[[Joey]]
+
+---
+
+Instead of modifying the [[basewiki]]'s [[shortcuts]] file for local needs --
+thus copying it at some point and losing continuity with upstream enhancements --
+what about handling a `shortcuts-local.mdwn` or `shortcuts/local.mdwn` (if such
+a file exists in the wiki), and additionally process that one. Possibily a
+conditional `\[[!inline]]` could be used. --[[tschwinge]]
diff --git a/doc/plugins/sidebar.mdwn b/doc/plugins/sidebar.mdwn
new file mode 100644
index 000000000..012733456
--- /dev/null
+++ b/doc/plugins/sidebar.mdwn
@@ -0,0 +1,28 @@
+[[!template id=plugin name=sidebar author="Tuomo Valkonen"]]
+[[!tag type/chrome]]
+
+This plugin allows adding a sidebar to pages in the wiki.
+
+By default, and unless the `global_sidebars` setting is turned off,
+a sidebar is added to all pages in the wiki. The content of the sidebar
+is simply the content of a page named "sidebar" (ie, create a "sidebar.mdwn").
+
+Typically this will be a page in the root of the wiki, but it can also be a
+[[ikiwiki/SubPage]]. In fact, this page,
+[[plugins/sidebar|plugins/sidebar]], will be treated as a sidebar for the
+[[plugins]] page, and of all of its SubPages, if the plugin is enabled.
+
+There is also a [[ikiwiki/directive/sidebar]] directive that can be used
+to provide a custom sidebar content for a page.
+
+----
+
+Warning: Any change to the sidebar page will cause a rebuild of the whole
+wiki, since every page includes a copy that has to be updated. This can
+especially be a problem if the sidebar includes an
+[[ikiwiki/directive/inline]] directive, since any changes to pages inlined
+into the sidebar will change the sidebar and cause a full wiki rebuild.
+
+Instead, if you include a [[ikiwiki/directive/map]] directive on the sidebar,
+and it does not use the `show` parameter, only adding or removing pages
+included in the map will cause a full rebuild. Modifying pages will not.
diff --git a/doc/plugins/sidebar/discussion.mdwn b/doc/plugins/sidebar/discussion.mdwn
new file mode 100644
index 000000000..245fb1544
--- /dev/null
+++ b/doc/plugins/sidebar/discussion.mdwn
@@ -0,0 +1,12 @@
+> Warning: Any change to the sidebar will cause a rebuild of the whole wiki, since every page includes a copy that has to be updated. This can especially be a problem if the sidebar includes inline or map directives, since any changes to pages inlined or mapped onto the sidebar will change the sidebar and cause a full wiki rebuild.
+
+I tried exactly that, namely having an inline in my sidebar to include an rss feed from some other side. I think the complete wiki rebuild should be doable every few days when a new article appears in that feed. But contrary to that warning there is no complete wiki rebuild, only the sidebar page is rebuilt by the "ikiwiki --aggregate" from cron. Is that a bug or a feature?
+
+> It's a bug, discussed in [[bugs/transitive_dependencies]]. --[[Joey]]
+
+I needed to include inline directives into sidebars at different site sections to generate a dynamically updated navigation - very nice when combined with toggles - and I ran into the very same problem. I tried the map directive instead, but found I wouldn't like to re-style everything and also was missing the ability to make use of the show=title variable giving me meta titles where needed without taking the cost of rebuild with every page edit.
+
+Then I came across the tip to include the quick=yes variable with the inline directive, where it is described as not showing page titles included with the meta-directive, and I thought, well if it lets me have it only this way, maybe I can restrain from using meta titles.
+But to my surprise, even with the quick=yes variable included into the inline directive in the sidebars meta titles still are shown, no more forced rebuild when editing via cgi, which is amazing, but maybe it should be noted somewhere. One more time ikiwiki showed its bright face, thank you. --Boris
+
+How to use a different sidebar and its own CSS for SubPages under a certain directory? -- Joe
diff --git a/doc/plugins/signinedit.mdwn b/doc/plugins/signinedit.mdwn
new file mode 100644
index 000000000..814ab5508
--- /dev/null
+++ b/doc/plugins/signinedit.mdwn
@@ -0,0 +1,5 @@
+[[!template id=plugin name=signinedit core=1 author="[[Joey]]"]]
+[[!tag type/auth]]
+
+This plugin, which is enabled by default, requires users be logged in
+before editing pages in the wiki.
diff --git a/doc/plugins/smiley.mdwn b/doc/plugins/smiley.mdwn
new file mode 100644
index 000000000..e4153c612
--- /dev/null
+++ b/doc/plugins/smiley.mdwn
@@ -0,0 +1,9 @@
+[[!template id=plugin name=smiley author="[[Joey]]"]]
+[[!tag type/chrome]]
+
+This plugin makes it easy to insert smileys and other special symbols into
+pages in the wiki. The symbols are all listed on the [[smileys]] page,
+which serves as both configuration for the plugin and a list of available
+smileys.
+
+This plugin is included in ikiwiki, but is not enabled by default. :-)
diff --git a/doc/plugins/sortnaturally.mdwn b/doc/plugins/sortnaturally.mdwn
new file mode 100644
index 000000000..a16381946
--- /dev/null
+++ b/doc/plugins/sortnaturally.mdwn
@@ -0,0 +1,6 @@
+[[!template id=plugin name=sortnaturally core=1 author="[[chrysn]], [[smcv]]"]]
+[[!tag type/meta]]
+
+This plugin provides the `title_natural` [[ikiwiki/pagespec/sorting]]
+order, which uses [[!cpan Sort::Naturally]] to sort numbered pages in a
+more natural order.
diff --git a/doc/plugins/sparkline.mdwn b/doc/plugins/sparkline.mdwn
new file mode 100644
index 000000000..83e24a27d
--- /dev/null
+++ b/doc/plugins/sparkline.mdwn
@@ -0,0 +1,22 @@
+[[!template id=plugin name=sparkline author="[[Joey]]"]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/sparkline]]
+[[ikiwiki/directive]], which allows for easily embedding sparklines into
+wiki pages. A sparkline is a small word-size graphic chart, that is designed
+to be displayed alongside text.
+
+# requirements
+
+The plugin uses the [Sparkline PHP Graphing Library](http://sparkline.org/)
+as it has better output than the native perl sparkline library. Therefore,
+to use the plugin, you will need:
+
+* The Sparkline PHP Graphing Library, installed in php's path so that
+ php can find it when `sparkline/Sparkline.php` is required.
+* The GD PHP module used by the Sparkline library.
+* A "php" program in the path, that can run standalone php programs.
+* [[!cpan Digest::SHA]]
+
+On a Debian system, this can be accomplished by installing these packages:
+`libsparkline-php` `php5-gd` `php5-cli` `libdigest-sha1-perl`
diff --git a/doc/plugins/table.mdwn b/doc/plugins/table.mdwn
new file mode 100644
index 000000000..d16bcc726
--- /dev/null
+++ b/doc/plugins/table.mdwn
@@ -0,0 +1,12 @@
+[[!template id=plugin name=table author="[[VictorMoral]]"]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/table]] [[ikiwiki/directive]].
+It can build HTML tables from data in CSV (comma-separated values)
+or DSV ([delimiter-separated values](http://en.wikipedia.org/wiki/Delimiter-separated_values)) format.
+
+It needs the perl module [[!cpan Text::CSV]] for the CSV data.
+
+Note that you can also build tables in [[ikiwiki/Markdown]] pages
+without using this plugin, by enabling the `multimarkdown` option
+and installing the [[!cpan Text::MultiMarkdown]] Perl module or simply by using the `<table>` HTML tag.
diff --git a/doc/plugins/table/discussion.mdwn b/doc/plugins/table/discussion.mdwn
new file mode 100644
index 000000000..86572c935
--- /dev/null
+++ b/doc/plugins/table/discussion.mdwn
@@ -0,0 +1,73 @@
+Well, that's an one workaround for (some versions of) markdown's
+lack of tables..
+
+Interesting that you chose to use CSV format. Seems there are advantages
+(standardisation) and disadvantages (limited to simple tables).
+
+--[[Joey]]
+
+# Patch for new header options
+
+I have written a small patch for this plugin to enable the first column as a header instead of just the first row or no header.
+
+In my version, there is three options for the header field :
+
++ **no**: no header;
++ **col**: the first column as header;
++ **row**: the first row as header (for compatibility reason, **yes** is an alternate value for this option).
+
+Here is the links to the patch and to a patched version of the plugin :
+
++ [table.pm.patch](http://alexandre.dupas.free.fr/code/ikiwiki/table.pm.patch)
++ [table.pm](http://alexandre.dupas.free.fr/code/ikiwiki/table.pm)
+
+I hope this might be intresting for some ikiwiki user's.
+
+--[[AlexandreDupas]]
+
+> Thanks for the patch, I've merged it in.
+> (Just FYI, in future, I recommend using a unified diff. Also, not
+> renaming variables that don't really need to be renamed makes your patch
+> easier to apply.) --[[Joey]]
+
+---
+
+# Horizontal cell alignment
+
+Do you know any easy method of horizontal cell alignment? I know I can set `class`
+attribute for the table, but how to set different `class` for different cells?
+
+[DokuWiki](http://www.dokuwiki.org/) has a nice horizontal alignment solution.
+Suppose that we have `|foo|` cell. If I want to align the cell to left,
+then I should type `|foo |`. If I want to do right alignment, then I type `| foo|`.
+For centering cell content I need to type `| foo |`. Please note that I used
+only one space for all examples, but in DokuWiki I can use many spaces.
+
+Do you like it? Can you implement the same in Ikiwiki? :) --[[Paweł|ptecza]]
+
+> Multimarkdown has [table support](http://fletcherpenney.net/multimarkdown/users_guide/multimarkdown_syntax_guide/#tables)
+> that includes alignment. (Using colons to control it.) So you can turn on
+> `multimarkdown` in setup to use that.
+>
+> I'd not mind if someone adds alignment to this plugin. Although the
+> universe of possible table formatting stuff is nearly endless, and at
+> some point it becomes clearer and simpler to just write the table in
+> html.. --[[Joey]]
+
+>> Thanks a lot for the info about Multimarkdown! It seems very interesting.
+
+>> I'll look at that plugin and try to add that feature, if it's not
+>> too hard.
+
+>> I know that people have many ideas how to format their tables
+>> and it's not easy to create universal tool. Of course `table` plugin
+>> was written rather for simple usage. However cell alignment is very
+>> helpful feature, so I think the plugin should be able to do it.
+>> --[[Paweł|ptecza]]
+
+-----
+
+If you see `[[!table\ Error: ]]` you probably need to `sudo apt-get install libtext-csv-perl`.
+
+> Perhaps more helpfully, ikiwiki 3.1415926 fixes display of such errors to
+> actualy include the error message. --[[Joey]]
diff --git a/doc/plugins/tag.mdwn b/doc/plugins/tag.mdwn
new file mode 100644
index 000000000..1d6bfbdd9
--- /dev/null
+++ b/doc/plugins/tag.mdwn
@@ -0,0 +1,24 @@
+[[!template id=plugin name=tag author="[[Joey]]"]]
+[[!tag type/tags type/link]]
+
+This plugin provides the [[ikiwiki/directive/tag]] and
+[[ikiwiki/directive/taglink]] [[directives|ikiwiki/directive]].
+These directives allow tagging pages.
+
+It also provides the `tagged()` [[ikiwiki/PageSpec]], which can be used to
+match pages that are tagged with a specific tag.
+
+The `tagbase` setting can be used to make tags default to being put in a
+particular subdirectory.
+
+The `tag_autocreate` setting can be used to control whether new tag pages
+are created as needed. It defaults to being done only if a `tagbase` is
+set.
+
+The `tag_autocreate_commit` setting is enabled by default, and causes
+new tag pages to be checked into version control.
+
+[[!if test="enabled(tag)" then="""
+This wiki has the tag plugin enabled, so you'll see a note below that this
+page is tagged with the "tags" tag.
+"""]]
diff --git a/doc/plugins/tag/discussion.mdwn b/doc/plugins/tag/discussion.mdwn
new file mode 100644
index 000000000..dfd749252
--- /dev/null
+++ b/doc/plugins/tag/discussion.mdwn
@@ -0,0 +1,31 @@
+I'd like to modify this plugin such that the tag pages are automatically created and populated with a list of relevant posts. The content of the tag page is simply `"\[[!inline pages="link(tag/$tag)"]]`. The tag plugin will have to determine whether a page for the given tag already exists, and if not use that Markdown fragment to generate it.
+
+There are clearly many ways to do this, but any opinions on which is the cleanest?
+
+--Ben
+
+It might work to use the 'change' hook, since that's called at the very end
+of `refresh()`. The hook could add the tag pages and re-run `refresh()`,
+taking appropriate care to avoid looping forever.
+
+--[[Joey]]
+
+Thanks. That works fine.
+
+--Ben
+
+@Ben: could you publish the code for that?
+
+--[[David_Riebenbauer]]
+
+AOLMODE=true echo "I too would really like this feature, which would make cgi free life much
+better" --[[DavidBremner]]
+
+Please make the actual text used a template some way or another. I may want `map` instead of `inline`. --[[madduck]]
+
+
+See [[todo/auto-create tag pages according to a template]]
+
+-- Jeremy Schultz <jeremy.schultz@uleth.ca>
+
+`tag_autocreate` can now enable this. --[[Joey]]
diff --git a/doc/plugins/template.mdwn b/doc/plugins/template.mdwn
new file mode 100644
index 000000000..8d17e2825
--- /dev/null
+++ b/doc/plugins/template.mdwn
@@ -0,0 +1,7 @@
+[[!template id=plugin name=template author="[[Joey]]"]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/template]] [[ikiwiki/directive]].
+With this plugin, you can set up templates, and cause them to be filled out
+and inserted into pages in the wiki. Existing templates are listed in the
+[[templates]] page.
diff --git a/doc/plugins/testpagespec.mdwn b/doc/plugins/testpagespec.mdwn
new file mode 100644
index 000000000..8180d5d4b
--- /dev/null
+++ b/doc/plugins/testpagespec.mdwn
@@ -0,0 +1,6 @@
+[[!template id=plugin name=testpagespec author="[[Joey]]"]]
+[[!tag type/special-purpose]]
+
+This plugin provides a [[ikiwiki/directive/testpagespec]] [[ikiwiki/directive]].
+The directive allows testing a [[ikiwiki/PageSpec]] to see if it matches a
+page, and to see the part that matches, or causes the match to fail.
diff --git a/doc/plugins/teximg.mdwn b/doc/plugins/teximg.mdwn
new file mode 100644
index 000000000..866b1ee05
--- /dev/null
+++ b/doc/plugins/teximg.mdwn
@@ -0,0 +1,15 @@
+[[!template id=plugin name=teximg author="[[PatrickWinnertz]]"]]
+[[!tag type/widget type/slow]]
+
+This plugin provides a [[ikiwiki/directive/teximg]] [[ikiwiki/directive]],
+that renders LaTeX formulas into images.
+
+Of course you will need LaTeX installed for this to work.
+
+## configuration
+
+There are several configuration directives that can be used in the setup
+file. `teximg_prefix` can be set to change the LaTeX preamble, and
+`teximg_postfix` to change the LaTeX postfix. The `teximg_dvipng` setting
+can be set to 0 to disable use of `dvipng`, and instead force use of `dvips`
+and `convert`.
diff --git a/doc/plugins/teximg/discussion.mdwn b/doc/plugins/teximg/discussion.mdwn
new file mode 100644
index 000000000..4d8769b73
--- /dev/null
+++ b/doc/plugins/teximg/discussion.mdwn
@@ -0,0 +1,5 @@
+A minor nitpick: if, while editing, you preview your page two times without changing anything, the second time produces an error. --[[buo]]
+
+> Fixed --[[Joey]]
+
+>> Thanks! --[[buo]]
diff --git a/doc/plugins/textile.mdwn b/doc/plugins/textile.mdwn
new file mode 100644
index 000000000..4ed7d4e81
--- /dev/null
+++ b/doc/plugins/textile.mdwn
@@ -0,0 +1,6 @@
+[[!template id=plugin name=syntax author="mazirian"]]
+[[!tag type/format]]
+
+Textile is a versatile markup language. So here's a plugin that will use the
+Textile markup language to render .txtl files in your data directory.
+You must have [[!cpan Text::Textile]] installed for it to work.
diff --git a/doc/plugins/theme.mdwn b/doc/plugins/theme.mdwn
new file mode 100644
index 000000000..5261df111
--- /dev/null
+++ b/doc/plugins/theme.mdwn
@@ -0,0 +1,18 @@
+[[!template id=plugin name=theme author="[[Joey]]"]]
+[[!tag type/web]]
+
+The theme plugin allows easily applying a theme to your wiki, by
+configuring the `theme` setting in the setup file with the name of a theme
+to use. The themes you can choose from are all subdirectories, typically
+inside `/usr/share/ikiwiki/themes/`. See [[themes]] for an overview
+of the themes included in ikiwiki and the [[theme market]] for third party themes.
+
+You can set the theme via the **theme** option in your config file (after
+enabling the plugin). Refresh the wiki (with `ikiwiki -setup <file>`, `--setup` won't work, they are not interchangable) after changing it to see the changes.
+
+Hints for theme builders
+------------------------
+
+ * Start from an existing [[CSS file|css]], see also the [[css market]] for examples
+ * You can override the [[templates]] files by dropping them in a `templates` subdirectory
+ * Try to stick with modifying the CSS however, maintaining custom templates is harder
diff --git a/doc/plugins/theme/discussion.mdwn b/doc/plugins/theme/discussion.mdwn
new file mode 100644
index 000000000..67a2bf46a
--- /dev/null
+++ b/doc/plugins/theme/discussion.mdwn
@@ -0,0 +1,26 @@
+### What license do themes need to have for distribution?
+
+Could someone specify what license the themes need to have to get
+distributed in ikiwiki or Debian? The current included theme seem to be
+under the GPLv2. Does the [Creative Commons Attribution 3.0 Unported
+License](http://creativecommons.org/licenses/by/3.0/) also work. This way a
+lot of free CSS templates could be included, e. g. from
+[freecsstemplates.org](http://www.freecsstemplates.org/). --PaulePanter
+
+> Paule, I'd love it if you did that! The only hard requirement on themes
+> included in ikiwiki is that they need to be licensed with a [DFSG
+> compatable license](https://wiki.debian.org/DFSGLicenses). CC-BY-SA 3.0
+> is DFSG; CC-BY is apparently being accepted by Debian too.
+>
+> As a soft requirement, I may exersise some discretion about themes that
+> require obtrusive attributions links be included on every page of a
+> site using the theme. While probably DFSG, that adds a requirement
+> that ikiwiki itself does not require. --[[Joey]]
+
+### Once one has enabled the 'theme' plugin in the setup file, how does one use themes?
+
+Choose one of the [[themes]] which are bundled with ikiwiki and configure ikiwiki to use it by setting this in your setup file, eg.
+
+ theme => 'blueview',
+
+-- [[AdamShand]]
diff --git a/doc/plugins/toc.mdwn b/doc/plugins/toc.mdwn
new file mode 100644
index 000000000..a0ad3a5d0
--- /dev/null
+++ b/doc/plugins/toc.mdwn
@@ -0,0 +1,5 @@
+[[!template id=plugin name=toc author="[[Joey]]"]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/toc]] [[ikiwiki/directive]],
+which adds a table of contents to a page.
diff --git a/doc/plugins/toc/discussion.mdwn b/doc/plugins/toc/discussion.mdwn
new file mode 100644
index 000000000..a09ae5703
--- /dev/null
+++ b/doc/plugins/toc/discussion.mdwn
@@ -0,0 +1,10 @@
+If you are using the sidebar plugin and have a header in the sidebar it shows up in the table of contents. I can see why this happens but it surprised me and wasn't the desired effect in my specific situation. -- [[AdamShand]]
+
+A related side effect: If you use any sort of headers in the page
+template (such as placing the page title in an `<h1>`), the toc plugin
+picks it up. I suppose it parses the entire page rather than just the
+rendered content. --[[JasonBlevins]]
+
+Why doesn't the TOC appear in the edit page preview? It only appears when the page is finally rendered. This makes it somewhat difficult to organize headings, saving & re-editing all the time. My user page currently has a toc to play with: --[[sabr]]
+
+> Fixed. --[[Joey]]
diff --git a/doc/plugins/toggle.mdwn b/doc/plugins/toggle.mdwn
new file mode 100644
index 000000000..d1500eba0
--- /dev/null
+++ b/doc/plugins/toggle.mdwn
@@ -0,0 +1,7 @@
+[[!template id=plugin name=toggle author="[[Joey]]"]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/toggle]] and
+[[ikiwiki/directive/toggleable]] [[directives|ikiwiki/directive]].
+With these directives you can create links on pages that, when clicked, toggle
+display of other parts of the page.
diff --git a/doc/plugins/toggle/discussion.mdwn b/doc/plugins/toggle/discussion.mdwn
new file mode 100644
index 000000000..e48eef5ba
--- /dev/null
+++ b/doc/plugins/toggle/discussion.mdwn
@@ -0,0 +1,43 @@
+## Nested plugins
+
+Is it possible to use another plugin into your toggle plugin? For example,
+I want to have toggleable table and try to use [[Victor Moral|users/victormoral]]'s [[table plugin|plugins/table]],
+but no success. How can I do it?
+--PTecza
+
+> Yes, you can nest preprocessor directives. However, due to the issues
+> discussed [[here|todo/nested_preprocessor_directives]], it's not
+> currently supported to nest multiple levels of the same quotes.
+> --[[Joey]]
+
+>> Thanks a lot for the fast reply, Joey! It's good to know it.
+>> --PTecza
+
+
+## [[bugs/Bug_when_toggling_in_a_preview_page]]
+
+----
+
+## Using toggle directives in a list item##
+Take this code snippet.
+
+ * [[!toggle id="test" text="test"]]
+ [[!toggleable id="test text="""toggle"""]]
+
+In the HTML-output the `ul` and `div` overlap.
+
+ <div id="content">
+ <ul>
+ <li><a class="toggle" href="#test.test">test</a>
+ <div class="toggleable" id="test.-test"></li>
+ </ul>
+
+ <p>toggle</p>
+
+ </div>
+
+ </div>
+
+Fixing this manually the Javascript seems not to be working and `toggle` is shown unconditionally.
+
+I do not know if this is due to [[shortcomming with nested preprocessor directives|todo/nested_preprocessor_directives]] you mentioned in the beginning of this page. Maybe a note could be added to the main page of the plugin. --Paul
diff --git a/doc/plugins/trail.mdwn b/doc/plugins/trail.mdwn
new file mode 100644
index 000000000..14b97e35a
--- /dev/null
+++ b/doc/plugins/trail.mdwn
@@ -0,0 +1,76 @@
+[[!template id=plugin name=trail author="[[Simon_McVittie|smcv]]"]]
+[[!tag type/chrome]]
+
+This plugin provides the [[ikiwiki/directive/trailoptions]],
+[[ikiwiki/directive/traillink]], [[ikiwiki/directive/trailitem]],
+and [[ikiwiki/directive/trailitems]] [[directives|ikiwiki/directive]].
+
+It's sometimes useful to have "trails" of pages in a wiki where each
+page links to the next and/or previous page. For instance, you could use
+this for a guided tour, sequence of chapters, or sequence of blog posts.
+
+In this plugin, a trail is represented by a page, and the pages in the
+trail are indicated by specially marked links within that page, or by
+including groups of pages with a [[ikiwiki/directive]].
+
+If using the default `page.tmpl`, each page automatically displays the
+trails that it's a member of (if any), with links to the trail and to
+the next and previous members. HTML `<link>` tags with the `prev`,
+`next` and `up` relations are also generated.
+
+The [[ikiwiki/directive/trailoptions]] directive sets options for the
+entire trail.
+
+Pages can be included in a trail in various ways:
+
+* The [[ikiwiki/directive/inline]] directive with `trail="yes"` sets up an
+ [[inline]], and at the same time adds the matching pages (from `pages` or
+ `pagenames`) to the trail. One use is to navigate through all posts in
+ a blog:
+
+ \[[!inline pages="page(./posts/*) and !*/Discussion" archive=yes
+ feedshow=10 quick=yes trail=yes]]
+
+ This only works if the trail and [[!iki plugins/inline desc=inline]]
+ plugins are both enabled.
+
+* The [[ikiwiki/directive/trailitems]] directive has optional `pages` and
+ `pagenames` options which behave the same as in [[inline]], but don't
+ produce any output in the page, so you can have trails that don't list
+ all their pages.
+
+* The [[ikiwiki/directive/traillink]] directive makes a visible link
+ and also adds the linked page to the trail. This will typically be
+ used in a bullet list, but could also be in paragraph text:
+
+ * [[!traillink Introduction]]
+ * [[!traillink "Chapter 1"]]
+ * [[!traillink Chapter_2]]
+ * [[!traillink Appendix_A]]
+
+ or
+
+ To use this software you must \[[!traillink install]] it,
+ \[[!traillink configuration text="configure it"]],
+ and finally \[[!traillink running|run_it]].
+
+ This also counts as a [[ikiwiki/WikiLink]] for things like the `link()`
+ [[ikiwiki/PageSpec]] item.
+
+* The [[ikiwiki/directive/trailitem]] directive adds a page to the trail
+ like `traillink`, but produces an invisible link, rather like `\[[!tag]]`:
+
+ To use this software you must \[[!traillink install]] it,
+ \[[!trailitem installing_from_packages]]
+ \[[!trailitem installing_from_source]]
+ \[[!traillink configuration text="configure it"]],
+ and finally \[[!traillink running|run_it]].
+ \[[!trailitem troubleshooting]]
+
+ Like `\[[!tag]]`, this still counts as a [[ikiwiki/WikiLink]] even though
+ there's no visible link.
+
+You can mix several of these directives in one page. The resulting
+trail will contain all of the pages matched by any of the directives,
+in the same order that the directives appear (unless you use the `sort` or
+`reverse` options on `\[[!trailoptions]]`).
diff --git a/doc/plugins/trail/discussion.mdwn b/doc/plugins/trail/discussion.mdwn
new file mode 100644
index 000000000..6c0b790b9
--- /dev/null
+++ b/doc/plugins/trail/discussion.mdwn
@@ -0,0 +1,105 @@
+I believe the `trail3-integrated` and `trail3-prebuild` branches address
+Joey's review comments from IRC:
+
+ 06-12-2011 19:01:07 <joeyh>: ok, light review finished. so, if you want
+ to make a branch with inline trail=yes, and perhaps also adding a hook
+ so you don't need to inject, I think I can merge it right away
+
+I haven't published instructions for using this version as a
+standalone plugin, because it needs core and inline changes.
+
+Commits up to 63bb8b42 make the trail plugin better-integrated,
+including `\[[!inline trail=yes]]`. 63bb8b42 is the commit to
+merge if you don't like the design of my hooks.
+
+Commit 24168b99 adds a `build_affected` hook, run at about the
+same time as `render_backlinks`, and uses it to render the
+extra pages. This removes the need for `trail` to inject
+anything. In principle, backlinks etc. could use this hook
+too, if they weren't core.
+
+Commit d0dea308 on the `trail3-prebuild` branch adds a
+`prebuild` hook, which runs after everything has been scanned
+but before anything is rendered. This removes the need
+for `trail` to run its old `prerender` function in its
+render hooks (preprocess, pagetemplate etc.) to collate
+metadata before it renders anything. However, I'm not sure
+that this is really the right thing to do, which is why it's
+in its own branch: the `prebuild` hook is a lot like
+`needsbuild` (but later), so it's called even if no trail
+or trail member has actually been edited.
+
+For it to be useful for `trail`, the `prebuild` hook has to run
+after both pagespecs and sorting work. The other use case
+I've seen for a similar hook was for Giuseppe Bilotta to
+sort an inline-of-inlines by mtime of newest post, but that
+can't be the same hook, because it has to run after pagespecs
+work, but before sorting.
+
+--[[smcv]]
+
+> I've merged trail3-integrated, but not prebuild. I don't exactly dislike
+> prebuild, but dunno that the hook prolieration is worth the minor cleanup
+> it allows in trail. --[[Joey]]
+
+>> Hmm, t/trail.t is failing several tests here. To reproduce, I build the
+>> debian package from a clean state, or `rm -rf .t` between test runs. --[[Joey]]
+
+<pre>
+t/trail.t .................... 1/?
+# Failed test at t/trail.t line 211.
+# Failed test at t/trail.t line 213.
+# Failed test at t/trail.t line 215.
+# Failed test at t/trail.t line 217.
+# Failed test at t/trail.t line 219.
+# Failed test at t/trail.t line 221.
+# Failed test at t/trail.t line 223.
+# Failed test at t/trail.t line 225.
+# Failed test at t/trail.t line 227.
+# Failed test at t/trail.t line 229.
+# Failed test at t/trail.t line 231.
+</pre>
+
+> Looking at the first of these, it expected "trail=sorting n=sorting/new p="
+> but gets: "trail=sorting n=sorting/ancient p=sorting/new"
+>
+> Looking at the second failure, it expected "trail=sorting n=sorting/middle p=sorting/old$"
+> but got: "trail=sorting n=sorting/old p=sorting/end"
+>
+> Perhaps a legitimate bug? --[[Joey]]
+
+>> I saw this while developing, but couldn't reproduce it, and assumed
+>> I'd failed to update `blib` before `make test`, or some such.
+>> In fact it's a race condition, I think.
+>>
+>> The change and failure here is that `sorting.mdwn` is modified
+>> to sort its trail in reverse order of title. Previously, it
+>> was sorted by order of directives in the page, and secondarily
+>> by whatever sort order each directive specified (e.g.
+>> new, old and ancient were sorted by increasing age).
+>> `old` appearing between `new` and `ancient`, and `new` appearing
+>> between `end` and `old`, indicates that this re-sorting has not
+>> actually taken effect, and the old sort order is still used.
+>>
+>> I believe this is because the system time (as an integer) remained
+>> the same for the entire test, and mtimes as used in ikiwiki
+>> only have a 1-second resolution. We can either fix this with
+>> utime or sleep; I chose utime, since sleeping for 1 second would
+>> slow down the test significantly. Please merge or cherry-pick
+>> `smcv/trail-test` (there's only one commit). --[[smcv]]
+
+----
+
+[[!template id=gitbranch branch=smcv/ready/trail author=smcv]]
+
+Some later changes to trail:
+
+* Display the trail links at beginning/end of default `page.tmpl`
+ as suggested on IRC
+* Improve CSS, particularly in blueview and goldtype themes
+ ([example](http://blueview.hosted.pseudorandom.co.uk/posts/second_post/))
+* Fix a possible bug regarding state deletion
+
+--[[smcv]]
+
+> Applied --[[Joey]]
diff --git a/doc/plugins/transient.mdwn b/doc/plugins/transient.mdwn
new file mode 100644
index 000000000..b7dd11906
--- /dev/null
+++ b/doc/plugins/transient.mdwn
@@ -0,0 +1,24 @@
+[[!template id=plugin name=transient author="[[Simon_McVittie|smcv]]" core=yes]]
+[[!tag type/special-purpose]]
+
+The `transient` plugin adds an underlay in `.ikiwiki/transient`, which is
+intended for pages that are automatically created and should not be committed
+to the [[RCS]]. It works in the same way as the [[basewiki]] and the underlays
+set up by the [[plugins/underlay]] plugin, so if a page in the transient
+underlay is edited via the web, the edited version is committed to the RCS
+as usual. Unlike other underlays, if a page in the transient underlay is
+superseded by an edited version in the RCS, the old transient version
+is deleted automatically.
+
+This plugin is mostly useful as something that other plugins can depend on:
+
+* [[plugins/aggregate]] writes aggregated posts into the transient underlay
+* [[plugins/autoindex]] can be configured to auto-create missing
+ pages that have a [[ikiwiki/subpage]] or an [[plugins/attachment]], but not
+ commit them, in which case they go in the transient underlay
+* [[plugins/comments]] can be configured to not commit comments: if so, it
+ puts them in the transient underlay
+* [[plugins/recentchanges]] writes new changes into the transient underlay
+* [[plugins/tag]] can be configured to auto-create missing
+ tag pages but not commit them, in which case they go in the transient
+ underlay
diff --git a/doc/plugins/txt.mdwn b/doc/plugins/txt.mdwn
new file mode 100644
index 000000000..a51aabf48
--- /dev/null
+++ b/doc/plugins/txt.mdwn
@@ -0,0 +1,19 @@
+[[!template id=plugin name=txt author="[[Gabriel]]"]]
+[[!tag type/format]]
+
+This plugin makes ikiwiki treat files with names ending in ".txt"
+as wiki pages.
+
+Unlike other [[type/format]] plugins, no formatting of markup in
+txt files is done; the file contents is displayed to the user as
+pre-formatted text, with html markup characters such as ">" escaped.
+
+The only exceptions are that [[WikiLinks|ikiwiki/WikiLink]] and
+[[directives|ikiwiki/directive]] are still expanded by
+ikiwiki, and that, if the [[!cpan URI::Find]] perl module is installed, URLs
+in the txt file are converted to hyperlinks.
+
+----
+
+As a special case, a file `robots.txt` will be copied intact into the
+`destdir`, as well as creating a wiki page named "robots".
diff --git a/doc/plugins/txt/discussion.mdwn b/doc/plugins/txt/discussion.mdwn
new file mode 100644
index 000000000..6b907e65c
--- /dev/null
+++ b/doc/plugins/txt/discussion.mdwn
@@ -0,0 +1,33 @@
+I guess the reason I never thought to write this is when I put a .txt file
+in ikiwiki, I'm happy enough to see it copied through unchanged.
+
+I guess the advantage of using this plugin is that you get the page wrapper
+around the preformatted text, and could even inline such a page.
+
+There is not currently a good way to turn off some processing steps for
+some page types. It's either all or nothing. The patch in
+[[todo/format_escape]] might allow a formatter to register its own special
+version of htmllink that didn't do anything, but would that be enough?
+
+--[[Joey]]
+
+[Here](http://www.gmcmanus.org/plaintext.pm) is an alternate approach.
+It encodes entities using a filter hook, before wikilinks are linkified.
+So wikilinks turn up as links.
+It also uses URI::Find to turn URIs into links.
+
+I'm not very familiar with Perl, so this code could be improved.
+
+--Gabriel
+
+I like this approach! It sidesteps the annoying problem, and it actually
+makes the .txt format genuinely wiki-like, by allowing wikilinks and
+preprocessor directices.
+
+The only thing I am not sure about is the conversion of external urls to
+hyperlinks.
+
+Can you please add a copyright/license statemnt to the top of the plugin?
+If you do, I'll add it to ikiwiki. Thanks! --[[Joey]]
+
+> I've added copyright and license (GPLv2 or later). --Gabriel
diff --git a/doc/plugins/type/auth.mdwn b/doc/plugins/type/auth.mdwn
new file mode 100644
index 000000000..6a1f2d12c
--- /dev/null
+++ b/doc/plugins/type/auth.mdwn
@@ -0,0 +1,4 @@
+These plugins add different authentication methods for logging in to the
+wiki and control what pages users can edit.
+
+[[!map pages="plugins/* and tagged(.)"]]
diff --git a/doc/plugins/type/bundle.mdwn b/doc/plugins/type/bundle.mdwn
new file mode 100644
index 000000000..980dbbe57
--- /dev/null
+++ b/doc/plugins/type/bundle.mdwn
@@ -0,0 +1,3 @@
+These plugins enable whole bundles of other plugins.
+
+[[!map pages="plugins/* and tagged(.)"]]
diff --git a/doc/plugins/type/chrome.mdwn b/doc/plugins/type/chrome.mdwn
new file mode 100644
index 000000000..73a6e5898
--- /dev/null
+++ b/doc/plugins/type/chrome.mdwn
@@ -0,0 +1,3 @@
+These plugins affect the look and feel of the overall wiki.
+
+[[!map pages="plugins/* and tagged(.)"]]
diff --git a/doc/plugins/type/comments.mdwn b/doc/plugins/type/comments.mdwn
new file mode 100644
index 000000000..1e4dd7278
--- /dev/null
+++ b/doc/plugins/type/comments.mdwn
@@ -0,0 +1,3 @@
+These plugins relate to [[plugins/comments]].
+
+[[!map pages="plugins/* and tagged(.)"]]
diff --git a/doc/plugins/type/core.mdwn b/doc/plugins/type/core.mdwn
new file mode 100644
index 000000000..2646996ba
--- /dev/null
+++ b/doc/plugins/type/core.mdwn
@@ -0,0 +1,3 @@
+These plugins provide core functionality and are enabled by default.
+
+[[!map pages="plugins/* and tagged(.)"]]
diff --git a/doc/plugins/type/date.mdwn b/doc/plugins/type/date.mdwn
new file mode 100644
index 000000000..b95a7ecd3
--- /dev/null
+++ b/doc/plugins/type/date.mdwn
@@ -0,0 +1,3 @@
+These plugins control how ikiwiki displays dates.
+
+[[!map pages="plugins/* and tagged(.)"]]
diff --git a/doc/plugins/type/format.mdwn b/doc/plugins/type/format.mdwn
new file mode 100644
index 000000000..8a100f963
--- /dev/null
+++ b/doc/plugins/type/format.mdwn
@@ -0,0 +1,3 @@
+These plugins provide ways to format text on wiki pages.
+
+[[!map pages="plugins/* and tagged(.)"]]
diff --git a/doc/plugins/type/fun.mdwn b/doc/plugins/type/fun.mdwn
new file mode 100644
index 000000000..ad9e9c2e6
--- /dev/null
+++ b/doc/plugins/type/fun.mdwn
@@ -0,0 +1,3 @@
+These plugins are just for fun.
+
+[[!map pages="plugins/* and tagged(.)"]]
diff --git a/doc/plugins/type/html.mdwn b/doc/plugins/type/html.mdwn
new file mode 100644
index 000000000..94e13b621
--- /dev/null
+++ b/doc/plugins/type/html.mdwn
@@ -0,0 +1,3 @@
+These plugins generate or process html.
+
+[[!map pages="plugins/* and tagged(.)"]]
diff --git a/doc/plugins/type/link.mdwn b/doc/plugins/type/link.mdwn
new file mode 100644
index 000000000..fce27ae9c
--- /dev/null
+++ b/doc/plugins/type/link.mdwn
@@ -0,0 +1,3 @@
+These plugins deal with [[WikiLinks|ikiwiki/WikiLink]].
+
+[[!map pages="plugins/* and tagged(.)"]]
diff --git a/doc/plugins/type/meta.mdwn b/doc/plugins/type/meta.mdwn
new file mode 100644
index 000000000..7a339747e
--- /dev/null
+++ b/doc/plugins/type/meta.mdwn
@@ -0,0 +1,3 @@
+These plugins deal in meta-information about the wiki.
+
+[[!map pages="plugins/* and tagged(.)"]]
diff --git a/doc/plugins/type/slow.mdwn b/doc/plugins/type/slow.mdwn
new file mode 100644
index 000000000..5907d26f9
--- /dev/null
+++ b/doc/plugins/type/slow.mdwn
@@ -0,0 +1,5 @@
+These plugins can cause wiki rendering to be significantly slowed down,
+due to things like needing to run an external program for every page
+rendered.
+
+[[!map pages="plugins/* and tagged(.)"]]
diff --git a/doc/plugins/type/special-purpose.mdwn b/doc/plugins/type/special-purpose.mdwn
new file mode 100644
index 000000000..7aeb8be9c
--- /dev/null
+++ b/doc/plugins/type/special-purpose.mdwn
@@ -0,0 +1,3 @@
+Special-purpose plugins.
+
+[[!map pages="plugins/* and tagged(.)"]]
diff --git a/doc/plugins/type/tags.mdwn b/doc/plugins/type/tags.mdwn
new file mode 100644
index 000000000..78daebd53
--- /dev/null
+++ b/doc/plugins/type/tags.mdwn
@@ -0,0 +1,3 @@
+These plugins support tagging.
+
+[[!map pages="plugins/* and tagged(.)"]]
diff --git a/doc/plugins/type/web.mdwn b/doc/plugins/type/web.mdwn
new file mode 100644
index 000000000..6ebd6cd37
--- /dev/null
+++ b/doc/plugins/type/web.mdwn
@@ -0,0 +1,3 @@
+These plugins enhance the web interface.
+
+[[!map pages="plugins/* and tagged(.)"]]
diff --git a/doc/plugins/type/widget.mdwn b/doc/plugins/type/widget.mdwn
new file mode 100644
index 000000000..496b7da24
--- /dev/null
+++ b/doc/plugins/type/widget.mdwn
@@ -0,0 +1,4 @@
+These plugins allow inserting various things into pages via a
+[[ikiwiki/directive]].
+
+[[!map pages="plugins/* and tagged(.)"]]
diff --git a/doc/plugins/typography.mdwn b/doc/plugins/typography.mdwn
new file mode 100644
index 000000000..9ff6c4ffd
--- /dev/null
+++ b/doc/plugins/typography.mdwn
@@ -0,0 +1,12 @@
+[[!template id=plugin name=typography author="[[Roktas]]"]]
+[[!tag type/chrome]]
+
+This plugin, also known as
+[SmartyPants](http://daringfireball.net/projects/smartypants/), translates
+plain ASCII punctuation characters into ``smart'' typographic punctuation HTML
+entities. To use it, you need to have the [[!cpan Text::Typography]] module
+installed.
+
+This plugin has a configuration option. To change the attributes,
+set `--typographyattributes=whatever`. See the documentation for
+[[!cpan Text::Typography]] for available attributes.
diff --git a/doc/plugins/underlay.mdwn b/doc/plugins/underlay.mdwn
new file mode 100644
index 000000000..0cf819472
--- /dev/null
+++ b/doc/plugins/underlay.mdwn
@@ -0,0 +1,14 @@
+[[!template id=plugin name=underlay author="[[Simon_McVittie|smcv]]"]]
+[[!tag type/special-purpose]]
+
+This plugin adds an `add_underlays` option to the setup file. Its value is
+a list of underlay directories whose content is added to the wiki.
+
+Multiple underlays are normally set up automatically by other plugins (for
+instance, the images used by the [[plugins/smiley]] plugin), but they can
+also be used as a way to pull in external files that you don't want in
+revision control, like photos or software releases.
+
+Directories in `add_underlays` should usually be absolute. If relative,
+they're interpreted as relative to the parent directory of the basewiki
+underlay, which is probably not particularly useful in this context.
diff --git a/doc/plugins/userlist.mdwn b/doc/plugins/userlist.mdwn
new file mode 100644
index 000000000..1d3d38303
--- /dev/null
+++ b/doc/plugins/userlist.mdwn
@@ -0,0 +1,6 @@
+[[!template id=plugin name=userlist core=0 author="[[Joey]]"]]
+[[!tag type/web]]
+
+This plugin allows wiki admins to see a list of users who have recently
+used the wiki. This can be helpful to find openids of users to grant
+permissions, or in dealing with spam.
diff --git a/doc/plugins/version.mdwn b/doc/plugins/version.mdwn
new file mode 100644
index 000000000..326a2e7ce
--- /dev/null
+++ b/doc/plugins/version.mdwn
@@ -0,0 +1,7 @@
+[[!template id=plugin name=version author="[[Joey]]"]]
+[[!tag type/meta]]
+[[!tag type/widget]]
+
+This plugin provides the [[ikiwiki/directive/version]]
+[[ikiwiki/directive]], which inserts the current version
+of ikiwiki into a page.
diff --git a/doc/plugins/websetup.mdwn b/doc/plugins/websetup.mdwn
new file mode 100644
index 000000000..387c90b75
--- /dev/null
+++ b/doc/plugins/websetup.mdwn
@@ -0,0 +1,27 @@
+[[!template id=plugin name=websetup core=0 author="[[Joey]]"]]
+[[!tag type/web]]
+
+This plugin allows wiki admins to configure the wiki using a web interface,
+rather than editing the setup file directly. A "Setup" button is added
+to the admins' preferences page.
+
+Warning: This plugin rewrites your setup file. Any comments or unusual
+things (such as perl code) in the setup file will not be preserved. Also,
+it will only work correctly with new format setup files, as introduced in
+ikiwiki 2.60. Older setup files have a "wrappers" section, which will not
+be properly preserved if this plugin is used.
+
+Most settings can be modified using the web interface. Plugins can be
+enabled and disabled using it too. Some settings are not considered safe
+enough to be manipulated over the web; these are still shown, by default,
+but cannot be modified. To hide them, set `websetup_show_unsafe` to false
+in the setup file. A few settings have too complex a data type to be
+configured via the web. To mark additional settings as unsafe, you can
+list them in `websetup_unsafe`.
+
+Plugins that should not be enabled/disabled via the web interface can be
+listed in `websetup_force_plugins` in the setup file.
+
+When the setup is saved, the setup file will be rewritten with the new
+settings, and the wiki will be refreshed, or rebuilt, to make the setup
+changes take effect.
diff --git a/doc/plugins/wikitext.mdwn b/doc/plugins/wikitext.mdwn
new file mode 100644
index 000000000..483712130
--- /dev/null
+++ b/doc/plugins/wikitext.mdwn
@@ -0,0 +1,23 @@
+[[!template id=plugin name=wikitext author="[[Joey]]"]]
+[[!tag type/format]]
+
+This plugin allows ikiwiki to process pages written in the original wiki
+text format. To use it, you need to have the [[!cpan Text::WikiFormat]] perl
+module installed, enable the plugin, then files with the extention `.wiki`
+will be processed as wiki text.
+
+Wiki formatting is very simple. An item wrapped in three single quotes is
+strong. An item wrapped in two single quotes is emphasized. Four or more
+hyphen characters at the start of a line create a horizontal line. Newlines
+turn into the appropriate tags. Headers are matching equals signs around
+the header text -- the more signs, the lesser the header.
+
+Links are standard [[WikiLinks|ikiwiki/WikiLink]], although you can also enable
+[[CamelCase]] links.
+
+Lists are indented text, by one tab or four spaces. In unordered lists,
+where each item has its own bullet point, each item needs a leading
+asterisk and space. Ordered lists consist of items marked with combination
+of one or more alphanumeric characters followed by a period and an optional
+space. Any indented text without either marking is code, handled literally.
+You can nest lists.
diff --git a/doc/plugins/wmd.mdwn b/doc/plugins/wmd.mdwn
new file mode 100644
index 000000000..7202aece6
--- /dev/null
+++ b/doc/plugins/wmd.mdwn
@@ -0,0 +1,16 @@
+[[!template id=plugin name=wmd author="[[Will]]"]]
+[[!tag type/web]]
+
+[WMD](http://wmd-editor.com/) is a What You See Is What You Mean editor for
+[[mdwn]]. This plugin makes WMD be used for editing pages in the wiki.
+
+To use the plugin, you will need to install WMD. Download the [WMD
+source](https://code.google.com/p/pagedown/). In that zip file
+you'll find a few example html files, a readme and `wmd` directory. Create
+a 'wmd' subdirectory in the ikiwiki `underlaydir` directory (ie `sudo mkdir
+/usr/share/ikiwiki/wmd`). Move the `wmd` directory into the directory you
+made. You should now have a `wmd/wmd/wmd.js` file as well as some other
+javascript files and an images directory in the same place.
+
+Note that the WMD plugin does **not** handle ikiwiki directives. For this
+reason the normal `preview` button remains.
diff --git a/doc/plugins/wmd/discussion.mdwn b/doc/plugins/wmd/discussion.mdwn
new file mode 100644
index 000000000..b57ef4057
--- /dev/null
+++ b/doc/plugins/wmd/discussion.mdwn
@@ -0,0 +1,73 @@
+I've tried to retrieve the wmd-editor source tarball lately, but the site seems offline.
+
+From what I've read on the Internet, wmd-editor is not (yet?) free software by itself, and its author has gone MIA.
+But it looks like somebody recently took the step to rewrite a wmd-clone under a saner license, see [[pagedown|http://code.google.com/p/pagedown/source/browse/]].
+
+Given all the above, what about upgrading this plugin to use pagedown instead of wmd? It seem a clear win to me...
+
+> AFAICS, pagedown is a modified version of WMD. Let's
+> look at its license file: --[[Joey]]
+
+<pre>
+A javascript port of Markdown, as used on Stack Overflow
+and the rest of Stack Exchange network.
+
+Largely based on showdown.js by John Fraser (Attacklab).
+
+Original Markdown Copyright (c) 2004-2005 John Gruber
+ <http://daringfireball.net/projects/markdown/>
+
+
+Original Showdown code copyright (c) 2007 John Fraser
+
+Modifications and bugfixes (c) 2009 Dana Robinson
+Modifications and bugfixes (c) 2009-2011 Stack Exchange Inc.
+
+Permission is hereby granted, free of charge, to any person obtaining a
+copy [...]
+</pre>
+
+> Ok, so it says it's based on showdown. John Fraser wrote showdown and also
+> WMD, which IIRC was built on top of showdown. (Showdown converts the
+> markdown to html, and WMD adds the editor UI.)
+>
+> I can nowhere find a actual statement of the copyright of showdown or
+> WMD. <http://code.google.com/p/wmd/> has a "MIT License" notice on it,
+> but this is clearly just the license chosen when signing up at google
+> code for the repo that would be used for a rewrite of the code, and the only thing
+> said about the previous 1.0 release of WMD is "use it freely", which is not
+> specific enough to be a grant of license, and is moreover not a free
+> software license, as it does not cover distribution or modification.
+>
+> Which was all covered in the thread here,
+> when StackOverflow decided to start working on pagedown.
+> <http://blog.stackoverflow.com/2008/12/reverse-engineering-the-wmd-editor/>
+> This thread does not give any indication that they ever managed to get
+> a license grant for WMD/showdown. It frankly, does not inspire confidence
+> that the people working on this care about the license.
+>
+> It would probably be pretty easy to adapt the ikiwiki wmd plugin
+> to use pagedown. But without a clear and credible license, why?
+>
+> (Note that I have a wmd-new branch in my ikiwiki git repo that
+> uses <https://github.com/derobins/wmd>, which was an earlier
+> version of pagedown (probably, not entirely clear).)
+>
+> An alternate alternative is markitup: <http://markitup.jaysalvat.com/>
+> It has a clear history and a credible license (MIT or GPL dual license).
+> It's also easily extensible to other formats so could handle rst etc.
+> It does not, however, have a markdown to html converter -- for
+> previewing it has to talk to the server with AJAX.
+> --[[Joey]]
+
+>> I've got pagedown working on my personal site (simon.kisikew.org) but I'm not sure how
+>> I can inject the relevant &lt;div&gt;'s in the right place. They need to go **above**
+>> the editing &lt;textarea&gt; . (Too bad about the licensing, it's rather nice.)
+>> I had to do one minor change to it to have it inject itself into the page properly,
+>> and that was to make this change in `Markdown.Editor.js`:
+>>
+>> `this.input = doc.getElementById("editcontent" + postfix);`
+>>
+>> on line 247. --[[simonraven]]
+
+>>> Well, I re-figured out that I needed a TMPL_VAR FOO in the template(s). --[[simonraven]]
diff --git a/doc/plugins/write.mdwn b/doc/plugins/write.mdwn
new file mode 100644
index 000000000..d6e6d8d1e
--- /dev/null
+++ b/doc/plugins/write.mdwn
@@ -0,0 +1,1396 @@
+Ikiwiki's plugin interface allows all kinds of useful [[plugins]] to be
+written to extend ikiwiki in many ways. Despite the length of this page,
+it's not really hard. This page is a complete reference to everything a
+plugin might want to do. There is also a quick [[tutorial]].
+
+[[!template id="note" text="""
+Ikiwiki is a compiler
+
+One thing to keep in mind when writing a plugin is that ikiwiki is a wiki
+*compiler*. So plugins influence pages when they are built, not when they
+are loaded. A plugin that inserts the current time into a page, for
+example, will insert the build time.
+
+Also, as a compiler, ikiwiki avoids rebuilding pages unless they have
+changed, so a plugin that prints some random or changing thing on a page
+will generate a static page that won't change until ikiwiki rebuilds the
+page for some other reason, like the page being edited.
+
+The [[tutorial]] has some other examples of ways that ikiwiki being a
+compiler may trip up the unwary.
+"""]]
+
+[[!toc levels=2]]
+
+## Highlevel view of ikiwiki
+
+Ikiwiki mostly has two modes of operation. It can either be running
+as a compiler, building or updating a wiki; or as a cgi program, providing
+user interface for editing pages, etc. Almost everything ikiwiki does
+is accomplished by calling various hooks provided by plugins.
+
+### compiler
+
+As a compiler, ikiwiki starts by calling the
+[[`refresh`|plugins/write#refresh]] hook. Then it checks the wiki's source to
+find new or changed pages. The [[`needsbuild`|plugins/write#needsbuild]] hook
+is then called to allow manipulation of the list of pages that need to be
+built.
+
+Now that it knows what pages it needs to build, ikiwiki runs two compile
+passes. First, it runs [[`scan`|plugins/write#scan]] hooks, which collect
+metadata about the pages. Then it runs a page rendering pipeline, by calling
+in turn these hooks: [[`filter`|plugins/write#filter]],
+[[`preprocess`|plugins/write#preprocess]],
+[[`linkify`|plugins/write#linkify]], [[`htmlize`|plugins/write#htmlize]],
+[[`indexhtml`|plugins/write#indexhtml]],
+[[`pagetemplate`|plugins/write#pagetemplate]],
+[[`sanitize`|plugins/write#sanitize]], [[`format`|plugins/write#format]].
+
+After all necessary pages are built, it calls the
+[[`changes`|plugins/write#changes]] hook. Finally, if a page was deleted, the
+[[`delete`|plugins/write#delete]] hook is called, and the files that page had
+previously produced are removed.
+
+### cgi
+
+The flow between hooks when ikiwiki is run as a cgi is best illustrated by
+an example.
+
+Alice browses to a page and clicks Edit.
+
+* Ikiwiki is run as a cgi. It assigns Alice a session cookie, and, by calling
+ the [[`auth`|plugins/write#auth]] hooks, sees that she is not yet logged in.
+* The [[`sessioncgi`|plugins/write#sessioncgi]] hooks are then called, and one
+ of them, from the [[editpage]] plugin, notices that the cgi has been told
+ "do=edit".
+* The [[editpage]] plugin calls the [[`canedit`|plugins/write#canedit]] hook
+ to check if this page edit is allowed. The [[signinedit]] plugin has a hook
+ that says not: Alice is not signed in.
+* The [[signinedit]] plugin then launches the signin process. A signin page is
+ built by calling the [[`formbuilder_setup`|plugins/write#formbuilder]]
+ hook.
+
+Alice signs in with her openid.
+
+* The [[openid]] plugin's [[`formbuilder`|plugins/write#formbuilder]] hook
+ sees that an openid was entered in the signin form, and redirects to Alice's
+ openid provider.
+* Alice's openid provider calls back to ikiwiki. The [[openid]] plugin has an
+ [[`auth`|plugins/write#auth]] hook that finishes the openid signin process.
+* Signin complete, ikiwiki returns to what Alice was doing before; editing
+ a page.
+* Now all the [[`canedit`|plugins/write#canedit]] hooks are happy. The
+ [[editpage]] plugin calls
+ [[`formbuilder_setup`|plugins/write#formbuilder]] to display the page
+ editing form.
+
+Alice saves her change to the page.
+
+* The [[editpage]] plugin's [[`formbuilder`|plugins/write#formbuilder]] hook
+ sees that the Save button was pressed, and calls the
+ [[`checkcontent`|plugins/write#checkcontent]] and
+ [[`editcontent`|plugins/write#editcontent]] hooks. Then it saves the page
+ to disk, and branches into the compiler part of ikiwiki to refresh the wiki.
+
+## Types of plugins
+
+Most ikiwiki [[plugins]] are written in perl, like ikiwiki. This gives the
+plugin full access to ikiwiki's internals, and is the most efficient.
+However, plugins can actually be written in any language that supports XML
+RPC. These are called [[external]] plugins.
+
+A plugin written in perl is a perl module, in the `IkiWiki::Plugin`
+namespace. The name of the plugin is typically in lowercase, such as
+`IkiWiki::Plugin::inline`. Ikiwiki includes a `IkiWiki::Plugin::skeleton`
+that can be fleshed out to make a useful plugin.
+`IkiWiki::Plugin::pagecount` is another simple example. All perl plugins
+should `use IkiWiki` to import the ikiwiki plugin interface. It's a good
+idea to include the version number of the plugin interface that your plugin
+expects: `use IkiWiki 3.00`.
+
+An external plugin is an executable program. It can be written in any
+language. Its interface to ikiwiki is via XML RPC, which it reads from
+ikiwiki on its standard input, and writes to ikiwiki on its standard
+output. For more details on writing external plugins, see [[external]].
+
+Despite these two types of plugins having such different interfaces,
+they're the same as far as how they hook into ikiwiki. This document will
+explain how to write both sorts of plugins, albeit with an emphasis on perl
+plugins.
+
+## Plugin interface
+
+To import the ikiwiki plugin interface:
+
+ use IkiWiki '3.00';
+
+This will import several variables and functions into your plugin's
+namespace. These variables and functions are the ones most plugins need,
+and a special effort will be made to avoid changing them in incompatible
+ways, and to document any changes that have to be made in the future.
+
+Note that IkiWiki also provides other variables and functions that are not
+exported by default. No guarantee is made about these in the future, so if
+it's not exported, the wise choice is to not use it.
+
+## Registering plugins
+
+Plugins should, when imported, call `hook()` to hook into ikiwiki's
+processing. The function uses named parameters, and use varies depending on
+the type of hook being registered -- see below. A plugin can call
+the function more than once to register multiple hooks.
+
+All calls to `hook()` should be passed a "type" parameter, which gives the
+type of hook, a "id" parameter, which should be a unique string for this
+plugin, and a "call" parameter, which tells what function to call for the
+hook.
+
+An optional "last" parameter, if set to a true value, makes the hook run
+after all other hooks of its type, and an optional "first" parameter makes
+it run first. Useful if the hook depends on some other hook being run first.
+
+## Types of hooks
+
+In roughly the order they are called.
+
+### getopt
+
+ hook(type => "getopt", id => "foo", call => \&getopt);
+
+This allows for plugins to perform their own processing of command-line
+options and so add options to the ikiwiki command line. It's called during
+command line processing, with `@ARGV` full of any options that ikiwiki was
+not able to process on its own. The function should process any options it
+can, removing them from `@ARGV`, and probably recording the configuration
+settings in `%config`. It should take care not to abort if it sees
+an option it cannot process, and should just skip over those options and
+leave them in `@ARGV`.
+
+### checkconfig
+
+ hook(type => "checkconfig", id => "foo", call => \&checkconfig);
+
+This is useful if the plugin needs to check for or modify ikiwiki's
+configuration. It's called early in the startup process. `%config`
+is populated at this point, but other state has not yet been loaded.
+The function is passed no values. It's ok for the function to call
+`error()` if something isn't configured right.
+
+### <a name="refresh">refresh</a>
+
+ hook(type => "refresh", id => "foo", call => \&refresh);
+
+This hook is called just before ikiwiki scans the wiki for changed files.
+It's useful for plugins that need to create or modify a source page. The
+function is passed no values.
+
+### <a name="needsbuild">needsbuild</a>
+
+ hook(type => "needsbuild", id => "foo", call => \&needsbuild);
+
+This allows a plugin to observe or even manipulate the list of files that
+need to be built when the wiki is refreshed.
+
+As its first parameter, the function is passed a reference to an array of
+files that will be built. It should return an array reference that is a
+modified version of its input. It can add or remove files from it.
+
+The second parameter passed to the function is a reference to an array of
+files that have been deleted.
+
+### <a name="scan">scan</a>
+
+ hook(type => "scan", id => "foo", call => \&scan);
+
+This hook is called early in the process of building the wiki, and is used
+as a first pass scan of the page, to collect metadata about the page. It's
+mostly used to scan the page for [[WikiLinks|ikiwiki/WikiLink]], and add
+them to `%links`. Present in IkiWiki 2.40 and later.
+
+The function is passed named parameters "page" and "content". Its return
+value is ignored.
+
+### <a name="filter">filter</a>
+
+ hook(type => "filter", id => "foo", call => \&filter);
+
+Runs on the full raw source of a page, before anything else touches it, and
+can make arbitrary changes. The function is passed named parameters "page",
+"destpage", and "content". It should return the filtered content.
+
+### <a name="preprocess">preprocess</a>
+
+Adding a preprocessor [[ikiwiki/directive]] is probably the most common use
+of a plugin.
+
+ hook(type => "preprocess", id => "foo", call => \&preprocess);
+
+Replace "foo" with the command name that will be used for the preprocessor
+directive.
+
+Each time the directive is processed, the referenced function (`preprocess`
+in the example above) is called. Whatever the function returns goes onto
+the page in place of the directive. Or, if the function aborts using
+`error()`, the directive will be replaced with the error message.
+
+The function is passed named parameters. First come the parameters set
+in the preprocessor directive. These are passed in the same order as
+they're in the directive, and if the preprocessor directive contains a bare
+parameter (example: `\[[!foo param]]`), that parameter will be passed with
+an empty value.
+
+After the parameters from the preprocessor directive some additional ones
+are passed: A "page" parameter gives the name of the page that embedded the
+preprocessor directive, while a "destpage" parameter gives the name of the
+page the content is going to (different for inlined pages), and a "preview"
+parameter is set to a true value if the page is being previewed.
+
+If `hook` is passed an optional "scan" parameter, set to a true value, this
+makes the hook be called during the preliminary scan that ikiwiki makes of
+updated pages, before begining to render pages. This should be done if the
+hook modifies data in `%links` (typically by calling `add_link`). Note that
+doing so will make the hook be run twice per page build, so avoid doing it
+for expensive hooks. (As an optimisation, if your preprocessor hook is
+called in a void context, you can assume it's being run in scan mode, and
+avoid doing expensive things at that point.)
+
+Note that if the [[htmlscrubber]] is enabled, html in
+preprocessor [[ikiwiki/directive]] output is sanitised, which may limit what
+your plugin can do. Also, the rest of the page content is not in html
+format at preprocessor time. Text output by a preprocessor directive will
+be linkified and passed through markdown (or whatever engine is used to
+htmlize the page) along with the rest of the page.
+
+### <a name="linkify">linkify</a>
+
+ hook(type => "linkify", id => "foo", call => \&linkify);
+
+This hook is called to convert [[WikiLinks|ikiwiki/WikiLink]] on the page into html
+links. The function is passed named parameters "page", "destpage", and
+"content". It should return the linkified content. Present in IkiWiki 2.40
+and later.
+
+Plugins that implement linkify must also implement a scan hook, that scans
+for the links on the page and adds them to `%links` (typically by calling
+`add_link`).
+
+### <a name="htmlize">htmlize</a>
+
+ hook(type => "htmlize", id => "ext", call => \&htmlize);
+
+Runs on the source of a page and turns it into html. The id parameter
+specifies the filename extension that a file must have to be htmlized using
+this plugin. This is how you can add support for new and exciting markup
+languages to ikiwiki.
+
+The function is passed named parameters: "page" and "content" and should
+return the htmlized content.
+
+If `hook` is passed an optional "keepextension" parameter, set to a true
+value, then the extension will not be stripped from the source filename when
+generating the page.
+
+If `hook` is passed an optional "noextension" parameter, set to a true
+value, then the id parameter specifies not a filename extension, but
+a whole filename that can be htmlized. This is useful for files
+like `Makefile` that have no extension.
+
+If `hook` is passed an optional "longname" parameter, this value is used
+when prompting a user to choose a page type on the edit page form.
+
+### <a name="indexhtml">indexhtml</a>
+
+ hook(type => "indexhtml", id => "foo", call => \&indexhtml);
+
+This hook is called once the page has been converted to html (but before
+the generated html is put in a template). The most common use is to
+update search indexes. Added in ikiwiki 2.54.
+
+The function is passed named parameters "page", "destpage", and "content".
+Its return value is ignored.
+
+### <a name="pagetemplate">pagetemplate</a>
+
+ hook(type => "pagetemplate", id => "foo", call => \&pagetemplate);
+
+[[Templates]] are filled out for many different things in
+ikiwiki, like generating a page, or part of a blog page, or an rss feed, or
+a cgi. This hook allows modifying the variables available on those
+templates. The function is passed named parameters. The "page" and
+"destpage" parameters are the same as for a preprocess hook. The "template"
+parameter is a [[!cpan HTML::Template]] object that is the template that
+will be used to generate the page. The function can manipulate that
+template object.
+
+The most common thing to do is probably to call `$template->param()` to add
+a new custom parameter to the template.
+
+### templatefile
+
+ hook(type => "templatefile", id => "foo", call => \&templatefile);
+
+This hook allows plugins to change the [[template|templates]] that is
+used for a page in the wiki. The hook is passed a "page" parameter, and
+should return the name of the template file to use (relative to the
+template directory), or undef if it doesn't want to change the default
+("page.tmpl").
+
+### pageactions
+
+ hook(type => "pageactions", id => "foo", call => \&pageactions);
+
+This hook allows plugins to add arbitrary actions to the action bar on a
+page (next to Edit, RecentChanges, etc). The hook is passed a "page"
+parameter, and can return a list of html fragments to add to the action
+bar.
+
+### <a name="sanitize">sanitize</a>
+
+ hook(type => "sanitize", id => "foo", call => \&sanitize);
+
+Use this to implement html sanitization or anything else that needs to
+modify the body of a page after it has been fully converted to html.
+
+The function is passed named parameters: "page", "destpage", and "content",
+and should return the sanitized content.
+
+### <a name="format">format</a>
+
+ hook(type => "format", id => "foo", call => \&format);
+
+The difference between format and sanitize is that sanitize only acts on
+the page body, while format can modify the entire html page including the
+header and footer inserted by ikiwiki, the html document type, etc. (It
+should not rely on always being passed the entire page, as it won't be
+when the page is being previewed.)
+
+The function is passed named parameters: "page" and "content", and
+should return the formatted content.
+
+### build_affected
+
+ hook(type => "build_affected", id => "foo", call => \&build_affected);
+
+This hook is called after the directly changed pages have been built,
+and can cause extra pages to be built. If links and backlinks were provided
+by a plugin, this would be where that plugin would rebuild pages whose
+backlinks have changed, for instance. The [[trail]] plugin uses this hook
+to rebuild pages whose next or previous page has changed.
+
+The function should currently ignore its parameters. It returns a list with
+an even number of items (a hash in list context), where the first item of
+each pair is a page name to be rebuilt (if it was not already rebuilt), and
+the second is a log message resembling
+`building plugins/write because the phase of the moon has changed`.
+
+### <a name="delete">delete</a>
+
+ hook(type => "delete", id => "foo", call => \&delete);
+
+After a page or pages is removed from the wiki, the referenced function
+is called, and passed the names of the source files that were removed.
+
+### rendered
+
+ hook(type => "rendered", id => "foo", call => \&rendered);
+
+After ikiwiki renders a change or addition (but not deletion) to the
+wiki, the referenced function is called, and passed the names of the
+source files that were rendered.
+
+(This hook used to be called "change", but that was not accurate.
+For now, plugins using the old hook name will still work.)
+
+### <a name="changes">changes</a>
+
+ hook(type => "changes", id => "foo", call => \&changes);
+
+After ikiwiki renders changes to the wiki, the referenced function is
+called, and passed the names of the source files that were added, modified,
+or deleted.
+
+### cgi
+
+ hook(type => "cgi", id => "foo", call => \&cgi);
+
+Use this to hook into ikiwiki's cgi script. Each registered cgi hook is
+called in turn, and passed a CGI object. The hook should examine the
+parameters, and if it will handle this CGI request, output a page
+(including the http headers) and terminate the program.
+
+Note that cgi hooks are called as early as possible, before any ikiwiki
+state is loaded, and with no session information.
+
+### <a name="auth">auth</a>
+
+ hook(type => "auth", id => "foo", call => \&auth);
+
+This hook can be used to implement an authentication method. When a user
+needs to be authenticated, each registered auth hook is called in turn, and
+passed a CGI object and a session object.
+
+If the hook is able to authenticate the user, it should set the session
+object's "name" parameter to the authenticated user's name. Note that
+if the name is set to the name of a user who is not registered,
+a basic registration of the user will be automatically performed.
+
+### <a name="sessioncgi">sessioncgi</a>
+
+ hook(type => "sessioncgi", id => "foo", call => \&sessioncgi);
+
+Unlike the cgi hook, which is run as soon as possible, the sessioncgi hook
+is only run once a session object is available. It is passed both a CGI
+object and a session object. To check if the user is in fact signed in, you
+can check if the session object has a "name" parameter set.
+
+### <a name="canedit">canedit</a>
+
+ hook(type => "canedit", id => "foo", call => \&canedit);
+
+This hook can be used to implement arbitrary access methods to control when
+a page can be edited using the web interface (commits from revision control
+bypass it). When a page is edited, each registered canedit hook is called
+in turn, and passed the page name, a CGI object, and a session object.
+
+If the hook has no opinion about whether the edit can proceed, return
+`undef`, and the next plugin will be asked to decide. If edit can proceed,
+the hook should return "". If the edit is not allowed by this hook, the
+hook should return an error message for the user to see, or a function
+that can be run to log the user in or perform other action necessary for
+them to be able to edit the page.
+
+This hook should avoid directly redirecting the user to a signin page,
+since it's sometimes used to test to see which pages in a set of pages a
+user can edit.
+
+### canremove
+
+ hook(type => "canremove", id => "foo", call => \&canremove);
+
+This hook can be used to implement arbitrary access methods to control
+when a page can be removed using the web interface (commits from
+revision control bypass it). It works exactly like the `canedit` hook,
+but is passed the named parameters `cgi` (a CGI object), `session`
+(a session object) and `page` (the page subject to deletion).
+
+### canrename
+
+ hook(type => "canrename", id => "foo", call => \&canrename);
+
+This hook can be used to implement arbitrary access methods to control when
+a page can be renamed using the web interface (commits from revision control
+bypass it). It works exactly like the `canedit` hook,
+but is passed the named parameters `cgi` (a CGI object), `session` (a
+session object), `src`, `srcfile`, `dest` and `destfile`.
+
+### <a name="checkcontent">checkcontent</a>
+
+ hook(type => "checkcontent", id => "foo", call => \&checkcontent);
+
+This hook is called to check the content a user has entered on a page,
+before it is saved, and decide if it should be allowed.
+
+It is passed named parameters: `content`, `page`, `cgi`, and `session`. If
+the content the user has entered is a comment, it may also be passed some
+additional parameters: `author`, `url`, and `subject`. The `subject`
+parameter may also be filled with the user's comment about the change.
+
+Note: When the user edits an existing wiki page, this hook is also
+passed a `diff` named parameter, which will include only the lines
+that they added to the page, or modified.
+
+The hook should return `undef` on success. If the content is disallowed, it
+should return a message stating what the problem is, or a function
+that can be run to perform whatever action is necessary to allow the user
+to post the content.
+
+### <a name="editcontent">editcontent</a>
+
+ hook(type => "editcontent", id => "foo", call => \&editcontent);
+
+This hook is called when a page is saved (or previewed) using the web
+interface. It is passed named parameters: `content`, `page`, `cgi`, and
+`session`. These are, respectively, the new page content as entered by the
+user, the page name, a `CGI` object, and the user's `CGI::Session`.
+
+It can modify the content as desired, and should return the content.
+
+### <a name="formbuilder">formbuilder</a>
+
+ hook(type => "formbuilder_setup", id => "foo", call => \&formbuilder_setup);
+ hook(type => "formbuilder", id => "foo", call => \&formbuilder);
+
+These hooks allow tapping into the parts of ikiwiki that use [[!cpan
+CGI::FormBuilder]] to generate web forms. These hooks are passed named
+parameters: `cgi`, `session`, `form`, and `buttons`. These are, respectively,
+the `CGI` object, the user's `CGI::Session`, a `CGI::FormBuilder`, and a
+reference to an array of names of buttons to go on the form.
+
+Each time a form is set up, the `formbuilder_setup` hook is called.
+Typically the `formbuilder_setup` hook will check the form's title, and if
+it's a form that it needs to modify, will call various methods to
+add/remove/change fields, tweak the validation code for the fields, etc. It
+will not validate or display the form.
+
+Just before a form is displayed to the user, the `formbuilder` hook is
+called. It can be used to validate the form, but should not display it.
+
+### savestate
+
+ hook(type => "savestate", id => "foo", call => \&savestate);
+
+This hook is called whenever ikiwiki normally saves its state, just before
+the state is saved. The function can save other state, modify values before
+they're saved, etc.
+
+### renamepage
+
+ hook(type => "renamepage", id => "foo", call => \&renamepage);
+
+This hook is called by the [[plugins/rename]] plugin when it renames
+something, once per page linking to the renamed page's old location.
+The hook is passed named parameters: `page`, `oldpage`, `newpage`, and
+`content`, and should try to modify the content of `page` to reflect
+the name change. For example, by converting links to point to the
+new page.
+
+### rename
+
+ hook(type => "rename", id => "foo", call => \&rename);
+
+When a page or set of pages is renamed, the referenced function is
+called for every page, and is passed named parameters:
+
+* `torename`: a reference to a hash with keys: `src`, `srcfile`,
+ `dest`, `destfile`, `required`.
+* `cgi`: a CGI object
+* `session`: a session object.
+
+Such a hook function returns any additional rename hashes it wants to
+add. This hook is applied recursively to returned additional rename
+hashes, so that it handles the case where two plugins use the hook:
+plugin A would see when plugin B adds a new file to be renamed.
+
+### getsetup
+
+ hook(type => "getsetup", id => "foo", call => \&getsetup);
+
+This hooks is not called during normal operation, but only when setting up
+the wiki, or generating a setup file. Plugins can use this hook to add
+configuration options.
+
+The hook is passed no parameters. It returns data about the configuration
+options added by the plugin. It can also check if the plugin is usable, and
+die if not, which will cause the plugin to not be offered in the configuration
+interface.
+
+The data returned is a list of `%config` options, followed by a hash
+describing the option. There can also be an item named "plugin", which
+describes the plugin as a whole. For example:
+
+ return
+ plugin => {
+ description => "description of this plugin",
+ safe => 1,
+ rebuild => 1,
+ section => "misc",
+ },
+ option_foo => {
+ type => "boolean",
+ description => "enable foo?",
+ advanced => 1,
+ safe => 1,
+ rebuild => 1,
+ },
+ option_bar => {
+ type => "string",
+ example => "hello",
+ description => "option bar",
+ safe => 1,
+ rebuild => 0,
+ },
+
+* `type` can be "boolean", "string", "integer", "pagespec",
+ or "internal" (used for values that are not user-visible). The type is
+ the type of the leaf values; the `%config` option may be an array or
+ hash of these.
+* `example` can be set to an example value.
+* `description` is a short description of the option.
+* `link` is a link to further information about the option. This can either
+ be a [[ikiwiki/WikiLink]], or an url.
+* `htmldescription` is displayed instead of the description by websetup.
+* `advanced` can be set to true if the option is more suitable for advanced
+ users.
+* `safe` should be false if the option should not be displayed in unsafe
+ configuration methods, such as the web interface. Anything that specifies
+ a command to run, a path on disk, or a regexp should be marked as unsafe.
+ If a plugin is marked as unsafe, that prevents it from being
+ enabled/disabled.
+* `rebuild` should be true if changing the option (or enabling/disabling
+ the plugin) will require a wiki rebuild, false if no rebuild is needed,
+ and undef if a rebuild could be needed in some circumstances, but is not
+ strictly required.
+* `section` can optionally specify which section in the config file
+ the plugin fits in. The convention is to name the sections the
+ same as the tags used for [[plugins]] on this wiki.
+
+### genwrapper
+
+ hook(type => "genwrapper", id => "foo", call => \&genwrapper);
+
+This hook is used to inject C code (which it returns) into the `main`
+function of the ikiwiki wrapper when it is being generated.
+
+The code runs before anything else -- in particular it runs before
+the suid wrapper has sanitized its environment.
+
+### disable
+
+ hook(type => "disable", id => "foo", call => \&disable);
+
+This hook is only run when a previously enabled plugin gets disabled
+during ikiwiki setup. Plugins can use this to perform cleanups.
+
+## Exported variables
+
+Several variables are exported to your plugin when you `use IkiWiki;`
+
+### `%config`
+
+A plugin can access the wiki's configuration via the `%config`
+hash. The best way to understand the contents of the hash is to look at
+your ikiwiki setup file, which sets the hash content to configure the wiki.
+
+### `%pagestate`
+
+The `%pagestate` hash can be used by plugins to save state that they will need
+next time ikiwiki is run. The hash holds per-page state, so to set a value,
+use `$pagestate{$page}{$id}{$key}=$value`, and to retrieve the value,
+use `$pagestate{$page}{$id}{$key}`.
+
+The `$value` can be anything that perl's Storable module is capable of
+serializing. `$key` can be any string you like, but `$id` must be the same
+as the "id" parameter passed to `hook()` when registering the plugin. This
+is so ikiwiki can know when to delete pagestate for plugins that are no
+longer used.
+
+When pages are deleted, ikiwiki automatically deletes their pagestate too.
+
+Note that page state does not persist across wiki rebuilds, only across
+wiki updates.
+
+### `%wikistate`
+
+The `%wikistate` hash can be used by a plugin to store persistant state
+that is not bound to any one page. To set a value, use
+`$wikistate{$id}{$key}=$value`, where `$value` is anything Storable can
+serialize, `$key` is any string you like, and `$id` must be the same as the
+"id" parameter passed to `hook()` when registering the plugin, so that the
+state can be dropped if the plugin is no longer used.
+
+### `%links`
+
+The `%links` hash can be used to look up the names of each page that
+a page links to. The name of the page is the key; the value is an array
+reference. Do not modify this hash directly; call `add_link()`.
+
+ $links{"foo"} = ["bar", "baz"];
+
+### `%typedlinks`
+
+The `%typedlinks` hash records links of specific types. Do not modify this
+hash directly; call `add_link()`. The keys are page names, and the values
+are hash references. In each page's hash reference, the keys are link types
+defined by plugins, and the values are hash references with link targets
+as keys, and 1 as a dummy value, something like this:
+
+ $typedlinks{"foo"} = {
+ tag => { short_word => 1, metasyntactic_variable => 1 },
+ next_page => { bar => 1 },
+ };
+
+Ordinary [[WikiLinks|ikiwiki/WikiLink]] appear in `%links`, but not in
+`%typedlinks`.
+
+### `%pagesources`
+
+The `%pagesources` has can be used to look up the source filename
+of a page. So the key is the page name, and the value is the source
+filename. Do not modify this hash.
+
+ $pagesources{"foo"} = "foo.mdwn";
+
+### `%destsources`
+
+The `%destsources` hash records the name of the source file used to
+create each destination file. The key is the output filename (ie,
+"foo/index.html"), and the value is the source filename that it was built
+from (eg, "foo.mdwn"). Note that a single source file may create multiple
+destination files. Do not modify this hash directly; call `will_render()`.
+
+ $destsources{"foo/index.html"} = "foo.mdwn";
+
+## Library functions
+
+Several functions are exported to your plugin when you `use IkiWiki;`
+
+### `hook(@)`
+
+Hook into ikiwiki's processing. See the discussion of hooks above.
+
+Note that in addition to the named parameters described above, a parameter
+named `no_override` is supported, If it's set to a true value, then this hook
+will not override any existing hook with the same id. This is useful if
+the id can be controled by the user.
+
+### `debug($)`
+
+Logs a debugging message. These are supressed unless verbose mode is turned
+on.
+
+### `error($;$)`
+
+Aborts with an error message. If the second parameter is passed, it is a
+function that is called after the error message is printed, to do any final
+cleanup.
+
+If called inside a preprocess hook, error() does not abort the entire
+wiki build, but instead replaces the preprocessor [[ikiwiki/directive]] with
+a version containing the error message.
+
+In other hooks, error() is a fatal error, so use with care. Try to avoid
+dying on bad input when building a page, as that will halt
+the entire wiki build and make the wiki unusable.
+
+### `template($;@)`
+
+Creates and returns a [[!cpan HTML::Template]] object. (In a list context,
+returns the parameters needed to construct the obhect.)
+
+The first parameter is the name of the template file. The optional remaining
+parameters are passed to `HTML::Template->new`.
+
+Normally, the template file is first looked for in the templates/ subdirectory
+of the srcdir. Failing that, it is looked for in the templatedir.
+
+Wiki pages can be used as templates. This should be done only for templates
+which it is safe to let wiki users edit. Enable it by passing a filename
+with no ".tmpl" extension. Template pages are normally looked for in
+the templates/ directory. If the page name starts with "/", a page
+elsewhere in the wiki can be used.
+
+If the template is not found, or contains a syntax error, an error is thrown.
+
+### `template_depends($$;@)`
+
+Use this instead of `template()` if the content of a template is being
+included into a page. This causes the page to depend on the template,
+so it will be updated if the template is modified.
+
+Like `template()`, except the second parameter is the page.
+
+### `htmlpage($)`
+
+Passed a page name, returns the base name that will be used for a the html
+page created from it. (Ie, it appends ".html".)
+
+Use this when constructing the filename of a html file. Use `urlto` when
+generating a link to a page.
+
+### `pagespec_match_list($$;@)`
+
+Passed a page name, and [[ikiwiki/PageSpec]], returns a list of pages
+in the wiki that match the [[ikiwiki/PageSpec]].
+
+The page will automatically be made to depend on the specified
+[[ikiwiki/PageSpec]], so `add_depends` does not need to be called. This
+is often significantly more efficient than calling `add_depends` and
+`pagespec_match` in a loop. You should use this anytime a plugin
+needs to match a set of pages and do something based on that list.
+
+Unlike pagespec_match, this may throw an error if there is an error in
+the pagespec.
+
+Additional named parameters can be specified:
+
+* `deptype` optionally specifies the type of dependency to add. Use the
+ `deptype` function to generate a dependency type.
+* `filter` is a reference to a function, that is called and passed a page,
+ and returns true if the page should be filtered out of the list.
+* `sort` specifies a sort order for the list. See
+ [[ikiwiki/PageSpec/sorting]] for the avilable sort methods. Note that
+ if a sort method is specified that depends on the
+ page content (such as 'meta(foo)'), the deptype needs to be set to
+ a content dependency.
+* `reverse` if true, sorts in reverse.
+* `num` if nonzero, specifies the maximum number of matching pages that
+ will be returned.
+* `list` makes it only match amoung the specified list of pages.
+ Default is to match amoung all pages in the wiki.
+
+Any other named parameters are passed on to `pagespec_match`, to further
+limit the match.
+
+### `add_depends($$;$)`
+
+Makes the specified page depend on the specified [[ikiwiki/PageSpec]].
+
+By default, dependencies are full content dependencies, meaning that the
+page will be updated whenever anything matching the PageSpec is modified.
+This can be overridden by passing a `deptype` value as the third parameter.
+
+### `pagespec_match($$;@)`
+
+Passed a page name, and [[ikiwiki/PageSpec]], returns a true value if the
+[[ikiwiki/PageSpec]] matches the page.
+
+Note that the return value is overloaded. If stringified, it will be a
+message indicating why the PageSpec succeeded, or failed, to match the
+page.
+
+Additional named parameters can be passed, to further limit the match.
+The most often used is "location", which specifies the location the
+PageSpec should match against. If not passed, relative PageSpecs will match
+relative to the top of the wiki.
+
+### `deptype(@)`
+
+Use this function to generate ikiwiki's internal representation of a
+dependency type from one or more of these keywords:
+
+* `content` is the default. Any change to the content
+ of a page triggers the dependency.
+* `presence` is only triggered by a change to the presence
+ of a page.
+* `links` is only triggered by a change to the links of a page.
+ This includes when a link is added, removed, or changes what
+ it points to due to other changes. It does not include the
+ addition or removal of a duplicate link.
+
+If multiple types are specified, they are combined.
+
+### `bestlink($$)`
+
+Given a page and the text of a link on the page, determine which
+existing page that link best points to. Prefers pages under a
+subdirectory with the same name as the source page, failing that
+goes down the directory tree to the base looking for matching
+pages, as described in [[ikiwiki/SubPage/LinkingRules]].
+
+### `htmllink($$$;@)`
+
+Many plugins need to generate html links and add them to a page. This is
+done by using the `htmllink` function. The usual way to call
+`htmllink` is:
+
+ htmllink($page, $page, $link)
+
+Why is `$page` repeated? Because if a page is inlined inside another, and a
+link is placed on it, the right way to make that link is actually:
+
+ htmllink($page, $destpage, $link)
+
+Here `$destpage` is the inlining page. A `destpage` parameter is passed to
+some of the hook functions above; the ones that are not passed it are not used
+during inlining and don't need to worry about this issue.
+
+After the three required parameters, named parameters can be used to
+control some options. These are:
+
+* noimageinline - set to true to avoid turning links into inline html images
+* forcesubpage - set to force a link to a subpage
+* linktext - set to force the link text to something
+* anchor - set to make the link include an anchor
+* rel - set to add a rel attribute to the link
+* class - set to add a css class to the link
+* title - set to add a title attribute to the link
+
+### `readfile($;$)`
+
+Given a filename, reads and returns the entire file.
+
+The optional second parameter, if set to a true value, makes the file be read
+in binary mode.
+
+A failure to read the file will result in it dying with an error.
+
+### `writefile($$$;$$)`
+
+Given a filename, a directory to put it in, and the file's content,
+writes a file.
+
+The optional fourth parameter, if set to a true value, makes the file be
+written in binary mode.
+
+The optional fifth parameter can be used to pass a function reference that
+will be called to handle writing to the file. The function will be called
+and passed a file descriptor it should write to, and an error recovery
+function it should call if the writing fails. (You will not normally need to
+use this interface.)
+
+A failure to write the file will result in it dying with an error.
+
+If the destination directory doesn't exist, it will first be created.
+
+The filename and directory are separate parameters because of
+some security checks done to avoid symlink attacks. Before writing a file,
+it checks to make sure there's not a symlink with its name, to avoid
+following the symlink. If the filename parameter includes a subdirectory
+to put the file in, it also checks if that subdirectory is a symlink, etc.
+The directory parameter, however, is not checked for symlinks. So,
+generally the directory parameter is a trusted toplevel directory like
+the srcdir or destdir, and any subdirectories of this are included in the
+filename parameter.
+
+### `will_render($$)`
+
+Given a page name and a destination file name (not including the base
+destination directory), register that the page will result in that file
+being rendered.
+
+It's important to call this before writing to any file in the destination
+directory, and it's important to call it consistently every time, even if
+the file isn't really written this time -- unless you delete any old
+version of the file. In particular, in preview mode, this should still be
+called even if the file isn't going to be written to during the preview.
+
+Ikiwiki uses this information to automatically clean up rendered files when
+the page that rendered them goes away or is changed to no longer render
+them. will_render also does a few important security checks.
+
+### `pagetype($)`
+
+Given the name of a source file, returns the type of page it is, if it's
+a type that ikiwiki knowns how to htmlize. Otherwise, returns undef.
+
+### `pagename($)`
+
+Given the name of a source file, returns the name of the wiki page
+that corresponds to that file.
+
+### `pagetitle($)`
+
+Give the name of a wiki page, returns a version suitable to be displayed as
+the page's title. This is accomplished by de-escaping escaped characters in
+the page name. "_" is replaced with a space, and '__NN__' is replaced by
+the UTF character with code NN.
+
+### `titlepage($)`
+
+This performs the inverse of `pagetitle`, ie, it converts a page title into
+a wiki page name.
+
+### `linkpage($)`
+
+This converts text that could have been entered by the user as a
+[[ikiwiki/WikiLink]] into a wiki page name.
+
+### `srcfile($;$)`
+
+Given the name of a source file in the wiki, searches for the file in
+the source directory and the underlay directories (most recently added
+underlays first), and returns the full path to the first file found.
+
+Normally srcfile will fail with an error message if the source file cannot
+be found. The second parameter can be set to a true value to make it return
+undef instead.
+
+### `add_underlay($)`
+
+Adds a directory to the set of underlay directories that ikiwiki will
+search for files.
+
+If the directory name is not absolute, ikiwiki will assume it is in
+the parent directory of the configured underlaydir.
+
+### `displaytime($;$$)`
+
+Given a time, formats it for display.
+
+The optional second parameter is a strftime format to use to format the
+time.
+
+If the third parameter is true, this is the publication time of a page.
+(Ie, set the html5 pubdate attribute.)
+
+### `gettext`
+
+This is the standard gettext function, although slightly optimised.
+
+### `ngettext`
+
+This is the standard ngettext function, although slightly optimised.
+
+### `urlto($;$$)`
+
+Construct a relative url to the first parameter from the page named by the
+second. The first parameter can be either a page name, or some other
+destination file, as registered by `will_render`.
+
+Provide a second parameter whenever possible, since this leads to better
+behaviour for the [[plugins/po]] plugin and `file:///` URLs.
+
+If the second parameter is not specified (or `undef`), the URL will be
+valid from any page on the wiki, or from the CGI; if possible it'll
+be a path starting with `/`, but an absolute URL will be used if
+the wiki and the CGI are on different domains.
+
+If the third parameter is passed and is true, the url will be a fully
+absolute url. This is useful when generating an url to publish elsewhere.
+
+### `newpagefile($$)`
+
+This can be called when creating a new page, to determine what filename
+to save the page to. It's passed a page name, and its type, and returns
+the name of the file to create, relative to the srcdir.
+
+### `targetpage($$;$)`
+
+Passed a page and an extension, returns the filename that page will be
+rendered to.
+
+Optionally, a third parameter can be passed, to specify the preferred
+filename of the page. For example, `targetpage("foo", "rss", "feed")`
+will yield something like `foo/feed.rss`.
+
+### `add_link($$;$)`
+
+This adds a link to `%links`, ensuring that duplicate links are not
+added. Pass it the page that contains the link, and the link text.
+
+An optional third parameter sets the link type. If not specified,
+it is an ordinary [[ikiwiki/WikiLink]].
+
+### `add_autofile($$$)`
+
+Sometimes you may want to add a file to the `srcdir` as a result of content
+of other pages. For example, [[plugins/tag]] pages can be automatically
+created as needed. This function can be used to do that.
+
+The three parameters are the filename to create (relative to the `srcdir`),
+the name of the plugin, and a callback function. The callback will be
+called if it is appropriate to automatically add the file, and should then
+take care of creating it, and doing anything else it needs to (such as
+checking it into revision control). Note that the callback may not always
+be called. For example, if an automatically added file is deleted by the
+user, ikiwiki will avoid re-adding it again.
+
+This function needs to be called during the scan hook, or earlier in the
+build process, in order to add the file early enough for it to be built.
+
+## Miscellaneous
+
+### Internal use pages
+
+Sometimes it's useful to put pages in the wiki without the overhead of
+having them be rendered to individual html files. Such internal use pages
+are collected together to form the RecentChanges page, for example.
+
+To make an internal use page, register a filename extension that starts
+with "_". Internal use pages cannot be edited with the web interface,
+generally shouldn't contain [[WikiLinks|ikiwiki/WikiLink]] or preprocessor directives (use
+either on them with extreme caution), and are not matched by regular
+PageSpecs glob patterns, but instead only by a special `internal()`
+[[ikiwiki/PageSpec]].
+
+### RCS plugins
+
+ikiwiki's support for [[revision_control_systems|rcs]] is also done via
+plugins. See [[RCS_details|rcs/details]] for some more info.
+
+RCS plugins must register a number of hooks. Each hook has type 'rcs',
+and the 'id' field is set to the name of the hook. For example:
+
+ hook(type => "rcs", id => "rcs_update", call => \&rcs_update);
+ hook(type => "rcs", id => "rcs_prepedit", call => \&rcs_prepedit);
+
+#### `rcs_update()`
+
+Updates the working directory with any remote changes.
+
+#### `rcs_prepedit($)`
+
+Is passed a file to prepare to edit. It can generate and return an arbitrary
+token, that will be passed into `rcs_commit` when committing. For example,
+it might return the current revision ID of the file, and use that
+information later when merging changes.
+
+#### `rcs_commit(@)`
+
+Passed named parameters: `file`, `message`, `token` (from `rcs_prepedit`),
+and `session` (optional).
+
+Should try to commit the file. Returns `undef` on *success* and a version
+of the page with the rcs's conflict markers on failure.
+
+#### `rcs_commit_staged(@)`
+
+Passed named parameters: `message`, and `session` (optional).
+
+Should commit all staged changes. Returns undef on success, and an
+error message on failure.
+
+Changes can be staged by calls to `rcs_add`, `rcs_remove`, and
+`rcs_rename`.
+
+#### `rcs_add($)`
+
+Adds the passed file to the archive. The filename is relative to the root
+of the srcdir.
+
+Note that this should not commit the new file, it should only
+prepare for it to be committed when rcs_commit (or `rcs_commit_staged`) is
+called. Note that the file may be in a new subdir that is not yet in
+to version control; the subdir can be added if so.
+
+#### `rcs_remove($)`
+
+Remove a file. The filename is relative to the root of the srcdir.
+
+Note that this should not commit the removal, it should only prepare for it
+to be committed when `rcs_commit` (or `rcs_commit_staged`) is called.
+
+#### `rcs_rename($$)`
+
+Rename a file. The filenames are relative to the root of the srcdir.
+
+Note that this should not commit the rename, it should only
+prepare it for when `rcs_commit` (or `rcs_commit_staged`) is called.
+The new filename may be in a new subdir, that is not yet added to
+version control. If so, the subdir will exist already, and should
+be added to revision control.
+
+#### `rcs_recentchanges($)`
+
+Examine the RCS history and generate a list of recent changes.
+The parameter is how many changes to return.
+
+The data structure returned for each change is:
+
+ {
+ rev => # the RCSs id for this commit
+ user => # user who made the change (may be an openid),
+ nickname => # short name for user (optional; not an openid),
+
+ committype => # either "web" or the name of the rcs,
+ when => # time when the change was made,
+ message => [
+ { line => "commit message line 1" },
+ { line => "commit message line 2" },
+ # etc,
+ ],
+ pages => [
+ {
+ page => # name of page changed,
+ diffurl => # optional url to a diff of changes
+ },
+ # repeat for each page changed in this commit,
+ ],
+ }
+
+#### `rcs_diff($;$)`
+
+The first parameter is the rev from `rcs_recentchanges`.
+The optional second parameter is how many lines to return (default: all).
+
+Should return a list of lines of the diff (including \n) in list
+context, and a string containing the whole diff in scalar context.
+
+#### `rcs_getctime($)`
+
+This is used to get the page creation time for a file from the RCS, by looking
+it up in the history.
+
+If the RCS cannot determine a ctime for the file, return 0.
+
+#### `rcs_getmtime($)`
+
+This is used to get the page modification time for a file from the RCS, by
+looking it up in the history.
+
+It's ok if this is not implemented, and throws an error.
+
+If the RCS cannot determine a mtime for the file, return 0.
+
+#### `rcs_receive()`
+
+This is called when ikiwiki is running as a pre-receive hook (or
+equivalent), and is testing if changes pushed into the RCS from an
+untrusted user should be accepted. This is optional, and doesn't make
+sense to implement for all RCSs.
+
+It should examine the incoming changes, and do any sanity
+checks that are appropriate for the RCS to limit changes to safe file adds,
+removes, and changes. If something bad is found, it should die, to abort
+the push. Otherwise, it should return a list of files that were changed,
+in the form:
+
+ {
+ file => # name of file that was changed
+ action => # either "add", "change", or "remove"
+ path => # temp file containing the new file content, only
+ # needed for "add"/"change", and only if the file
+ # is an attachment, not a page
+ }
+
+The list will then be checked to make sure that each change is one that
+is allowed to be made via the web interface.
+
+#### `rcs_preprevert($)`
+
+This is called by the revert web interface. It is passed a RCS-specific
+change ID, and should determine what the effects would be of reverting
+that change, and return the same data structure as `rcs_receive`.
+
+Like `rcs_receive`, it should do whatever sanity checks are appropriate
+for the RCS to limit changes to safe changes, and die if a change would
+be unsafe to revert.
+
+#### `rcs_revert($)`
+
+This is called by the revert web interface. It is passed a named
+parameter rev that is the RCS-specific change ID to revert.
+
+It should try to revert the specified rev, and leave the reversion staged
+so `rcs_commit_staged` will complete it. It should return undef on _success_
+and an error message on failure.
+
+This hook and `rcs_preprevert` are optional, if not implemented, no revert
+web interface will be available.
+
+### PageSpec plugins
+
+It's also possible to write plugins that add new functions to
+[[PageSpecs|ikiwiki/PageSpec]]. Such a plugin should add a function to the
+IkiWiki::PageSpec package, that is named `match_foo`, where "foo()" is
+how it will be accessed in a [[ikiwiki/PageSpec]]. The function will be passed
+two parameters: The name of the page being matched, and the thing to match
+against. It may also be passed additional, named parameters.
+
+It should return a IkiWiki::SuccessReason object if the match succeeds, or
+an IkiWiki::FailReason object if the match fails. If the match cannot be
+attempted at all, for any page, it can instead return an
+IkiWiki::ErrorReason object explaining why.
+
+When constructing these objects, you should also include information about
+of any pages whose contents or other metadata influenced the result of the
+match. Do this by passing a list of pages, followed by `deptype` values.
+
+For example, "backlink(foo)" is influenced by the contents of page foo;
+"link(foo)" and "title(bar)" are influenced by the contents of any page
+they match; "created_before(foo)" is influenced by the metadata of foo;
+while "glob(*)" is not influenced by the contents of any page.
+
+### Sorting plugins
+
+Similarly, it's possible to write plugins that add new functions as
+[[ikiwiki/pagespec/sorting]] methods. To achieve this, add a function to
+the IkiWiki::SortSpec package named `cmp_foo`, which will be used when sorting
+by `foo` or `foo(...)` is requested.
+
+The names of pages to be compared are in the global variables `$a` and `$b`
+in the IkiWiki::SortSpec package. The function should return the same thing
+as Perl's `cmp` and `<=>` operators: negative if `$a` is less than `$b`,
+positive if `$a` is greater, or zero if they are considered equal. It may
+also raise an error using `error`, for instance if it needs a parameter but
+one isn't provided.
+
+The function will also be passed one or more parameters. The first is
+`undef` if invoked as `foo`, or the parameter `"bar"` if invoked as `foo(bar)`;
+it may also be passed additional, named parameters.
+
+### Setup plugins
+
+The ikiwiki setup file is loaded using a pluggable mechanism. If you look
+at the top of a setup file, it starts with 'use IkiWiki::Setup::Standard',
+and the rest of the file is passed to that module's import method.
+
+It's possible to write other modules in the `IkiWiki::Setup::` namespace that
+can be used to configure ikiwiki in different ways. These modules should,
+when imported, populate `$IkiWiki::Setup::raw_setup` with a reference
+to a hash containing all the config items. They should also implement a
+`gendump` function.
+
+By the way, to parse a ikiwiki setup file and populate `%config`, a
+program just needs to do something like:
+`use IkiWiki::Setup; IkiWiki::Setup::load($filename)`
+
+### Function overriding
+
+Sometimes using ikiwiki's pre-defined hooks is not enough. Your plugin
+may need to replace one of ikiwiki's own functions with a modified version,
+or wrap one of the functions.
+
+For example, your plugin might want to override `displaytime`, to change
+the html markup used when displaying a date. Or it might want to override
+`IkiWiki::formattime`, to change how a date is formatted. Or perhaps you
+want to override `bestlink` and change how ikiwiki deals with [[WikiLinks|ikiwiki/WikiLink]].
+
+By venturing into this territory, your plugin is becoming tightly tied to
+ikiwiki's internals. And it might break if those internals change. But
+don't let that stop you, if you're brave.
+
+Ikiwiki provides an `inject()` function, that is a powerful way to replace
+any function with one of your own. This even allows you to inject a
+replacement for an exported function, like `bestlink`. Everything that
+imports that function will get your version instead. Pass it the name of
+the function to replace, and a new function to call.
+
+For example, here's how to replace `displaytime` with a version using HTML 5
+markup:
+
+ inject(name => 'IkiWiki::displaytime', call => sub {
+ return "<time>".formattime(@_)."</time>";
+ });
+
+Here's how to wrap `bestlink` with a version that tries to handle
+plural words:
+
+ my $origbestlink=\&bestlink;
+ inject(name => 'IkiWiki::bestlink', call => \&mybestlink);
+
+ sub deplural ($) {
+ my $word=shift;
+ $word =~ s/e?s$//; # just an example :-)
+ return $word;
+ }
+
+ sub mybestlink ($$) {
+ my $page=shift;
+ my $link=shift;
+ my $ret=$origbestlink->($page, $link);
+ if (! length $ret) {
+ $ret=$origbestlink->($page, deplural($link));
+ }
+ return $ret;
+ }
+
+### Javascript
+
+Some plugins use javascript to make ikiwiki look a bit more web-2.0-ish.
+
+All javascript code should be put in `.js` files in the `javascript`
+underlay, and plugins using those files can enable use of the underlay by
+calling `add_underlay("javascript");` in their `import` function.
+
+You'll have to arrange for `<script>` tags to be added to the pages that
+use your javascript. This can be done using a `format` hook.
+
+Ikiwiki provides some utility functions in `ikiwiki.js`, for use by other
+javascript code. These include:
+
+#### `getElementsByClass(cls, node, tag)`
+
+Returns an array of elements with the given class. The node and tag are
+optional and define what document node and element names to search.
+
+#### `hook(name, call)`
+
+The function `call` will be run as part of the hook named `name`.
+
+Note that to hook into `window.onload`, you can use the `onload' hook.
+
+#### `run_hooks(name)`
+
+Runs the hooks with the specified name.
diff --git a/doc/plugins/write/discussion.mdwn b/doc/plugins/write/discussion.mdwn
new file mode 100644
index 000000000..24a556ffe
--- /dev/null
+++ b/doc/plugins/write/discussion.mdwn
@@ -0,0 +1,46 @@
+Maybe this is obvious, but the config variable lives in the IkiWiki module, and one probably
+wants to call defaultconfig for most applications.
+<pre>
+%IkiWiki::config=IkiWiki::defaultconfig();
+IkiWiki::Setup::load($config_file);
+print join(",",keys %IkiWiki::config);
+</pre>
+
+[[DavidBremner]]
+
+I'm a little concerned about one aspect of the `%wikistate` variable that was just introduced.
+I think global state for each plugin is a fine idea, but I worry about making it persist across
+rebuilds. (And by rebuild, I assume we're talking about the `--rebuild` option.)
+
+My reasoning is that a 'rebuild' should be similar to checking out a new copy of the wiki
+and building. Another way of saying this is that all permanent state should be in the RCS.
+It is great that there is temporary state stored in other places - I think of it as indexing
+and caching. I'm worried that with the persistence, plugin writers will start putting data
+there that isn't backed by the RCS and that will break IkiWiki's great abilities as a
+distributed wiki.
+
+[[Will]]
+
+> Well, if you look at state that already persists across rebuilds, we have
+> pagectime, which can be extracted from RCS only very slowly in many
+> cases. There's also the separate state stored by the aggregate plugin,
+> which is indeed independant of the RCS, and can in some cases not be
+> replecated by rebuilding a different checkout (if the data is gone from
+> the feeds). Then there's the session cookie database, and the user
+> database, which started out with a lot of local state, has been
+> whittled down by removing admin prefs and subscriptions, but still has
+> important state including password hashes.
+>
+> So while I take your point about the potential for abuse,
+> there's certianly legitimate reasons to need to store data across
+> rebuilds. And plugins have always been able to drop their own files in
+> wikistatedir as aggregate does and have it persist, so the abuse
+> potential has always been there, the barrier has been lowered only
+> slightly.
+>
+> OTOH, if something can be added to the documentation that encourages
+> good behavior, that'd be a good thing ... --[[Joey]]
+
+---
+
+I would find this page clearer split up into sub-pages. Does anyone agree/disagree? -- [[users/Jon]]
diff --git a/doc/plugins/write/external.mdwn b/doc/plugins/write/external.mdwn
new file mode 100644
index 000000000..a3fbe8a2c
--- /dev/null
+++ b/doc/plugins/write/external.mdwn
@@ -0,0 +1,146 @@
+External plugins are standalone, executable programs, that can be written
+in any language. When ikiwiki starts up, it runs the program, and
+communicates with it using [XML RPC][xmlrpc]. If you want to [[write]] an
+external plugin, read on..
+
+[xmlrpc]: http://www.xmlrpc.com/
+
+ikiwiki contains one sample external plugin, named `externaldemo`. This is
+written in perl, but is intended to be an example of how to write an
+external plugin in your favorite programming language. Wow us at how much
+easier you can do the same thing in your favorite language. ;-)
+
+There's now a second external plugin, the [[rst]] plugin, written in
+python. It uses a `proxy.py`, a helper library for ikiwiki python plugins.
+
+[[!toc ]]
+
+## How external plugins use XML RPC
+
+While XML RPC is typically used over http, ikiwiki doesn't do that.
+Instead, the external plugin reads XML RPC data from stdin, and writes it
+to stdout. To ease parsing, each separate XML RPC request or response must
+start at the beginning of a line, and end with a newline. When outputting
+XML RPC to stdout, be _sure_ to flush stdout. Failure to do so will result
+in deadlock!
+
+An external plugin should operate in a loop. First, read a command from
+stdin, using XML RPC. Dispatch the command, and return its result to
+stdout, also using XML RPC. After reading a command, and before returning
+the result, the plugin can output XML RPC requests of its own, calling
+functions in ikiwiki. Note: *Never* make an XML RPC request at any other
+time. IkiWiki won't be listening for it, and you will deadlock.
+
+When ikiwiki starts up an external plugin, the first RPC it will make
+is to call the plugin's `import()` function. That function typically makes
+an RPC to ikiwiki's `hook()` function, registering a callback.
+
+An external plugin can use XML RPC to call any of the exported functions
+documented in the [[plugin_interface_documentation|write]]. It can also
+actually call any non-exported IkiWiki function, but doing so is a good way
+to break your plugin when ikiwiki changes. There is currently no versioned
+interface like there is for perl plugins, but external plugins were first
+supported in ikiwiki version 2.6.
+
+## Accessing data structures
+
+IkiWiki has a few global data structures such as `%config`, which holds
+its configuration. External plugins can use the `getvar` and `setvar` RPCs
+to access any such global hash. To get the "url" configuration value,
+call `getvar("config", "url")`. To set it, call
+`setvar("config", "url", "http://example.com/)`.
+
+The `%pagestate` is a special hash with a more complex format. To access
+it, external plugins can use the `getstate` and `setstate` RPCs. To access
+stored state, call `getstate("page", "id", "key")`, and to store state,
+call `setstate("page", "id", "key", "value")`.
+
+To access ikiwiki's ARGV array, call `getargv()`. To change its ARGV, call
+`setargv(array)`.
+
+## Notes on function parameters
+
+The [[plugin_interface_documentation|write]] talks about functions that take
+"named parameters". When such a function is called over XML RPC, such named
+parameters look like a list of keys and values:
+
+ page, foo, destpage, bar, magnify, 1
+
+If a name is repeated in the list, the later value overrides the earlier
+one:
+
+ name, Bob, age, 20, name, Sally, gender, female
+
+In perl, boiling this down to an associative array of named parameters is
+very easy:
+
+ sub foo {
+ my %params=@list;
+
+Other languages might not find it so easy. If not, it might be a good idea
+to convert these named parameters into something more natural for the
+language as part of their XML RPC interface.
+
+## undef
+
+XML RPC has a limitation that it does not have a way to pass
+undef/NULL/None. There is an extension to the protocol that supports this,
+but it is not yet available in all versions of the [[!cpan XML::RPC]] library
+used by ikiwiki.
+
+Until the extension is available, ikiwiki allows undef to be communicated
+over XML RPC by passing a sentinal value, a hash with a single key "null"
+with a value of an empty string. External plugins that need to communicate
+null values to or from ikiwiki will have to translate between undef and
+the sentinal.
+
+## Function injection
+
+Some parts of ikiwiki are extensible by adding or overriding functions.
+It's actually possible to do this from an external plugin too.
+
+To make your external plugin override the `IkiWiki::formattime` function, for
+example, make an RPC call to `inject`. Pass it named parameters "name" and
+"call", where "name" is the name of the function to inject into perl (here
+"Ikiwiki::formattime" and "call" is the RPC call ikiwiki will make whenever
+that function is run.
+
+If the RPC call is memoizable, you can also pass a "memoize" parameter, set
+to 1.
+
+## Limitations of XML RPC
+
+Since XML RPC can't pass around references to objects, it can't be used
+with functions that take or return such references. That means you can't
+100% use XML RPC for `cgi` or `formbuilder` hooks (which are passed CGI and
+FormBuilder perl objects), or use it to call `template()` (which returns a
+perl HTML::Template object).
+
+## Performance issues
+
+Since each external plugin is a separate process, when ikiwiki is
+configured to use lots of external plugins, it will start up slower, and
+use more resources. One or two should not be a problem though.
+
+There is some overhead in using XML RPC for function calls. Most plugins
+should find it to be pretty minimal though. In one benchmark, ikiwiki was
+able to perform 10000 simple XML RPC calls in 11 seconds -- 900 per second.
+
+Using external plugins for hooks such as `sanitize` and `format`, which
+pass around entire pages, and are run for each page rendered, will cause
+more XML RPC overhead than usual, due to the larger number of calls, and the
+large quantity of data conversion going on. In contrast, `preprocess` hooks
+are called generally rarely, and pass around minimal data.
+
+External plugins should avoid making RPC calls unnecessarily (ie, in a loop).
+Memoizing the results of appropriate RPC calls is one good way to minimize the
+number of calls.
+
+Injecting a replacement for a commonly called ikiwiki function
+could result in a lot more RPC calls than expected and slow
+everything down. `pagetitle`, for instance, is called about 100 times
+per page build. Whenever possible, you should tell ikiwiki to memoize
+injected functions.
+
+In general, use common sense, and your external plugin will probably
+perform ok.
diff --git a/doc/plugins/write/tutorial.mdwn b/doc/plugins/write/tutorial.mdwn
new file mode 100644
index 000000000..1912c8a2f
--- /dev/null
+++ b/doc/plugins/write/tutorial.mdwn
@@ -0,0 +1,189 @@
+This tutorial will walk you through [[writing|write]] your first ikiwiki
+plugin.
+
+What should the plugin do? Let's make it calculate and output the Fibonacci
+sequence. To output the next number in the sequence, all a user has to do
+is write this on a wiki page:
+
+ \[[!fib]]
+
+When the page is built, the [[ikiwiki/directive]] will be
+replaced by the next number in the sequence.
+
+Most of ikiwiki's plugins are written in Perl, and it's currently easiest
+to write them in Perl. So, open your favorite text editor and start
+editing a file named "fib.pm".
+
+ #!/usr/bin/perl
+
+This isn't really necessary, since fib.pm will be a Perl module, but it's
+nice to have. Since it's a module, the next bit is this. Notice the "fib"
+at the end, matching the "fib" in the filename.
+
+ package IkiWiki::Plugin::fib;
+
+Now let's import a few modules. Warnings and strict are good ideas, but the
+important one is the IkiWiki module.
+
+ use warnings;
+ use strict;
+ use IkiWiki 3.00;
+
+Ok, boilerplate is out of the way. Now to add the one function that ikiwiki
+expects to find in any module: `import`. The import function is called when
+the module is first loaded; what modules typically do with it is
+register hooks that ikiwiki will call later.
+
+ sub import {
+ hook(type => "preprocess", id => "fib", call => \&preprocess);
+ }
+
+This has hooked our plugin into the preprocess hook, which ikiwiki uses to
+expand preprocessor [[directives|ikiwiki/directive]]. Notice
+that "fib" has shown up again. It doesn't actually have to match the module
+name this time, but it generally will. This "fib" is telling ikiwiki what
+kind of preprocessor directive to handle, namely one that looks like this:
+
+ [[!fib ]]
+
+Notice the `\&preprocess`? This is how you pass a reference to a function,
+and the `preprocess` function is the one that ikiwiki will call to expand
+the preprocessor directive. So, time to write that function:
+
+ sub preprocess {
+ my %params=@_;
+ return 1;
+ }
+
+Whatever this function returns is what will show up on the wiki page.
+Since this is the Fibonacci sequence, returning 1 will be right for the
+first two calls anways, so our plugin isn't _too_ buggy. ;-) Before we fix
+the bug, let's finish up the plugin.
+
+ 1
+
+Always put this as the last line in your Perl modules. Perl likes it.
+
+Ok, done! If you save the plugin, you can copy it to a place your ikiwiki
+looks for plugins (`/usr/share/perl5/IkiWiki/Plugins/` is a good bet; see
+[[install]] for the details of how to figure out where to
+install it). Then configure ikiwiki to use the plugin, and you're ready to
+insert at least the first two numbers of the Fibonacci sequence on web
+pages. Behold, the power of ikiwiki! ...
+
+----
+
+You could stop here, if you like, and go write your own plugin that does
+something more useful. Rather than leave you with a broken fib plugin
+though, this tutorial will go ahead and complete it. Let's add a simple
+Fibonacci generating function to the plugin. This is right out of a
+textbook.
+
+ sub fib {
+ my $num=shift;
+ return 0 if $num == 0;
+ return 1 if $num == 1;
+ return fib($num - 1) + fib($num - 2);
+ }
+
+And let's change the `preprocess` sub to use it:
+
+ my $last=0;
+
+ sub preprocess {
+ my %params=@_;
+ my $num=++$last;
+ return fib($num);
+ }
+
+Feel free to try it out with a simple page like this:
+
+ [[!fib ]], [[!fib ]], [[!fib ]], [[!fib ]], [[!fib ]]
+
+Looks like it works ok, doesn't it? That creates a page that lists:
+
+ 1, 1, 2, 3, 5
+
+But what happens if there are two pages that both use fib? Try it out.
+If ikiwiki builds both pages in one pass, the sequence will continue
+counting up from one page to the next. But if that second page is modified
+later and needs to be rebuilt, the sequence will start over from 1. This is
+because `$last` only remembers what was output during the current
+ikiwiki run.
+
+But that's not the end of the strange behavior. Create a page that inlines
+a page that uses fib. Now the inlined page will have one set of numbers,
+and the standalone page another. The numbers might even skip over part of
+the sequence in some cases.
+
+Obviously, using a global `$last` variable was a bad idea. It would
+work ok in a more regular cgi-based wiki, which only outputs one page per
+run. But since ikiwiki is a wiki *compiler*, things are a bit more
+complicated. It's not very hard to fix, though, if we do want the sequence
+to start from 1 in every page that uses it.
+
+ my %last;
+
+ sub preprocess {
+ my %params=@_;
+ my $page=$params{destpage};
+ my $num=++$last{$page};
+ return fib($num);
+ }
+
+All this is doing is using a hash to store the last number on a per-page
+basis. To get the name of the page that's being built, it looks in the
+`%params` hash.
+
+Ok, one more enhancement. Just incrementing the numbers is pretty boring.
+It would be nice to be able to jump directly to a given point in the
+sequence:
+
+ \[[!fib seed=20]], [[!fib ]], [[!fib ]]
+
+Just insert these lines of code inside `preprocess`, in the appropriate
+spot:
+
+ if (exists $params{seed}) {
+ $last{$page}=$params{seed};
+ }
+
+But this highlights another issue with the plugin. The `fib()` function is
+highly recursive and gets quite slow for large numbers. If a user enters
+seed=1000, it will run for a very long time, blocking ikiwiki from
+finishing. This denial of service attack also uses an ever-increasing
+amount of memory due to all the recursion.
+
+Now, we could try to fix `fib()` to run in constant time for any number,
+but that's not the focus of this tutorial. Instead, let's concentrate on
+making the plugin use the existing function safely. A good first step would
+be a guard on how high it will go.
+
+ my %last;
+
+ sub preprocess {
+ my %params=@_;
+ my $page=$params{destpage};
+ if (exists $params{seed}) {
+ $last{$page}=$params{seed}-1;
+ }
+ my $num=++$last{$page};
+ if ($num > 25) {
+ error "can only calculate the first 25 numbers in the sequence";
+ }
+ return fib($num);
+ }
+
+Returning an error message like this is standard for preprocessor plugins,
+so that the user can look at the built page and see what went wrong.
+
+Are we done? Nope, there's still a security hole. Consider what `fib()`
+does for numbers less than 0. Or for any number that's not an integer. In
+either case, it will run forever. Here's one way to fix that:
+
+ if (int($num) != $num || $num < 0) {
+ error "positive integers only, please";
+ }
+
+As these security problems have demonstrated, even a simple input from the
+user needs to be checked thoroughly before being used by an ikiwiki plugin.
diff --git a/doc/plugins/write/tutorial/discussion.mdwn b/doc/plugins/write/tutorial/discussion.mdwn
new file mode 100644
index 000000000..19f7e4084
--- /dev/null
+++ b/doc/plugins/write/tutorial/discussion.mdwn
@@ -0,0 +1,20 @@
+Thanks for the tutorial!
+
+But I think you have an error in the fib function! If you really start with
+
+ my $last = 0;
+
+and your fib function, you'll get this error, as you've produced a never ending recursion:
+
+ Deep recursion on subroutine "IkiWiki::Plugin::fib::fib" at ./fib.pm line 29.
+
+So the fib function should better look like this, which is its true definition (see [[Wikipedia|http://de.wikipedia.org/wiki/Fibonacci-Folge]], for example):
+
+ sub fib {
+ my $num=shift;
+ return 0 if $num == 0;
+ return 1 if $num == 1;
+ return fib($num - 1) + fib($num - 2);
+ }
+
+Just as a hint for people who run into this error while doing this tutorial.
diff --git a/doc/post-commit.mdwn b/doc/post-commit.mdwn
new file mode 100644
index 000000000..1c5176d42
--- /dev/null
+++ b/doc/post-commit.mdwn
@@ -0,0 +1,19 @@
+If your wiki is kept in [[revision_control|rcs]], a post-commit hook is run
+every time you commit a change to your repository.
+
+ikiwiki generates the "post-commit hook" once you've uncommented the relevant
+section (under wrappers) in the ikiwiki.setup.
+
+The generated wrapper is a C program that is designed to safely be made
+suid if necessary. It's hardcoded to run ikiwiki with the settings
+specified when you ran --wrapper, and can only be used to update and
+compile that one checkout into the specified html directory.
+
+Depending on your setup, the post-commit hook might end up
+getting called by users who have write access to the repository, but not to
+your wiki checkout and html directory. If so, you can safely make
+the wrapper suid to a user who can write there (*not* to root!). You might
+want to read [[Security]] first.
+
+[[Setup]] explains setting this up from the start and see [[rcs/details]] to
+know more.
diff --git a/doc/post-commit/discussion.mdwn b/doc/post-commit/discussion.mdwn
new file mode 100644
index 000000000..fc0a27ee4
--- /dev/null
+++ b/doc/post-commit/discussion.mdwn
@@ -0,0 +1,123 @@
+Hi Joey and many thanks for your work on ikiwiki, as usual you give us a very good soft...
+
+I want to be able to edit my website from a navigator (with the CGI) and
+from my favorite editor on my laptop. I have managed to use the subversion wrapper
+so I have write a post-commit hook with :
+
+ cd /~/wikisrc/
+ svn up
+ /usr/bin/ikiwiki --setup ../ikiwiki.setup
+
+at the end.
+
+This configuration works for me, the svn wrapper doesn't seems to
+do the svn up stuff so I wonder if I've missed something...
+
+Regards.
+
+> Well, you've created a post-commit script that runs ikiwiki in setup mode.
+> That's not how it's generally done, instead you generally configure
+> ikiwiki to generate a post-commit _binary_ that runs ikiwiki in update
+> mode. That binary can be installed directly as the post-commit hook, or
+> called from an existing post-commit hook script, and it will handle the
+> necessary svn up, and will update the wiki much quicker than your --setup
+> command above (which rebuilds the entire wiki and all wrappers each
+> commit)!
+>
+> In this wiki's setup file, I configure ikiwiki to generate a post-commit
+> wrapper binary like so:
+>
+> wrappers => [
+> {
+> wrapper => "/srv/svn/ikiwiki/hooks/post-commit",
+> wrappermode => "04755",
+> notify => 1,
+> }
+> ],
+
+
+Hello, I've setup ikiwiki with subversion. I can edit pages from web browser using CGI and, when I go to recentchanges, it shows that modification with "web" word. But, if I modify any .mdwn file, it gets updated in website but it doesn't show in recentchanges entry with "svn" word. If I run "svn ci -m changes", it shows in recentchanges correctly.
+
+So, I think I miss something, because I don't think I must run "svn add" or "svn commit" anytime I modify or create a wiki file.
+
+Thanks
+
+> Yes, ikiwiki does expect you to use your revision control system to check
+> in changes. Otherwise, recentchanges cannot work right, since it uses the
+> commit history from your revision control system. --[[Joey]]
+
+-----
+
+I'm working on an [[rcs]] plugin for CVS, adapted from `svn.pm`, in order
+to integrate ikiwiki at sites where that's all they've got. What's working
+so far: web commit (post-commit hook and all), diff, add (under certain
+conditions), and remove. What's not working: with rcs_add(), iff any of the
+new page's parent dirs aren't already under CVS control and the post-commit
+hook is enabled, the browser and ikiwiki stall for several seconds trying
+to add it, then time out. (If I kill ikiwiki when this is happening, it cvs
+adds the topmost parent that needed adding; if I wait for timeout, it
+doesn't. I think.) If I disable the post-commit hook and do the same kind
+of thing, the page is created and saved.
+
+In case you're lucky enough not to know, cvs adds on directories are weird
+-- they operate immediately against the repository, unlike file adds:
+
+ $ cvs add randomdir
+ Directory /Users/schmonz/Documents/cvswiki/repository/ikiwiki/randomdir added to the repository
+
+I was able to work out that when I'm seeing this page save misbehavior, my
+plugin is somewhere inside `system("cvs", "-Q", "add", "$file")`, which was
+never returning. If I changed it to anything other than cvs it iterated
+correctly over all the parent dirs which needed to be added to CVS, in the
+proper order. (cvs add isn't recursive, sadly.)
+
+Can you offer an educated guess what's going wrong here? --[[Schmonz]]
+
+> Got `rcs_recentchanges` working, believe it or not, thanks to [cvsps](http://www.cobite.com/cvsps/). If I can figure out this interaction between the post-commit hook and `cvs add` on directories, the CVS plugin is mostly done. Could it be a locking issue? Where should I be looking? Any suggestions appreciated. --[[Schmonz]]
+
+>> Okay, it is definitely a locking issue. First, on the conjecture that
+>> `cvs add <directory>` was triggering the post-commit hook and confusing
+>> ikiwiki, I wrapped the ikiwiki post-commit binary with a shell script
+>> that exited 0 if the triggering file was a directory. The first half of
+>> the conjecture was correct -- my wrapper got triggered -- but the web
+>> add of `one/two/three.mdwn` (where `one` and `two` weren't existing
+>> CVS-controlled dirs) remained hung as before. There were two ikiwiki
+>> processes running. On a whim, I killed the one with the higher PID; `cvs
+>> add one` immediately completed successfully, then back to a hang and two
+>> ikiwiki processes. I killed the newer one again and then `cvs add
+>> one/two` and `cvs add one/two/three.mdwn` completed and the web add was
+>> successful. --[[Schmonz]]
+
+>>> Aaaaaand I was wrong about the second half of the conjecture being
+>>> wrong. The wrapper script wasn't correctly identifying directories;
+>>> with that fixed, everything works. I've created a
+>>> [[rcs/cvs]] page. Thanks for listening. :-)
+>>> --[[Schmonz]]
+
+>> Here is a comment I committed to my laptop from Madrid Airport before
+>> your most recent updates, in case it's still useful:
+>>
+>> Locking certianly seems likely to be a problem. ikiwiki calls `rcs_add`
+>> *before* disabling the post-commit plugin, since all over VCS allow
+>> adding something in a staged manner. You can see this in, for example,
+>> `editpage.pm` lines 391+.
+>>
+>> So I guess what happens is that ikiwiki has taken the wiki lock, calls
+>> `rcs_add`, which does a `cvs add`, which runs the post commit hook,
+>> since it is not disabled -- which blocks waiting for the wiki lock.
+>>
+>> I guess you can fix this in either of three ways: Modify lots of places
+>> in ikiwiki to disable the post commit hook before calling `rcs_add`,
+>> or make cvs's `rcs_add` temporarily disable the commit hook and
+>> re-enable it (but only if it was not already disabled, somehow),
+>> or make cvs's `rcs_add` only make note that it needs to call `cvs add`
+>> later, and do so at `rcs_commit`. The last of these seems easist,
+>> especially since ikiwiki always commits after an add, in the same
+>> process, so you could just use a temporary list of things to add.
+>> --[[Joey]]
+
+>>> Thanks for the comments. Attempting to set up a wiki on a different system with a different version of `cvs`, I've encountered a new locking problem within CVS: `cvs commit` takes a write lock, post-commit ikiwiki calls `rcs_update()`, `cvs update` wants a read lock and blocks. The easiest fix I can think of is to make `cvs commit` return and relinquish its lock -- so instead of my wrapper script `exec`ing ikiwiki's post-commit hook, I amp it off and exit 0. Seems to do the trick and, if I grok ikiwiki's behavior here, is not dangerous. (Beats me why my development `cvs` doesn't behave the same WRT locking.)
+
+>>> I was all set to take your third suggestion, but now that there's more than one CVS oddity fixed trivially in a wrapper script, I think I prefer doing it that way.
+
+>>> I'd be glad for the CVS plugin to be included in ikiwiki, if and when you deem it ready. Please let me know what needs to be done for that to happen. --[[Schmonz]]
diff --git a/doc/quotes.mdwn b/doc/quotes.mdwn
new file mode 100644
index 000000000..22f3a28d8
--- /dev/null
+++ b/doc/quotes.mdwn
@@ -0,0 +1,3 @@
+Collecting some happy quotes about ikiwiki here.
+
+[[!inline pages="quotes/* and !*/Discussion"]]
diff --git a/doc/quotes/pizza.mdwn b/doc/quotes/pizza.mdwn
new file mode 100644
index 000000000..34899edfb
--- /dev/null
+++ b/doc/quotes/pizza.mdwn
@@ -0,0 +1,4 @@
+> Best. Wiki. Ever. Now a wiki that I can't "git clone" and "git push" to is
+> like a pizza that I have to eat with a knife and fork.
+
+-- [Don Marti](http://lwn.net/Articles/360888/)
diff --git a/doc/quotes/pizza/discussion.mdwn b/doc/quotes/pizza/discussion.mdwn
new file mode 100644
index 000000000..ecf8c44a6
--- /dev/null
+++ b/doc/quotes/pizza/discussion.mdwn
@@ -0,0 +1 @@
+It would be cool to know where this was written (assuming it was somewhere public :-)) Googling around, I've found a few places Don has recommended ikiwiki, but not in such glowing terms. -- [[Jon]]
diff --git a/doc/quotes/sold.mdwn b/doc/quotes/sold.mdwn
new file mode 100644
index 000000000..4bd021f74
--- /dev/null
+++ b/doc/quotes/sold.mdwn
@@ -0,0 +1,3 @@
+I'm totally sold on ikiwiki now.
+
+-- Anna Hess
diff --git a/doc/rcs.mdwn b/doc/rcs.mdwn
new file mode 100644
index 000000000..1f6b3c24e
--- /dev/null
+++ b/doc/rcs.mdwn
@@ -0,0 +1,44 @@
+[[!meta title="Revision Control Systems"]]
+
+Ikiwiki supports using several revision control systems for storing page
+histories.
+
+Ikiwiki started out supporting only [[Subversion|svn]], but the interface
+ikiwiki uses to a revision control system is sufficiently simple and
+generic that it can be adapted to work with many systems by writing a
+[[plugin|plugins/write]]. These days, most people use [[git]].
+
+While all supported revision control systems work well enough for basic
+use, some advanced or special features are not supported in all of them.
+The table below summarises this for each revision control system and
+links to more information about each.
+
+[[!table data="""
+feature |[[git]]|[[svn]]|[[bzr]] |[[monotone]]|[[mercurial]]|[[darcs]]|[[tla]] |[[cvs]]
+[[ikiwiki-makerepo]]|yes |yes |yes |yes |yes |yes |no |yes
+auto.setup |yes |yes |incomplete|yes |incomplete |yes |incomplete|yes
+`rcs_commit_staged` |yes |yes |yes |yes |yes |yes |no |yes
+`rcs_rename` |yes |yes |yes |yes |yes |yes |no |yes
+`rcs_remove` |yes |yes |yes |yes |yes |yes |no |yes
+`rcs_diff` |yes |yes |yes |yes |yes |yes |yes |yes
+`rcs_getctime` |fast |slow |slow |slow |fast |slow |slow |slow
+`rcs_getmtime` |fast |slow |slow |slow |fast |no |no |no
+`rcs_preprevert` |yes |no |no |no |no |no |no |no
+`rcs_revert` |yes |no |no |no |no |no |no |no
+anonymous push |yes |no |no |no |no |no |no |no
+conflict handling |yes |yes |yes |buggy |yes |yes |yes |yes
+openid username |yes |no |no |no |yes |yes |no |no
+"""]]
+
+Notes:
+
+* Lack of support in [[ikiwiki-makerepo]] or auto.setup can make it harder to
+ set up a wiki using that revision control system.
+* The `rcs_commit_staged` hook is needed to use [[attachments|plugins/attachment]]
+ or [[plugins/comments]].
+* `rcs_getctime` and `rcs_getmtime` may be implemented in a fast way (ie, one log
+ lookup for all files), or very slowly (one lookup per file).
+* Openid username support allows avoiding display of Google's ugly openids.
+
+There is a page with [[details]] about how the different systems work with
+ikiwiki, for the curious.
diff --git a/doc/rcs/bzr.mdwn b/doc/rcs/bzr.mdwn
new file mode 100644
index 000000000..19a7ae395
--- /dev/null
+++ b/doc/rcs/bzr.mdwn
@@ -0,0 +1,8 @@
+[Bazaar](http://bazaar-vcs.org/) is a distributed revison control
+system developed by Canonical Ltd. Ikiwiki supports storing a wiki in a
+bzr repository.
+
+Ikiwiki can run as a post-update hook to update a wiki whenever commits
+come in. When running as a [[cgi]] with bzr, ikiwiki automatically
+commits edited pages, and uses the bzr history to generate the
+[[RecentChanges]] page.
diff --git a/doc/rcs/cvs.mdwn b/doc/rcs/cvs.mdwn
new file mode 100644
index 000000000..2b191f257
--- /dev/null
+++ b/doc/rcs/cvs.mdwn
@@ -0,0 +1,46 @@
+[[!template id=gitbranch branch=schmonz/cvs author="[[schmonz]]"]]
+
+If you really need to, you can use [[!wikipedia desc="CVS" Concurrent
+Versions System]] with ikiwiki.
+
+### Usage
+7. Install [[!cpan File::chdir]], [[!cpan File::ReadBackwards]],
+ [cvsps](http://www.cobite.com/cvsps/), and
+ [cvsweb](http://www.freebsd.org/projects/cvsweb.html) or the like.
+7. Adjust CVS-related parameters in your setup file.
+
+Consider creating `$HOME/.cvsrc` if you don't have one already; the
+plugin doesn't need it, but you yourself might. Here's a good
+general-purpose one:
+
+ cvs -q
+ checkout -P
+ update -dP
+ diff -u
+ rdiff -u
+
+### Implementation details
+* [[ikiwiki-makerepo]]:
+ * creates a repository,
+ * imports `$SRCDIR` into top-level module `ikiwiki` (vendor tag
+ IKIWIKI, release tag PRE_CVS),
+ * configures the post-commit hook in `CVSROOT/loginfo`.
+
+### To do
+* Expand test coverage and fix bugs.
+* Have `ikiwiki-makerepo` set up NetBSD-like `log_accum` and
+ `commit_prep` scripts that coalesce commits into changesets. Reasons:
+ 7. Obviates the need to scrape the repo's complete history to
+ determine the last N changesets. (Repositories without such
+ records can fall back on the `cvsps` and `File::ReadBackwards`
+ code.)
+ 7. Arranges for ikiwiki to be run once per changeset, rather
+ than CVS's once per committed file (!), which is a waste at
+ best and bug-inducing at worst. (Currently, on multi-directory
+ commits, only the first directory's changes get mentioned
+ in [[recentchanges|plugins/recentchanges]].)
+* Perhaps prevent web edits from attempting to create `.../CVS/foo.mdwn`
+ (and `.../cvs/foo.mdwn` on case-insensitive filesystems); thanks
+ to the CVS metadata directory, the attempt will fail anyway (and
+ much more confusingly) if we don't.
+* Do a writeup for [[rcs/details]].
diff --git a/doc/rcs/cvs/discussion.mdwn b/doc/rcs/cvs/discussion.mdwn
new file mode 100644
index 000000000..57b0d044b
--- /dev/null
+++ b/doc/rcs/cvs/discussion.mdwn
@@ -0,0 +1,191 @@
+I've started reviewing this, and the main thing I don't like is the
+post-commit wrapper wrapper that ikiwiki-makerepo is patched to set up.
+That just seems unnecessarily complicated. Why can't ikiwiki itself detect
+the "cvs add <directory>" call and avoid doing anything in that case?
+--[[Joey]]
+
+> The wrapper wrapper does three things:
+>
+> 7. It ignores `cvs add <directory>`, since this is a weird CVS
+> behavior that ikiwiki gets confused by and doesn't need to act on.
+> 7. It prevents `cvs` locking against itself: `cvs commit` takes a
+> write lock and runs the post-commit hook, which runs `cvs update`,
+> which wants a read lock and sleeps forever -- unless the post-commit
+> hook runs in the background so the commit can "finish".
+> 7. It fails silently if the ikiwiki post-commit hook is missing.
+> CVS doesn't have any magic post-commit filenames, so hooks have to
+> be configured explicitly. I don't think a commit will actually fail
+> if a configured post-commit hook is missing (though I can't test
+> this at the moment).
+>
+> Thing 1 can probably be handled within ikiwiki, if that seems less
+> gross to you.
+
+>> It seems like it might be. You can use a `getopt` hook to check
+>> `@ARGV` to see how it was called. --[[Joey]]
+
+>>> This does the trick iff the post-commit wrapper passes its args
+>>> along. Committed on my branch. This seems potentially dangerous,
+>>> since the args passed to ikiwiki are influenced by web commits.
+>>> I don't see an exploit, but for paranoia's sake, maybe the wrapper
+>>> should only be built with execv() if the cvs plugin is loaded?
+>>> --[[schmonz]]
+
+>>>> Hadn't considered that. While in wrapper mode the normal getopt is not
+>>>> done, plugin getopt still runs, and so any unsafe options that
+>>>> other plugins support could be a problem if another user runs
+>>>> the setuid wrapper and passes those options through. --[[Joey]]
+
+>>>>> I've tried compiling the argument check into the wrapper as
+>>>>> the first thing main() does, and was surprised to find that
+>>>>> this doesn't prevent the `cvs add <dir>` deadlock in a web
+>>>>> commit. I was convinced this'd be a reasonable solution,
+>>>>> especially if conditionalized on the cvs plugin being loaded,
+>>>>> but it doesn't work. And I stuck debug printfs at the beginning
+>>>>> of all the rcs_foo() subs, and whatever `cvs add <dir>` is
+>>>>> doing to ikiwiki isn't visible to my plugin, because none of
+>>>>> those subs are getting called. Nuts. Can you think of anything
+>>>>> else that might solve the problem, or should I go back to
+>>>>> generating a minimal wrapper wrapper that checks for just
+>>>>> this one thing? --[[schmonz]]
+
+>>>>>> I don't see how there could possibly be a difference between
+>>>>>> ikiwiki's C wrapper and your shell wrapper wrapper here. --[[Joey]]
+
+>>>>>>> I was comparing strings overly precisely. Fixed on my branch.
+>>>>>>> I've also knocked off the two most pressing to-do items. I
+>>>>>>> think the plugin's ready for prime time. --[[schmonz]]
+
+> Thing 2 I'm less sure of. (I'd like to see the web UI return
+> immediately on save anyway, to a temporary "rebuilding, please wait
+> if you feel like knowing when it's done" page, but this problem
+> with CVS happens with any kind of commit, and could conceivably
+> happen with some other VCS.)
+
+>> None of the other VCSes let a write lock block a read lock, apparently.
+>>
+>> Anyway, re the backgrounding, when committing via the web, the
+>> post-commit hook doesn't run anyway; the rendering is done via the
+>> ikiwiki CGI. It would certianly be nice if it popped up a quick "working"
+>> page and replaced it with the updated page when done, but that's
+>> unrelated; the post-commit
+>> hook only does rendering when committing using the VCS directly. The
+>> backgrounding you do actually seems safe enough -- but tacking
+>> on a " &" to the ikiwiki wrapper call doesn't need a wrapper script,
+>> does it? --[[Joey]]
+
+>>> Nope, it works fine to append it to the `CVSROOT/loginfo` line.
+>>> Fixed on my branch. --[[schmonz]]
+
+> Thing 3 I think I did in order to squelch the error messages that
+> were bollixing up the CGI. It was easy to do this in the wrapper
+> wrapper, but if that's going away, it can be done just as easily
+> with output redirection in `CVSROOT/loginfo`.
+>
+> --[[schmonz]]
+
+>> If the error messages screw up the CGI they must go to stdout.
+>> I thought we had stderr even in the the CVS dark ages. ;-) --[[Joey]]
+
+>>> Some messages go to stderr, but definitely not all. That's why
+>>> I wound up reaching for IPC::Cmd, to execute the command line
+>>> safely while shutting CVS up. Anyway, I've tested what happens
+>>> if a configured post-commit hook is missing, and it seems fine,
+>>> probably also thanks to IPC::Cmd.
+>>> --[[schmonz]]
+
+----
+
+
+Further review.. --[[Joey]]
+
+I don't understand what `cvs_shquote_commit` is
+trying to do with the test message, but it seems
+highly likely to be insecure; I never trust anything
+that relies on safely quoting user input passed to the shell.
+
+(As an aside, `shell_quote` can die on certian inputs.)
+
+Seems to me that, if `IPC::Cmd` exposes input to the shell
+(which I have not verified but its docs don't specify; a bad sign)
+you chose the wrong tool and ended up doing down the wrong
+route, dragging in shell quoting problems and fixes. Since you
+chose to use `IPC::Cmd` just because you wanted to shut
+up CVS stderr, my suggestion would be to use plain `system`
+to run the command, with stderr temporarily sent to /dev/null:
+
+ open(my $savederr, ">&STDERR");
+ open(STDERR, ">", "/dev/null");
+ my $ret=system("cvs", "-Q", @_);
+ open(STDERR, ">$savederr");
+
+`cvs_runcvs` should not take an array reference. It's
+usual for this type of function to take a list of parameters
+to pass to the command.
+
+> Thanks for reading carefully. I've tested your suggestions and
+> applied them on my branch. --[[schmonz]]
+
+----
+
+I've abstracted out CVS's involvement in the wrapper, adding a new
+"wrapperargcheck" hook to examine `argc/argv` and return success or
+failure (failure causes the wrapper to terminate) and implementing
+this hook in the plugin. In the non-CVS case, the check immediately
+returns success, so the added overhead is just a function call.
+
+Given how rarely anything should need to reach in and modify the
+wrapper -- I'd go so far as to say we shouldn't make it too easy
+-- I don't think it's worth the effort to try and design a more
+general-purpose way to do so. If and when some other problem thinks
+it wants to be solved by a new wrapper hook, it's easy enough to add
+one. Until then, I'd say it's more important to keep the wrapper as
+short and clear as possible. --[[schmonz]]
+
+> I've committed a slightly different hook, which should be general enough
+> that `IkiWiki::Receive` can also use it, so please adapt your code to
+> that. --[[Joey]]
+
+>> Done. --[[schmonz]].
+
+----
+
+I'm attempting to bring some polish to this plugin, starting with
+fuller test coverage. In preparation, I've refactored the tests a
+bunch (and shuffled the code a bit) in my branch. I'm worried,
+however, that my misunderstanding of `git rebase` may have made my
+branch harder for you to pull.
+
+Before I go writing a whole swack of test cases, could you merge
+my latest? Through at least ad0e56cdcaaf76bc68d1b5c56e6845307b51c44a
+there should be no functional change. --[[schmonz]]
+
+Never mind, I was able to convince myself (by cloning `origin`
+afresh and merging from `schmonz/cvs`). The history is a little
+gross but the before-and-after diff looks right.
+
+Bugs found and fixed so far:
+
+* Stop treating text files as binary (`-kb`) on `rcs_add()`
+ (ac8eab29e8394aca4c0b23a6687ec947ea1ac869)
+
+> Merged to current head. --[[Joey]]
+
+----
+
+Hi! Bugfixes in `schmonz/cvs` I'd like to see merged:
+
+* `6753235d`: Return bounded output from `rcs_diff()` when asked, as
+ the API states.
+* `e45175d5`: Always explicitly set CVS keyword substitution behavior.
+ Fixes behavior when a text file is added under a name formerly
+ used for a binary file.
+* `b30cacdf`: If the previous working directory no longer exists after
+ a CVS operation, don't try to `chdir()` back to it afterward.
+* `91b477c0`: Fix diffurl links (cvsweb expects unescaped '/').
+
+These are all the diffs that exist on the branch, so if the changes
+are acceptable you should be able to simply merge the branch.
+--[[schmonz]]
+
+> All applied. --[[Joey]]
diff --git a/doc/rcs/darcs.mdwn b/doc/rcs/darcs.mdwn
new file mode 100644
index 000000000..fbb9bcede
--- /dev/null
+++ b/doc/rcs/darcs.mdwn
@@ -0,0 +1,15 @@
+[Darcs](http://darcs.new) is a distributed revison control
+system. Ikiwiki supports storing a wiki in a
+Darcs repository.
+
+An Ikiwiki wrapper is run by the `posthook` to update a wiki whenever commits
+or remote pushes come in. When running as a [[cgi]] with Darcs, ikiwiki
+automatically commits edited pages, and uses the Darcs history to generate the
+[[RecentChanges]] page.
+
+Example for a `_darcs/prefs/defaults` file in `$SRCDIR`:
+
+ apply posthook /path/to/repository/_darcs/ikiwiki-wrapper
+ apply run-posthook
+
+See also [[todo/darcs|todo/darcs]]
diff --git a/doc/rcs/details.mdwn b/doc/rcs/details.mdwn
new file mode 100644
index 000000000..013ddb745
--- /dev/null
+++ b/doc/rcs/details.mdwn
@@ -0,0 +1,292 @@
+A few bits about the RCS backends
+
+[[!toc ]]
+
+## Terminology
+
+``web-edit'' means that a page is edited by using the web (CGI) interface
+as opposed to using a editor and the RCS interface.
+
+
+## [[svn]]
+
+Subversion was the first RCS to be supported by ikiwiki.
+
+### How does it work internally?
+
+Master repository M.
+
+RCS commits from the outside are installed into M.
+
+There is a working copy of M (a checkout of M): W.
+
+HTML is generated from W. rcs_update() will update from M to W.
+
+CGI operates on W. rcs_commit() will commit from W to M.
+
+For all the gory details of how ikiwiki handles this behind the scenes,
+see [[commit-internals]].
+
+You browse and web-edit the wiki on W.
+
+W "belongs" to ikiwiki and should not be edited directly.
+
+
+## [[darcs]]
+
+Regarding the repository layout: There are two darcs repositories. One is the `srcdir`, the other we'll call `master`.
+
+* HTML is generated from `srcdir`.
+* CGI edits happen in `srcdir`.
+* The backend pulls updates from `master` into `srcdir`, i.e. darcs commits should happen to `master`.
+* `master` calls ikiwiki (through a wrapper) in its apply posthook, i.e. `master/_darcs/prefs/defaults` should look like this:
+
+ apply posthook ikiwrap
+ apply run-posthook
+
+* The backend pushes CGI edits from `srcdir` back into `master` (triggering the apply hook).
+* The working copies in `srcdir` and `master` should *not* be touched by the user, only by the CGI or darcs, respectively.
+
+## [[Git]]
+
+Regarding the Git support, Recai says:
+
+I have been testing it for the past few days and it seems satisfactory. I
+haven't observed any race condition regarding the concurrent blog commits
+and it handles merge conflicts gracefully as far as I can see.
+
+(After about a year, git support is nearly as solid as subversion support --[[Joey]])
+
+As you may notice from the patch size, GIT support is not so trivial to
+implement (for me, at least). It has some drawbacks (especially wrt merge
+which was the hard part). GIT doesn't have a similar functionality like
+'svn merge -rOLD:NEW FILE' (please see the relevant comment in `_merge_past`
+for more details), so I had to invent an ugly hack just for the purpose.
+
+> I was looking at this, and WRT the problem of uncommitted local changes,
+> it seems to me you could just git-stash them now that git-stash exists.
+> I think it didn't when you first added the git support.. --[[Joey]]
+
+
+>> Yes, git-stash had not existed before. What about sth like below? It
+>> seems to work (I haven't given much thought on the specific implementation
+details). --[[roktas]]
+
+>> # create test files
+>> cd /tmp
+>> seq 6 >page
+>> cat page
+>> 1
+>> 2
+>> 3
+>> 4
+>> 5
+>> 6
+>> sed -e 's/2/2ME/' page >page.me # my changes
+>> cat page
+>> 1
+>> 2ME
+>> 3
+>> 4
+>> 5
+>> 6
+>> sed -e 's/5/5SOMEONE/' page >page.someone # someone's changes
+>> cat page
+>> 1
+>> 2
+>> 3
+>> 4
+>> 5SOMEONE
+>> 6
+>>
+>> # create a test repository
+>> mkdir t
+>> cd t
+>> cp ../page .
+>> git init
+>> git add .
+>> git commit -m init
+>>
+>> # save the current HEAD
+>> ME=$(git rev-list HEAD -- page)
+>> $EDITOR page # assume that I'm starting to edit page via web
+>>
+>> # simulates someone's concurrent commit
+>> cp ../page.someone page
+>> git commit -m someone -- page
+>>
+>> # My editing session ended, the resulting content is in page.me
+>> cp ../page.me page
+>> cat page
+>> 1
+>> 2ME
+>> 3
+>> 4
+>> 5
+>> 6
+>>
+>> # let's start to save my uncommitted changes
+>> git stash clear
+>> git stash save "changes by me"
+>> # we've reached a clean state
+>> cat page
+>> 1
+>> 2
+>> 3
+>> 4
+>> 5SOMEONE
+>> 6
+>>
+>> # roll-back to the $ME state
+>> git reset --soft $ME
+>> # now, the file is marked as modified
+>> git stash save "changes by someone"
+>>
+>> # now, we're at the $ME state
+>> cat page
+>> 1
+>> 2
+>> 3
+>> 4
+>> 5
+>> 6
+>> git stash list
+>> stash@{0}: On master: changes by someone
+>> stash@{1}: On master: changes by me
+>>
+>> # first apply my changes
+>> git stash apply stash@{1}
+>> cat page
+>> 1
+>> 2ME
+>> 3
+>> 4
+>> 5
+>> 6
+>> # ... and commit
+>> git commit -m me -- page
+>>
+>> # apply someone's changes
+>> git stash apply stash@{0}
+>> cat page
+>> 1
+>> 2ME
+>> 3
+>> 4
+>> 5SOMEONE
+>> 6
+>> # ... and commit
+>> git commit -m me+someone -- page
+
+By design, Git backend uses a "master-clone" repository pair approach in contrast
+to the single repository approach (here, _clone_ may be considered as the working
+copy of a fictious web user). Even though a single repository implementation is
+possible, it somewhat increases the code complexity of backend (I couldn't figure
+out a uniform method which doesn't depend on the prefered repository model, yet).
+By exploiting the fact that the master repo and _web user_'s repo (`srcdir`) are all
+on the same local machine, I suggest to create the latter with the "`git clone -l -s`"
+command to save disk space.
+
+Note that, as a rule of thumb, you should always put the rcs wrapper (`post-update`)
+into the master repository (`.git/hooks/`).
+
+Here is how a web edit works with ikiwiki and git:
+
+* ikiwiki cgi modifies the page source in the clone
+* git-commit in the clone
+* git push origin master, pushes the commit from the clone to the master repo
+* the master repo's post-update hook notices this update, and runs ikiwiki
+* ikiwiki notices the modifies page source, and compiles it
+
+Here is a how a commit from a remote repository works:
+
+* git-commit in the remote repository
+* git-push, pushes the commit to the master repo on the server
+* (Optionally, the master repo's pre-receive hook runs, and checks that the
+ update only modifies files that the pushing user is allowed to update.
+ If not, it aborts the receive.)
+* the master repo's post-update hook notices this update, and runs ikiwiki
+* ikiwiki notices the modifies page source, and compiles it
+
+## [[Mercurial]]
+
+The Mercurial backend is still in a early phase, so it may not be mature
+enough, but it should be simple to understand and use.
+
+As Mercurial is a distributed RCS, it lacks the distinction between
+repository and working copy (every wc is a repo).
+
+This means that the Mercurial backend uses directly the repository as
+working copy (the master M and the working copy W described in the svn
+example are the same thing).
+
+You only need to specify 'srcdir' (the repository M) and 'destdir' (where
+the HTML will be generated).
+
+Master repository M.
+
+RCS commit from the outside are installed into M.
+
+M is directly used as working copy (M is also W).
+
+HTML is generated from the working copy in M. rcs_update() will update
+to the last committed revision in M (the same as 'hg update').
+If you use an 'update' hook you can generate automatically the HTML
+in the destination directory each time 'hg update' is called.
+
+CGI operates on M. rcs_commit() will commit directly in M.
+
+If you have any question or suggestion about the Mercurial backend
+please refer to [Emanuele](http://nerd.ocracy.org/em/)
+
+## [[tla]]
+
+Nobody really understands how tla works. ;-)
+
+## rcs
+
+There is a patch that needs a bit of work linked to from [[todo/rcs]].
+
+## [[Monotone]]
+
+In normal use, monotone has a local database as well as a workspace/working copy.
+In ikiwiki terms, the local database takes the role of the master repository, and
+the srcdir is the workspace. As all monotone workspaces point to a default
+database, there is no need to tell ikiwiki explicitly about the "master" database. It
+will know.
+
+The backend currently supports normal committing and getting the history of the page.
+To understand the parallel commit approach, you need to understand monotone's
+approach to conflicts:
+
+Monotone allows multiple micro-branches in the database. There is a command,
+`mtn merge`, that takes the heads of all these branches and merges them back together
+(turning the tree of branches into a dag). Conflicts in monotone (at time of writing)
+need to be resolved interactively during this merge process.
+It is important to note that having multiple heads is not an error condition in a
+monotone database. This condition will occur in normal use. In this case
+'update' will choose a head if it can, or complain and tell the user to merge.
+
+For the ikiwiki plugin, the monotone ikiwiki plugin borrows some ideas from the svn ikiwiki plugin.
+On prepedit() we record the revision that this change is based on (I'll refer to this as the prepedit revision). When the web user
+saves the page, we check if that is still the current revision. If it is, then we commit.
+If it isn't then we check to see if there were any changes by anyone else to the file
+we're editing while we've been editing (a diff bewteen the prepedit revision and the current rev).
+If there were no changes to the file we're editing then we commit as normal.
+
+It is only if there have been parallel changes to the file we're trying to commit that
+things get hairy. In this case the current approach is to
+commit the web changes as a branch from the prepedit revision. This
+will leave the repository with multiple heads. At this point, all data is saved.
+The system then tries to merge the heads with a merger that will fail if it cannot
+resolve the conflict. If the merge succeeds then everything is ok.
+
+If that merge failed then there are conflicts. In this case, the current code calls
+merge again with a merger that inserts conflict markers. It commits this new
+revision with conflict markers to the repository. It then returns the text to the
+user for cleanup. This is less neat than it could be, in that a conflict marked
+revision gets committed to the repository.
+
+## [[bzr]]
+
+## [[cvs]]
diff --git a/doc/rcs/details/discussion.mdwn b/doc/rcs/details/discussion.mdwn
new file mode 100644
index 000000000..c2a0a5977
--- /dev/null
+++ b/doc/rcs/details/discussion.mdwn
@@ -0,0 +1,15 @@
+## [[git]]
+
+I'm currently spending some thoughts on how to extend the
+ikiwiki git infrastructure to allow for the two repositories
+to be on different machines. Has someone else already made
+such thoughts? --[[tschwinge]]
+
+> Okay, I got this working. I'll test and experiment some
+> more and then document it in here. --[[tschwinge]]
+
+> I think the method documented in [[setup]] will work fine for this. Just
+> have the other machine clone from the bare repository on the server. When
+> it pushes changes to the server, the post-update hook fires, and updates the
+> wiki. Of course, you actually have *three* repos in this setup, two on
+> the server, and one on the other machine. --[[Joey]]
diff --git a/doc/rcs/git.mdwn b/doc/rcs/git.mdwn
new file mode 100644
index 000000000..c82adbd04
--- /dev/null
+++ b/doc/rcs/git.mdwn
@@ -0,0 +1,153 @@
+[[!meta title="Git"]]
+
+[Git][git] is a distributed revison control system originally developed for
+the Linux kernel. Ikiwiki supports storing a wiki in git.
+
+[git]: http://git.or.cz/
+
+Ikiwiki can run as a git `post-update` hook to update a wiki
+whenever commits come in. When running as a [[cgi]],
+ikiwiki automatically commits edited pages, and uses the
+git history to generate the [[RecentChanges]] page.
+
+Normally you can just follow the instructions in [[setup]] to create
+the git repositories and get started. To understand the details, read on.
+
+## git repository setup
+
+[[!img wiki_edit_flow.svg size=490x align=right]]
+
+The suggested setup for git has a bare repository, and various
+working clones (with working directories). The bare
+repository is pushed to and pulled from the various working clones.
+
+One of the clones is special; it is the srcdir
+which is used to compile the wiki, and is also used by the
+[[cgi]] to commit changes made via the web interface. It is special
+since the `post-update` hook for the bare root repository is used to
+trigger an update of this repository, and then an ikiwiki refresh
+updates the published wiki itself.
+
+The other (optional) clones are meant for you to work
+on, and commit to, changes should then be pushed to the bare root
+repository.
+
+Using three or more repositories isn't the most obvious set up, but
+it works the best for typical ikiwiki use. [[ikiwiki-makerepo]] can
+automate setting this up for the common case where there is no
+pre-existing wiki. [[tips/Laptop_wiki_with_git]] describes a different
+way to set up ikiwiki and git.
+
+## git repository with multiple committers
+
+It can be tricky to get the permissions right to allow multiple people to
+commit to an ikiwiki git repository. As the [[security]] page mentions,
+for a secure ikiwiki installation, only one person should be able to write
+to ikiwiki's srcdir. When other committers make commits, their commits
+should be pushed to the bare repository, which has a `post-update` hook
+that uses ikiwiki to pull the changes to the srcdir.
+
+One setup that will work is to put all committers in a group (say,
+"ikiwiki"), and use permissions to allow that group to commit to the bare git
+repository. Make both the post-update hook and ikiwiki.cgi be setgid
+to the group, as well as suid to the user who admins the wiki. The
+`wrappergroup` [[setup_file_option|usage]] can be used to make the wrappers
+be setgid to the right group. Then the srcdir, including its git
+repository, should only be writable by the wiki's admin, and *not* by the
+group. Take care that ikiwiki uses a umask that does not cause files in
+the srcdir to become group writable. (umask 022 will work.)
+
+## git repository with untrusted committers
+
+By default, anyone who can commit to the git repository can modify any file
+on the wiki however they like. A `pre-receive` hook can be set up to limit
+incoming commits from untrusted users. Then the same limits that are placed
+on edits via the web will be in effect for commits to git for the users.
+They will not be allowed to edit locked pages, they will only be able to
+delete pages that the [[plugins/remove]] configuration allows them to
+remove, and they will only be allowed to add non-page attachments that the
+[[plugins/attachment]] configuration allows.
+
+To enable this, you need to set up the git repository to have multiple
+committers. Trusted committers, including the user that ikiwiki runs as,
+will not have their commits checked by the `pre-receive` hook. Untrusted
+committers will have their commits checked. The configuration settings to
+enable are `git_test_receive_wrapper`, which enables generation of a
+`pre-receive` hook, and `untrusted_committers`, which is a list of
+usernames of the untrusted committers.
+
+Note that when the `pre-receive` hook is checking incoming changes, it
+ignores the git authorship information, and uses the username of the unix
+user who made the commit. Then tests including the `locked_pages`
+[[ikiwiki/PageSpec]]
+are checked to see if that user can edit the pages in the commit.
+
+You can even set up an [[anonymous_user|tips/untrusted_git_push]], to allow
+anyone to push changes in via git rather than using the web interface.
+
+## Optionally using a local wiki to preview changes
+
+When working on your wiki,
+it is common (but optional) practice to preview your changes using a
+private wiki on the local host before publishing the updates by
+sending it to the root repository. If you do want to setup a private
+wiki, you will have to have another setup file and and an ikiwiki
+installation on your local machine. You will need all the packages
+this implies -- a web server, git, ikiwiki, etc. However, there is a
+_caveat_: by default, ikiwiki pulls and pushes from `origin`. This is
+not ideal for the working clones on the local machine, since you might
+go through several iterations of a page before pushing to the bare
+root of the repository tree (and thus publishing it on your public wiki).
+You do not want the action of refreshing the local wiki in order to
+review your work to accidentally publish the
+contents before you are ready. In order to prevent the git push that
+is the normal behaviour of ikiwiki, set the configuration of the local wiki:
+
+ gitorigin_branch => "",
+ ## git post-commit wrapper
+ git_wrapper => "/working/dir/.git/hooks/post-commit",
+
+Then just committing should refresh the private ikiwiki on the local
+host. Now just run `ikiwiki -setup localwiki.setup -gettime` and
+you should be good to go. (You only need the slow `-gettime` option
+the first time you run setup.) Use standard git commands to handle
+pulling from and pushing to the server. **Note**: After
+pulling changes from the bare root repository, you will need to
+manually update the local wiki, with a command such as `ikiwiki
+-setup localwiki.setup -refresh`. You could use git's `post-merge` hook
+to automate that command.
+
+## Using ikiwiki with Gerrit
+
+[Gerrit Code Review](https://code.google.com/p/gerrit/) manages a set of Git
+repositories and provides a web interface to review and merge commits. You can
+configure ikiwiki to work with a Gerrit-managed repository, allowing you to
+review and merge commits to your wiki.
+
+First, create your initial wiki repository with Gerrit. On the server, as the
+user that will own the wiki, clone that repository to create a working
+directory for ikiwiki, such as /srv/wiki/ikiwiki-checkout. Create a setup file
+and target directory as usual, referencing that working directory path, and
+creating a post-update hook in Gerrit's repository. You'll need to set
+appropriate permissions on the hook directory for the repository so that the
+user running ikiwiki can compile and install the post-update hook. Also note
+that you must disable web editing by disabling the editpage plugin, and you
+must not enable any other plugin that commits to the repository, since ikiwiki
+will not have permission to push to the repository. (Allowing web edits to
+have such permission would bypass Gerrit's code review, defeating the purpose.)
+
+Gerrit does not run per-repository hooks, such as the post-update hook ikiwiki
+installs to update the wiki after pushes. However, Gerrit has site-wide hooks,
+including a ref-updated hook that runs whenever a ref changes. You can use
+that hook to trigger ikiwiki's post-update hook. The following script,
+installed as Gerrit's ref-updated hook, will run the post-update hook on any
+repository that has a "gerrit-run-post-update-hook" file in it:
+
+ #!/bin/sh
+ if [ -e "$GIT_DIR/gerrit-run-post-update-hook" ] ; then
+ exec "$GIT_DIR/hooks/post-update"
+ fi
+
+Then just create gerrit-run-post-update-hook in the wiki repository, run
+ikiwiki --setup on the setup file, add your wiki to /etc/ikiwiki/wikilist, and
+start reviewing and committing wiki changes via Gerrit.
diff --git a/doc/rcs/git/discussion.mdwn b/doc/rcs/git/discussion.mdwn
new file mode 100644
index 000000000..7d39c6107
--- /dev/null
+++ b/doc/rcs/git/discussion.mdwn
@@ -0,0 +1,129 @@
+## ikiwiki + git
+
+<http://fob.po8.org/node/346>
+
+Here's an early page documenting setting up ikiwiki with git. It shouldn't be
+this hard anymore. :-) See [[setup]] --[[Joey]]
+
+## Migrating from svn to git ##
+
+I'd like to migrate from svn to git, because git is better in general but also has some nice properties that go well together with my use of ikiwiki.. I only change it myself. I want a single git repo so that my website directory is self-contained so that I don't need to drag around a separate svn repository on my computer. Is it possible to use ikiwiki so that it only uses a git repository in the same dir as all files are stored and edited?
+
+Otherwise, I hope migrating is just importing the svn repo to git and then setting up ikiwiki to use git. I don't plan to go back to svn after that so git-svn should only do the import.
+
+### Solution ###
+**Basis:** I only use ikiwiki as a wiki compiler. No cgi or anything.
+
+I imported my svn repo into git with `git-svnimport`. I reconfigured ikiwiki to _not use any rcs_. In `ikiwiki.setup`, I have the git repository as srcdir, and a suitable dstdir.
+
+Then, in my git repository, I added this `post-commit` hook to refresh the wiki:
+
+ #!/bin/sh
+
+ # to refresh when changes happen
+
+ BASE="/path/to/base/dir"
+ SETUPFILE="$BASE/ikiwiki.setup"
+ UNDERLAYDIR="$BASE/underlay"
+
+ ikiwiki --refresh --setup "$SETUPFILE" --underlaydir="$UNDERLAYDIR" --verbose
+
+Positives:
+
+* Containment: I only have the above `$BASE` directory to backup: it contains the srcdir and setup files. No external svn repository. This means that suddenly `git` and `ikiwiki` pair into a stand-alone self-contained wiki compiler kit.
+
+UlrikSverdrup (This is now crossposted to the above mentioned [website][ulrikweb])
+
+[ulrikweb]: http://www.student.lu.se/~cif04usv/wiki/stuff/git.html
+
+> Note that while the post-commit hook above may work in some situations, it *will* fail (or at least be suboptimal) for web commits. If you're setting up ikiwiki and git for a wiki that allows web commits, you should use
+> the repository and hook setups in documented in [[setup]] instead. With that method, you do end up with two separate git repos; but it's fine to only back one of them up. :-) --[[Joey]]
+
+## gitmanual
+
+Main use case I am trying to accomplish: Edit wiki pages offline.
+
+1. Imagine you're the administrator of the site and you want to checkout the wiki sources to give them some love while on a train journey.
+2. Or you are writing a complex document and you want to simply use your favourite $EDITOR
+3. Learn a little more about [git](http://git.or.cz/)
+
+# Workflow
+
+## on webconverger.org aka si.dabase.com aka hendry machine
+
+Wiki page created with [ikiwiki](http://ikiwiki.info). Example usb.mdwn [usb](http://webconverger.org/usb/)
+
+## on monty (my laptop)
+
+ git-clone ssh://si.dabase.com/home/hendry/wikiwc/.git/
+
+You might want to set some config variables like your email as this [tutorial](http://www.kernel.org/pub/software/scm/git/docs/tutorial.html) describes.
+
+ echo "blah" >> usb.mdwn
+
+Then to commit:
+
+ git-commit -a -m "added test"
+
+Send back:
+
+ git push origin
+
+## on webconverger.org aka si.dabase.com aka hendry machine
+
+You should setup the "The git post-update wrapper" in the **ikiwiki.setup** file.
+
+Then the wiki should be up-to-date! :)
+
+# Ack
+
+Thanks to gitte on #git on Freenode and of course joeyh. Have a look at [[rcs/details]].
+
+## Too many pages about git?
+I think it would be a good thing if the various git pages where somehow unified. It seems to me that [[tips/laptop_wiki_with_git]] is currently not so different from [[git]]. Let us see what joeyh thinks about the new git pages, but if this level of detail is to go there, I think it could pretty much include (maybe as sub pages) the info in [[tips/laptop_wiki_with_git]] and [[tips/laptop_wiki_with_git_extended]] --[[DavidBremner]]
+
+# Does 'push' from the shallow clones work for you? git-clone and git-fetch explicitly state it doesn't...
+
+-------
+
+## Permissions for web users and local users editing and creating pages
+What is the right permissions setup for a situation where both web and local users will be editing and creatingt pages?
+My usage is this: I have a repository /srv/git/wiki.git chowned to me:apache with 775/664 permissions recursively (where 'me' is my account and the ikiwiki administrator), a /srv/www/ikisrc chowned to apache:apache, and a /srv/www/html/wiki chowned to apache:apache. As is, I can commit to the wiki.git repo (because it is owned by me) and web users can commit to it as well (because the group also has write access) what happens when I create a new page from either of those sources? For example, the apache user running ikiwiki.cgi would create /srv/www/ikisrc/something.mdwn, commit and push it to /srv/git/wiki.git, but that new object is owned by apache:apache. If I then try to commit a change to something.mdwn from a cloned repo sitting on my laptop, for example, will the commit not fail because apache created the files?
+
+Does that mean that apache:apache should just own everything, and I should only commit through that user (via git:// protocol only, maybe, or ssh as apache instead of myself)? For some reason, my head can't quite wrap itself around the whole permissions issue. Thanks. --mrled
+
+> Ikiwiki is designed so that you don't have to worry about this kind of permissions issue.
+> Instead you can just configure the ikiwiki.cgi, in the setup file, to be suid to your
+> user. Then there's no need to let the web server's user modify files at all. --[[Joey]]
+
+
+## using a local wiki to preview changes: an srcdir needed?
+I have read the hints about using a local wiki to preview changes, but I haven't understood: is it assumed that I should also have a separate "srcdir" for this local preview-wiki (as it is done for the main wiki site), or I could point the local ikiwiki's "srcdir" to the working dir? Can something bad happen if I do this? I guess no, because--as I see it--the reason to have 2 repos for the main site was only enabling pushing to it, so it's a peculiarity of git, and not a requirement for a clean functioning of ikiwiki.
+
+Ok, probably, I have answered my question myself, but I'll let this comment stay here, if someone else will be tinking about the same issue. --Ivan Z.
+
+## Fix for error on git pull origin
+
+Error message when running git pull origin:
+
+ You asked me to pull without telling me which branch you
+ want to merge with, and 'branch.master.merge' in
+ your configuration file does not tell me either. Please
+ name which branch you want to merge on the command line and
+ try again (e.g. 'git pull <repository> <refspec>').
+ See git-pull(1) for details on the refspec.
+
+ If you often merge with the same branch, you may want to
+ configure the following variables in your configuration
+ file:
+
+ branch.master.remote = <nickname>
+ branch.master.merge = <remote-ref>
+ remote.<nickname>.url = <url>
+ remote.<nickname>.fetch = <refspec>
+
+ See git-config(1) for details.
+
+The solution is to run this command in your srcdir:
+
+ git config branch.master.remote origin
diff --git a/doc/rcs/git/wiki_edit_flow.svg b/doc/rcs/git/wiki_edit_flow.svg
new file mode 100644
index 000000000..200a3439d
--- /dev/null
+++ b/doc/rcs/git/wiki_edit_flow.svg
@@ -0,0 +1,705 @@
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<!-- Created with Inkscape (http://www.inkscape.org/) -->
+
+<svg
+ xmlns:dc="http://purl.org/dc/elements/1.1/"
+ xmlns:cc="http://creativecommons.org/ns#"
+ xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
+ xmlns:svg="http://www.w3.org/2000/svg"
+ xmlns="http://www.w3.org/2000/svg"
+ xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
+ xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
+ width="493.90625"
+ height="548.64734"
+ id="svg2"
+ version="1.1"
+ inkscape:version="0.48.1 r9760"
+ sodipodi:docname="wiki_edit_flow.svg">
+ <defs
+ id="defs4">
+ <marker
+ inkscape:stockid="Arrow2Lend"
+ orient="auto"
+ refY="0"
+ refX="0"
+ id="Arrow2Lend"
+ style="overflow:visible">
+ <path
+ id="path3914"
+ style="font-size:12px;fill-rule:evenodd;stroke-width:0.625;stroke-linejoin:round"
+ d="M 8.7185878,4.0337352 -2.2072895,0.01601326 8.7185884,-4.0017078 c -1.7454984,2.3720609 -1.7354408,5.6174519 -6e-7,8.035443 z"
+ transform="matrix(-1.1,0,0,-1.1,-1.1,0)"
+ inkscape:connector-curvature="0" />
+ </marker>
+ <marker
+ inkscape:stockid="Arrow1Lend"
+ orient="auto"
+ refY="0"
+ refX="0"
+ id="Arrow1Lend"
+ style="overflow:visible">
+ <path
+ id="path3896"
+ d="M 0,0 5,-5 -12.5,0 5,5 0,0 z"
+ style="fill-rule:evenodd;stroke:#000000;stroke-width:1pt;marker-start:none"
+ transform="matrix(-0.8,0,0,-0.8,-10,0)"
+ inkscape:connector-curvature="0" />
+ </marker>
+ <marker
+ inkscape:stockid="Arrow1Lstart"
+ orient="auto"
+ refY="0"
+ refX="0"
+ id="Arrow1Lstart"
+ style="overflow:visible">
+ <path
+ id="path3893"
+ d="M 0,0 5,-5 -12.5,0 5,5 0,0 z"
+ style="fill-rule:evenodd;stroke:#000000;stroke-width:1pt;marker-start:none"
+ transform="matrix(0.8,0,0,0.8,10,0)"
+ inkscape:connector-curvature="0" />
+ </marker>
+ <linearGradient
+ id="linearGradient3767">
+ <stop
+ style="stop-color:#efbc00;stop-opacity:1;"
+ offset="0"
+ id="stop3769" />
+ <stop
+ id="stop3775"
+ offset="0.93150687"
+ style="stop-color:#ffcb10;stop-opacity:1;" />
+ <stop
+ style="stop-color:#ffffff;stop-opacity:1;"
+ offset="1"
+ id="stop3771" />
+ </linearGradient>
+ <marker
+ inkscape:stockid="Arrow2Lend"
+ orient="auto"
+ refY="0"
+ refX="0"
+ id="Arrow2Lend-4"
+ style="overflow:visible">
+ <path
+ inkscape:connector-curvature="0"
+ id="path3914-9"
+ style="font-size:12px;fill-rule:evenodd;stroke-width:0.625;stroke-linejoin:round"
+ d="M 8.7185878,4.0337352 -2.2072895,0.01601326 8.7185884,-4.0017078 c -1.7454984,2.3720609 -1.7354408,5.6174519 -6e-7,8.035443 z"
+ transform="matrix(-1.1,0,0,-1.1,-1.1,0)" />
+ </marker>
+ <marker
+ inkscape:stockid="Arrow2Lend"
+ orient="auto"
+ refY="0"
+ refX="0"
+ id="marker5456"
+ style="overflow:visible">
+ <path
+ inkscape:connector-curvature="0"
+ id="path5458"
+ style="font-size:12px;fill-rule:evenodd;stroke-width:0.625;stroke-linejoin:round"
+ d="M 8.7185878,4.0337352 -2.2072895,0.01601326 8.7185884,-4.0017078 c -1.7454984,2.3720609 -1.7354408,5.6174519 -6e-7,8.035443 z"
+ transform="matrix(-1.1,0,0,-1.1,-1.1,0)" />
+ </marker>
+ <marker
+ inkscape:stockid="Arrow2Lend"
+ orient="auto"
+ refY="0"
+ refX="0"
+ id="Arrow2Lend-3"
+ style="overflow:visible">
+ <path
+ inkscape:connector-curvature="0"
+ id="path3914-6"
+ style="font-size:12px;fill-rule:evenodd;stroke-width:0.625;stroke-linejoin:round"
+ d="M 8.7185878,4.0337352 -2.2072895,0.01601326 8.7185884,-4.0017078 c -1.7454984,2.3720609 -1.7354408,5.6174519 -6e-7,8.035443 z"
+ transform="matrix(-1.1,0,0,-1.1,-1.1,0)" />
+ </marker>
+ <marker
+ inkscape:stockid="Arrow2Lend"
+ orient="auto"
+ refY="0"
+ refX="0"
+ id="marker5456-4"
+ style="overflow:visible">
+ <path
+ inkscape:connector-curvature="0"
+ id="path5458-7"
+ style="font-size:12px;fill-rule:evenodd;stroke-width:0.625;stroke-linejoin:round"
+ d="M 8.7185878,4.0337352 -2.2072895,0.01601326 8.7185884,-4.0017078 c -1.7454984,2.3720609 -1.7354408,5.6174519 -6e-7,8.035443 z"
+ transform="matrix(-1.1,0,0,-1.1,-1.1,0)" />
+ </marker>
+ <marker
+ inkscape:stockid="Arrow2Lend"
+ orient="auto"
+ refY="0"
+ refX="0"
+ id="Arrow2Lend-5"
+ style="overflow:visible">
+ <path
+ inkscape:connector-curvature="0"
+ id="path3914-92"
+ style="font-size:12px;fill-rule:evenodd;stroke-width:0.625;stroke-linejoin:round"
+ d="M 8.7185878,4.0337352 -2.2072895,0.01601326 8.7185884,-4.0017078 c -1.7454984,2.3720609 -1.7354408,5.6174519 -6e-7,8.035443 z"
+ transform="matrix(-1.1,0,0,-1.1,-1.1,0)" />
+ </marker>
+ <marker
+ inkscape:stockid="Arrow2Lend"
+ orient="auto"
+ refY="0"
+ refX="0"
+ id="marker5456-3"
+ style="overflow:visible">
+ <path
+ inkscape:connector-curvature="0"
+ id="path5458-78"
+ style="font-size:12px;fill-rule:evenodd;stroke-width:0.625;stroke-linejoin:round"
+ d="M 8.7185878,4.0337352 -2.2072895,0.01601326 8.7185884,-4.0017078 c -1.7454984,2.3720609 -1.7354408,5.6174519 -6e-7,8.035443 z"
+ transform="matrix(-1.1,0,0,-1.1,-1.1,0)" />
+ </marker>
+ <marker
+ inkscape:stockid="Arrow2Lend"
+ orient="auto"
+ refY="0"
+ refX="0"
+ id="Arrow2Lend-36"
+ style="overflow:visible">
+ <path
+ inkscape:connector-curvature="0"
+ id="path3914-5"
+ style="font-size:12px;fill-rule:evenodd;stroke-width:0.625;stroke-linejoin:round"
+ d="M 8.7185878,4.0337352 -2.2072895,0.01601326 8.7185884,-4.0017078 c -1.7454984,2.3720609 -1.7354408,5.6174519 -6e-7,8.035443 z"
+ transform="matrix(-1.1,0,0,-1.1,-1.1,0)" />
+ </marker>
+ <marker
+ inkscape:stockid="Arrow2Lend"
+ orient="auto"
+ refY="0"
+ refX="0"
+ id="marker5532"
+ style="overflow:visible">
+ <path
+ inkscape:connector-curvature="0"
+ id="path5534"
+ style="font-size:12px;fill-rule:evenodd;stroke-width:0.625;stroke-linejoin:round"
+ d="M 8.7185878,4.0337352 -2.2072895,0.01601326 8.7185884,-4.0017078 c -1.7454984,2.3720609 -1.7354408,5.6174519 -6e-7,8.035443 z"
+ transform="matrix(-1.1,0,0,-1.1,-1.1,0)" />
+ </marker>
+ </defs>
+ <sodipodi:namedview
+ id="base"
+ pagecolor="#ffffff"
+ bordercolor="#666666"
+ borderopacity="1.0"
+ inkscape:pageopacity="0.0"
+ inkscape:pageshadow="2"
+ inkscape:zoom="1.0885159"
+ inkscape:cx="281.26331"
+ inkscape:cy="219.65103"
+ inkscape:document-units="px"
+ inkscape:current-layer="layer1"
+ showgrid="false"
+ showguides="true"
+ inkscape:guide-bbox="true"
+ inkscape:snap-global="true"
+ inkscape:window-width="1280"
+ inkscape:window-height="995"
+ inkscape:window-x="1280"
+ inkscape:window-y="0"
+ inkscape:window-maximized="1"
+ fit-margin-top="25"
+ fit-margin-left="25"
+ fit-margin-right="25"
+ fit-margin-bottom="25">
+ <inkscape:grid
+ type="xygrid"
+ id="grid2985"
+ empspacing="5"
+ visible="true"
+ enabled="true"
+ snapvisiblegridlinesonly="true" />
+ </sodipodi:namedview>
+ <metadata
+ id="metadata7">
+ <rdf:RDF>
+ <cc:Work
+ rdf:about="">
+ <dc:format>image/svg+xml</dc:format>
+ <dc:type
+ rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
+ <dc:title></dc:title>
+ </cc:Work>
+ </rdf:RDF>
+ </metadata>
+ <g
+ inkscape:label="Layer 1"
+ inkscape:groupmode="layer"
+ id="layer1"
+ transform="translate(-159.65625,-106.875)">
+ <rect
+ style="fill:none;stroke:#000000;stroke-width:0.70866144;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4;stroke-opacity:1;stroke-dasharray:none;stroke-dashoffset:0"
+ id="rect3866"
+ width="220.00006"
+ height="79.999939"
+ x="184.99994"
+ y="142.36218"
+ ry="10"
+ rx="10" />
+ <path
+ style="fill:#ffcb14;fill-opacity:1;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1"
+ d="m 355,182.36218 55,0 0,-45 -25,0 -5,-5 -20,0 -5,5 z"
+ id="path2989"
+ inkscape:connector-curvature="0" />
+ <text
+ xml:space="preserve"
+ style="font-size:40px;font-style:normal;font-weight:normal;line-height:125%;letter-spacing:0px;word-spacing:0px;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="368.57144"
+ y="225.21931"
+ id="text2995"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan2997"
+ x="368.57144"
+ y="225.21931" /></text>
+ <path
+ style="fill:#ffcb14;fill-opacity:1;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1"
+ d="m 355,322.36218 55,0 0,-45 -25,0 -5,-5 -20,0 -5,5 z"
+ id="path2989-4"
+ inkscape:connector-curvature="0" />
+ <path
+ style="fill:#ffcb14;fill-opacity:1;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1"
+ d="m 355,457.36218 55,0 0,-45 -25,0 -5,-5 -20,0 -5,5 z"
+ id="path2989-1"
+ inkscape:connector-curvature="0" />
+ <path
+ style="fill:#ffcb14;fill-opacity:1;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1"
+ d="m 355,597.36218 55,0 0,-45 -25,0 -5,-5 -20,0 -5,5 z"
+ id="path2989-5"
+ inkscape:connector-curvature="0" />
+ <path
+ style="opacity:0.48800001;fill:#ffcb14;fill-opacity:1;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1"
+ d="m 475,597.36218 55,0 0,-45 -25,0 -5,-5 -20,0 -5,5 z"
+ id="path2989-2"
+ inkscape:connector-curvature="0" />
+ <path
+ style="opacity:0.5;fill:#ffcb14;fill-opacity:1;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1"
+ d="m 235,597.36218 55,0 0,-45 -25,0 -5,-5 -20,0 -5,5 z"
+ id="path2989-8"
+ inkscape:connector-curvature="0" />
+ <text
+ xml:space="preserve"
+ style="font-size:40px;font-style:normal;font-weight:normal;text-align:center;line-height:125%;letter-spacing:0px;word-spacing:0px;text-anchor:middle;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="386.06738"
+ y="626.36218"
+ id="text3868"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan3870"
+ x="386.06738"
+ y="626.36218"
+ style="font-size:20px;text-align:center;text-anchor:middle">working clones</tspan></text>
+ <text
+ xml:space="preserve"
+ style="font-size:18px;font-style:normal;font-weight:normal;text-align:center;line-height:125%;letter-spacing:0px;word-spacing:0px;text-anchor:middle;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="524.37988"
+ y="437.36218"
+ id="text3874"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan3876"
+ x="524.37988"
+ y="437.36218"
+ style="font-size:20px;text-align:center;text-anchor:middle">repository</tspan></text>
+ <text
+ xml:space="preserve"
+ style="font-size:16px;font-style:normal;font-weight:normal;text-align:center;line-height:125%;letter-spacing:0px;word-spacing:0px;text-anchor:middle;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="525.35156"
+ y="304.36218"
+ id="text3878"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan3880"
+ x="525.35156"
+ y="304.36218"
+ style="font-size:20px;text-align:center;text-anchor:middle">srcdir</tspan></text>
+ <text
+ xml:space="preserve"
+ style="font-size:16px;font-style:normal;font-weight:normal;text-align:center;line-height:125%;letter-spacing:0px;word-spacing:0px;text-anchor:middle;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="525.1543"
+ y="165.36218"
+ id="text3882"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan3884"
+ x="525.1543"
+ y="165.36218"
+ style="font-size:20px;text-align:center;text-anchor:middle">destdir</tspan></text>
+ <g
+ id="g5440"
+ transform="translate(5,25.000003)">
+ <path
+ sodipodi:nodetypes="cc"
+ inkscape:connector-curvature="0"
+ id="path3886"
+ d="m 370,512.36218 c -5,-24.99999 -5,-44.99999 0,-70"
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-start:none;marker-mid:none;marker-end:url(#Arrow2Lend)" />
+ <path
+ sodipodi:nodetypes="cc"
+ inkscape:connector-curvature="0"
+ id="path3888"
+ d="m 390,442.36218 c 5,25 5,45 0,70"
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow2Lend)" />
+ </g>
+ <g
+ transform="translate(5,-110)"
+ id="g5440-4">
+ <path
+ sodipodi:nodetypes="cc"
+ inkscape:connector-curvature="0"
+ id="path3886-8"
+ d="m 370,512.36218 c -5,-24.99999 -5,-44.99999 0,-70"
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-start:none;marker-mid:none;marker-end:url(#Arrow2Lend)" />
+ <path
+ sodipodi:nodetypes="cc"
+ inkscape:connector-curvature="0"
+ id="path3888-9"
+ d="m 390,442.36218 c 5,25 5,45 0,70"
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow2Lend)" />
+ </g>
+ <g
+ transform="matrix(0.71872744,0.69529193,-0.69529193,0.71872744,353.78964,-104.94206)"
+ id="g5440-47">
+ <path
+ sodipodi:nodetypes="cc"
+ inkscape:connector-curvature="0"
+ id="path3886-3"
+ d="m 370,512.36218 c -5,-24.99999 -0.0778,-66.9912 7.34379,-88.08431"
+ style="opacity:0.5;fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-start:none;marker-mid:none;marker-end:url(#Arrow2Lend)" />
+ <path
+ sodipodi:nodetypes="cc"
+ inkscape:connector-curvature="0"
+ id="path3888-6"
+ d="m 391.48399,424.51223 c 5,25 6.0155,63.74804 -1.48399,87.84995"
+ style="opacity:0.5;fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow2Lend)" />
+ </g>
+ <g
+ transform="matrix(-0.71872744,0.69529193,0.69529193,0.71872744,421.21036,-104.94206)"
+ id="g5440-47-9"
+ style="opacity:0.5">
+ <path
+ sodipodi:nodetypes="cc"
+ inkscape:connector-curvature="0"
+ id="path3886-3-3"
+ d="m 370,512.36218 c -5,-24.99999 -0.0778,-66.9912 7.34379,-88.08431"
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-start:none;marker-mid:none;marker-end:url(#Arrow2Lend)" />
+ <path
+ sodipodi:nodetypes="cc"
+ inkscape:connector-curvature="0"
+ id="path3888-6-3"
+ d="m 391.48399,424.51223 c 5,25 6.0155,63.74804 -1.48399,87.84995"
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow2Lend)" />
+ </g>
+ <path
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#marker5532)"
+ d="m 380,262.36218 0,-60"
+ id="path5558"
+ inkscape:connector-curvature="0" />
+ <g
+ id="g5810"
+ transform="translate(0,-9)">
+ <g
+ transform="translate(-230,-4.9999974)"
+ id="g3784-7">
+ <g
+ id="g3779-37">
+ <path
+ inkscape:connector-curvature="0"
+ id="path2993-5"
+ d="m 440,177.36218 10,0 0,-10"
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" />
+ <path
+ sodipodi:nodetypes="cccccc"
+ inkscape:connector-curvature="0"
+ id="path2991-3"
+ d="m 440,177.36218 0,40 35,0 0,-50 -25,0 z"
+ style="fill:#ffffff;fill-opacity:1;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" />
+ <path
+ inkscape:connector-curvature="0"
+ id="path3777-8"
+ d="m 440,177.36218 10,0 0,-10"
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" />
+ </g>
+ </g>
+ <g
+ transform="translate(-235,-9.9999974)"
+ id="g3784">
+ <g
+ id="g3779">
+ <path
+ inkscape:connector-curvature="0"
+ id="path2993"
+ d="m 440,177.36218 10,0 0,-10"
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" />
+ <path
+ sodipodi:nodetypes="cccccc"
+ inkscape:connector-curvature="0"
+ id="path2991"
+ d="m 440,177.36218 0,40 35,0 0,-50 -25,0 z"
+ style="fill:#ffffff;fill-opacity:1;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" />
+ <path
+ inkscape:connector-curvature="0"
+ id="path3777"
+ d="m 440,177.36218 10,0 0,-10"
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" />
+ </g>
+ </g>
+ <text
+ sodipodi:linespacing="125%"
+ id="text5762"
+ y="176.55017"
+ x="206.62401"
+ style="font-size:16px;font-style:normal;font-weight:normal;line-height:125%;letter-spacing:0px;word-spacing:0px;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ xml:space="preserve"><tspan
+ style="font-size:8px"
+ y="176.55017"
+ x="206.62401"
+ id="tspan5764"
+ sodipodi:role="line">&lt;html&gt;</tspan></text>
+ </g>
+ <g
+ id="g5824"
+ transform="translate(0,-9)">
+ <g
+ transform="translate(-165,-9.9999974)"
+ id="g3784-0">
+ <g
+ id="g3779-3">
+ <path
+ inkscape:connector-curvature="0"
+ id="path2993-2"
+ d="m 440,177.36218 10,0 0,-10"
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" />
+ <path
+ sodipodi:nodetypes="cccccc"
+ inkscape:connector-curvature="0"
+ id="path2991-8"
+ d="m 440,177.36218 0,40 35,0 0,-50 -25,0 z"
+ style="fill:#ffffff;fill-opacity:1;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" />
+ <path
+ inkscape:connector-curvature="0"
+ id="path3777-7"
+ d="m 440,177.36218 10,0 0,-10"
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" />
+ </g>
+ </g>
+ <g
+ transform="matrix(0.74161576,0,0,0.74161576,75.250882,53.354937)"
+ id="g5772">
+ <path
+ sodipodi:type="star"
+ style="fill:#939393;fill-opacity:1;stroke:#939393;stroke-width:0.70866144;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4;stroke-opacity:1;stroke-dasharray:none;stroke-dashoffset:0"
+ id="path5768"
+ sodipodi:sides="13"
+ sodipodi:cx="295"
+ sodipodi:cy="187.36218"
+ sodipodi:r1="10.889445"
+ sodipodi:r2="14.142136"
+ sodipodi:arg1="-2.3561945"
+ sodipodi:arg2="-2.1145335"
+ inkscape:flatsided="false"
+ inkscape:rounded="0.36"
+ inkscape:randomized="0"
+ d="m 287.3,179.66218 c 1.12496,-1.12496 -0.97775,-3.57952 0.38374,-4.40257 1.36149,-0.82305 2.55772,2.1795 4.07662,1.70619 1.5189,-0.47331 0.79773,-3.62389 2.38576,-3.71995 1.58803,-0.0961 1.25188,3.11848 2.81676,3.40526 1.56487,0.28677 2.39046,-2.83808 3.84123,-2.18514 1.45078,0.65294 -0.34074,3.34306 0.91162,4.32422 1.25235,0.98116 3.43557,-1.40209 4.41673,-0.14973 0.98116,1.25236 -1.85532,2.80178 -1.20238,4.25255 0.65294,1.45078 3.69363,0.35511 3.98041,1.91998 0.28677,1.56488 -2.94485,1.61865 -3.04091,3.20668 -0.0961,1.58803 3.10552,2.03094 2.63221,3.54984 -0.47331,1.5189 -3.35976,0.0647 -4.18281,1.42619 -0.82305,1.3615 1.80598,3.24152 0.68102,4.36648 -1.12496,1.12496 -3.00498,-1.50407 -4.36648,-0.68101 -1.36149,0.82305 0.0927,3.7095 -1.42619,4.1828 -1.5189,0.47331 -1.96181,-2.72827 -3.54984,-2.63221 -1.58803,0.0961 -1.64181,3.32768 -3.20668,3.04091 -1.56488,-0.28678 -0.4692,-3.32746 -1.91998,-3.9804 -1.45077,-0.65294 -3.00019,2.18353 -4.25255,1.20237 -1.25236,-0.98116 1.13089,-3.16437 0.14973,-4.41673 -0.98116,-1.25236 -3.67128,0.53916 -4.32422,-0.91161 -0.65294,-1.45078 2.47191,-2.27636 2.18513,-3.84124 -0.28677,-1.56488 -3.50131,-1.22873 -3.40525,-2.81676 0.096,-1.58803 3.24664,-0.86686 3.71995,-2.38576 0.47331,-1.5189 -2.52925,-2.71513 -1.70619,-4.07662 0.82305,-1.36149 3.27761,0.74122 4.40257,-0.38374 z"
+ inkscape:transform-center-x="-0.68364368"
+ inkscape:transform-center-y="0.68364368"
+ transform="translate(-2,0)" />
+ <path
+ sodipodi:type="arc"
+ style="fill:#ffffff;fill-opacity:1;stroke:#939393;stroke-width:0.70866144;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4;stroke-opacity:1;stroke-dasharray:none;stroke-dashoffset:0"
+ id="path5770"
+ sodipodi:cx="295"
+ sodipodi:cy="187.36218"
+ sodipodi:rx="5"
+ sodipodi:ry="5"
+ d="m 300,187.36218 a 5,5 0 1 1 -10,0 5,5 0 1 1 10,0 z"
+ transform="matrix(1.4,0,0,1.4,-120,-74.944873)" />
+ </g>
+ </g>
+ <text
+ xml:space="preserve"
+ style="font-size:16px;font-style:normal;font-weight:normal;line-height:125%;letter-spacing:0px;word-spacing:0px;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="265"
+ y="211.36218"
+ id="text5806"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan5808"
+ x="265"
+ y="211.36218"
+ style="font-size:12px">ikiwiki.cgi</tspan></text>
+ <path
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#marker5532)"
+ d="m 295,217.36218 c 10,40 25,65 55,85"
+ id="path5834"
+ inkscape:connector-curvature="0"
+ sodipodi:nodetypes="cc" />
+ <text
+ xml:space="preserve"
+ style="font-size:16px;font-style:normal;font-weight:normal;text-align:end;line-height:125%;letter-spacing:0px;word-spacing:0px;text-anchor:end;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="351.31982"
+ y="362.36218"
+ id="text6240"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan6242"
+ x="351.31982"
+ y="362.36218"
+ style="font-size:14px;text-align:end;writing-mode:lr-tb;text-anchor:end">post-update</tspan><tspan
+ sodipodi:role="line"
+ x="351.31982"
+ y="379.86218"
+ style="font-size:14px;text-align:end;writing-mode:lr-tb;text-anchor:end"
+ id="tspan6244">hook</tspan></text>
+ <text
+ xml:space="preserve"
+ style="font-size:16px;font-style:normal;font-weight:normal;line-height:125%;letter-spacing:0px;word-spacing:0px;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="420"
+ y="362.36218"
+ id="text6246"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan6248"
+ x="420"
+ y="362.36218"
+ style="font-size:14px">ikiwiki.cgi</tspan><tspan
+ sodipodi:role="line"
+ x="420"
+ y="379.86218"
+ id="tspan6250"
+ style="font-size:14px">push</tspan></text>
+ <text
+ xml:space="preserve"
+ style="font-size:16px;font-style:normal;font-weight:normal;line-height:125%;letter-spacing:0px;word-spacing:0px;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="382"
+ y="316.36218"
+ id="text6252"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan6254"
+ x="382"
+ y="316.36218">.git</tspan></text>
+ <text
+ xml:space="preserve"
+ style="font-size:16px;font-style:normal;font-weight:normal;line-height:125%;letter-spacing:0px;word-spacing:0px;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="383"
+ y="592.36218"
+ id="text6252-3"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan6254-1"
+ x="383"
+ y="592.36218">.git</tspan></text>
+ <text
+ xml:space="preserve"
+ style="font-size:16px;font-style:normal;font-weight:normal;line-height:125%;letter-spacing:0px;word-spacing:0px;opacity:0.3;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="502"
+ y="591.36218"
+ id="text6252-3-1"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan6254-1-0"
+ x="502"
+ y="591.36218">.git</tspan></text>
+ <text
+ xml:space="preserve"
+ style="font-size:16px;font-style:normal;font-weight:normal;line-height:125%;letter-spacing:0px;word-spacing:0px;opacity:0.3;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="263"
+ y="592.36218"
+ id="text6252-3-6"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan6254-1-1"
+ x="263"
+ y="592.36218">.git</tspan></text>
+ <text
+ xml:space="preserve"
+ style="font-size:16px;font-style:normal;font-weight:normal;line-height:125%;letter-spacing:0px;word-spacing:0px;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="411"
+ y="456.36218"
+ id="text6252-3-0"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan6254-1-2"
+ x="411"
+ y="456.36218">.git</tspan></text>
+ <text
+ xml:space="preserve"
+ style="font-size:16px;font-style:normal;font-weight:normal;text-align:end;line-height:125%;letter-spacing:0px;word-spacing:0px;text-anchor:end;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="300"
+ y="262.36218"
+ id="text6372"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan6374"
+ x="300"
+ y="262.36218"
+ style="font-size:14px">web-side</tspan><tspan
+ sodipodi:role="line"
+ x="300"
+ y="279.86218"
+ id="tspan6376"
+ style="font-size:14px">edit</tspan></text>
+ <text
+ xml:space="preserve"
+ style="font-size:16px;font-style:normal;font-weight:normal;line-height:125%;letter-spacing:0px;word-spacing:0px;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="410"
+ y="232.36218"
+ id="text6378"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan6380"
+ x="410"
+ y="232.36218"
+ style="font-size:14px">automatic</tspan><tspan
+ sodipodi:role="line"
+ x="410"
+ y="249.86218"
+ id="tspan6382"
+ style="font-size:14px">rebuild</tspan></text>
+ <text
+ xml:space="preserve"
+ style="font-size:16px;font-style:normal;font-weight:normal;line-height:125%;letter-spacing:0px;word-spacing:0px;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="406.75635"
+ y="501.15298"
+ id="text6384"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan6386"
+ x="406.75635"
+ y="501.15298"
+ style="font-size:14px">git</tspan><tspan
+ sodipodi:role="line"
+ x="406.75635"
+ y="518.65295"
+ id="tspan6388"
+ style="font-size:14px">pull</tspan></text>
+ <text
+ xml:space="preserve"
+ style="font-size:16px;font-style:normal;font-weight:normal;text-align:end;line-height:125%;letter-spacing:0px;word-spacing:0px;text-anchor:end;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ x="363.62955"
+ y="504.39691"
+ id="text6390"
+ sodipodi:linespacing="125%"><tspan
+ sodipodi:role="line"
+ id="tspan6392"
+ x="363.62955"
+ y="504.39691"
+ style="font-size:14px;text-align:end;text-anchor:end">git</tspan><tspan
+ sodipodi:role="line"
+ x="363.62955"
+ y="521.89691"
+ id="tspan6394"
+ style="font-size:14px;text-align:end;text-anchor:end">push</tspan></text>
+ </g>
+</svg>
diff --git a/doc/rcs/mercurial.mdwn b/doc/rcs/mercurial.mdwn
new file mode 100644
index 000000000..ebfc35202
--- /dev/null
+++ b/doc/rcs/mercurial.mdwn
@@ -0,0 +1,18 @@
+[Mercurial](http://selenic.com/mercurial) is a distributed revison control
+system developed by Matt Mackall. Ikiwiki supports storing a wiki in a
+mercurial repository.
+
+Ikiwiki can run as a `post-commit` and/or `incoming` hook to update a wiki whenever commits or remote pushes
+come in. When running as a [[cgi]] with Mercurial, ikiwiki automatically
+commits edited pages, and uses the Mercurial history to generate the
+[[RecentChanges]] page.
+
+Example for a `.hg/hgrc` file in `$SRCDIR`:
+
+ [hooks]
+ post-commit = ikiwiki --setup /path/to/ikiwiki.setup --post-commit
+ incoming = ikiwiki --setup /path/to/ikiwiki.setup --post-commit
+
+Do not use `commit` or `precommit` hooks or ikiwiki will run into a dead lock when committing in `$SRCDIR`. Also note that `--post-commit` and not `--refresh` must be used to avoid dead locking when editing the pages via CGI interface.
+
+See also [[todo/mercurial|todo/mercurial]]
diff --git a/doc/rcs/monotone.mdwn b/doc/rcs/monotone.mdwn
new file mode 100644
index 000000000..2cfcdfbf5
--- /dev/null
+++ b/doc/rcs/monotone.mdwn
@@ -0,0 +1,24 @@
+[Monotone](http://monotone.ca/) is a distributed revision control system.
+Ikiwiki supports storing a wiki in a Monotone repository and editing it
+using the [[cgi]] interface. It will use the Monotone logs to generate the
+[[RecentChanges]] page.
+
+The monotone support in ikiwiki requires the Monotone perl module to be
+installed. (It's available from the contrib/ directory in the monotone
+source.) In particular, it needs version 0.03 or higher of that module.
+The module is available from the monotone source repository at:
+<http://viewmtn.angrygoats.net/branch/changes/net.venge.monotone>
+
+Monotone support works, but there are still a few minor missing bits (listed here so they are not forgotten):
+
+* Documentation (this page) could be improved.
+
+There is also a mismatch between the way Ikiwiki handles conflicts and the
+way Monotone handles conflicts. At present, if there is a conflict, then
+Ikiwiki will commit a revision with conflict markers before presenting it
+to the user. This is ugly, but there is no clean way to fix it at present.
+
+Also note that not all recent ikiwiki features have been implemented in the
+monotone plugin. At the moment we're missing:
+
+ * [[todo/Untrusted_push_in_Monotone]]
diff --git a/doc/rcs/svn.mdwn b/doc/rcs/svn.mdwn
new file mode 100644
index 000000000..7aa682978
--- /dev/null
+++ b/doc/rcs/svn.mdwn
@@ -0,0 +1,9 @@
+[Subversion](http://subversion.tigris.org/) is a [[revision control system|rcs]]. While ikiwiki is relatively
+independent of the underlying revision control system, and can easily be
+used without one, using it with Subversion or another revision control
+system is recommended.
+
+Ikiwiki can run as a [[post-commit]] hook to update a wiki whenever commits
+come in. When running as a [[cgi]] with Subversion, ikiwiki automatically
+commits edited pages to the subversion repostory, and uses the Subversion
+log to generate the [[RecentChanges]] page.
diff --git a/doc/rcs/svn/discussion.mdwn b/doc/rcs/svn/discussion.mdwn
new file mode 100644
index 000000000..426735182
--- /dev/null
+++ b/doc/rcs/svn/discussion.mdwn
@@ -0,0 +1,13 @@
+If the user interrupts the page loading during the running of `svn commit`,
+the repository will be left in an inconsistent state. The probability of
+this happening increases with the size of the repository and the number of
+plugins installed, because these both affect how long the post-commit hook
+takes to run. (The core issue, I guess, is that we're abusing the concept
+of a "working copy" by giving everybody the same one). Here are the main
+solutions that I can see: (1) CGI queues commits so that a single process
+can act upon them sequentially, or (2) optionally divorce the `ikiwiki
+--refresh` from the `svn commit` so that commits happen faster. -- [[Ben]]
+
+I'm not aware of web servers, at least apache, killing cgi processes when
+the user stops a page load. If this is happening ikiwiki should be able to
+avoid it by blocking whatever signal is causing it to terminate. --[[Joey]]
diff --git a/doc/rcs/tla.mdwn b/doc/rcs/tla.mdwn
new file mode 100644
index 000000000..79eecd627
--- /dev/null
+++ b/doc/rcs/tla.mdwn
@@ -0,0 +1,13 @@
+[tla](http://wiki.gnuarch.org/) is an implementation of the
+[GNU](http://www.gnu.org/) [Arch](http://www.gnuarch.org/) revision control
+system. Ikiwiki supports storing a wiki in tla.
+
+Warning: Since tla is not being maintained, neither is this plugin, and
+using ikiwiki with tla is not recommended.
+
+Ikiwiki can run as a [[post-commit]] hook to update a wiki whenever commits
+come in. When running as a [[cgi]] with tla, ikiwiki automatically
+commits edited pages to the Arch repostory, and uses the Arch
+log to generate the [[RecentChanges]] page.
+
+Note that the tla support needs the [[!cpan MailTools]] perl module.
diff --git a/doc/recentchanges.mdwn b/doc/recentchanges.mdwn
new file mode 100644
index 000000000..3383fc703
--- /dev/null
+++ b/doc/recentchanges.mdwn
@@ -0,0 +1,7 @@
+[[!if test="enabled(meta)" then="""
+[[!meta title="RecentChanges"]]
+"""]]
+Recent changes to this wiki:
+
+[[!inline pages="internal(recentchanges/change_*) and !*/Discussion"
+template=recentchanges show=0]]
diff --git a/doc/reviewed.mdwn b/doc/reviewed.mdwn
new file mode 100644
index 000000000..14772a369
--- /dev/null
+++ b/doc/reviewed.mdwn
@@ -0,0 +1,7 @@
+This page lists [[branches]] that have been reviewed. If your branch
+shows up here, the ball is back in your court, to respond to the review and
+deal with whatever is preventing it from being merged into ikiwiki. Once
+you do, remove the "reviewed" tag.
+
+[[!inline pages="(todo/* or bugs/*) and link(/branches) and !link(bugs/done)
+and !link(todo/done) and !*/*/* and link(.)" show=0 archive=yes]]
diff --git a/doc/roadmap.mdwn b/doc/roadmap.mdwn
new file mode 100644
index 000000000..f2ff5802e
--- /dev/null
+++ b/doc/roadmap.mdwn
@@ -0,0 +1,92 @@
+This is the roadmap for ikiwiki development.
+
+# 1.0
+
+* No severe [[security]] bugs.
+* All the basic [[features]] people would expect in a wiki.
+
+Released 29 April 2006.
+
+The 1.X series is no longer supported.
+
+----
+
+# 2.0
+
+* New improved URLs to pages via `usedirs`.
+* [[plugins/OpenID]] support, enabled by default.
+* Plugin [[interface|plugins/write]] added, with some 60 [[plugins]] available,
+ greatly expanding the capabilities of ikiwiki.
+* [[Tags]], atom feeds, and generally full-fledged blogging support.
+* Fully working [[todo/utf8]].
+* Optimisations, approximately 3.5 times as fast as version 1.0.
+* Improved scalability to large numbers of pages.
+* Improved scalable [[logo]].
+* Support for additional revision control systems besides svn: git,
+ tla, mercurial.
+* Some support for other markup languages than markdown: rst, textile.
+* Unit test suite, with more than 300 tests.
+
+Released 30 April 2007.
+
+The 2.x series is now in maintenance mode. Only security fixes and fixes for
+really bad bugs will be applied going forward.
+
+----
+
+# 3.0
+
+Version 3.0 is an opportunity to make significant transitions.
+Read [[tips/upgrade_to_3.0]] for the steps you will need to
+follow when upgrading your wiki to this version.
+
+The highlights of the changes in version 3.0 include:
+
+* Support for uploading [[attachments|plugins/attachment]].
+* Can [[plugins/rename]] and [[plugins/remove]] pages and files via the web.
+* [[Web_based_setup|plugins/websetup]].
+* Blog-style [[plugins/comments]] as an alternative to Discussion pages.
+* Many other new plugins including [[plugins/htmlbalance]], [[plugins/format]],
+ [[plugins/progress]], [[plugins/color]], [[plugins/autoindex]],
+ [[plugins/cutpaste]], [[plugins/hnb]], [[plugins/creole]], [[plugins/txt]],
+ [[plugins/amazon_s3]], [[plugins/pinger]], [[plugins/pingee]],
+ [[plugins/edittemplate]]
+* The RecentChanges page is compiled statically, not generated from the CGI.
+* Support for additional revision control systems: [[rcs/bzr]],
+ [[rcs/monotone]]
+* Support for [[tips/untrusted_git_push]].
+* A new version (3.00) of the plugin API, exporting additional
+ commonly used functions from `IkiWiki.pm`.
+* Nearly everything in ikiwiki is now a plugin, from WikiLinks to page
+ editing, to RecentChanges.
+* Far too many bug fixes, features, and enhancements to list here.
+
+Released 31 December, 2008.
+
+The 3.x series is expected to undergo continuing development for some time,
+adding improvements and new features, but avoiding changes that break
+backwards compatability.
+
+----
+
+# compatability breaking changes
+
+Probably incomplete list:
+
+* Drop old `--getctime` option.
+* Remove compatability code in `loadindex` to handle old index data layouts.
+* Make pagespecs match relative by default? (see [[discussion]])
+* Flip wikilinks? (see [[todo/link_plugin_perhaps_too_general?]] and [[todo/do_not_make_links_backwards]])
+* Enable tagbase by default (so that tag autocreation will work by default).
+ Note that this is already done for wikis created by `auto-blog.setup`.
+* [[tips/html5]] on by default (some day..)
+* Remove support for old `.ikiwiki/comments_pending` from comment plugin.
+
+In general, we try to use [[ikiwiki-transition]] or forced rebuilds on
+upgrade to deal with changes that break compatability. Some things that
+can't help with.
+
+# future goals
+
+* Conversion support for existing other wikis. See [[convert]].
+* [[TODO]], [[bugs]], ...
diff --git a/doc/roadmap/discussion.mdwn b/doc/roadmap/discussion.mdwn
new file mode 100644
index 000000000..8233b1990
--- /dev/null
+++ b/doc/roadmap/discussion.mdwn
@@ -0,0 +1,32 @@
+Changing pagespecs to be relative by default is quite feasible now, but it will cause
+backwards compatibility problems. Should this be marked as a future plan, perhaps at a
+major version number like 2.0? --Ethan
+
+Yes, I'm looking at making this kind of change at 2.0, added to the list.
+(Update: Didn't make it in 2.0 or 3.0...)
+However, I have doubts that it makes good sense to go relative by default.
+While it's not consitent with links, it seems to work better overall to
+have pagespecs be absolute by default, IMHO. --[[Joey]]
+
+I think after you work with ikiwiki for a while, it "makes more sense" for
+them to be absolute, but I definitely remember tripping over absolute
+pagespecs a few times when I was just starting out. Thus I think we've
+learned to accept it as natural, where a new user wouldn't.
+
+* bugs, todo, news, blog, users, and sandbox
+ are all at "toplevel", so they are equivalent whether
+ pagespecs are absolute or relative.
+* soc doesn't refer to any pages explicitly so it doesn't matter
+* various plugins have pagespecs at plugins/foo.mdwn: map, linkmap, orphans,
+ pagecount, pagestats
+ * I'd say most of these make more sense as having abs. pagespecs
+ * I note that your sitemap is at toplevel, but there's no reason
+ not to allow putting it in a special meta/ directory.
+* examples/blog and examples/software site need to have relative pagespecs,
+ but they're pretty special cases -- for a real site those things
+ will probably be toplevel
+* plugins/contrib makes more sense to inline relative (though it doesn't
+ right now)
+
+Maybe inline should use relative pagespecs by default, and other plugins
+don't? --Ethan
diff --git a/doc/robots.txt b/doc/robots.txt
new file mode 100644
index 000000000..7be87f9bd
--- /dev/null
+++ b/doc/robots.txt
@@ -0,0 +1,2 @@
+User-Agent: *
+Disallow: /ikiwiki.cgi
diff --git a/doc/sandbox.mdwn b/doc/sandbox.mdwn
new file mode 100644
index 000000000..3536e327d
--- /dev/null
+++ b/doc/sandbox.mdwn
@@ -0,0 +1,84 @@
+This is the [[SandBox]], a page anyone can edit to try out ikiwiki
+(version [[!version ]]).
+
+hello world
+
+> This is a blockquote.
+>
+> This is the first level of quoting.
+>
+> > This is a nested blockquote.
+>
+>> Without a space works too.
+>>> to three levels
+>
+> Back to the first level.
+>
+> added a line in level 1
+> and another
+
+
+Numbered list
+
+1. First item.
+ 1. Sub item.
+ 1. Number 2
+1. Another.
+1. And another..
+ 1. foo
+ 2. bar
+ 3. quz
+
+Bulleted list
+
+* item
+* *italic item*
+* item
+* one
+ * footballs; runner; unices
+ * Cool !
+
+test _this_ out.
+
+`test this code block`
+
+----
+
+[[!template id=note text="this is generated by the [[plugins/haiku]] plugin"]]
+[[!haiku hint="sandbox play"]]
+
+----
+
+## Different sorts of links:
+
+* [[Features]]
+* <http://ikiwiki.info/ikiwiki/formatting/>
+* [[different_name_for_a_WikiLink|ikiwiki/WikiLink]]
+* <http://www.gnu.org/>
+* [GNU](http://www.gnu.org/)
+* <a href="http://kitenet.net/~joey/">Joey's blog</a>
+
+----
+
+# header1
+
+## header2
+
+### header3
+
+#### header4
+
+##### header 5
+
+**bold**
+
+_italic_
+
+test ms
+
+opopopo
+----
+
+This **SandBox** is also a [[blog]]!
+
+[[!inline pages="sandbox/* and !*/Discussion" rootpage="sandbox" show="4" archive="yes"]]
diff --git a/doc/sandbox/NewPage.mdwn b/doc/sandbox/NewPage.mdwn
new file mode 100644
index 000000000..18e27fc89
--- /dev/null
+++ b/doc/sandbox/NewPage.mdwn
@@ -0,0 +1 @@
+This page uses directory hierarchies. Good.
diff --git a/doc/sandbox/hmm__44___what_kind_of_a_blog_is_this__63____41__.mdwn b/doc/sandbox/hmm__44___what_kind_of_a_blog_is_this__63____41__.mdwn
new file mode 100644
index 000000000..6b54183ae
--- /dev/null
+++ b/doc/sandbox/hmm__44___what_kind_of_a_blog_is_this__63____41__.mdwn
@@ -0,0 +1,3 @@
+test:)
+
+echo and bounce
diff --git a/doc/security.mdwn b/doc/security.mdwn
new file mode 100644
index 000000000..afefd1bc3
--- /dev/null
+++ b/doc/security.mdwn
@@ -0,0 +1,499 @@
+Let's do an ikiwiki security analysis.
+
+If you are using ikiwiki to render pages that only you can edit, do not
+generate any wrappers, and do not use the cgi, then there are no more
+security issues with this program than with cat(1). If, however, you let
+others edit pages in your wiki, then some possible security issues do need
+to be kept in mind.
+
+[[!toc levels=2]]
+
+----
+
+# Probable holes
+
+_(The list of things to fix.)_
+
+## commit spoofing
+
+Anyone with direct commit access can forge "web commit from foo" and
+make it appear on [[RecentChanges]] like foo committed. One way to avoid
+this would be to limit web commits to those done by a certain user.
+
+## other stuff to look at
+
+I have been meaning to see if any CRLF injection type things can be
+done in the CGI code.
+
+----
+
+# Potential gotchas
+
+_(Things not to do.)_
+
+## image file etc attacks
+
+If it enounters a file type it does not understand, ikiwiki just copies it
+into place. So if you let users add any kind of file they like, they can
+upload images, movies, windows executables, css files, etc (though not html
+files). If these files exploit security holes in the browser of someone
+who's viewing the wiki, that can be a security problem.
+
+Of course nobody else seems to worry about this in other wikis, so should we?
+
+People with direct commit access can upload such files
+(and if you wanted to you could block that with a pre-commit hook).
+
+The attachments plugin is not enabled by default. If you choose to
+enable it, you should make use of its powerful abilities to filter allowed
+types of attachments, and only let trusted users upload.
+
+It is possible to embed an image in a page edited over the web, by using
+`img src="data:image/png;"`. Ikiwiki's htmlscrubber only allows `data:`
+urls to be used for `image/*` mime types. It's possible that some broken
+browser might ignore the mime type and if the data provided is not an
+image, instead run it as javascript, or something evil like that. Hopefully
+not many browsers are that broken.
+
+## multiple accessors of wiki directory
+
+If multiple people can directly write to the source directory ikiwiki is
+using, or to the destination directory it writes files to, then one can
+cause trouble for the other when they run ikiwiki through symlink attacks.
+
+So it's best if only one person can ever directly write to those directories.
+
+## setup files
+
+Setup files are not safe to keep in the same revision control repository
+with the rest of the wiki. Just don't do it.
+
+## page locking can be bypassed via direct commits
+
+A locked page can only be edited on the web by an admin, but anyone who is
+allowed to commit directly to the repository can bypass this. This is by
+design, although a pre-commit hook could be used to prevent editing of
+locked pages, if you really need to.
+
+## web server attacks
+
+If your web server does any parsing of special sorts of files (for example,
+server parsed html files), then if you let anyone else add files to the wiki,
+they can try to use this to exploit your web server.
+
+----
+
+# Hopefully non-holes
+
+_(AKA, the assumptions that will be the root of most security holes...)_
+
+## exploiting ikiwiki with bad content
+
+Someone could add bad content to the wiki and hope to exploit ikiwiki.
+Note that ikiwiki runs with perl taint checks on, so this is unlikely.
+
+One fun thing in ikiwiki is its handling of a PageSpec, which involves
+translating it into perl and running the perl. Of course, this is done
+*very* carefully to guard against injecting arbitrary perl code.
+
+## publishing cgi scripts
+
+ikiwiki does not allow cgi scripts to be published as part of the wiki. Or
+rather, the script is published, but it's not marked executable (except in
+the case of "destination directory file replacement" below), so hopefully
+your web server will not run it.
+
+## suid wrappers
+
+`ikiwiki --wrapper` is intended to generate a wrapper program that
+runs ikiwiki to update a given wiki. The wrapper can in turn be made suid,
+for example to be used in a [[post-commit]] hook by people who cannot write
+to the html pages, etc.
+
+If the wrapper program is made suid, then any bugs in this wrapper would be
+security holes. The wrapper is written as securely as I know how, is based
+on code that has a history of security use long before ikiwiki, and there's
+been no problem yet.
+
+## shell exploits
+
+ikiwiki does not expose untrusted data to the shell. In fact it doesn't use
+`system(3)` at all, and the only use of backticks is on data supplied by the
+wiki admin and untainted filenames.
+
+Ikiwiki was developed and used for a long time with perl's taint checking
+turned on as a second layer of defense against shell and other exploits. Due
+to a strange [bug](http://bugs.debian.org/411786) in perl, taint checking
+is currently disabled for production builds of ikiwiki.
+
+## cgi data security
+
+When ikiwiki runs as a cgi to edit a page, it is passed the name of the
+page to edit. It has to make sure to sanitise this page, to prevent eg,
+editing of ../../../foo, or editing of files that are not part of the wiki,
+such as subversion dotfiles. This is done by sanitising the filename
+removing unallowed characters, then making sure it doesn't start with "/"
+or contain ".." or "/.svn/", etc. Annoyingly ad-hoc, this kind of code is
+where security holes breed. It needs a test suite at the very least.
+
+## CGI::Session security
+
+I've audited this module and it is massively insecure by default. ikiwiki
+uses it in one of the few secure ways; by forcing it to write to a
+directory it controls (and not /tmp) and by setting a umask that makes the
+file not be world readable.
+
+## cgi password security
+
+Login to the wiki using [[plugins/passwordauth]] involves sending a password
+in cleartext over the net. Cracking the password only allows editing the wiki
+as that user though. If you care, you can use https, I suppose. If you do use
+https either for all of the wiki, or just the cgi access, then consider using
+the sslcookie option. Using [[plugins/openid]] is a potentially better option.
+
+## XSS holes in CGI output
+
+ikiwiki has been audited to ensure that all cgi script input/output
+is sanitised to prevent XSS attacks. For example, a user can't register
+with a username containing html code (anymore).
+
+It's difficult to know for sure if all such avenues have really been
+closed though.
+
+## HTML::Template security
+
+If the [[plugins/template]] plugin is enabled, all users can modify templates
+like any other part of the wiki. Some trusted users can modify templates
+without it too. This assumes that HTML::Template is secure
+when used with untrusted/malicious templates. (Note that includes are not
+allowed.)
+
+----
+
+# Plugins
+
+The security of [[plugins]] depends on how well they're written and what
+external tools they use. The plugins included in ikiwiki are all held to
+the same standards as the rest of ikiwiki, but with that said, here are
+some security notes for them.
+
+* The [[plugins/img]] plugin assumes that imagemagick/perlmagick are secure
+ from malformed image attacks. Imagemagick has had security holes in the
+ past. To be able to exploit such a hole, a user would need to be able to
+ upload images to the wiki.
+
+----
+
+# Fixed holes
+
+_(Unless otherwise noted, these were discovered and immediately fixed by the
+ikiwiki developers.)_
+
+## destination directory file replacement
+
+Any file in the destination directory that is a valid page filename can be
+replaced, even if it was not originally rendered from a page. For example,
+ikiwiki.cgi could be edited in the wiki, and it would write out a
+replacement. File permission is preseved. Yipes!
+
+This was fixed by making ikiwiki check if the file it's writing to exists;
+if it does then it has to be a file that it's aware of creating before, or
+it will refuse to create it.
+
+Still, this sort of attack is something to keep in mind.
+
+## symlink attacks
+
+Could a committer trick ikiwiki into following a symlink and operating on
+some other tree that it shouldn't? svn supports symlinks, so one can get
+into the repo. ikiwiki uses File::Find to traverse the repo, and does not
+tell it to follow symlinks, but it might be possible to race replacing a
+directory with a symlink and trick it into following the link.
+
+Also, if someone checks in a symlink to /etc/passwd, ikiwiki would read and
+publish that, which could be used to expose files a committer otherwise
+wouldn't see.
+
+To avoid this, ikiwiki will skip over symlinks when scanning for pages, and
+uses locking to prevent more than one instance running at a time. The lock
+prevents one ikiwiki from running a svn up/git pull/etc at the wrong time
+to race another ikiwiki. So only attackers who can write to the working
+copy on their own can race it.
+
+## symlink + cgi attacks
+
+Similarly, a commit of a symlink could be made, ikiwiki ignores it
+because of the above, but the symlink is still there, and then you edit the
+page from the web, which follows the symlink when reading the page
+(exposing the content), and again when saving the changed page (changing
+the content).
+
+This was fixed for page saving by making ikiwiki refuse to write to files
+that are symlinks, or that are in subdirectories that are symlinks,
+combined with the above locking.
+
+For page editing, it's fixed by ikiwiki checking to make sure that it
+already has found a page by scanning the tree, before loading it for
+editing, which as described above, also is done in a way that avoids
+symlink attacks.
+
+## underlaydir override attacks
+
+ikiwiki also scans an underlaydir for pages, this is used to provide stock
+pages to all wikis w/o needing to copy them into the wiki. Since ikiwiki
+internally stores only the base filename from the underlaydir or srcdir,
+and searches for a file in either directory when reading a page source,
+there is the potential for ikiwiki's scanner to reject a file from the
+srcdir for some reason (such as it being contained in a directory that is
+symlinked in), find a valid copy of the file in the underlaydir, and then
+when loading the file, mistakenly load the bad file from the srcdir.
+
+This attack is avoided by making ikiwiki refuse to add any files from the
+underlaydir if a file also exists in the srcdir with the same name.
+
+## multiple page source issues
+
+Note that I previously worried that underlay override attacks could also be
+accomplished if ikiwiki were extended to support other page markup
+languages besides markdown. However, a closer look indicates that this is
+not a problem: ikiwiki does preserve the file extension when storing the
+source filename of a page, so a file with another extension that renders to
+the same page name can't bypass the check. Ie, ikiwiki won't skip foo.rst
+in the srcdir, find foo.mdwn in the underlay, decide to render page foo and
+then read the bad foo.mdwn. Instead it will remember the .rst extension and
+only render a file with that extension.
+
+## XSS attacks in page content
+
+ikiwiki supports protecting users from their own broken browsers via the
+[[plugins/htmlscrubber]] plugin, which is enabled by default.
+
+## svn commit logs
+
+It's was possible to force a whole series of svn commits to appear to
+have come just before yours, by forging svn log output. This was
+guarded against by using svn log --xml.
+
+ikiwiki escapes any html in svn commit logs to prevent other mischief.
+
+## XML::Parser
+
+XML::Parser is used by the aggregation plugin, and has some security holes.
+Bug #[378411](http://bugs.debian.org/378411) does not
+seem to affect our use, since the data is not encoded as utf-8 at that
+point. #[378412](http://bugs.debian.org/378412) could affect us, although it
+doesn't seem very exploitable. It has a simple fix, and has been fixed in
+Debian unstable.
+
+## include loops
+
+Various directives that cause one page to be included into another could
+be exploited to DOS the wiki, by causing a loop. Ikiwiki has always guarded
+against this one way or another; the current solution should detect all
+types of loops involving preprocessor directives.
+
+## Online editing of existing css and images
+
+A bug in ikiwiki allowed the web-based editor to edit any file that was in
+the wiki, not just files that are page sources. So an attacker (or a
+genuinely helpful user, which is how the hole came to light) could edit
+files like style.css. It is also theoretically possible that an attacker
+could have used this hole to edit images or other files in the wiki, with
+some difficulty, since all editing would happen in a textarea.
+
+This hole was discovered on 10 Feb 2007 and fixed the same day with the
+release of ikiwiki 1.42. A fix was also backported to Debian etch, as
+version 1.33.1. I recommend upgrading to one of these versions if your wiki
+allows web editing.
+
+## html insertion via title
+
+Missing html escaping of the title contents allowed a web-based editor to
+insert arbitrary html inside the title tag of a page. Since that part of
+the page is not processed by the htmlscrubber, evil html could be injected.
+
+This hole was discovered on 21 March 2007 and fixed the same day (er, hour)
+with the release of ikiwiki 1.46. A fix was also backported to Debian etch,
+as version 1.33.2. I recommend upgrading to one of these versions if your
+wiki allows web editing or aggregates feeds.
+
+## javascript insertion via meta tags
+
+It was possible to use the meta plugin's meta tags to insert arbitrary
+url contents, which could be used to insert stylesheet information
+containing javascript. This was fixed by sanitising meta tags.
+
+This hole was discovered on 21 March 2007 and fixed the same day
+with the release of ikiwiki 1.47. A fix was also backported to Debian etch,
+as version 1.33.3. I recommend upgrading to one of these versions if your
+wiki can be edited by third parties.
+
+## insufficient checking for symlinks in srcdir path
+
+Ikiwiki did not check if path to the srcdir to contained a symlink. If an
+attacker had commit access to the directories in the path, they could
+change it to a symlink, causing ikiwiki to read and publish files that were
+not intended to be published. (But not write to them due to other checks.)
+
+In most configurations, this is not exploitable, because the srcdir is
+checked out of revision control, but the directories leading up to it are
+not. Or, the srcdir is a single subdirectory of a project in revision
+control (ie, `ikiwiki/doc`), and if the subdirectory were a symlink,
+ikiwiki would still typically not follow it.
+
+There are at least two configurations where this is exploitable:
+
+* If the srcdir is a deeper subdirectory of a project. For example if it is
+ `project/foo/doc`, an an attacker can replace `foo` with a symlink to a
+ directory containing a `doc` directory (not a symlink), then ikiwiki
+ would follow the symlink.
+* If the path to the srcdir in ikiwiki's configuration ended in "/",
+ and the srcdir is a single subdirectory of a project, (ie,
+ `ikiwiki/doc/`), the srcdir could be a symlink and ikiwiki would not
+ notice.
+
+This security hole was discovered on 26 November 2007 and fixed the same
+day with the release of ikiwiki 2.14. I recommend upgrading to this version
+if your wiki can be committed to by third parties. Alternatively, don't use
+a trailing slash in the srcdir, and avoid the (unusual) configurations that
+allow the security hole to be exploited.
+
+## javascript insertion via uris
+
+The htmlscrubber did not block javascript in uris. This was fixed by adding
+a whitelist of valid uri types, which does not include javascript.
+([[!cve CVE-2008-0809]]) Some urls specifyable by the meta plugin could also
+theoretically have been used to inject javascript; this was also blocked
+([[!cve CVE-2008-0808]]).
+
+This hole was discovered on 10 February 2008 and fixed the same day
+with the release of ikiwiki 2.31.1. (And a few subsequent versions..)
+A fix was also backported to Debian etch, as version 1.33.4. I recommend
+upgrading to one of these versions if your wiki can be edited by third
+parties.
+
+## Cross Site Request Forging
+
+Cross Site Request Forging could be used to constuct a link that would
+change a logged-in user's password or other preferences if they clicked on
+the link. It could also be used to construct a link that would cause a wiki
+page to be modified by a logged-in user. ([[!cve CVE-2008-0165]])
+
+These holes were discovered on 10 April 2008 and fixed the same day with
+the release of ikiwiki 2.42. A fix was also backported to Debian etch, as
+version 1.33.5. I recommend upgrading to one of these versions.
+
+## Cleartext passwords
+
+Until version 2.48, ikiwiki stored passwords in cleartext in the `userdb`.
+That risks exposing all users' passwords if the file is somehow exposed. To
+pre-emtively guard against that, current versions of ikiwiki store password
+hashes (using Eksblowfish).
+
+If you use the [[plugins/passwordauth]] plugin, I recommend upgrading to
+ikiwiki 2.48, installing the [[!cpan Authen::Passphrase]] perl module, and running
+`ikiwiki-transition hashpassword` to replace all existing cleartext passwords
+with strong blowfish hashes.
+
+You might also consider changing to [[plugins/openid]], which does not
+require ikiwiki deal with passwords at all, and does not involve users sending
+passwords in cleartext over the net to log in, either.
+
+## Empty password security hole
+
+This hole allowed ikiwiki to accept logins using empty passwords, to openid
+accounts that didn't use a password. It was introduced in version 1.34, and
+fixed in version 2.48. The [bug](http://bugs.debian.org/483770) was
+discovered on 30 May 2008 and fixed the same day. ([[!cve CVE-2008-0169]])
+
+I recommend upgrading to 2.48 immediatly if your wiki allows both password
+and openid logins.
+
+## Malformed UTF-8 DOS
+
+Feeding ikiwiki page sources containing certian forms of malformed UTF-8
+can cause it to crash. This can potentially be used for a denial of service
+attack.
+
+intrigeri discovered this problem on 12 Nov 2008 and a patch put in place
+later that day, in version 2.70. The fix was backported to testing as version
+2.53.3, and to stable as version 1.33.7.
+
+## Insufficient blacklisting in teximg plugin
+
+Josh Triplett discovered on 28 Aug 2009 that the teximg plugin's
+blacklisting of insecure TeX commands was insufficient; it could be
+bypassed and used to read arbitrary files. This was fixed by
+enabling TeX configuration options that disallow unsafe TeX commands.
+The fix was released on 30 Aug 2009 in version 3.1415926, and was
+backported to stable in version 2.53.4. If you use the teximg plugin,
+I recommend upgrading. ([[!cve CVE-2009-2944]])
+
+## javascript insertion via svg uris
+
+Ivan Shmakov pointed out that the htmlscrubber allowed `data:image/*` urls,
+including `data:image/svg+xml`. But svg can contain javascript, so that is
+unsafe.
+
+This hole was discovered on 12 March 2010 and fixed the same day
+with the release of ikiwiki 3.20100312.
+A fix was also backported to Debian etch, as version 2.53.5. I recommend
+upgrading to one of these versions if your wiki can be edited by third
+parties.
+
+## javascript insertion via insufficient htmlscrubbing of comments
+
+Kevin Riggle noticed that it was not possible to configure
+`htmlscrubber_skip` to scrub comments while leaving unscubbed the text
+of eg, blog posts. Confusingly, setting it to "* and !comment(*)" did not
+scrub comments.
+
+Additionally, it was discovered that comments' html was never scrubbed during
+preview or moderation of comments with such a configuration.
+
+These problems were discovered on 12 November 2010 and fixed the same
+hour with the release of ikiwiki 3.20101112. ([[!cve CVE-2010-1673]])
+
+## javascript insertion via insufficient checking in comments
+
+Dave B noticed that attempting to comment on an illegal page name could be
+used for an XSS attack.
+
+This hole was discovered on 22 Jan 2011 and fixed the same day with
+the release of ikiwiki 3.20110122. A fix was backported to Debian squeeze,
+as version 3.20100815.5. An upgrade is recommended for sites
+with the comments plugin enabled. ([[!cve CVE-2011-0428]])
+
+## possible javascript insertion via insufficient htmlscrubbing of alternate stylesheets
+
+Giuseppe Bilotta noticed that 'meta stylesheet` directives allowed anyone
+who could upload a malicious stylesheet to a site to add it to a
+page as an alternate stylesheet, or replacing the default stylesheet.
+
+This hole was discovered on 28 Mar 2011 and fixed the same hour with
+the release of ikiwiki 3.20110328. A fix was backported to Debian squeeze,
+as version 3.20100815.6. An upgrade is recommended for sites that have
+untrusted committers, or have the attachments plugin enabled.
+([[!cve CVE-2011-1401]])
+
+## tty hijacking via ikiwiki-mass-rebuild
+
+Ludwig Nussel discovered a way for users to hijack root's tty when
+ikiwiki-mass-rebuild was run. Additionally, there was some potential
+for information disclosure via symlinks. ([[!cve CVE-2011-1408]])
+
+This hole was discovered on 8 June 2011 and fixed the same day with
+the release of ikiwiki 3.20110608. Note that the fix is dependant on
+a version of su that has a similar hole fixed. Version 4.1.5 of the shadow
+package contains the fixed su; [[!debbug 628843]] tracks fixing the hole in
+Debian. An upgrade is a must for any sites that have `ikiwiki-update-wikilist`
+installed suid (not the default), and whose admins run `ikiwiki-mass-rebuild`.
+
+## javascript insertion via meta tags
+
+Raúl Benencia discovered an additional XSS exposure in the meta plugin.
+([[!cve CVE-2012-0220]])
+
+This hole was discovered on 16 May 2012 and fixed the same day with
+the release of ikiwiki 3.20120516. A fix was backported to Debian squeeze,
+as version 3.20100815.9. An upgrade is recommended for all sites.
diff --git a/doc/security/discussion.mdwn b/doc/security/discussion.mdwn
new file mode 100644
index 000000000..ddf61c5f8
--- /dev/null
+++ b/doc/security/discussion.mdwn
@@ -0,0 +1,33 @@
+Copied from an email I sent --[[Joey]]
+
+> Apart from restricting escape characters and characters with special
+> meanings to the filesystem (such as '/') or the version control system
+> (which may not cope with \n), why limit filenames at all?
+
+Suppose that git-add and git-commit a shell scripts:
+
+ #!/bin/sh
+ /opt/git/git commit $1
+
+ #!/bin/sh
+ /opt/git/git add $1
+
+Ok, that's crappy code, but git add and commit are only run by a trusted
+user at the command line, so it's hardly a security hole. (And frankly,
+I'm not all too impressed with the real shell code I've seen in git-*
+..)
+
+But there's no security problem until ikiwiki calls it on a filename
+that a web user made up. Now, suppose that ikiwiki decided to allow
+spaces in filenames. Nothing else new, just spaces. Of course, the above
+bad code will fail to add and commit such files.
+
+But it won't just fail, it can even expose private data. Suppose that $1
+is "foo.mdwn .ikiwiki/userdb foo.mdwn". Then the userdb, with its
+passwords and emails is committed, along with foo.mdwn.
+
+Moral: ikiwiki interfaces with code that was not necessarily written for the
+security context that ikiwiki runs in. Even the most innocuous filenames can do
+very unexpected things if you let the shell get ahold of them. Ikiwiki needs to
+sanitize the hell out of user inputted data before letting it anywhere near the
+shell.
diff --git a/doc/setup.mdwn b/doc/setup.mdwn
new file mode 100644
index 000000000..bdbe323fd
--- /dev/null
+++ b/doc/setup.mdwn
@@ -0,0 +1,152 @@
+This tutorial will walk you through setting up a wiki with ikiwiki.
+
+[[!toc ]]
+
+## Install ikiwiki
+
+If you're using Debian or Ubuntu, ikiwiki is an <code><a href="http://www.debian.org/doc/manuals/debian-reference/ch02.en.html#_basic_package_management_operations">apt-get</a> install ikiwiki</code> away.
+If you're not, see the [[download]] and [[install]] pages.
+
+## Create your wiki
+
+All it takes to create a fully functional wiki using ikiwiki is running
+one command.
+[[!template id=note text="""
+For more control, advanced users may prefer to set up a wiki
+[[by_hand|byhand]].
+"""]]
+
+ % ikiwiki --setup /etc/ikiwiki/auto.setup
+
+Or, set up a blog with ikiwiki, run this command instead.
+
+ % ikiwiki --setup /etc/ikiwiki/auto-blog.setup
+
+`librpc-xml-perl` and `python-docutils` dependencies are needed.
+
+Either way, it will ask you a couple of questions.
+
+ What will the wiki be named? foo
+ What revision control system to use? git
+ What wiki user (or openid) will be admin? joey
+ Choose a password:
+
+Then, wait for it to tell you an url for your new site..
+
+ Successfully set up foo:
+ url: http://example.com/~joey/foo
+ srcdir: ~/foo
+ destdir: ~/public_html/foo
+ repository: ~/foo.git
+ To modify settings, edit ~/foo.setup and then run:
+ ikiwiki --setup ~/foo.setup
+
+Done!
+
+## Using the web interface
+
+Now you can go to the url it told you, and edit pages in your new wiki
+using the web interface.
+
+(If the web interface doesn't seem to allow editing or login, you may
+need to [[configure_the_web_server|tips/dot_cgi]].)
+
+## Checkout and edit wiki source
+
+Part of the fun of using ikiwiki is not being limited to using the
+web for editing pages, and instead using your favorite text editor and
+[[Revision_Control_System|rcs]].
+
+To do this, you need to check out a copy of the source to your wiki.
+(You should avoid making changes directly to the `srcdir`, as that
+checkout is reserved for use by ikiwiki itself.)
+
+Depending on which [[Revision_Control_System|rcs]] you chose to use,
+you can run one of these commands to check out your own copy of your wiki's
+source. (Remember to replace "foo" with the real directory name.)
+
+ git clone foo.git foo.src
+ svn checkout file://`pwd`/foo.svn/trunk foo.src
+ cvs -d `pwd`/foo get -P ikiwiki
+ bzr clone foo foo.src
+ hg clone foo foo.src
+ darcs get foo.darcs foo.src
+ # TODO monotone, tla
+
+Now to edit pages by hand, go into the directory you checked out (ie,
+"foo.src"), and fire up your text editor to edit `index.mdwn` or whatever
+other page you want to edit. If you chose to set up a blog, there is even a
+sample first post in `posts/first_post.mdwn` that you can edit.
+
+Once you've edited a page, use your revision control system to commit
+the changes. For distributed revision control systems, don't forget to push
+your commit.
+
+Once the commit reaches the repository, ikiwiki will notice it, and
+automatically update the wiki with your changes.
+
+## Customizing the wiki
+
+There are lots of things you can configure to customize your wiki.
+These range from changing the wiki's name, to enabling [[plugins]],
+to banning users and locking pages.
+
+If you log in as the admin user you configured earlier, and go to
+your Preferences page, you can click on "Setup" to customize many
+wiki settings and plugins.
+
+Some settings cannot be configured on the web, for security reasons or
+because misconfiguring them could break the wiki. To change these settings,
+you can manually edit the setup file, which is named something like
+"foo.setup". The file lists all available configuration settings
+and gives a brief description of each.
+
+After making changes to this file, you need to tell ikiwiki to use it:
+
+ % ikiwiki --setup foo.setup
+
+Alternatively, you can ask ikiwiki to change settings in the file for you:
+
+ % ikiwiki --changesetup foo.setup --plugin goodstuff
+
+See [[usage]] for more options.
+
+## Customizing file locations
+
+As a wiki compiler, ikiwiki builds a wiki from files in a source directory,
+and outputs the files to a destination directory. The source directory is
+a working copy checked out from the version control system repository.
+
+When you used `auto.setup`, ikiwiki put the source directory, destination
+directory, and repository in your home directory, and told you the location
+of each. Those locations were chosen to work without customization, but you
+might want to move them to different directories.
+
+First, move the destination directory and repository around.
+
+ % mv public_html/foo /srv/web/foo.com
+ % mv foo.git /srv/git/foo.git
+
+If you moved the repository to a new location, checkouts pointing at the
+old location won't work, and the easiest way to deal with this is to delete
+them and re-checkout from the new repository location.
+
+ % rm -rf foo
+ % git clone /srv/git/foo.git
+
+Finally, edit the setup file. Modify the settings for `srcdir`, `destdir`,
+`url`, `cgiurl`, `cgi_wrapper`, `git_wrapper`, etc to reflect where
+you moved things. Remember to run `ikiwiki --setup` after editing the
+setup file.
+
+## Enjoy your new wiki!
+
+Add yourself to [[IkiWikiUsers]]. And check out
+the [[tips]] to find out how to get more out of ikiwiki.
+
+----
+
+_Notes_:
+
+- If you are searching for the file where the users are stored, it's in `your_repository/.ikiwiki/userdb`. The one which is in YOUR REPOSITORY, it cannot be found into your `~/.ikiwiki`.
+- If you want to enable a plugin you **WILL HAVE** to add it to the `add_plugins` array in the `*.setup` file (or to use the `--plugin` switch while calling `ikiwiki`). Uncommenting the plugin options/configuration fields in the setup is not **ALWAYS** sufficient. You have been warned.
diff --git a/doc/setup/byhand.mdwn b/doc/setup/byhand.mdwn
new file mode 100644
index 000000000..6d0f37cd9
--- /dev/null
+++ b/doc/setup/byhand.mdwn
@@ -0,0 +1,202 @@
+This tutorial will walk you through setting up a wiki with ikiwiki,
+doing everything by hand. [[Setup]] has an easier method, but with less
+control.
+
+[[!toc ]]
+
+## Decide where your wiki's files will go.
+
+As a wiki compiler, ikiwiki builds a wiki from files in a source directory,
+and outputs the files to a destination directory. If you keep your wiki in
+a version control system, the source directory will contain a working copy
+checked out from the version control system.
+
+For the purposes of this tutorial, we'll set shell variables
+for these locations, and use those variables in the commands that follow.
+
+ SRCDIR=~/wikiwc
+ DESTDIR=~/public_html/wiki/
+
+Note that ikiwiki owns the working copy directory; do not perform your own
+edits in ikiwiki's working copy.
+
+## Create the beginnings of your wiki.
+
+This will create a simple main page for the wiki.
+
+ mkdir $SRCDIR
+ cd $SRCDIR
+ $EDITOR index.mdwn
+
+In the editor, you could start by entering a simple page like
+[[!toggle id=page text="this one"]].
+[[!toggleable id=page text="""
+ Welcome to your new wiki.
+
+ All wikis are supposed to have a \[[SandBox]],
+ so this one does too.
+
+ ----
+
+ This wiki is powered by [ikiwiki](http://ikiwiki.info).
+"""]]
+
+See [[ikiwiki/formatting]] for details about the markup language.
+
+Note that several [[standard_wiki_pages|basewiki]] will be added to your
+wiki, from files in `/usr/share/ikiwiki/basewiki/`, so your wiki will
+automatically get a [[SandBox]], and some other useful pages.
+
+## Build your wiki for the first time.
+
+ ikiwiki --verbose $SRCDIR $DESTDIR --url=http://example.org/~you/wiki/
+
+Replace the url with the real url to your wiki. You should now
+be able to visit the url and see your wiki.
+
+## Add content to your wiki.
+
+Continue editing or adding pages and rebuilding the wiki.
+
+To quickly get started on a common task like blogging with ikiwiki, you
+can copy in files from the [[examples]]. The examples are located in
+`doc/examples/` in the ikiwiki source package.
+
+You can experiment with other ikiwiki parameters such as `--wikiname`
+and `--rebuild` too. Get comfortable with its command line (see
+[[usage]]).
+
+## Add a setup file.
+
+By now you should be getting tired of typing in all the command line
+options each time you change something in your wiki's setup. Time to
+introduce setup files.
+
+To generate a setup file, use `ikiwiki --dumpsetup`. You can pass
+all the options have you been including at the command line, and they
+will be stored in the setup file.
+
+ ikiwiki $SRCDIR $DESTDIR --url=http://example.org/~you/wiki/ --dumpsetup ikiwiki.setup
+
+Note that this file should *not* be put in your wiki's directory with
+the rest of the files. A good place to put it is in a ~/.ikiwiki/
+subdirectory.
+
+Most of the options, like `wikiname` in the setup file are the same as
+ikiwiki's command line options (documented in [[usage]]). `srcdir` and
+`destdir` are the two directories you specify when running ikiwiki by
+hand. Make sure that these are pointing to the right directories, and
+read through and configure the rest of the file to your liking.
+
+When you're satisfied, run `ikiwiki --setup ikiwiki.setup`, and it
+will set everything up.
+
+## Turn on additional features.
+
+[[!template id="note" text="""
+CGI configuration is heavily dependent on webserver. Figure out (or
+configure) the location and/or filename extension your webserver
+needs to execute a CGI, then set `cgi_wrapper` to a suitable path.
+"""]]
+
+Now you have a basic wiki with a setup file. Time to experiment
+with ikiwiki's many features.
+
+Let's first enable a key wiki feature and set up [[CGI]] to allow
+editing the wiki from the web. Just edit ikiwiki.setup, uncomment the
+settings for the `cgi_wrapper`, make sure the filename for the cgi wrapper
+is ok, run `ikiwiki --setup ikiwiki.setup`, and you're done!
+
+There are lots of other configuration options in ikiwiki.setup that you
+can uncomment, configure, and enable by re-running
+`ikiwiki --setup ikiwiki.setup`. Be sure to browse through all the
+[[plugins]].
+
+## Put your wiki in revision control.
+
+At this point you might want to check your wiki in to a revision control
+system so you can keep track of changes and revert edits. Depending
+on the revision control system you choose, the way this is done varies.
+
+Note that the .ikiwiki subdirectory is where ikiwiki keeps its state, and
+should be preserved, but not checked into revision control.
+
+The [[ikiwiki-makerepo]] command automates setting up a wiki in
+revision control.
+
+[[!toggle id=subversion text="Subversion"]]
+[[!toggleable id=subversion text="""
+ REPOSITORY=~/wikirepo
+ ikiwiki-makerepo svn $SRCDIR $REPOSITORY
+"""]]
+
+[[!toggle id=cvs text="CVS"]]
+[[!toggleable id=cvs text="""
+ REPOSITORY=~/wikirepo
+ ikiwiki-makerepo cvs $SRCDIR $REPOSITORY
+"""]]
+
+[[!toggle id=git text="Git"]]
+[[!toggleable id=git text="""
+ REPOSITORY=~/wiki.git
+ ikiwiki-makerepo git $SRCDIR $REPOSITORY
+
+Please see [[rcs/git]] for detailed documentation about how
+ikiwiki uses git repositories, and some important caveats
+about using the git repositories.
+"""]]
+
+[[!toggle id=mercurial text="Mercurial"]]
+[[!toggleable id=mercurial text="""
+ REPOSITORY=$SRCDIR
+ ikiwiki-makerepo mercurial $SRCDIR
+"""]]
+
+[[!toggle id=bazaar text="Bazaar"]]
+[[!toggleable id=bazaar text="""
+ REPOSITORY=$SRCDIR
+ ikiwiki-makerepo bzr $SRCDIR
+"""]]
+
+[[!toggle id=tla text="TLA"]]
+[[!toggleable id=tla text="""
+ REPOSITORY=~/wikirepo
+ tla make-archive me@localhost--wiki $REPOSITORY
+ tla my-id "<me@localhost>"
+ cd $SRCDIR
+ tla archive-setup me@localhost--wiki/wiki--0
+ tla init-tree me@localhost--wiki/wiki--0
+ # Edit {arch}/=tagging-method and change the precious
+ # line to add the .ikiwiki directory to the regexp.
+ tla add *
+ tla import
+"""]]
+
+[[!toggle id=monotone text="Monotone"]]
+[[!toggleable id=monotone text="""
+ # This assumes that you have already used "mtn genkey you@hostname".
+ REPOSITORY=~/wiki.monotone
+ ikiwiki-makerepo monotone $SRCDIR $REPOSITORY
+"""]]
+
+## Configure ikiwiki to use revision control.
+
+Once your wiki is checked in to the revision control system, you should
+configure ikiwiki to use revision control. Edit your ikiwiki.setup, set
+`rcs` to the revision control system you chose to use. Be careful,
+you may need to use the 'fullname'. For example, 'hg' doesn't work, you
+should use mercurial. Be sure to set `svnrepo` to the directory for your
+repository, if using subversion. Uncomment the configuration for the wrapper
+for your revision control system, and configure the wrapper path appropriately
+(for Git, it should be the path to `hooks/post-update` inside the bare git repository).
+
+Once it's all set up, run `ikiwiki --setup ikiwiki.setup` once more.
+Now you should be able to edit files in $SRCDIR, and use your revision
+control system to commit them, and the wiki will automatically update.
+And in the web interface, RecentChanges should work, and files changed
+by web users will also be committed using revision control.
+
+## Enjoy your new wiki!
+
+Add yourself to [[IkiWikiUsers]]. And check out
+the [[tips]] to find out how to get more out of ikiwiki.
diff --git a/doc/setup/byhand/discussion.mdwn b/doc/setup/byhand/discussion.mdwn
new file mode 100644
index 000000000..941976789
--- /dev/null
+++ b/doc/setup/byhand/discussion.mdwn
@@ -0,0 +1,7 @@
+What directory is the 'working copy'? There can be two interpretations: the current dir and the .git dir.
+
+> It is fairly common terminology amoung all version control systems to use
+> "working copy" to refer to a checkout from version control, including
+> copies of all the versioned files, and whatever VCS-specific cruft that
+> entails. So, a working copy is everything you get when you `git clone`
+> a repository. --[[Joey]]
diff --git a/doc/setup/discussion.mdwn b/doc/setup/discussion.mdwn
new file mode 100644
index 000000000..7ab935181
--- /dev/null
+++ b/doc/setup/discussion.mdwn
@@ -0,0 +1,271 @@
+I have copied over the ikiwiki.setup file from /usr/share/doc/ikiwiki/ to /etc/ikiwiki/ and run it after editing. My site gets built but when I click on the 'edit' button, firefox and google chrome download the cgi file instead of creating a way to edit it. The permissions on my ikiwiki.cgi script look like this: -rwsr-sr-x 1 root root 13359 2009-10-13 19:21 ikiwiki.cgi. Is there something I should do, i.e. change permissions, so I can get it to run correctly? (jeremiah)
+
+> Have a look [[here|tips/dot_cgi]]. --[[Jogo]]
+
+I just went through the standard procedure described for setup, copied the blog directory from examples into my source directory, ran ikiwiki, and everything seems to have worked, except that none of the
+&#91;&#91;!meta ... &#93;&#93; tags get converted. They simply show up in the html files unformatted, with no exclamation point, and with p tags around them. Any ideas? using ikiwiki version 2.40 on freebsd --mjg
+
+> The meta plugin is not enabled by default. It's pulled in by the
+> goodstuff plugin, so add one or the other to the add_plugins line in your
+> config file. --[[Joey]]
+
+Can the instructions for using `ikiwiki-makerepo` be clarified. This
+command wants to create folders in the directory it is run in. Which
+directory should that be - `$SRCDIR?` --Andy
+
+> No, `ikiwiki-makerepo` does not create directories in the current
+> directory. You specify the directory you want it to create and it creates
+> the directory and makes it into a repository. The setup instuctions have
+> examples of doing this. I don't see anything unclear. --[[Joey]]
+
+Sorry, was not precise enough. It does if you are using the git option. I
+believe this is partially explained on the rcs/git/ page. However I'm still
+not totally clear where I should run the command when using git. If I
+should be in $SRCDIR then updating the instructions to something like
+
+ REPOSITORY=~/wiki.git
+ cd $SRCDIR
+ ikiwiki-makerepo git $SCDIR $REPOSITORY
+
+might clear things up a little. Apologies if I'm being a bit dim, learning
+ikiwiki and git at same time :)
+
+Have tried 3 options
+
+1. mkdir $REP cd $REP run command (says $REP already exists so won't run)
+2. rm -rf $REP cd $SRC run command (creates a repository in $SRC, does not create $REP ends with "remote origin already exists"
+3. mkdir $TMP cd $TMP run command (creates a repository in $SRC/.git creates a repository in $TMP, does nothing to $REP, ends with "remote origin already exists".
+
+Version of ikiwiki installed is package from Ubuntu/Hardy 2.19
+Git version is 1.5.2.5
+
+n.b. svn version of command worked fine :)
+
+--Andy
+
+> Initialized empty shared Git repository in /home/deploy/tmp/
+> Initialized empty Git repository in .git/
+> fatal: '/home/deploy/tmp/repo': unable to chdir or not a git archive
+
+Looks like your git does not support GIT_DIR being used with git-init. I
+see some mentions of changes in git's changelog for 1.5.3 that look
+relevant. I've changed ikiwiki-makerepo to use a method more portable to
+older versions of git. --[[Joey]]
+
+Many thanks Joey, upgraded my git, and now have working iki :) -- Andy
+
+----
+
+It isn't intended that .ikiwiki be versioned, is it? Do you have a svn:ignore set?
+Is there some magic way you can make the svn commands (and presumably commands for
+the other VCSs here) ignore the .ikiwiki directory during step 8, when they import it?
+If not, maybe a note should be made that the user should delete this file before
+they import. --Ethan
+
+> No, .ikiwiki should not be versioned, and a svn:ignore of it is reasonable,
+> although probably too much noise for the setup instructions.
+> I've switched to a different method that preserves .ikiwiki, w/o checking
+> it in. --[[Joey]]
+
+----
+
+These instructions should probably show how to use a bare Git repository
+(`GIT_DIR=somewhere.git git-init-db`) rather than a repository with a full
+working copy. You can always clone the repository if you want your own local
+working copy. Furthermore, this allows you to make multiple commits to your
+working copy before pushing them to the repository and causing the wiki to
+update. --[[JoshTriplett]]
+
+> I'm currently testing and running such a setup. --[[tschwinge]]
+
+Furthermore the git instructions should be changed to move the *.ikiwiki*
+directory back into the wiki's working copy directory, isn't it? --[[tschwinge]]
+
+> Yes, I think so. I will clean these instructions up unless somebody tells me we're missing something fundamental. --[[BartMassey]]
+
+>> Either you do it or I'll do it somewhen soon. --[[tschwinge]]
+
+----
+
+Curious as to why support for CVS is not built in. --[[Luther]]
+
+> See [[todo/CVS_backend|todo/CVS_backend]], but you might consider switching to a better version control system. --[[JoshTriplett]]
+
+----
+
+What is the syntax for specifying the adminuser as an openid user? I've tried a couple things but I'm missing something. Thanks for any pointers. -- [[AdamShand]]
+
+> Just put the openid url in there. It has to be the full url with
+> "http://". --[[Joey]]
+
+----
+
+I apologize if this is the incorrect forum for this question, but I am
+trying to get ikiwiki set up and running with git. I followed all the
+directions and all seems to work until I go back and try to make changes.
+The steps I am performing:
+
+ cd $SRCDIR (e.g. ~/ikisrc)
+ vim index.mdwn (add a couple lines)
+ git commit -a -m 'test'
+ git push
+
+I then get a long error message which reads in part "You asked me to pull
+without telling me which branch you want to merge with, and
+'branch.master.merge' in your configuration file does not tell me either."
+From that point on, I get:
+
+ sws@odin:~/dev/ikisrc$ git push
+ To /home/git/repos/myikiwiki.git
+ ! [rejected] master -> master (non-fast forward)
+ error: failed to push to '/home/git/repos/myikiwiki.git'
+
+If I do a git clone ssh://odin/path/to/$REPOSITORY from another machine and
+try to edit I get the same error sequence. What am I doing wrong?
+
+> I don't know. The only time I have seen this message is when
+> the master git repository was not bare. All current instructions and
+> `ikiwiki-makerepo` have a proper bare repo used for the master
+> repository, but perhaps you followed some old, broken instructions, or
+> forgot to make it bare? --[[Joey]]
+
+-----
+
+I follow every steps of the setup procedure, change some templates and
+tried to modify some page through the web but was not able to do so. Every
+page seems to be locked by the adminuser user. When I remove the adminuser
+in the setup file, every ran fine. Did I miss something ? What is exactly
+the adminuser supposed to be allowed to ? Is he the only user allowed to
+modify pages ?
+
+> This was a bug in ikwiki that was fixed in version 2.44. --[[Joey]]
+
+-----
+
+pI hope you guys can put up with an absolute newbie. I am fairly new to linux and completely new to Perl. I have just installed MoinMoin locally in my PC, running ubuntu 8.4 and was about to use it until I ran into your ikiwiki. I thought ikiwiki is a better fit for what I want to do, so am seriously considering installing it as well in ubuntu. Except that the install seems way beyond my understanding.
+
+Do i need to install git first? Which git -- the git-core? Ubuntu's instructions on installing the git-core is: "sudo apt-get install git-core". Is that it? Do I need to do a git-init as well, or will the ikiwiki-makerepo handle that? If I have to do a git-init as well, what --share values should I specify?
+
+It seems I will have to install the ikiwiki from the tar.gz file. I have downloaded it, but do I need to install CPAN or CPAN++ first? That doesn't sound right. I am totally confused already. Does anyone have some install documents pitched to someone as ignorant as I am? -- [[WillDioneda]]
+
+> Ubuntu includes ikiwiki (in universe, I assume), so you should just be
+> able to use apt or synaptic to install the package, as documented on the
+> [[download]] page. Install git-core also to get git.
+>
+> You do not need to use git-init if you use ikiwiki-makrepo. --[[Joey]]
+
+
+Thanks for your response. You're right. Ubuntu does have ikiwiki, except that it is an older version. I tried installing it; saw some error messages from the install, and decided against it. Plus the documentation here in ikiwiki.info seems slightly different. I made an executive/beginner decision: to go for the latest tarball. And found myself in deep water, ...
+
+Anyway, I think I might be able to install it from the tarball I downloaded. I've been reading the discussions, had a look at your screencasts, etc. I will give it another bash. -- [[WillDioneda]]
+
+----
+
+How do I set up cgi editing? In setup I have:
+
+ * cgiurl => 'http://wiki.had.co.nz/edit.cgi'
+ * cgi_wrapper => 'edit.cgi'
+
+But I don't get an edit link on my pages? What am I doing wrong?
+
+> Assuming you don't have the editpage plugin disabled, all you should need
+> to so is re-run `ikiwiki -setup` with the above config and it should
+> rebuild your wiki and add the edit links to pages. --[[Joey]]
+
+----
+
+I setup ikiwiki on a fedora 10 machine and I am using apache as my http server. Faced a few difficulties while setting it up as the default setup program left some suid files and group writeable directories on the system. It took some time to get it working and documented what I did at http://flyingtux.blogspot.com/2009/03/installing-ikiwiki.html. Thought it might be useful to someone here. (The version installed is 2.72)
+
+> ikiwiki makes wrappers suid by default, because this ensures that when
+> the ikiwiki.cgi is run by your web server, it runs as the user who owns
+> your wiki, and can thus write to it. ikiwiki is designed to run securely
+> suid. If your webserver uses some
+> mechanism to run the ikiwiki.cgi as the user who owns it, without the
+> suid bit being set, you *could* modify `cgi_wrappermode` in your setup
+> file to drop the suid bit.
+>
+> ikiwiki respects the umask, so if your umask is one that causes things to
+> be group writable, they will by. If you want to override that, there is
+> also a `umask ` setting in your setup file. --[[Joey]]
+
+----
+
+/etc/ikiwiki/auto.setup tries to get abs_path of a non-existent
+"repository" path (in ikiwiki-makerepo), and that doesn't work in my perl:
+
+<pre>
+[mort@localhost ~]$ perl -e 'use Cwd q{abs_path}; print abs_path("/var")'
+/var[mort@localhost ~]$ perl -e 'use Cwd q{abs_path}; print abs_path("/abcde")'
+[mort@localhost ~]$
+</pre>
+
+Because of this, /etc/ikiwiki/auto.setup fails:
+
+<pre>
+$ ikiwiki -setup /etc/ikiwiki/auto.setup
+What will the wiki be named? wiki
+What revision control system to use? git
+What wiki user (or openid) will be admin? mort
+
+
+Setting up wiki ...
+internal error finding repository abs_path
+/etc/ikiwiki/auto.setup: failed to set up the repository with ikiwiki-makerepo
+
+usage: ikiwiki [options] source dest
+ ikiwiki --setup configfile
+$ perl -v
+
+This is perl, v5.8.8 built for i386-linux-thread-multi
+(with 2 registered patches, see perl -V for more detail)
+
+Copyright 1987-2007, Larry Wall
+
+Perl may be copied only under the terms of either the Artistic License or the
+GNU General Public License, which may be found in the Perl 5 source kit.
+
+Complete documentation for Perl, including FAQ lists, should be found on
+this system using "man perl" or "perldoc perl". If you have access to the
+Internet, point your browser at http://www.perl.org/, the Perl Home Page.
+
+$
+</pre>
+
+Can't ikiwiki's "make test" perhaps test for this, so that one knows something will go wrong?
+-- Ivan Z.
+
+> FWIW, I tried the same thing with perl 5.8.8 from Debian etch, and its
+> Cwd does not have the problem. But I've modified `ikiwiki-makerepo` to
+> avoid using `abs_path` this way anyhow. --[[Joey]]
+
+Thank you! I'm not a Perl programmer, so what's your opinion: is this behavior a violation of the specification of abs_path and I should report it to [ALTLinux](http://bugs.altlinux.org) (the distro)? --Ivan Z.
+
+> That is not entirely clear to me from the documentation. It doesn't
+> say the path has to exist, but doesn't say it cannot either. --[[Joey]]
+
+I am experiencing the same problem "/etc/ikiwiki/custom: failed to set up the repository with ikiwiki-makerepo
+" on Debian squeeze with perl5.10.0. Upgrading to ikiwiki 3.10 fixes it. -- [Albert](http://www.docunext.com/)
+
+----
+
+Just a note, perl 5.10 isn't packaged as part of RHEL or thus CentOS nor EPEL,
+so it's not especially trivial to satisfy that requirement for ikiwiki on
+those platforms, without backporting it from Fedora or building from source.
+However, I have an ikiwiki 3.20100403 running on RHEL-4 supplied 5.8.8 without
+(seemingly too much) complaint. How strong is the 5.10 requirement? what
+precicely breaks without it? -- [[Jon]]
+
+> I don't remember what was the specific problem with perl 5.8.8. All I can
+> find is some taint checking bugs, which are currently worked around by
+> taint checking being disabled. --[[Joey]]
+
+---
+
+Did anyone tried to install ikiwiki under a vhost setup ?
+ikiwiki is installed under a debian lenny system. but without write acces to /etc/ikiwiki (obvious) i am coming not far.
+Or do i miss something which is probably hidden deeper in the documentation ?
+
+Well it should be similar to shared hosting [or a remote server in general](http://ikiwiki.info/forum/how_to_setup_ikiwiki_on_a_remote_host/)
+
+----
+Perhaps it's worth noting that when installing ikiwiki with apt on Debian stable, you need to use the backports version in order to follow the setup instructions.
diff --git a/doc/shortcuts.mdwn b/doc/shortcuts.mdwn
new file mode 100644
index 000000000..b4f6d8ef4
--- /dev/null
+++ b/doc/shortcuts.mdwn
@@ -0,0 +1,83 @@
+[[!if test="enabled(shortcut)"
+ then="This wiki has shortcuts **enabled**."
+ else="This wiki has shortcuts **disabled**."]]
+
+Some examples of using shortcuts include:
+
+ \[[!google foo]]
+ \[[!wikipedia War_of_1812]]
+ \[[!debbug 12345]]
+ Check the \[[!google ikiwiki desc="google search for %s"]].
+
+This page controls what shortcut links the wiki supports.
+
+* [[!shortcut name=google url="https://encrypted.google.com/search?q=%s"]]
+* [[!shortcut name=archive url="http://web.archive.org/*/%S"]]
+* [[!shortcut name=gmap url="https://maps.google.com/maps?q=%s"]]
+* [[!shortcut name=gmsg url="https://groups.google.com/groups?selm=%s"]]
+* [[!shortcut name=wikipedia url="https://en.wikipedia.org/wiki/%W"]]
+* [[!shortcut name=wikitravel url="https://wikitravel.org/en/%s"]]
+* [[!shortcut name=wiktionary url="https://en.wiktionary.org/wiki/%s"]]
+* [[!shortcut name=debbug url="http://bugs.debian.org/%S" desc="Debian bug #%s"]]
+* [[!shortcut name=deblist url="https://lists.debian.org/debian-%s" desc="debian-%s@lists.debian.org"]]
+* [[!shortcut name=debpkg url="http://packages.debian.org/%s"]]
+* [[!shortcut name=debpkgsid url="http://packages.debian.org/sid/%s"]]
+* [[!shortcut name=debpts url="http://packages.qa.debian.org/%s"]]
+* [[!shortcut name=debmsg url="https://lists.debian.org/msgid-search/%s"]]
+* [[!shortcut name=debrt url="https://rt.debian.org/Ticket/Display.html?id=%s"]]
+* [[!shortcut name=debss url="http://snapshot.debian.org/package/%s/"]]
+ * Usage: `\[[!debss package]]` or `\[[!debss package/version]]`. See <http://snapshot.debian.org/> for details.
+* [[!shortcut name=debwiki url="https://wiki.debian.org/%s"]]
+* [[!shortcut name=fdobug url="https://bugs.freedesktop.org/show_bug.cgi?id=%s" desc="freedesktop.org bug #%s"]]
+* [[!shortcut name=fdolist url="http://lists.freedesktop.org/mailman/listinfo/%s" desc="%s@lists.freedesktop.org"]]
+* [[!shortcut name=gnomebug url="https://bugzilla.gnome.org/show_bug.cgi?id=%s" desc="GNOME bug #%s"]]
+* [[!shortcut name=linuxbug url="https://bugzilla.kernel.org/show_bug.cgi?id=%s" desc="Linux bug #%s"]]
+* [[!shortcut name=mozbug url="https://bugzilla.mozilla.org/show_bug.cgi?id=%s" desc="Mozilla bug #%s"]]
+* [[!shortcut name=gnulist url="https://lists.gnu.org/mailman/listinfo/%s" desc="%s@gnu.org"]]
+* [[!shortcut name=marcmsg url="http://marc.info/?i=%s"]]
+* [[!shortcut name=marclist url="http://marc.info/?l=%s"]]
+* [[!shortcut name=gmane url="http://dir.gmane.org/gmane.%s" desc="gmane.%s"]]
+* [[!shortcut name=gmanemsg url="http://mid.gmane.org/%s"]]
+* [[!shortcut name=cpan url="http://search.cpan.org/search?mode=dist&query=%s"]]
+* [[!shortcut name=ctan url="http://tug.ctan.org/cgi-bin/ctanPackageInformation.py?id=%s"]]
+* [[!shortcut name=hoogle url="http://haskell.org/hoogle/?q=%s"]]
+* [[!shortcut name=iki url="http://ikiwiki.info/%S/"]]
+* [[!shortcut name=ljuser url="http://%s.livejournal.com/"]]
+* [[!shortcut name=rfc url="https://www.ietf.org/rfc/rfc%s.txt" desc="RFC %s"]]
+* [[!shortcut name=c2 url="http://c2.com/cgi/wiki?%s"]]
+* [[!shortcut name=meatballwiki url="http://www.usemod.com/cgi-bin/mb.pl?%s"]]
+* [[!shortcut name=emacswiki url="http://www.emacswiki.org/cgi-bin/wiki/%s"]]
+* [[!shortcut name=haskellwiki url="http://haskell.org/haskellwiki/%s"]]
+* [[!shortcut name=dict url="http://www.dict.org/bin/Dict?Form=Dict1&Strategy=*&Database=*&Query=%s"]]
+* [[!shortcut name=imdb url="http://imdb.com/find?q=%s"]]
+* [[!shortcut name=gpg url="http://pgpkeys.mit.edu:11371/pks/lookup?op=vindex&exact=on&search=0x%s"]]
+* [[!shortcut name=perldoc url="http://perldoc.perl.org/search.html?q=%s"]]
+* [[!shortcut name=whois url="http://reports.internic.net/cgi/whois?whois_nic=%s&type=domain"]]
+* [[!shortcut name=cve url="https://cve.mitre.org/cgi-bin/cvename.cgi?name=%s"]]
+* [[!shortcut name=flickr url="https://secure.flickr.com/photos/%s"]]
+* [[!shortcut name=man url="http://linux.die.net/man/%s"]]
+* [[!shortcut name=ohloh url="https://www.ohloh.net/p/%s"]]
+* [[!shortcut name=cpanrt url="https://rt.cpan.org/Ticket/Display.html?id=%s" desc="CPAN RT#%s"]]
+* [[!shortcut name=novellbug url="https://bugzilla.novell.com/show_bug.cgi?id=%s" desc="bug %s"]]
+* [[!shortcut name=ubupkg url="http://packages.ubuntu.com/%s"]]
+* [[!shortcut name=mozillazinekb url="http://kb.mozillazine.org/%s"]]
+* [[!shortcut name=freebsdwiki url="http://wiki.freebsd.org/%s"]]
+* [[!shortcut name=hackage url="http://hackage.haskell.org/package/%s"]]
+
+To add a new shortcut, use the `shortcut`
+[[ikiwiki/directive]]. In the url, "%s" is replaced with the
+text passed to the named shortcut, after [[!wikipedia url_encoding]]
+it, and '%S' is replaced with the raw, non-encoded text.
+Additionally, `%W` is replaced with the text encoded just right for
+Wikipedia. The optional `desc` parameter controls the description of
+the link.
+
+Remember that the `name` you give the shortcut will become a new
+[[ikiwiki/directive]]. Avoid using a `name` that conflicts
+with an existing directive. These directives also accept a `desc`
+parameter that will override the one provided at definition time.
+
+If you come up with a shortcut that you think others might find useful,
+consider contributing it to the [shortcuts page on the ikiwiki
+wiki](http://ikiwiki.info/shortcuts/), so that future versions of
+ikiwiki will include your shortcut in the standard underlay.
diff --git a/doc/shortcuts/discussion.mdwn b/doc/shortcuts/discussion.mdwn
new file mode 100644
index 000000000..11903a127
--- /dev/null
+++ b/doc/shortcuts/discussion.mdwn
@@ -0,0 +1,21 @@
+# Suggestions for multi-language links
+
+Sites like Wikipedia have different URLs for each language. The shortcut for Wikipedia `!wikipedia` points to `https://secure.wikimedia.org/wikipedia/en/wiki/%s` which is the English version.
+
+Do you have a suggestion on how to make that shortcut also be used to point to a different language.
+
+1. The option to just adapt the shortcut (`s/en/de/`) is quite cumbersome for non English speakers and also has the disadvantage of always updating the shortcut links manually after each modification in the upstream ikiwiki shortcut list to stay in sync.
+1. Adding an extra shortcut for every language, e. g. `!wikipediade`, with for example the country TLD in it is an option but would make the shortcut list quite big.
+1. Adding a `lang` parameter comes also to my mind, but I do not know how feasible that is.
+
+Thanks. --[[PaulePanter]]
+
+> Does anyone have an opinion on the shortcuts for google/wikipedia pointing at the HTTPS services? Introduced by an edit by Paul Panter.
+> Personally, I think they should be separate shortcut keys. Most of my
+> google/WP usage is such that I would prefer it over HTTP (faster, less
+> resource usage at client side). However, I never use the shortcuts feature
+> in ikiwiki anyway... -- [[Jon]]
+
+>> I have been trying to decide what to do about that Jon. https links are
+>> painful for me (dialup). This really needs to be fixed at a lower level
+>> in the web than ikiwiki though, and I understand the push for https. --[[Joey]]
diff --git a/doc/sitemap.mdwn b/doc/sitemap.mdwn
new file mode 100644
index 000000000..095ef95df
--- /dev/null
+++ b/doc/sitemap.mdwn
@@ -0,0 +1,5 @@
+This map excludes discussion pages, as well as subpages that are in feeds.
+
+[[!map pages="page(*) and !*/discussion and !recentchanges
+and !bugs/* and !examples/*/* and !news/* and !tips/* and !plugins/*
+and !sandbox/* and !forum/* and !todo/* and !users/*"]]
diff --git a/doc/smileys.mdwn b/doc/smileys.mdwn
new file mode 100644
index 000000000..4c19b0585
--- /dev/null
+++ b/doc/smileys.mdwn
@@ -0,0 +1,56 @@
+This page is used to control what smileys are supported by the wiki.
+Just write the text of a smiley to display it.
+
+* \\:) [[smileys/smile.png]]
+* \\:-) [[smileys/smile.png]]
+* \\:D [[smileys/biggrin.png]]
+* \\:-D [[smileys/biggrin.png]]
+* \\B) [[smileys/smile2.png]]
+* \\B-) [[smileys/smile2.png]]
+* \\:)) [[smileys/smile3.png]]
+* \\:-)) [[smileys/smile3.png]]
+* \\;) [[smileys/smile4.png]]
+* \\;-) [[smileys/smile4.png]]
+* \\:\ [[smileys/ohwell.png]]
+* \\:-\ [[smileys/ohwell.png]]
+* \\:/ [[smileys/ohwell.png]]
+* \\:-/ [[smileys/ohwell.png]]
+* \\:| [[smileys/neutral.png]]
+* \\:-| [[smileys/neutral.png]]
+* \\>:> [[smileys/devil.png]]
+* \\X-( [[smileys/angry.png]]
+* \\<:( [[smileys/frown.png]]
+* \\:( [[smileys/sad.png]]
+* \\:-( [[smileys/sad.png]]
+* \\:-? [[smileys/tongue.png]]
+* \\:-P [[smileys/tongue.png]]
+* \\:o [[smileys/redface.png]]
+* \\|) [[smileys/tired.png]]
+* \\|-) [[smileys/tired.png]]
+* \\{OK} [[smileys/thumbs-up.png]]
+* \\{X} [[smileys/icon-error.png]]
+* \\{i} [[smileys/icon-info.png]]
+* \\(./) [[smileys/checkmark.png]]
+* \\(!) [[smileys/idea.png]]
+* \\[!] [[smileys/attention.png]]
+* \\/!\ [[smileys/alert.png]]
+* \\(?) [[smileys/question.png]]
+* \\{x} [[smileys/star_on.png]]
+* \\{*} [[smileys/star_on.png]]
+* \\{o} [[smileys/star_off.png]]
+* \\{1} [[smileys/prio1.png]]
+* \\{2} [[smileys/prio2.png]]
+* \\{3} [[smileys/prio3.png]]
+
+For example: {x} B) {x}
+
+----
+
+To change the supported smileys, just edit the lists on this page.
+Note that the format is important; each list item should start with the
+text that is turned into the smiley, escaped so that users can see what
+produces it, followed by a [[ikiwiki/WikiLink]] to the image to display.
+
+/!\ Bear in mind that the link to the image needs to be written in a way that
+will work if it's copied to other pages on the wiki. So be sure to include the
+smileys directory in the path to the file.
diff --git a/doc/smileys/alert.png b/doc/smileys/alert.png
new file mode 100644
index 000000000..5bb87e33f
--- /dev/null
+++ b/doc/smileys/alert.png
Binary files differ
diff --git a/doc/smileys/angry.png b/doc/smileys/angry.png
new file mode 100644
index 000000000..05bc69f74
--- /dev/null
+++ b/doc/smileys/angry.png
Binary files differ
diff --git a/doc/smileys/attention.png b/doc/smileys/attention.png
new file mode 100644
index 000000000..7e064566d
--- /dev/null
+++ b/doc/smileys/attention.png
Binary files differ
diff --git a/doc/smileys/biggrin.png b/doc/smileys/biggrin.png
new file mode 100644
index 000000000..f71b42c11
--- /dev/null
+++ b/doc/smileys/biggrin.png
Binary files differ
diff --git a/doc/smileys/checkmark.png b/doc/smileys/checkmark.png
new file mode 100644
index 000000000..8869caa19
--- /dev/null
+++ b/doc/smileys/checkmark.png
Binary files differ
diff --git a/doc/smileys/devil.png b/doc/smileys/devil.png
new file mode 100644
index 000000000..8684c3970
--- /dev/null
+++ b/doc/smileys/devil.png
Binary files differ
diff --git a/doc/smileys/frown.png b/doc/smileys/frown.png
new file mode 100644
index 000000000..2999b55ea
--- /dev/null
+++ b/doc/smileys/frown.png
Binary files differ
diff --git a/doc/smileys/icon-error.png b/doc/smileys/icon-error.png
new file mode 100644
index 000000000..c39e65c33
--- /dev/null
+++ b/doc/smileys/icon-error.png
Binary files differ
diff --git a/doc/smileys/icon-info.png b/doc/smileys/icon-info.png
new file mode 100644
index 000000000..c1b14f331
--- /dev/null
+++ b/doc/smileys/icon-info.png
Binary files differ
diff --git a/doc/smileys/idea.png b/doc/smileys/idea.png
new file mode 100644
index 000000000..37bc02ddc
--- /dev/null
+++ b/doc/smileys/idea.png
Binary files differ
diff --git a/doc/smileys/neutral.png b/doc/smileys/neutral.png
new file mode 100644
index 000000000..ddeb59231
--- /dev/null
+++ b/doc/smileys/neutral.png
Binary files differ
diff --git a/doc/smileys/ohwell.png b/doc/smileys/ohwell.png
new file mode 100644
index 000000000..a83adfbf5
--- /dev/null
+++ b/doc/smileys/ohwell.png
Binary files differ
diff --git a/doc/smileys/prio1.png b/doc/smileys/prio1.png
new file mode 100644
index 000000000..774d34d65
--- /dev/null
+++ b/doc/smileys/prio1.png
Binary files differ
diff --git a/doc/smileys/prio2.png b/doc/smileys/prio2.png
new file mode 100644
index 000000000..4528653c3
--- /dev/null
+++ b/doc/smileys/prio2.png
Binary files differ
diff --git a/doc/smileys/prio3.png b/doc/smileys/prio3.png
new file mode 100644
index 000000000..84332ce82
--- /dev/null
+++ b/doc/smileys/prio3.png
Binary files differ
diff --git a/doc/smileys/question.png b/doc/smileys/question.png
new file mode 100644
index 000000000..df22152e6
--- /dev/null
+++ b/doc/smileys/question.png
Binary files differ
diff --git a/doc/smileys/redface.png b/doc/smileys/redface.png
new file mode 100644
index 000000000..9a8739253
--- /dev/null
+++ b/doc/smileys/redface.png
Binary files differ
diff --git a/doc/smileys/sad.png b/doc/smileys/sad.png
new file mode 100644
index 000000000..86a059d97
--- /dev/null
+++ b/doc/smileys/sad.png
Binary files differ
diff --git a/doc/smileys/smile.png b/doc/smileys/smile.png
new file mode 100644
index 000000000..b51b3ff49
--- /dev/null
+++ b/doc/smileys/smile.png
Binary files differ
diff --git a/doc/smileys/smile2.png b/doc/smileys/smile2.png
new file mode 100644
index 000000000..43ea05f9d
--- /dev/null
+++ b/doc/smileys/smile2.png
Binary files differ
diff --git a/doc/smileys/smile3.png b/doc/smileys/smile3.png
new file mode 100644
index 000000000..c690ccc2f
--- /dev/null
+++ b/doc/smileys/smile3.png
Binary files differ
diff --git a/doc/smileys/smile4.png b/doc/smileys/smile4.png
new file mode 100644
index 000000000..f8f5b523d
--- /dev/null
+++ b/doc/smileys/smile4.png
Binary files differ
diff --git a/doc/smileys/star_off.png b/doc/smileys/star_off.png
new file mode 100644
index 000000000..c5535c3de
--- /dev/null
+++ b/doc/smileys/star_off.png
Binary files differ
diff --git a/doc/smileys/star_on.png b/doc/smileys/star_on.png
new file mode 100644
index 000000000..969908d39
--- /dev/null
+++ b/doc/smileys/star_on.png
Binary files differ
diff --git a/doc/smileys/thumbs-up.png b/doc/smileys/thumbs-up.png
new file mode 100644
index 000000000..1faabace7
--- /dev/null
+++ b/doc/smileys/thumbs-up.png
Binary files differ
diff --git a/doc/smileys/tired.png b/doc/smileys/tired.png
new file mode 100644
index 000000000..a3d5c56fb
--- /dev/null
+++ b/doc/smileys/tired.png
Binary files differ
diff --git a/doc/smileys/tongue.png b/doc/smileys/tongue.png
new file mode 100644
index 000000000..65105407c
--- /dev/null
+++ b/doc/smileys/tongue.png
Binary files differ
diff --git a/doc/soc.mdwn b/doc/soc.mdwn
new file mode 100644
index 000000000..e05543bd5
--- /dev/null
+++ b/doc/soc.mdwn
@@ -0,0 +1,20 @@
+[[!meta title="Summer of Code"]]
+
+This page includes information about ikiwiki's involvement in
+[Google Summer of Code](http://code.google.com/soc/).
+
+For SoC 2007, we started with a list of [[ideas]]. Our [[application]] was
+accepted, and the following projects were worked on:
+
+* latex plugin input/output for ikiwiki
+ by [[PatrickWinnertz]]
+ (See [[todo/latex]])
+* Implement File Upload Functionality and Image Gallery Creation
+ by Ben Coffey
+ (See [[todo/fileupload/soc-proposal]] and [[plugins/contrib/attach]])
+* Wiki WYSIWYG Editor
+ by [[TaylorKillian]]
+ (See [[todo/wikiwyg]])
+* Creating a gallery of a bunch of images
+ by [[ArpitJain]]
+ (See [[todo/Gallery]])
diff --git a/doc/soc/application.mdwn b/doc/soc/application.mdwn
new file mode 100644
index 000000000..5cdb10bb7
--- /dev/null
+++ b/doc/soc/application.mdwn
@@ -0,0 +1,96 @@
+Application submitted 03-06-2007.
+
+Based on ["What should a mentoring organization application look like?"](http://code.google.com/support/bin/answer.py?answer=60303).
+
+1. **Describe your organization.**
+
+ The ikiwiki project aims to develop a general-purpose wiki engine, with particular emphasis on personal wikis, project wikis, blogs, and collaborative software development. We provide several features unique or uncommon amongst wikis:
+
+ * Rather than inventing yet another simplistic, linear revision control system, ikiwiki uses a standard version control system such as Subversion or Git. You can edit a wiki by committing to your repository, as well as through a traditional web interface. This makes ikiwiki ideal for collaborative software development; just keep your wiki in version control next to your software. You can also take advantage of the features of these systems; for instance, you can keep a local branch of your wiki via Git.
+
+ * You can turn any set of pages into an inline news feed, complete with RSS and Atom support. You can run your weblog on ikiwiki (and many people do), run a Planet-like aggregator for external feeds, or keep a TODO and bug list with tags for completed items.
+
+ * ikiwiki provides a wiki compiler, designed to transform your wiki content into a set of static pages. You can then serve these pages as static content. ikiwiki will not fall over during a Slashdotting, because page views don't require the ikiwiki CGI; as long as Apache can keep up, your site will survive. Furthermore, you can choose whether you want to run the ikiwiki CGI for web edits or only handle commits to the underlying version control system; you can even run ikiwiki privately and just manually copy the content to another server. So if you want to put a wiki up on a server without installing any software on that server, try ikiwiki.
+
+2. **Why is your organization applying to participate in GSoC 2007? What do you hope to gain by participating?**
+
+ ikiwiki has had a strong positive response from several communities of software developers. We believe we have filled a genuine need, unaddressed by previous software. Many users have begun to take ikiwiki in unexpected directions, using it in ways we had not previously envisioned.
+
+ Thus, we believe ikiwiki has reached a stage where it would greatly benefit from more widespread exposure to the ingenuity of other developers.
+
+ Furthermore, we have a good list of existing projects on our TODO list. The nature of ikiwiki, with its highly capable plugin system and broader focus than most wikis, results in far more ideas than implementation time. We have a well-managed TODO list, ranging from minor items to major projects. We would greatly appreciate contributions toward some of our more substantial feature ideas. We believe we have a very low barrier to contribution.
+
+3. **Did your organization participate in GSoC 2005 or 2006? If so, please summarize your involvement and the successes and failures of your student projects.**
+
+ ikiwiki has not previously participated in Google Summer of Code.
+
+4. **If your organization has not previously participated in GSoC, have you applied in the past? If so, for what year(s)?**
+
+ ikiwiki has not previously applied for Google Summer of Code.
+
+5. **Who will your organization administrator be? Please include Google Account information.**
+
+ Josh Triplett <<josh@freedesktop.org>>
+
+6. **What license does your project use?**
+
+ ikiwiki uses the GNU General Public License. The basewiki,
+ incorporated into users' wikis, uses an all-permissive license. See
+ <http://ikiwiki.info/freesoftware/> for details.
+
+7. **What is the URL for your ideas page?**
+
+ <http://ikiwiki.info/soc/>
+
+8. **What is the main development mailing list for your organization?**
+
+ The ikiwiki project strongly encourages collaboration through ikiwiki itself, and thus does not have a mailing list.
+ Anyone can create an account on ikiwiki's own wiki. ikiwiki provides a bug tracker, a TODO list, and the ability
+ to create a weblog on any page. ikiwiki also includes "discussion" sub-pages on every page. The developers and mentors
+ monitor RecentChanges closely, via the webpage, email, and CIA, and will respond in a timely fashion.
+
+9. **What is the main IRC channel for your organization?**
+
+ `#ikiwiki` on OFTC (`irc.oftc.net`).
+
+10. **Does your organization have an application template you would like to see students use? If so, please provide it now.**
+
+ No application template needed.
+
+11. **Who will be your backup organization administrator? Please include Google Account information.**
+
+ Joey Hess <<joey@kitenet.net>>
+
+12. **Who will your mentors be? Please include Google Account Information.**
+
+ Joey Hess <<joey@kitenet.net>>
+
+ Josh Triplett <<josh@freedesktop.org>>
+
+ Jamey Sharp <<jamey.sharp@gmail.com>>
+
+13. **What criteria did you use to select these individuals as mentors? Please be as specific as possible.**
+
+ Joey Hess developed ikiwiki, and serves as its primary developer and maintainer.
+
+ Josh Triplett and Jamey Sharp maintain numerous ikiwikis, and have experience hacking on ikiwiki. They developed and currently maintain a set of scripts to convert other wikis to ikiwiki.
+
+14. **What is your plan for dealing with disappearing students?**
+
+ We will strongly encourage all students working on projects to create a user page with an activity blog, and update that blog regularly with the status of their project. We will use these blogs to closely monitor the progress of each student on their projects. If a student mentions problems they have encountered, we will work with them to resolve those problems. If we see no activity from a student for longer than their usual status-reporting interval, we will check with them directly to determine their status.
+
+15. **What is your plan for dealing with disappearing mentors?**
+
+ All three mentors understand the responsibility they will undertake, and they will not abdicate that responsibility lightly. In the unlikely event that one of the mentors cannot follow through with a project due to exigent circumstances, the other mentors will take up the slack and help the students working on projects with that mentor. Furthermore, we intend for all mentors to help with all projects to some degree, by collaborating with the students through ikiwiki.
+
+16. **What steps will you take to encourage students to interact with your project's community before, during and after the program?**
+
+ Before the program, we will encourage any students interested in working on ikiwiki to contact us. We will advise them to create an account on the ikiwiki ikiwiki, look over the TODO items (<http://ikiwiki.info/todo/>), add ideas of their own, and discuss existing ideas. We will also help them set up their own ikiwikis, both for experimentation and for actual use. We will suggest that students begin looking at the ikiwiki codebase and asking questions.
+
+ During the program, we would like all students working on projects to create a user page with an activity blog, and update that blog regularly with the status of their project. We also plan to aggregate these blogs into a single Summer of Code newsfeed, and suggest that students subscribe to this feed; this will allow them to observe the activity of their fellow students, to spur each other forward and help each other along the way. We plan to accept incremental patches towards a feature, or support students who wish to create their own branch.
+
+ After the program, we will continue to work towards integrating any projects that have not yet completed, and talk with students about their future plans regarding ikiwiki. If the students have started using ikiwiki for their own wikis, as we hope they will after we encourage them to experiment with it, then they will likely have a vested interest in ongoing ikiwiki development. Thus, we will encourage them to remain active by helping them become active and interested users.
+
+17. **What will you do to ensure that your accepted students stick with the project after GSoC concludes?**
+
+ Near the conclusion of GSoC, we will talk to each student about their future plans regarding ikiwiki. During the program, we will help the students set up their own ikiwikis if they have not already done so, and encourage them to use those ikiwikis to maintain a weblog, maintain their personal website, keep a TODO list, or do any of the other tasks ikiwiki proves useful for. We plan to encourage the students to remain active developers by helping them become active and interested users, and thus giving them a personal stake in the ongoing development of ikiwiki.
diff --git a/doc/soc/discussion.mdwn b/doc/soc/discussion.mdwn
new file mode 100644
index 000000000..8d0685ca8
--- /dev/null
+++ b/doc/soc/discussion.mdwn
@@ -0,0 +1,2 @@
+SOC application period is open from now until the 15th so if this is gonna
+be done we'd better get on with it --[[Joey]]
diff --git a/doc/soc/ideas.mdwn b/doc/soc/ideas.mdwn
new file mode 100644
index 000000000..88f06b0f1
--- /dev/null
+++ b/doc/soc/ideas.mdwn
@@ -0,0 +1,8 @@
+This list includes pages, such as [[/todo]] items, tagged `soc`. If you
+have a great idea for an ikiwiki project not on this list, please file it
+as a todo item, and ask us if it might work as a Summer of Code project,
+but please don't add the `soc` tag yourself.
+
+[[!inline pages="(todo/* or bugs/*) and link(soc) and !todo/done and
+!link(todo/done) and !bugs/done and !link(bugs/done) and
+!*/Discussion" actions=yes show=0]]
diff --git a/doc/style.css b/doc/style.css
new file mode 100644
index 000000000..b52c72b6b
--- /dev/null
+++ b/doc/style.css
@@ -0,0 +1,551 @@
+/* ikiwiki style sheet */
+
+/* Note that instead of modifying this style sheet, you can instead edit
+ * local.css and use it to override or change settings in this one.
+ */
+
+/* html5 compat */
+article,
+header,
+footer,
+nav {
+ display: block;
+}
+
+.header {
+ margin: 0;
+ font-size: 140%;
+ font-weight: bold;
+ line-height: 1em;
+ display: block;
+}
+
+.inlineheader .author {
+ margin: 0;
+ font-size: 112%;
+ font-weight: bold;
+ display: block;
+}
+
+.actions ul {
+ margin: 0;
+ padding: 6px .4em;
+ height: 1em;
+ list-style-type: none;
+}
+.actions li {
+ display: inline;
+ padding: .2em;
+}
+.pageheader .actions ul {
+ border-bottom: 1px solid #000;
+}
+
+.inlinepage .actions ul {
+ border-bottom: 0;
+}
+
+#otherlanguages ul {
+ margin: 0;
+ padding: 6px;
+ list-style-type: none;
+}
+#otherlanguages li {
+ display: inline;
+ padding: .2em .4em;
+}
+.pageheader #otherlanguages {
+ border-bottom: 1px solid #000;
+}
+
+.inlinecontent {
+ margin-top: .4em;
+}
+
+.pagefooter,
+.inlinefooter,
+.comments {
+ clear: both;
+}
+
+#pageinfo {
+ margin: 1em 0;
+ border-top: 1px solid #000;
+}
+
+.tags {
+ margin-top: 1em;
+}
+
+.inlinepage .tags {
+ display: inline;
+}
+
+.mapparent {
+ text-decoration: none;
+}
+
+.img caption {
+ font-size: 80%;
+ caption-side: bottom;
+ text-align: center;
+}
+
+img.img {
+ margin: 0.5ex;
+}
+
+.align-left {
+ float:left;
+}
+
+.align-right {
+ float:right;
+}
+
+#backlinks {
+ margin-top: 1em;
+}
+
+#searchform {
+ display: inline;
+ float: right;
+}
+
+#editcontent {
+ width: 98%;
+}
+
+.editcontentdiv {
+ width: auto;
+ overflow: auto;
+}
+
+img {
+ border-style: none;
+}
+
+pre {
+ overflow: auto;
+}
+
+div.recentchanges {
+ border-style: solid;
+ border-width: 1px;
+ overflow: auto;
+ width: auto;
+ clear: none;
+ background: #eee;
+ color: black !important;
+}
+.recentchanges .metadata {
+ padding: 0px 0.5em;
+}
+.recentchanges .changelog {
+ font-style: italic;
+ clear: both;
+ display: block;
+ padding: 1px 2px;
+ background: white !important;
+ color: black !important;
+}
+.recentchanges .desc {
+ display: none;
+}
+.recentchanges .diff {
+ display: none;
+}
+.recentchanges .committer {
+ float: left;
+ margin: 0;
+ width: 40%;
+}
+.recentchanges .committype {
+ float: left;
+ margin: 0;
+ width: 5%;
+ font-size: small;
+}
+.recentchanges .changedate {
+ float: left;
+ margin: 0;
+ width: 35%;
+ font-size: small;
+}
+.recentchanges .pagelinks,
+.recentchanges .revert {
+ float: right;
+ margin: 0;
+ width: 60%;
+}
+
+.blogform, #blogform {
+ padding: 10px 10px;
+ border: 1px solid #aaa;
+ background: #eee;
+ color: black !important;
+ width: auto;
+ overflow: auto;
+}
+
+.inlinepage {
+ padding: 10px 10px;
+ border: 1px solid #aaa;
+ overflow: auto;
+}
+
+.pagedate,
+.pagelicense,
+.pagecopyright {
+ font-style: italic;
+ display: block;
+ margin-top: 1em;
+}
+
+.archivepagedate {
+ font-style: italic;
+}
+.archivepage {
+ margin-bottom: 1em;
+}
+
+.error {
+ color: #C00;
+}
+
+.sidebar {
+ width: 20ex;
+ float: right;
+ margin-left: 4px;
+ margin-bottom: 4px;
+ margin-top: -1px;
+ padding: 0ex 2ex;
+ background: white;
+ border: 1px solid black;
+ color: black !important;
+}
+
+hr.poll {
+ height: 10pt;
+ color: white !important;
+ background: #eee;
+ border: 2px solid black;
+}
+div.poll {
+ margin-top: 1ex;
+ margin-bottom: 1ex;
+ padding: 1ex 1ex;
+ border: 1px solid #aaa;
+}
+
+span.color {
+ padding: 2px;
+}
+
+.comment-header,
+.microblog-header {
+ font-style: italic;
+ margin-top: .3em;
+}
+.comment .author,
+.microblog .author {
+ font-weight: bold;
+}
+.comment-subject {
+ font-weight: bold;
+}
+.comment-avatar {
+ float: right;
+}
+.comment {
+ border: 1px solid #aaa;
+ padding: 3px;
+}
+
+div.progress {
+ margin-top: 1ex;
+ margin-bottom: 1ex;
+ border: 1px solid #888;
+ width: 400px;
+ background: #eee;
+ color: black !important;
+ padding: 1px;
+}
+div.progress-done {
+ background: #ea6 !important;
+ color: black !important;
+ text-align: center;
+ padding: 1px;
+}
+
+/* things to hide in printouts */
+@media print {
+ .actions { display: none; }
+ .tags { display: none; }
+ .trails { display: none; }
+ .feedbutton { display: none; }
+ #searchform { display: none; }
+ .blogform, #blogform { display: none; }
+ #backlinks { display: none; }
+}
+
+/* infobox template */
+.infobox {
+ float: right;
+ margin-left: 2ex;
+ margin-top: 1ex;
+ margin-bottom: 1ex;
+ padding: 1ex 1ex;
+ border: 1px solid #aaa;
+ background: white;
+ color: black !important;
+}
+
+/* notebox template */
+.notebox {
+ float: right;
+ margin-left: 2ex;
+ margin-top: 1ex;
+ margin-bottom: 1ex;
+ padding: 1ex 1ex;
+ border: 1px solid #aaa;
+ width: 25%;
+ background: white;
+ color: black !important;
+}
+
+/* popup template and backlinks hiding */
+.popup {
+ border-bottom: 1px dotted #366;
+ color: #366;
+}
+.popup .balloon,
+.popup .paren,
+.popup .expand {
+ display: none;
+ text-align: left;
+}
+.popup:hover .balloon,
+.popup:focus .balloon {
+ position: absolute;
+ display: inline;
+ margin: 1em 0 0 -2em;
+ padding: 0.625em;
+ border: 2px solid;
+ background-color: #dee;
+ color: black;
+}
+
+/* form styling */
+fieldset {
+ margin: 1ex 0;
+ border: 1px solid black;
+}
+legend {
+ padding: 0 1ex;
+}
+.fb_submit {
+ float: left;
+ margin: 2px 0;
+}
+label.block {
+ display: block;
+}
+label.inline {
+ display: inline;
+}
+input#openid_identifier {
+ background: url(wikiicons/openidlogin-bg.gif) no-repeat;
+ background-color: #fff;
+ background-position: 0 50%;
+ color: #000;
+ padding-left: 18px;
+}
+input#searchbox {
+ background: url(wikiicons/search-bg.gif) no-repeat;
+ background-color: #fff;
+ background-position: 100% 50%;
+ color: #000;
+ padding-right: 16px;
+}
+/* invalid form fields */
+.fb_invalid {
+ color: red;
+ background: white !important;
+}
+/* required form fields */
+.fb_required {
+ font-weight: bold;
+}
+
+/* highlight plugin */
+pre.hl { color:#000000; background-color:#ffffff; }
+.hl.num { color:#2928ff; }
+.hl.esc { color:#ff00ff; }
+.hl.str { color:#ff0000; }
+.hl.dstr { color:#818100; }
+.hl.slc { color:#838183; font-style:italic; }
+.hl.com { color:#838183; font-style:italic; }
+.hl.dir { color:#008200; }
+.hl.sym { color:#000000; }
+.hl.line { color:#555555; }
+.hl.mark { background-color:#ffffbb; }
+.hl.kwa { color:#000000; font-weight:bold; }
+.hl.kwb { color:#830000; }
+.hl.kwc { color:#000000; font-weight:bold; }
+.hl.kwd { color:#010181; }
+
+/* calendar plugin */
+.month-calendar-day-this-day,
+.year-calendar-this-month {
+ background-color: #eee;
+}
+.month-calendar-day-head,
+.month-calendar-day-nolink,
+.month-calendar-day-link,
+.month-calendar-day-this-day,
+.month-calendar-day-future {
+ text-align: right;
+}
+.month-calendar-arrow A:link,
+.year-calendar-arrow A:link,
+.month-calendar-arrow A:visited,
+.year-calendar-arrow A:visited {
+ text-decoration: none;
+ font-weight: normal;
+ font-size: 150%;
+}
+
+/* outlines */
+li.L1 { list-style: upper-roman; }
+li.L2 { list-style: decimal; }
+li.L3 { list-style: lower-alpha; }
+li.L4 { list-style: disc; }
+li.L5 { list-style: square; }
+li.L6 { list-style: circle; }
+li.L7 { list-style: lower-roman; }
+li.L8 { list-style: upper-alpha; }
+
+/* tag cloud */
+.pagecloud {
+ float: right;
+ width: 30%;
+ text-align: center;
+ padding: 10px 10px;
+ border: 1px solid #aaa;
+ background: #eee;
+ color: black !important;
+}
+.smallestPC { font-size: 70%; }
+.smallPC { font-size: 85%; }
+.normalPC { font-size: 100%; }
+.bigPC { font-size: 115%; }
+.biggestPC { font-size: 130%; }
+
+/* orange feed button */
+.feedbutton {
+ background: #ff6600;
+ color: white !important;
+ border-left: 1px solid #cc9966;
+ border-top: 1px solid #ccaa99;
+ border-right: 1px solid #993300;
+ border-bottom: 1px solid #331100;
+ padding: 0px 0.5em 0px 0.5em;
+ font-family: sans-serif;
+ font-weight: bold;
+ font-size: small;
+ text-decoration: none;
+ margin-top: 1em;
+}
+.feedbutton:hover {
+ color: white !important;
+ background: #ff9900;
+}
+
+.FlattrButton {
+ display: none;
+}
+
+/* openid selector */
+#openid_choice {
+ display: none;
+}
+#openid_input_area {
+ clear: both;
+ padding: 10px;
+}
+#openid_btns, #openid_btns br {
+ clear: both;
+}
+#openid_highlight {
+ background-color: black;
+ float: left;
+}
+.openid_large_btn {
+ padding: 1em 1.5em;
+ border: 1px solid #DDD;
+ margin: 3px;
+ float: left;
+}
+.openid_small_btn {
+ padding: 4px 4px;
+ border: 1px solid #DDD;
+ margin: 3px;
+ float: left;
+}
+a.openid_large_btn:focus {
+ outline: none;
+}
+a.openid_large_btn:focus {
+ outline-style: none;
+}
+.openid_selected {
+ border: 4px solid #DDD;
+}
+
+.fileupload-content .ui-progressbar {
+ width: 200px;
+ height: 20px;
+}
+.fileupload-content .ui-progressbar-value {
+ background: url(ikiwiki/images/pbar-ani.gif);
+}
+
+.trails {
+ margin-top: 1em;
+ margin-bottom: 1em;
+}
+.trail {
+ display: block;
+ clear: both;
+ position: relative;
+}
+
+.trailprev {
+ display: block;
+ text-align: left;
+ position: absolute;
+ top: 0%;
+ left: 3%;
+ width: 30%;
+}
+
+.trailup {
+ display: block;
+ text-align: center;
+ margin-left: 35%;
+ margin-right: 35%;
+}
+
+.trailnext {
+ display: block;
+ text-align: right;
+ position: absolute;
+ top: 0%;
+ width: 30%;
+ right: 3%;
+}
+
+.trailsep {
+ display: none;
+}
diff --git a/doc/tags.mdwn b/doc/tags.mdwn
new file mode 100644
index 000000000..71d925b24
--- /dev/null
+++ b/doc/tags.mdwn
@@ -0,0 +1,26 @@
+While ikiwiki supports hierarchically categorising pages by creating
+[[SubPages|ikiwiki/SubPage]], that's often not flexible enough, and it can also be
+useful to tag pages in various non-hierarchical ways.
+
+Since this is a wiki, tagging is just a form of linking. The general rule
+is that all tags are links, but not all links are tags. So a tag is a
+special link that ikiwiki knows is intended to be used as a tag.
+
+Generally you will tag a page without putting a visible link on it.
+The [[tag_plugin|plugins/tag]] allows you to do so, like this:
+
+ \[[!tag mytag othertag thirdtag]]
+
+You can also tag a page with a visible link:
+
+ \[[!taglink mytag]]
+
+This tag will be displayed just like a regular [[ikiwiki/WikiLink]].
+
+One way to use these tags is to create a [[blog]] of pages that have a
+particular set of tags. Or just look at the [[BackLinks]] to a tag page to
+see all the pages that are tagged with it. [[Plugins]] can be written to do
+anything else with tags that you might desire.
+
+Once you have tags, you can use the [[plugins/pagestats]] plugin to
+generate tag clouds.
diff --git a/doc/tags/discussion.mdwn b/doc/tags/discussion.mdwn
new file mode 100644
index 000000000..d7a6297c0
--- /dev/null
+++ b/doc/tags/discussion.mdwn
@@ -0,0 +1,20 @@
+In another blog, I could tag a post with arbitrary words and not have to do
+anything else for the software to recognize it as a tag. In Ikiwiki if you
+want to tag something \[[!tag foo]] you also have to go to tags/ and create
+foo.mkdn (even if it's zero-length), because "tags are links", and links
+don't actually *link* if they have no destination. This allows for
+customization of how you present different tag feeds, but this (to me) is
+too much work and more like a category than a tag. It'd be nice if you
+could tell the tag plugin "if the tag target doesn't exist in tags/*,
+pretend it does exist and is zero-length". -- [[users/Larry_Clapp]]
+
+Never mind, I think I found the answer (or at least a pointer)
+[[here|plugins/tag/discussion/]]. Feel free to delete both these comments
+:). -- [[users/Larry_Clapp]]
+
+> Why do you have to go create the tag? A tag (or link) pointing at a page that
+> doesn't exist _does_ still exist. ikiwiki allows you to:
+>
+> * Create a pagespec to match pages linking to the "nonexistant" tag.
+> * Click on the tag to create the tag page, like any other incomplete link.
+> --[[Joey]]
diff --git a/doc/templates.mdwn b/doc/templates.mdwn
new file mode 100644
index 000000000..d0f891c21
--- /dev/null
+++ b/doc/templates.mdwn
@@ -0,0 +1,94 @@
+[[Ikiwiki]] uses many templates for many purposes. By editing its templates,
+you can fully customise its appearance, and avoid duplicate content.
+
+Ikiwiki uses the HTML::Template module as its template engine. This
+supports things like conditionals and loops in templates and is pretty
+easy to learn. All you really need to know to modify templates is this:
+
+* To insert the value of a template variable, use `<TMPL_VAR variable>`.
+* To make a block of text conditional on a variable being set use
+ `<TMPL_IF variable>text</TMPL_IF>`.
+* To use one block of text if a variable is set and a second if it's not,
+ use `<TMPL_IF variable>text<TMPL_ELSE>other text</TMPL_IF>`
+
+[[!if test="enabled(template) or enabled(edittemplate)" then="""
+## template pages
+
+Template pages are regular wiki pages that are used as templates for other
+pages.
+"""]]
+
+[[!if test="enabled(template)" then="""
+The [[!iki ikiwiki/directive/template desc="template directive"]] allows
+template pages to be filled out and inserted into other pages in the wiki.
+"""]]
+
+[[!if test="enabled(edittemplate)" then="""
+The [[!iki ikiwiki/directive/edittemplate desc="edittemplate directive"]] can
+be used to make new pages default to containing text from a template
+page, which can be filled out as the page is edited.
+"""]]
+
+[[!if test="(enabled(template) or enabled(edittemplate))
+and enabled(inline)" then="""
+These template pages are currently available:
+
+[[!inline pages="templates/* and !*.tmpl and !templates/*/* and !*/discussion"
+feeds=no archive=yes sort=title template=titlepage
+rootpage=templates postformtext="Add a new template page named:"]]
+"""]]
+
+## template files
+
+Template files are unlike template pages in that they have the extension
+`.tmpl`. Template files are used extensively by Ikiwiki to generate html.
+They can contain html that would not normally be allowed on a wiki page.
+
+Template files are located in `/usr/share/ikiwiki/templates` by default;
+the `templatedir` setting can be used to make another directory be
+searched first. Customised template files can also be placed inside the
+"templates/" directory in your wiki's source -- files placed there override
+ones in the `templatedir`.
+
+Here is a full list of the template files used:
+
+* `page.tmpl` - Used for displaying all regular wiki pages. This is the
+ key template to customise to change the look and feel of Ikiwiki.
+ [[!if test="enabled(pagetemplate)" then="""
+ (The [[!iki ikiwiki/directive/pagetemplate desc="pagetemplate directive"]]
+ can be used to make a page use a different template than `page.tmpl`.)"""]]
+* `rsspage.tmpl` - Used for generating rss feeds for blogs.
+* `rssitem.tmpl` - Used for generating individual items on rss feeds.
+* `atompage.tmpl` - Used for generating atom feeds for blogs.
+* `atomitem.tmpl` - Used for generating individual items on atom feeds.
+* `inlinepage.tmpl` - Used for displaying a post in a blog.
+* `archivepage.tmpl` - Used for listing a page in a blog archive page.
+* `titlepage.tmpl` - Used for listing a page by title in a blog archive page.
+* `microblog.tmpl` - Used for showing a microblogging post inline.
+* `blogpost.tmpl` - Used for a form to add a post to a blog (and rss/atom links)
+* `feedlink.tmpl` - Used to add rss/atom links if `blogpost.tmpl` is not used.
+* `aggregatepost.tmpl` - Used by the aggregate plugin to create
+ a page for a post.
+* `searchform.tmpl`, `googleform.tmpl` - Used by the search plugin
+ and google plugin to add search forms to wiki pages.
+* `searchquery.tmpl` - This is a Omega template, used by the
+ search plugin.
+* `comment.tmpl` - Used by the comments plugin to display a comment.
+* `change.tmpl` - Used to create a page describing a change made to the wiki.
+* `recentchanges.tmpl` - Used for listing a change on the RecentChanges page.
+* `autoindex.tmpl` - Filled in by the autoindex plugin to make index pages.
+* `autotag.tmpl` - Filled in by the tag plugin to make tag pages.
+* `calendarmonth.tmpl`, `calendaryear.tmpl` - Used by ikiwiki-calendar to
+ make calendar archive pages.
+* `trails.tmpl` - Used by the trail plugin to generate links on each page
+ that is a member of a trail.
+* `notifyemail.tmpl` - Used by the notifymail plugin to generate mails about
+ changed pages.
+* `editpage.tmpl`, `editconflict.tmpl`, `editcreationconflict.tmpl`,
+ `editfailedsave.tmpl`, `editpagegone.tmpl`, `pocreatepage.tmpl`,
+ `editcomment.tmpl` `commentmoderation.tmpl`, `renamesummary.tmpl`,
+ `passwordmail.tmpl`, `openid-selector.tmpl`, `revert.tmpl` - Parts of ikiwiki's user
+ interface; do not normally need to be customised.
+
+
+[[!meta robots="noindex, follow"]]
diff --git a/doc/templates/discussion.mdwn b/doc/templates/discussion.mdwn
new file mode 100644
index 000000000..c7115e4d6
--- /dev/null
+++ b/doc/templates/discussion.mdwn
@@ -0,0 +1,27 @@
+This confuses me enormously. Perhaps because I am new to ikiwiki, to perl, to Linux, etc.
+
+note and popups are templates? But they're not in the templates directory, and in my readings here, templates are supposed to be in the ../templates directory.
+
+> Ikiwiki has an basewiki underlay that provides wiki files not included in
+> your personal wiki sources. The note and popup template pages are
+> installed there, typically in `/usr/share/ikiwiki/basewiki/templates/`
+> --[[Joey]]
+
+> > And how am I able to use e.g. links? It's not listed in `/usr/share/ikiwiki/basewiki/templates`.
+> > I intend do (mis)use links for a horizontal navigation. Or may I be better off altering page.tmpl?
+> > --z3ttacht
+
+Is there a list of the TMPL_VAR-Variables that are defined by ikiwiki?
+
+What I'm trying to achieve is to print the URL of every page on the page itself and therefore I would need the corresponding value in the Template.
+
+Am I missing something? --[[jwalzer]]
+
+> If there isn't a suitable variable (I don't think there is a list at
+> the moment), a [[plugin|plugins/write]] to add one would be about 10
+> lines of perl - you'd just need to define a `pagetemplate` hook. --[[smcv]]
+
+Is there a list of all the available variables somewhere, or do I just grep the source for TMPL_VAR? And is there a way to refer to a variable inside of a wiki page or does it have to be done from a template? Thanks. -- [[AdamShand]]
+
+I pulled a list of variables and posted it, its in the history for [[templates]] under my name. [[justint]]
+
diff --git a/doc/templates/gitbranch.mdwn b/doc/templates/gitbranch.mdwn
new file mode 100644
index 000000000..4fdf937ff
--- /dev/null
+++ b/doc/templates/gitbranch.mdwn
@@ -0,0 +1,16 @@
+<span class="infobox">
+Available in a [[!taglink /git]] repository [[!taglink branch|/branches]].<br />
+Branch: <TMPL_VAR branch><br />
+Author: <TMPL_VAR author><br />
+</span>
+<TMPL_UNLESS branch>
+This template is used to create an infobox for a git branch. It uses
+these parameters:
+
+<ul>
+<li>branch - the name of the branch, prefixed with the name of one of the
+ remotes listed on the [[/git]] page and provided by the gitremotes script
+ (e.g. github/master)</li>
+<li>author - the author of the branch</li>
+</ul>
+</TMPL_UNLESS>
diff --git a/doc/templates/links.mdwn b/doc/templates/links.mdwn
new file mode 100644
index 000000000..4bd1a85bf
--- /dev/null
+++ b/doc/templates/links.mdwn
@@ -0,0 +1,16 @@
+<div class="infobox">
+[[ikiwiki_logo|logo/ikiwiki.png]]
+<ul>
+<li>[[News]]</li>
+<li>[[Download]]</li>
+<li>[[Setup]]</li>
+<li>[[Security]]</li>
+<li>[[Users|IkiWikiUsers]]</li>
+<li>[[SiteMap]]</li>
+<li>[[Contact]]</li>
+<li>[[TipJar]]</li>
+</ul>
+<a href="http://flattr.com/thing/39811/ikiwiki">
+<img src="https://api.flattr.com/button/flattr-badge-large.png"
+alt="Flattr this" title="Flattr this" /></a>
+</div>
diff --git a/doc/templates/note.mdwn b/doc/templates/note.mdwn
new file mode 100644
index 000000000..9ef5ad942
--- /dev/null
+++ b/doc/templates/note.mdwn
@@ -0,0 +1,11 @@
+<div class="notebox">
+<TMPL_VAR text>
+</div>
+<TMPL_UNLESS text>
+Use this template to insert a note into a page. The note will be styled to
+float to the right of other text on the page. This template has one
+parameter:
+<ul>
+<li>`text` - the text to display in the note
+</ul>
+</TMPL_UNLESS>
diff --git a/doc/templates/plugin.mdwn b/doc/templates/plugin.mdwn
new file mode 100644
index 000000000..322c49445
--- /dev/null
+++ b/doc/templates/plugin.mdwn
@@ -0,0 +1,19 @@
+<span class="infobox">
+Plugin: <TMPL_VAR name><br />
+Author: <TMPL_VAR author><br />
+Included in ikiwiki: [[!if test="sourcepage(plugins/contrib/*)" then="""no""" else="""yes"""]]<br />
+Enabled by default: <TMPL_IF core>yes<TMPL_ELSE>no</TMPL_IF><br />
+Included in [[/plugins/goodstuff]]: [[!if test="backlink(plugins/goodstuff)" all=no then="""yes""" else="""no"""]]<br />
+Currently enabled: [[!if test="enabled(<TMPL_VAR name>)" then="yes" else="no"]]<br />
+</span>
+[[!if test="sourcepage(plugins/contrib/*)" then="""[[!meta title="<TMPL_VAR name> (third party plugin)"]]"""]]
+<TMPL_IF core>[[!tag plugins/type/core]]</TMPL_IF>
+<TMPL_UNLESS name>
+This template is used to create an infobox for an ikiwiki plugin. It uses
+these parameters:
+<ul>
+<li>name - the name of the plugin
+<li>author - the author of the plugin
+<li>core - set to a true value if the plugin is enabled by default
+</ul>
+</TMPL_UNLESS>
diff --git a/doc/templates/popup.mdwn b/doc/templates/popup.mdwn
new file mode 100644
index 000000000..92455eb21
--- /dev/null
+++ b/doc/templates/popup.mdwn
@@ -0,0 +1,16 @@
+<TMPL_UNLESS mouseover>
+Use this template to create a popup window that is displayed when the mouse
+is over part of the page. This template has two parameters:
+<ul>
+<li>`mouseover` - This is the text or other content that triggers the
+popup.
+<li>`popup` - This should be the content of the popup window. It can be
+anything, even images or a whole little wiki page, but should not be too
+large for good usability.
+</ul>
+Note that browsers that do not support the CSS will display the popup
+inline in the page, inside square brackets.
+</TMPL_UNLESS>
+<span class="popup"><TMPL_VAR mouseover>
+<span class="paren">[</span><span class="balloon"><TMPL_VAR popup></span><span class="paren">]</span>
+</span>
diff --git a/doc/theme_market.mdwn b/doc/theme_market.mdwn
new file mode 100644
index 000000000..e9bdaa056
--- /dev/null
+++ b/doc/theme_market.mdwn
@@ -0,0 +1,13 @@
+[[!meta title="Theme Market" description="user-contributed themes for ikiwiki"]]
+
+This is a directory of user-contributed ikiwiki themes. For more information about themes, see the [[plugins/theme]] page.
+
+It is usually preferable (and more maintainable) to create a [[css]] file instead of a full theme, but sometimes the HTML produced by ikiwiki just isn't compatible with your template, so you need to modify the templates provided. This is when you need to make your own theme.
+
+Feel free to add your own [[theme|themes]] here, but first consider writing a simpler [[css]] file and adding it to the [[css market]] instead, or look at the [[themes]] shipped with ikiwiki.
+
+ * **[[AntPortal theme|https://github.com/AntPortal/ikiwiked]]**, contributed by Danny, see an example [[on the Antportal wiki|https://antportal.com/wiki/]]
+
+ * **[[Night city theme|http://anarcat.ath.cx/night_city/README/]]**, contributed by [[anarcat]], see an example [[on his homepage|http://anarcat.ath.cx/]]
+
+ * **[[Bootstrap theme|http://anonscm.debian.org/gitweb/?p=users/jak/website.git;a=summary]]**, contributed by [[JAK LINUX|http://jak-linux.org/about/]], based on [[Twitter Bootstrap|http://twitter.github.com/bootstrap/]]
diff --git a/doc/themes.mdwn b/doc/themes.mdwn
new file mode 100644
index 000000000..e15248360
--- /dev/null
+++ b/doc/themes.mdwn
@@ -0,0 +1,34 @@
+A theme provides a style.css file, and any associated images to give
+ikiwiki a nice look and feel. The local.css [[CSS]] file is left
+free for you to further customize.
+
+Ikiwiki now comes with several themes contributed by users.
+You can enable the [[theme_plugin|plugins/theme]] to use any of
+these, but you can also deploy custom themes maintained by the
+community from the [[theme market]].
+
+[[!img actiontabs_small.png align=left]] The **actiontabs** theme, contributed by
+[[svend]]. This style sheet displays the action list
+(Edit, RecentChanges, etc.) as tabs.
+
+<br clear="both" />
+
+[[!img blueview_small.png align=left]] The **blueview** theme, contributed by
+[[BerndZeimetz]], featuring a tiling panoramic photo he took.
+
+<br clear="both" />
+
+[[!img goldtype_small.png align=left]] The **goldtype** theme, based on
+blueview and featuring the photography of Lars Wirzenius.
+
+<br clear="both" />
+
+[[!img monochrome_small.png align==left]] The **monochrome** theme,
+based on [[Jon]]'s homepage design.
+
+<br clear="both" />
+
+[[!img none_small.png align=left]] For completeness, ikiwiki's default
+anti-theme.
+
+<br clear="both" />
diff --git a/doc/themes/actiontabs_small.png b/doc/themes/actiontabs_small.png
new file mode 100644
index 000000000..4b05ad3dc
--- /dev/null
+++ b/doc/themes/actiontabs_small.png
Binary files differ
diff --git a/doc/themes/blueview_small.png b/doc/themes/blueview_small.png
new file mode 100644
index 000000000..74972c4c3
--- /dev/null
+++ b/doc/themes/blueview_small.png
Binary files differ
diff --git a/doc/themes/discussion.mdwn b/doc/themes/discussion.mdwn
new file mode 100644
index 000000000..5c0766a06
--- /dev/null
+++ b/doc/themes/discussion.mdwn
@@ -0,0 +1,20 @@
+I would like to contribute a theme I created and posted on github:
+
+[[https://github.com/AntPortal/ikiwiked]]
+
+For an example of the theme in action, see: [[https://antportal.com/wiki/]]
+
+> Shouldn't we just make people post their themes in the [[themes]] page? Or maybe we should make a [[theme market]]? --[[anarcat]]
+
+> I did just that. -- [[anarcat]]
+
+What is the process for merging a theme in Ikiwiki? It seems to me the
+[[Bootstrap theme|http://www2.tblein.eu/posts/How_to_have_a_nice_design_for_ikiwiki/]]
+could improve the options a lot... See the [[theme market]] for the
+links to the actual theme. -- [[anarcat]]
+
+> Step 1 is to not need two versions of page.tmpl to be maintained.
+> This is, unfortunately, the reason why I have not pulled in the bootstrap
+> theme yet. I recently made `<TMPL_IF THEME_$NAME>` be available,
+> so the page.tmpl could use that to do different things if the boostrap
+> theme was enabled. --[[Joey]]
diff --git a/doc/themes/goldtype_small.png b/doc/themes/goldtype_small.png
new file mode 100644
index 000000000..a011bb200
--- /dev/null
+++ b/doc/themes/goldtype_small.png
Binary files differ
diff --git a/doc/themes/monochrome_small.png b/doc/themes/monochrome_small.png
new file mode 100644
index 000000000..6c98100a1
--- /dev/null
+++ b/doc/themes/monochrome_small.png
Binary files differ
diff --git a/doc/themes/none_small.png b/doc/themes/none_small.png
new file mode 100644
index 000000000..8272ae606
--- /dev/null
+++ b/doc/themes/none_small.png
Binary files differ
diff --git a/doc/tipjar.mdwn b/doc/tipjar.mdwn
new file mode 100644
index 000000000..6d65a0a70
--- /dev/null
+++ b/doc/tipjar.mdwn
@@ -0,0 +1,25 @@
+Ikiwiki is [[FreeSoftware]], but you're also free to show your appreciation
+to [[Joey]] or help offset hosting costs with a donation in any amount you
+choose. If you'd like to fund development of a specific feature, see the
+[[consultants]] page.
+
+<a href="https://www.paypal.com/cgi-bin/webscr?cmd=_xclick&business=joey%40kitenet%2enet&item_name=ikiwiki&no_shipping=1&cn=Comments%3f&tax=0&currency_code=USD&lc=US&bn=PP%2dDonationsBF&charset=UTF%2d8"><img src="https://www.paypal.com/en_US/i/btn/x-click-but04.gif" alt="donate with PayPal" /></a>
+
+<script type="text/javascript">var flattr_url = 'http://ikiwiki.info';</script>
+<script src="http://api.flattr.com/button/load.js" type="text/javascript"></script>
+
+Thanks to the following people for their kind contributions:
+
+* James Westby
+* Kyle S MacLea
+* Adam Shand
+* Martin Krafft
+* Paweł Tęcza
+* Mick Pollard
+* Nico Schottelius
+* Jon Dowland
+* Amitai Schlair
+* Luca Capello
+
+(Note that this page is locked to prevent anyone from tampering with the PayPal button.
+If you prefer your donation *not* be listed here, let [[Joey]] know.)
diff --git a/doc/tips.mdwn b/doc/tips.mdwn
new file mode 100644
index 000000000..53f966001
--- /dev/null
+++ b/doc/tips.mdwn
@@ -0,0 +1,5 @@
+This page is a place to document tips and techniques for using ikiwiki.
+
+[[!inline pages="tips/* and !tips/*/*"
+feedpages="created_after(tips/howto_avoid_flooding_aggregators)" archive="yes"
+rootpage="tips" postformtext="Add a new tip about:" show=0]]
diff --git a/doc/tips/Adding_Disqus_to_your_wiki.mdwn b/doc/tips/Adding_Disqus_to_your_wiki.mdwn
new file mode 100644
index 000000000..3fd3a647d
--- /dev/null
+++ b/doc/tips/Adding_Disqus_to_your_wiki.mdwn
@@ -0,0 +1,30 @@
+<a href="http://disqus.com">Disqus</a> is a comment system that you can add to your blog to manage comments.
+
+To add it to ikiwiki first create an account at disqus and add you blog. Then click on the Admin link at that top of the main page.
+
+In the admin section there should be a tab called "Tools" for you site. Select the "Generic Code" option to install your site and then tweak the settings so the comments box looks like you want. This will then create a bit of javascript. Copy that code.
+
+In ikiwiki templates edit the page.tmpl and somewhere down the bottom, I put mine just before the footer, paste in the code that you had from before. This will add a disqus comment box to every page on your site.
+
+If you want to change your blog to also use the comments then you need to edit the inlinepage.tmpl template as well. This time remove the lines
+
+ <TMPL_IF NAME="DISCUSSIONLINK">
+ <li><TMPL_VAR DISCUSSIONLINK></li>
+
+and replace with
+
+ <li>
+ <TMPL_IF NAME="PERMALINK">
+ <a href="<TMPL_VAR PERMALINK>">Comment</a>
+ <TMPL_ELSE>
+ <a href="<TMPL_VAR PAGEURL>">Comment</a>
+ </TMPL_IF>
+ </li>
+
+This changes the discussion link to a Comment link that takes you to the full page for that blog entry which should contain the disqus comments form that you added before.
+
+Note: This does then mean that to add a comment people need to have javascript enabled.
+
+---
+
+You can also try [IkiWiki::Plugin::disqus](http://code.google.com/p/ikiwiki-plugin-disqus/).
diff --git a/doc/tips/Adding_Disqus_to_your_wiki/discussion.mdwn b/doc/tips/Adding_Disqus_to_your_wiki/discussion.mdwn
new file mode 100644
index 000000000..4f2e13700
--- /dev/null
+++ b/doc/tips/Adding_Disqus_to_your_wiki/discussion.mdwn
@@ -0,0 +1 @@
+This appears to add the Disqus code to every page, including Archive pages and the blog index. Is there a way to only add it to the blog posts?
diff --git a/doc/tips/DreamHost.mdwn b/doc/tips/DreamHost.mdwn
new file mode 100644
index 000000000..338bca782
--- /dev/null
+++ b/doc/tips/DreamHost.mdwn
@@ -0,0 +1,192 @@
+# Introduction
+I had some trouble installing ikiwiki on to a shared hosting service (DreamHost) and figured I'd post the results, since it was pretty rough to get installed. These instructions should work for Perl generally (some of the docs are borrowed from Catalyst's docs), but are tailored for Ikiwiki. There are a few items I'll file as bugs, as well, to aid future installation, but frankly the problems appear to be with installing perl as a non-root user, not anything specific to iki.
+
+**Note: CPAN seems to die without warning, or die after successfully install modules. It appears to just like dying. If you encounter this, retry the last command after restarting CPAN.** Unfortunately, this doc can't cover how to fix any other problems with CPAN beyond what you find here.
+
+# Fixing CPAN
+[These instructions are paraphrased from Catalyst's documentation](http://dev.catalystframework.org/wiki/Dreamhost) :
+
+We're going to assume that you're installing CPAN and other Perl modules into ~/site/perl.
+
+In your .bashrc/.bash_profile/.profile, add:
+
+ export PERL5LIB="$HOME/site/perl/share/perl/5.8:$HOME/site/perl/share/perl/5.8.4:$HOME/site/perl/lib/perl5:$HOME/site/perl/lib/perl/5.8.4"
+
+These locations may be different on your computer. For example, I use:
+
+ export PERL5LIB="$HOME/site/perl/lib/perl5:$HOME/site/perl/lib/perl5/site_perl/5.8.8:$PERL5LIB"
+
+You probably want to add *~/site/perl/bin/* to your path, as well, since Ikiwiki's scripts are put in there.
+
+Make sure to source your modified file (or logout/login). Next, run :
+
+ perl -MCPAN -e shell
+
+and say no to manual configuration. (Ed : I assume this sets up a basic CPAN with the existing site config.)
+
+Exit CPAN shell and restart, then run :
+
+ o conf makepl_arg PREFIX=~/site/perl
+ o conf commit
+ install CPAN
+
+Exit CPAN shell and restart, say no to manual configuration. Note that I used defaults except for the mbuildpl_arg parameter, which I set to *--install-base=~/site/perl/*. I believe this obviates the need for the first configuration parameter in the next section, but if you're paranoid, specify both (note added to next section). My output looked like this :
+
+ user@server:~$ perl -MCPAN -e shell
+ Sorry, we have to rerun the configuration dialog for CPAN.pm due to
+ some missing parameters...
+
+ Normally CPAN.pm keeps config variables in memory and changes need to
+ be saved in a separate 'o conf commit' command to make them permanent
+ between sessions. If you set the 'auto_commit' option to true, changes
+ to a config variable are always automatically committed to disk.
+
+ <auto_commit>
+ Always commit changes to config variables to disk? [no]
+
+ A Build.PL is run by perl in a separate process. Likewise we run
+ './Build' and './Build install' in separate processes. If you have any
+ parameters you want to pass to the calls, please specify them here.
+
+ Typical frequently used settings:
+
+ --install_base /home/xxx # different installation directory
+
+ <mbuildpl_arg>
+ Parameters for the 'perl Build.PL' command? [] --install-base=~/site/perl/
+
+ Parameters for the './Build' command? Setting might be:
+
+ --extra_linker_flags -L/usr/foo/lib # non-standard library location
+
+ <mbuild_arg>
+ Your choice: []
+
+ Do you want to use a different command for './Build install'? Sudo
+ users will probably prefer:
+
+ su root -c ./Build
+ or
+ sudo ./Build
+ or
+ /path1/to/sudo -u admin_account ./Build
+
+ <mbuild_install_build_command>
+ or some such. Your choice: [./Build]
+
+ Parameters for the './Build install' command? Typical frequently used
+ setting:
+
+ --uninst 1 # uninstall conflicting files
+
+ <mbuild_install_arg>
+ Your choice: []
+
+ Please remember to call 'o conf commit' to make the config permanent!
+
+ cpan shell -- CPAN exploration and modules installation (v1.9205)
+ ReadLine support enabled
+
+Next, run :
+
+ o conf mbuildpl_arg --install-base=~/site/perl (This may be optional, given the prior step to configure this)
+ o conf prefer_installer MB
+ o conf commit
+ install Module::Build
+
+After this step, you should have a working CPAN and Module::Build installed. This is the starting point for being able to successfully install modules via CPAN.
+
+# Update old modules
+I updated particular modules out of paranoia. Either installation errors (during previous installs) or notes on the web led me to install these. If you know what you're doing, you can skip this, but if you're perl-fu is as weak as mine, you're better off installing them.
+
+ install File::BaseDir
+ install Module::Build
+ install File::Temp
+ install Digest::SHA
+ install YAML
+ install Test::Builder
+ install Test::Pod
+ install Test::Pod::Coverage
+
+# Install modules for Ikiwiki
+Install the modules required for Ikiwiki. I install all of the ones required *and* suggested because most of what I want to do requires most of these modules.
+
+ install Text::Markdown URI HTML::Parser HTML::Template
+ install CGI CGI::Session CGI::FormBuilder
+ install Mail::Sendmail HTML::Scrubber
+ install RPC::XML XML::Simple XML::Feed File::MimeInfo Locale::gettext
+
+# Changes to Ikiwiki's build/install process
+An explanation of why each of these changes were made will follow these instructions. To tell the default install where your libraries are, well modify docwiki.setup (just another ikiwiki setup file) to add the "libdir" configuration, using ${HOME}/site/perl/lib/perl5 as the value (you'll see this again in your final ikiwiki config).
+
+Next, you'll need to pass the directory where you installed your perl modules (*~/site/perl/ in this example*) into the MakeMaker build script (verbose isn't required, but gives you more feedback since you're following along at home):
+
+ user@server:~/ikiwiki$ perl Makefile.PL PREFIX=${HOME}/site/perl/ NOTAINT=1
+ Using PERL=/usr/bin/perl
+ Writing Makefile for IkiWiki
+
+The README suggests the NOTAINT for buggy Perl impls, of which mine is one. So, add NOTAINT=1 after your calls to 'make'. The NOTAINT=1 doesn't seem to remove the problem below.
+
+Next, we'll need to [patch the bug described here](http://ikiwiki.info/bugs/Insecure_dependency_in_eval_while_running_with_-T_switch/) (incidentally, this bug isn't on the bugs/ or bugs/done/ page, for some reason. It's only findable via search). Edit the Ikiwiki.pm file to look like below (line numbers prefix each line) :
+
+ 1202 #my $ret=eval pagespec_translate($spec);
+ 1203 my $ret=eval possibly_foolish_untaint(pagespec_translate($spec));
+
+At this point, you can run *make* and then *make install* (*make test* fails for reasons explained below).
+
+# Ikiwiki setup
+You can follow the normal installation process, excepting a few changes in your ikiwiki.setup documents.
+
+In ikiwiki.setup, you have to make your source and destination folders have your full *unsymlinked* home directory. The home dir you see (/home/username) is actually a symlink from /home/.yourserver/username. You need to find what this is and use that directly. Run *ls -la* on ~ to find it, the output should look like :
+
+ [good]$ ls -la ~
+ lrwxrwxrwx 1 root staff 25 2007-08-03 16:44 /home/user -> /home/.server/user
+
+So far, it looks like only the source and destination parameters require this unsymlinked path, but for paranoia reasons, you may want to put them everywhere. The changelog for version 2.14 explains why this happens.
+
+Next, add your installed Perl module directory to the *libdir* parameter. It should look something like :
+
+ #libdir => "/home/me/.ikiwiki/",
+ libdir => "/home/.server/user/site/perl/lib/perl5/",
+
+# CGI Wrapper
+The CGI wrapper file will be created automatically by "ikiwiki --setup path/to/setup", as long as you have inserted a valid filename to be created into the setup file. On DreamHost, be careful not to put the ikiwiki.cgi file in a directory that has different owner/group than the file itself (such as the main site.domain.tld/ directory): this will cause suexec to fail.
+
+The wrapper mode of "06755" doesn't seem to work. "755" appears to. However, this may be completely insecure and/or buggy, so if you know better than I, edit this doc and add it here.
+
+# Pre-created SVN repository
+DreamHost has a pretty installation and management GUI for SVN, but it means your SVN rep is pre-created. As such, you can't use the installation script they mention in the setup document, because it creates the repository for you. As such, you simply use the relevant portion of the script, but skip the repository creation. That part (from the version I installed from, *make sure you check your file as well*) is :
+
+ cd your/ikiwiki/source/dir/here
+ svn mkdir "file:///home/user/svn/yoursvnrepositoryhere/whereyouwanttoinstallto" -m "create trunk directory"
+ svn co "file:///home/user/svn/yoursvnrepositoryhere/whereyouwanttoinstallto" . # Note the dot, it's important
+ svn propset svn:ignore ".ikiwiki" . # Note the dot, it's important
+ svn add *
+ svn commit -m "initial import"
+
+# Make installing OpenID not suck
+If you try to install the Net::OpenID::Consumer module, it takes forever (and for me, fails 90% of the time). Following the [tip found here](http://www.windley.com/archives/2007/04/speeding_up_cryptdh.shtml), installing the GMP (big math) plugin greatly speeds up the process and makes it, well, work for me. However, getting this to be used by Perl requires a few more steps. First, follow the directions [to install GMP](http://gmplib.org/) (grab the package and read the INSTALL doc), but the quick steps are :
+
+ ./configure --prefix=YOUR_INSTALL_PATH_HERE # use something like ${HOME}/usr/local/
+ make
+ make check
+ make install
+
+Then you'll have to add a few variables to your path [referenced by your compiled](http://www.psc.edu/general/software/packages/gcc/manual/gcc_36.html), namely :
+
+ export C_INCLUDE_PATH=YOUR_INSTALL_PATH_HERE/include/
+ export LIBRARY_PATH=YOUR_INSTALL_PATH_HERE/lib/
+ export LD_LIBRARY_PATH=YOUR_INSTALL_PATH_HERE/lib/
+
+Then you should be able to install the module, and it'll be faster.
+
+# Why do I have to do all of this?
+IANA Perl Expert. This is just what I found.
+
+[This appears to be a presentation outlining the problem](http://schwern.org/~schwern/talks/PREFIX/slides/slide001.html) -- Note, this site was up a few days ago, but appears to be down now. YMMV.
+
+Part of the problem appears to be where modules decide to install themselves. Markdown, for example, installs itself to .../lib/perl5, which somehow never makes it's way into any of the INC paths used in the Perl. Not sure why, really. The rest of the modules seem to work well with the PREFIX option, but Markdown doesn't install to the path used from the PREFIX (hence the manual modification of Makefile.PL).
+
+Basically, some stuff obeys the (apparently defunct and never really functional) PREFIX option, while the newer modules that use Module::Build, use the *install-base* option. It would be nice if Ikiwiki could handle being installed in a non-root situation, but I have no real suggestions on how to make that happen.
+
+I'm willing to setup an account for committers wanting to try installing this on a DH server. I know who Joey is, so if he can vouch for you, I'll set up the account ASAP. You can reach me at mreynolds+dhwikiproblem@loopysoft.com .
diff --git a/doc/tips/DreamHost/discussion.mdwn b/doc/tips/DreamHost/discussion.mdwn
new file mode 100644
index 000000000..258d385ae
--- /dev/null
+++ b/doc/tips/DreamHost/discussion.mdwn
@@ -0,0 +1,18 @@
+I managed to install ikiwiki on eggplant farms, with most basic features except markdown.
+
+I think ikiwiki is more suitable for VPS/dedicated server. Shared hosting doesn't fit.
+
+I just (2009/04/27) installed ikiwiki on DreamHost and the CPAN instructions here are unnecessarily complicated. I used "cpan" instead of "perl -MCPAN -e shell" and had no trouble with that portion of the install. --[[schmonz]]
+
+After tiring of managing things by hand, I've switched to using
+pkgsrc as an unprivileged user. This uses a bit more disk for my
+own copies of perl, python, etc., but in exchange I can `cd
+.../pkgsrc/www/ikiwiki && make install` and everything just works.
+Plus I get all the benefits of a package system, like easy uninstalling
+and being notified of outdated or insecure software.
+
+The only catch: sometimes the package dependency tree gets too deep
+for DreamHost's user process limit, resulting in build death. I
+work around this by resuming the build partway down the tree, then
+trying again from whatever I was actually trying to install.
+--[[schmonz]]
diff --git a/doc/tips/Emacs_and_markdown.html b/doc/tips/Emacs_and_markdown.html
new file mode 100644
index 000000000..fcff8f0a5
--- /dev/null
+++ b/doc/tips/Emacs_and_markdown.html
@@ -0,0 +1,16 @@
+I added the following to my <code>.emacs</code>.
+
+The hook is to convert tabs to spaces to avoid unpleasant
+surprises with code blocks. For source to ska-untabify see the
+<a href="http://www.emacswiki.org/cgi-bin/wiki/UntabifyUponSave">EmacsWiki</a>
+<pre>
+(autoload 'markdown-mode "markdown-mode")
+(add-to-list 'auto-mode-alist '("\\.mdwn" . markdown-mode))
+
+(add-hook 'markdown-mode-hook
+ '(lambda ()
+ (make-local-hook 'write-contents-hooks)
+ (add-hook 'write-contents-hooks 'ska-untabify nil t)))
+</pre>
+
+;; [[DavidBremner]]
diff --git a/doc/tips/Git_repository_and_web_server_on_different_hosts.mdwn b/doc/tips/Git_repository_and_web_server_on_different_hosts.mdwn
new file mode 100644
index 000000000..58940b89f
--- /dev/null
+++ b/doc/tips/Git_repository_and_web_server_on_different_hosts.mdwn
@@ -0,0 +1,61 @@
+One may want to provide ikiwiki hosting with [[rcs/git]]+ssh access and web
+server located at different hosts. Here's a description for such
+a setup, using password-less SSH as a way of communication between
+these two hosts.
+
+Git server
+==========
+
+Let's create a user called `ikiwiki_example`. This user gets SSH
+access restricted to GIT pull/push, using `git-shell` as a shell.
+
+The root (bare) repository:
+
+- is stored in `~ikiwki_example/ikiwiki_example.git`
+- is owned by `ikiwiki_example:ikiwiki_example`
+- has permissions 0700
+
+The master repository's post-update hook connects via SSH to
+`webserver` as user `ikiwiki_example`, in order to run
+`~/bin/ikiwiki.update` on `webserver`; this post-update hook, located
+in `~ikiwki_example/ikiwiki_example.git/hooks/post-update`, is
+executable and contains:
+
+ #!/bin/sh
+ /usr/bin/ssh ikiwiki_example@webserver bin/ikiwiki.update
+
+Password-less SSH must be setup to make this possible; one can
+restrict `gitserver:ikiwiki_example` to be able to run only the needed
+command on the web server, using such a line in
+`webserver:~ikiwiki_example/.ssh/authorized_keys`:
+
+ command="bin/ikiwiki.update",from="gitserver.example.com",no-port-forwarding,no-X11-forwarding,no-agent-forwarding,no-pty ssh-rsa ...
+
+Web server
+==========
+
+Let's create a user called `ikiwiki_example` on `webserver`. She needs
+to have write permission to the destination directory.
+
+The working tree repository (`srcdir`):
+
+- is stored in `~ikiwki_example/src`
+- is owned by `ikiwiki_example:ikiwiki_example`
+- has permissions 0700
+- has the following origin: `ikiwiki_example@gitserver:ikiwiki_example.git`
+
+The CGI wrapper is generated with ownership set to
+`ikiwiki_example:ikiwiki_example` and permissions `06755`.
+
+Password-less SSH must be setup so that `ikiwiki_example@webserver` is
+allowed to push to the master repository. As told earlier, SSH access
+to `ikiwiki_example@gitserver` is restricted to GIT pull/push, which
+is just what we need.
+
+The Git wrapper is generated in `~ikiwiki_example/bin/ikiwiki.update`:
+
+ git_wrapper => '/home/ikiwiki_example/bin/ikiwiki.update'
+
+As previously explained, this wrapper is run over SSH by the master
+repository's post-update hook; it pulls updates from the master
+repository and triggers a wiki refresh.
diff --git a/doc/tips/Google_custom_search.mdwn b/doc/tips/Google_custom_search.mdwn
new file mode 100644
index 000000000..1093029f5
--- /dev/null
+++ b/doc/tips/Google_custom_search.mdwn
@@ -0,0 +1,12 @@
+Instead of the [[plugins/search]] plugin you could embed [Google Custom Search](http://www.google.com/cse) for site search functionality.
+
+**Unfortunately** you need [[plugins/rawhtml]] enabled.
+
+Once you've created your "custom search engine", just drop in the search box html like I've done in my [Debian tips and tricks](http://dabase.com/tips/) [source](http://git.webconverger.org/?p=faq.git;a=blob;f=tips.mdwn).
+
+If you have odd "save failed" error messages in Google's CSE's control panel, try `/usr/lib/WebKit/GtkLauncher` from [webkit](http://packages.qa.debian.org/w/webkit.html).
+
+
+# Alternatively
+
+You could use the [[Google_plugin|plugins/google]] from version 2.67.
diff --git a/doc/tips/Importing_posts_from_Wordpress.mdwn b/doc/tips/Importing_posts_from_Wordpress.mdwn
new file mode 100644
index 000000000..1ea82b862
--- /dev/null
+++ b/doc/tips/Importing_posts_from_Wordpress.mdwn
@@ -0,0 +1,102 @@
+Use case: You want to move away from Wordpress to Ikiwiki as your blogging/website platform, but you want to retain your old posts.
+
+[This](http://git.chris-lamb.co.uk/?p=ikiwiki-wordpress-import.git) is a simple tool that generates [git-fast-import](http://www.kernel.org/pub/software/scm/git/docs/git-fast-import.html)-compatible data from a WordPress export XML file.
+
+WordPress categories are mapped onto Ikiwiki tags. The ability to import comments is planned.
+
+The script uses the [BeautifulSoup][] module.
+
+[BeautifulSoup]: http://www.crummy.com/software/BeautifulSoup/
+
+-----
+
+I include a modified version of this script. This version includes the ability to write \[[!tag foo]] directives, which the original intended, but didn't actually do.
+
+-- [[users/simonraven]]
+
+[[ikiwiki-wordpress-import]]
+
+-----
+
+Perhaps slightly insane, but here's an XSLT style sheet that handles my pages. It's basic, but sufficient to get started.
+Note that I had to break up the ikiwiki meta strings to post this.
+
+-- JasonRiedy
+
+ <?xml version="1.0" encoding="UTF-8"?>
+ <xsl:stylesheet version="2.0"
+ xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
+ xmlns:content="http://purl.org/rss/1.0/modules/content/"
+ xmlns:wp="http://wordpress.org/export/1.0/">
+
+ <xsl:output method="text"/>
+ <xsl:output method="text" name="txt"/>
+
+ <xsl:variable name='newline'><xsl:text>
+ </xsl:text></xsl:variable>
+
+ <xsl:template match="channel">
+ <xsl:apply-templates select="item[wp:post_type = 'post']"/>
+ </xsl:template>
+
+ <xsl:template match="item">
+ <xsl:variable name="idnum" select="format-number(wp:post_id,'0000')" />
+ <xsl:variable name="basename"
+ select="concat('wp-posts/post-',$idnum)" />
+ <xsl:variable name="filename"
+ select="concat($basename, '.html')" />
+ <xsl:text>Creating </xsl:text>
+ <xsl:value-of select="concat($filename, $newline)" />
+ <xsl:result-document href="{$filename}" format="txt">
+ <xsl:text>[[</xsl:text><xsl:text>meta title="</xsl:text>
+ <xsl:value-of select="replace(title, '&quot;', '&amp;ldquo;')"/>
+ <xsl:text>"]]</xsl:text><xsl:value-of select="$newline"/>
+ <xsl:text>[[</xsl:text><xsl:text>meta date="</xsl:text>
+ <xsl:value-of select="pubDate"/>
+ <xsl:text>"]]</xsl:text><xsl:value-of select="$newline"/>
+ <xsl:text>[[</xsl:text><xsl:text>meta updated="</xsl:text>
+ <xsl:value-of select="pubDate"/>
+ <xsl:text>"]]</xsl:text> <xsl:value-of select="$newline"/>
+ <xsl:value-of select="$newline"/>
+ <xsl:value-of select="content:encoded"/>
+ <xsl:text>
+
+ </xsl:text>
+ <xsl:apply-templates select="category[@domain='tag' and not(@nicename)]">
+ <xsl:sort select="name()"/>
+ </xsl:apply-templates>
+ </xsl:result-document>
+ <xsl:apply-templates select="wp:comment">
+ <xsl:sort select="date"/>
+ <xsl:with-param name="basename">$basename</xsl:with-param>
+ </xsl:apply-templates>
+ </xsl:template>
+
+ <xsl:template match="wp:comment">
+ <xsl:param name="basename"/>
+ <xsl:variable name="cnum" select="format-number(wp:comment_id, '000')" />
+ <xsl:variable name="filename" select="concat($basename, '/comment_', $cnum, '._comment')"/>
+ <xsl:variable name="nickname" select="concat(' nickname=&quot;', wp:comment_author, '&quot;')" />
+ <xsl:variable name="username" select="concat(' username=&quot;', wp:comment_author_url, '&quot;')" />
+ <xsl:variable name="ip" select="concat(' ip=&quot;', wp:comment_author_IP, '&quot;')" />
+ <xsl:variable name="date" select="concat(' date=&quot;', wp:comment_date_gmt, '&quot;')" />
+ <xsl:result-document href="{$filename}" format="txt">
+ <xsl:text>[[</xsl:text><xsl:text>comment format=html</xsl:text><xsl:value-of select="$newline"/>
+ <xsl:value-of select="$nickname"/>
+ <xsl:value-of select="$username"/>
+ <xsl:value-of select="$ip"/>
+ <xsl:value-of select="$date"/>
+ <xsl:text>subject=""</xsl:text><xsl:value-of select="$newline"/>
+ <xsl:text>content="""</xsl:text><xsl:value-of select="$newline"/>
+ <xsl:value-of select="wp:comment_content"/>
+ <xsl:value-of select="$newline"/>
+ <xsl:text>"""]]</xsl:text><xsl:value-of select="$newline"/>
+ </xsl:result-document>
+ </xsl:template>
+
+ <xsl:template match="category">
+ <xsl:text>[</xsl:text><xsl:text>[</xsl:text><xsl:text>!tag "</xsl:text><xsl:value-of select="."/><xsl:text>"]]</xsl:text>
+ <xsl:value-of select="$newline"/>
+ </xsl:template>
+
+ </xsl:stylesheet>
diff --git a/doc/tips/Importing_posts_from_Wordpress/discussion.mdwn b/doc/tips/Importing_posts_from_Wordpress/discussion.mdwn
new file mode 100644
index 000000000..f1028bc38
--- /dev/null
+++ b/doc/tips/Importing_posts_from_Wordpress/discussion.mdwn
@@ -0,0 +1,44 @@
+Using a new debian 6.0.5 system, I get the following error trying to run the script:
+
+ ~/bin/ikiwiki-wordpress-import.py "Name" email@domain log < ~/share/wordpress.2012-08-23.xml.edited | git-fast-import
+ Traceback (most recent call last):
+ File "/home/luke/bin/ikiwiki-wordpress-import.py", line 139, in <module>
+ main(*sys.argv[1:])
+ File "/home/luke/bin/ikiwiki-wordpress-import.py", line 65, in main
+ content += x.find('content:encoded').string.replace('\r\n', '\n')
+ AttributeError: 'NoneType' object has no attribute 'replace'
+ git-fast-import statistics:
+
+Any ideas on what I am doing wrong would be appreciated.
+
+-----
+
+When I attempt to use this script, I get the following error:
+warning: Not updating refs/heads/master (new tip 26b1787fca04f2f9772b6854843fe99fe06e6088 does not contain fc0ad65d14d88fd27a6cee74c7cef3176f6900ec). I have git 1.5.6.5, any ideas?
+
+Thanks!!
+
+-----
+
+### KeyError: 146
+
+I also get this error, here's the output (it seems to stem from an error in the python script):
+
+<pre>
+Traceback (most recent call last):
+ File "../ikiwiki-wordpress-import.py", line 74, in <module>
+ main(*sys.argv[1:])
+ File "../ikiwiki-wordpress-import.py", line 54, in main
+ data = content.encode('ascii', 'html_replace')
+ File "../ikiwiki-wordpress-import.py", line 30, in <lambda>
+ % htmlentitydefs.codepoint2name[ord(c)] for c in x.object[x.start:x.end]]), x.end))
+KeyError: 146
+warning: Not updating refs/heads/master (new tip 6dca6ac939e12966bd64ce8a822ef14fe60622b2 does not contain 60b798dbf92ec5ae92f18acac3075c4304aca120)
+git-fast-import statistics:
+</pre>
+
+etc.
+
+(Removed now dead info and blah blah.)
+
+> It works fine.... The script is picky about having everything in proper UTF-8, **and** proper XML and HTML escaping. You need that to have a successful import. I let Emacs remove DOS line endings, and it works OK (if on *nix of some sort, of course). Thing is with this `git fast-import`, is that you have to `git reset` afterwards, (let's say you put them in posts/) `git checkout posts`, `git add posts`, then commit. I don't know if this a characteristic with `git fast-import`, but this is the way I get my posts to exist on the filesystem. If I don't do this, then I lose the data. If you get that "Not updating..." error, then just --force the import in. --[[users/simonraven]]
diff --git a/doc/tips/JavaScript_to_add_index.html_to_file:_links.mdwn b/doc/tips/JavaScript_to_add_index.html_to_file:_links.mdwn
new file mode 100644
index 000000000..250bb26af
--- /dev/null
+++ b/doc/tips/JavaScript_to_add_index.html_to_file:_links.mdwn
@@ -0,0 +1,63 @@
+The source file `foo/bar.mdwn` or `foo/bar.html` generates the
+page `foo/bar/index.html`, but the links to the page appear
+as "`foo/bar/`". This is fine (and recommended) for pages
+served by an http server, but it doesn't work when browsing
+the pages directly using `file:` URL. The latter might be
+desirable when testing pages before upload, or if you want to
+read pages when off-line without access to a web server.
+
+Here is a JavaScript "`onload`" script which fixes the URLs
+if the `local.protocol` isn't `http` or `https`:
+
+ function fixLinks() {
+ var scheme = location.protocol;
+ if (scheme=="http:" || scheme=="https:") return;
+ var links = document.getElementsByTagName("a");
+ for (var i = links.length; --i >= 0; ) {
+ var link = links[i];
+ var href = link.href;
+ var hlen = href.length;
+ if (hlen > 0 && link.protocol==scheme && href.charAt(hlen-1) == "/")
+ links[i].href = href + "index.html";
+ }
+ }
+
+This can be placed in `page.tmpl`:
+
+ <html>
+ <head>
+ <script language="JavaScript">
+ function fixLinks() {
+ ...
+ }
+ </script>
+ </head>
+ <body onload="javascript:fixLinks();">
+ ...
+ </html>
+
+This script has not been extensively tested.
+
+---
+
+A version that handles anchors:
+
+
+ function fixLinks() {
+ var scheme = location.protocol;
+ if (scheme != "file:") return;
+ var links = document.getElementsByTagName("a");
+ for (var i = links.length; --i >= 0; ) {
+ var link = links[i];
+ var href = link.href;
+ var anchor = "";
+ var anchorIndex = href.indexOf("#");
+ if (anchorIndex != -1) {
+ anchor = href.substring(anchorIndex);
+ href = href.substring(0, anchorIndex);
+ };
+ var hlen = href.length;
+ if (hlen > 0 && link.protocol==scheme && href.charAt(hlen-1) == "/")
+ links[i].href = href + "index.html" + anchor;
+ }
+ }
diff --git a/doc/tips/JavaScript_to_add_index.html_to_file:_links/discusion.mdwn b/doc/tips/JavaScript_to_add_index.html_to_file:_links/discusion.mdwn
new file mode 100644
index 000000000..e65616fba
--- /dev/null
+++ b/doc/tips/JavaScript_to_add_index.html_to_file:_links/discusion.mdwn
@@ -0,0 +1,3 @@
+Or you can just rebuild the wiki with --no-usedirs, which seems like a much
+more sensible solution to me, if you're browsing a build with file:// a
+lot. Still, the javascript is probably useful in some cases.. --[[Joey]]
diff --git a/doc/tips/JavaScript_to_add_index.html_to_file:_links/discussion.mdwn b/doc/tips/JavaScript_to_add_index.html_to_file:_links/discussion.mdwn
new file mode 100644
index 000000000..3b9f29e77
--- /dev/null
+++ b/doc/tips/JavaScript_to_add_index.html_to_file:_links/discussion.mdwn
@@ -0,0 +1,2 @@
+Please make this an ikiwiki feature. By that I mean, "server side". Cheers, thanks, --Dave
+> After I left this comment, I found that --no-usedirs suits my purposes: I can navigate my local wiki with file:/// urls... Hope this helps someone!
diff --git a/doc/tips/Make_calendar_start_week_on_Monday.mdwn b/doc/tips/Make_calendar_start_week_on_Monday.mdwn
new file mode 100644
index 000000000..5bce4b649
--- /dev/null
+++ b/doc/tips/Make_calendar_start_week_on_Monday.mdwn
@@ -0,0 +1,9 @@
+To accomplish this on a blog setup, I ran:
+
+ mkdir ${SRCDIR}/templates
+ cp /usr/share/ikiwiki/templates/calendar* ${SRCDIR}/templates/
+ sed -i 's/^\(\[\[!calendar\)/\1 week_start_day="1"/' ${SRCDIR}/templates/calendar* ${SRCDIR}/sidebar.mdwn
+
+where `${SRCDIR}` was the source directory for the blog. This overrides the standard templates with ones that have the `week_start_day="1"` option added. If the upstream templates change, one has to manually update the locally overriding ones.
+
+I personally managed to forget about having to change also the initially auto-generated `sidebar.mdwn` page, which led to some brain activity for me.
diff --git a/doc/tips/Make_calendar_start_week_on_Monday/discussion.mdwn b/doc/tips/Make_calendar_start_week_on_Monday/discussion.mdwn
new file mode 100644
index 000000000..fffd587d8
--- /dev/null
+++ b/doc/tips/Make_calendar_start_week_on_Monday/discussion.mdwn
@@ -0,0 +1 @@
+It should be pointed out that copying the templates are optional -- you only have to add **week_start_day="1"** to the calendar part of the sidebar.
diff --git a/doc/tips/add_chatterbox_to_blog.mdwn b/doc/tips/add_chatterbox_to_blog.mdwn
new file mode 100644
index 000000000..e07e36b07
--- /dev/null
+++ b/doc/tips/add_chatterbox_to_blog.mdwn
@@ -0,0 +1,24 @@
+If you use twitter or identi.ca, here's how to make a box
+on the side of your blog that holds your recent status updates
+from there, like I have on [my blog](http://kitenet.net/~joey/blog/)
+--[[Joey]]
+
+* Enable the [[plugins/aggregate]] plugin, and set up a cron
+ job for it.
+* At the top of your blog's page, add something like the following.
+ You'll want to change the urls of course. Be sure to also change
+ the inline directive's [[PageSpec]] to link to the location the
+ feed is aggregated to, which will be a subpage of the page
+ you put this on (blog in this example):
+
+ \[[!template id=note text="""
+ \[[!aggregate expirecount=5 name="dents" url="http://identi.ca/joeyh"
+ feedurl="http://identi.ca/api/statuses/user_timeline/joeyh.atom"]]
+ \[[!inline pages="internal(./blog/dents/*)" template=microblog
+ show=5 feeds=no]]
+ """]]
+
+* To filter out `@-replies`, append "and !*@*" to the [[ikiwiki/PageSpec]].
+ The same technique can be used for other filtering.
+
+Note: Works best with ikiwiki 3.10 or better.
diff --git a/doc/tips/add_chatterbox_to_blog/discussion.mdwn b/doc/tips/add_chatterbox_to_blog/discussion.mdwn
new file mode 100644
index 000000000..a3d686409
--- /dev/null
+++ b/doc/tips/add_chatterbox_to_blog/discussion.mdwn
@@ -0,0 +1,43 @@
+The example you gave looks a bit odd.
+
+This is what I did from your example (still trying to learn the more complex things ;).
+
+<pre>
+\[[!template id=note text="""
+\[[!aggregate expirecount=5 name=kijkaqawej url=http://identi.ca/kjikaqawej
+feedurl=http://identi.ca/api/statuses/user_timeline/kjikaqawej.atom]]
+\[[!inline pages="internal(kijkaqawej/*)" template=microblog show=5 feeds=no]] """]]
+</pre>
+
+mine, live, here: <http://simonraven.kisikew.org/blog/meta/microblog-feed/>
+
+I expected something like: sidebar, with a number, and displaying them in the sidebar, but they don't display (similar to what you have on your blog).
+
+On the [[/ikiwiki/pagespec]] page, it says "internal" pages aren't "first-class" wiki pages, so it's best not to directly display them, so how do you manage to display them? I'd like to display their name, and what they link to in the sidebar, or otherwise in the main body.
+
+> That's what the inline does, displays the internal pages.
+>
+> You need to fix your pagespec to refer to where the pages are aggregated
+> to, under the page that contains the aggregate directive. In your example,
+> it should be `internal(./blog/meta/microblog-feed/kijkaqawej/*)` --[[Joey]]
+
+>> Oooh, I see, it's referring to an absolute path (relative to the site), right?
+>> Thanks :).
+
+>>> Right, PageSpecs are always absolute paths unless prefixed with `./`
+>>> (somewhat confusingly since WikiLinks are always realtive unless
+>>> previxed with `/` ...) --[[Joey]]
+
+>> This is not working for me at all, all I get is some SHA1 hash all the time. I've tried variants of the `internal()` arg, and nothing gets spit out. --[[simonraven]]
+
+>>> Sounds like [[!debbug 380212]]?
+>>> If so, the fix is to use Text::Markdown, or markdown 1.0.2 instead of buggy
+>>> old markdown 1.0.1. --[[Joey]]
+
+>> `ii libtext-markdown-perl 1.0.21-1 Markdown and MultiMarkdown markup languages library`
+>>
+>> I'm using `Text::Markdown` due to its "multi-markdown" support. Yes, it does seem exactly like [[!debbug 380212]] .
+>> Maybe update it from CPAN + dh-make-perl (if there's a new one, that is) --[[simonraven]]
+>> I've just built and installed `libtext-markdown-perl 1.0.21-1` from dh-make-perl & CPAN, and regenerated that page.. let's see what happens... no hashes, but nothing else either:
+>>
+>> "kijkaqawej: last checked 10 minutes ago (25 posts)" -- inside of a box, no display of posts.
diff --git a/doc/tips/blog_script.mdwn b/doc/tips/blog_script.mdwn
new file mode 100644
index 000000000..1dfd71538
--- /dev/null
+++ b/doc/tips/blog_script.mdwn
@@ -0,0 +1,6 @@
+I have a [blog](http://git.kitenet.net/?p=joey/home.git;a=blob_plain;f=bin/blog)
+program that I use to write blog posts in a text editor. The first line I
+enter is used as the title, and it automatically comes up with a unique page
+name based on the title and handles all the details of posting to my blog.
+--[[Joey]]
+
diff --git a/doc/tips/comments_feed.mdwn b/doc/tips/comments_feed.mdwn
new file mode 100644
index 000000000..3d6a8c449
--- /dev/null
+++ b/doc/tips/comments_feed.mdwn
@@ -0,0 +1,17 @@
+You've enabled the [[plugins/comments]] plugin, so a set of pages on your
+blog can have comments added to them. Pages with comments even have special
+feeds that can be used to subscribe to those comments. But you'd like to
+add a feed that contains all the comments posted to any page. Here's how:
+
+ \[[!inline pages="comment(*)" template=comment]]
+
+The special [[ikiwiki/PageSpec]] matches all comments. The
+[[template|templates]] causes the comments to be displayed formatted
+nicely.
+
+---
+
+It's also possible to make a feed of comments that are held pending
+moderation.
+
+ \[[!inline pages="comment_pending(*)" template=comment]]
diff --git a/doc/tips/convert_blogger_blogs_to_ikiwiki.mdwn b/doc/tips/convert_blogger_blogs_to_ikiwiki.mdwn
new file mode 100644
index 000000000..e71e2132d
--- /dev/null
+++ b/doc/tips/convert_blogger_blogs_to_ikiwiki.mdwn
@@ -0,0 +1,5 @@
+Daniel Burrows
+[explains](http://algebraicthunk.net/~dburrows/blog/entry/howto-convert-your-blogger-or-blogspot-blog-to-ikiwiki/)
+how to convert your Blogger/BlogSpot blog to ikiwiki.
+
+François Marier used a [different approach](http://feeding.cloud.geek.nz/posts/moving-from-blogger-to-ikiwiki-and-branchable/) on a more recent version of Blogger.
diff --git a/doc/tips/convert_mediawiki_to_ikiwiki.mdwn b/doc/tips/convert_mediawiki_to_ikiwiki.mdwn
new file mode 100644
index 000000000..e60b413dd
--- /dev/null
+++ b/doc/tips/convert_mediawiki_to_ikiwiki.mdwn
@@ -0,0 +1,286 @@
+[[!toc levels=2]]
+
+Mediawiki is a dynamically-generated wiki which stores its data in a
+relational database. Pages are marked up using a proprietary markup. It is
+possible to import the contents of a Mediawiki site into an ikiwiki,
+converting some of the Mediawiki conventions into Ikiwiki ones.
+
+The following instructions describe ways of obtaining the current version of
+the wiki. We do not yet cover importing the history of edits.
+
+Another set of instructions and conversion tools (which imports the full history)
+can be found at <http://github.com/mithro/media2iki>
+
+## Step 1: Getting a list of pages
+
+The first bit of information you require is a list of pages in the Mediawiki.
+There are several different ways of obtaining these.
+
+### Parsing the output of `Special:Allpages`
+
+Mediawikis have a special page called `Special:Allpages` which list all the
+pages for a given namespace on the wiki.
+
+If you fetch the output of this page to a local file with something like
+
+ wget -q -O tmpfile 'http://your-mediawiki/wiki/Special:Allpages'
+
+You can extract the list of page names using the following python script. Note
+that this script is sensitive to the specific markup used on the page, so if
+you have tweaked your mediawiki theme a lot from the original, you will need
+to adjust this script too:
+
+ import sys
+ from xml.dom.minidom import parse, parseString
+
+ dom = parse(sys.argv[1])
+ tables = dom.getElementsByTagName("table")
+ pagetable = tables[-1]
+ anchors = pagetable.getElementsByTagName("a")
+ for a in anchors:
+ print a.firstChild.toxml().\
+ replace('&amp;','&').\
+ replace('&lt;','<').\
+ replace('&gt;','>')
+
+Also, if you have pages with titles that need to be encoded to be represented
+in HTML, you may need to add further processing to the last line.
+
+Note that by default, `Special:Allpages` will only list pages in the main
+namespace. You need to add a `&namespace=XX` argument to get pages in a
+different namespace. (See below for the default list of namespaces)
+
+Note that the page names obtained this way will not include any namespace
+specific prefix: e.g. `Category:` will be stripped off.
+
+### Querying the database
+
+If you have access to the relational database in which your mediawiki data is
+stored, it is possible to derive a list of page names from this. With mediawiki's
+MySQL backend, the page table is, appropriately enough, called `table`:
+
+ SELECT page_namespace, page_title FROM page;
+
+As with the previous method, you will need to do some filtering based on the
+namespace.
+
+### namespaces
+
+The list of default namespaces in mediawiki is available from <http://www.mediawiki.org/wiki/Manual:Namespace#Built-in_namespaces>. Here are reproduced the ones you are most likely to encounter if you are running a small mediawiki install for your own purposes:
+
+[[!table data="""
+Index | Name | Example
+0 | Main | Foo
+1 | Talk | Talk:Foo
+2 | User | User:Jon
+3 | User talk | User_talk:Jon
+6 | File | File:Barack_Obama_signature.svg
+10 | Template | Template:Prettytable
+14 | Category | Category:Pages_needing_review
+"""]]
+
+## Step 2: fetching the page data
+
+Once you have a list of page names, you can fetch the data for each page.
+
+### Method 1: via HTTP and `action=raw`
+
+You need to create two derived strings from the page titles: the
+destination path for the page and the source URL. Assuming `$pagename`
+contains a pagename obtained above, and `$wiki` contains the URL to your
+mediawiki's `index.php` file:
+
+ src=`echo "$pagename" | tr ' ' _ | sed 's,&,&amp;,g'`
+ dest=`"$pagename" | tr ' ' _ | sed 's,&,__38__,g'`
+
+ mkdir -p `dirname "$dest"`
+ wget -q "$wiki?title=$src&action=raw" -O "$dest"
+
+You may need to add more conversions here depending on the precise page titles
+used in your wiki.
+
+If you are trying to fetch pages from a different namespace to the default,
+you will need to prefix the page title with the relevant prefix, e.g.
+`Category:` for category pages. You probably don't want to prefix it to the
+output page, but you may want to vary the destination path (i.e. insert an
+extra directory component corresponding to your ikiwiki's `tagbase`).
+
+### Method 2: via HTTP and `Special:Export`
+
+Mediawiki also has a special page `Special:Export` which can be used to obtain
+the source of the page and other metadata such as the last contributor, or the
+full history, etc.
+
+You need to send a `POST` request to the `Special:Export` page. See the source
+of the page fetched via `GET` to determine the correct arguments.
+
+You will then need to write an XML parser to extract the data you need from
+the result.
+
+### Method 3: via the database
+
+It is possible to extract the page data from the database with some
+well-crafted queries.
+
+## Step 3: format conversion
+
+The next step is to convert Mediawiki conventions into Ikiwiki ones.
+
+### categories
+
+Mediawiki uses a special page name prefix to define "Categories", which
+otherwise behave like ikiwiki tags. You can convert every Mediawiki category
+into an ikiwiki tag name using a script such as
+
+ import sys, re
+ pattern = r'\[\[Category:([^\]]+)\]\]'
+
+ def manglecat(mo):
+ return '\[[!tag %s]]' % mo.group(1).strip().replace(' ','_')
+
+ for line in sys.stdin.readlines():
+ res = re.match(pattern, line)
+ if res:
+ sys.stdout.write(re.sub(pattern, manglecat, line))
+ else: sys.stdout.write(line)
+
+## Step 4: Mediawiki plugin or Converting to Markdown
+
+You can use a plugin to make ikiwiki support Mediawiki syntax, or you can
+convert pages to a format ikiwiki understands.
+
+### Step 4a: Mediawiki plugin
+
+The [[plugins/contrib/mediawiki]] plugin can be used by ikiwiki to interpret
+most of the Mediawiki syntax.
+
+The following things are not working:
+
+* templates
+* tables
+* spaces and other funky characters ("?") in page names
+
+### Step 4b: Converting pages
+
+#### Converting to Markdown
+
+There is a Python script for converting from the Mediawiki format to Markdown in [[mithro]]'s conversion repository at <http://github.com/mithro/media2iki>. *WARNING:* While the script tries to preserve everything is can, Markdown syntax is not as flexible as Mediawiki so the conversion is lossy!
+
+ # The script needs the mwlib library to work
+ # If you don't have easy_install installed, apt-get install python-setuptools
+ sudo easy_install mwlib
+
+ # Get the repository
+ git clone git://github.com/mithro/media2iki.git
+ cd media2iki
+
+ # Do a conversion
+ python mediawiki2markdown.py --no-strict --no-debugger <my mediawiki file> > output.md
+
+
+[[mithro]] doesn't frequent this page, so please report issues on the [github issue tracker](https://github.com/mithro/media2iki/issues).
+
+## Scripts
+
+There is a repository of tools for converting MediaWiki to Git based Markdown wiki formats (such as ikiwiki and github wikis) at <http://github.com/mithro/media2iki>. It also includes a standalone tool for converting from the Mediawiki format to Markdown. [[mithro]] doesn't frequent this page, so please report issues on the [github issue tracker](https://github.com/mithro/media2iki/issues).
+
+[[Albert]] wrote a ruby script to convert from mediawiki's database to ikiwiki at <https://github.com/docunext/mediawiki2gitikiwiki>
+
+[[scy]] wrote a python script to convert from mediawiki XML dumps to git repositories at <https://github.com/scy/levitation>.
+
+[[Anarcat]] wrote a python script to convert from a mediawiki website to ikiwiki at git://src.anarcat.ath.cx/mediawikigitdump.git/. The script doesn't need any special access or privileges and communicates with the documented API (so it's a bit slower, but allows you to mirror sites you are not managing, like parts of Wikipedia). The script can also incrementally import new changes from a running site, through RecentChanges inspection. It also supports mithro's new Mediawiki2markdown converter (which I have a copy here: git://src.anarcat.ath.cx/media2iki.git/).
+
+> Some assembly is required to get Mediawiki2markdown and its mwlib
+> gitmodule available in the right place for it to use.. perhaps you could
+> automate that? --[[Joey]]
+
+> > You mean a debian package? :) media2iki is actually a submodule, so you need to go through extra steps to install it. mwlib being the most annoying part... I have fixed my script so it looks for media2iki directly in the submodule and improved the install instructions in the README file, but I'm not sure I can do much more short of starting to package the whole thing... --[[anarcat]]
+
+>>> You may have forgotten to push that, I don't see those changes.
+>>> Packaging the python library might be a good 1st step.
+>>> --[[Joey]]
+
+> Also, when I try to run it with -t on www.amateur-radio-wiki.net, it
+> fails on some html in the page named "4_metres". On archiveteam.org,
+> it fails trying to write to a page filename starting with "/", --[[Joey]]
+
+> > can you show me exactly which commandline arguments you're using? also, I have made improvements over the converter too, also available here: git://src/anarcat.ath.cx/media2iki.git/ -- [[anarcat]]
+
+>>> Not using your new converter, just the installation I did earlier
+>>> today:
+>>> --[[Joey]]
+
+<pre>
+fetching page 4 metres from http://www.amateur-radio-wiki.net//index.php?action=raw&title=4+metres into 4_metres.mdwn
+Unknown tag TagNode tagname='div' vlist={'style': {u'float': u'left', u'border': u'2px solid #aaa', u'margin-left': u'20px'}}->'div' div
+Traceback (most recent call last):
+ File "./mediawikigitdump.py", line 298, in <module>
+ fetch_allpages(namespace)
+ File "./mediawikigitdump.py", line 82, in fetch_allpages
+ fetch_page(page.getAttribute('title'))
+ File "./mediawikigitdump.py", line 187, in fetch_page
+ c.parse(urllib.urlopen(url).read())
+ File "/home/joey/tmp/mediawikigitdump/mediawiki2markdown.py", line 285, in parse
+ self.parse_node(ast)
+ File "/home/joey/tmp/mediawikigitdump/mediawiki2markdown.py", line 76, in parse_node
+ f(node)
+ File "/home/joey/tmp/mediawikigitdump/mediawiki2markdown.py", line 88, in on_article
+ self.parse_children(node)
+ File "/home/joey/tmp/mediawikigitdump/mediawiki2markdown.py", line 83, in parse_children
+ self.parse_node(child)
+ File "/home/joey/tmp/mediawikigitdump/mediawiki2markdown.py", line 76, in parse_node
+ f(node)
+ File "/home/joey/tmp/mediawikigitdump/mediawiki2markdown.py", line 413, in on_section
+ self.parse_node(child)
+ File "/home/joey/tmp/mediawikigitdump/mediawiki2markdown.py", line 76, in parse_node
+ f(node)
+ File "/home/joey/tmp/mediawikigitdump/mediawiki2markdown.py", line 83, in parse_children
+ self.parse_node(child)
+ File "/home/joey/tmp/mediawikigitdump/mediawiki2markdown.py", line 76, in parse_node
+ f(node)
+ File "/home/joey/tmp/mediawikigitdump/mediawiki2markdown.py", line 474, in on_tagnode
+ assert not options.STRICT
+AssertionError
+zsh: exit 1 ./mediawikigitdump.py -v -t http://www.amateur-radio-wiki.net/
+</pre>
+
+<pre>
+joey@wren:~/tmp/mediawikigitdump>./mediawikigitdump.py -v -t http://archiveteam.org
+fetching page list from namespace 0 ()
+found 222 pages
+fetching page /Sites using MediaWiki (English) from http://archiveteam.org/index.php?action=raw&title=%2FSites+using+MediaWiki+%28English%29 into /Sites_using_MediaWiki_(English).mdwn
+Traceback (most recent call last):
+ File "./mediawikigitdump.py", line 298, in <module>
+ fetch_allpages(namespace)
+ File "./mediawikigitdump.py", line 82, in fetch_allpages
+ fetch_page(page.getAttribute('title'))
+ File "./mediawikigitdump.py", line 188, in fetch_page
+ f = open(filename, 'w')
+IOError: [Errno 13] Permission denied: u'/Sites_using_MediaWiki_(English).mdwn'
+zsh: exit 1 ./mediawikigitdump.py -v -t http://archiveteam.org
+</pre>
+
+> > > > > I have updated my script to call the parser without strict mode and to trim leading slashes (and /../, for that matter...) -- [[anarcat]]
+
+> > > > > > Getting this error with the new version on any site I try (when using -t only): `TypeError: argument 1 must be string or read-only character buffer, not None`
+> > > > > > bisecting, commit 55941a3bd89d43d09b0c126c9088eee0076b5ea2 broke it.
+> > > > > > --[[Joey]]
+
+> > > > > > > I can't reproduce here, can you try with -v or -d to try to trace down the problem? -- [[anarcat]]
+
+<pre>
+fetching page list from namespace 0 ()
+found 473 pages
+fetching page 0 - 9 from http://www.amateur-radio-wiki.net/index.php?action=raw&title=0+-+9 into 0_-_9.mdwn
+Traceback (most recent call last):
+ File "./mediawikigitdump.py", line 304, in <module>
+ main()
+ File "./mediawikigitdump.py", line 301, in main
+ fetch_allpages(options.namespace)
+ File "./mediawikigitdump.py", line 74, in fetch_allpages
+ fetch_page(page.getAttribute('title'))
+ File "./mediawikigitdump.py", line 180, in fetch_page
+ f.write(options.convert(urllib.urlopen(url).read()))
+TypeError: argument 1 must be string or read-only character buffer, not None
+zsh: exit 1 ./mediawikigitdump.py -v -d -t http://www.amateur-radio-wiki.net/
+</pre>
diff --git a/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn b/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn
new file mode 100644
index 000000000..f1b0598ee
--- /dev/null
+++ b/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn
@@ -0,0 +1,669 @@
+20100428 - I just wrote a simple ruby script which will connect to a mysql server and then recreate the pages and their revision histories with Grit. It also does one simple conversion of equals titles to pounds. Enjoy!
+
+<http://github.com/docunext/mediawiki2gitikiwiki>
+
+-- [[users/Albert]]
+
+----
+
+I wrote a script that will download all the latest revisions of a mediawiki site. In short, it does a good part of the stuff required for the migration: it downloads the goods (ie. the latest version of every page, automatically) and commits the resulting structure. There's still a good few pieces missing for an actual complete conversion to ikiwiki, but it's a pretty good start. It only talks with mediawiki through HTTP, so no special access is necessary. The downside of that is that it will not attempt to download every revision for performance reasons. The code is here: git://anarcat.ath.cx/software/mediawikigitdump.git/ and git://anarcat.ath.cx/software/media2iki.git/ See header of the file for more details and todos. -- [[users/Anarcat]] 2010-10-15
+
+----
+
+The u32 page is excellent, but I wonder if documenting the procedure here
+would be worthwhile. Who knows, the remote site might disappear. But also
+there are some variations on the approach that might be useful:
+
+ * using a python script and the dom library to extract the page names from
+ Special:Allpages (such as
+ <http://www.staff.ncl.ac.uk/jon.dowland/unix/docs/get_pagenames.py>)
+ * Or, querying the mysql back-end to get the names
+ * using WWW::MediaWiki for importing/exporting pages from the wiki, instead
+ of Special::Export
+
+Also, some detail on converting mediawiki transclusion to ikiwiki inlines...
+
+-- [[users/Jon]]
+
+----
+
+> "Who knows, the remote site might disappear.". Right now, it appears to
+> have done just that. -- [[users/Jon]]
+
+I have manage to recover most of the site using the Internet Archive. What
+I was unable to retrieve I have rewritten. You can find a copy of the code
+at <http://github.com/mithro/media2iki>
+
+> This is excellent news. However, I'm still keen on there being a
+> comprehensive and up-to-date set of instructions on *this* site. I wouldn't
+> suggest importing that material into ikiwiki like-for-like (not least for
+> [[licensing|freesoftware]] reasons), but it's excellent to have it available
+> for reference, especially since it (currently) is the only set of
+> instructions that gives you the whole history.
+>
+> The `mediawiki.pm` that was at u32.net is licensed GPL-2. I'd like to see it
+> cleaned up and added to IkiWiki proper (although I haven't requested this
+> yet, I suspect the way it (ab)uses linkify would disqualify it at present).
+>
+> I've imported Scott's initial `mediawiki.pm` into a repository at
+> <http://github.com/jmtd/mediawiki.pm> as a start.
+> -- [[Jon]]
+
+----
+
+The iki-fast-load ruby script from the u32 page is given below:
+
+ #!/usr/bin/env ruby
+
+ # This script is called on the final sorted, de-spammed revision
+ # XML file.
+ #
+ # It doesn't currently check for no-op revisions... I believe
+ # that git-fast-load will dutifully load them even though nothing
+ # happened. I don't care to solve this by adding a file cache
+ # to this script. You can run iki-diff-next.rb to highlight any
+ # empty revisions that need to be removed.
+ #
+ # This turns each node into an equivalent file.
+ # It does not convert spaces to underscores in file names.
+ # This would break wikilinks.
+ # I suppose you could fix this with mod_speling or mod_rewrite.
+ #
+ # It replaces nodes in the Image: namespace with the files themselves.
+
+
+ require 'rubygems'
+ require 'node-callback'
+ require 'time'
+ require 'ostruct'
+
+
+ # pipe is the stream to receive the git-fast-import commands
+ # putfrom is true if this branch has existing commits on it, false if not.
+ def format_git_commit(pipe, f)
+ # Need to escape backslashes and double-quotes for git?
+ # No, git breaks when I do this.
+ # For the filename "path with \\", git sez: bad default revision 'HEAD'
+ # filename = '"' + filename.gsub('\\', '\\\\\\\\').gsub('"', '\\"') + '"'
+
+ # In the calls below, length must be the size in bytes!!
+ # TODO: I haven't figured out how this works in the land of UTF8 and Ruby 1.9.
+ pipe.puts "commit #{f.branch}"
+ pipe.puts "committer #{f.username} <#{f.email}> #{f.timestamp.rfc2822}"
+ pipe.puts "data #{f.message.length}\n#{f.message}\n"
+ pipe.puts "from #{f.branch}^0" if f.putfrom
+ pipe.puts "M 644 inline #{f.filename}"
+ pipe.puts "data #{f.content.length}\n#{f.content}\n"
+ pipe.puts
+ end
+
+> Would be nice to know where you could get "node-callbacks"... this thing is useless without it. --[[users/simonraven]]
+
+
+Mediawiki.pm - A plugin which supports mediawiki format.
+
+ #!/usr/bin/perl
+ # By Scott Bronson. Licensed under the GPLv2+ License.
+ # Extends Ikiwiki to be able to handle Mediawiki markup.
+ #
+ # To use the Mediawiki Plugin:
+ # - Install Text::MediawikiFormat
+ # - Turn of prefix_directives in your setup file.
+ # (TODO: we probably don't need to do this anymore?)
+ # prefix_directives => 1,
+ # - Add this plugin on Ikiwiki's path (perl -V, look for @INC)
+ # cp mediawiki.pm something/IkiWiki/Plugin
+ # - And enable it in your setup file
+ # add_plugins => [qw{mediawiki}],
+ # - Finally, turn off the link plugin in setup (this is important)
+ # disable_plugins => [qw{link}],
+ # - Rebuild everything (actually, this should be automatic right?)
+ # - Now all files with a .mediawiki extension should be rendered properly.
+
+
+ package IkiWiki::Plugin::mediawiki;
+
+ use warnings;
+ use strict;
+ use IkiWiki 2.00;
+ use URI;
+
+
+ # This is a gross hack... We disable the link plugin so that our
+ # linkify routine is always called. Then we call the link plugin
+ # directly for all non-mediawiki pages. Ouch... Hopefully Ikiwiki
+ # will be updated soon to support multiple link plugins.
+ require IkiWiki::Plugin::link;
+
+ # Even if T:MwF is not installed, we can still handle all the linking.
+ # The user will just see Mediawiki markup rather than formatted markup.
+ eval q{use Text::MediawikiFormat ()};
+ my $markup_disabled = $@;
+
+ # Work around a UTF8 bug in Text::MediawikiFormat
+ # http://rt.cpan.org/Public/Bug/Display.html?id=26880
+ unless($markup_disabled) {
+ no strict 'refs';
+ no warnings;
+ *{'Text::MediawikiFormat::uri_escape'} = \&URI::Escape::uri_escape_utf8;
+ }
+
+ my %metaheaders; # keeps track of redirects for pagetemplate.
+ my %tags; # keeps track of tags for pagetemplate.
+
+
+ sub import { #{{{
+ hook(type => "checkconfig", id => "mediawiki", call => \&checkconfig);
+ hook(type => "scan", id => "mediawiki", call => \&scan);
+ hook(type => "linkify", id => "mediawiki", call => \&linkify);
+ hook(type => "htmlize", id => "mediawiki", call => \&htmlize);
+ hook(type => "pagetemplate", id => "mediawiki", call => \&pagetemplate);
+ } # }}}
+
+
+ sub checkconfig
+ {
+ return IkiWiki::Plugin::link::checkconfig(@_);
+ }
+
+
+ my $link_regexp = qr{
+ \[\[(?=[^!]) # beginning of link
+ ([^\n\r\]#|<>]+) # 1: page to link to
+ (?:
+ \# # '#', beginning of anchor
+ ([^|\]]+) # 2: anchor text
+ )? # optional
+
+ (?:
+ \| # followed by '|'
+ ([^\]\|]*) # 3: link text
+ )? # optional
+ \]\] # end of link
+ ([a-zA-Z]*) # optional trailing alphas
+ }x;
+
+
+ # Convert spaces in the passed-in string into underscores.
+ # If passed in undef, returns undef without throwing errors.
+ sub underscorize
+ {
+ my $var = shift;
+ $var =~ tr{ }{_} if $var;
+ return $var;
+ }
+
+
+ # Underscorize, strip leading and trailing space, and scrunch
+ # multiple runs of spaces into one underscore.
+ sub scrunch
+ {
+ my $var = shift;
+ if($var) {
+ $var =~ s/^\s+|\s+$//g; # strip leading and trailing space
+ $var =~ s/\s+/ /g; # squash multiple spaces to one
+ }
+ return $var;
+ }
+
+
+ # Translates Mediawiki paths into Ikiwiki paths.
+ # It needs to be pretty careful because Mediawiki and Ikiwiki handle
+ # relative vs. absolute exactly opposite from each other.
+ sub translate_path
+ {
+ my $page = shift;
+ my $path = scrunch(shift);
+
+ # always start from root unless we're doing relative shenanigans.
+ $page = "/" unless $path =~ /^(?:\/|\.\.)/;
+
+ my @result = ();
+ for(split(/\//, "$page/$path")) {
+ if($_ eq '..') {
+ pop @result;
+ } else {
+ push @result, $_ if $_ ne "";
+ }
+ }
+
+ # temporary hack working around http://ikiwiki.info/bugs/Can__39__t_create_root_page/index.html?updated
+ # put this back the way it was once this bug is fixed upstream.
+ # This is actually a major problem because now Mediawiki pages can't link from /Git/git-svn to /git-svn. And upstream appears to be uninterested in fixing this bug. :(
+ # return "/" . join("/", @result);
+ return join("/", @result);
+ }
+
+
+ # Figures out the human-readable text for a wikilink
+ sub linktext
+ {
+ my($page, $inlink, $anchor, $title, $trailing) = @_;
+ my $link = translate_path($page,$inlink);
+
+ # translate_path always produces an absolute link.
+ # get rid of the leading slash before we display this link.
+ $link =~ s#^/##;
+
+ my $out = "";
+ if($title) {
+ $out = IkiWiki::pagetitle($title);
+ } else {
+ $link = $inlink if $inlink =~ /^\s*\//;
+ $out = $anchor ? "$link#$anchor" : $link;
+ if(defined $title && $title eq "") {
+ # a bare pipe appeared in the link...
+ # user wants to strip namespace and trailing parens.
+ $out =~ s/^[A-Za-z0-9_-]*://;
+ $out =~ s/\s*\(.*\)\s*$//;
+ }
+ # A trailing slash suppresses the leading slash
+ $out =~ s#^/(.*)/$#$1#;
+ }
+ $out .= $trailing if defined $trailing;
+ return $out;
+ }
+
+
+ sub tagpage ($)
+ {
+ my $tag=shift;
+
+ if (exists $config{tagbase} && defined $config{tagbase}) {
+ $tag=$config{tagbase}."/".$tag;
+ }
+
+ return $tag;
+ }
+
+
+ # Pass a URL and optional text associated with it. This call turns
+ # it into fully-formatted HTML the same way Mediawiki would.
+ # Counter is used to number untitled links sequentially on the page.
+ # It should be set to 1 when you start parsing a new page. This call
+ # increments it automatically.
+ sub generate_external_link
+ {
+ my $url = shift;
+ my $text = shift;
+ my $counter = shift;
+
+ # Mediawiki trims off trailing commas.
+ # And apparently it does entity substitution first.
+ # Since we can't, we'll fake it.
+
+ # trim any leading and trailing whitespace
+ $url =~ s/^\s+|\s+$//g;
+
+ # url properly terminates on > but must special-case &gt;
+ my $trailer = "";
+ $url =~ s{(\&(?:gt|lt)\;.*)$}{ $trailer = $1, ''; }eg;
+
+ # Trim some potential trailing chars, put them outside the link.
+ my $tmptrail = "";
+ $url =~ s{([,)]+)$}{ $tmptrail .= $1, ''; }eg;
+ $trailer = $tmptrail . $trailer;
+
+ my $title = $url;
+ if(defined $text) {
+ if($text eq "") {
+ $text = "[$$counter]";
+ $$counter += 1;
+ }
+ $text =~ s/^\s+|\s+$//g;
+ $text =~ s/^\|//;
+ } else {
+ $text = $url;
+ }
+
+ return "<a href='$url' title='$title'>$text</a>$trailer";
+ }
+
+
+ # Called to handle bookmarks like \[[#heading]] or <span class="createlink"><a href="http://u32.net/cgi-bin/ikiwiki.cgi?page=%20text%20&amp;from=Mediawiki_Plugin%2Fmediawiki&amp;do=create" rel="nofollow">?</a>#a</span>
+ sub generate_fragment_link
+ {
+ my $url = shift;
+ my $text = shift;
+
+ my $inurl = $url;
+ my $intext = $text;
+ $url = scrunch($url);
+
+ if(defined($text) && $text ne "") {
+ $text = scrunch($text);
+ } else {
+ $text = $url;
+ }
+
+ $url = underscorize($url);
+
+ # For some reason Mediawiki puts blank titles on all its fragment links.
+ # I don't see why we would duplicate that behavior here.
+ return "<a href='$url'>$text</a>";
+ }
+
+
+ sub generate_internal_link
+ {
+ my($page, $inlink, $anchor, $title, $trailing, $proc) = @_;
+
+ # Ikiwiki's link link plugin wrecks this line when displaying on the site.
+ # Until the code highlighter plugin can turn off link finding,
+ # always escape double brackets in double quotes: \[[
+ if($inlink eq '..') {
+ # Mediawiki doesn't touch links like \[[..#hi|ho]].
+ return "\[[" . $inlink . ($anchor?"#$anchor":"") .
+ ($title?"|$title":"") . "]]" . $trailing;
+ }
+
+ my($linkpage, $linktext);
+ if($inlink =~ /^ (:?) \s* Category (\s* \: \s*) ([^\]]*) $/x) {
+ # Handle category links
+ my $sep = $2;
+ $inlink = $3;
+ $linkpage = IkiWiki::linkpage(translate_path($page, $inlink));
+ if($1) {
+ # Produce a link but don't add this page to the given category.
+ $linkpage = tagpage($linkpage);
+ $linktext = ($title ? '' : "Category$sep") .
+ linktext($page, $inlink, $anchor, $title, $trailing);
+ $tags{$page}{$linkpage} = 1;
+ } else {
+ # Add this page to the given category but don't produce a link.
+ $tags{$page}{$linkpage} = 1;
+ &$proc(tagpage($linkpage), $linktext, $anchor);
+ return "";
+ }
+ } else {
+ # It's just a regular link
+ $linkpage = IkiWiki::linkpage(translate_path($page, $inlink));
+ $linktext = linktext($page, $inlink, $anchor, $title, $trailing);
+ }
+
+ return &$proc($linkpage, $linktext, $anchor);
+ }
+
+
+ sub check_redirect
+ {
+ my %params=@_;
+
+ my $page=$params{page};
+ my $destpage=$params{destpage};
+ my $content=$params{content};
+
+ return "" if $page ne $destpage;
+
+ if($content !~ /^ \s* \#REDIRECT \s* \[\[ ( [^\]]+ ) \]\]/x) {
+ # this page isn't a redirect, render it normally.
+ return undef;
+ }
+
+ # The rest of this function is copied from the redir clause
+ # in meta::preprocess and actually handles the redirect.
+
+ my $value = $1;
+ $value =~ s/^\s+|\s+$//g;
+
+ my $safe=0;
+ if ($value !~ /^\w+:\/\//) {
+ # it's a local link
+ my ($redir_page, $redir_anchor) = split /\#/, $value;
+
+ add_depends($page, $redir_page);
+ my $link=bestlink($page, underscorize(translate_path($page,$redir_page)));
+ if (! length $link) {
+ return "<b>Redirect Error:</b> <nowiki>\[[$redir_page]] not found.</nowiki>";
+ }
+
+ $value=urlto($link, $page);
+ $value.='#'.$redir_anchor if defined $redir_anchor;
+ $safe=1;
+
+ # redir cycle detection
+ $pagestate{$page}{mediawiki}{redir}=$link;
+ my $at=$page;
+ my %seen;
+ while (exists $pagestate{$at}{mediawiki}{redir}) {
+ if ($seen{$at}) {
+ return "<b>Redirect Error:</b> cycle found on <nowiki>\[[$at]]</nowiki>";
+ }
+ $seen{$at}=1;
+ $at=$pagestate{$at}{mediawiki}{redir};
+ }
+ } else {
+ # it's an external link
+ $value = encode_entities($value);
+ }
+
+ my $redir="<meta http-equiv=\"refresh\" content=\"0; URL=$value\" />";
+ $redir=scrub($redir) if !$safe;
+ push @{$metaheaders{$page}}, $redir;
+
+ return "Redirecting to $value ...";
+ }
+
+
+ # Feed this routine a string containing <nowiki>...</nowiki> sections,
+ # this routine calls your callback for every section not within nowikis,
+ # collecting its return values and returning the rewritten string.
+ sub skip_nowiki
+ {
+ my $content = shift;
+ my $proc = shift;
+
+ my $result = "";
+ my $state = 0;
+
+ for(split(/(<nowiki[^>]*>.*?<\/nowiki\s*>)/s, $content)) {
+ $result .= ($state ? $_ : &$proc($_));
+ $state = !$state;
+ }
+
+ return $result;
+ }
+
+
+ # Converts all links in the page, wiki and otherwise.
+ sub linkify (@)
+ {
+ my %params=@_;
+
+ my $page=$params{page};
+ my $destpage=$params{destpage};
+ my $content=$params{content};
+
+ my $file=$pagesources{$page};
+ my $type=pagetype($file);
+ my $counter = 1;
+
+ if($type ne 'mediawiki') {
+ return IkiWiki::Plugin::link::linkify(@_);
+ }
+
+ my $redir = check_redirect(%params);
+ return $redir if defined $redir;
+
+ # this code was copied from MediawikiFormat.pm.
+ # Heavily changed because MF.pm screws up escaping when it does
+ # this awful hack: $uricCheat =~ tr/://d;
+ my $schemas = [qw(http https ftp mailto gopher)];
+ my $re = join "|", map {qr/\Q$_\E/} @$schemas;
+ my $schemes = qr/(?:$re)/;
+ # And this is copied from URI:
+ my $reserved = q(;/?@&=+$,); # NOTE: no colon or [] !
+ my $uric = quotemeta($reserved) . $URI::unreserved . "%#";
+
+ my $result = skip_nowiki($content, sub {
+ $_ = shift;
+
+ # Escape any anchors
+ #s/<(a[\s>\/])/&lt;$1/ig;
+ # Disabled because this appears to screw up the aggregate plugin.
+ # I guess we'll rely on Iki to post-sanitize this sort of stuff.
+
+ # Replace external links, http://blah or [http://blah]
+ s{\b($schemes:[$uric][:$uric]+)|\[($schemes:[$uric][:$uric]+)([^\]]*?)\]}{
+ generate_external_link($1||$2, $3, \$counter)
+ }eg;
+
+ # Handle links that only contain fragments.
+ s{ \[\[ \s* (\#[^|\]'"<>&;]+) (?:\| ([^\]'"<>&;]*))? \]\] }{
+ generate_fragment_link($1, $2)
+ }xeg;
+
+ # Match all internal links
+ s{$link_regexp}{
+ generate_internal_link($page, $1, $2, $3, $4, sub {
+ my($linkpage, $linktext, $anchor) = @_;
+ return htmllink($page, $destpage, $linkpage,
+ linktext => $linktext,
+ anchor => underscorize(scrunch($anchor)));
+ });
+ }eg;
+
+ return $_;
+ });
+
+ return $result;
+ }
+
+
+ # Find all WikiLinks in the page.
+ sub scan (@)
+ {
+ my %params = @_;
+ my $page=$params{page};
+ my $content=$params{content};
+
+ my $file=$pagesources{$page};
+ my $type=pagetype($file);
+
+ if($type ne 'mediawiki') {
+ return IkiWiki::Plugin::link::scan(@_);
+ }
+
+ skip_nowiki($content, sub {
+ $_ = shift;
+ while(/$link_regexp/g) {
+ generate_internal_link($page, $1, '', '', '', sub {
+ my($linkpage, $linktext, $anchor) = @_;
+ push @{$links{$page}}, $linkpage;
+ return undef;
+ });
+ }
+ return '';
+ });
+ }
+
+
+ # Convert the page to HTML.
+ sub htmlize (@)
+ {
+ my %params=@_;
+ my $page = $params{page};
+ my $content = $params{content};
+
+
+ return $content if $markup_disabled;
+
+ # Do a little preprocessing to babysit Text::MediawikiFormat
+ # If a line begins with tabs, T:MwF won't convert it into preformatted blocks.
+ $content =~ s/^\t/ /mg;
+
+ my $ret = Text::MediawikiFormat::format($content, {
+
+ allowed_tags => [#HTML
+ # MediawikiFormat default
+ qw(b big blockquote br caption center cite code dd
+ div dl dt em font h1 h2 h3 h4 h5 h6 hr i li ol p
+ pre rb rp rt ruby s samp small strike strong sub
+ sup table td th tr tt u ul var),
+ # Mediawiki Specific
+ qw(nowiki),
+ # Our additions
+ qw(del ins), # These should have been added all along.
+ qw(span), # Mediawiki allows span but that's rather scary...?
+ qw(a), # this is unfortunate; should handle links after rendering the page.
+ ],
+
+ allowed_attrs => [
+ qw(title align lang dir width height bgcolor),
+ qw(clear), # BR
+ qw(noshade), # HR
+ qw(cite), # BLOCKQUOTE, Q
+ qw(size face color), # FONT
+ # For various lists, mostly deprecated but safe
+ qw(type start value compact),
+ # Tables
+ qw(summary width border frame rules cellspacing
+ cellpadding valign char charoff colgroup col
+ span abbr axis headers scope rowspan colspan),
+ qw(id class name style), # For CSS
+ # Our additions
+ qw(href),
+ ],
+
+ }, {
+ extended => 0,
+ absolute_links => 0,
+ implicit_links => 0
+ });
+
+ return $ret;
+ }
+
+
+ # This is only needed to support the check_redirect call.
+ sub pagetemplate (@)
+ {
+ my %params = @_;
+ my $page = $params{page};
+ my $destpage = $params{destpage};
+ my $template = $params{template};
+
+ # handle metaheaders for redirects
+ if (exists $metaheaders{$page} && $template->query(name => "meta")) {
+ # avoid duplicate meta lines
+ my %seen;
+ $template->param(meta => join("\n", grep { (! $seen{$_}) && ($seen{$_}=1) } @{$metaheaders{$page}}));
+ }
+
+ $template->param(tags => [
+ map {
+ link => htmllink($page, $destpage, tagpage($_), rel => "tag")
+ }, sort keys %{$tags{$page}}
+ ]) if exists $tags{$page} && %{$tags{$page}} && $template->query(name => "tags");
+
+ # It's an rss/atom template. Add any categories.
+ if ($template->query(name => "categories")) {
+ if (exists $tags{$page} && %{$tags{$page}}) {
+ $template->param(categories => [map { category => $_ },
+ sort keys %{$tags{$page}}]);
+ }
+ }
+ }
+
+ 1
+
+----
+
+Hello. Got ikiwiki running and I'm planning to convert my personal
+Mediawiki wiki to ikiwiki so I can take offline copies around. If anyone
+has an old copy of the instructions, or any advice on where to start I'd be
+glad to hear it. Otherwise I'm just going to chronicle my journey on the
+page.--[[users/Chadius]]
+
+> Today I saw that someone is working to import wikipedia into git.
+> <http://www.gossamer-threads.com/lists/wiki/foundation/181163>
+> Since wikipedia uses mediawiki, perhaps his importer will work
+> on mediawiki in general. It seems to produce output that could be
+> used by the [[plugins/contrib/mediawiki]] plugin, if the filenames
+> were fixed to use the right extension. --[[Joey]]
+
+>> Here's another I found while browsing around starting from the link you gave Joey<br />
+>> <http://github.com/scy/levitation><br />
+>> As I don't run mediawiki anymore, but I still have my xz/gzip-compressed XML dumps,
+>> it's certainly easier for me to do it this way; also a file or a set of files is easier to lug
+>> around on some medium than a full mysqld or postgres master and relevant databases.
diff --git a/doc/tips/convert_moinmoin_to_ikiwiki.mdwn b/doc/tips/convert_moinmoin_to_ikiwiki.mdwn
new file mode 100644
index 000000000..ec4574971
--- /dev/null
+++ b/doc/tips/convert_moinmoin_to_ikiwiki.mdwn
@@ -0,0 +1,109 @@
+This MoinMoin converter converts wikis to ikiwikis backed by a git repository, including full history. It simply parses the wiki pages into markdown using the MoinMoin engine.
+
+The converter was originally written by [[JoshTriplett]] and included support for Tikiwiki, for which it parses the wiki pages to HTML then back into markdown using the `libhtml-wikiconverter` Perl package. That original version from Josh is still available from [his wiki page](/users/JoshTriplett).
+
+The MoinMoin side of things was completely re-written by [[anarcat]] and is currently still in development. That version is available at:
+
+ git clone git://git.koumbit.net/moin2iki.git
+
+It doesn't feature support to migrate from Tikiwiki anymore and focuses on MoinMoin support.
+
+Issues can be filed in the redmine bugtracker: <https://redmine.koumbit.net/projects/moin2iki>
+
+[[!toc levels=2]]
+
+The software is made of two pieces:
+
+ * the importer (`moin2git`) - which converts the wiki pages into a git repository with full history
+ * the converter (`moin2mdwn`) - which converts a set of moin-formatted text files into markdown + ikiwiki directives
+
+## MoinMoin importer features
+
+ * supports latest MoinMoin versions (tested with 1.9.x)
+ * uses `git fast-import` to improve performance (10 minutes and 200M of ram for a 7 years old 2GB Moinmoin wiki)
+ * multistep process allows bulk edit through git before markdown conversion, or staying with a
+ * imports attachments as subpages
+ * uses the per-page edit log
+ * consistent: multiple runs will generate the same repository
+ * re-entrant: can be run multiple times to import new changes
+
+## MoinMoin converter features
+
+ * most of the inline markup
+ * links
+ * attachment links
+ * smileys
+ * images (not well tested), into [[ikiwiki/directive/img]]
+ * preformatted and code areas, including [[ikiwiki/directive/format]]
+ * ordered, unordered and definition lists
+ * tables (although only with HTML and no styles)
+
+### Supported macros
+
+ * TableOfContents, through [[ikiwiki/directive/toc]]
+ * Navigation, through [[ikiwiki/directive/map]] (so as a nested
+ vertical list instead of an horizontal list)
+ * PageList, through [[ikiwiki/directive/map]]
+ * MonthCalendar, partially, through [[ikiwiki/directive/calendar]]
+ * FootNote, through markdown
+ * Anchor, through markdown and plain HTML
+ * `<<BR>>`, through the weird line ending thing
+ * AttachList, through a weird [[ikiwiki/directive/inline]]
+ * FullSearch, partially, only through [[ikiwiki/directive/inline]] (so no textual search)
+ * Include, partially through [[ikiwiki/directive/inline]] (so missing boundary extraction and heading level generation)
+ * PageCount, same name even :)
+ * OrphanedPages, through [[ikiwiki/directive/orphans]]
+ * Date and Datetime, should be through [[plugins/date]] instead of
+ current hack
+
+### Supported parsers
+
+ * the main "moin wiki" markup
+ * highlight parser, through the [[plugins/format]] plugin
+ * other parsers may be supported if an equivalent plugin exists in Ikiwiki (example: [[plugins/rst]])
+
+## Current blocker
+
+This script is being used to test the conversion of the venerable [Koumbit wiki](https://wiki.koumbit.net/) into Ikiwiki, and so far progress is steady but difficult. The current blocker is:
+
+ * figuring out exactly which pages should exist and which should not, as there is ambiguity in the internal datastructures of MoinMoin, which become apparent when running the conversion script, as files a missing
+
+## Todos
+
+There are also significant pieces missing:
+
+ * inline parsers and hackish styled tables
+ * turn categories into tags
+ * name converted page to the right name depending on the `#format` parameter on top of page
+ * finish a full converter run on the Koumbitwiki
+ * improve the output of the converter (too much debugging)
+
+## MoinMoin features missing from ikiwiki
+
+The importer is pretty much complete, but the converter can only go so far as what features ikiwiki supports. Here are the MoinMoin features that are known to be missing from ikiwiki. Note that some of those features are available in MoinMoin only through third-party extensions.
+
+ * [[todo/do_not_make_links_backwards/]] - MoinMoin and Creole use `\[[link|text]]`, while ikiwiki uses `\[[text|link]]` - for now the converter generates [[markdown]] links so this is not so much an issue, but will freak out users
+ * [[todo/internal_definition_list_support/]] - includes tabling the results ([MoinMoin's DictColumns macro](http://moinmo.in/MacroMarket/DictColumns))
+ * [[todo/per page ACLs]] - ([MoinMoin's ACLs](http://moinmo.in/HelpOnAccessControlLists))
+ * [MailTo](http://moinmo.in/HelpOnMacros/MailTo) macro spam protection
+ * list pages based on full text page search
+ * extract part of other pages with the inline macro
+ * specifying a template when creating a page (as opposed to matching a pagespec)
+ * specifying a style for a sub-section (MoinMoin's inline parsers
+ allow the user to specify a CSS class - very useful see
+ [the documentation](http://moinmo.in/HelpOnMoinWikiSyntax#Using_the_wiki_parser_with_css_classes)
+ to get an idea)
+ * the above also keeps the SectionParser from being properly supported
+ * regex matching all over the place: pagespec, basically, but all
+ full text search (which is missing anyways, see above)
+
+### Missing macros
+
+ * RandomPage(N) - lists N random pages, skipped
+ * Gallery() - skipped
+ * Gettext - translates the string accordign to internal translation
+ system, ignored
+ * AdvancedSearch - an elaborate search form provided by MoinMoin
+ * Goto - a simple "jump to page" macro
+
+Comments and feedback always welcome! --[[anarcat]]
diff --git a/doc/tips/convert_moinmoin_to_ikiwiki/discussion.mdwn b/doc/tips/convert_moinmoin_to_ikiwiki/discussion.mdwn
new file mode 100644
index 000000000..2fe55d944
--- /dev/null
+++ b/doc/tips/convert_moinmoin_to_ikiwiki/discussion.mdwn
@@ -0,0 +1,5 @@
+I look forward to trying this. I have a large (~10 year old) MoinMoin installation that has been migrated up to a 1.8.x version so far, and which is partially ACL'd away behind logins. — [[Jon]]
+
+> I'll make that clearer in the docs, but we do not deal with ACL (yet?), as ikiwiki doesn't support Moinmoin's level of ACL flexibility. See [[todo/per_page_ACLs]] for more information. --[[anarcat]]
+
+>> I was actually thinking the ACLs would cause a problem just for the crawler, I hadn't considered their re-implementation (but yes, that would be good!) — [[Jon]]
diff --git a/doc/tips/distributed_wikis.mdwn b/doc/tips/distributed_wikis.mdwn
new file mode 100644
index 000000000..cf9c2e338
--- /dev/null
+++ b/doc/tips/distributed_wikis.mdwn
@@ -0,0 +1,46 @@
+[[rcs/git]] and other distributed version control systems are all about
+making it easy to create and maintain copies and branches of a project. And
+this can be used for all sorts of interesting stuff. Since ikiwiki can use
+git, let's explore some possibilities for distributed wikis.
+
+## a wiki mirror
+
+The simplest possibility is setting up a mirror. If a wiki exposes its git
+repository and has the [[plugins/pinger]] plugin enabled, then anyone can
+set up a mirror that will automatically be kept up-to-date with the origin
+wiki. Just clone the git repo, configure ikiwiki to use it, enable the
+[[plugins/pingee]] plugin in your configuration, and edit the origin wiki,
+adding a ping directive for your mirror:
+
+ \[[!ping from="http://thewiki.com/"
+ to="http://mymirror.com/ikiwiki.cgi?do=ping"]]
+
+The "from" parameter needs to be the url to the origin wiki. The "to" parameter
+is the url to ping on your mirror.
+
+Now whenever the main wiki is edited, it will ping your mirror, which will
+pull the changes from "origin" using git, and update itself. It could, in
+turn ping another mirror, etc.
+
+And if someone edits a page on your mirror, it will "git push origin",
+committing the changes back to the origin git repository, and updating the
+origin mirror. Assuming you can push to that git repository. If you can't,
+and you want a mirror, and not a branch, you should disable web edits on
+your mirror. (You could also point the cgiurl for your mirror at the origin
+wiki.)
+
+## branching a wiki
+
+It follows that setting up a branch of a wiki is just like a mirror, except
+we don't want it to push changes back to the origin. The easy way to
+accomplish this is to clone the origin git repository using a readonly
+protocol (ie, "git://"). Then you can't push to it.
+
+If a page on your branch is modified and other modifications are made to
+the same page in the origin, a conflict might occur when that change is
+pulled in. How well will this be dealt with and how to resolve it? I think
+that the conflict markers will just appear on the page as it's rendered in
+the wiki, and if you could even resolve the conflict using the web
+interface. Not 100% sure as I've not gotten into this situation yet.
+
+--[[Joey]]
diff --git a/doc/tips/distributed_wikis/discussion.mdwn b/doc/tips/distributed_wikis/discussion.mdwn
new file mode 100644
index 000000000..994c493f9
--- /dev/null
+++ b/doc/tips/distributed_wikis/discussion.mdwn
@@ -0,0 +1,7 @@
+Would it work if the mirrored wiki was configured with cgiurl set to the cgiurl of the origin wiki - so that users were seamlessly redirected to the origin for edits? --[[Jamie]]
+
+> Yes, if the origin wiki is set up to ping the mirrored wiki when
+> updated, the mirror is free to use its cgi setup. (Note that the cgi will
+> leave the user on a page on the origin wiki when they save the edit.)
+> I've put a mention of this option in the page.
+> --[[Joey]]
diff --git a/doc/tips/dot_cgi.mdwn b/doc/tips/dot_cgi.mdwn
new file mode 100644
index 000000000..9067fbea5
--- /dev/null
+++ b/doc/tips/dot_cgi.mdwn
@@ -0,0 +1,111 @@
+It's common to name the [[cgi]] "ikiwiki.cgi", and put it somewhere
+like `~/public_html/ikiwiki.cgi`, or `/var/www/wiki/ikiwiki.cgi`.
+
+If you do that, you may find that when trying to edit a page in your wiki,
+you see the raw contents of the ikiwiki.cgi program. Or get a permission
+denied problem.
+
+This is because web servers are generally not configured to run cgi scripts
+unless they're in `/usr/lib/cgi-bin/`. While you can put ikiwiki.cgi in
+there if you like, it's better to configure your web server to
+run `.cgi` programs from anywhere.
+
+These instructions are for Debian systems, but the basic
+configuration changes should work anywhere.
+
+## apache 2
+
+* Make sure the cgi module is loaded. (Ie, `a2enmod cgi`)
+
+* Edit /etc/apache2/apache2.conf (or /etc/apache2/mods-available/mime.conf)
+ and add a line like this:
+
+ AddHandler cgi-script .cgi
+
+* Find the "Options" line for the directory where you've put the
+ ikiwiki.cgi, and add "ExecCGI" to the list of options. For example, if
+ ikiwiki.cgi is in /var/www/, edit `/etc/apache2/sites-enabled/000-default`
+ and add it to the "Options" line in the "Directory /var/www/" stanza.
+ Or, if you've put it in a `~/public_html`, edit
+ `/etc/apache2/mods-available/userdir.conf`.
+
+* If your wiki is in `~/public_html` and does not appear when you enter the URL given by the installer, check that you have
+ the userdir mod enabled (there should be simlinks to userdir.load and userdir.conf in /etc/apache2/modes-enabled). If not,
+ run `a2enmod userdir` and reload apache2.
+
+* You may also want to enable the [[plugins/404]] plugin.
+ To make apache use it, the apache config file will need a further
+ modification to make it use ikiwiki's CGI as the apache 404 handler.
+ Something like this, with the path adjusted to where you've put the CGI:
+
+ ErrorDocument 404 /cgi-bin/ikiwiki.cgi
+
+## lighttpd
+
+Here is how to enable cgi on [lighttpd](http://www.lighttpd.net/) and
+configure it in order to execute ikiwiki.cgi wherever it is located.
+
+* Activate cgi by linking `/etc/lighttpd/conf-available/10-cgi.conf` into `/etc/lighttpd/conf-enabled` ([doc](http://trac.lighttpd.net/trac/wiki/Docs%3AModCGI)).
+
+* Create `/etc/lighttpd/conf-available/90-ikiwiki-cgi.conf` and add a line like this:
+
+ cgi.assign = ( "ikiwiki.cgi" => "", )
+
+* Activate ikiwiki-cgi by linking `/etc/lighttpd/conf-available/90-ikiwiki-cgi.conf` into `/etc/lighttpd/conf-enabled`.
+
+* Restart lighttpd server with something like `/etc/init.d/lighttpd restart`.
+
+Note that the first part enables cgi server wide but depending on default
+configuration, it may be not enough. The second part creates a specific
+rule that allow `ikiwiki.cgi` to be executed.
+
+**Warning:** I only use this lighttpd configuration on my development
+server (offline). I am not sure of how secure this approach is.
+If you have any thought about it, feel free to let me know.
+
+## nginx
+
+To run CGI under nginx, you need to use a FastCGI wrapper. The wrapper must be started somehow just like any other FastCGI program. You can use launchd on OSX.
+
+In Linux, you will need the spawn-fcgi and fcgiwrap packages and start
+them with:
+
+ spawn-fcgi -s /tmp/fcgi.socket -n -- /usr/sbin/fcgiwrap
+
+This needs to be ran as your user. It can be added to `inittab` or
+made into a startup script in `init.d`. You may also need to make this file writable by the webserver, if that's running as a different user, e.g.:
+
+ chmod a+w /tmp/fcgi.socket
+
+Then you need an nginx config plugged in that wrapper. Here's an
+example virtual host configuration:
+
+ server {
+ #listen 80; ## listen for ipv4; this line is default and implied
+ #listen [::]:80 default_server ipv6only=on; ## listen for ipv6
+
+ root /home/anarcat/public_html/wiki.reseaulibre.ca/;
+ index index.html index.htm;
+
+ # Make site accessible from http://localhost/
+ server_name wiki.reseaulibre.ca;
+
+ location / {
+ try_files $uri $uri/ /index.html;
+ }
+ location /ikiwiki.cgi {
+ fastcgi_pass unix:/tmp/fcgi.socket;
+ fastcgi_index ikiwiki.cgi;
+ fastcgi_param SCRIPT_FILENAME /home/anarcat/public_html/ikiwiki.cgi;
+ fastcgi_param DOCUMENT_ROOT /home/anarcat/public_html/wiki.reseaulibre.ca;
+ include /etc/nginx/fastcgi_params;
+ }
+ }
+
+Also, note that the `/tmp/fcgi.socket` file needs to be writable by the webserver. I am also unsure as to the security of this setup, as I am using this only on my dev server. Needless to say that [[real fastcgi support|todo/fastcgi_or_modperl_installation_instructions]] would be great. ;) --[[anarcat]]
+
+## boa
+
+Edit /etc/boa/boa.conf and make sure the following line is not commented:
+
+ AddType application/x-httpd-cgi cgi
diff --git a/doc/tips/dot_cgi/discussion.mdwn b/doc/tips/dot_cgi/discussion.mdwn
new file mode 100644
index 000000000..0e23e3a08
--- /dev/null
+++ b/doc/tips/dot_cgi/discussion.mdwn
@@ -0,0 +1,51 @@
+## Alt explanation/instructions
+For whatever reason, I found the info on the dot cgi page very confusing. The instructions on [[http://maketecheasier.com/install-and-configure-apache-in-ubuntu/2011/03/09]] were a lot easier to follow, and ultimately got me over the ubuntu-apache hump.
+
+Following this method the wiki won't be at the same url, it will be at localhost/*wiki_name*
+
+## warning: lighttpd only or both?
+
+Is your warning at the bottom (you don't know how secure it is) only about
+lighttpd or it's about apache2 configuration as well?
+
+> The latter. (Although I don't know why using lighttpd would lead
+> to any additional security exposure anyway.) --[[Joey]]
+
+I'm asking this because right now I want to setup an httpd solely for the
+public use of ikiwiki on a general purpose computer (there are other things
+there), and so I need to choose the more secure solution. --Ivan Z.
+
+> AFAIU, my main simplest security measure should be running the public
+> ikiwiki's cgi under a special user, but then: how do I push to the repo
+> owned by that other user? I see, probably I should setup the public wiki
+> under the special user (so that it was able to create the cgi-script with
+> the desired permission), and then give my personal user the required
+> permissions to make a git-push by, say, creating a special Unix group for
+> this.
+
+> Shouldn't there be a page here which would document a secure public and
+> multi-user installation of ikiwiki (by "multi-user" I mean writable by a
+> group of local Unix users)? If there isn't such yet, I started writing it
+> with this discussion.--Ivan Z.
+
+> I see, perhaps a simpler setup would not make use of a Unix group, but
+> simply allow pushing to the public wiki (kept under a special user) through
+> git+ssh. --Ivan Z.
+
+>> Yes, it's certianly possible to configure git (and svn, etc) repositories so that
+>> two users can both push to them. There should be plenty of docs out there
+>> about doing that.
+>>
+>> The easiest way though is probably
+>> to add your ssh key to the special user's `.ssh/authorized_keys`
+>> and push that way. --[[Joey]]
+
+## apache2 - run from userdir
+Followed instructions but couldn't get it right to run from user dir (running ubuntu jaunty),
+Finally got it working once I've sym linked as follow (& restarted apache):
+\# ln -s ../mods-available/userdir.load .
+\# ln -s ../mods-available/userdir.conf .
+\# pwd
+/etc/apache2/mods-enabled
+
+
diff --git a/doc/tips/emacs_syntax_highlighting.mdwn b/doc/tips/emacs_syntax_highlighting.mdwn
new file mode 100644
index 000000000..941cf5415
--- /dev/null
+++ b/doc/tips/emacs_syntax_highlighting.mdwn
@@ -0,0 +1,3 @@
+A [markdown mode](http://jblevins.org/projects/markdown-mode/) for
+emacs can help in editing of ikiwiki
+[[ikiwiki/markdown]] files.
diff --git a/doc/tips/embedding_content.mdwn b/doc/tips/embedding_content.mdwn
new file mode 100644
index 000000000..bfe458a84
--- /dev/null
+++ b/doc/tips/embedding_content.mdwn
@@ -0,0 +1,35 @@
+Content from sites such as YouTube can be embedded into a web page. Maybe
+you want to do this. But you'll find that the [[plugins/htmlscrubber]]
+doesn't let you. It blocks the tags used to embed such content, because
+they can be abused in many evil ways.
+
+Some plugins have been written to try to work around this problem, by
+whitelisting the html needed to embed things from a few sites like Google
+maps, calendar, videos, and YouTube. The problem with these plugins is that
+they have to be kept up to date to add new sites, and follow changes to the
+html such sites use for embedding.
+
+(Digression: The real problem with the plugins is that they hide the
+underlying trust relationship. If you decide to embed html from a site,
+you'd better trust that site. And if ikiwiki lets you enter such html, it
+needs to trust you.)
+
+The [[plugins/htmlscrubber]] offers a different way around this problem.
+You can configure it to skip scrubbing certain pages, so that content from
+elsewhere can be embedded on those pages. Then use [[plugins/lockedit]]
+to limit who can edit those unscrubbed pages.
+
+For example, suppose your blog is all under `blog/*`, and you want
+only yourself to be able to post there, and you'd like to be able to embed
+youtube videos etc in your blog. Other users can edit some pages in the
+wiki (Discussion pages, say), but not your blog posts. Then you could configure
+ikiwiki as follows:
+
+ htmlscrubber_skip => 'blog/* and !*/Discussion',
+ locked_pages => '!*/Discussion',
+
+More simply, you might want to allow yourself to embed content anywhere
+on the wiki, but scrub content written on Discussion pages:
+
+ htmlscrubber_skip => '!*/Discussion',
+ locked_pages => '!*/Discussion',
diff --git a/doc/tips/follow_wikilinks_from_inside_vim.mdwn b/doc/tips/follow_wikilinks_from_inside_vim.mdwn
new file mode 100644
index 000000000..015a4ecee
--- /dev/null
+++ b/doc/tips/follow_wikilinks_from_inside_vim.mdwn
@@ -0,0 +1,47 @@
+The [ikiwiki-nav](http://www.vim.org/scripts/script.php?script_id=2968) plugin
+for vim eases the editing of IkiWiki wikis, by letting you "follow" the
+wikilinks on your file (page), by loading the file associated with a given
+wikilink in vim. The plugin takes care of following the ikiwiki linking rules
+to figure out which file a wikilink points to
+
+The plugin also includes commands (and mappings) to make the cursor jump to the
+previous/next wikilink in the current file
+
+## Jumping to pages
+
+To open the file associated to a wikilink, place the cursor over its text, and
+hit Enter (`<CR>`). This functionality is also available through the
+`:IkiJumpToPage` command
+
+## Moving to next/previous wikilink in current file
+
+`Ctrl-j` will move the cursor to the next wikilink. `Ctrl-k` will move it to the
+previous wikilink. This functionality is also available through the
+`:IkiNextWikiLink` command. This command takes one argument, the direction to
+move into
+
+ * `:IkiNextWikiLink 0` will look forward for the wikilink
+ * `:IkiNextWikiLink 1` will look backwards for the wikilink
+
+## Installation
+
+Copy the `ikiwiki_nav.vim` file to your `.vim/ftplugin` directory.
+
+## Current issues:
+
+ * The plugin only works for wikilinks contained in a single text line;
+ multiline wikilinks are not (yet) seen as such
+
+## Notes
+
+The official releases of the plugin are in the
+[vim.org script page](http://www.vim.org/scripts/script.php?script_id=2968)
+
+The latest version of this script can be found in the following location
+
+<http://git.devnull.li/cgi-bin/gitweb.cgi?p=ikiwiki-nav.git;a=blob;f=ftplugin/ikiwiki_nav.vim;hb=HEAD>
+
+Any feedback you can provide is appreciated; the contact details can be found
+inside the plugin
+
+[[!tag vim]]
diff --git a/doc/tips/github.mdwn b/doc/tips/github.mdwn
new file mode 100644
index 000000000..d745bfcc5
--- /dev/null
+++ b/doc/tips/github.mdwn
@@ -0,0 +1,64 @@
+Here's how to set up a static wiki or blog using ikiwiki with no hosting
+fees. Everything is hosted on github, both the git repository and the web
+site. Your laptop is used to generate and publish changes to it.
+
+This is possible because github now supports
+[github pages](http://github.com/blog/272-github-pages).
+
+Note that github limits free accounts to 100 MB of git storage. It's
+unlikely that a small wiki or blog will outgrow this, but we are keeping
+two copies of the website in git (source and the compiled site), and all
+historical versions too. So it could happen. If it does, you can pay github
+for more space, or you can migrate your site elsewhere.
+
+## Github Setup
+
+* Go to [github](http://github.com/) and sign up for an account, if you haven't already.
+* Be sure to add your laptop's ssh key to it so you can push to github.
+* Create a repository on github named `$YOU.github.com`, substituting your
+ *username*. This repository will be used to publish your compiled website.
+* Create a repository on github named `$YOU` (or anything else you like).
+ This repository will be used to publish the source of your website.
+ This is actually optional.
+
+## Local Setup
+
+* On your laptop, create two empty git repositories to correspond to the github repositories: <br />
+ `YOU=your github username here` <br />
+ `mkdir ~/$YOU.github.com` <br />
+ `cd ~/$YOU.github.com` <br />
+ `git init` <br />
+ `git remote add origin git@github.com:$YOU/$YOU.github.com.git` <br />
+ `mkdir ~/$YOU` <br />
+ `cd ~/$YOU` <br />
+ `git init` <br />
+ `git remote add origin git@github.com:$YOU/$YOU.git` <br />
+* Add some wiki pages, such as an `index.mdwn`, to `~/$YOU`, and check them
+ in and commit them to git. You need something to push to github. Run
+ `git push origin master` to push the source pages to github.
+
+## Publishing to Github
+
+* Now build your wiki with a command such as: <br />
+ `ikiwiki ~/$YOU ~/$YOU.github.com --refresh`
+* Each time you build the wiki you will need to commit the changes
+ to git, and push the compiled pages to github: <br />
+ `cd ~/YOU.github.com` <br />
+ `git add .` <br />
+ `git commit -a -m update` <br />
+ `git push origin master` <br />
+
+Your wiki will show up at `http://$YOU.github.com/` within ten
+minutes after the first push, and changes you push to it from then on
+should show up immediately.
+
+## Enhancements
+
+You can follow the instructions in [[laptop_wiki_with_git]] to set up an
+editable version of your wiki on your laptop. Then you can use the web
+interface for editing. You'll still need to follow the instructions above
+to publish your changes to github.
+
+It would also be possible to teach ikiwiki to push compiled pages to github
+itself via a plugin, as was done with the [[plugins/amazon_s3]] plugin. Not
+done yet!
diff --git a/doc/tips/howto_avoid_flooding_aggregators.mdwn b/doc/tips/howto_avoid_flooding_aggregators.mdwn
new file mode 100644
index 000000000..e45b96689
--- /dev/null
+++ b/doc/tips/howto_avoid_flooding_aggregators.mdwn
@@ -0,0 +1,28 @@
+If you have a [[blog]] that is aggregated, either on a site like Planet
+Debian, or just through user subscriptions, one common problem is that
+changes to the guids of items in the blog can “flood” the aggregator,
+causing all recent blog entries to be posted to the top of it.
+
+This can happen in a lot of situations:
+
+* Perhaps you’ve just switched to ikiwiki from some other blog engine and
+ imported your data.
+* Perhaps you’ve turned on the `usedirs` setting, which changes all the
+ urls in your wiki. Even if you set up
+ [[redirections|redirections_for_usedirs]] for the old urls, you still face
+ the issue of flooding aggregators.
+* Perhaps you’ve just moved stuff around in your wiki.
+
+To avoid annoying readers in these situations, it’s a good idea to remove
+any existing items from your blog’s news feed. That way only new items will
+show up in the aggregator. The best way to do this is to add a `feedpages`
+parameter to the `inline` directive for your blog, with a condition such as:
+
+ feedpages=created_after(blog/posts/old_post)
+
+Where “old_post” is the name of the last post you made to the blog before
+making the change. This will limit the feed to only newer posts, while still
+displaying the old posts in the blog page.
+
+Alternatively, you can add the [[plugins/meta]] guid directives to pages,
+to force the old url to be used.
diff --git a/doc/tips/howto_limit_to_admin_users.mdwn b/doc/tips/howto_limit_to_admin_users.mdwn
new file mode 100644
index 000000000..4d579327a
--- /dev/null
+++ b/doc/tips/howto_limit_to_admin_users.mdwn
@@ -0,0 +1,9 @@
+Enable [[plugins/lockedit]] in your setup file.
+
+For example:
+
+ add_plugins => [qw{goodstuff table rawhtml template embed typography sidebar img remove lockedit}],
+
+And to only allow admin users to edit the page, simply specify a pagespec for everything in the .setup:
+
+ locked_pages => '*',
diff --git a/doc/tips/htaccess_file.mdwn b/doc/tips/htaccess_file.mdwn
new file mode 100644
index 000000000..6964cf24e
--- /dev/null
+++ b/doc/tips/htaccess_file.mdwn
@@ -0,0 +1,27 @@
+If you try to include a `.htaccess` file in your wiki's source, in order to
+configure the web server, you'll find that ikiwiki excludes it from
+processing. In fact, ikiwiki excludes any file starting with a dot, as well
+as a lot of other files, for good security reasons.
+
+You can tell ikiwiki not to exclude the .htaccess file by adding this to
+your setup file:
+
+ include => '^\.htaccess$',
+
+Caution! Before you do that, please think for a minute about who can edit
+your wiki. Are attachment uploads enabled? Can users commit changes
+directly to the version control system? Do you trust everyone who can
+make a change to not do Bad Things with the htaccess file? Do you trust
+everyone who *might* be able to make a change in the future? Note that a
+determined attacker who can write to the htaccess file can probably get a
+shell on your web server.
+
+If any of these questions have given you pause, I suggest you find a
+different way to configure the web server. One way is to not put the
+`.htaccess` file under ikiwiki's control, and just manually install it
+in the destdir. --[[Joey]]
+
+[Apache's documentation](http://httpd.apache.org/docs/2.2/howto/htaccess.html)
+says:
+> In general, you should never use .htaccess files unless you don't have
+> access to the main server configuration file.
diff --git a/doc/tips/html5.mdwn b/doc/tips/html5.mdwn
new file mode 100644
index 000000000..b47c3fe39
--- /dev/null
+++ b/doc/tips/html5.mdwn
@@ -0,0 +1,27 @@
+First, if you just want to embed videos using the html5 `<video>` tag,
+you can do that without switching anything else to html5.
+However, if you want to fully enter the brave new world of html5, read on..
+
+Currently, ikiwiki does not use html5 by default. There is a `html5`
+setting that can be turned on, in your setup file. Rebuild with it set, and
+lots of fancy new semantic tags will be used all over the place.
+
+You may need to adapt your CSS for html5. While all the class and id names
+are the same, some of the `div` elements are changed to other things.
+Ikiwiki's default CSS will work in both modes.
+
+The html5 support is still experimental, and may break in some browsers.
+No care is taken to add backwards compatibility hacks for browsers that
+are not html5 aware (like MSIE). If you want to include the javascript with
+those hacks, you can edit `page.tmpl` to do so.
+[Dive Into HTML5](http://diveintohtml5.info/) is a good reference for
+current compatability issues and workarounds with html5. And a remotely-loadable
+JS shiv for enabling HTML5 elements in IE is available through [html5shiv at Google Code](http://code.google.com/p/html5shiv/).
+
+---
+
+Known ikiwiki-specific issues:
+
+* [[plugins/htmltidy]] uses `tidy`, which is not html5 aware, so if you
+ have that enabled, it will mangle it back to html4.
+* [[plugins/toc]] does not understand the html5 outline algorithm.
diff --git a/doc/tips/ikiwiki_as_a_requirements_management_tool.mdwn b/doc/tips/ikiwiki_as_a_requirements_management_tool.mdwn
new file mode 100644
index 000000000..6bef2619e
--- /dev/null
+++ b/doc/tips/ikiwiki_as_a_requirements_management_tool.mdwn
@@ -0,0 +1,95 @@
+[[!template id=note text="**Table of contents** [[!toc ]]"]]
+
+Introduction
+------------
+At work textual requirements and traceability are daily use terms, often used as contracts with clients or among stakeholders, but at the moment the only way we specify requirements is via a word processor, and traceability is managed manually (ouch!) unless we use a commercial UML (Unified Modeling Language) tool that handles office files, an also allows traceability from design, code and test artifacts. But functionality of that tool is less than basic for requirements.
+
+We are considering the use of a specific requirements management tool, but the problem and something that gets me really frustrated is the extremely expensive price of the licenses of the "de facto" commercial tools we should use. One floating license for both tools (requirements management and modeling) can go beyond $20,000. Of course we can purchase cheaper ones, but I'm tired of this licensing nightmare of worrying about how many licenses are being used, praying not to exceed the limit or restarting a dead license server. We pay companies to not trust us. Taking a look at the FLOSS world doesn't seem to add any reasonable alternative.
+
+These are the raw high level features of the tool I'd like to use:
+
+ * Requirements edition
+ * Requirements attributes edition
+ * Traceability: edition, coverage analysis and navigation
+ * External traceability: from requirements in one document/module to requirements in other one (eg: software requirements tracing to system requirements). Note: a set of requirements will be called "module" hereafter in this page.
+ * Requirements identifiers management
+ * Requirements history and diff/blame
+ * Team work
+ * Easy integration with other software lifecycle tools: modeling (eg. BOUML), project management (eg. Trac)
+ * Support for other formats such as HTML, office...
+ * Filtering and searching
+ * Export facilities to create standards compliant documentation.
+
+Initial idea was to develop a simple web solution using XHTML files. These files would be created in a web broser with existing WYSIWIM editors and store all the stuff in Subversion. All requirements would be stored at the same level (no hierarchies among requirements of the same module) and atomically accessible via a simple web browser. No server side programming would be needed to read requirements. Also special XHTML files (let's call it "views") would be necessary to group requirements hierarchically in a requirements document fashion, using xinclude.
+
+When I first played with ikiwiki I was so happy that many of the ideas I worked on were already in use in this marvelous piece of software, specially the decision to use well-known RCS software to manage history instead of reinventing the wheel, opening also one interesting feature: off-line edition. Other similarity was the absence of special processing for read-only navigation.
+
+So, let's now take all the features above and describe how to make them real using ikiwiki and some simple conventions. Some features would need new functionality and improvement, I'd really appreciate additional ideas on how to better get to the point.
+
+Requirements edition
+--------------------
+Suppose that all requirements would reside under a concrete folder. We will call it "reqs", and under "reqs" we add as folders as requirements modules we want to use for a system called "foo" (eg. "foo_sss" for system requirements, "foo_srs" for software requirements...). Index file for each document shall be a page summarizing the module: number of requirements, basic coverage information... Other similar pages under the "views" folder could be used in order to have different sets of requirements including additional stuff: introduction, document identification, etc... The rest of the files - actually requirements - shall be markdown files. So editing a requirement would be as simple as adding a page to the wiki.
+
+To create the summary and views, just [[ikiwiki/directive/inline]] and [[ikiwiki/directive/pagecount]] directives could provide nice pages. The uncomfortable part is having to use many [[pagespecs|ikiwiki/PageSpec]] to create the whole views, but it actually shoud work. One possible workaround would be an external tool to handle this and create directives automatically or graphically.
+
+Requirements attributes
+-----------------------
+There are lots of useful data to associate to a requirement. Eg:
+
+ * If it is traceable or not
+ * Its criticality level
+ * Its priority
+ * If it is funtional or not
+
+How to implement this? Using [[ikiwiki/directive/meta]] could be a solution, not tried yet, I'd rather keep requirements content alone. Storing this information in SVN it is easy, although ikiwiki does not provide a way to do it it would imply really little effort. The requirement in itself is the content of the file; attributes are stored as key-value pairs in the file's properties. AFAIK this is a feature available only in SVN, although git has something similar (gitattributes) although path based, but anyway whichever RCS is used, a ".properties" file could be created always when a requirement file is created.
+
+Traceability: edition, coverage analysis and navigation
+--------------------------------------------------------
+This is the most important feature of a requirements engineering tool. How to do this with ikiwiki? There are some ways, from extremely simple ones to more sophisticated:
+
+ * One simple solution: Links. Just link from one requirement to another one to create a traceable directional connection
+ * One harder: file attributes (see section about requirements just above)
+
+For coverage analysis , using [[ikiwiki/directive/pagecount]] is the perfect solution to summarize and show covered and uncovered requirements. We could add several pages per module - probably using template pages- with ready made coverage analysis reports... Wow!!! [[ikiwiki/directive/linkmap]] directive can show traceability information graphically.
+
+Navigating among requirements needs... Nothing!!! Just follow the links of referring pages that ikiwiki adds by default.
+
+External traceability
+---------------------
+Being just different folders under the wiki, external traceability is as easy as internal.
+
+Requirements identifiers management
+-----------------------------------
+Another useful convention: requirement identifier shall be the name of the requirements file. In ikiwiki page title is the same as requirement Id. No trouble, it works. I personally prefer to keep title as page title and create a short auto-increasing numeric codes with prefixes and/or suffixes as file name (eg. SRS_FOO_0001, SSS_FOO_002), hope to have somethig running soon.
+
+Requirements history and diff/blame
+-----------------------------------
+Out of the box! And really much more useful than average diffing components of requirement management tools. There are plenty online front ends to use and for offline work tools like meld are awesome.
+
+Team work
+---------
+Also no need to do anything, RCS software does it all. Also for experienced users merging and conflict solving can provide much more practical solutions (most requirements management tools work blocking).
+
+Easy integration with other software lifecycle tools
+----------------------------------------------------
+Modeling tools: as a general rule, store model elements in the most atomic parts: classes, enums, actors, use cases... and use again file attributes to store traceability information. Other way is transforming files representing these atomic model parts in independent mdwn files under, for example, a "mdl" folder
+
+Trac integration is so simple... As simple as any PM tool that acceses the same RCS as ikiwiki does. Diffing, blaming, even navigating directly to ikiwiki generated pages. Integration of a ticketing system will give awesome power to all the team.
+
+Support for other formats
+-------------------------
+Out of the box, at least for wiki and mathematical formats, and creating additional ones shouldn't be so difficult.
+
+Filtering and searching
+-----------------------
+Look that box in the top right corner?
+
+Export facilities
+-----------------
+Views with custom styles and html conversion tools would be enough for most purposes.
+
+That's all!
+
+One funny thing: our "de facto future" requirements management tool, after years of research included some years ago a really nice feature: a Discussion tag per each requirement... See this in ikiwiki? Again out of the box!!!
+
+Comments are really welcome!!!
diff --git a/doc/tips/ikiwiki_as_a_requirements_management_tool/discussion.mdwn b/doc/tips/ikiwiki_as_a_requirements_management_tool/discussion.mdwn
new file mode 100644
index 000000000..26eae28a5
--- /dev/null
+++ b/doc/tips/ikiwiki_as_a_requirements_management_tool/discussion.mdwn
@@ -0,0 +1,21 @@
+How about using tags/links to associate attributes with requirements?
+This could be as simple as adding a link, fo e.g. :
+
+ * If it is traceable or not
+ + \[[attributes/traceable]]
+ + \[[attributes/untraceable]]
+ * Its criticality level
+ + \[[attributes/level/critical]]
+ + \[[attributes/level/important]]
+ + etc.
+ * Its priority
+ + \[[attributes/priority/low]]
+ + \[[attributes/priority/high]]
+ * If it is functional or not
+ + \[[attributes/functional]]
+ + \[[attributes/non-functional]]
+
+You just have to create pages for each attribute you want and then pagespec could be used to filter requirements by attributes. I think something similar is used to trac bug with ikiwiki (linking to a \[[done]] page, etc.).
+
+---
+Generally speaking, I think it's always a good idea to get back to the "basics" for something, that huge and expensive tools were made for. But I'm doubtful if such a text oriented tool would really fit all needs of a requirements engineering tool... so what is your real world experience with your requirements engineering tool as described?
diff --git a/doc/tips/ikiwiki_on_mac_os_x.mdwn b/doc/tips/ikiwiki_on_mac_os_x.mdwn
new file mode 100644
index 000000000..b3d1de370
--- /dev/null
+++ b/doc/tips/ikiwiki_on_mac_os_x.mdwn
@@ -0,0 +1,218 @@
+[[!toc]]
+
+# MacPorts
+
+The easiest way of installing ikiwiki on Mac OS X [Snow] Leopard and Tiger is via MacPorts: <http://www.macports.org/>
+
+This project ports Open Source software into Mac Os X platform.
+It's very easy to intall ikiwiki via MacPorts:
+
+1.- Donwnload and install MacPorts port manager from: <http://www.macports.org/install.php> .
+ Port manager installs via Mac Os X installer. Prerequisite: XCode.
+ Se above URL for details
+
+2.- Run
+
+ $ sudo port install ikiwiki
+
+This installs ikiwiki and all of its dependencies
+
+enjoy
+
+Enrique Castilla
+
+-----
+
+# pkgsrc
+
+The other easiest way of installing ikiwiki on Mac OS X is via
+[pkgsrc](http://www.pkgsrc.org/).
+
+7. Bootstrap pkgsrc
+7. Run `cd .../pkgsrc/www/ikiwiki && make install clean`
+
+-----
+
+# Manual install
+
+These are some notes on installing ikiwiki on Mac OS X Snow Leopard. I have a three year old machine with a lot of stuff on it so it took quite a while, YMMV.
+
+The best part of installing ikiwiki was learning how to use git. I never used source control before but its pretty slick.
+
+
+## installing git:
+
+cd /opt/ikiwiki/install
+
+curl http://kernel.org/pub/software/scm/git/git-(latest version).tar.gz -O
+
+tar xzvf git-(latest version).tar.gz
+
+cd git-(latest version)
+
+./configure --prefix=/usr/local
+
+make prefix=/usr/local all
+
+sudo make install
+
+
+git config --global user.name "firstname lastname"
+
+git config --global user.email "email here"
+
+git config --global color.ui "auto"
+
+
+curl http://www.kernel.org/pub/software/scm/git/git-manpages-1.7.3.1.tar.gz | sudo tar -xzC /usr/local/share/man/
+
+
+## installing ikiwiki:
+I had terrible trouble installing ikiwiki. It turned out I had accidentally installed Perl through ports. Uninstalling that made everything install nicely.
+I got an error on msgfmt. Turns out this is a program in gettext. I installed that and it fixed the error.
+
+cd ..
+
+git clone git://git.ikiwiki.info/
+
+cd git.ikiwiki.info/
+
+perl Makefile.PL LIB=/Library/Perl/5.10.0
+
+make
+
+sudo make install
+
+when you make ikiwiki it gives you a .git folder with the ikiwiki files. Stay out of this folder. You want to learn how to create a clone and make all your changes in the clone. When you push the changes ikiwiki will update. I moved a file in this folder by accident because I named my working file the same and I couldn't get into the setup page. I had apparently messed up my ikiwiki git repository. I did a pull into my clone, deleted the repository and webserver/ cgi folders and ran a new setup. Then I did a git clone and dragged all my old files into the new clone. Did the git dance and did git push. Then the angels sang.
+
+
+## using git from inside a git folder:
+
+start with git clone, then learn to do the git dance like this.
+
+git pull
+
+make your changes to your clone
+
+git commit -a -m "message here"
+
+git push
+
+
+When you can't get into the setup page or you get strange behavior after a setup update the Utilities > Console app is your friend.
+
+## installing gitweb
+
+cd ../git-1.7.3.1/gitweb
+
+make GITWEB_PROJECTROOT="/opt/ikiwiki/" GITWEB_CSS="/gitweb.css" GITWEB_LOGO="/git-logo.png" GITWEB_FAVICON="/git-favicon.png" GITWEB_JS="/gitweb.js"
+
+cp gitweb.cgi /Library/WebServer/CGI-Executables/
+
+cp /usr/local/share/gitweb/static/git-favicon.png /Library/WebServer/Documents/
+
+cp /usr/local/share/gitweb/static/git-logo.png /Library/WebServer/Documents/
+
+cp /usr/local/share/gitweb/static/gitweb.css /Library/WebServer/Documents/
+
+cp /usr/local/share/gitweb/static/gitweb.js /Library/WebServer/Documents/
+
+
+sudo chmod 2755 /Library/WebServer/CGI-Executables/gitweb.cgi
+
+sudo chmod 2755 /Library/WebServer/Documents/git-favicon.png
+
+sudo chmod 2755 /Library/WebServer/Documents/git-logo.png
+
+sudo chmod 2755 /Library/WebServer/Documents/gitweb.css
+
+sudo chmod 2755 /Library/WebServer/Documents/gitweb.js
+
+
+## installing xapian:
+
+download xapian and omega
+
+I needed pcre: sudo ports install pcre
+
+./configure
+
+make
+
+sudo make install
+
+
+## installing omega:
+
+I had a build error do to libiconv undefined symbols. sudo port deactivate libiconv took care of it. After install I had trouble with ikiwiki so I did a sudo port install libiconv and ikiwiki came back.
+
+./configure
+
+make
+
+sudo make install
+
+
+## installing Search::Xapian from CPAN
+
+for some reason this wouldn't install using CPAN console so I went to CPAN online and downloaded the source.
+
+perl Makefile.PL
+
+make
+
+make test
+
+sudo make install
+
+it installed without issue so I'm baffled why it didn't install from command line.
+
+
+ ## setup file
+ _!/usr/bin/perl
+ _ Ikiwiki setup automator.
+
+ _ This setup file causes ikiwiki to create a wiki, check it into revision
+ _ control, generate a setup file for the new wiki, and set everything up.
+
+ _ Just run: ikiwiki -setup /etc/ikiwiki/auto.setup
+
+ _By default, it asks a few questions, and confines itself to the user's home
+ _directory. You can edit it to change what it asks questions about, or to
+ _modify the values to use site-specific settings.
+ require IkiWiki::Setup::Automator;
+
+ our $wikiname="your wiki";
+ our $wikiname_short="yourwiki";
+ our $rcs="git";
+ our $admin="your name";
+ use Net::Domain q{hostfqdn};
+ our $domain="your.domain";
+
+ IkiWiki::Setup::Automator->import(
+ wikiname => $wikiname,
+ adminuser => [$admin],
+ rcs => $rcs,
+ srcdir => "/opt/ikiwiki/$wikiname_short",
+ destdir => "/Library/WebServer/Documents/$wikiname_short",
+ repository => "/opt/ikiwiki/$wikiname_short.".($rcs eq "monotone" ? "mtn" : $rcs),
+ dumpsetup => "/opt/ikiwiki/$wikiname_short.setup",
+ url => "http://$domain/$wikiname_short",
+ cgiurl => "http://$domain/cgi-bin/$wikiname_short/ikiwiki.cgi",
+ cgi_wrapper => "/Library/WebServer/CGI-Executables/$wikiname_short/ikiwiki.cgi",
+ adminemail => "your\@email.com",
+ add_plugins => [qw{goodstuff websetup}],
+ disable_plugins => [qw{}],
+ libdir => "/opt/ikiwiki/.ikiwiki",
+ rss => 1,
+ atom => 1,
+ syslog => 1,
+ )
+
+
+## turning on search plugin:
+
+I turned on the plugin from the setup page in ikiwiki but it gave an error when I went to search. Error "Error: /usr/lib/cgi-bin/omega/omega failed: No such file or directory".
+I did a "find / -name "omega" -print" and found the omega program in "/usr/local/lib/xapian-omega/bin/omega".
+
+Then I went into the 2wiki.setup file and replaced the bad path, updated and badda-boom badda-bing.
diff --git a/doc/tips/ikiwiki_via_gopher.mdwn b/doc/tips/ikiwiki_via_gopher.mdwn
new file mode 100644
index 000000000..ffea70f73
--- /dev/null
+++ b/doc/tips/ikiwiki_via_gopher.mdwn
@@ -0,0 +1,22 @@
+Remember gopher? Ikiwiki can be served up by this venerable protocol.
+
+It's pretty simple to get it going. Just install pygopherd or another gopher
+server, and have ikiwiki put its pages where that server expects. For
+pygopherd, it was in `/var/gopher`.
+
+When building the wiki, make sure to specify --no-usedirs (or equivilant in
+the setup file). Gopher doesn't convert "foo/" links into
+"foo/index.html", so usedirs won't work well with it; if usedirs is
+disabled, browsing through the wiki via gopher will work just fine.
+
+Since AFAIK gopher has no equivilant to CGI, you'll need to keep a
+web server around for editing pages. If you do set up a cgi, make sure to
+configure `url` to something like `gopher://hostname/h/`, so that it
+links back properly to gopherspace from the CGI.
+
+One unresolved problem: Style sheets are not loaded. The urls seem ok, but
+pygopherd seems to serve them in a form that doesn't work somehow. I have
+not invesitaged more, because a fully unstyled web page fits the retro
+gopher better anyhow.
+
+--[[Joey]]
diff --git a/doc/tips/ikiwiki_via_gopher/discussion.mdwn b/doc/tips/ikiwiki_via_gopher/discussion.mdwn
new file mode 100644
index 000000000..196f203bc
--- /dev/null
+++ b/doc/tips/ikiwiki_via_gopher/discussion.mdwn
@@ -0,0 +1,8 @@
+Joey, do you have an ikiwiki served up on gopher, as an example, I can take a look at?
+I find this prospect interesting.
+I have a gopherhole, but no wiki in it.
+
+I was wondering, myself, if I symlinked a dirfull of dokuwiki pages in my gopherhole what that might be like...hmmmm.
+I might try that.
+
+tony baldwin | http://tonybaldwin.me | gopher://tonybaldwin.me
diff --git a/doc/tips/importing_posts_from_typo.mdwn b/doc/tips/importing_posts_from_typo.mdwn
new file mode 100644
index 000000000..1b87e7dae
--- /dev/null
+++ b/doc/tips/importing_posts_from_typo.mdwn
@@ -0,0 +1 @@
+[Here](http://blog.spang.cc/posts/migrating_from_typo_to_ikiwiki/) is a blog post that gives instructions and a script for importing posts from [Typo](http://typosphere.org/), a Ruby-on-Rails based blogging engine.
diff --git a/doc/tips/importing_posts_from_wordpress/ikiwiki-wordpress-import.mdwn b/doc/tips/importing_posts_from_wordpress/ikiwiki-wordpress-import.mdwn
new file mode 100644
index 000000000..a59d4b5ad
--- /dev/null
+++ b/doc/tips/importing_posts_from_wordpress/ikiwiki-wordpress-import.mdwn
@@ -0,0 +1,468 @@
+[[!meta title="ikiwiki-wordpress-import"]]
+
+I converted the script to Perl. The new version gets your name and email automatically from your git config, converts the body of your posts to markdown, and also imports comments. More importantly it works with the latest wordpress, which the python version does not. Note that it's still not 100% perfect and I intend to make a few modifications still, but they will require access to the mysql database and that may render the script useless to some users.
+
+-----
+[[!format perl '''
+#!/usr/bin/env perl
+
+use 5.16.1;
+use warnings;
+
+use XML::Simple;
+use DateTime::Format::Strptime;
+use HTML::WikiConverter;
+use LWP::UserAgent;
+use Try::Tiny;
+use Digest::MD5 'md5_hex';
+
+die "usage: $0 import_file subdir [branch] | git-fast-import"
+ unless @ARGV == 2 or @ARGV == 3;
+
+chomp(my $name = qx(git config --get user.name));
+chomp(my $email = qx(git config --get user.email));
+
+my ($file, $subdir, $branch) = @ARGV;
+
+my %events;
+
+POST:
+for my $x (grep $_->{'wp:status'} eq 'publish', @{XMLin($file)->{channel}{item}}) {
+ state $date_parser = DateTime::Format::Strptime->new(
+ pattern => '%F %T',
+ time_zone => 'UTC',
+ );
+
+ my $stub = $x =~ m<([^/]+)\/$>
+ ? $1
+ : lc($x->{title} =~ s/\W/-/gr =~ s/-$//r)
+ ;
+
+ my $guid = $x->{guid}{content} || $x->{link};
+ utf8::encode($x->{title});
+ my $msg = qq($x->{title}\n\nfrom WordPress [$guid]);
+ my $timestamp = $date_parser
+ ->parse_datetime($x->{'wp:post_date_gmt'})
+ ->epoch;
+
+ my $c = $x->{category};
+ $c = [$c] if ref $c && ref $c ne 'ARRAY';
+
+ my $content =
+ sprintf(qq([[!meta title="%s"]]\n), $x->{title} =~ s/"/\\"/gr) .
+ convert_content($x->{'content:encoded'}) . "\n\n" .
+ join("\n",
+ map '[[!tag ' . s/ /-/r . ']]',
+ keys %{
+ +{
+ map { $_ => 1 }
+ grep $_ ne 'uncategorized',
+ map $_->{nicename},
+ @$c
+ }
+ }
+ );
+
+ $events{$timestamp} = join "\n",
+ "commit refs/heads/$branch",
+ "committer $name <$email> $timestamp +0000",
+ 'data <<8675309',
+ $msg,
+ '8675309',
+ "M 644 inline $subdir/$stub.mdwn",
+ 'data <<8675309',
+ $content,
+ '8675309'
+ ;
+
+ get_comments($x->{link}, "$subdir/$stub")
+ if $x->{'wp:post_type'} eq 'post'
+}
+
+sub get_comments {
+ my ($url, $dir) = @_;
+
+ state $ua = LWP::UserAgent->new;
+
+ my $content = $ua->get("$url/feed")->decoded_content;
+ my $first;
+ my $bail;
+ my $decoded =
+ try { XMLin($content, ForceArray => ['item']) }
+ catch { $bail = 1 };
+
+ return if $bail;
+
+ COMMENT:
+ for my $x (@{$decoded->{channel}{item}}) {
+ my $date = $x->{pubDate};
+ $date =~ s/^\S+\s//;
+ $date =~ s/\s\S+$//;
+
+ #ghetto
+ $date =~ s/Jan/01/;
+ $date =~ s/Feb/02/;
+ $date =~ s/Mar/03/;
+ $date =~ s/Apr/04/;
+ $date =~ s/May/05/;
+ $date =~ s/Jun/06/;
+ $date =~ s/Jul/07/;
+ $date =~ s/Aug/08/;
+ $date =~ s/Sep/09/;
+ $date =~ s/Oct/10/;
+ $date =~ s/Nov/11/;
+ $date =~ s/Dec/12/;
+
+ state $date_parser = DateTime::Format::Strptime->new(
+ pattern => '%d %m %Y %T',
+ time_zone => 'UTC',
+ );
+
+ my $datetime = $date_parser
+ ->parse_datetime($date);
+
+ my $timestamp = $datetime->epoch;
+ my $formatted_date = "$timestamp";
+
+ my $msg = 'Added a comment';
+ my $content = convert_content($x->{'content:encoded'});
+ utf8::encode($x->{'dc:creator'});
+
+ $events{$timestamp} = join "\n",
+ "commit refs/heads/$branch",
+ # still need to get email address
+ "committer $x->{'dc:creator'} <$x->{'dc:creator'}> $timestamp +0000",
+ 'data <<8675309',
+ $msg,
+ '8675309',
+ "M 644 inline " . unique_comment_location($dir, $content),
+ 'data <<8675309',
+
+ <<"COMMENT",
+[[!comment format=mdwn
+ username="$x->{'dc:creator'}"
+ date="$formatted_date"
+ content="""
+$content
+"""]]
+COMMENT
+ '8675309'
+ ;
+ }
+}
+
+say $events{$_} for sort keys %events;
+
+sub convert_content {
+ my $body = shift;
+
+ utf8::encode($body);
+
+ state $converter = HTML::WikiConverter->new(
+ dialect => 'Markdown',
+ link_style => 'inline',
+ unordered_list_style => 'dash',
+ image_style => 'inline',
+ image_tag_fallback => 0,
+ );
+
+ # I know I know you can't parse XML with regular expressions. Go find a real
+ # parser and send me a patch
+ my $in_code = 0;
+
+ my $start_code = qr(<pre[^>]*>);
+ # (?:) is a no op but keeps ikiwiki from breaking my script
+ my $end_code = qr(</p(?:)re>);
+
+ $body =~ s(&#(?:8217|039);)(')g;
+ $body =~ s(&(?:quot|#822[01]);)(")g;
+ $body =~ s(&lt;)(<)g;
+ $body =~ s(&gt;)(>)g;
+ $body =~ s(&amp;)(&)g;
+ $body =~ s(&#8230;)(...)g;
+ $body =~ s(&#821[12];)(-)g;
+ $body =~ s(&#8216;)(')g;
+ $body =~ s(&#8242;)(')g;
+ $body =~ s(&infin;)(∞)g;
+ $body =~ s(&nbsp;)()g;
+ $body =~ s(<code[^>]*>)(<p(?:)re>)g;
+ $body =~ s(</c(?:)ode>)(</p(?:)re>)g;
+
+ my @tokens =
+ map {; split qr[(?=<p(?:)re>)] }
+ map {; split qr[</p(?:)re>\K] }
+ split /\n\n/,
+ $body;
+
+ my @new_tokens;
+ for my $t (@tokens) {
+ if (
+ ($in_code && $t !~ $end_code) ||
+ ($t =~ $start_code && $t =~ $end_code)
+ ) {
+ # do nothing
+ } elsif ($t =~ $start_code) {
+ $in_code = 1;
+ } elsif ($t =~ $end_code) {
+ $in_code = 0;
+ } else {
+ die "$t !!! '$1'" if $t =~ m/&([^;\s]+);/ && $1 !~ /[lg]t/;
+
+ $t = "<p>$t</p>"
+ }
+ push @new_tokens, $t
+ }
+
+ $converter->html2wiki(join "\n\n", @new_tokens)
+}
+
+sub unique_comment_location {
+ my ($dir, $content) = @_;
+
+ utf8::encode($content);
+ my $md5 = md5_hex($content);
+
+ my $location;
+ my $i = 0;
+ do {
+ $i++;
+ $location = "$dir/comment_${i}_$md5._comment";
+ } while -e $location;
+
+ return $location
+}
+
+''']]
+-----
+
+I modified the script a bit so categories and tags would actually show up in the output file.
+
+-----
+[[!format '''
+#!/usr/bin/env python
+
+"""
+ Purpose:
+ Wordpress-to-Ikiwiki import tool
+
+ Copyright:
+ Copyright (C) 2007 Chris Lamb <chris@chris-lamb.co.uk>
+
+ This program is free software: you can redistribute it and/or modify
+ it under the terms of the GNU General Public License as published by
+ the Free Software Foundation, either version 3 of the License, or
+ (at your option) any later version.
+
+ This program is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+ GNU General Public License for more details.
+
+ You should have received a copy of the GNU General Public License
+ along with this program. If not, see <http://www.gnu.org/licenses/>.
+
+ Usage: run --help as an argument with this script.
+
+ Notes:
+ I added some extra bits to include the \[[!tag foo]] stuff in the post,
+ as it wasn't before, at all. I'll diff the versions out so you can see
+ the mess I made :).
+
+"""
+
+import os, sys
+import time
+import re
+
+from BeautifulSoup import BeautifulSoup
+
+import codecs, htmlentitydefs
+
+codecs.register_error('html_replace', lambda x: (''.join([u'&%s;' \
+ % htmlentitydefs.codepoint2name[ord(c)] for c in x.object[x.start:x.end]]), x.end))
+
+def main(name, email, subdir, branch='master'):
+ soup = BeautifulSoup(sys.stdin.read())
+
+ # Regular expression to match stub in URL.
+ stub_pattern = re.compile(r'.*\/(.+)\/$')
+
+ for x in soup.findAll('item'):
+ # Ignore draft posts
+ if x.find('wp:status').string != 'publish': continue
+
+ match = stub_pattern.match(x.guid.string)
+ if match:
+ stub = match.groups()[0]
+ else:
+ # Fall back to our own stubs
+ stub = re.sub(r'[^a-zA-Z0-9_]', '-', x.title.string).lower()
+
+ commit_msg = """Importing WordPress post "%s" [%s]""" % (x.title.string, x.guid.string)
+ timestamp = time.mktime(time.strptime(x.find('wp:post_date_gmt').string, "%Y-%m-%d %H:%M:%S"))
+
+ content = '\[[!meta title="%s"]]\n\n' % (x.title.string.replace('"', r'\"'))
+ content += x.find('content:encoded').string.replace('\r\n', '\n')
+
+ # categories = x.findAll('category')
+ # categories = x.findAll({'category':True}, attrs={'domain':re.compile(('category|tag'))})
+ # categories = x.findAll({'category':True}, domain=["category", "tag"])
+ # categories = x.findAll({'category':True}, nicename=True)
+ """
+ We do it differently here because we have duplicates otherwise.
+ Take a look:
+ &lt;category&gt;&lt;![CDATA[Health]]&gt;&lt;/category&gt;
+ &lt;category domain="category" nicename="health"&gt;&lt;![CDATA[Health]]&gt;&lt;/category&gt;
+
+ If we do the what original did, we end up with all tags and cats doubled.
+ Therefore we only pick out nicename="foo". Our 'True' below is our 'foo'.
+ I'd much rather have the value of 'nicename', and tried, but my
+ python skillz are extremely limited....
+ """
+ categories = x.findAll('category', nicename=True)
+ if categories:
+ content += "\n"
+ for cat in categories:
+ # remove 'tags/' because we have a 'tagbase' set.
+ # your choice: 'tag', or 'taglink'
+ # content += "\n\[[!tag %s]]" % (cat.string.replace(' ', '-'))
+ content += "\n\[[!taglink %s]]" % (cat.string.replace(' ', '-'))
+ # print >>sys.stderr, cat.string.replace(' ', '-')
+
+ # moved this thing down
+ data = content.encode('ascii', 'html_replace')
+ print "commit refs/heads/%s" % branch
+ print "committer %s &lt;%s&gt; %d +0000" % (name, email, timestamp)
+ print "data %d" % len(commit_msg)
+ print commit_msg
+ print "M 644 inline %s" % os.path.join(subdir, "%s.mdwn" % stub)
+ print "data %d" % len(data)
+ print data
+
+if __name__ == "__main__":
+ if len(sys.argv) not in (4, 5):
+ print >>sys.stderr, "%s: usage: %s name email subdir [branch] < wordpress-export.xml | git-fast-import " % (sys.argv[0], sys.argv[0])
+ else:
+ main(*sys.argv[1:])
+
+''']]
+-----
+
+I have another version of the script, which uses the `timestamp` from the script, and inserts that as a \[[!meta date="foodate"]]. I'm posting it here just in case I happen to be doing something to the httpd.
+
+(Hopefully I've escaped everything properly; if I missed something, check the source.)
+
+-----
+[[!format '''
+#!/usr/bin/env python
+
+"""
+ Purpose:
+ Wordpress-to-Ikiwiki import tool
+
+ Copyright:
+ Copyright (C) 2007 Chris Lamb <chris@chris-lamb.co.uk>
+
+ This program is free software: you can redistribute it and/or modify
+ it under the terms of the GNU General Public License as published by
+ the Free Software Foundation, either version 3 of the License, or
+ (at your option) any later version.
+
+ This program is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+ GNU General Public License for more details.
+
+ You should have received a copy of the GNU General Public License
+ along with this program. If not, see <http://www.gnu.org/licenses/>.
+
+ Usage: run --help as an argument with this script.
+
+ Notes:
+ I added some extra bits to include the \[[!tag foo]] stuff in the post,
+ as it wasn't before, at all. I'll diff the versions out so you can see
+ the mess I made :).
+
+"""
+
+import os, sys
+import time
+import re
+
+from datetime import datetime
+from BeautifulSoup import BeautifulSoup
+
+import codecs, htmlentitydefs
+
+codecs.register_error('html_replace', lambda x: (''.join([u'&%s;' \
+ % htmlentitydefs.codepoint2name[ord(c)] for c in x.object[x.start:x.end]]), x.end))
+
+def main(name, email, subdir, branch='master'):
+ soup = BeautifulSoup(sys.stdin.read())
+
+ # Regular expression to match stub in URL.
+ stub_pattern = re.compile(r'.*\/(.+)\/$')
+
+ for x in soup.findAll('item'):
+ # Ignore draft posts
+ if x.find('wp:status').string != 'publish': continue
+
+ match = stub_pattern.match(x.guid.string)
+ if match:
+ stub = match.groups()[0]
+ else:
+ # Fall back to our own stubs
+ stub = re.sub(r'[^a-zA-Z0-9_]', '-', x.title.string).lower()
+
+ commit_msg = """Importing WordPress post "%s" [%s]""" % (x.title.string, x.guid.string)
+ timestamp = time.mktime(time.strptime(x.find('wp:post_date_gmt').string, "%Y-%m-%d %H:%M:%S"))
+ content = '\[[!meta title="%s"]]\n' % (x.title.string.replace('"', r'\"'))
+ content += "\[[!meta date=\"%s\"]]\n" % datetime.fromtimestamp(timestamp)
+ content += x.find('content:encoded').string.replace('\r\n', '\n')
+
+ """
+ We do it differently here because we have duplicates otherwise.
+ Take a look:
+ &lt;category&gt;&lt;![CDATA[Health]]&gt;&lt;/category&gt;
+ &lt;category domain="category" nicename="health"&gt;&lt;![CDATA[Health]]&gt;&lt;/category&gt;
+
+ If we do the what original did, we end up with all tags and cats doubled.
+ Therefore we only pick out nicename="foo". Our 'True' below is our 'foo'.
+ I'd much rather have the value of 'nicename', and tried, but my
+ python skillz are extremely limited....
+ """
+ categories = x.findAll('category', nicename=True)
+ if categories:
+ content += "\n"
+ for cat in categories:
+ # remove 'tags/' because we have a 'tagbase' set.
+ # your choice: 'tag', or 'taglink'
+ # content += "\n\[[!tag %s]]" % (cat.string.replace(' ', '-'))
+ content += "\n\[[!taglink %s]]" % (cat.string.replace(' ', '-'))
+ # this is just debugging, and for fun
+ # print >>sys.stderr, cat.string.replace(' ', '-')
+
+ # moved this thing down
+ data = content.encode('ascii', 'html_replace')
+ print "commit refs/heads/%s" % branch
+ print "committer %s &lt;%s&gt; %d +0000" % (name, email, timestamp)
+ print "data %d" % len(commit_msg)
+ print commit_msg
+ print "M 644 inline %s" % os.path.join(subdir, "%s.mdwn" % stub)
+ print "data %d" % len(data)
+ print data
+
+if __name__ == "__main__":
+ if len(sys.argv) not in (4, 5):
+ print >>sys.stderr, "%s: usage: %s name email subdir [branch] < wordpress-export.xml | git-fast-import " % (sys.argv[0], sys.argv[0])
+ else:
+ main(*sys.argv[1:])
+
+''']]
+-----
+
+
+[[!tag wordpress]]
+[[!tag python]]
+[[!tag conversion]]
+[[!tag ikiwiki]]
diff --git a/doc/tips/inside_dot_ikiwiki.mdwn b/doc/tips/inside_dot_ikiwiki.mdwn
new file mode 100644
index 000000000..a74d00f47
--- /dev/null
+++ b/doc/tips/inside_dot_ikiwiki.mdwn
@@ -0,0 +1,91 @@
+[[!meta title="inside .ikiwiki"]]
+
+The `.ikiwiki` directory contains ikiwiki's internal state. Normally,
+you don't need to look in it, but here's some tips for how to do so if
+you need/want to.
+
+## the index
+
+`.ikiwiki/indexdb` contains a cache of information about pages.
+This information can always be recalculated by rebuilding the wiki.
+(So the file is safe to delete and need not be backed up.)
+It used to be a (semi) human-readable text file, but is not anymore.
+
+To dump the contents of the file, enter a perl command like this.
+
+ joey@kodama:~/src/joeywiki/.ikiwiki> perl -le 'use Storable; my $index=Storable::retrieve("indexdb"); use Data::Dumper; print Dumper $index' | head
+ $VAR1 = {
+ 'index' => {
+ 'ctime' => 1199739528,
+ 'dest' => [
+ 'index.html'
+ ],
+ 'mtime' => 1199739528,
+ 'src' => 'index.mdwn',
+ 'links' => [
+ 'index/discussion',
+
+## the user database
+
+`.ikiwiki/userdb` is the user database, which records preferences of all
+web users.
+
+To list all users in the database, enter a perl command like this.
+Note that the output can include both registered users, and known
+openids.
+
+ joey@kodama:~/src/joeywiki/.ikiwiki> perl -le 'use Storable; my $userinfo=Storable::retrieve("userdb"); print $_ foreach keys %$userinfo'
+ http://joey.kitenet.net/
+ foo
+
+To list each user's email address:
+
+ joey@kodama:~/src/joeywiki/.ikiwiki> perl -le 'use Storable; my $userinfo=Storable::retrieve("userdb"); print $userinfo->{$_}->{email} foreach keys %$userinfo'
+
+ joey@kitenet.net
+
+To dump the entire database contents:
+
+ joey@kodama:~/src/joeywiki/.ikiwiki> perl -le 'use Storable; my $userinfo=Storable::retrieve("userdb"); use Data::Dumper; print Dumper $userinfo'
+ $VAR1 = {
+ 'http://joey.kitenet.net/' => {
+ 'email' => 'joey@kitenet.net',
+ [...]
+
+Editing values is simply a matter of changing values and calling `Storable::nstore()`.
+So to change a user's email address:
+
+ joey@kodama:~/src/joeywiki/.ikiwiki> perl -le 'use Storable; my $userinfo=Storable::retrieve("userdb"); $userinfo->{"foo"}->{email}=q{foo@bar}; Storable::lock_nstore($userinfo, "userdb")'
+
+To remove that user:
+
+ joey@kodama:~/src/joeywiki/.ikiwiki> perl -le 'use Storable; my $userinfo=Storable::retrieve("userdb"); delete $userinfo->{"foo"}; Storable::lock_nstore($userinfo, "userdb")'
+
+I've not written actual utilities to do this yet because I've only needed
+to do it rarely, and the data I've wanted has been different each time.
+--[[Joey]]
+
+## the session database
+
+`.ikiwiki/sessions.db` is the session database. See the [[!cpan CGI::Session]]
+documentation for more details.
+
+## lockfiles
+
+In case you're curious, here's what the various lock files do.
+
+* `.ikiwiki/lockfile` is the master ikiwiki lock file. Ikiwiki takes this
+ lock before reading/writing state.
+* `.ikiwiki/commitlock` is locked as a semophore, to disable the commit hook
+ from doing anything.
+* `.ikiwiki/cgilock` is locked by the cgi wrapper, to ensure that only
+ one ikiwiki process is run at a time to handle cgi requests.
+
+## plugin state files
+
+Some plugins create other files to store their state.
+
+* `.ikiwiki/aggregate` is a plain text database used by the aggregate plugin
+ to record feeds and known posts.
+* `.ikiwiki/xapian/` is created by the search plugin, and contains xapian-omega
+ configuration and the xapian database.
diff --git a/doc/tips/inside_dot_ikiwiki/discussion.mdwn b/doc/tips/inside_dot_ikiwiki/discussion.mdwn
new file mode 100644
index 000000000..69df369ec
--- /dev/null
+++ b/doc/tips/inside_dot_ikiwiki/discussion.mdwn
@@ -0,0 +1,66 @@
+My database appears corrupted:
+
+ $ perl -le 'use Storable; my $index=Storable::retrieve("indexdb"); use Data::Dumper; print Dumper $index'
+ Out of memory!
+
+No idea how this happened. I've blown it away and recreated it but, for future reference, is there any less violent way to recover from this situation? I miss having the correct created and last edited times. --[[sabr]]
+> update: fixed ctimes and mtimes using [these instructions](http://u32.net/Mediawiki_Conversion/Git_Import/#Correct%20Creation%20and%20Last%20Edited%20time) --[[sabr]]
+
+> That's overly complex. Just run `ikiwiki -setup your.setup -gettime`.
+> BTW, I'd be interested in examining such a corrupt storable file to try
+> to see what happened to it. --[[Joey]]
+
+>> --gettime appears to only set the last edited date. It's not supposed to set the creation date, is it? The only place that info is stored is in the git repo.
+
+>>> Pulling the page creation date out of the git history is exactly what
+>>> --gettime does. (It used to be called --getctime, and only do that; now
+>>> it also pulls out the last modified date). --[[Joey]]
+
+>> Alas, I seem to have lost the bad index file to periodic /tmp wiping; I'll send it to you if it happens again. --[[sabr]]
+
+<!-- Add by Blanko -->
+
+## Lost password for an user
+
+This morning, a person has lost its password. I was able to do something to make another password. This is the way I take :
+
+> You can certianly do that, but do note that ikiwiki will offer to mail a
+> user a password reset link if they lost their password. --[[Joey]]
+
+### Locate the user database
+
+As tips show us, the user database is in the source file, for an example :
+
+ src/.ikiwiki/userdb
+
+### See which user to modify
+
+Because I don't know the real login of the user, I have to read all the database :
+
+ perl -le 'use Storable; my $index=Storable::retrieve("userdb"); use Data::Dumper; print Dumper $index'
+
+Then I was able to find this :
+
+ 'Utilisateur' => {
+ 'email' => 'user@pl.fr',
+ 'cryptresettoken' => '$2a$10$cfVeOoVbFw9VzMlgEbPMsu34pwHIFP84mWlkrs2RCKknZYPZkPffm',
+ 'password' => '',
+ 'resettoken' => '',
+ 'cryptpassword' => '$2a$10$H8bYq.dlb68wpnfJgVZQhOdsF9JQ06cteRfhPQPB5eHKnD5Y3u7au',
+ 'regdate' => '1226574052'
+ },
+
+Let's have a look to modify lines.
+
+### Modify the line
+
+When you have found the line to modify, take the user name, and change its password to **sc** (for an example) :
+
+ perl -le 'use Storable; my $userinfo=Storable::retrieve("userdb"); $userinfo->{"Utilisateur"}->{cryptpassword}=q{$2a$10$7viOHCrUkdAVL135Kr6one1mpZQ/FWYC773G1yZ0EtQciI11sSDRS}; Storable::lock_nstore($userinfo, "userdb")'
+ perl -le 'use Storable; my $userinfo=Storable::retrieve("userdb"); $userinfo->{"Utilisateur"}->{cryptresettoken}=q{}; Storable::lock_nstore($userinfo, "userdb")'
+
+Because I don't know how suppress cryptresettoken and resettoken fields, I change their content with *null*.
+
+After all these modifications, the user *Utilisateur* could connect to its account with the password **sc**, and go to Preferences, then change its password.
+
+<!-- End of Blanko's modifications -->
diff --git a/doc/tips/integrated_issue_tracking_with_ikiwiki.mdwn b/doc/tips/integrated_issue_tracking_with_ikiwiki.mdwn
new file mode 100644
index 000000000..0c871d6c0
--- /dev/null
+++ b/doc/tips/integrated_issue_tracking_with_ikiwiki.mdwn
@@ -0,0 +1,277 @@
+[[!meta title="Integrated issue tracking with Ikiwiki"]]
+
+[[!meta author="Joey Hess, LinuxWorld.com"]]
+
+[[!meta copyright="""
+Copyright 2007 Joey Hess <joeyh@ikiwiki.info>, LinuxWorld.com
+[First published](http://www.linuxworld.com/news/2007/040607-integrated-issue-tracking-ikiwiki.html)
+on [LinuxWorld.com](http://www.linuxworld.com/), a publication of Network
+World Inc., 118 Turnpike Rd., Southboro, MA 01772.
+"""]]
+[[!meta license="[[GPL|freesoftware]]"]]
+
+Wikis are not just for encyclopedias and websites anymore. You can use
+Ikiwiki in combination with your revision control system to handle issue
+tracking, news feeds, and other needs of a software project. The wiki can
+make your bug reports as much a part of your software project as its code,
+with interesting results.
+
+Ikiwiki is a wiki engine with a twist. It's best
+described by the term "wiki compiler". Just as a
+typical software project consists of source code
+that is stored in revision control and compiled with
+`make` and `gcc`, an ikiwiki-based wiki is stored as
+human-editable source in a revision control system,
+and built into HTML using ikiwiki.
+
+Ikiwiki uses your revision control system to track
+changes and handle tasks such as rolling back changes and
+merging edits. Because it takes advantage of revision
+control, there are no annoying warnings about other
+people editing a file, or finding yourself locked
+out of a file because someone else started editing it
+and left. Instead, the other person's changes will
+be automatically merged with yours when you commit.
+
+In the rare cases where automatic merging fails
+because of concurrent edits to the same part of a
+page, regular commit conflict markers are shown in
+the file to let you resolve the conflict, as you
+would for conflicting edits in source code.
+
+Ikiwiki is a full-featured wiki that you can use
+for a variety of purposes, from traditional wikis
+to weblogs, podcasting, or even aggregating other
+sites' RSS feeds into a Planet page. While people
+are [[using|ikiwikiusers]]
+Ikiwiki for purposes ranging from genealogy research
+to shoe accessory sales, one thing it's especially
+well suited for is collaborative software development,
+including announcements, documentation, managing a
+software project's web site, and even acting as an
+issue tracking system.
+
+## Building a project wiki with ikiwiki
+
+The simplest way to use ikiwiki is to build static
+HTML files from source wiki files. This example builds
+a wiki for an imaginary software project. The wiki
+source files used in this example are available in the
+[[examples/softwaresite|examples/softwaresite]] section
+of ikiwiki's documentation.
+
+ wiki$ ls
+ Makefile bugs.mdwn doc/ download.mdwn news/
+ bugs/ contact.mdwn doc.mdwn index.mdwn news.mdwn
+ wiki$ make
+ ikiwiki `pwd` html --wikiname FooBar --plugin=goodstuff \
+ --exclude=html --exclude=Makefile
+ wiki$ w3m -dump html/doc/faq.html
+ FooBar/ doc/ faq
+
+ FooBar frequently asked questions.
+
+ 1. Is this a real program?
+ 2. Really?
+
+ _Is this a real program?_
+
+ No, it's just an example.
+
+ _Really?_
+
+ Yes, really.
+
+ Links: contact doc
+ Last edited Wed Nov 22 09:58:35 2006
+
+If all you need is a simple static set of pages
+that can be put up on a web site, or shipped with
+a software package, this is a good starting point.
+The examples included with ikiwiki include pages for
+a news feed for the project (with RSS), an issue
+tracker, and other pages users expect to see on a
+project's website. You can check the wiki-format text
+into revision control as part of the software project,
+and tie it into the build system using the Makefile.
+
+Ikiwiki can also be tied into the [[post-commit]] hook of your revision
+control system, so that whenever a developer commits a change to a wiki
+page in revision control, the project's web site is automatically updated.
+The [[ikiwiki_tutorial|setup]] explains in
+detail how to set this up using the Subversion, Git, TLA, and Mercurial
+revision control systems.
+
+The tutorial also explains how to configure ikiwiki so that users can edit
+pages using a web interface, with their changes committed back into revision
+control. After all, one of the benefits of keeping a project's docs in a wiki
+is to make it easy for users to improve them, so that busy software developers
+don't have to. And if the wiki is being used for issue tracking, this will
+let users post and follow up on bug reports.
+
+## Using a wiki for issue tracking?
+
+You might be wondering exactly how a wiki can be used as an issue tracking
+system. Three key parts of ikiwiki come together to create an issue tracker:
+pages, tags, and inlining.
+
+Each issue is described on a separate page in the
+wiki. There can also be an associated Discussion page,
+as well as other related subpages that can be used
+to hold files used to reproduce the bug, or patches,
+or other related files. Since each issue is a page,
+standard wiki links can be used to link related
+issues, or link issues with other pages in the wiki.
+Each issue has its own unique URL. Since ikiwiki
+supports subdirectories, it's usual to keep all the
+bugs in a `bugs/` subdirectory. You might prefer
+to separate bugs and todo items, with todo items in
+their own 'todo/' subdirectory.
+
+While directories are useful for broad hierarchical
+grouping, tags are better for categorizing issues
+as bugs, wishlist items, security issues, patches,
+or whatever other categories are useful. Bugs can
+be tagged "moreinfo", "done", "unreproducible",
+etc, to document different stages of
+their lifecycle. A developer can take ownership of a
+bug by tagging it with something like "owner/Joey".
+
+To tag a wiki page, edit it and add text such as "\[[!tag done]]". Note that
+adding a wiki link to "\[[done]]" will have the same categorisation effect
+as a tag, but the link will show up in the body of the page, which is a
+nice effect if used in a sentence such as "This was \[[done]] in version
+1.1.". Another way to close a bug is to move it out of the `bugs/`
+subdirectory, though this would prevent it from showing up in a list of
+closed bugs.
+
+Inlining is how ikiwiki pulls individual issue pages together into
+something larger, be it a page listing recently opened bugs (with a form to
+let a user easily post a new bug), or a page listing recently closed bugs,
+or an index of all bugs, or all wishlist items, or RSS feeds for any of
+these. A flexible syntax is used for specifying what kind of pages should
+be inlined into a given page. A few examples:
+
+* A typical list of all open bugs, with their full text, and a form to post new
+ bugs.
+
+ \[[!inline pages="bugs/* and !link(done) and !*/Discussion" actions=yes postform=yes show=0 rootpage="bugs"]]
+
+* Index of the 30 most recently fixed bugs.
+
+ \[[!inline pages="bugs/* and link(done) and !*/Discussion" sort=mtime show=30 archive=yes]]
+
+* Index of the 10 most recently active bugs.
+
+ \[[!inline pages="bugs/* and !link(done) and !*/Discussion" sort=mtime show=10]]
+
+* Open security issues.
+
+ \[[!inline pages="bugs/* and link(security) and !link(done) and !*/Discussion"]]
+
+* Full text of bugs assigned to Joey.
+
+ \[[!inline pages="bugs/* and link(owner/Joey) and !link(done) and !*/Discussion" show=0]]
+
+It may seem strange to consider using a wiki for issue tracking when there
+are several dedicated bug tracking systems, like Bugzilla, that handle all
+aspects of it already. The weakest part of using ikiwiki for issue
+tracking, and certainly the place where a dedicated bug tracker like
+Bugzilla shines in comparison, is storing and querying structured data
+about bugs. Ikiwiki has little structured data except for page filenames
+and tags, so if you need lots of queryable data such as what versions a bug
+affects and what version it was fixed in, ikiwiki may not be a good fit for
+your issue tracking.
+
+On the other hand, by using a wiki for issue
+tracking, there is one less system for users and
+developers to learn, and all the flexibility of a
+wiki to take advantage of. Ikiwiki even supports
+[OpenID](http://openid.net/), so it's easy for users
+to use it for filing bugs without going through an
+annoying registration process.
+
+Developers who work offline, or at the other end of a
+slow connection, might appreciate having a full copy
+of the project bug tracking system, too.
+
+
+## Benefits
+
+Realistically, there are plusses and minuses to letting users edit a
+software project's documentation in a wiki. Like any wiki, to be
+successful, some review is needed of the changes users make. In some cases
+it will be easiest to limit the pages that users are allowed to edit.
+Still, keeping the wiki open for user edits will probably turn up some
+passionate users who prove very useful at filling in holes in the
+documentation and cleaning up the site.
+
+Programmers are supposed to be bad at writing documentation, and putting a
+project's docs into a wiki might not solve that. But it can make it a
+little bit easier. Consider a programmer who's just coded up a new feature.
+He can commit that to a development branch in revision control, and then go
+update the docs on the web site to document it. But the feature isn't
+available in a released version yet, so it's probably easier to skip
+updating the website. Maybe once it's released, the web site will be
+updated to mention the feature, but maybe (probably) not.
+
+Now consider what happens if instead the web site is a wiki that has its
+source included in the project's revision control system. The programmer
+codes up the feature, and can easily update the docs in the wiki to match.
+When he commits his changes to a development branch, the docs are committed
+too. Later, when that change is merged to the release branch, the doc
+changes are also merged, and automatically go live on the web site.
+Updating the documentation to reflect each change made and publishing it on
+the website has become a standard part of the programmer's workflow.
+
+But this still requires programmers to write documentation, so maybe it
+still won't work. Let's go back a step. Before the programmer wrote that
+feature, he probably got some requests for it, and maybe he developed those
+into a specification for how the feature should work. Since ikiwiki can be
+used as an issue tracker, the requests were made using it, and were
+collaboratively edited on the wiki, to develop the specification. Once the
+feature is implemented, that issue can be closed. What better way to close
+it than to move it out of the issue tracking system, and into the project's
+documentation? In Subversion:
+
+ svn mv wiki/bugs/new_feature.mdwn wiki/doc/
+
+If the spec is written well enough to be useful for end user documentation,
+the programmer doesn't have to write a lot of docs after all; that was done
+when the feature was designed. By using ikiwiki for issue tracking, plus
+editing the spec, plus documentation, plus the website, each of these steps
+has built on the other and the programmer has had to do less busywork.
+
+A different example of how ikiwiki can tie
+things together is how a security hole might be
+handled. First it's discovered, and a bug filed about
+it. When it's fixed, the commit that fixes the bug
+can include a change to the bug's page, marking it
+as done. Since it's a security hole, the project
+needs to make an announcement right away so users
+will know they need to upgrade. This announcement
+can be added to the wiki's news feed, and committed
+along with the fix, and the announcement can use a
+regular wiki link to link to the bug that describes
+the security hole in detail. If the security hole
+also affects an older version of the software, the
+fix, along with the wiki documentation for that fix,
+can be merged into the branch for the older version.
+
+Another benefit of keeping the bug tracking system in revision control with
+the wiki is that it allows for disconnected development. So there's no need
+to be online to review the project's bug list, and there's no need to
+remember to close fixed bugs once you're back online.
+
+For fans of distributed revision control, ikiwiki opens even more
+possibilities. With a project's website and issue tracker kept in
+distributed revision control with the project, these become distributed as
+well, rather than centralized appendixes to the project. Developers can
+pass around changesets that not only fix bugs, but mark them as done. If
+large changes are being made in someone's branch, they can choose to put up
+their own version of the website, use it to track bugs for that branch, and
+when the branch is ready, all these changes can be merged back into the
+mainline of the project.
+
+Ikiwiki powers its own bug tracking system. To see how wiki bug tracking
+works in practice, visit the [[bugs]] or [[TODO]] pages.
diff --git a/doc/tips/integrated_issue_tracking_with_ikiwiki/discussion.mdwn b/doc/tips/integrated_issue_tracking_with_ikiwiki/discussion.mdwn
new file mode 100644
index 000000000..8c6a6ecc9
--- /dev/null
+++ b/doc/tips/integrated_issue_tracking_with_ikiwiki/discussion.mdwn
@@ -0,0 +1,32 @@
+From IRC messages.. may later format into a nicer display (time is limited):
+
+Just wondering, who's using ikiwiki as their bug-tracking system? I'm trying to root out bug-tracking systems that work with GIT and so far like ikiwiki for docs, but haven't yet figured out the best way to make it work for bug-tracking.
+
+> I know of only a few:
+> * This wiki.
+> * The "awesome" window manager.
+
+I suppose having a separate branch for public web stuff w/ the following workflow makes sense:
+
+* Separate master-web and master branches
+* master-web is public
+* cherry-pick changes from master-web into master when they are sane
+* regularly merge master -> master-web
+
+> That's definitely one way to do it. For this wiki, I allow commits
+> directly to master via the web, and sanity check after the fact. Awesome
+> doesn't allow web commits at all.
+
+Bug origination point: ... anybody have ideas for this? Create branch at bug origination point and merge into current upstream branches? (I guess this would be where cherry-picking would work best, since the web UI can't do this)
+
+> Not sure what you mean.
+>> Documentation as to where the bug came from for related branches...
+>> Ex: The bug got located in r30, but really came about r10. Desire is to propagate the bug to all everything after r10.
+
+Bug naming: any conventions/ideas on how to standardize? Any suggestions on methods of linking commits to bugs without having to modify the bug in each commit?
+
+> I don't worry about naming, but then I don't refer to the bug urls
+> anywhere, so any names are ok. When I make a commit to fix a bug, I mark
+> the bug done in the same commit, which links things.
+
+-- [[harningt]]
diff --git a/doc/tips/laptop_wiki_with_git.mdwn b/doc/tips/laptop_wiki_with_git.mdwn
new file mode 100644
index 000000000..cfa565d1a
--- /dev/null
+++ b/doc/tips/laptop_wiki_with_git.mdwn
@@ -0,0 +1,71 @@
+[[!toc]]
+
+Using ikiwiki with the [[rcs/git]] backend, some interesting things can be done
+with creating mirrors (or, really, branches) of a wiki. In this tip, I'll
+assume your wiki is located on a server, and you want to take a copy with
+you on your laptop.
+
+With the configuration described here, you can commit local changes to the
+version on the laptop, perhaps while offline. You can browse and edit the
+wiki using a local web server. When you're ready, you can manually push the
+changes to the main wiki on the server.
+
+## simple clone approach
+
+First, set up the wiki on the server, if it isn't already. Nothing special
+needs to be done here, just follow the regular instructions in [[setup]]
+for setting up ikiwiki with git.
+
+Next, `git clone` the source (`$REPOSITORY`, not `$SRCDIR`)
+from the server to the laptop.
+
+Now, set up a [[web_server|dot_cgi]] on your laptop, if it doesn't
+already have one.
+
+Now you need to write a setup file for ikiwiki on the laptop. Mostly this
+is standard, but a few special settings are needed:
+
+* Configure a cgi wrapper as usual, but configure the git wrapper to
+ be written to the `post-commit` hook of the git clone, rather than the
+ usual `post-update` hook.
+
+* By default, ikiwiki pulls and pushes from `origin`. This shouldn't be
+ done on the laptop, because the origin may not be accessible (when the
+ laptop is offline). Also, commits made on the laptop should stay there,
+ until manually pushed, so that the laptop can serve as a staging area.
+
+ Make sure you have ikiwiki 2.11 or better installed, and set:
+
+ gitorigin_branch => "",
+
+* You can optionally enable to the [[plugins/mirrorlist]] plugin,
+ and configure it so that each page links to the corresponding page on the
+ server.
+
+Now just run `ikiwiki -setup wiki.setup -getctime` and you should be
+good to go. (You only need the slow `-getctime` option the first time you
+run setup.)
+
+Use standard git commands to handle pulling from and pushing to the server.
+
+Note that if changes are pulled from the server, you will need to manually
+update the wiki, with a command such as `ikiwiki -setup wiki.setup -refresh`.
+If you'd like it to automatically update when changes are merged in, you
+can simply make a symlink `post-merge` hook pointing at the `post-update`
+hook ikiwiki created.
+
+## bare mirror approach
+
+As above, set up a normal ikiwiki on the server, with the usual bare repository.
+
+Next, `git clone --mirror server:/path/to/bare/repository`
+
+This will be used as the $REPOSITORY on the laptop. Then you can follow
+the instructions in [[setup by hand|/setup/byhand]] as per a normal ikiwiki
+installation. This means that you can clone from the local bare repository
+as many times as you want (thus being able to have a repository which is
+used by the ikiwiki CGI, and another which you can use for updating via
+git).
+
+Use standard git commands, run in the laptop's bare git repository
+to handle pulling from and pushing to the server.
diff --git a/doc/tips/laptop_wiki_with_git/discussion.mdwn b/doc/tips/laptop_wiki_with_git/discussion.mdwn
new file mode 100644
index 000000000..297a2a6a7
--- /dev/null
+++ b/doc/tips/laptop_wiki_with_git/discussion.mdwn
@@ -0,0 +1,15 @@
+I have followed this idea along, and it seems to work pretty well.
+Now I have a question as a git newbie. Can I have the post-commit hook on the server use something like rsync to update the files on a third machine hosting the web server? The web server does not have git (cretins!). Of course I could just run a cron job.
+
+Or, was this last remark about rebuilding after pulling meant to apply to rebuilding after pushing as well?
+[[DavidBremner]]
+
+* *Updated* Now that I play with this a bit, this seems not so important. Having a seperate sync operation that I run from the laptop is no big deal, and lets me update the parts of my site not yet managed by ikiwiki at the same time.
+
+* Ok, I have finally finished to set this up. I have a question for you :) Is it mandatory to have a locally running webserver on the laptop ? I mean, do I need to setup the CGI wrapper on the laptop ? Is it possible to just add/edit/delete/whatever, git commit all the stuff and git push back to the server ? Thank you. --[[xma]]
+
+> Of course you don't need a web server on the laptop. It is useful for
+> previewing pages before publishing them though. --[[Joey]]
+
+I have followed this idea too, however after pushing to the server running gitk in the scrdir shows that the remotes/origin/master branch is newer than the master. Is this normal? Have I reset the master branch to remotes/origin/master then every time when someone pushed something (and run ikiwiki -setup afterwards?)
+[[Micheal]]
diff --git a/doc/tips/laptop_wiki_with_git_extended.mdwn b/doc/tips/laptop_wiki_with_git_extended.mdwn
new file mode 100644
index 000000000..0666da450
--- /dev/null
+++ b/doc/tips/laptop_wiki_with_git_extended.mdwn
@@ -0,0 +1,43 @@
+[[!meta title="Laptop Ikiwiki extended"]]
+
+I have (at least) three different hosts, `laptop`, `gitserver`, and `webserver`.
+
+1. I started by following [[/tips/laptop_wiki_with_git]] to create
+a bare repo on `gitserver`, and clone that to a workingdir on gitserver.
+
+ On the laptop clone gitserver:repo /working/dir
+
+ Next create a setup file for the laptop with
+
+ gitorigin_branch=> "",
+ git_wrapper => "/working/dir/.git/hooks/post-commit",
+
+ At this point, assuming you followed page above, and not my hasty summary,
+
+ git commit -a
+
+ should rebuild the output of your wiki.
+
+2. Now create a setup file for the server (I call it server.setup).
+
+ gitorigin_branch=> "origin",
+ git_wrapper => "/repo/wiki.git/hooks/post-update.ikiwiki"
+
+ Note the non-standard and bizzare name of the hook.
+
+ edit /repo/wiki.git/hooks/post-update so that it looks something like
+
+ /repo/wiki.git/hooks/post-update.ikiwiki
+ rsync -cavz /home/me/public_html/* webserver:/destdir
+
+ Run
+
+ ikiwiki --setup server.setup
+
+Now in principle when you run git push on the laptop, the git server will
+first do its "regular" thing and update ~/public_html (in my case) and
+then rsync it onto the webserver. For this to work, you need passwordless
+ssh or something like it.
+
+[[DavidBremner]]
+
diff --git a/doc/tips/laptop_wiki_with_git_extended/discussion.mdwn b/doc/tips/laptop_wiki_with_git_extended/discussion.mdwn
new file mode 100644
index 000000000..8213e9649
--- /dev/null
+++ b/doc/tips/laptop_wiki_with_git_extended/discussion.mdwn
@@ -0,0 +1 @@
+It appears that this is no longer necessary with git_wrapper_background_command.
diff --git a/doc/tips/mailman_subscription_form.mdwn b/doc/tips/mailman_subscription_form.mdwn
new file mode 100644
index 000000000..3e9ed0786
--- /dev/null
+++ b/doc/tips/mailman_subscription_form.mdwn
@@ -0,0 +1,10 @@
+One can think about implementing "[[Mailman integration]]" or something, but I find that rather overdoing it. Mailman is simple enough that you can add a clean subscription form to your ikiwiki site in seconds, just add:
+
+~~~~
+<form action="https://listes.example.com/cgi-bin/mailman/subscribe/listname" method="POST">
+Email: <input name="email" />
+<input type="submit" value="Subscribe" />
+</form>
+~~~~
+
+To your site and voilà, you are done! No more scary mailman subscription form! (Thanks to [[bgm]] for that trick!) --[[anarcat]]
diff --git a/doc/tips/markdown_and_eclipse.mdwn b/doc/tips/markdown_and_eclipse.mdwn
new file mode 100644
index 000000000..9e8e9bfa9
--- /dev/null
+++ b/doc/tips/markdown_and_eclipse.mdwn
@@ -0,0 +1,4 @@
+For people that were not born with GNU emacs fingers,
+there is a markdown editor (with preview and outline)
+for [eclipse](http://www.eclipse.org) available
+[here](http://www.winterwell.com/software/markdown-editor.php).
diff --git a/doc/tips/mathopd_permissions.mdwn b/doc/tips/mathopd_permissions.mdwn
new file mode 100644
index 000000000..c0425b9ca
--- /dev/null
+++ b/doc/tips/mathopd_permissions.mdwn
@@ -0,0 +1,15 @@
+When using [mathopd](http://www.mathopd.org) to serve ikiwiki, be careful of your Umask settings in the mathopd.conf.
+
+With `Umask 026` in mathopd.conf, editing pages resulted in the following errors and a 404 page when the wiki tried to take me to the updated page.
+
+ append_indexes: cannot open .../[destdir]/[outputfile].html
+ open: Permission denied
+
+With `Umask 022` in mathopd.conf, editing pages works.
+
+Hopefully this prevents someone else from spending ~2 hours figuring out why this wouldn't work. ;)
+
+> More generally, if your web server uses a nonstandard umask
+> or you're getting permissions related problems in the cgi log
+> when using ikiwiki, you can force ikiwiki to use a sane umask
+> via the `umask` setting in ikiwiki's own setup file. --[[Joey]]
diff --git a/doc/tips/nearlyfreespeech.mdwn b/doc/tips/nearlyfreespeech.mdwn
new file mode 100644
index 000000000..a3d1ec678
--- /dev/null
+++ b/doc/tips/nearlyfreespeech.mdwn
@@ -0,0 +1,108 @@
+[NearlyFreeSpeech.net](http://NearlyFreeSpeech.net) is a shared hosting
+provider with very cheap pay as you go pricing. Here's how to install ikiwiki
+there if you don't have a dedicated server.
+
+Note that you can also follow these instructions, get your wiki set up on
+NearlyFreeSpeech, and then use the [[plugins/Amazon_S3]] plugin to inject
+the static web pages into Amazon S3. Then NearlyFreeSpeech will handle the
+CGI, and S3 will handle the web serving. This might be a more cost effective,
+scalable, or robust solution than using NearlyFreeSpeech alone.
+
+## Register for an account and set up a site
+
+After you [get an account](https://www.nearlyfreespeech.net/about/start.php),
+create a site using their web interface.
+
+Mine is named `ikiwiki-test` and I used their DNS instead of getting my
+own, resulting in <http://ikiwiki-test.nfshost.com/>. (Not being kept up
+anymore.)
+
+They gave me 2 cents free funding for signing up, which is enough to pay
+for 10 megabytes of bandwidth, or about a thousand typical page views, at
+their current rates. Plenty to decide if this is right for you. If it is,
+$5 might be a good starting amount of money to put in your account.
+
+## ssh in and configure the environment
+
+ssh into their server using the ssh hostname and username displayed on
+the site's information page. For me this was:
+
+ ssh joeyh_ikiwiki-test@ssh.phx.nearlyfreespeech.net
+
+Now set up .profile to run programs from ~/bin.
+
+ cd $HOME
+ echo "PATH=$PATH:$HOME/bin" > .profile
+ . .profile
+
+## Download an unpack ikiwiki
+
+Use `wget` to [[download]] the ikiwiki tarball. Then unpack it:
+
+ tar zxvf ikiwiki*.tar.gz
+
+## Install perl modules
+
+As an optional step, you can use CPAN to install the perl modules ikiwiki
+uses into your home directory. This should not be necessary, mostly,
+because the system has most modules installed already.
+
+So, you might want to skip this step and come back to it later if ikiwiki
+doesn't work.
+
+ PERL5LIB=`pwd`/ikiwiki:`pwd`/ikiwiki/cpan:`pwd`/lib/perl5 PERL_MM_USE_DEFAULT=1 perl -MCPAN -e 'CPAN::Shell->install("Bundle::IkiWiki")'
+
+ PERL5LIB=`pwd`/ikiwiki:`pwd`/ikiwiki/cpan:`pwd`/lib/perl5 PERL_MM_USE_DEFAULT=1 perl -MCPAN -e 'CPAN::Shell->force(install => "Bundle::IkiWiki::Extras")'
+
+This will take a while. As long as the first command succeeds, ikiwiki will be
+usable. The second command adds extra modules that some plugins use, so it's
+ok if installation of some of them fail.
+
+## Build and install ikiwiki
+
+ cd ikiwiki
+ export MAKE=gmake
+ perl Makefile.PL INSTALL_BASE=$HOME PREFIX=
+ $MAKE
+ $MAKE install
+
+## Set up a wiki in the usual way
+
+With ikiwiki installed, you can follow the regular [[setup]] tutorial for
+settng up your wiki. Make sure to set `destdir` to `/home/htdocs/` so that
+the wiki is published on the web site. I recommend using git for revision
+control; you can then clone your wiki's git repository as an offsite backup.
+
+Here is an example of how I set up a wiki:
+
+ mkdir ~/wiki
+ cd ~/wiki
+ cp -r ~/ikiwiki/doc/examples/blog/* .
+ ikiwiki -dumpsetup ikiwiki.setup
+ nano ikiwiki.setup
+ # Set destdir to /home/htdocs
+ # Set srcdir to /home/private/wiki
+ # Set url to http://yoursite.nfshost.com/
+ # Set cgiurl to http://yoursite.nfshost.com/ikiwiki.cgi
+ # Uncomment the `rcs => "git"` line.
+ # Set the cgi_wrapper path to /home/htdocs/ikiwiki.cgi
+ # Set the git_wrapper path to /home/private/wiki.git/hooks/post-update
+ # Configure the rest to your liking and save the file.
+ ikiwiki-makerepo git . ../wiki.git
+ ikiwiki -setup ikiwiki.setup
+
+## Clean up
+
+Finally, you can save a _lot_ of disk space by cleaning up the ikiwiki
+tarball and .cpan directory and a few other peices of cruft. Since you'll be
+charged one cent per month per megabyte, this is a quick way to save several
+dollars.
+
+rm -rf ~/ikiwiki*.tar.gz ~/.cpan ~/ikiwiki ~/man ~/lib/perl5/5.8.8
+
+That should cut things down to less than 2 megabytes. If you want to save
+even more space, delete unused perl modules from ~/lib/perl5
+
+## Enjoy!
+
+Have fun and do good things. --[[Joey]]
diff --git a/doc/tips/nearlyfreespeech/discussion.mdwn b/doc/tips/nearlyfreespeech/discussion.mdwn
new file mode 100644
index 000000000..b76432566
--- /dev/null
+++ b/doc/tips/nearlyfreespeech/discussion.mdwn
@@ -0,0 +1,22 @@
+with version 3.141592 I get
+<pre>
+HOME=/home/me /usr/bin/perl -Iblib/lib ikiwiki.out -libdir . -dumpsetup ikiwiki.setup
+Failed to load plugin IkiWiki::Plugin::inline: Can't use global $_ in "my" at IkiWiki/Plugin/inline.pm line 198, near "my $_"
+Compilation failed in require at (eval 19) line 2.
+BEGIN failed--compilation aborted at (eval 19) line 2.
+</pre>
+
+perl is 5.8.9
+
+> This is fixed in 3.1415926. --[[Joey]]
+
+
+Hi!<br />
+How can i upgrade my nearlyfreespeech installation of ikiwiki? To install it i have followed your instructions here.<br>
+But now if I want to upgrade it to a newer version?<br />
+Thanks for your incredible work!
+
+> You can move `~/ikiwiki` out of the way and then re-download and install
+> it ikiwiki. --[[Joey]]
+
+Thanks a lot Joey. :-)
diff --git a/doc/tips/optimising_ikiwiki.mdwn b/doc/tips/optimising_ikiwiki.mdwn
new file mode 100644
index 000000000..d66ee9343
--- /dev/null
+++ b/doc/tips/optimising_ikiwiki.mdwn
@@ -0,0 +1,188 @@
+Ikiwiki is a wiki compiler, which means that, unlike a traditional wiki,
+all the work needed to display your wiki is done up front. Where you can
+see it and get annoyed at it. In some ways, this is better than a wiki
+where a page view means running a program to generate the page on the fly.
+
+But enough excuses. If ikiwiki is taking too long to build your wiki,
+let's fix that. Read on for some common problems that can be avoided to
+make ikiwiki run quick.
+
+[[!toc]]
+
+(And if none of that helps, file a [[bug|bugs]]. One other great thing about
+ikiwiki being a wiki compiler is that it's easy to provide a test case when
+it's slow, and get the problem fixed!)
+
+## rebuild vs refresh
+
+Are you building your wiki by running a command like this?
+
+ ikiwiki -setup my.setup
+
+If so, you're always telling ikiwiki to rebuild the entire site, from
+scratch. But, ikiwiki is smart, it can incrementally update a site,
+building only things affected by the changes you make. You just have to let
+it do so:
+
+ ikiwiki -setup my.setup -refresh
+
+Ikiwiki automatically uses an incremental refresh like this when handing
+a web edit, or when run from a [[rcs]] post-commit hook. (If you've
+configured the hook in the usual way.) Most people who have run into this
+problem got in the habit of running `ikiwiki -setup my.setup` by hand
+when their wiki was small, and found it got slower as they added pages.
+
+## use the latest version
+
+If your version of ikiwiki is not [[!version]], try upgrading. New
+optimisations are frequently added to ikiwiki, some of them yielding
+*enormous* speed increases.
+
+## run ikiwiki in verbose mode
+
+Try changing a page, and run ikiwiki with `-v` so it will tell you
+everything it does to deal with that changed page. Take note of
+which other pages are rebuilt, and which parts of the build take a long
+time. This can help you zero in on individual pages that contain some of
+the expensive things listed below.
+
+## expensive inlines
+
+Do you have an archive page for your blog that shows all posts,
+using an inline that looks like this?
+
+ \[[!inline pages="blog/*" show=0]]
+
+Or maybe you have some tag pages for your blog that show all tagged posts,
+something like this?
+
+ \[[!inline pages="blog/* and tagged(foo)" show=0]]
+
+These are expensive, because they have to be updated whenever you modify a
+matching page. And, if there are a lot of pages, it generates a large html
+file, which is a lot of work. And also large RSS/Atom files, which is even
+more work!
+
+To optimise the inline, consider enabling quick archive mode. Then the
+inline will only need to be updated when new pages are added; no RSS
+or Atom feeds will be built, and the generated html file will be much
+smaller.
+
+ \[[!inline pages="blog/*" show=0 archive=yes quick=yes]]
+
+ \[[!inline pages="blog/* and link(tag)" show=0 archive=yes quick=yes]]
+
+Only downsides: This won't show titles set by the [[ikiwiki/directive/meta]]
+directive. And there's no RSS feed for users to use -- but if this page
+is only for the archives or tag for your blog, users should be subscribing
+to the blog's main page's RSS feed instead.
+
+For the main blog page, the inline should only show the latest N posts,
+which won't be a performance problem:
+
+ \[[!inline pages="blog/*" show=30]]
+
+## expensive maps
+
+Do you have a sitemap type page, that uses a map directive like this?
+
+ \[[!map pages="*" show=title]]
+
+This is expensive because it has to be updated whenever a page is modified.
+The resulting html file might get big and expensive to generate as you
+keep adding pages.
+
+First, consider removing the "show=title". Then the map will not show page
+titles set by the [[ikiwiki/directive/meta]] directive -- but will also
+only need to be generated when pages are added or removed, not for every
+page change.
+
+Consider limiting the map to only show the toplevel pages of your site,
+like this:
+
+ \[[!map pages="* and !*/*" show=title]]
+
+Or, alternatively, to drop from the map parts of the site that accumulate
+lots of pages, like individual blog posts:
+
+ \[[!map pages="* and !blog/*" show=title]]
+
+## sidebar issues
+
+If you enable the [[plugins/sidebar]] plugin, be careful of what you put in
+your sidebar. Any change that affects what is displayed by the sidebar
+will require an update of *every* page in the wiki, since all pages include
+the sidebar.
+
+Putting an expensive map or inline in the sidebar is the most common cause
+of problems. At its worst, it can result in any change to any page in the
+wiki requiring every page to be rebuilt.
+
+## avoid htmltidy
+
+A few plugins do neat stuff, but slowly. Such plugins are tagged
+[[plugins/type/slow]].
+
+The worst offender is possibly [[plugins/htmltidy]]. This runs an external
+`tidy` program on each page that is built, which is necessarily slow. So don't
+use it unless you really need it; consider using the faster
+[[plugins/htmlbalance]] instead.
+
+## be careful of large linkmaps
+
+[[plugins/Linkmap]] generates a cool map of links between pages, but
+it does it using the `graphviz` program. And any changes to links between
+pages on the map require an update. So, avoid using this to map a large number
+of pages with frequently changing links. For example, using it to map
+all the pages on a traditional, highly WikiLinked wiki, is asking for things
+to be slow. But using it to map a few related pages is probably fine.
+
+This site's own [[plugins/linkmap]] rarely slows it down, because it
+only shows the index page, and the small set of pages that link to it.
+That is accomplished as follows:
+
+ \[[!linkmap pages="index or (backlink(index)"]]
+
+## overhead of the search plugin
+
+Be aware that the [[plugins/search]] plugin has to update the search index
+whenever any page is changed. This can slow things down somewhat.
+
+## profiling
+
+If you have a repeatable change that ikiwiki takes a long time to build,
+and none of the above help, the next thing to consider is profiling
+ikiwiki.
+
+The best way to do it is:
+
+* Install [[!cpan Devel::NYTProf]]
+* `PERL5OPT=-d:NYTProf`
+* `export PER5OPT`
+* Now run ikiwiki as usual, and it will generate a `nytprof.out` file.
+* Run `nytprofhtml` to generate html files.
+* Those can be examined to see what parts of ikiwiki are being slow.
+
+## scaling to large numbers of pages
+
+Finally, let's think about how huge number of pages can affect ikiwiki.
+
+* Every time it's run, ikiwiki has to scan your `srcdir` to find
+ new and changed pages. This is similar in speed to running the `find`
+ command. Obviously, more files will make it take longer.
+
+* Also, to see what pages match a [[ikiwiki/PageSpec]] like "blog/*", it has
+ to check if every page in the wiki matches. These checks are done quite
+ quickly, but still, lots more pages will make PageSpecs more expensive.
+
+* The backlinks calculation has to consider every link on every page
+ in the wiki. (In practice, most pages only link to at most a few dozen
+ other pages, so this is not a `O(N^2)`, but closer to `O(N)`.)
+
+* Ikiwiki also reads and writes an `index` file, which contains information
+ about each page, and so if you have a lot of pages, this file gets large,
+ and more time is spent on it. For a wiki with 2000 pages, this file
+ will run about 500 kb.
+
+If your wiki will have 100 thousand files in it, you might start seeing
+the above contribute to ikiwiki running slowly.
diff --git a/doc/tips/parentlinks_style.mdwn b/doc/tips/parentlinks_style.mdwn
new file mode 100644
index 000000000..f9dfa8f55
--- /dev/null
+++ b/doc/tips/parentlinks_style.mdwn
@@ -0,0 +1,143 @@
+Here are some tips for ways to style the links
+provided by the [[plugins/parentlinks]] plugin.
+
+This plugin offers a `HTML::Template` loop that iterates over all or
+a subset of a page's parents. It also provides a few bonus
+possibilities, such as styling the parent links depending on their
+place in the path.
+
+[[!toc levels=2]]
+
+Content
+=======
+
+The plugin provides one template loop, called `PARENTLINKS`, that
+returns the list of parent pages for the current page. Every returned
+path element has the following variables set:
+
+* `URL` (string): url to the current path element
+* `PAGE` (string): title of the current path element
+* `DEPTH` (positive integer): depth of the path leading to the
+ current path element, counting from the wiki's root, which has
+ `DEPTH=0`
+* `HEIGHT` (positive integer): distance, expressed in path elements,
+ from the current page to the current path element; e.g. this is
+ 1 for the current page's mother, 2 for its grand-mother, etc.
+* `DEPTH_n` (boolean): true if, and only if, `DEPTH==n`
+* `HEIGHT_n` (boolean): true if, and only if, `HEIGHT==n`
+
+Usage
+=====
+
+The `DEPTH_n` and `HEIGHT_n` variables allow the template writer to
+skip arbitrary elements in the parents list: they are arbitrary
+page-range selectors.
+
+The `DEPTH` and `HEIGHT` variables allow the template writer to apply
+general treatment, depending on one of these variables, to *every*
+parent: they are counters.
+
+Basic usage
+-----------
+
+As in the default `page.tmpl`, one can simply display the list of
+parent pages:
+
+ <TMPL_LOOP NAME="PARENTLINKS">
+ <a href="<TMPL_VAR NAME=URL>"><TMPL_VAR NAME=PAGE></a>/
+ </TMPL_LOOP>
+ <TMPL_VAR TITLE>
+
+
+Styling parents depending on their depth
+----------------------------------------
+
+Say you want the parent links to be styled depending on their depth in
+the path going from the wiki root to the current page; just add the
+following lines in `page.tmpl`:
+
+ <TMPL_LOOP NAME="PARENTLINKS">
+ <a href="<TMPL_VAR NAME="URL">" class="depth<TMPL_VAR NAME="DEPTH">">
+ <TMPL_VAR NAME="PAGE">
+ </a> /
+ </TMPL_LOOP>
+
+Then write the appropriate CSS bits for `a.depth1`, etc.
+
+Skip some parents, style the others depending on their distance to the current page
+-----------------------------------------------------------------------------------
+
+Say you want to display all the parents links but the wiki homepage,
+styled depending on their distance to the current page; just add the
+following lines in `page.tmpl`:
+
+ <TMPL_LOOP NAME="PARENTLINKS">
+ <TMPL_IF NAME="DEPTH_0">
+ <TMPL_ELSE>
+ <a href="<TMPL_VAR NAME="URL">" class="height<TMPL_VAR NAME="HEIGHT">">
+ <TMPL_VAR NAME="PAGE">
+ </a> /
+ </TMPL_IF>
+ </TMPL_LOOP>
+
+Then write the appropriate CSS bits for `a.height1`, etc.
+
+Avoid showing title of toplevel index page
+------------------------------------------
+
+If you don't like having "index" appear on the top page of the wiki,
+but you do want to see the name of the page otherwise, you can use a
+special `HAS_PARENTLINKS` template variable that the plugin provides.
+It is true for every page *except* the toplevel index.
+
+Here is an example of using it to hide the title of the toplevel index
+page:
+
+ <TMPL_LOOP NAME="PARENTLINKS">
+ <a href="<TMPL_VAR NAME=URL>"><TMPL_VAR NAME=PAGE></a>/
+ </TMPL_LOOP>
+ <TMPL_IF HAS_PARENTLINKS>
+ <TMPL_VAR TITLE>
+ </TMPL_IF>
+
+Full-blown example
+------------------
+
+Let's have a look at a more complicated example; combining the boolean
+loop variables provided by the plugin (`IS_ROOT` and friends) and
+`HTML::Template` flow control structures, you can have custom HTML
+and/or CSS generated for some special path components; e.g.:
+
+ <!-- all parents, skipping mother and grand'ma, inside a common div+ul -->
+ <div id="oldestparents">
+ <ul>
+ <TMPL_LOOP NAME="PARENTLINKS">
+ <TMPL_IF NAME="HEIGHT_2">
+ <TMPL_ELSE>
+ <TMPL_IF NAME="HEIGHT_1">
+ <TMPL_ELSE>
+ <li><a href="<TMPL_VAR NAME="URL">"><TMPL_VAR NAME="PAGE"></a></li>
+ </TMPL_IF>
+ </TMPL_IF>
+ </TMPL_LOOP>
+ </ul>
+ </div>
+
+ <!-- dedicated div's for mother and grand'ma -->
+ <TMPL_LOOP NAME="PARENTLINKS">
+ <TMPL_IF NAME="HEIGHT_2">
+ <div id="grandma">
+ <a href="<TMPL_VAR NAME="URL">"><TMPL_VAR NAME="PAGE"></a>
+ </div>
+ <TMPL_ELSE>
+ <TMPL_IF NAME="HEIGHT_1">
+ <div id="mother">
+ <a href="<TMPL_VAR NAME="URL">"><TMPL_VAR NAME="PAGE"></a>
+ </div>
+ </TMPL_IF>
+ </TMPL_IF>
+ </TMPL_LOOP>
+
+ <!-- eventually, the current page title -->
+ <TMPL_VAR NAME="TITLE">
+ </div>
diff --git a/doc/tips/psgi.mdwn b/doc/tips/psgi.mdwn
new file mode 100644
index 000000000..0d2eeefc8
--- /dev/null
+++ b/doc/tips/psgi.mdwn
@@ -0,0 +1,21 @@
+Here's the app.psgi file if you want to run ikiwiki with [PSGI](http://plackperl.org) instead of apache or other web servers:
+
+ use Plack::App::CGIBin;
+ use Plack::Builder;
+ use Plack::App::File;
+
+ builder {
+ mount '/ikiwiki.cgi' => Plack::App::CGIBin->new(file => './ikiwiki.cgi')->to_app;
+ enable "Plack::Middleware::Static",
+ path => sub { s!(^(?:/[^.]*)?/?$)!${1}/index.html! },
+ root => '.';
+ mount '/' => Plack::App::File->new(root => ".")->to_app;
+ };
+
+Put it in your destdir and now your can run `plackup -p <port>`.
+
+Note that you should configure your `url` and `cgiurl` to point to the listening address of plackup.
+
+Also, the app.psgi residing in the destdir means that /app.psgi is accessible from the web server.
+
+Hopefully some day ikiwiki web ui will speak psgi natively.
diff --git a/doc/tips/redirections_for_usedirs.mdwn b/doc/tips/redirections_for_usedirs.mdwn
new file mode 100644
index 000000000..588b9f4b5
--- /dev/null
+++ b/doc/tips/redirections_for_usedirs.mdwn
@@ -0,0 +1,39 @@
+Want to turn on the `usedirs` setting on an existing wiki without breaking
+all the links into it?
+
+#Apache and RewriteEngine
+
+Here's a way to do it for Apache, using the
+RewriteEngine. This example is for a wiki at the top of a web site, but can
+be adapted to other situations.
+
+ # pages
+ RewriteCond $1 !^/~ # these pages
+ RewriteCond $1 !^/doc/ # are not part of
+ RewriteCond $1 !^/ajaxterm # the wiki, so
+ RewriteCond $1 !^/cgi-bin/ # don't rewrite them
+ RewriteCond $1 !.*/index$
+ RewriteRule (.+).html$ $1/ [R]
+
+ # rss feeds
+ RewriteCond $1 !^/~
+ RewriteCond $1 !.*/index$
+ RewriteRule (.+).rss$ $1/index.rss
+
+ # atom feeds
+ RewriteCond $1 !^/~
+ RewriteCond $1 !.*/index$
+ RewriteRule (.+).atom$ $1/index.atom
+
+#lighttpd and mod_redirect
+
+The following example is exactly the same thing written for lighttpd by using mod_redirect:
+
+ $HTTP["url"] !~ "^/(~|doc/|ajaxterm|cgi-bin/)" {
+ $HTTP["url"] !~ "^/(.*/index\.(html|rss|atom))" {
+ url.redirect = (
+ "(.*)\.html$" => "$1/",
+ "(.*)\.(atom|rss)$" => "$1/index.$2"
+ )
+ }
+ } \ No newline at end of file
diff --git a/doc/tips/spam_and_softwaresites.mdwn b/doc/tips/spam_and_softwaresites.mdwn
new file mode 100644
index 000000000..a07889e6b
--- /dev/null
+++ b/doc/tips/spam_and_softwaresites.mdwn
@@ -0,0 +1,87 @@
+Any wiki with a form of web-editing enabled will have to deal with
+spam. (See the [[plugins/blogspam]] plugin for one defensive tool you
+can deploy).
+
+If:
+
+ * you are using ikiwiki to manage the website for a [[examples/softwaresite]]
+ * you allow web-based commits, to let people correct documentation, or report
+ bugs, etc.
+ * the documentation is stored in the same revision control repository as your
+ software
+
+It is undesirable to have your software's VCS history tainted by spam and spam
+clean-up commits. Here is one approach you can use to prevent this. This
+example is for the [[git]] version control system, but the principles should
+apply to others.
+
+## Isolate web commits to a specific branch
+
+Create a separate branch to contain web-originated edits (named `doc` in this
+example):
+
+ $ git checkout -b doc
+
+Adjust your setup file accordingly:
+
+ gitmaster_branch => 'doc',
+
+## merging good web commits into the master branch
+
+You will want to periodically merge legitimate web-based commits back into
+your master branch. Ensure that there is no spam in the documentation
+branch. If there is, see 'erase spam from the commit history', below, first.
+
+Once you are confident it's clean:
+
+ # ensure you are on the master branch
+ $ git branch
+ doc
+ * master
+ $ git merge --ff doc
+
+## removing spam
+
+### short term
+
+In the short term, just revert the spammy commit.
+
+If the spammy commit was the top-most:
+
+ $ git revert HEAD
+
+This will clean the spam out of the files, but it will leave both the spam
+commit and the revert commit in the history.
+
+### erase spam from the commit history
+
+Git allows you to rewrite your commit history. We will take advantage of this
+to eradicate spam from the history of the doc branch.
+
+This is a useful tool, but it is considered bad practise to rewrite the
+history of public repositories. If your software's repository is public, you
+should make it clear that the history of the `doc` branch in your repository
+is unstable.
+
+Once you have been spammed, use `git rebase` to remove the spam commits from
+the history. Assuming that your `doc` branch was split off from a branch
+called `master`:
+
+ # ensure you are on the doc branch
+ $ git branch
+ * doc
+ master
+ $ git rebase --interactive master
+
+In your editor session, you will see a series of lines for each commit made to
+the `doc` branch since it was branched from `master` (or since the last merge
+back into `master`). Delete the lines corresponding to spammy commits, then
+save and exit your editor.
+
+Caveat: if there are no commits you want to keep (i.e. all the commits since
+the last merge into master are either spam or spam reverts) then `git rebase`
+will abort. Therefore, this approach only works if you have at least one
+non-spam commit to the documentation since the last merge into `master`. For
+this reason, it's best to wait until you have at least one
+commit you want merged back into the main history before doing a rebase,
+and until then, tackle spam with reverts.
diff --git a/doc/tips/spam_and_softwaresites/discussion.mdwn b/doc/tips/spam_and_softwaresites/discussion.mdwn
new file mode 100644
index 000000000..21f0a5d7e
--- /dev/null
+++ b/doc/tips/spam_and_softwaresites/discussion.mdwn
@@ -0,0 +1,8 @@
+In the cleanup spam section:
+
+> Caveat: if there are no commits you want to keep (i.e. all the commits since the last merge into master are either spam or spam reverts) then git rebase will abort.
+
+Wouldn't it be enough then to use `git reset --hard` to the desired last good commit?
+
+regards,
+iustin
diff --git a/doc/tips/switching_to_usedirs.mdwn b/doc/tips/switching_to_usedirs.mdwn
new file mode 100644
index 000000000..92871439f
--- /dev/null
+++ b/doc/tips/switching_to_usedirs.mdwn
@@ -0,0 +1,28 @@
+As of version 2.0, ikiwiki will switch to enabling the 'usedirs' setting by
+default. This *will* break all URLs to wikis that did not have 'usedirs'
+turned on before. You can either follow this procedure to convert your wiki
+to usedirs, or edit your setup file and turn usedirs back off.
+
+* Upgrade ikiwiki to 2.0.
+* Force ikiwiki to rebuild your wiki, by using `ikiwiki-mass-rebuild`,
+ or manually.
+* Since usedirs is enabled, ikiwiki will have created a bunch of new
+ html files. Where before ikiwiki generated a `dest/foo.html`, now it will
+ generate `dest/foo/index.html`. The old html files will be removed.
+* If you have a blog that is aggregated on a Planet or similar, all the
+ items in the RSS or atom feed will seem like new posts, since their URLs
+ have changed. See [[howto_avoid_flooding_aggregators]] for a workaround.
+* Now all the URLs to pages in your wiki have changed. See
+ [[redirections_for_usedirs]] for instructions on setting up redirections
+ to keep the old URLs working.
+
+Why usedirs?
+------------
+There are several advantages to `usedirs`, including simpler URLs, URLs that
+aren't dependent on the underlying implementation (`.html`), and being able to
+use URLs as tags as described in the [rel-tag
+microformat](http://microformats.org/wiki/rel-tag).
+
+The main disadvantage is that it is harder to browse using `file://` URIs,
+since `file:///dir/` doesn't automatically translate to `dir/index.html`. This
+is something one could fix in the browser though.
diff --git a/doc/tips/switching_to_usedirs/discussion.mdwn b/doc/tips/switching_to_usedirs/discussion.mdwn
new file mode 100644
index 000000000..79ada00a1
--- /dev/null
+++ b/doc/tips/switching_to_usedirs/discussion.mdwn
@@ -0,0 +1,24 @@
+I'm working on an asciidoc plugin and having a problem figuring out how to deal with usedirs.
+
+One of the key reasons I use asciidoc is for the ability to do inline includes of scripts with syntax-highlighting. I also want the script to be linked from the page for easy downloading.
+
+Formerly, I would have had the asciidoc and the script in the same directory, and all was well (and I have my asciidoc plugin working just fine for this). With usedirs, the script needs to be in the same dir as the asciidoc source at render time, but in the newly created subdir for download. The problem is that the page effectively moves down by one directory level when it is rendered from the source to the html tree.
+
+As far as my needs are concerned, there is no problem - I can just run without usedirs - but it would be nice to find a solution that would work either way. Just a clean solution to usedirs would be fine (even if it doesn't work without usedirs), since usedirs is now the default.
+
+------
+OK, now that I've written this out, I see a solution. The include for the syntax-highlighting and the link to the script are not tightly coupled (in fact they're not coupled at all, except by the person writing the page). So, the solution under usedirs is to specify a current-dir link to the script for the syntax-highlighter, and a parent-dir link for the script. This could even be made conditional on usedirs being enabled, if one felt so inclined.
+
+I'll leave this ramble here in case anyone has anything to say about it. Thank you for listening :-)
+
+-- [[KarlMW]]
+
+------
+This may serve only to highlight my naivete, but what are the advantages of the usedirs approach? I'm not passionately against the option, and I'm confident that there must be benefits for it to have become the default in ikiwiki - I just don't understand why.
+
+It seems to me that the only advantage is slightly tidier URLs, but with the disadvantage that source files change dir level and relative links need to change too.
+
+-- [[KarlMW]]
+
+The cleaner urls seem worth it to me. The `urlto()` function makes it easy
+for ikiwiki code to deal with the path changes. --[[Joey]]
diff --git a/doc/tips/untrusted_git_push.mdwn b/doc/tips/untrusted_git_push.mdwn
new file mode 100644
index 000000000..948a55063
--- /dev/null
+++ b/doc/tips/untrusted_git_push.mdwn
@@ -0,0 +1,114 @@
+This tip will describe how to allow anyone on the planet to `git push`
+changes into your wiki, without needing a special account. All a user needs
+to know is:
+
+ git clone git://your.wiki/path
+ # now modify any of the files the wiki would let you modify on the web
+ git push
+
+This is a wonderful thing to set up for users, because then they can work
+on the wiki while offline, and they don't need to mess around with web
+browsers.
+
+## security
+
+But, you might be wondering, how can this possibly be secure. Won't users
+upload all sorts of garbage, change pages you don't want them to edit, and so
+on.
+
+The key to making it secure is configuring ikiwiki to run as your git
+repository's `pre-receive` hook. There it will examine every change that
+untrusted users push into the wiki, and reject pushes that contain changes
+that cannot be made using the web interface.
+
+So, unless you have the [[plugins/attachment]] plugin turned on,
+non-page files cannot be added. And if it's turned on, whatever
+`allowed_attachments` checks you have configured will also check files
+pushed into git.
+
+And, unless you have the [[plugins/remove]] plugin turned on, no
+files can be deleted.
+
+And if you have `locked_pages` configured, then it will also affect what's
+pushed into git.
+
+Untrusted committers will also not be able to upload files with strange
+modes, or push to any branch except for the configured `gitorigin_branch`,
+or manipulate tags.
+
+One thing to keep an eye on is uploading large files. It may be easier to
+do this via git push than using the web, and that could be abused.
+
+Also, no checking is done that the authors of commits are right, so people
+can make a commit that pretends to be done by someone else.
+
+## user setup
+
+Add a dedicated user who will push in untrusted commits. This user should have
+a locked password, and `git-shell` as its shell.
+
+ root@bluebird:/home/joey>adduser --shell=/usr/bin/git-shell --disabled-password anon
+ Adding user `anon' ...
+
+## ikiwiki setup
+
+You should set up ikiwiki before turning on anonymous push in git.
+
+Edit your wiki's setup file, and uncomment the lines for
+`git_test_receive_wrapper` and `untrusted_committers`.
+
+ # git pre-receive hook to generate
+ git_test_receive_wrapper => '/srv/git/ikiwiki.info/.git/hooks/pre-receive',
+ # unix users whose commits should be checked by the pre-receive hook
+ untrusted_committers => ['anon'],
+
+The `git_test_receive_wrapper` will become the git `pre-receive` hook. The
+`untrusted_committers` list is the list of unix users who will be pushing in
+untrusted changes. It should *not* include the user that ikiwiki normally runs
+as.
+
+Once you're done modifying the setup file, don't forget to run
+`ikiwiki --setup ikiwiki.setup --refresh --wrappers` on it.
+
+## git setup
+
+You'll need to arrange the permissions on your bare git repository so that
+user anon can write to it. One way to do it is to create a group, and put
+both anon and your regular user in that group. Then make the bare git
+repository owned and writable by the group. See [[rcs/git]] for some more
+tips on setting up a git repository with multiple committers.
+
+Note that anon should *not* be able to write to the `srcdir`, *only* to the bare git
+repository for your wiki.
+
+If you want to allow git over `ssh`, generate a ssh key for anon, and
+publish the *private* key for other people to use. This is optional; you
+can use `git-daemon` instead and not worry about keys.
+
+Now set up `git-daemon`. It will need to run as user `anon`, and be
+configured to export your wiki's bare git repository. I set it up as
+follows in `/etc/inetd.conf`, and ran `/etc/init.d/openbsd-inetd restart`.
+
+ git stream tcp nowait anon /usr/bin/git-daemon git-daemon --inetd --export-all --interpolated-path=/srv/git/%H%D /srv/git
+
+At this point you should be able to `git clone git://your.wiki/path` from
+anywhere, and check out the source to your wiki. But you won't be able to
+push to it yet, one more change is needed to turn that on. Edit the
+`config` file of your bare git repository, and allow `git-daemon` to
+receive pushes:
+
+ [daemon]
+ receivepack = true
+
+Now pushes should be accepted, and your wiki immediatly be updated. If it
+doesn't, check your git repo's permissions, and make sure that the
+`post-update` and `pre-receive` hooks are suid so they run as the user who
+owns the `srcdir`.
+
+## infelicities
+
+If a user tries to push a changeset that ikiwiki doesn't like, it will
+abort the push before refs are updated. However, the changeset will still
+be present in your repository, wasting space. Since nothing refers to it,
+it will be expired eventually. You can speed up the expiry by running `git
+prune`.
diff --git a/doc/tips/untrusted_git_push/discussion.mdwn b/doc/tips/untrusted_git_push/discussion.mdwn
new file mode 100644
index 000000000..d95c01ecf
--- /dev/null
+++ b/doc/tips/untrusted_git_push/discussion.mdwn
@@ -0,0 +1,33 @@
+I've just tried this (commit c1fa07a). Recent changes shows:
+
+<div id="change-c1fa07ad4f165b42c962ba2a310681107f38c4f7" class="metadata">
+<span class="desc"><br />Changed pages:</span>
+<span class="pagelinks">
+
+<a href="http://git.ikiwiki.info/?p=ikiwiki;a=blobdiff;h=8bfa3dd7601a09b11ecbd20026849a777dc4b1b9;hp=c6302616f52ec058de5a8f5956fc512149a2f1a3;hb=1ea66c3d3f0a33bc3f04d073457b525a70380c37;f=doc/users/jondowland.mdwn"><img src="/wikiicons/diff.png" alt="diff" /></a><a href="http://ikiwiki.info/ikiwiki.cgi?page=users%2Fjondowland&amp;do=recentchanges_link">users/jondowland</a>
+
+
+</span>
+<span class="desc"><br />Changed by:</span>
+<span class="committer">
+
+<a href="http://ikiwiki.info/ikiwiki.cgi?page=users%2Fjon&amp;do=recentchanges_link">jon</a>
+
+</span>
+<span class="desc"><br />Commit type:</span>
+<span class="committype">git</span>
+<span class="desc"><br />Date:</span>
+<span class="changedate"><span class="relativedate" title="Mon, 10 Nov 2008 18:24:22 -0500">18:24:22 11/10/08</span>
+</div>
+
+Note that the user for the commit is 'jon', and the link points at cgi to
+create users/jon. I was wondering if that is configurable for users pushing
+via git. It would be nice perhaps to specify it in some way, perhaps via a
+git-config setting (user.name?). I'm not too familiar with exactly what the
+changeset contains. -- [[users/Jon]]
+
+> All ikiwiki can do it look at who git has recorded as the author of
+> the change (and it looks at the username part of the email address).
+> You can set `user.email` in `.git/config`. --[[Joey]]
+
+> > Ah, excellent. In which case this *should* DTRT... -- [[users/Jon]]
diff --git a/doc/tips/upgrade_to_3.0.mdwn b/doc/tips/upgrade_to_3.0.mdwn
new file mode 100644
index 000000000..05b6d6fbd
--- /dev/null
+++ b/doc/tips/upgrade_to_3.0.mdwn
@@ -0,0 +1,95 @@
+Version 3.0 of ikiwiki makes some significant changes, which
+you will need to deal with when upgrading from ikiwiki 2.x.
+
+[[!toc ]]
+
+## setup file format change
+
+The layout of the setup file changed in a significant way in version 2.60
+of ikiwiki. If you have not changed yours to the new format, now would be a
+good time to do so. Some new features, like the [[plugins/websetup]]
+interface, need the new format setup file.
+
+You can convert old setup files into the new format by running
+`ikiwiki-transition setupformat your.setup`
+
+## moving settings from Preferences page
+
+The admin preferences page used to have settings for allowed attachments,
+locked pages, and banned users. These three settings have moved to the
+setup file, and will no longer appear on the admin preferences page once
+your wiki is upgraded to 3.0.
+
+You can move these preferences into the setup file by running
+`ikiwiki-transition moveprefs your.setup; ikiwiki -setup your.setup -refresh -wrappers`
+
+(Make sure you have converted the setup file to the new format first.)
+
+## prefix directives
+
+In 3.0, the syntax ikiwiki uses for [[directives|ikiwiki/directive]] has
+changed, requiring that the directive start with a bang:
+
+ \[[!directive ...]]
+
+If you would like to keep the old syntax, it is still supported, add the
+following to your setup file:
+
+ prefix_directives => 0,
+
+To convert to the new syntax, make sure that your setup file does *not*
+contain the above, then run `ikiwiki-transition prefix_directives your.setup`
+
+(And then commit the changes it makes to pages in your srcdir.)
+
+## GlobLists
+
+In 3.0, the old "GlobList" syntax for [[PageSpecs|ikiwiki/PageSpec]] is no
+longer supported. A GlobList contains multiple terms, but does not separate
+them with "and" or "or":
+
+ sandbox !*/Discussion
+
+To convert this to a modern PageSpec, simply add "and" or "or" as
+appropriate between terms:
+
+ sandbox and !*/Discussion
+
+GlobLists have been deprecated for more than two years. If your wiki dates
+to the ikiwiki 1.0 era, you should check it for any that might have lurked
+unnoticed in it since back then. Ikiwiki version 2.72 will print warnings
+about any GlobLists it sees.
+
+## aggregateinternal
+
+If your wiki uses the [[aggregate|plugins/aggregate]] plugin, it will start
+to aggregate feeds to special "internal" pages.
+
+If you don't want this change, you can add the following to your setup
+file:
+
+ aggregateinternal => 0,
+
+Otherwise, follow this procedure to upgrade a wiki using the aggregate plugin:
+
+1. Update all [[PageSpecs|ikiwiki/PageSpec]] that refer to the aggregated
+ pages -- such as those in inlines. Put "internal()" around globs
+ in those PageSpecs. For example, if the PageSpec was `foo/*`, it should
+ be changed to `internal(foo/*)`. This has to be done because internal
+ pages are not matched by regular globs.
+2. Use [[ikiwiki-transition]] to rename all existing aggregated `.html`
+ files in the srcdir. The command to run is
+ `ikiwiki-transition aggregateinternal your.setup`,
+3. Refresh the wiki. (`ikiwiki -setup your.setup -refresh`)
+
+## embed / googlecalendar
+
+The googlecalendar plugin has been deprecated for a long time, and is
+removed in 3.0.
+
+The embed plugin is also now deprecated, though not yet removed.
+
+If you use either plugin to embed content from google, youtube, etc,
+into your wiki, you should instead configure the [[plugins/htmlscrubber]]
+to skip sanitising some pages, via the `htmlscrubber_skip` setting.
+See [[embedding_content]] for examples.
diff --git a/doc/tips/using_the_web_interface_with_a_real_text_editor.mdwn b/doc/tips/using_the_web_interface_with_a_real_text_editor.mdwn
new file mode 100644
index 000000000..f0fbbba9a
--- /dev/null
+++ b/doc/tips/using_the_web_interface_with_a_real_text_editor.mdwn
@@ -0,0 +1,17 @@
+If you use Firefox or Iceweasel, the [It's All
+Text](https://addons.mozilla.org/en-US/firefox/addon/4125) extension allows
+you to use a real text editor like Emacs or Vim to edit the contents of text
+areas. This allows you to edit ikiwiki pages with a real text editor through
+the ikiwiki web interface, rather than only with direct commit
+access. --[[JoshTriplett]]
+
+Chrome and chromium have [Edit with
+Emacs](https://chrome.google.com/webstore/detail/ljobjlafonikaiipfkggjbhkghgicgoh)
+for those who use Emacs -- Rémi Vanicat
+
+For Firefox or Iceweasel users, the vimperator extension is also a good
+idea. You can press Ctrl-I in the insert mode of vimperator and switch to
+an external editor, e.g. Vim. --[[WeakishJiang]]
+
+Finally, with wikis configured to allow, [[untrusted_git_push]], you can
+ditch the browser altogether. --[[Joey]]
diff --git a/doc/tips/using_the_web_interface_with_a_real_text_editor/discussion.mdwn b/doc/tips/using_the_web_interface_with_a_real_text_editor/discussion.mdwn
new file mode 100644
index 000000000..ad0b9e1cb
--- /dev/null
+++ b/doc/tips/using_the_web_interface_with_a_real_text_editor/discussion.mdwn
@@ -0,0 +1,2 @@
+Any ideas about how to do something similar with konqueror?
+[[DavidBremner]]
diff --git a/doc/tips/vim_and_ikiwiki.mdwn b/doc/tips/vim_and_ikiwiki.mdwn
new file mode 100644
index 000000000..e4136aa5d
--- /dev/null
+++ b/doc/tips/vim_and_ikiwiki.mdwn
@@ -0,0 +1,28 @@
+# Vim and ikiwiki
+
+## Syntax highlighting
+
+[ikiwiki-syntax](http://www.vim.org/scripts/script.php?script_id=3156) is a vim
+syntax highlighting file for ikiwiki [[ikiwiki/markdown]] files. It highlights
+directives and wikilinks. It only supports prefixed directives, i.e.,
+\[[!directive foo=bar baz]], not the old format with spaces.
+
+------
+
+The previous syntax definition for ikiwiki links is at [[vim_syntax_highlighting/ikiwiki.vim]]; however,
+it seems to not be [[maintained
+anymore|forum/navigation_of_wiki_pages_on_local_filesystem_with_vim#syn-maintenance]],
+and it has some [[issues|forum/ikiwiki_vim_syntaxfile]].
+
+## Page creation and navigation
+
+The [ikiwiki-nav](http://www.vim.org/scripts/script.php?script_id=2968) package
+is a vim plugin that enables you to do the following from inside vim:
+
+ * Jumping to the file corresponding to the wikilink under the cursor.
+ * Creating the file corresponding to the wikilink under the cursor (including
+ directories if necessary.)
+ * Jumping to the previous/next wikilink in the current file.
+ * Autocomplete link names.
+
+Download it from [here](http://www.vim.org/scripts/script.php?script_id=2968)
diff --git a/doc/tips/vim_syntax_highlighting.mdwn b/doc/tips/vim_syntax_highlighting.mdwn
new file mode 100644
index 000000000..8f2fdc1f0
--- /dev/null
+++ b/doc/tips/vim_syntax_highlighting.mdwn
@@ -0,0 +1,20 @@
+This page is deprecated. See [[tips/vim_and_ikiwiki]] for the most up to date
+content
+
+--------
+
+[ikiwiki-syntax](http://www.vim.org/scripts/script.php?script_id=3156) is a vim
+syntax highlighting file for ikiwiki [[ikiwiki/markdown]] files. It highlights
+directives and wikilinks. It only supports prefixed directives, i.e.,
+\[[!directive foo=bar baz]], not the old format with spaces.
+
+See also: [[follow_wikilinks_from_inside_vim]]
+
+------
+
+The previous syntax definition for ikiwiki links is at [[ikiwiki.vim]]; however,
+it seems to not be [[maintained
+anymore|forum/navigation_of_wiki_pages_on_local_filesystem_with_vim#syn-maintenance]],
+and it has some [[issues|forum/ikiwiki_vim_syntaxfile]].
+
+[[!tag vim]]
diff --git a/doc/tips/vim_syntax_highlighting/discussion.mdwn b/doc/tips/vim_syntax_highlighting/discussion.mdwn
new file mode 100644
index 000000000..72cb52aab
--- /dev/null
+++ b/doc/tips/vim_syntax_highlighting/discussion.mdwn
@@ -0,0 +1,8 @@
+I'm going to look at merging this with potwiki.vim (a vim-based personal wiki) so that you can follow wiki-links and auto-create pages etc., direct from vim. (I'm writing this incase I don't get around to it) -- [[users/Jon]]
+
+----
+
+Another attempt at the same thing is here:
+<http://plasticboy.com/markdown-vim-mode/>
+
+In my tests, [[ikiwiki.vim]] works better than that one, YMMV. --[[Joey]]
diff --git a/doc/tips/vim_syntax_highlighting/ikiwiki.vim b/doc/tips/vim_syntax_highlighting/ikiwiki.vim
new file mode 100644
index 000000000..bbcad4239
--- /dev/null
+++ b/doc/tips/vim_syntax_highlighting/ikiwiki.vim
@@ -0,0 +1,71 @@
+" Vim syntax file
+" Language: Ikiwiki (links)
+" Maintainer: Recai Oktaş (roktasATdebian.org)
+" Last Change: 2007 May 29
+
+" Instructions:
+" - make sure to use the relevant syntax file which can be found
+" at vim.org; below are the syntax files for markdown and reST,
+" respectively:
+" http://www.vim.org/scripts/script.php?script_id=1242
+" http://www.vim.org/scripts/script.php?script_id=973
+" - put the file into your syntax directory (e.g. ~/.vim/syntax)
+" - if you use markdown (with .mdwn extension) add sth like below
+" in your VIM startup file:
+" au BufNewFile,BufRead *.mdwn set ft=ikiwiki
+" - if you use a different markup other than markdown (e.g. reST)
+" make sure to setup 'g:ikiwiki_render_filetype' properly in
+" your startup file (skip this step for mkd.vim, it should work
+" out of the box)
+" Todo:
+" - revamp the whole file so as to detect valid ikiwiki directives
+" and parameters (needs a serious work)
+
+let s:cpo_save = &cpo
+set cpo&vim
+
+" Load the base syntax (default to markdown) if nothing was loaded.
+if !exists("b:current_syntax")
+ let s:ikiwiki_render_filetype = "mkd"
+ if exists("g:ikiwiki_render_filetype")
+ let s:ikiwiki_render_filetype = g:ikiwiki_render_filetype
+ endif
+ exe 'runtime! syntax/' . s:ikiwiki_render_filetype . '.vim'
+endif
+
+unlet b:current_syntax
+
+syn case match
+
+syn region ikiwikiLinkContent matchgroup=ikiwikiLink start=+\[\[\(\w\+\s\+\)\{,1}+ end=+\]\]+ contains=ikiwikiLinkNested,ikiwikiParam,ikiwikiNoParam
+syn region ikiwikiLinkNested matchgroup=ikiwikiLinkNested start=+"""+ end=+"""+ contains=ikiwikiLinkContent contained
+
+" FIXME: Below is an ugly hack to prevent highlighting of simple links
+" as directives. Links with spaces are still problematic though.
+syn region ikiwikiNoParam start=+\[\[[^|=]\+|+ end=+[^|=]\+\]\]+ keepend contains=ikiwikiMagic,ikiwikiDelim
+
+syn match ikiwikiDelim "\(\[\[\|\]\]\)" contained
+syn match ikiwikiMagic "|" contained
+syn match ikiwikiParam "\<\i\+\ze=" nextgroup=ikiwikiParamAssign contained
+syn match ikiwikiParamAssign "=" nextgroup=ikiwikiValue contained
+syn region ikiwikiValue start=+"[^"]+hs=e-1 end=+[^"]"+ skip=+\\"+ keepend contains=ikiwikiValueMagic,ikiwikiDelim contained
+syn match ikiwikiValueMagic +\(!\<\|\*\|\<\(and\|or\)\>\|\<\i*(\|\>)\)+ contained
+
+syn sync minlines=50
+
+hi def link ikiwikiLink Statement
+hi def link ikiwikiLinkNested String
+hi def link ikiwikiLinkContent Underlined
+
+hi def link ikiwikiMagic Operator
+hi def link ikiwikiDelim Operator
+hi def link ikiwikiNoParam Underlined
+hi def link ikiwikiParam Identifier
+hi def link ikiwikiParamAssign Operator
+hi def link ikiwikiValue String
+hi def link ikiwikiValueMagic Type
+
+let b:current_syntax = "ikiwiki"
+unlet s:cpo_save
+
+" vim:ts=8:sts=8:noet
diff --git a/doc/tips/wikiannounce.mdwn b/doc/tips/wikiannounce.mdwn
new file mode 100644
index 000000000..6eb142cdf
--- /dev/null
+++ b/doc/tips/wikiannounce.mdwn
@@ -0,0 +1,8 @@
+One thing I use ikiwiki for is the web pages for software projects I
+maintain. Each of my projects has a news page with an announcements feed,
+and to automatically update this when I release a new version, generating
+an announcement from the debian/changelog and debian/NEWS files, I use a
+[wikiannounce](http://git.kitenet.net/?p=joey/home.git;a=blob_plain;f=bin/wikiannounce)
+program. It's somewhat specific to dealing with Debian packages, and uses a
+special `announcedir` target in debian/rules, but the general idea could be
+useful. --[[Joey]]
diff --git a/doc/tips/yaml_setup_files.mdwn b/doc/tips/yaml_setup_files.mdwn
new file mode 100644
index 000000000..56eeb61a1
--- /dev/null
+++ b/doc/tips/yaml_setup_files.mdwn
@@ -0,0 +1,12 @@
+Here's how to convert your existing standard format ikiwiki setup file into
+the new YAML format recently added to ikiwiki.
+
+1. First, make sure you have the [[!cpan YAML]] perl module installed.
+ (Run: `apt-get install libyaml-perl`)
+2. Run: `ikiwiki --setup my.setup --dumpsetup my.setup --set setuptype=Yaml`
+
+The format of the YAML setup file should be fairly self-explanatory.
+
+(To convert the other way, use "setuptype=Standard" instead.)
+
+--[[Joey]]
diff --git a/doc/todo.mdwn b/doc/todo.mdwn
new file mode 100644
index 000000000..75314c75b
--- /dev/null
+++ b/doc/todo.mdwn
@@ -0,0 +1,21 @@
+Feel free to post your ideas for todo and [[wishlist]] items here, as well
+as any [[patches|patch]]. If it seems more like a bug in the existing code,
+post it to [[bugs]] instead. Link items to [[todo/done]] when done.
+
+<!-- currently commented out because I lost all my mtimes :-)
+[[!if test="enabled(postsparkline)"
+then="""
+How long will it take your todo item to be fixed? Well...
+[[!postsparkline pages="todo/* and !todo/done and !link(todo/done) and !todo/*/*"
+max=12 ymin=10 formula=permonth style=bar barwidth=2 barspacing=1 height=13]]
+this many are being added per month
+[[!postsparkline pages="todo/* and !todo and link(todo/done)"
+max=12 ymin=10 formula=permonth time=mtime style=bar barwidth=2 barspacing=1 height=13]]
+while this many are being fixed.
+"""]]
+-->
+
+[[!inline pages="todo/* and !todo/done and !link(todo/done) and
+!link(patch) and !link(wishlist) and !todo/*/*"
+feedpages="created_after(todo/supporting_comments_via_disussion_pages)"
+actions=yes archive=yes rootpage="todo" postformtext="Add a new todo item titled:" show=0]]
diff --git a/doc/todo/ACL.mdwn b/doc/todo/ACL.mdwn
new file mode 100644
index 000000000..dd9793233
--- /dev/null
+++ b/doc/todo/ACL.mdwn
@@ -0,0 +1,98 @@
+How about adding ACL? So that you can control which users are allowed
+to read, write certain pages. The moinmoin wiki has that, and it is
+something, that I think is very valuable.
+
+> ikiwiki currently has only the most rudimentary access controls: pages
+> can be locked, or unlocked and only the admin can edit locked pages. That
+> could certianly be expanded on, although it's not an area that I have an
+> overwhelming desire to work on myself right now. Patches appreciated and
+> I'll be happy to point you in the right directions.. --[[Joey]]
+
+>> I'm really curious how you'd suggest implementing ACLs on reading a page.
+>> It seems to me the only way you could do it is .htaccess DenyAll or something,
+>> and then route all page views through ikiwiki.cgi. Am I missing something?
+>> --[[Ethan]]
+
+>>> Or you could just use apache or whatever and set up the access controls
+>>> there. Of course, that wouldn't integrate very well with the wiki,
+>>> unless perhaps you decided to use http basic authentication and the
+>>> httpauth plugin for ikiwiki that integrates with that.. --[[Joey]]
+
+>>>> Which would rule out openid, or other fun forms of auth. And routing all access
+>>>> through the CGI sort of defeats the purpose of ikiwiki. --[[Ethan]]
+
+>>>>> I think what Joey is suggesting is to use apache ACLs in conjunction
+>>>>> with basic HTTP auth to control read access, and ikiwiki can use the
+>>>>> information via the httpauth plugin for other ACLs (write, admin). But
+>>>>> yes, that would rule out non-httpauth mechanisms. -- [[Jon]]
+
+Also see [[!debbug 443346]].
+
+> Just a few quick thoughts about this:
+>
+>* I'm only thinking about write ACLs. As Joey noted, read ACLs need to be done in the web server.
+>* ACLs are going to be really hard for people with direct access to the revision control system.
+> Which means that we really only need to define ACLs for web access.
+>* ACLs for web access can then be defined by the web master. These might not need to be
+> defined in the wiki pages (although they could be).
+>* Given the previous two points, can't this be done with the `match_user()`
+> function defined by the [[plugins/attachment]] plugin (see the [[ikiwiki/pagespec/attachment]] pagespec info)
+> and the [[plugins/lockedit]] plugin?
+>
+> For example, add the following to your config file:
+>
+> locked_pages => '!(user(john) and */Discussion) and *',
+>
+> would lock all pages unless you're john and editing a Discussion page.
+> It's a thought anyway :-). -- [[Will]]
+
+>> Yes, writing per-user commit ACLs has become somewhat easier with recent
+>> features. Breaking `match_user` out of attachment, and making the
+>> lockedit plugin pass`user` and `ip` params when it calls `pagespec_match`
+>> would be sufficient. And [[done]], configurable via
+>> [[plugin/lockedit]]'s `locked_pages`. --[[Joey]]
+
+I am considering giving this a try, implementing it as a module.
+Here is how I see it:
+
+ * a new preprocessor directive allows to define ACL entries providing permissions
+ for a given (user, page, operation), as in:
+
+ <pre>
+ \[[!acl user=joe page=*.png allow=upload]]
+ \[[!acl user=bob page=/blog/bob/* allow=*]]
+ \[[!acl user=* page=/blog/bob/* deny=*]]
+ \[[!acl user=http://jeremie.koenig.myopenid.com/ page=/todo/* deny=create
+ reason="spends his time writing todo items instead of source code"]]
+ </pre>
+
+ Each would expand to a description of the resulting rule.
+
+ * a configurable page of the wiki would be used as an ACL list.
+ Possibly could refer to other ACL pages, as in:
+
+ <pre>
+ \[[!acl user=* page=/subsite/* acl=/subsite/acl.mdwn]]
+ </pre>
+
+Any idea when this is going to be finished? If you want, I am happy to beta test.
+
+> It's already done, though that is sorta hidden in the above. :-)
+> Example of use to only allow two users to edit the tipjar page:
+> locked_pages => 'tipjar and !(user(joey) or user(bob))',
+> --[[Joey]]
+
+> > Thank you for the hint but I am being still confused (read: dense)... What I am trying to do is this:
+
+> > * No anonymous access.
+> > * Logged in users can edit and create pages.
+> > * Users can set who can edit their pages.
+> > * Some pages are only viewable by admins.
+
+> > Is it possible? If so how?...
+
+>>> I don't believe this is currently possible. What is missing is the concept
+>>> of page 'ownership'. -- [[Jon]]
+
+>>>> GAH! That is really a shame... Any chance of adding that? No, I do not really expect it to be added, after all my requirements are pushing the boundary of what a wikiwiki
+ should be. Nonetheless, thanks for your help!
diff --git a/doc/todo/A_page_that_inlines_pages__61____34____42____34___results_in_unnecessary_feed_generation.mdwn b/doc/todo/A_page_that_inlines_pages__61____34____42____34___results_in_unnecessary_feed_generation.mdwn
new file mode 100644
index 000000000..543c346ac
--- /dev/null
+++ b/doc/todo/A_page_that_inlines_pages__61____34____42____34___results_in_unnecessary_feed_generation.mdwn
@@ -0,0 +1,80 @@
+I noticed when generating my wiki that all of my RSS feeds were
+getting regenerated even when I edited only a page that did not affect
+any feed.
+
+I found that the problem only occurs in the presence of a file that
+contains \[[!inline pages="*"]].
+
+> How is this unexpected? By inlining _every_ page in the wiki, you're
+> making that page depend on every other page; any change to any page in
+> the wiki will result in the inlining page and its rss feed needing to be
+> updated to include the changed page.
+>
+> At best, this is a [[wishlist]] optimisation item -- it would be nice if inline could
+> somehow know that since it's only displaying N pages, a change to the
+> N+1th page that its PageSpec matches is a no-op.
+> --[[Joey]]
+
+[[!tag done]]
+
+Here's a short script for replicating the bug. Just cut and paste this
+to a shell, (it will only muck in a new /tmp/ikiwiki-test directory
+that it will create):
+
+ cd /tmp
+ mkdir ikiwiki-test; cd ikiwiki-test; mkdir src
+ echo '\[[!inline pages="blog/*"]]' > src/myblog.mdwn
+ mkdir src/blog; echo "A blog entry" > src/blog/entry.mdwn
+ echo 'use IkiWiki::Setup::Standard {
+ srcdir => "src",
+ destdir => "output",
+ url => "http://example.com",
+ templatedir => "/dev/null",
+ underlaydir => "/dev/null",
+ rss => 1,
+ wrappers => [],
+ verbose => 1,
+ refresh => 1
+ }' > setup
+ ikiwiki --setup setup
+ ls -l --time-style=full-iso output/myblog/index.rss
+ echo "not a blog entry" > src/not-a-blog.mdwn
+ ikiwiki --setup setup
+ ls -l --time-style=full-iso output/myblog/index.rss
+ echo '\[[!inline pages="*"]]' > src/archives.mdwn
+ ikiwiki --setup setup
+ ls -l --time-style=full-iso output/myblog/index.rss
+ echo "still not blogging" >> src/not-a-blog.mdwn
+ ikiwiki --setup setup
+ ls -l --time-style=full-iso output/myblog/index.rss
+
+Here's the tail of the output that I see for this command:
+
+ $ echo "not a blog entry" > src/not-a-blog.mdwn
+ $ ikiwiki --setup setup
+ refreshing wiki..
+ scanning not-a-blog.mdwn
+ rendering not-a-blog.mdwn
+ done
+ $ ls -l --time-style=full-iso output/myblog/index.rss
+ -rw-r--r-- 1 cworth cworth 459 2007-06-01 06:34:36.000000000 -0700 output/myblog/index.rss
+ $ echo '\[[!inline pages="*"]]' > src/archives.mdwn
+ $ ikiwiki --setup setup
+ refreshing wiki..
+ scanning archives.mdwn
+ rendering archives.mdwn
+ done
+ $ ls -l --time-style=full-iso output/myblog/index.rss
+ -rw-r--r-- 1 cworth cworth 459 2007-06-01 06:34:37.000000000 -0700 output/myblog/index.rss
+ $ echo "still not blogging" >> src/not-a-blog.mdwn
+ $ ikiwiki --setup setup
+ refreshing wiki..
+ scanning not-a-blog.mdwn
+ rendering not-a-blog.mdwn
+ rendering archives.mdwn, which depends on not-a-blog
+ done
+ $ ls -l --time-style=full-iso output/myblog/index.rss
+ -rw-r--r-- 1 cworth cworth 459 2007-06-01 06:34:38.000000000 -0700 output/myblog/index.rss
+
+It looks like the rendering of archives.mdwn is also silently
+generating myblog/index.rss.
diff --git a/doc/todo/Account-creation_password.mdwn b/doc/todo/Account-creation_password.mdwn
new file mode 100644
index 000000000..d9646d42f
--- /dev/null
+++ b/doc/todo/Account-creation_password.mdwn
@@ -0,0 +1,6 @@
+[[plugins/passwordauth]] could support an "account creation password", as a
+simplistic anti-spam measure. (Some wikis edited by a particular group use an
+account creation password as an "ask an existing member to get an account"
+system.) --[[JoshTriplett]]
+
+[[todo/done]] --[[JoshTriplett]] \ No newline at end of file
diff --git a/doc/todo/Account_moderation.mdwn b/doc/todo/Account_moderation.mdwn
new file mode 100644
index 000000000..6ef7a65f1
--- /dev/null
+++ b/doc/todo/Account_moderation.mdwn
@@ -0,0 +1,12 @@
+The account creation process in the default [[plugins/passwordauth]] plugin
+could support account moderation by an administrator. New account signups
+would go into a queue for approval by an administrator.
+
+(Random, potentially infeasible idea: save their edits and apply them if
+the account gets approved.)
+
+--[[JoshTriplett]]
+
+[[!tag soc]]
+
+[[wishlist]]
diff --git a/doc/todo/Add_DATE_parameter_for_use_in_templates.mdwn b/doc/todo/Add_DATE_parameter_for_use_in_templates.mdwn
new file mode 100644
index 000000000..e5ac391c3
--- /dev/null
+++ b/doc/todo/Add_DATE_parameter_for_use_in_templates.mdwn
@@ -0,0 +1,86 @@
+I sometimes want to inline things with the complete date and time, and
+sometimes need only the date. I know about the prettydate plugin that
+already makes the time a bit "nicer" to read, but sometimes I just
+don't want it at all.
+
+Here's a patch to add a DATE parameter for use in templates as
+controlled by a Tdateformat setting in the setup.
+
+I explicitly did not edit any date-related plugins, (for fear of
+breaking them as I don't use them so I wouldn't be testing them). But
+it occurs to me that it might be correct to not touch them anyway,
+(since things like prettydate are really more concerned with changing
+the presentation of the time, not the date).
+
+I also didn't edit the sample setup file, (since I'm just using a git
+repository setup on my local /usr/share/perl5/IkiWiki directory
+here). But, ah, now that I look, I do see that there's a "real" git
+repository advertised with the ikiwiki source. I'll have to start
+using that for future patches, (so let me know if you want me to
+regenerate this one against that).
+
+-Carl
+
+ From 325d208d8dc8016a377bb7c923a51af2bd3355b0 Mon Sep 17 00:00:00 2001
+ From: Carl Worth <cworth@cworth.org>
+ Date: Tue, 3 Jul 2007 11:39:03 -0700
+ Subject: [PATCH] Allow DATE as a template parameter (with format controlled by dateformat setting)
+
+ ---
+ IkiWiki.pm | 12 +++++++++++-
+ Plugin/inline.pm | 1 +
+ 2 files changed, 12 insertions(+), 1 deletions(-)
+
+ diff --git a/IkiWiki.pm b/IkiWiki.pm
+ index cd42e8d..ebf0474 100644
+ --- a/IkiWiki.pm
+ +++ b/IkiWiki.pm
+ @@ -16,7 +16,7 @@ use vars qw{%config %links %oldlinks %pagemtime %pagectime %pagecase
+ use Exporter q{import};
+ our @EXPORT = qw(hook debug error template htmlpage add_depends pagespec_match
+ bestlink htmllink readfile writefile pagetype srcfile pagename
+ - displaytime will_render gettext urlto targetpage
+ + displaytime displaydate will_render gettext urlto targetpage
+ %config %links %renderedfiles %pagesources %destsources);
+ our $VERSION = 2.00; # plugin interface version, next is ikiwiki version
+ our $version="2.1";my $installdir="/usr";
+ @@ -70,6 +70,7 @@ sub defaultconfig () {
+ plugin => [qw{mdwn inline htmlscrubber passwordauth openid signinedit
+ lockedit conditional}],
+ timeformat => '%c',
+ + dateformat => '%x',
+ locale => undef,
+ sslcookie => 0,
+ httpauth => 0,
+ @@ -447,6 +448,15 @@ sub displaytime ($) {
+ $config{timeformat}, localtime($time)));
+ }
+
+ +sub displaydate ($) {
+ + my $time=shift;
+ +
+ + # strftime doesn't know about encodings, so make sure
+ + # its output is properly treated as utf8
+ + return decode_utf8(POSIX::strftime(
+ + $config{dateformat}, localtime($time)));
+ +}
+ +
+ sub beautify_url ($) {
+ my $url=shift;
+
+ diff --git a/Plugin/inline.pm b/Plugin/inline.pm
+ index 8f6ab51..7bd6147 100644
+ --- a/Plugin/inline.pm
+ +++ b/Plugin/inline.pm
+ @@ -148,6 +148,7 @@ sub preprocess_inline (@) {
+ $template->param(pageurl => urlto(bestlink($params{page}, $page), $params{destpage}));
+ $template->param(title => pagetitle(basename($page)));
+ $template->param(ctime => displaytime($pagectime{$page}));
+ + $template->param(date => displaydate($pagectime{$page}));
+
+ if ($actions) {
+ my $file = $pagesources{$page};
+ --
+ 1.5.2.2
+
+[[!tag patch patch/core plugins/inline]]
diff --git a/doc/todo/Add_HTML_support_to_po_plugin.mdwn b/doc/todo/Add_HTML_support_to_po_plugin.mdwn
new file mode 100644
index 000000000..9c4b8c701
--- /dev/null
+++ b/doc/todo/Add_HTML_support_to_po_plugin.mdwn
@@ -0,0 +1,9 @@
+The HTML page type should be fully supported by the PO plugin: po4a's
+HTML support is able to extract translatable strings and to disregard
+the rest.
+
+This is implemented in my po branch, please review. --[[intrigeri]]
+
+[[!tag patch]]
+
+> This has been merged for a while => [[done]].
diff --git a/doc/todo/Add_a_plugin_to_list_available_pre-processor_commands.mdwn b/doc/todo/Add_a_plugin_to_list_available_pre-processor_commands.mdwn
new file mode 100644
index 000000000..9ac400cd2
--- /dev/null
+++ b/doc/todo/Add_a_plugin_to_list_available_pre-processor_commands.mdwn
@@ -0,0 +1,141 @@
+I've found myself wanting to know which [[plugins]] are switched on so I know which pre-processor commands I can use. The attached [[patch]] adds a new plugin that generates the list of available plugins. -- [[Will]]
+
+> Good idea, I do see a few problems:
+>
+> - preprocessor directives do not necessarily have the same name as the
+> plugin that contains them (for example, the graphviz plugin adds a graph
+> directive). Won't keys `%{IkiWiki::hooks{preprocess}}` work?
+
+>>> Er, yeah - that's a much better solution. :) -- and done
+
+> - "listplugins" is a bit misnamed since it only does preprocessor directives.
+
+>>> Yes. Initially this was going to list all enabled plugins. Then when searching
+>>> for enabled plugins I changed my mind and decided that a list of pre-processor
+>>> directives was more useful. I'll fix that too. -- changed to `listpreprocessors`
+
+> - comment was copied from version plugin and still mentions version :-)
+
+>>> :-) -- fixed
+
+> - Seems like [[ikiwiki/formatting]] could benefit from including the
+> list.. however, just a list of preprocessor directive names is not
+> the most user-friendly thing that could be put on that page. It would
+> be nice if there were also a short description and maybe an example of
+> use. Seems like the place to include that info would be in the call
+> to `hook()`.
+> (Maybe adding that is more involved than you want to go though..)
+>
+> --[[Joey]]
+
+>> Adding a whole new hook for a usage example is more effort than I
+>> wanted to go to. I was thinking of either:
+
+>>> Just to clarify, I meant adding new parameters to the same hook call
+>>> that registers the plugin. --[[Joey]]
+
+>> - Adding a configuration for a wiki directory. If a matching page is in the
+>> specified wiki directory then the plugin name gets turned into a link to that
+>> page
+>> - Adding configuration for an external URL. Each plugin name is added as
+>> a link to the plugin name appended to the URL.
+
+>>The first option is easier to navigate and wouldn't produce broken links,
+>>but requires all the plugin documentation to be local. The second option
+>>can link back to the main IkiWiki site, but if you have any non-standard
+>>plugins then you'll get broken links.
+>>
+>>Hrm. After listing all of that, maybe your idea with the hooks is the better
+>>solution. I'll think about it some more. -- [[Will]]
+
+>>> I've also run into this problem with the websetup plugin, and
+>>> considered those ideas too. I don't like the external url, because
+>>> ikiwiki.info may be out of sync with the version of ikiwiki being used.
+>>> (Or maybe it's gone! :-) The first idea is fine, except for the bloat
+>>> issue. If turning on listpreprocessors and/or websetup means adding
+>>> hundreds of pages (and of kilobytes) to your wiki, that could be an
+>>> incentive to not turn them on..
+>>>
+>>> Hmm.. maybe the thing to do is to use _internal pages for the plugins;
+>>> then the individual pages would not be rendered, and your inlines would
+>>> still work. Although I don't know how websetup would use it then, and
+>>> also they would have to be non-internal for ikiwiki's own docwiki. Hmm.
+>>> Maybe these are two different things; one is a set of pages describing
+>>> preprocessor directives, and the second a set of pages describing
+>>> plugins. They're so closely related though it seems a shame to keep
+>>> them separate..
+>>> --[[Joey]]
+
+>>> I started implementing the hook based solution, and decided I didn't like
+>>> it because there was no nice way to rebuild pages when the preprocessor
+>>> descriptions changed. So instead I assumed that the the [[plugins]] pages
+>>> would be moved into the underlay directory. This plugin then uses an
+>>> `inline` directive to include those pages. You can use the `inline`
+>>> parameter to decide if you want to include all the descriptions or
+>>> just the titles. There is also an option to auto-create default/blank
+>>> description pages if they are missing (from a template). As preprocessor
+>>> commands don't list unless they have a description page, auto-creation
+>>> is enabled by default.
+>>>
+>>> There are three new templates that are needed. These are for:
+>>>
+>>> - The auto-created description pages are generated from `preprocessor-description.tmpl`.
+>>> - When only pre-processor names are listed, the `listpreprocessors-listonly.tmpl` template is used.
+>>> - When pre-processor descriptions are included inline, the `listpreprocessors-inline.tmpl` template is used.
+>>>
+>>> -- [[Will]]
+
+>>>> Just a quick note: pages are only created for pre-processor commands
+>>>> that exist when the `refresh` hook is called. This is before the [[shortcuts]] are
+>>>> processed. However, the list of available pre-processor commands will include
+>>>> shortcuts if they have description pages (the list is generated later, after the
+>>>> shortcuts have been added). While this was unplanned, it seems a reasonable
+>>>> tradeoff between including all the large number of shortcuts and including none. -- [[Will]]
+
+>>>>>> I think that using an inline is elegant! However, I don't understand
+>>>>>> why it has to create stub description pages? I doubt that, if a
+>>>>>> directive is missing a page, the stub will be filled out in many
+>>>>>> wikis. And it adds a lot of complexity, particularly committing a
+>>>>>> bunch of generated pages to revision control when the user just
+>>>>>> wants a plugin list seems undesirable.
+>>>>>>
+>>>>>> Seems to me it could use the inline for pages that exist, and append
+>>>>>> to the bottom a generated text for anything that is currently missing.
+>>>>>> The generated text could even have a page creation link in it if
+>>>>>> you wanted.
+>>>>>> --[[Joey]]
+
+>>>>>>> I kinda agree about the page generation. I don't like mixing an
+>>>>>>> inlined and a list though. Besides which, that ends
+>>>>>>> up keeping much of complexity of the page generation because
+>>>>>>> the code still has to detect which pages are missing. I've added
+>>>>>>> a patch that uses a list of wikilinks instead. This way available
+>>>>>>> pages get linked correctly, and missing pages get normal creation
+>>>>>>> links. The old patch is still here if you decide you prefer that. -- [[Will]]
+
+>>>>>>>> Can you explain the full/early list (why track both?) and generated parameter?
+
+>>>>>>>>> If you add in all the shortcuts you get quite a long list. My original idea
+>>>>>>>>> was to just track the plugin commands. This is the early list. But then
+>>>>>>>>> I thought that it might be nice for someone looking at wiki source and
+>>>>>>>>> seeing a shortcut to know where it came from. So I decided to make
+>>>>>>>>> displaying the full list an option, with the original concept as the default.
+
+>>>>>>>>> Another option here might be to generate the full list every time, but give
+>>>>>>>>> generated pre-processor commands (e.g. shortcuts) a different css class.
+>>>>>>>>> I'm not sure that is better than what I have though.
+
+>>>>>>>>> I keep track of both in the page state because if a command moves from
+>>>>>>>>> a shortcut to the early list (or vice versa) it changes what should be
+>>>>>>>>> displayed in the default use of the plugin. I thought about tracking just what
+>>>>>>>>> was actually used on the page, but I don't know in the needsbuild hook whether the `generated`
+>>>>>>>>> parameter has been supplied (or maybe the plugin is used twice on the page -
+>>>>>>>>> once in each form). It was just easier to track both.
+
+>>>>>>>> Only code change I'd suggest is using `htmllink` rather than
+>>>>>>>> generating a wikilink.
+
+>>>>>>>>> Yeah - that would make sense. done. -- [[Will]]
+
+Patch is applied (along with some changes..). [[done]] (But, see
+[[directive_docs]].
diff --git a/doc/todo/Add_basename_in_edittemplate.mdwn b/doc/todo/Add_basename_in_edittemplate.mdwn
new file mode 100644
index 000000000..6f5b0569f
--- /dev/null
+++ b/doc/todo/Add_basename_in_edittemplate.mdwn
@@ -0,0 +1,8 @@
+I wanted to produce an external link from a ikiwiki Subpage based on
+the *basename* of the Subpage. So I added the following code to the
+edittemplate plugin:
+
+ my ($basename) = $page =~ m!.*/(.*)!;
+ $template->param(basename => $basename || $page);
+
+Is there any other way I could have achieved this?
diff --git a/doc/todo/Add_camelcase_exclusions.mdwn b/doc/todo/Add_camelcase_exclusions.mdwn
new file mode 100644
index 000000000..6b86132a0
--- /dev/null
+++ b/doc/todo/Add_camelcase_exclusions.mdwn
@@ -0,0 +1,23 @@
+Camelcase currently looks for any and call camelcase words and turns them into wiki links. This patch adds a config item called <code>camelcase_ignore</code> which is an array of camelcase words to ignore.
+
+<pre>
+--- /usr/share/perl5/IkiWiki/Plugin/camelcase.pm.orig 2008-12-24 11:49:14.000000000 +1300
++++ /usr/share/perl5/IkiWiki/Plugin/camelcase.pm 2008-12-24 12:02:21.000000000 +1300
+@@ -33,7 +33,11 @@
+ my $destpage=$params{destpage};
+
+ $params{content}=~s{$link_regexp}{
+- htmllink($page, $destpage, IkiWiki::linkpage($1))
++ if (grep {/$1/} @{ $config{'camelcase_ignore'} }) {
++ $1
++ } else {
++ htmllink($page, $destpage, IkiWiki::linkpage($1))
++ }
+ }eg;
+
+ return $params{content};
+</pre>
+
+--[[puck]]
+
+[[done]]
diff --git a/doc/todo/Add_instructive_commit_messages_for_add__47__edit_pages.mdwn b/doc/todo/Add_instructive_commit_messages_for_add__47__edit_pages.mdwn
new file mode 100644
index 000000000..cfb5b98a3
--- /dev/null
+++ b/doc/todo/Add_instructive_commit_messages_for_add__47__edit_pages.mdwn
@@ -0,0 +1,43 @@
+When I added or edited a page, no commit message was written out (Mercurial backend, though I guess it shouldn't matter). This was done for e.g. the `rename` plugin. I made a naive but seemingly working change to `editpage.pm` to add a message.
+
+I modeled the message on `rename.pm`, which used a lowercase initial letter and imperative form of the verb. This is not the case for e.g. the `comment` plugin, which says "Added a comment: ", so I guess there is no strict rule on style in this case.
+
+Diff follows. --[[Daniel Andersson]]
+
+> This is somewhat intentional. It's pretty usual for changes to be made
+> to a wiki without bothering to say what changed; the change speaks for
+> itself and it would just be clutter to mention what file was changed,
+> since any reasonable interface will show the filename, or a link,
+> or some summary of what files were affected when showing a change.
+
+>> I use the Mercurial backend, and Mercurial doesn't allow empty commit messages, so if there were no message, it would default to "no message given" (hardcoded in `mercurial.pm`), which is also clutter, and non-descriptive at that. But I'm on board with your reasoning. It's a matter of taste (and somewhat backend), I guess. I might continue to locally use this patch (with the caveat below fixed when commit message is given), but I won't push for it to be included upstream. --[[Daniel Andersson]]
+
+>>> Hmm.. It would be possible to make the mercurial backend
+>>> include the filename (or just "added" or "edited") in the commit
+>>> message. It might take some work, especially to handle
+>>> `rcs_commit_staged`, since it would probably need to cache
+>>> what files have been staged for commit. --[[Joey]]
+
+> Also your patch stomps over any commit message that the user *does*
+> provide, so certianly cannot be applied as-is. --[[Joey]]
+
+>> Yes, "naive" was the word :-) . --[[Daniel Andersson]]
+
+[[!tag patch]]
+
+---
+
+ diff -r ee177ca9bf36 Plugin/editpage.pm
+ --- a/Plugin/editpage.pm Fri Jul 15 17:58:04 2011 +0200
+ +++ b/Plugin/editpage.pm Sat Jul 16 03:01:13 2011 +0200
+ @@ -405,6 +405,10 @@
+ if ($config{rcs}) {
+ if (! $exists) {
+ rcs_add($file);
+ + $message = "add $file";
+ + }
+ + else {
+ + $message = "edit $file";
+ }
+
+ # Prevent deadlock with post-commit hook by
diff --git a/doc/todo/Add_instructive_commit_messages_for_removing_pages.mdwn b/doc/todo/Add_instructive_commit_messages_for_removing_pages.mdwn
new file mode 100644
index 000000000..8b1dd74a7
--- /dev/null
+++ b/doc/todo/Add_instructive_commit_messages_for_removing_pages.mdwn
@@ -0,0 +1,32 @@
+As [[Add instructive commit messages for add _47_ edit pages]], but for `remove.pm`.
+
+I use a `join()` since it at least looks like the plugin is able to remove several pages at once (`foreach` looping over file parameters), thus holding multiple entries in `@pages`. I haven't seen this happen, though.
+
+> I feel that anything that shows a change should show what files were
+> changed (at least as an easily accessible option), so mentioning
+> filenames in commits is almost always clutter.
+>
+> It could be argued that there should be no message at all here, unless
+> the user provides one (which they currently cannot), as is done when
+> adding files. But the entire removal of a page from a wiki is a fairly
+> unusual circumstance that is probably best highlighted as such in
+> recentchanges. --[[Joey]]
+
+Diff follows. --[[Daniel Andersson]]
+
+[[!tag patch]]
+
+---
+
+ diff -r 4f2ad3a5377e Plugin/remove.pm
+ --- a/Plugin/remove.pm Fri Jul 15 17:39:04 2011 +0200
+ +++ b/Plugin/remove.pm Sat Jul 16 03:20:35 2011 +0200
+ @@ -228,7 +228,7 @@
+ IkiWiki::rcs_remove($file);
+ }
+ IkiWiki::rcs_commit_staged(
+ - message => gettext("removed"),
+ + message => sprintf(gettext("remove %s"), join(', ', @files)),
+ session => $session,
+ );
+ IkiWiki::enable_commit_hook();
diff --git a/doc/todo/Add_label_to_search_form_input_field.mdwn b/doc/todo/Add_label_to_search_form_input_field.mdwn
new file mode 100644
index 000000000..514108fba
--- /dev/null
+++ b/doc/todo/Add_label_to_search_form_input_field.mdwn
@@ -0,0 +1,56 @@
+The default searchform.tmpl looks rather bare and unintuitive with just an input field.
+
+The patch below adds a label for the field to improve usability:
+
+ --- templates/searchform.tmpl.orig Fri Jun 15 15:02:34 2007
+ +++ templates/searchform.tmpl Fri Jun 15 15:02:41 2007
+ @@ -1,5 +1,6 @@
+ <form method="get" action="<TMPL_VAR SEARCHACTION>" id="searchform">
+ <div>
+ +<label for="phrase">Search:</label>
+ <input type="text" name="phrase" value="" size="16" />
+ <input type="hidden" name="enc" value="UTF-8" />
+ <input type="hidden" name="do" value="hyperestraier" />
+
+> I don't do this by default because putting in the label feels to me make
+> the action bar too wide. YMMV. What I'd really like to do is make the
+> _content_ of the search field say "search". You see that on some other
+> sites, but so far the only way I've seen to do it is by inserting a
+> nasty lump of javascript. --[[Joey]]
+
+>> Please don't do that, it is a bad idea on so many levels :) See e.g.
+>> <http://universalusability.com/access_by_design/forms/auto.html> for
+>> an explanation why. --[[HenrikBrixAndersen]]
+
+>>> If you really want to do this, this is one way:
+
+ --- searchform.tmpl.orig Sat Aug 25 11:54:28 2007
+ +++ searchform.tmpl Sat Aug 25 11:56:19 2007
+ @@ -1,6 +1,6 @@
+ <form method="get" action="<TMPL_VAR SEARCHACTION>" id="searchform">
+ <div>
+ -<input type="text" name="phrase" value="" size="16" />
+ +<input type="text" name="phrase" value="Search" size="16" onfocus="this.value=''" />
+ <input type="hidden" name="enc" value="UTF-8" />
+ <input type="hidden" name="do" value="hyperestraier" />
+ </div>
+
+> That's both nasty javascript and fails if javascript is disabled. :-)
+> What I'd really like is a proper search label that appears above the
+> input box. There is free whitespace there, except for pages with very
+> long titles. Would someone like to figure out the CSS to make that
+> happen?
+>
+> The tricky thing is that the actual html for the form needs to
+> still come after the page title, not before it. Because the first thing
+> a non-css browser should show is the page title. But the only way I know
+> to get it to appear higher up is to put it first, or to use Evil absolute
+> positioning. (CSS sucks.) --[[Joey]]
+
+> Update: html5 allows just adding `placeholder="Search"` to the input
+> element. already works in eg, chromium. However, ikiwiki does not use
+> html5 yet. --[[Joey]]
+
+>> [[Done]], placeholder added, in html5 mode only.
+
+[[!tag wishlist bugs/html5_support]]
diff --git a/doc/todo/Add_nicer_math_formatting.mdwn b/doc/todo/Add_nicer_math_formatting.mdwn
new file mode 100644
index 000000000..d83c97add
--- /dev/null
+++ b/doc/todo/Add_nicer_math_formatting.mdwn
@@ -0,0 +1,28 @@
+It would be nice to add nicer math formatting. I currently use the
+[[plugins/teximg]] plugin, but I wonder if
+[jsMath](http://www.math.union.edu/~dpvc/jsMath/) wouldn't be a better option.
+
+[[Will]]
+
+> I've looked at jsmath (which is nicely packaged in Debian), and
+> I agree that this is nicer than TeX images. That text-mode browsers
+> get to see LaTeX as a fallback is actually a nice feature (better
+> than nothing, right? :) That browsers w/o javascript will not be able to
+> see the math either is probably ok.
+>
+> A plugin would probably be a pretty trivial thing to write.
+> It just needs to include the javascript files,
+> and slap a `<div class="math"> avound the user's code`, then
+> call `jsMath.Process(document);` at the end of the page.
+>
+> My only concern is security: Has jsMath's parser been written
+> to be safe when processing untrusted input? Could a user abuse the
+> parser to cause it to emit/run arbitrary javascript code?
+> I've posted a question about this to its forum: --[[Joey]]
+> <https://sourceforge.net/projects/jsmath/forums/forum/592273/topic/3831574>
+
+I think [mathjax](http://www.mathjax.org/) would be the best option. This is the math rendering engine used in mathoverflow.
+
+> I've updated Jason Blevin's pandoc plugin to permit tighter integration between Ikiwiki and Pandoc. Given the features Pandoc has added over the past 6-12 months, this makes for a very powerful combination, e.g. with code block syntax highlighting and lots of options for how to process and display inline TeX. Both jsMath and MathJaX are supported, along with many other methods. See https://github.com/profjim/pandoc-iki for details. --Profjim
+
+[[!tag wishlist]]
diff --git a/doc/todo/Add_showdown_GUI_input__47__edit.mdwn b/doc/todo/Add_showdown_GUI_input__47__edit.mdwn
new file mode 100644
index 000000000..7e7947fbc
--- /dev/null
+++ b/doc/todo/Add_showdown_GUI_input__47__edit.mdwn
@@ -0,0 +1,31 @@
+Add an option to use the Showdown GUI for editing or adding content.
+It is BSD-licensed javascript that shows the rendered Markdown (or HTML) while editing.
+
+A demo is at <http://www.attacklab.net/showdown-gui.html>
+
+(I read about this on the markdown mailing list.)
+
+> [[Wikiwyg]] also can provide a nice GUI for editing, although it would
+> need to be expanded to support markdown. The benefit compared to showdown
+> is that it provides a GUI for editing with widets for italics, etc,
+> compared to showdown which still leaves input in markdown and seems more
+> geared to a fast preview of the html. --[[Joey]]
+
+> Since we have semi-working wikiwgy and it's better, I'm considering this
+> todo item as [[done]] or rather, will-not-be-done..
+
+>> Given the unfortunate state of affairs for the wikiwyg project, could it
+>> be worthwhile to consider this option again? It seems to have a companion
+>> product (wmd) with formatting widgets and a live preview pane, that is
+>> promised to be MIT licensed as of the next release.... --Chapman Flack
+
+>>> What sort of integration would be needed to put in WMD?
+>>> It looks like it would need to be aware of some plugin/wikiword behavior
+>>> ... perhaps taking a Wikiword and making it appear like a link in preview, but
+>>> with a different style (perhaps diff color/font). For plugin commands,
+>>> applying a 'real' preview would probably be difficult, so it'd probably
+>>> be necessary to insert some sort of placeholder, perhaps by outputting
+>>> the text in monospace form w/ a lighter font to denote that it won't
+>>> directly be shown in the page... -- [[harningt]]
+
+>>>>> We have a wmd plugin now. --[[Joey]]
diff --git a/doc/todo/Add_space_before_slash_in_parent_links.mdwn b/doc/todo/Add_space_before_slash_in_parent_links.mdwn
new file mode 100644
index 000000000..e07ad8ef9
--- /dev/null
+++ b/doc/todo/Add_space_before_slash_in_parent_links.mdwn
@@ -0,0 +1,156 @@
+This [[patch]] adds a space before the forward-slash in the the parent links. There is already a space after the slash.
+
+> I intentionally put the space after the slash and not before, because I
+> like how it looks that way. So I don't plan to apply this patch unless a
+> lot of people disagree with me or whatever. --[[Joey]]
+
+>> Couldn't we export what's put between the links to a variable? For instance, I might actually want to set it to ' : ' or '→'. --[[madduck]]
+
+>>> Yes, please. This seems to be something a lot of people want to customize. (I certainly do -- a forward slash only looks natural to Unix users) --[[sabr]]
+
+>> Joey, would I be right to summarize your position on this as "people who
+>> want to change the text of the templates should maintain their own version
+>> of the `.tmpl` files"? It's not clear to me how this todo item could be
+>> closed in a way acceptable to you, except perhaps as WONTFIX. --[[smcv]]
+
+Before:
+
+ ikiwiki/ todo/ Add space before slash in parent links
+
+After:
+
+ ikiwiki / todo / Add space before slash in parent links
+
+Patch:
+
+ diff --git a/templates/misc.tmpl b/templates/misc.tmpl
+ index 184920e..80e6d0b 100644
+ --- a/templates/misc.tmpl
+ +++ b/templates/misc.tmpl
+ @@ -15,7 +15,7 @@
+
+ <div class="header">
+ <span>
+ -<TMPL_VAR INDEXLINK>/ <TMPL_VAR TITLE>
+ +<TMPL_VAR INDEXLINK> / <TMPL_VAR TITLE>
+ </span>
+ </div>
+
+ diff --git a/templates/page.tmpl b/templates/page.tmpl
+ index 3a1ac9e..1978e93 100644
+ --- a/templates/page.tmpl
+ +++ b/templates/page.tmpl
+ @@ -17,7 +17,7 @@
+ <div class="header">
+ <span>
+ <TMPL_LOOP NAME="PARENTLINKS">
+ -<a href="<TMPL_VAR NAME=URL>"><TMPL_VAR NAME=PAGE></a>/
+ +<a href="<TMPL_VAR NAME=URL>"><TMPL_VAR NAME=PAGE></a> /
+ </TMPL_LOOP>
+ <TMPL_VAR TITLE>
+ </span>
+ diff --git a/templates/recentchanges.tmpl b/templates/recentchanges.tmpl
+ index e03482f..4877395 100644
+ --- a/templates/recentchanges.tmpl
+ +++ b/templates/recentchanges.tmpl
+ @@ -15,7 +15,7 @@
+
+ <div class="header">
+ <span>
+ -<TMPL_VAR INDEXLINK>/ <TMPL_VAR TITLE>
+ +<TMPL_VAR INDEXLINK> / <TMPL_VAR TITLE>
+ </span>
+ </div>
+
+----
+
+It's almost implicit in some of the discussion above but this can be achieved locally if you fork your templates directory from ikiwiki's, with an ammendment such as
+
+ <h1><TMPL_LOOP NAME="PARENTLINKS"><a
+ href="<TMPL_VAR NAME=URL>"><TMPL_VAR NAME=PAGE></a>
+ &rarr;
+ </TMPL_LOOP><TMPL_VAR TITLE></h1>
+
+This is what I do on my site for example. -- [[Jon]]
+
+> You don't actually need to fork the whole directory, "only" `page.tmpl` -
+> put `templatedir => "/foo/templates"` in your setup file, copy `page.tmpl`
+> to that directory, and modify it there. IkiWiki will look in `templatedir`
+> first, then fall back to its default templates if any are missing from
+> `templatedir`.
+>
+> (Admittedly, `page.tmpl` is the hardest to maintain a fork of, because it
+> tends to change whenever a new plugin is added...) --[[smcv]]
+
+----
+
+Here is a solution which doesn't require people to create their own
+`page.tmpl`. The solution uses an HTML list together with CSS to draw the
+separator and can therefore be controlled by users. This change also
+allows people to control other aspects of how the parentlinks are
+displayed. The only drawback is that lynx/w3m don't seem to deal with this
+CSS feature, but I don't think it's too bad since the parentlinks will
+simply show up as a list.
+
+> I guess I could live with w3m having a second list at the top.
+>
+> Does this method look identical in the default theme? What about the
+> other themes? Several of them do things with parentlinks css.. --[[Joey]]
+
+(I see that the other patch changes templates/misc.tmpl and
+templates/recentchanges.tmpl for INDEXLINK. I haven't done that but can do
+so if [[Joey]] likes this approach.)
+
+> Those template no longer have the redundant stuff. --[[Joey]]
+
+--[[tbm]]
+
+ diff --git a/doc/style.css b/doc/style.css
+ index 35a1331..b726365 100644
+ --- a/doc/style.css
+ +++ b/doc/style.css
+ @@ -129,6 +129,23 @@ pre {
+ overflow: auto;
+ }
+
+ +ul.parentlinks li:after {
+ +display: marker;
+ +content: "/ ";
+ +background: none;
+ +}
+ +
+ +ul.parentlinks li {
+ +display: inline;
+ +}
+ +
+ +ul.parentlinks
+ +{
+ +padding-left: 0;
+ +display:inline;
+ +list-style-type: none;
+ +}
+ +
+ div.recentchanges {
+ border-style: solid;
+ border-width: 1px;
+ diff --git a/templates/page.tmpl b/templates/page.tmpl
+ index 770ac23..f54493e 100644
+ --- a/templates/page.tmpl
+ +++ b/templates/page.tmpl
+ @@ -44,11 +44,15 @@
+ <TMPL_IF HTML5><section class="pageheader"><TMPL_ELSE><div class="pageheader"></TMPL_IF>
+ <TMPL_IF HTML5><header class="header"><TMPL_ELSE><div class="header"></TMPL_IF>
+ <span>
+ +<TMPL_IF PARENTLINKS>
+ <span class="parentlinks">
+ +<ul class="parentlinks">
+ <TMPL_LOOP PARENTLINKS>
+ -<a href="<TMPL_VAR URL>"><TMPL_VAR PAGE></a>/
+ +<li><a href="<TMPL_VAR URL>"><TMPL_VAR PAGE></a></li>
+ </TMPL_LOOP>
+ +</ul>
+ </span>
+ +</TMPL_IF>
+ <span class="title">
+ <TMPL_VAR TITLE>
+ <TMPL_IF ISTRANSLATION>
diff --git a/doc/todo/Add_support_for_latest_Text::Markdown_as_found_on_CPAN.mdwn b/doc/todo/Add_support_for_latest_Text::Markdown_as_found_on_CPAN.mdwn
new file mode 100644
index 000000000..6b9fa0535
--- /dev/null
+++ b/doc/todo/Add_support_for_latest_Text::Markdown_as_found_on_CPAN.mdwn
@@ -0,0 +1,45 @@
+Recent versions of Text::Markdown as found on CPAN (e.g. 1.0.16) no longer contains a Text::Markdown::Markdown() routine, but instead contains a Text::Markdown::markdown() routine (notice the difference in capitalization).
+
+It seems that the Text::Markdown module as found on CPAN is now identical to Text::MultiMarkdown - hence the subtle change.
+
+This patch allows IkiWiki to work with either of the two:
+
+> I already wrote such a patch a few days ago and applied it to git. Might
+> be a good idea to check out current git master before spending time on
+> patches in the future. Thanks for the work anyway.. --[[Joey]]
+
+[[!tag done]]
+
+ --- IkiWiki/Plugin/mdwn.pm.orig 2008-03-08 11:33:50.000000000 +0100
+ +++ IkiWiki/Plugin/mdwn.pm 2008-03-08 13:37:21.000000000 +0100
+ @@ -28,14 +28,20 @@ sub htmlize (@) {
+ $markdown_sub=\&Markdown::Markdown;
+ }
+ else {
+ - eval q{use Text::Markdown};
+ + eval q{use Text::Markdown 'Markdown'};
+ if (! $@) {
+ $markdown_sub=\&Text::Markdown::Markdown;
+ }
+ else {
+ - do "/usr/bin/markdown" ||
+ - error(sprintf(gettext("failed to load Markdown.pm perl module (%s) or /usr/bin/markdown (%s)"), $@, $!));
+ - $markdown_sub=\&Markdown::Markdown;
+ + eval q{use Text::Markdown 'markdown'};
+ + if (! $@) {
+ + $markdown_sub=\&Text::Markdown::markdown;
+ + }
+ + else {
+ + do "/usr/bin/markdown" ||
+ + error(sprintf(gettext("failed to load Markdown.pm perl module (%s) or /usr/bin/markdown (%s)"), $@, $!));
+ + $markdown_sub=\&Markdown::Markdown;
+ + }
+ }
+ }
+ require Encode;
+
+The above patch, which is against ikiwiki-2.40, should fix [[bugs/markdown_module_location]].
+
+-- [[HenrikBrixAndersen]]
+
+[[!tag patch]]
diff --git a/doc/todo/Adjust_goodstuff.mdwn b/doc/todo/Adjust_goodstuff.mdwn
new file mode 100644
index 000000000..8a9b5c3b1
--- /dev/null
+++ b/doc/todo/Adjust_goodstuff.mdwn
@@ -0,0 +1,12 @@
+Need to re-evaluate the contents of [[plugins/goodstuff]] based on external dependencies.
+
+* Possibly drop [[plugins/img]] due to PerlMagick dependency, and [[plugins/otl]] due to vimoutliner dependency.
+* Add [[plugins/favicon]] and [[plugins/more]] due to lack of external dependencies.
+
+Alternatively, if including items that have minor external dependencies:
+
+* Possibly add [[plugins/table]].
+
+--[[JoshTriplett]]
+
+[[done]]
diff --git a/doc/todo/Allow_TITLE_to_include_part_of_the_path_in_addition_to_the_basename.mdwn b/doc/todo/Allow_TITLE_to_include_part_of_the_path_in_addition_to_the_basename.mdwn
new file mode 100644
index 000000000..b97c81efa
--- /dev/null
+++ b/doc/todo/Allow_TITLE_to_include_part_of_the_path_in_addition_to_the_basename.mdwn
@@ -0,0 +1,79 @@
+I need to display part of my pages' path in the `<title>` meta HTML
+header instead of their basename ; e.g. for /abs/path/to/basename, I'd
+like to set it to path/to/basename.
+
+Of course, one might consider it's my own problem, as I could
+workaround this in my templates, and replace, in the `<title>` meta
+HTML header, `<TMPL_VAR TITLE>` with a `TMPL_LOOP` on `PARENTLINKS`,
+but...
+
+- it's ugly (call it a semantic hijacking if you want) ; a side-effect
+ of this ugliness is :
+- it defeats any further plugin's (e.g. [[plugins/meta]])
+ attempt to override the default title with a nicer one ;
+- all parents appear : there is no way to specify how deep to go up in
+ the parents tree.
+
+So I really want to avoid this ugly workaround.
+
+Looking at `Render.pm`, the second solution I thought of was :
+
+- add a `parents_in_page_title` configuration option (default=0, i.e.
+ the current behaviour) ;
+- modify `Render.pm` to insert as much parents as possible (up to
+ `N=parents_in_page_title`), separated by '/', in the `title`
+ template parameter, before the actual page basename ; I personally
+ would use N=2.
+
+The only problems I can see with this approach are :
+
+- it requires modification of the core, which may not be desirable
+- the resulting title would be unconditionally overridden by the meta
+ plugin, and I can think of no clean solution to make this
+ configurable without hacking [[plugins/meta]], which I'd rather not
+ to ; I don't care, but once you add a ad-hoc feature to the core,
+ you can be sure someone will want a more generic version in less than
+ three months ;)
+
+I'm not too convinced writing a plugin for such a small feature isn't
+overdoing it, so I'm tempted to implement this solution in the
+simplest way : the generated title would be the default and could be
+overridden later by plugins.
+
+Joey, what do you think ?
+
+(Before starting to write any single line of code, I need to know how
+much you are on the "if you can do it as a plugin, don't ever modify
+the core" side... :)
+
+> My general philosophy is that the core should be flexible enough to allow
+> plugins to do arbitrary useful stuff. And there are some things in-core
+> that I'd like to get out (such as backlinks processing), but that cannot
+> currently be moved out efficiently. KISS is also part of my pholisophy.
+>
+> So no, I don't like adding new options to the core that few users will
+> want to use.
+
+In case you're on the hardcore side, I would probably write
+a dedicated plugin, called `genealogictitle` or whatever, and :
+
+- use the pagetemplate hook to modify the `title` template parameter,
+ and maybe set `title_overridden`, as does the meta plugin
+- add a `genealogictitle_depth` configuration option to tell how many
+ parents to display
+- maybe add a `genealogictitle_overrides_meta` or whatever to decide
+ whether a title overridden by [[plugins/meta]] should be overridden
+ by genealogictitle ; but anyway, I've not found, in the plugins
+ documentation, any hint about the order in which the plugins are
+ called for a given hook, so the "choose the strongest between meta
+ and genealogictitle" thing might just be more complicated... (no,
+ I did not Read The Nice Source, yet).
+
+-- intrigeri
+
+> Plugin sounds reasonable. --[[Joey]]
+
+>> Well, it seems I once more designed a solution before clearly
+>> defining my problem... What I really need is more generic, can be
+>> done as a plugin, and deserves its own [[todo|pedigree_plugin]], so
+>> I'm tagging this one wontfix^W [[done]]. I'm sorry. -- intrigeri
diff --git a/doc/todo/Allow_change_of_wiki_file_types.mdwn b/doc/todo/Allow_change_of_wiki_file_types.mdwn
new file mode 100644
index 000000000..19574b175
--- /dev/null
+++ b/doc/todo/Allow_change_of_wiki_file_types.mdwn
@@ -0,0 +1,85 @@
+The new [[plugins/rename]] plugin allows files to be renamed, but doesn't seem to allow changing the page type. It would be nice if there was a way to change page type through the web interface.
+
+#### Background
+
+I'm currently moving a couple of projects from [Trac](http://trac.edgewall.org/) to Ikiwiki. I don't want to have to re-do all the wiki formatting at once. Initially I simply imported all the old wiki pages without suffixes. This made them appear on the web as raw un-editable text. I wanted other project members to be able to do the updating to the new markup language, so I then renamed the files to use '.txt' suffixes, and that allows them to be edited. Unfortunately, there is still no way to convert them to '.mdwn' files on the web.
+
+I was hoping that the [[plugins/rename]] plugin would allow web uses to change the filename suffix, but it doesn't. This means that the page type can be set on page creation using the web interface, but cannot be changed thereafter using the web interface. I was thinking the UI would be something like adding the 'Page type' drop-down menu that appears on the creation page to either the edit or rename pages.
+
+#### [[patch]]
+
+ diff --git a/IkiWiki/Plugin/rename.pm b/IkiWiki/Plugin/rename.pm
+ index 527ee88..123b772 100644
+ --- a/IkiWiki/Plugin/rename.pm
+ +++ b/IkiWiki/Plugin/rename.pm
+ @@ -43,7 +43,7 @@ sub check_canrename ($$$$$$$) {
+
+ # Dest checks can be omitted by passing undef.
+ if (defined $dest) {
+ - if ($src eq $dest || $srcfile eq $destfile) {
+ + if ($srcfile eq $destfile) {
+ error(gettext("no change to the file name was specified"));
+ }
+
+ @@ -54,7 +54,7 @@ sub check_canrename ($$$$$$$) {
+ }
+
+ # Must not be a known source file.
+ - if (exists $pagesources{$dest}) {
+ + if ($src ne $dest && exists $pagesources{$dest}) {
+ error(sprintf(gettext("%s already exists"),
+ htmllink("", "", $dest, noimageinline => 1)));
+ }
+ @@ -97,6 +97,24 @@ sub rename_form ($$$) {
+ $f->field(name => "do", type => "hidden", value => "rename", force => 1);
+ $f->field(name => "page", type => "hidden", value => $page, force => 1);
+ $f->field(name => "new_name", value => IkiWiki::pagetitle($page), size => 60);
+ + if (!$q->param("attachment")) {
+ + # insert the standard extensions
+ + my @page_types;
+ + if (exists $IkiWiki::hooks{htmlize}) {
+ + @page_types=grep { !/^_/ }
+ + keys %{$IkiWiki::hooks{htmlize}};
+ + }
+ +
+ + # make sure the current extension is in the list
+ + my ($ext) = $pagesources{$page}=~/\.([^.]+)$/;
+ + if (! $IkiWiki::hooks{htmlize}{$ext}) {
+ + unshift(@page_types, $ext);
+ + }
+ +
+ + $f->field(name => "type", type => 'select',
+ + options => \@page_types,
+ + value => $ext, force => 1);
+ + }
+ $f->field(name => "attachment", type => "hidden");
+
+ return $f, ["Rename", "Cancel"];
+ @@ -223,12 +241,19 @@ sub sessioncgi ($$) {
+ my $dest=IkiWiki::possibly_foolish_untaint(IkiWiki::titlepage($q->param("new_name")));
+
+ # The extension of dest is the same as src if it's
+ - # a page. If it's an extension, the extension is
+ + # a page. If it's an attachment, the extension is
+ # already included.
+ my $destfile=$dest;
+ if (! $q->param("attachment")) {
+ - my ($ext)=$srcfile=~/(\.[^.]+)$/;
+ - $destfile.=$ext;
+ + my $type=$q->param('type');
+ + if (defined $type && length $type && $IkiWiki::hooks{htmlize}{$type}) {
+ + $type=IkiWiki::possibly_foolish_untaint($type);
+ + } else {
+ + my ($ext)=$srcfile=~/\.([^.]+)$/;
+ + $type=$ext;
+ + }
+ +
+ + $destfile.=".".$type;
+ }
+
+ check_canrename($src, $srcfile, $dest, $destfile,
+
+-- [[users/Will]]
+
+Thanks, fixed a few bugs and applied. --[[Joey]]
+[[done]]
diff --git a/doc/todo/Allow_disabling_edit_and_preferences_links.mdwn b/doc/todo/Allow_disabling_edit_and_preferences_links.mdwn
new file mode 100644
index 000000000..17f45dda6
--- /dev/null
+++ b/doc/todo/Allow_disabling_edit_and_preferences_links.mdwn
@@ -0,0 +1,81 @@
+This patch allows disabling the edit and preferences link in the config file. It is backwards compatible (so peoples edit and preferences links won't suddenly vanish).
+
+To disable edit or prefs respectively, add the following to the config file:
+
+<pre>
+ 'edit' => 0,
+ 'prefs' => 0,
+</pre>
+
+Patch:
+<pre>
+--- /usr/share/perl5/IkiWiki/Render.pm.orig 2008-12-23 16:49:00.000000000 +1300
++++ /usr/share/perl5/IkiWiki/Render.pm 2008-12-23 16:55:40.000000000 +1300
+@@ -80,8 +80,10 @@
+ my $actions=0;
+
+ if (length $config{cgiurl}) {
+- $template->param(editurl => cgiurl(do => "edit", page => $page));
+- $template->param(prefsurl => cgiurl(do => "prefs"));
++ $template->param(editurl => cgiurl(do => "edit", page => $page))
++ if ! defined $config{edit} || (defined $config{edit} && $config{edit} == 1);
++ $template->param(prefsurl => cgiurl(do => "prefs"))
++ if ! defined $config{prefs} || (defined $config{prefs} && $config{prefs} == 1);
+ $actions++;
+ }
+
+</pre>
+
+> On irc, you said, "That was to allow the hack to of using wikistatedir to
+> allow me to generate two websites, one with inline editting, the other a
+> static page for public consumption."
+>
+> The edit and preferences links can already be disabled by editing
+> `page.tmpl`. (Look for PREFSURL and EDITURL).
+>
+> More to the point though, disabling those links does not disable anyone
+> consticting the urls by hand and logging in and editing a page. So you'd
+> really want to disable the editpage plugin in the setup file for the
+> public, static wiki. Sounds like you might also want to turn off cgi
+> entirely for that build. --[[Joey]]
+
+>> I want to retain the same page.tmpl for both sites (different templates
+>> will just increase the maintenance hell), so disabling the links in the
+>> config for one public site works better in my case.
+>>
+>> I do have the editpage plugin disabled for the public static wiki, but
+>> the link still appears on the site. I want to keep the cgi on, so that
+>> the site is still searchable. --[[puck]]
+
+>>> For me, disabling the editpage plugin does make the "Edit" link
+>>> disappear (this is with 3.03) but as far as I can tell, "Preferences"
+>>> is not controlled by any plugin. It would be nice if it were; I am
+>>> trying to achieve a configuration where the only action supported
+>>> via CGI is blog-style comments. --[Zack](http://zwol.livejournal.com/)
+
+>>> Like [[puck]], I'd like to keep search available but I want to disable all
+>>> login facitilities and thus disable the "Preferences" link.
+>>>
+>>> After digging a little bit in the source code, my first attempt was to make
+>>> the "Preferences" link appear only if there is `sessioncgi` hooks
+>>> registered. But this will not work as the [[plugins/inline]] plugin also
+>>> defines it.
+>>>
+>>> Looking for `auth` hooks currently would not work as at least
+>>> [[plugins/passwordauth]] does not register one.
+>>>
+>>> Adding a new `canlogin` hook looks like overkill to me. [[Joey]], how
+>>> about making registration of the `auth` hook mandatory for all plugins
+>>> making sense of the "Preferences" link? --[[Lunar]]
+
+>>>> Hmm, using the `auth` hook existance does seem like a nice solution.
+>>>> While splitting the preferences code out into its own plugin is
+>>>> easily enough done, it has the minor problem of being yet another
+>>>> file nearly all ikiwikis will have to load, and also, prefs would
+>>>> have to be disabled manually. So I like that using the hook would
+>>>> cause it to auto-disable if nothing uses it. It's a bit ugly that
+>>>> passwordauth doesn't need an auth hook (it could be reorged to
+>>>> use it instead of formbuilder, maybe) and would probably just have an
+>>>> empty one. Thanks for the idea. --[[Joey]] [[done]]
+
+>>>>> Thanks for implementing it! --[[Lunar]]
diff --git a/doc/todo/Allow_edittemplate_to_set_file_type.mdwn b/doc/todo/Allow_edittemplate_to_set_file_type.mdwn
new file mode 100644
index 000000000..1b99a4e05
--- /dev/null
+++ b/doc/todo/Allow_edittemplate_to_set_file_type.mdwn
@@ -0,0 +1,44 @@
+Below is a [[patch]] to [[plugins/edittemplate]] that does a few things:
+
+ * It defaults the type of the file to be created to the same type as the template.
+ * It adds a 'silent' parameter to the directive that stops it from printing out what what registered.
+ * It makes the description of what was registered link to the template page (which gives feedback for typos or allows template creation)
+ * It adds a colon to the standard string correcting the syntax.
+
+[[done]] except for the colon change; it's referring to the template as an
+edittemplate there. --[[Joey]]
+
+----
+
+ diff --git a/IkiWiki/Plugin/edittemplate.pm b/IkiWiki/Plugin/edittemplate.pm
+ index 98308de..c381940 100644
+ --- a/IkiWiki/Plugin/edittemplate.pm
+ +++ b/IkiWiki/Plugin/edittemplate.pm
+ @@ -56,8 +56,14 @@ sub preprocess (@) {
+
+ $pagestate{$params{page}}{edittemplate}{$params{match}}=$params{template};
+
+ - return sprintf(gettext("edittemplate %s registered for %s"),
+ - $params{template}, $params{match});
+ + return "" if ($params{silent} && IkiWiki::yesno($params{silent}));
+ +
+ + my $link=IkiWiki::linkpage($params{template});
+ + add_depends($params{page}, $link);
+ + my $linkHTML = htmllink($params{page}, $params{destpage}, $link);
+ +
+ + return sprintf(gettext("edittemplate: %s registered for %s"),
+ + $linkHTML, $params{match});
+ }
+
+ sub formbuilder (@) {
+ @@ -89,6 +95,9 @@ sub formbuilder (@) {
+ if (pagespec_match($p, $pagespec, location => $registering_page)) {
+ $form->field(name => "editcontent",
+ value => filltemplate($pagestate{$registering_page}{edittemplate}{$pagespec}, $page));
+ + $form->field(name => "type",
+ + value => pagetype($pagesources{$pagestate{$registering_page}{edittemplate}{$pagespec}}))
+ + if $pagesources{$pagestate{$registering_page}{edittemplate}{$pagespec}};
+ return;
+ }
+ }
+
diff --git a/doc/todo/Allow_filenames_that_are_all_type.mdwn b/doc/todo/Allow_filenames_that_are_all_type.mdwn
new file mode 100644
index 000000000..bebbcafa8
--- /dev/null
+++ b/doc/todo/Allow_filenames_that_are_all_type.mdwn
@@ -0,0 +1,41 @@
+This is a [[patch]] to allow filenames that are just the type. The best example of this is wanting to
+pass a `Makefile` through one of the [[todo/syntax_highlighting/]] plugins. With this patch,
+if the plugin can process files of type `.Makefile` then it will also process `Makefile`.
+
+I put this patch on the [[todo/syntax_highlighting/]] page a while ago, but it seemed to get
+lost because it didn't have its own bug to track it. Now it does :). -- [[Will]]
+
+> This changes `pagename()`, but what about `pagetype()`?
+> Many things in ikiwiki check if `pagetype($file)` returns
+> true to see if it's a page, etc. --[[Joey]]
+
+>> I think this patch is complete. It does not change `pagename()`, it
+>> changes `pagetype()` (the diff is fairly old - line numbers may have
+>> changed).
+>>
+>> Before this patch, `pagetype()` required a `.` in the page name. With
+>> this patch it doesn't, as long as the extension is being kept. This allows
+>> the filename to be all extension. `pagename()` relies on `pagetype()`
+>> to detect the type. `pagename()` also removes the extension on some
+>> pages, but this patch only affects pages where the extension isn't
+>> removed.
+>>
+>> So, yeah, I think this patch is complete. :) -- [[Will]]
+
+>>> Thanks, [[applied|done]], but I added a noextension parameter,
+>>> since having keepextension allow files with no extension didn't make
+>>> sense. Also, made it work for pages in subdirs.. --[[Joey]]
+
+ diff --git a/IkiWiki.pm b/IkiWiki.pm
+ index 8d728c9..1bd46a9 100644
+ --- a/IkiWiki.pm
+ +++ b/IkiWiki.pm
+ @@ -618,6 +618,8 @@ sub pagetype ($) {
+
+ if ($page =~ /\.([^.]+)$/) {
+ return $1 if exists $hooks{htmlize}{$1};
+ + } elsif ($hooks{htmlize}{$page}{keepextension}) {
+ + return $page;
+ }
+ return;
+ }
diff --git a/doc/todo/Allow_per-page_template_selection.mdwn b/doc/todo/Allow_per-page_template_selection.mdwn
new file mode 100644
index 000000000..231ccf502
--- /dev/null
+++ b/doc/todo/Allow_per-page_template_selection.mdwn
@@ -0,0 +1,43 @@
+It'd be nice to be able to specify an altenate template file to be
+used for some pages. For example, I'd like most of my pages to use
+page.tmpl but I want my front page to be formatted in some unique way,
+so I'd like it to use a separate front.tmp template instead.
+
+I'm not sure what syntax to use for this, (\[[template]] seems to be
+taken for another purpose already). Perhaps something like
+\[[page-template front]] ?).
+
+Joey provided a nice suggestion for implementing this feature, ("I
+would probably add a hook that allowed overriding the default template
+constuction and returning a template object"). I did start looking
+into that, but finally I wimped out and just put the following hack
+into the `genpage()` function in Render.pm:
+
+
+ if ($page eq 'index') {
+ $template->param(suppresstitle => 1);
+ }
+
+That lets me use a `<TMPL_UNLESS SUPPRESSTITLE>` in my template to get
+the effect I want. I don't think that's anything that upstream should
+pick-up as is, (maybe with an appropriate configuration option, but
+then again allowing for per-page template selection would be more
+powerful anyway). But I'm happy enough now that I probably won't
+pursue implementing this feature further myself.
+
+But I'd still happily switch to using this feature if someone were to
+implement it.
+
+UPDATE: My latest hack is as follows:
+
+ if ($page eq 'index') {
+ $template->param(toplevelindex => 1);
+ }
+
+And that's something that I'm not actually embarrassed to suggest
+could be accepted upstream. Joey, what do you think? And would a
+proper patch be helpful (it really just adds those lines to the right
+place).
+
+> Fully implemented as a templatefile hook and a [[plugins/pagetemplate]]
+> plugin. --[[Joey]] [[done]]
diff --git a/doc/todo/Allow_web_edit_form_comment_field_to_be_mandatory.mdwn b/doc/todo/Allow_web_edit_form_comment_field_to_be_mandatory.mdwn
new file mode 100644
index 000000000..da68b04c2
--- /dev/null
+++ b/doc/todo/Allow_web_edit_form_comment_field_to_be_mandatory.mdwn
@@ -0,0 +1,22 @@
+[[!tag wishlist]]
+[[!tag patch]]
+
+In our team internal wiki, we wish to impose a policy that all edits must have a comment. Patch in [[!debbug 450620]].
+
+> Good idea! I also hate empty commit comments, but I know that it's also a matter
+> of human mentality. Of course, you can forbid users to commit empty comments,
+> but then they can commit so worthless comments like "\*" or "\* blah". --[[Paweł|ptecza]]
+
+> I don't feel this belongs in ikiwiki core, but would accept a plugin that
+> does it. I think that can be done using a formbuilder_setup hook.
+> BTW, it wuld probably be better to validate against a `comment_regexp`,
+> so that when the evil admins notice that /.+/ is absurdly easy for users
+> to get around (by using " "), they can tighten it down. --[[Joey]]
+
+> Sorry for being dense, but I don't see a way to modify the template from within
+> a plugin, without providing a whole new template for editform, which obviously
+> isn't a workable solution. If the template was modified to allow overriding the
+> portion of the message in question, then I agree that a plugin couuld do the rest.
+> Thoughts appreciated :) --[[Dom]]
+
+> Yes, modifying the template is ok. --[[Joey]]
diff --git a/doc/todo/Attempt_to_extend_Mercurial_backend_support.mdwn b/doc/todo/Attempt_to_extend_Mercurial_backend_support.mdwn
new file mode 100644
index 000000000..8ded94393
--- /dev/null
+++ b/doc/todo/Attempt_to_extend_Mercurial_backend_support.mdwn
@@ -0,0 +1,258 @@
+Using the Mercurial backend, the lack of `rcs_commit_staged` is noticed
+frequently. I couldn't find any tries to update `mercurial.pm`, so not
+letting lack of Mercurial AND Perl knowledge bring me down, I copy-pasted
+from `git.pm` to mimic its behaviour from a Mercurial perspective. I hope
+it can be a foundation for development by those more proficient in
+ikiwiki's inner workings. I have doubts that I personally will be able to
+revise it more, based on my Perl skills.
+
+I've tested it briefly. `ikiwiki-calendar` and posting of comments now
+works with automatic commits, i.e. the `rcs_commit_staged` function works
+in those cases. Under my current setup, I don't know where else to expect
+it to work. I would be flabberghasted if there wasn't any problems with it,
+though.
+
+> Absolutely, the [[/rcs]] chart shows mercurial is lagging behind
+> nearly everything.
+>
+> I don't think this stuff is hard, or unlikely to work, familiarity with
+> the rcs's particular details is the main thing. --[[Joey]]
+
+Diff follows, for anyone to annotate. First code version is also available at [my hg-repo](http://510x.se/hg/program/ikiwiki/file/e741fcfd800f/Plugin/mercurial.pm). Latest version should be [here](http://46.239.104.5:81/hg/program/ikiwiki/file/tip/Plugin/mercurial.pm) ([raw format](http://46.239.104.5:81/hg/program/ikiwiki/raw-file/tip/Plugin/mercurial.pm)). I'll notify on this page with "*Done*" remarks when I've actually commited changes to my local repository. I don't know if I should replace the code and the comments below when I've changed something. I'll probably do this when the code feels more mature. --[[Daniel Andersson]]
+
+> I've looked over the current version and it looks ok to me. --[[Joey]]
+
+>> I changed the by `mercurial.pm` recorded commit messages and the `rcs_recentchanges` logic to include more information, to emulate the `git.pm` behaviour regarding name presentation on RecentChanges. I don't have anything more to add at the moment, so if the code passes review, I'm done, and I tag this page as "patch". [Final patch version as per this page at my hg repo](http://510x.se/hg/program/ikiwiki/file/bc0e2f838fe3/Plugin/mercurial.pm) ([raw format](http://46.239.104.5:81/hg/program/ikiwiki/raw-file/bc0e2f838fe3/Plugin/mercurial.pm)). I keep the below conversation for reference, but it's mostly outdated. --[[Daniel Andersson]]
+
+[[merged|done]] --[[Joey]]
+
+[[!tag patch]]
+
+***
+
+ diff -r 20c61288d7bd Plugin/mercurial.pm
+ --- a/Plugin/mercurial.pm Fri Jul 15 02:55:12 2011 +0200
+ +++ b/Plugin/mercurial.pm Fri Jul 15 03:29:10 2011 +0200
+ @@ -7,6 +7,8 @@
+ use Encode;
+ use open qw{:utf8 :std};
+
+ +my $hg_dir=undef;
+ +
+ sub import {
+ hook(type => "checkconfig", id => "mercurial", call => \&checkconfig);
+ hook(type => "getsetup", id => "mercurial", call => \&getsetup);
+
+A corresponding variable is declared for git. It is unused as of yet for
+Mercurial, but when more advanced merge features become available for
+`mercurial.pm`, I think it will come into play.
+
+> Maybe.. I'd rather avoid unused cruft though. --[[Joey]]
+
+>> OK, will be removed. *Done* --[[Daniel Andersson]]
+
+ @@ -69,6 +71,62 @@
+ },
+ }
+
+ +sub safe_hg (&@) {
+ + # Start a child process safely without resorting to /bin/sh.
+ + # Returns command output (in list content) or success state
+ + # (in scalar context), or runs the specified data handler.
+ +
+ + my ($error_handler, $data_handler, @cmdline) = @_;
+ +
+ + my $pid = open my $OUT, "-|";
+ +
+ + error("Cannot fork: $!") if !defined $pid;
+ +
+ + if (!$pid) {
+ + # In child.
+ + # hg commands want to be in wc.
+ + if (! defined $hg_dir) {
+ + chdir $config{srcdir}
+ + or error("cannot chdir to $config{srcdir}: $!");
+ + }
+ + else {
+ + chdir $hg_dir
+ + or error("cannot chdir to $hg_dir: $!");
+ + }
+
+> How can this possibly work, since `$hg_dir` is not set? The code
+> that is being replaced seems to use `-R` to make hg use the right
+> directory. If it worked for you without specifying the directory,
+> it's quite likely this is the place the patch fails in some
+> unusual circumstance.. but I think it could easily use `-R` here.
+
+>> It works since `if (! defined $hg_dir)` always hits, and `chdir $config{srcdir}` is well defined. The whole logic is just used in `git.pm` for merge functionality that is not present in `mercurial.pm`, so by the cruft argument above, this should be replaced with just chdir `$config{srcdir}` (which is equivalent to `hg -R` or `hg --cwd` from what I know). Will be removed. *Done* --[[Daniel Andersson]]
+
+ + exec @cmdline or error("Cannot exec '@cmdline': $!");
+ + }
+ + # In parent.
+ +
+ + # hg output is probably utf-8 encoded, but may contain
+ + # other encodings or invalidly encoded stuff. So do not rely
+ + # on the normal utf-8 IO layer, decode it by hand.
+ + binmode($OUT);
+
+> Is this actually true for hg?
+
+>> I don't know. ["hg stores everything internally as UTF-8, except for pathnames"](https://jira.atlassian.com/browse/FE-3198), but output is dependent on the system's locale. The environment variable `HGENCODING=utf-8` can be set to ensure that Mercurial's own output is always UTF-8, but when viewing a diff containing non-UTF-8 changes, the affected lines are nevertheless output in their original encoding. I personally think that this is the correct way to output it, though, unless there is a possibility that someone is running ikiwiki wih a non-UTF-8 locale.
+
+>>> *Done*. I removed the `encode_utf8()` part and instead set `HGENCODING=utf-8` where the external `hg` command was called. It seems to have taken care of "all" character encoding issues (but it is an almost infinite error pool to draw from, so some problem might pop up). --[[Daniel Andersson]]
+
+ + my @lines;
+ + while (<$OUT>) {
+ + $_=decode_utf8($_, 0);
+ +
+ + chomp;
+ +
+ + if (! defined $data_handler) {
+ + push @lines, $_;
+ + }
+ + else {
+ + last unless $data_handler->($_);
+ + }
+ + }
+ +
+ + close $OUT;
+ +
+ + $error_handler->("'@cmdline' failed: $!") if $? && $error_handler;
+ +
+ + return wantarray ? @lines : ($? == 0);
+ +}
+ +# Convenient wrappers.
+ +sub run_or_die ($@) { safe_hg(\&error, undef, @_) }
+ +sub run_or_cry ($@) { safe_hg(sub { warn @_ }, undef, @_) }
+ +sub run_or_non ($@) { safe_hg(undef, undef, @_) }
+ +
+ sub mercurial_log ($) {
+ my $out = shift;
+ my @infos;
+ @@ -116,10 +174,7 @@
+ }
+
+ sub rcs_update () {
+ - my @cmdline = ("hg", "-q", "-R", "$config{srcdir}", "update");
+ - if (system(@cmdline) != 0) {
+ - warn "'@cmdline' failed: $!";
+ - }
+ + run_or_cry('hg', '-q', 'update');
+ }
+
+ sub rcs_prepedit ($) {
+
+With the `run_or_{die,cry,non}()` functions defined as in `git.pm`, some old Mercurial functions can be rewritten more compactly.
+
+ @@ -129,6 +184,14 @@
+ sub rcs_commit (@) {
+ my %params=@_;
+
+ + return rcs_commit_helper(@_);
+ +}
+ +
+ +sub rcs_commit_helper (@) {
+ + my %params=@_;
+ +
+ + my %env=%ENV;
+
+> This `%env` stash is unused; `%ENV` is never modified.
+
+>> Yes, the code is missing setting the username at all. The local `hgrc` file is always used. I'll add setting of `$ENV{HGUSER}`. *Done* --[[Daniel Andersson]]
+
+ +
+ my $user="Anonymous";
+ if (defined $params{session}) {
+ if (defined $params{session}->param("name")) {
+
+Here comes the `rcs_commit{,_staged}` part. It is modeled on a `rcs_commit_helper` function, as in `git.pm`.
+
+Some old `mercurial.pm` logic concerning commiter name is kept instead of transplanting the more elaborate logic from `git.pm`. Maybe it is better to "steal" that as well.
+
+> Exactly how to encode the nickname from openid in the commit metadata,
+> and get it back out in `rcs_recentchanges`, would probably vary from git.
+
+>> Yes, right now the long and ugly OpenID strings, e.g. `https://www.google.com/accounts/o8/id?id=AItOawmUIes3yDLfQME0uvZvJKDN0NsdKPx_PTw`, gets recorded as author and are shown as `id [www.google.com/accounts/o8]` in RecentChanges. I see that here on `ikiwiki.info`, my commits, identified by OpenID, are shown as authored by simply `Daniel`. I'll look into it. --[[Daniel Andersson]]
+
+>>> I adapted some logic from `git.pm`. `hg` only has a single commiter name field, whereas `git` has both `GIT_AUTHOR_NAME` and `GIT_AUTHOR_EMAIL`. The behaviour can be emulated by encoding nick and commit medium into commiter name as "`https://www.google.com/accounts/o8/id?id=AItOawmUIes3yDLfQME0uvZvJKDN0NsdKPx_PTw <Daniel@web>`" and parsing this out as necessary when `rcs_recentchanges` is called. *Done* --[[Daniel Andersson]]
+
+ @@ -143,43 +206,45 @@
+ $params{message} = "no message given";
+ }
+
+ - my @cmdline = ("hg", "-q", "-R", $config{srcdir}, "commit",
+ - "-m", IkiWiki::possibly_foolish_untaint($params{message}),
+ - "-u", IkiWiki::possibly_foolish_untaint($user));
+ - if (system(@cmdline) != 0) {
+ - warn "'@cmdline' failed: $!";
+ + $params{message} = IkiWiki::possibly_foolish_untaint($params{message});
+ +
+ + my @opts;
+ +
+ + if (exists $params{file}) {
+ + push @opts, '--', $params{file};
+ }
+ -
+ + # hg commit returns non-zero if nothing really changed.
+ + # So we should ignore its exit status (hence run_or_non).
+ + run_or_non('hg', 'commit', '-m', $params{message}, '-q', @opts);
+ +
+ + %ENV=%env;
+ return undef; # success
+ }
+
+ sub rcs_commit_staged (@) {
+ # Commits all staged changes. Changes can be staged using rcs_add,
+ # rcs_remove, and rcs_rename.
+ - my %params=@_;
+ -
+ - error("rcs_commit_staged not implemented for mercurial"); # TODO
+ + return rcs_commit_helper(@_);
+ }
+
+ sub rcs_add ($) {
+ my ($file) = @_;
+
+ - my @cmdline = ("hg", "-q", "-R", "$config{srcdir}", "add", "$config{srcdir}/$file");
+ - if (system(@cmdline) != 0) {
+ - warn "'@cmdline' failed: $!";
+ - }
+ + run_or_cry('hg', 'add', $file);
+ }
+
+ sub rcs_remove ($) {
+ + # Remove file from archive.
+ +
+ my ($file) = @_;
+
+ - error("rcs_remove not implemented for mercurial"); # TODO
+ + run_or_cry('hg', 'remove', '-f', $file);
+ }
+
+ sub rcs_rename ($$) {
+ my ($src, $dest) = @_;
+
+ - error("rcs_rename not implemented for mercurial"); # TODO
+ + run_or_cry('hg', 'rename', '-f', $src, $dest);
+ }
+
+ sub rcs_recentchanges ($) {
+
+> Remainder seems ok to me. Should probably test that the remove plugin
+> works, since this implements `rcs_remove` too and you didn't mention
+> any tests that would run that code path. --[[Joey]]
+
+>> I tested `rename`. It fails if the page title includes e.g. åäö. Trying to rename a page from "title without special chars" to "title with åäö" renders in `/var/log/apache2/error.log`:
+
+ [Fri Jul 15 14:58:17 2011] [error] [client 46.239.104.5] transaction abort!, referer: http://46.239.104.5:81/blog/ikiwiki.cgi
+ [Fri Jul 15 14:58:17 2011] [error] [client 46.239.104.5] rollback completed, referer: http://46.239.104.5:81/blog/ikiwiki.cgi
+ [Fri Jul 15 14:58:17 2011] [error] [client 46.239.104.5] abort: decoding near 'itle_with_\xc3\xa5\xc3\xa4\xc3\xb6.mdw': 'ascii' codec can't decode byte 0xc3 in position 66: ordinal not in range(128)!, referer: http://46.239.104.5:81/blog/ikiwiki.cgi
+ [Fri Jul 15 14:58:17 2011] [error] [client 46.239.104.5] 'hg commit -m rename posts/title_without_special_chars.mdwn to posts/title_with_\xc3\xa5\xc3\xa4\xc3\xb6.mdwn -q' failed: at /usr/share/perl5/IkiWiki/Plugin/mercurial.pm line 123., referer: http://46.239.104.5:81/blog/ikiwiki.cgi
+
+>>> I added setting the environment variable `HGENCODING=utf-8` in `rcs_commit_helper`, which took care of these problems. *Done* --[[Daniel Andersson]]
+
+>> When this has happened, directly following by `rename` and `remove` doesn't work as it should, since the file is not commited. `hg remove -f` doesn't physically remove files that aren't tracked (no `hg` command does). Perhaps a regular `unlink $file` should be called to not clutter the source dir if `hg remove` failed because the file wasn't tracked. --[[Daniel Andersson]]
+
+>> I've also noted that when a post is added or removed, the commit message lacks the page title. It contains the title when the page is renamed though, so it should be an easy fix. I'll look into it. --[[Daniel Andersson]]
+
+>>> This is to do with `{rename,remove,editchanges}.pm`. The last two simply don't give a message to `rcs_commit_staged`. Separate issue. --[[Daniel Andersson]]
diff --git a/doc/todo/Auto-setup_and_maintain_Mercurial_wrapper_hooks.mdwn b/doc/todo/Auto-setup_and_maintain_Mercurial_wrapper_hooks.mdwn
new file mode 100644
index 000000000..94212966a
--- /dev/null
+++ b/doc/todo/Auto-setup_and_maintain_Mercurial_wrapper_hooks.mdwn
@@ -0,0 +1,240 @@
+Attempt to fix a `TODO` in `Automator.pm` in combination with the Mercurial backend.
+
+1. To define hooks, Mercurial uses paths given in the config file `.hg/hgrc`. To enable Mercurial to call `ikiwiki-wrapper` automatically after blog/wiki setup, ikiwiki thus needs to create `hgrc`.
+2. To reflect changes in `$config{srcdir}` and/or `$config{mercurial_wrapper}`, relevant lines in `hgrc` need to be updated on wrapper creation.
+
+ikiwiki can keep track of lines in `hgrc` for which it is responsible by adding a `.ikiwiki` suffix to its hooks. This is correct and recommended markup, Mercurial-wise.
+
+Two ways follow below. I prefer the long one. --[[Daniel Andersson]]
+
+> I comment myself: this can probably be solved without adding ad-hoc hooks and stuff (maybe not as "correct" as changing the config file only directly after wrappers have been generated, but good enough). I have a large rewrite of `mercurial.pm` ready, currently under local testing before I upload it for comments, trying to make it equal in function to `git.pm`. Comments below are of course welcome, but I will look into other ways of solving it later. Maybe `rcs_checkconfig` or `rcs_genwrapper` should host the `hgrc`-changing code. --[[Daniel Andersson]]
+
+>> Having a hook that runs after a wrapper is generated may well be a good
+>> thing anyway. In ikiwiki-hosting, there are some genwrapper hooks
+>> that don't add any code to the wrapper, but are there only to run at,
+>> essentially, that time.
+>>
+>> With that said, here it seems like unnecessary complexity.
+>> Why is `mercurial_wrapper` configurable at all? Why not just always
+>> write it to a specific place relative to the srcdir, and always make
+>> the hgrc look there?
+>>
+>> (Other rcs plugins have good reasons to make their wrappers
+>> configurable, because one might want the wrapper to run as a git
+>> post-update or post-commit hook.) --[[Joey]]
+
+Compact way (addresses only point 1)
+------------------------------------
+[This patch at pastebin](http://pastebin.com/by9f4dwX) ([raw version](http://pastebin.com/raw.php?i=by9f4dwX)).
+
+Set default `ikiwiki-wrapper` path.
+
+ diff -r 8faf136ca94f Setup/Automator.pm
+ --- a/Setup/Automator.pm Tue Jul 19 21:04:13 2011 +0200
+ +++ b/Setup/Automator.pm Wed Jul 20 15:33:21 2011 +0200
+ @@ -75,8 +75,7 @@
+ print STDERR "warning: do not know how to set up the bzr_wrapper hook!\n";
+ }
+ elsif ($config{rcs} eq 'mercurial') {
+ - # TODO
+ - print STDERR "warning: do not know how to set up the mercurial_wrapper hook!\n";
+ + $config{mercurial_wrapper}=$config{srcdir}."/.hg/ikiwiki-wrapper";
+ }
+ elsif ($config{rcs} eq 'tla') {
+ # TODO
+
+Create `$config{srcdir}/.hg/hgrc` with hook info during auto-installation script. Use relative paths to not require manual `hgrc` intervention if `$config{srcdir}` is changed. If `$config{mercurial_wrapper}` is changed, manual edit of `hgrc` is needed to catch the new wrapper path.
+
+(Is there a security risk with relative paths?)
+
+> The code seems to assume that hg will be run from within the srcdir,
+> specifically the top of the srcdir. If it's run from somewhere else,
+> even a subdirectory, this will fail to find the wrapper, or could
+> run some other program. Unless mercurial always interprets these paths
+> as relative to the top of the repository? --[[Joey]]
+
+ @@ -187,6 +186,22 @@
+ die "ikiwiki --wrappers --setup $config{dumpsetup} failed";
+ }
+
+ + # Setup initial config file for Mercurial to hook up the wrapper.
+ + if ($config{rcs} eq 'mercurial' && exists $config{mercurial_wrapper}
+ + && length $config{mercurial_wrapper}) {
+ + # Use a relative path to avoid having to manually change the
+ + # autogenerated hgrc if the user changes $config{srcdir}.
+ + use File::Spec;
+ + my $mercurial_wrapper_relpath=File::Spec->abs2rel($config{mercurial_wrapper}, $config{srcdir});
+ + open (HGRC, '>', $config{srcdir}.'/.hg/hgrc');
+ + print HGRC <<EOF;
+ +[hooks]
+ +post-commit.ikiwiki = $mercurial_wrapper_relpath
+ +incoming.ikiwiki = $mercurial_wrapper_relpath
+ +EOF
+ + close (HGRC);
+ + }
+ +
+ # Add it to the wikilist.
+ mkpath("$ENV{HOME}/.ikiwiki");
+ open (WIKILIST, ">>$ENV{HOME}/.ikiwiki/wikilist") || die "$ENV{HOME}/.ikiwiki/wikilist: $!";
+
+
+Less compact but more robust way (addresses point 1 and 2)
+----------------------------------------------------------
+[This complete patch at pastebin](http://pastebin.com/AcDHjbK6) ([raw version](http://pastebin.com/raw.php?i=AcDHjbK6)).
+
+This way leaks onto additional files and adds general functionality that may or may not be wanted. The main part of the extra code is contained within `mercurial.pm`, though.
+
+Set default `ikiwiki-wrapper` path.
+
+ diff -r b08179653c00 IkiWiki/Setup/Automator.pm
+ --- a/IkiWiki/Setup/Automator.pm Wed Jul 20 16:56:09 2011 +0200
+ +++ b/IkiWiki/Setup/Automator.pm Wed Jul 20 19:28:21 2011 +0200
+ @@ -75,8 +75,7 @@
+ print STDERR "warning: do not know how to set up the bzr_wrapper hook!\n";
+ }
+ elsif ($config{rcs} eq 'mercurial') {
+ - # TODO
+ - print STDERR "warning: do not know how to set up the mercurial_wrapper hook!\n";
+ + $config{mercurial_wrapper}=$config{srcdir}."/.hg/ikiwiki-wrapper";
+ }
+ elsif ($config{rcs} eq 'tla') {
+ # TODO
+
+Create `$config{srcdir}/.hg/hgrc` during auto-installation with hook info.
+
+ @@ -182,6 +181,19 @@
+ }
+ }
+
+ + # Setup initial config file for Mercurial to hook up the wrapper. The
+ + # path to the wrapper will be automatically added when it is generated.
+ + if ($config{rcs} eq 'mercurial' && exists $config{mercurial_wrapper}
+ + && length $config{mercurial_wrapper}) {
+ + open (HGRC, '>', $config{srcdir}.'/.hg/hgrc');
+ + print HGRC <<EOF;
+ +[hooks]
+ +post-commit.ikiwiki =
+ +incoming.ikiwiki =
+ +EOF
+ + close (HGRC);
+ + }
+ +
+ # Add wrappers, make live.
+ if (system("ikiwiki", "--wrappers", "--setup", $config{dumpsetup}) != 0) {
+ die "ikiwiki --wrappers --setup $config{dumpsetup} failed";
+
+`hgrc` is setup initially. Below follows code to keep `hgrc` updated.
+
+Add backend specific function `rcs_wrapper_postcall()` for later call in `Wrappers.pm`.
+
+ diff -r b08179653c00 IkiWiki/Plugin/mercurial.pm
+ --- a/IkiWiki/Plugin/mercurial.pm Wed Jul 20 16:56:09 2011 +0200
+ +++ b/IkiWiki/Plugin/mercurial.pm Wed Jul 20 19:28:21 2011 +0200
+ @@ -21,6 +21,7 @@
+ hook(type => "rcs", id => "rcs_diff", call => \&rcs_diff);
+ hook(type => "rcs", id => "rcs_getctime", call => \&rcs_getctime);
+ hook(type => "rcs", id => "rcs_getmtime", call => \&rcs_getmtime);
+ + hook(type => "rcs", id => "rcs_wrapper_postcall", call => \&rcs_wrapper_postcall);
+ }
+
+ sub checkconfig () {
+
+Pass variable to `gen_wrapper()` to decide if `rcs_wrapper_postcall()` should run. Default is `1` to update `hgrc`, since it is done non-intrusive (won't create `hgrc` if it doesn't exist, won't overwrite anything unless it is set by ikiwiki itself).
+
+ @@ -28,6 +29,7 @@
+ push @{$config{wrappers}}, {
+ wrapper => $config{mercurial_wrapper},
+ wrappermode => (defined $config{mercurial_wrappermode} ? $config{mercurial_wrappermode} : "06755"),
+ + wrapper_postcall => (defined $config{mercurial_wrapper_hgrc_update} ? $config{mercurial_wrapper_hgrc_update} : "1"),
+ };
+ }
+ }
+
+Include default configuration value and comment.
+
+ @@ -53,6 +55,13 @@
+ safe => 0,
+ rebuild => 0,
+ },
+ + mercurial_wrapper_hgrc_update => {
+ + type => "string",
+ + example => "1",
+ + description => "updates existing hgrc to reflect path changes for mercurial_wrapper",
+ + safe => 0,
+ + rebuild => 0,
+ + },
+ historyurl => {
+ type => "string",
+ example => "http://example.com:8000/log/tip/\[[file]]",
+
+`hgrc` should be updated to point to the new wrapper path. The regexp transforms lines as e.g.
+
+ post-commit.ikiwiki = /home/daniel/blog/.hg/ikiwiki-wrapper-oldpath
+ incoming.ikiwiki = /home/daniel/blog/.hg/ikiwiki-wrapper-oldpath
+
+to
+
+ post-commit.ikiwiki = $config{mercurial_wrapper}
+ incoming.ikiwiki = $config{mercurial_wrapper}
+
+with absolute paths.
+
+ @@ -402,4 +411,23 @@
+ return findtimes($file, 0);
+ }
+
+ +sub rcs_wrapper_postcall($) {
+ + # Update hgrc if it exists. Change post-commit/incoming hooks with the
+ + # .ikiwiki suffix to point to the wrapper path given in the setup file.
+ + # Work with a tempfile to not delete hgrc if the loop is interrupted
+ + # midway.
+ + my $hgrc=$config{srcdir}.'/.hg/hgrc';
+ + my $backup_suffix='.ikiwiki.bak';
+ + if (-e $hgrc) {
+ + use File::Spec;
+ + my $mercurial_wrapper_abspath=File::Spec->rel2abs($config{mercurial_wrapper}, $config{srcdir});
+ + local ($^I, @ARGV)=($backup_suffix, $hgrc);
+ + while (<>) {
+ + s/^(post-commit|incoming)(\.ikiwiki[ \t]*=[ \t]*).*$/$1$2$mercurial_wrapper_abspath/;
+ + print;
+ + }
+ + unlink($hgrc.$backup_suffix);
+ + }
+ +}
+ +
+ 1
+
+`rcs_wrapper_postcall` is made available.
+
+ diff -r b08179653c00 IkiWiki.pm
+ --- a/IkiWiki.pm Wed Jul 20 16:56:09 2011 +0200
+ +++ b/IkiWiki.pm Wed Jul 20 19:28:21 2011 +0200
+ @@ -2059,6 +2059,10 @@
+ $hooks{rcs}{rcs_getmtime}{call}->(@_);
+ }
+
+ +sub rcs_wrapper_postcall (@) {
+ + $hooks{rcs}{rcs_wrapper_postcall}{call}->(@_);
+ +}
+ +
+ sub rcs_receive () {
+ $hooks{rcs}{rcs_receive}{call}->();
+ }
+
+
+`rcs_wrapper_postcall` is called if $config{wrapper_postcall} is true, which it should only be for Mercurial at the moment.
+
+ diff -r b08179653c00 IkiWiki/Wrapper.pm
+ --- a/IkiWiki/Wrapper.pm Wed Jul 20 16:56:09 2011 +0200
+ +++ b/IkiWiki/Wrapper.pm Wed Jul 20 19:28:21 2011 +0200
+ @@ -238,6 +238,10 @@
+ }
+ #translators: The parameter is a filename.
+ debug(sprintf(gettext("successfully generated %s"), $wrapper));
+ +
+ + if (defined $config{wrapper_postcall} && $config{wrapper_postcall} ) {
+ + IkiWiki::rcs_wrapper_postcall();
+ + }
+ }
+
+ 1
diff --git a/doc/todo/Auto-setup_should_default_to_YAML.mdwn b/doc/todo/Auto-setup_should_default_to_YAML.mdwn
new file mode 100644
index 000000000..05fba7eac
--- /dev/null
+++ b/doc/todo/Auto-setup_should_default_to_YAML.mdwn
@@ -0,0 +1,3 @@
+I think that the auto-generated setup files generated by /etc/ikiwiki/*.setup should be created in the new YAML format.
+
+> Ah, I missed that. [[done]] --[[Joey]]
diff --git a/doc/todo/Automatic_aggregate_setup_from_wikilist_in_Debian_package_.mdwn b/doc/todo/Automatic_aggregate_setup_from_wikilist_in_Debian_package_.mdwn
new file mode 100644
index 000000000..b409008c8
--- /dev/null
+++ b/doc/todo/Automatic_aggregate_setup_from_wikilist_in_Debian_package_.mdwn
@@ -0,0 +1,29 @@
+ikiwiki could have an option to process /etc/ikiwiki/wikilist and run ikiwiki
+in aggregate mode for all wikis that need it. The Debian package could then
+include an optional cron job to automatically handle aggregation.
+
+> You can actually use ikiwiki-mass-rebuild for this. Just pass --aggregate
+> --refresh to it. (The program could have a clearer name, perhaps I should
+> rename it to mass-ikiwiki? ikiwiki-map? ikiwiki-all? ...)
+>
+> A cron job like the one
+> you suggest could also handle cases when plugins call for a page
+> to be rebuilt. For example, a calendar plugin could use this to refresh a
+> calendar daily.
+>
+> I do worry that such a cron job would produce more load than might be optimal.
+> If you have one wiki that never needs to updated,
+> another that might want to update daily, and a third that wants to update
+> every 15 minutes for aggregation, updating all three every 15 minutes wastes
+> a bit of CPU time. Two cron jobs seem like a better fit
+> in this situation, rather than a one size fits all master cron job. But it
+> would be fine adding a cron job as an example, at least.
+>
+> Another problem is that ikiwiki --aggregate will fail on any wikis that don't
+> have the aggregate plugin enabled. This is really a problem with the plugin's
+> special-casey approach of adding a new flag. This could be fixed by adding
+> a more general syntax like "--set aggregate=1". (done)
+>
+> Sorry for making this sound so complex, it's a good idea, but I'm on an
+> airplane and have nothing good to do except blather on here, and read
+> haskell tutorials. ;-) --[[Joey]]
diff --git a/doc/todo/BTS_integration.mdwn b/doc/todo/BTS_integration.mdwn
new file mode 100644
index 000000000..635586709
--- /dev/null
+++ b/doc/todo/BTS_integration.mdwn
@@ -0,0 +1,11 @@
+Given network access when building the wiki, ikiwiki could retrieve information from bug-tracking systems and apply that information to BTS links. For instance, consider how links from one bugzilla bug to another use strikeout formatting for links to fixed bugs, and use the status and summary of the bug to the link as a title.
+
+This seems somewhat difficult, as ikiwiki would need to maintain a cache of the remote BTS information, and know how to talk to various types of BTS. CPAN modules exist to solve this problem for some types of BTS.
+
+--[[JoshTriplett]]
+
+[scmbug](http://www.mkgnu.net/scmbug) might help here. --[[JoshTriplett]]
+
+[[!tag soc]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/Bestdir_along_with_bestlink_in_IkiWiki.pm.mdwn b/doc/todo/Bestdir_along_with_bestlink_in_IkiWiki.pm.mdwn
new file mode 100644
index 000000000..bf8de16cd
--- /dev/null
+++ b/doc/todo/Bestdir_along_with_bestlink_in_IkiWiki.pm.mdwn
@@ -0,0 +1,51 @@
+This patch adds function bestdir() which returns best directory from the directory structure. This is in addition to the bestlink() function which is there in IkiWiki.pm
+
+> Um, what is this for? :-) It would probably be a lot easier to review if it
+> had documentation, and/or a plugin that used it. --[[smcv]]
+
+-------
+
+ Index: IkiWiki.pm
+ ===================================================================
+ --- IkiWiki.pm (revision 9)
+ +++ IkiWiki.pm (working copy)
+ @@ -391,6 +391,35 @@
+ return "";
+ }
+
+ +sub bestdir ($$) {
+ + my $page=shift;
+ + my $link=shift;
+ + my $cwd=$page;
+ +
+ + if ($link=~s/^\/+//) {
+ + $cwd="";
+ + }
+ +
+ + do {
+ + my $l=$cwd;
+ + $l.="/" if length $l;
+ + $l.=$link;
+ + if (-d "$config{srcdir}/$l") {
+ + return $l;
+ + }
+ + } while $cwd=~s!/?[^/]+$!!;
+ +
+ + if (length $config{userdir}) {
+ + my $l = "$config{userdir}/".lc($link);
+ +
+ + if (-d $l) {
+ + return $l;
+ + }
+ + }
+ +
+ + return "";
+ +}
+ +
+ sub isinlinableimage ($) {
+ my $file=shift;
+
+----
+-[[users/arpitjain]]
+
+[[!tag patch patch/core]]
diff --git a/doc/todo/Bestdir_along_with_bestlink_in_IkiWiki.pm/discussion.mdwn b/doc/todo/Bestdir_along_with_bestlink_in_IkiWiki.pm/discussion.mdwn
new file mode 100644
index 000000000..d473bc3ad
--- /dev/null
+++ b/doc/todo/Bestdir_along_with_bestlink_in_IkiWiki.pm/discussion.mdwn
@@ -0,0 +1,6 @@
+- Is there some implicit license for patches posted on the wiki?
+ I would like to maybe use this in [[todo/mbox]] --[[DavidBremner]]
+
+> If it's not clear to me that a patch is a derivative work of ikiwiki, I
+> always ask for a license clarification before adding it to ikiwiki.
+> --[[Joey]]
diff --git a/doc/todo/Better_bug_tracking_support.mdwn b/doc/todo/Better_bug_tracking_support.mdwn
new file mode 100644
index 000000000..1a810ad55
--- /dev/null
+++ b/doc/todo/Better_bug_tracking_support.mdwn
@@ -0,0 +1,71 @@
+I see that ikiwiki has already some [[bugs]] stored on the wiki, but adding
+better support for bug tracking would really make it a good project
+management system for small projects. Storing the sourcecode, wiki, developer
+blog and the issue tracker information under a same revision control
+system really makes sense. At the moment the only part missing from ikiwiki
+is the bug tracker plugin.
+
+The support would not need to be anything fancy, assignment of bug numbers
+is perhaps the biggest thing missing when compared to a plain wiki page.
+Integration with the revision control system a la [scmbug](http://www.mkgnu.net/?q=scmbug)
+would really neat though, so that bug tracker commands like (closes: #nnn) could
+be embedded to the source code repository commit messages.
+
+> A while back I posted some thoughts in my blog about
+> [using a wiki for issue tracking](http://kitenet.net/~joey/blog/entry/using_a_wiki_for_issue_tracking.html).
+> Google's BTS also has some interesting developments along the lines of
+> free-form search-based bug tracking, a style that seems a better fit to
+> wikis than the traditional rigid data of a BTS.
+>
+> I sorta take your point about bug numbers. It can be a pain to refer to
+> 'using_a_wiki_for_issue_tracking' as a bug name in a place like a
+> changelog.
+
+>> Would a modified [[plugins/inline]] plugin that allowed new pages, but without a title field, be ok?
+>> When you hit the edit button it just chooses a new number and makes a page with that
+>> name.
+
+>> The only issue I can see with this is if you're using a distributed wiki for
+>> distributed bug tracking. In that case you're going to have to make sure that you
+>> don't get conflicting bug ids.
+>> Maybe there should be two options - consecutive numbering, and uuid numbering
+>> which uses a random (128 bit, base64 encoded = 22 chars) name. -- [[Will]]
+
+> OTOH, I don't see a need for specially formatted commit messages to be
+> used to close bugs. Instead, if your BTS is kept in an ikiwiki wiki in
+> an RCS along with your project, you can do like I do here, and just edit a
+> bug's page, tag it `done`, and commit that along with the bug fix.
+>
+> --[[Joey]]
+
+>> I think a little bit more structure than in a normal wiki would be
+>> good to have for bug tracking. Bug numbers, automatic timestamps on comments
+>> and maybe an email interface would be nice. The resulting page may not
+>> look like a wikipage anymore, but rather something like the Debian
+>> BTS web-interface, but it would still benefit from wikilinks to the
+>> documentation in the wiki etc.
+
+>>> I think it is useful to look at these things separately:
+>>>
+>>> * Bug numbers: See above.
+>>> * Automatic timestamps on comments: this already exists with the inline directive.
+>>> * Email interface: You can certainly get an rss feed of what changes in the wiki.
+>>> * You didn't mention [[todo/structured_page_data]] but that is, I think, part
+>>> of what you seem to be saying.
+>>> * [[todo/tracking_bugs_with_dependencies]] is also important.
+>>>
+>>> -- [[Will]]
+
+>> About the commit message interface: I was thinking about a project
+>> architecture where the code is kept in a separate revision control
+>> system branch than the metadata (blog, wiki, BTS). This would IMHO
+>> be a cleaner solution for distributing the source and making releases
+>> etc. For this kind of setup, having the BTS scan the messages of the
+>> source branch (by a commit-hook for example) would be useful.
+>>
+>> By Google BTS, do you mean the issue tracker in the Google code
+>> project hosting?
+>>
+>> --Teemu
+
+[[wishlist]]
diff --git a/doc/todo/Better_reporting_of_validation_errors.mdwn b/doc/todo/Better_reporting_of_validation_errors.mdwn
new file mode 100644
index 000000000..83420921f
--- /dev/null
+++ b/doc/todo/Better_reporting_of_validation_errors.mdwn
@@ -0,0 +1,2 @@
+It would be nice (especially if more CGI::FormBuilder validation is done in the future (such as [[todo/Allow_web_edit_form_comment_field_to_be_mandatory]])
+to tell the user what the error was, rather than silently redisplaying the form. I may have time to work on this. \ No newline at end of file
diff --git a/doc/todo/BibTeX.mdwn b/doc/todo/BibTeX.mdwn
new file mode 100644
index 000000000..16aa1a0f9
--- /dev/null
+++ b/doc/todo/BibTeX.mdwn
@@ -0,0 +1,74 @@
+I would *love* to see a plugin that lets you create one or more BibTeX-formatted bibliography pages and add citations to other pages. The plugin could also render the bibliographies themselves using a chosen BibTeX style and an HTML formatter for LaTeX (such as HeVeA).
+
+--[[JoshTriplett]]
+
+> There is such a plugin at [[plugins/contrib/bibtex|plugins/contrib/bibtex]]. --[[MatthiasIhrke]]
+
+I work on a plugin to htmlize '.bib' files.
+
+A sample result is shown on my webpage : <http://www.adupas.org/research/publications/>.
+
+It features the htmlization of the bibtex with 4 types of entry supported (InProceedings, Article, MastersThesis and PhdThesis). I will add the book entry support soon. It creates for each '.bib' file an html version, and for each entry a specific page with abstract as well as an individual bib file. It lack some features like the possibility to have a pdf or ps version of the article attached.
+
+This plugin uses two templates to render the html version of each file.
+
+I have a problem to create a new page that render like any other page in the wiki. I have used the Ikiwiki's internal **genpage($$)** routine but I suppose that there is another way to do this. My method lack the backlink support for the individual entry files as well as the modification date of these file.
+
+Is it possible to create several wiki page from only one source file?
+
+The source of this plugin could be found on this page : <http://www.adupas.org/software/ikiwiki/> .
+
+Feel free to propose any modifications to enhance this plugin.
+
+--[[AlexandreDupas]]
+
+I have not found any other approach to build several wiki page with only one source file. Does someone have an idea?
+
+I also try to build a wiki-wide preprocessing of the source file to find reference to my bib entry (citation) but apparently there is no wiki-wide preprocessing hook allowing to collect data from each page before building the site. Do I miss something?
+
+--[[AlexandreDupas]]
+
+> The scan hook is run on new page content before building --[[Joey]]
+
+What notation did you have in mind for citations? A preprocessor
+directive? Something LaTeX-inspired might be
+
+ \[[!cite key="foo"]]
+
+which would output "(Foo, 2008)". With the appropriate options, this
+could allow for several variations like "Foo (2008)" and "(Foo, 2008,
+p. 28)". A `nocite` option could cause the reference to be printed in
+the bibliography but produce no output.
+
+What about the references section? There are several ways to
+go about it, for example:
+
+1. It could be included at the bottom of the page automatically for
+ pages with references, with a configurable title and heading level
+ (e.g., `<h2>References</h2>`) followed by a list of references.
+
+2. Use another preprocessor directive like
+
+ ## References ##
+
+ \[[!bibliography ]]
+
+ or
+
+ \[[!bibliography title="References" headerlevel="2"]]
+
+ with configurable default values. Would it be tedious to do this on
+ every page?
+
+3. Use HTML::Template and allow users to add a bibliography section to
+ `page.tmpl` to include the bibliography if references are present and
+ loop over the references to emit a list. The downside here is having
+ to ask people to modify their templates (unless the plugin is
+ eventually included in the distribution).
+
+Any thoughts on the best way to proceed?
+
+--[[JasonBlevins]], March 23, 2008 21:41 EDT
+
+
+[[!tag soc]] [[!tag wishlist]]
diff --git a/doc/todo/BrowserID.mdwn b/doc/todo/BrowserID.mdwn
new file mode 100644
index 000000000..04a9166a8
--- /dev/null
+++ b/doc/todo/BrowserID.mdwn
@@ -0,0 +1,25 @@
+Please consider providing a plugin for [BrowserID](https://browserid.org/) authentication, preferably enabled by default.
+
+Some additional information on BrowserID:
+
+- <https://github.com/mozilla/browserid/wiki/How-to-Use-BrowserID-on-Your-Site>
+- <http://identity.mozilla.com/post/7616727542/introducing-browserid-a-better-way-to-sign-in>
+- <http://identity.mozilla.com/post/7669886219/how-browserid-differs-from-openid>
+- <http://identity.mozilla.com/post/7899984443/privacy-and-browserid>
+
+> I would like to see BrowserID offered as a signin option in ikiwiki
+> right next to the buttons for common openid providers.
+>
+> As far as implementing it goes, I don't want to rely on browserid.org.
+> This means that include.js needs to be shipped with ikiwiki (or in a
+> dependency in a sane world).
+>
+> And it means that relying on a https
+> connection to browserid.org to verify the user's identity assertion
+> token is out. (Well, it's probably out anyway, since it relies on https
+> CA security as the only security in that part of the protocol.)
+>
+> This seems to need an implementation, in perl or an externally callable
+> program (haskell would be fine ;),
+> of <https://wiki.mozilla.org/Identity/BrowserID#Assertion_Verification>.
+> --[[Joey]]
diff --git a/doc/todo/CGI_method_to_pullrefresh.mdwn b/doc/todo/CGI_method_to_pullrefresh.mdwn
new file mode 100644
index 000000000..f01c2f65a
--- /dev/null
+++ b/doc/todo/CGI_method_to_pullrefresh.mdwn
@@ -0,0 +1,7 @@
+[[!meta title="CGI method to pull/refresh"]]
+
+In some situations, it makes sense to have the repository in use by ikiwiki reside on a different machine. In that case, one could juggle SSH keys for the `post-update` hook. A better way may be to provide a different `do` parameter handler for the CGI, which would pull new commits to the working clone and refresh the wiki. Then, the remote `post-update` hook could just `wget` that URL. To prevent simple DoS attacks, one might assign a simple password.
+
+[[!tag wishlist]]
+
+> [[done]] via the pinger and pingee plugins --[[Joey]]
diff --git a/doc/todo/CSS_classes_for_links.mdwn b/doc/todo/CSS_classes_for_links.mdwn
new file mode 100644
index 000000000..29ed3770e
--- /dev/null
+++ b/doc/todo/CSS_classes_for_links.mdwn
@@ -0,0 +1,138 @@
+Hi Joey,
+
+What do you think about CSS classes for links to display link with icon?
+You probably know that there are wikis with that feature, for example
+Moin Moin.
+
+Here is a piece of `common.css` file grabbed from <http://wiki.openwrt.org>
+site which is powered by Moin Moin wiki:
+
+ a.www:before {content: url(../img/moin-www.png); margin: 0 0.2em;}
+ a.http:before {content: url(../img/moin-www.png); margin: 0 0.2em;}
+ a.https:before {content: url(../img/moin-www.png); margin: 0 0.2em;}
+ a.file:before {content: url(../img/moin-ftp.png); margin: 0 0.2em;}
+ a.ftp:before {content: url(../img/moin-ftp.png); margin: 0 0.2em;}
+ a.nntp:before {content: url(../img/moin-news.png); margin: 0 0.2em;}
+ a.news:before {content: url(../img/moin-news.png); margin: 0 0.2em;}
+ a.telnet:before {content: url(../img/moin-telnet.png); margin: 0 0.2em;}
+ a.irc:before {content: url(../img/moin-telnet.png); margin: 0 0.2em;}
+ a.mailto:before {content: url(../img/moin-email.png); margin: 0 0.2em;}
+ a.attachment:before {content: url(../img/moin-attach.png); margin: 0 0.2em;}
+ a.badinterwiki:before {content: url(../img/moin-inter.png); margin: 0 0.2em;}
+ a.interwiki:before {content: url(../img/moin-inter.png); margin: 0 0.2em;}
+
+You can see that they use a lot of CSS classes for links, but only one CSS class
+for external links is enough for me :) Please look at my example:
+
+ \[[Foo]] -> <a href="http://www.mywiki.org/foo.html">Foo</a>
+ \[[Bar|foo/bar]] -> <a href="http://www.mywiki.org/foo/bar.html">Bar</a>
+ <http://www.gnu.org/> -> <a class="external" href="http://www.gnu.org/">http://www.gnu.org/</a>
+ [GNU](http://www.gnu.org/) -> <a class="external" href="http://www.gnu.org/">GNU</a>
+ [RMS](mailto:rms@gnu.org) -> <a href="mailto:rms@gnu.org">RMS</a>
+
+My best regards,
+
+--[[Paweł|ptecza]]
+
+> If you did not already know, you can achieve similar results using CSS3
+> selectors. Eg: `a[href="http://www.foobar.com/"] { foobar: css }` or
+> `a[title~="Mail"] {text-decoration: none; }`. See
+> <http://www.w3.org/TR/2001/CR-css3-selectors-20011113/> for a complete list.
+
+>> Hi Charles,
+>>
+>> Thanks for the hint! I don't know CSS3 yet :) What modern and popular
+>> WWW browsers do support it now?
+>>
+>>> Safari supports it. Firefoz&Co support most of it. IE6 did not, but IE7
+>>> supports a fair part of CSS3, ans is said to support selectors.
+>>>
+>>> Example on how to use selectors here: http://www.kryogenix.org/days/2002/08/30/external
+>>>
+>>> I also think this should be in an external plugin, not in ikiwiki.
+>>>
+
+I find CSS3 support still spotty... Here are some notes on how to do this in IkiWiki with jQuery: <http://iki.u32.net/setup/External_Links> --[[sabr]]
+
+> If you need to achieve this in IkiWiki itself, I imagine you could create a
+> plugin which runs in the `format` phase of rendering and search/replaces
+> specific link patterns. This should be a fairly simple exercise in regular
+> expressions.
+>
+> --CharlesMauch
+
+>> I've never written plugin for ikiwiki, but I can try if it's simple job :)
+>>
+>> --[[Paweł|ptecza]]
+
+> I wouldn't mind adding a _single_ css class to ikiwiki links, but it
+> would have to be a class added to all internal, not all external, links.
+> Reason is that there are many ways for external links to get into an
+> ikiwiki page, including being entered as raw html. The only time ikiwiki
+> controls a link is when an internal link is added using a WikiLink.
+>
+> (Note that tags get their own special
+> [[rel_attribute|rel_attribute_for_links]] now that CSS can use.)
+>
+> --[[Joey]]
+
+>> I had a little look at this, last weekend. I added a class definition to
+>> the `htmllink` call in `linkify` in `link.pm`. It works pretty well, but
+>> I'd also need to adjust other `htmllink` calls (map, inline, etc.). I found
+>> other methods (CSS3 selectors, etc.) to be unreliable.
+>>
+>> Would you potentially accept a patch that added `class="internal"` to
+>> various `htmllink` calls in ikiwiki?
+>>
+>> How configurable do you think this behaviour should be? I'm considering a
+>> config switch to enable or disable this behaviour, or possibly a
+>> configurable list of class names to append for internal links (defaulting
+>> to an empty list for backwards compatibility)>
+>>
+>> As an alternative to patching the uses of `htmllink`, what do you think
+>> about patching `htmllink` itself? Are there circumstances where it might be
+>> used to generate a non-internal link? -- [[Jon]]
+
+>>> I think that the minimum configurability to get something that
+>>> can be used by CSS to style the links however the end user wants
+>>> is the best thing to shoot for. Ideally, no configurability. And
+>>> a tip or something documenting how to use the classes in your CSS
+>>> to style links so that eg, external links have a warning icon.
+>>>
+>>> `htmllink` can never be used to generate an external link. So,
+>>> patching it seems the best approach. --[[Joey]]
+
+>>>> I had a quick look to this issue. Internal links are generated at
+>>>> 11 places in the Perl code and would need to be patched (this
+>>>> number could be lowered a bit if a htmllink-like function existed
+>>>> for CGI urls; such a function would use `cgiurl`, and be used in
+>>>> most places where `cgiurl` is currently called by plugins).
+>>>>
+>>>> Also, more than 30 `<a>` links appear in templates, most of those
+>>>> being internal links.
+>>>>
+>>>> Sure, patching those few dozen places is trivial. On the other
+>>>> hand, I'm wondering how doable it would be to make sure, on the
+>>>> long run, any generated internal link has the right CSS class
+>>>> applied. One would need to write tests running against the code
+>>>> with all plugins enabled, all templates put to work, in order to
+>>>> ensure consistency is maintained. --[[intrigeri]]
+
+-----
+If you're going to be patching htmllink anyway, might I suggest something more flexible, like being able to configure the link format?
+(Yes, PmWiki allows this, that's where I got the idea)
+That is, rather than having "&lt;a href=". blah . blah ...
+one could use a sprintf with a default format which could be configured in the setup file.
+
+For example:
+
+ $format = ($config{createlink_format}
+ ? $config{createlink_format}
+ : '<span class=\"createlink\"><a href="%s" rel="nofollow">?</a>%s</span>');
+ return sprintf($format,
+ cgiurl(do => "create", page => lc($link), from => $lpage),
+ $linktext);
+
+I admit, I've been wanting something like this for a long time, because I dislike the existing createlink format...
+
+--[[KathrynAndersen]]
diff --git a/doc/todo/CVS_backend.mdwn b/doc/todo/CVS_backend.mdwn
new file mode 100644
index 000000000..c450542e2
--- /dev/null
+++ b/doc/todo/CVS_backend.mdwn
@@ -0,0 +1,16 @@
+(moved from an item on the main [[/index/discussion]] page.)
+
+ikiwiki could have a CVS backend.
+
+Original discussion:
+
+> Any examples of using ikiwiki with cvs?
+>
+>> No, although the existing svn backend could fairly esily be modified into
+>> a CVS backend, by someone who doesn't mind working with CVS. --[[Joey]]
+>>
+>>> Wouldn't say I don't mind, but I needed it. See [[rcs/cvs]]. --[[Schmonz]]
+
+[[done]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/Calendar:_listing_multiple_entries_per_day_.mdwn b/doc/todo/Calendar:_listing_multiple_entries_per_day_.mdwn
new file mode 100644
index 000000000..ba01790b3
--- /dev/null
+++ b/doc/todo/Calendar:_listing_multiple_entries_per_day_.mdwn
@@ -0,0 +1,94 @@
+[[!tag patch]]
+
+I am copying stuff discussed in the [[forum|/forum/Calendar:_listing_multiple_entries_per_day]], since the [[patch]] only list pages that are todo or bugs.
+
+If there are several pages created on the same date, the [[calendar directive|/ikiwiki/directive/calendar]] only display the first one.
+Here is a patch that:
+
+- if there is a single entry in one day, does not change anything (compared to the previous version of the calendar plugin);
+- if there are several entries, when mouse passes over the day, displays a popup listing all the entries of that day.
+
+That's all. No new pages for each day, takes as little space as it took before, and only a few lines more in the source.
+
+The only thing I am not totally happy with is the CSS. We have to say that the text is aligned on the left (otherwise, it is aligned on the right, as is each day of the calendar), but I do not know which place is the more sensible to put that line of CSS in.
+
+Regards,
+-- Louis
+
+
+ diff --git a/IkiWiki/Plugin/calendar.pm b/IkiWiki/Plugin/calendar.pm
+ index d443198..2c9ed79 100644
+ --- a/IkiWiki/Plugin/calendar.pm
+ +++ b/IkiWiki/Plugin/calendar.pm
+ @@ -86,8 +86,11 @@ sub format_month (@) {
+ my $year = $date[5] + 1900;
+ my $mtag = sprintf("%02d", $month);
+
+ - # Only one posting per day is being linked to.
+ - $linkcache{"$year/$mtag/$mday"} = $p;
+ + # Several postings per day
+ + if (! $linkcache{"$year/$mtag/$mday"}) {
+ + $linkcache{"$year/$mtag/$mday"} = [];
+ + }
+ + push(@{$linkcache{"$year/$mtag/$mday"}}, $p);
+ }
+
+ my $pmonth = $params{month} - 1;
+ @@ -221,11 +224,36 @@ EOF
+ $tag='month-calendar-day-link';
+ }
+ $calendar.=qq{\t\t<td class="$tag $downame{$wday}">};
+ - $calendar.=htmllink($params{page}, $params{destpage},
+ - $linkcache{$key},
+ - noimageinline => 1,
+ - linktext => $day,
+ - title => pagetitle(IkiWiki::basename($linkcache{$key})));
+ + if ( scalar(@{$linkcache{$key}}) == 1) {
+ + # Only one posting on this page
+ + my $page = $linkcache{$key}[0];
+ + $calendar.=htmllink($params{page}, $params{destpage},
+ + $page,
+ + noimageinline => 1,
+ + linktext => $day,
+ + title => pagetitle(IkiWiki::basename($page)));
+ + } else {
+ + $calendar.=qq{<div class='popup'>$day<div class='balloon'>};
+ + # Several postings on this page
+ + $calendar.=qq{<ul>};
+ + foreach my $page (@{$linkcache{$key}}) {
+ + $calendar.= qq{\n\t\t\t<li>};
+ + my $title;
+ + if (exists $pagestate{$page}{meta}{title}) {
+ + $title = "$pagestate{$page}{meta}{title}";
+ + } else {
+ + $title = pagetitle(IkiWiki::basename($page));
+ + }
+ + $calendar.=htmllink($params{page}, $params{destpage},
+ + $page,
+ + noimageinline => 1,
+ + linktext => $title,
+ + title => $title);
+ + $calendar.= '</li>';
+ + }
+ + $calendar.=qq{\n\t\t</ul>};
+ + $calendar.=qq{</div></div>};
+ + }
+ $calendar.=qq{</td>\n};
+ }
+ else {
+ diff --git a/doc/style.css b/doc/style.css
+ old mode 100644
+ new mode 100755
+ index 424d438..b52c72b
+ --- a/doc/style.css
+ +++ b/doc/style.css
+ @@ -323,6 +323,7 @@ div.progress-done {
+ .popup .paren,
+ .popup .expand {
+ display: none;
+ + text-align: left;
+ }
+ .popup:hover .balloon,
+ .popup:focus .balloon {
+
+> [[applied|done]] --[[Joey]]
diff --git a/doc/todo/Case.mdwn b/doc/todo/Case.mdwn
new file mode 100644
index 000000000..a19dbb2a6
--- /dev/null
+++ b/doc/todo/Case.mdwn
@@ -0,0 +1,4 @@
+ikiwiki should support pages that have uppercase in their filenames.
+However, links to such pages should not need to exactly preserve the case.
+
+[[todo/done]]
diff --git a/doc/todo/Commit_emails:_ones_own_changes.mdwn b/doc/todo/Commit_emails:_ones_own_changes.mdwn
new file mode 100644
index 000000000..862a85071
--- /dev/null
+++ b/doc/todo/Commit_emails:_ones_own_changes.mdwn
@@ -0,0 +1,9 @@
+What's the rationale behind excluding ones own changes from the commit emails sent out?
+--[[tschwinge]]
+
+> Well, commit mails are intended to keep you informed of changes in the
+> wiki, and I assumed you'd know about changes you made yourself.
+> --[[Joey]]
+
+> [[done]] -- commit mails removed; recentchanges feeds can be configured
+> for whatever you like.
diff --git a/doc/todo/Configurable_minimum_length_of_log_message_for_web_edits.mdwn b/doc/todo/Configurable_minimum_length_of_log_message_for_web_edits.mdwn
new file mode 100644
index 000000000..74a4fb1b1
--- /dev/null
+++ b/doc/todo/Configurable_minimum_length_of_log_message_for_web_edits.mdwn
@@ -0,0 +1,5 @@
+It would be nice to specify a minimum length for the change log for web edits, and if it's only required vs. non-required. I realise this is not going to solve the problem of crap log messages, but it helps guard against accidental submissions which one would have logged. Mediawiki/wikipedia has that option, and I find it a useful reminder. --[[madduck]]
+
+> +1 --[[lnussel]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/Configureable_separator_of_page_name.mdwn b/doc/todo/Configureable_separator_of_page_name.mdwn
new file mode 100644
index 000000000..8f4500d89
--- /dev/null
+++ b/doc/todo/Configureable_separator_of_page_name.mdwn
@@ -0,0 +1,12 @@
+Currently ikiwiki replaces spaces of page name/title to underline characters.
+I'd rather prefer en dash character there. Is it a chance to configure separator
+character in ikiwiki?
+
+--[[Paweł|ptecza]]
+
+> To do this a plugin could overwride the builtin IkiWiki::pagetitle,
+> IkiWiki::titlepage, and IkiWiki::linkpage functions. All the mangling of
+> page names should be cantralised to those functions. --[[Joey]]
+
+>> Do I really need a plugin to do it? What about yet another parameter
+>> for `ikiwiki.setup` file? ;) --[[Paweł|ptecza]] \ No newline at end of file
diff --git a/doc/todo/Debian_package_could_Recommend_gcc_+_libc6-dev__44___not_Depend.mdwn b/doc/todo/Debian_package_could_Recommend_gcc_+_libc6-dev__44___not_Depend.mdwn
new file mode 100644
index 000000000..ee55de8b9
--- /dev/null
+++ b/doc/todo/Debian_package_could_Recommend_gcc_+_libc6-dev__44___not_Depend.mdwn
@@ -0,0 +1,22 @@
+IkiWiki doesn't actually need a C compiler etc. except for CGI (and other
+wrappers?). As it's perfectly possible to use IkiWiki without the CGI mode,
+would it be possible to make the Debian package Recommend
+gcc/libc6-dev/etc., rather than Depending on them?
+
+> My approach with the dependencies of the package is to depend on
+> everything that is needed to use ikiwiki for a fairly full-featured wiki (but
+> not things that are only needed to turn on special features like specific
+> plugins). That's why ikiwiki also depends on the FormBuilder, for
+> example.
+>
+> I feel that making it easier to get going with ikiwiki is preferable to
+> making everyone cherry-pick exactly the packages they need. I realize
+> that this is not optimal for everyone.
+>
+> I might move these things to recommends once recommends are automatically
+> installed by apt. That will be sorting itself out over the next month or
+> so. --[[Joey]]
+
+> [[done]]
+
+[[wishlist]]
diff --git a/doc/todo/Default_text_for_new_pages.mdwn b/doc/todo/Default_text_for_new_pages.mdwn
new file mode 100644
index 000000000..a904f8287
--- /dev/null
+++ b/doc/todo/Default_text_for_new_pages.mdwn
@@ -0,0 +1,104 @@
+The [[ikiwiki/directive/inline]] directive allows the creation of new pages.
+It would be nice if it was possible to specify default text for the new post.
+For example:
+
+ \[[!inline pages="blog/* and !*/Discussion" postform="yes" newposttemplate="blogtemplate.mdwn"]]
+
+This would allow you to create a new blog post. When you hit the `Edit` button, the system presents
+you with an edit form as normal, but rather than being empty, it has the text from `blogtemplate.mdwn`.
+
+Inline below is a [[patch]] that implements this:
+
+----
+
+ diff --git a/IkiWiki/Plugin/editpage.pm b/IkiWiki/Plugin/editpage.pm
+ index bb21ed2..10c985c 100644
+ --- a/IkiWiki/Plugin/editpage.pm
+ +++ b/IkiWiki/Plugin/editpage.pm
+ @@ -60,7 +60,7 @@ sub cgi_editpage ($$) {
+
+ decode_cgi_utf8($q);
+
+ - my @fields=qw(do rcsinfo subpage from page type editcontent comments);
+ + my @fields=qw(do rcsinfo subpage from page type editcontent comments templatepage);
+ my @buttons=("Save Page", "Preview", "Cancel");
+ eval q{use CGI::FormBuilder};
+ error($@) if $@;
+ @@ -117,9 +117,20 @@ sub cgi_editpage ($$) {
+ }
+ else {
+ $type=$form->param('type');
+ +
+ + my $defaultContent = "";
+ + my $templatepage = $form->param('templatepage');
+ + if ($templatepage && $pagesources{$templatepage}) {
+ + $defaultContent = readfile(IkiWiki::srcfile($pagesources{$templatepage}));
+ + }
+ +
+ if (defined $type && length $type && $hooks{htmlize}{$type}) {
+ $type=possibly_foolish_untaint($type);
+ }
+ + elsif ($templatepage && $pagesources{$templatepage}) {
+ + # favor the type of the template page
+ + $type=pagetype($pagesources{$templatepage});
+ + }
+ elsif (defined $from && exists $pagesources{$from}) {
+ # favor the type of linking page
+ $type=pagetype($pagesources{$from});
+ @@ -129,7 +140,7 @@ sub cgi_editpage ($$) {
+ if (! $form->submitted) {
+ $form->field(name => "rcsinfo", value => "", force => 1);
+ }
+ - $form->field(name => "editcontent", validate => '/.+/');
+ + $form->field(name => "editcontent", value => $defaultContent, force => 0, validate => '/.+/');
+ }
+
+ $form->field(name => "do", type => 'hidden');
+ diff --git a/IkiWiki/Plugin/inline.pm b/IkiWiki/Plugin/inline.pm
+ index 8efef3f..075d7d8 100644
+ --- a/IkiWiki/Plugin/inline.pm
+ +++ b/IkiWiki/Plugin/inline.pm
+ @@ -271,6 +271,7 @@ sub preprocess_inline (@) {
+ $rootpage=$params{page};
+ }
+ $formtemplate->param(rootpage => $rootpage);
+ + $formtemplate->param(templatepage => $params{newposttemplate}) if $params{newposttemplate};
+ $formtemplate->param(rssurl => $rssurl) if $feeds && $rss;
+ $formtemplate->param(atomurl => $atomurl) if $feeds && $atom;
+ if (exists $params{postformtext}) {
+ diff --git a/templates/blogpost.tmpl b/templates/blogpost.tmpl
+ index 7eeede6..5c8b34c 100644
+ --- a/templates/blogpost.tmpl
+ +++ b/templates/blogpost.tmpl
+ @@ -8,6 +8,9 @@
+ </TMPL_IF>
+ <input type="hidden" name="do" value="blog" />
+ <input type="hidden" name="from" value="<TMPL_VAR ROOTPAGE>" />
+ +<TMPL_IF NAME="TEMPLATEPAGE">
+ +<input type="hidden" name="templatepage" value="<TMPL_VAR TEMPLATEPAGE>" />
+ +</TMPL_IF>
+ <input type="hidden" name="subpage" value="1" />
+ <TMPL_VAR POSTFORMTEXT>
+ <input name="title" size="40" />
+
+---
+
+Perhaps I'm misunderstanding something, but can't you use already existing
+in-house means instead of this patch; use a procedure as I do in the Hurd wiki?
+<http://www.bddebian.com/~wiki/config_edittemplate/> with one template:
+<http://www.bddebian.com/~wiki/config_edittemplate/regular_page/>.
+-- [[tschwinge]]
+
+> You are entirely correct. I thought I'd seen it somewhere, but then couldn't
+> find it when I came to use it. If the patch isn't applied (and I can see arguments
+> on both sides of that debate), then at least a pointer to
+> [[ikiwiki/directive/edittemplate]] should be added to [[ikiwiki/directive/inline]]
+> (and I'd make that change myself, but the edit needs to happen in the underlay,
+> not in the online docs). -- [[Will]]
+
+>> Go ahead and make the edit, ikiwiki's source is arranged such that edits
+>> on this wiki to files that form the underlay will affect the underlay.
+>> (Clearly I won't be adding duplicate functionality.)
+>> --[[Joey]]
+
+>>> Edit made. [[done]] -- [[Will]]
diff --git a/doc/todo/Does_not_support_non-UTF8_files.mdwn b/doc/todo/Does_not_support_non-UTF8_files.mdwn
new file mode 100644
index 000000000..b78a5ebeb
--- /dev/null
+++ b/doc/todo/Does_not_support_non-UTF8_files.mdwn
@@ -0,0 +1,7 @@
+Ikiwiki does not seem to support non-UTF-8 file content, although there's no reason it should assume anything other than ASCII-compatibility from the encoding, at least if the Web interface is not used. It suffices that users use the same encoding as the templates specify. If I try to run it on `.mdwn` with content in ISO-8859-1 format, in an ISO-8859-1 locale, I get:
+
+ Malformed UTF-8 character (unexpected non-continuation byte 0x74, immediately after start byte 0xe4) in substitution iterator at /usr/local/share/perl/5.8.8/IkiWiki.pm line 640.
+
+I hope Ikiwiki is not part of the UTF-8 monoculturist movement...
+
+[[wishlist]]
diff --git a/doc/todo/Editing_po_files.mdwn b/doc/todo/Editing_po_files.mdwn
new file mode 100644
index 000000000..d4f142e17
--- /dev/null
+++ b/doc/todo/Editing_po_files.mdwn
@@ -0,0 +1,5 @@
+ikiwiki could support rendering and editing po files via the web. Run against
+a software repository, ikiwiki would make for an interesting
+translation-management tool. --[[JoshTriplett]]
+
+[[wishlist]]
diff --git a/doc/todo/Enable_filtering_of_files_indexed_for_search.mdwn b/doc/todo/Enable_filtering_of_files_indexed_for_search.mdwn
new file mode 100644
index 000000000..7e692dcd8
--- /dev/null
+++ b/doc/todo/Enable_filtering_of_files_indexed_for_search.mdwn
@@ -0,0 +1,7 @@
+[[!tag wishlist]]
+
+A useful item, I think, would be for an option to disable certain pages from being indexed.
+
+A simple method would be for a regex method similar to the 'exclude' option.
+
+A more powerful method would be for PageSpecs to be permitted in this.
diff --git a/doc/todo/Extensible_inlining.mdwn b/doc/todo/Extensible_inlining.mdwn
new file mode 100644
index 000000000..994ed0759
--- /dev/null
+++ b/doc/todo/Extensible_inlining.mdwn
@@ -0,0 +1,263 @@
+Here's an idea with [[patch]] for extending inline in two directions:
+
+1. Permit the content-fetching function to return undef to skip a page. The limiting of @list to a set size is performed after that filtering.
+2. Permit other directive plugins to pass a function to generate content via an inliner_ parameter. The current patch doesn't try to remove that key from the parameters, so hilarity might ensue if someone is too clever. I suppose I should fix that... My *intent* is that other, custom directives can add inliner_.
+
+The diff looks large because the first requires switching some loops.
+
+I'm using this along with a custom BibTeX formatter (one item per file) to generate larger pages and tiny listings. I still need to hammer the templates for that, but I think that's possible without further patches.
+
+(Setting up a git branch for a single plugin is a pain, but I can if necessary. I also could separate this into some sequence rather than all at once, but I won't have time for a week or two.)
+
+-- [[JasonRiedy]]
+
+<pre><code>
+--- /home/ejr/src/git.ikiwiki.info/IkiWiki/Plugin/inline.pm 2011-03-05 14:18:30.261293808 -0500
++++ inline.pm 2011-03-06 21:44:18.887903638 -0500
+@@ -185,6 +185,7 @@
+ }
+
+ my @list;
++ my $num = 0;
+
+ if (exists $params{pagenames}) {
+ foreach my $p (qw(sort pages)) {
+@@ -213,23 +214,121 @@
+ if ($params{feedshow} && $num < $params{feedshow} && $num > 0) {
+ $num=$params{feedshow};
+ }
+- if ($params{skip} && $num) {
+- $num+=$params{skip};
+- }
+
+ @list = pagespec_match_list($params{page}, $params{pages},
+ deptype => deptype($quick ? "presence" : "content"),
+ filter => sub { $_[0] eq $params{page} },
+ sort => exists $params{sort} ? $params{sort} : "age",
+ reverse => yesno($params{reverse}),
+- ($num ? (num => $num) : ()),
+ );
+ }
+
+ if (exists $params{skip}) {
+ @list=@list[$params{skip} .. $#list];
+ }
++
++ if ($params{show} && $params{show} > $num) {
++ $num = $params{show}
++ }
++
++ my $ret="";
++ my @displist;
++ if ($feedonly) {
++ @displist = @list;
++ } else {
++ my $template;
++ if (! $raw) {
++ # cannot use wiki pages as templates; template not sanitized due to
++ # format hook hack
++ eval {
++ $template=template_depends($params{template}.".tmpl", $params{page},
++ blind_cache => 1);
++ };
++ if ($@) {
++ error sprintf(gettext("failed to process template %s"), $params{template}.".tmpl").": $@";
++ }
++ }
++ my $needcontent=$raw || (!($archive && $quick) && $template->query(name => 'content'));
++
++ foreach my $page (@list) {
++ last if ($num && scalar @displist >= $num);
++ my $file = $pagesources{$page};
++ my $type = pagetype($file);
++ if (! $raw) {
++ # Get the content before populating the
++ # template, since getting the content uses
++ # the same template if inlines are nested.
++ if ($needcontent) {
++ my $content;
++ if (exists $params{inliner_} && defined $params{inliner_}) {
++ $content = &{$params{inliner_}}($page, $template, %params);
++ } else {
++ $content=get_inline_content($page, $params{destpage});
++ }
++ next if !defined $content;
++ $template->param(content => $content);
++ push @displist, $page;
++ }
++ $template->param(pageurl => urlto($page, $params{destpage}));
++ $template->param(inlinepage => $page);
++ $template->param(title => pagetitle(basename($page)));
++ $template->param(ctime => displaytime($pagectime{$page}, $params{timeformat}, 1));
++ $template->param(mtime => displaytime($pagemtime{$page}, $params{timeformat}));
++ $template->param(first => 1) if $page eq $list[0];
++ $template->param(last => 1) if ($num && scalar @displist == $num);
++ $template->param(html5 => $config{html5});
+
++ if ($actions) {
++ my $file = $pagesources{$page};
++ my $type = pagetype($file);
++ if ($config{discussion}) {
++ if ($page !~ /.*\/\Q$config{discussionpage}\E$/i &&
++ (length $config{cgiurl} ||
++ exists $pagesources{$page."/".lc($config{discussionpage})})) {
++ $template->param(have_actions => 1);
++ $template->param(discussionlink =>
++ htmllink($page,
++ $params{destpage},
++ $config{discussionpage},
++ noimageinline => 1,
++ forcesubpage => 1));
++ }
++ }
++ if (length $config{cgiurl} &&
++ defined $type &&
++ IkiWiki->can("cgi_editpage")) {
++ $template->param(have_actions => 1);
++ $template->param(editurl => cgiurl(do => "edit", page => $page));
++
++ }
++ }
++
++ run_hooks(pagetemplate => sub {
++ shift->(page => $page, destpage => $params{destpage},
++ template => $template,);
++ });
++
++ $ret.=$template->output;
++ $template->clear_params;
++ }
++ else {
++ if (defined $type) {
++ $ret.="\n".
++ linkify($page, $params{destpage},
++ preprocess($page, $params{destpage},
++ filter($page, $params{destpage},
++ readfile(srcfile($file)))));
++ }
++ else {
++ $ret.="\n".
++ readfile(srcfile($file));
++ }
++ push @displist, $page;
++ }
++ }
++ }
++ @list = @displist;
++
+ my @feedlist;
+ if ($feeds) {
+ if (exists $params{feedshow} &&
+@@ -241,10 +340,6 @@
+ }
+ }
+
+- if ($params{show} && @list > $params{show}) {
+- @list=@list[0..$params{show} - 1];
+- }
+-
+ if ($feeds && exists $params{feedpages}) {
+ @feedlist = pagespec_match_list(
+ $params{page}, "($params{pages}) and ($params{feedpages})",
+@@ -302,8 +397,6 @@
+ }
+ }
+
+- my $ret="";
+-
+ if (length $config{cgiurl} && ! $params{preview} && (exists $params{rootpage} ||
+ (exists $params{postform} && yesno($params{postform}))) &&
+ IkiWiki->can("cgi_editpage")) {
+@@ -355,91 +448,7 @@
+ }
+ $ret.=$linktemplate->output;
+ }
+-
+- if (! $feedonly) {
+- my $template;
+- if (! $raw) {
+- # cannot use wiki pages as templates; template not sanitized due to
+- # format hook hack
+- eval {
+- $template=template_depends($params{template}.".tmpl", $params{page},
+- blind_cache => 1);
+- };
+- if ($@) {
+- error sprintf(gettext("failed to process template %s"), $params{template}.".tmpl").": $@";
+- }
+- }
+- my $needcontent=$raw || (!($archive && $quick) && $template->query(name => 'content'));
+-
+- foreach my $page (@list) {
+- my $file = $pagesources{$page};
+- my $type = pagetype($file);
+- if (! $raw) {
+- if ($needcontent) {
+- # Get the content before populating the
+- # template, since getting the content uses
+- # the same template if inlines are nested.
+- my $content=get_inline_content($page, $params{destpage});
+- $template->param(content => $content);
+- }
+- $template->param(pageurl => urlto($page, $params{destpage}));
+- $template->param(inlinepage => $page);
+- $template->param(title => pagetitle(basename($page)));
+- $template->param(ctime => displaytime($pagectime{$page}, $params{timeformat}, 1));
+- $template->param(mtime => displaytime($pagemtime{$page}, $params{timeformat}));
+- $template->param(first => 1) if $page eq $list[0];
+- $template->param(last => 1) if $page eq $list[$#list];
+- $template->param(html5 => $config{html5});
+-
+- if ($actions) {
+- my $file = $pagesources{$page};
+- my $type = pagetype($file);
+- if ($config{discussion}) {
+- if ($page !~ /.*\/\Q$config{discussionpage}\E$/i &&
+- (length $config{cgiurl} ||
+- exists $pagesources{$page."/".lc($config{discussionpage})})) {
+- $template->param(have_actions => 1);
+- $template->param(discussionlink =>
+- htmllink($page,
+- $params{destpage},
+- $config{discussionpage},
+- noimageinline => 1,
+- forcesubpage => 1));
+- }
+- }
+- if (length $config{cgiurl} &&
+- defined $type &&
+- IkiWiki->can("cgi_editpage")) {
+- $template->param(have_actions => 1);
+- $template->param(editurl => cgiurl(do => "edit", page => $page));
+
+- }
+- }
+-
+- run_hooks(pagetemplate => sub {
+- shift->(page => $page, destpage => $params{destpage},
+- template => $template,);
+- });
+-
+- $ret.=$template->output;
+- $template->clear_params;
+- }
+- else {
+- if (defined $type) {
+- $ret.="\n".
+- linkify($page, $params{destpage},
+- preprocess($page, $params{destpage},
+- filter($page, $params{destpage},
+- readfile(srcfile($file)))));
+- }
+- else {
+- $ret.="\n".
+- readfile(srcfile($file));
+- }
+- }
+- }
+- }
+-
+ if ($feeds && ($emptyfeeds || @feedlist)) {
+ if ($rss) {
+ my $rssp=$feedbase."rss".$feednum;
+</code></pre>
diff --git a/doc/todo/Feature_parity_with_Trac.mdwn b/doc/todo/Feature_parity_with_Trac.mdwn
new file mode 100644
index 000000000..b2d9d43ed
--- /dev/null
+++ b/doc/todo/Feature_parity_with_Trac.mdwn
@@ -0,0 +1,22 @@
+It would be nice to have feature parity with [Trac](http://trac.edgewall.org/). Note that it is expected that the
+implementation will be quite different, but IkiWiki should support the same uses.
+
+Features needed:
+
+ * Wiki, duh.
+ * Source code viewing: This can be handled quite well with a [[shortcut|shortcuts]] to an external source viewer, or by putting
+ the source in the wiki itself (see the [[todo/automatic_use_of_syntax_plugin_on_source_code_files]] wishlist item and [[todo/syntax_highlighting]] todo item).
+ * This could be improved with [[todo/source_link]].
+ * Currently the source highlighting is a little problematic, as there can be two source files with the same wikiname.
+ e.g. a `hello-world.c` and `hello-world.h`. See [[bugs/multiple_pages_with_same_name]]
+ > That bug was fixed before you linked to the page. :-)
+ >> I was the one that fixed it... :) -- [[Will]]
+ * Trac 'Timeline' feature: view history of the RCS - the `recentchanges` button.
+ * Trac 'Roadmap' feature: Which TODOs/bugs are needed for which milestones. Use the [[plugins/progress]] directive to show percentage complete for each milestone.
+ * Bug tracking: see [[tips/integrated_issue_tracking_with_ikiwiki]] and [[todo/Updated_bug_tracking_example]].
+ * Queries on bug database: e.g. all open bugs that don't depend on an open bug. See [[todo/tracking_bugs_with_dependencies]] and [[todo/structured_page_data]].
+ * Build Status: Maybe this is just a link to an external (centralised) status board (e.g. [BuildBot](http://buildbot.net/)).
+
+-- [[Will]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/Fenced_code_blocks___40__from_GitHub_Flavored_Markdown__41__.mdwn b/doc/todo/Fenced_code_blocks___40__from_GitHub_Flavored_Markdown__41__.mdwn
new file mode 100644
index 000000000..24552b29f
--- /dev/null
+++ b/doc/todo/Fenced_code_blocks___40__from_GitHub_Flavored_Markdown__41__.mdwn
@@ -0,0 +1,44 @@
+GitHub's flavor of markdown adds fenced code blocks, delimited by triple-backquotes, like this:
+
+ ```
+ code
+ more code
+ ```
+
+That syntax proves quite a bit more convenient in many cases, because it doesn't require modifying every line of the code block to add indentation. Please consider adding optional support for this in ikiwiki. Please also consider turning it on by default for new wikis, though not for existing wikis since it could *potentially* break backward compatibility with existing content.
+
+> I don't think that's an official markdown feature, although it might be available
+> as an extension in some markdown library or other -- possibly one of the ones
+> supported by ikiwiki.
+>
+> However, aside from compatability, ikiwiki already provides a way to do it that does not
+> require indenting the code: The [[ikiwiki/directive/format]] directive. Which has the benefit of
+> also telling it what kind of code it is, so it can syntax highlight it. Example:
+
+[[!format haskell """
+main :: IO ()
+main = forever $
+ putStrLn "hello, world!"
+"""]]
+
+> --[[Joey]]
+
+> > It is not a standard feature (as much as Markdown is [[standardized|Track_Markdown_Standardisation_Efforts]]...) But it does allow for [syntax hilightning](https://help.github.com/articles/github-flavored-markdown) too, just tag the language name after the backticks. It *seems* that Discount supports github-style backtick format (as well as Pandoc `~~~~` format) but doesn't allow the keyword argument.
+> >
+> > I strongly support this feature. --[[anarcat]]
+> >
+> > In fact, it turns out that it already works here!
+> >
+> > ~~~~
+> > this is a pandoc-style fenced in code block
+> > this is another line
+> > ~~~~
+> >
+> > github-style backticks, however, do not add a wrapping `<pre>` block for some reason:
+> >
+> > ```
+> > this is a github-style fenced in code block
+> > this is another line
+> > ```
+> >
+> > ... maybe a bug in Discount... --[[anarcat]]
diff --git a/doc/todo/Fix_CSS_to_not_put_a_border_around_image_links.mdwn b/doc/todo/Fix_CSS_to_not_put_a_border_around_image_links.mdwn
new file mode 100644
index 000000000..f6cafe376
--- /dev/null
+++ b/doc/todo/Fix_CSS_to_not_put_a_border_around_image_links.mdwn
@@ -0,0 +1,7 @@
+Browsers, by default, put a link-colored border around images used as links:
+
+[![ikiwiki logo](http://ikiwiki.info/logo/ikiwiki.png)](http://ikiwiki.info)
+
+We should fix the CSS to not do that. --[[JoshTriplett]]
+
+[[todo/done]] --[[JoshTriplett]]
diff --git a/doc/todo/Fix_selflink_in_po_plugin.mdwn b/doc/todo/Fix_selflink_in_po_plugin.mdwn
new file mode 100644
index 000000000..b276c075d
--- /dev/null
+++ b/doc/todo/Fix_selflink_in_po_plugin.mdwn
@@ -0,0 +1,21 @@
+Using the po plugin, a link to /bla is present in the sidebar.
+When viewing /bla in the default language, this link is detected as
+a selflink. When viewing a translation of /bla, it
+isn't. --[[intrigeri]]
+
+Fixed in my po branch. --[[intrigeri]]
+
+[[!tag patch done]]
+
+> bump?
+
+>> I know I've looked at 88c6e2891593fd508701d728602515e47284180c
+>> before, and something about it just seemed wrong. Maybe it's
+>> the triviality of the sub, which it would seem to be easy to
+>> decide to refactor back into its one caller (which would reintroduce the
+>> bug). --[[Joey]]
+
+>>> Well, I can hear and understand this. Apart of adding a comment to
+>>> the sub, explaining the rationale (which is now done in my po
+>>> branch), I don't know what I can do to make it not seem wrong.
+>>> --[[intrigeri]]
diff --git a/doc/todo/FormBuilder__95__Template__95__patch.mdwn b/doc/todo/FormBuilder__95__Template__95__patch.mdwn
new file mode 100644
index 000000000..e754c9ea5
--- /dev/null
+++ b/doc/todo/FormBuilder__95__Template__95__patch.mdwn
@@ -0,0 +1,10 @@
+This patch allows you to use the pagetemplate hooks with FormBuilder.
+
+It can be downloaded from <http://ikiwiki.xbaud.com/FormBuilder_Templates_patch.tar.gz>
+
+--[[TaylorKillian]]
+
+I think I've fixed this in a different way. Please let me know if the fix
+somehow doesn't work well enough. --[[Joey]]
+
+[[done]](?)
diff --git a/doc/todo/FormattingHelp_should_open_new_window.mdwn b/doc/todo/FormattingHelp_should_open_new_window.mdwn
new file mode 100644
index 000000000..58b4b6856
--- /dev/null
+++ b/doc/todo/FormattingHelp_should_open_new_window.mdwn
@@ -0,0 +1 @@
+Currently the FormattingHelp link on the editing page takes you away from the editing page. The formattinghelp link should open in a new window, to allow the user to continue editing; wikipedia's edit page behaves this way.
diff --git a/doc/todo/Gallery.mdwn b/doc/todo/Gallery.mdwn
new file mode 100644
index 000000000..f41980333
--- /dev/null
+++ b/doc/todo/Gallery.mdwn
@@ -0,0 +1,83 @@
+[[!template id=gitbranch branch=origin/gallery author="[[arpitjain]]"]]
+
+New Version of gallery is available now. Few more features have been added like support for multiple pages, sorting and resizing of images etc.
+
+Gallery repo is now available at <http://github.com/joeyh/ikiwiki/tree/gallery>
+
+--[[arpitjain]]
+
+----
+
+creating a gallery of a bunch of images:
+
+* Display Exif informations
+* Display Image informations (like size, date, resolution, compression...)
+* Create CSS data for customizing
+* Create Thumbnails (maybe in more than one size, eg: full,1024x768,800x600,640x480)
+* Descriptions for every image
+* Comments
+* Ratings
+* Watermarks
+* Some javascript for easy navigation (see [photon](http://www.saillard.org/programs_and_patches/photon/) for a good example)
+
+It should be possible to disable every feature for every directory.
+
+----
+
+This could be split into two distinct projects. One would be to modify the [[plugins/img]] plugin to support some of these ideas for extracting and using information such as exif out of images. The other project would be to design something that handles setting up a gallery, which could be just some regular wiki pages using the img plugin, and perhaps some other custom plugins for things like ratings and javascript), and adding new images to a gallery as they are added to the wiki.
+
+That's one way to do it, and it has some nice benefits, like being able to edit the gallery pages like any wiki page, to add comments about images, links, etc. An example of ikiwiki being used like that: <http://kitenet.net/~family/pics/guaimaca.html> (still room for improvement, clearly).
+
+--[[Joey]]
+
+[[!tag soc]]
+
+[[!tag wishlist]]
+
+----
+
+I have implemented the first version of the Gallery Plugin for Ikiwiki as part of [[soc]]. This plugin would create a nice looking gallery of the images once the directory containing images is specified with some additional parameters. It has been build over the img plugin.
+
+Plugin can be downloaded from [here](http://myweb.unomaha.edu/~ajain/gallery.tar).
+
+It can be used as : <br>
+\[[!gallery imagedir="images" thumbnailsize="200x200" cols="3" alt="Can not be displayed" title="My Pictures"]]
+
+where-<br>
+* imagedir => Directory containing images. It will scan all the files with jpg|png|gif extension from the directory and will put it in the gallery.<br>
+* thumbnailsize(optional) => Size of the thumbnail that you want to generate for the gallery.<br>
+* alt(optional) => If image can not be displayed, it will display the text contained in alt argument.<br>
+* cols(optional) => Number of columns of thumbnails that you want to generate.<br>
+* title(optional) => Title of the gallery.<br>
+
+Features of the Gallery Plugin:<br>
+* You can go the next image by clicking on the right side of the image or by pressing 'n'.<br>
+* Similary, you can go the previous image by clicking on the left side of the image or by pressing 'p'.<br>
+* Press esc to close the gallery.<br>
+* While on image, nearby images are preloaded in the background, so as to make the browsing fast.<br>
+
+Right now, it features only one template namely [Lightbox](http://www.hudddletogether.com). Later on, I will add few more templates.<br>
+For any feedback or query, feel free to mail me at arpitjain11 [AT] gmail.com
+
+Additional details are available [here](http://myweb.unomaha.edu/~ajain/ikiwikigallery.html).
+
+[[!tag patch]]
+
+> I'd love to merge this into ikiwiki.
+>
+> However, lightbox.js is licensed under a non-free (Creative Commons) license. :-(
+>
+> Since I don't much like the lightbox effects anyway (too much resizing
+> motion, too slow), I wonder if another template could be added, perhaps
+> a free one?
+>
+> Now that ikiwiki is in git, I've downloaded the most recenty version of
+> the gallery and put it in a "gallery" branch of my git repository.
+>
+> --[[Joey]]
+
+----
+
+See also [[/users/smcv/gallery]] for another implementation of the same sort of
+thing. Unfortunately, none of the implementation ideas
+I have there seem quite right either... --[[smcv]]
diff --git a/doc/todo/Give_access_to_more_TMPL__95__VAR_variables_in_templates_inserted_by_the_template_plugin.mdwn b/doc/todo/Give_access_to_more_TMPL__95__VAR_variables_in_templates_inserted_by_the_template_plugin.mdwn
new file mode 100644
index 000000000..c71250b3a
--- /dev/null
+++ b/doc/todo/Give_access_to_more_TMPL__95__VAR_variables_in_templates_inserted_by_the_template_plugin.mdwn
@@ -0,0 +1,111 @@
+[[!tag wishlist patch]]
+
+# Context
+
+I may have missed a simple way to achieve what I need without
+modifying ikiwiki, so here is the context.
+
+I have a first-level directory (called `bricks`) containing a bunch of
+wiki pages :
+
+ /bricks
+ |
+ |- bla.mdwn
+ |
+ |- bli.mdwn
+ |
+ `- ...
+
+I have two groups of tags called `direction` and `usage`, stored in
+two sub-directories of `$tagbase` :
+
+ /tag
+ |
+ |- direction
+ | |- d1.mdwn
+ | |- d2.mdwn
+ | |- ...
+ |
+ |- usage
+ | |- u1.mdwn
+ | |- u2.mdwn
+ | |- ...
+
+Any page in `/brick` can be tagged with one or more tags from any of
+these tags-groups.
+
+I need to present different views for these wiki pages, so a `/view`
+tree is dedicated to this mission :
+
+ /view
+ |
+ |- dev
+ | |- d1.mdwn
+ | |- d2.mdwn
+ | |-...
+ |
+ |- howto
+ |- u1.mdwn
+ |- u2.mdwn
+ |- ...
+
+... where e.g. `/view/dev/d1` lists a subset (depending on other tags)
+of the pages tagged d1.
+
+My current plan is :
+
+- thanks to the edittemplate plugin, `/view/dev/*` and `/view/howto/*` would contain respectively `\[[!template id=dev_direction]]` and `\[[!template id=howto_usage]]`
+- `/templates/dev_direction.mdwn` and `/templates/howto_usage.mdwn` would use `\[[!map ...]]` directives to build their views
+
+# My issue
+
+Such map directives would look something like the following (more
+complicated, actually, but let's focus on my current issue) :
+
+ \[[!map pages="bricks/* and link(tag/usage/<TMPL_VAR BASENAME>)"]]
+
+Where `BASENAME` value would be, e.g., `u1` or `d2`, depending on the
+page inserting the template. But `BASENAME` does not exist. I found
+that `<TMPL_VAR PAGE>` is replaced with the full path to the page, but
+I did not found how to get the page's basename in a template included
+with a `\[[!template id=...]]` directive.
+
+Any idea ?
+
+I guess it would be possible to provide the templates inserted by the
+template plugin with more `TMPL_VAR` variables, but I don't know ikiwiki
+codebase well, so I don't know how hard it would be.
+
+I sure could write a ad-hoc plugin that would use the pagetemplate
+hook to add a custom `<TMPL_LOOP>` selector I could use in my
+templates, or another one that would use the filter hook to add the
+needed `\[[!map ...]]` where it belongs to, but since ikiwiki's
+preprocessor directives already *almost* do what I need, I'd like to
+avoid the ad-hoc plugin solution.
+
+(Adding a parameter such as `name=d1` in every `/view/dev/d*.mdwn` and
+`/view/howto/u*.mdwn` page is not an option : I want to factorize the
+most possible of these pages.
+
+> The following patch adds a `basename` `TMPL_VAR` variable that can be
+> used in the templates inserted by \[[!template plugin]] :
+
+> diff --git a/IkiWiki/Plugin/template.pm b/IkiWiki/Plugin/template.pm
+> index a6e34fc..bb9dd8d 100644
+> --- a/IkiWiki/Plugin/template.pm
+> +++ b/IkiWiki/Plugin/template.pm
+> @@ -57,6 +57,8 @@ sub preprocess (@) {
+> }
+> }
+>
+> + $template->param("basename" => (split("/", $params{page}))[-1]);
+> +
+> return IkiWiki::preprocess($params{page}, $params{destpage},
+> IkiWiki::filter($params{page}, $params{destpage},
+> $template->output));
+>
+> -- intrigeri
+
+> Thanks for taking the trouble to develop a patch. [[done]] --[[Joey]]
+
+>> Thanks :) -- intrigeri
diff --git a/doc/todo/Google_Analytics_support.mdwn b/doc/todo/Google_Analytics_support.mdwn
new file mode 100644
index 000000000..8bbb1c69b
--- /dev/null
+++ b/doc/todo/Google_Analytics_support.mdwn
@@ -0,0 +1,31 @@
+[[!template id=gitbranch branch=GiuseppeBilotta/google-analytics
+author="[[GiuseppeBilotta]]"]]
+
+I've extended the google plugin to add support for Google Analytics.
+This is done in two steps:
+
+* a `google_sitesearch` config option is introduced, to allow disabling
+ sitesearch even when the `google` plugin is loaded
+* a `google_analytics_account` config option is introduced. When it's
+ defined, its value is assumed to be a Google Analytics account ID
+ and the corresponding JavaScript code is automatically inserted in all
+ documents. The way this is done is shamelessy stolen from the flattr
+ plugin
+
+> Putting this in the google plugin does not seem to be a good approach.
+> That this "functionality" is offered by the same company as google search
+> is really of no consequence.
+
+Well, my idea was to put all Google-related functionality (in the sense
+of support for any service provided by Google) into the google plugin.
+The alternative would have been to have one separate plugin per feature,
+but that doesn't sound particularly nice to me. I can split it in a
+separate plugin if you believe it's cleaner that way
+
+> Also, can't this be easily accomplished by editing page.tmpl? --[[Joey]]
+
+Yes, and so would flattr. But precisely because this kind of code would require
+editing page.tmpl, doing it the manual way carries the burden of keeping it in
+sync across Ikiwiki updates (I'm sure I don't need to mention the number of
+help requests that essentially boil down to "oops, I was using custom templates
+and hadn't updated them").
diff --git a/doc/todo/Google_Sitemap_protocol.mdwn b/doc/todo/Google_Sitemap_protocol.mdwn
new file mode 100644
index 000000000..ea8ee7f03
--- /dev/null
+++ b/doc/todo/Google_Sitemap_protocol.mdwn
@@ -0,0 +1,60 @@
+It would be useful if ikiwiki was able to create [google sitemap][1] files to allow easy indexing.
+
+[1]: https://www.google.com/webmasters/tools/docs/en/protocol.html
+
+> Sitemaps are particularly beneficial when users can't reach all areas of a
+> website through a browseable interface. (Generally, this is when users are
+> unable to reach certain pages or regions of a site by following links). For
+> example, any site where certain pages are only accessible via a search form
+> would benefit from creating a Sitemap and submitting it to search engines.
+
+What I don't get is exactly how ikiwiki, as a static wiki that's quite
+deeply hyperlinked, benefits from a sitemap. The orphans plugin can
+produce a map of pages that other pages do not link to, if you're worried
+about having such pages not found by web spiders.
+
+--[[Joey]]
+
+While pages are very interlinked, most people use ikiwiki for blogging. Blogging produces pages at random intervals and google apparently optimizes their crawls to fit the frequency of changes. For me it's not so often that the contents of my blog changes, so google indexes it quite infrequently. Sitemaps are polled more often than other content (if one exists) so it's lighter for the site and for search engines (yes, google) to frequently poll it instead. So it's not that pages can't be found, but it's lighter for the site to keep an up to date index.
+
+-- Sami
+
+> I've written a sitemaps plugin for my own use. With a little tweaking it
+> should be usable for others. See [my git
+repo](http://localhost/git/?p=website.git;a=blob;f=plugins/googlesitemap.pm)
+for an example. You will probably need to strip out the metadata variables I
+> gather, and tweak to generate proper priorities. The code is pretty simple
+> though and self-explanatory.
+>
+> -- CharlesMauch
+
+>> presumably you really mean [xtermin.us rather than localhost](http://xtermin.us/git/?p=website.git;a=blob;f=plugins/googlesitemap.pm)
+>> -- [[KarlMW]]
+
+>>>[xtermin.us rather than localhost](http://xtermin.us/git/?p=website.git;a=blob;f=plugins/googlesitemap.pm) is 404 now.
+>>> -- weakish
+
+
+Although it is not able to read the meta-data from files, using google-sitemapgen [works well for me](http://bzed.de/posts/2010/06/creating_a_google_sitemap_for_ikiwiki/) to create a sitemap for my ikiwiki installation. -- [[bzed|BerndZeimetz]]
+
+There is a [sitemap XML standard](http://www.sitemaps.org/protocol.php) that ikiwiki needs to generate for.
+
+# Google Webmaster tools and RSS
+
+On [Google Webmaster tools](https://www.google.com/webmasters/tools) you can substitute an RSS feed as a sitemap. Do not use Atom as if you have malformed XHTML it will fail to parse and you will get a ERROR message like so:
+
+ We were unable to read your Sitemap. It may contain an entry we are unable to recognize. Please validate your Sitemap before resubmitting.
+
+[Google should grok feeds as sitemaps.](http://www.google.com/support/webmasters/bin/answer.py?answer=34654) Or rather [[plugins/inline]] should be improved to support the [sitemap protocol](http://sitemaps.org/protocol.php) natively.
+
+-- [[Hendry]]
+
+
+Took me a minute to figure this out so I figured I'd share the steps I took:
+
+* Added rss=>1 and allowrss=>1 to my setup file
+* Created a new page where the RSS would be created with this content, replacing "first_page" with the page in my wiki with the earliest date:
+
+<pre>
+\[[!inline pages="* and !*/Discussion and created_after(first_page)" archive="yes" rss="yes" ]]
+</pre>
diff --git a/doc/todo/Have_xapian_index_pdf__44___openoffice__44___documents.mdwn b/doc/todo/Have_xapian_index_pdf__44___openoffice__44___documents.mdwn
new file mode 100644
index 000000000..7fbe728e2
--- /dev/null
+++ b/doc/todo/Have_xapian_index_pdf__44___openoffice__44___documents.mdwn
@@ -0,0 +1,5 @@
+Xapian supports indexing pdfs, openoffice docs, etc. It would be nice if
+the search plugin would index these documents and optionally allow their
+exclusion.
+
+[[!tag wishlist]]
diff --git a/doc/todo/IRC_topic.mdwn b/doc/todo/IRC_topic.mdwn
new file mode 100644
index 000000000..036787e73
--- /dev/null
+++ b/doc/todo/IRC_topic.mdwn
@@ -0,0 +1,10 @@
+ikiwiki could support grabbing the /topic from an IRC channel and putting the
+result in a page. See <http://wiki.debian.org/TopicDebianDevel> for an
+example. Like [[plugins/aggregate]], the page and its updates should not go
+in version control. --[[JoshTriplett]]
+
+A simple script should be able to poll for the irc topic and update a page,
+then run ikiwiki -refresh to update the wiki. No need to put that in
+ikiwiki or a plugin, though. --[[Joey]]
+
+[[wishlist]]
diff --git a/doc/todo/Improve_display_of_OpenIDs.mdwn b/doc/todo/Improve_display_of_OpenIDs.mdwn
new file mode 100644
index 000000000..e2ba1d90d
--- /dev/null
+++ b/doc/todo/Improve_display_of_OpenIDs.mdwn
@@ -0,0 +1,5 @@
+Some OpenIDs seen in the IkiWiki git history are displayed poorly in [[RecentChanges]], including mine :-) (`http://smcv.pseudorandom.co.uk/`, shown as `smcv.pseudorandom [co.uk]`)
+
+My `openid` branch on <http://git.pseudorandom.co.uk/> improves on a couple of cases and adds a regression test. --[[smcv]]
+
+[[!tag patch done]]
diff --git a/doc/todo/Improve_markdown_speed.mdwn b/doc/todo/Improve_markdown_speed.mdwn
new file mode 100644
index 000000000..de3230c9e
--- /dev/null
+++ b/doc/todo/Improve_markdown_speed.mdwn
@@ -0,0 +1,33 @@
+I'm not sure where the bottleneck is for running ikiwiki over a site like
+my blog
+[Natalian](http://source.natalian-org.branchable.com/?p=source.git;),
+though I like to think the markdown processing could be speeded up by the
+support of the C implementation of Markdown called
+[Sundown](https://github.com/tanoku/sundown).
+
+>> Sundown doesn't appear to have Perl bindings, so the cost of calling a
+>> separate process could wipe out some or all of the speed gain. It might
+>> be worth looking into Text::Upskirt instead, which uses the Upskirt
+>> library which Sundown appears to be derived from. -- [[KathrynAndersen]]
+
+>>> It would be fairly easy to write a perl binding for sundown. For that
+>>> matter, Text::Upskirt could be adapted to it. I am waiting for any of
+>>> upskirt, sundown and perl bindings to get into Debian, then I will
+>>> see about making ikiwiki use them.
+>>>
+>>> For now, I have added discount support to ikiwiki. This does speed up
+>>> markdown rendering by up to 40x, although when building a site ikiwiki
+>>> in practice does other work, so the gains are less impressive. Building
+>>> the ikiwiki doc wiki went from 62 to 45 seconds. The lack of a Debian
+>>> package of Text::Markdown::Discount means this is not used by default
+>>> yet.
+>>>
+>>> (Upskirt, discount... Who comes up with these names? Discount also
+>>> features a "NOPANTS" option.) --[[Joey]]
+
+>>>> Thanks for doing this; it's given a well-needed speedup to my huge site.
+>>>>
+>>>> (At least "Discount" is related to "Mark Down" but I don't fathom "Upskirt" either.)
+>>>> --[[KathrynAndersen]]
+
+[[wishlist]]
diff --git a/doc/todo/Improve_signin_form_layout.mdwn b/doc/todo/Improve_signin_form_layout.mdwn
new file mode 100644
index 000000000..fe7a84f82
--- /dev/null
+++ b/doc/todo/Improve_signin_form_layout.mdwn
@@ -0,0 +1,44 @@
+In SVN commits 3478, 3480, 3482, and
+3485, I added a fieldset around the passwordauth fields, and
+some additional documentation. However, this needs some additional work to
+work correctly with the registration part of the form, as well as the
+buttons. It may also need some CSS love, and some means to style multiple
+formbuilder fieldsets differently. I reverted these four commits to avoid
+regressions before the 2.0 release; after the release, we should look at it
+again. --[[JoshTriplett]]
+
+FormBuilder forms can be made much more ameanable to styling by passing
+these parameters:
+
+ name => "signin",
+ template => {type => 'div'},
+
+This results in a form that uses div instead of a table for layout, and adds
+separate id attributes to every form element, including the fieldsets, so that
+different forms can be styled separately. The only downside is that it doesn't
+allow creating a custom template for the form, but a) nobody has done that and
+b) stylesheets are much easier probably. So I think this is the way to go, we
+just have to get stylin'. :-)
+
+ .fb_submit {
+ float: left;
+ margin: 2px 0;
+ }
+ #signin_openid_url_label {
+ float: left;
+ margin-right: 1ex;
+ }
+ #signin_openid {
+ padding: 10px 10px;
+ border: 1px solid #aaa;
+ background: #eee;
+ color: black !important;
+ }
+
+That looks pretty good.. putting the passwordauth part in a box of its own with
+the submit buttons I don't know how to do.
+
+I'm happy enough with what we have now though, with the above improvements.
+Hope you are too, as I'm calling this [[done]]..
+
+--[[Joey]]
diff --git a/doc/todo/Improving_the_efficiency_of_match__95__glob.mdwn b/doc/todo/Improving_the_efficiency_of_match__95__glob.mdwn
new file mode 100644
index 000000000..4e1df3381
--- /dev/null
+++ b/doc/todo/Improving_the_efficiency_of_match__95__glob.mdwn
@@ -0,0 +1,228 @@
+[[!template id=gitbranch branch=smcv/ready/glob-cache
+ author="[[KathrynAndersen]], [[smcv]]"]]
+[[!tag patch]]
+
+I've been profiling my IkiWiki to try to improve speed (with many pages makes speed even more important) and I've written a patch to improve the speed of match_glob. This matcher is a good one to improve the speed of, because it gets called so many times.
+
+Here's my patch - please consider it! -- [[KathrynAndersen]]
+
+> It seems to me as though changing `glob2re` to return qr/$re/, and calling
+> `memoize(glob2re)` next to the other memoize calls, would be a less
+> verbose way to do this? --[[smcv]]
+
+>> I think so, yeah. Anyway, do you have any benchmark results handy,
+>> Kathryn? --[[Joey]]
+
+>>> See below.
+>>> Also, would it make more sense for glob2re to return qr/^$re$/i rather than qr/$re/? Everything that uses glob2re seems to use
+ $foo =~ /^$re$/i
+>>> rather than /$re/ so I think that would make sense.
+>>> -- [[KathrynAndersen]]
+
+>>>> Git branch `smcv/ka-glob-cache` has Kathryn's patch. Git
+>>>> branch `smcv/memoize-glob2re` does as I suggested, which
+>>>> is less verbose than Kathryn's patch but also not as
+>>>> fast; I'm not sure why, tbh. --[[smcv]]
+
+>>>>> I think it's because my patch focuses on match_glob while the memoize patch focuses on `glob2re`, and `glob2re` is called in `filecheck`, `meta` and `po` as well as in `match_glob` and `match_user`; thus the memoized `glob2re` is dealing with a bigger set of globs to look up, and thus could be just that little bit slower. -- [[KathrynAndersen]]
+
+>>>>>> What may be going on is that glob2re is already a fairly fast
+>>>>>> function, so the overhead of memoizing it with the very generic
+>>>>>> `_memoizer` (see its source) swamps the memoization gain. Note
+>>>>>> that the few functions memoized with the Memoizer before were much
+>>>>>> more expensive, so that little overhead was acceptable then.
+>>>>>>
+>>>>>> It also may be that Kathryn's patch is slightly faster due to using
+>>>>>> the construct `$foo =~ $regexp` rather than `$foo =~ /$regexp/`
+>>>>>> (probably avoids a copy or something like that internally) --
+>>>>>> this despite checking both `exists` and `defined` on the hash, which
+>>>>>> should be reundant AFAICS.
+>>>>>>
+>>>>>> My guess is that the best of both worlds would be to move
+>>>>>> the byhand memoization to glob2re and have it return a compiled
+>>>>>> `/^/i` regexp that can be used without further modifiction in most
+>>>>>> cases. --[[Joey]]
+
+>>>>>>> Done, see `smcv/ready/glob-cache` and `smcv/glob-cache-too-far`.
+>>>>>>>
+>>>>>>> Kathryn's patch is a significant improvement; my first patch on top of
+>>>>>>> that is a trivial cleanup that speeds it up a little, and the next two
+>>>>>>> patches (using precompiled regexes) have surprisingly little effect
+>>>>>>> (they don't slow it down either though, so either omit them or merge
+>>>>>>> them, whichever). Detailed benchmark results below.
+>>>>>>>
+>>>>>>> Moving the memoization to `glob2re` actually seems to slow things down
+>>>>>>> again - I suspect the docwiki has few enough mentions of `user()` etc.
+>>>>>>> that caching them is a waste of time, but perhaps it's not the most
+>>>>>>> representative.
+>>>>>>> --[[smcv]]
+
+[[done]] --[[Joey]]
+
+--------------------------------------------------------------
+
+[[!toggle id="smcv-benchmark" text="current benchmarks"]]
+
+[[!toggleable id="smcv-benchmark" text="""
+master at time of branch:
+
+ time elapsed (wall): 29.6348
+ time running program: 24.9212 (84.09%)
+ time profiling (est.): 4.7136 (15.91%)
+ number of calls: 1360181
+ number of exceptions: 13
+
+ %Time Sec. #calls sec/call F name
+ 13.24 3.2986 3408 0.000968 Text::Balanced::_match_tagged
+ 10.94 2.7253 79514 0.000034 IkiWiki::PageSpec::match_glob
+ 3.19 0.7952 59454 0.000013 <anon>:IkiWiki/Plugin/inline.pm:223
+
+`Improve the speed of match_glob`:
+
+ time elapsed (wall): 27.9755
+ time running program: 23.5293 (84.11%)
+ time profiling (est.): 4.4461 (15.89%)
+ number of calls: 1280875
+ number of exceptions: 13
+
+ %Time Sec. #calls sec/call F name
+ 14.56 3.4257 3408 0.001005 Text::Balanced::_match_tagged
+ 7.82 1.8403 79514 0.000023 IkiWiki::PageSpec::match_glob
+ 3.27 0.7698 59454 0.000013 <anon>:IkiWiki/Plugin/inline.pm:223
+
+`match_glob: streamline glob cache slightly`:
+
+ time elapsed (wall): 27.5753
+ time running program: 23.1714 (84.03%)
+ time profiling (est.): 4.4039 (15.97%)
+ number of calls: 1280875
+ number of exceptions: 13
+
+ %Time Sec. #calls sec/call F name
+ 14.09 3.2637 3408 0.000958 Text::Balanced::_match_tagged
+ 7.74 1.7926 79514 0.000023 IkiWiki::PageSpec::match_glob
+ 3.30 0.7646 59454 0.000013 <anon>:IkiWiki/Plugin/inline.pm:223
+
+`glob2re: return a precompiled, anchored case-insensitiv...`:
+
+ time elapsed (wall): 27.5656
+ time running program: 23.1464 (83.97%)
+ time profiling (est.): 4.4192 (16.03%)
+ number of calls: 1282189
+ number of exceptions: 13
+
+ %Time Sec. #calls sec/call F name
+ 14.21 3.2891 3408 0.000965 Text::Balanced::_match_tagged
+ 7.72 1.7872 79514 0.000022 IkiWiki::PageSpec::match_glob
+ 3.32 0.7678 59454 0.000013 <anon>:IkiWiki/Plugin/inline.pm:223
+
+`make use of precompiled regex objects`:
+
+ time elapsed (wall): 27.5357
+ time running program: 23.1289 (84.00%)
+ time profiling (est.): 4.4068 (16.00%)
+ number of calls: 1281981
+ number of exceptions: 13
+
+ %Time Sec. #calls sec/call F name
+ 14.17 3.2776 3408 0.000962 Text::Balanced::_match_tagged
+ 7.70 1.7814 79514 0.000022 IkiWiki::PageSpec::match_glob
+ 3.35 0.7756 59454 0.000013 <anon>:IkiWiki/Plugin/inline.pm:223
+
+`move memoization from match_glob to glob2re`:
+
+ time elapsed (wall): 28.7677
+ time running program: 23.9473 (83.24%)
+ time profiling (est.): 4.8205 (16.76%)
+ number of calls: 1360181
+ number of exceptions: 13
+
+ %Time Sec. #calls sec/call F name
+ 13.98 3.3469 3408 0.000982 Text::Balanced::_match_tagged
+ 8.85 2.1194 79514 0.000027 IkiWiki::PageSpec::match_glob
+ 3.24 0.7750 59454 0.000013 <anon>:IkiWiki/Plugin/inline.pm:223
+
+--[[smcv]]
+"""]]
+
+--------------------------------------------------------------
+
+[[!toggle id="ka-benchmarks" text="Kathryn's benchmarks"]]
+
+[[!toggleable id="ka-benchmarks" text="""
+Benchmarks done with Devel::Profile on the same testbed IkiWiki setup. I'm just showing the start of the profile output, since that's what's relevant.
+
+Before:
+<pre>
+time elapsed (wall): 27.4173
+time running program: 22.5909 (82.40%)
+time profiling (est.): 4.8264 (17.60%)
+number of calls: 1314729
+number of exceptions: 65
+
+%Time Sec. #calls sec/call F name
+11.05 2.4969 62333 0.000040 IkiWiki::PageSpec::match_glob
+ 4.10 0.9261 679 0.001364 Text::Balanced::_match_tagged
+ 2.72 0.6139 59812 0.000010 IkiWiki::SuccessReason::merge_influences
+</pre>
+
+After:
+<pre>
+time elapsed (wall): 26.1843
+time running program: 21.5673 (82.37%)
+time profiling (est.): 4.6170 (17.63%)
+number of calls: 1252433
+number of exceptions: 65
+
+%Time Sec. #calls sec/call F name
+ 7.66 1.6521 62333 0.000027 IkiWiki::PageSpec::match_glob
+ 4.33 0.9336 679 0.001375 Text::Balanced::_match_tagged
+ 2.81 0.6057 59812 0.000010 IkiWiki::SuccessReason::merge_influences
+</pre>
+
+Note that the seconds per call for match_glob in the "after" case has gone down by about a third.
+
+K.A.
+"""]]
+
+--------------------------------------------------------------
+
+[[!toggle id="ka-patch" text="Kathryn's original patch"]]
+
+[[!toggleable id="ka-patch" text="""
+
+<pre>
+diff --git a/IkiWiki.pm b/IkiWiki.pm
+index 08a3d78..c187b98 100644
+--- a/IkiWiki.pm
++++ b/IkiWiki.pm
+@@ -2482,6 +2482,8 @@ sub derel ($$) {
+ return $path;
+ }
+
++my %glob_cache;
++
+ sub match_glob ($$;@) {
+ my $page=shift;
+ my $glob=shift;
+@@ -2489,8 +2491,15 @@ sub match_glob ($$;@) {
+
+ $glob=derel($glob, $params{location});
+
+- my $regexp=IkiWiki::glob2re($glob);
+- if ($page=~/^$regexp$/i) {
++ # Instead of converting the glob to a regex every time,
++ # cache the compiled regex to save time.
++ if (!exists $glob_cache{$glob}
++ or !defined $glob_cache{$glob})
++ {
++ my $re=IkiWiki::glob2re($glob);
++ $glob_cache{$glob} = qr/^$re$/i;
++ }
++ if ($page =~ $glob_cache{$glob}) {
+ if (! IkiWiki::isinternal($page) || $params{internal}) {
+ return IkiWiki::SuccessReason->new("$glob matches $page");
+ }
+</pre>
+"""]]
+--------------------------------------------------------------
diff --git a/doc/todo/Inline_plugin_option_to_show_full_page_path.mdwn b/doc/todo/Inline_plugin_option_to_show_full_page_path.mdwn
new file mode 100644
index 000000000..691694009
--- /dev/null
+++ b/doc/todo/Inline_plugin_option_to_show_full_page_path.mdwn
@@ -0,0 +1,30 @@
+I recently created a page which shows only discussion pages which have recently been updated (sort of a discussion blog), eg.
+
+<http://adam.shand.net/iki/comments/>
+
+My thought was that this would be a good way to keep track of recent comments other then using the recent changes functionality. It works well except that the title of the discussion pages all show up as "discussion" so it's not visibly obvious which page they are discussing.
+
+One solution would be to change the inline plugin to support an argument which caused the full path of the page to be listed as the title. So rather then the title of discussion page being listed as "discussion" it would show up as "2007/OpenDNS/discussion/".
+
+The only other way I can think of making this work would be to set the title of the discussion pages using the meta plugin ... but I don't like my chances of getting visitors to do that consistantly. :-(
+
+Cheers,
+[[AdamShand]]
+
+> One way to approach it would be to add a field to the template
+> that contains the full page name. Then you just use a modified
+> `inlinepage.tmpl`, that uses that instead of the title. --[[Joey]]
+
+ diff --git a/IkiWiki/Plugin/inline.pm b/IkiWiki/Plugin/inline.pm
+ index 59eabb6..82913ba 100644
+ --- a/IkiWiki/Plugin/inline.pm
+ +++ b/IkiWiki/Plugin/inline.pm
+ @@ -229,6 +229,7 @@ sub preprocess_inline (@) {
+ $template->param(content => $content);
+ }
+ $template->param(pageurl => urlto(bestlink($params{page}, $page), $params{destpage}));
+ + $template->param(page => $page);
+ $template->param(title => pagetitle(basename($page)));
+ $template->param(ctime => displaytime($pagectime{$page}, $params{timeformat}));
+
+Cool, I'll give it a try, thanks! -- [[AdamShand]]
diff --git a/doc/todo/Location_of_pages_starting_with___36__tagbase_should_be_in__by_default.mdwn b/doc/todo/Location_of_pages_starting_with___36__tagbase_should_be_in__by_default.mdwn
new file mode 100644
index 000000000..98a92dddc
--- /dev/null
+++ b/doc/todo/Location_of_pages_starting_with___36__tagbase_should_be_in__by_default.mdwn
@@ -0,0 +1,15 @@
+If I click on a tag that does not yet have a name, but `$tagbase` is set,
+it would be nice if the location suggested is `$tagbase/<tagname>`, not a
+subpage: `<current>/$tagbase/<tagname>`.
+
+> This only seems doable if the tag link you click on is in the "Tags:"
+> footer, or is added with a taglink directive. Think that's good enough?
+> --[[Joey]]
+
+>> yes, that would be better! --[[madduck]]
+
+See also: [[bugs/tags_base_dir_not_used_when_creating_new_tags]]
+
+[[!tag wishlist]]
+
+[[done]]
diff --git a/doc/todo/Mailing_list.mdwn b/doc/todo/Mailing_list.mdwn
new file mode 100644
index 000000000..67cbbb00b
--- /dev/null
+++ b/doc/todo/Mailing_list.mdwn
@@ -0,0 +1,36 @@
+Please don't shoot me for asking:
+
+Could we have an ikiwiki mailing list?
+
+Here's an example use case:
+I want to discuss building a feature. Such discussion could happen on the wiki, but I would prefer to---at the least---be able to email ikiwiki developers and ask them to participate in this particular discussion.
+
+Does this sound okay?
+
+---[[JosephTurian]]
+
+[[!tag wishlist]]
+
+> People ask for this from time to time, but I personally prefer not to be
+> on an ikiwiki mailing list, because limiting public ikiwiki discussion to
+> this wiki helps make ikiwiki a better platform for discussion. So some
+> (most?) active ikiwiki people subscribe to recentchanges, or to the
+> todo/bugs/forum feeds, or to some other feed they create on their user page.
+> And there's work on making the discussion pages more structured, on
+> accepting comments sent via mail, etc. --[[Joey]]
+
+>>I was going to make the very same request, so I'm glad to know I'm not the only one who felt the need for it.
+
+>>I can see your reasoning, though I don't think ikiwiki has reached the level (yet) of facilitating discussion as well as a mailing list does.
+>>You've already pointed out the need for (a) more structured discussion pages, (b) comments sent via mail, but I'm not sure whether that will be enough. This is because the nature of a wiki means that discussions are scattered all over the site, as people discuss in discussion pages about the given topic - and so they should. The consequence of this, however, is that one has a choice (in regard to RSS feeds) of having too much or too little. Too little, if one only feeds on news/todo/bugs/forum, since one misses out on discussions elsewhere. Too much, because the only other option appears to be subscribing to recentchanges, which will give one *everything*, whether it is relevant or not.
+>>Unfortunately, I'm not really sure what the best solution is for this problem.
+
+>> For those who might be interested, I've added the following RSS feeds to <http://www.dreamwidth.org>:
+*ikiwiki_bugs_feed,
+ikiwiki_forum_feed,
+ikiwiki_news_feed,
+ikiwiki_recent_feed,
+ikiwiki_todo_feed,
+ikiwiki_wishlist_feed*
+
+>>--[[KathrynAndersen]]
diff --git a/doc/todo/Make_example_setup_file_consistent.mdwn b/doc/todo/Make_example_setup_file_consistent.mdwn
new file mode 100644
index 000000000..54cc34af6
--- /dev/null
+++ b/doc/todo/Make_example_setup_file_consistent.mdwn
@@ -0,0 +1,33 @@
+The current example ikiwiki.setup file has a number of options included, but commented out. This is standard. Unfortunately there are two standards for the settings of those commented out options:
+
+ - Have the commented out options showing the default setting, or
+ - Have the commented out options showing the most common alternate setting.
+
+Each of these has its advantages. The first makes it clear what the default setting is. The second makes it easy to switch the option on -- you just uncomment the option.
+
+My issue with ikiwiki's example setup file is that it doesn't appear to be consistent. Looking at the 'svn' entries (the first set of rcs entries), we see that
+
+ svnpath => "trunk",
+
+is an example of the first type, but
+
+ rcs => "svn",
+
+is an example of the second type.
+
+I think things could be improved if a clear decision was made here. Most of the settings seem to be of the second type. Perhaps all that is needed is for settings of the first type to grow a comment:
+
+ svnpath => "trunk", #default
+
+What do others think?
+
+> I agree, and I'll take a patch.
+>
+> I may not work on it myself, since I have some
+> [[interesting_ideas|online_configuration]] that would let ikiwiki
+> generate a setup file for you, rather than having to keep maintain the
+> current example.
+>
+> And.. [[done]].. setup files are now generated with `--dumpsetup`, based on
+> the built-in defaults, and commented options show an example
+> setting, not a default. --[[Joey]]
diff --git a/doc/todo/Mercurial_backend_update.mdwn b/doc/todo/Mercurial_backend_update.mdwn
new file mode 100644
index 000000000..d98a4ea68
--- /dev/null
+++ b/doc/todo/Mercurial_backend_update.mdwn
@@ -0,0 +1,969 @@
+I submitted some changes that added 5 "Yes"es and 2 "Fast"s to Mercurial at [[/rcs]], but some functionality is still missing as compared to e.g. `git.pm`, with which it should be able to be equivalent.
+
+To do this, a more basic rewrite would simplify things. I inline the complete file below with comments. I don't expect anyone to take the time to read it all at once, but I'd be glad if those interested in the Mercurial backend could do some beta testing.
+
+* [This specific revision at my hg repo](http://46.239.104.5:81/hg/program/ikiwiki/file/4994ba5e36fa/Plugin/mercurial.pm) ([raw version](http://46.239.104.5:81/hg/program/ikiwiki/raw-file/4994ba5e36fa/Plugin/mercurial.pm)).
+
+* [My default branch](http://510x.se/hg/program/ikiwiki/file/default/Plugin/mercurial.pm) (where updates will be made, will mention here if anything happens) ([raw version](http://510x.se/hg/program/ikiwiki/raw-file/default/Plugin/mercurial.pm)).
+
+(I've stripped the `hgrc`-generation from the linked versions, so it should work to just drop them on top of the old `mercurial.pm`).
+
+I break out my comments from the code to make them more readable. I comment all the changes as compared to current upstream. --[[Daniel Andersson]]
+
+> So, sorry it took me so long (summer vacation), but I've finally
+> gotten around to looking at this. Based mostly just on the comments,
+> it does not seem mergeable as-is, yet. Red flags for me include:
+>
+> * This is a big rewrite, and the main idea seems to be to copy git.pm
+> and hack on it until it works, which I think is unlikely to be ideal
+> as git and mercurial are not really similar at the level used here.
+> * There have been no changes in your hg repo to the code since you
+> originally committed it. Either it's perfect, or it's not been tested..
+> * `hg_local_dirstate_shelve` writes to a temp file in the srcdir,
+> which is hardly clean or ideal.
+> * Relies on mercurial bookmarks extension that seems to need to be
+> turned on (how?)
+> * There are some places where code was taken from git.pm and the
+> comment asks if it even makes sense for mercurial, which obviously
+> would need to be cleaned up.
+> * The `rcs_receive` support especially is very ambitious to try to add to
+> the mercurial code. Does mercurial support anonymous pushes at all? How
+> would ikiwiki be run to handle such a push? How would it tell
+> mercurial not to accept a push if it made prohibited changes?
+>
+> I'm glad we already got so many standalone improvements into
+> mercurial.pm. That's a better approach than rewriting the world, unless
+> the world is badly broken.
+>
+> --[[Joey]]
+
+---
+
+ #!/usr/bin/perl
+ package IkiWiki::Plugin::mercurial;
+
+ use warnings;
+ use strict;
+ use IkiWiki;
+ use Encode;
+ use open qw{:utf8 :std};
+
+
+Pattern to validate hg sha1 sums. hg usually truncates the hash to 12
+characters and prepends a local revision number for output, but internally
+it keeps a 40 character hash. Will use the long version in this code.
+
+ my $sha1_pattern = qr/[0-9a-fA-F]{40}/;
+
+Message to skip in recent changes
+
+ my $dummy_commit_msg = 'dummy commit';
+
+*TODO:* `$hg_dir` not really implemented yet, until a srcdir/repository distinction is
+made as for e.g. Git. Used in `rcs_receive`, and for attachments in `hg_parse_changes`. See comments in those places, though.
+
+ my $hg_dir=undef;
+
+ sub import {
+ hook(type => "checkconfig", id => "mercurial", call => \&checkconfig);
+ hook(type => "getsetup", id => "mercurial", call => \&getsetup);
+ hook(type => "rcs", id => "rcs_update", call => \&rcs_update);
+ hook(type => "rcs", id => "rcs_prepedit", call => \&rcs_prepedit);
+ hook(type => "rcs", id => "rcs_commit", call => \&rcs_commit);
+ hook(type => "rcs", id => "rcs_commit_staged", call => \&rcs_commit_staged);
+ hook(type => "rcs", id => "rcs_add", call => \&rcs_add);
+ hook(type => "rcs", id => "rcs_remove", call => \&rcs_remove);
+ hook(type => "rcs", id => "rcs_rename", call => \&rcs_rename);
+ hook(type => "rcs", id => "rcs_recentchanges", call => \&rcs_recentchanges);
+ hook(type => "rcs", id => "rcs_diff", call => \&rcs_diff);
+ hook(type => "rcs", id => "rcs_getctime", call => \&rcs_getctime);
+ hook(type => "rcs", id => "rcs_getmtime", call => \&rcs_getmtime);
+ hook(type => "rcs", id => "rcs_preprevert", call => \&rcs_preprevert);
+ hook(type => "rcs", id => "rcs_revert", call => \&rcs_revert);
+
+This last hook is "unsanctioned" from [[Auto-setup and maintain Mercurial wrapper hooks]]. Will try to solve its function
+another way later.
+
+ hook(type => "rcs", id => "rcs_wrapper_postcall", call => \&rcs_wrapper_postcall);
+ }
+
+ sub checkconfig () {
+ if (exists $config{mercurial_wrapper} && length $config{mercurial_wrapper}) {
+ push @{$config{wrappers}}, {
+ wrapper => $config{mercurial_wrapper},
+ wrappermode => (defined $config{mercurial_wrappermode} ? $config{mercurial_wrappermode} : "06755"),
+
+Next line part of [[Auto-setup and maintain Mercurial wrapper hooks]].
+
+ wrapper_postcall => (defined $config{mercurial_wrapper_hgrc_update} ? $config{mercurial_wrapper_hgrc_update} : "1"),
+ };
+ }
+ }
+
+ sub getsetup () {
+ return
+ plugin => {
+ safe => 0, # rcs plugin
+ rebuild => undef,
+ section => "rcs",
+ },
+ mercurial_wrapper => {
+ type => "string",
+ #example => # FIXME add example
+ description => "mercurial post-commit hook to generate",
+ safe => 0, # file
+ rebuild => 0,
+ },
+ mercurial_wrappermode => {
+ type => "string",
+ example => '06755',
+ description => "mode for mercurial_wrapper (can safely be made suid)",
+ safe => 0,
+ rebuild => 0,
+ },
+ mercurial_wrapper_hgrc_update => {
+ type => "string",
+ example => "1",
+ description => "updates existing hgrc to reflect path changes for mercurial_wrapper",
+ safe => 0,
+ rebuild => 0,
+ },
+ historyurl => {
+ type => "string",
+ example => "http://example.com:8000/log/tip/\[[file]]",
+ description => "url to hg serve'd repository, to show file history (\[[file]] substituted)",
+ safe => 1,
+ rebuild => 1,
+ },
+ diffurl => {
+ type => "string",
+ example => "http://localhost:8000/?fd=\[[r2]];file=\[[file]]",
+ description => "url to hg serve'd repository, to show diff (\[[file]] and \[[r2]] substituted)",
+ safe => 1,
+ rebuild => 1,
+ },
+ }
+
+ sub safe_hg (&@) {
+ # Start a child process safely without resorting to /bin/sh.
+ # Returns command output (in list content) or success state
+ # (in scalar context), or runs the specified data handler.
+
+ my ($error_handler, $data_handler, @cmdline) = @_;
+
+ my $pid = open my $OUT, "-|";
+
+ error("Cannot fork: $!") if !defined $pid;
+
+ if (!$pid) {
+ # In child.
+ # hg commands want to be in wc.
+
+This `$hg_dir` logic means nothing and could be stripped until srcdir/repdir distinction is made (it's stripped in upstream `mercurial.pm` right now).
+
+ if (! defined $hg_dir) {
+ chdir $config{srcdir}
+ or error("cannot chdir to $config{srcdir}: $!");
+ }
+ else {
+ chdir $hg_dir or error("cannot chdir to $hg_dir: $!");
+ }
+
+ exec @cmdline or error("Cannot exec '@cmdline': $!");
+ }
+ # In parent.
+
+ my @lines;
+ while (<$OUT>) {
+ chomp;
+
+ if (! defined $data_handler) {
+ push @lines, $_;
+ }
+ else {
+ last unless $data_handler->($_);
+ }
+ }
+
+ close $OUT;
+
+ $error_handler->("'@cmdline' failed: $!") if $? && $error_handler;
+
+ return wantarray ? @lines : ($? == 0);
+ }
+ # Convenient wrappers.
+ sub run_or_die ($@) { safe_hg(\&error, undef, @_) }
+ sub run_or_cry ($@) { safe_hg(sub { warn @_ }, undef, @_) }
+ sub run_or_non ($@) { safe_hg(undef, undef, @_) }
+
+
+To handle uncommited local changes ("ULC"s for short), I use logic similar to the (non-standard) "shelve" extension to Mercurial. By taking a diff before resetting to last commit, making changes and then applying diff again, one can do things Mercurial otherwise refuses, which is necessary later.
+
+This function creates this diff.
+
+ sub hg_local_dirstate_shelve ($) {
+ # Creates a diff snapshot of uncommited changes existing the srcdir.
+ # Takes a string (preferably revision) as input to create a unique and
+ # identifiable diff name.
+ my $tempdiffname = "diff_".shift;
+ my $tempdiffpath;
+ if (my @tempdiff = run_or_die('hg', 'diff', '-g')) {
+ $"="\n";
+ writefile($tempdiffname, $config{srcdir},
+ "@tempdiff");
+ $"=" ";
+ $tempdiffpath = $config{srcdir}.'/'.$tempdiffname;
+ }
+ return $tempdiffpath;
+ }
+
+This function restores the diff.
+
+ sub hg_local_dirstate_unshelve ($) {
+ # Applies diff snapshot to revert back to initial dir state. If diff
+ # revert succeeds, the diff is removed. Otherwise it stays to not
+ # eradicate the local changes if they were important. This clutters the
+ # directory though. Better ways to handle this are welcome. A true way
+ # around this dance is to have a separate repository for local changes
+ # and push ready commits to the srcdir instead.
+ if (my $tempdiffpath = shift) {
+ if (run_or_cry('hg', 'import', '--no-commit', $tempdiffpath)) {
+ unlink($tempdiffpath);
+ return undef;
+ }
+ }
+ }
+
+This makes online diffing possible. A similar approach as in `git.pm`, which is [discussed to some length in a comment there](http://source.ikiwiki.branchable.com/?p=source.git;a=blob;f=IkiWiki/Plugin/git.pm;h=cf7fbe9b7c43ee53180612d0411e6202074fb9e0;hb=refs/heads/master#l211), is taken.
+
+ sub merge_past ($$$) {
+ my ($sha1, $file, $message) = @_;
+
+ # Undo stack for cleanup in case of an error
+ my @undo;
+ # File content with conflict markers
+ my $conflict;
+ my $tempdiffpath;
+
+ eval {
+ # Hide local changes from Mercurial by renaming the modified
+ # file. Relative paths must be converted to absolute for
+ # renaming.
+ my ($target, $hidden) = (
+ "$config{srcdir}/${file}",
+ "$config{srcdir}/${file}.${sha1}"
+ );
+ rename($target, $hidden)
+ or error("rename '$target' to '$hidden' failed: $!");
+ # Ensure to restore the renamed file on error.
+ push @undo, sub {
+ return if ! -e "$hidden"; # already renamed
+ rename($hidden, $target)
+ or warn "rename '$hidden' to '$target' failed: $!";
+ };
+
+
+Take a snapshot of srcdir to be able to restore uncommited local changes ("ULCs") afterwards.
+
+* This must happen _after_ the merging commit in Mercurial, there is no way around it. By design hg refuses to commit merges if there are other changes to tracked content present, no matter how much you beg.
+
+* ULCs to the file being edited are special: they can't be diffed here since `editpage.pm` already has overwritten the file. When the web edit session started though, the ULC version (not the commited
+version) was read into the form, so in a way, the web user _has already merged_ with the ULC. It is not saved in commit history, but that is the exact consequence of "uncommited" changes. If an ULC is done between the time the web edit started and was submitted, then it is lost, though. All in all, one shouldn't be editing the srcdir directly when web edits of the same file are allowed. Clone the repo and push changes instead.
+
+Much of these issues disappear, I believe, if one works with a master repo which only is pushed to.
+
+ my $tempdiffpath = hg_local_dirstate_shelve($sha1);
+
+ # Ensure uniqueness of bookmarks.
+ my $bookmark_upstream_head = "current_head_$sha1";
+ my $bookmark_edit_base = "edit_base_$sha1";
+
+ # Git and Mercurial differ in the branch concept. Mercurial's
+ # "bookmarks" are closer in function in this regard.
+
+Bookmarks aren't standard until Mercurial 1.8 ([2011--02--10](http://selenic.com/hg/rev/d4ab9486e514)), but they've been bundled with Mercurial since ~2008, so they can be enabled by writing a `hgrc`, which is also being worked on.
+
+ # Create a bookmark at current tip.
+ push @undo, sub { run_or_cry('hg', 'bookmark', '--delete',
+ $bookmark_upstream_head) };
+ run_or_die('hg', 'bookmark', $bookmark_upstream_head);
+
+ # Create a bookmark at the revision from which the edit was
+ # started and switch to it, discarding changes (they are stored
+ # in $tempdiff and the hidden file at the moment).
+ push @undo, sub { run_or_cry('hg', 'bookmark', '--delete',
+ $bookmark_edit_base) };
+ run_or_die('hg', 'bookmark', '-r', $sha1, $bookmark_edit_base);
+ run_or_die('hg', 'update', ,'-C', $bookmark_edit_base);
+
+ # Reveal the modified file.
+ rename($hidden, $target)
+ or error("rename '$hidden' to '$target' failed: $!");
+
+ # Commit at the bookmarked revision, creating a new head.
+ run_or_cry('hg', 'commit', '-m', $message);
+
+ # Attempt to merge the newly created head with upstream head.
+ # '--tool internal:merge' to avoid spawning a GUI merger.
+
+(*Semi-TODO:* How do you make this command quiet? On failed merge, it
+always writes to STDERR and clutters the web server log.)
+
+ if (!run_or_non('hg', 'merge', '--tool', 'internal:merge',
+ $bookmark_upstream_head)) {
+ # ..., otherwise return file with conflict markers.
+ $conflict = readfile($target);
+
+ # The hardcore reset approach. Keep your hands inside
+ # the cart.
+ run_or_die('hg', 'rollback');
+ run_or_die('hg', 'update', '-C',
+ $bookmark_upstream_head);
+ if ($tempdiffpath) {
+ hg_local_dirstate_unshelve($tempdiffpath);
+ }
+
+Other approaches tried here:
+
+1. Clean up merge attempt,
+
+ run_or_die('hg', 'update', '-C', $bookmark_upstream_head);
+
+2. Redo "merge", using only upstream head versions,
+
+ run_or_die('hg', 'merge', '--tool', 'internal:local', $bookmark_edit_base);
+
+3. dummy commit to close head.
+
+ run_or_non('hg', 'commit', '-m', $message);
+
+This creates a cluttered and erroneous history. We
+tell Mercurial to merge, even though we in practice
+discard. This creates problems when trying to revert
+changes.
+
+Other attempt:
+
+1. Discard merge attempt and switch to temp head,
+
+ run_or_die('hg', 'update', '-C', $bookmark_edit_base);
+
+2. close the temp head (why do they call the command that in practice closes heads "--close-branch"?),
+
+ run_or_non('hg', 'commit', '--close-branch', '-m', $message);
+
+3. restore working directory to pre-fiddling status.
+
+ run_or_die('hg', 'update', $bookmark_upstream_head);
+
+...but this requires the same amount of forks as the
+above method, and confuses other parts of ikiwiki
+since the upstream head is now the third newest
+revision. Maybe that particular problem is solvable
+by setting a global default bookmark that follows the
+main tip. It will leave clutter in the revision
+history, though. Two extra commits that in practice
+don't hold relevant information will be recorded for
+each failed merge attempt.
+
+To only create one extra commit, one could imagine
+adding `--close-branch` to the commit that initially
+created the new head (since there is no problem
+merging with closed heads), but it's not possible to
+close and create a head at the same time, apparently.
+
+ }
+ };
+ my $failure = $@;
+
+ # Process undo stack (in reverse order). By policy, cleanup actions
+ # should normally print a warning on failure.
+ while (my $handle = pop @undo) {
+ $handle->();
+ }
+
+ error("Mercurial merge failed!\n$failure\n") if $failure;
+
+ return ($conflict, $tempdiffpath);
+ }
+
+ sub hg_commit_info ($;$;$) {
+ # Return an array of commit info hashes of num commits starting from
+ # the given sha1sum.
+ #
+This could be optimized by using a lookup cache similar to
+`findtimes()`. By adding `KeyAttr => ['node']` to `XMLin()` options, one
+could use the revision ID as key and do a single massive history
+lookup and later just check if the given revision already exists as a
+key. Right now I'm at the "don't optimize it yet" stage, though.
+
+This uses Mercurial's built-in `--style xml` and parses it with `XML::Simple`. Mercurial's log output is otherwise somewhat cumbersome to get good stuff out of, so this XML solution is quite good, I think. It adds module dependency, but XML::Simple seems fairly standard (but what do I know, I've used 1 Perl installation in my life).
+
+ use XML::Simple;
+ use Date::Parse;
+
+ my ($sha1, $num, $file) = @_;
+
+ my @opts;
+ if (defined $sha1) {
+ if ($sha1 =~ m/^($sha1_pattern)$/) {
+ push @opts, ('-r'. $1.':0');
+ }
+ elsif ($sha1 =~ m/^($sha1_pattern):($sha1_pattern)$/) {
+ push @opts, ('-r', $1.':'.$2);
+ }
+ }
+ push @opts, ('--limit', $num) if defined $num;
+ push @opts, ('--', $file) if defined $file;
+
+ my %env=%ENV;
+ $ENV{HGENCODING} = 'utf-8';
+ my @xml = run_or_cry('hg', 'log', '-v', '--style', 'xml', @opts);
+ %ENV=%env;
+
+ # hg returns empty string if file is not in repository.
+ return undef if !@xml;
+
+Some places it is clear that I'm coding ad-hoc Perl. I don't know if this is a reasonably efficient way to give input to `XMLin`, but it works.
+
+ # Want to preserve linebreaks in multiline comments.
+ $"="\n";
+ my $xmllog = XMLin("@xml",
+ ForceArray => ['logentry', 'parent', 'copy', 'path']);
+ $"=" ";
+
+ my @c_infos;
+ foreach my $rev (@{$xmllog->{logentry}}) {
+ my %c_info;
+ # In Mercurial, "rev" is technically the strictly local
+ # revision number. What ikiwiki wants is what is called
+ # "node": a globally defined SHA1 checksum.
+ $c_info{rev} = $rev->{node};
+ foreach my $parent (@{$rev->{parent}}) {
+ push @{$c_info{parents}}, {rev => $parent->{node}};
+ }
+ $c_info{user} = $rev->{author}{content};
+ # Mercurial itself parses out and stores an email address if
+ # present in author name. If not, hg sets email to author name.
+ if ( $rev->{author}{content} ne $rev->{author}{email} &&
+ $rev->{author}{email} =~ m/^([^\@]+)\@(.*)$/ ) {
+ if ($2 eq "web") {
+ $c_info{nickname} = $1;
+ $c_info{web_commit} = "1";
+ }
+ }
+ # Mercurial gives date in ISO 8601, well handled by str2time().
+ $c_info{when} = str2time($rev->{date});
+ # Mercurial doesn't allow empty commit messages, so there
+ # should always be a single defined message.
+ $c_info{message} = $rev->{msg}{content};
+ # Inside "paths" sits a single array "path" that contains
+ # multiple paths. Crystal clear :-)
+ foreach my $path (@{$rev->{paths}{path}}) {
+ push @{$c_info{files}}, {
+ # Mercurial doesn't track file permissions as
+ # Git do, so that's missing here.
+ 'file' => $path->{content},
+ 'status' => $path->{action},
+ };
+ }
+ # There also exists an XML branch "copies"->"copy", containing
+ # source and dest of files that have been copied with "hg cp".
+ # The copy action is also registered in "paths" as a removal of
+ # source and addition of dest, so it's not needed here.
+ push @c_infos, {%c_info};
+ use Data::Dumper;
+ }
+
+ return wantarray ? @c_infos : $c_infos[0];
+ }
+
+ sub hg_sha1 (;$) {
+ # Return head sha1sum (of given file).
+ my $file = shift || q{--};
+
+ # Non-existing file doesn't give error, just empty string.
+ my $f_info = hg_commit_info(undef, 1, $file);
+ my $sha1;
+ if ($f_info->{rev}) {
+ ($sha1) = $f_info->{rev} =~ m/($sha1_pattern)/;
+ }
+ else {
+ debug("Empty sha1sum for '$file'.");
+ }
+ return defined $sha1 ? $sha1 : q{};
+ }
+
+ sub rcs_update () {
+ run_or_cry('hg', '-q', 'update');
+ }
+
+ sub rcs_prepedit ($) {
+ # Return the commit sha1sum of the file when editing begins.
+ # This will be later used in rcs_commit if a merge is required.
+ my ($file) = @_;
+
+ return hg_sha1($file);
+ }
+
+ sub rcs_commit (@) {
+ # Try to commit the page; returns undef on _success_ and
+ # a version of the page with the rcs's conflict markers on
+ # failure.
+ my %params=@_;
+
+ # Check to see if the page has been changed by someone else since
+ # rcs_prepedit was called.
+ my $cur = hg_sha1($params{file});
+ my ($prev) = $params{token} =~ /^($sha1_pattern)$/; # untaint
+
+ if (defined $cur && defined $prev && $cur ne $prev) {
+
+If there was a conflict, the file with conflict markers is returned. Else, the path to the tempdiff, which is to be run to restore previous local state after `rcs_commit_staged`, is returned.
+
+ my ($conflict, $tempdiffpath) =
+ merge_past($prev, $params{file}, $dummy_commit_msg);
+ return defined $conflict
+ ? $conflict
+ : rcs_commit_helper(
+ @_,
+ merge => 1,
+ tempdiffpath => $tempdiffpath);
+ }
+
+ return rcs_commit_helper(@_);
+ }
+
+ sub rcs_commit_helper (@) {
+ my %params=@_;
+
+ my %env=%ENV;
+ $ENV{HGENCODING} = 'utf-8';
+
+ my $user="Anonymous";
+ my $nickname;
+ if (defined $params{session}) {
+ if (defined $params{session}->param("name")) {
+ $user = $params{session}->param("name");
+ }
+ elsif (defined $params{session}->remote_addr()) {
+ $user = $params{session}->remote_addr();
+ }
+
+ if (defined $params{session}->param("nickname")) {
+ $nickname=encode_utf8($params{session}->param("nickname"));
+ $nickname=~s/\s+/_/g;
+ $nickname=~s/[^-_0-9[:alnum:]]+//g;
+ }
+ $ENV{HGUSER} = encode_utf8($user . ' <' . $nickname . '@web>');
+ }
+
+ if (! length $params{message}) {
+ $params{message} = "no message given";
+ }
+
+ $params{message} = IkiWiki::possibly_foolish_untaint($params{message});
+
+ my @opts;
+
+Mercurial rejects file arguments when performing a merging commit. It
+only does "all or nothing" commits by design when merging, so given file arguments must be discarded. It should not pose a problem.
+
+ if (exists $params{file} && ! defined $params{merge}) {
+ push @opts, '--', $params{file};
+ }
+
+ # hg commit returns non-zero if nothing really changed.
+ # So we should ignore its exit status (hence run_or_non).
+ run_or_non('hg', 'commit', '-m', $params{message}, '-q', @opts);
+
+If there were uncommited local changes in srcdir before a merge was done, they are restored here.
+
+ if (defined $params{tempdiffpath}) {
+ hg_local_dirstate_unshelve($params{tempdiffpath});
+ }
+
+ %ENV=%env;
+ return undef; # success
+ }
+
+ sub rcs_commit_staged (@) {
+ # Commits all staged changes. Changes can be staged using rcs_add,
+ # rcs_remove, and rcs_rename.
+ return rcs_commit_helper(@_);
+ }
+
+ sub rcs_add ($) {
+ my ($file) = @_;
+
+ run_or_cry('hg', 'add', $file);
+ }
+
+ sub rcs_remove ($) {
+ # Remove file from archive.
+ my ($file) = @_;
+
+ run_or_cry('hg', 'remove', '-f', $file);
+ }
+
+ sub rcs_rename ($$) {
+ my ($src, $dest) = @_;
+
+ run_or_cry('hg', 'rename', '-f', $src, $dest);
+ }
+
+ sub rcs_recentchanges ($) {
+ my ($num) = @_;
+
+ my @c_infos;
+
+ foreach my $c_info (hg_commit_info(undef, $num, undef)) {
+ my @pagenames;
+ for my $page (@{$c_info->{files}}) {
+ my $diffurl=defined $config{diffurl} ?
+ $config{diffurl} : '';
+ # These substitutions enable defining keywords \[[file]]
+ # and \[[r2]] (backward compatibility) in the setup file
+ # that will be exchanged with filename and revision
+ # respectively.
+ $diffurl =~ s/\[\[file\]\]/$page->{file}/go;
+ $diffurl =~ s/\[\[r2\]\]/$c_info->{rev}/go;
+ push @pagenames, {
+ # pagename() strips suffixes and returns the
+ # path to the file as it is to be represented
+ # in the build dir.
+ page => pagename($page->{file}),
+ diffurl => $diffurl,
+ };
+ }
+
+ # It is expected of ikiwiki to get each comment line as a
+ # separate entry.
+ my @messagelines;
+ open my $message, '<', \$c_info->{message};
+ while (<$message>) { push @messagelines, { line => $_ } };
+
+ push @c_infos, {
+ rev => $c_info->{rev},
+ user => $c_info->{user},
+ nickname => defined $c_info->{nickname} ?
+ $c_info->{nickname} : $c_info->{user},
+ committype => $c_info->{web_commit} ? "web" : "hg",
+ when => $c_info->{when},
+ message => [@messagelines],
+ pages => [@pagenames],
+ } if @pagenames;
+ }
+
+ return @c_infos;
+ }
+
+ sub rcs_diff ($;$) {
+ my $rev=shift;
+ my $maxlines=shift;
+ my @lines;
+ my $addlines=sub {
+ my $line=shift;
+ return if defined $maxlines && @lines == $maxlines;
+ push @lines, $line."\n"
+ if (@lines || $line=~/^diff --git/);
+ return 1;
+ };
+ safe_hg(undef, $addlines, "hg", "diff", "-c", $rev, "-g");
+ if (wantarray) {
+ return @lines;
+ }
+ else {
+ return join("", @lines);
+ }
+ }
+
+ {
+ my %time_cache;
+
+This is an upstream change I did a week ago or so. Perhaps it can be merged in some clever way with the updated `hg_commit_info` to make one shared lookup cache. Don't know how much would be gained.
+
+ sub findtimes ($$) {
+ my $file=shift;
+ my $id=shift; # 0 = mtime ; 1 = ctime
+
+ if (! keys %time_cache) {
+ my $date;
+
+ # It doesn't seem possible to specify the format wanted for the
+ # changelog (same format as is generated in git.pm:findtimes(),
+ # though the date differs slightly) without using a style
+ # _file_. There is a "hg log" switch "--template" to directly
+ # control simple output formatting, but in this case, the
+ # {file} directive must be redefined, which can only be done
+ # with "--style".
+ #
+ # If {file} is not redefined, all files are output on a single
+ # line separated with a space. It is not possible to conclude
+ # if the space is part of a filename or just a separator, and
+ # thus impossible to use in this case.
+ #
+ # Some output filters are available in hg, but they are not fit
+ # for this cause (and would slow down the process
+ # unnecessarily).
+
+ eval q{use File::Temp};
+ error $@ if $@;
+ my ($tmpl_fh, $tmpl_filename) = File::Temp::tempfile(UNLINK => 1);
+
+ print $tmpl_fh 'changeset = "{date}\\n{files}\\n"' . "\n";
+ print $tmpl_fh 'file = "{file}\\n"' . "\n";
+
+ foreach my $line (run_or_die('hg', 'log', '--style', $tmpl_filename)) {
+ # {date} gives output on the form
+ # 1310694511.0-7200
+ # where the first number is UTC Unix timestamp with one
+ # decimal (decimal always 0, at least on my system)
+ # followed by local timezone offset from UTC in
+ # seconds.
+ if (! defined $date && $line =~ /^\d+\.\d[+-]\d*$/) {
+ $line =~ s/^(\d+).*/$1/;
+ $date=$line;
+ }
+ elsif (! length $line) {
+ $date=undef;
+ }
+ else {
+ my $f=$line;
+
+ if (! $time_cache{$f}) {
+ $time_cache{$f}[0]=$date; # mtime
+ }
+ $time_cache{$f}[1]=$date; # ctime
+ }
+ }
+ }
+
+ return exists $time_cache{$file} ? $time_cache{$file}[$id] : 0;
+ }
+
+ }
+
+ sub rcs_getctime ($) {
+ my $file = shift;
+
+ return findtimes($file, 1);
+ }
+
+ sub rcs_getmtime ($) {
+ my $file = shift;
+
+ return findtimes($file, 0);
+ }
+
+The comment just below the function declaration below is taken from `git.pm`. Is it true? Should ikiwiki support sharing its repo with other things? Mercurial-wise that sounds like a world of pain.
+
+> Yes, ikiwiki supports this for git and svn. It's useful when you want
+> a doc/ directory with the wiki for a project. I don't know why
+> it wouldn't be a useful thing to do with mercurial, but it's not
+> required. --[[Joey]]
+
+ {
+ my $ret;
+ sub hg_find_root {
+ # The wiki may not be the only thing in the git repo.
+ # Determine if it is in a subdirectory by examining the srcdir,
+ # and its parents, looking for the .git directory.
+
+ return @$ret if defined $ret;
+
+ my $subdir="";
+ my $dir=$config{srcdir};
+ while (! -d "$dir/.hg") {
+ $subdir=IkiWiki::basename($dir)."/".$subdir;
+ $dir=IkiWiki::dirname($dir);
+ if (! length $dir) {
+ error("cannot determine root of hg repo");
+ }
+ }
+
+ $ret=[$subdir, $dir];
+ return @$ret;
+ }
+
+ }
+
+ sub hg_parse_changes (@) {
+ # Only takes a single info hash as argument in rcs_preprevert, but
+ # should be able to take several in rcs_receive.
+ my @c_infos_raw = shift;
+
+ my ($subdir, $rootdir) = hg_find_root();
+ my @c_infos_ret;
+
+ foreach my $c_info_raw (@c_infos_raw) {
+ foreach my $path (@{$c_info_raw->{files}}) {
+ my ($file, $action, $temppath);
+
+ $file=$path->{file};
+
+ # check that all changed files are in the subdir
+ if (length $subdir && ! ($file =~ s/^$subdir//)) {
+ error sprintf(gettext("you are not allowed to change %s"), $file);
+ }
+
+ if ($path->{status} eq "M") { $action="change" }
+ elsif ($path->{status} eq "A") { $action="add" }
+ elsif ($path->{status} eq "R") { $action="remove" }
+ else { error "unknown status ".$path->{status} }
+
+I haven't tested the attachment code below. Is it run when there is an non-trusted file upload?
+
+> It's run when an anonymous git push is done. I don't know if there would
+> be any equivilant with mercurial; if not, it does not makes sense
+> to implement this at all (this function is only used by `rcs_receive`). --[[Joey]]
+
+ # extract attachment to temp file
+ if (($action eq 'add' || $action eq 'change') &&
+ ! pagetype($file)) {
+
+ eval q{use File::Temp};
+ die $@ if $@;
+
+ my $fh;
+ ($fh, $temppath)=File::Temp::tempfile(undef, UNLINK => 1);
+ my $cmd = "cd $hg_dir && ".
+ "hg diff -g -c $c_info_raw->{rev} > '$temppath'";
+ if (system($cmd) != 0) {
+ error("failed writing temp file '$temppath'.");
+ }
+ }
+
+ push @c_infos_ret, {
+ file => $file,
+ action => $action,
+ path => $temppath,
+ };
+ }
+ }
+
+ return @c_infos_ret;
+ }
+
+*TODO:* I don't know what's happening here. I've changed the code to adhere to this file's variables and functions, but it refers to a srcdir _and_ a default repo, which currently isn't available in the Mercurial setup.
+
+`rcs_receive` is optional and only runs when running a pre-receive hook. Where `$_` comes from and its format are mysteries to me.
+
+Also, a comment in `git.pm` mentions that we don't want to chdir to a subdir "and only see changes in it" - but this isn't true for either Git or Mercurial to my knowledge. It only seems to happen in `git.pm` since the `git log` command in `git_commit_info` ends with "`-- .`" - if it didn't do that, one wouldn't have to chdir for this reason, I believe.
+
+In this case we need to stay in default repo instead of srcdir though, so `hg_dir="."` _is_ needed, but not for the abovementioned reason :-) (maybe there's more to it, though).
+
+> Implementing some sort of anonymous push handling for mercurial is not something
+> you can funble your way through like this, if it can be done at all.
+>
+> Hint: `$_` is being populated by the specific format git sends to a
+> specific hook script.
+> --[[Joey]]
+
+ sub rcs_receive () {
+ my @c_infos_ret;
+ while (<>) {
+ chomp;
+ my ($oldrev, $newrev, $refname) = split(' ', $_, 3);
+
+ # only allow changes to hg_default_branch
+
+*TODO:* What happens here? Some Git voodoo. _If_ `$_` has the exact same format for Mercurial, then the below should work just as well here, I think.
+
+ if ($refname !~ m|^refs/heads/$config{hg_default_branch}$|) {
+ error sprintf(gettext("you are not allowed to change %s"), $refname);
+ }
+
+Comment from `git.pm`:
+
+ # Avoid chdir when running git here, because the changes are in
+ # the default git repo, not the srcdir repo. (Also, if a subdir
+ # is involved, we don't want to chdir to it and only see
+ # changes in it.) The pre-receive hook already puts us in the
+ # right place.
+ $hg_dir=".";
+ push @c_infos_ret,
+ hg_parse_changes(hg_commit_info($newrev.":".$oldrev,
+ undef, undef));
+ $hg_dir=undef;
+ }
+
+ return @c_infos_ret;
+ }
+
+ sub rcs_preprevert ($) {
+ my $rev=shift;
+ my ($sha1) = $rev =~ /^($sha1_pattern)$/; # untaint
+
+The below 4 lines of code are from `git.pm`, but I can't see what they actually do there. Neither Git nor Mercurial only lists changes in working directory when given a command - they always traverse to repository root by themselves. I keep it here for comments, in case I'm missing something.
+
+*UPDATE:* See earlier note about `git log` ending in "`-- .`".
+
+ ## Examine changes from root of git repo, not from any subdir,
+ ## in order to see all changes.
+ #my ($subdir, $rootdir) = git_find_root();
+ #$git_dir=$rootdir;
+
+ my $c_info=hg_commit_info($sha1, 1, undef) or error "unknown commit";
+
+ # hg revert will fail on merge commits. Add a nice message.
+ if (exists $c_info->{parents} && $c_info->{parents} > 1) {
+ error gettext("you are not allowed to revert a merge");
+ }
+
+ my @c_info_ret=hg_parse_changes($c_info);
+
+ ### Probably not needed, if earlier comment is correct.
+ #$hg_dir=undef;
+ return @c_info_ret;
+ }
+
+ sub rcs_revert ($) {
+ # Try to revert the given rev; returns undef on _success_.
+ my $rev = shift;
+ my ($sha1) = $rev =~ /^($sha1_pattern)$/; # untaint
+
+ # Save uncommited local changes to diff file. Attempt to restore later.
+ my $tempdiffpath = hg_local_dirstate_shelve($sha1);
+
+ # Clean dir to latest commit.
+ run_or_die('hg', 'update', '-C');
+
+Some voodoo is needed here. `hg backout --tool internal:local -r $sha1` is *almost* good, but if the reversion is done to the directly previous revision, hg automatically commits, which is bad in this case. Instead I generate a reverse diff and pipe it to `import --no-commit`.
+
+ if (run_or_non("hg diff -c $sha1 --reverse | hg import --no-commit -")) {
+ if ($tempdiffpath) { hg_local_dirstate_unshelve($tempdiffpath) }
+ return undef;
+ }
+ else {
+ if ($tempdiffpath) { hg_local_dirstate_unshelve($tempdiffpath) }
+ return sprintf(gettext("Failed to revert commit %s"), $sha1);
+ }
+ }
+
+Below follows code regarding [[Auto-setup and maintain Mercurial wrapper hooks]]. Will try to solve it in another place later, but the code in itself is working.
+
+Should perhaps add initiation of the bookmark extension here, to support older Mercurial versions.
+
+ sub rcs_wrapper_postcall($) {
+ # Update hgrc if it exists. Change post-commit/incoming hooks with the
+ # .ikiwiki suffix to point to the wrapper path given in the setup file.
+ # Work with a tempfile to not delete hgrc if the loop is interrupted
+ # midway.
+ # I believe there is a better way to solve this than creating new hooks
+ # and callbacks. Will await discussion on ikiwiki.info.
+ my $hgrc=$config{srcdir}.'/.hg/hgrc';
+ my $backup_suffix='.ikiwiki.bak';
+ if (-e $hgrc) {
+ use File::Spec;
+ my $mercurial_wrapper_abspath=File::Spec->rel2abs($config{mercurial_wrapper}, $config{srcdir});
+ local ($^I, @ARGV)=($backup_suffix, $hgrc);
+ while (<>) {
+ s/^(post-commit|incoming)(\.ikiwiki[ \t]*=[ \t]*).*$/$1$2$mercurial_wrapper_abspath/;
+ print;
+ }
+ unlink($hgrc.$backup_suffix);
+ }
+ }
+
+ 1
diff --git a/doc/todo/Modern_standard_layout.mdwn b/doc/todo/Modern_standard_layout.mdwn
new file mode 100644
index 000000000..37f1ee740
--- /dev/null
+++ b/doc/todo/Modern_standard_layout.mdwn
@@ -0,0 +1,39 @@
+I think it would be a good idea to think about the standard layout style of ikiwiki, the current layout used in a standard setup and on ikiwiki.info as well looks a bit old-fashioned to me. I guess that a nice modern layout would attract more new ikiwiki users and boost the ikwiki community...
+
+> FWIW, I agree. The actiontabs [[theme|themes]] would be a better default, but something which showed what ikiwiki was capable of (or more precicely: that ikiwiki is as capable as other popular wiki softwares) would be better still. — [[Jon]]
+
+>> As an author of plugins that interact with the UI, I think it's good that
+>> a *minimal* ikiwiki has a minimal anti-theme, and that plugins are
+>> developed against the anti-theme - it's a "blank slate" for themes.
+>> [[plugins/contrib/trail]] was much easier to get working in
+>> the default anti-theme than in actiontabs and blueview.
+>>
+>> Technical detail: all the standard themes are done by appending to the
+>> anti-theme's CSS (albeit in ikiwiki's build system rather than during
+>> the wiki build), rather than by replacing it - so themes that haven't
+>> been updated for a new UI element end up using the version of it from
+>> the anti-theme. [[plugins/Comments]] and [[plugins/contrib/trail]]
+>> both need some tweaks per-theme to make them integrate nicely,
+>> but most of the design comes from the anti-theme.
+>>
+>> That doesn't necessarily mean the anti-theme should be the one used
+>> on ikiwiki.info, or used by default in new wikis - from my
+>> point of view, it'd be fine for either of those to be actiontabs
+>> or something The important thing is to *have* a "blank slate" anti-theme
+>> that looks simple but sufficient, as a basis for new styles (either
+>> [[themes]], or wikis that want their own unique stylesheet), and derive
+>> the other themes from it. --[[smcv]]
+
+> Ikiwiki's minimal theme is not modern. It's postmodern. I like it for the
+> reasons described here. <http://kitenet.net/~joey/blog/entry/web_minimalism/>
+> " The minimalism sucked you in, it made the web feel like one coherent,
+> unified thing, unlike the constellation of corporate edifices occupying
+> much of it today."
+>
+> I see an increasing trend back toward these principles, driven partly
+> by limits of eg, smartphone UI. So I certianly won't be changing the
+> look of any of my ikiwiki sites, including this one.
+>
+> `auto.setup` and `auto-blog.setup` could have different defaults,
+> or allow a theme to be picked as [Branchable](http://branchable.com/)
+> does. Perhaps actiontabs for auto-blog and default for wikis? --[[Joey]]
diff --git a/doc/todo/More_flexible_po-plugin_for_translation.mdwn b/doc/todo/More_flexible_po-plugin_for_translation.mdwn
new file mode 100644
index 000000000..3399f7834
--- /dev/null
+++ b/doc/todo/More_flexible_po-plugin_for_translation.mdwn
@@ -0,0 +1,5 @@
+I have a website with multi-language content, where some content is only in English, some in German, and some is available in both languages.
+
+The po-module currently has only one master-language, with slave languages, and a PageSpec should be considered.
+
+It would be nice to flag the content which should have a translation on a file-by-file basis (with some inline directive?) which could contain the information of the master-language for that file and the desired target-languages.
diff --git a/doc/todo/Move_teximg_latex_preamble_to_config_file.mdwn b/doc/todo/Move_teximg_latex_preamble_to_config_file.mdwn
new file mode 100644
index 000000000..3cedd5ae3
--- /dev/null
+++ b/doc/todo/Move_teximg_latex_preamble_to_config_file.mdwn
@@ -0,0 +1,156 @@
+The [[plugins/teximg]] plugin currently has a TODO in the source code to make the preamble configurable. The included [[patch]] makes this change.
+
+The patch also makes some other changes:
+
+ - The default latex preamble is changed to the international standard `article` class from the European `scrartcl` class.
+ - Removed the non-standard `mhchem` package from the default preamble.
+ - Allow the use of `dvipng` rather than `dvips` and `convert` (`convert` is not a standard part of a latex install). This is configurable.
+
+-- [[Will]]
+
+> I like making this configurable. I do fear that changing what's included
+> by default could break some existing uses of teximg? That needs to be
+> considered, and either the breakage documented in NEWS, or avoided. Also,
+> if mchem is dropped, I think the suggests on texlive-science in
+> debian/control should probably go? --[[Joey]]
+
+>> Yes, changing the defaults could break some existing uses. I think in
+>> this case, documenting in NEWS and dropping texlive-science is the
+>> best option. In fact, NEWS should probably document the config
+>> setting to return things to how they were.
+>>
+>> The reason I prefer dropping `mchem` rather than keeping it is that `mchem`
+>> is non-standard. Now that things are configurable and it is easy to
+>> add in if you want it, having only standard packages by default is a
+>> good thing. Here is a proposed NEWS entry:
+
+File: TexImg standard preamble changed
+
+The [[plugins/teximg]] [[plugin]] now has a configurable LaTeX preamble.
+As part of this change the `mchem` LaTeX package has been removed from
+the default LaTeX preamble as it wasn't included in many TeX installations.
+
+The previous behaviour can be restored by adding the following to your ikiwiki setup:
+
+ teximg_prefix => '\documentclass{scrartcl}
+ \usepackage[version=3]{mhchem}
+ \usepackage{amsmath}
+ \usepackage{amsfonts}
+ \usepackage{amssymb}
+ \pagestyle{empty}
+ \begin{document}',
+
+In addition, the rendering mechanism has been changed to use `dvipng` by default.
+If you would like to return to the old rendering mechanism using `dvips` and `convert`
+then you should add the following line to your ikiwiki setup:
+
+ teximg_dvipng => 0,
+
+The LaTeX postfix is unchanged, but is also now configurable using `teximg_postfix`.
+Happy TeXing.
+
+>> I think that about covers it. -- [[Will]]
+
+ diff --git a/IkiWiki/Plugin/teximg.pm b/IkiWiki/Plugin/teximg.pm
+ index 369c108..8c3379f 100644
+ --- a/IkiWiki/Plugin/teximg.pm
+ +++ b/IkiWiki/Plugin/teximg.pm
+ @@ -10,6 +10,18 @@ use File::Temp qw(tempdir);
+ use HTML::Entities;
+ use IkiWiki 2.00;
+
+ +my $default_prefix = <<EOPREFIX
+ +\\documentclass{article}
+ +\\usepackage{amsmath}
+ +\\usepackage{amsfonts}
+ +\\usepackage{amssymb}
+ +\\pagestyle{empty}
+ +\\begin{document}
+ +EOPREFIX
+ +;
+ +
+ +my $default_postfix = '\\end{document}';
+ +
+ sub import {
+ hook(type => "getsetup", id => "teximg", call => \&getsetup);
+ hook(type => "preprocess", id => "teximg", call => \&preprocess);
+ @@ -21,6 +33,26 @@ sub getsetup () {
+ safe => 1,
+ rebuild => undef,
+ },
+ + teximg_dvipng => {
+ + type => "boolean",
+ + description => "Should teximg use dvipng to render, or dvips and convert?",
+ + safe => 0,
+ + rebuild => 0,
+ + },
+ + teximg_prefix => {
+ + type => "string",
+ + example => $default_prefix,
+ + description => "LaTeX prefix for teximg plugin",
+ + safe => 0, # Not sure how secure LaTeX is...
+ + rebuild => 1,
+ + },
+ + teximg_postfix => {
+ + type => "string",
+ + example => $default_postfix,
+ + description => "LaTeX postfix for teximg plugin",
+ + safe => 0, # Not sure how secure LaTeX is...
+ + rebuild => 1,
+ + },
+ }
+
+ sub preprocess (@) {
+ @@ -105,25 +137,35 @@ sub gen_image ($$$$) {
+ my $digest = shift;
+ my $imagedir = shift;
+
+ - #TODO This should move into the setup file.
+ - my $tex = '\documentclass['.$height.'pt]{scrartcl}';
+ - $tex .= '\usepackage[version=3]{mhchem}';
+ - $tex .= '\usepackage{amsmath}';
+ - $tex .= '\usepackage{amsfonts}';
+ - $tex .= '\usepackage{amssymb}';
+ - $tex .= '\pagestyle{empty}';
+ - $tex .= '\begin{document}';
+ + if (!defined $config{teximg_prefix}) {
+ + $config{teximg_prefix} = $default_prefix;
+ + }
+ + if (!defined $config{teximg_postfix}) {
+ + $config{teximg_postfix} = $default_postfix;
+ + }
+ + if (!defined $config{teximg_dvipng}) {
+ + # TODO: Can we detect whether dvipng or convert is in the path?
+ + $config{teximg_dvipng} = 1;
+ + }
+ +
+ + my $tex = $config{teximg_prefix};
+ $tex .= '$$'.$code.'$$';
+ - $tex .= '\end{document}';
+ + $tex .= $config{teximg_postfix};
+ + $tex =~ s!\\documentclass{article}!\\documentclass[${height}pt]{article}!g;
+ + $tex =~ s!\\documentclass{scrartcl}!\\documentclass[${height}pt]{scrartcl}!g;
+
+ my $tmp = eval { create_tmp_dir($digest) };
+ if (! $@ &&
+ - writefile("$digest.tex", $tmp, $tex) &&
+ - system("cd $tmp; latex --interaction=nonstopmode $tmp/$digest.tex > /dev/null") == 0 &&
+ - system("dvips -E $tmp/$digest.dvi -o $tmp/$digest.ps 2> $tmp/$digest.log") == 0 &&
+ - # ensure destination directory exists
+ - writefile("$imagedir/$digest.png", $config{destdir}, "") &&
+ - system("convert -density 120 -trim -transparent \"#FFFFFF\" $tmp/$digest.ps $config{destdir}/$imagedir/$digest.png > $tmp/$digest.log") == 0) {
+ + writefile("$digest.tex", $tmp, $tex) &&
+ + system("cd $tmp; latex --interaction=nonstopmode $tmp/$digest.tex > /dev/null") == 0 &&
+ + # ensure destination directory exists
+ + writefile("$imagedir/$digest.png", $config{destdir}, "") &&
+ + (($config{teximg_dvipng} &&
+ + system("dvipng -D 120 -bg Transparent -T tight -o $config{destdir}/$imagedir/$digest.png $tmp/$digest.dvi > $tmp/$digest.log") == 0
+ + ) ||
+ + (!$config{teximg_dvipng} &&
+ + system("dvips -E $tmp/$digest.dvi -o $tmp/$digest.ps 2> $tmp/$digest.log") == 0 &&
+ + system("convert -density 120 -trim -transparent \"#FFFFFF\" $tmp/$digest.ps $config{destdir}/$imagedir/$digest.png > $tmp/$digest.log") == 0))) {
+ return 1;
+ }
+ else {
+
+[[done]]
diff --git a/doc/todo/Moving_Pages.mdwn b/doc/todo/Moving_Pages.mdwn
new file mode 100644
index 000000000..387e4fb82
--- /dev/null
+++ b/doc/todo/Moving_Pages.mdwn
@@ -0,0 +1,222 @@
+I thought I'd draw attention to a desire of mine for **ikiwiki**. I'm no power-user, and mostly I do fairly simple stuff with my [wiki](http://kitenet.net/~kyle/family/wiki).
+
+However, I would like the ability (now) to **rename/move/delete** pages. As part of having a genealogy wiki, I've put name and dates of birth/death as part of the title of each article (so to avoid cases where people have the same name, but are children/cousins/etc of others with that name). However, some of this information changes. For instance, I didn't know a date of death and now I do, or I had it wrong originally, or it turns out someone is still alive I didn't know about. All of these cases leave me with bad article titles.
+
+So, I can go ahead and move the file to a new page with the correct info, orphan that page, provide a link for the new page if desired, and otherwise ignore that page. But then, it clutters up the wiki and serves no useful purpose.
+
+Anyway to consider implementing **rename/move/delete** ? I certainly lack the skills to appreciate what this would entail, but feel free to comment if it appears impossible, and then I'll go back to the aforementioned workaround. I would prefer simple rename, however.
+
+Thanks again to [Joey](http://kitenet.net/~joey) for putting ikiwiki together. I love the program.
+
+*[Kyle](http://kitenet.net/~kyle/)=*
+
+> Took a bit too long, but [[done]] now. --[[Joey]]
+
+----
+
+The MediaWiki moving/renaming mechanism is pretty nice. It's easy to get a list of pages that point to the current page. When renaming a page it sticks a forwarding page in the original place. The larger the size of the wiki the more important organization tools become.
+
+I see the need for:
+
+* a new type of file to represent a forwarding page
+* a rename tool that can
+ * move the existing page to the new name
+ * optionally drop a forwarding page
+ * optionally rewrite incoming links to the new location
+
+Brad
+
+> This could be implemented through the use of an HTTP redirect to the
+> new page, but this has the downside that people may not know they're being
+> redirected.
+>
+> This could also be implemented using a combination of raw inline and meta
+> to change the title (add a "redirected from etc." page. This could be done
+> with a plugin. A redirect page would be [[!redirect page="newpage"]].
+> But then if you click "edit" on this redirect page, you won't be able
+> to edit the new page, only the call to redirect.
+> --Ethan
+
+-----
+
+I'm going to try to run through a full analysis and design for moving and
+deleting pages here. I want to make sure all cases are covered. --[[Joey]]
+
+## UI
+
+The UI I envision is to add "Rename" and "Delete" buttons to the file edit
+page. Both next to the Save button, and also at the bottom of the attachment
+management interface.
+
+The attachment(s) to rename or delete would be selected using the check boxes
+and then the button applies to all of them. Deleting multiple attachments
+in one go is fine; renaming multiple attachments in one go is ambiguous,
+and it can just error out if more than one is selected for rename.
+(Alternatively, it could allow moving them all to a different subdirectory.)
+
+The Delete buttons lead to a page to confirm the deletion(s).
+
+The Rename buttons lead to a page with a text edit box for editing the
+page name. The title of the page is edited, not the actual filename.
+
+There will also be a optional comment field, so a commit message can be
+written for the rename/delete.
+
+Note that there's an edge case concerning pages that have a "/" encoded
+as part of their title. There's no way for a title edit box to
+differentiate between that, and a "/" that is instended to refer to a
+subdirectory to move the page to. Consequence is that "/" will always be
+treated literally, as a subdir separator; it will not be possible to use
+this interface to put an encoded "/" in a page's name.
+
+Once a page is renamed, ikiwiki will return to the page edit interface,
+now for the renamed page. Any modifications that the user had made to the
+textarea will be preserved.
+
+Similarly, when an attachment is renamed, or deleted, return to the page
+edit interface (with the attachments displayed).
+
+When a page is deleted, redirect the user to the toplevel index.
+
+Note that this design, particularly the return to the edit interface after
+rename, means that the rename button can *only* be put on the page edit ui.
+It won't be possible to put it on the action bar or somewhere else. (It
+would be possible to code up a different rename button that doesn't do
+that, and use it elsewhere.)
+
+Hmm, unless it saves the edit state and reloads it later, while using a separate
+form. Which seems to solve other problems, so I think is the way to go.
+
+## SubPages
+
+When renaming `foo`, it probably makes sense to also rename
+`foo/Discussion`. Should other SubPages in `foo/` also be renamed? I think
+it's probably simplest to rename all of its SubPages too.
+
+(For values of "simplest" that don't include the pain of dealing with all
+the changed links on subpages.. as well as issues like pagespecs that
+continue to match the old subpages, and cannot reasonably be auto-converted
+to use the new, etc, etc... So still undecided about this.)
+
+When deleting `foo`, I don't think SubPages should be deleted. The
+potential for mistakes and abuse is too large. Deleting Discussion page
+might be a useful exception.
+
+TODO: Currently, subpages are not addressed.
+
+## link fixups
+
+When renaming a page, it's desirable to keep links that point to it
+working. Rather than use redirection pages, I think that all pages that
+link to it should be modified to fix their links.
+
+The rename plugin can add a "rename" hook, which other plugins can use to
+update links &etc. The hook would be passed page content, the old and new
+link names, and would modify the content and return it. At least the link
+plugin should have such a hook.
+
+After calling the "rename" hook, and rendering the wiki, the rename plugin
+can check to see what links remain pointing to the old page. There could
+still be some, for example, CamelCase links probably won't be changed; img
+plugins and others contain logical links to the file, etc. The user can be
+presented with a list of all the pages that still have links to the old
+page, and can manually deal with them.
+
+In some cases, a redirection page will be wanted, to keep long-lived urls
+working. Since the meta plugin supports creating such pages, and since they
+won't always be needed, I think it will be simplest to just leave it up to
+the user to create such a redirection page after renaming a page.
+
+## who can delete/rename what?
+
+The source page must be editable by the user to be deleted/renamed.
+When renaming, the dest page must not already exist, and must be creatable
+by the user, too.
+
+lWhen deleting/renaming attachments, the `allowed_attachments` PageSpec
+is checked too.
+
+## RCS
+
+Three new functions are added to the RCS interface:
+
+* `rcs_remove(file)`
+* `rcs_rename(old, new)`
+* `rcs_commit_staged(message, user, ip)`
+
+See [[rcs_updates_needed_for_rename_and_remove]].
+
+## conflicts
+
+Cases to consider:
+
+* Alice clicks "delete" button for a page; Bob makes a modification;
+ Alice confirms deletion. Ideally in this case, Alice should get an error
+ message that there's a conflict.
+ Update: In my current code, alice's deletion will fail if the file was
+ moved or deleted in the meantime; if the file was modified since alice
+ clicked on the delete button, the modifications will be deleted too. I
+ think this is acceptable.
+* Alice opens edit UI for a page; Bob makes a modification; Alice
+ clicks delete button and confirms deletion. Again here, Alice should get
+ a conflict error. Note that this means that the rcstoken should be
+ recorded when the edit UI is first opened, not when the delete button is
+ hit.
+ Update: Again here, there's no conflict, but the delete succeeds. Again,
+ basically acceptible.
+* Alice and Bob both try to delete a page at the same time. It's fine for
+ the second one to get a message that it no longer exists. Or just to
+ silently fail to delete the deleted page..
+ Update: It will display an error to the second one that the page doesn't
+ exist.
+* Alice deletes a page; Bob had edit window open for it, and saves
+ it afterwards. I think that Bob should win in this case; Alice can always
+ notice the page has been added back, and delete it again.
+ Update: Bob wins.
+* Alice clicks "rename" button for a page; Bob makes a modification;
+ Alice confirms rename. This case seems easy, it should just rename the
+ modified page.
+ Update: it does
+* Alice opens edit UI for a page; Bob makes a modification; Alice
+ clicks rename button and confirms rename. Seems same as previous case.
+ Update: check
+* Alice and Bob both try to rename a page at the same time (to probably
+ different names). Or one tries to delete, and the other to rename.
+ I think it's acceptible for the second one to get an error message that
+ the page no longer exists.
+ Update: check, that happens
+* Alice renames a page; Bob had edit window open for it, and saves
+ it afterwards, under old name. I think it's acceptible for Bob to succeed
+ in saving it under the old name in this case, though not ideal.
+ Update: Behavior is the same as if Alice renamed the page and Bob created
+ a new page with the old name. Seems acceptable, though could be mildly
+ confusing to Bob (or Alice).
+* Alice starts creating a new page. In the meantime, Bob renames a
+ different page to that name. Alice should get an error message when
+ committing; and it should have conflict markers. Ie, this should work the
+ same as if Bob had edited the new page at the same time as Alice did.
+ Update: That should happen. Haven't tested this case yet to make sure.
+* Bob starts renaming a page. In the meantime, Alice creates a new page
+ with the name he's renaming it to. Here Bob should get a error message
+ that he can't rename the page to an existing name. (A conflict resolution
+ edit would also be ok.)
+ Update: Bob gets an error message.
+* Alice renames (or deletes) a page. In the meantime, Bob is uploading an
+ attachment to it, and finishes after the rename finishes. Is it
+ acceptible for the attachment to be saved under the old name?
+ Update: Meh. It's certianly not ideal; if Bob tries to save the page he
+ uploaded the attachment to, he'll get a message about it having been
+ deleted/renamed, and he can try to figure out what to do... :-/
+* I don't know if this is a conflict, but it is an important case to consider;
+ you need to make sure that there are no security holes. You dont want
+ someone to be able to rename something to <code>/etc/passwd</code>.
+ I think it would be enough that you cannot rename to a location outside
+ of srcdir, you cannot rename to a location that you wouldn't be able
+ to edit because it is locked, and you cannot rename to an existing page.
+
+ > Well, there are a few more cases (like not renaming to a pruned
+ > filename, and not renaming _from_ a file that is not a known source
+ > file or is locked), but yes, that's essentially it.
+ >
+ > PS, the first thing I do to any
+ > web form is type /etc/passwd and ../../../../etc/passwd into it. ;-) --[[Joey]]
diff --git a/doc/todo/Multiple_categorization_namespaces.mdwn b/doc/todo/Multiple_categorization_namespaces.mdwn
new file mode 100644
index 000000000..3e9f8feaa
--- /dev/null
+++ b/doc/todo/Multiple_categorization_namespaces.mdwn
@@ -0,0 +1,103 @@
+I came across this when working on converting my old blog into an ikiwiki, but I think it could be of more general use.
+
+The background: I have a (currently suspended, waiting to be converted) blog on the [il Cannocchiale](http://www.ilcannocchiale.it) hosting platform. Aside from the usual metatadata (title, author), il Cannocchiale also provides tags and two additional categorization namespaces: a blog-specific user-defind "column" (Rubrica) and a platform-wide "category" (Categoria). The latter is used to group and label a couple of platform-wide lists of latest posts, the former may be used in many different ways (e.g. multi-author blogs could have one column per author or so, or as a form of 'macro-tagging'). Columns are also a little more sophisticated than classical tags because you can assign them a subtitle too.
+
+When I started working on the conversion, my first idea was to convert Rubriche to subdirectories of an ikiwiki blog. However, this left me with a few annoying things: when rebuilding links from the import, I had to (programmatically) dive into each subdirectory to see where each post was; this would also be problematic for future posting, too. It also meant that moving a post from a Rubrica to the other would break all links (unless ikiwiki has a way to fix this automagically). And I wasn't too keen on the fact that the Rubrica would come up in the URL of the post. And finally, of course, I couldn't use this to preserve the Categoria metadata.
+
+Another solution I thought about was to use special deeper tags for the Rubrica and Categoria (like: `\[[!tag "Rubrica/Some name"]]`), but this is horrible, clumsy, and makes special treatment of these tags a PITN (for example you wouldn't want the Rubrica to be displayed together with the other tags, and you would want it displayed somewhere else like next to the title of the post). This solution however looks to me as the proper path, as long as tags could support totally separate namespaces. I have a tentative implementation of this `tagtype` feature at [my git clone of ikiwiki](http://git.oblomov.eu/ikiwiki).
+
+The feature is currently implemented as follows: a `tagtypes` config options takes an array of strings: the tag types to be defined _aside from the usual tags_. Each tag type automatically provides a new directive which sets up tags that different from standard tags by having a different tagbase (the same as the tagtype) and link type (again, the same as the tagtype) (a TODO item for this would to make the directive, tagbase and link type customizable). For example, for my imported blog I would define
+
+ tagtypes => [qw{Categoria Rubrica}]
+
+and then in the blog posts I would have stuff like
+
+ \[[!Categoria "LAVORO/Vita da impiegato"]]
+ \[[!Rubrica "Il mio mondo"]]
+ \[[!meta title="Blah blah"]]
+ \[[!meta author="oblomov"]]
+
+ The body of the article
+
+ \[[!tag a bunch of tags]]
+
+and the tags would appear at the bottom of the post, the Rubrica next to the title, etc. All of this information would end up as categories in the feeds (although I would like to rework that code to make use of namespaces, terms and labels in a different way).
+
+> Note [[plugins/contrib/report/discussion]]. To quote myself from the latter page:
+> *I find tags as they currently exist to be too limiting. I prefer something that can be used for Faceted Tagging http://en.wikipedia.org/wiki/Faceted_classification; that is, things like Author:Fred Nurk, Genre:Historical, Rating:Good, and so on. Of course, that doesn't mean that each tag is limited to only one value, either; just to take the above examples, something might have more than one author, or have multiple genres (such as Historical + Romance).*
+
+> So you aren't the only one who wants to do more with tags, but I don't think that adding a new directive for each tag type is the way to go; I think it would be simpler to just have one directive, and take advantage of the new [[matching different kinds of links]] functionality, and enhance the tag directive.
+> Perhaps something like this:
+
+ \[[!tag categorica="LAVORO/Vita da impiegato" rubrica="Il mio mondo"]]
+
+> Part of my thinking in this is to also combine tags with [[plugins/contrib/field]], so that the tags for a page could be queried and displayed; that way, one could put them wherever you wanted on the page, using any of [[plugins/contrib/getfield]], [[plugins/contrib/ftemplate]], or [[plugins/contrib/report]].
+> --[[KathrynAndersen]]
+
+>> A very generic metadata framework could cover all possible usages of fields, tags, and related metadata, but keeping its _user interface_ generic would only make it hard to use. Note that this is not an objection to the idea of collapsing the fields and tags functionality (at quick glance, I cannot see a real difference between single-valued custom tagtypes and fields, but see below), but more about the syntax.
+
+>> I had thought about the `\[[!tag type1=value1 type2=value2]]` syntax myself, but ultimately decided against it for a number of reasons, most importantly the fact that (1) it's harder to type, (2) it's harder to spot errors in the tag types (so for example if one misspelled `categoria` as `categorica`, he might not notice it as quickly as seeing the un-parsed `\[[!categorica ]]` directive in the output html) and (3) it encourages collapsing possibly unrelated metadata together (for example, I would never consider putting the categoria information together with the rubrica one; of course with your syntax it's perfectly possible to keep them separate as well).
+
+>> Point (2) may be considered a downside as well as an upside, depending on perspective, of course. And it would be possible to have a set of predefined tag types to match against, like in my tagtype directive approach but with your syntax.
+
+>>> You seem to have answered your own objections already. -- K.A.
+
+>>Point (3) is of course entirely in the hands of the user, but that's exactly what syntax should be about. There is nothing functionally wrong with e.g. `\[[!meta tag=sometag author=someauthor title=sometitle rubrica=somecolumn]]`, but I honestly find it horrible.
+
+>>> So, really, point 3 comes down to differing aesthetics. -- K.A.
+
+>> A solution could be to allow both syntaxes, getting to have for example `\[[!sometagtype "blah"]]` as a shortcut for `\[[!tag sometagtype="blah"]]` (or, in the more general case, `\[[!somefieldname "blah"]]` as a shortcut for `\[[!meta fieldname="blah"]]`).
+
+>> I would like to point out however that there are some functional differences between categorization metadata vs other metadata that might suggest to keep fields and (my extended) tags separate. For examples, in feeds you'd want all categorization metadata to fall in one place, with some appropriate manipulation (which I still have to implement, by the way), while things like author or title would go to the corresponding feed item properties. Although it all would be possible with appropriate report or template juggling, having such default metadata handled natively looks like a bonus to me.
+
+>>> Whereas I prefer being able to control such things with templates, because it gives more flexibility AND control. - K.A.
+
+>>>> Flexibility and control is good for tuning and power-usage, but sensible defaults are a must for a platform to be usable out of the box without much intervention. Moreover, there's a possible problem with what kind of data must be passed over to templates.
+
+Aside from the name of the plugin (and thus of the main directive), which could be `tag`, `meta`, `field` or whatever (maybe extending `meta` would be the most sensible choice), the features we want are
+
+1. allow multiple values per type/attribute/field/whatever (fields currently only allows one)
+ * Agreed about multiple values; I've been considering whether I should add that to `field`. -- K.A.
+2. allow both hidden and visible references (a la tag vs taglink)
+ * Hidden and visible references; that's fair enough too. My approach with `ymlfront` and `getfield` is that the YAML code is hidden, and the display is done with `getfield`, but there's no reason not to use additional approaches. -- K.A.
+3. allow each type/attribute/field to be exposed under multiple queries (e.g. tags and categories; this is mostly important for backwards compatibility, not sure if it might have other uses too)
+ * I'm not sure what you mean here. -- K.A.
+ * Typical example is tags: they are accessible both as `tags` and as `categories`, although the way they are presented changes a little -- G.B.
+4. allow arbitrary types/attributes/fields/whatever (even 'undefined' ones)
+ * Are you saying that these must be typed, or are you saying that they can be user-defined? -- K.A.
+ * I am saying that the user should be able to define (e.g. in the config) some set of types/fields/attributes/whatever, following the specification illustrated below, but also be able to use something like `\[[!meta somefield="somevalue"]]` where `somefield` was never defined before. In this case `somefield` will have some default values for the properties described in the spec below. -- G.B.
+
+Each type/attribute/field/whatever (predefined, user-defined, arbitrary) would thus have the following parameters:
+
+* `directive` : the name of the directive that can be used to set the value as a hidden reference; we can discuss whether, for pre- or user-defined types, it being undef means no directive or a default directive matching the attribute name would be defined.
+ * I still want there to be able to be enough flexibility in the concept to enable plugins such as `yamlfront`, which sets the data using YAML format, rather than using directives. -- K.A.
+ * The possibility to use a directive does not preclude other ways of defining the field values. IOW, even if the directive `somefield` is defined, the user would still be able to use the syntax `\[[!meta somefield="somevalue"]]`, or any other syntax (such as YAML). -- G.B.
+* `linkdirective` : the name of the directive that can be used for a visible reference; no such directive would be defined by default
+* `linktype` : link type for (hidden and visible) references
+ * Is this the equivalent to "field name"? -- K.A.
+ * This would be such by default, but it could be set to something different. [[Typed links|matching_different_kinds_of_links]] is a very recent ikiwiki feature. -- G.B.
+* `linkbase` : akin to the tagbase parameter
+ * Is this a field-name -> directory mapping? -- K.A.
+ * yes, with each directory having one page per value. It might not make sense for all fields, of course -- G.B.
+ * (nods) I've been working on something similar with my unreleased `tagger` module. In that, by default, the field-name maps to the closest wiki-page of the same name. Thus, if one had the field "genre=poetry" on the page fiction/stories/mary/lamb, then that would map to fiction/genre/poetry if fiction/genre existed. --K.A.
+ * that's the idea. In your case you could have the linkbase of genre be fiction/genre, and it would be created if it was missing. -- G.B.
+* `queries` : list of template queries this type/attribute/field/whatever is exposed to
+ * I'm not sure what you mean here. -- K.A.
+ * as mentioned before, some fields may be made accessible through different template queries, in different form. This is the case already for tags, that also come up in the `categories` query (used by Atom and RSS feeds). -- G.B.
+ * Ah, do you mean that the input value is the same, but the output format is different? Like the difference between TMPL_VAR NAME="FOO" and TMPL_VAR NAME="raw_FOO"; one is htmlized, and the other is not. -- K.A.
+ * Actually this is about the same information appearing in different queries (e.g. NAME="FOO" and NAME="BAR"). Example: say that I defined a "Rubrica" field. I would want both tags and categories to appear in `categories` template query, but only tags would appear in the `tags` query, and only Rubrica values to appear in `rubrica` queries. The issue of different output formats was presented in the next paragraph instead. -- G.B.
+
+Where this approach is limiting is on the kind of data that is passed to (template) queries. The value of the metadata fields might need some massaging (e.g. compare how tags are passed to tags queries vs cateogires queries, or also see what is done with the fields in the current `meta` plugin). I have problems on picturing an easy way to make this possible user-side (i.e. via templates and not in Perl modules). Suggestions welcome.
+
+One possibility could be to have the `queries` configuration allow a hash mapping query names to functions that would transform the data. Lacking that possibility, we might have to leave some predefined fields to have custom Perl-side treatment and leave custom fields to be untransformable.
+
+-----
+
+I've now updated the [[plugins/contrib/field]] plugin to have:
+
+* arrays (multi-valued fields)
+* the "linkbase" option as mentioned above (called field_tags), where the linktype is the field name.
+
+I've also updated [[plugins/contrib/ftemplate]] and [[plugins/contrib/report]] to be able to use multi-valued fields, and [[plugins/contrib/ymlfront]] to correctly return multi-valued fields when they are requested.
+
+--[[KathrynAndersen]]
diff --git a/doc/todo/New_preprocessor_directive_syntax.mdwn b/doc/todo/New_preprocessor_directive_syntax.mdwn
new file mode 100644
index 000000000..2215cc4b4
--- /dev/null
+++ b/doc/todo/New_preprocessor_directive_syntax.mdwn
@@ -0,0 +1,21 @@
+As discussed on IRC, preprocessor directives should transition to a
+new syntax distinct from wikilinks. Possible syntaxes:
+
+* `[[!preprocessor directive]]`
+* `{{preprocessor directive}}`
+
+The transition would involve adding the new syntax, adding an option
+to turn off the old syntax with the default allowing it, giving people
+time to convert their wikis and turn the option on, and releasing a
+new ikiwiki (version 3 for instance) that turns off the old syntax by
+default.
+
+Making this transition would fix two major warts:
+
+* The inability to use spaces in wikilinks or link text
+* The requirement to use a trailing space on a preprocessor directive
+ with no arguments, such as `\[[!toc ]]`
+
+--[[JoshTriplett]]
+
+[[done]] in version 2.21, using the '!'-prefixed syntax. --[[JoshTriplett]]
diff --git a/doc/todo/New_preprocessor_directive_syntax/discussion.mdwn b/doc/todo/New_preprocessor_directive_syntax/discussion.mdwn
new file mode 100644
index 000000000..f6c0fc0ec
--- /dev/null
+++ b/doc/todo/New_preprocessor_directive_syntax/discussion.mdwn
@@ -0,0 +1,19 @@
+Err, is this really fixed in 2.21? I can't find it anywhere in 2.32.3
+(debian unstable)
+
+-----
+
+I just did a `--dumpsetup` with the current version from the Git repository
+and the default option is
+
+ # use '!'-prefixed preprocessor directives?
+ prefix_directives => 0,
+
+My impression was that this should be enabled by default now. --[[JasonBlevins]]
+
+> As stated in `debian/NEWS`:
+>> For backward compatibility with existing wikis,
+>> refix_directives currently defaults to false. In ikiwiki 3.0,
+>> prefix_directives will default to true [...]
+> --[[intrigeri]]
+
diff --git a/doc/todo/OpenSearch.mdwn b/doc/todo/OpenSearch.mdwn
new file mode 100644
index 000000000..c35da54e1
--- /dev/null
+++ b/doc/todo/OpenSearch.mdwn
@@ -0,0 +1,38 @@
+[[plugins/search]] could provide [OpenSearch](http://www.opensearch.org/)
+metadata. Various software supports OpenSearch (see the Wikipedia article on
+[[!wikipedia OpenSearch]]); in particular, browsers like Firefox and Iceweasel
+will automatically discover an OpenSearch search and offer it in the search
+box.
+
+More specifically, we want to follow the [OpenSearch Description Document
+standard](http://www.opensearch.org/Specifications/OpenSearch/1.1#OpenSearch_description_document),
+by having a `link` with `rel="search"` and
+`type="application/opensearchdescription+xml"` in the headers of HTML, RSS,
+and Atom pages. The `href` of that `link` should point to an
+OpenSearchDescription XML file with contents generated based on the
+information in `ikiwiki.setup`, and the `title` attribute of the `link` should
+contain the wiki title from `ikiwiki.setup`.
+
+--[[JoshTriplett]]
+
+> I support adding this. I think all that is needed, beyond the simple task
+> of adding the link header, is to make the search plugin write out
+> the xml file, probably based on a template.
+>
+> One problem is that the
+> [specification](http://www.opensearch.org/Specifications/OpenSearch/1.1#OpenSearch_description_document)
+> for the XML file contains a number of silly limits to field lenghs.
+> For example, it wants a "ShortName" that identifies the search engine,
+> to be 16 characters or less. The Description is limited to 1024,
+> the LongName to 48. This limits what existing config settings can be
+> reused for those.
+>
+> Another semi-problem is that the specification saz:
+>
+>> OpenSearch description documents should include at least one Query element of role="example" that is expected to return search results. Search clients may use this example query to validate that the search engine is working properly.
+>
+> How should ikiwiki know what example query will return actual results?
+> (How would a client know if a HTML page contains results or not, anyway?)
+> Sillyness. Ignore this? --[[Joey]]
+
+[[wishlist]]
diff --git a/doc/todo/Option_to_disable_date_footer_for_inlines.mdwn b/doc/todo/Option_to_disable_date_footer_for_inlines.mdwn
new file mode 100644
index 000000000..807f4a84c
--- /dev/null
+++ b/doc/todo/Option_to_disable_date_footer_for_inlines.mdwn
@@ -0,0 +1,31 @@
+[[/plugins/inline]], with the `archive` option, shows only page titles and
+post dates. I'd like an option to omit the post dates as well, leaving only
+the page titles. Such an option would streamline the [[/users]] page, for
+instance. --[[JoshTriplett]]
+> Yes, indeed, something like "compact" mode would be useful.
+> In fact, this would be better handled with a replacement of the "archive" on/off API with something like
+> mode = normal|archive|compact|.... defaulting to normal
+> --hb
+
+>> You also don't need to be restricted to a fixed set of modes: the `mode` parameter could simply specify
+>> the template to be used: `inlinepage-$mode.tmpl`. For producing e.g. bulleted lists of the entries, some extra
+>> container template would be useful in addition to that...
+>>
+>> In a related note, I'd like an option to include the creation date on some non-inlined pages too. I suppose
+>> that's doable with some template hook in a plugin, and a command-line parameter pagespec (suffices for me),
+>> but I haven't got around to that yet. --[[tuomov]]
+
+Customised templates can now be specified with the `templates` parameter,
+so done --[[Joey]]
+
+> That definitely solves this problem in general; thanks!
+>
+> For this specific case, I'd still like to see a `titleonly.tmpl` template
+> included by default. How about this simple template, based on
+> archivepage.tmpl?
+>
+> <p><a href="<TMPL_VAR PAGEURL>"><TMPL_VAR TITLE></a></p>
+>
+> --[[JoshTriplett]]
+
+[[todo/done]]
diff --git a/doc/todo/Option_to_make_title_an_h1__63__.mdwn b/doc/todo/Option_to_make_title_an_h1__63__.mdwn
new file mode 100644
index 000000000..8345cd010
--- /dev/null
+++ b/doc/todo/Option_to_make_title_an_h1__63__.mdwn
@@ -0,0 +1,14 @@
+Currently, the page title (either the name of the page or the title specified with `\[[!meta title="..."]]`) shows up in a `<div class="header">`. I tend to follow the [w3c guideline recommending the use of h1 for the title](http://www.w3.org/QA/Tips/Use_h1_for_Title); for this purpose, how about an option to make the page title an `<h1 class="header">`, and shift the markdown headings down by one (making # an h2, ## an h3, etc; or alternatively making # equivalent to `\[[!meta title="..."]]`)?
+
+> The reason I don't use a h1 for the navbar is that while it incorporates
+> the page title, it's not just a page title, it has the links to parent pages.
+> I also don't want to get in the business of munging up markdown's semantics. This
+> way, # is reserved for h1 if you choose to use headers in your page. --[[Joey]]
+
+[[done]]
+
+> For anyone interested, I've written a small plugin called [h1title][] that does the
+> latter, making `#` (only when on the first line) set the page title, removing it from
+> the page body. --[[JasonBlevins]], October 22, 2008
+
+ [h1title]: http://jblevins.org/git/ikiwiki/plugins.git/plain/h1title.pm
diff --git a/doc/todo/Overlay_directory_for_pagetemplates.mdwn b/doc/todo/Overlay_directory_for_pagetemplates.mdwn
new file mode 100644
index 000000000..c4f261d03
--- /dev/null
+++ b/doc/todo/Overlay_directory_for_pagetemplates.mdwn
@@ -0,0 +1,9 @@
+I am aware of the [[pagetemplate|/plugins/pagetemplate]] plugin, but I'd like to style my pages completely differently. Modifications in `/usr/share/ikiwiki` are not an option, and I'd rather not snapshot that directory and make my changes to the detached copy.
+
+Instead, I wonder whether it would be possible to create an overlay directory such that ikiwiki first looks for page templates there, and only goes to `/usr/share/ikiwiki` if the overlay does not satisfy the request.
+
+-----
+
+[[madduck]]: Update: I did try setting `templates` in `ikiwiki.setup` but could not get it to work. Then I found in the code that ikiwiki already checks that dir before the `/usr/share/ikiwiki` one, and tried it again, and now it works... sorry.
+
+Thus [[!taglink done]]. \ No newline at end of file
diff --git a/doc/todo/Pagination_next_prev_links.mdwn b/doc/todo/Pagination_next_prev_links.mdwn
new file mode 100644
index 000000000..8474c9c27
--- /dev/null
+++ b/doc/todo/Pagination_next_prev_links.mdwn
@@ -0,0 +1,68 @@
+I've observed people just seem to get into a dead end whilst reading my ikiwiki instances.
+
+They don't want to back out of post to an index. They want an easy button to click next or previous post, like what you find on Wordpress sites.
+
+<http://codex.wordpress.org/Next_and_Previous_Links>
+
+[Jekyll](http://jekyllrb.com/)'s implementation looks rather neat:
+
+* <https://github.com/mojombo/jekyll/wiki/template-data>
+* <https://github.com/mojombo/jekyll/blob/master/lib/jekyll/generators/pagination.rb>
+
+
+
+> This is a perfect use for [[todo/wikitrails]], of which my
+> [[plugins/contrib/trail]] plugin is an implementation. Code review on that
+> plugin would be welcome; it might even get merged one day.
+>
+>> The trail plugin is very likely to be merged soon, and is already
+>> available. So, closing this bug report [[done]] --[[Joey]]
+>
+> Unfortunately, IkiWiki blogs use a [[ikiwiki/PageSpec]] to define the set of
+> "posts" in the blog (through which the next/prev trail should range), and
+> the current implementation of [[plugins/contrib/trail]] in terms of typed
+> links would have a circular dependency if used with a PageSpec: typed links
+> have to be added before PageSpecs are evaluated, because "A links to B" is
+> something that can be in a PageSpec; but if you want to add typed links
+> ("A is part of trail B" in this case) based on a PageSpec, then the PageSpec
+> must be evaluated before the typed links can be added. Chicken/egg.
+>
+> One solution would be to make the trail plugin use its own data
+> structure, like [[plugins/contrib/album]] used to do, instead of typed
+> links: at scan time, the trail plugin would just record what the PageSpec
+> was, and delay actually *evaluating* the PageSpec until the beginning
+> of the `render` stage (after all pages have been scanned). This
+> reduces the generic usefulness of typed links, though - in particular
+> you can no longer use "is part of trail A" in a PageSpec. --[[smcv]]
+
+>> Version 3 of [[plugins/contrib/trail]] now does this. For `traillink`
+>> and `trailitem` it additionally adds a typed link, which it does not
+>> itself consume; for `trailinline` and `trail` it doesn't. --[[smcv]]
+
+>>> Indeed, I know the problem; I ran into the same kind of thing with my [[plugins/contrib/report]] plugin and its `trail` concept.
+>>> I simply had to declare that one couldn't use "trail" and "maketrail" options within the same report. That way, "maketrail" will add links in the "scan" pass, and "trail" will test for links in the "build" pass. That usually works. --[[KathrynAndersen]]
+
+>>>> I'm not sure that even that is *quite* right: if your `trail` takes
+>>>> pagespecs as arguments, then it's potentially evaluating those pagespecs
+>>>> before all pages have been scanned, which could mean it lists pages
+>>>> which matched the spec before a recent change, or doesn't list pages
+>>>> which didn't previously match the spec but do now.
+>>>>
+>>>> In version 3 of [[plugins/contrib/trail]] I ended up storing
+>>>> uninterpreted pagespecs and links at scan time, and evaluating them the
+>>>> first time a page is built. I *think* that's sufficiently lazy to give
+>>>> the right answer... --[[smcv]]
+
+>> Do you have an example? --[[hendry]]
+
+>>> Now linked on the plugin's page - it doesn't pretend to be a blog, but
+>>> [the second demo](http://demo.hosted.pseudorandom.co.uk/trail2/)
+>>> is a `trailinline`, so you could do that with blog posts just as well.
+>>> Making [[plugins/contrib/album]] require `trail` v3, and trying it out
+>>> on my blog, are next on the list.
+>>> --[[smcv]]
+
+>>>> Sorry thank link <http://demo.hosted.pseudorandom.co.uk/trail2/> doesn't work. I get a forbidden. --[[hendry]]
+
+
+[[wishlist]]
diff --git a/doc/todo/Plugins_to_provide___34__add_to__34___links_for_popular_feed_readers.mdwn b/doc/todo/Plugins_to_provide___34__add_to__34___links_for_popular_feed_readers.mdwn
new file mode 100644
index 000000000..b755ebdaa
--- /dev/null
+++ b/doc/todo/Plugins_to_provide___34__add_to__34___links_for_popular_feed_readers.mdwn
@@ -0,0 +1,6 @@
+ikiwiki could provide one or more plugins that provide "add to" links for popular feed readers, such as Google Reader, Bloglines, My Yahoo!, or Netvibes.
+
+Potentially less useful given an implementation of [[todo/integration_with_Firefox_and_Iceweasel_feed_subscription_mechanism]].
+--[[JoshTriplett]]
+
+[[wishlist]]
diff --git a/doc/todo/Post-compilation_inclusion_of_the_sidebar.mdwn b/doc/todo/Post-compilation_inclusion_of_the_sidebar.mdwn
new file mode 100644
index 000000000..36161e8b4
--- /dev/null
+++ b/doc/todo/Post-compilation_inclusion_of_the_sidebar.mdwn
@@ -0,0 +1,67 @@
+In some sites (mine, for example), the pages are quasi-static, while the sidebar must be updated at each commit
+(because it contains some lists, like "last posts" or "last updates", or a tagcloud). As this sidebar is included
+in every page of the site, many commits can potentialy leat to a full re-compilation....
+
+I think a sidebar included after the compilation (via a SSI mechanism for example) would make sense and
+reduce the dependencies.
+
+Different things could be possible:
+
+* output as .shtml instead of .html
+* ignore the sidebar->page dependency links
+* consider the *real* dependencies; pageA may include the title (only) of pageB, but don't need to be recompiled
+after each typo correction on pageB.
+
+shtml output with open cgi web access is a potential security hole and can DoS the site, but it's not a problem for a
+single-editor site.
+
+NicolasLimare
+
+> This is a good idea, though sadly not portable enough to be the default.
+> Especially if the only way to do it is with .shtml.
+> But I really like the idea of not rebuilding the sidebar all the time.
+> Definitly a TODO, for me, if I can figure out how to do it. Patches
+> eagerly accepted.
+>
+> I have implemented a htmlext configuration item, that lets you control
+> what extension ikiwiki uses for output html pages. So in theory, a
+> sidebar could be done as you describe using .shtml. --[[Joey]]
+
+[[wishlist]]
+
+> I have a plan for a way to avoid unecessary rebuilds caused by the
+> sidebar. The idea is to use wikistate to store what a sidebar renders to.
+> Then in the needsbuild hook, render sidebar(s) and compare with their
+> previous stored rendering. If a sidebar's rendered content has changed,
+> then all pages that display that sidebar need to be forced to be rebuilt.
+>
+> Also, if there is no previous stored rendering for a sidebar, or
+> if there is a stored rendering for a sidebar page that no longer exists, then
+> the pages need to be rebuilt. (This should deal with the [[bugs/Building_a_sidebar_does_not_regenerate_the_subpages]] bug.
+>
+> This would also save significant time, since the stored sidebar rendering
+> could just be dumped into the page by the pagetemplate hook. Current code
+> re-loads and renders the same sidebar file for every page built!
+>
+> The sticky part is (relative) links on the sidebar. These would need to
+> be modified somehow depending on the page that the sidebar is placed on,
+> to not break the link.
+>
+> Another wrinkle is changing subpage links on a sidebar. Suppose a sidebar
+> links to page `foo`. If page `bar/foo` exists, the sidebar on page bar will,
+> currently, link to that page, in preference to a toplevel `foo`.
+> If `bar/foo` is removed, it will update to link to `foo`. With the new
+> scheme, the stored sidebar rendering is not for page `foo`, and so
+> the change of the `bar/foo` link will not be noticed or acted on.
+> Granted, it's unlikely that anyone relies on the current behavior. You
+> generally want links on a sidebar to link to the same place on every page
+> that displays it. So finding some way to force all links on a sidebar to
+> be handled absolutely and documenting that would avoid this problem.
+>
+> So, one way to handle both the above problems would be to use the
+> pre-rendered sidebar for each page, but use a html parser to look for
+> links in it, and munge them to work as relative links on the page the
+> sidebar is being added to. Or, if the wiki's url is known, just do this
+> once when rendering the sidebar, adding the full url to the links.
+> (Maybe require `url` be set when using sidebar?)
+> --[[Joey]]
diff --git a/doc/todo/Print_link.mdwn b/doc/todo/Print_link.mdwn
new file mode 100644
index 000000000..c7af6c05a
--- /dev/null
+++ b/doc/todo/Print_link.mdwn
@@ -0,0 +1,73 @@
+I think that Print link to open popup window with printable
+HTML version of page is very useful thing, so I would like
+to have it in my ikiwiki :)
+
+Probably it's better to generate a page on the fly as a CGI
+(just the same like for RecentChanges page) when a user
+really needs it, instead to build static printable version
+for all ikiwiki pages. --[[Paweł|ptecza]]
+
+> I've always considered print links to be a sign of a badly designed web
+> site that looks ugly in a printer because it's ugly anywhere, so I may
+> take some convinving. :-) Ikiwiki pages seem like they'd print out ok
+> as-is to me.
+
+>> ikiwiki home pages are plain and clean, but please note that some
+>> ikiwiki users can have their wiki with banners and navbars and a lot
+>> of graphics.
+
+> (I also often click on print links, just to get a web page that I can
+> read, especially often hoping that it will have the whole article on it,
+> instead of the 99 tiny pagelets nasty websites like to split things into.
+> Have I ever mentioned how much I *hate* the web?)
+
+>> I always print all interested articles for me, because I hate reading
+>> them from a display monitor. It's too painful for my eyes. And I want
+>> to print only article *body* without all wrappers, because I don't
+>> need them.
+
+> One option, if your stylesheet contained something that was unpalatable
+> in printing, would be to define an alternate stylesheet optimised for
+> printing, and somehow switch the browser to use that stylesheet when
+> printing a page (it can be switched from a menu in the UI of some
+> browsers, but I'm not sure what a good way would be to switch the
+> stylesheet on the fly without re-rendering the page..)
+>
+> --[[Joey]]
+
+>> Maybe you could add `print.css` file for printable version? We just have
+>> `local.css` file for a local styling. --[[Paweł|ptecza]]
+
+>>> Sure, very doable, but the UI to switch to it when printing, I don't
+>>> know..
+
+>>>> Is the IU to switch is really necessary? Why don't use only
+>>>> `style.css` and `print.css` files in header of printable version
+>>>> of page? The second file can be equivalent of `local.css` file
+>>>> and it can overwrite default CSS styles.
+
+>>> BTW, I'm sure that the Print link as originally requested could be
+>>> written as a plugin fairly simply. --[[Joey]]
+
+>>>> I'm not a Perl expert, but I can take a look at code of other
+>>>> ikiwiki plugins.
+
+>>>> BTW, I also was thinkig about plugin to CVS support,
+>>>> but unfortunately I don't have too much free time. --[[Paweł|ptecza]]
+
+>> You don't need a stylesheet-switching UI or a printer-friendly
+>> version; just link to a stylesheet with `media="print"`. --[[JoshTriplett]]
+
+>>> Example? --[[Joey]]
+
+>>>> I used `meta` to add a `media="print"` stylesheet to the [[sandbox]]. In print or
+>>>> print preview (on browsers supporting data URIs), you should no longer
+>>>> see the search form. --[[JoshTriplett]]
+
+>>>>> (And I broke it, since it was a security hole ;-). So it looks like
+>>>>> media=print can be
+>>>>> [used inside a style sheet](http://www.w3.org/TR/REC-CSS2/media.html),
+>>>>> so the thing to do would be to edit style.css to automatically disable parts
+>>>>> not wanted when printing. That would rock. --[[Joey]]
+
+Yay! I've modified the stylesheet and this is [[done]]. --[[Joey]]
diff --git a/doc/todo/RSS_fields.mdwn b/doc/todo/RSS_fields.mdwn
new file mode 100644
index 000000000..54a2f98ad
--- /dev/null
+++ b/doc/todo/RSS_fields.mdwn
@@ -0,0 +1,25 @@
+### Add more fields to the RSS output
+
+I'd like to see more fields in the RSS output, specifically an author
+indication and a comments link. These would be useful for blogging,
+especially on aggregator sites like http://planet.debian.org. I think maybe
+the meta plugin can be used to set the author field, though I haven't tried
+it, but in my opinion a better way would be to have the author taken from
+the user name who created the page (either from the svn commit or from the
+user name in the web commit). Maybe there are issues with this though.
+
+> Yes, the meta plugin will add fields if author meta-info is speficied.
+> To get author info from commits it would need to store it in the index file,
+> similarly to how pagectime is stored. Doable.
+
+I'd also like to see a comments field added with a link to the discussion
+page so that comments can be made there. One thing I'm not sure how to deal
+with is the way the discussion page link changes after it has been created.
+There would need to be some way of specifying to ikiwiki.cgi to create the
+page if it doesn't exist, or to just edit the page if it does.
+Alternatively, the discussion pages could be automatically created when a
+new blog post is created, and then the edit link would work fine.
+
+> I would really like for some additional TMP variables to be present in the rss template as well. For the inline page template, the CTIME TMPL_VAR results in nice phrases like: <q>Posted late Tuesday morning, November 13th, 2007</q>, and it would be neat to let the planet Debian people see that as well :-) Manoj
+
+[[!tag wishlist]]
diff --git a/doc/todo/RSS_links.mdwn b/doc/todo/RSS_links.mdwn
new file mode 100644
index 000000000..bfbd495e0
--- /dev/null
+++ b/doc/todo/RSS_links.mdwn
@@ -0,0 +1,17 @@
+The RSS feeds on a page should be indicated with &lt;link rel&gt;, so that
+they can be found by aggregators.
+
+--tumov
+
+I've been wondering about this. Ikiwiki's rss buttons include a
+type="application/rss+xml" and link to the rss file, and this is enough for
+at least some auto-discovery tools to find the rss feed. But apparently not
+all of them.
+
+For example, firefox requires the following:
+
+ <link rel="alternate" type="application/rss+xml" title="RSS" href="index.rss" />
+
+[[todo/done]]
+
+--[[Joey]]
diff --git a/doc/todo/Raw_view_link.mdwn b/doc/todo/Raw_view_link.mdwn
new file mode 100644
index 000000000..b62a9022b
--- /dev/null
+++ b/doc/todo/Raw_view_link.mdwn
@@ -0,0 +1,19 @@
+I'd like to have a "raw view" link to view the source for the current page. It would go with the history link that each page has.
+
+The configuration setting for Mercurial could be something like this:
+
+ rawurl => "http://localhost:8000//raw-file/tip/\[[file]]",
+
+> What I do when I want to see if the raw source is either
+> click on the edit link, or click on history and navigate to it in the
+> history browser (easier to do in viewvc than in gitweb, IIRC).
+> Not that I'm opposed to the idea of a plugin that adds a Raw link
+> --[[Joey]]
+
+>> In [[todo/source_link]], Will does this via the CGI instead of delegating
+>> to gitweb/etc. I think Will's patch is a good approach, and have improved
+>> on it a bit in a git branch.
+
+>>> Since that is merged in now, I'm marking this [[done]] --[[Joey]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/RecentChanges_page_links_without_cgi_wrapper.mdwn b/doc/todo/RecentChanges_page_links_without_cgi_wrapper.mdwn
new file mode 100644
index 000000000..b37109032
--- /dev/null
+++ b/doc/todo/RecentChanges_page_links_without_cgi_wrapper.mdwn
@@ -0,0 +1,26 @@
+Links to the changed page on RecentChanges only show up if the cgi wrapper is
+enabled. It would be nice if links were also generated on wikis that do not use
+the cgi. [[svend]]
+
+> It would be, but doing so would make updating the recentchanges page for
+> each commit a lot slower, or would result in there often being broken
+> links there.
+>
+> The broken links would happen if a page is removed.
+>
+> The speed issue is that currently each individual change in the
+> recentchanges page is built just once, when the change is made, and the
+> html for it is reused thereafter. To avoid broken links, it would need to
+> regenerate each change's html on each commit. That's 100x the overhead.
+> (Perhaps it's possible to be smarter about which need generation tho.)
+>
+> The best way to approach this that I can see ATM is to use the
+> [[plugins/404]] plugin to handle the broken links and then recentchanges
+> could avoid explicitly using the CGI. But this doesn't meet your use case
+> of having no CGI.
+>
+> If you're willing to live with broken links to removed pages, I suppose
+> that could be made an option..
+> --[[Joey]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/Render_multiple_destinations_from_one_source.mdwn b/doc/todo/Render_multiple_destinations_from_one_source.mdwn
new file mode 100644
index 000000000..b7723f919
--- /dev/null
+++ b/doc/todo/Render_multiple_destinations_from_one_source.mdwn
@@ -0,0 +1,85 @@
+I've set up a couple of sites where the users use ikiwiki in fairly standard mode as a CMS and I then set up another ikiwiki setup file that's got the edit options turned off, but is pointing at the same git repository in the background. I then make the post-update hook for each be <tt>post-update-hook.ikiwiki</tt> and <tt>post-update-hook.ikiwiki-public</tt> and have the <tt>post-update</tt> hook itself be a script like:
+
+ #!/bin/sh
+
+ $0.ikiwiki "$@"
+ $0.ikiwiki-public "$@"
+
+obviously this results in duplication of most of the <tt>ikiwiki.setup</tt>, a spare working directory that (perhaps) isn't needed, and an extra post-update hook plus wrapper script that is really needless extra complication.
+
+If instead there was a way of specifying additional destdir's, or perhaps more generally a way of specifying that there should be multiple passes through the build process using alternative values for some of the variables, then one could have both the private wiki view, and the public static view generated with minimal additional configuration.
+
+One idea that occurs to me is an <tt>additional_configs</tt> list where one would specify files containing just the settings you want to override compared with the main setup file.
+
+Alternatively, one might invent a new way of specifying alternative settings. i.e.:
+
+ additionalsites:
+ - public
+
+ destdir: /home/wiki/wiki-view
+ destdir[public]: /home/wiki/public_html
+
+ disable_plugins: []
+ disable_plugins[public]:
+ - recentchanges
+ - editpage
+
+ url: https://example.com/editors/
+ url[public]: http://www.example.com/
+
+ ...
+
+where the existance of the <tt>additionalsites</tt> list provokes additional runs through using the settings with matching extra bits to be used to override the defaults found in the rest of the file.
+
+Just brainstorming a bit after [[liw]]'s comment about this being useful on IRC, and thought I'd write the idea up while I was thinking about it. -[[fil]]
+
+> I don't think you can avoid ikiwiki needing to store a different
+> `.ikiwiki` directory with state for each site. Differences in
+> configuration can affect the state it stores in arbitrary ways,
+> ranging from which pages are even built to what plugins are enabled and
+> store state. This also means that it doesn't make sense to try and
+> share state amoung rebuilds of the same site.
+>
+> There is a hidden, and undocumented configuration setting `wikistatedir`
+> that can actually be pointed at a different directory than `.ikiwiki`.
+> Then you can rebuild multiple configurations from one working directory.
+>
+> Another handy trick is to use the old perl-format (not yaml) setup file,
+> and parameterize it using `$ENV{FOO}`, then you can build two different
+> setups from the same setup file.
+> --[[Joey]]
+
+> > My post-update script has grown a bit, as I'm using ikiwiki-hosting now, so want to let the users update stuff themselves:
+> >
+> > #!/bin/sh
+> >
+> > PUB_URL=http://truestedt.hands.com
+> > PUB_TMPL=$HOME/source-public/templates-public
+> >
+> > # make the public config, in case of updates via ikiwiki-hosting
+> > sed -e 's/^\(srcdir\|destdir\|git_wrapper\): .*/&-public/;s#^\(url:\).*#\1 '$PUB_URL'#;s/^\(cgi_wrapper:\).*/\1 '"''"'/;s#^\(templatedir:\).*#\1 '$PUB_TMPL'#;s/^\(cgiurl\|historyurl\):/#&/;/disable_plugins:/a \
+> > - recentchanges\
+> > - editpage' ~/ikiwiki.setup > ~/ikiwiki.setup-public
+> > #echo 'wikistatedir: source/.ikiwiki-public' >> ~/ikiwiki.setup-public
+> > [ -d ~/source-public ] || cp -a ~/source ~/source-public
+> > [ -d ~/public_html-public ] || mkdir ~/public_html-public
+> >
+> > # run normal post-update hook
+> > ./hooks/post-update-ikiwiki "$@"
+> >
+> > # run post-update hook for the public version of the site
+> > ./hooks/post-update-ikiwiki-public "$@"
+> >
+> > exec git update-server-info
+> >
+> > I tried using wikistatedir, as you suggested, but then wiki edits are not reflected on the second site (AFAICT), so reverted to having a full checkout of the source.
+> > I'm guessing that that's because the second run through with the post-update hook sees no changes that it needs to worry about in the source directory, but it's just
+> > possible that I got confused while testing, as the sed is pretty fragile, so some of the time it was failing because of sed syntax errors.
+> >
+> > It strikes me that one ought to be able to have a plugin that takes the current config, applies a few minor tweaks to it (perhaps by loading an extra config file) and
+> > then does some or all of the tasks normally run by main() again, targeting a new directory -- that way there would be no need for the two post-updates, and whatever
+> > provoked a rebuild would always do both, whether on the command line or via CGI.
+> > I just don't know quite where the right place to plumb such a plugin in would be -- also, it would be good to separate out the bits of main() that we'd be calling
+> > so that both the plugin and main calls them in the same way, to ease future maintenance
+> >
+> > Any hints on where to start with such a plugin, gratefully received :-) -[[fil]]
diff --git a/doc/todo/Resolve_native_reStructuredText_links_to_ikiwiki_pages.mdwn b/doc/todo/Resolve_native_reStructuredText_links_to_ikiwiki_pages.mdwn
new file mode 100644
index 000000000..6e0f32fd5
--- /dev/null
+++ b/doc/todo/Resolve_native_reStructuredText_links_to_ikiwiki_pages.mdwn
@@ -0,0 +1,333 @@
+_NB! this page has been refactored, hopefully it is clearer now_
+_I propose putting discussion posts somewhere in the vincity of
+the secttion Individual reStructuredText Issues_
+
+## Design ##
+
+**Goal**
+
+To be able to use rst as a first-class markup language in ikiwiki. I think
+most believe this is almost impossible (ikiwiki is built around markdown).
+
+## Wikilinks ##
+
+**WikiLinks**, first and foremost, are needed for a wiki. rST already allows
+specifying absolue and relative URL links, and relative links can be used to
+tie together wiki of rst documents.
+
+1. Below are links to a small, working implementation for resolving
+ undefined rST references using ikiwiki's mechanism. This is **Proposal 1**
+ for rst WikiLinks.
+
+2. Looking over at rST-using systems such as trac and MoinMoin; I think it
+ would be wiser to implement wikilinks by the `:role:` mechanism, together
+ with allowing a custom URL scheme to point to wiki links. This is
+ **Proposal 2**.
+
+ This is a simple wiki page, with :wiki:`WikiLinks` and other_ links
+
+ .. _other: wiki:wikilink
+
+ We can get rid of the role part as well for WikiLinks::
+
+ .. default-role:: wiki
+
+ Enables `WikiLinks` but does not impact references such as ``other``
+ This can be made the default for ikiwiki.
+
+Benefits of using a `:role:` and a `wiki: page/subpage` URL scheme are
+following:
+
+1. rST documents taken out of the context (the wiki) will not fail as bad as
+ if they have lots of Proposal-1 links: They look just the same as valid
+ references, and you have to edit them all.
+ In contrast, should the `:wiki:` role disappear, one line is enough
+ to redefined it and silence all the warnings for the document:
+
+ .. role:: wiki (title)
+
+### Implementation ###
+
+Implementation of Proposal-2 wikilinks are in the branch
+[rst-wikilinks][rst-wl]
+
+
+ This is a simple wiki page, with :wiki:`WikiLinks` and |named| links
+
+ .. |named| wiki:: Some Page
+
+ We can get rid of the role part as well for WikiLinks::
+
+ .. default-role:: wiki
+
+ Enables `WikiLinks` but does not impact references such as ``named``
+ This can be made the default for ikiwiki.
+
+[rst-wl]: http://github.com/engla/ikiwiki/commits/rst-wikilinks
+
+**rst-wikilinks** patch series includes changes at the end to use ikiwiki's
+'htmllink' for the links (which is the only sane thing to do to work in all configurations).
+This means a :wiki:`Link` should render just exactly like [[Link]] whether
+the target exists or not.
+
+On top of **rst-wikilinks** is [rst-customize][rst-custom] which adds two
+power user features: Global (python) file to read in custom directives
+(unsafe), and a wikifile as "header" file for all parsed .rst files (safe,
+but disruptive since all .rst depend on it). Well, the customizations have
+to be picked and chosen from this, but at least the global python file can
+be very convenient.
+
+> Did you consider just including the global rst header text into an item
+> in the setup file? --[[Joey]]
+>
+>> Then `rst_header` would not be much different from the python script
+>> `rst_customize`. rst_header is as safe as other files (though disruptive
+>> as noted), so it should/could be a editable file in the Wiki. A Python
+>> script of course can not be. There is nothing you can do in the
+>> rst_header (that you sensibly would do, I think) that couldn't be done in
+>> the Python script. `rst_header` has very limited use, but it is another
+>> possibility, mainly for the user-editable aspect. --[[ulrik]]
+>>
+>> (I foresaw only two things to be added to the rst_header: the default
+>> role could be configured there (as with rst_wikirole), and if you have a
+>> meta-role like :shortcut:, shortcuts could be defined there.)
+>
+> I have some discussion on the [docutils mailing list][dml], the developers
+> of docutils seems to favor "Proposal 1", while I defend my ideas. They
+> want all users of ReST to use only the basic featureset to remain
+> compatible, of course. -- [[ulrik]]
+
+[dml]: http://thread.gmane.org/gmane.text.docutils.user/5376
+
+Some rst-custom [examples are here](http://kaizer.se/wiki/rst_examples/)
+
+[rst-custom]: http://github.com/engla/ikiwiki/commits/rst-customize
+
+## Directives ##
+
+Now **Directives**: As it is now, ikiwiki goes though (roughly):
+filter, preprocess, htmlize, format as major stages of content
+transformation. rST has major problems to work with any HTML that enters the
+picture before it.
+
+1. Formatting rST in `htmlize` (as is done now): Raw html can be escaped by
+ raw blocks:
+
+ .. raw:: html
+
+ \[[!inline and do stuff]]
+
+ (This can be simplified to alias the above as `.. ikiwiki::`)
+ This escape method works, if ikwiki can be persuaded to maintain the
+ indent when inserting html, so that it stays inside the raw block.
+
+2. Formatting rST in `filter` (idea)
+ 1. rST does not have to see any HTML (raw not needed)
+ 2. rST directives can alias ikiwiki syntax:
+
+ ..ikiwiki:: inline pages= ...
+
+ 3. Using rST directives as ikiwiki directives can be complicated;
+ but rST directives allow a direct line (after :: on first line),
+ an option list, and a content block.
+
+> You've done a lot of work already, but ...
+>
+> The filter approach seems much simpler than the other approaches
+> for users to understand, since they can just use identical ikiwiki
+> markup on rst pages as they would use anywhere else. This is very desirable
+> if the wiki allows rst in addition to mdwn, since then users don't have
+> to learn two completly different ways of doing wikilinks and directives.
+> I also wonder if even those familiar with rst would find entirely natural
+> the ways you've found to shoehorn in wikilinks, named wikilinks, and ikiwiki
+> directives?
+>
+> Htmlize in filter avoids these problems. It also leaves open the possibility
+> that ikiwiki could become smarter about the rendering chain later, and learn
+> to use a better order for rst (ie, htmlize first). If that later happened,
+> the htmlize in filter hack could go away. --[[Joey]]
+
+> (BTW, the [[plugins/txt]] plugin already does html formatting
+> in filter, for similar reasons.) --[[Joey]]
+
+>> Thank you for the comments! Forget the work, it's not so much.
+>> I'd rank the :wiki: link addition pretty high, and the other changes way
+>> behind that:
+>>
+>> The :wiki:`Wiki Link` syntax is *very* appropriate as rst syntax
+>> since it fits well with other uses of roles (notice that :RFC:`822`
+>> inserts a link to RFC822 etc, and that the default role is a *title* role
+>> (title of some work); thus very appropriate for medium-specific links like
+>> wiki links. So I'd rank :wiki: links a worthwhile addition regardless of
+>> outcome here, since it's a very rst-like alternative for those who wish to
+>> use more rst-like syntax (and documents degrades better outside the wiki as
+>> noted).
+>>
+>>> Unsure about the degredation argument. It will work some of
+>>> the time, but ikiwiki's [[ikiwiki/subpage/linkingrules]]
+>>> are sufficiently different from normal html relative link
+>>> rules that it often won't work. --[[Joey]]
+>>>
+>>>> With degradation I mean that if you take a file out of the wiki; the
+>>>> links degrade to stylized text. If using default role, they degrade to
+>>>> :title: which renders italicized text (which I find is exactly
+>>>> appropriate). There is no way for them to degrade into links, except of
+>>>> course if you reimplement the :wiki: role. You can also respecify
+>>>> either the default role (the `wikilink` syntax) or the :wiki: role (the
+>>>> :wiki:`wikilink` syntax) to any other markup, for example None.
+>>>> --[[ulrik]]
+>>
+>> The named link syntax (just like the :wiki: role) are inspired from
+>> [trac][tracrst] and a good fit, but only if the wiki is committed to
+>> using only rst, which I don't think is the case.
+>>
+>> The rst-customize changes are very useful for custom directive
+>> installations (like the sourcecode directive, or shortcut roles I show
+>> in the examples page), but there might be a way for the user to inject
+>> docutils addons that I'm missing (one very ugly way would be to stick
+>> them in sitecustomize.py which affects all Python programs).
+>>
+>> With the presented changes, I already have a working RestructuredText
+>> wiki, but I'm admitting that using .. raw:: html around all directives is
+>> very ugly (I use few directives: inline, toggle, meta, tag, map)
+>>
+>> On filter/htmlize: Well **rst** is clearly antisocial: It can't see HTML,
+>> and ikiwiki directives are wrappend in paragraph tags. (For wikilinks
+>> this is probably no problem). So the suggestion about `.. ikiwiki:` is
+>> partly because it looks good in rst syntax, but also since it would emit
+>> a div to wrap around the element instead of a paragraph.
+>>
+>> I don't know if you mean that rst could be reordered to do htmlize before
+>> other phases? rst must be before any preprocess hook to avoid seeing any
+>> HTML.
+>>
+>>> One of my long term goals is to refactor all the code in ikiwiki
+>>> that manually runs the various stages of the render pipeline,
+>>> into one centralized place. Once that's done, that place can get
+>>> smart about what order to run the stages, and use a different
+>>> order for rst. --[[Joey]]
+>>
+>> If I'm thinking right, processing to HTML already in filter means any
+>> processing in scan can be reused directly (or skipped if it's legal to
+>> emit 'add_link' in filter.)
+>>
+>> -- [[ulrik]]
+
+>>> Seems it could be, yes. --[[Joey]]
+>>>
+>>>> It is not clear how we can work around reST wrapping directives with
+>>>> paragraph tags. Also, some escaping of xml characters & <> might
+>>>> happen, but I can't imagine right now what breakage can come from that.
+>>>> -- [[ulrik]]
+
+[tracrst]: http://trac.edgewall.org/wiki/WikiRestructuredText
+
+### Implementation ###
+
+Preserving indents in the preprocessor are in branch [pproc-indent][ppi]
+
+(These simple patches come with a warning: _Those are the first lines of
+Perl I've ever written!_)
+
+> This seems like a good idea, since it solves issues for eg, indented
+> directives in mdwn as well. But, looking at the diff, I see a clear bug:
+>
+> - return "[[!$command <span class=\"error\">".
+> + $result = "[[!$command <span class=\"error\">".
+>
+> That makes it go on and parse an infinitely nested directive chain, instead
+> of immediatly throwing an error.
+>
+> Also, it seems that the "indent" matching in the regexps may be too broad,
+> wouldn't it also match whitespace before a directive that was not at the beginning
+> of a line, and treat it as an indent? With some bad luck, that could cause mdwn
+> to put the indented output in a pre block. --[[Joey]]
+>
+>> You are probably right about the bug. I'm not quite sure what the nested
+>> directives examples looks like, but I must have overlooked how the
+>> recursion counter works; I thought simply changing if to elif the next
+>> few lines would solve that. I'm sorry for that!
+>>
+>> We don't have to change the `$handle` function at all, if it is possible
+>> to do the indent substitution all in one line instead of passing it to
+>> handle, I don't know if it is possible to turn:
+>>
+>> $content =~ s{$regex}{$handle->($1, $2, $3, $4, $5)}eg;
+>>
+>> into
+>>
+>> $content =~ s{$regex}{s/^/$1/gm{$handle->($2, $3, $4, $5)}}eg;
+>>
+>> Well, no idea how that would be expressed, but I mean, replace the indent
+>> directly in $handle's return value.
+>>
+>>> Yes, in effect just `indent($1, handle->($2,$,4))` --[[Joey]]
+>>
+>> The indent-catching regex is wrong in the way you mention, it has been
+>> nagigng my mind a bit as well; I think matching start of line + spaces
+>> and tabs is the only thing we want.
+>> -- [[ulrik]]
+>>
+>>> Well, seems you want to match the indent at the start of the line containing
+>>> the directive, even if the directive does not start the line. That would
+>>> be quite hard to make a regexp do, though. --[[Joey]]
+>>
+>> I wasted a long time getting the simpler `indent($1, handle->($2,$,4))` to
+>> work (remember, I don't know perl at all). Somehow `$1` does not arrive, I
+>> made a simple testcase that worked, and I conclude something inside $handle
+>> results in the value of $1 not arriving as it should!
+>>
+>> Anyway, instead a very simple incremental patch is in [pproc-indent][ppi]
+>> where the indentation regex is `(^[ \t]+|)` instead, which seems to work
+>> very well (and the regex is multiline now as well). I'm happy to rebase the
+>> changes if you want or you can just squash the four patches 1+3 => 1+1
+>> -- [[ulrik]]
+
+[ppi]: http://github.com/engla/ikiwiki/commits/pproc-indent
+
+## Discussion ##
+
+I guess you (or someone) has been through this before and knows why it
+simply won't work. But I hoped there was something original in the above;
+and I know there are wiki installations where rST works. --ulrik
+
+**Individual reStructuredText Issues**
+
+* We resolve rST links without definition, we don't help resolving defined
+ relative links, so we don't support specifying link name and target
+ separately.
+
+ * Resolved by |replacement| links with the wiki:: directive.
+
+**A first implementation: Resolving unmatched links**
+
+I have a working minimal implementation letting the rst renderer resolve
+undefined native rST links to ikiwiki pages. I have posted it as one patch at:
+
+Preview commit: http://github.com/engla/ikiwiki/commit/486fd79e520da1d462f00f40e7a90ab07e9c6fdf
+Repository: git://github.com/engla/ikiwiki.git
+
+Design issues of the patch:
+
+The page is rST-parsed once in 'scan' and once in 'htmlize' (the first to generate backlinks). Can the parse output be safely reused?
+
+> The page content fed to htmlize may be different than that fed to scan,
+> as directives can change the content. If you cached the input and output
+> at scan time, you could reuse the cached data at htmlize time for inputs
+> that are the same -- but that could be a very big cache! --[[Joey]]
+
+>> I would propose using a simple heuristic: If you see \[[ anywhere on the
+>> page, don't cache it. It would be an effective cache for pure-rst wikis
+>> (without any ikiwiki directives or wikilinks).
+>> However, I think that if the cache does not work for a big load, it should
+>> not work at all; small loads are small so they don't matter. --ulrik
+
+-----
+
+Another possiblity is using empty url for wikilinks (gitit uses this approach), for example:
+
+ `SomePage <>`_
+
+Since it uses *empty* url, I would like to call it *proposal 0* :-) --[weakish]
+
+[weakish]: http://weakish.pigro.net
diff --git a/doc/todo/Restrict_formats_allowed_for_comments.mdwn b/doc/todo/Restrict_formats_allowed_for_comments.mdwn
new file mode 100644
index 000000000..9aee29037
--- /dev/null
+++ b/doc/todo/Restrict_formats_allowed_for_comments.mdwn
@@ -0,0 +1,99 @@
+I want to write my blog posts in a convenient format (Emacs org mode)
+but do not want commenters to be able to use this format for security
+reasons. This patch allows to configure which formats are allowed for
+writing comments.
+
+Effectively, it restricts the formats enabled with add_plugin to those
+mentioned in comments_allowformats. If this is empty, all formats are
+allowed, which is the behavior without this patch.
+
+The patch can be pulled from my repo ([gitweb](https://rtime.felk.cvut.cz/gitweb/sojka/ikiwiki.git/commitdiff/c42fd7d7580d081f3e3f624fd74219b0435230f6?hp=bfc9dc93c9f64a9acfff4683b69995d5a0edb0ea))
+
+ git pull git://rtime.felk.cvut.cz/sojka/ikiwiki.git restrict-comment-formats
+---
+
+<pre>
+From c42fd7d7580d081f3e3f624fd74219b0435230f6 Mon Sep 17 00:00:00 2001
+From: Michal Sojka <sojkam1@fel.cvut.cz>
+Date: Tue, 5 Mar 2013 10:54:51 +0100
+Subject: [PATCH] Add configuration to restrict the formats allowed for
+ comments
+
+I want to write my blog posts in a convenient format (Emacs org mode)
+but do not want commenters to be able to use this format for security
+reasons. This patch allows to configure which formats are allowed for
+writing comments.
+
+Effectively, it restricts the formats enabled with add_plugin to those
+mentioned in comments_allowformats. If this is empty, all formats are
+allowed, which is the behavior without this patch.
+---
+ IkiWiki/Plugin/comments.pm | 21 +++++++++++++++++++--
+ 1 file changed, 19 insertions(+), 2 deletions(-)
+
+diff --git a/IkiWiki/Plugin/comments.pm b/IkiWiki/Plugin/comments.pm
+index 285013e..151e839 100644
+--- a/IkiWiki/Plugin/comments.pm
++++ b/IkiWiki/Plugin/comments.pm
+@@ -90,6 +90,15 @@ sub getsetup () {
+ safe => 0,
+ rebuild => 0,
+ },
++ comments_allowformats => {
++ type => 'string',
++ default => '',
++ example => 'mdwn txt',
++ description => 'Restrict formats for comments to (no restriction if empty)',
++ safe => 1,
++ rebuild => 0,
++ },
++
+ }
+
+ sub checkconfig () {
+@@ -101,6 +110,8 @@ sub checkconfig () {
+ unless defined $config{comments_closed_pagespec};
+ $config{comments_pagename} = 'comment_'
+ unless defined $config{comments_pagename};
++ $config{comments_allowformats} = ''
++ unless defined $config{comments_allowformats};
+ }
+
+ sub htmlize {
+@@ -128,12 +139,18 @@ sub safeurl ($) {
+ }
+ }
+
++sub isallowed ($) {
++ my $format = shift;
++ return ! $config{comments_allowformats} || $config{comments_allowformats} =~ /\b$format\b/;
++}
++
+ sub preprocess {
+ my %params = @_;
+ my $page = $params{page};
+
+ my $format = $params{format};
+- if (defined $format && ! exists $IkiWiki::hooks{htmlize}{$format}) {
++ if (defined $format && (! exists $IkiWiki::hooks{htmlize}{$format} ||
++ ! isallowed($format))) {
+ error(sprintf(gettext("unsupported page format %s"), $format));
+ }
+
+@@ -332,7 +349,7 @@ sub editcomment ($$) {
+
+ my @page_types;
+ if (exists $IkiWiki::hooks{htmlize}) {
+- foreach my $key (grep { !/^_/ } keys %{$IkiWiki::hooks{htmlize}}) {
++ foreach my $key (grep { !/^_/ && isallowed($_) } keys %{$IkiWiki::hooks{htmlize}}) {
+ push @page_types, [$key, $IkiWiki::hooks{htmlize}{$key}{longname} || $key];
+ }
+ }
+--
+1.7.10.4
+
+</pre>
+
+[[!tag patch]]
+
+> [[done]] --[[Joey]]
diff --git a/doc/todo/Restrict_page_viewing.mdwn b/doc/todo/Restrict_page_viewing.mdwn
new file mode 100644
index 000000000..20b59cb13
--- /dev/null
+++ b/doc/todo/Restrict_page_viewing.mdwn
@@ -0,0 +1,42 @@
+I'd like to have some pages of my wiki to be only viewable by some users.
+
+I could use htaccess for that, but it would force the users to have
+2 authentication mecanisms, so I'd prefer to use openID for that too.
+
+* I'm thinking of adding a "show" parameter to the cgi script, thanks
+ to a plugin similar to goto.
+* When called, it would check the credential using the session stuff
+ (that I don't understand yet).
+* If not enough, it would serve a 403 error of course.
+* If enough, it would read the file locally on the server side and
+ return this as a content.
+
+Then, I'd have to generate the private page the regular way with ikiwiki,
+and prevent apache from serving them with an appropriate and
+much more maintainable htaccess file.
+
+-- [[users/emptty]]
+
+> While I'm sure a plugin could do this, it adds so much scalability cost
+> and is so counter to ikiwiki's design.. Have you considered using the
+> [[plugins/httpauth]] plugin to unify around htaccess auth? --[[Joey]]
+
+>> I'm not speaking of rendering the pages on demand, but to serve them on demand.
+>> They would still be compiled the regular way;
+>> I'll have another look at [[plugins/httpauth]] but I really like the openID whole idea.
+>> --[[emptty]]
+
+>>> How about
+>>> [mod_auth_openid](http://trac.butterfat.net/public/mod_auth_openid), then?
+>>> A plugin for ikiwiki to serve its own pages is far afield from ikiwiki's roots,
+>>> as Joey pointed out, but might be a neat option to have anyway -- for unifying
+>>> authentication across views and edits, for systems not otherwise running
+>>> web servers, for systems with web servers you don't have access to, and
+>>> doubtless for other purposes. Such a plugin would add quite a bit of flexibility,
+>>> and in that sense (IMO, of course) it'd be in the spirit of ikiwiki. --[[schmonz]]
+
+>>>> Yes, I think this could probably be used in combination with ikiwiki's
+>>>> httpauth and openid plugins. --[[Joey]]
+
+>>>>> If you use the httpauth and the cgiauthurl method, you can restrict a path
+>>>>> like /private/* to be accessible only under the authenticated request uri.
diff --git a/doc/todo/Separate_OpenIDs_and_usernames.mdwn b/doc/todo/Separate_OpenIDs_and_usernames.mdwn
new file mode 100644
index 000000000..b7ff82282
--- /dev/null
+++ b/doc/todo/Separate_OpenIDs_and_usernames.mdwn
@@ -0,0 +1,55 @@
+I see OpenID more as an authentication technology than as a human-friendly identifier. It would be cool if, in addition to my identity URL, I could also associate a username with my account. I'd then use the URL to log in, but all changes would be associated with the username I provide. Additionally, I could sign changes with my username, possibly change my identity URL and set a password. It would be nice if I could use my identity URL for authentication convenience and actually be known as nolan or thewordnerd on my wikis, rather than the somewhat less human http://thewordnerd.info. :)
+
+Separating username from identity URL would also let me change the URL later. It would be nice, for instance, if I could assign a username to my account and change the identity to my preferred thewordnerd.info once delegation is supported, without the potential of losing access to my account and contributions. :)
+
+I see this being implemented in one of two possible ways. The easiest seems like it'd involve splitting the fields, doing a simple OpenID verification as is done today, then allow setting of username on the preferences page. When crediting a user for a change, call a function that returns either the username or, if it is null, the identity URL. Then, allow logging into the same account with the username, but only if the password is non-blank. That seems like the most minimal and least invasive way of making the change.
+
+A slightly more complex next step would be to request sreg from the provider and, if provided, automatically set the identity's username and email address from the provided persona. If username login to accounts with blank passwords is disabled, then you have the best of both worlds. Passwordless signin, human-friendly attribution, automatic setting of preferences.
+
+> Given that openids are a global user identifier, that can look as pretty
+> as the user cares to make it look via delegation, I am not a fan of
+> having a site-local identifier that layered on top of that. Perhaps
+> partly because every site that I have seen that does that has openid
+> implemented as a badly-done wart on the side of their regular login
+> system.
+>
+> > If there are user profiles on the site with non-empty information associated with them (including permissions, reputation), then it would make more sense to be able to access your user profile with alternative OpenIDs (in case one of the provider goes down), as on <http://stackoverflow.com>. In ikiwiki, there might be no such special information associated with users (or you can think of something like this?), except for the admin rights. But fortunately, several OpenIDs can be set up for admins in ikwiki. (Only if it comes to [the OpenIDs provided by Gmail][forum/google openid broken?], then it turns out to be unhandy to write the ID into the configuration file as a second admin ID.)--Ivan Z.
+>
+> The openid plugin now attempts to get an email and a username, and stores
+> them in the session database for later use (ie, when the user edits a
+> page).
+>
+> I am considering displaying the userid or fullname, if available,
+> instead of the munged openid url in recentchanges and comments.
+> It would be nice for those nasty [[google_openids|forum/google_openid_broken?]].
+> But, I first have to find a way to encode the name in the VCS commit log,
+> while still keeping the openid of the committer in there too.
+> Perhaps something like this (for git): --[[Joey]]
+>
+> Author: Joey Hess &lt;http://joey.kitenet.net/@web&gt;
+>
+> Only problem with the above is that the openid will still be displayed
+> by CIA. Other option is this, which solves that, but at the expense of
+> having to munge the username to fit inside the email address,
+> and generally seems backwards: --[[Joey]]
+>
+> Author: http://joey.kitenet.net/ &lt;Joey_Hess@web&gt;
+>
+> So, what needs to be done:
+>
+> * Change `rcs_commit` and `rcs_commit_staged` to take a session object,
+> instead of just a userid. (For back-compat, if the parameter is
+> not an object, it's a userid.) Bump ikiwiki plugin interface version.
+> (done)
+> * Modify all RCS plugins to include the session username somewhere
+> in the commit, and parse it back out in `rcs_recentchanges`.
+> (done for git only so far)
+> * Modify recentchanges plugin to display the username instead of the
+> `openiduser`.
+> (done)
+> * Modify comment plugin to put the session username in the comment
+> template instead of the `openiduser`. (done)
+
+Unfortunately I don't speak Perl, so hopefully someone thinks these suggestions are good enough to code up. I've hacked on openid code in Ruby before, so hopefully these changes aren't all that difficult to implement. Even if you don't get any data via sreg, you're no worse off than where you are now, so I don't think there'd need to be much in the way of error/sanity-checking of returned data. If it's null or not available then no big deal, typing in a username is no sweat.
+
+[[!tag wishlist done]]
diff --git a/doc/todo/Set_arbitrary_date_to_be_used_by_calendar_plugin.mdwn b/doc/todo/Set_arbitrary_date_to_be_used_by_calendar_plugin.mdwn
new file mode 100644
index 000000000..e0074eef8
--- /dev/null
+++ b/doc/todo/Set_arbitrary_date_to_be_used_by_calendar_plugin.mdwn
@@ -0,0 +1,210 @@
+[[!tag patch plugins/calendar]]
+
+Here's my next version of the patch - still a work in progress.
+
+ Note:I partially updated part of this patch to work on Ikiwiki v3 - see [here](http://ikiwiki.info/forum/Calendar:_listing_multiple_entries_per_day/) -- Matt Ford
+
+It provides the following new features. The features are designed to preserve the behavior of the existing plugin by default.
+
+ * If you specify an event preprocessor in a post, such as:
+
+ [[!event time="2008-06-24"]]
+
+ That date will be used instead of the post creation time when displaying the calendar.
+
+ * When specifying a calendar preprocessor, you can now add the following new parameters:
+
+ * time_src: by default it is set to auto. auto means that when determining the date to use for a given post, take the date given in the event preprocessor directive if it exists and if not, use the creation time of the post. The other option is event which means: take the date given in the event preprocessor directive. If there is not event preprocessor in a post, don't include the post.
+ * detail: by default it is set to 0. If set to 0, display the monthly calendar as a series of days. Each day is a link to the post on that day. If there is more than one post on a day, only one is linked to. If set to 1 - then display the title of each post on the day as a link to the post. Is 0 and 1 the best values here? On/Off? Yes/no?
+
+The following changes are in the works:
+
+ * Switch to using HTML::CalendarMonth to create the html output
+
+ * Display the start time when detail is set to 1.
+
+Longer term plans:
+
+ * Support recurring events
+
+ * Support for end time on events (including end time that is on a different day than the start time)
+
+ * Convincing the world to switch to base 10 calendar system.
+
+
+ --- calendar.pm.orig 2008-06-24 22:36:09.000000000 -0400
+ +++ calendar.pm 2008-06-28 22:02:15.000000000 -0400
+ @@ -23,6 +23,8 @@
+ use IkiWiki 2.00;
+ use Time::Local;
+ use POSIX;
+ +use Date::Parse;
+ +use Data::Dumper;
+
+ my %cache;
+ my %linkcache;
+ @@ -32,6 +34,7 @@
+ sub import {
+ hook(type => "needsbuild", id => "version", call => \&needsbuild);
+ hook(type => "preprocess", id => "calendar", call => \&preprocess);
+ + hook(type => "preprocess", id => "event", call => \&preprocess_event);
+ }
+
+ sub is_leap_year (@) {
+ @@ -58,6 +61,7 @@
+ my $nmonth = $params{nmonth};
+ my $pyear = $params{pyear};
+ my $nyear = $params{nyear};
+ + my $detail = $params{detail};
+
+ my @list;
+ my $calendar="\n";
+ @@ -153,33 +157,58 @@
+ }
+
+ my $tag;
+ + my $display_day;
+ my $mtag = sprintf("%02d", $month);
+ - if (defined $cache{$pagespec}{"$year/$mtag/$day"}) {
+ - if ($day == $today) {
+ + if ($day == $today) {
+ $tag='month-calendar-day-this-day';
+ - }
+ - else {
+ - $tag='month-calendar-day-link';
+ - }
+ - $calendar.=qq{\t\t<td class="$tag $downame{$wday}">};
+ - $calendar.=htmllink($params{page}, $params{destpage},
+ - pagename($linkcache{"$year/$mtag/$day"}),
+ - "linktext" => "$day");
+ - push @list, pagename($linkcache{"$year/$mtag/$day"});
+ - $calendar.=qq{</td>\n};
+ + }
+ + elsif ($day >= $future_dom) {
+ + $tag='month-calendar-day-future';
+ + }
+ + elsif($params{detail} == 0 &&
+ + !defined $cache{$pagespec}{"$year/$mtag/$day"}) {
+ + $tag='month-calendar-day-nolink';
+ + }
+ + elsif($params{detail} == 0 &&
+ + defined $cache{$pagespec}{"$year/$mtag/$day"}) {
+ + $tag='month-calendar-day-link';
+ }
+ else {
+ - if ($day == $today) {
+ - $tag='month-calendar-day-this-day';
+ - }
+ - elsif ($day == $future_dom) {
+ - $tag='month-calendar-day-future';
+ + $tag='month-calendar-day';
+ + }
+ +
+ + $calendar.=qq{\t\t<td class="$tag $downame{$wday}">};
+ + my $day_label = qq{<span class="month-calendar-day-label">$day</span>};
+ + if (defined $cache{$pagespec}{"$year/$mtag/$day"}) {
+ + my $srcpage; my $destpage;
+ + if($params{detail} == 0) {
+ + # pull off the first page
+ + (($srcpage,$destpage) = each(%{$linkcache{"$year/$mtag/$day"}}));
+ + $calendar.=htmllink($params{page}, $params{destpage},
+ + pagename($destpage),
+ + "linktext" => "$day_label");
+ + push @list, pagename($linkcache{"$year/$mtag/$day"});
+ }
+ else {
+ - $tag='month-calendar-day-nolink';
+ + $calendar.=qq{$day_label\n};
+ + while(($srcpage,$destpage) = each(%{$linkcache{"$year/$mtag/$day"}})) {
+ + my $title = IkiWiki::basename(pagename($srcpage));
+ + if (exists $pagestate{$srcpage}{meta}{title} ) {
+ + $title = $pagestate{$srcpage}{meta}{title};
+ + }
+ + $calendar.=qq{\t\t<div class="$tag $downame{$wday}">};
+ + $calendar.=htmllink($params{page}, $params{destpage},
+ + pagename($destpage),
+ + "linktext" => $title);
+ + push @list, pagename($linkcache{"$year/$mtag/$day"}{"$srcpage"});
+ + $calendar.=qq{\t\t</div>};
+ + }
+ }
+ - $calendar.=qq{\t\t<td class="$tag $downame{$wday}">$day</td>\n};
+ }
+ + else {
+ + $calendar.=qq{$day_label\n};
+ + }
+ + $calendar.=qq{</td>\n};
+ }
+
+ # finish off the week
+ @@ -304,6 +333,18 @@
+ return $calendar;
+ }
+
+ +sub preprocess_event (@) {
+ + my %params=@_;
+ + # if now time is given, use now
+ + $params{begin} = localtime($time) unless defined $params{begin};
+ +
+ + my $timestamp = str2time($params{begin});
+ + if ( defined $timestamp) {
+ + $pagestate{$params{page}}{event}{begin}=$timestamp;
+ + }
+ + return "<!-- $params{begin} -->";
+ +} #}}
+ +
+ sub preprocess (@) {
+ my %params=@_;
+ $params{pages} = "*" unless defined $params{pages};
+ @@ -311,6 +352,8 @@
+ $params{month} = sprintf("%02d", $params{month}) if defined $params{month};
+ $params{week_start_day} = 0 unless defined $params{week_start_day};
+ $params{months_per_row} = 3 unless defined $params{months_per_row};
+ + $params{time_src} = 'auto' unless defined $params{time_src};
+ + $params{detail} = 0 unless defined $params{detail};
+
+ if (! defined $params{year} || ! defined $params{month}) {
+ # Record that the calendar next changes at midnight.
+ @@ -355,19 +398,29 @@
+ if (! defined $cache{$pagespec}) {
+ foreach my $p (keys %pagesources) {
+ next unless pagespec_match($p, $pagespec);
+ - my $mtime = $IkiWiki::pagectime{$p};
+ - my $src = $pagesources{$p};
+ - my @date = localtime($mtime);
+ - my $mday = $date[3];
+ - my $month = $date[4] + 1;
+ - my $year = $date[5] + 1900;
+ - my $mtag = sprintf("%02d", $month);
+ -
+ - # Only one posting per day is being linked to.
+ - $linkcache{"$year/$mtag/$mday"} = "$src";
+ - $cache{$pagespec}{"$year"}++;
+ - $cache{$pagespec}{"$year/$mtag"}++;
+ - $cache{$pagespec}{"$year/$mtag/$mday"}++;
+ + my $begin = '';
+ + # use time defined by event preprocessor if it's available
+ + if (defined $pagestate{$p}{event}{begin}) {
+ + $begin = $pagestate{$p}{event}{begin};
+ + # fall back on ctime if time_src is set to auto
+ + # set time_src to 'event' to skip posts that don't
+ + # have the event preprocessor
+ + } elsif ($params{time_src} eq 'auto') {
+ + $begin = $IkiWiki::pagectime{$p};
+ + }
+ + if($begin ne '') {
+ + my $dest = $pagesources{$p};
+ + my @date = localtime($begin);
+ + my $mday = $date[3];
+ + my $month = $date[4] + 1;
+ + my $year = $date[5] + 1900;
+ + my $mtag = sprintf("%02d", $month);
+ +
+ + $linkcache{"$year/$mtag/$mday"}{$p} = "$dest";
+ + $cache{$pagespec}{"$year"}++;
+ + $cache{$pagespec}{"$year/$mtag"}++;
+ + $cache{$pagespec}{"$year/$mtag/$mday"}++;
+ + }
+ }
+ }
+
diff --git a/doc/todo/Set_arbitrary_date_to_be_used_by_calendar_plugin/discussion.mdwn b/doc/todo/Set_arbitrary_date_to_be_used_by_calendar_plugin/discussion.mdwn
new file mode 100644
index 000000000..b5334b73d
--- /dev/null
+++ b/doc/todo/Set_arbitrary_date_to_be_used_by_calendar_plugin/discussion.mdwn
@@ -0,0 +1,44 @@
+> Thanks for coming up with a patch.. Let me make sure I understand its
+> rationalle.
+>
+> The meta plugin already allows modifying the page creation time,
+> which is what the calendar plugin uses.
+>
+> So, it seems to me that the use of this patch is for recording events in
+> the future. You'd not want a page for a future event to claim it was
+> created in the future. I suppose you could also use it for events in the
+> past, if you didn't want to change the creation time for some reason.
+> (Perhaps you're doing a calendar of historical events, for example.)
+>
+> Accurate? --[[Joey]]
+
+>> Thanks for the feedback. Thinking about what you said ... I suspect my patch
+>> doesn't belong in the calendar plugin, which does a very specific thing
+>> (create a calendar to show when blog posts were created). I'm really angling
+>> toward an event calendar (as mentioned on [[todo/plugin]]). I'd like to preserve
+>> the page creation time - which is useful and important information in its own right
+>> - and be able to generate a calendar with links to particular posts that will show
+>> up on the calendar based on an arbitrary date. Perhaps this should be re-considered
+>> as a separate plugin? --[[Jamie]]
+
+>>> I think it makes sense to have only one calendar, if possible.
+>>> I think your event stuff is fine, the only thing we might want to add
+>>> is a config option for the calendar, to control whether it looks at the
+>>> event date, or the creation date. --[[Joey]]
+
+>>>> Ok - I can work on that. One question - the existing calendar module has it's own
+>>>> functions for building an html display of a calendar. HTML::CalendarMonth seems to
+>>>> provide that functionality. My instincts are to rip out the code in the calendar plugin
+>>>> and use the existing module. On the other hand, that creates added dependencies.
+>>>> Suggestions anyone? --[[Jamie]]
+
+>>>>> I'm all for ripping code out of ikiwiki where CPAN can be used, as
+>>>>> long as the resulting code and html are good. --[[Joey]]
+
+>>>>>> Sounds good. I'll work on HTML::CalendarMonth for the next version. In the current version I
+>>>>>> did the event date vs. creation date option as a parameter to the calendar
+>>>>>> preprocessor rather than as a config variable so you could do it differently on
+>>>>>> different calendars in the same wiki. I also opted for values of auto vs. event
+>>>>>> rather than creation time vs. event since if you want to use creation time you
+>>>>>> can simply not include the event preprocessor directive. auto seems to give you that
+>>>>>> plus more flexibility. Feedback welcome :). --[[Jamie]]
diff --git a/doc/todo/Set_templates_for_whole_sections_of_the_site.mdwn b/doc/todo/Set_templates_for_whole_sections_of_the_site.mdwn
new file mode 100644
index 000000000..0dbda8a3a
--- /dev/null
+++ b/doc/todo/Set_templates_for_whole_sections_of_the_site.mdwn
@@ -0,0 +1,41 @@
+At the moment, IkiWiki allows you to set the template for individual pages using the [[plugins/pagetemplate]] directive/plugin, but not for whole sections of the wiki.
+
+I've written a new plugin, sectiontemplate, available in the `page_tmpl` branch of my git repository that allows setting the template by pagespec in the config file.
+
+-- [[Will]]
+
+> This is an excellent idea and looks ready for merging.
+> Before I do though, would it perhaps make sense to merge
+> this in with the pagetemplate plugin? They do such a similar thing,
+> and unless letting users control the template by editing a page is not
+> wanted, I can't see any reason not to combine them. --[[Joey]]
+
+> One other idea is moving the pagespec from setup file to a directive.
+> The edittemplate plugin does that, and it might be nice
+> to be consistent with that. To implement it, you'd have to
+> make the scan hook store the template pagespecs in `%pagestate`,
+> so a bit more complicated. --[[Joey]]
+
+>> I started with the pagetemplate plugin, which is why they're so similar. I guess they could be combined.
+>> I had a brief think about having the specs and templates defined in a directive rather than the config file, but it got tricky.
+>> How do you know what needs to be rebuilt when? There is probably a solution, maybe even an obvious one, but I thought that getting something done was more important than polishing it.
+>> In the worst case, admins can always use the web interface to change the setup :).
+
+>> I wanted this to put comments on my new blog, and was more interested in that goal than this subgoal. I've moved most of my web pages to IkiWiki and there is only one small part that is the blog.
+>> I wanted to use [[Disqus comments|tips/Adding_Disqus_to_your_wiki/]], but only on the blog pages. (I'm using Disqus rather than IkiWiki comments because I don't want to have to deal with spam, security, etc. I'll happily just let someone else host the comments.) -- [[Will]]
+
+>>> Yes, handing the rebuild is a good reason not to use directives for
+>>> this.
+>>>
+>>> I do still think combining this with pagetemplate would be good.
+>>> --[[Joey]]
+
+>>>> This is exactly what I was looking for and it took me a while to find it. I very much support the idea to provide this as a regular plugin, be it merged with pagetemplate or stand-alone. Thank you for your work and code! --BenTo
+
+>>>> Any update on this. This could be very helpful as I plan to run a section of a wiki with a different language (and language settings like RTL-ed CSS). --Nezmer
+
+>>>>> I've implemented this functionality as part of `pagetemplate` as on my "pagetemplate" branch of ikiwiki at https://github.com/rubykat/ikiwiki/tree/pagetemplate - do you want to pull this, Joey?
+>>>>> It isn't implemented quite the same way as Will did; I have the template name first and the pagespec last, but it does the same thing.
+>>>>> --[[KathrynAndersen]]
+
+Just a quick note that Kathryn's branch is ready.[[!template id=gitbranch branch=rubykat/pagetemplate author="[[KathrynAndersen]]"]][[!tag patch]] --[[Will]]
diff --git a/doc/todo/Short_wikilinks.mdwn b/doc/todo/Short_wikilinks.mdwn
new file mode 100644
index 000000000..b9aec9112
--- /dev/null
+++ b/doc/todo/Short_wikilinks.mdwn
@@ -0,0 +1,104 @@
+Markdown supports nice short links to external sites within body text by
+references defined elsewhere in the source:
+
+ foo [bar][ref]
+
+ [ref]: http://example.invalid/
+
+It would be nice to be able to do this or something like this for wikilinks
+as well, so that you can have long page names without the links cluttering
+the body text. I think the best way to do this would be to move wikilink
+resolving after HTML generation: parse the HTML with a proper HTML parser,
+and replace relative links with links to the proper files (plus something
+extra for missing pages).
+
+> That's difficult to do and have resonable speed as well. Ikiwiki needs to
+> know all about all the links between pages before it can know what pages
+> it needs to build to it can update backlink lists, update links to point
+> to new/moved pages etc. Currently it accomplishes this by a first pass
+> that scans new and changed files, and quickly finds all the wikilinks
+> using a simple regexp. If it had to render the whole page before it was
+> able to scan for hrefs using a html parser, this would make it at least
+> twice as slow, or would require it to cache all the rendered pages in
+> memory to avoid re-rendering. I don't want ikiwiki to be slow or use
+> excessive amounts of memory. YMMV. --[[Joey]]
+
+>> Or you could disk cache the incomplete page containing only the body text,
+>> which should often not need re-rendering, as most alterations consist of
+>> changing the link targets exactly, and we can know pages that exist before
+>> rendering a single page. Then after backlinks have been resolved, it would
+>> suffice to feed this body text from the cache file to the template. However, e.g.
+>> the inline plugin would demand extra rendering after the depended-upon pages
+>> have been rendered, but these pages should usually not be that frequent, or
+>> contain that many other pages in full. (And for 'archive' pages we don't need
+>> to remember that much information from the semi-inlined pages.) It would help
+>> if you could get data structures instead of HTML text from the HTMLizer, and
+>> then simply cache these data structures in some quickly-loadeble form (that
+>> I suppose perl itself has support for). Regexp hacks are so ugly compared
+>> to actually parsing a properly-defined syntax...
+
+A related possibility would be to move a lot of "preprocessing" after HTML
+generation as well (thus avoiding some conflicts with the htmlifier), by
+using special tags for the preprocessor stuff. (The old preprocessor could
+simply replace links and directives with appropriate tags, that the
+htmlifier is supposed to let through as-is. Possibly the htmlifier plugin
+could configure the format.)
+
+> Or using postprocessing, though there are problems with that too and it
+> doesn't solve the link scanning issue.
+
+Other alternatives would be
+
+ * to understand the source format, but this seems too much work with all the supported formats; or
+
+ * something like the shortcut plugin for external links, with additional
+ support for specifying the link text, but the syntax would be much more
+ cumbersome then.
+
+> I agree that a plugin would probably be more cumbersome, but it is very
+> doable. It might look something like this:
+
+ \[[!link bar]]
+
+ \[[!link bar=VeryLongPageName]]
+
+>> This is, however, still missing specifying the link text, and adding that option would seem to me to complicate the plugin syntax a lot, unless support is added for the |-syntax for specifying a particular parameter to every plugin.
+
+>>> Well, the link text in my example above is "bar". It's true that if
+>>> you want to use the same link text for multiple distinct links, or
+>>> different link texts for the same link, this is missing a useful layer of
+>>> indirection; it's optimised for the (probably) more common case. It
+>>> could be done as a degenerate form of the syntax I propose below, BTW.
+>>> --[[Joey]]
+
+>> ... Returning to this, the syntax infact wouldn't be so bad with the |-syntax, given a short name for the plugin:
+
+ [[whatever|ref 1]]
+ \[[!ref 1=page_with_long_name]]
+
+>>> A way to do this that doesn't need hacking at the preprocessor syntax
+>>> follows: --[[Joey]]
+
+ \[[!link bar=1]]
+ \[[!dest 1=page_with_long_name]]
+
+>>>> But this doesn't work so well for links that aren't valid keys. Such
+>>>> as stuff with spaces in it. I'd like to be able to write any kind of
+>>>> links conveniently, not just something that looks like a wikilink.
+
+>>>>> You're right, and to fix that it could be turned around: --[[Joey]]
+
+ \[[!link 1=bar]]
+ \[[!dest 1=page_with_long_name]]
+
+>> It also shouldn't be difficult to support non-wiki links in this same
+>> way, so that you could still link everywhere in an uniform manner, as
+>> the (still preferred by me) HTML processing approach would provide.
+>> Perhaps a plugin call wouldn't even be necessary for the links
+>> themselves: what about aliases for the normal link mechanism? Although
+>> the 'ref' call may infact be cleaner, and adding that |-syntax for
+>> plugins could offer other possibilities for other plugins.
+
+>>> I agree, it should be easy to make it support non-wiki links too.
+>>> We seem to have converged at something we can both live with that's
+>>> reasonable to implement.. --[[Joey]]
diff --git a/doc/todo/Shorter_feeds.mdwn b/doc/todo/Shorter_feeds.mdwn
new file mode 100644
index 000000000..2e0b0fab9
--- /dev/null
+++ b/doc/todo/Shorter_feeds.mdwn
@@ -0,0 +1,11 @@
+It should be possible to control the number of items included in a feed
+independently of the number of items included on the page (the latter,
+however, possibly setting an upper limit). This would be particularly
+useful on archive pages providing a feed. Presently the feed grows huge, if
+the archive page has no limit on the entries listed on it (as in the list
+of [all entries][ionfaq] in the Ion FAQ). An alternative useful filter
+would be filtering by the age of the page.
+
+ [ionfaq]: http://iki.fi/tuomov/ion/faq/entries.html
+
+> [[todo/Done]], option name is `feedshow` --[[Joey]]
diff --git a/doc/todo/Silence_monotone_warning.mdwn b/doc/todo/Silence_monotone_warning.mdwn
new file mode 100644
index 000000000..d875900c5
--- /dev/null
+++ b/doc/todo/Silence_monotone_warning.mdwn
@@ -0,0 +1,17 @@
+A quick [[patch]] to silence a [[rcs/monotone]] warning I started seeing:
+
+ diff --git a/IkiWiki/Plugin/monotone.pm b/IkiWiki/Plugin/monotone.pm
+ index 4b9be31..9d4e280 100644
+ --- a/IkiWiki/Plugin/monotone.pm
+ +++ b/IkiWiki/Plugin/monotone.pm
+ @@ -55,7 +55,7 @@ sub checkconfig () {
+ error("Monotone version too old, is $version but required 0.38");
+ }
+
+ - if (length $config{mtn_wrapper}) {
+ + if (defined $config{mtn_wrapper} && length $config{mtn_wrapper}) {
+ push @{$config{wrappers}}, {
+ wrapper => $config{mtn_wrapper},
+ wrappermode => (defined $config{mtn_wrappermode} ? $config{mtn_wrappermode} : "06755"),
+
+> Thanks, [[done]]
diff --git a/doc/todo/Split_plugins_with_external_dependencies_into_separate_Debian_packages.mdwn b/doc/todo/Split_plugins_with_external_dependencies_into_separate_Debian_packages.mdwn
new file mode 100644
index 000000000..fdf0dea50
--- /dev/null
+++ b/doc/todo/Split_plugins_with_external_dependencies_into_separate_Debian_packages.mdwn
@@ -0,0 +1,40 @@
+The Debian ikiwiki package has a pile of recommends and suggests for packages needed by various plugins and other optional functionality. To make it easier for people to figure out what to install, and to make it easier for automatic dependency tracking to remove packages ikiwiki no longer needs, we could split the plugins with additional dependencies into their own packages.
+
+Notable plugin dependencies:
+
+- [[plugins/img]] depends on [[!debpkg perlmagick]]
+- [[plugins/graphviz]] depends on [[!debpkg graphviz]]
+ - [[plugins/linkmap]] depends on the graphviz plugin, so it should probably go in the same package.
+- [[plugins/polygen]] depends on [[!debpkg polygen]]
+- [[plugins/teximg]] depends on [[!debpkg dvipng]] and [[!debpkg texlive]]
+- [[plugins/htmltidy]] depends on [[!debpkg tidy]]
+- [[plugins/table]] depends on [[!debpkg libtext-csv-perl]]
+- [[plugins/textile]] depends on [[!debpkg libtext-textile-perl]]
+- [[plugins/txt]] should probably just depend on [[!debpkg liburi-find-perl]]
+- [[plugins/sparkline]] depends on [[!debpkg libsparkline-php]], which pulls in the whole PHP stack.
+ - [[plugins/postsparkline]] depends on the sparkline plugin, so it should probably go in the same package.
+- [[plugins/search]] depends on [[!debpkg xapian-omega]] and [[!debpkg libsearch-xapian-perl]]
+- [[plugins/po]] depends on [[!debpkg po4a]] (and possibly [[!debpkg gettext]] and [[!debpkg liblocale-gettext-perl]], or does something else use those?)
+- [[plugins/amazon_s3]] depends on [[!debpkg libnet-amazon-s3-perl]] and [[!debpkg libfile-mimeinfo-perl]]
+- [[plugins/highlight]] depends on [[!debpkg libhighlight-perl]]
+- [[plugins/htmlbalance]] depends on [[!debpkg libhtml-tree-perl]]
+- [[plugins/typography]] depends on [[!debpkg libtext-typography-perl]]
+- [[plugins/creole]] depends on [[!debpkg libtext-wikicreole-perl]]
+- [[plugins/wikitext]] depends on [[!debpkg libtext-wikiformat-perl]]
+- [[plugins/rst]] depends on [[!debpkg librpc-xml-perl]] and [[!debpkg python-docutils]], and pulls in Python
+- [[plugins/blogspam]] depends on [[!debpkg librpc-xml-perl]]
+- [[plugins/prettydate]] depends on [[!debpkg libtimedate-perl]]
+- [[plugins/hnb]] depends on [[!debpkg hnb]]
+- [[plugins/fortune]] depends on [[!debpkg fortune]]
+- [[plugins/filecheck]] depends on [[!debpkg libfile-mimeinfo-perl]] and file
+- [[plugins/ddate]] depends on [[!debpkg libdatetime-calendar-discordian-perl]] and [[!debpkg libdatetime-perl]]
+- [[plugins/otl]] depends on [[!debpkg vim-vimoutliner]]
+- [[plugins/haiku]] depends on [[!debpkg libcoy-perl]]
+- [[plugins/sortnaturally]] depends on [[!debpkg libsort-naturally-perl]]
+- [[plugins/pinger]] depends on [[!debpkg liblwpx-paranoidagent-perl]] (it works with plain LWP, but less securely) and should probably just depend on [[!debpkg libcrypt-ssleay-perl]]
+- [[plugins/openid]] depends on [[!debpkg libnet-openid-consumer-perl]], and should either recommend or just depend on [[!debpkg liblwpx-paranoidagent-perl]] and [[!debpkg libcrypt-ssleay-perl]]
+- Support for tla depends on [[!debpkg libmailtools-perl]] (could make this a package depending on [[!debpkg tla]] and [[!debpkg libmailtools-perl]])
+
+Also, ikiwiki should probably just depend on [[!debpkg libauthen-passphrase-perl]] and refuse to store insecure passwords.
+
+[[!tag wishlist]]
diff --git a/doc/todo/Suggested_location_should_be_subpage_if_siblings_exist.mdwn b/doc/todo/Suggested_location_should_be_subpage_if_siblings_exist.mdwn
new file mode 100644
index 000000000..4489dd5d2
--- /dev/null
+++ b/doc/todo/Suggested_location_should_be_subpage_if_siblings_exist.mdwn
@@ -0,0 +1,26 @@
+Given a page `/foo` and a subpage `/foo/bar`, if I add a link to `baz` to
+`/foo` and click on the link to create the page, the suggested location
+should be `/foo/baz` instead of `/baz`. The rationale is that presence of a
+sibling folder, or sibling pages (with `usefolders=0`) is a strong hint
+that we're at the root of a (sub-)hierarchy.
+
+> I think there's something to be said for consistency, even if it doesn't
+> make the best guess every time. It makes it more easy to learn when
+> you do need to change the location, and learn when default works.
+>
+> In your example, to get the foo/bar page created, you'd need to remember
+> to change the default when creating that subpage. But only for the first
+> subpage, after that it would get the default right. But who can remember if
+> a particular page has a subpage already? You end up having to check every
+> time anyway. Plus, you'd have to check every time you wanted to create "bar"
+> from "foo" that it didn't guess you meant "foo/bar".
+>
+> With the current simple default, you at least
+> know you don't need to check in that fairly common case, which seems like
+> a win over your suggestion.
+>
+> IMHO, what you really want is [[Moving_pages]]. :-) --[[Joey]]
+
+>> This sounds like WONTFIX to me? --[[smcv]]
+
+[[!tag wishlist done]]
diff --git a/doc/todo/Support_MultiMarkdown_3.X.mdwn b/doc/todo/Support_MultiMarkdown_3.X.mdwn
new file mode 100644
index 000000000..ad724442b
--- /dev/null
+++ b/doc/todo/Support_MultiMarkdown_3.X.mdwn
@@ -0,0 +1,10 @@
+[[!tag wishlist]]
+Information about Multimarkdown 3.X can be found at <http://fletcherpenney.net/multimarkdown/>.
+Apparently this will run faster because it's not a Perl script.
+The markdown plug-in only uses the perl script for multi-markdown.
+I see that I could just replace /usr/bin/markdown with a renamed multimarkdown,
+but I'd rather not change the system file or uninstall the perl modules.
+Perhaps a custom Plugin/mdwn.pm or a clever way to set $markdown_sub would suffice,
+but I don't know perl. If I wanted to replace Plugin/mdwn.pm with something simple
+that didn't bother to check for Text::*Markdown, calling /home/me/bin/mymdwn instead,
+what would that look like? -- [[tjgolubi]]
diff --git a/doc/todo/Support_XML-RPC-based_blogging.mdwn b/doc/todo/Support_XML-RPC-based_blogging.mdwn
new file mode 100644
index 000000000..6a0593b17
--- /dev/null
+++ b/doc/todo/Support_XML-RPC-based_blogging.mdwn
@@ -0,0 +1,17 @@
+Perhaps ikiwiki should support XML-RPC-based blogging, using the [standard
+MetaWeblog protocol](http://www.xmlrpc.com/metaWeblogApi). This would allow
+the use of applets like [[!debpkg gnome-blog]] to post to an ikiwiki blog. The
+protocol supports multiple blog names, so one standard URL with page names as
+blog names would work. --[[JoshTriplett]]
+
+> This would be a great thing to add a plugin for. (Probably using the cgi
+> hook to let ikiwiki act as an RPC server. --[[Joey]]
+
+>> I'd love to see support for this and would be happy to contribute towards a bounty (say US$100) :-). [PmWiki](http://www.pmwiki.org/) has a plugin which [implements this](http://www.pmwiki.org/wiki/Cookbook/XMLRPC) in a way which seems fairly sensible as an end user. --[[AdamShand]]
+
+>>> Bump. This would be a nice feature, and with the talent on this project I'm sure it could be done safely, too.
+
+
+[[!tag soc]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/Support__47__Switch_to_MultiMarkdown.mdwn b/doc/todo/Support__47__Switch_to_MultiMarkdown.mdwn
new file mode 100644
index 000000000..2d22de2c0
--- /dev/null
+++ b/doc/todo/Support__47__Switch_to_MultiMarkdown.mdwn
@@ -0,0 +1,35 @@
+Supporting or switching to MultiMarkdown would take care of a few of the outstanding feature requests. Quoting from the MultiMarkdown site:
+
+>MultiMarkdown is a modification of John Gruber's original Markdown.pl file. It uses the same basic syntax, with several additions:
+
+> 1. I have added a basic metadata feature, to allow the inclusion of
+> metadata within a document that can be used in different ways based
+> on the output format.
+
+> 2. I have allowed the automatic use of cross-references within a Markdown
+> document. For instance, you can easily jump to
+> [the Introduction][Introduction].
+
+> 3. I have incorporated John's proposed syntax for footnotes. Since he
+> has not determined the output format, I created my own. Mainly, I
+> wanted to be able to add footnotes to the LaTeX output; I was less
+> concerned with the XHTML formatting.
+
+> 4. Most importantly, however, I have changed the way that the processed
+> output is created, so that it is quite simple to export Markdown syntax
+> to a variety of outputs. By setting the `Format` metadata to `complete`,
+> you generate a well-formed XHTML page. You can then use XSLT to convert
+> to virtually any format you like.
+
+MultiMarkdown would solve the BibTex request and the multiple output formats would make the print_link request an easy fix. MultiMarkdown is actively developed and can be found at:
+
+[MultiMarkdown Homepage](http://fletcher.freeshell.org/wiki/MultiMarkdown)
+
+> I don't think MultiMarkdown solves [[the_BibTeX_request|todo/BibTeX]], but it might solve the request for LaTeX output. --[[JoshTriplett]]
+
+> Unless there's a way to disable a zillion of the features, please **no**. Do _not_ switch to it. One thing that I like about markdown as opposed to most other ASCII markup languages, is that it has at least a bit of moderation on the syntax (although it could be even simpler). There's not a yet another reserved character lurking behind every corner. Not so in multimarkdown anymore. Footnotes, bibliography and internal references I could use, and they do not add any complex syntax: it's all inside the already reserved sequences of bracketed stuff. (If you can even say that ASCII markup languages have reserved sequences, as they randomly decide to interpret stuff, never actually failing on illegal input, like a proper language to write any serious documentation in, would do.) But tables, math, and so on, no thanks! Too much syntax! Syntax overload! Bzzzt! I don't want mischievous syntaxes lurking behind every corner, out to get me. --[[tuomov]]
+
+> ikiwiki already supports MultiMarkdown, since it has the same API as MarkDown. So if you install it as Markdown.pm (or as /usr/bin/markdown), it should Just Work. It would also be easy to support some other extension such as mmdwn to use multimarkdown installed as MuliMarkdown.pm, if someone wanted to do that for some reason -- just copy the mdwn plugin and lightly modify. --[[Joey]]
+
+> There's now a multimarkdown setup file option that uses
+> Text::MultiMarkdown for .mdwn files. [[done]] --[[Joey]]
diff --git a/doc/todo/Support_preprocessing_CSS.mdwn b/doc/todo/Support_preprocessing_CSS.mdwn
new file mode 100644
index 000000000..9aafc2449
--- /dev/null
+++ b/doc/todo/Support_preprocessing_CSS.mdwn
@@ -0,0 +1 @@
+Inspired by [this blog post](http://eikke.com/css-preprocessor/), I think ikiwiki could support preprocessing CSS files. Templates, conditionals, and raw inlines could all prove useful in CSS. --[[JoshTriplett]]
diff --git a/doc/todo/Support_subdirectory_of_a_git_repo.mdwn b/doc/todo/Support_subdirectory_of_a_git_repo.mdwn
new file mode 100644
index 000000000..4eec87d2b
--- /dev/null
+++ b/doc/todo/Support_subdirectory_of_a_git_repo.mdwn
@@ -0,0 +1,9 @@
+Git does not support checking out a subdirectory of a repository as a repository. In order to allow a software project managed with Git to keep its [[wiki,_bug-tracker,_TODO-list,_and_stuff|tips/integrated_issue_tracking_with_ikiwiki]] in a subdirectory of the same repository (rather than a parallel `foo-wiki.git` repository, which does not stay in sync with the code), ikiwiki should support checking out a repository but only using a subdirectory of that repository. --[[JoshTriplett]]
+
+> This seems to be a mandatory feature. I'll start working to implement it as soon as possible --[[Roktas]]
+
+>> Thanks! --[[JoshTriplett]]
+
+[[todo/done]]; patches from Jamey Sharp. --[[JoshTriplett]]
+
+> This is great! Thanks to Jamey Sharp for his work and please accept my apologies for not being able to find enough time for this feature. --[[Roktas]] \ No newline at end of file
diff --git a/doc/todo/Support_tab_insertion_in_textarea.mdwn b/doc/todo/Support_tab_insertion_in_textarea.mdwn
new file mode 100644
index 000000000..10d94343f
--- /dev/null
+++ b/doc/todo/Support_tab_insertion_in_textarea.mdwn
@@ -0,0 +1,15 @@
+[[!tag wishlist]]
+
+It'd be nice to be allowed to insert tabs into the textarea, as opposed to
+having to insert 4-spaces for lists/etc. This would require JavaScript to
+be practical, and would gracefully degrade on browsers w/o JS support.
+
+Some browsers already let you insert tabs (IIRR links or lynx does...)
+
+> So does w3m, since it launched an external editor.. Basically, this is a
+> bug in your web browser, not in ikiwiki. Trying to work around it with
+> javascript seems like a losing proposition. Good link, btw. --[[Joey]]
+
+Here's a link to a wiki discussing the topic: <http://c2.com/cgi/wiki?TipForTypingTab>
+
+-- [[harningt]]
diff --git a/doc/todo/Support_wildcard_inside_of_link__40____41___within_a_pagespec.mdwn b/doc/todo/Support_wildcard_inside_of_link__40____41___within_a_pagespec.mdwn
new file mode 100644
index 000000000..8320f72a6
--- /dev/null
+++ b/doc/todo/Support_wildcard_inside_of_link__40____41___within_a_pagespec.mdwn
@@ -0,0 +1,45 @@
+I don't segregate my blog entries into a directory, but instead want
+my blog to simply consist of all pages that have been tagged. That is,
+I'd like to have my blog page look like this:
+
+ \[[!inline pages="link(tag/*)"]]
+
+That doesn't work in ikiwiki 2.1, but I have it
+[working](http://www.cworth.org/blog) with the following patch:
+
+ From 6149386084417fb8375d08446438b20ed52d6882 Mon Sep 17 00:00:00 2001
+ From: Carl Worth <cworth@cworth.org>
+ Date: Tue, 29 May 2007 11:43:21 -0700
+ Subject: [PATCH] Allow for glob matching inside of link() within a pagespec
+
+ ---
+ IkiWiki.pm | 11 ++++++++---
+ 1 files changed, 8 insertions(+), 3 deletions(-)
+
+ diff --git a/IkiWiki.pm b/IkiWiki.pm
+ index 38aa46a..cd42e8d 100644
+ --- a/IkiWiki.pm
+ +++ b/IkiWiki.pm
+ @@ -1082,10 +1082,15 @@ sub match_link ($$;@) {
+ my $links = $IkiWiki::links{$page} or return undef;
+ return IkiWiki::FailReason->new("$page has no links") unless @$links;
+ my $bestlink = IkiWiki::bestlink($from, $link);
+ - return IkiWiki::FailReason->new("no such link") unless length $bestlink;
+ foreach my $p (@$links) {
+ - return IkiWiki::SuccessReason->new("$page links to $link")
+ - if $bestlink eq IkiWiki::bestlink($page, $p);
+ + if (length $bestlink) {
+ + return IkiWiki::SuccessReason->new("$page links to $link")
+ + if $bestlink eq IkiWiki::bestlink($page, $p);
+ + }
+ + else {
+ + return IkiWiki::SuccessReason->new("$page links to page matching $link")
+ + if match_glob ($p, $link, %params);
+ + }
+ }
+ return IkiWiki::FailReason->new("$page does not link to $link");
+ }
+ --
+ 1.5.1.1.g6aead
+
+Thanks! [[done]] --[[Joey]]
diff --git a/doc/todo/Tags_list_in_page_footer_uses_basename.mdwn b/doc/todo/Tags_list_in_page_footer_uses_basename.mdwn
new file mode 100644
index 000000000..603e82b20
--- /dev/null
+++ b/doc/todo/Tags_list_in_page_footer_uses_basename.mdwn
@@ -0,0 +1,11 @@
+Page footers contain a list of links to the page and a list of tags applied to the page. The link list uses the full path to pages. However, the tag list contains only the basename of the tag pages. For instance, if I tag a page with person1/foo and person2/bar, the tag list will just list foo and bar without the necessary disambiguating prefixes.
+
+I think the tag list should always contain the full path to the tag, with the tagbase value removed.
+
+--[[JoshTriplett]]
+
+> What if tagbase is not used? I know this would clutter up the display of
+> my tags on several wikis, including this one. --[[Joey]]
+
+>> Since Giuseppe's patches to fix [[bugs/tag_behavior_changes_introduced_by_typed_link_feature]],
+>> the tag list has what Josh requested, but only if a tagbase is used. [[done]] --[[smcv]]
diff --git a/doc/todo/Track_Markdown_Standardisation_Efforts.mdwn b/doc/todo/Track_Markdown_Standardisation_Efforts.mdwn
new file mode 100644
index 000000000..54a615920
--- /dev/null
+++ b/doc/todo/Track_Markdown_Standardisation_Efforts.mdwn
@@ -0,0 +1,7 @@
+Just a quick note that some people are making noise about Markdown standardisation. Specifically:
+
+ * <http://markdown.github.com/>
+ * <http://www.codinghorror.com/blog/2012/10/the-future-of-markdown.html>
+ * <http://johnmacfarlane.net/babelmark2/faq.html#what-are-some-big-questions-that-the-markdown-spec-does-not-answer>
+
+It might be worth following...
diff --git a/doc/todo/Unit_tests.mdwn b/doc/todo/Unit_tests.mdwn
new file mode 100644
index 000000000..9c579a62e
--- /dev/null
+++ b/doc/todo/Unit_tests.mdwn
@@ -0,0 +1,10 @@
+Not sure about this TODO, but here it is anyway...
+
+It would be nice to have unit tests for IkiWiki. This would make sure that we don't break things when adding more functionality.
+
+> mmm unit tests. --[[schmonz]]
+
+>> Ikiwiki has over 500 tests, most of them of the unit test variety. They
+>> are located in the 't' directory and are run by 'make test'. Most of the
+>> core ikiwiki functions are covered. Feel free to contribute more..
+>> --[[Joey]] [[done]]
diff --git a/doc/todo/Untrusted_push_in_Monotone.mdwn b/doc/todo/Untrusted_push_in_Monotone.mdwn
new file mode 100644
index 000000000..a8b1cd7c4
--- /dev/null
+++ b/doc/todo/Untrusted_push_in_Monotone.mdwn
@@ -0,0 +1,28 @@
+As noted in [[tips/untrusted_git_push]] an untrusted push capability was added recently, but only implemented in git.
+(See also [[todo/rcs_updates_needed]])
+
+This note describes (but does not implement) an approach for this with the [[rcs/monotone]] rcs backend.
+
+----
+
+Monotone behaves a little differently to git in its networking. Git allows anyone to try to push, and then
+check whether it is ok before finally accepting it. Monotone has no way to accept or reject revisions
+in this way. However, monotone does have the ability to mark revisions, and to ignore unmarked revisions.
+
+This marking capability can be used to achieve a somewhat similar effect to what happens with git. The
+problem with this is that anyone could put anything into the monotone database, and while this wouldn't
+affect ikiwiki, it seems bad to leave open, untrusted storage on the web.
+
+The Plan
+=====
+
+In the `note_netsync_revision_received` hook in the monotone server, have the server check to make sure
+that either a) the revision is signed by someone trusted or, b) the revision is checked using the same
+hook that git uses in `pre-receive`. If the revision passes the ikiwiki `pre-receive` check then the
+monotone hook signs the revision. This gives that revision the 'ikiwiki seal of approval'.
+
+You'll also want to update the monotone trust hooks to only trust revisions signed by trusted people, or
+ikiwiki.
+
+Now anyone can upload a revision, but only those signed by a trusted person, or which pass the ikiwiki
+check and so get signed by the ikiwiki key, will be seen by ikiwiki.
diff --git a/doc/todo/Updated_bug_tracking_example.mdwn b/doc/todo/Updated_bug_tracking_example.mdwn
new file mode 100644
index 000000000..4fbb06497
--- /dev/null
+++ b/doc/todo/Updated_bug_tracking_example.mdwn
@@ -0,0 +1,136 @@
+I've put together an updated bug tracking example. This example requires some recent
+patches of mine. It requires [[todo/tracking_bugs_with_dependencies]],
+[[todo/Allow_edittemplate_to_set_file_type]] and the second [[patch]] in
+[[todo/structured_page_data]] (the data plugin, not the form plugin).
+
+You'll then want to add/replace the following files in the software project example. The
+heading is the name of the file. I've commented all the directives. Oh, and I don't
+have nice CSS for this yet. I did make sure that if I borrowed
+[Trac's](http://trac.edgewall.org/) css then it would display as nicely as theirs - the
+html is there if not the CSS.
+
+It might be worth adding some justification of what is going on here. The datatable
+and data directives generate a nice tabular form for the structured data. The HTML
+generated is the same as for Trac, except I've got fewer fields. Adding more is trivial.
+
+I use data directives rather than links because the data directive allows separating
+dependencies from links. We can specify 'bugs with all dependencies closed' without
+being confused by other links on the page.
+
+-- [[Will]]
+
+### templates/bug.mdwn
+
+ \[[!datatable class="bugtable" datalist="""
+ [[!data key="Reported by" link=""]] [[!data key="Owned by" link=""]]
+ [[!data key="Depends on"]]
+ """]]
+
+ ### Description
+
+ This is a bug that needs solving.
+
+ #### Steps to reproduce:
+
+ #### What I expect to happen:
+
+ #### What actually happens:
+
+ #### What I have tried to narrow it down:
+
+### bugs.mdwn
+
+ This is FooBar's bug list. Link bugs to \[[bugs/done]] when done.
+
+ \[[!inline pages="bugs and ! bugs" feeds=no postform=yes
+ postformtext="Report a bug:" rootpage="bugs"]]
+
+ \[[!edittemplate template="templates/bug" match="bugs/* and !*/Discussion" silent=yes]]
+
+ \[[!toggle id="all bugs" text="Show all bugs"]]
+
+ \[[!toggle id="open bugs" text="Show open bugs"]]
+
+ \[[!toggle id="ready bugs" text="Show ready bugs (open bugs with all dependencies closed)"]]
+
+ \[[!toggleable id="ready bugs" text="""
+ #### Ready Bugs
+
+ Open bugs with all dependencies closed.
+
+ [[!inline pages="define(~open, ./bugs/* and !./bugs/done and !link(done) and !*/Discussion)
+ and ~open and !data_link(Depends on,~open)" actions=yes archive=yes show=0]]
+ """]]
+
+ \[[!toggleable id="open bugs" text="""
+ #### Open Bugs
+
+ [[!inline pages="./bugs/* and !./bugs/done and !link(done)
+ and !*/Discussion" actions=yes archive=yes show=0]]
+ """]]
+
+ \[[!toggleable id="all bugs" text="""
+ #### All Bugs
+
+ [[!inline pages="./bugs/* and !./bugs/done and !*/Discussion"
+ actions=yes archive=yes show=0]]
+ """]]
+
+### bugs/needs_more_bugs.mdwn
+
+ \[[!datatable class="bugtable" datalist="""
+ [[!data key="Reported by" link="John"]] [[!data key="Owned by" link="Frank"]]
+ [[!data key="Depends on" link="bugs/fails_to_frobnicate"]]
+ """]]
+
+ ### Description
+
+ FooBar does not have enough bugs, which suggests that it's not a real Free
+ Software project. Please help create more bugs by adding code to FooBar!
+ :-)
+
+ #### Steps to reproduce:
+
+ Test frobnicate.
+
+ #### What I expect to happen:
+
+ It should fail.
+
+ #### What actually happens:
+
+ It works.
+
+ #### What I have tried to narrow it down:
+
+ I've added some code, but I'm not sure it was the right code.
+
+### bugs/fails_to_frobnicate.mdwn
+
+ \[[!datatable class="bugtable" datalist="""
+ [[!data key="Reported by" link="John"]] [[!data key="Owned by" link="Frank"]]
+ [[!data key="Depends on"]]
+ """]]
+
+ ### Description
+
+ FooBar, when used with the `--frob` option, fails to properly frobnicate
+ output.
+
+ > This is fixed in \[[news/version_1.0]]; marking this bug \[[done]].
+
+ #### Steps to reproduce:
+
+ Use FooBar with the `--frob` option.
+
+ #### What I expect to happen:
+
+ Lots of frobnication.
+
+ #### What actually happens:
+
+ Complete lack of frobnication
+
+ #### What I have tried to narrow it down:
+
+ Tested on Linux, MacOS and NetBSD.
diff --git a/doc/todo/Using_page_titles_in_internal_links.mdwn b/doc/todo/Using_page_titles_in_internal_links.mdwn
new file mode 100644
index 000000000..6e1438bfd
--- /dev/null
+++ b/doc/todo/Using_page_titles_in_internal_links.mdwn
@@ -0,0 +1,3 @@
+It would be really nice if, should a page happen to have a title metavariable then links to that page which do not explicitly state a title would use it. -- Daniel Silverstone
+
+> i like the idea for some applications, but i'm afraid there would be lots of cases where it wouldn't be appropriate to happen automatically, first and foremost capitalization. a syntax like ``\[[|that page]]`` might still be available, as the current implementation assumes the ``|`` character to be part of the page name. --[[chrysn]]
diff --git a/doc/todo/Wikilink_to_a_symbolic_link.mdwn b/doc/todo/Wikilink_to_a_symbolic_link.mdwn
new file mode 100644
index 000000000..1f9a12d9c
--- /dev/null
+++ b/doc/todo/Wikilink_to_a_symbolic_link.mdwn
@@ -0,0 +1,5 @@
+Some time ago I asked in the [[ikiwiki forum|http://ikiwiki.info/forum/Wikilink_to_a_symbolic_link/]] how to create wikilinks to symbolic links. The answer was that this is not possible because of security reasons.
+
+However I use ikiwiki only locally for my personal use. So it wouldn't be a security issue for me. The point is, that I want to link to pdf-files that are somewhere on my harddrive because I want to have automatically the newest versions of the files linked to in my ikiwiki. So my idea would be to create a symbolic links to those files in my scrdir.
+
+Would be great if something like that would be possible soon.
diff --git a/doc/todo/Wrapper_config_with_multiline_regexp.mdwn b/doc/todo/Wrapper_config_with_multiline_regexp.mdwn
new file mode 100644
index 000000000..7b4323de1
--- /dev/null
+++ b/doc/todo/Wrapper_config_with_multiline_regexp.mdwn
@@ -0,0 +1,36 @@
+Turning the wikilink regexp into an extended regexp on the svn trunk seems
+to have broken the setuid wrapper on my system, because of two reasons:
+First, the wrapper generator should turn each newline in $configstring into
+`\n` in the C code rather than `\` followed by a newline in the C code.
+Second, the untainting of $configstring should allow newlines.
+
+> Both of these problems were already dealt with in commit r3714, on June
+> 3rd. Confused why you're posting patches for them now. [[done]] --[[Joey]]
+
+ Modified: wiki-meta/perl/IkiWiki.pm
+ ==============================================================================
+ --- wiki-meta/perl/IkiWiki.pm (original)
+ +++ wiki-meta/perl/IkiWiki.pm Mon Jun 11 10:52:07 2007
+ @@ -205,7 +205,7 @@
+
+ sub possibly_foolish_untaint ($) {
+ my $tainted=shift;
+ - my ($untainted)=$tainted=~/(.*)/;
+ + my ($untainted)=$tainted=~/(.*)/s;
+ return $untainted;
+ }
+
+
+ Modified: wiki-meta/perl/IkiWiki/Wrapper.pm
+ ==============================================================================
+ --- wiki-meta/perl/IkiWiki/Wrapper.pm (original)
+ +++ wiki-meta/perl/IkiWiki/Wrapper.pm Mon Jun 11 10:52:07 2007
+ @@ -62,7 +62,7 @@
+ }
+ $configstring=~s/\\/\\\\/g;
+ $configstring=~s/"/\\"/g;
+ - $configstring=~s/\n/\\\n/g;
+ + $configstring=~s/\n/\\n/g;
+
+ #translators: The first parameter is a filename, and the second is
+ #translators: a (probably not translated) error message.
diff --git a/doc/todo/Zoned_ikiwiki.mdwn b/doc/todo/Zoned_ikiwiki.mdwn
new file mode 100644
index 000000000..26260b256
--- /dev/null
+++ b/doc/todo/Zoned_ikiwiki.mdwn
@@ -0,0 +1,64 @@
+The idea behind this would be to have one ikiwiki behave as a dynamic private wiki in a specified area
+and a more static publiczone wiki. Actually private wiki page can be addressed via a *pagespec*.
+
+What is ready /can be done:
+
+* We already can more or less do this for example with [[httpauth|/plugins/httpauth/]], *.htaccess* files and a proper *httpauth_pagespec*
+yet at the cost of maintaining two different user/pass logbase (native ikiwiki signin)
+* Furthermore we can [[lockedit|plugins/lockedit/]] some pagespecs, ie in the public zone.
+
+What is problematic is when you link a public page in a private page :
+a backlink will be generated from the public page to the private page.
+
+As I noticed in [[per_page_ACLs]] in the end users through backlink
+navigation will frequently hit HTTP/401 deterring browsing as well as for the admin at false-positive logwatching.
+
+One can radically [[disable backlinks feature|todo/allow_disabling_backlinks]] but then no more neat backlink navigation that
+is really good to have in both area.
+
+I think of just preventing this backlink leak in that case would be sufficient via i.e a *privatebacklinks* config and
+a below patch.
+
+Comments are welcome.
+
+[[mathdesc]]
+
+
+<pre>
+diff --git a/IkiWiki.pm b/IkiWiki.pm
+--- a/IkiWiki.pm
++++ b/IkiWiki.pm
+@@ -294,6 +294,14 @@ sub getsetup () {
+ safe => 1,
+ rebuild => 1,
+ },
++ privatebacklinks => {
++ type => "pagespec",
++ example => "",
++ description => "PageSpec controlling which backlinks are private (ie users/*)",
++ link => "ikiwiki/PageSpec",
++ safe => 1,
++ rebuild => 1,
++ },
+ hardlink => {
+ type => "boolean",
+ default => 0,
+diff --git a/IkiWiki/Render.pm b/IkiWiki/Render.pm
+--- a/IkiWiki/Render.pm
++++ b/IkiWiki/Render.pm
+@@ -52,7 +52,8 @@ sub backlinks ($) {
+ $p_trimmed=~s/^\Q$dir\E// &&
+ $page_trimmed=~s/^\Q$dir\E//;
+
+- push @links, { url => $href, page => pagetitle($p_trimmed) };
++ push @links, { url => $href, page => pagetitle($p_trimmed) }
++ unless defined $config{privatebacklinks} && length $config{privatebacklinks} && pagespec_match($p, $config{privatebacklinks}) && !pagespec_match($page, $config{privatebacklinks}) ;
+ }
+ return @links;
+ }
+
+</pre>
+
+> Have you considered all the ways that anyone with edit access to the
+> public wiki could expose information from the public wiki? For example,
+> you could inline all the private pages into a public page. --[[Joey]]
diff --git a/doc/todo/__34__subscribe_to_this_page__34___checkbox_on_edit_form.mdwn b/doc/todo/__34__subscribe_to_this_page__34___checkbox_on_edit_form.mdwn
new file mode 100644
index 000000000..dc456bbbf
--- /dev/null
+++ b/doc/todo/__34__subscribe_to_this_page__34___checkbox_on_edit_form.mdwn
@@ -0,0 +1,10 @@
+The edit form could include a checkbox "subscribe to this page", allowing a
+user to add a page to their subscribed list while editing. This would prove
+particularly useful for [[todo]] and [bug](bugs) items, to allow users to receive
+notifications for activity on their reports.
+
+--[[JoshTriplett]]
+
+I went and removed commit notification mails entirely, the idea is that you
+subscribe using the [[RecentChanges]] rss feed, and filter it on your end.
+Good enough? --[[Joey]]
diff --git a/doc/todo/__42__forward__42__ing_functionality_for_the_meta_plugin.mdwn b/doc/todo/__42__forward__42__ing_functionality_for_the_meta_plugin.mdwn
new file mode 100644
index 000000000..b3804d652
--- /dev/null
+++ b/doc/todo/__42__forward__42__ing_functionality_for_the_meta_plugin.mdwn
@@ -0,0 +1,70 @@
+Here is a patch [[!tag patch]] to add a *forward*ing functionality
+to the [[`meta`_plugin|plugins/meta]].
+
+> [[done]], with some changes --[[Joey]]
+
+Find the most recent version at
+<http://schwinge.homeip.net/~thomas/tmp/meta_forward.patch>.
+
+I can't use `scrub(...)`, as that will strip out the forwarding HTML command.
+How to deal with that?
+
+I can also submit a Git patch, if desired.
+
+
+# Syntax
+
+**URL** = http://some.nice/place/ (*etc.*)
+
+**WHITHER** = \[\[**[[ikiwiki/wikilink]]**]] | **URL**
+
+**D** = natural number (*meaning seconds*)
+
+**OPT_DELAY** = delay=**D** | empty (*immediatelly*)
+
+\[[!meta forward="**WHITHER**" **OPT_DELAY**]]
+
+
+# Extensions and Ideas
+
+It might be doable to add references to pages that refer to the page containg
+the forwarding statement also to the referred-to page.
+
+--[[tschwinge]]
+
+
+# Discussion
+
+> The html scrubber cannot scrub meta headers. So if you emit one
+> containing user-supplied data, it's up to you to scrub it to avoid all
+> possible XSS attacks. Two attacks I'd worry about are cyclic meta refresh
+> loops, which some, but not all web browsers detect and break, and any way
+> to insert javascript via the user-supplied parameters. (Ie, putting
+> something in the delay value that closes the tag can probably insert
+> javascript ATM; and are there ways to embed javascript in the url?)
+> --[[Joey]]
+
+>> OK. I can add code to make sure that `$delay` **D** indeed is a natural number
+>> and that the passed target address **WHITHER** is nothing but a valid target address.
+>> (How to qualify a valid target address?)
+>> What is a *cyclic meta refresh loop*? Two pages in turn forwarding to each other?
+>> I think it would be possible to implement such a guard when only in-wiki links
+>> ([[ikiwiki/wikilink]]s) are being used, but how to do so for external links? --[[tschwinge]]
+
+>>> This seems a lot more securely to do for in-wiki links, since we know
+>>> that a link generated by a wikilink is safe, and can avoid cycles.
+>>> Obviously there's no way to avoid cycles when using external links.
+>>>
+>>> An example of code that doesn't detect such cycles is LWP::UserAgent,
+>>> which will happily follow cycles forever. There's a LWPx::ParanoidAgent
+>>> that can deal with cycles. I suppose this could be considered a client
+>>> side issue, except that if I were going to turn this redirect feature
+>>> on in my wikis, I'd really prefer to not have to worry about my wiki
+>>> causing such problems for clients. I feel it makes sense to make
+>>> external redirects or other potentially unsafe things an option,
+>>> and have the default behavior be only things that are known to be
+>>> secure.
+>>>
+>>> I haven't checked if there's a way to embed javascript in meta refresh
+>>> links or not. Given all the other places I've seen it be embedded, I'll
+>>> assume it is possible until it's shown not to be though.. --[[Joey]]
diff --git a/doc/todo/__47___should_point_to_top-level_index.mdwn b/doc/todo/__47___should_point_to_top-level_index.mdwn
new file mode 100644
index 000000000..60cb763e1
--- /dev/null
+++ b/doc/todo/__47___should_point_to_top-level_index.mdwn
@@ -0,0 +1,3 @@
+Since `/` now works in [[ikiwiki/WikiLink]]s to anchor links to the root of the site
+(for instance, `\[[/index]]`), `/` alone should work as a reference to the
+top-level index page (for instance, `\[[Home|/]]`). --[[JoshTriplett]]
diff --git a/doc/todo/a_navbar_based_on_page_properties.mdwn b/doc/todo/a_navbar_based_on_page_properties.mdwn
new file mode 100644
index 000000000..091e0a39b
--- /dev/null
+++ b/doc/todo/a_navbar_based_on_page_properties.mdwn
@@ -0,0 +1,48 @@
+One thing I don't like about Tobi's `navbar.pm` is that the navigation bar is
+hardcoded instead of computed from what's available. Obviously, this allows
+for a very customised `navbar` (i.e. not making all pages show up, like
+a `map` would). However, I think this could also be achieved through page
+properties.
+
+So imagine four pages A, B, A/C, and A/D, and these pages would include the
+following directives, respectively
+
+ \[[!navbaritem navbar=main priority=3]]
+ \[[!navbaritem navbar=main priority=5]]
+ \[[!navbaritem navbar=main title="Something else"]]
+ \[[!navbaritem navbar=main]]
+
+then one could insert `\[[!navbar id=main maxlevels=0]]` somewhere and it
+would get replaced with (this being in the context of viewing page C):
+
+ <ol class="navbar" id="navbar_main">
+ <li><a href="../B">B</a></li>
+ <li><a href="../A">A</a>
+ <ol>
+ <li class="current">Something else</li>
+ <li><a href="D">D</a></li>
+ </ol>
+ </li>
+ </ol>
+
+B would sort before A because it has a higher priority, but C would sort
+before D because their priorities are equal. The overridden title is not used
+for sorting.
+
+Also, the code automatically deduces that C and D are second-level under A.
+
+I don't think this is hard to code up and it's what I've been using with
+[rest2web](http://www.voidspace.org.uk/python/rest2web/) and it's served me
+well.
+
+There is a problem though if this navbar were included in a sidebar (the logical place): if a page is updated, the navbar needs to be rebuilt which causes the sidebar to be rebuilt, which causes the whole site to be rebuilt. Unless we can subscribe only to title changes, this will be pretty bad...
+
+--[[madduck]]
+
+> I've just written a plugin for a automatically created menu for use
+> [here](http://www.ff-egersdorf-wachendorf.de/). The source is at
+> [gitorious](http://gitorious.org/ikiwiki-plugins/automenu). The problem with
+> rebuilding remains unsolved but doesn't matter that much for me as I always
+> generate the web site myself, ie it's not really a wiki. --[[lnussel]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/abbreviation.mdwn b/doc/todo/abbreviation.mdwn
new file mode 100644
index 000000000..f2880091c
--- /dev/null
+++ b/doc/todo/abbreviation.mdwn
@@ -0,0 +1,7 @@
+We might want some kind of abbreviation and acronym plugin. --[[JoshTriplett]]
+
+ * Not sure if this is what you mean, but I'd love a way to make works which match existing page names automatically like (eg. if there is a page called "MySQL" then any time the word MySQL is mentioned it should become a link to that page). -- [[AdamShand]]
+
+ * The python-markdown-extras package has support for [abbreviations](http://www.freewisdom.org/projects/python-markdown/Abbreviations), with the syntax that you just use the abbreviation in text (e.g. HTML) and then define the abbreviations at the end (like "footnote-style" links). For consistency, it might be good to use the same syntax, which apparently derives from [PHP-markdown-extra](http://michelf.com/projects/php-markdown/extra/#abbr).
+
+[[wishlist]]
diff --git a/doc/todo/ability_to_force_particular_UUIDs_on_blog_posts.mdwn b/doc/todo/ability_to_force_particular_UUIDs_on_blog_posts.mdwn
new file mode 100644
index 000000000..099683820
--- /dev/null
+++ b/doc/todo/ability_to_force_particular_UUIDs_on_blog_posts.mdwn
@@ -0,0 +1,24 @@
+When converting an existing blog to ikiwiki it would be useful to be able to preserve any existing UUIDs on posts, in order to [avoid flooding aggregators](/tips/howto_avoid_flooding_aggregators/).
+
+Also, it should be possible to change the permalink (the Atom `<link>`) of a post (e.g. moving the content and leaving a redirector behind), while keeping the same Atom `<id>` (so that aggregators don't get confused).
+
+Ideally UUIDs for [[blog]] posts should be chosen when the post is created, and frozen permanently by checking them in along with the content.
+
+Perhaps ikiwiki's blogging functionality could be hooked up to the [meta plugin](/meta/plugin/), with a new meta keyword (uuid? feed-uuid? atom-uuid?) to achieve this.
+
+> I've now knocked together a [[patch]], which is in the "force-uuids" branch of git://git.debian.org/git/users/smcv/ikiwiki.git (see also [gitweb](http://git.debian.org/?p=users/smcv/ikiwiki.git;a=shortlog;h=refs/heads/force-uuids)).
+>
+> I'm not convinced that "uuid" is the best name for this functionality - the `<id>` in an Atom feed can be any URI, and one of the use-cases I have for this functionality in my own blog needs to have its `<id>` set to a URI that is not, in fact, its own address (it was a typo). "id" is a bit arrogant (forcing Atom terminology into a flat namespace!), "atom-id" is slightly misleading because it's also used for RSS... any ideas?
+>
+> While I was there, I noticed that the quality of the generated Atom/RSS feeds could be improved by making more use of the meta plugin if it's also enabled - would anyone object to me hacking on this some more?
+>
+> -[smcv](http://smcv.pseudorandom.co.uk/)
+
+> [[merged|done]], thank you!
+>
+> I chose to use the term guid, since it's both a generic term that fits
+> very well and describes both using a uuid and an url, and also happens
+> to be the term rss uses. ;-)
+>
+> Of course I'm happy if you can improve the feeds. They do already
+> use some meta information (author, copyright). --[[Joey]]
diff --git a/doc/todo/absolute_urls_in_wikilinks.mdwn b/doc/todo/absolute_urls_in_wikilinks.mdwn
new file mode 100644
index 000000000..a0fe83e44
--- /dev/null
+++ b/doc/todo/absolute_urls_in_wikilinks.mdwn
@@ -0,0 +1,20 @@
+[[!tag wishlist]]
+
+An option to have absolute urls in wikilinks instead of relative ones would be useful,
+for pages included into other pages out of the wiki rendering process (shtml for example)
+since these pages can be included from a subdir. Ditto, links from \[[!inline ..]] or \[[!map ..]].
+
+> You can make a wikilink absolute by prefixing it with a /, see
+> [[ikiwiki/subpage/linkingrules/]]. Pagespecs match absolute by default. But what do
+> you mean "included from a subdir"? If you inline a page, its links shouldn't
+> change. --Ethan
+
+>> I want the "last pages" in my sidebar. and some links to a few special pages.
+>> \[[!inline ]] or \\[[!map ]] in the sidebar is a bad idea, (because each update rebuilds
+>> all the wiki), so I use server-side-include instead of the sidebar plugin;
+>> this reduces the dependencies
+>> my sidebar is generated as http://foo.org/menu/index.html, so all the links generated by
+>> \[[!inline ]] or \[[!map ]] are relative to this position.
+>> Included from http://foo.org/section/sub/blah/index.shtml, the links are broken.
+>>
+>> — NicolasLimare
diff --git a/doc/todo/access_keys.mdwn b/doc/todo/access_keys.mdwn
new file mode 100644
index 000000000..fb23cf900
--- /dev/null
+++ b/doc/todo/access_keys.mdwn
@@ -0,0 +1,286 @@
+Access keys (i.e., keyboard shortcuts) can be defined for common
+features. Something like the following:
+
+* 1 - Homepage
+* 2 - Search box
+* E - Edit
+* R - RecentChanges
+* H - History
+* P - Preferences
+* D - Discussion
+* S - Save the current page (when editing)
+* C - Cancel the current edit
+* V - Preview the current page
+
+Then, for example, in Firefox one could press Alt+Shift+E to edit the
+page.
+
+For links, this is implemented as:
+
+ <a href="recentchanges/" accesskey="r">RecentChanges</a>
+
+and for forms buttons:
+
+ <input type="submit" value="Submit" accesskey="s"/>
+
+--[[JasonBlevins]], March 21, 2008 18:05 EDT
+
+- - -
+
+There were also a few thoughts about access keys on the
+[[main_discussion_page|index/discussion]]. Now moved to here:
+
+> Would anyone else find this a valuable addition. In oddmuse and instiki (the only other
+> wiki engines I am currently using, the edit, home, and submit link tags have an
+> accesskey attribute. I find it nice not to have to resort to the mouse for those
+> actions. However, it may not be something everyone appreciates. Any thoughts?
+> --[Mazirian](http://mazirian.com)
+>
+> > Maybe, although it would need to take the critisism at
+> > <http://www.cs.tut.fi/~jkorpela/forms/accesskey.html> into account.
+>
+> >> Thank you for that link. Given that the edit link is the first thing you tab to
+> >> in the current layout, I guess it isn't all that necessary. I have had a
+> >> a user complaint recently that Alt-e in oddmuse was overriding his access
+> >> to the browser menu.
+
+----
+
+The main criticism there it
+seems is that some browsers implement access keys in a way (via the Alt
+key) that allows them to override built-in keyboard shortcuts. I
+believe this is not a problem any longer in Firefox (which uses the
+Shift+Alt prefix) but I suppose it could still be a problem in other
+browsers.
+
+Another criticism is that most browsers do not display the access keys
+that are defined. The [article][] cited on the main discussion page
+suggests underlining the relevant mnemonic. I think it would be
+sufficient to just list them in the basewiki documentation somewhere.
+
+ [article]: http://www.cs.tut.fi/~jkorpela/forms/accesskey.html
+
+It's an unfortunate situation&mdash;I'd like an alternative to the
+rodent but there are quite a few downsides to using access keys.
+Tabbing isn't quite the same as a nice shortcut key. There's always
+Conkeror...
+
+--[[JasonBlevins]], March 22, 2008 10:35 EDT
+
+----
+
+I've written a plugin to implement access keys, configured using a wiki page similar to [[shortcuts]]. It works for links and most form submit buttons.
+
+As I am new to ikiwiki plugin writing, feedback is greatly appreciated.
+
+[[!toggle id="accesskeys" text="Toggle: accesskeys.pm"]]
+
+[[!toggleable id="accesskeys" text="""
+
+ #!/usr/bin/perl
+
+ package IkiWiki::Plugin::accesskeys;
+
+ use warnings;
+ use strict;
+ use IkiWiki 3.00;
+ use CGI::FormBuilder;
+
+ =head1 NAME
+
+ accesskeys.pm - IkiWiki module to implement access keys (keyboard shortcuts)
+
+ =head1 VERSION
+
+ v.5.0 - initial version
+
+ =head1 DESCRIPTION
+
+ Access keys are defined on a page called B<accesskeys>, using the C<accesskey> directive.
+ Example:
+
+ [[!accesskey command="Save Page" key="s"]]
+
+ B<command> may contain only alphanumeric characters (and spaces), and must be a complete
+ match to the target link or submit button's display name.
+
+ B<key> may only be a single alphanumeric character.
+
+ The access key is applied to the first matching link on a page (including header), or the
+ first matching submit button in the @buttons array.
+
+ The wiki must be completely rebuilt every time the B<accesskeys> page changes.
+
+ =head2 Sample accesskeys page
+
+ [[!if test="enabled(accesskeys)"
+ then="This wiki has accesskeys **enabled**."
+ else="This wiki has accesskeys **disabled**."]]
+
+ This page controls what access keys the wiki uses.
+
+ * [[!accesskey command="Save Page" key="s"]]
+ * [[!accesskey command="Cancel" key="c"]]
+ * [[!accesskey command="Preview" key="v"]]
+ * [[!accesskey command="Edit" key="e"]]
+ * [[!accesskey command="RecentChanges" key="c"]]
+ * [[!accesskey command="Preferences" key="p"]]
+ * [[!accesskey command="Discussion" key="d"]]
+
+ =head1 IMPLEMENTATION
+
+ This plugin uses the following flow:
+
+ =over 1
+
+ =item 1. Override default CGI::FormBuilder::submit function
+
+ FormBuilder does not support any arbitrary modification of it's submit buttons, so
+ in order to add the necessary attributes you have to intercept the internal function
+ call which generates the formatted html for the submit buttons. Not pretty, but it
+ works.
+
+ =item 2. Get list of keys
+
+ During the B<checkconfig> stage the B<accesskeys> source file is read (default
+ F<accesskeys.mdwn>) to generate a list of defined keys.
+
+ =item 3. Insert keys (links)
+
+ Keys are inserted into links during the format stage. All defined commands are checked
+ against the page's links and if there is a match the key is inserted. Only the first
+ match for each command is processed.
+
+ =item 4. Insert keys (FormBuilder buttons)
+
+ FormBuilder pages are intercepted during formatting. Keys are inserted as above.
+
+ =back
+
+ =head1 TODO
+
+ =over 1
+
+ =item * non-existant page links ex: ?Discussion
+
+ =item * Support non-submit array buttons (like those added after the main group for attachments)
+
+ =item * Support form fields (search box)
+
+ =back
+
+ =cut
+
+ #=head1 HISTORY
+
+ =head1 AUTHOR
+
+ Written by Damian Small.
+
+ =cut
+
+ my %accesskeys = ();
+
+ # Initialize original function pointer to FormBuilder::submit
+ my $original_submit_function = \&{'CGI::FormBuilder::submit'};
+ # Override default submit function in FormBuilder
+ {
+ no strict 'refs';
+ no warnings;
+ *{'CGI::FormBuilder::submit'} = \&submit_override;
+ }
+
+ sub submit_override {
+ # Call the original function, and get the results
+ my $contents = $original_submit_function->(@_);
+
+ # Hack the results to add accesskeys
+ foreach my $buttonName (keys %accesskeys) {
+ $contents =~ s/(<input id="_submit[^>]+ value="$buttonName")( \/>)/$1 title="$buttonName [$accesskeys{$buttonName}]" accesskey="$accesskeys{$buttonName}"$2/;
+ }
+
+ return $contents;
+ }
+
+ sub import {
+ hook(type => "getsetup", id => "accesskeys", call => \&getsetup);
+ hook(type => "checkconfig", id => "accesskeys", call => \&checkconfig);
+ hook(type => "preprocess", id => "accesskey", call => \&preprocess_accesskey);
+ hook(type => "format", id => "accesskeys", call => \&format);
+ }
+
+ sub getsetup () {
+ return
+ plugin => {
+ safe => 1,
+ rebuild => 1,
+ section => "widget",
+ },
+ }
+
+ sub checkconfig () {
+ if (defined $config{srcdir} && length $config{srcdir}) {
+ # Preprocess the accesskeys page to get all the access keys
+ # defined before other pages are rendered.
+ my $srcfile=srcfile("accesskeys.".$config{default_pageext}, 1);
+ if (! defined $srcfile) {
+ $srcfile=srcfile("accesskeys.mdwn", 1);
+ }
+ if (! defined $srcfile) {
+ print STDERR sprintf(gettext("accesskeys plugin will not work without %s"),
+ "accesskeys.".$config{default_pageext})."\n";
+ }
+ else {
+ IkiWiki::preprocess("accesskeys", "accesskeys", readfile($srcfile));
+ }
+ }
+ }
+
+ sub preprocess_accesskey (@) {
+ my %params=@_;
+
+ if (! defined $params{command} || ! defined $params{key}) {
+ error gettext("missing command or key parameter");
+ }
+
+ # check the key
+ if ($params{key} !~ /^[a-zA-Z0-9]$/) {
+ error gettext("key parameter is not a single character");
+ }
+ # check the command
+ if ($params{command} !~ /^[a-zA-Z0-9 _]+$/) {
+ error gettext("command parameter is not an alphanumeric string");
+ }
+ # Add the access key:
+ $accesskeys{$params{command}} = $params{key};
+
+ return sprintf(gettext("[%s] is the access key for command '<i>%s</i>'"), $params{key}, $params{command});
+ }
+
+ sub format (@) {
+ my %params = @_;
+ my $contents = $params{content};
+
+ # If the accesskey page changes, all pages will need to be updated
+ #debug("Adding dependency: for " . $params{page} . " to AccessKeys");
+ add_depends($params{page}, "AccessKeys");
+
+ # insert access keys
+ foreach my $command (keys %accesskeys) {
+ $contents =~ s/(<a href=[^>]+)(>$command<\/a>)/$1 accesskey="$accesskeys{$command}"$2/;
+ }
+ # may need special handling for non-existant discussion links (and possibly other similar cases?)
+ #$contents =~ s/(<a href=[^>]+)(>\?<\/a>Discussion)/$1 accesskey="d"$2/;
+
+ return $contents;
+ }
+
+ 1
+
+
+[[!toggle id="accesskeys" text="hide accesskeys.pm"]]
+"""]]
+
+--[[DamianSmall]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/ad-hoc_plugins.mdwn b/doc/todo/ad-hoc_plugins.mdwn
new file mode 100644
index 000000000..da28c1b8e
--- /dev/null
+++ b/doc/todo/ad-hoc_plugins.mdwn
@@ -0,0 +1,66 @@
+with pypy's sandbox module, it is possible to run user supplied code safely; that can be used for ad-hoc [[!taglink plugins]].
+
+ad-hoc plugins are little code snipplets that work similar to plugins, are limited in what they can do, but their code resides inside the wiki.
+
+## use cases
+
+* calendar modules for non-standard calendars.
+
+ an article mentioning Maladay could note it as `\[[!template id=discdate date="The Aftermath 5, 579 YOLD"]]`, which could run a script parsing the date and showing an appropriate gregorian daten in parentheses after the date in [[!wikipedia Discordian calendar]]
+
+* url operations for services that don't use the widespread url calculation patterns
+
+ a template for geocoordinates that offers links to various geo-services could take various input formats and generate urls like http://geohash.org/u2edk850cxh31 from `\[[!template id=geoinfos n=48.2081743 e=16.3738189]]`.
+
+## implementation in ikiwiki
+
+### `\[[!pythontemplate id=foo arg=value]]`
+
+the easiest way to enable ad-hoc plugins that came to my mind was creating a plugin like template, pythontemplate, that uses the same calling convention as the template plugin. i have implemented the plugin in perl, in a way that doesn't need any additional python code apart from what is shipped in pypy (in the [[!debpkg python-pypy.translator.sandbox]] package). the implementation is far from mature, but works.
+
+### `\[[!foo argument option=value]]`
+
+an implementation in the style of the [[ikiwiki/directive/shortcut]] directive would be easier to use, and would allow positional arguments too. (the template way of calling, with id= identification, requires parsing all arguments to a hash).
+
+### `hook(type="preprocess", id="foo", call=preprocess)`
+
+if one was to allow more features to wiki editors, one could even export the ikiwiki rpc api to python pages. (pages would get their `import()` function called via rpc, and coold `hook` into ikiwiki.) the security implications of such a feature would be much harder to overview, though, and the rpc would probably need filtering mechanisms.
+
+## implementation in python
+
+on the python side, i've prepared a `pythontemplate` module that can be imported from the template python programs. for example, it contains an argparse module that's adapted to ikiwiki specifics by formatting help output in a way suitable for wiki inclusion, and that silently handles the `page` and `destpage` options.
+
+The discordian calendar described above could look like this:
+
+ """Explain dates in discordian calendar"""
+ from pythontemplate.argparse import ArgumentParser
+ p = ArgumentParser(description=__doc__)
+ p.add_argument("--date", help="Date in discordian calendar")
+
+ args = p.parse_args()
+
+ def convert_date(...):
+ ...
+
+ print "%s <small>(%s)</small>"%(args.date, convert_date(args.date))
+
+Using argparse might be a bit of overkill here, but has the nice property of enabling `\[[!pythontemplate id=discdate help=true]]` for a documentation page.
+
+## security implications
+
+a simple implementation like my current one or a shortcut-style one is secure by design:
+
+* the perl module decides which python script inside the wiki is to be executed. it takes the arguments to preprocess and prepares them being passed over to the script in argv.
+* the perl module launches the secure pypy-sandbox. it tells it to allow read access to the script itself and the python library, and to run the script in an otherwise isolated environment. it passes the arguments by means of argv, receives the resulting html+directive text from stdout, and evaluates the return status and stderr in case of problems.
+
+time and memory limits can be passed to the sandbox process, so the worst thing a wiki editor could do would be to use up both resources to the defined limit whenever someone edits a page triggering the script.
+
+some details on pypy-sandbox internals:
+
+an interact script provides an "operating system" to the pypy sandbox binary itself, which it launches. the only syscalls the sandbox binary can do are stdio read/write, and every time the script being run inside wants to do something that would normally trigger a syscall, it talks to the interact script. for example, if the script tries to import the library, the binary asks the interact script for a file handle using an open() line, and the interact script will look in the virtual filesystem it keeps if such a file is present there. (it is, as it was instructed thus by the perl module).
+
+## performance / optimizations
+
+the current implementation amounts to an invocation to classical python and another invocation to pypy per directive. this also means that pypy will never get to play its big strength, just in time optimization.
+
+running a complete foreign language plugin using the xmlrpc interface in the sandbox would alleviate the problem, but the security implications would be difficult. a middle path (running a single pypy sandbox binary per ikiwiki run, but still calling into it only for directive evaluation) seems feasible. there is no direct support for such a thing in pypy yet, but it shouldn't be too hard to do, and even if the separations between the individual directive evaluations could be torn down from inside, the worst thing an attacker could do would be to have side effects between different directive evaluations).
diff --git a/doc/todo/add_forward_age_sorting_option_to_inline.mdwn b/doc/todo/add_forward_age_sorting_option_to_inline.mdwn
new file mode 100644
index 000000000..e91c5a42f
--- /dev/null
+++ b/doc/todo/add_forward_age_sorting_option_to_inline.mdwn
@@ -0,0 +1,34 @@
+The names are a bit backwards, but I guess inverting the sense of age would make people angry :-)
+
+I do not believe copy-and-paste of 3 lines is copyrightable, but in any case feel free to include in ikiwiki under the same terms.
+[[DavidBremner]]
+
+You can already do do sort=age reverse=yes [[done]] --[[Joey]]
+
+<pre>
+From 2fb2b561a678616bb0054a2d7a9d29df11998bc2 Mon Sep 17 00:00:00 2001
+From: David Bremner <bremner@pivot.cs.unb.ca>
+Date: Fri, 29 Aug 2008 15:05:41 -0300
+Subject: [PATCH] add sort='youth' to inline plugin
+
+---
+ IkiWiki/Plugin/inline.pm | 3 +++
+ 1 files changed, 3 insertions(+), 0 deletions(-)
+
+diff --git a/IkiWiki/Plugin/inline.pm b/IkiWiki/Plugin/inline.pm
+index d2e5832..9e52712 100644
+--- a/IkiWiki/Plugin/inline.pm
++++ b/IkiWiki/Plugin/inline.pm
+@@ -194,6 +194,9 @@ sub preprocess_inline (@) {
+ elsif (! exists $params{sort} || $params{sort} eq 'age') {
+ @list=sort { $pagectime{$b} <=> $pagectime{$a} } @list;
+ }
++ elsif (! exists $params{sort} || $params{sort} eq 'youth') {
++ @list=sort { $pagectime{$a} <=> $pagectime{$b} } @list;
++ }
+ else {
+ return sprintf(gettext("unknown sort type %s"), $params{sort});
+ }
+--
+1.5.6.3
+</pre>
diff --git a/doc/todo/adding_new_pages_by_using_the_web_interface.mdwn b/doc/todo/adding_new_pages_by_using_the_web_interface.mdwn
new file mode 100644
index 000000000..9a502a852
--- /dev/null
+++ b/doc/todo/adding_new_pages_by_using_the_web_interface.mdwn
@@ -0,0 +1,79 @@
+Perhaps I'm just too stupid to find the proper way to do this, but how
+would I add a new page to the wiki without selecting to edit an already
+installed one and frobbing the URL to direct to the to-be-created page?
+--[[ThomasSchwinge]]
+
+Good point. Of course one way is to start with creating a link to the page,
+which also helps prevent orphans. But other wikis based on CGI do have this
+a bit easier, since they can detect an attempt to access a nonexistant page
+and show an edit page. Ikiwiki can't do that (unless its web server is
+configured to do smart things on a 404, like maybe call ikiwiki.cgi which
+could be modified to work as a smart 404 -> edit handler).
+
+> Since this todo was opened, the [[plugins/404]] plugin has been added;
+> it does exactly that. Only if you have Apache, at the moment, though. --[[smcv]]
+
+Some wikis also provide a UI means for creating a new page. If we can find
+something good, that can be added to ikiwiki's UI. --[[Joey]]
+
+Hmm, maybe just a preprocessor directive that creates a form inside a page,
+like is used for blog posting already would suffice? Then the main page of
+a wiki could have a form for adding new pages, if that directive were
+included there. Won't work for subpages though, unless the directive were
+added to the parent page. However, unconnected subpages are surely an even
+rarer thing to want than unconnected top level pages. --[[Joey]]
+
+> Here is a simple plugin that does that. Perhaps options could be added to
+> it, but I couldn't really think of any.
+> <http://jameswestby.net/scratch/create.diff>
+> -- JamesWestby
+
+> For what it's worth, the following works:
+> `\[[!inline pages=!* rss=no atom=no postform=yes postformtext="Add a new page titled:"]]`
+> Add `rootpage=/` if you do this in `index.mdwn` to avoid creating subpages.
+> --[[JeremieKoenig]]
+
+
+Maybe a very simple PHP frontend for serving the
+statically generated pages, that would display a page editing form or
+something like that for non-existent pages, wouldn't be too bad a thing
+and resource hog? Just a thought... --[[Tuomov]]
+
+----
+
+A quick round-up of how other wikis address this problem:
+
+ * mediawiki *used* to Offer a search box with two buttons: 'Go'
+ and 'Search'. 'Go' brought you to a page with the name you
+ typed if it exists, and searches otherwise. In the latter case,
+ you get a link like this at the top of the search results:
+
+> *There is no page titled "Testing". You can create this page.*
+
+ * wikia mediawikis have an "add a page" button that pops-up a JS
+ pseudo-window asking for a page name. On submission, you end
+ up at an edit window for the page.
+ * wikipedia now makes it quite hard to create new pages. The old
+ 'go' button is gone, nearly all search terms end up at an actual
+ article, a "no results" match does not have helpful create link
+ options.
+ * Moin Moin has a two-button search: "Titles" and "Text". Neither
+ offer a "create page" option for 0-match searches.
+ * the original c2.com wiki has no helpful link for this either.
+
+So - the direction of travel would appear to be *away* from having
+"new page" functionality.
+
+I would suggest the following for ikiwiki:
+
+ * Extend the search results page to include a "create this page" link,
+ perhaps toggleable, perhaps only if the search term matches some
+ criteria for what makes a sensible page name
+ * Some combination of JamesWestby's "create" plugin, extracting the
+ current stuff inside [[plugins/inline]] (see also:
+ [[more flexible inline postform]]) -- more generally, rationalising
+ where that code lives so it can be used in more contexts.
+ * documenting the `inline` hack above (which I use extensively on my
+ private wikis, by the way!) as a [[tip|tips]].
+
+-- [[Jon]]
diff --git a/doc/todo/adjust_commit_message_for_rename__44___remove.mdwn b/doc/todo/adjust_commit_message_for_rename__44___remove.mdwn
new file mode 100644
index 000000000..3d0d1aff4
--- /dev/null
+++ b/doc/todo/adjust_commit_message_for_rename__44___remove.mdwn
@@ -0,0 +1,5 @@
+When you rename or remove pages using the relevant plugins, a commit message is generated automatically by the plugin.
+
+It would be nice to provide a text field in the remove/rename form, pre-populated with the automatic message, so that a user may customize or append to the message (modulo VCS support)
+
+-- [[Jon]]
diff --git a/doc/todo/aggregate_401_handling.mdwn b/doc/todo/aggregate_401_handling.mdwn
new file mode 100644
index 000000000..125e74e76
--- /dev/null
+++ b/doc/todo/aggregate_401_handling.mdwn
@@ -0,0 +1,20 @@
+The aggregate plugin's handling of http 301 (moved permanently) could be
+improved. Per [[!rfc 1945]]:
+
+> The requested resource has been assigned a new permanent URL
+> and any future references to this resource should be done
+> using that URL.
+
+So ideally aggregate would notice the 301 and use the new url henceforth.
+
+It's a little tricky because the aggregate plugin can't just edit the page and
+change the url in the preprocessor directive. (Because committing such an
+edit would be .. hard.) Also, aggregate directives may also include a separate
+url for the site, which may also have changed. Perhaps the thing to do is
+record the new url in the aggregate plugin's state file, and change the message
+to "Processed ok (new url http://..)", and let the user deal with updating
+the page later.
+
+----
+
+Assuming they really mean **301** (because 401 is "Unauthorized"), I've updated the body of the text but the page itself should really be renamed. -- adam. \ No newline at end of file
diff --git a/doc/todo/aggregate_locking.mdwn b/doc/todo/aggregate_locking.mdwn
new file mode 100644
index 000000000..062f34ea3
--- /dev/null
+++ b/doc/todo/aggregate_locking.mdwn
@@ -0,0 +1,64 @@
+The [[plugins/aggregate]] plugin's locking is a suboptimal.
+
+There should be no need to lock the wiki while aggregating -- it's annoying
+that long aggregate runs can block edits from happening. However, not
+locking would present problems. One is, if an aggregate run is happening,
+and the feed is removed, it could continue adding pages for that feed.
+Those pages would then become orphaned, and stick around, since the feed
+that had created them is gone, and thus there's no indication that they
+should be removed.
+
+To fix that, garbage collect any pages that were created by
+aggregation once their feed is gone.
+
+Are there other things that could happen while it's aggregating that it
+should check for?
+
+Well, things like the feed url etc could change, and it
+would have to merge in such changes before saving the aggregation state.
+New feeds could also be added, feeds could be moved from one source page to
+another.
+
+Merging that feed info seems doable, just re-load the aggregation state
+from disk, and set the `message`, `lastupdate`, `numposts`, and `error`
+fields to their new values if the feed still exists.
+
+----
+
+Another part of the mess is that it needs to avoid stacking multiple
+aggregate processes up if aggregation is very slow. Currently this is done
+by taking the lock in nonblocking mode, and not aggregating if it's locked.
+This has various problems, for example a page edit at the right time can
+prevent aggregation from happening.
+
+Adding another lock just for aggregation could solve this. Check that lock
+(in checkconfig) and exit if another aggregator holds it.
+
+----
+
+The other part of the mess is that it currently does aggregation in
+checkconfig, locking the wiki for that, and loading state, and then
+dropping the lock, unloading state, and letting the render happen. Which
+reloads state. That state reloading is tricky to do just right.
+
+A simple fix: Move the aggregation to the new 'render' hook. Then state
+would be loaded, and there would be no reason to worry about aggregating.
+
+Or aggregation could be kept in checkconfig, like so:
+
+* load aggregation state
+* get list of feeds needing aggregation
+* exit if none
+* attempt to take aggregation lock, exit if another aggregation is happening
+* fork a child process to do the aggregation
+ * load wiki state (needed for aggregation to run)
+ * aggregate
+ * lock wiki
+ * reload aggregation state
+ * merge in aggregation state changes
+ * unlock wiki
+* drop aggregation lock
+* force rebuild of sourcepages of feeds that were aggregated
+* exit checkconfig and continue with usual refresh process
+
+[[done]]
diff --git a/doc/todo/aggregate_to_internal_pages.mdwn b/doc/todo/aggregate_to_internal_pages.mdwn
new file mode 100644
index 000000000..272e146f4
--- /dev/null
+++ b/doc/todo/aggregate_to_internal_pages.mdwn
@@ -0,0 +1,59 @@
+The new internal page feature is designed for something like
+[[plugins/aggregate]].
+
+How to transition to it though? inlines of aggregated content would need to
+change their pagespecs to use `internal()`.
+
+> [[patch]] in git://git.debian.org/git/users/smcv/ikiwiki.git, branch "aggregate".
+> Migration is a two-step process: first change all your pagespecs to use `internal()`, then add `internalize="yes"` to all your aggregate invocations. --smcv.pseudorandom.co.uk
+
+> Thanks for working on this.
+>
+> I see one problem, if internalize is flipped on and there are existing
+> aggregated pages, htmlfn will not return the right filename for those
+> pages when expiring them. Seems that `$was_internal` (or just the full
+> source filename) should be recorded on a per-guid basis. Could you do
+> that?
+>
+> I'm weighing the added complexity of having an internalize option
+> (which people would have to add, and would probably forget), with just
+> making aggregate create all new pages as internal, and having a flag day
+> where all inlines and other uses of aggregated pages have to change
+> pagespecs to use `isinternal()`.
+>
+> There are real bugs that are fixed by making
+> aggregated plugins internal, including:
+> - Avoids web edits to aggregated pages. (Arguably a security hole;
+> though they can be locked..)
+> - Significant speed improvements.
+> - Less disk use.
+>
+> If internal has to be manually enabled, people will forget to. I'd rather
+> not have to worry about these bugs in the future. So, I'm thinking flag
+> day. --[[Joey]]
+
+> OK, there's a simpler approach in the same repository, branch
+> "aggregateinternal". It just adds an aggregateinternal option
+> for the whole wiki.
+>
+> On a flag day, everyone has to change their inline directives
+> to use `internal()`, after which this option can be switched on.
+> When changing the option, you'll have to clean up the mess from
+> old aggregated pages by hand, and re-aggregate.
+>
+> If this is a direction you prefer, the next step would be to
+> add support for existing wikis setting this option - for instance
+> it could look for non-internal pages that were previously
+> aggregated, and convert them to internal, the first time aggregation
+> runs. --smcv
+
+> Sure, that seems reasonable. Perhaps `ikiwiki-transition` could be used
+> to move the pages, and even, possibly update the pagespecs (not sure how
+> it could figure out which ones tho). --[[Joey]]
+
+> I've patched ikiwiki-transition to have an aggregateinternal mode.
+> See my aggregateinternal branch, again.
+> "ikiwiki-transition aggregateinternal $setupfile" moves the pages around,
+> although it doesn't update the pagespecs (I wouldn't know how...) --[[smcv]]
+
+[[!tag patch done]]
diff --git a/doc/todo/aggregation.mdwn b/doc/todo/aggregation.mdwn
new file mode 100644
index 000000000..371d20c12
--- /dev/null
+++ b/doc/todo/aggregation.mdwn
@@ -0,0 +1,3 @@
+* Still need to support feed expiry.
+
+[[todo/done]]
diff --git a/doc/todo/alias_directive.mdwn b/doc/todo/alias_directive.mdwn
new file mode 100644
index 000000000..71a2efc76
--- /dev/null
+++ b/doc/todo/alias_directive.mdwn
@@ -0,0 +1,72 @@
+An alias directive could work like an inverse redirect, but in a more
+maintainable way. Currently, a page might have several redirects leading to it,
+without an easy way of enumerating them. Therefore, the following directive is
+suggested for addition (possibly by means of a plugin):
+
+> The `alias` and `aliastext` directives implicitly create
+> redirect pages to the page they are used on. If two or more pages claim a
+> non-existing page to be an alias, a disambiguation page will automatically
+> generated. If an existing page is claimed as an alias, it will be prefixed
+> with a note that its topic is also an alias for other pages.
+>
+> All aliases to a page are automatically listed below the backlink and tag
+> lists at the bottom of a page by default. This can be configured globally by
+> setting the `alias_list` configuration option to `false`, or set explicitly
+> per alias by specifying `list=true` or `list=false`.
+>
+> Similar to the `taglink` directive, `aliastext` produces the alias name as
+> well as registering it.
+>
+> ## Usage example
+>
+> `Greece.mdwn`:
+>
+> > Greece, also known as \[[!aliastext Hellas]] and officially the
+> > \[[!aliastext "Hellenic Republic"]], is a …
+> >
+> > <!-- there are so many people who misspell this, let's create a redirect -->
+> > \[[!alias Grece list=false]]
+>
+> This page by itself will redirect from the "Hellas", "Hellenic Republic" and
+> "Grece" pages as if they both contained just:
+>
+> > \[[!meta redir="Greece"]]
+>
+> If, on the other hand, `Hellas Planitia` also claims `[[!alias Hellas]]`, the
+> Hellas page will look like this:
+>
+> > **Hellas** is an alias for the following pages:
+> >
+> > * \[[Greece]]
+> > * \[[Hellas Planitia]]
+
+The proposed plugin/directive could be extended, eg. by also including
+old-style redirects in the alias list, but that might introduce unwanted
+coupling with the meta directive.
+
+-----------------
+
+On second thought, implementing this might have similarities with
+[[todo/auto-create tag pages according to a template]] -- the auto-created
+pages would, if the way of the alias directive is followed, not create physical
+files, though, but be created just when someone edits them.
+
+If multiple plugins do such a trick, they would have to fight over who comes
+first. If, for example, we have a setup where not yet created tag pages are
+automatically generated as "\[[!inline pages="link(<TMPL_VAR TAG>)"
+archive="yes"]]" and aliases are enabled, and a non-tag pages grabs a tag as an
+alias (as to redirect all taglinks of the tag to itself), there are two
+possibilities:
+
+* The autotag plugin comes first:
+ * autotag sees the missing tag and creates its "\[[!inline" stuff
+ * alias sees that there is already content and adds its prefix
+* The alias plugin comes first (this is the prefered way):
+ * alias sees the empty page, sees it is not contested by other alias
+ directives and creates its "\[[!meta" redirect
+ * autotag sees there is already content and doesn't do anything
+
+That issue could be handled with "priority number" on the hook, with plugins
+with a lower number being called first.
+
+[[!tag wishlist]]
diff --git a/doc/todo/allow_CGI_to_create_dynamic_pages.mdwn b/doc/todo/allow_CGI_to_create_dynamic_pages.mdwn
new file mode 100644
index 000000000..7f51f79d0
--- /dev/null
+++ b/doc/todo/allow_CGI_to_create_dynamic_pages.mdwn
@@ -0,0 +1,3 @@
+[[!tag wishlist]]
+
+It would be cool if the CGI could be used to render dynamic pages. For instance, I might want to create a page with a `\[[map]]` according to a [[pagespec]] to be passed in the query string, instead of creating/hardcoding all possible pagespecs I might want to call.
diff --git a/doc/todo/allow_TMPL__95__LOOP_in_template_directives.mdwn b/doc/todo/allow_TMPL__95__LOOP_in_template_directives.mdwn
new file mode 100644
index 000000000..890c4cf4b
--- /dev/null
+++ b/doc/todo/allow_TMPL__95__LOOP_in_template_directives.mdwn
@@ -0,0 +1,278 @@
+[[!tag patch todo]]
+
+[[!template id="note" text="""
+Simply copied this from my website
+[[http://www.camco.ie/code/ikiwiki,3.20120202,20120313a/]]
+feel free to reformat / delete"""]]
+
+The following re-write allows for multiple definitions of the
+same tag value in a [[plugins/template]] definition. This, in turn, allows
+us to use TMPL_LOOPS in our [[ikiwiki/directive/template]] directives; all-be-it in a
+rather limited way.
+
+> I'm willing to consider such a feature, but it needs to be presented in
+> the form of a patch that is reviewable, not a gratuitous rewrite.
+> --[[Joey]]
+
+>> Yes, my apologies for that. The two worker functions `mktmpl_hash`
+and `proc_tmpl_hash` are new. The `preprocess` function then starts
+by arranging the parameters into an array. This array is passed to the
+`mktmpl_hash` and it creates a hash, suitable for passing into the
+HTML::Template directly. The `proc_tmpl_hash` then walks the hash
+structure and processes the parameters.
+
+>> I know ... you weren't looking for an explanation, just a patch
+... totally understand. Point I'm trying to make, it's a 90% re-write
+anyway (and my `style(8)` will probably piss most people off).
+
+>> Anyway, would love to contribute so will try to get to doing this
+"correctly" and post as a patch.
+
+I would, personally, only use this feature for very basic loops
+and, although nested loops *might* be possible (with a little
+more tinkering) it think any attempt would be better served by
+[[Kathyrn Anderson's|http://www.katspace.org/]] [[field et
+al.|http://ikiwiki.info/plugins/contrib/field/]] plugin.
+
+It *is* (primarily) intended to allow insertion of organised CSS
+blocks (i.e. `<div>`) through template directives (since i can't
+seem to get HTML and Markup to mix the way I want).
+
+[[!template id="note" text="""
+Apologies for the re-write. I struggle reading perl code that
+I didn't write and (probably too often) re-format to reduce my
+head-aches. Anyway it didn't make sense to post the patch since
+everything's changed now.
+"""]]
+
+NB: this *should* be 100% backwards compatible.
+
+# `lib/perl5/IkiWiki/Plugin/template.pm`
+
+[[!format perl """
+
+ #!/usr/bin/perl
+ # Structured template plugin.
+ package IkiWiki::Plugin::template ;
+
+ use warnings ;
+ use strict ;
+ use IkiWiki 3.00 ;
+ use Encode ;
+
+ sub mktmpl_hash( $ ; $ ; @ ) ;
+ # declare to supress warning in recursive call
+ sub mktmpl_hash( $ ; $ ; @ )
+ # make hash for the template, filling
+ # values from the supplied params
+ {
+ my $template = shift( @_ )
+ || error( "mktmpl_hash: no template provided" ) ;
+ my $param_src = shift( @_ )
+ || error( "mktmpl_hash: no parameters" ) ;
+
+ my $path ;
+ if( $#_ > 0 )
+ {
+ $path = [ @_ ] ;
+ } else {
+ $path = shift(@_) || [] ;
+ } ;
+
+ my %params ;
+
+ my @path_vars ;
+ if( $#{$path} < 0 )
+ {
+ @path_vars = $template->query() ;
+ } else {
+ @path_vars = $template->query( loop => $path ) ;
+ } ;
+
+ foreach my $var ( @path_vars )
+ {
+ push( @{$path}, $var ) ;
+ my $param_type = $template->query( name => $path ) ;
+ if( $param_type eq 'VAR' )
+ {
+ my @var_path = split( /_/, $var ) ;
+ if( $var_path[0] ne '' )
+ {
+ $path->[-1] = join( '_', @var_path[1..$#var_path] )
+ if( $var_path[0] eq 'raw' ) ;
+ $params{$var} = shift( @{$param_src->{$path->[-1]}} )
+ || return(undef) ;
+ } ;
+ } elsif( $param_type eq 'LOOP' )
+ {
+ $params{$var} = [] ;
+ push( @{$params{$var}}, $_ )
+ while( $_ = mktmpl_hash($template,$param_src,$path) ) ;
+ } ;
+ pop( @{$path} ) ;
+ } ;
+ return( \%params ) ;
+ } ;
+
+ sub proc_tmpl_hash( $ ; $ ; $ ; $ ) ;
+ # declare to supress warning in recursive call
+ sub proc_tmpl_hash( $ ; $ ; $ ; $ )
+ # walk the hash, preprocess and
+ # convert to html
+ {
+ my $tmpl_hash = shift( @_ ) ;
+ my $page = shift( @_ ) ;
+ my $destpage = shift( @_ ) ;
+ my $scan = shift( @_ ) ;
+ foreach my $key ( keys(%{$tmpl_hash}) )
+ {
+ unless( ref($tmpl_hash->{$key}) )
+ # here we assume that
+ # any reference is an
+ # array and allow it to
+ # fail if that's false
+ {
+ $tmpl_hash->{$key} =
+ IkiWiki::preprocess(
+ $page,
+ $destpage,
+ $tmpl_hash->{$key},
+ $scan ) ;
+ my @key_path = split( /_/, $key ) ;
+ $tmpl_hash->{$key} =
+ IkiWiki::htmlize(
+ $page,
+ $destpage,
+ pagetype($pagesources{$page}),
+ $tmpl_hash->{$key}, )
+ unless( $key_path[0] eq 'raw' ) ;
+ } else {
+ proc_tmpl_hash( $_, $page, $destpage, $scan )
+ foreach( @{$tmpl_hash->{$key}} ) ;
+ } ;
+ } ;
+ } ;
+
+ # "standard" ikiwiki definitions / hooks
+
+ sub import
+ {
+ hook( type => "getsetup",
+ id => "template",
+ call => \&getsetup ) ;
+ hook( type => "preprocess",
+ id => "template",
+ call => \&preprocess,
+ scan => 1 ) ;
+ } ;
+
+ sub getsetup()
+ {
+ return(
+ plugin => {
+ safe => 1,
+ rebuild => undef,
+ section => "widget",
+ }, ) ;
+ } ;
+
+ sub preprocess( @ )
+ {
+ # first process arguments into arrays of values
+ my %params ;
+
+ my( $key, $value ) ;
+ while( ($key,$value)=splice(@_,0,2) )
+ {
+ if( exists($params{$key}) )
+ {
+ push( @{$params{$key}}, $value ) ;
+ } else {
+ $params{$key} = [ $value ] ;
+ } ;
+ } ;
+
+ # set context
+ my $scan = ! defined( wantarray() ) ;
+ # This needs to run even in scan
+ # mode, in order to process links
+ # and other metadata included via
+ # the template.
+
+ # check for critical values
+ if( ! exists($params{id}) )
+ {
+ error( gettext("missing id parameter") ) ;
+ } ;
+
+ # set some convenience variables
+ my $id = $params{id}->[$#{$params{id}}] ;
+ my $page = $params{page}->[$#{$params{page}}] ;
+ my $destpage = $params{destpage}->[$#{$params{destpage}}] ;
+ # ... and an essential one for the production pass
+ $params{basename} = [ IkiWiki::basename($page) ] ;
+
+ # load the template
+ my $template ;
+ eval {
+ $template =
+ template_depends( $id, $page,
+ blind_cache=>1 ) ;
+ # The bare id is used, so
+ # a page templates/$id can
+ # be used as the template.
+ } ;
+ if( $@ )
+ {
+ error(
+ sprintf(
+ gettext("failed to process template %s"),
+ htmllink(
+ $page,
+ $destpage,
+ "/templates/$id")
+ )." $@"
+ ) ;
+ } ;
+
+ # create and process the parameters
+ my $tmpl_hash = mktmpl_hash( $template, \%params ) ;
+ proc_tmpl_hash( $tmpl_hash, $page, $destpage, $scan ) ;
+ # ... and load the template with the values
+ $template->param( $tmpl_hash ) ;
+
+ # return the processed page chunk
+ return( IkiWiki::preprocess($page,
+ $destpage,
+ $template->output(),$scan)
+ ) ;
+ } ;
+
+ 1 ;
+
+"""]]
+
+## sample template
+
+ # <TMPL_VAR HEADER0>
+
+ <table>
+ <TMPL_LOOP TEST0>
+ <tr>
+ <td><TMPL_VAR DATA0></td>
+ <td><TMPL_VAR DATA1></td>
+ </tr>
+ </TMPL_LOOP>
+ </table>
+
+## sample iki page
+
+ \[[!meta title="this is my loops page"]]
+
+ \[[!template id="loops"
+ header0="this is a table"
+ data0="cell0:0"
+ data1="cell0:1"
+ data0="cell1:0"
+ data1="cell1:1"
+ ]]
diff --git a/doc/todo/allow_banning_a_user_when_moderating_a_comment.mdwn b/doc/todo/allow_banning_a_user_when_moderating_a_comment.mdwn
new file mode 100644
index 000000000..c0b85ec85
--- /dev/null
+++ b/doc/todo/allow_banning_a_user_when_moderating_a_comment.mdwn
@@ -0,0 +1 @@
+If a logged-in user is both a comment moderator and an admin, it would be nice if you could tick a box to ban the poster of a comment (or their IP if not signed in. Or their IP, AND their login if signed in, I suppose.) via the comment moderation interface. Presently, you must view the back-end files to establish who posted the comment (the IP is not exposed in the moderation interface yet.) — [[Jon]]
diff --git a/doc/todo/allow_creation_of_non-existent_pages.mdwn b/doc/todo/allow_creation_of_non-existent_pages.mdwn
new file mode 100644
index 000000000..61f311b8c
--- /dev/null
+++ b/doc/todo/allow_creation_of_non-existent_pages.mdwn
@@ -0,0 +1,13 @@
+With a statement such as
+
+ ErrorDocument 404 /wiki/cgi-bin/ikiwiki?do=create
+
+in `apache`'s configuration, I think that it would be possible to let the user surf to non-existent pages and be prompted to create an entry, as it is with other popular wiki engines.
+
+From the [apache documentation](http://httpd.apache.org/docs/2.2/custom-error.html), it seems that the environment variable `REDIRECT_URL` will carry the name of the page the user has accessed. Also see [ErrorDocument](http://httpd.apache.org/docs/2.2/mod/core.html#errordocument)'s documentation.
+
+> Nice idea, I'll try to find time to add a plugin doing this. --[[Joey]]
+
+>> [[done]] some time ago, as the [[plugins/404]] plugin --[[smcv]]
+
+[[wishlist]]
diff --git a/doc/todo/allow_disabling_backlinks.mdwn b/doc/todo/allow_disabling_backlinks.mdwn
new file mode 100644
index 000000000..5dd4876e8
--- /dev/null
+++ b/doc/todo/allow_disabling_backlinks.mdwn
@@ -0,0 +1,18 @@
+This patch allows disabling the backlinks in the config file by setting nobacklinks to 0.
+
+It is backwards compatible, and by default enables backlinks in the generated pages.
+
+<pre>
+--- IkiWiki/Render.pm.orig2 2009-01-06 14:54:01.000000000 +1300
++++ IkiWiki/Render.pm 2009-01-06 14:55:08.000000000 +1300
+@@ -107,7 +107,8 @@
+ $template->param(have_actions => 1);
+ }
+
+- my @backlinks=sort { $a->{page} cmp $b->{page} } backlinks($page);
++ my @backlinks=sort { $a->{page} cmp $b->{page} } backlinks($page)
++ unless defined $config{nobacklinks} && $config{nobacklinks} == 0;
+ my ($backlinks, $more_backlinks);
+ if (@backlinks <= $config{numbacklinks} || ! $config{numbacklinks}) {
+ $backlinks=\@backlinks;
+</pre>
diff --git a/doc/todo/allow_displaying_number_of_comments.mdwn b/doc/todo/allow_displaying_number_of_comments.mdwn
new file mode 100644
index 000000000..02d55fc9b
--- /dev/null
+++ b/doc/todo/allow_displaying_number_of_comments.mdwn
@@ -0,0 +1,30 @@
+My `numcomments` Git branch adds a `NUMCOMMENTS` `TMPL_VAR`, which is
+useful to add to the `forumpage.tmpl` template to emulate (the nice
+bits of) a more usual webforum.
+
+Please review... and pull :)
+
+-- [[intrigeri]]
+
+> How is having this variable for showing a count of the comments
+> better (or more forum-ish) than the COMMENTSLINK variable which
+> includes a count and a link to the comments, and is already displayed
+> in inlinepage.tmpl?
+>
+> `num_comments` will never return undef.
+>
+> I see no need to add a second pagetemplate hook.
+> The existing one can be added to. Probably inside its `if ($shown)`
+> block.
+>
+> It may also be a good idea to either combine the calls to `num_comments`
+> used for this and for the commentslink,
+> or to memoize it. I'm thinking generally memoizing it may be a good idea
+> since the comments for a page will typically be counted twice when it's
+> inlined.
+> --[[Joey]]
+
+[[patch]]
+
+>> Well, the COMMENTSLINK variable fits my needs. Sorry for
+>> the disturbance. [[done]] --[[intrigeri]]
diff --git a/doc/todo/allow_full_post_from_the___34__add_a_new_post__34___form.mdwn b/doc/todo/allow_full_post_from_the___34__add_a_new_post__34___form.mdwn
new file mode 100644
index 000000000..a604182b1
--- /dev/null
+++ b/doc/todo/allow_full_post_from_the___34__add_a_new_post__34___form.mdwn
@@ -0,0 +1,12 @@
+To avoid the two-step posting process of typing a page name, hitting
+"Edit", entering content, and hitting "Save Page", how about optionally
+including a post content field, save button, and preview button directly on
+the page with the inline? This would particularly help when using an
+inline directive for a comment form at the bottom of a blog post; with
+these added fields, the post form becomes exactly like the typical blog
+comment form.
+
+> I agree that having this as an option is reasonable. Although it would
+> take a fair amount of work. --[[Joey]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/allow_plugins_to_add_sorting_methods.mdwn b/doc/todo/allow_plugins_to_add_sorting_methods.mdwn
new file mode 100644
index 000000000..b523cd19f
--- /dev/null
+++ b/doc/todo/allow_plugins_to_add_sorting_methods.mdwn
@@ -0,0 +1,304 @@
+[[!tag patch]]
+
+The available [[ikiwiki/pagespec/sorting]] methods are currently hard-coded in
+IkiWiki.pm, making it difficult to add any extra sorting mechanisms. I've
+prepared a branch which adds 'sort' as a hook type and uses it to implement a
+new `meta_title` sort type.
+
+Someone could use this hook to make `\[[!inline sort=title]]` prefer the meta
+title over the page name, but for compatibility, I'm not going to (I do wonder
+whether it would be worth making sort=name an alias for the current sort=title,
+and changing the meaning of sort=title in 4.0, though).
+
+> What compatability concerns, exactly, are there that prevent making that
+> change now? --[[Joey]]
+
+*[sort-hooks branch now withdrawn in favour of sort-package --s]*
+
+I briefly tried to turn *all* the current sort types into hook functions, and
+have some of them pre-registered, but decided that probably wasn't a good idea.
+That earlier version of the branch is also available for comparison:
+
+*[also withdrawn in favour of sort-package --s]*
+
+>> I wonder if IkiWiki would benefit from the concept of a "sortspec", like a [[ikiwiki/PageSpec]] but dedicated to sorting lists of pages rather than defining lists of pages? Rather than defining a sort-hook, define a SortSpec class, and enable people to add their own sort methods as functions defined inside that class, similarly to the way they can add their own pagespec definitions. --[[KathrynAndersen]]
+
+>>> [[!template id=gitbranch branch=smcv/ready/sort-package author="[[Simon_McVittie|smcv]]"]]
+>>> I'd be inclined to think that's overkill, but it wasn't very hard to
+>>> implement, and in a way is more elegant. I set it up so sort mechanisms
+>>> share the `IkiWiki::PageSpec` package, but with a `cmp_` prefix. Gitweb:
+>>> <http://git.pseudorandom.co.uk/smcv/ikiwiki.git?a=shortlog;h=refs/heads/sort-package>
+
+>>>> I agree it seems more elegant, so I have focused on it.
+>>>>
+>>>> I don't know about reusing `IkiWiki::PageSpec` for this.
+>>>> --[[Joey]]
+
+>>>>> Fair enough, `IkiWiki::SortSpec::cmp_foo` would be just
+>>>>> as easy, or `IkiWiki::Sorting::cmp_foo` if you don't like
+>>>>> introducing "sort spec" in the API. I took a cue from
+>>>>> [[ikiwiki/pagespec/sorting]] being a subpage of
+>>>>> [[ikiwiki/pagespec]], and decided that yes, sorting is
+>>>>> a bit like a pagespec :-) Which name would you prefer? --s
+
+>>>>>> `SortSpec` --[[Joey]]
+
+>>>>>>> [[Done]]. --s
+
+>>>> I would be inclined to drop the `check_` stuff. --[[Joey]]
+
+>>>>> It basically exists to support `title_natural`, to avoid
+>>>>> firing up the whole import mechanism on every `cmp`
+>>>>> (although I suppose that could just be a call to a
+>>>>> memoized helper function). It also lets sort specs that
+>>>>> *must* have a parameter, like
+>>>>> [[field|plugins/contrib/field/discussion]], fail early
+>>>>> (again, not so valuable).
+>>>>>
+>>>>>> AFAIK, `use foo` has very low overhead when the module is already
+>>>>>> loaded. There could be some evalation overhead in `eval q{use foo}`,
+>>>>>> if so it would be worth addressing across the whole codebase.
+>>>>>> --[[Joey]]
+>>>>>>
+>>>>>>> check_cmp_foo now dropped. --s
+>>>>>
+>>>>> The former function could be achieved at a small
+>>>>> compatibility cost by putting `title_natural` in a new
+>>>>> `sortnatural` plugin (that fails to load if you don't
+>>>>> have `title_natural`), if you'd prefer - that's what would
+>>>>> have happened if `title_natural` was written after this
+>>>>> code had been merged, I suspect. Would you prefer this? --s
+
+>>>>>> Yes! (Assuming it does not make sense to support
+>>>>>> natural order sort of other keys than the title, at least..)
+>>>>>> --[[Joey]]
+
+>>>>>>> Done. I added some NEWS.Debian for it, too. --s
+
+>>>> Wouldn't it make sense to have `meta(title)` instead
+>>>> of `meta_title`? --[[Joey]]
+
+>>>>> Yes, you're right. I added parameters to support `field`,
+>>>>> and didn't think about making `meta` use them too.
+>>>>> However, `title` does need a special case to make it
+>>>>> default to the basename instead of the empty string.
+>>>>>
+>>>>> Another special case for `title` is to use `titlesort`
+>>>>> first (the name `titlesort` is derived from Ogg/FLAC
+>>>>> tags, which can have `titlesort` and `artistsort`).
+>>>>> I could easily extend that to other metas, though;
+>>>>> in fact, for e.g. book lists it would be nice for
+>>>>> `field(bookauthor)` to behave similarly, so you can
+>>>>> display "Douglas Adams" but sort by "Adams, Douglas".
+>>>>>
+>>>>> `meta_title` is also meant to be a prototype of how
+>>>>> `sort=title` could behave in 4.0 or something - sorting
+>>>>> by page name (which usually sorts in approximately the
+>>>>> same place as the meta-title, but occasionally not), while
+>>>>> displaying meta-titles, does look quite odd. --s
+
+>>>>>> Agreed. --[[Joey]]
+
+>>>>>>> I've implemented meta(title). meta(author) also has the
+>>>>>>> `sortas` special case; meta(updated) and meta(date)
+>>>>>>> should also work how you'd expect them to (but they're
+>>>>>>> earliest-first, unlike age). --s
+
+>>>> As I read the regexp in `cmpspec_translate`, the "command"
+>>>> is required to have params. They should be optional,
+>>>> to match the documentation and because most sort methods
+>>>> do not need parameters. --[[Joey]]
+
+>>>>> No, `$2` is either `\w+\([^\)]*\)` or `[^\s]+` (with the
+>>>>> latter causing an error later if it doesn't also match `\w+`).
+>>>>> This branch doesn't add any parameterized sort methods,
+>>>>> in fact, although I did provide one on
+>>>>> [[field's_discussion_page|plugins/contrib/report/discussion]]. --s
+
+>>>> I wonder if it would make sense to add some combining keywords, so
+>>>> a sortspec reads like `sort="age then ascending title"`
+>>>> In a way, this reduces the amount of syntax that needs to be learned.
+>>>> I like the "then" (and it could allow other operations than
+>>>> simple combination, if any others make sense). Not so sure about the
+>>>> "ascending", which could be "reverse" instead, but "descending age" and
+>>>> "ascending age" both seem useful to be able to explicitly specify.
+>>>> --[[Joey]]
+
+>>>>> Perhaps. I do like the simplicity of [[KathrynAndersen]]'s syntax
+>>>>> from [[plugins/contrib/report]] (which I copied verbatim, except for
+>>>>> turning sort-by-`field` into a parameterized spec).
+>>>>>
+>>>>> If we're getting into English-like (or at least SQL-like) queries,
+>>>>> it might make sense to change the signature of the hook function
+>>>>> so it's a function to return a key, e.g.
+>>>>> `sub key_age { return -%pagemtime{$_[0]) }`. Then we could sort like
+>>>>> this:
+>>>>>
+>>>>> field(artistsort) or field(artist) or constant(Various Artists) then meta(titlesort) or meta(title) or title
+>>>>>
+>>>>> with "or" binding more closely than "then". Does this seem valuable?
+>>>>> I think the implementation would be somewhat more difficult. and
+>>>>> it's probably getting too complicated to be worthwhile, though?
+>>>>> (The keys that actually benefit from this could just
+>>>>> have smarter cmp functions, I think.)
+>>>>>
+>>>>> If the hooks return keys rather than cmp results, then we could even
+>>>>> have "lowercase" as an adjective used like "ascending"... maybe.
+>>>>> However, there are two types of adjective here: "lowercase"
+>>>>> really applies to the keys, whereas "ascending" applies to the "cmp"
+>>>>> result. Again, I think this is getting too complex, and could just
+>>>>> be solved with smarter cmp functions.
+>>>>>
+>>>>>> I agree. (Also, I think returning keys may make it harder to write
+>>>>>> smarter cmp functions.) --[[Joey]]
+>>>>>
+>>>>> Unfortunately, `sort="ascending mtime"` actually sorts by *descending*
+>>>>> timestamp (but`sort=age` is fine, because `age` could be defined as
+>>>>> now minus `ctime`). `sort=freshness` isn't right either, because
+>>>>> "sort by freshness" seems as though it ought to mean freshest first,
+>>>>> but "sort by ascending freshness" means put the least fresh first. If
+>>>>> we have ascending and descending keywords which are optional, I don't
+>>>>> think we really want different sort types to have different default
+>>>>> directions - it seems clearer to have `ascending` always be a no-op,
+>>>>> and `descending` always negate.
+>>>>>
+>>>>>> I think you've convinced me that ascending/descending impose too
+>>>>>> much semantics on it, so "-" is better. --[[Joey]]
+
+>>>>>>> I've kept the semantics from `report` as-is, then:
+>>>>>>> e.g. `sort="age -title"`. --s
+
+>>>>> Perhaps we could borrow from `meta updated` and use `update_age`?
+>>>>> `updateage` would perhaps be a more normal IkiWiki style - but that
+>>>>> makes me think that updateage is a quantity analagous to tonnage or
+>>>>> voltage, with more or less recently updated pages being said to have
+>>>>> more or less updateage. I don't know whether that's good or bad :-)
+>>>>>
+>>>>> I'm sure there's a much better word, but I can't see it. Do you have
+>>>>> a better idea? --s
+
+[Regarding the `meta title=foo sort=bar` special case]
+
+> I feel it sould be clearer to call that "sortas", since "sort=" is used
+> to specify a sort method in other directives. --[[Joey]]
+>> Done. --[[smcv]]
+
+## speed
+
+I notice the implementation does not use the magic `$a` and `$b` globals.
+That nasty perl optimisation is still worthwhile:
+
+ perl -e 'use warnings; use strict; use Benchmark; sub a { $a <=> $b } sub b ($$) { $_[0] <=> $_[1] }; my @list=reverse(1..9999); timethese(10000, {a => sub {my @f=sort a @list}, b => sub {my @f=sort b @list}, c => => sub {my @f=sort { b($a,$b) } @list}})'
+ Benchmark: timing 10000 iterations of a, b, c...
+ a: 80 wallclock secs (76.74 usr + 0.05 sys = 76.79 CPU) @ 130.23/s (n=10000)
+ b: 112 wallclock secs (106.14 usr + 0.20 sys = 106.34 CPU) @ 94.04/s (n=10000)
+ c: 330 wallclock secs (320.25 usr + 0.17 sys = 320.42 CPU) @ 31.21/s (n=10000)
+
+Unfortunatly, I think that c is closest to the new implementation.
+--[[Joey]]
+
+> Unfortunately, `$a` isn't always `$main::a` - it's `$Package::a` where
+> `Package` is the call site of the sort call. This was a showstopper when
+> `sort` was a hook implemented in many packages, but now that it's a
+> `SortSpec`, I may be able to fix this by putting a `sort` wrapper in the
+> `SortSpec` namespace, so it's like this:
+>
+> sub sort ($@)
+> {
+> my $cmp = shift;
+> return sort $cmp @_;
+> }
+>
+> which would mean that the comparison used `$IkiWiki::SortSpec::a`.
+> --s
+
+>> I've now done this. On a wiki with many [[plugins/contrib/album]]s
+>> (a full rebuild takes half an hour!), I tested a refresh after
+>> `touch tags/*.mdwn` (my tag pages contain inlines of the form
+>> `tagged(foo)` sorted by date, so they exercise sorting).
+>> I also tried removing sorting from `pagespec_match_list`
+>> altogether, as an upper bound for how fast we can possibly make it.
+>>
+>> * `master` at branch point: 63.72user 0.29system
+>> * `master` at branch point: 63.91user 0.37system
+>> * my branch, with `@_`: 65.28user 0.29system
+>> * my branch, with `@_`: 65.21user 0.28system
+>> * my branch, with `$a`: 64.09user 0.28system
+>> * my branch, with `$a`: 63.83user 0.36system
+>> * not sorted at all: 58.99user 0.29system
+>> * not sorted at all: 58.92user 0.29system
+>>
+>> --s
+
+> I do notice that `pagespec_match_list` performs the sort before the
+> filter by pagespec. Is this a deliberate design choice, or
+> coincidence? I can see that when `limit` is used, this could be
+> used to only run the pagespec match function until `limit` pages
+> have been selected, but the cost is that every page in the wiki
+> is sorted. Or, it might be useful to do the filtering first, then
+> sort the sub-list thus produced, then finally apply the limit? --s
+
+>> Yes, it was deliberate, pagespec matching can be expensive enough that
+>> needing to sort a lot of pages seems likely to be less work. (I don't
+>> remember what benchmarking was done though.) --[[Joey]]
+
+>>> We discussed this on IRC and Joey pointed out that this also affects
+>>> dependency calculation, so I'm not going to get into this now... --s
+
+Joey pointed out on IRC that the `titlesort` feature duplicates all the
+meta titles. I did that in order to sort by the unescaped version, but
+I've now changed the branch to only store that if it makes a difference.
+--s
+
+## Documentation from sort-package branch
+
+### advanced sort orders (conditionally added to [[ikiwiki/pagespec/sorting]])
+
+* `title_natural` - Orders by title, but numbers in the title are treated
+ as such, ("1 2 9 10 20" instead of "1 10 2 20 9")
+* `meta(title)` - Order according to the `\[[!meta title="foo" sortas="bar"]]`
+ or `\[[!meta title="foo"]]` [[ikiwiki/directive]], or the page name if no
+ full title was set. `meta(author)`, `meta(date)`, `meta(updated)`, etc.
+ also work.
+
+### Multiple sort orders (added to [[ikiwiki/pagespec/sorting]])
+
+In addition, you can combine several sort orders and/or reverse the order of
+sorting, with a string like `age -title` (which would sort by age, then by
+title in reverse order if two pages have the same age).
+
+### meta sortas parameter (added to [[ikiwiki/directive/meta]])
+
+[in title]
+
+An optional `sort` parameter will be used preferentially when
+[[ikiwiki/pagespec/sorting]] by `meta(title)`:
+
+ \[[!meta title="The Beatles" sort="Beatles, The"]]
+
+ \[[!meta title="David Bowie" sort="Bowie, David"]]
+
+[in author]
+
+ An optional `sortas` parameter will be used preferentially when
+ [[ikiwiki/pagespec/sorting]] by `meta(author)`:
+
+ \[[!meta author="Joey Hess" sortas="Hess, Joey"]]
+
+### Sorting plugins (added to [[plugins/write]])
+
+Similarly, it's possible to write plugins that add new functions as
+[[ikiwiki/pagespec/sorting]] methods. To achieve this, add a function to
+the IkiWiki::SortSpec package named `cmp_foo`, which will be used when sorting
+by `foo` or `foo(...)` is requested.
+
+The names of pages to be compared are in the global variables `$a` and `$b`
+in the IkiWiki::SortSpec package. The function should return the same thing
+as Perl's `cmp` and `<=>` operators: negative if `$a` is less than `$b`,
+positive if `$a` is greater, or zero if they are considered equal. It may
+also raise an error using `error`, for instance if it needs a parameter but
+one isn't provided.
+
+The function will also be passed one or more parameters. The first is
+`undef` if invoked as `foo`, or the parameter `"bar"` if invoked as `foo(bar)`;
+it may also be passed additional, named parameters.
diff --git a/doc/todo/allow_site-wide_meta_definitions.mdwn b/doc/todo/allow_site-wide_meta_definitions.mdwn
new file mode 100644
index 000000000..f548f1a5b
--- /dev/null
+++ b/doc/todo/allow_site-wide_meta_definitions.mdwn
@@ -0,0 +1,169 @@
+[[!tag plugins/meta patch]]
+[[!template id=gitbranch branch=jon/defaultmeta author="[[Jon]]"]]
+
+I'd like to define [[plugins/meta]] values to apply across all pages
+site-wide unless the pages define their own: default values for meta
+definitions essentially.
+
+ <snip old patch, see below for latest>
+
+-- [[Jon]]
+
+> This doesn't support multiple-argument meta directives like
+> `link=x rel=y`, or meta directives with special side-effects like
+> `updated`.
+>
+> The first could be solved (if you care) by a syntax like this:
+>
+> meta_defaults => [
+> { copyright => "© me" },
+> { link => "about:blank", rel => "silly", },
+> ]
+>
+> The second could perhaps be solved by invoking `meta::preprocess` from within
+> `scan` (which might be a simplification anyway), although this is complicated
+> by the fact that some (but not all!) meta headers are idempotent.
+>
+> --[[smcv]]
+
+>> Thanks for your comment. I've revised the patch to use the config syntax
+>> you suggest. I need to perform some more testing to make sure I've
+>> addressed the issues you highlight.
+>>
+>> I had to patch part of IkiWiki core, the merge routine in Setup, because
+>> the use of `possibly_foolish_untaint` was causing the hashrefs at the deep
+>> end of the data structure to be converted into strings. The specific change
+>> I've made may not be acceptable, though -- I'd appreciate someone providing
+>> some feedback on that hunk!
+
+>>> Well, re that hunk, taint checking is currently disabled, but
+>>> if the perl bug that disallows it is fixed and it is turned back on,
+>>> the hash values will remain tainted, which will probably lead to
+>>> problems.
+>>>
+>>> I'm also leery of using such a complex data structure in config.
+>>> The websetup plugin would be hard pressed to provide a UI for such a
+>>> data structure. (It lacks even UI for a single hash ref yet, let alone
+>>> a list.)
+>>>
+>>> Also, it seems sorta wrong to have two so very different syntaxes to
+>>> represent the same meta data. A user without a lot of experience will
+>>> be hard pressed to map from a directive to this in the setup file.
+>>>
+>>> All of which leads me to think the setup file could just contain
+>>> a text that could hold meta directives. Which generalizes really to
+>>> a text that contains any directives, and is, perhaps appended to the
+>>> top of every page. Which nearly generalizes to the sidebar plugin,
+>>> or perhaps something more general than that...
+>>>
+>>> However, excessive generalization is the root of all evil, so
+>>> I'm not necessarily saying that's a good idea. Indeed, my memory
+>>> concerns below invalidate this idea pretty well. --[[Joey]]
+
+ <snip old patch>
+
+-- [[Jon]]
+
+>> Ok, I've had a bit of a think about this. There are currently 15 supported
+>> meta fields. Of these: title, licence, copyright, author, authorurl,
+>> and robots might make sense to define globally and override on a per-page
+>> basis.
+>>
+>> Less so, description (due to its impact on map); openid (why would
+>> someone want more than one URI to act as an openid endpoint to the same
+>> place?); updated. I can almost see why someone might want to set a global
+>> updated value. Almost.
+>>
+>> Not useful are permalink, date, stylesheet (you already have a global
+>> stylesheet), link, redir, and guid.
+>>
+>> In other words, the limitations of my first patch that [[smcv]] outlined
+>> are only relevant to defined fields that you wouldn't want to specify a
+>> global default for anyway.
+>>
+>>> I generally agree with this. It is *possible* that meta would have a new
+>>> field added, that takes parameters and make sense to use globally.
+>>> (Indeed, this later happened to some extent with the sortas parameters
+>>> being added to some metas.)
+>>> --[[Joey]]
+>>
+>> Due to this, and the added complexity of the second patch (having to adjust
+>> `IkiWiki/Setup.pm`), I think the first patch makes more sense. I've thus
+>> reverted to it here.
+>>
+>> Is this merge-worthy?
+
+ <snip old patch>
+
+-- [[Jon]]
+
+>>> Merry Christmas/festive season/happy new year folks. I've been away from
+>>> ikiwiki for the break, and now I've returned to watching recentchanges.
+>>> Hopefully I'll be back in the mix soon, too. In the meantime, Joey, have
+>>> you had a chance to look at this yet? -- [[Jon]]
+
+>>>> Ping :) Hi. [[Joey]], would you consider this patch for the next
+>>>> ikiwiki release? -- [[Jon]]
+
+>>> For this to work with websetup and --dumpsetup, it needs to define the
+>>> `meta_*` settings in the getsetup function.
+>>>>
+>>>> I think this will be problematic with the current implementation of this
+>>>> patch. The datatype here is an array of hash references, with each hash
+>>>> having a variable (and arbitrary) number of key/value pairs. I can't
+>>>> think of an intuitive way of implementing a way of editing such a
+>>>> datatype in the web interface, let alone registering the option in
+>>>> getsetup.
+>>>>
+>>>> Perhaps a limited set of defined meta values could be exposed via
+>>>> websetup (the obvious ones: author, copyright, license, etc.) -- [[Jon]]
+>>>
+>>> I also have some concerns about both these patches, since both throw
+>>> a lot of redundant data at meta, which then stores it in a very redundant
+>>> way. Specifically, meta populates a per-page `%metaheaders` hash
+>>> as well as storing per-page metadata in `%pagestate`. So, if you have
+>>> a wiki with 10 thousand pages, and you add a 1k site-wide license text,
+>>> that will bloat the memory usage of ikiwiki by in excess of 2
+>>> megabytes. It will also cause ikiwiki to write a similar amount more data
+>>> to its state file which has to be loaded back in each
+>>> run.
+>>>
+>>> Seems that this could be managed much more efficiently by having
+>>> meta special-case the site-wide settings, not store them in these
+>>> per-page data structures, and just make them be used if no per-page
+>>> metadata of the given type is present. --[[Joey]]
+>>>>
+>>>> that should be easy enough to do. I will work on a patch. -- [[Jon]]
+>>>>> Hi — I've written a new patch which I hope addresses the concerns raised
+>>>>> with the previous ones. The new approach is to hard-code in `scan()`
+>>>>> which of the meta types are supported in the setup file. If one is
+>>>>> defined, then `scan()` calls `preprocess()`, as [[smcv]] suggested,
+>>>>> rather than stuffing redundant data into ikiwiki's data structures.
+>>>>>
+>>>>> Two types supported in the setup file have optional arguments: `author`
+>>>>> and `title`. These are supported by having special-cased setup keys
+>>>>> `meta_author_sortas` and `meta_title_sortas`. Future expansion of the
+>>>>> number of supported types, or addition of arguments to existing ones,
+>>>>> can similarly be special-cased.
+>>>>>
+>>>>> The setup data structure is no longer complicated with an
+>>>>> array-of-hashes, which means this is suitable for exposing via websetup.
+>>>>> `getsetup()` has been adjusted accordingly.
+>>>>>
+>>>>> The patch can be found at the git branch described above.
+>>>>> — [[Jon]]
+
+>>>>>> I wish I could take pity on you and just merge this, but
+>>>>>> AFAICS it still suffers from the memory bloat described above.
+>>>>>> Specifically, when `scan` calls `preprocess`, it
+>>>>>> stores the metadata in `%pagestate` etc. --[[Joey]]
+
+>>>>>>> No pity required — but whoops, yes, that was a bit of a mistake
+>>>>>>> ☺ I guess I'll have to split the current `preprocess` in half.
+>>>>>>> — [[Jon]]
+
+>>>>>>>> I've been taking another look at this today, as I'm very keen to
+>>>>>>>> close various open loops of mine in IkiWiki to move on and do some
+>>>>>>>> other stuff. However, I'm not actually *using* this at the moment,
+>>>>>>>> so whilst I think it's a good idea, I can't really motivate myself
+>>>>>>>> to fix it anymore. I guess for now, this should just rot. — [[Jon]]
diff --git a/doc/todo/allow_wiki_syntax_in_commit_messages.mdwn b/doc/todo/allow_wiki_syntax_in_commit_messages.mdwn
new file mode 100644
index 000000000..97691bdc3
--- /dev/null
+++ b/doc/todo/allow_wiki_syntax_in_commit_messages.mdwn
@@ -0,0 +1,21 @@
+Commit messages should allow wiki syntax, and RecentChanges should format them
+accordingly.
+
+--[[JoshTriplett]]
+
+That's a neat idea! It would probably have to be only the simpler bits,
+without preprocessor directives -- wouldn't want a commit message inlining
+a whole page into RecentChanges. Of course, it could only use _one_ of the
+available markups, ie the default markdown. --[[Joey]]
+
+To go along with this, the preview should show the formatted commit message.
+--[[JoshTriplett]]
+
+This is really easy to do now, but it would have to be limited to applying
+markdown formatting (or whatever formatter is default I suppose) to the
+content, and *not* to expanding any WikiLinks or preprocessor directives.
+Especially with the new static RecentChanges, expanding even wikilinks
+would be pretty tricky to do. Applying markdown formatting seems like a
+reasonable thing; it would make commit messages that have the form of a
+bulletted list be marked up nicely, and would also handle _emphasised_
+words etc, and even http links. --[[Joey]]
diff --git a/doc/todo/anon_push_of_comments.mdwn b/doc/todo/anon_push_of_comments.mdwn
new file mode 100644
index 000000000..b472ea13f
--- /dev/null
+++ b/doc/todo/anon_push_of_comments.mdwn
@@ -0,0 +1,14 @@
+It should be possible to use anonymous git push to post comments
+(created, say, by a ikiwiki-comment program). Currently, that is not
+allowed, because users cannot edit, or create internal page files.
+But, comments in allowed locations are an exception to that rule, and
+that exception should be communicated somehow to `IkiWiki::Receive`.
+--[[Joey]]
+
+> Complications include:
+>
+> * Hard to see a way to prevent users from committing a comment that
+> claims to be written by someone else.
+> * `checkcontent` hooks need to be run, but can't accept a comment
+> for later moderation, since it's coming in as part of a commit.
+> Best they could do is reject the commit.
diff --git a/doc/todo/anti-spam_protection.mdwn b/doc/todo/anti-spam_protection.mdwn
new file mode 100644
index 000000000..e39d4c19b
--- /dev/null
+++ b/doc/todo/anti-spam_protection.mdwn
@@ -0,0 +1,30 @@
+The spammers have just found my ikiwiki. I have my main pages locked but allow open changes to my discussion pages and in the last few days one page in particular has been overwritten by spam about nine times, each edit was from a different IP address.
+
+<http://adam.shand.net/iki/recentchanges/> (sorry for the funny formatting, I upgraded to the latest version and haven't tracked down what changed yet)
+
+I'll probably lock down my discussion pages to require a login of some sort and hopefully that will slow them down. Is anyone else seeing problems on their wiki?
+
+As far as techniques for reducing spam I've found the [MoinMoin technique](http://moinmo.in/HelpOnSpam) of refusing to allow page saves with [known spam URLs](http://moinmo.in/BadContent) combined with a group maintained list of URLs to be fairly effective.
+
+Cheers,
+[[AdamShand]]
+
+> I have yet to hear of any spammer using openid.. --[[Joey]]
+
+
+>> Mh.. well. I know this problem, too. I leave the Discussion sites open for no registrations, so that visitors can easily write a comment to this specific blog entry without the need for registration. (This would be the same behaviour, as many blogging engines are using). Maybe it is possible to wrote a plugin that would scan the text which is submitted via spamassassin or something similar. (Using this combined with known spam URLs would maybe reduce the load of the server if there are many webpages which are getting editted by someone). If you like this idea Joey I might be interested to write such a plugin after my exams this and the next month. :) -- [[Winnie]]
+
+You might look at the Wikipedia page on "Spam\_in\_blogs" for more ideas. In particular, would it be possible to force a subset of the pages (by regex, but you'd choose the regex to match those pages which are publicly writable) to use rel="nofollow" in all links.
+
+> I just wanted to leave a link here to the [[todo/require_CAPTCHA_to_edit]] plugin patch. Unfortunately that plugin currently interacts badly with the openid plugin. -- [[Will]]
+
+
+---
+
+Ikiwiki now has a checkcontent hook that plugins can use to see content
+that is being entered and check it for spam/whatever.
+
+There is a [[plugins/blogspam]] plugin that uses the blogspam.org service
+to check for common spam signatures. --[[Joey]]
+
+[[done]]
diff --git a/doc/todo/apache_404_ErrorDocument_handler.mdwn b/doc/todo/apache_404_ErrorDocument_handler.mdwn
new file mode 100644
index 000000000..4ae1d1a79
--- /dev/null
+++ b/doc/todo/apache_404_ErrorDocument_handler.mdwn
@@ -0,0 +1,25 @@
+Apache's ErrorDocument directive lets you write a CGI script that will be invoked for all 404s.
+IkiWiki could offer one as an optional wrapper; it would do much the same thing that the
+existing recentchanges_link (or [[generic___39__do__61__goto__39___for_CGI]]) does when
+encountering a nonexistent page.
+
+I think it'd probably have to be a separate CGI script because the environment with which
+404 handlers are invoked is somewhat odd, and because it needs to return a 404 status
+(having said that, it might make sense for `recentchanges_link` to return 404 rather than
+200 anyway if the page doesn't exist).
+
+> This turns out to be untrue, as long as the wrapper lets a couple of extra
+> environment variables through. --[[smcv]]
+
+This would give IkiWiki the behaviour of many other wikis, where visiting a page that
+does not yet exist prompts you to create it, without having to invoke the CGI for
+successful requests.
+
+Due to [a well-known MSIE misfeature](http://support.microsoft.com/default.aspx?scid=kb;en-us;Q294807),
+error output needs to be at least 512 bytes long, so some padding might also be required.
+
+Implemented in the 'goto' branch in my git repository. You can see this
+feature in action [on my blog](http://smcv.pseudorandom.co.uk/no/such/page/).
+--[[smcv]]
+
+[[done]]
diff --git a/doc/todo/applydiff_plugin.mdwn b/doc/todo/applydiff_plugin.mdwn
new file mode 100644
index 000000000..d26b0dfe9
--- /dev/null
+++ b/doc/todo/applydiff_plugin.mdwn
@@ -0,0 +1,110 @@
+[[!tag wishlist done]]
+
+[[!toc ]]
+
+Summary
+=======
+
+Allow a user to apply an arbitrary diff, in order to modify a given
+page (or, even better, a given set of pages).
+
+Rationale
+=========
+
+To edit intensively an ikiwiki-powered website can quickly get
+annoying for anybody meeting enough of the following conditions:
+
+* living mainly offline
+* having no commit access to the RCS backing up the site (BTW, please
+ note I can send my ssh public key to anyone who asks for, free of
+ charges)
+* hating web-browsers and despising textareas
+* loving in his/her own preferred `$EDITOR`
+
+... and when one gathers all of these defaults, she/he is on her/his
+way to get mad. Soon.
+
+Before it's too late, some dareful ones dream of the following
+playflow:
+
+1. Go online.
+1. Update local working copy.
+1. Go offline.
+1. `$EDITOR` : write, report, answer, propose
+1. Go online.
+1. Update local working copy (and optionally fix conflicts between
+ local changes and remote ones).
+1. Generate a diff.
+1. Use a web-browser to paste the diffs (or better, upload them into
+ a form) somewhere on the wiki, and click "Apply".
+1. git pull (to reflect locally the fact that the diff has been
+ applied to the remote repo)
+1. Go offline.
+
+(This is for sure a bit theoretical: the ones who dream of this would
+actually insert various steps about branching, merging and rebasing
+random stuff.)
+
+Design
+======
+
+This has to be thought very carefully, to avoid one to apply diffs to
+pages he/she is not allowed to edit. Restricting a given diff to
+modify only *one* page may be easier.
+
+Implementation
+==============
+
+Also see [[joey]]'s idea on [[users/xma/discussion]], to allow (filtered) anonymous push to this wiki's repository.
+
+> Ideally the filtering should apply the same constraints on what's pushed
+> as are applied to web edits. So locked pages can't be changed, etc.
+>
+> That could be accomplished by making the git pre-receive hook be a
+> ikiwiki wrapper. A new `git_receive_wrapper` config setting could cause
+> the wrapper to be generated, with `$config{receive}` set to true.
+>
+> When run that way, ikiwiki would call `rcs_receive`. In the case of git,
+> that would look at the received changes as fed into the hook on stdin,
+> and use `parse_diff_tree` to get a list of the files changed. Then it
+> could determine if the changes were allowed.
+>
+> To do that, it should first look at what unix user received the
+> commit. That could be mapped directly to an ikiwiki user. This would
+> typically be an unprivelidged user (that was set up just to allow
+> anonymous pushes), but you might also want to set up
+> separate users who have fewer limits on what they can push. And, of
+> course, pushes from the main user, who owns the wiki, would not be
+> checked at all. So, let's say `$config{usermap}` is a hash, something
+> like `{usera => "wikiusera", userb => "wikiuserb"}`, and pushes from
+> users not in the hash are not checked.
+>
+> Then it seems like it would want to call `check_canedit` to test if an
+> edit to each changed page is allowed. Might also want to call
+> `check_canattach` and `check_canremove` if the attach and remove plugins
+> are enabled. All three expect to be passed a CGI and a CGI::Session
+> object, which is a bit problimatic here. So dummy the objects up? (To call
+> `check_canattach` the changed attachment would need to be extracted to a
+> temp file for it to check..)
+>
+> If a change is disallowed, it would print out what was disallowed, and
+> exit nonzero. I think that git then discards the pushed objects (or maybe
+> they remain in the database until `git-gc` .. if so, that could be used
+> to DOS by uploading junk, so need to check this point).
+>
+> Also, I've not verified that the objects have been recieved already when
+> whe pre-receive hook is called. Although the docs seem to say that is the
+> case. --[[Joey]]
+
+>> Update: The git pre-receive hook stuff is written, and seems to work.
+>> I think it makes more sense than using diffs, and so think this todo
+>> could probably be closed.
+>> --[[Joey]]
+
+>>> I agree, closing this. I really prefer this solution to the one I was
+>>> initially proposing.
+>>> Is this pre-receive hook already enabled on ikiwiki.info?
+>>> If not, do you plan to enable it at some point?
+>>> --[[intrigeri]]
+
+>>>> [[news/git_push_to_this_wiki]] gave me the answer. Well done! --[[intrigeri]]
diff --git a/doc/todo/assumes_system_perl.mdwn b/doc/todo/assumes_system_perl.mdwn
new file mode 100644
index 000000000..63ffccf0d
--- /dev/null
+++ b/doc/todo/assumes_system_perl.mdwn
@@ -0,0 +1,20 @@
+ikiwiki 1.45 doesn't work properly for perl installs not in the system path.
+
+ie:
+
+~/tools/perl-5.8.8/perl Makefile.PL
+make
+
+fails, as the 'make' command attempts to use the perl install in PATH, rather than the one ikiwiki is being installed for.
+
+The installed bin/ikiwiki file also refers to /usr/bin/perl rather than the perl it is being installed for.
+
+> I will acdept sufficiently nonintrusive patches to make ikiwiki work better on strange systems like
+> yours, but do not plan to work on it myself, since I do not use systems
+> where /usr/bin/perl is not a sane default. --[[Joey]]
+
+> > I've implemented a change that should fix this. For what it's worth this is a
+> > life saver on shared hosting where building your own perl is super effective.
+> > --frioux ([code here](https://github.com/frioux/ikiwiki/tree/use-env-perl))
+
+[[wishlist]]
diff --git a/doc/todo/attachments.mdwn b/doc/todo/attachments.mdwn
new file mode 100644
index 000000000..600c6cf7b
--- /dev/null
+++ b/doc/todo/attachments.mdwn
@@ -0,0 +1,22 @@
+Stuff the [[plugins/attachment]] plugin is currently missing, that might be
+nice to add:
+
+* Add a progress bar for attachment uploads (needs AJAX stuff..)
+* Maybe optimise the "Insert Links" button with javascript, so, if
+ javascript is available, the link is inserted at the current cursor
+ position in the page edit form, without actually reposting the form.
+ (Falling back to the current reposting of the form if javascript is not
+ available of course.)
+* An option to not `rcs_add` new attachments, but just write them to the
+ srcdir. This would allow the admin to review them, and manually
+ add/delete them before they bloat history.
+
+> I'd be inclined to implement that one by writing them to a nominated
+> underlay, I think, rather than having them in the srcdir but not in
+> the VCS. My [[plugins/contrib/album]] plugin could benefit from this
+> functionality, although in that case the photos should probably just
+> stay in the underlay forever (I already use an underlay on my own
+> websites for photos and software releases, which are too big to want
+> them in the VCS permanently.) --[[smcv]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/auto-create_tag_pages_according_to_a_template.mdwn b/doc/todo/auto-create_tag_pages_according_to_a_template.mdwn
new file mode 100644
index 000000000..7b65eba2e
--- /dev/null
+++ b/doc/todo/auto-create_tag_pages_according_to_a_template.mdwn
@@ -0,0 +1,270 @@
+It would be great if I could tell ikiwiki to automatically instantiate pages for each [[tag|/tags]], according to a template, especially when `$tagbase` is set.
+
+Tags are mainly specific to the object to which they’re stuck. However, I often use them the other way around, too: as concepts. And sometimes I’d like to see all pages related to a given concept (“tagged with a given tag”). The only way to do this with ikiwiki is to instantiate a page for each tag and slap a map on it. This is quite tedious and I’d really love to see Ikiwiki do so by default for all tags.
+
+Also see: <http://madduck.net/blog/2008.01.06:new-blog/> and <http://users.itk.ppke.hu/~cstamas/code/ikiwiki/autocreatetagpage/>
+
+[[!tag wishlist plugins/tag patch patch/core]]
+
+I would love to see this as well. -- dato
+
+---
+
+I have create a patch to [[tag.pm|plugins/tag]] for add the option for auto create tag pages.
+A new setting is used to enable or disable auto-create tag pages, `tag_autocreate`.
+The new tag file is created during the preprocess phase.
+The new tag file is then complied during the change phase.
+
+*see git history of this page if you want the patch --[[smcv]]*
+
+This uses a [[template|templates]] called `autotagpage.tmpl`, here is my template file:
+
+ \[[!inline pages="link(<TMPL_VAR TAG>)" archive="yes"]]
+
+
+A quirk I have not figured out is during the `sub change`, see my comments in the code.
+I am not sure if that is the best way to handle it.
+
+[[!tag patch]]
+-- Jeremy Schultz <jeremy.schultz@uleth.ca>
+
+No, this doesn't help:
+
+ + # This refresh/saveindex is to fix the Tags link
+ + # With out this additional refresh/saveindex the tag link displays ?tag
+ + IkiWiki::refresh();
+ + IkiWiki::saveindex();
+
+On the second extra pass, it doesn't notice that it has to update the "?"-link. If I run ikiwiki once more, it is updated. I don't know yet how this should be fixed, because I don't know the internals of ikiwiki well enough. Something inhibits detecting the need to update in refresh() in Render.pm; perhaps, this condition:
+
+ if (! $pagemtime{$page}) {
+ ...
+ push @add, $file;
+ ...
+ }
+
+is not satisfied for the newly created tag page. I shall put debug msgs into Render.pm to find out better how it works. --Ivan Z.
+
+---
+
+I've made another attempt at fixing this
+
+The current progress can be found at my [git repository][gitweb] on branch
+`autotag`:
+
+ git://git.liegesta.at/git/ikiwiki
+
+[gitweb]: http://git.liegesta.at/?p=ikiwiki.git;a=shortlog;h=refs/heads/autotag (gitweb for branch autotag)
+
+It's not entirely finished yet, but already quite usable. Testing and comments
+on code quality, implementation details, as well as other patches would be
+appreciated.
+
+Here's what it does right now:
+
+* enabled by setting `tag_autocreate=1` in the configuration.
+* Tag pages will be created in `tagbase` from the template `autotag.tmpl`.
+* Will correctly render all links, and dependencies. Well, AFAIK.
+* When a tag page is deleted it will automatically recreated from template. (I
+consider this a feature, not a bug)
+* Requires a rebuild on first use.
+* Adds a function `add_autofile()` to the plugin API, to do all this.
+
+Todo/Bugs:
+
+* Will still create a page even if there's a page other than `$tag` under
+`tagbase` satisfying the tag link. (details? --[[Joey]])
+* Call from `IkiWiki.pm` to `Render.pm`, which adds a module dependency in the
+wrong direction. (fixed --[[Joey]] )
+* Add files to RCS.
+* Unit tests.
+* Proper documentation. (fixed (mostly) --[[Joey]])
+
+--[[David_Riebenbauer]]
+
+> Starting review of this. Some of your commits are to very delicate,
+> optimised, and security-sensitive ground, so I have to look at them very
+> carefully. --[[Joey]]
+
+>> First of, sorry that it took me so damn long to answer. I didn't lose
+>> interest but it took a while for me to find the time and motivation
+>> to address you suggestions. --[[David_Riebenbauer]]
+
+> * In the refactoring in [f3abeac919c4736429bd3362af6edf51ede8e7fe][],
+> you introduced at least 2 bugs, one a possible security hole.
+> Now one part of the code tests `if ($file)` and the other
+> caller tests `if ($f)`. These two tests both tested `if (! defined $f)`
+> before. Notice that the variable needs to be the untainted variable
+> for both. Also notice that `if ($f)` fails if `$f` contains `0`,
+> which is a very common perl gotcha.
+> * Your refactored code changes `-l $_ || -d _` to `-l $file || -d $file`.
+> The latter makes one more stat system call; note the use of a
+> bare `_` in the first to make perl reuse the stat buffer.
+> * (As a matter of style, could you put a space after the commas in your
+> perl?)
+
+>> The first two points should be addressed in
+>> [da5d29f95f6e693e8c14be1b896cf25cf4fdb3c0][]. And sure, I can add the
+>> spaces. --[[David_Riebenbauer]]
+
+> I'd like to cherry-pick the above commit, once it's in shape, before
+> looking at the rest in detail. So just a few other things that stood out.
+>
+> * Commit [4af4d26582f0c2b915d7102fb4a604b176385748][] seems unnecessary.
+> `srcfile($file, 1)` already is documented to return undef if the
+> file does not exist. (But without the second parameter, it throws
+> an error.)
+
+>> You're right. I must have been some confused by some other promplem I
+>> introduced then. Reverted. --[[David_Riebenbauer]]
+
+> * Commit [f58f3e1bec41ccf9316f37b014ce0b373c8e49e1][] adds a line
+> that is intented by a space, not a tab.
+
+>> Sorry, That one was reverted anyway. --[[David_Riebenbauer]]
+
+> * Commit [f58f3e1bec41ccf9316f37b014ce0b373c8e49e1][] says that auto-added
+> files will be recreated if the user deletes them. That seems bad.
+> `autoindex` goes to some trouble to not recreate deleted files.
+
+>> I reverted the commit and addressed the issue in
+>> [a358d74bef51dae31332ff27e897fe04834571e6][] and
+>> [981400177d68a279f485727be3f013e68f0bf691][].
+ --[[David_Riebenbauer]]
+
+>>> This doesn't seem to have all the refinements that autoindex has:
+>>>
+>>> * `autoindex` attaches the record of deletions to the `index` page, which
+>>> is (nearly) guaranteed to exist; this one attaches the record of
+>>> deletions to the deleted page's page state. Won't that tend to result
+>>> in losing the record along with the deleted page?
+
+>>>> This is probably on of the harder things to do, 'cause there are (most of the
+>>>> time) several pages that are responsible for the creation of a single tag page.
+>>>> Of course I could attach the info to all of them.
+
+>>>> With current behaviour I think the information in `%pagestate` is kept around
+>>>> regardless whether the corresponding page exists or not.
+>>>> --[[David_Riebenbauer]]
+
+>>>>> Sorry, I'll try to be clearer: `autoindex` hard-codes that the index page
+>>>>> of the entire wiki is the one responsible for storing the page state. That
+>>>>> page isn't responsible for the creation of the tag page, it's just an
+>>>>> arbitrary page that's (more or less) guaranteed to exist. --[[smcv]]
+
+>>>>> I don't like that [[plugins/autoindex]] has to do that,
+>>>>> but `%pagestate` values are only stored for pages that exist,
+>>>>> so it was necessary. (Another way to look at this is that
+>>>>> `%pagestate` is not the ideal data structure.) --[[Joey]]
+
+>>>>>> Aha! Having looked at [[plugins/write]] again, it turns out that what this
+>>>>>> feature should really use is `%wikistate`, I think? :-) --[[smcv]]
+
+>>>>>>> Ah, indeed, that came after I wrote autoindex. I've fixed autoindex to
+>>>>>>> use it. --[[Joey]]
+
+>>>>> Ok, now I know what you mean. --[[David_Riebenbauer]]
+
+>>> * `autoindex` forgets that a page was deleted when that page is
+>>> re-created
+
+>>>> Yes, I forgot about that and that is a bug. I'll fix that.
+>>>> --[[David_Riebenbauer]]
+
+>>>>> In my branch, it keeps a list of autofiles that were created,
+>>>>> not deleted. And I think that turns out to be necessary, really.
+>>>>> However, I see no way to clean out that list on deletion and
+>>>>> manual recreation -- it still needs to remember it was once an autofile,
+>>>>> in order to avoid recreating it if it's deleted yet again. --[[Joey]]
+
+>>>>>> Are these really the semantics we want? It seems strange to me
+>>>>>> that this:
+>>>>>>
+>>>>>> * tag a page as foo
+>>>>>> * tags/foo automatically appears
+>>>>>> * delete tags/foo
+>>>>>> * create tags/foo manually
+>>>>>> * delete tags/foo again
+>>>>>> * tags/foo isn't automatically created
+>>>>>>
+>>>>>> isn't the same as this:
+>>>>>>
+>>>>>> * create tags/foo
+>>>>>> * delete tags/foo
+>>>>>> * tag a page as foo
+>>>>>> * tags/foo automatically appears
+>>>>>>
+>>>>>> or even this:
+>>>>>>
+>>>>>> * create tags/foo
+>>>>>> * tag a page as foo
+>>>>>> * delete tags/foo
+>>>>>> * tags/foo automatically appears (?)
+>>>>>>
+>>>>>> --[[smcv]]
+
+>>>>>>> I agree that the last of these is not desired. It could be avoided
+>>>>>>> by extending the list of autofiles to include those that were not
+>>>>>>> created due to the file/page already existing.
+>>>>>>>
+>>>>>>> Hmm, that would fix the previous scenario too. --[[Joey]]
+
+>>> * `autoindex` forgets that a page was deleted when it's no longer needed
+>>> anyway (this may be harder for `autotag`?)
+
+>>>> I don't think so. AFAIK ikiwiki can detect whether there are taglinks to a page
+>>>> anyway, so it should be quite easy. I'll try to implement that too.
+>>>> --[[David_Riebenbauer]]
+
+>>> It'd probably be an interesting test of the core change to port
+>>> `autoindex` to use it? (Adding the file to the RCS would be
+>>> necessary to get parity with `autoindex`.) --[[smcv]]
+
+>>>> Good suggestion. Adding the files to RCS is on my todo list anyway.
+>>>> --[[David_Riebenbauer]]
+
+>>>>> I think it may be better to allow the `add_autofile` caller
+>>>>> to specify if it is added to RCS. In my branch, it can do
+>>>>> so by just making the callback it registers call `rcs_add`;
+>>>>> and I have tag do this. Other plugins might want autofiles
+>>>>> that do not get checked in, conceivably.
+>>>>> --[[Joey]]
+
+> Regarding the call from `IkiWiki.pm` to `Render.pm`, wouldn't this be
+> quite easy to solve by moving `verify_src_file` to IkiWiki.pm? --[[smcv]]
+
+>> True. I'll do that. --[[David_Riebenbauer]]
+>> Fixed in my branch --[[Joey]]
+
+[[!template id=gitbranch branch=origin/autotag author="[[Joey]]"]]
+I've pushed an autotag branch of my own, which refactors
+things a bit and fixes bugs around deletion/recreation.
+I've tested it fairly thouroughly. --[[Joey]]
+
+[f3abeac919c4736429bd3362af6edf51ede8e7fe]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=f3abeac919c4736429bd3362af6edf51ede8e7fe (commitdiff for f3abeac919c4736429bd3362af6edf51ede8e7fe)
+[4af4d26582f0c2b915d7102fb4a604b176385748]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=4af4d26582f0c2b915d7102fb4a604b176385748 (commitdiff for 4af4d26582f0c2b915d7102fb4a604b176385748)
+[f58f3e1bec41ccf9316f37b014ce0b373c8e49e1]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=f58f3e1bec41ccf9316f37b014ce0b373c8e49e1 (commitdiff for f58f3e1bec41ccf9316f37b014ce0b373c8e49e1)
+[da5d29f95f6e693e8c14be1b896cf25cf4fdb3c0]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=da5d29f95f6e693e8c14be1b896cf25cf4fdb3c0 (commitdiff for da5d29f95f6e693e8c14be1b896cf25cf4fdb3c0)
+[a358d74bef51dae31332ff27e897fe04834571e6]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=a358d74bef51dae31332ff27e897fe04834571e6 (commitdiff for a358d74bef51dae31332ff27e897fe04834571e6)
+[981400177d68a279f485727be3f013e68f0bf691]: http://git.liegesta.at/?p=ikiwiki.git;a=commitdiff;h=981400177d68a279f485727be3f013e68f0bf691 (commitdiff for 981400177d68a279f485727be3f013e68f0bf691)
+
+-------------------
+
+Even if this is already marked as done, I'd like to suggest an alternative
+solution:
+
+Instead of creating a file that gets checked in into the RCS, the source files
+could be left out and the output files be written as long as there is no
+physical source file (think of a virtual underlay). Something similar would be
+required to implement [[todo/alias directive]], which couldn't be easily done
+by writing to the RCS as the page's contents can change depending on which
+other pages claim it as an alias. --[[chrysn]]
+
+I agree with [[chrysn]]. In fact, is there any good reason that the core tag
+plugin doesn't do this? The current usability is horrible, to the point that
+I have gone 2.5 years with Ikiwiki and haven't yet started using tags.
+--[Eric](http://wiki.pdxhub.org/people/eric)
+
+> See [[todo/transient_pages]] for progress on this. --[[smcv]]
+
+[[!tag done]]
diff --git a/doc/todo/auto_getctime_on_fresh_build.mdwn b/doc/todo/auto_getctime_on_fresh_build.mdwn
new file mode 100644
index 000000000..760c56fa1
--- /dev/null
+++ b/doc/todo/auto_getctime_on_fresh_build.mdwn
@@ -0,0 +1,13 @@
+[[!tag wishlist]]
+
+It might be a good idea to enable --gettime when `.ikiwiki` does not
+exist. This way a new checkout of a `srcdir` would automatically get
+ctimes right. (Running --gettime whenever a rebuild is done would be too
+slow.) --[[Joey]]
+
+Could this be too annoying in some cases, eg, checking out a large wiki
+that needs to get set up right away? --[[Joey]]
+
+> Not for git with the new, optimised --getctime. For other VCS.. well,
+> pity they're not as fast as git ;), but it is a one-time expense...
+> [[done]] --[[Joey]]
diff --git a/doc/todo/auto_publish_expire.mdwn b/doc/todo/auto_publish_expire.mdwn
new file mode 100644
index 000000000..7a5a17517
--- /dev/null
+++ b/doc/todo/auto_publish_expire.mdwn
@@ -0,0 +1,33 @@
+It could be nice to mark some page such that:
+
+* the page is automatically published on some date (i.e. build, linked, syndicated, inlined/mapped, etc.)
+* the page is automatically unpublished at some other date (i.e. removed)
+
+I know that ikiwiki is a wiki compiler so that something has to refresh the wiki periodically to enforce the rules (a cronjob for instance). It seems to me that the calendar plugin rely on something similar.
+
+The date for publishing and expiring could be set be using some new directives; an alternative could be to expand the [[plugin/meta]] plugin with [<span/>[!meta date="auto publish date"]] and [<span/>[!meta expires="auto expire date"]].
+
+--[[JeanPrivat]]
+
+> This is a duplicate, and expansion, of
+> [[todo/tagging_with_a_publication_date]].
+> There, I suggest using a branch to develop
+> prepublication versions of a site, and merge from it
+> when the thing is published.
+>
+> Another approach I've seen used is to keep such pages in a pending/
+> directory, and move them via cron job when their publication time comes.
+> But that requires some familiarity with, and access to, cron.
+>
+> On [[todo/tagging_with_a_publication_date]], I also suggested using meta
+> date to set a page's date into the future,
+> and adding a pagespec that matches only pages with dates in the past,
+> which would allow filtering out the unpublished ones.
+> Sounds like you are thinking along these lines, but possibly using
+> something other than the page's creation or modification date to do it.
+>
+> I do think the general problem with that approach is that you have to be
+> careful to prevent the unpublished pages from leaking out in any
+> inlines, maps, etc. --[[Joey]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/auto_rebuild_on_template_change.mdwn b/doc/todo/auto_rebuild_on_template_change.mdwn
new file mode 100644
index 000000000..ea990b877
--- /dev/null
+++ b/doc/todo/auto_rebuild_on_template_change.mdwn
@@ -0,0 +1,78 @@
+If `page.tmpl` is changed, it would be nice if ikiwiki automatically
+noticed, and rebuilt all pages. If `inlinepage.tmpl` is changed, a rebuild
+of all pages using it in an inline would be stellar.
+
+This would allow setting:
+
+ templatedir => "$srcdir/templates",
+
+.. and then the [[templates]] are managed like other wiki files; and
+like other wiki files, a change to them automatically updates dependent
+pages.
+
+Originally, it made good sense not to have the templatedir inside the wiki.
+Those templates can be used to bypass the htmlscrubber, and you don't want
+just anyone to edit them. But the same can be said of `style.css` and
+`ikiwiki.js`, which *are* in the wiki. We rely on `allowed_attachments`
+being set to secure those to prevent users uploading replacements. And we
+assume that users who can directly (non-anon) commit *can* edit them, and
+that's ok.
+
+So, perhaps the easiest way to solve this [[wishlist]] would be to
+make templatedir *default* to "$srcdir/templates/, and make ikiwiki
+register dependencies on `page.tmpl`, `inlinepage.tmpl`, etc, as they're
+used. Although, having every page declare an explicit dep on `page.tmpl`
+is perhaps a bit much; might be better to implement a special case for that
+one. Also, having the templates be copied to `destdir` is not desirable.
+In a sense, these template would be like internal pages, except not wiki
+pages, but raw files.
+
+The risk is that a site might have `allowed_attachments` set to
+`templates/*` or `*.tmpl` something like that. I think such a configuration
+is the *only* risk, and it's unlikely enough that a NEWS warning should
+suffice.
+
+(This would also help to clear up the tricky disctinction between
+wikitemplates and in-wiki templates.)
+
+Note also that when using templates from "$srcdir/templates/", `no_includes`
+needs to be set. Currently this is done by the two plugins that use
+such templates, while includes are allowed in `templatedir`.
+
+Have started working on this.
+[[!template id=gitbranch branch=origin/templatemove author="[[Joey]]"]]
+
+> But would this require that templates be parseable as wiki pages? Because that would be a nuisance. --[[KathrynAndersen]]
+
+>> It would be better for them not to be rendered separately at all.
+>> --[[Joey]]
+
+>>> I don't follow you. --[[KathrynAndersen]]
+
+>>>> If they don't render to output files, they clearly don't
+>>>> need to be treated as wiki pages. (They need to be treated
+>>>> as raw files anyway, because you don't want random users editing them
+>>>> in the online editor.) --[[Joey]]
+
+>>>>> Just to be clear, the raw files would not be copied across to the output
+>>>>> directory? -- [[Jon]]
+
+>>>>>> Without modifying ikiwiki, they'd be copied to the output directory as
+>>>>>> (e.g.) http://ikiwiki.info/templates/inlinepage.tmpl; to not copy them,
+>>>>>> it'd either be necessary to make them be internal pages
+>>>>>> (templates/inlinepage._tmpl) or special-case them in some other way.
+>>>>>> --[[smcv]]
+
+>>>>>>> In my branch, I left in support for the templatedir, and also
+>>>>>>> /usr/share/ikiwiki/templates. So, users do not have to put their
+>>>>>>> custom templates in templates/ in the wiki. If they do,
+>>>>>>> the templates are copied to the destdir like other non-wiki page files
+>>>>>>> are. The templates are not wiki pages, except those used by a few
+>>>>>>> things like the [[plugins/template]] plugin.
+>>>>>>>
+>>>>>>> That seems acceptable, since users probably don't need to modify
+>>>>>>> many templates, so the clutter is small. (Especially when
+>>>>>>> compared to the other clutter the basewiki always puts in destdir.)
+>>>>>>> This could be revisted later. --[[Joey]]
+
+[[done]]
diff --git a/doc/todo/autoindex_should_use_add__95__autofile.mdwn b/doc/todo/autoindex_should_use_add__95__autofile.mdwn
new file mode 100644
index 000000000..f3fb24c16
--- /dev/null
+++ b/doc/todo/autoindex_should_use_add__95__autofile.mdwn
@@ -0,0 +1,120 @@
+`add_autofile` is a generic version of [[plugins/autoindex]]'s code,
+so the latter should probably use the former. --[[smcv]]
+
+> [[merged|done]] --[[Joey]]
+
+----
+
+[[!template id=gitbranch branch=smcv/ready/autoindex-autofile author="[[smcv]]"]]
+
+I'm having trouble fixing this:
+
+ # FIXME: some of this is probably redundant with add_autofile now, and
+ # the rest should perhaps be added to the autofile machinery
+
+By "a generic version of" above, it seems I mean "almost, but not
+quite, entirely unlike".
+
+> As long as it's not Tea. ;) --[[Joey]]
+
+I tried digging through the git history for the
+reasoning behind the autofile and autoindex implementations, but now I'm
+mostly confused.
+
+## autofile
+
+The autofile machinery records a list of every file that has ever been proposed
+as an autofile: for instance, the tag plugin has a list of every tag that
+has ever been named in a \[[!tag]] or \[[!taglink]], even if no file was
+actually needed (e.g. because it already existed). Checks for files that
+already exist (or whatever) are deferred until after this list has been
+updated, and files in this list are never auto-created again unless the wiki
+is rebuilt.
+
+This avoids re-creating the tag `create-del` in this situation, which is
+the third one that I noted on
+[[todo/auto-create tag pages according to a template]]:
+
+* create tags/create-del manually
+* tag a page as create-del
+* delete tags/create-del
+
+and also avoids re-creating `auto-del` in this similar situation (which I
+think is probably the most important one to get right):
+
+* tag a page as auto-del, which is created automatically
+* delete tags/auto-del
+
+I think both of these are desirable.
+
+However, this infrastructure also results in the tag page not being
+re-created in either of these situations (the first and second that I noted
+on the other page):
+
+* tag a page as auto-del-create-del, which is created automatically
+* delete tags/auto-del-create-del
+* create tags/auto-del-create-del manually
+* delete tags/auto-del-create-del again
+
+or
+
+* create tags/create-del-auto
+* delete tags/create-del-auto
+* tag a page as create-del-auto
+
+I'm less sure that these shouldn't create the tag page: we deleted the
+manually-created version, but that doesn't necessarily mean we don't want
+*something* to exist.
+
+> That could be argued, but it's a very DWIM thing. Probably best to keep
+> the behavior simple and predictable, so one only needs to remember that
+> when a page is deleted, nothing will ever re-create it behind ones back.
+> --[[Joey]]
+
+>> Fair enough, I'll make autoindex do that. --s
+
+## autoindex
+
+The autoindex machinery records a more complex set. Items are added to the
+set when they are deleted, but would otherwise have been added as an autoindex
+(don't exist, do have children (by which I mean subpages or attachments),
+and are a directory in the srcdir). They're removed if this particular run
+wouldn't have added them as an autoindex (they exist, or don't have children).
+
+Here's what happens in situations mirroring those above.
+
+The "create-del" case still doesn't create the page:
+
+* create create-del manually
+* create create-del/child
+* delete create-del
+* it's added to `%deleted` and not re-created
+
+Neither does the "auto-del" case:
+
+* create auto-del/child, resulting in auto-del being created automatically
+* delete auto-del
+* it's added to `%deleted` and not re-created
+
+However, unlike the generic autofile infrastructure, `autoindex` forgets
+that it shouldn't re-create the deleted page in the latter two situations:
+
+* create auto-del-create-del/child, resulting in auto-del-create-del being
+ created automatically
+* delete auto-del-create-del; it's added to `%deleted` and not re-created
+* create auto-del-create-del manually; it's removed from `%deleted`
+* delete auto-del-create-del again (it's re-created)
+
+and
+
+* create create-del-auto
+* delete create-del-auto; it's not added to `%deleted` because there's no
+ child that would cause it to exist
+* create create-del-auto/child
+
+> I doubt there is any good reason for this behavior. These are probably
+> bugs. --[[Joey]]
+
+>> OK, I believe my updated branch gives `autoindex` the same behaviour
+>> as auto-creation of tags. The `auto-del-create-del` and
+>> `create-del-auto` use cases work the same as for tags on my demo wiki. --s
diff --git a/doc/todo/automatic_rebuilding_of_html_pages.mdwn b/doc/todo/automatic_rebuilding_of_html_pages.mdwn
new file mode 100644
index 000000000..10482d68a
--- /dev/null
+++ b/doc/todo/automatic_rebuilding_of_html_pages.mdwn
@@ -0,0 +1,5 @@
+It seems that pages like [[Todo]] aren't rebuilt automatically when a new item is added using the web interface.
+
+AFAIK this is working ok. For example, this page appears in [[TODO]]. Maybe you need to force-refresh the page in your web browser? --[[Joey]]
+
+[[todo/done]]
diff --git a/doc/todo/automatic_use_of_syntax_plugin_on_source_code_files.mdwn b/doc/todo/automatic_use_of_syntax_plugin_on_source_code_files.mdwn
new file mode 100644
index 000000000..71b4b88f0
--- /dev/null
+++ b/doc/todo/automatic_use_of_syntax_plugin_on_source_code_files.mdwn
@@ -0,0 +1,17 @@
+[[Wishlist]]: optionally use the [[plugins/contrib/syntax]] plugin
+automatically on source code files in the repository with recognized
+extensions or shebangs, and render them as though they consisted of an
+[[.mdwn|ikiwiki/markdown]] page containing nothing but a single call to the syntax
+plugin with the file contents as the text argument and the recognized type
+as the type argument.
+
+Together with the ability to have
+[[wiki-formatted_comments|wiki-formatted_comments_with_syntax_plugin]],
+this would allow the use of ikiwiki for [[!wikipedia literate programming]].
+
+* I have started something along these lines see [[plugins/contrib/sourcehighlight]]. For some reason I started with source-highlight [[DavidBremner]]
+
+* I wonder if this is similar to what you want: <http://iki.u32.net/setup/Highlight_Code_Plugin/>
+
+> The new [[plugins/highlight]] plugin is in ikiwiki core and supports
+> source code files natively. [[done]] --[[Joey]]
diff --git a/doc/todo/automatic_use_of_syntax_plugin_on_source_code_files/discussion.mdwn b/doc/todo/automatic_use_of_syntax_plugin_on_source_code_files/discussion.mdwn
new file mode 100644
index 000000000..64bc21ee0
--- /dev/null
+++ b/doc/todo/automatic_use_of_syntax_plugin_on_source_code_files/discussion.mdwn
@@ -0,0 +1,215 @@
+Here is another [[patch]] for this. It is more up to date than either of the patches linked on the previous page. It is most similar to [[plugins/contrib/sourcehighlight]].
+
+Updated to use fix noted in [[bugs/multiple_pages_with_same_name]].
+
+-- [[Will]]
+
+----
+I was trying to replace sourcehighlight with sourcecode. I had to modify the
+htmlize call slightly so that it would work in a format directive.
+([modified version](http://pivot.cs.unb.ca/git/?p=ikiplugins.git;a=blob_plain;f=IkiWiki/Plugin/sourcecode.pm;hb=21fc57091edb9))
+
+> I haven't tested them, but those changes look sensible to me. -- [[Will]]
+
+I hit a wall the following example (the last commit in the above repo).
+
+ \[[!meta title="Solutions to assignment 1"]]
+
+ - [[!format cc """
+ test
+ """]]
+
+
+> I haven't actually tested this to see what the problem is. How does this fail?
+> Does source-highlight barf on the non-c++ content? Is there a wiki URL that shows the failure? -- [[Will]]
+>> Here is the content div from the output page
+>> [[DavidBremner]]
+
+ <div id="content">
+ <p><ul>
+ <li><div id="sourcecode"></li>
+ </ul>
+ 2beb4fd7289998159f61976143f66bb6</p>
+
+ <p></div></p>
+
+ </div>
+
+>>> That is quite strange. I tested your version of the plugin. I had to revert one your changes to get it to
+>>> work: the linenumber argument should not have a space at the end of it. Once I made that change,
+>>> everything worked as expected. The output I get for your example is below:
+
+ <div id="content">
+ <ul>
+ <li><div id="sourcecode"></li>
+ </ul>
+
+ <pre><tt><span class="linenum">00001:</span> <span class="normal">test</span></tt></pre>
+
+ <p></div></p>
+
+ </div>
+
+>>> I don't know what is going wrong for you... source-highlight, Markdown or something else.
+>>>> It's a well-known bug in old versions of markdown. --[[Joey]]
+>>> I do find it interesting the way the sourcecode `div` and the list get interleaved. That
+>>> just looks like a Markdown thing though.
+>>> In any case, I've updated the patch below to include most of your changes. -- [[Will]]
+
+----
+
+ #!/usr/bin/perl
+ # markup source files
+ # Originally by Will Uther
+ # With modifications by David Bremner
+ package IkiWiki::Plugin::sourcecode;
+
+ use warnings;
+ use strict;
+ use IkiWiki 2.00;
+ use open qw{:utf8 :std};
+
+ my %metaheaders;
+
+ sub import {
+ hook(type => "getsetup", id => "sourcecode", call => \&getsetup);
+ hook(type => "checkconfig", id => "sourcecode", call => \&checkconfig);
+ hook(type => "pagetemplate", id => "sourcecode", call => \&pagetemplate);
+ }
+
+ sub getsetup () {
+ return
+ plugin => {
+ safe => 1,
+ rebuild => 1, # format plugin
+ },
+ sourcecode_command => {
+ type => "string",
+ example => "/usr/bin/source-highlight",
+ description => "The command to execute to run source-highlight",
+ safe => 0,
+ rebuild => 1,
+ },
+ sourcecode_lang => {
+ type => "string",
+ example => "c,cpp,h,java",
+ description => "Comma separated list of suffixes to recognise as source code",
+ safe => 1,
+ rebuild => 1,
+ },
+ sourcecode_linenumbers => {
+ type => "boolean",
+ example => 1,
+ description => "Should we add line numbers to the source code",
+ safe => 1,
+ rebuild => 1,
+ },
+ sourcecode_css => {
+ type => "string",
+ example => "sourcecode_style",
+ description => "page to use as css file for source",
+ safe => 1,
+ rebuild => 1,
+ },
+ }
+
+ sub checkconfig () {
+ if (! $config{sourcecode_lang}) {
+ error("The sourcecode plugin requires a list of suffixes in the 'sourcecode_lang' config option");
+ }
+
+ if (! $config{sourcecode_command}) {
+ $config{sourcecode_command} = "source-highlight";
+ }
+
+ if (! length `which $config{sourcecode_command} 2>/dev/null`) {
+ error("The sourcecode plugin is unable to find the $config{sourcecode_command} command");
+ }
+
+ if (! $config{sourcecode_css}) {
+ $config{sourcecode_css} = "sourcecode_style";
+ }
+
+ if (! defined $config{sourcecode_linenumbers}) {
+ $config{sourcecode_linenumbers} = 1;
+ }
+
+ my %langs = ();
+
+ open(LANGS, "$config{sourcecode_command} --lang-list|");
+ while (<LANGS>) {
+ if ($_ =~ /(\w+) = .+\.lang/) {
+ $langs{$1} = 1;
+ }
+ }
+ close(LANGS);
+
+ foreach my $lang (split(/[, ]+/, $config{sourcecode_lang})) {
+ if ($langs{$lang}) {
+ hook(type => "htmlize", id => $lang, no_override=>1,
+ call => sub { htmlize(lang=>$lang, @_) },
+ keepextension => 1);
+ } else {
+ error("Your installation of source-highlight cannot handle sourcecode language $lang!");
+ }
+ }
+ }
+
+ sub htmlize (@) {
+ my %params=@_;
+
+ my $page = $params{page};
+
+ eval q{use FileHandle};
+ error($@) if $@;
+ eval q{use IPC::Open2};
+ error($@) if $@;
+
+ local(*SPS_IN, *SPS_OUT); # Create local handles
+
+ my @args;
+
+ if ($config{sourcecode_linenumbers}) {
+ push @args, '--line-number';
+ }
+
+ my $pid = open2(*SPS_IN, *SPS_OUT, $config{sourcecode_command},
+ '-s', $params{lang},
+ '-c', $config{sourcecode_css}, '--no-doc',
+ '-f', 'xhtml',
+ @args);
+
+ error("Unable to open $config{sourcecode_command}") unless $pid;
+
+ print SPS_OUT $params{content};
+ close SPS_OUT;
+
+ my @html = <SPS_IN>;
+ close SPS_IN;
+
+ waitpid $pid, 0;
+
+ my $stylesheet=bestlink($page, $config{sourcecode_css}.".css");
+ if (length $stylesheet) {
+ push @{$metaheaders{$page}}, '<link href="'.urlto($stylesheet, $page).'"'.
+ ' rel="stylesheet"'.
+ ' type="text/css" />';
+ }
+
+ return '<div id="sourcecode">'."\r\n".join("",@html)."\r\n</div>\r\n";
+ }
+
+ sub pagetemplate (@) {
+ my %params=@_;
+
+ my $page=$params{page};
+ my $template=$params{template};
+
+ if (exists $metaheaders{$page} && $template->query(name => "meta")) {
+ # avoid duplicate meta lines
+ my %seen;
+ $template->param(meta => join("\n", grep { (! $seen{$_}) && ($seen{$_}=1) } @{$metaheaders{$page}}));
+ }
+ }
+
+ 1
diff --git a/doc/todo/avatar.mdwn b/doc/todo/avatar.mdwn
new file mode 100644
index 000000000..7fa3762da
--- /dev/null
+++ b/doc/todo/avatar.mdwn
@@ -0,0 +1,31 @@
+[[!tag wishlist]]
+
+It would be nice if ikiwiki, particularly [[plugins/comments]]
+(but also, ideally, recentchanges) supported user avatar icons.
+
+> Update: Done for comments, but not for anything else, and the directive
+> below would be a nice addition. --[[Joey]]
+
+Idea is to add a directive that displays a small avatar image for a user.
+Pass it a user's the email address, openid, username, or the md5 hash
+of their email address:
+
+ \[[!avatar user@example.com]]
+ \[[!avatar http://joey.kitenet.net/]]
+ \[[!avatar user]]
+ \[[!avatar hash]]
+
+These directives can then be hand-inserted onto pages, or more likely,
+included in eg, a comment post via a template.
+
+An optional second parameter can be included, containing additional
+options to pass in the
+[gravatar url](http://en.gravatar.com/site/implement/url).
+For example, this asks for a smaller gravatar, and if a user does
+not have a gravatar, uses a cute auto-generated "wavatar" avatar.
+
+ \[[!gravatar user@example.com "size=40&default=wavatar"]]
+
+The `gravitar_options` setting in the setup file can be used to
+specify additional options to pass. So for example if you want
+to use wavatars everywhere, set it to "default=wavatar".
diff --git a/doc/todo/avatar/discussion.mdwn b/doc/todo/avatar/discussion.mdwn
new file mode 100644
index 000000000..568866f93
--- /dev/null
+++ b/doc/todo/avatar/discussion.mdwn
@@ -0,0 +1 @@
+It seems that this thing is on by default. How to turn it off?
diff --git a/doc/todo/avoid_attachement_ui_if_upload_not_allowed.mdwn b/doc/todo/avoid_attachement_ui_if_upload_not_allowed.mdwn
new file mode 100644
index 000000000..487915850
--- /dev/null
+++ b/doc/todo/avoid_attachement_ui_if_upload_not_allowed.mdwn
@@ -0,0 +1,25 @@
+Any way to make it so an edit page doesn't offer the attachment capability
+unless it matches a specific user, is an admin, and/or is an allowed page?
+(For now, I have it on all pages, and then it prohibits after I submit
+based on the allowed_attachments.)
+
+> To do that, ikiwiki would have to try to match the `allowed_attachments`
+> pagespec against a sort of dummy upload to the current page. Then if it
+> failed, assume all real uploads would fail. Now consider a pagespec like
+> "user(joey) and mimetype(audio/mpeg)" -- it'd be hard to make a dummy
+> upload to test this pagespec against.
+>
+> So, there would need to be some sort of test mode, where terms like
+> `mimetype()` always succeed. But then consider a pagespec like
+> "user(joey) and !mimetype(video/mpeg)" -- if mimetype succeeds, this
+> fails.
+>
+> So, maybe we can instead just filter out all the pagespec terms aside
+> from `user()`, `ip()`, and `admin()`. Transforming that into just
+> "user(joey)", which would succeed in the test.
+>
+> That'd work, I guess. Pulling a pagespec apart, filtering out terms, and
+> putting it back together is nontrivial, but doable.
+>
+> Other approach would be to have a separate pagespec that explicitly
+> controlls what pages to show the attachment UI on. --[[Joey]]
diff --git a/doc/todo/avoid_thrashing.mdwn b/doc/todo/avoid_thrashing.mdwn
new file mode 100644
index 000000000..45b11d872
--- /dev/null
+++ b/doc/todo/avoid_thrashing.mdwn
@@ -0,0 +1,22 @@
+Problem: Suppose a server has 256 mb ram. Each ikiwiki process needs about
+15 mb, before it's loaded the index. (And maybe 25 after, but only one such
+process runs at any time). That allows for about 16 ikiwiki processes to
+run concurrently on a server, before it starts to swap. Of course, anything
+else that runs on the server and eats memory will affect this.
+
+One could just set `MaxClients 16` in the apache config, but then it's also
+limited to 16 clients serving static pages, which is silly. Also, 16 is
+optimistic -- 8 might be a saner choice. And then, what if something on the
+server decides to eat a lot of memory? Ikiwiki can again overflow memory
+and thrash.
+
+It occurred to me that the ikiwiki cgi wrapper could instead do locking of
+its own (say of `.ikiwiki/cgilock`). The wrapper only needs a few kb to
+run, and it starts *fast*. So hundreds could be running waiting for a lock
+with no ill effects. Crank `MaxClients` up to 256? No problem..
+
+And there's no real reason to allow more than one ikiwiki cgi to run at a
+time. Since almost all uses of the CGI lock the index, only one can really
+be doing anything at a time. --[[Joey]]
+
+[[done]]
diff --git a/doc/todo/backlinks_result_is_lossy.mdwn b/doc/todo/backlinks_result_is_lossy.mdwn
new file mode 100644
index 000000000..2a9fc4a0a
--- /dev/null
+++ b/doc/todo/backlinks_result_is_lossy.mdwn
@@ -0,0 +1,12 @@
+[[!tag patch patch/core]]
+
+IkiWiki::backlinks returns a form of $backlinks{$page} that has undergone a
+lossy transformation (to get it in the form that page templates want), making
+it more difficult to use in other contexts (like pagestats).
+
+A commit on my `among` branch splits it into IkiWiki::backlink_pages
+(which returns the keys of $backlinks{$page}, and might be suitable for
+exporting) and IkiWiki::backlinks (which calls backlink_pages, then performs
+the same lossy transformation as before on the result).
+
+[[done]] --[[Joey]]
diff --git a/doc/todo/basewiki_should_be_self_documenting.mdwn b/doc/todo/basewiki_should_be_self_documenting.mdwn
new file mode 100644
index 000000000..cb8dee697
--- /dev/null
+++ b/doc/todo/basewiki_should_be_self_documenting.mdwn
@@ -0,0 +1,40 @@
+The pages in the basewiki should be fully self-documenting as far as what
+users need to know to edit pages in the wiki. [[ikiwiki/Formatting]]
+documents the basics, but doesn't include every preprocessor directive.
+
+> Thanks to Joey's work applying and fixing up my patches, this is mostly done.
+> The one thing I'd add above the way things currently work would be to add
+> the [[plugins/listdirectives]] plugin to [[plugins/goodstuff]].
+> Doing that requires making the decision about whether you really want the
+> documentation in every wiki - it is 200k. -- [[Will]]
+
+>> I don't think that it needs to be in goodstuff to close this, though I
+>> might decide to add it to goodstuff later. [[done]] --[[Joey]]
+
+Note that there's a disctinction between being self-documenting for users,
+and being complete documentation for ikiwiki. The basewiki is _not_
+intended to be the latter, so it lacks the usage page, all the plugin
+pages, etc.
+
+I've made some progress toward making the basewiki self-documenting by moving
+the docs about using templates, shortcuts, and blogs from the plugin pages,
+onto the pages in the basewiki.
+
+Here are some of the things that are not documented in full in the
+basewiki:
+
+ joey@kodama:~/src/ikiwiki/doc/plugins>grep usage *
+ aggregate.mdwn:## usage
+ graphviz.mdwn:page. Example usage:
+ graphviz.mdwn:amounts of processing time and disk usage.
+ img.mdwn:## usage
+ linkmap.mdwn:set of pages in the wiki. Example usage:
+ map.mdwn:This plugin generates a hierarchical page map for the wiki. Example usage:
+ postsparkline.mdwn:# usage
+ sparkline.mdwn:# usage
+ sparkline.mdwn:more detail in [its wiki](http://sparkline.wikispaces.com/usage).
+ table.mdwn:## usage
+
+Meta is another one.
+
+See also: [[Conditional_Underlay_Files]]
diff --git a/doc/todo/be_more_selective_about_running_hooks.mdwn b/doc/todo/be_more_selective_about_running_hooks.mdwn
new file mode 100644
index 000000000..70a1cb7a2
--- /dev/null
+++ b/doc/todo/be_more_selective_about_running_hooks.mdwn
@@ -0,0 +1,68 @@
+[[!template id=gitbranch branch=chrismgray/exclusive-hooks author="[[chrismgray]]"]]
+
+Sometimes plugins register a function with `hook`, but they only want
+the function called with the content that they know how to deal with.
+Normally, this means that they call `pagetype` first thing in the
+function, determine if they know how to deal with the content, and
+only do anything if they do.
+
+> So, I can't find any plugins shipped with ikiwiki that actually do that.
+> Scan hooks are only ever passed the content of actual wiki pages, and
+> so unless a scan hook cares whether a page is written in markdown or
+> something else, it has no reason to care what the pagetype is. (Same for
+> linkify.) --[[Joey]]
+
+>> My [[org-mode|todo/org_mode]] external plugin (which will never
+>> really make sense to include with ikiwiki I think) does this. I
+>> think that most plugins defining alternate wiki syntaxes would as
+>> well. --[[chrismgray]]
+
+This is a bit wasteful in itself, but for external plugins, it's
+really bad. For functions like `scan` and `linkify`, where the entire
+page is sent back and forth over `stdout` and `stdin`, it really slows
+things down.
+
+Thus, I propose that there be a new optional parameter to `hook` that
+tells it that the function should only be called for files whose type
+is the same as the id of the plugin calling `hook`. I have called
+this parameter `exclusive` in my branch, but this might not be the
+best name.
+
+[[!tag patch]]
+
+> It's an interesting idea, but it might be more useful if it was more
+> generalized, say, by making it a filter, where the parameter is a regexp.
+>
+> --[[KathrynAndersen]]
+
+>> Would it make more sense as a pagespec? That might be a bit more hard
+>> to implement, but would certainly fix the naming issue.
+>>
+>> --[[chrismgray]]
+
+>>> Considering where it would be called, a pagespec might be overkill. --[[KathrynAndersen]]
+
+>>>> Pagespecs have some overhead themselves. Probably less than shipping
+>>>> the page content over stdio.
+>>>>
+>>>> Rather than putting filtering in the core of ikiwiki, I can think
+>>>> of two options. One is to make the main plugin a perl plugin, and
+>>>> have it call functions that are provided by another, external plugin.
+>>>> This is assuming you're using the other language because something
+>>>> is easy to do in it, not to avoid writing perl.
+>>>>
+>>>> Or, the external plugin interface could provide a version of `hook()`
+>>>> that does not pass the content parameter, but saves a copy that
+>>>> the plugin could request with a later rpc call. Assuming that
+>>>> it's really the overhead of serializing the page content, that's
+>>>> the problem, and not just the general overhead of making rpc calls
+>>>> for every page.. --[[Joey]]
+
+>>>>> Since the argument to `hook` is optional, the pagespec is only
+>>>>> interpreted if it is given. So there is no extra overhead
+>>>>> (beyond an unused `if` branch) in 99% of the cases.
+>>>>>
+>>>>> Rewriting the external plugin's shim using Perl is a good idea,
+>>>>> and one that I wish I had thought of earlier. On the other
+>>>>> hand, it doesn't set a great precedent about the usability of
+>>>>> external plugins. --[[chrismgray]]
diff --git a/doc/todo/beef_up_sidebar_to_allow_for_multiple_sidebars.mdwn b/doc/todo/beef_up_sidebar_to_allow_for_multiple_sidebars.mdwn
new file mode 100644
index 000000000..fdaa09f26
--- /dev/null
+++ b/doc/todo/beef_up_sidebar_to_allow_for_multiple_sidebars.mdwn
@@ -0,0 +1,125 @@
+Maybe sidebar could be beefed up to take the name of a sidebar, such that I could use multiple sidebars in the same wiki. For instance, the default name would be 'sidebar', meaning the plugin looks for `sidebar.pm` and fills in the `sidebar` slot, but I might also want a footer in `footer.pm`, filling the template's `footer` slot.
+
+One good way (if possible) would be to provide a directive like `\[[!sidebar
+id=sidebar]]` which would cause the file, in which it occurred to fill the
+slot `SIDEBAR` in the template: basically, a page `foo.mdwn` says
+`\[[!fillslot slot=myslot]]` and then its contents should go into `<TMPL_VAR
+SLOT_MYSLOT>` for all pages. Ideally, this can then be overridden, so if
+`/bar/foo.mdwn` also references `myslot` then pages under `/bar` should get
+those contents instead.
+
+
+--[[madduck]]
+
+> In mine I just copied sidebar out and made some extra "sidebars", but they go elsewhere. Ugly hack, but it works. --[[simonraven]]
+
+>> Here a simple [[patch]] for multiple sidebars. Not too fancy but better than having multiple copies of the sidebar plugin. --[[jeanprivat]]
+
+>>> I made a [[git]] branch for it [[!template id=gitbranch branch="privat/multiple_sidebars" author="[[jeanprivat]]"]] --[[jeanprivat]]
+
+>>>> Ping for [[Joey]]. Do you have any comment? I could improve it if there is things you do not like. I prefer to have such a feature integrated upstream. --[[JeanPrivat]]
+
+>>>>> The code is fine.
+>>>>>
+>>>>> I did think about having it examine
+>>>>> the `page.tmpl` for parameters with names like `FOO_SIDEBAR`
+>>>>> and automatically enable page `foo` as a sidebar in that case,
+>>>>> instead of using the setup file to enable. But I'm not sure about
+>>>>> that idea..
+>>>>>
+>>>>> The full compliment of sidebars would be a header, a footer,
+>>>>> a left, and a right sidebar. It would make sense to go ahead
+>>>>> and add the parameters to `page.tmpl` so enabling each just works,
+>>>>> and add whatever basic CSS makes sense. Although I don't know
+>>>>> if I want to try to get a 3 column CSS going, so perhaps leave the
+>>>>> left sidebar out of that.
+
+-------------------
+
+<pre>
+--- /usr/share/perl5/IkiWiki/Plugin/sidebar.pm 2010-02-11 22:53:17.000000000 -0500
++++ plugins/IkiWiki/Plugin/sidebar.pm 2010-02-27 09:54:12.524412391 -0500
+@@ -19,12 +19,20 @@
+ safe => 1,
+ rebuild => 1,
+ },
++ active_sidebars => {
++ type => "string",
++ example => qw(sidebar banner footer),
++ description => "Which sidebars must be activated and processed.",
++ safe => 1,
++ rebuild => 1
++ },
+ }
+
+-sub sidebar_content ($) {
++sub sidebar_content ($$) {
+ my $page=shift;
++ my $sidebar=shift;
+
+- my $sidebar_page=bestlink($page, "sidebar") || return;
++ my $sidebar_page=bestlink($page, $sidebar) || return;
+ my $sidebar_file=$pagesources{$sidebar_page} || return;
+ my $sidebar_type=pagetype($sidebar_file);
+
+@@ -49,11 +57,17 @@
+
+ my $page=$params{page};
+ my $template=$params{template};
+-
+- if ($template->query(name => "sidebar")) {
+- my $content=sidebar_content($page);
+- if (defined $content && length $content) {
+- $template->param(sidebar => $content);
++
++ my @sidebars;
++ if (defined $config{active_sidebars} && length $config{active_sidebars}) { @sidebars = @{$config{active_sidebars}}; }
++ else { @sidebars = qw(sidebar); }
++
++ foreach my $sidebar (@sidebars) {
++ if ($template->query(name => $sidebar)) {
++ my $content=sidebar_content($page, $sidebar);
++ if (defined $content && length $content) {
++ $template->param($sidebar => $content);
++ }
+ }
+ }
+ }
+</pre>
+
+----------------------------------------
+## Further thoughts about this
+
+(since the indentation level was getting rather high.)
+
+What about using pagespecs in the config to map pages and sidebar pages together? Something like this:
+
+<pre>
+ sidebar_pagespec => {
+ "foo/*" => 'sidebars/foo_sidebar',
+ "bar/* and !bar/*/*' => 'bar/bar_top_sidebar',
+ "* and !foo/* and !bar/*" => 'sidebars/general_sidebar',
+ },
+</pre>
+
+One could do something similar for *pageheader*, *pagefooter* and *rightbar* if desired.
+
+Another thing which I find compelling - but probably because I am using [[plugins/contrib/field]] - is to be able to treat the included page as if it were *part* of the page it was included into, rather than as an included page. I mean things like \[[!if ...]] would test against the page name of the page it's included into rather than the name of the sidebar/header/footer page. It's even more powerful if one combines this with field/getfield/ftemplate/report, since one could make "generic" headers and footers that could apply to a whole set of pages.
+
+Header example:
+<pre>
+#{{$title}}
+\[[!ftemplate id="nice_data_table"]]
+</pre>
+
+Footer example:
+<pre>
+------------
+\[[!report template="footer_trail" trail="trailpage" here_only=1]]
+</pre>
+
+(Yes, I am already doing something like this on my own site. It's like the PmWiki concept of GroupHeader/GroupFooter)
+
+-- [[KathrynAndersen]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/beef_up_signin_page.mdwn b/doc/todo/beef_up_signin_page.mdwn
new file mode 100644
index 000000000..ee322b663
--- /dev/null
+++ b/doc/todo/beef_up_signin_page.mdwn
@@ -0,0 +1,17 @@
+ikiwiki's signin page is too sparse for people who don't live in the Web 2.0.
+
+We occasionally have GNU Hurd web pages contributors wonder what they have to
+do on that page. They don't know / the page doesn't explain what an *account
+provider* is, and that cross-indentification (using an existing OpenID account)
+is possible to begin with. And, if they don't have such an OpenID account,
+it's not easily understandable that the *other* option is for creating a local
+site-only account (like in the old days).
+
+--[[tschwinge]]
+
+> I agree that this would be good. It could be done by leaving
+> the compact widget at the top and adding some verbose explanations
+> and/or further forms below.
+>
+> All it takes is editing `templates/openid-selector.tmpl`,
+> so I welcome suggestions. --[[Joey]]
diff --git a/doc/todo/block_external_links.mdwn b/doc/todo/block_external_links.mdwn
new file mode 100644
index 000000000..56627653e
--- /dev/null
+++ b/doc/todo/block_external_links.mdwn
@@ -0,0 +1,16 @@
+I'd like the ability to block external links from anonymous users, or from
+untrusted users. This could work by generating the HTML for the new page and
+comparing it to the HTML for the old page, looking for any new `<a>` tags with
+href values that didn't exist in the old page and don't start with the URL of
+the wiki. Comparing the HTML, rather than the input, allows usage with
+various types of input formats, and ensures that a template, shortcut, or some
+new plugin will not bypass the filter.
+
+This would probably benefit from a whitelist of acceptable external URLs.
+
+This may actually form a subset of the general concept of content policies,
+described at [[todo/fileupload]].
+
+--[[JoshTriplett]]
+
+[[wishlist]]
diff --git a/doc/todo/blocking_ip_ranges.mdwn b/doc/todo/blocking_ip_ranges.mdwn
new file mode 100644
index 000000000..ac2344ece
--- /dev/null
+++ b/doc/todo/blocking_ip_ranges.mdwn
@@ -0,0 +1,7 @@
+Admins need the ability to block IP ranges. They can already ban users.
+
+See [[fileupload]] for a propsal that grew to encompass the potential to do
+this.
+
+[[done]] (well, there is no pagespec for IP ranges yet, but we can block
+individual IPs)
diff --git a/doc/todo/blogging.mdwn b/doc/todo/blogging.mdwn
new file mode 100644
index 000000000..a31674809
--- /dev/null
+++ b/doc/todo/blogging.mdwn
@@ -0,0 +1,137 @@
+- It would be possible to support rss enclosures for eg, podcasts, pretty easily.
+
+Here is the last of those items. Using the meta plugin you can give the appropriate
+info, and the enclosure will be added to the entry. It will also add a <link /> tag
+at the top, but I don't know if this is necessary. It also includes a fix for
+when make is used without PREFIX.
+
+<http://jameswestby.net/scratch/podcast.diff>
+
+-- JamesWestby
+
+> Hmm. Not quite how I'd envisioned podcasts would work, my idea was
+> more that the sound files would be kept inside the wiki, and the
+> inline plugin could be told to eg, inline *.mp3, and would add
+> those to the rss feed as enclosures. Maybe you'd also inline some
+> regular blog pages to describe the files or the like.
+
+> Do you think that would work or that it's worth pursuing that
+> approach? I haven't looked at podcasts enough to know if that
+> method would be technically feasable; for one thing it would limit
+> the blog items for podcasts to just having an enclosure but no
+> description.
+
+> Even if that doesn't work and pages are needed to desribe the items
+> like you did, it still seems better to keep the podcast items in
+> the wiki..
+
+> --[[Joey]]
+
+That's fair enough. I'm a little unsure of how it all works, so I just did the
+simplest thing I could.
+
+You don't need a description for podcasts it seems. So there's nothing stopping
+you there.
+
+I have another patch that I think does what you want. It only supports .mp3 files,
+.ogg or similar could be added easily.
+
+It has the disadvantage that the filename is all there is to go on, as I can't
+think of a way to associate any information with the mp3 file. I don't
+want to add a dependency on a IDv3 tag library. You could add another file
+.mp3.info with the title/description in.
+
+It's obviously up to you which way you want to go.
+
+<http://jameswestby.net/scratch/podcast2.diff>
+
+-- JamesWestby
+
+> Hmm, this could be taken a step further, and assume that if
+> IkiWiki::pagetype doesn't return a defined page type for the page
+> in the blog, then no matter the extension it should be fed into the
+> rss feed in an enclosure. This would allow for not only podcasting,
+> but vidcasting and a form of photo blogging. Or even an rss feed
+> containing the source of ikiwiki. ;-)
+>
+> --[[Joey]]
+
+Yes I agree that this would be great, but rss2 spec says that enclosure
+must have mime-type. How about I use the File::MimeInfo trick from the
+first patch to do this? I don't know why I didn't do this before.
+This will probably clean the code up a little as well.
+
+What do you think of the change that when using raw, if the filetype is not
+known it adds an entry anyway? I did this so that the entries appear if
+this mode is used. It might be that this is not necessary, as can we assume
+that people wont use raw if they want to pod/vid/whatevercast?
+
+-- JamesWestby
+
+> Using File::Mimeinfo makes sense to me.
+
+> I think it probably makes sense to make the (html) blog page
+> add an entry with a link to the file that's in the enclosure in the
+> rss feed. Whether or not raw is being used.
+
+> Note: I'm still unsure about whether podcasts should support
+> descriptions for the enclosures or not. Here's an early podcast
+> that did use descriptions:
+> <http://static.userland.com/gems/backend/gratefulDead.xml>
+> Here's a contemporary podcast, which also uses descriptions:
+> <http://www.lugradio.org/episodes.rss>
+
+> The podcast client I use certianly doesn't care about the
+> descriptions. But it's podracer, probably not the thing most
+> podcast users use. :-)
+
+> --[[Joey]]
+
+I tested with amarok, and that also ignored the description.
+I am thinking of those where you have a mixed feed, and people
+using clients that ignore enclsures get pretty much a blank post,
+with just the filename, and the html page, which also just displays
+the filename.
+
+I don't think this is a big issue though, so I guess it's just which
+you think is the cleaner interface.
+
+I have also added the first of your ideas as well (though you seem to have
+removed it). It adds a parameter to inline `archive_after` which limits
+showing full entries to that number.
+
+<http://jameswestby.net/scratch/limit.diff>
+
+-- JamesWestby
+
+> I removed it because I don't really see the need for it anymore.
+> The added complexity doesn't seem worth it, unless someone needs the
+> features. --[[Joey]]
+
+And here is the updated podcast patch supporting any file type.
+
+<http://jameswestby.net/scratch/podcast2.diff>
+
+-- JamesWestby
+
+And here is a patch for the remaining item. It adds links to the bottom of
+inlined entries for edit and discuss (if they are enabled). It doesn't add
+links for edit if the filetype is not known.
+
+The stylesheet should probably be done slightly better than I have. I just
+added a bit of spacing as the links were too close to the date. I have no
+skill in this area though. Perhaps you would like to use the list system
+that you have for the links at the top.
+
+<http://jameswestby.net/scratch/actions.diff>
+
+-- JamesWestby
+
+> Thanks! I did tweak the css a bit. Not totally happy with it, but pretty
+> good I think. (I'll try to get to the other patches soon.) --[[Joey]]
+
+
+---
+
+I'm very happy to report that this is [[todo/done]]. Podcasting patch
+applied (finally!) --[[Joey]]
diff --git a/doc/todo/blogpost_plugin.mdwn b/doc/todo/blogpost_plugin.mdwn
new file mode 100644
index 000000000..69df27271
--- /dev/null
+++ b/doc/todo/blogpost_plugin.mdwn
@@ -0,0 +1,156 @@
+This is a plugin that prompts the user for a title for a post, and then
+redirects the user to post in a hierarchically organized directory
+structure. It supports limiting blog posts only to certain users, and
+this applies to creating new as well as editing old ones. It's a little
+clumsy so far:
+
+* Although Joey said specifically not to, I export the printheader
+ function so that the plugin can create a form.
+* The form doesn't invoke the formbuilder hooks; I couldn't figure
+ out a way to do that easily from inside the plugin (unless I
+ exported run_hooks and that didn't seem right).
+* This invokes a new hook, authcgi, although Joey said that
+ "the best approach was adding a flag that makes a cgi hook get
+ postponed and called after a session is set up". I don't know
+ if this is a good idea; authenticated CGI plugins need sessions,
+ and unauthenticated CGI plugins don't, so it seems like a new hook
+ is a better idea.
+* This creates a lot of broken parents unless you use something
+ like [[missingparents.pm]].
+
+I don't expect that this will be applied; it's more like a first draft.
+I would like to hear good ideas on how to solve the first two problems.
+Maybe make the hook return a form, which is printed by CGI.pm? --Ethan
+
+<pre>
+Index: IkiWiki/CGI.pm
+===================================================================
+--- IkiWiki/CGI.pm (revision 3968)
++++ IkiWiki/CGI.pm (working copy)
+@@ -684,6 +684,9 @@
+ }
+ }
+
++ run_hooks(authcgi => sub { shift->($q, $session); });
++ my $do=$q->param('do'); # in case the hook changed it
++
+ if (defined $session->param("name") &&
+ userinfo_get($session->param("name"), "banned")) {
+ print $q->header(-status => "403 Forbidden");
+Index: IkiWiki/Plugin/blogpost.pm
+===================================================================
+--- IkiWiki/Plugin/blogpost.pm (revision 0)
++++ IkiWiki/Plugin/blogpost.pm (revision 0)
+@@ -0,0 +1,97 @@
++#!/usr/bin/perl
++# blogpost plugin: interprets cgi "blogpost" commands as create commands.
++package IkiWiki::Plugin::blogpost;
++
++use warnings;
++use strict;
++use POSIX;
++use IkiWiki 2.00;
++
++sub import {
++ hook(type => "checkconfig", id => "blogpost", call => \&checkconfig);
++ hook(type => "authcgi", id => "blogpost", call => \&authcgi);
++ hook(type => "canedit", id => "blogpost", call => \&canedit);
++}
++
++sub checkconfig () {
++ if (! defined $config{blogformat}){
++ $config{blogformat} = 'posts/%Y/%m/%d/$title';
++ }
++
++ if (! defined $config{blogpagespec}){
++ my $spec = $config{blogformat};
++ $spec =~ s/%./\*/g;
++ $spec =~ s/\$title/*/;
++ $config{blogpagespec} = "$spec and ! $spec/*";
++ }
++
++ if (! defined $config{blogusers}) {
++ $config{blogusers} = (); # disallow all posting by default
++ }
++}
++
++sub authcgi ($$) {
++ my $cgi=shift;
++ my $session=shift;
++
++ return unless (defined $cgi->param('do') && $cgi->param('do') eq "blogpost");
++ my $user=$session->param("name");
++ error ("not allowed to blog, $user") unless
++ $config{blogusers} eq "*" ||
++ grep {$_ eq $user} $config{blogusers};
++ eval q{use CGI::FormBuilder};
++ error($@) if $@;
++ my @fields=qw(title);
++ my $form = CGI::FormBuilder->new(
++ fields => \@fields,
++ title => "post title",
++ name => "post title",
++ header => 1,
++ charset => "utf-8",
++ method => 'POST',
++ required => 'NONE',
++ javascript => 0,
++ params => $cgi,
++ action => $config{cgiurl},
++ header => 0,
++ template => {type => 'div'},
++ stylesheet => $config{url}."/style.css",
++ );
++ my $buttons=["Blog!"];
++ $form->field(name => "do", type => "hidden", value => "blogpost",
++ force => 1);
++ if (! $form->submitted){
++ printheader($session);
++ print misctemplate($form->title, $form->render(submit => $buttons));
++ exit;
++ }
++ else {
++ my $page = blogpage($form->field("title"));
++ $cgi->param("do", "create");
++ $cgi->param("page", $page);
++ }
++
++}
++
++sub blogpage ($) {
++ my $title=shift;
++ my $page=POSIX::strftime $config{blogformat}, localtime;
++ $page =~ s/\$title/$title/;
++ return $page;
++}
++
++sub canedit ($$$) {
++ my $page=shift;
++ my $cgi=shift;
++ my $session=shift;
++
++ return undef unless pagespec_match($page, $config{blogpagespec});
++ my $user=$session->param("name");
++ IkiWiki::needsignin($cgi, $session) unless defined $user;
++
++ return "" if ($config{blogusers} eq "*" ||
++ grep {$_ eq $user} $config{blogusers});
++ return ("not allowed to blog, $user");
++}
++
++1
+Index: IkiWiki.pm
+===================================================================
+--- IkiWiki.pm (revision 3968)
++++ IkiWiki.pm (working copy)
+@@ -17,6 +17,7 @@
+ our @EXPORT = qw(hook debug error template htmlpage add_depends pagespec_match
+ bestlink htmllink readfile writefile pagetype srcfile pagename
+ displaytime will_render gettext urlto targetpage
++ misctemplate printheader
+ %config %links %renderedfiles %pagesources %destsources);
+ our $VERSION = 2.00; # plugin interface version, next is ikiwiki version
+ our $version='unknown'; # VERSION_AUTOREPLACE done by Makefile, DNE
+</pre>
+
+[[!tag patch patch/core]]
diff --git a/doc/todo/blogs.mdwn b/doc/todo/blogs.mdwn
new file mode 100644
index 000000000..8c9cba593
--- /dev/null
+++ b/doc/todo/blogs.mdwn
@@ -0,0 +1,4 @@
+ikiwiki needs to support blogging. Make subpages of a page turn into a blog
+with a special post-processor rune.
+
+[[todo/done]]
diff --git a/doc/todo/blogspam_training.mdwn b/doc/todo/blogspam_training.mdwn
new file mode 100644
index 000000000..f15eba59d
--- /dev/null
+++ b/doc/todo/blogspam_training.mdwn
@@ -0,0 +1,31 @@
+The [[blogspam plugin|plugins/blogspam]] is just great.
+
+However, it lacks support in the web interface to [train comments as
+SPAM](http://blogspam.net/api/classifyComment.html), when they were
+erroneously identified as ham. It would be great to have such
+support, also in the spirit of helping
+[blogspam.net](http://blogspam.net) to get better and better.
+
+What would consist the most appropriate user interface is not entirely
+clear to me in the general case (wiki page editing). The case of blog
+comments look easier to: when the admin user is logged in (and if the
+blogspam plugin is enabled), each comment can have an extra link "mark
+as SPAM" which would both delete/revert the comment and submit it to
+the configured blogspam server for training.
+
+> Comments can't have an extra link when the admin user is logged
+> in, because the admin user sees the same static pages as everyone
+> else (non-admins still see the "remove" link provided by the remove
+> plugin, too). Perhaps a better UI would be that the action of that
+> link was overridden by the blogspam plugin to go to a form with
+> a checkbox for "also submit as spam"? --[[smcv]]
+
+Similarly, ham training can be plugged directly into the current
+comment moderation interface. Each comment that gets approved by the
+admin, can be sent to blogspam.net as ham. If this is considered too
+"aggressive", this behaviour can need to be explicitly enabled by
+turning on a configuration option.
+
+-- [[Zack]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/break_up_page_template_into_subfiles.mdwn b/doc/todo/break_up_page_template_into_subfiles.mdwn
new file mode 100644
index 000000000..e9f2e310b
--- /dev/null
+++ b/doc/todo/break_up_page_template_into_subfiles.mdwn
@@ -0,0 +1,36 @@
+Wishlist items such as [[Add space before slash in parent links]] would be
+easier to deal with if the page.tmpl template was broken up into sections
+and each section had a separate template file which was included in the
+master page.tmpl file. This would make it easier to customize parts of a
+page without having to fork the whole page.tmpl and then have things break
+when there's an update of the master page.tmpl file.
+
+Suggested sections:
+
+* page_head.tmpl for the things in the <head> section
+* page_header.tmpl for things in the "header" div (which includes the PARENTLINKS loop)
+* page_actions.tmpl for the actions section
+* page_sidebar.tmpl for the sidebar
+* page_content.tmpl for the main content
+* page_footer.tmpl for the footer
+
+Would this work, or would HTML::Template have problems with this?
+
+-- [[KathrynAndersen]]
+
+> Well, breaking it up into 6 sections would let a user modify one of them
+> with only 1/6th the chance of it being broken by a new ikiwiki.
+> Which seems like a win from the user's POV. However, I know that there
+> are ikiwiki users who modify the `page.tmpl` but are very
+> unsophisticated; needing to find the right file amoung 6 to modify
+> would be a loss for these users. And some modifications would probably
+> need to be coordinated amoung multiple files.
+>
+> For ikiwiki developers, reducing by 5/6th the number of users affected by a
+> breaking change to page.tmpl is nice, but we still have to worry about
+> the 1 in 6 that would be affected despite the splitting. Ikiwiki has
+> enough users that any change to page.tmpl has to be carefully considered
+> to avoid breaking something they may depend on, and it's been two years
+> since that last needed to be done.
+>
+> So all in all, I don't think it's worth doing. --[[Joey]]
diff --git a/doc/todo/brokenlinks_should_group_links_to_a_page.mdwn b/doc/todo/brokenlinks_should_group_links_to_a_page.mdwn
new file mode 100644
index 000000000..8d7c9eb7a
--- /dev/null
+++ b/doc/todo/brokenlinks_should_group_links_to_a_page.mdwn
@@ -0,0 +1,21 @@
+I would find the [[plugins/brokenlinks]] listing much easier to use if it
+grouped all the links to a single missing page on one line, rather than showing
+one line per page that links to the missing page.
+
+I think this would work well as the default; however, if people prefer the
+current behavior, perhaps brokenlinks could have an option to group by link
+target.
+
+--[[JoshTriplett]]
+
+> The only downside I see to doing that is that currently it create a
+> "?Link" that will create the missing page, with a default location that's
+> the same as clicking on the "?Link" in the page with the broken link. But
+> if multiple pages are listed on one line, there's only one link and so it
+> can only be from=somepage. This would probably not be a problem in most
+> cases though. It's likely that if a missing page is linked to from 2+ pages,
+> that the user both won't take much care which link is clicked on
+> to create it, and that both pages really meant to link to the same page
+> anyway. --[[Joey]]
+
+[[done]]
diff --git a/doc/todo/bzr.mdwn b/doc/todo/bzr.mdwn
new file mode 100644
index 000000000..a50c58d26
--- /dev/null
+++ b/doc/todo/bzr.mdwn
@@ -0,0 +1,194 @@
+This is mostly based on the Mercurial plugin (in fact, apart from the commands
+being run, only the name of the rcs was changed in rcs_recentchanges, and
+rcs_commit was only changed to work around bzr's lack of a switch to set the
+username). bzr_log could probably be written better by someone better at perl,
+and rcs_getctime and rcs_notify aren't written at all. --[[bma]]
+
+(rcs_notify is not needed in this branch --[[Joey]])
+
+ #!/usr/bin/perl
+
+ use warnings;
+ use strict;
+ use IkiWiki;
+ use Encode;
+ use open qw{:utf8 :std};
+
+ package IkiWiki;
+
+ sub bzr_log($) {
+ my $out = shift;
+
+ my @lines = <$out>;
+
+ my @entries = split(/\n-+\s/,join("", @lines));
+
+ my @ret = ();
+
+ foreach my $entry (@entries) {
+
+ my ($initial,$i) = split(/message:/,$entry,2);
+ my ($message, $j, $files) = split(/(added|modified|removed):/,$i,3);
+ $message =~ s/\n/\\n/g;
+ $files =~ s/\n//g;
+ $entry = $initial . "\ndescription: " . $message . "\nfiles: " . $files;
+
+ my @lines = split(/\n/,$entry);
+ shift(@lines);
+
+ my %entry;
+ foreach (@lines) {
+ my ($key,$value) = split(/: /);
+ $entry{$key} = $value;
+ }
+ $entry{description}=~s/\\n/\n/g;
+ $entry{files}=~s/\s\s+/\ /g;
+ $entry{files}=~s/^\s+//g;
+
+ $ret[@ret] = {
+ "description" => $entry{description},
+ "user" => $entry{committer},
+ "files" => $entry{files},
+ "date" => $entry{timestamp},
+ }
+ }
+
+ return @ret;
+ }
+
+ sub rcs_update () {
+ # Not needed.
+ }
+
+ sub rcs_prepedit ($) {
+ return "";
+ }
+
+ sub rcs_commit ($$$;$$) {
+ my ($file, $message, $rcstoken, $user, $ipaddr) = @_;
+
+ if (defined $user) {
+ $user = possibly_foolish_untaint($user);
+ }
+ elsif (defined $ipaddr) {
+ $user = "Anonymous from ".possibly_foolish_untaint($ipaddr);
+ }
+ else {
+ $user = "Anonymous";
+ }
+
+ $message = possibly_foolish_untaint($message);
+ if (! length $message) {
+ $message = "no message given";
+ }
+
+ my $olduser = `bzr whoami`;
+ chomp $olduser;
+ system("bzr","whoami",$user); # This will set the branch username; there doesn't seem to be a way to do it on a per-commit basis.
+ # Save the old one and restore after the commit.
+ my @cmdline = ("bzr", "commit", "-m", $message, $config{srcdir}."/".$file);
+ if (system(@cmdline) != 0) {
+ warn "'@cmdline' failed: $!";
+ }
+
+ $olduser=possibly_foolish_untaint($olduser);
+ system("bzr","whoami",$olduser);
+
+ return undef; # success
+ }
+
+ sub rcs_add ($) {
+ my ($file) = @_;
+
+ my @cmdline = ("bzr", "add", "--quiet", "$config{srcdir}/$file");
+ if (system(@cmdline) != 0) {
+ warn "'@cmdline' failed: $!";
+ }
+ }
+
+ sub rcs_recentchanges ($) {
+ my ($num) = @_;
+
+ eval q{use CGI 'escapeHTML'};
+ error($@) if $@;
+
+ my @cmdline = ("bzr", "log", "--long", "--verbose", "--limit", $num,$config{srcdir});
+ open (my $out, "@cmdline |");
+
+ eval q{use Date::Parse};
+ error($@) if $@;
+
+ my @ret;
+ foreach my $info (bzr_log($out)) {
+ my @pages = ();
+ my @message = ();
+
+ foreach my $msgline (split(/\n/, $info->{description})) {
+ push @message, { line => $msgline };
+ }
+
+ foreach my $file (split / /,$info->{files}) {
+ my $diffurl = $config{'diffurl'};
+ $diffurl =~ s/\[\[file\]\]/$file/go;
+ $diffurl =~ s/\[\[r2\]\]/$info->{changeset}/go;
+
+ push @pages, {
+ page => pagename($file),
+ diffurl => $diffurl,
+ };
+ }
+
+ my $user = $info->{"user"};
+ $user =~ s/\s*<.*>\s*$//;
+ $user =~ s/^\s*//;
+
+ push @ret, {
+ rev => $info->{"changeset"},
+ user => $user,
+ committype => "bzr",
+ when => time - str2time($info->{"date"}),
+ message => [@message],
+ pages => [@pages],
+ };
+ }
+
+ return @ret;
+ }
+
+ sub rcs_notify () {
+ # TODO
+ }
+
+ sub rcs_getctime ($) {
+ # TODO
+ }
+
+ 1
+
+
+[[patch]]
+
+
+> Thanks for doing this.
+> bzr 0.90 has support for --author to commit to set the author for one commit at a time,
+> you might like to use that instead of changing the global username (which is racy).
+>
+> Wouter van Heyst and I were also working on a plugin for bzr, but we were waiting for
+> the smart server to grow the ability to run server side hooks, so that you can edit locally
+> and then push to rebuild the wiki, but there is no need to stop this going in in the mean
+> time.
+> Thanks again --[[JamesWestby]]
+
+>> I didn't know about --author, it doesn't seem to be mentioned in the manual.
+>> I'd update the patch to reflect this, but it breaks with the version of bzr
+>> from Stable, and also the one I'm currently using from backports.org.
+
+>>> It's new (in fact I'm not even sure that it made it in to 0.90, it might be in 0.91 due
+>>> in a couple of weeks.
+>>> I was just noting it for a future enhancement. --[[JamesWestby]]
+
+> I've just posted another patch with support for bzr, including support for
+> --author and a testsuite to git://git.samba.org/jelmer/ikiwiki.git. I hadn't
+> seen this page earlier. --[[jelmer]]
+
+> I used jelmer's patch --[[done]]! --[[Joey]]
diff --git a/doc/todo/cache_backlinks.mdwn b/doc/todo/cache_backlinks.mdwn
new file mode 100644
index 000000000..dc13d464e
--- /dev/null
+++ b/doc/todo/cache_backlinks.mdwn
@@ -0,0 +1,25 @@
+I'm thinking about caching the backlinks between runs. --[[Joey]]
+
+* It would save some time (spent resolving every single link
+ on every page, every run). The cached backlinks could be
+ updated by only updating backlinks from changed pages.
+ (Saved time is less than 1/10th of a second for docwiki.)
+
+* It may allow attacking [[bugs/bestlink_change_update_issue]],
+ since that seems to need a copy of the old backlinks.
+ Actually, just the next change will probably solve that:
+
+* It should allow removing the `%oldlink_targets`, `%backlinkchanged`,
+ and `%linkchangers` calculation code. Instead, just generate
+ a record of which pages' backlinks have changed when updating
+ the backlinks, and then rebuild those pages.
+
+Proposal:
+
+* Store a page's backlinks in the index, same as everything else.
+
+* Do *something* to generate or store the `%brokenlinks` data.
+ This is currently generated when calculating backlinks, and
+ is only used by the brokenlinks plugin. It's not the right
+ "shape" to be stored in the index, but could be changed around
+ to fit.
diff --git a/doc/todo/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn b/doc/todo/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn
new file mode 100644
index 000000000..0a036d315
--- /dev/null
+++ b/doc/todo/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn
@@ -0,0 +1,124 @@
+I am serving notice that I am starting work on a calendar plugin inspired by Blosxom's calendar plugin. The current plan is to create a plugin that looks through all the source files matching a certain pagespec, and optionally spit out a month view for the specified month (default to current), or spit out a year view for a given year (defaulting to the current year), of a list of year with posts in them. The output would be a table, with the same CSS directives that the Blosxom plugin used to use (so that I can just reuse my css file). The links would be created to a $config{archivedir}/$year or $config{archivedir}/$year-$month file, which can just have
+
+ \[[!inline pages="blog/* and !*/Discussion and creation_year($year) and creation_month($month)" rss="no" atom="no" show="0"]]
+
+or some thing to generate a archive of postings.
+
+Roland Mas suggested a separate cron job to generate these archive indices automatically, but that is another thread.
+
+ManojSrivastava
+
+This plugin is inspired by the calendar plugin for Blosxom, but derivesno code from it. This plugin is essentially a fancy front end to archives of previous pages, usually used for blogs. It can produce a calendar for a given month, or a list of months for a given year. To invoke the calendar, just use the preprocessor directive:
+
+ \[[!calendar ]]
+
+or
+
+ \[[!calendar type="month" pages="blog/* and !*/Discussion"]]
+
+or
+
+ \[[!calendar type="year" year="2005" pages="blog/* and !*/Discussion"]]
+
+
+The year and month entities in the out put have links to archive index pages, which are supposed to exist already. The idea is to create an archives hierarchy, rooted in the subdirectory specified in the site-wide customization variable, archivebase. archivebase defaults to "archives". Links are created to pages "$archivebase/$year" and "$archivebase/$year/$month". The idea is to create annual and monthly indices, for example, by using something like this sample from my archives/2006/01.mdwn
+
+ \[[!meta title="Archives for 2006/01"]]
+ \[[!inline rootpage="blog" atom="no" rss="no" show="0" pages="blog/* and !*/Discussion and creation_year(2006) and creation_month(01)" ]]
+
+I'll send in the patch via email.
+
+ManojSrivastava
+
+------
+
+Since this is a little bit er, stalled, I'll post here the stuff Manoj
+mailed me, and my response to it. --[[Joey]]
+
+> > I'm sending in an updated package, and have removed the older version you had here.--ManojSrivastava
+
+
+[[!tag patch]]
+
+----
+
+'ve been looking over the calendar plugin. Some items:
+
+* Why did you need to use a two-stage generation with a format hook?
+ That approach should only be needed if adding something to a page that
+ would be removed by the htmlscrubber, and as far as I can tell, the
+ calendars don't involve anything that would be a problem. It seems
+ that emitting the whole calendar in the preprocess hook would simplify
+ things and you'd not need to save state about calendars.
+
+> I am scared of the html scrubber, and have never turned it on,
+> and did not look too deeply into what would be scrubbed out --ManojSrivastava
+>> Unless you're using javascript, a few annoyances link <blink>, or inline
+>> css, it's unlikly to object to any html you might write. The list of
+>> allowed tags and attributes is easy to find near the top of the plugin.
+
+> In case the option that gets the ctime of the pages from the
+> SCM itself, %IkiWiki::pagectime is not populated that early,
+> is it? So I waited until the last possible moment to look at
+> the time information.
+>
+>> Actually, since my big rewrite of the rendering path a few months ago,
+>> ikiwiki scans and populates almost all page information before starting
+>> to render any page. This includes %pagectime, and even %links. So you
+>> shouldn't need to worry about running it late.
+
+* The way that it defaults to the current year and current month
+ is a little bit tricky, because of course the wiki might not get
+ updated in a particular time period, and even if it is updated, only
+ iff a page containing a calendar is rebuilt for some other reason will
+ the calendar get updated, and change what year or month it shows. This
+ is essentially the same problem described in
+ [[todo/tagging_with_a_publication_date]],
+ although I don't think it will affect the calendar plugin very badly.
+ Still, the docs probably need to be clear about this.
+
+> I use it on the sidebar; and the blog pages are almost always
+> rebuilt, which is where the calendar is looked at most often. Oh,
+> and I also cheat, I have ikiwiki --setup foo as a @daily cronjob, so
+> my wiki is always built daily from scratch.
+>
+> I think it should be mentioned, yes.
+
+* There seems to be something a bit wrong with the year-to-year
+ navigation in the calendar, based on the example in your blog. If I'm
+ on the page for 2006, there's an arrow pointing left which takes me to
+ 2005. If I'm on 2005, the arrow points left, but goes to 2006, not
+ 2004.
+
+> I need to look into this.
+
+* AIUI, the archivebase setting makes a directory rooted at the top of
+ the wiki, so you can have only one set of archives per wiki, in
+ /archives/. It would be good if it were possible to have multiple
+ archived for different blogs in the same wiki at multiple locations.
+ Though since the archives contain calendars, the archive location
+ can't just be relative to the page with the calendar. But perhaps
+ archivebase could be a configurable parameter that can be specified in
+ the directive for the calendar? (It would be fine to keep the global
+ location as a default.)
+
+> OK, this is simple enough to implement. I'll do that (well,
+> perhaps not before Xmas, I have a family dinner to cook) and send in
+> another patch.
+
+
+----
+And that's all I've heard so far. Hoping I didn't miss another patch?
+
+--[[Joey]]
+
+>No, you did not. But I am back to hacking on this, and I think I have discovered a major problem with my approach. One of the problems with the current plugin is that the goal of a calendar is to create a calendar, either a month >or year based on, that provides links to blogs for all the days in (for the month calendar), and all the months (in a year calendar) in which there have been blog postings. For the monthly calendar, it needs to know the previous >and next months where there is a posting, and for the year calendar, it needs to know which of the previous (next) years had entries.
+
+>Now, this means that it needs to know about at _all_ pages that meet the pagespec, and stash that information, before it begins generating the calandar in question, in order to calculate how to create the symlinks. And, of >course, all pages that have calendars on them might need to change anytime a page that meets the pagespec is added; and again at midnight, when the current day changes.
+
+>> I think I have solved the ""Need to look at all pages that match the spec"" issue; but the nightly rebuild to handle the current day changing still remain. I use cron. It is now, however, richly documented :)
+
+
+--ManojSrivastava
+
+> Finally reviewed and applied this. [[done]]! --[[Joey]]
diff --git a/doc/todo/calendar_with___34__create__34___links.mdwn b/doc/todo/calendar_with___34__create__34___links.mdwn
new file mode 100644
index 000000000..9fe6c434a
--- /dev/null
+++ b/doc/todo/calendar_with___34__create__34___links.mdwn
@@ -0,0 +1,10 @@
+the [[ikiwiki/directive/calendar]] directive is well usable without ikiwiki-calendar (eg for articles about meetings), but in such situations, it might be useful to have page creating links at the days.
+
+a [[!taglink patch]] to address this [[!taglink wishlist]] item is [[attached|incomplete_patch.pl]].
+
+from the new documentation (also in the patch):
+
+> * `newpageformat` - In month mode, if no articles match the query, the value of
+> `newpageformat` will be used to strformat the date in question. A good value
+> is `newpageformat="meetings/%Y-%m-%d"`. It might be a good idea to have
+> `\[[!meta date="<TMPL_VAR name>"]]` in the edittemplate of `meetings/*`
diff --git a/doc/todo/calendar_with___34__create__34___links/incomplete_patch.pl b/doc/todo/calendar_with___34__create__34___links/incomplete_patch.pl
new file mode 100644
index 000000000..dc6798831
--- /dev/null
+++ b/doc/todo/calendar_with___34__create__34___links/incomplete_patch.pl
@@ -0,0 +1,36 @@
+diff --git a/IkiWiki/Plugin/calendar.pm b/IkiWiki/Plugin/calendar.pm
+index d443198..0436eda 100644
+--- a/IkiWiki/Plugin/calendar.pm
++++ b/IkiWiki/Plugin/calendar.pm
+@@ -238,7 +238,16 @@ EOF
+ else {
+ $tag='month-calendar-day-nolink';
+ }
+- $calendar.=qq{\t\t<td class="$tag $downame{$wday}">$day</td>\n};
++ if ($params{newpageformat}) {
++ $calendar.=qq{\t\t<td class="$tag $downame{$wday}">};
++ $calendar.=htmllink($params{page}, $params{destpage},
++ strftime_utf8($params{newpageformat}, 0, 0, 0, $day, $params{month} - 1, $params{year} - 1900),
++ noimageinline => 1,
++ linktext => $day);
++ $calendar.=qq{</td>\n};
++ } else {
++ $calendar.=qq{\t\t<td class="$tag $downame{$wday}">$day</td>\n};
++ }
+ }
+ }
+
+diff --git a/doc/ikiwiki/directive/calendar.mdwn b/doc/ikiwiki/directive/calendar.mdwn
+index cb40f88..7b7fa85 100644
+--- a/doc/ikiwiki/directive/calendar.mdwn
++++ b/doc/ikiwiki/directive/calendar.mdwn
+@@ -56,5 +56,9 @@ An example crontab:
+ and so on. Defaults to 0, which is Sunday.
+ * `months_per_row` - In the year calendar, number of months to place in
+ each row. Defaults to 3.
++* `newpageformat` - In month mode, if no articles match the query, the value of
++ `newpageformat` will be used to strformat the date in question. A good value
++ is `newpageformat="meetings/%Y-%m-%d"`. It might be a good idea to have
++ `\[[!meta date="<TMPL_VAR name>"]]` in the edittemplate of `meetings/*`.
+
+ [[!meta robots="noindex, follow"]]
diff --git a/doc/todo/call_git-update-server-info_from_post-udpate_hook.mdwn b/doc/todo/call_git-update-server-info_from_post-udpate_hook.mdwn
new file mode 100644
index 000000000..5c769211b
--- /dev/null
+++ b/doc/todo/call_git-update-server-info_from_post-udpate_hook.mdwn
@@ -0,0 +1,15 @@
+ikiwiki uses the `post-update` Git hook, which traditionally is being used to
+call `git-update-server-info`. This is only needed for HTTP-served
+repositories.
+
+It would be nice if there were a configuration option with which I could
+instruct the hook to call `git-update-server-info` as well, in the
+wrappers['git'] configuration hash (?).
+
+--[[madduck]]
+
+> Of course you can have ikiwiki write the wrapper to post-update.ikiwiki
+> and then just call that from the real post-update script. What I do if I
+> need to also notify cia or what have you. --[[Joey]]
+
+[[Convinced|done]], --[[madduck]]
diff --git a/doc/todo/canonical_feed_location.mdwn b/doc/todo/canonical_feed_location.mdwn
new file mode 100644
index 000000000..694f67633
--- /dev/null
+++ b/doc/todo/canonical_feed_location.mdwn
@@ -0,0 +1,16 @@
+Any way to use `inline` but point the feed links to a different feed on the
+same site? I have news in news/*, a news archive in news.mdwn, and the
+first few news items on index.mdwn, but I don't really want two separate
+feeds, one with all news and one with the latest few articles; I'd rather
+point the RSS feed links of both to the same feed. (Which one, the one
+with all news or the one with the latest news only, I don't know yet.)
+
+> Not currently. It could be implemented, or you could just turn off the
+> rss feed for the index page, and manually put in a wikilink to the news
+> page and rss feed. --[[Joey]]
+
+>> That wouldn't use the same style for the RSS and Atom links, and it
+>> wouldn't embed the feed link into `<head>` so that browsers can automatically
+>> find it.
+
+[[!tag wishlist]]
diff --git a/doc/todo/capitalize_title.mdwn b/doc/todo/capitalize_title.mdwn
new file mode 100644
index 000000000..3e8366dd3
--- /dev/null
+++ b/doc/todo/capitalize_title.mdwn
@@ -0,0 +1,31 @@
+Here I propose an option (with a [[patch]]) to capitalize the first letter (ucfirst) of default titles : filenames and urls can be lowercase but title are displayed with a capital first character (filename = "foo.mdwn", pagetitle = "Foo"). Note that \[[!meta title]] are unaffected (no automatic capitalization). Comments please :) --[[JeanPrivat]]
+<pre><code>
+diff --git a/IkiWiki.pm b/IkiWiki.pm
+index 6da2819..fd36ec4 100644
+--- a/IkiWiki.pm
++++ b/IkiWiki.pm
+@@ -281,6 +281,13 @@ sub getsetup () {
+ safe => 0,
+ rebuild => 1,
+ },
++ capitalize => {
++ type => "boolean",
++ default => undef,
++ description => "capitalize the first letter of page titles",
++ safe => 1,
++ rebuild => 1,
++ },
+ userdir => {
+ type => "string",
+ default => "",
+@@ -989,6 +996,10 @@ sub pagetitle ($;$) {
+ $page=~s/(__(\d+)__|_)/$1 eq '_' ? ' ' : "&#$2;"/eg;
+ }
+
++ if ($config{capitalize}) {
++ $page = ucfirst $page;
++ }
++
+ return $page;
+ }
+</code></pre>
diff --git a/doc/todo/cas_authentication.mdwn b/doc/todo/cas_authentication.mdwn
new file mode 100644
index 000000000..ed8010518
--- /dev/null
+++ b/doc/todo/cas_authentication.mdwn
@@ -0,0 +1,184 @@
+[[!tag patch wishlist]]
+
+ikiwiki should support [Central Authentication
+Service](http://www.ja-sig.org/products/cas/) authentication in order to use
+this <acronym title='Single Sign On'>SSO</acronym> mechanism very popular in
+universities web services.
+
+I have already written a first draft plugin supporting that authentication
+mechanism. It works for me with my university CAS service. I did not try it
+with other CAS server but it do not see any reason why it should not work.
+
+What is the best way to submit it to you (just in case it can help my patch
+follows) ?
+
+--[[/users/bbb]]
+
+> Inline here is ok; git-am by mail is ok; a git repo I can pull from also
+> ok.
+>
+> This looks pretty acceptable as-is, but you need to put a copyright and
+> license statement at the top. I have a few questions that I'll insert
+> inline with the patch below. --[[Joey]]
+
+>> I have made some corrections to this patch (my cas plugin) in order to use
+>> IkiWiki 3.00 interface and take your comments into account. It should work
+>> fine now.
+>>
+>> You can pull it from my git repo at
+>> http://git.boulgour.com/bbb/ikiwiki.git/ and maybe add it to your main
+>> repo.
+>>
+>> I will add GNU GPL copyright license statement as soon as I get some free
+>> time.
+>>
+>> --[[/users/bbb]]
+
+------------------------------------------------------------------------------
+ diff --git a/IkiWiki/Plugin/cas.pm b/IkiWiki/Plugin/cas.pm
+ new file mode 100644
+ index 0000000..ea189df
+ --- /dev/null
+ +++ b/IkiWiki/Plugin/cas.pm
+ @@ -0,0 +1,94 @@
+ +#!/usr/bin/perl
+ +# JaSIG CAS support by Bruno Beaufils <bruno@boulgour.com>
+ +package IkiWiki::Plugin::cas;
+ +
+ +use warnings;
+ +use strict;
+ +use IkiWiki 2.00;
+ +use AuthCAS; # http://search.cpan.org/~osalaun/AuthCAS-1.3.1/
+
+> In ikiwiki we generally deman-load perl modules only when they're used.
+> This avoids loading expensive modules when the CGI isn't doing
+> authentication. Can you do that with AuthCAS? Something like this before
+> the use of it: `eval q{use AuthCAS}; error $@ if $@`
+
+ +
+ +sub import {
+ + hook(type => "getopt", id => "cas", call => \&getopt);
+ + hook(type => "auth", id => "cas", call => \&auth);
+ + hook(type => "formbuilder_setup", id => "cas", call => \&formbuilder_setup);
+ +}
+
+> Could you please use tabs for indentation of program flow?
+
+ +# FIXME: We should check_config to ensure that :
+ +# * cas_url and ca_file are present
+
+> Please fix that..
+
+ +# * no other auth plugin are present (at least passwordauth and openid)
+
+> Why would you want to make other auth plugins not work? Could a site not
+> legitimatly chose to use this and another auth method?
+
+ +sub getopt () {
+ + eval q{use Getopt::Long};
+ + error($@) if $@;
+ + Getopt::Long::Configure('pass_through');
+ + GetOptions("cas_url=s" => \$config{cas_url});
+ + GetOptions("ca_file=s" => \$config{ca_file});
+ +}
+ +
+ +sub auth ($$) {
+ + my $q=shift;
+ + my $session=shift;
+ +
+ + my $cas = new AuthCAS(casUrl => $config{'cas'}{'cas_url'},
+ + CAFile => $config{'cas'}{'ca_file'});
+ +
+ + my $service = $config{'cgiurl'};
+ + my $ticket = $q->param('ticket');
+ +
+ + unless (defined($ticket)) {
+ + $service .= "?$ENV{QUERY_STRING}";
+ + my $login_url = $cas->getServerLoginURL($service);
+ + debug("CAS: asking a Service Ticket for service $service");
+ + IkiWiki::redirect($q, $login_url);
+ + exit 0;
+ + } else {
+ + $service = $service . "?$ENV{QUERY_STRING}";
+ + $service =~ s/\&ticket=$ticket//;
+ + my $user = $cas->validateST($service, $ticket);
+ + if (defined $user) {
+ + debug("CAS: validating a Service Ticket ($ticket) for service $service");
+ + $session->param(name=>$user);
+ + $session->param(CASservice=>$service);
+ + IkiWiki::cgi_savesession($session);
+ + } else {
+ + error("CAS failure: ".&AuthCAS::get_errors());
+ + }
+ + }
+ +}
+ +
+ +# I use formbuilder_setup and not formbuilder type in order to bypass the
+ +# Logout processing done in IkiWiki::CGI::cgi_prefs()
+ +sub formbuilder_setup (@) {
+ + my %params=@_;
+ +
+ + my $form=$params{form};
+ + my $session=$params{session};
+ + my $cgi=$params{cgi};
+ + my $buttons=$params{buttons};
+ +
+ + my $cas = new AuthCAS(casUrl => $config{'cas'}{'cas_url'},
+ + CAFile => $config{'cas'}{'ca_file'});
+ +
+ + if ($form->title eq "preferences") {
+ + # Show the login
+ + if (! defined $form->field(name => "name")) {
+ + $form->field(name => "CAS ID",
+ + disabled => 1,
+ + value => $session->param("name"),
+ + size => 50,
+ + force => 1,
+ + fieldset => "login");
+ + }
+ +
+ + # Force a logout if asked
+ + if ($form->submitted && $form->submitted eq 'Logout')
+ + {
+ + debug("CAS: asking to remove the Ticket Grant Cookie");
+ + IkiWiki::redirect($cgi, $cas->getServerLogoutURL($config{'url'}));
+ + $session->delete();
+ + exit 0;
+ + }
+ + }
+ +}
+ +
+ +1
+ diff --git a/doc/plugins/cas.mdwn b/doc/plugins/cas.mdwn
+ new file mode 100644
+ index 0000000..2f2f53e
+ --- /dev/null
+ +++ b/doc/plugins/cas.mdwn
+ @@ -0,0 +1,18 @@
+ +[[ template id=plugin name=cas core=0 author="[[bbb]]"]]
+ +[[ tag type/auth]]
+ +
+ +This plugin allows users to use authentication offered by a
+ +[JaSIG](http://www.ja-sig.org) [<acronym title='Central Authentication
+ +Service'>CAS</acronym>](http://www.ja-sig.org/products/cas/) server to log
+ +into the wiki.
+ +
+ +The plugin needs the [[!cpan AuthCAS-1.3.1]] perl module.
+
+> Does it really need that specific version? I think you should lose the
+> version part.
+
+ +
+ +This plugin has two mandatory configuration option. You **must** set `--cas_url`
+ +to the url of a server offering CAS 2.0 authentication. You must also set the
+ +`--ca_file` to an absolute path to the file containing CA certificates used by
+ +the server (generally, aka under Debian, fixing that value to
+ +`/etc/ssl/certs/ca-certificates.crt` is sufficient).
+
+> It would be good to add commented-out examples of these to
+> ikiwiki.setup as well.
+
+ +This plugin is not enabled by default. It can not be used with other
+ +authentication plugin, such as [[passwordauth]] or [[openid]].
+
+------------------------------------------------------------------------------
diff --git a/doc/todo/cdate_and_mdate_available_for_templates.mdwn b/doc/todo/cdate_and_mdate_available_for_templates.mdwn
new file mode 100644
index 000000000..70d8fc8c9
--- /dev/null
+++ b/doc/todo/cdate_and_mdate_available_for_templates.mdwn
@@ -0,0 +1,15 @@
+[[!tag wishlist]]
+
+`CDATE_3339`, `CDATE_822`, `MDATE_3339` and `MDATE_822` template variables would be useful for evey page, at least for my templates with Dublin Core metadata.
+
+I tried to pick the relevant lines of the [[inline|plugins/inline]] plugin and hack it into a custom plugin, but it failed miserably because of my obvious lack of perl litteracy...
+
+Anyway, I'm sure this is almost nothing...
+
+* `sub date_822 ($) {}`
+* `sub date_3339 ($) {}`
+* and something like `$template->param('cdate_822' => date_822($IkiWiki::pagectime{$page}));`
+
+Anyone can fill the missing lines?
+
+-- [[nil]]
diff --git a/doc/todo/cgi_hooks_get_session_objects.mdwn b/doc/todo/cgi_hooks_get_session_objects.mdwn
new file mode 100644
index 000000000..edb9aba25
--- /dev/null
+++ b/doc/todo/cgi_hooks_get_session_objects.mdwn
@@ -0,0 +1,5 @@
+How about a hook to allow CGI objects to insist on authenticated users
+only? I think "authcgi" would be a good name. --Ethan
+
+> This is now [[done]], although I called it sessioncgi since the user may
+> or may not be authed. --[[Joey]]
diff --git a/doc/todo/clear_page_to_delete.mdwn b/doc/todo/clear_page_to_delete.mdwn
new file mode 100644
index 000000000..6bab6ef27
--- /dev/null
+++ b/doc/todo/clear_page_to_delete.mdwn
@@ -0,0 +1,33 @@
+Would it make sense to automatically delete a page if it's edited and
+cleared to be entirely empty (or only have whitespace)? Discuss. --[[Joey]]
+
+ I'd say so; yes. A method of deleting pages via the web would be great; I
+can't think of a use of keeping blank pages around. What about vandalism --
+if someone blanks a page and deletes it and someone else wishes to restore
+it; or is undoing edits via the web a bigger issue? -- [[users/Jon]]
+
+Of course there's already a way to delete pages (remove plugin). So the
+question is really:
+
+* Does it make sense to have a second way to do it, by clearing the page?
+* Should it be enabled even if the full remove plugin isn't?
+
+Re vandalism in general, I am generally happy using git-revert to kill the
+offending change. --[[Joey]]
+
+I don't think we need a second way to delete pages, which would probably be
+used by the only few people who will learn it's possible by random
+documentation reading, find it useful, *and* remember it. -- [[intrigeri]]
+
+On the other hand, clearing the page's whole content essentially means deleting
+the page. That's what the user intended to do in this case. The information
+content of an empty vs. a deleted page is essentially the same, I'd say. But
+having ikiwiki remove those stale pages would save some (minimal, admittedly)
+time needed for manual clean-up. --[[tschwinge]]
+
+On EmacsWiki, a page is marked for deletion when it contains just the DeletedPage
+keyword and if there were no page editions since XX days. Here, I use pages that
+can be empty everyday and filled all day long. It does not make sense to me to
+delete these pages :). --[[xma]]
+
+I was not aware of [[plugins/remove]]. I don't think another method is necessary -- [[users/Jon]]
diff --git a/doc/todo/clickable-openid-urls-in-logs.mdwn b/doc/todo/clickable-openid-urls-in-logs.mdwn
new file mode 100644
index 000000000..ab2a6b51f
--- /dev/null
+++ b/doc/todo/clickable-openid-urls-in-logs.mdwn
@@ -0,0 +1,23 @@
+OpenID URLs aren't clickable in the ViewVC logs because they're directly
+followed by a colon. At the expense of, um, proper grammar, here's a patch
+for SVN. If this is OK, I'll patch the other RCS modules, too.
+
+> Reasonable, but probably needs to modify the wiki\_commit\_regexp to
+> recognise such commit messages when parsing the logs. Do that and extend
+> to the other modules and I'll accept it. --[[Joey]]
+
+[[!tag patch]]
+
+<pre>
+--- IkiWiki/Rcs/svn.pm (revision 2650)
++++ IkiWiki/Rcs/svn.pm (working copy)
+@@ -71,7 +71,7 @@
+ my $ipaddr=shift;
+
+ if (defined $user) {
+- $message="web commit by $user".(length $message ? ": $message" : "");
++ $message="web commit by $user ".(length $message ? ": $message" : "");
+ }
+ elsif (defined $ipaddr) {
+ $message="web commit from $ipaddr".(length $message ? ": $message" : "");
+</pre>
diff --git a/doc/todo/color_plugin.mdwn b/doc/todo/color_plugin.mdwn
new file mode 100644
index 000000000..19fba3b35
--- /dev/null
+++ b/doc/todo/color_plugin.mdwn
@@ -0,0 +1,231 @@
+Recently I've wanted to colour some piece of text on my Ikiwiki page.
+It seems that Markdown can do it only using HTML tags, so I used
+`<span class="color">foo bar baz</span>`.
+
+However, in my opinion mixing Markdown syntax and HTML tags is rather ugly,
+so maybe we should create a new color plugin to add more color to Ikiwiki ;)
+I know that another Wikis have similar plugin, for example
+[WikiDot](http://www.wikidot.com/).
+
+I've noticed that htmlscrubber plugin strips `style` attribute, because of
+security, so probably we need to use `class` attribute of HTML. But then
+we have to customize our `local.css` file to add all color we want to use.
+It's not as easy in usage like color name or definition as plugin argument,
+but I don't have a better idea right now.
+
+What do you think about it? --[[Paweł|ptecza]]
+
+> Making a plugin preserve style attributes can be done, it just has to add
+> them after the sanitize step, which strips them. The general method is
+> adding placeholders first, and replacing them with the real html later.
+>
+> The hard thing to me seems to be finding a syntax that is better than a
+> `<span>`. A preprocessor directive is not really any less ugly than html
+> tags, though at least it could play nicely with nested markdown: --[[Joey]]
+>
+> \[[!color red,green """
+> Xmas-colored markdown here
+> """]]
+
+>> I'm glad you like that idea. In my opinion your syntax looks good.
+>> Out of curiosity, why did you used 2 colors in your example? What is HTML
+>> result for it? ;)
+
+>>> I was thinking one would be foreground, the other background. Don't
+>>> know if setting the background makes sense or not.
+
+>> I can try to create that plugin, if you are too busy now. I'm not Perl
+>> hacker, but I wrote a lot of Perl scripts in my life and color plugin
+>> doesn't seem to be very hard task. --[[Paweł|ptecza]]
+
+>> Yes, it's a good intro plugin, have at it! --[[Joey]]
+
+---
+
+This is a RC1 of my `color` plugin. It works for me well, but all your
+comments are very welcome. --[[Paweł|ptecza]]
+
+> Sure, I have a couple.
+
+>> Great! Thank you very much! --[[Paweł|ptecza]]
+
+> The preprocess function is passed named parameters. The hack you have of
+> hardcoding use of `$_[0]` and `$_[2]` can fail at any time.
+
+>> But the problem is that arguments of my plugin don't have a name.
+>> How can I identify them in `params` hash?
+
+>> Similar hardcoded method I've found in `img` plugin :) But only one
+>> argument is not named there (image path).
+
+>>> I think I hadn't realized what you were doing there. The order
+>>> for unnamed parameters can in fact be relied on.
+>>>
+>>> --[[Joey]]
+
+>> Maybe I shouldn't use so simple plugin syntax? For following syntax
+>> I wouldn't have that problem:
+
+>> \[[!color fg=white bg=red text="White text on red background"]]
+
+> `replace_preserved_style` is passed a single parameter, so its prototype
+> should be `($)`, not `(@)`. Ditt `preserve_style`, it should have
+> `($$)`.
+
+>> OK, it will be fixed.
+
+> The sanitize hook is always passed `$params{content}`, so there should be
+> no reason to check that it exists. Also, it shouldn't be done in a
+> sanitize hook, since html sanitization could run _after_ that santize
+> hook. It should use a format hook.
+
+>> Probably you're right. It was rather paranoid checking ;) Thanks for
+>> the hook hint!
+
+> The preprocess hook needs to call `IkiWiki::preprocess` on the content
+> passed into it if you want to support nesting other preprocessor
+> directives inside the color directive. See `preprocess_toggleable` in the
+> toggle plugin, for example.
+>
+> I'm not a big fan of the dummy text `COLORS { ... } SROLOC;TEXT { ... TXET }`
+> The method used by toggle of using two real `<div>`s seems slightly
+> better. --[[Joey]]
+
+>> I don't like that too, but I didn't have better idea :) Thank you for
+>> the hint! I'll take a look at `toggle` plugin.
+
+---
+
+And here is RC2 of that plugin. I've changed a plugin syntax, because the old
+seems to be too enigmatic and it was hard to me to handle unnamed parameters
+in not hardcoded way. I hope that my changes are acceptable for you.
+Of course, I'm open for discussion or exchange of ideas :) --[[Paweł|ptecza]]
+
+> One question, why the 2px padding for span.color? --[[Joey]]
+
+>> Sorry for a long silence, but I had Internet free summer holiday :)
+>> I did that, because backgrounded text without any padding looks
+>> strange for me ;) You can remove it if you don't like that padding.
+>> --[[Paweł|ptecza]]
+
+>>> Joey, will you add that plugin to Ikiwiki 2.61? :) --[[Paweł|ptecza]]
+
+>>>> I also had a long net-free summer holiday. :-) The [[patch]] is
+>>>> ready for integration (made a few minor changes). Is this GPL 2?
+>>>> --[[Joey]]
+
+>>>>> No problem. I guessed it, because I've not seen your commits
+>>>>> at [[RecentChanges]] page in last days and I subscribe your
+>>>>> [blog](http://kitenet.net/~joey/blog/entry/vacation/) :D
+>>>>> It's GPL-2+ like your Ikiwiki and the most external plugins.
+>>>>> --[[Paweł|ptecza]]
+
+ --- /dev/null 2008-06-21 02:02:15.000000000 +0200
+ +++ color.pm 2008-07-27 14:58:12.000000000 +0200
+ @@ -0,0 +1,69 @@
+ +#!/usr/bin/perl
+ +# Ikiwiki text colouring plugin
+ +# Paweł‚ Tęcza <ptecza@net.icm.edu.pl>
+ +package IkiWiki::Plugin::color;
+ +
+ +use warnings;
+ +use strict;
+ +use IkiWiki 2.00;
+ +
+ +sub import {
+ + hook(type => "preprocess", id => "color", call => \&preprocess);
+ + hook(type => "format", id => "color", call => \&format);
+ +}
+ +
+ +sub preserve_style ($$$) {
+ + my $foreground = shift;
+ + my $background = shift;
+ + my $text = shift;
+ +
+ + $foreground = defined $foreground ? lc($foreground) : '';
+ + $background = defined $background ? lc($background) : '';
+ + $text = '' unless (defined $text);
+ +
+ + # Validate colors. Only color name or color code are valid.
+ + $foreground = '' unless ($foreground &&
+ + ($foreground =~ /^[a-z]+$/ || $foreground =~ /^#[0-9a-f]{3,6}$/));
+ + $background = '' unless ($background &&
+ + ($background =~ /^[a-z]+$/ || $background =~ /^#[0-9a-f]{3,6}$/));
+ +
+ + my $preserved = '';
+ + $preserved .= '<span class="color">';
+ + $preserved .= 'color: '.$foreground if ($foreground);
+ + $preserved .= '; ' if ($foreground && $background);
+ + $preserved .= 'background-color: '.$background if ($background);
+ + $preserved .= '</span>';
+ + $preserved .= '<span class="colorend">'.$text.'</span>';
+ +
+ + return $preserved;
+ +
+ +}
+ +
+ +sub replace_preserved_style ($) {
+ + my $content = shift;
+ +
+ + $content =~ s!<span class="color">((color: ([a-z]+|\#[0-9a-f]{3,6})?)?((; )?(background-color: ([a-z]+|\#[0-9a-f]{3,6})?)?)?)</span>!<span class="color" style="$1">!g;
+ + $content =~ s!<span class="colorend">!!g;
+ +
+ + return $content;
+ +}
+ +
+ +sub preprocess (@) {
+ + my %params = @_;
+ +
+ + # Preprocess the text to expand any preprocessor directives
+ + # embedded inside it.
+ + $params{text} = IkiWiki::preprocess($params{page}, $params{destpage},
+ + IkiWiki::filter($params{page}, $params{destpage}, $params{text}));
+ +
+ + return preserve_style($params{foreground}, $params{background}, $params{text});
+ +}
+ +
+ +sub format (@) {
+ + my %params = @_;
+ +
+ + $params{content} = replace_preserved_style($params{content});
+ + return $params{content};
+ +}
+ +
+ +1
+ --- /dev/null 2008-06-21 02:02:15.000000000 +0200
+ +++ color.mdwn 2008-07-27 15:04:42.000000000 +0200
+ @@ -0,0 +1,25 @@
+ +\[[!template id=plugin name=color core=0 author="[[ptecza]]"]]
+ +
+ +This plugin can be used to color a piece of text on a page.
+ +It can be used to set the foreground and/or background color of the text.
+ +
+ +You can use a color name (e.g. `white`) or HTML code (e.g. `#ffffff`)
+ +to define colors.
+ +
+ +Below are a few examples:
+ +
+ + \[[!color foreground=white background=#ff0000 text="White text on red background"]]
+ +
+ +In the above example, the foreground color is defined as a word, while the background color is defined as a HTML
+ +color code.
+ +
+ + \[[!color foreground=white text="White text on default color background"]]
+ +
+ +The background color is missing, so the text is displayed on default background.
+ +
+ + \[[!color background=#ff0000 text="Default color text on red background"]]
+ +
+ +The foreground is missing, so the text has the default foreground color.
+ --- style.css-orig 2008-07-27 15:12:39.000000000 +0200
+ +++ style.css 2008-07-27 15:15:06.000000000 +0200
+ @@ -333,3 +333,7 @@
+ background: #eee;
+ color: black !important;
+ }
+ +
+ +span.color {
+ + padding: 2px;
+ +}
+
+[[done]]
diff --git a/doc/todo/comment_by_mail.mdwn b/doc/todo/comment_by_mail.mdwn
new file mode 100644
index 000000000..87e57417e
--- /dev/null
+++ b/doc/todo/comment_by_mail.mdwn
@@ -0,0 +1,3 @@
+I would like to allow comments on ikiwiki pages without CGI.
+
+> [[done]], see [[plugins/contrib/postal]]
diff --git a/doc/todo/comment_by_mail/discussion.mdwn b/doc/todo/comment_by_mail/discussion.mdwn
new file mode 100644
index 000000000..2b32f9d02
--- /dev/null
+++ b/doc/todo/comment_by_mail/discussion.mdwn
@@ -0,0 +1,25 @@
+I am moving some of the "settled" discussion here, I hope that is
+appropriate. --[[DavidBremner]]
+
+ > I wonder if it would be more or less natural to put an encoded form
+ > of the page name in the email address? I'm thinking about something
+ > like `wikiname+index@host` or `wikiname+todo+comment_by_mail@host`.
+ > The basic transformation would be to call `titlepage($page)` (in the
+ > C locale), followed by replacing "/" with "+" (since "/" is not
+ > valid in mails). --[[Joey]]
+ >> I guess you are right, there is no point being more obscure
+ >> than necessary. I am leaning towards [something](http://www.cs.unb.ca/~bremner/blog/posts/encoding) not
+ >> calling titlepage but in the same spirit. --[[DavidBremner]]
+
+In response to the suggestion by Joey to process mailboxes into blogs
+>> One thing it made me think about is
+>> how to encode reference (threading) information. One can of
+>> course encode this into local-part, but I wonder if it would be
+>> better to use header features of mailto (this could also be an
+>> alternative to tagged mail addresses for page references).
+>> Various client handling of mailto always seemed a bit fragile to
+>> me but maybe I am just behind the times. Most headers are ignored, but
+>> pseudo-headers in the body might work. For example:
+>>[test](mailto:bremner@somewhere.ca?body=X-Iki-Page:%20test%0AX-Iki-thread:%20foobar). I hesitate to use the subject because every mail admin in the
+>> world seems to want to add things to the front of it.
+>> -- [[DavidBremner]]
diff --git a/doc/todo/comment_moderation_feed.mdwn b/doc/todo/comment_moderation_feed.mdwn
new file mode 100644
index 000000000..996de152d
--- /dev/null
+++ b/doc/todo/comment_moderation_feed.mdwn
@@ -0,0 +1,16 @@
+There should be a way to generate a feed that is updated whenever a new
+comment needs moderation. Otherwise, it can be hard to remember to check
+sites, which may rarely get comments.
+
+The feed should not include the comment subject or body, but could mention
+the author. It would be especially handy if it was generated statically.
+One way would be to generate internal pages corresponding to each comment
+that needs moderation; then the feed could be constructed via a usual
+inline.
+
+----
+
+See [[tips/comments feed]] --liw
+
+> Indeed, and the demo blog comes with a comments page with such feeds
+> already set up. [[done]] --[[Joey]]
diff --git a/doc/todo/comments.mdwn b/doc/todo/comments.mdwn
new file mode 100644
index 000000000..7a113bee3
--- /dev/null
+++ b/doc/todo/comments.mdwn
@@ -0,0 +1,170 @@
+# Known issues with the [[plugins/comments]] plugin
+
+## Unimplemented
+
+* Instead of just a link to add a comment, it could have a form to enter
+ the title, similar to the form for adding a new blog post.
+
+ > I'm not sure this is so useful? On Livejournal titles are allowed on
+ > comments, but very rarely used (and indeed usually not very useful);
+ > it's hard enough to get some people to title their blog posts :-)
+ > --[[smcv]]
+
+## Won't fix
+
+* Because IkiWiki generates static HTML, we can't have a form inlined in
+ page.tmpl where the user fills in an entire comment and can submit it in
+ a single button-press, without being vulnerable to cross-site request forgery.
+ So I'll put this in as wontfix. --[[smcv]]
+
+ > Surely there's a way around that?
+ > A web 2.0 way comes to mind: The user clicks on a link
+ > to open the comment post form. While the nasty web 2.0 javascript :)
+ > is manipulating the page to add the form to it, it looks at the cookie
+ > and uses that to insert a sid field.
+ >
+ > Or, it could have a mandatory preview page and do the CSRF check then.
+ > --[[Joey]]
+
+* It would be useful to have a pagespec that always matches all comments on
+ pages matching a glob. Something like `comment(blog/*)`.
+ Perhaps postcomment could also be folded into this? Then the pagespec
+ would match both existing comments, as well as new comments that are
+ being posted.
+
+ > Please see [[plugins/comments/discussion]]. If I've convinced you that
+ > internal pages are the way forward, then sure, we can do that, because
+ > people who can comment still won't be able to edit others' comments
+ > (one of my goals is that commenters can't put words into each other's
+ > mouths :-) )
+ >
+ > On the other hand, if you still want me to switch this plugin to "real"
+ > pages, or if internal pages might become editable in future, then
+ > configuring lockedit/anonok so a user X can add comments to blog pages
+ > would also let X edit/delete comments on blog pages (including those
+ > written by others) in arbitrary ways, which doesn't seem good. --[[smcv]]
+
+ > I had a look at implementing comment() and fell afoul of
+ > some optimisations that assume only internal() will be used to match
+ > internal pages. So probably this isn't worth doing. --[[Joey]]
+
+## Done
+
+* There is some common code cargo-culted from other plugins (notably inline and editpage) which
+ should probably be shared
+
+ > Actually, there's less of this now than there used to be - a lot of simple
+ > things that were shared have become unshareable as they became more
+ > complex. --[[smcv]]
+
+ > There's still goto. You have a branch for that. --[[Joey]]
+
+ >> Now merged --[[smcv]]
+
+* The default template should have a (?) icon next to unauthenticated users (with the IP address
+ as title) and an OpenID icon next to OpenIDs
+
+ > Done in my comments git branch, at least as a mockup (using the (?),
+ > {x} and {*} smileys for anonymous, OpenID and login respectively).
+ > --[[smcv]]
+
+ >> I've improved this to use independent icons from the wikiicons
+ >> directory (untested!) --[[smcv]]
+
+ >>> The new code produces links like /wikiisons/openid.png, which
+ >>> fail if ikiwiki is not at the root of the web server. --[[Joey]]
+
+ >>>> Sorry, I should have spotted that (the assumption failed on my demo
+ >>>> site, but the push to that site was when I was on the way out, so I
+ >>>> didn't have time to investigate). As a note for other ikiwiki hackers,
+ >>>> I should have used
+ >>>> `<img src="<TMPL_VAR NAME=BASEURL>wikiicons/openid.png" />`. --[[smcv]]
+
+ >>> I got to wondering if the icons are needed. On my comments branch
+ >>> (not master), I've dropped the icons and info can be seen by hovering
+ >>> over the author's name. Idea being that you probably don't care how
+ >>> they authenticated unless something is weird, and in that case you
+ >>> can hover to check. Does that make sense, should I merge it?
+ >>> --[[Joey]]
+
+ >>>> Yeah, go ahead. I preferred my layout with the author before the
+ >>>> comment - perhaps that's Livejournal's influence :-) - but I can always
+ >>>> edit the templates for my own site. As long as the default is something
+ >>>> reasonable and both layouts are possible, I don't really mind.
+ >>>> Minimizing the number of "resource" files in the basewiki also seems
+ >>>> a good goal. --[[smcv]]
+
+* Previews always say "unknown IP address"
+
+ > Fixed in my comments branch by commits bc66a00b and 95b3bbbf --[[smcv]]
+
+* The Comments link in the "toolbar" is to `index.html#comments`, not the
+ desired `./#comments`
+
+ > Fixed in my comments branch by commit 0844bd0b; commits 5b1cf21a
+ > and c42f174e fix another `beautify_urlpath` bug and add a regression test
+ > --[[smcv]]
+
+
+* Now that inline has some comments-specific functionality anyway, it would
+ be good to output `<link rel="comments">` in Atom and the equivalent in RSS.
+
+ > Fixed in my comments branch by d0d598e4, 3feebe31, 9e5f504e --[[smcv]]
+
+
+* Add `COMMENTOPENID`: the authenticated/verified user name, if and only if it was an OpenID
+
+ > Done in my comments git branch --[[smcv]]
+
+ > Not seeing it there, which branch? --[[Joey]]
+
+ >> Bah, git push --all is not the default... 'comments' branch now (I've also rebased it).
+ >> Sorry, I'm on mobile Internet at the moment... --[[smcv]]
+
+ >>> merged by [[Joey]] in commit 0f03af38 --[[smcv]]
+
+* Should the comments be visually set off more from the page above?
+ Rather than just a horizontal rule, I'm thinking put the comments
+ in a box like is used for inlined pages.
+
+ > I did put them in a box in the CSS... I agree the default template
+ > could do with visual improvement though. --[[smcv]]
+
+ >> I'll consider this solved by [[Joey]]'s changes. --[[smcv]]
+
+* One can use inline to set up a feed of all comments posted to any page.
+ Using template=comment they are displayed right. Only problem
+ is there is no indication in that template of what page each comment in the
+ feed is a comment on. So, if a comment is inlined into a different page,
+ I think it should show a link back to the page commented on.
+ (BTW, the rss feed in this situation seems ok; there the link element
+ points back to the parent page.
+
+ > done --[[Joey]]
+
+* One of Joey's commit messages says "Not ideal, it would be nicer to jump to
+ the actual comment posted, but no anchor is available". In fact there is
+ an anchor - the `\[[_comment]]` preprocessing wraps the comment in a `<div>`
+ with id="comment_123" or something. I'll fix this, unless Joey gets there
+ first. --[[smcv]]
+
+ > done --[[Joey]]
+
+* If a spammer posts a comment, it is either impossible or hard to clean
+ up via the web. Would be nice to have some kind of link on the comment
+ that allows trusted users to remove it (using the remove plugin of
+ course).
+
+ > Won't the remove plugin refuse to remove internal pages? This would be
+ > a good feature to have, though. --[[smcv]]
+
+ > Here, FWIW, is the first ikiwiki comment spam I've seen:
+ > <http://waldeneffect.org/blog/Snake_bite_information/#blog/Snake_bite_information/comment_1>
+ > So that took about 10 days...
+ > --[[Joey]]
+
+ >> Implemented in my 'comments' branch, please review. It turns out
+ >> [[plugins/remove]] is happy to remove internal pages, so it was quite
+ >> easy to do. --[[smcv]]
+
+ >>> done --[[Joey]]
diff --git a/doc/todo/conditional_text_based_on_ikiwiki_features.mdwn b/doc/todo/conditional_text_based_on_ikiwiki_features.mdwn
new file mode 100644
index 000000000..0d0f66da4
--- /dev/null
+++ b/doc/todo/conditional_text_based_on_ikiwiki_features.mdwn
@@ -0,0 +1,128 @@
+I'd like to see some way to conditionally include wiki text based on
+whether the wiki enables or disables certain features. For example,
+[[ikiwiki/formatting]], could use `\[[!if (enabled smiley) """Also, because
+this wiki has the smiley plugin enabled, you can insert \[[smileys]] and
+some other useful symbols."""]]`, and a standard template for [[plugins]]
+pages could check for the given plugin name to print "enabled" or
+"disabled".
+
+Some potentially useful conditionals:
+
+* `enabled pluginname`
+* `disabled pluginname`
+* `any pagespec`: true if any of the pages in the [[ikiwiki/PageSpec]] exist
+* `all pagespec`: true if all of the pages in the [[ikiwiki/PageSpec]] exist
+* `no pagespec` or `none pagespec`: true if none of the pages in the [[ikiwiki/PageSpec]] exist
+* `thispage pagespec`: true if pagespec includes the page getting rendered (possibly one including the page with this content on it).
+* `sourcepage pagespec`: true if pagespec includes the page corresponding to the file actually containing this content, rather than a page including it.
+* `included`: true if included on another page, via [[plugins/inline]], [[plugins/sidebar]], [[plugins/contrib/navbar]], etc.
+
+You may or may not want to include boolean operations (`and`, `or`, and
+`not`); if you do, you could replace `disabled` with `not enabled`, and `no
+pagespec` or `none pagespec` with `not any pagespec` (but you may want to
+keep the aliases for simplicity anyway). You also may or may not want to
+include an `else` clause; if so, you could label the text used if true as
+`then`.
+
+Syntax could vary greatly here, both for the
+[[ikiwiki/Directive]] and for the condition itself.
+
+> I think this is a good thing to consider, although conditionals tend to
+> make everything a lot more complicated, so I also want to KISS, and not
+> use too many of them.
+>
+> I'd probably implement this using the same method as pagespecs, so 'and',
+> 'or', '!', and paren groupings work.
+>
+> It could be thought of as simply testing to see if a pagespec matches
+> anything, using a slightly expanded syntax for the pagespec, which would
+> also allow testing for things like link(somepage),
+> created_before(somepage), etc.
+>
+> That also gives us your "any pagespec" for free: "page or page or page".
+> And for "all pagespec", you can do "page and page and page".
+>
+> For plugins testing, maybe just use "enabled(name)"?
+>
+> I'm not sure what the use cases are for thispage, sourcepage, and
+> included. I don't know if the included test is even doable. I'd be
+> inclined to not bother with these three unless there are use cases I'm
+> not seeing.
+>
+> As to the syntax, to fit it into standard preprocessor syntax, it would
+> need to look something like this:
+>
+> \[[!if test="enabled(smiley)" """foo"""]]
+>
+> --[[Joey]]
+
+>> [[ikiwiki/PageSpec]] syntax seems perfect, and your proposed syntax for the `if`
+>> [[ikiwiki/Directive]] looks fine to me.
+>>
+>> [[ikiwiki/PageSpec]]s don't give you `none` for free, since `!foo/*` as a boolean
+>> would mean "does any page not matching `foo/*` exist", not "does `foo/*`
+>> match nothing"; however, I don't really care much about `none`, since I
+>> just threw it in while brainstorming, and I don't know any compelling use
+>> cases for it.
+>>
+>> `enabled(pluginname)` will work perfectly, and `!enabled(pluginname)`
+>> makes `disabled` unnecessary.
+>>
+>> A few use cases for `included`, which I would really like to see:
+>>
+>> * On the sidebar page, you could say something like \[[!if test="!included"
+>> """This page, without this help message, appears as a sidebar on all
+>> pages."""]]. The help text would then only appear on the sidebar page
+>> itself, not the sidebar included on all pages.
+>>
+>> * On [[blog]] entries, you could use `included` to implement a cut.
+>> (Please don't take that as an argument against. :) ) For instance, you
+>> could use included rather than [[plugins/toggle]] for the detailed
+>> changelogs of ikiwiki, or to embed an image as a link in the feed rather
+>> than an embedded image.
+>>
+>> Some use cases for `thispage`:
+>>
+>> * You could use `thispage` to include or exclude parts of the sidebar based
+>> on the page you include it in. You can already use subpages/sidebar for
+>> subpages/*, but `thispage` seems more flexible, makes it trivial to have
+>> common portions rather than using [[plugins/inline]] with the `raw`
+>> option, and keeps the sidebar in one place.
+>>
+>> * You could use `thispage` to implement multiple different feeds for the
+>> same content with slightly different presentation. For instance, using
+>> templates for image inclusion, you could offer a feed with image links
+>> and a feed with embedded images. Similarly, using templates for cuts, you
+>> could offer a feed with cuts and a feed with full content in every post.
+>>
+>> I don't have any particular attachment to `sourcepage`. It only makes
+>> sense as part of a template, since otherwise you know the source page when
+>> typing in the if.
+>>
+>> --[[JoshTriplett]]
+
+This is now completely [[todo/done]]! See [[plugins/conditional]].
+
+--[[Joey]]
+
+> You rock mightily. --[[JoshTriplett]]
+
+Is there a way to test features other than plugins? For example,
+to add to [[ikiwiki/Markdown]] something like
+
+ \[[!if test="enabled(multimarkdown)" then="You can also use..."]]
+
+(I tried it like that just to see if it would work, but I wasn't that lucky.)
+--ChapmanFlack
+
+> No, not supported. I really think that trying to conditionalise text on a
+> page for multimarkdown is a path to madness or unreadability though.
+> Perhaps it would be better to have .mmdwn files that can only contain
+> multimarkdown? --[[Joey]]
+
+>> Really, there was only one (or maybe two) pages I had in mind as appropriate
+>> places for conditional text based on multimarkdown&mdash;the underlay pages
+>> for 'markdown' and maybe also 'formatting', because those are the pages you
+>> look at when you're trying to find out how to mark stuff up for the wiki, so
+>> if MM is enabled, they need to at least mention it and have a link to the
+>> MM syntax guide.--ChapmanFlack
diff --git a/doc/todo/conditional_underlay_files.mdwn b/doc/todo/conditional_underlay_files.mdwn
new file mode 100644
index 000000000..c578bceaf
--- /dev/null
+++ b/doc/todo/conditional_underlay_files.mdwn
@@ -0,0 +1,29 @@
+I'd like to see some way to include certain files from the underlay only when the wiki has certain plugins enabled. For example:
+
+* Only include smileys.mdwn and the smileys subdirectory if you enable the [[plugins/smiley]] plugin.
+* Exclude openid.mdwn if you disable the [[plugins/openid]] plugin.
+* Include shortcuts.mdwn only if you enable the [[plugins/shortcut]] plugin.
+* Include blog.mdwn only if you don't disable the [[plugins/inline]] plugin.
+* Include favicon.ico only if you enable the [[plugins/favicon]] plugin.
+* Include wikiicons/diff.png (and the wikiicons directory) only if you enable the CGI.
+* Include a hypothetical restructuredtexthelp.rst or similar for other formats only with those formats enabled.
+
+I can see two good ways to implement this. Ideally, with
+[[conditional_text_based_on_ikiwiki_features]] available, ikiwiki could
+parse a page like conditionalpages.mdwn, which could contain a set of
+conditional-wrapped page names; that seems like the most elegant and
+ikiwiki-like approach. Alternatively, ikiwiki.setup could contain a
+Perl-generated exclude option by default; that would work, but it seems
+hackish.
+
+> Another way might be to have a third directory of source files where
+> plugins could drop in pages, and only build the files from there if their
+> plugins were enabled.
+>
+> Using the conditionals in a page to control what other pages get built
+> feels complex to me.
+>
+> Instead, this has been implmented as the `add_underlay()` function.
+> [[done]]
+> --[[Joey]]
+
diff --git a/doc/todo/configurable_markdown_path.mdwn b/doc/todo/configurable_markdown_path.mdwn
new file mode 100644
index 000000000..63fa2dcbd
--- /dev/null
+++ b/doc/todo/configurable_markdown_path.mdwn
@@ -0,0 +1,64 @@
+[[!template id=gitbranch branch=wtk/mdwn author="[[wtk]]"]]
+
+summary
+=======
+
+Make it easy to configure the Markdown implementation used by the
+[[plugins/mdwn]] plugin. With this patch, you can set the path to an
+external Markdown executable in your ikiwiki config file. If you do
+not set a path, the plugin will use the usual config options to
+determine which Perl module to use.
+
+> This adds a configuration in which a new process has to be worked
+> for every single page rendered. Actually, it doesn't only add
+> such a configuration, it makes it be done by *default*.
+>
+> Markdown is ikiwiki's default, standard renderer. A configuration
+> that makes it slow will make ikiwiki look bad.
+>
+> I would not recommend using Gruber's perl markdown. It is old, terminally
+> buggy, and unmaintained. --[[Joey]] [[!tag reviewed]]
+
+----
+
+I wasn't trying to make an external markdown the default, I was trying
+to make the currently hardcoded `/usr/bin/markdown` configurable. It
+should only use an external process if `markdown_path` is set, which
+it is not by default. Consider the following tests from clean checkouts:
+
+Current ikiwiki trunk:
+
+ $ PERL5LIB="." time ikiwiki --setup docwiki.setup
+ ...
+ 38.73user 0.62system 1:20.90elapsed 48%CPU (0avgtext+0avgdata 103040maxresident)k
+ 0inputs+6472outputs (0major+19448minor)pagefaults 0swaps
+
+My mdwn branch:
+
+ $ PERL5LIB="." time ikiwiki --setup docwiki.setup
+ ...
+ Markdown: Text::Markdown::markdown()
+ ...
+ 39.17user 0.73system 1:21.77elapsed 48%CPU (0avgtext+0avgdata 103072maxresident)k
+ 0inputs+6472outputs (0major+19537minor)pagefaults 0swaps
+
+My mdwn branch with `markdown_path => "/usr/bin/markdown"` added in
+`docwiki.setup` (on my system, `/usr/bin/markdown` is a command-line
+wrapper for `Text::Markdown::markdown`).
+
+ $ PERL5LIB="." time ikiwiki --setup docwiki.setup
+ ...
+ Markdown: /usr/bin/markdown
+ ...
+ 175.35user 18.99system 6:38.19elapsed 48%CPU (0avgtext+0avgdata 92320maxresident)k
+ 0inputs+17608outputs (0major+2189080minor)pagefaults 0swaps
+
+So my patch doesn't make ikiwiki slow unless the user explicitly
+requests an extenral markdown, which they would presumably only do to
+work around bugs in their system's Perl implementation.
+ -- [[wtk]]
+
+> I was wrong about it being enabled by default, but I still don't like
+> the idea of a configuration that makes ikiwiki slow on mdwn files,
+> even if it is a nonstandard configuration. How hard can it be to install
+> the Text::Markdown library? --[[Joey]]
diff --git a/doc/todo/configurable_tidy_command_for_htmltidy.mdwn b/doc/todo/configurable_tidy_command_for_htmltidy.mdwn
new file mode 100644
index 000000000..2a7ebce0a
--- /dev/null
+++ b/doc/todo/configurable_tidy_command_for_htmltidy.mdwn
@@ -0,0 +1,8 @@
+[[!tag patch patch]]
+
+I was trying to get htmltidy to [play nicely with MathML][play]. Unfortunately, I couldn't construct a command line that I was happy with, but along the way I altered htmltidy to allow a configurable command line. This seemed like a generally useful thing, so I've published my [patch][] as a Git branch.
+
+[play]: http://lists.w3.org/Archives/Public/html-tidy/2006JanMar/0052.html
+[patch]: http://git.tremily.us/?p=ikiwiki.git&a=commitdiff&h=408ee89fd7c1dc70510385a7cf263a05862dda97&hb=e65ce4f0937eaf622846c02a9d39fa7aebe4af12
+
+> Thanks, [[done]] --[[Joey]]
diff --git a/doc/todo/configurable_timezones.mdwn b/doc/todo/configurable_timezones.mdwn
new file mode 100644
index 000000000..36f2e9dbb
--- /dev/null
+++ b/doc/todo/configurable_timezones.mdwn
@@ -0,0 +1,7 @@
+It would be nice if the sure could set the timezone of the wiki, and have ikiwiki render the pages with that timezone.
+
+This is nice for shared hosting, and other situation where the user doesn't have control over the server timezone.
+
+> [[done]] via the ENV setting in the setup file. --[[Joey]]
+
+>> Now via a timezone setting that is web configurable. --[[Joey]]
diff --git a/doc/todo/conflict_free_comment_merges.mdwn b/doc/todo/conflict_free_comment_merges.mdwn
new file mode 100644
index 000000000..e84400c17
--- /dev/null
+++ b/doc/todo/conflict_free_comment_merges.mdwn
@@ -0,0 +1,23 @@
+Currently, new comments are named with an incrementing ID (comment_N). So
+if a wiki has multiple disconnected servers, and comments are made to the
+same page on both, merging is guaranteed to result in conflicts.
+
+I propose avoiding such merge problems by naming a comment with a sha1sum
+of its (full) content. Keep the incrementing ID too, so there is an
+-ordering. And so duplicate comments are allowed..)
+So, "comment_N_SHA1".
+
+Note: The comment body will need to use meta title in the case where no
+title is specified, to retain the current behavior of the default title
+being "comment N".
+
+What do you think [[smcv]]? --[[Joey]]
+
+> I had to use md5sums, as sha1sum perl module may not be available and I
+> didn't want to drag it in. But I think that's ok; this doesn't need to be
+> cryptographically secure and even the chances of being able to
+> purposefully cause a md5 collision and thus an undesired merge conflict
+> are quite low since it modifies the input text and adds a date stamp to
+> it.
+>
+> Anyway, I think it's good, [[done]] --[[Joey]]
diff --git a/doc/todo/consistent_smileys.mdwn b/doc/todo/consistent_smileys.mdwn
new file mode 100644
index 000000000..58952f900
--- /dev/null
+++ b/doc/todo/consistent_smileys.mdwn
@@ -0,0 +1,22 @@
+ikiwiki should have a consistent set of smileys. We could fix the current smileys, or we could grab a new set of consistent smileys, such as the Tango emotes from gnome-icon-theme (GPL).
+
+> Tango doesn't have a smiley icon for :-/ I'd have to use the same icon as
+> for :-( . Also missing is :-P |-) and some of the less used ones. Some of
+> the non-face icons, like {*} and {o} also don't seem to be in there,
+> though we could keep the current ones.
+>
+> gnome-icon-theme's emotes are not the tango ones. Tango is CC-BY-SA 2.5
+> (non-free IIRC), while gnome-icon-themes is GPL. If you compare icons,
+> such as the sunglasses one, they're different drawings, too. (I had been
+> very confused by these different licenses before..)
+>
+> gnome-icon-theme does have an icon for :-P , though it's missing |-) and
+> some of the less used icons. Its :-( sucks, the mouth is barely visible
+> at all even at 32x32 size, and the frown is hard to make out. In general
+> the tango icons seem better drawn, though gnome-icon-themes has a better
+> B-).
+>
+> Now that ikiwiki has multiple underlays, it would be possible to ship
+> multiple icon themes with ikiwiki, and have different versions of
+> the [[smileys]] page to include only the smileys available for a given
+> theme. The underlay to use could be a configuation option. --[[Joey]]
diff --git a/doc/todo/copyright_based_on_pagespec.mdwn b/doc/todo/copyright_based_on_pagespec.mdwn
new file mode 100644
index 000000000..f15ad4b75
--- /dev/null
+++ b/doc/todo/copyright_based_on_pagespec.mdwn
@@ -0,0 +1,10 @@
+I think the following would be useful. Have a plugin (pagecopyright? dunno about the name), configured
+by an "association-list"
+<pre>
+copyright_alist=>[ "*/Discussion" => "Comments copyright individual authors",
+ "*"=> "Copyright 2008 Widget Co" ]
+</pre>
+
+And yes, I know about [[plugins/contrib/default_content_for___42__copyright__42___and___42__license__42__]]
+
+[[DavidBremner]]
diff --git a/doc/todo/correct_published_and_updated_time_information_for_the_feeds.mdwn b/doc/todo/correct_published_and_updated_time_information_for_the_feeds.mdwn
new file mode 100644
index 000000000..565f3b16c
--- /dev/null
+++ b/doc/todo/correct_published_and_updated_time_information_for_the_feeds.mdwn
@@ -0,0 +1,113 @@
+In [Atom](http://www.ietf.org/rfc/rfc4287.txt), we can provide `published` and `updated` information.
+In [RSS](http://cyber.law.harvard.edu/rss/rss.html), there is only `pubDate`, for the
+publication date, but an update can be mentioned with the [`dc:modified`](http://www.ietf.org/rfc/rfc2413.txt)
+element (whose datetime format is [iso 8601](http://www.w3.org/TR/NOTE-datetime)).
+This patch updates :) `inline.pm` and the two relevant templates.
+
+> I tested a slightly modified patch, which I've put below for now.
+> feedvalidator.org complains that dc:modified is not a known element. I'll
+> bet some header needs to be added to make the dublin core stuff available.
+> The atom feeds seem ok. --[[Joey]]
+
+<pre>
+Index: debian/changelog
+===================================================================
+--- debian/changelog (revision 4066)
++++ debian/changelog (working copy)
+@@ -15,8 +15,11 @@
+ * Updated French translation from Cyril Brulebois. Closes: #437181
+ * The toc directive doesn't work well or make sense inside an inlined page.
+ Disable it when the page with the toc is nested inside another page.
++ * Apply a patch from NicolasLimare adding modification date tags to rss and
++ atom feeds, and also changing the publication time for a feed to the
++ newest modiciation time (was newest creation time).
+
+- -- Joey Hess <joeyh@debian.org> Sat, 11 Aug 2007 17:40:45 -0400
++ -- Joey Hess <joeyh@debian.org> Sat, 11 Aug 2007 18:25:28 -0400
+
+ ikiwiki (2.5) unstable; urgency=low
+
+Index: templates/atomitem.tmpl
+===================================================================
+--- templates/atomitem.tmpl (revision 4066)
++++ templates/atomitem.tmpl (working copy)
+@@ -11,7 +11,8 @@
+ <category term="<TMPL_VAR CATEGORY>" />
+ </TMPL_LOOP>
+ </TMPL_IF>
+- <updated><TMPL_VAR DATE_3339></updated>
++ <updated><TMPL_VAR MDATE_3339></updated>
++ <published><TMPL_VAR CDATE_3339></published>
+ <TMPL_IF NAME="ENCLOSURE">
+ <link rel="enclosure" type="<TMPL_VAR TYPE>" href="<TMPL_VAR ENCLOSURE>" length="<TMPL_VAR LENGTH>" />
+ <TMPL_ELSE>
+Index: templates/rssitem.tmpl
+===================================================================
+--- templates/rssitem.tmpl (revision 4066)
++++ templates/rssitem.tmpl (working copy)
+@@ -12,7 +12,8 @@
+ <category><TMPL_VAR CATEGORY></category>
+ </TMPL_LOOP>
+ </TMPL_IF>
+- <pubDate><TMPL_VAR DATE_822></pubDate>
++ <pubDate><TMPL_VAR CDATE_822></pubDate>
++ <dc:modified><TMPL_VAR MDATE_3339></dc:modified>
+ <TMPL_IF NAME="ENCLOSURE">
+ <enclosure url="<TMPL_VAR ENCLOSURE>" type="<TMPL_VAR TYPE>" length="<TMPL_VAR LENGTH>" />
+ <TMPL_ELSE>
+Index: IkiWiki/Plugin/inline.pm
+===================================================================
+--- IkiWiki/Plugin/inline.pm (revision 4066)
++++ IkiWiki/Plugin/inline.pm (working copy)
+@@ -361,8 +361,10 @@
+ title => pagetitle(basename($p)),
+ url => $u,
+ permalink => $u,
+- date_822 => date_822($pagectime{$p}),
+- date_3339 => date_3339($pagectime{$p}),
++ cdate_822 => date_822($pagectime{$p}),
++ mdate_822 => date_822($pagemtime{$p}),
++ cdate_3339 => date_3339($pagectime{$p}),
++ mdate_3339 => date_3339($pagemtime{$p}),
+ );
+
+ if ($itemtemplate->query(name => "enclosure")) {
+@@ -397,7 +399,7 @@
+ $content.=$itemtemplate->output;
+ $itemtemplate->clear_params;
+
+- $lasttime = $pagectime{$p} if $pagectime{$p} > $lasttime;
++ $lasttime = $pagemtime{$p} if $pagemtime{$p} > $lasttime;
+ }
+
+ my $template=template($feedtype."page.tmpl", blind_cache => 1);
+</pre>
+
+
+
+>> Yes I noticedthe bug today; the correct (tested on feedvalidator) rssitem.tmpl template must start with the following content:
+
+ <item>
+ <TMPL_IF NAME="AUTHOR">
+ <title><TMPL_VAR AUTHOR ESCAPE=HTML>: <TMPL_VAR TITLE></title>
+ <dcterms:creator><TMPL_VAR AUTHOR ESCAPE=HTML></dcterms:creator>
+ <TMPL_ELSE>
+ <title><TMPL_VAR TITLE></title>
+ </TMPL_IF>
+ <dcterms:modified><TMPL_VAR MDATE_3339></dcterms:modified>
+ <dcterms:created><TMPL_VAR DATE_3339></dcterms:created>
+ ....
+
+>> and rsspage.tmpl must start with:
+
+ <?xml version="1.0"?>
+ <rss version="2.0"
+ xmlns:dc="http://purl.org/dc/elements/1.1/"
+ xmlns:dcterms="http://purl.org/dc/terms/" >
+ ....
+
+>> — [[NicolasLimare]]
+
+[[done]] --[[Joey]]
+
+[[!tag patch]]
diff --git a/doc/todo/countdown_directive.mdwn b/doc/todo/countdown_directive.mdwn
new file mode 100644
index 000000000..61c36204c
--- /dev/null
+++ b/doc/todo/countdown_directive.mdwn
@@ -0,0 +1,5 @@
+I'd love to have a countdown directive, which would take a timestamp to count down to and generate a JavaScript timer in the page.
+
+Ideally I'd also like to either have parameters providing content to show before and after the time passes, or integration with existing conditional directives to do the same thing.
+
+[[!tag wishlist]]
diff --git a/doc/todo/credentials_page.mdwn b/doc/todo/credentials_page.mdwn
new file mode 100644
index 000000000..6b90af144
--- /dev/null
+++ b/doc/todo/credentials_page.mdwn
@@ -0,0 +1,33 @@
+pushing [[this|todo/httpauth feature parity with passwordauth]] and [[this|todo/htpasswd mirror of the userdb]] further (although rather in the [[wishlist]] priority): would it make sense for users to have a `$USER/credentials` page that is by default locked to the user and admins, where the user can state one or more of the below?
+
+* OpenID
+* ssh public key (would require an additional mechanism for writing this to a `authorized_keys` file with appropriate environment variables or prefix that makes sure the commit is checked against the right user and that the user names agree)
+* gpg public key (once there is a mechanism that relies on gpg for authentication))
+* https certificate hash (don't know details; afair the creation of such certificates is typically initiated server-side)
+* password hash (this is generally considered a valuable secret; is this still true with good hashes and proper salting?)
+
+such a page could have a form as described in [[todo/structured page data]] and could even serve as a way of managing users. --[[chrysn]]
+
+> I was just thinking about something along these lines myself. The
+> idea, if I understand correctly, is to allow users to have multiple
+> login options all leading to the same identity. This would allow a
+> user to login for example via either their Google account or their
+> WordPress account, while still being identified as the same user.
+
+> However, I'm not sure this should be a static page (I guess you
+> mean `$USER/credentials`, I don't think ‘creditentials’ actually
+> exists). Something entirely managed at the CGI level is probably
+> better, as it also helps keeping the data in its place (such as ssh
+> public keys in `authorized_keys` etc).
+
+> -- GB
+
+>> having multiple login options leading to the same identity, and (more important to me) giving the user an easy way to review and edit them. i'm thinking a bit of foaf+ssl style "i am $USER and you can recognize me by my client certificate $CERTIFICATE" statements.
+>>
+>> the reason why i want this in a static place instead of cgi level is that it can be used, for example, for automatically creating htpasswd files for read-only (cgi-less) replicas of private wikis. furthermore, it all gets versioned and it can easily be seen where the data really is. the credentials have to be filed appropriately by plugins anyway, but that can happen as a part of the regular rebuild process.
+>>
+>> and yes, you're right about the word misusage; thanks for pointing it out and fixing it.
+>>
+>> --[[chrysn]]
+
+an issue to be considered: for ways of authentication that don't explicitly mention the user name (and that would be everything but password; especially OpenID), there has to be a way to prevent users from hijacking an admin's account. the user wouldn't get more privileges, but the admin could find himself logged in as a user instead of an admin when he logs in using his OpenID, for example. he could fix it by removing the openid from the user's ("his") page, but it has to be taken care of nevertheless. --[[chrysn]]
diff --git a/doc/todo/ctime_on_blog_post_pages_.mdwn b/doc/todo/ctime_on_blog_post_pages_.mdwn
new file mode 100644
index 000000000..76708e0da
--- /dev/null
+++ b/doc/todo/ctime_on_blog_post_pages_.mdwn
@@ -0,0 +1,11 @@
+[[Blog]] feeds and index pages show the posted time (ctime), the actual blog entry pages only show the modified time.
+
+The user has to look at the history link to find when a blog item was posted.
+
+It would be nice if blog entry post pages could include the ctime. -- [[Edward_Betts]]
+
+> I've committed a change that adds a CTIME variable to page.tmpl. I left
+> it commented out in the default template, since it seems like a bit of
+> clutter to me. Good enough? --[[Joey]]
+
+[[done]]
diff --git a/doc/todo/custom_location_for_openlayers.mdwn b/doc/todo/custom_location_for_openlayers.mdwn
new file mode 100644
index 000000000..3ccfa2944
--- /dev/null
+++ b/doc/todo/custom_location_for_openlayers.mdwn
@@ -0,0 +1,17 @@
+In the [[plugins/osm]], we use an absolute URL to download the OpenLayers.js script file. This has two downsides:
+
+ 1. if the wiki is behind HTTPS, this will create a nasty SSL warning in the browser and we don't want that
+ 2. if we want the map to work offline, we need to load the js locally
+
+For those reasons, I think the location of that script should be customizable. --[[anarcat]]
+
+[[!template id=gitbranch branch=anarcat/master author="[[anarcat]]"]]
+
+There is now a [[patch]] for this, thanks to Genevieve, available in my master branch.
+
+Note that there's an update to the patch in my master branch, that allows changing the URL for tiles too.
+
+> There's a lot of stuff in your master branch. Which commit is it,
+> or if you want me to merge it, spin a branch I can merge. --[[Joey]]
+
+> > I believe this was already fixed, actually - it's commit 409c4e48f983d10aceb6321148d7f440d17eb28f, which you cherry-picked on August 5th in d926c4a. So this is [[done]], thanks! -- [[anarcat]]
diff --git a/doc/todo/darcs.mdwn b/doc/todo/darcs.mdwn
new file mode 100644
index 000000000..985ae5f8b
--- /dev/null
+++ b/doc/todo/darcs.mdwn
@@ -0,0 +1,53 @@
+<http://khjk.org/~pesco/ikiwiki-darcs/> (now a darcs repo)
+
+I've taken all the good stuff from the above (now deleted --[[Joey]]) and added the missing hooks. The code hasn't seen a lot of testing, so some bugs are likely yet to surface. Also, I'm not experienced with perl and don't know where I should have used the function `possibly_foolish_untaint`.
+
+> Review of this one:
+>
+> * Should use tab indentation. (fixed)
+> * `rcs_getctime` should not need to use a ctime cache (such a cache should
+> also not be named `.ikiwiki.ctimes`). `rcs_getctime` is run exactly
+> once per page, ever, and the data is cached in ikiwiki's index. (fixed)
+> * I doubt that ENV{DARCS} will be available, since the wrapper clobbers> the entire
+> environment. I'd say remove that. (fixed)
+> * I don't understand what `darcs_info` is doing, but it seems to be
+> parsing xml with a regexp?
+> * Looks like `rcs_commit` needs a few improvements, as marked TODO
+> * `rcs_remove` just calls unlink? Does darcs record notice the file was removed
+> and automatically commit the removal?
+> * Is the the darcs info in [[rcs/details]] still up-to-date re this version? (fixed)
+> --[[Joey]]
+
+> Update:
+>
+> I think I've addressed all of the above except for the XML parsing in `darcs_info`.
+> The function determines the md5 hash of the last patch the given file appears in.
+> That's indeed being done with regexps but my Perl isn't good enough for a quick recode
+> right now.
+>
+> As for the darcs info in [[rcs/details]], it does not accurately describe the way
+> this version works. It's similar, but the details differ slightly.
+> You could copy my description above to replace it.
+>
+>> done --[[Joey]]
+>
+> There is still some ironing to do, for instance the current version doesn't allow for
+> modifying attachments by re-uploading them via CGI ("darcs add failed"). Am I assuming
+> correctly that "adding" a file that's already in the repo should just be a no-op?
+> --pesco
+
+>> It should result in the new file contents being committed by
+>> `rcs_commit_staged`. For some revision control systems, which
+>> automatically commit modifications, it would be a no-op. --[[Joey]]
+
+>>> Done. --pesco
+
+----
+
+I've finally merged this into ikiwiki master. The plugin looks quite
+complete, with only the new `rcs_receive` hook missing, and I
+hope it works as good as it looks. :) If anyone wants to work on improving
+it, there are some TODOs as mentioned above that could still be improved.
+--[[Joey]]
+
+[[!tag patch done]]
diff --git a/doc/todo/datearchives-plugin.mdwn b/doc/todo/datearchives-plugin.mdwn
new file mode 100644
index 000000000..5f33cde4c
--- /dev/null
+++ b/doc/todo/datearchives-plugin.mdwn
@@ -0,0 +1,77 @@
+I'll be using IkiWiki primarily as a blog, so I want a way to view entries
+by date. A URL of the form `/date/YYYY/MM/DD.html` (or `/date/YYYY/MM/DD/`
+when using the `use_dirs` patch) should show posts from that period. ATM, I
+have this:
+
+<pre>
+Index: IkiWiki/Plugin/datearchives.pm
+===================================================================
+--- IkiWiki/Plugin/datearchives.pm (revision 0)
++++ IkiWiki/Plugin/datearchives.pm (revision 0)
+@@ -0,0 +1,31 @@
++#!/usr/bin/perl
++
++package IkiWiki::Plugin::datearchives;
++
++use warnings;
++use strict;
++use IkiWiki;
++
++sub import {
++ hook(type => "pagetemplate", id => "datearchives", call => \&pagetemplate, scan => 1);
++}
++
++sub pagetemplate (@) {
++ my %args = @_;
++ my $dt;
++ eval {
++ use DateTime;
++ $dt = DateTime->from_epoch(epoch => $IkiWiki::pagectime{ $args{page} });
++ };
++ return if $@;
++ my $base = $config{datearchives_base} || 'date';
++ my $link = $base.'/'.$dt->strftime('%Y/%m/%d');
++ push @{$links{$args{page}}}, $link;
++ my $template = $args{template};
++ if ($template->query(name => "ctime")) {
++ $template->param(ctime => htmllink( $args{page}, $args{destpage}, $link, 0, 0,
++ $template->param('ctime')));
++ }
++}
++
++1
+</pre>
+
+This works (although accessing `%IkiWiki::pagectime` is not too clever),
+but it would be far more useful if the date pages were automatically
+created and populated with the relevant posts. A [[ikiwiki/Pagespec]] works perfectly for displaying the relevant content, but we're still left with the issue of actually creating the page. What's the Right Way to do this? We could create them in the RCS working copy and check them in, or create them directly in the output directory... (I'd also like to create an option for the tags plugin to auto-create its targets in the same way). Any opinions? :-)
+
+> Ok, first, I don't understand what your plugin does. Maybe I need to get
+> some sleep, but a better explanation might help. :-) It seems to make
+> links from pages to the archive pages? But I don't understand why you
+> want such links .. wouldn't a sidebar with links to the available archive
+> pages work better? Or something else, depending on personal preference.
+>
+> Secondly, you're certianly not the first to wat to do data based archive
+> pages. So far I have successfully punted the issue of creating these
+> pages out of ikiwiki by pointing out that everyone wants them to be
+> _different_, and suggesting people set up cron jobs or other machinery to
+> generate the kinds of archives that they like. This makes me happy
+> because generalizing all the possible ways people might want to do date
+> based archives and somehow bolting support for creating them onto the
+> size of ikiwiki seems to be a recipe for a mess.
+>
+> A few examples of ikiwiki sites with date archives:
+> <http://www.golden-gryphon.com/blog/manoj/> and
+> <http://roland.entierement.nu/categories/geek-en.html> --[[Joey]]
+
+>> Yeah, it wasn't much of a description, was it? ;-) It's an attempt to emulate the style of Wordpress and other popular blog platforms, which can link a post's creation date to YYY/MM/DD archive pages, which then list all the relevant posts. My use-case is on a blog page which in-lines (via pagespecs) recent blog posts.
+
+>> I agree with not adding this kind of functionality to the core. :-) I simply didn't want to have break links when I convert to IkiWiki. I guess I'll just play around with the page-creation thing myself then. Feel free to delete this from the queue. :-) --Ben
+
+>>> Ah, I get it, I hadn't realized it was making the date into a link.
+>>> No reason to delete this from the queue, it's a reasonable plugin. I
+>>> might move it to the contributed plugins directory as it's a bit
+>>> specialised to be included in ikiwiki though. --[[Joey]]
+
+[[!tag patch]]
diff --git a/doc/todo/default_content_for_new_post.mdwn b/doc/todo/default_content_for_new_post.mdwn
new file mode 100644
index 000000000..48cb1cc9d
--- /dev/null
+++ b/doc/todo/default_content_for_new_post.mdwn
@@ -0,0 +1,66 @@
+# Use Case *[[plugins/inline]]*
+
+Along the same lines as having a [[default_name_for_new_post]]s, an option
+to include default content in a new [[plugins/inline]] post would help with
+tasks like using an inline for a comment form on each new blog post.
+--[[JoshTriplett]]
+
+No, it would only help if the new blog post were being made via the form.
+If you're editing it in vi, and committing, it doesn't help. :-) This is
+another reason why I prefer the approach in [[discussion_page_as_blog]].
+Although I don't mind getting this implemented too, for other reasons.
+
+I see three possible designs:
+
+1. Simply use the bestlink(new_page_content) as the default content. Thomas
+ Schwinge emailed me an implementation of this. It has the problem that
+ it doesn't make sense to use the same new page template for a Discussion
+ page as for the page being discussed. (That's a specific case of a more
+ general problem.)
+
+1. Modify inline so that "template=foo" uses page foo as the template for
+ new posts made to the blog. This doesn't cater to every case, but
+ perhaps it would be enough?
+
+1. Make pages able to embed in them a pagespec that says they are the
+ template when new pages are created that match that pagespec.
+ This is the most general solution, but depends on
+ [[plugin_data_storage]].
+
+--[[Joey]]
+
+This feature would also allow the automatic inclusion of a given template in
+every new post, which could help for [[/plugins]] (automatically use the
+plugin template), or for [[/bugs]] and [[todo_items|/todo]] (automatically use
+a template that appends "(done)" to the title if the page links to
+"done"). --[[JoshTriplett]]
+
+> This is a feature I miss a lot from MoinMoin, and is especially helpful when you maintain pages which have a regular format (eg. recipe pages). I understand that using svn would bypass this feature but I think it's worth considering anyway because:
+
+> * For any given site often it's only the admin user who makes changes via svn, everyone else uses the web
+> * It's remote and casual users who benefit most from having standard templates to use for new pages
+> * When using svn to make changes it's easier to manually provide template functionality (eg. cp recipe_template.mdwn newpage.mdwn)
+
+> All of course bearing in mind that I'm just commenting because I don't have the skills to actually make the required changes. ;-) -- [[AdamShand]]
+
+
+# Use Case *Copyright Notes*
+
+Leaving aside the [[plugins/inline]] stuff I have a completely different
+use case for this (which is also why I wrote the plugin Joey mentioned).
+
+For a GNU-affiliated wiki we want to track copyright stuff right from
+the beginning, as the wiki pages may eventually evolve into official
+GNU documentation.
+
+That's why I want to have such copyright notices
+be included in every freshly created page by default (and having them
+interpreted by another plugin I also emailed to Joey).
+
+Of course this
+will also only work when using web-editing, but the people using
+rcs-editing (coining new terms, eh ;-)?) usually know what they're doing.
+
+--[[tschwinge]]
+
+> [[done]] in the [[plugins/edittemplate]] plugin. --[[Joey]]
diff --git a/doc/todo/default_name_for_new_post.mdwn b/doc/todo/default_name_for_new_post.mdwn
new file mode 100644
index 000000000..2a7f1ba20
--- /dev/null
+++ b/doc/todo/default_name_for_new_post.mdwn
@@ -0,0 +1,3 @@
+How about an option in [[plugins/inline]] for providing a default post name in the form, consisting of the current date in ISO format?
+
+> It would render to static HTML. In order to get a dynamic "today's date", you'd have to do it in Javascript, which htmlscrubber might not let through. I thought a better approach would be a plugin, so I wrote [[blogpost_plugin]]. --Ethan \ No newline at end of file
diff --git a/doc/todo/dependency_types.mdwn b/doc/todo/dependency_types.mdwn
new file mode 100644
index 000000000..4db633ead
--- /dev/null
+++ b/doc/todo/dependency_types.mdwn
@@ -0,0 +1,579 @@
+Ikiwiki currently only has one type of dependency between pages
+(plus wikilinks special cased in on the side). This has resulted in various
+problems, and it's seemed for a long time to me that ikiwiki needs to get
+smarter about what types of dependencies are supported.
+
+### unnecessary work
+
+The current single dependency type causes the depending page to be rebuilt
+whenever a matching dependency is added, removed, or *modified*. But a
+great many things don't care about the modification case, and often cause
+unnecessary page rebuilds:
+
+* map only cares if the pages are added or removed. Content change does
+ not matter (unless show=title is used).
+* brokenlinks, orphans, pagecount, ditto (generally)
+* inline in archive mode cares about page title, author changing, but
+ not content. (Ditto for meta with show=title.)
+* Causes extra work when solving the [[bugs/transitive_dependencies]]
+ problem.
+
+### two types of dependencies needed for [[tracking_bugs_with_dependencies]]
+
+>> it seems that there are two types of dependency, and ikiwiki
+>> currently only handles one of them. The first type is "Rebuild this
+>> page when any of these other pages changes" - ikiwiki handles this.
+>> The second type is "rebuild this page when set of pages referred to by
+>> this pagespec changes" - ikiwiki doesn't seem to handle this. I
+>> suspect that named pagespecs would make that second type of dependency
+>> more important. I'll try to come up with a good example. -- [[Will]]
+
+>>> Hrm, I was going to build an example of this with backlinks, but it
+>>> looks like that is handled as a special case at the moment (line 458 of
+>>> render.pm). I'll see if I can breapk
+>>> things another way. Fixing this properly would allow removal of that special case. -- [[Will]]
+
+>>>> I can't quite understand the distinction you're trying to draw
+>>>> between the two types of dependencies. Backlinks are a very special
+>>>> case though and I'll be suprised if they fit well into pagespecs.
+>>>> --[[Joey]]
+
+>>>>> The issue is that the existential pagespec matching allows you to build things that have similar
+>>>>> problems to backlinks.
+>>>>> e.g. the following inline:
+
+ \[[!inline pages="define(~done, link(done)) and link(~done)" archive=yes]]
+
+>>>>> includes any page that links to a page that links to done. Now imagine I add a new link to 'done' on
+>>>>> some random page somewhere - a page which some other page links to which didn't previously get included - the set of pages accepted by the pagespec, and hence the set of
+>>>>> pages inlined, will change. But, there is no dependency anywhere on the page that I altered, so
+>>>>> ikiwiki will not rebuild the page with the inline in it. What is happening is that the page that I altered affects
+>>>>> the set of pages matched by the pagespec without itself being matched by the pagespec, and hence included in the dependency list.
+
+>>>>> To make this work well, I think you need to recognise two types of dependencies for each page (and no
+>>>>> special cases for particular types of links, eg backlinks). The first type of dependency says, "The content of
+>>>>> this page depends upon the content of these other pages". The `add_depends()` in the shortcuts
+>>>>> plugin is of this form: any time the shortcuts page is edited, any page with a shortcut on it
+>>>>> is rebuilt. The inline plugin also needs to add dependencies of this form to detect when the inlined
+>>>>> content changes. By contrast, the map plugin does not need a dependency of this form, because it
+>>>>> doesn't actually care about the content of any pages, just which pages it needs to include (which we'll handle next).
+
+>>>>> The second type of dependency says, "The content of this page depends upon the exact set of pages matched
+>>>>> by this pagespec". The first type of dependency was about the content of some pages, the second type is about
+>>>>> which pages get matched by a pagespec. This is the type of dependency tracking that the map plugin needs.
+>>>>> If the set of pages matched by map pagespec changes, then the page with the map on it needs to be rebuilt to show a different list of pages.
+>>>>> Inline needs this type of dependency as well as the previous type - This type handles a change in which pages
+>>>>> are inlined, the previous type handles a change in the content of any of those pages. Shortcut does not need this type of
+>>>>> dependency. Most of the places that use `add_depends()` seem to need this type of dependency rather than the first type.
+
+>>>>>> Note that inline and map currently achieve the second type of dependency by
+>>>>>> explicitly calling `add_depends` for each page the displayed.
+>>>>>> If any of those pages are removed, the regular pagespec would not
+>>>>>> match them -- since they're gone. However, the explicit dependency
+>>>>>> on them does cause them to match. It's an ugly corner I'd like to
+>>>>>> get rid of. --[[Joey]]
+
+>>>>> Implementation Details: The first type of dependency can be handled very similarly to the current
+>>>>> dependency system. You just need to keep a list of pages that the content depends upon. You could
+>>>>> keep that list as a pagespec, but if you do this you might want to check that the pagespec doesn't change,
+>>>>> possibly by adding a dependency of the second type along with the dependency of the first type.
+
+>>>>>> An example of the current system not tracking enough data is
+>>>>>> described in [[bugs/transitive_dependencies]].
+>>>>>> --[[Joey]]
+
+>>>>> The second type of dependency is a little more tricky. For each page, we'd need a list of pagespecs that
+>>>>> the page depended on, and for each pagespec you'd want to store the list of pages that currently match it.
+>>>>> On refresh, you'd need to check each pagespec to see if the set of pages that match it has changed, and if
+>>>>> that set has changed, then rebuild the dependent page(s). Oh, and for this second type of dependency, I
+>>>>> don't think you can merge pagespecs. If I wanted to know if either "\*" or "link(done)" changes, then just checking
+>>>>> to see if the set of pages matched by "\* or link(done)" changes doesn't work.
+
+>>>>> The current system works because even though you usually want dependencies of the second type, the set of pages
+>>>>> referred to by a pagespec can only change if one of those pages itself changes. i.e. A dependency check of the
+>>>>> first type will catch a dependency change of the second type with current pagespecs.
+>>>>> This doesn't work with backlinks, and it doesn't work with existential matching. Backlinks are currently special-cased. I don't know
+>>>>> how to special-case existential matching - I suspect you're better off just getting the dependency tracking right.
+
+>>>>> I also tried to come up with other possible solutions: e.g. can we find the dependencies for a pagespec? That
+>>>>> would be the set of pages where a change on one of those pages could lead to a change in the set of pages matched by the pagespec.
+>>>>> For old-style pagespecs without backlinks, the dependency set for a pagespec is the same as the set of pages the pagespec matches.
+>>>>> Unfortunately, with existential matching, the set of pages that each
+>>>>> pagespec depends upon can quickly become "*", which is not very useful. -- [[Will]]
+
+### proposal
+
+I propose the following. --[[Joey]]
+
+* Add a second type of dependency, call it an "presence dependency".
+* `add_depends` defaults to adding a regular ("full") dependency, as
+ before. (So nothing breaks.)
+* `add_depends($page, $spec, presence => 0)` adds an presence dependency.
+* `refresh` only looks at added/removed pages when resolving presence
+ dependencies.
+
+This seems straightforwardly doable. I'd like [[Will]]'s feedback on it, if
+possible. The type types of dependencies I am proposing are not identical
+to the two types he talks about above, but I hope are close enough that
+they can be used.
+
+This doesn't deal with the stuff that only depend on the metadata of a
+page, as collected in the scan pass, changing. But it does leave a window
+open for adding such a dependency type later.
+
+----
+
+I implemented the above in a branch.
+[[!template id=gitbranch branch=origin/dependency-types author="[[joey]]"]]
+
+Then I found some problems:
+
+* Something simple like pagecount, that seems like it could use a
+ presence dependency, can have a pagespec that uses metadata, like
+ `author()` or `copyright()`.
+* pagestats, orphans and brokenlinks cannot use presence dependencies
+ because they need to update when links change.
+
+Now I'm thinking about having a special dependency look at page
+metadata, and fire if the metadata changes. And it seems links should
+either be included in that, or there should be a way to make a dependency
+that fires when a page's links change. (And what about backlinks?)
+
+It's easy to see when a page's links change, since there is `%oldlinks`.
+To see when metadata is changed is harder, since it's stored in the
+pagestate by the meta plugin. Also, there are many different types of
+metadata, that would need to be matched with the pagespecs somehow.
+
+Quick alternative: Make add_depends look at the pagespec. Ie, if it
+is a simple page name, or a glob, we know a presence dependency
+can be valid. If's more complex, convert the dependency from
+presence to full.
+
+There is a lot to dislike about this method. Its parsing of the pagespec,
+as currently implemented, does not let plugins add new types of pagespecs
+that only care about presence. Its pagespec parsing is also subject to
+false negatives (though these should be somewhat rare, and no false
+positives). Still, it does work, and it makes things like simple maps and
+pagecounts much more efficient.
+
+----
+
+#### Will's first pass feedback.
+
+If the API is going to be updated, then it would be good to make it forward compatible.
+I'd like for the API to be extendible to what is useful for complex pagespecs, even if we
+that is a little redundant at the moment.
+
+My attempt to play with this is in my git repo. [[!template id=gitbranch branch=origin/depends-spec author="[[will]]"]]
+That branch is a little out of date, but if you just look at the changes in IkiWiki.pm you'll see the concept I was looking at.
+I added an "add_depends_spec()" function that adds a dependency on the pagespec passed to it. If the set of matched pages
+changes, then the dependent page is rebuilt. At the moment the implementation uses the same hack used by map and inline -
+just add all the pages that currently exist as traditional content dependencies.
+
+> As I note below, a problem with this approach is that it has to try
+> matching the pagespec against every page, redundantly with the work done
+> by the plugin. (But there are ways to avoid that redundant matching.)
+> --[[Joey]]
+
+Getting back to commenting on your proposal:
+
+Just talking about the definition of a "presence dependency" for the moment, and ignoring implementation. Is a
+"presence dependency" supposed to cause an update when a page disappears? I assume so. Is a presence dependency
+supposed to cause an update when a pages existence hasn't changed, but it no longer matches the pagespec.
+(e.g. you use `created_before(test_page)` in a pagespec, and there was a page, `new_page`, that was created
+after `test_page`. `new_page` will not match the spec. Now we'll delete and then re-create `test_page`. Now
+`new_page` will match the spec, and yet `new_page` itself hasn't changed. Nor has its 'presence' - it was present
+before and it is present now. Should this cause a re-build of any page that has a 'presence' dependency on the spec?
+
+> Yes, a presence dep will trigger when a page is added, or removed.
+
+> Your example is valid.. but it's also not handled right by normal,
+> (content) dependencies, for the same reasons. Still, I think I've
+> addressed it with the pagespec influence stuff below. --[[Joey]]
+
+I think that is another version of the problem you encountered with meta-data.
+
+In the longer term I was thinking we'd have to introduce a concept of 'internal pagespec dependencies'. Note that I'm
+defining 'internal' pagespec dependencies differently to the pagespec dependencies I defined above. Perhaps an example:
+If you had a pagespec that was `created_before(test_page)`, then you could list all pages created before `test_page`
+with a `map` directive. The map directive would add a pagespec dependency on `created_before(test_page)`.
+Internally, there would be a second page-spec parsing function that discovers which pages a given pagespec
+depends on. As well as the function `match_created_before()`, we'd have to add a new function `depend_created_before()`.
+This new function would return a list of pages, which when any of them change, the output of `match_created_before()`
+would change. In this example, it would just return `test_page`.
+
+These lists of dependent pages could just be concatenated for every `match_...()` function in a pagespec - you can ignore
+the boolean formula aspects of the pagespec for this. If a content dependency were added on these pages, then I think
+the correct rebuilds would occur.
+
+In all, this is a surprisingly difficult problem to solve perfectly. Consider the following case:
+
+PageA.mdwn:
+
+> [ShavesSelf]
+
+PageB.mdwn
+
+> Doesn't shave self.
+
+ShavedByBob.mdwn:
+
+> [!include pages="!link(ShavesSelf)"]
+
+Does ShavedByBob.mdwn include itself?
+
+(Yeah - in IkiWiki currently links are *not* included by include, but the idea holds. I had a good example a while back, but I can't think of it right now.)
+
+sigh.
+
+-- [[Will]]
+
+> I have also been thinking about some sort of analysis pass over pagespecs
+> to determine what metadata, pages, etc they depend on. It is indeed
+> tricky to do. More thoughts on influence lists a bit below. --[[Joey]]
+
+>> The big part of what makes this tricky is that there may be cycles in the
+>> dependency graph. This can lead to situations where the result is just not
+>> well defined. This is what I was trying to get at above. -- [[Will]]
+
+>>> Hmm, I'm not seeing cycles be a problem, at least with the current
+>>> pagespec terms. --[[Joey]]
+
+>>>> Oh, they're not with current pagespec terms. But this is really close to extending to handle
+>>>> functional pagespecs, etc. And I think I'd like to think about that now.
+>>>>
+>>>> Having said that, I don't want to hold you up - you seem to be making progress. The best is
+>>>> the enemy of the good, etc. etc.
+>>>>
+>>>> For my part, I'm imagining we have two more constructs in IkiWiki:
+>>>>
+>>>> * A map directive that actually wikilinks to the pages it links to, and
+>>>> * A `match_sharedLink(pageX)` matching function that matches pageY if both pageX and pageY each have links to any same third page, pageZ.
+>>>>
+>>>> With those two constructs, one page changing might change the set of pages included in a map somewhere, which might then change the set of pages matched by some other pagespec, which might then...
+>>>>
+>>>> --[[Will]]
+
+>>>>> I think that should be supported by [[bugs/transitive_dependencies]].
+>>>>> At least in the current implementation, which considers each page
+>>>>> that is rendered to be changed, and rebuilds pages that are dependent
+>>>>> on it, in a loop. An alternate implementation, which could be faster,
+>>>>> is to construct a directed graph and traverse it just once. Sounds
+>>>>> like that would probably not support what you want to do.
+>>>>> --[[Joey]]
+
+>>>>>> Yes - that's what I'm talking about - I'll add some comments there. -- [[Will]]
+
+----
+
+### Link dependencies
+
+* `add_depends($page, $spec, links => 1, presence => 1)`
+ adds a links + presence dependency.
+* Use backlinks change code to detect changes to link dependencies too.
+* So, brokenlinks can fire whenever any links in any of the
+ pages it's tracking change, or when pages are added or
+ removed.
+* To determine if a pagespec is valid to be used with a links dependency,
+ use the same set that are valid for presence dependencies. But also
+ allow `backlinks()` to be used in it, since that matches pages
+ that the page links to, which is just what link dependencies are
+ triggered on.
+
+[[done]]
+----
+
+### the removal problem
+
+So far I have not addressed fixing the removal problem (which Will
+discusses above).
+
+Summary of problem: A has a dependency on a pagespec such as
+"bugs/* and !link(done)". B currently matches. Then B is updated,
+in a way that makes A's dependency not match it (ie, it links to done).
+Now A is not updated, because ikiwiki does not realize that it
+depended on B before.
+
+This was worked around to fix [[bugs/inline_page_not_updated_on_removal]]
+by inline and map adding explicit dependencies on each page that appears
+on them. Then a change to B triggers the explicit dep. While this works,
+it's 1) ugly 2) probably not implemented by all plugins that could
+be affected by this problem (ie, linkmap) and 3) is most of the reason why
+we grew the complication of `depends_simple`.
+
+One way to fix this is to include with each dependency, a list of pages
+that currently match it. If the list changes, the dependency is triggered.
+
+Should be doable, but may involve more work than
+currently. Consider that a dependency on `bugs/*` currently
+is triggered by just checking until *one* page is found to match it.
+But to store the list, *every* page would have to be tried against it.
+Unless the list can somehow be intelligently updated, looking at only the
+changed pages.
+
+----
+
+Found a further complication in presence dependencies. Map now uses
+presence dependencies when adding its explicit dependencies on pages. But
+this defeats the purpose of the explicit dependencies! Because, now,
+when B is changed to not match a pagespec, the A's presence dep does
+not fire.
+
+I didn't think things through when switching it to use presence
+dependencies there. But, if I change it to use full dependencies, then all
+the work that was done to allow map to use presence dependencies for its
+main pagespec is for naught. The map will once again have to update
+whenever *any* content of the page changes.
+
+This points toward the conclusion that explicit dependencies, however they
+are added, are not the right solution at all. Some other approach, such as
+maintaining the list of pages that match a dependency, and noticing when it
+changes, is needed.
+
+----
+
+### pagespec influence lists
+
+I'm using this term for the concept of a list of pages whose modification
+can indirectly influence what pages a pagespec matches.
+
+> Trying to make a formal definition of this: (Note, I'm using the term sets rather than lists, but they're roughly equivalent)
+>
+> * Let the *matching set* for a pagespec be the set of existing pages that the pagespec matches.
+> * Let the *missing document matching set* be the set of pages that would match the spec if they didn't exist. These pages may or may not currently exist. Note that membership of this set depends upon how the `match_()` functions react to non-existant pages.
+> * Let the *indirect influence set* for a pagespec be the set of all pages, *p*, whose alteration might:
+> * cause the pagespec to include or exclude a page other than *p*, or
+> * cause the pagespec to exclude *p*, unless the alteration is the removal of *p* and *p* is in the missing document matching set.
+>
+> Justification: The 'base dependency mechanism' is to compare changed pages against each pagespec. If the page matches, then rebuild the spec. For this comparison, creation and removal
+> of pages are both considered changes. This base mechanism will catch:
+>
+> * The addition of any page to the matching set through its own modification/creation
+> * The removal of any page *that would still match while non-existant* from the matching set through its own removal. (Note: The base mechanism cannot remove a page from the matching set because of that page's own modification (not deletion). If the page should be removed matching set, then it obviously cannot match the spec after the change.)
+> * The modification (not deletion) of any page that still matches after the modification.
+>
+> The base mechanism may therefore not catch:
+>
+> * The addition or removal of any page from the matching set through the modification/addition/removal of any other page.
+> * The removal of any page from the matching set through its own modification/removal if it does not still match after the change.
+>
+> The indirect influence set then should handle anything that the base mechanism will not catch.
+>
+> --[[Will]]
+
+>> I appreciate the formalism!
+>>
+>> Only existing pages need to be in these sets, because if a page is added
+>> in the future, the existing dependency code will always test to see
+>> if it matches. So it will be in the maching set (or not) at that point.
+>>
+>>> Hrm, I agree with you in general, but I think I can come up with nasty counter-examples. What about a pagespec
+>>> of "!backlink(bogus)" where the page bogus doesn't exist? In this case, the page 'bogus' needs to be in the influence
+>>> set even though it doesn't exist.
+>>>
+>>>> I think you're right, this is a case that the current code is not
+>>>> handling. Actually, I made all the pagespecs return influences
+>>>> even if the influence was not present or did not match. But, it
+>>>> currently only records influences as dependencies when a pagespec
+>>>> successfully matches. Now I'm sure that is wrong, and I've removed
+>>>> that false optimisation. I've updated some of the below. --[[Joey]]
+>>>
+>>> Also, I would really like the formalism to include the whole dependency system, not just any additions to it. That will make
+>>> the whole thing much easier to reason about.
+>>
+>> The problem with your definition of direct influence set seems to be
+>> that it doesn't allow `link()` and `title()` to have as an indirect
+>> influence, the page that matches. But I'm quite sure we need those.
+>> --[[Joey]]
+
+>>> I see what you mean. Does the revised definition capture this effectively?
+>>> The problem with this revised definition is that it still doesn't match your examples below.
+>>> My revised definition will include pretty much all currently matching pages to be in the influence list
+>>> because deletion of any of them would cause a change in which pages are matched - the removal problem.
+>>> -- [[Will]]
+
+#### Examples
+
+* The pagespec "created_before(foo)" has an indirect influence list that contains foo.
+ The removal or (re)creation of foo changes what pages match it. Note that
+ this is true even if the pagespec currently fails to match.
+
+>>> This is an annoying example (hence well worth having :) ). I think the
+>>> indirect influence list must contain 'foo' and all currently matching
+>>> pages. `created_before(foo)` will not match
+>>> a deleted page, and so the base mechanism would not cause a rebuild. The
+>>> removal problem strikes. -- [[Will]]
+
+>>>> But `created_before` can in fact match a deleted page. Because the mtime
+>>>> of a deleted page is temporarily set to 0 while the base mechanism runs to
+>>>> find changes in deleted pages. (I verified this works by experiment,
+>>>> also that `created_after` is triggered by a deleted page.) --[[Joey]]
+
+>>>>> Oh, okie. I looked at the source, saw the `if (exists $IkiWiki::pagectime{$testpage})` and assumed it would fail.
+>>>>> Of course, having it succeed doesn't cure all the issues -- just moves them. With `created_before()` succeeding
+>>>>> for deleted files, this pagespec will be match any removal in the entire wiki with the base mechanism. Whether this is
+>>>>> better or worse than the longer indirect influence list is an empirical question. -- [[Will]]
+
+* The pagespec "foo" has an empty influence list. This is because a
+ modification/creation/removal of foo directly changes what the pagespec
+ matches.
+
+* The pagespec "*" has an empty influence list, for the same reason.
+ Avoiding including every page in the wiki into its influence list is
+ very important!
+
+>>> So, why don't the above influence lists contain the currently matched pages?
+>>> Don't you need this to handle the removal problem? -- [[Will]]
+
+>>>> The removal problem is slightly confusingly named, since it does not
+>>>> affect pages that were matched by a glob and have been removed. Such
+>>>> pages can be handled without being influences, because ikiwiki knows
+>>>> they have been removed, and so can still match them against the
+>>>> pagespec, and see they used to match; and thus knows that the
+>>>> dependency has triggered.
+>>>>
+>>>>> IkiWiki can only see that they used to match if they're in the glob matching set. -- [[Will]]
+>>>>
+>>>> Maybe the thing to do is consider this an optimisation, where such
+>>>> pages are influences, but ikiwiki is able to implicitly find them,
+>>>> so they do not need to be explicitly stored. --[[Joey]]
+
+* The pagespec "title(foo)" has an influence list that contains every page
+ that currently matches it. A change to any matching page can change its
+ title, making it not match any more, and so the list is needed due to the
+ removal problem. A page that does not have a matching title is not an
+ influence, because modifying the page to change its title directly
+ changes what the pagespec matches.
+
+* The pagespec "backlink(index)" has an influence list
+ that contains index (because a change to index changes the backlinks).
+ Note that this is true even if the backlink currently fails.
+
+>>> This is another interesting example. The missing document matching set contains all links on the page index, and so
+>>> the influence list only needs to contain 'index' itself. -- [[Will]]
+
+* The pagespec "link(done)" has an influence list that
+ contains every page that it matches. A change to any matching page can
+ remove a link and make it not match any more, and so the list is needed
+ due to the removal problem.
+
+>> Why doesn't this include every page? If I change a page that doesn't have a link to
+>> 'done' to include a link to 'done', then it will now match... or is that considered a
+>> 'direct match'? -- [[Will]]
+
+>>> The regular dependency calculation code will check if every changed
+>>> page matches every dependency. So it will notice the link was added.
+>>> --[[Joey]]
+
+#### Low-level Calculation
+
+One way to calculate a pagespec's influence would be to
+expand the SuccessReason and FailReason objects used and returned
+by `pagespec_match`. Make the objects be created with an
+influence list included, and when the objects are ANDed or ORed
+together, combine the influence lists.
+
+That would have the benefit of allowing just using the existing `match_*`
+functions, with minor changes to a few of them to gather influence info.
+
+But does it work? Let's try some examples:
+
+Consider "bugs/* and link(done) and backlink(index)".
+
+Its influence list contains index, and it contains all pages that the whole
+pagespec matches. It should, ideally, not contain all pages that link
+to done. There are a lot of such pages, and only a subset influence this
+pagespec.
+
+When matching this pagespec against a page, the `link` will put the page
+on the list. The `backlink` will put index on the list, and they will be
+anded together and combined. If we combine the influences from each
+successful match, we get the right result.
+
+Now consider "bugs/* and link(done) and !backlink(index)".
+
+It influence list is the same as the previous one, even though a term has
+been negated. Because a change to index still influences it, though in a
+different way.
+
+If negation of a SuccessReason preserves the influence list, the right
+influence list will be calculated.
+
+Consider "bugs/* and (link(done) or backlink(index))"
+and "bugs/* and (backlink(index) or link(done))'
+
+Its clear that the influence lists for these are identical. And they
+contain index, plus all matching pages.
+
+When matching the first against page P, the `link` will put P on the list.
+The OR needs to be a non-short-circuiting type. (In perl, `or`, not `||` --
+so, `pagespec_translate` will need to be changed to not use `||`.)
+Given that, the `backlink` will always be evalulated, and will put index
+onto the influence list. If we combine the influences from each
+successful match, we get the right result.
+
+> This is implemented, seems to work ok. --[[Joey]]
+
+> `or` short-circuits too, but the implementation correctly uses `|`,
+> which I assume is what you meant. --[[smcv]]
+
+>> Er, yeah. --[[Joey]]
+
+----
+
+What about: "!link(done)"
+
+Specifically, I want to make sure it works now that I've changed
+`match_link` to only return a page as an influence if it *does*
+link to done.
+
+So, when matching against page P, that does not link to done,
+there are no influences, and the pagespec matches. If P is later
+changed to add a link to done, then the dependency resolver will directly
+notice that.
+
+When matching against page P, that does link to done, P
+is an influence, and the pagespec does not match. If P is later changed
+to not link to done, the influence will do its job.
+
+Looks good!
+
+----
+
+Here is a case where this approach has some false positives.
+
+"bugs/* and link(patch)"
+
+This finds as influences all pages that link to patch, even
+if they are not under bugs/, and so can never match.
+
+To fix this, the influence calculation would need to consider boolean
+operators. Currently, this turns into roughly:
+
+`FailReason() & SuccessReason(patch)`
+
+Let's say that the glob instead returns a HardFailReason, which when
+ANDed with another object, blocks their influences. (But when ORed,
+combines them.)
+
+Question: Are all pagespec terms that return reason objects w/o any
+influence info, suitable to block influence in this way?
+
+To be suitable to block, a term should never change from failing to match a
+page to successfully matching it, unless that page is directly changed in a
+way that influences are not needed for ikiwiki to notice. But, if a term
+did not meet these criteria, it would have an influence. QED.
+
+#### Influence types
+
+Note that influences can also have types, same as dependency types.
+For example, "backlink(foo)" has an influence of foo, of type links.
+"created_before(foo)" also is influenced by foo, but it's a presence
+type. Etc.
+
+> This is an interesting concept that I hadn't considered. It might
+> allow significant computational savings, but I suspect will be tricky
+> to implement. -- [[Will]]
+
+>> It was actually really easy to implement it, assuming I picked the right
+>> dependency types of course. --[[Joey]]
diff --git a/doc/todo/description_meta_param_passed_to_templates.mdwn b/doc/todo/description_meta_param_passed_to_templates.mdwn
new file mode 100644
index 000000000..712471258
--- /dev/null
+++ b/doc/todo/description_meta_param_passed_to_templates.mdwn
@@ -0,0 +1,10 @@
+[[!tag wishlist patch]]
+
+I'd like to use the description parameter from [[meta|/ikiwiki/directive/meta]] directives in custom [[inline|/ikiwiki/directive/inline]] templates. I guess this could be useful to others too.
+
+The only change required is on [line 266](http://github.com/joeyh/ikiwiki/blob/master/IkiWiki/Plugin/meta.pm#L266) of `meta.pm`
+
+ - foreach my $field (qw{author authorurl permalink}) {
+ + foreach my $field (qw{author authorurl description permalink}) {
+
+> Good idea, [[done]]. --[[Joey]]
diff --git a/doc/todo/different_search_engine.mdwn b/doc/todo/different_search_engine.mdwn
new file mode 100644
index 000000000..9d0fc92c9
--- /dev/null
+++ b/doc/todo/different_search_engine.mdwn
@@ -0,0 +1,332 @@
+[[done]], using xapian-omega! --[[Joey]]
+
+After using it for a while, my feeling is that hyperestraier, as used in
+the [[plugins/search]] plugin, is not robust enough for ikiwiki. It doesn't
+upgrade well, and it has a habit of sig-11 on certain input from time to
+time.
+
+So some other engine should be found and used instead.
+
+Enrico had one that he was using for debtags stuff that looked pretty good.
+That was [Xapian](http://www.xapian.org/), which has perl bindings in
+libsearch-xapian-perl. The nice thing about xapian is that it does a ranked
+search so it understands what words are most important in a search. (So
+does Lucene..) Another nice thing is it supports "more documents like this
+one" kind of search. --[[Joey]]
+
+## xapian
+
+I've invesitgated xapian briefly. I think a custom xapian indexer and use
+of the omega for cgi searches could work well for ikiwiki. --[[Joey]]
+
+### indexer
+
+A custom indexer is needed because omindex isn't good enough for ikiwiki's
+needs for incremental rendering. (And because, since ikiwiki has page info
+in memory, it's silly to write it to disk and have omindex read it back.)
+
+The indexer would run as a ikiwiki hook. It needs to be passed the page
+name, and the content. Which hook to use is an open question.
+Possibilities:
+
+* `filter` - Since this runs before preprocess, only the actual text
+ written on the page would be indexed. Not text generated by directives,
+ pulled in by inlining, etc. There's something to be said for that. And
+ something to be said against it. It would also get markdown formatted
+ content, mostly, though it would still need to strip html, and also
+ probably strip preprocessor directives too.
+* `sanitize` - Would get the htmlized content, so would need to strip html.
+ Preprocessor directive output would be indexed. Doesn't get a destpage
+ parameter, making optimisation hard.
+* `format` - Would get the entire html page, including the page template.
+ Probably not a good choice as indexing the same template for each page
+ is unnecessary.
+
+The hook would remove any html from the content, and index it.
+It would need to add the same document data that omindex would.
+
+The indexer (and deleter) will need a way to figure out the ids in xapian
+of the documents to delete. One way is storing the id of each page in the
+ikiwiki index.
+
+The other way would be adding a special term to the xapian db that can be
+used with replace_document_by_term/delete_document_by_term.
+Hmm, let's use a term named "P<pagename>".
+
+The hook should try to avoid re-indexing pages that have not changed since
+they were last indexed. One problem is that, if a page with an inline is
+built, every inlined item will get each hook run. And so a naive hook would
+index each of those items, even though none of them have necessarily
+changed. Date stamps are one possibility. Another would be to avoid having
+the hook not do any indexing when `%preprocessing` is set (Ikiwiki.pm would
+need to expose that variable.) Another approach would be to use a
+needsbuild hook and only index the pages that are being built.
+
+#### cgi
+
+The cgi hook would exec omega to handle the searching, much as is done
+with estseek in the current search plugin.
+
+It would first set `OMEGA_CONFIG_FILE=.ikiwiki/omega.conf` ; that omega.conf
+would set `database_dir=.ikiwiki/xapian` and probably also set a custom
+`template_dir`, which would have modified templates branded for ikiwiki. So
+the actual xapian db would be in `.ikiwiki/xapian/default/`.
+
+## lucene
+
+>> I've done a bit of prototyping on this. The current hip search library is [Lucene](http://lucene.apache.org/java/docs/). There's a Perl port called [Plucene](http://search.cpan.org/~tmtm/Plucene-1.25/). Given that it's already packaged, as `libplucene-perl`, I assumed it would be a good starting point. I've written a **very rough** patch against `IkiWiki/Plugin/search.pm` to handle the indexing side (there's no facility to view the results yet, although I have a command-line interface working). That's below, and should apply to SVN trunk.
+
+>> Of course, there are problems. ;-)
+
+>> * Plucene throws up a warning when running under Taint mode. There's a patch on the mailing list, but I haven't tried applying it yet. So for now you'll have to build IkiWiki with `NOTAINT=1 make install`.
+>> * If I kill `ikiwiki` while it's indexing, I can screw up Plucene's locks. I suspect that this will be an easy fix.
+
+>> There is a [C++ port of Lucene](http://sourceforge.net/projects/clucene/) which is packaged as `libclucene0`. The Perl interface to this is called [Lucene](http://search.cpan.org/~tbusch/Lucene-0.09/lib/Lucene.pm). This is supposed to be significantly faster, and presumably won't have the taint bug. The API is virtually the same, so it will be easy to switch over. I'd use this now, were it not for the lack of package. (I assume you won't want to make core functionality depend on installing a module from CPAN). I've never built a Debian package before, so I can either learn then try building this, or somebody else could do the honours. ;-)
+
+>> If this seems a sensible approach, I'll write the CGI interface, and clean up the plugin. -- Ben
+
+>>> The weird thing about lucene is that these are all reimplmentations of
+>>> it. Thank you java.. The C++ version seems like a better choice to me
+>>> (packages are trivial). --[[Joey]]
+
+> Might I suggest renaming the "search" plugin to "hyperestraier", and then creating new search plugins for different engines? No reason to pick a single replacement. --[[JoshTriplett]]
+
+<pre>
+Index: IkiWiki/Plugin/search.pm
+===================================================================
+--- IkiWiki/Plugin/search.pm (revision 2755)
++++ IkiWiki/Plugin/search.pm (working copy)
+@@ -1,33 +1,55 @@
+ #!/usr/bin/perl
+-# hyperestraier search engine plugin
+ package IkiWiki::Plugin::search;
+
+ use warnings;
+ use strict;
+ use IkiWiki;
+
++use Plucene::Analysis::SimpleAnalyzer;
++use Plucene::Document;
++use Plucene::Document::Field;
++use Plucene::Index::Reader;
++use Plucene::Index::Writer;
++use Plucene::QueryParser;
++use Plucene::Search::HitCollector;
++use Plucene::Search::IndexSearcher;
++
++#TODO: Run the Plucene optimiser after a rebuild
++#TODO: CGI query interface
++
++my $PLUCENE_DIR;
++# $config{wikistatedir} may not be defined at this point, so we delay setting $PLUCENE_DIR
++# until a subroutine actually needs it.
++sub init () {
++ error("Plucene: Statedir <$config{wikistatedir}> does not exist!")
++ unless -e $config{wikistatedir};
++ $PLUCENE_DIR = $config{wikistatedir}.'/plucene';
++}
++
+ sub import {
+- hook(type => "getopt", id => "hyperestraier",
+- call => \&amp;getopt);
+- hook(type => "checkconfig", id => "hyperestraier",
++ hook(type => "checkconfig", id => "plucene",
+ call => \&amp;checkconfig);
+- hook(type => "pagetemplate", id => "hyperestraier",
+- call => \&amp;pagetemplate);
+- hook(type => "delete", id => "hyperestraier",
++ hook(type => "delete", id => "plucene",
+ call => \&amp;delete);
+- hook(type => "change", id => "hyperestraier",
++ hook(type => "change", id => "plucene",
+ call => \&amp;change);
+- hook(type => "cgi", id => "hyperestraier",
+- call => \&amp;cgi);
+ }
+
+-sub getopt () {
+- eval q{use Getopt::Long};
+- error($@) if $@;
+- Getopt::Long::Configure('pass_through');
+- GetOptions("estseek=s" => \$config{estseek});
+-}
+
++sub writer {
++ init();
++ return Plucene::Index::Writer->new(
++ $PLUCENE_DIR, Plucene::Analysis::SimpleAnalyzer->new(),
++ (-e "$PLUCENE_DIR/segments" ? 0 : 1));
++}
++
++#TODO: Better name for this function.
++sub src2rendered_abs (@) {
++ return map { Encode::encode_utf8($config{destdir}."/$_") }
++ map { @{$renderedfiles{pagename($_)}} }
++ grep { defined pagetype($_) } @_;
++}
++
+ sub checkconfig () {
+ foreach my $required (qw(url cgiurl)) {
+ if (! length $config{$required}) {
+@@ -36,112 +58,55 @@
+ }
+ }
+
+-my $form;
+-sub pagetemplate (@) {
+- my %params=@_;
+- my $page=$params{page};
+- my $template=$params{template};
++#my $form;
++#sub pagetemplate (@) {
++# my %params=@_;
++# my $page=$params{page};
++# my $template=$params{template};
++#
++# # Add search box to page header.
++# if ($template->query(name => "searchform")) {
++# if (! defined $form) {
++# my $searchform = template("searchform.tmpl", blind_cache => 1);
++# $searchform->param(searchaction => $config{cgiurl});
++# $form=$searchform->output;
++# }
++#
++# $template->param(searchform => $form);
++# }
++#}
+
+- # Add search box to page header.
+- if ($template->query(name => "searchform")) {
+- if (! defined $form) {
+- my $searchform = template("searchform.tmpl", blind_cache => 1);
+- $searchform->param(searchaction => $config{cgiurl});
+- $form=$searchform->output;
+- }
+-
+- $template->param(searchform => $form);
+- }
+-}
+-
+ sub delete (@) {
+- debug(gettext("cleaning hyperestraier search index"));
+- estcmd("purge -cl");
+- estcfg();
++ debug("Plucene: purging: ".join(',',@_));
++ init();
++ my $reader = Plucene::Index::Reader->open($PLUCENE_DIR);
++ my @files = src2rendered_abs(@_);
++ for (@files) {
++ $reader->delete_term( Plucene::Index::Term->new({ field => "id", text => $_ }));
++ }
++ $reader->close;
+ }
+
+ sub change (@) {
+- debug(gettext("updating hyperestraier search index"));
+- estcmd("gather -cm -bc -cl -sd",
+- map {
+- Encode::encode_utf8($config{destdir}."/".$_)
+- foreach @{$renderedfiles{pagename($_)}};
+- } @_
+- );
+- estcfg();
++ debug("Plucene: updating search index");
++ init();
++ #TODO: Do we want to index source or rendered files?
++ #TODO: Store author, tags, etc. in distinct fields; may need new API hook.
++ my @files = src2rendered_abs(@_);
++ my $writer = writer();
++
++ for my $file (@files) {
++ my $doc = Plucene::Document->new;
++ $doc->add(Plucene::Document::Field->Keyword(id => $file));
++ my $data;
++ eval { $data = readfile($file) };
++ if ($@) {
++ debug("Plucene: can't read <$file> - $@");
++ next;
++ }
++ debug("Plucene: indexing <$file> (".length($data).")");
++ $doc->add(Plucene::Document::Field->UnStored('text' => $data));
++ $writer->add_document($doc);
++ }
+ }
+-
+-sub cgi ($) {
+- my $cgi=shift;
+-
+- if (defined $cgi->param('phrase') || defined $cgi->param("navi")) {
+- # only works for GET requests
+- chdir("$config{wikistatedir}/hyperestraier") || error("chdir: $!");
+- exec("./".IkiWiki::basename($config{cgiurl})) || error("estseek.cgi failed");
+- }
+-}
+-
+-my $configured=0;
+-sub estcfg () {
+- return if $configured;
+- $configured=1;
+-
+- my $estdir="$config{wikistatedir}/hyperestraier";
+- my $cgi=IkiWiki::basename($config{cgiurl});
+- $cgi=~s/\..*$//;
+-
+- my $newfile="$estdir/$cgi.tmpl.new";
+- my $cleanup = sub { unlink($newfile) };
+- open(TEMPLATE, ">:utf8", $newfile) || error("open $newfile: $!", $cleanup);
+- print TEMPLATE IkiWiki::misctemplate("search",
+- "<!--ESTFORM-->\n\n<!--ESTRESULT-->\n\n<!--ESTINFO-->\n\n",
+- baseurl => IkiWiki::dirname($config{cgiurl})."/") ||
+- error("write $newfile: $!", $cleanup);
+- close TEMPLATE || error("save $newfile: $!", $cleanup);
+- rename($newfile, "$estdir/$cgi.tmpl") ||
+- error("rename $newfile: $!", $cleanup);
+-
+- $newfile="$estdir/$cgi.conf";
+- open(TEMPLATE, ">$newfile") || error("open $newfile: $!", $cleanup);
+- my $template=template("estseek.conf");
+- eval q{use Cwd 'abs_path'};
+- $template->param(
+- index => $estdir,
+- tmplfile => "$estdir/$cgi.tmpl",
+- destdir => abs_path($config{destdir}),
+- url => $config{url},
+- );
+- print TEMPLATE $template->output || error("write $newfile: $!", $cleanup);
+- close TEMPLATE || error("save $newfile: $!", $cleanup);
+- rename($newfile, "$estdir/$cgi.conf") ||
+- error("rename $newfile: $!", $cleanup);
+-
+- $cgi="$estdir/".IkiWiki::basename($config{cgiurl});
+- unlink($cgi);
+- my $estseek = defined $config{estseek} ? $config{estseek} : '/usr/lib/estraier/estseek.cgi';
+- symlink($estseek, $cgi) || error("symlink $estseek $cgi: $!");
+-}
+-
+-sub estcmd ($;@) {
+- my @params=split(' ', shift);
+- push @params, "-cl", "$config{wikistatedir}/hyperestraier";
+- if (@_) {
+- push @params, "-";
+- }
+-
+- my $pid=open(CHILD, "|-");
+- if ($pid) {
+- # parent
+- foreach (@_) {
+- print CHILD "$_\n";
+- }
+- close(CHILD) || print STDERR "estcmd @params exited nonzero: $?\n";
+- }
+- else {
+- # child
+- open(STDOUT, "/dev/null"); # shut it up (closing won't work)
+- exec("estcmd", @params) || error("can't run estcmd");
+- }
+-}
+-
+-1
++1;
+</pre>
+
+
diff --git a/doc/todo/directive_docs.mdwn b/doc/todo/directive_docs.mdwn
new file mode 100644
index 000000000..2baa61b40
--- /dev/null
+++ b/doc/todo/directive_docs.mdwn
@@ -0,0 +1,79 @@
+The current basewiki is not [[self-documenting|todo/basewiki_should_be_self_documenting]]. In particular, if
+[[plugins/listdirectives]] is used, it creates a list with a bunch of
+broken links to directives/*, pages that do not currently exist in the
+docwiki or basewiki.
+
+This could be fixed by adding a page for each directive under to
+`ikiwiki/directives`, and put those into a new underlay, which the plugin
+could enable. Rather a lot of work and maintenance to document all the
+directives like that.
+
+I also considered having it link to the plugin that defined the
+directive. Then all the plugins can be included in a new underlay, which
+both [[plugins/listdirectives]] and [[plugins/websetup]] could enable.
+(The latter could be improved by making the plugin names in the web setup
+be links to docs about each plugin..)
+
+The problem I ran into doing that is that the existing plugin pages have a
+lot of stuff on them you probably don't want an underlay doing. The biggest
+issues were wikilinks to other pages in the docwiki (which would end up
+broken if the plugins were used as an underlay), and plugin pages that
+include examples of the plugin in use, which are sometimes rather expensive
+(eg, brokenlinks).
+
+Either way requires a lot of reorganisation/doc work, and an onging
+maintenance load.
+
+> Which has now been [[done]]. -- [[Will]]
+
+BTW, this patch would be needed for the second approach, to allow
+listdirectives to map from preprocessor directives back to the plugin that
+defined them: --[[Joey]]
+
+ commit 0486b46a629cae19ce89492d5ac498bbf9b84f5f
+ Author: Joey Hess <joey@kodama.kitenet.net>
+ Date: Mon Aug 25 15:38:51 2008 -0400
+
+ record which plugins registered which hooks
+
+ diff --git a/IkiWiki.pm b/IkiWiki.pm
+ index e476521..afe982a 100644
+ --- a/IkiWiki.pm
+ +++ b/IkiWiki.pm
+ @@ -493,6 +493,7 @@ sub loadplugins () {
+ return 1;
+ }
+
+ +my $loading_plugin;
+ sub loadplugin ($) {
+ my $plugin=shift;
+
+ @@ -502,14 +503,18 @@ sub loadplugin ($) {
+ "$installdir/lib/ikiwiki") {
+ if (defined $dir && -x "$dir/plugins/$plugin") {
+ require IkiWiki::Plugin::external;
+ + $loading_plugin=$plugin;
+ import IkiWiki::Plugin::external "$dir/plugins/$plugin";
+ + $loading_plugin=undef;
+ $loaded_plugins{$plugin}=1;
+ return 1;
+ }
+ }
+
+ my $mod="IkiWiki::Plugin::".possibly_foolish_untaint($plugin);
+ + $loading_plugin=$plugin;
+ eval qq{use $mod};
+ + $loading_plugin=undef;
+ if ($@) {
+ error("Failed to load plugin $mod: $@");
+ }
+ @@ -1429,6 +1434,9 @@ sub hook (@) {
+
+ return if $param{no_override} && exists $hooks{$param{type}}{$param{id}};
+
+ + # Record which plugin was being loaded when the hook was defined.
+ + $param{plugin}=$loading_plugin if defined $loading_plugin;
+ +
+ $hooks{$param{type}}{$param{id}}=\%param;
+ return 1;
+ }
diff --git a/doc/todo/discuss_without_login.mdwn b/doc/todo/discuss_without_login.mdwn
new file mode 100644
index 000000000..82311459d
--- /dev/null
+++ b/doc/todo/discuss_without_login.mdwn
@@ -0,0 +1,19 @@
+# Discuss without login? Or feedback forum? Or fine-tuned per-page access control?
+
+Any plugin or option for allowing website visitors to edit the discuss page without logging in (without having ikiwiki accounts)?
+
+Or any plugin to add a feedback form (and maybe threads) to extend a Wiki webpage?
+
+Or is there per-page access control that can be fine-tuned to lock some users or groups for specific pages?
+(The [[ikiwiki/pagespec]] does show a way to lock all pages except for Discussion pages, but I want some users to also be able to edit other pages.)
+
+I want a way for website visitors to be able to give feedback on the wiki pages without having to sign up or log in.
+I don't want them to be able to edit the exiting wiki pages except maybe Discussion page.
+
+(For some reason, it seems like I asked this before ...)
+
+--JeremyReed
+
+[[todo/Done]]; there's now a plugin interface for this and several nice
+plugins including one allowing [[plugins/opendiscussion]]. More special-purpose
+(and less wiki-like plugins) can be added based on this. --[[Joey]]
diff --git a/doc/todo/discussion_page_as_blog.mdwn b/doc/todo/discussion_page_as_blog.mdwn
new file mode 100644
index 000000000..990b7ddb3
--- /dev/null
+++ b/doc/todo/discussion_page_as_blog.mdwn
@@ -0,0 +1,33 @@
+Look at a discussion page here or eg on wikipedia. It tends to turn into a
+mess. One nice way to avoid the mess would be to set it a discussion page
+as a blog so each new comment is a separate post.
+
+One issue is, would there be a way to do this for all new discussion pages
+by default somehow? Setting up the blog means inserting a preprocessor
+directive; and that could somehow happen automatically when the discussion
+page is first created. (Creating a bunch of empty discussion pages with
+such directives ahead of time would be silly.) Maybe some kind of new page
+template system would do the trick, so pages matching */Discussion start
+off as a clone of DiscussionTemplate. Although the first person to try to
+create the discussion page would still end up in an edit page with that
+template, which is not ideal. Hmm.
+
+Thinking about this some more, discussion links for pages that don't yet
+have discussion could go directly to the ikiwiki cgi, which could provide a
+post form, and create a new discussion page with the necessary inlining.
+
+Another issue is that discussions really want to be threaded. Does that
+mean that a page like foo/discussion/question should have its own
+foo/discussion/question/(discussion?)/answer page? Of course, rss feeds
+don't handle threading, and of course doing this might be dependant on the
+issue above. Worrying about threading may be overkill.
+
+> Something like [[discussion/castle]] and [[discussion/castle/discussion]]? (Sorry about the noise, btw.) --Ethan
+
+>> this really didn't seem to work -- here's my attempted comment; http://ikiwiki.info/sandbox/castle/discussion/test_comment/index.html. -- JonDowland
+
+I don't think that the nesting is very clear, I found it confusing..
+
+Would each page be its own individual blog? Or its own blog post? To me it seems like an entire wiki can be viewed as a blog, with threaded or unthreaded comments underneath.
+
+[[!tag soc done]]
diff --git a/doc/todo/discussion_page_as_blog/discussion/castle.mdwn b/doc/todo/discussion_page_as_blog/discussion/castle.mdwn
new file mode 100644
index 000000000..4d462f74f
--- /dev/null
+++ b/doc/todo/discussion_page_as_blog/discussion/castle.mdwn
@@ -0,0 +1,3 @@
+I want to test some stuff regarding discussion pages, so I'm using this area to knock together some prototypes.
+
+I like foo. \ No newline at end of file
diff --git a/doc/todo/discussion_page_as_blog/discussion/castle/discussion.mdwn b/doc/todo/discussion_page_as_blog/discussion/castle/discussion.mdwn
new file mode 100644
index 000000000..c95fb0e20
--- /dev/null
+++ b/doc/todo/discussion_page_as_blog/discussion/castle/discussion.mdwn
@@ -0,0 +1 @@
+[[!inline pages="sandbox/castle/discussion/* and !sandbox/castle/discussion/*/*" rootpage="sandbox/castle/discussion"]] \ No newline at end of file
diff --git a/doc/todo/discussion_page_as_blog/discussion/castle/discussion/Don__39__t_like_foo.mdwn b/doc/todo/discussion_page_as_blog/discussion/castle/discussion/Don__39__t_like_foo.mdwn
new file mode 100644
index 000000000..48ca72a9c
--- /dev/null
+++ b/doc/todo/discussion_page_as_blog/discussion/castle/discussion/Don__39__t_like_foo.mdwn
@@ -0,0 +1,3 @@
+I don't like foo. Have you tried living without foo?
+
+[[!inline pages="sandbox/castle/discussion/Don__39__t_like_foo/*" rootpage="sandbox/castle/discussion/Don__39__t_like_foo"]] \ No newline at end of file
diff --git a/doc/todo/discussion_page_as_blog/discussion/castle/discussion/Don__39__t_like_foo/how_about_bar.mdwn b/doc/todo/discussion_page_as_blog/discussion/castle/discussion/Don__39__t_like_foo/how_about_bar.mdwn
new file mode 100644
index 000000000..219ce5066
--- /dev/null
+++ b/doc/todo/discussion_page_as_blog/discussion/castle/discussion/Don__39__t_like_foo/how_about_bar.mdwn
@@ -0,0 +1 @@
+I like bar in cases like this. \ No newline at end of file
diff --git a/doc/todo/discussion_page_as_blog/discussion/castle/discussion/Don__39__t_like_foo/sdf.mdwn b/doc/todo/discussion_page_as_blog/discussion/castle/discussion/Don__39__t_like_foo/sdf.mdwn
new file mode 100644
index 000000000..23e3ba13f
--- /dev/null
+++ b/doc/todo/discussion_page_as_blog/discussion/castle/discussion/Don__39__t_like_foo/sdf.mdwn
@@ -0,0 +1,5 @@
+#just checking
+
+## again :)
+
+ohh.. great I didn't know that smileys work in here.. Good for me :) \ No newline at end of file
diff --git a/doc/todo/discussion_page_as_blog/discussion/castle/discussion/foo_is_ok.mdwn b/doc/todo/discussion_page_as_blog/discussion/castle/discussion/foo_is_ok.mdwn
new file mode 100644
index 000000000..bf4900be4
--- /dev/null
+++ b/doc/todo/discussion_page_as_blog/discussion/castle/discussion/foo_is_ok.mdwn
@@ -0,0 +1 @@
+In my experience foo is only OK. \ No newline at end of file
diff --git a/doc/todo/discussion_page_as_blog/discussion/castle/discussion/test.mdwn b/doc/todo/discussion_page_as_blog/discussion/castle/discussion/test.mdwn
new file mode 100644
index 000000000..038d718da
--- /dev/null
+++ b/doc/todo/discussion_page_as_blog/discussion/castle/discussion/test.mdwn
@@ -0,0 +1 @@
+testing
diff --git a/doc/todo/do_not_make_links_backwards.mdwn b/doc/todo/do_not_make_links_backwards.mdwn
new file mode 100644
index 000000000..4059d8e2a
--- /dev/null
+++ b/doc/todo/do_not_make_links_backwards.mdwn
@@ -0,0 +1,95 @@
+[[!template id=gitbranch branch=anarcat/backwards_links author="[[anarcat]]"]]
+
+I understand this may be a bit provocative, but I strongly feel that ikiwiki linking rules are backwards. I come from the world of wikis like MoinMoin and [[plugins/contrib/mediawiki]], where you use `\[[link|description]]`. The defacto wiki markup "[[plugins/creole]]" also uses that convention, as does raw HTML (href comes first!). Ikiwiki doesn't: here we need to use `\[[description|link]]`.
+
+Everytime i come back to ikiwiki, i need to bend my mind backwards to create *proper* links. I understand that `\[[description|link]]` is more inline with Markdown's `[description](link)` approach, but in my mind it is too much of a problem for third part plugins to be a proper justification. For example, the [[plugins/creole]] plugin works pretty much as expected *expect* for links, because it can't override ikiwiki's internal link parser. For me that's a huge inconsistency that should be fixed.
+
+If there is an agreement within the community that we can change that, I am ready to work on a migration script or even a configuration variable... -- [[anarcat]]
+
+Dev notes
+---------
+
+I started looking into this, after encouraging words from Joey ("very long term roadmap", AKA "if someone does it"). It turns out it is less deeply rooted than i thought in the core of ikiwiki; everything being a plugin and all, this is also a plugin ([[plugins/link]]).
+
+The following needs to be done:
+
+ 1. the `link_regexp` variable needs to be turned backwards (or frontwards, if you like :P) (./) added an option for this, working!
+ 2. a config setting need to be added to the `link` plugin so that we can choose if we want backwards links or not (./) `links_direction`, how does that sound? I have changed that from `backwards_links` to be more neutral. 'rtl' means `\[[link|text]]` and 'ltr' means `\[[text|link]]`
+ 3. a (solid!) parser needs to be written for [[ikiwiki-transition]] to change the actual links (if necessary) (./) done!
+ 4. rewrite tests to take into account the two syntaxes (!) would be done when we migrate to the syntax
+ 5. deal with underlays (./) i wrote a script to convert it to markdown
+
+Discussion
+----------
+
+> It's not at all obvious to me that `rtl` should mean "link before description"
+> and not the other way round. Perhaps `wikilink_text_first` => `1` for the historical
+> IkiWiki syntax or `0` for the Creole/Mediawiki syntax? --[[smcv]]
+>
+> > A friend made the argument that it is more natural for a human to read the `text` then `link`, as the link is less important. Since we (occidental languages) read left to right, I felt this was appropriate. I also blindly assumed that it would "feel" also appropriate for right to left languages (arabic, hebrew, etc) to have those links backwards, and those languages are generally named "right to left".
+> >
+> > Originally, I named that parameter `backwards_links`, but then it wouldn't make sense in the long term, and isn't exactly neutral: it assume the current way is backwards! Your suggestion is interesting however, but I don't think the rtl/ltr nomenclature is problematic, with proper documentation of course... --[[anarcat]]
+
+There's a caveat: we can't have a per-wiki backwards_links option, because of the underlay, common to all wikis, which needs to be converted. So the option doesn't make much sense. Not sure how to deal with this... Maybe this needs to be at the package level? --[[anarcat]]
+
+> I've thought about adding a direction-neutral `\[[!link]]` directive -
+> see [[link plugin perhaps too general?]] for details. The basewiki
+> could use `\[[!link to=b desc=a]]` whenever it needs `\[[a|b]]`-style
+> links, maybe? --[[smcv]]
+
+>> It could, but it would be a pain to remember to do that.
+>>
+>> I feel that this should probably be a flag day transition because
+>> otherwise there will be a lot of variation between how different
+>> ikiwikis handle links, which is even worse than the current variation
+>> between ikiwiki and other wikis!
+>>
+>> There are quite likely ikiwiki page generators that build wikilinks
+>> too. One that's part of ikiwiki itself is `change.tmpl`. There may be
+>> others... --[[Joey]]
+
+>>> Agreed that it would be cleaner to just change everything, even though the transition might be painful.
+
+>>> Another interim option might be to change the basewiki links to be just \[[link to whatever]] without having a description.
+>>> That style of link would work whether the link style was "backwards" or "forwards". Unfortunately it could make some links less readable; after all, there is a reason why one wants to be able to change the link text! But I don't know what proportion of the links are like that. It's a thought, anyway.
+>>> --[[KathrynAndersen]]
+
+>>>> I dislike placing such requirements on the underlay, which is after
+>>>> all, just a subset of pages in this wiki, which many of the people
+>>>> editing may not even realize are part of the underlay. --[[Joey]]
+
+>>> Another option for internal links is to just use the regular markdown links instead of `\[[text|link]]` markup, that way it works regardless. Then the documentation for the link plugin just has to state both syntaxes in a safe manner.
+>>> I also agree that we should just switch in one shot, although I am worried this means this could be postponed indefinitely.--[[anarcat]]
+
+>>>> I have done just that in my branch: now the underlay only uses wikilinks in the wikilink page, elsewhere regular markdown links are used. I haven't converted the whole of the doc/ directory however, that would be left to the migration. I have written a ikiwik-transition tool to migrate from wikilink to markdown while i was there. --[[anarcat]]
+
+>>>>> No, that is *not* an option. Relative markdown links **break** when
+>>>>> page A, containing a link, is inlined into page B. --[[Joey]]
+
+----
+
+FWIW, I think this change may well be painful, but is a good idea. I can never remember which way around it should be.
+Rather like USB plugs, I invariably have to try both ways. — [[Jon]]
+
+The bikeshed color should be ...
+--------------------------------
+
+...[blue](http://blue.bikeshed.org/) of course. :) Just to make things clear here, the "bikeshedding" potential is absolutely huge here. right to left? left to right? who's right? how could we even decide this?
+
+I think we can approach this rationnally:
+
+ 1. left to right (text then link) can be considered more natural, and should therefore be supported
+ 2. it is supported in markdown using regular markdown links. in the proposed branch, the underlay wikilinks are converted to use regular markdown links
+ 3. ikiwiki links break other markup plugins, like mediawiki and creole, as those work right to left.
+ 4. those are recognized "standards" used by other popular sites, like Wikipedia, or any wiki supporting the Creole markup, which is [most wikis](http://www.wikicreole.org/wiki/Engines)
+
+Therefore, to respect interoperability and [POLA](https://en.wikipedia.org/wiki/Principle_of_least_astonishment), ikiwiki should respect that convention and reverse the way links are parsed by the link plugin, or move that functionality into creole/mediawiki modules, and out of the main core, which I do not think can be an option.
+
+So here's a roadmap to deploy this change:
+
+ 1. the code in the backwards_links branch i am working on is tested and proven, then merged in
+ 2. a release of the 3.x branch is published with the possibility for wikis to convert to the new markup, with the notion that the older markup is deprecated
+ 3. this wiki is converted to the new markup
+ 4. 4.0 is released with the new markup enabled by default and runs ikiwiki-transition on your wiki on upgrade
+
+Note that ikiwiki-transition can be ran multiple and will convert your markup to and from rtl/ltr, without issues, so this is pretty sturdy. I think the configuration variable can be kept throughout 4.x, with the notion that it will be completely removed eventually. --[[anarcat]]
diff --git a/doc/todo/done.mdwn b/doc/todo/done.mdwn
new file mode 100644
index 000000000..7fcbe44b6
--- /dev/null
+++ b/doc/todo/done.mdwn
@@ -0,0 +1,3 @@
+recently fixed [[TODO]] items
+
+[[!inline pages="link(todo/done) and !todo and !*/Discussion" sort=mtime show=10 archive=yes]]
diff --git a/doc/todo/double-click_protection_for_form_buttons.mdwn b/doc/todo/double-click_protection_for_form_buttons.mdwn
new file mode 100644
index 000000000..501be4498
--- /dev/null
+++ b/doc/todo/double-click_protection_for_form_buttons.mdwn
@@ -0,0 +1,5 @@
+A small piece of JS to prevent double-submitting forms would be quite nice. I seem to have developed a habit of doing this and having to resolve a merge conflict for two initial commits. -- [[Jon]]
+
+> By the time you see that merge conflict, the first commit has
+> already successfully happened, so you can just hit cancel
+> and throw away the second submit. --[[Joey]]
diff --git a/doc/todo/doxygen_support.mdwn b/doc/todo/doxygen_support.mdwn
new file mode 100644
index 000000000..4625da904
--- /dev/null
+++ b/doc/todo/doxygen_support.mdwn
@@ -0,0 +1,7 @@
+[[!tag wishlist]]
+
+Given that ikiwiki has a suggested use as a tool for developers, I was thinking it might be cool if ikiwiki had [Doxygen](http://www.doxygen.org/) support. I'm not exactly sure how the integration would work. Something along the lines of a plugin to support .dox files would be my first thought. I'd leave generating the documentation from any source files for a separate run of Doxygen - it'd be easier and you probably don't want the source being edited over the web.
+
+#### Background ####
+
+I have been involved with one project that uses Doxygen to generate their web pages and user docs, as well as their 'in code' documentation: <http://orca-robotics.sourceforge.net/orca/index.html>. This makes the whole system somewhat like ikiwiki, but without the cgi for public editing. I was thinking of trying to convince that project to move to ikiwiki, but they're not going to want to re-write all their documentation.
diff --git a/doc/todo/dynamic_rootpage.mdwn b/doc/todo/dynamic_rootpage.mdwn
new file mode 100644
index 000000000..3c39484bc
--- /dev/null
+++ b/doc/todo/dynamic_rootpage.mdwn
@@ -0,0 +1,35 @@
+I prefer to use a current year, month and day to archive my blog posts, for example
+`post/2007/11/12/foo-bar-baz` path is better for me then `post/foo-bar-baz`.
+Unfortunately it seems that `rootpage` parameter of inline plugin is very static.
+Is it a chance to make it more dynamic? Now I have to use `svn mkdir` command
+to create appropriate subdirectories by hand.
+
+I think that you could add builtin functions or variables, for example `current_year()`
+or `$CURRENT_YEAR` to usage inside `rootpage` parameter. Something like for Manoj's
+calendar plugin. Then my `rootpage` parameter could be like
+`rootpage="post/current_year()/current_month()/current_day()"`. Another good hints
+are welcomed ;)
+
+What's your opinion, Joey? I hope it's also useful for another ikiwiki lovers :)
+
+--[[Paweł|ptecza]]
+
+>> Hello Joey! Is it a taboo subject? ;) --[[Paweł|ptecza]]
+
+>> No, but I don't know of a way to do it that feels flexible and right..
+>> Using functions as in your example doesn't feel right somehow.
+>> --[[Joey]]
+
+>>> Seems like a job for good ol' string interpolation. rootpage="post/$current_year/$current_month/$current_day"
+>>> Ikiwiki could provide some vars, and it would be nice to write plugins to also provide vars. Sort of like templates.
+>>> Does that feel OK? --[[sabr]]
+
+> I want the exact same thing. My compromise was to create a `datedblog` module which overrides `inline`'s `sessioncgi` hook
+> with something that sets the new page name to `%Y-%m-%d.$page` and sets up a meta directive at the beginning of
+> the content, with the title you wanted. Now if you use the `datedblog` module, you get dated blog entries. But I'd
+> like to have traditional `inline` functionality too. This would work great if there were a way to change the `do`
+> parameter in the `blogpost` template's form; if I could change it to `datedblog` instead of `blog` then I could hook
+> my datedblog module in nicely, without having to override anything. What would be the right way to do that? --[[neale]]
+
+> This is basically the same request as
+> [[todo/inline_postform_autotitles]]. --[[smcv]]
diff --git a/doc/todo/ease_archivepage_styling.mdwn b/doc/todo/ease_archivepage_styling.mdwn
new file mode 100644
index 000000000..67415c176
--- /dev/null
+++ b/doc/todo/ease_archivepage_styling.mdwn
@@ -0,0 +1,59 @@
+Hi! Please apply the following [[patch]] to make the
+`archivepage.tmpl` template more semantic and easier to style with
+a local CSS:
+
+ From 4e5cc0d9e5582f20df9f26dd5b1937ead0b46827 Mon Sep 17 00:00:00 2001
+ From: intrigeri <intrigeri@boum.org>
+ Date: Sat, 18 Aug 2012 10:34:36 +0200
+ Subject: [PATCH] Ease archivepage styling by using named classes, move
+ styling to the CSS.
+
+ ---
+ doc/style.css | 4 ++++
+ templates/archivepage.tmpl | 8 ++++----
+ 2 files changed, 8 insertions(+), 4 deletions(-)
+
+ diff --git a/doc/style.css b/doc/style.css
+ index 6e2afce..5fb4100 100644
+ --- a/doc/style.css
+ +++ b/doc/style.css
+ @@ -202,6 +202,10 @@ div.recentchanges {
+ margin-top: 1em;
+ }
+
+ +.archivepagedate {
+ + font-style: italic;
+ +}
+ +
+ .error {
+ color: #C00;
+ }
+ diff --git a/templates/archivepage.tmpl b/templates/archivepage.tmpl
+ index 93bdd9c..3e0bd9b 100644
+ --- a/templates/archivepage.tmpl
+ +++ b/templates/archivepage.tmpl
+ @@ -1,10 +1,10 @@
+ -<p>
+ +<div class="archivepage">
+ <TMPL_IF PERMALINK>
+ <a href="<TMPL_VAR PERMALINK>"><TMPL_VAR TITLE></a><br />
+ <TMPL_ELSE>
+ <a href="<TMPL_VAR PAGEURL>"><TMPL_VAR TITLE></a><br />
+ </TMPL_IF>
+ -<i>
+ +<span class="archivepagedate">
+ Posted <TMPL_VAR CTIME>
+ <TMPL_IF AUTHOR>
+ by <span class="author">
+ @@ -15,5 +15,5 @@ by <span class="author">
+ </TMPL_IF>
+ </span>
+ </TMPL_IF>
+ -</i>
+ -</p>
+ +</span>
+ +</div>
+ --
+ 1.7.10.4
+
+> [[done]] --[[Joey]]
diff --git a/doc/todo/edit_form:_no_fixed_size_for_textarea.mdwn b/doc/todo/edit_form:_no_fixed_size_for_textarea.mdwn
new file mode 100644
index 000000000..77e46049f
--- /dev/null
+++ b/doc/todo/edit_form:_no_fixed_size_for_textarea.mdwn
@@ -0,0 +1,52 @@
+At the moment the text area in the edit form has a fixed size of 20 rows.
+
+On longer pages its not very comfortable to edit pages with such a small box. The whole screen size should be used instead([example](http://img3.imagebanana.com/img/bl10u9mb/editingtodo_1241804460828.png)).
+
+> The whole screen width is used, via the following
+> from style.css:
+>
+> {
+> width: 100%;
+> }
+>
+> Perhaps you have replaced it with a modified style sheet that does not
+> include that? --[[Joey]]
+
+>> The screen shot was made with http://ikiwiki.info/ where i didn't change anything. The width is optimally used. The problem is the height.
+
+>>> You confused me by talking about rows...
+>>> I don't know how to allow CSS to resize a textarea
+>>> to the full browser height. The obvious `height: 75%;`
+>>> does not work, at least in firefox and epiphany.
+>>>
+>>> Ah, of course, if it did work, it'd make it be 75% of
+>>> the full *page* height, and not the browser window height.
+>>>
+>>> According to
+>>> [this page](http://stackoverflow.com/questions/632983/css-height-if-textarea-as-a-percentage-of-the-viewport-height):
+>>>>>50% of what? Parent says ‘auto’, which means base it on the height of the child content. Which depends on the height on the parent. Argh! etc.
+>>>>>
+>>>>>So you have to give its parent a percentage height. And the parent's parent, all the way up to the root.
+>>> So, other than a javascript-based resizer, some very tricky and invasive CSS
+>>> seems to be needed. Please someone let me know if you succeed in doing that.
+>>> --[[Joey]]
+
+>>>>>> the javascript approach would need to work something like this: you need to know about the "bottom-most" item on the edit page, and get a handle for that object in the DOM. You can then obtain the absolute position height-wise of this element and the absolute position of the bottom of the window to determine the pixel-difference. Then, you set the height of the textarea to (current height in px) + determined-value. This needs to be re-triggered on various resize events, at least for the window and probably for other elements too. I may have a stab at this at some point. -- [[Jon]]
+
+Google chrome has a completly elegant fix for this problem: All textareas
+have a small resize handle in a corner, that can be dragged around. No
+nasty javascript needed. IMHO, this is the right solution, and I hope other
+browsers emulate it. [[done]]
+--[[Joey]]
+
+Wouldn't it be possible to just implement an integer-valued setting for this, accessible via the "Setup" wiki page? This would require a wiki regen, but such a setting would not be changed frequently I suppose. Also, Mediawiki has this implemented as a per-user setting (two settings, actually, -- number of rows and columns of the edit area); such a per-user setting would be the best possible implementation, but I'm not sure if ikiwiki already supports per-user settings. Please consider implementing this as the current 20 rows is a great PITA for any non-trivial page.
+
+> I don't think it would need a wiki rebuild, as the textarea is generated dynamically by the CGI when you perform a CGI action, and (as far as I know) is not cooked into any static content. -- [[Jon]]
+
+>> There is no need for a configuration setting for this -- to change
+>> the default height from 20 rows to something else, you can just put
+>> something like this in your `local.css`: --[[Joey]]
+
+ #editcontent {
+ height: 50em;
+ }
diff --git a/doc/todo/edittemplate_should_look_in_templates_directory_by_default.mdwn b/doc/todo/edittemplate_should_look_in_templates_directory_by_default.mdwn
new file mode 100644
index 000000000..4bc10e432
--- /dev/null
+++ b/doc/todo/edittemplate_should_look_in_templates_directory_by_default.mdwn
@@ -0,0 +1,8 @@
+[[plugins/edittemplate]] looks for the specified template relative to the
+page the directive appears on. Which can be handy, eg, make a
+blog/mytemplate and put the directive on blog, and it will find
+"mytemplate". However, it can also be confusing, since other templates
+always are looked for in `templates/`.
+
+I think it should probably fall back to looking for `templates/$foo`.
+--[[Joey]]
diff --git a/doc/todo/edittemplate_should_support_uuid__44___date_variables.mdwn b/doc/todo/edittemplate_should_support_uuid__44___date_variables.mdwn
new file mode 100644
index 000000000..7ec95b536
--- /dev/null
+++ b/doc/todo/edittemplate_should_support_uuid__44___date_variables.mdwn
@@ -0,0 +1,19 @@
+[[!tag wishlist patch]]
+
+I use a default template for all new pages:
+
+ \[[!meta title="<TMPL_VAR name>"]]
+ \[[!meta author=]]
+ \[[!meta date="<TMPL_VAR time>"]]
+ \[[!meta guid="urn:uuid:<TMPL_VAR uuid>"]]
+ \[[!tag ]]
+
+This encourages me to include useful metadata on the page. In particular, though, I've modified the `edittemplate` plugin to generate a uuid for use in the guid, for use in `inline`. Importantly, this keeps `inline` from flooding aggregators when I rename these pages.
+
+I've also noticed that IkiWiki seems to use the creation time for the generated page for the page date. This means that when I do a rebuild, `inline`d pages get shuffled. The inclusion of a `time` variable in `edittemplate` (and in a `meta` declaration for all such pages) prevents the date from changing unexpectedly.
+
+I've already made these changes in my installation, and have made my patches available in the `edittemplate` branch of git://civilfritz.net/ikiwiki.git.
+
+Changes to the structure of `$pagestate{$registering_page}{edittemplate}{$pagespec}` mean that a `cgi` rebuild is necessary (for reasons I don't entirely understand); but I think that's preferable to creating an entirely separate `$pagestate` namespace for storing parameters. That said, I'm not really a perl programmer, so corrections are welcome.
+
+> I like this patch. I hate seeing things I've already read get marked as unread in my rss feed. -- [[JoshBBall]]
diff --git a/doc/todo/else_parameter_for_map_plugin.mdwn b/doc/todo/else_parameter_for_map_plugin.mdwn
new file mode 100644
index 000000000..981e50d43
--- /dev/null
+++ b/doc/todo/else_parameter_for_map_plugin.mdwn
@@ -0,0 +1,56 @@
+[[!tag patch done]]
+
+[[plugins/map]] (and I) could benefit from a bonus parameter:
+
+ else="Display this if no page matches the PageSpec"
+
+This was quite simple, so I implemented this (branch "map" in my
+ikiwiki repo, see my user page for the up-to-date URL). Not patched the
+documentation yet, I'm waiting for feedback first, but I'll do it for sure. -- [[intrigeri]]
+
+> Can't a [[plugins/conditional]] be for this?
+> --[[Joey]]
+
+>> Hmmm, what do you mean? Adding a syntax such as the one below?
+>> Or something else?
+
+ \[[!if test="map(" then="..." else="..."]]
+
+>> What would you write in the `then` clause?
+>> I'm not opposed at all to rewrite my two-liner, but I don't understand.
+>> --[[intrigeri]]
+
+ \[[!if test="foo/*" then="""
+ [[!map pages="foo/*"]]
+ """ else="no pages"]]
+
+--[[Joey]]
+
+>>> I'm not convinced: the syntax you're proposing implies to duplicate
+>>> the pagespec (once in the test clause, and once in the map query), which I find
+>>> not only inelegant, which I can live with, but also tiring and unpractical:
+>>> my `else` suggestion
+>>> finds its roots in map queries with rather long pagespecs. On the other
+>>> hand, if I'm the only one using map in such a way, I can live with this
+>>> heavy duplicated syntax without bloating the map plugin with features
+>>> no-one but me needs. On the other other hand, the patch is a 3-liner.
+>>> I'm not fixed yet, I'll think about it. --[[intrigeri]]
+
+>>>> Write a [[plugins/template]] which accepts a pagespec and an
+>>>> "else" clause, and then you won't have to duplicate the
+>>>> pagespec. --[[JoshTriplett]]
+
+>>>> Yeah, the patch is obviously very simple. My problem with it really is
+>>>> that there would seem to be several other places in ikiwiki where
+>>>> someone might want to be able to handle an "else" case where a
+>>>> pagespec expands to nothing. And adding else cases for all of them
+>>>> could be a bit much. --[[Joey]]
+
+>>>>> Agreed, and tagging as done. For the record, here is the [[plugins/template]] I use:
+
+ \[[!if test="<TMPL_VAR raw_pages>"
+ then="""<TMPL_VAR intro>
+ [[!map pages="<TMPL_VAR raw_pages>"]]"""
+ else="<TMPL_VAR else>"]]
+
+>>>>> --[[intrigeri]]
diff --git a/doc/todo/enable-htaccess-files.mdwn b/doc/todo/enable-htaccess-files.mdwn
new file mode 100644
index 000000000..3b9721d50
--- /dev/null
+++ b/doc/todo/enable-htaccess-files.mdwn
@@ -0,0 +1,80 @@
+ Index: IkiWiki.pm
+ ===================================================================
+ --- IkiWiki.pm (revision 2981)
+ +++ IkiWiki.pm (working copy)
+ @@ -26,7 +26,7 @@
+ memoize("file_pruned");
+
+ sub defaultconfig () {
+ - wiki_file_prune_regexps => [qr/\.\./, qr/^\./, qr/\/\./,
+ + wiki_file_prune_regexps => [qr/\.\./, qr/^\.(?!htaccess)/, qr/\/\.(?!htaccess)/,
+ qr/\.x?html?$/, qr/\.ikiwiki-new$/,
+ qr/(^|\/).svn\//, qr/.arch-ids\//, qr/{arch}\//],
+ wiki_link_regexp => qr/\[\[(?:([^\]\|]+)\|)?([^\s\]#]+)(?:#([^\s\]]+))?\]\]/,
+
+> Note that the above patch is **completely broken**.
+> It removes the crucial excludes of all files starting with a dot.
+> The negative regexps for htaccess have no effect, so the whole
+> thing only "works" because it allows *any* file starting with a dot.
+> If you applied this patch to your ikiwiki, you opened a huge security
+> hole. --[[Joey]]
+
+[[!tag patch patch/core]]
+
+This lets the site administrator have a `.htaccess` file in their underlay
+directory, say, then get it copied over when the wiki is built. Without
+this, installations that are located at the root of a domain don't get the
+benefit of `.htaccess` such as improved directory listings, IP blocking,
+URL rewriting, authorisation, etc.
+
+> I'm concerned about security ramifications of this patch. While ikiwiki
+> won't allow editing such a .htaccess file in the web interface, it would
+> be possible for a user who has svn commit access to the wiki to use it to
+> add a .htaccess file that does $EVIL.
+>
+> Perhaps this should be something that is configurable via the setup file
+> instead. --[[Joey]]
+
+> See
+
+---
+
+Hi, I would like to have my htaccess files in svn repository so ikiwiki would export that file to my webspace with every commit.
+
+That way I have revision control on that file too. That may be a security concern, but I trust everybody that has svn commit
+access and such .htaccess files should not be accessible through wiki cgi. Of course, it could default to 'off'.
+
+> See [[!debbug 447267]] for a patch for this.
+
+>> It looks to me as though this functionality won't be included in ikiwiki
+>> unless someone who wants it takes responsibility for updating the patch
+>> from that Debian bug to be applicable to current versions, so that there's a
+>> setup file parameter for extra filenames to allow, defaulting to none
+>> (i.e. a less simplistic patch than the one at the top of this page).
+>> Joey, is this an accurate summary? --[[smcv]]
+
+---
+
+bump! I would like to see some form of this functionality included in ikiwiki. I use a patched version, but
+its a bit of a PITA to constantly apply it (and to forget sometimes!). I know that security concern is important to consider,
+but I use ikiwiki with a very small group of people collaborating so svn/web access is under control
+and htaccess is for limiting access to some areas of wiki.
+It should be off by default of course. --Max
+
+---
++1 I want `.htaccess` so I can rewrite some old Wordpress URLs to make feeds work again. --[[hendry]]
+
+> Unless you cannot modify apache's configuration, you do not need htaccess
+> to do that. Apache's documentation recommends against using htaccess
+> unless you're a user who cannot modify the main server configuration.
+> --[[Joey]]
+
+---
++1 for various purposes (but sometimes the filename isn't `.htaccess`, so please make it configurable) --[[schmonz]]
+
+> I've described a workaround for one use case at the [[plugins/rsync]] [[plugins/rsync/discussion]] page. --[[schmonz]]
+
+---
+
+[[done]], you can use the `include` setting to override the default
+excludes now. Please use extreme caution when doing so. --[[Joey]]
diff --git a/doc/todo/enable_arbitrary_markup_for_directives.mdwn b/doc/todo/enable_arbitrary_markup_for_directives.mdwn
new file mode 100644
index 000000000..c1f0f86ed
--- /dev/null
+++ b/doc/todo/enable_arbitrary_markup_for_directives.mdwn
@@ -0,0 +1,47 @@
+One of the good things about [PmWiki](http://www.pmwiki.org) is the ability to treat arbitrary markup as directives.
+In ikiwiki, all directives have the same format:
+
+\[[!name arguments]]
+
+But with PmWiki, directives can be added to the engine (with the "Markup" hook) with the usual name and function passing, but also with a regexp which has capturing parentheses, and the results of the match are passed to the given function.
+Would it be possible to alter the "preprocess" hook to have an optional regex argument which acted in a similar fashion?
+
+For example, one could then write a plugin which would treat
+
+Category: Foo, Bar
+
+as a tag, by using a regex such as /^Category:\s*([\w\s,]+)$/; the result "Foo, Bar" could then be further processed by the hook function.
+
+This could also make it easier to support more styles of markup, rather than having to do all the processing in "htmlize" and/or "filter".
+
+-- [[KathrynAndersen]]
+
+[[!taglink wishlist]]
+
+> Arbitrary text transformations can already be done via the filter and
+> sanitize hooks. That's how the smiley and typography plugins do their
+> thing.
+>
+> AFAICS, the only benefit to having a regexp-based-hook interface is less
+> overhead in passing page content into the hooks. But that overhead is a
+> small amount of the total render time.
+>
+> Also, I notice that smiley does such complicated things in its sanitize
+> hook (ie, it looks at html context around the smilies) that a simple
+> matching regexp would not be sufficient. Furthermore, typography needs to
+> pass the page content into the library it uses, which does not expose
+> regexps to match on. So ikiwiki's more general filtering interface seems
+> to allow both of these to do things that could not be done with the
+> PmWiki interface. --[[Joey]]
+
+>>You have some good points. I was aware of using filter, but it didn't occur to me that one could use sanitize to do processing also, probably because "sanitize" brought to mind removing harmful content rather than doing other alterations.
+>>It has also occurred to me, on further thought, that if one wants one's chosen markup to actually be processed during the "preprocess" stage, that one could do so by converting the chosen markup to directive-style markup during the "filter" stage and then processing the directive during the "preprocess" stage as per usual. Is there a tag for "no longer on the wishlist?". --[[KathrynAndersen]]
+
+>>> Yeah, sanitize is a misleading name for the relatively few things that
+>>> use it this way.
+>>>
+>>> While you could do a filter to preprocess step, it is a bit
+>>> of a long way round, since filter always runs just before
+>>> preprocess.
+>>>
+>>> Anyway, guess this is [[done]] --[[Joey]]
diff --git a/doc/todo/etherpad_support.mdwn b/doc/todo/etherpad_support.mdwn
new file mode 100644
index 000000000..c11243ffc
--- /dev/null
+++ b/doc/todo/etherpad_support.mdwn
@@ -0,0 +1,22 @@
+[[Other wikis are doing it|https://www.mediawiki.org/wiki/Extension:EtherEditor]], why not jump the fray? The idea here would be to make the main editor hook into etherpad.
+
+Trivial implementation
+----------------------
+
+There are a lot of funky things that would be done here, but the basic functionality would be to throw the document in etherpad and make everyone that edits the same page join the same etherpad. Only one person would need to save the document, but the last person to save it would save the last version. Documents would be left on the etherpad server. That's what I would call the trivial way to go around this.
+
+This would translate in a simple javascript hook for the editor page. The pad name could simply be the page name, which makes it insecure for private wikis.
+
+Garbage-collecting implementation
+---------------------------------
+
+This would require a bit more work. With this implementation, a "counter" would be implemented for every user that would edit the page simultaneously. Once a user saves the page, the counter goes down, when the counter reaches zero, the pad is deleted.
+
+Resources
+---------
+
+ * [etherpad jquery plugin](https://github.com/ether/etherpad-lite-jquery-plugin) - for embeding in any page
+ * [embed parameters](https://github.com/ether/etherpad-lite/wiki/Embed-Parameters) - for embeding using an iframe, probably not what we want
+ * [other integrations](https://github.com/ether/etherpad-lite/wiki/Third-party-web-services-that-have-support-for-Etherpad-Lite) - document us here when done
+ * [no Perl API implementation](https://github.com/ether/etherpad-lite/wiki/HTTP-API-client-libraries) - we'll have to write our own?
+ * [API documentation](http://etherpad.org/doc/v1.2.0/)
diff --git a/doc/todo/excluding_commit_mails.mdwn b/doc/todo/excluding_commit_mails.mdwn
new file mode 100644
index 000000000..9ae838fe0
--- /dev/null
+++ b/doc/todo/excluding_commit_mails.mdwn
@@ -0,0 +1,19 @@
+It would be good to be able to exclude commits made by a given user from
+generating commit mails.
+
+My immediate need for this is because I subscribed to commit mails using my
+openid. So I don't get commit mails for changes I make over the web, using
+that id. But, if I do a svn commit, that's from a "different" user, so a
+commit mail is sent to me. This particular case could be treated as ikiwiki
+needing some way to link together openids and other accounts, which could
+also be good, but I think the general case of not wanting to see changes
+some other user makes is reasonable.
+
+Extending pagespecs for commit mails would be a nice approach. Then I could
+subscribe to:
+
+ * and !SandBox and !user(joey)
+
+Insert standard argument about how wonderfly flexible this is. :-)
+
+[[done]]
diff --git a/doc/todo/fancypodcast.mdwn b/doc/todo/fancypodcast.mdwn
new file mode 100644
index 000000000..64af7e8a9
--- /dev/null
+++ b/doc/todo/fancypodcast.mdwn
@@ -0,0 +1,330 @@
+ikiwiki's simple podcasting, while elegant and minimal, doesn't (as
+mentioned in [[todo/blogging]]) produce full-featured feeds. In
+fancy podcasts, episodes are accompanied by text content. The feeds
+also have lots more metadata.
+
+[[!toc]]
+
+## Status
+
+[[!template id=gitbranch branch=schmonz/fancypodcast author="[[schmonz]]"]]
+[[!tag patch]]
+
+In summary, the branch preserves ikiwiki's existing podcast behavior,
+adds more featureful behavior, and has been tested to work well in
+some common podcatchers. I believe it is ready for review and
+possible integration, and I'd like to get feedback to that effect
+(or to the contrary) before making further enhancements. I know
+[[joey]]'s the final arbiter here, but I'd appreciate any qualified,
+critical eyes ([[smcv]]?) raking over my diffs. --[[schmonz]]
+
+## Features
+
+[[!table data="""
+Feature |iTunes RSS|iTunes Atom|Downcast RSS|Downcast Atom
+Feed image | | | |
+Feed title |(./) |(./) |(./) |(./)
+Feed publisher | | | |
+Feed "category" | | | |
+Feed date |(./) |(./) |(./) |(./)
+Feed description |(./) |(./) |(./) |
+Episode image | | | |
+Episode title |(./) |(./) |(./) |(./)
+Episode date |(./) |(./) |(./) |(./)
+Episode duration | | | |
+Episode author | | | |
+Episode description|(./) |(./) |(./) |
+Episode enclosure |(./) |(./) |(./) |(./)
+"""]]
+
+## Design
+
+7. For each fancy podcast episode, write a blog post containing
+ `\[[!meta enclosure="WikiLink/to/media.mp3"]]`. (Don't specify
+ more than one enclosure -- but if you do, last one wins.)
+7. When rendering to HTML (single-page or inlined), append a link
+ to the media file.
+7. When rendering to RSS/Atom, the text is the entry's content and
+ the media file is its enclosure.
+7. Don't break simple podcasts in pursuit of fancy podcasts.
+
+## Implementation
+
+### Completed
+
+* Cover the existing simple podcast behavior with tests.
+* Add an `enclosure` field to [[plugins/meta]] that expands the
+ given [[ikiwiki/WikiLink]] to an absolute URL (feed enclosures
+ pretty much need to be, and the reference feeds I've looked at
+ all do this).
+* Write failing tests for the desired single-page and inlined
+ HTML behavior, then make them pass by adding enclosure stanzas
+ to `{,inline}page.tmpl`.
+* Write failing tests for the desired RSS/Atom behavior, then make
+ them pass via changes to `{atom,rss}item.tmpl` and [[plugins/inline]].
+* Match feature-for-feature with
+ [tru_podcast](http://www.rainskit.com/blog/542/tru_podcast-a-podcasting-plugin-for-textpattern)
+ (what [[schmonz]] will be migrating from).
+* Enrich [feed metadata](http://cyber.law.harvard.edu/rss/rss.html)
+ by catching up `rsspage.tmpl` to `atompage.tmpl`.
+* Verify that [[plugins/more]] plays well with fancy podcasts.
+* Verify that the feeds validate.
+* Subscribe to a fancy feed in some common podcatchers and verify
+ display details against a reference podcast.
+* Verify smooth transitions for two common use cases (see testing
+ details below).
+
+### Must-have (for [[schmonz]], anyway)
+
+* Think carefully about UTF-8.
+* Verify that _all_ the tests pass (not just my new ones).
+
+## Migration
+
+### Upgrading within ikiwiki: from simple to fancy
+
+#### My test podcast
+
+For this test, I chose a podcast that tries to work around ikiwiki's
+current limitations by issuing two separate `inline`s:
+
+* One with `feedonly=yes` that includes `.mdwn`, `.pdf`, and `.mp3`
+* One with `feeds=no` that includes only `.mdwn` (and makes a trail)
+
+This has the following effects:
+
+* Browser: sees just the articles (each of which has a manually
+ created link to its corresponding media file)
+* Feedreader: sees all the articles and media in one flat stream
+* Podcatcher: sees just the media (sans articles)
+
+I want instead to write one `inline` with these effects:
+
+* Browser: sees just the articles (each of which automatically links
+ to its enclosure)
+* Feedreader: sees just the articles (each of which specifies its
+ enclosure)
+* Podcatcher: sees just the enclosures (each of which has an enclosing
+ article, rendered as the media's "description")
+
+#### Upgrade steps
+
+7. Set up a non-production copy of the podcast.
+ 7. Visually diff RSS and Atom feeds against production.
+ 7. Subscribe to the copy (both feeds) in `r2e`, iTunes, Downcast.
+7. Apply fancypodcast patch to the installed ikiwiki:
+ 7. `cd ~/Documents/trees/ikiwiki && git checkout fancypodcast`
+ 7. `git diff --no-prefix master > ~/Documents/trees/localpatches/www/ikiwiki/fancypodcast.diff`
+ 7. `cd ~/Documents/trees/pkgsrc-current/www/ikiwiki && make deinstall && make install clean`
+7. Verify that simple podcasts are unaffected:
+ 7. Rerun `ikiwiki --setup`.
+ 7. `diff -uB simple-before.rss simple-after.rss`
+ * A few new elements and attributes, as expected.
+ 7. `diff -uB simple-before.atom simple-after.atom`
+ * No change.
+7. Remove the feed-only `inline` and enable feeds on the remaining one.
+7. Convert articles' manual download links to `\[[!meta enclosure=""]]`.
+7. I want existing and future podcatchers to get my new fancy
+ episodes, and I know my podcast isn't in any planets, so I'm
+ going to skip [[tips/howto avoid flooding aggregators]].
+7. Rerun `ikiwiki --setup`.
+7. Verify browser shows the same stuff.
+7. `diff -uB simple-after.rss fancy-after.rss # and atom`
+ * MP3s and PDFs are no longer naked enclosures, but belong to
+ articles as they should.
+ * Articles have updated modification times, as they should.
+7. `r2e run` (both RSS and Atom)
+ * Nothing new with the default `trust-guid = True` (otherwise
+ would expect updated articles).
+7. iTunes "Update Podcast" (both RSS and Atom)
+ * Added one episode per article, with article text as the episode
+ description.
+ * Kept old naked-enclosure episodes around.
+7. Downcast refresh (RSS):
+ * Added one episode per article, with article text as the episode
+ description.
+ * Kept old naked-enclosure episodes around.
+7. Downcast refresh (Atom):
+ * Added one episode per article, with no episode description
+ (expected, see feature table).
+ * Kept old naked-enclosure episodes around.
+
+Different tradeoffs are possible. These seem okay to me.
+
+### Importing into ikiwiki: fancy (from another CMS)
+
+#### My test podcast
+
+For this test, I chose a podcast currently being published with
+Textpattern and tru_podcast, because I'd strongly prefer to publish
+it with ikiwiki instead.
+
+#### Upgrade steps
+
+7. Set up a non-production copy of the podcast.
+ 7. Visually diff RSS and Atom feeds against production.
+ 7. Subscribe to the copy (both feeds) in `r2e`, iTunes, Downcast.
+7. With a fancypodcast-enabled ikiwiki installed:
+ 7. Copy content from Textpattern to ikiwiki:
+ 7. Match article paths to preserve `/YYYY/MM/DD/post-title` permalinks.
+ 7. Match enclosure paths (or redirect) to preserve Textpattern's URLs.
+ 7. Match titles, post dates, and guids with `\[[!meta]]`.
+ 7. Match feed paths with permanent redirects from `/atom/` to
+ `/index.atom` (and same for RSS).
+ 7. `\[[!inline]]` the articles.
+ 7. Rerun `ikiwiki --setup`.
+7. Stop Textpattern, start ikiwiki.
+7. Verify that podcatchers see the feeds and don't redownload anything.
+7. Naively add two new blog posts, one with an enclosure.
+7. Verify that podcatchers download the new enclosures.
+
+-----
+
+## Future improvements
+
+### iTunes fancy podcasting
+
+* [iTunes-specific tags](https://www.apple.com/itunes/podcasts/specs.html)
+ appear to be RSS-only
+ * If they work in Atom, teach `inline` to optionally iTunesify RSS/Atom.
+ * Else, add `itunes` as a third kind of feed (RSS plus more stuff).
+* Notable tags for feeds:
+ * `itunes:subtitle`
+ * `itunes:author`
+ * `itunes:summary` (same as `description`)
+ * `itunes:owner` (includes `itunes:name` and `itunes:email`)
+ * `itunes:image href=''`
+ * `itunes:publisher`
+ * `itunes:category text=''` (can contain subcategories)
+ * `itunes:keywords`
+* Notable tags for entries:
+ * `itunes:duration`
+ * [[!cpan Audio::TagLib]] might be fastest, if present and applicable
+ * [ffprobe](http://ffmpeg.org/ffprobe.html) is reasonably fast
+ * [mediainfo](http://mediainfo.sourceforge.net/) is way slower
+ * Cache computed durations as pagestate
+
+### Fancy podcast aggregating
+
+* Write tests comparing a fancy podcast (HTML and feeds) against
+ the same podcast aggregated and republished, then make them pass
+ via changes to `aggregatepost.impl` and [[plugins/aggregate]].
+
+### Other ideas
+
+* Don't render template text (e.g., "Use this template to insert a
+ note into a page") in feeds.
+* Optionally specify the enclosure's:
+ * MIME type, in case `File::MimeInfo` guesses wrong.
+ * Duration, in case `ffprobe` guesses wrong.
+* Optionally specify enclosures outside the wiki:
+ * Some people don't want to store big unchanging files in the VCS.
+ * Other people like [podcasting found media](http://huffduffer.com/about).
+ * We'd have to download the file just to compute some metadata
+ about it, and then somehow not frequently re-download it.
+* Configurably generate additional subscription links (such as
+ iTunes) alongside the RSS/Atom ones in [[plugins/inline]].
+* Support Apple's "enhanced podcasts" (if they're still relevant).
+
+### code review
+
+ + # XXX better way to compute relative to srcdir?
+ + my $file = $absurl;
+ + $file =~ s|^$config{url}/||;
+
+I don't think ikiwiki offers a better way to do that, because there is
+normally no reason to do that. Why does it need an url of this form here?
+--[[Joey]]
+
+> In all the popular, production-quality podcast feeds I've looked
+> at, enclosure URLs are always absolute (even when they could be
+> expressed concisely as relative). [Apple's
+> example](http://www.apple.com/itunes/podcasts/specs.html#example)
+> does too. So I told \[[!meta]] to call `urlto()` with the third
+> parameter true, which means the \[[!inline]] code here gets an
+> absolute URL in `$pagestate{$p}{meta}{enclosure}`. To compute the
+> enclosure's metadata, though, we of course need it as a local path.
+> I didn't see a less
+> [ongepotchket](http://www.jewish-languages.org/jewish-english-lexicon/words/1402)
+> way at the time. If you have a better idea, I'm happy to hear it;
+> if not, I'll add an explanatory comment. --[[schmonz]]
+
+>> I would be more comfortable with this if two two different forms of url
+>> you need were both generated by calling urlto. It'd be fine to call
+>> it more than once. --[[Joey]]
+
+>>> Heh, it was even easier than that! (Hooray for tests.) Done.
+>>> --[[schmonz]]
+
+ +<TMPL_IF HTML5><section id="inlineenclosure"><TMPL_ELSE><div id="inlineenclosure"></TMPL_IF>
+ +<TMPL_IF ENCLOSURE>
+
+Can't we avoid adding this div when there's no enclosure? --[[Joey]]
+
+> Sure, I've moved the `<TMPL_IF ENCLOSURE>` check to outside the
+> section-and-div block for `{,inline}page.tmpl`. --[[schmonz]]
+
+ +<a href="<TMPL_VAR ENCLOSURE>">Download this episode</a>
+
+"Download this episode" is pretty specific to particular use cases.
+Can this be made more generic, perhaps just "Download"? --[[Joey]]
+
+> Yep, I got a little carried away. Done. --[[schmonz]]
+
+ -<TMPL_IF AUTHOR>
+ - <title><TMPL_VAR AUTHOR ESCAPE=HTML>: <TMPL_VAR TITLE></title>
+ - <dcterms:creator><TMPL_VAR AUTHOR ESCAPE=HTML></dcterms:creator>
+
+This change removes the author name from the title of the rss feed, which
+does not seem necessary for fancy podcasts. And it is a change that
+could negatively impact eg, Planet style aggregators using ikiwiki. --[[Joey]]
+
+> While comparing how feeds render in podcatchers, I noticed that
+> RSS and Atom were inconsistent in a couple ways, of which this was
+> one. The way I noticed it: with RSS, valuable title space was being
+> spent to display the author. I figured Atom's display was the one
+> worth matching. You're right, of course, that planets using the
+> default template and somehow relying on the current author-in-the-title
+> rendering for RSS feeds (but not Atom feeds!) would be broken by
+> this change. I'm having trouble imagining exactly what would break,
+> though, since guids and timestamps are unaffected. Would it suffice
+> to provide a note in the changelog warning people to be careful
+> upgrading their planets, and to customize `rssitem.tmpl` if they
+> really prefer the old behavior (or don't want to take any chances)?
+> --[[schmonz]]
+
+>> A specific example I know of is updo.debian.net, when used with
+>> rss2email. Without the author name there, one cannot see who posted
+>> an item. It's worth noting that planet.debian.org does the same thing
+>> with its rss feed. (That's probably what I copied.) Atom feeds may
+>> not have this problem, don't know. --[[Joey]]
+
+>>> Okay, that's easy to reproduce. It looks like this _might_ be
+>>> a simple matter of getting \[[!aggregate]] to populate author in
+>>> `add_page()`. I'll see what I can figure out. --[[schmonz]]
+
+ +++ b/templates/rsspage.tmpl
+ + xmlns:atom="http://www.w3.org/2005/Atom"
+ +<atom:link href="<TMPL_VAR FEEDURL>" rel="self" type="application/rss+xml" />
+
+Why is it using atom namespace inside an rss feed? What are the chances
+every crummy rss reader on earth is going to understand this? I'd put it at
+about 0%; I doubt ikiwiki's own rss reader understands such a mashup.
+--[[Joey]]
+
+> The validator I used (<http://validator.w3.org/>, I think) told me to.
+> Pretty sure it doesn't make anything work better in the podcatchers
+> I tried. Hadn't considered that it might break some readers.
+> Removed. --[[schmonz]]
+
+ +<generator>ikiwiki</generator>
+
+Does this added tag provide any benefits? --[[Joey]]
+
+> Consistency with the Atom feed, and of course it trumpets ikiwiki
+> to software and/or curious humans who inspect their feeds. The tag
+> arrived only in RSS 2.0, but that's already the version we're
+> claiming to be, and it's over a decade old. Seems much less risky
+> than the atom namespace bits. --[[schmonz]]
+
+>> Sounds ok then. --[[Joey]]
diff --git a/doc/todo/fastcgi_or_modperl_installation_instructions.mdwn b/doc/todo/fastcgi_or_modperl_installation_instructions.mdwn
new file mode 100644
index 000000000..ad7910956
--- /dev/null
+++ b/doc/todo/fastcgi_or_modperl_installation_instructions.mdwn
@@ -0,0 +1,18 @@
+There has got to be a way to run the CGI wrapper under fastcgi or modperl (apache 2). Are there easy to follow instructions describing how to set this up?
+
+> AFAIK noone has done this. One immediate problem would be permissions;
+> the CGI wrapper runs setuid to you so it can write to the wiki -- if
+> running in fastcgi/modperl I guess it would run as the web server, unless
+> there's some way to control that. So you'd need to set up the perms
+> differenly, to let the web server commit changes to the wiki.
+>
+> I've not looked at what code changes fastcgi or modperl would require in
+> ikiwiki. --[[Joey]]
+
+> > Looking at nginx support in [[tips/dot_cgi]], I had to figure that out, and it's not so complicated. The hackish way that's documented there right now (and also supported by [answers on serverfault.com](http://serverfault.com/questions/93090/installing-ikiwiki-on-nginx-fastcgi-fcgi-wrapper) or [other](http://vilain.net/comp/ikiwiki_setup.html) [guides](https://library.linode.com/web-applications/wikis/ikiwiki/arch-linux)), and involves starting up a fcgi wrapper, which I find personnally quite weird.
+> >
+> > Otherwise the general idea would be to launch a daemon per site that would have a pool of fastcgi processes to answer requests. The common setup pattern here is that users have a fixed quota of processes running as their user, listening either on the network (hackish: a port need to be allocated for each user) or on a socket (documented above, but then the webserver needs write access).
+> >
+> > Perl has had extensive support for FastCGI for quite a while. It seems to me a simple daemon could be written to wrap around the `.cgi`, it's a common way things are deployed. [RT](http://rt.bestpractical.com/) for example can run as a regular CGI, under `mod_perl` or `FastCGI` indiscrimenatly, the latter being more reliable and faster. They use [Plack](http://search.cpan.org/dist/Plack/) to setup that server (see the [startup script](https://github.com/bestpractical/rt/blob/stable/sbin/rt-server.in) for an example). But of course, [TIMTOWTDI](http://search.cpan.org/search?query=fastcgi&mode=all). --[[anarcat]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/feed_enhancements_for_inline_pages.mdwn b/doc/todo/feed_enhancements_for_inline_pages.mdwn
new file mode 100644
index 000000000..f13213dc2
--- /dev/null
+++ b/doc/todo/feed_enhancements_for_inline_pages.mdwn
@@ -0,0 +1,132 @@
+[[!template id=gitbranch branch=GiuseppeBilotta/inlinestuff author="[[GiuseppeBilotta]]"]]
+
+I rearranged my patchset once again, to clearly identify the origin and
+motivation of each patch, which is explained in the following.
+
+In my ikiwiki-based website I have the following situation:
+
+* `$config{usedirs}` is 1
+* there are a number of subdirectories (A/, B/, C/, etc)
+ with pages under each of them (A/page1, A/page2, B/page3, etc)
+* 'index pages' for each subdirectory: A.mdwn, B.mdwn, C.mdwn;
+ these are rather barebone, only contain an inline directive for their
+ respective subpages and become A/index.html, etc
+* there is also the main index.mdwn, which inlines A.mdwn, B.mdwn, C.mdwn,
+ etc (i.e. the top-level index files are also inlined on the homepage)
+
+With the upstream `inline` plugin, the feeds for A, B, C etc are located
+in `A/index.atom`, `B/index.atom`, etc; their title is the wiki name and
+their main link goes to the wiki homepage rather than to their
+respective subdir (e.g. I would expect `A/index.atom` to have a link to
+`http://website/A` but it actually points to `http://website/`).
+
+This is due to them being generated from the main index page, and is
+fixed by the first patch: ‘inline: base feed urls on included page
+name’. As explained in the commit message for the patch itself, this is
+a ‘forgotten part’ from a previous page vs destpage fix which has
+already been included upstream.
+
+> Applied. --[[Joey]]
+
+>> Thanks.
+
+The second patch, ‘inline: improve feed title and description
+management’, aligns feed title and description management by introducing
+a `title` option to complement `description`, and by basing the
+description on the page description if the entry is missing. If no
+description is provided by either the directive parameter or the page
+metadata, we use a user-configurable default based on both the page
+title and wiki name rather than hard-coding the wiki name as description.
+
+> Reviewing, this seems ok, but I don't like that
+> `feed_desc_fmt` is "safe => 0". And I question if that needs
+> to be configurable at all. I say, drop that configurable, and
+> only use the page meta description (or wikiname for index).
+>
+> Oh, and could you indent your `elsif` the same as I? --[[Joey]]
+
+>> I hadn't even realized that I was nesting ifs inside else clauses,
+>> sorry. I think you're also right about the safety of the key, after
+>> all it only gets interpolated with known, safe strings.
+
+>>> I did not mean to imply that I thought it safe. --[[Joey]]
+
+>>>> Sorry for assuming you implied that. I do think it is safe, though
+>>>> (I defaulted to not safe just to err on the safe side).
+
+>> The question is what to do for pages that do not have a description
+>> (and are not the index). With your proposal, the Atom feed subtitle
+>> would turn up empty. We could make it conditional in the default
+>> template, or we could have `$desc` default to `$title` if nothing
+>> else is provided, but at this point I see no reason to _not_ allow
+>> the user to choose a way to build a default description.
+
+>>> RSS requires the `<description>` element be present, it can't
+>>> be conditionalized away. But I see no reason to add the complexity
+>>> of an option to configure a default value for a field that
+>>> few RSS consumers likely even use. That's about 3 levels below useful.
+>>> --[[Joey]]
+
+>>>> The way I see it, there are three possibilities for non-index pages
+>>>> which have no description meta: (1) we leave the
+>>>> description/subtitle in feed blank, per your current proposal here
+>>>> (2) we hard-code some string to put there and (3) we make the
+>>>> string to put there configurable. Honestly, I think option #1 sucks
+>>>> aesthetically and option #2 is conceptually wrong (I'm against
+>>>> hard-coding stuff in general), which leaves option #3: however
+>>>> rarely used it would be, I still think it'd be better than #2 and
+>>>> less unaesthetical than #1.
+
+>>>> I'm also not sure what's ‘complex’ about having such an option:
+>>>> it's definitely not going to get much use, but does it hurt to have
+>>>> it? I could understand not wasting time putting it in, but since
+>>>> the code is written already … (but then again I'm known for being a
+>>>> guy who loves options).
+
+The third patch, ‘inline: allow assigning an id to postform/feedlink’,
+does just that. I don't currently use it, but it can be particularly
+useful in the postform case for example for scriptable management of
+multiple postforms in the same page.
+
+> Applied. --[[Joey]]
+
+>> Thanks.
+
+In one of my wiki setups I had a terminating '/' in `$config{url}`. You
+mention that it should not be present, but I have not seen this
+requirement described anywhere. Rather than restricting the user input,
+I propose a patch that prevents double slashes from appearing in links
+created by `urlto()` by fixing the routine itself.
+
+> If this is fixed I would rather not put the overhead of fixing it in
+> every call to `urlto`. And I'm not sure this is a comprehensive
+> fix to every problem a trailing slash in the url could cause. --[[Joey]]
+
+>> Maybe something that sanitizes the config value would be better instead?
+>> What is the policy about automatic changing user config?
+
+>>> It's impossible to do for perl-format setup files. --[[Joey]]
+
+>>>> Ok. In that case I think that we should document that it must be
+>>>> slash-less. I'll cook up a patch in that sense.
+
+The inline plugin is also updated (in a separate patch) to use `urlto()`
+rather than hand-coding the feed urls. You might want to keep this
+change even if you discard the urlto patch.
+
+> IIRC, I was missing a proof that this always resulted in identical urls,
+> which is necessary to prevent flooding. I need such a proof before I can
+> apply that. --[[Joey]]
+
+>> Well, the URL would obviously change if the `$config{url}` ended in
+>> slash and the `urlto` patch (or other equivalent) went into effect.
+
+>> Aside from that, if I read the code correctly, the only other extra
+>> thing that `urlto` does is to `beautify_url_path` the `"/".$to` part,
+>> and the only way this would cause the url to be altered is if the
+>> feed name was "index" (which can easily happen) and
+>> `$config{htmlext}` was set to something like `.rss` or
+>> `.rss.1`.
+
+>> So there is a remote possibility that a different URL would be
+>> produced.
diff --git a/doc/todo/fileupload.mdwn b/doc/todo/fileupload.mdwn
new file mode 100644
index 000000000..8c9b18b19
--- /dev/null
+++ b/doc/todo/fileupload.mdwn
@@ -0,0 +1,63 @@
+(I've written a [[proposal|todo/fileupload/soc-proposal]] for this feature --Ben).
+
+Support for uploading files is useful for many circumstances:
+
+* Uploading images.
+* Uploading local.css files (admin only).
+* Uploading mp3s for podcasts.
+* Etc.
+
+ikiwiki should have an easy to use interface for this, but the real meat of
+the work is in securing it. Several classes of controls seem appropriate:
+
+* Limits to size of files that can be uploaded. Prevent someone spamming
+ the wiki with CD isos..
+* Limits to the type of files that can be uploaded. To prevent uploads of
+ virii, css, raw html etc, and avoid file types that are not safe.
+ Should default to excluding all files types, or at least all
+ except a very limited set, and should be able to open it up to more
+ types.
+
+ Would checking for file extensions (.gif, .jpg) etc be enough? Some
+ browsers are probably too smart for their own good and may ignore the
+ extension / mime info and process as the actual detected file type. It
+ may be necessary to use `file` to determine a file's true type.
+* Optional ability to test a file using a virus scanner like clamav.
+* Limits to who can upload what type of files.
+* Limits to what files can be uploaded where.
+
+It seems that for max flexability, rules should be configurable by the admin
+to combine these limits in different ways. If we again extend the pagespec
+for this, as was done for [[conditional_text_based_on_ikiwiki_features]],
+the rules might look something like this:
+
+ ( maxsize(30kb) and type(webimage) ) or
+ ( user(joey) and maxsize(1mb) and (type(webimage) or *.mp3) ) or
+ ( user(joey) and maxsize(200mb) and (*.mov or *.avi) and videos/*)
+
+With a small extension, this could even be used to limit the max sizes of
+normal wiki pages, which could be useful if someone was abusing an open wiki
+as a wikifs. Maybe.
+
+ ( type(page) and maxsize(32k) )
+
+And if that's done, it can also be used to lock users from editing a pages
+or the whole wiki:
+
+ !(( user(spammer) and * ) or
+ ( user(42.12.*) and * ) or
+ ( user(http://evilopenidserver/*) and * ) or
+ ( user(annoying) and index) or
+ ( immutable_page ))
+
+That would obsolete the current simple admin prefs for banned users and
+locked pages. Suddenly all the access controls live in one place.
+Wonderbar!
+
+(Note that pagespec_match will now return an object that stringifies to a
+message indicating why the pagespec matched, or failed to match, so if a
+pagespec lock like the above prevents an edit or upload from happening,
+ikiwiki could display a reasonable message to the user, indicating what
+they've done wrong.)
+
+[[!tag soc done]]
diff --git a/doc/todo/fileupload/discussion.mdwn b/doc/todo/fileupload/discussion.mdwn
new file mode 100644
index 000000000..01c0cc3fe
--- /dev/null
+++ b/doc/todo/fileupload/discussion.mdwn
@@ -0,0 +1,45 @@
+ * Limits to size of files that can be uploaded. Prevent someone
+ spamming the wiki with CD isos..
+
+> CGI.pm has a limitation that you can't prevent someone uploading
+> something huge and filling up your server.
+> However it is obviously possible to not actually put it in to the
+> wiki if it's too large.
+> Presumably there is also a way to limit the size of POST requests
+> in the server.
+
+* Limits to the type of files that can be uploaded. To prevent uploads of
+ virii, css, raw html etc, and avoid file types that are not safe.
+ Should default to excluding all files types, or at least all except
+ a very limited set, and should be able to open it up to more types.
+
+ Would checking for file extensions (.gif, .jpg) etc be enough? Some
+ browsers are probably too smart for their own good and may ignore
+ the extension / mime info and process as the actual detected file
+ type. It may be necessary to use file to determine a file's true type.
+
+> I think using the extension is too risky, and as much information as
+> possible should go in to the decision. Saving the file to disk, then
+> checking the type before using it seems like the best approach to me,
+> as long as the file is deleted properly.
+
+> Have you any thoughts on what the interface should be? I can see three
+> options. First add a box to the file creation page that allows you
+> to upload a file instead of the page. The second is an upload file
+> link that asks for a page. The last would be an attachments system
+> that e.g. Twiki use, where the file could be uploaded as a subpage.
+
+> How about the limit setting etc.? Add it as a box on the admin's
+> preference page, allow it anywhere using preprocessor directives,
+> or have a configuration page that only the admin is allowed to edit
+> (and perhaps people named on the page?)
+
+> The syntax of the conditionals isn't too hard, as the things that
+> are being added fit in nicely. It might be nice to allow plugins
+> to register new functions for them, and provide callbacks to
+> provide a yes no answer. I'm haven't looked at the code yet,
+> are the pagespecs uniform in all places, or is the conditional
+> usage an extended one? i.e. can I lock pages based on date etc?
+> --[[JamesWestby]]
+
+
diff --git a/doc/todo/fileupload/soc-proposal.mdwn b/doc/todo/fileupload/soc-proposal.mdwn
new file mode 100644
index 000000000..ca007e7e0
--- /dev/null
+++ b/doc/todo/fileupload/soc-proposal.mdwn
@@ -0,0 +1,71 @@
+# SoC Proposal for Implementation of a File Upload Interface
+
+I intend to extend Ikiwiki such that it accepts file uploads, subject to access
+control, and integrates said uploads with the interface. What
+follows is a **very rough draft** of my thoughts on the matter. Comments are
+welcomed, either on the discussion page or via e-mail (_me_ at _inelegant.org_).
+
+I suggest we adopt the Trac/Wikipedia concept of "attaching" files to a given
+page. In this scenario, each page for which file upload has been enabled, will
+sport an `<input type="file">` construct along with an _Attach_ button. Upon
+successfully attaching a file, its name will be appended to an _"Attachments"_
+list at the bottom of the page. The names in the list will link to the
+appropriate files. Architecturally, this means that after a file has been attached to a page, the
+page will have to be rebuilt.
+
+Files will be uploaded in a background thread via XMLHTTPRequest. This allows us to provide visual indicators of upload status, support multiple uploads at a time, and reduces the amount of template code we must write.
+
+After an upload has been started, another text entry field will be rendered, enabling the user to commence a new upload.
+
+## Metadata
+
+It is necessary to associate metadata with the uploaded file. The IkiWiki index file already associates rudimentary metadata with the files it renders, but there has been interest from multiple sources in creating a general purpose metadata layer for IkiWiki which supports the association of arbitrary metadata with a file. This work is outside the scope of the file upload feature, but I will attempt a basic implementation nevertheless.
+
+A key decision involves the storage of the metadata. IkiWiki must be as usable from the CLI as from the web, so the data being stored must be easily manipulatable using standard command line tools. It is infeasible to expect users to embed arbitrary metadata in arbitrary files, so we will use a plaintext file consisting of name-value pairs for recording metadata. Each file in the IkiWiki source directory may have its own metadata file, but they are always optional. The metadata for a file, _F_, will be stored in a file named _F.meta_. For example, the metadata for this page would be in _todo/fileupload/soc-proposal.mdwn.meta_.
+
+For instance: `cat "license: gpl\n" >> software.tar.gz.meta`. It would be trivial to distribute a tool with IkiWiki that made this even easier, too, e.g. `ikiwiki-meta license gpl software.tar.gz`. An open issue is how this metadata will be added from the web interface.
+
+For source files, this approach conflicts with the [_meta_ plugin](http://ikiwiki.info/plugins/meta/), so there needs to be some integration between the two.
+
+In keeping with the current architecture of IkiWiki, we can make this metadata available to plugins by using a hash keyed on the filename, e.g. `$metadata{'software/software.tar.gz'}{'license'} eq 'gpl'`.
+
+In general, we will only use the _.meta_ files to store data that cannot be automatically determined from the file itself. For uploaded files this will be probably include the uploader's IP address, for example.
+
+## Configuration
+
+In [[todo/fileupload]] it is specified that the upload feature must be highly
+configurable. Joey suggests the use of the preferences page to specify some of these options, but it is not yet clear which ones are important enough to expose in this way. All options will be configurable via the config file.
+
+We will (or do) support configuring:
+
+* The allowable MIME types of uploaded files.
+* The maximum size of the uploaded file.
+* The maximum size of the upload temporary directory.
+* The maximum size of the source directory.
+* The IP addresses allowed to upload.
+* The pages which can have files attached to them.
+* The users who are allowed to upload.
+* The users who are prohibited from uploading.
+
+etc.
+
+## Operation
+
+1. File upload forms will be rendered on all wiki pages which have been allowed
+in the global configuration file. By default, this will probably be none of
+them.
+2. The forms will interface with _ikiwiki.cgi_, passing it the filename, the
+file contents, and the name of the page to which it is being attached.
+3. The CGI will consult the config file and any embedded pagespecs in turn, to
+determine whether the access controls permit the upload. If they don't, an error
+message will be displayed to the user, and the process will abort.
+4. The uploaded file will be saved to a temporary upload directory.
+5. Access controls which work on the entire file will be ran. The process will abort if they fail, or if the upload appears to have been aborted. Before the process is aborted, the file will be deleted from the temp directory.
+6. The file is moved to the appropriate directory.
+7. The _$file.meta_ file will be created and populated.
+8. The uploaded file will be committed to the RCS.
+9. _.ikiwiki/index_ will be modified to reflect the new upload (as above).
+10. The page to which the file is attached (and any other
+affected pages) will be regenerated.
+
+--Ben
diff --git a/doc/todo/fileupload/soc-proposal/discussion.mdwn b/doc/todo/fileupload/soc-proposal/discussion.mdwn
new file mode 100644
index 000000000..f85a956db
--- /dev/null
+++ b/doc/todo/fileupload/soc-proposal/discussion.mdwn
@@ -0,0 +1,46 @@
+There's nothing in [[fileupload]] that suggests putting the file upload limit in the body of a page. That would indeed be a strange choice. Ikiwiki already uses [[PageSpecs|ikiwiki/PageSpec]] in the Preferences page (for specifying locked pages, banned users, and subscriptions), and I had envisioned putting the file upload controls there, and possibly subsuming some of those other controls into them.
+
+> Thanks for clarifying; I clearly misunderstood the original text. -- Ben
+
+It's not clear to me that the concept of attaching files to a page fits ikiwiki very well; unlike most wikis, ikiwiki supports subdirectories and [[SubPages|ikiwiki/SubPage]], which allows for hierarchical placement of uploaded files, which is a much more flexible concept than simple attachment. Futhermore, the idea of listing all attached files at the bottom of a page seems somewhat inflexible. What if I want to make a podcast, using inline's existing support for that -- I won't want a list of every "attached" file at the bottom of my podcast's page then.
+
+> If a file was attached to _some-dir/some-page_, it would be stored in _some-dir/_ and linked from _some-page_. That would seem reasonably hierarchical to me. What do you suggest as an alternative?
+
+>> I'd suggest `some-dir/some-page/file`, which nicely makes clear that the file is "attached" to some-page, allows easy wikilinks to "file" from some-page, and has other nice properties.
+
+>>> So _some-dir/some-page_ would feature an upload form that stored its payload in _some-dir/some-page/file_? IOW, we'd still be attaching files, but making the relationship between attacher and attached more explicit? --Ben
+
+>>>> More explicit or less, I don't know.. :-) It seems to make sense for most of the use cases I've thought of to put the uploaded file there, but there might be use cases where it would go somewhere else, and so maybe the UI should allow specifying where it goes (similarly to how ikwiki allows specifying where to put a page when creating a new page).
+
+>>>> Exactly where the upload form should be I don't know. Putting it directly on the page seems somewhat strange, I know that some wikis have an actions menu that includes file upload and deletion, I think others make the Edit form include support for uploading files. Maybe survey other wikis and base it on a UI that works well.
+
+> As for the attachment list, I envisaged that being optional. --Ben
+
+>> So some kind of preprocessor directive that is added to a page to generate the attachment list?
+
+>>> Absolutely.
+
+I don't understand why the file size would need to be stored in the index file; this information is available by statting the file, surely? Similarly, the mime type can be determined through inspection, unless there turns out to be a reason to need to cache it for speed.
+
+--[[Joey]]
+
+For images, videos, etc. it would be nice to have some kind of meta data file to go along with it
+(eg. image.jpg.meta), to store information like creator, title, description, tags, length, width,
+height, compression, etc. which could be initially created by 'ikiwiki --generate-meta-stuff'.
+Then PageSpec should be
+teached to use these. Galleries could then be generated by means of
+\[[!inline pages="type(image/*) and year(2007)" template="gallery"]]. It
+should of course be possible to edit this information via ikiwiki.cgi and with any
+text editor (Name: value). This should also allow for creations of default .html pages with
+the image/video/file/... and a discussion page. Probably named image.mdwn and image/discussion.
+ I think that would fit nicely with the ikiwiki concept. Comments? --enodev
+
+> Replying to myself. Just appending .meta gives problems when \$usedirs is enabled as the original file and the directory containing the html file will have the same name. Taking away the original extensions has problems with filenames having different extensions and the same basename. So something like 'about-image.jpg.meta'? That would require no changes to the core to support it and is reasonably easy to the eye.
+
+> I also pondered about putting this info in the rcs log, but that is problematic when you just 'cp'/'mv'/whatever the directory. Same goes for using something like svn properties, which git does not even support. Storing this info in the index file is problematic, because that isn't versioned. Major problem I see with this approach would be the disconnected nature of having two files. Posix extended attributes? ;-(
+
+> This could also be used to specify the license of the file.
+
+> I did an proof of concept implementation of this idea [here](http://ng.l4x.org/brainstorm/gallery/) yesterday night, including the link to the source code. I'd really love to hear comments about this approach.
+ (note1: I'm really not interested in any kind of http interface to that thing, just testing ways of storing the meta
+ data, note2: I'm no perl programmer)
diff --git a/doc/todo/filtering_content_when_inlining.mdwn b/doc/todo/filtering_content_when_inlining.mdwn
new file mode 100644
index 000000000..8a2326035
--- /dev/null
+++ b/doc/todo/filtering_content_when_inlining.mdwn
@@ -0,0 +1,16 @@
+It would help to allow filtering of content when
+[[inlining|plugins/inline]] pages. For example, given some way to filter
+out the display of inlines within other inlines, a blog post could allow
+easy inline commenting by putting an inline directive with post form at the
+bottom of the post.
+
+> That's trying to do the same thing as the todo item
+> [[discussion_page_as_blog]]. Difference is that you're suggesting
+> displaying the comments in the blog post that they comment on, instead
+> of on the separate disucssion page. Which leads to the problem of those
+> comments showing up inlined into the blog.
+>
+> I know there are benefits to having the comments on the same page and not
+> a separate discussion page, but it does add compliciations and ikiwiki
+> already has discussion pages, so I'm more likely to go the route
+> described in [[discussion_page_as_blog]]. --[[Joey]]
diff --git a/doc/todo/finer_control_over___60__object___47____62__s.mdwn b/doc/todo/finer_control_over___60__object___47____62__s.mdwn
new file mode 100644
index 000000000..50c4d43bf
--- /dev/null
+++ b/doc/todo/finer_control_over___60__object___47____62__s.mdwn
@@ -0,0 +1,98 @@
+IIUC, the current version of [HTML::Scrubber][] allows for the `object` tags to be either enabled or disabled entirely. However, while `object` can be used to add *code* (which is indeed a potential security hole) to a document, reading [Objects, Images, and Applets in HTML documents][objects-html] reveals that the &ldquo;dangerous&rdquo; are not all the `object`s, but rather those having the following attributes:
+
+ classid %URI; #IMPLIED -- identifies an implementation --
+ codebase %URI; #IMPLIED -- base URI for classid, data, archive--
+ codetype %ContentType; #IMPLIED -- content type for code --
+ archive CDATA #IMPLIED -- space-separated list of URIs --
+
+It seems that the following attributes are, OTOH, safe:
+
+ declare (declare) #IMPLIED -- declare but don't instantiate flag --
+ data %URI; #IMPLIED -- reference to object's data --
+ type %ContentType; #IMPLIED -- content type for data --
+ standby %Text; #IMPLIED -- message to show while loading --
+ height %Length; #IMPLIED -- override height --
+ width %Length; #IMPLIED -- override width --
+ usemap %URI; #IMPLIED -- use client-side image map --
+ name CDATA #IMPLIED -- submit as part of form --
+ tabindex NUMBER #IMPLIED -- position in tabbing order --
+
+Should the former attributes be *scrubbed* while the latter left intact, the use of the `object` tag would seemingly become safe.
+
+Note also that allowing `object` (either restricted in such a way or not) automatically solves the [[/todo/svg]] issue.
+
+For Ikiwiki, it may be nice to be able to restrict [URI's][URI] (as required by the `data` and `usemap` attributes) to, say, relative and `data:` (as per [RFC 2397][]) ones as well, though it requires some more consideration.
+
+&mdash;&nbsp;[[Ivan_Shmakov]], 2010-03-12Z.
+
+[[wishlist]]
+
+> SVG can contain embedded javascript.
+
+>> Indeed.
+
+>> So, a more general tool (`XML::Scrubber`?) will be necessary to
+>> refine both [XHTML][] and SVG.
+
+>> &hellip; And to leave [MathML][] as is (?.)
+
+>> &mdash;&nbsp;[[Ivan_Shmakov]], 2010-03-12Z.
+
+> The spec that you link to contains
+> examples of objects that contain python scripts, Microsoft OLE
+> objects, and Java. And then there's flash. I don't think ikiwiki can
+> assume all the possibilities are handled securely, particularly WRT XSS
+> attacks.
+> --[[Joey]]
+
+>> I've scanned over all the `object` examples in the specification and
+>> all of those that hold references to code (as opposed to data) have a
+>> distinguishing `classid` attribute.
+
+>> While I won't assert that it's impossible to reference code with
+>> `data` (and, thanks to `text/xhtml+xml` and `image/svg+xml`, it is
+>> *not* impossible), throwing away any of the &ldquo;insecure&rdquo;
+>> attributes listed above together with limiting the possible URI's
+>> (i.&nbsp;e., only *local* and certain `data:` ones for `data` and
+>> `usemap`) should make `object` almost as harmless as, say, `img`.
+
+>>> But with local data, one could not embed youtube videos, which surely
+>>> is the most obvious use case?
+
+>>>> Allowing a &ldquo;remote&rdquo; object to render on one's page is a
+ security issue by itself.
+ Though, of course, having an explicit whitelist of URI's may make
+ this issue more tolerable.
+ &mdash;&nbsp;[[Ivan_Shmakov]], 2010-03-12Z.
+
+>>> Note that youtube embedding uses an
+>>> object element with no classid. The swf file is provided via an
+>>> enclosed param element. --[[Joey]]
+
+>>>> I've just checked a random video on YouTube and I see that the
+ `.swf` file is provided via an enclosed `embed` element. Whether
+ to allow those or not is a different issue.
+ &mdash;&nbsp;[[Ivan_Shmakov]], 2010-03-12Z.
+
+>> (Though it certainly won't solve the [[SVG_problem|/todo/SVG]] being
+>> restricted in such a way.)
+
+>> Of the remaining issues I could only think of recursive
+>> `object` &mdash; the one that references its container document.
+
+>> &mdash;&nbsp;[[Ivan_Shmakov]], 2010-03-12Z.
+
+## See also
+
+* [Objects, Images, and Applets in HTML documents][objects-html]
+* [[plugins/htmlscrubber|/plugins/htmlscrubber]]
+* [[todo/svg|/todo/svg]]
+* [RFC 2397: The &ldquo;data&rdquo; URL scheme. L.&nbsp;Masinter. August 1998.][RFC 2397]
+* [Uniform Resource Identifier &mdash; the free encyclopedia][URI]
+
+[HTML::Scrubber]: http://search.cpan.org/~podmaster/HTML-Scrubber-0.08/Scrubber.pm
+[MathML]: http://en.wikipedia.org/wiki/MathML
+[objects-html]: http://www.w3.org/TR/1999/REC-html401-19991224/struct/objects.html
+[RFC 2397]: http://tools.ietf.org/html/rfc2397
+[URI]: http://en.wikipedia.org/wiki/Uniform_Resource_Identifier
+[XHTML]: http://en.wikipedia.org/wiki/XHTML
diff --git a/doc/todo/firm_up_plugin_interface.mdwn b/doc/todo/firm_up_plugin_interface.mdwn
new file mode 100644
index 000000000..c7553f7dd
--- /dev/null
+++ b/doc/todo/firm_up_plugin_interface.mdwn
@@ -0,0 +1,96 @@
+Reopening this for 3.0, to consider adding new functions.
+
+I don't want this interface to be too firm; it's ok for a plugin like
+`ddate` to redefine an internal function like IkiWiki::displaytime if it
+wants to.. But plugins that still access stuff through IkiWiki:: should be
+aware that that stuff can change at any time and break them. Possibly without
+perl's type checking catching the breakage, in some cases. Plugins that
+only use exported symbols should not be broken by future ikiwiki changes.
+
+## Most often used functions, by number of calls from plugin code
+
+ 27 IkiWiki::possibly_foolish_untaint
+
+Not very happy about exporting, it's not ikiwiki-specific,
+and plugins that need to untaint things should think about it, hard.
+
+ 12 IkiWiki::userinfo_get
+ 5 IkiWiki::userinfo_set
+
+Used by only 4 plugins, all of which are fairly core, so thinking
+don't export.
+
+ 11 IkiWiki::preprocess
+ 8 IkiWiki::filter
+ 4 IkiWiki::linkify
+ 4 IkiWiki::htmlize
+
+The page rendering chain. Note that it's very common to call `preprocess(filter(text))`,
+or `htmlize(linkify(preprocess(filter(text))))`, while `htmlize(linkify(preprocess(text))`
+is called less frequently, and it's also not unheard of to leave out a step and do
+`htmlize(preprocess(text))`. (I haven't checked if any of those cases are bugs.)
+
+It would be nice if the api could avoid exposing the details of the render chain,
+by providing a way to say "I have filtered text, and would like html", or "I have raw
+text and would like to get it up to the preprocess stage".
+
+Another problimatic thing is plugins often define functions named 'preprocess', etc.
+
+ 12 IkiWiki::linkpage
+ 11 IkiWiki::pagetitle
+ 6 IkiWiki::titlepage
+
+These go together; linkpage is needed by all link plugins, and the others are used widely.
+All should be exported. (Done)
+
+ 7 IkiWiki::saveindex
+ 5 IkiWiki::loadindex
+
+Still too internal to ever be exported?
+
+ 7 IkiWiki::redirect
+
+Only used by 4 plugins, and not in IkiWiki.pm itself, so probably not to be exported.
+
+ 7 IkiWiki::dirname
+ 4 IkiWiki::basename
+
+Not ikiwiki-specific, don't export.
+
+ 6 IkiWiki::refresh
+
+Very internal, not part of IkiWiki.pm, don't export.
+
+ 5 IkiWiki::yesno
+
+Not ikiwiki-specific, but worth exporting to get a consistent localised yes/no parser
+for directives.
+
+ 5 IkiWiki::showform
+ 4 IkiWiki::decode_form_utf8
+
+Only used by 3 fairly core plugins, not in IkiWiki.pm, don't export.
+
+ 5 IkiWiki::rcs_update
+ 4 IkiWiki::rcs_prepedit
+ 5 IkiWiki::is_admin
+ 5 IkiWiki::cgi_savesession
+ 4 IkiWiki::cgiurl
+
+Not enough use, I think, to export.
+
+ 5 IkiWiki::enable_commit_hook
+ 5 IkiWiki::disable_commit_hook
+
+Deep internal magic, if exported people will use it wrong, only used by core plugins.
+
+ 4 IkiWiki::check_canedit
+
+Probably needs to evolve more and be more widely used before being exported.
+
+## Variables used by plugins but not exported yet
+
+* %IkiWiki::pagecase (aggregate)
+* %IkiWiki::backlinks (pagestats)
+
+[[done]] (until 4.0)..
diff --git a/doc/todo/for_amazon_s3_pre-gzip-encode_safe_files.mdwn b/doc/todo/for_amazon_s3_pre-gzip-encode_safe_files.mdwn
new file mode 100644
index 000000000..c0c0c12ab
--- /dev/null
+++ b/doc/todo/for_amazon_s3_pre-gzip-encode_safe_files.mdwn
@@ -0,0 +1,17 @@
+Regarding the [[Amazon_S3_Plugin|plugins/amazon_s3]]:
+
+Amazon S3 doesn't seem to support automatic GZIP encoding content (such as HTML, JavaScript, and CSS) that might be compressed by a full-capability webserver. (I'll also note that NearlyFreeSpeech.NET doesn't support compressing out-going files on-the-fly). However, Amazon S3 does support setting some response headers, such as Transfer-Encoding and the like.
+
+One possibility of decreasing bandwidth costs/download sizes would be to GZIP all content on the site and set the necessary header... however there are certain browser compatibility issues to be navigated.
+
+Another side item that would be useful potentially would be to have a config option to create a mapping of files that can be gzipped as an alternate name...
+
+For example:
+
+ gzipped_files => {
+ js => "js.gz"
+ }
+
+Would take all js files and gzip them w/ the altered extension. *This* could allow for using JavaScript to customize what other JS/CSS code gets loaded in based on browser-detection JS code.
+
+--[[harningt]]
diff --git a/doc/todo/format_escape.mdwn b/doc/todo/format_escape.mdwn
new file mode 100644
index 000000000..762f16646
--- /dev/null
+++ b/doc/todo/format_escape.mdwn
@@ -0,0 +1,292 @@
+Since some preprocessor directives insert raw HTML, it would be good to
+specify, per-format, how to pass HTML so that it goes through the format
+OK. With Markdown we cross our fingers; with reST we use the "raw"
+directive.
+
+I added an extra named parameter to the htmlize hook, which feels sort of
+wrong, since none of the other hooks take parameters. Let me know what
+you think. --Ethan
+
+Seems fairly reasonable, actually. Shouldn't the `$type` come from `$page`
+instead of `$destpage` though? Only other obvious change is to make the
+escape parameter optional, and only call it if set. --[[Joey]]
+
+> I couldn't figure out what to make it from, but thinking it through,
+> yeah, it should be $page. Revised patch follows. --Ethan
+
+>> I've updated the patch some more, but I think it's incomplete. ikiwiki
+>> emits raw html when expanding WikiLinks too, and it would need to escape
+>> those. Assuming that escaping html embedded in the middle of a sentence
+>> works.. --[[Joey]]
+
+>>> Revised again. I get around this by making another hook, htmlescapelink,
+>>> which is called to generate links in whatever language. In addition, it
+>>> doesn't (can't?) generate
+>>> spans, and it doesn't handle inlineable image links. If these were
+>>> desired, the approach to take would probably be to use substitution
+>>> definitions, which would require generating two bits of code for each
+>>> link/html snippet, and putting one at the end of the paragraph (or maybe
+>>> the document?).
+>>> To specify that (for example) Discussion links are meant to be HTML and
+>>> not rst or whatever, I added a "genhtml" parameter to htmllink. It seems
+>>> to work -- see <http://ikidev.betacantrips.com/blah.html> for an example.
+>>> --Ethan
+
+## Alternative solution
+
+[Here](http://www.jk.fr.eu.org/ikiwiki/format-escapes-2.diff) is a patch
+largely inspired from the one below, which is up to date and written with
+[[todo/multiple_output_formats]] in mind. "htmlize" hooks are generalized
+to "convert" ones, which can be registered for any pair of filename
+extensions.
+
+Preprocessor directives are allowed to return the content to be inserted
+as a hash, in any format they want, provided they provide htmlize hooks for it.
+Pseudo filename extensions (such as `"_link"`) can also be introduced,
+which aren't used as real extensions but provide useful intermediate types.
+
+--[[JeremieKoenig]]
+
+> Wow, this is in many ways a beautiful patch. I did notice one problem,
+> if a link is converted to rst and then from there to a hyperlink, the
+> styling info usially added to such a link is lost. I wonder if it would
+> be better to lose _link stuff and just create link html that is fed into
+> the rst,html converter. Other advantage to doing that is that link
+> creation has a rather complex interface, with selflink, attrs, url, and
+> content parameters.
+>
+> --[[Joey]]
+
+>> Thanks for the compliment. I must confess that I'm not too familiar with
+>> rst. I am using this todo item somewhat as a pretext to get the conversion
+>> stuff in, which I need to implement some other stuff. As a result I was
+>> less careful with the rst plugin than with the rest of the patch.
+>> I just updated the patch to fix some other problems which I found with
+>> more testing, and document the current limitations.
+
+>> Rst cannot embed raw html in the middle of a paragraph, which is why
+>> "_link" was necessary. Rst links are themselves tricky and can't be made to
+>> work inside of words without knowledge about the context.
+>> Both problems could be fixed by inserting marks instead of the html/link,
+>> which would be replaced at a later stage (htmlize, format), somewhat
+>> similiar to the way the toc plugin works. When I get more time I will
+>> try to fix the remaining glitches this way.
+
+>> Also, I think it would be useful if ikiwiki had an option to export
+>> the preprocessed source. This way you can use docutils to convert your
+>> rst documents to other formats. Raw html would be loosed in such a
+>> process (both with directives and marks), which is another
+>> argument for `"_link"` and other intermediate forms. I think I can
+>> come up with a way for rst's convert_link to be used only for export
+>> purposes, though.
+
+>> --[[JeremieKoenig]]
+
+> Another problem with this approach is when there is some html (say a
+> table), that contains a wikilink. If the link is left up to the markup
+> lamguage to handle, it will never convert it to a link, since the table
+> will be processed as a chunk of raw html.
+> --[[Joey]]
+
+### Updated patch
+
+I've created an updated [patch](http://www.idletheme.org/code/patches/ikiwiki-format-escapes-rlk-2007-09-24.diff) against the current revision. No real functionality changes, except for a small test script, one minor bugfix (put a "join" around a scalar-context "map" in convert_link), and some wrangling to get it merged properly; I thought it might be helpful for anyone else who wants to work on the code.
+
+(With that out of the way, I think I'm going to take a stab at Jeremie's plan to use marks which would be replaced post-htmlization. I've also got an eye towards [[todo/multiple_output_formats]].)
+
+--Ryan Koppenhaver
+
+## Original patch
+[[!tag patch patch/core plugins/rst]]
+
+<pre>
+Index: debian/changelog
+===================================================================
+--- debian/changelog (revision 3197)
++++ debian/changelog (working copy)
+@@ -24,6 +24,9 @@
+ than just a suggests, since OpenID is enabled by default.
+ * Fix a bug that caused link(foo) to succeed if page foo did not exist.
+ * Fix tags to page names that contain special characters.
++ * Based on a patch by Ethan, add a new htmlescape hook, that is called
++ when a preprocssor directive emits inline html. The rst plugin uses this
++ hook to support inlined raw html.
+
+ [ Josh Triplett ]
+ * Use pngcrush and optipng on all PNG files.
+Index: IkiWiki/Render.pm
+===================================================================
+--- IkiWiki/Render.pm (revision 3197)
++++ IkiWiki/Render.pm (working copy)
+@@ -96,7 +96,7 @@
+ if ($page !~ /.*\/\Q$discussionlink\E$/ &&
+ (length $config{cgiurl} ||
+ exists $links{$page."/".$discussionlink})) {
+- $template->param(discussionlink => htmllink($page, $page, gettext("Discussion"), noimageinline => 1, forcesubpage => 1));
++ $template->param(discussionlink => htmllink($page, $page, gettext("Discussion"), noimageinline => 1, forcesubpage => 1, genhtml => 1));
+ $actions++;
+ }
+ }
+Index: IkiWiki/Plugin/rst.pm
+===================================================================
+--- IkiWiki/Plugin/rst.pm (revision 3197)
++++ IkiWiki/Plugin/rst.pm (working copy)
+@@ -30,15 +30,36 @@
+ html = publish_string(stdin.read(), writer_name='html',
+ settings_overrides = { 'halt_level': 6,
+ 'file_insertion_enabled': 0,
+- 'raw_enabled': 0 }
++ 'raw_enabled': 1 }
+ );
+ print html[html.find('<body>')+6:html.find('</body>')].strip();
+ ";
+
+ sub import {
+ hook(type => "htmlize", id => "rst", call => \&htmlize);
++ hook(type => "htmlescape", id => "rst", call => \&htmlescape);
++ hook(type => "htmlescapelink", id => "rst", call => \&htmlescapelink);
+ }
+
++sub htmlescapelink ($$;@) {
++ my $url = shift;
++ my $text = shift;
++ my %params = @_;
++
++ if ($params{broken}){
++ return "`? <$url>`_\ $text";
++ }
++ else {
++ return "`$text <$url>`_";
++ }
++}
++
++sub htmlescape ($) {
++ my $html=shift;
++ $html=~s/^/ /mg;
++ return ".. raw:: html\n\n".$html;
++}
++
+ sub htmlize (@) {
+ my %params=@_;
+ my $content=$params{content};
+Index: doc/plugins/write.mdwn
+===================================================================
+--- doc/plugins/write.mdwn (revision 3197)
++++ doc/plugins/write.mdwn (working copy)
+@@ -121,6 +121,26 @@
+ The function is passed named parameters: "page" and "content" and should
+ return the htmlized content.
+
++### htmlescape
++
++ hook(type => "htmlescape", id => "ext", call => \&htmlescape);
++
++Some markup languages do not allow raw html to be mixed in with the markup
++language, and need it to be escaped in some way. This hook is a companion
++to the htmlize hook, and is called when ikiwiki detects that a preprocessor
++directive is inserting raw html. It is passed the chunk of html in
++question, and should return the escaped chunk.
++
++### htmlescapelink
++
++ hook(type => "htmlescapelink", id => "ext", call => \&htmlescapelink);
++
++Some markup languages have special syntax to link to other pages. This hook
++is a companion to the htmlize and htmlescape hooks, and it is called when a
++link is inserted. It is passed the target of the link and the text of the
++link, and an optional named parameter "broken" if a broken link is being
++generated. It should return the correctly-formatted link.
++
+ ### pagetemplate
+
+ hook(type => "pagetemplate", id => "foo", call => \&pagetemplate);
+@@ -355,6 +375,7 @@
+ * forcesubpage - set to force a link to a subpage
+ * linktext - set to force the link text to something
+ * anchor - set to make the link include an anchor
++* genhtml - set to generate HTML and not escape for correct format
+
+ #### `readfile($;$)`
+
+Index: doc/plugins/rst.mdwn
+===================================================================
+--- doc/plugins/rst.mdwn (revision 3197)
++++ doc/plugins/rst.mdwn (working copy)
+@@ -10,10 +10,8 @@
+ Note that this plugin does not interoperate very well with the rest of
+ ikiwiki. Limitations include:
+
+-* reStructuredText does not allow raw html to be inserted into
+- documents, but ikiwiki does so in many cases, including
+- [[WikiLinks|ikiwiki/WikiLink]] and many
+- [[Directives|ikiwiki/Directive]].
++* Some bits of ikiwiki may still assume that markdown is used or embed html
++ in ways that break reStructuredText. (Report bugs if you find any.)
+ * It's slow; it forks a copy of python for each page. While there is a
+ perl version of the reStructuredText processor, it is not being kept in
+ sync with the standard version, so is not used.
+Index: IkiWiki.pm
+===================================================================
+--- IkiWiki.pm (revision 3197)
++++ IkiWiki.pm (working copy)
+@@ -469,6 +469,10 @@
+ my $page=shift; # the page that will contain the link (different for inline)
+ my $link=shift;
+ my %opts=@_;
++ # we are processing $lpage and so we need to format things in accordance
++ # with the formatting language of $lpage. inline generates HTML so links
++ # will be escaped seperately.
++ my $type=pagetype($pagesources{$lpage});
+
+ my $bestlink;
+ if (! $opts{forcesubpage}) {
+@@ -494,12 +498,17 @@
+ }
+ if (! grep { $_ eq $bestlink } map { @{$_} } values %renderedfiles) {
+ return $linktext unless length $config{cgiurl};
+- return "<span><a href=\"".
+- cgiurl(
+- do => "create",
+- page => pagetitle(lc($link), 1),
+- from => $lpage
+- ).
++ my $url = cgiurl(
++ do => "create",
++ page => pagetitle(lc($link), 1),
++ from => $lpage
++ );
++
++ if ($hooks{htmlescapelink}{$type} && ! $opts{genhtml}){
++ return $hooks{htmlescapelink}{$type}{call}->($url, $linktext,
++ broken => 1);
++ }
++ return "<span><a href=\"". $url.
+ "\">?</a>$linktext</span>"
+ }
+
+@@ -514,6 +523,9 @@
+ $bestlink.="#".$opts{anchor};
+ }
+
++ if ($hooks{htmlescapelink}{$type} && !$opts{genhtml}) {
++ return $hooks{htmlescapelink}{$type}{call}->($bestlink, $linktext);
++ }
+ return "<a href=\"$bestlink\">$linktext</a>";
+ }
+
+@@ -628,6 +640,14 @@
+ preview => $preprocess_preview,
+ );
+ $preprocessing{$page}--;
++
++ # Handle escaping html if the htmlizer needs it.
++ if ($ret =~ /[<>]/ && $pagesources{$page}) {
++ my $type=pagetype($pagesources{$page});
++ if ($hooks{htmlescape}{$type}) {
++ return $hooks{htmlescape}{$type}{call}->($ret);
++ }
++ }
+ return $ret;
+ }
+ else {
+</pre>
diff --git a/doc/todo/fortune:_select_options_via_environment.mdwn b/doc/todo/fortune:_select_options_via_environment.mdwn
new file mode 100644
index 000000000..ddacd91b5
--- /dev/null
+++ b/doc/todo/fortune:_select_options_via_environment.mdwn
@@ -0,0 +1,34 @@
+ diff -up fortune.pm.ORIG fortune.pm.MODIFIED
+ --- fortune.pm.ORIG 2008-01-11 19:07:48.000000000 +0100
+ +++ fortune.pm.MODIFIED 2008-01-12 07:58:44.000000000 +0100
+ @@ -1,5 +1,11 @@
+ #!/usr/bin/perl
+ -# Include a fortune in a page
+ +# Include a fortune in a page.
+ +# If the environment variable IKIWIKI_FORTUNE_COMMAND is defined, use it.
+ +# This allows to run e.g.:
+ +# $IKIWIKI_FORTUNE_COMMAND='fortune ~/.fortune/myfortunes' \
+ +# ikiwiki -setup ~/.ikiwiki/ikiwiki.setup
+ +# Combining this with cron could make regenerated wiki content.
+ +# This may or may not be a good thing wrt. version control.
+ package IkiWiki::Plugin::fortune;
+
+ use warnings;
+ @@ -12,7 +18,13 @@ sub import {
+
+ sub preprocess (@) {
+ $ENV{PATH}="$ENV{PATH}:/usr/games:/usr/local/games";
+ - my $f = `fortune 2>/dev/null`;
+ + my $f;
+ + if (exists ($ENV{'IKIWIKI_FORTUNE_COMMAND'})) {
+ + $f = `$ENV{'IKIWIKI_FORTUNE_COMMAND'} 2>/dev/null`
+ + }
+ + else {
+ + $f = `fortune 2>/dev/null`;
+ + }
+
+ if ($?) {
+ return "[[".gettext("fortune failed")."]]";
+
+> An environment variable is not the right approach. Ikiwiki has a setup
+> file, and plugins can use configuration from there. --[[Joey]]
diff --git a/doc/todo/friendly_markup_names.mdwn b/doc/todo/friendly_markup_names.mdwn
new file mode 100644
index 000000000..f88e3c1c7
--- /dev/null
+++ b/doc/todo/friendly_markup_names.mdwn
@@ -0,0 +1,13 @@
+On the edit form when you are creating a new page, you are given an option of
+page types that can be used. The string presented to the user here is not
+particularly friendly: e.g., mdwn, txtl... it would be nice if the drop-down
+contents were "Markdown", "Textile", etc. (the values in the option tags can
+remain the same).
+
+I've written a first-take set of patches for this. They are in
+git://github.com/jmtd/ikiwiki.git in the branch "friendly_markup_names". [[!tag patch]]
+
+-- [[Jon]]
+
+[[merged|done]], TFTP! (I have not checked if any other format plugins
+would benefit from a longer name) --[[Joey]]
diff --git a/doc/todo/generated_po_stuff_not_ignored_by_git.mdwn b/doc/todo/generated_po_stuff_not_ignored_by_git.mdwn
new file mode 100644
index 000000000..29c017c5d
--- /dev/null
+++ b/doc/todo/generated_po_stuff_not_ignored_by_git.mdwn
@@ -0,0 +1,6 @@
+[[!tag patch]]
+
+The recent merge of the po branch didn't come with a .gitignore.
+It eventually annoyed me enough to fix it :-) --[[smcv]]
+
+[[done]]
diff --git a/doc/todo/generic___39__do__61__goto__39___for_CGI.mdwn b/doc/todo/generic___39__do__61__goto__39___for_CGI.mdwn
new file mode 100644
index 000000000..26c5202d0
--- /dev/null
+++ b/doc/todo/generic___39__do__61__goto__39___for_CGI.mdwn
@@ -0,0 +1,35 @@
+The [[plugins/recentchanges]] plugin has a `do=recentchanges_link` feature that will
+redirect to a given wiki page, or an error page with a creation link.
+
+In the [[plugins/contrib/comments]] plugin I've found that it would be useful to do
+the same for users. For now I've just cloned the functionality into the comments
+plugin, but perhaps this functionality could be renamed to `do=goto` or
+something, and moved to `IkiWiki/CGI.pm`?
+
+> Now implemented as the 'goto' branch in my git repository, along with
+> [[apache_404_ErrorDocument_handler]]. --[[smcv]]
+
+>> Looks good, the only things I wonder are:
+>> * Should it be a separate plugin? In particular `cgi_page_from_404()` is
+>> pretty big, and only works if apache is configured so seems somewhat
+>> pluginaable.
+
+>>> I've split out `goto` and `apache404` plugins in the branch. I think
+>>> you're right that apache404 should be a plugin. If you think goto is small
+>>> and general enough to not be a plugin, just don't merge my most recent
+>>> patch! --[[smcv]]
+
+>> * I wish there were some way to generalize the workaround for the stupid
+>> MSIE behavior. Actually, I wish we could ignore the MSIE stupidity,
+>> as I tend to do, but perhaps it's too stupid in this case for that to
+>> fly..
+>> * Is there any reason to require do=goto before checking for
+>> `REDIRECT_STATUS`? Seems that if that code were moved
+>> out of the enclosing if block, the apache 404 handler could
+>> be set direct to the cgi, which seems simpler to remember.
+>> --[[Joey]]
+
+>>> No, good point - the `REDIRECT_STATUS` check is sufficiently unambiguous
+>>> already. Fixed. --[[smcv]]
+
+[[done]]
diff --git a/doc/todo/generic_insert_links.mdwn b/doc/todo/generic_insert_links.mdwn
new file mode 100644
index 000000000..050f32ee7
--- /dev/null
+++ b/doc/todo/generic_insert_links.mdwn
@@ -0,0 +1,24 @@
+The attachment plugin's Insert Links button currently only knows
+how to insert plain wikilinks and img directives (for images).
+
+[[wishlist]]: Generalize this, so a plugin can cause arbitrary text
+to be inserted for a particular file. --[[Joey]]
+
+Design:
+
+Add an insertlinks hook. Each plugin using the hook would be called,
+and passed the filename of the attachment. If it knows how to handle
+the file type, it returns a the text that should be inserted on the page.
+If not, it returns undef, and the next plugin is tried.
+
+This would mean writing plugins in order to handle links for
+special kinds of attachments. To avoid that for simple stuff,
+a fallback plugin could run last and look for a template
+named like `templates/embed_$extension`, and insert a directive like:
+
+ \[[!template id=embed_vp8 file=my_movie.vp8]]
+
+Then to handle a new file type, a user could just make a template
+that expands to some relevant html. In the example above,
+`templates/embed_vp8` could make a html5 video tag, possibly with some
+flash fallback code even.
diff --git a/doc/todo/geotagging.mdwn b/doc/todo/geotagging.mdwn
new file mode 100644
index 000000000..65658d7c4
--- /dev/null
+++ b/doc/todo/geotagging.mdwn
@@ -0,0 +1,7 @@
+Would be nice to see a way of geotagging pages in an ikiwiki,
+and search/sort pages by distance to a given location, as well as
+showing page locations on a map (Google Map, OpenStreetMap, etc). -- [[users/vibrog]]
+
+[[!tag wishlist]]
+
+> [[!cpan Geo::Coordinates::UTM]] would probably be useful. --[[smcv]]
diff --git a/doc/todo/git-rev-list_requires_relative_path___40__fixes_git_ctime__41__.mdwn b/doc/todo/git-rev-list_requires_relative_path___40__fixes_git_ctime__41__.mdwn
new file mode 100644
index 000000000..ec7d61b90
--- /dev/null
+++ b/doc/todo/git-rev-list_requires_relative_path___40__fixes_git_ctime__41__.mdwn
@@ -0,0 +1,24 @@
+ Index: IkiWiki/Rcs/git.pm
+ ===================================================================
+ --- IkiWiki/Rcs/git.pm (revision 4532)
+ +++ IkiWiki/Rcs/git.pm (working copy)
+ @@ -275,6 +275,9 @@
+
+ my $file = shift || q{--};
+
+ + # Remove srcdir prefix to appease git-rev-list
+ + $file =~ s/^$config{srcdir}\/?//;
+ +
+ # Ignore error since a non-existing file might be given.
+ my ($sha1) = run_or_non('git-rev-list', '--max-count=1', 'HEAD', $file);
+ if ($sha1) {
+
+I actually see a bug in this patch. :-) If srcdir = "foo" and the wiki
+contains a "foo/bar" and a "bar", this will make it, in the non-ctime case,
+get the sha1 of the wrong file, "bar", when "foo/bar" is asked for.
+
+Better to strip the path out in getctime, I guess.
+
+--[[Joey]]
+
+[[!tag patch done]]
diff --git a/doc/todo/git_attribution.mdwn b/doc/todo/git_attribution.mdwn
new file mode 100644
index 000000000..baa522adc
--- /dev/null
+++ b/doc/todo/git_attribution.mdwn
@@ -0,0 +1,9 @@
+When run with the [[rcs/Git]] backend, ikiwiki should use `GIT_AUTHOR_NAME`
+and `GIT_AUTHOR_EMAIL` rather than munging the commit message. Depending
+on the semantics you want to imply (does a web edit constitute a commit by
+the user or by the script?), it could also set `GIT_COMMITTER_NAME` and
+`GIT_COMMITTER_EMAIL` to the same values. --[[JoshTriplett]]
+
+> See [[!debbug 451023]] for a [[patch]] --[[Joey]]
+
+[[done]]
diff --git a/doc/todo/git_attribution/discussion.mdwn b/doc/todo/git_attribution/discussion.mdwn
new file mode 100644
index 000000000..6905d9b4b
--- /dev/null
+++ b/doc/todo/git_attribution/discussion.mdwn
@@ -0,0 +1,98 @@
+I'd strongly recommend this modification to ikiwiki. Any particular limitations that anyone can think of?
+
+I might even have a try at this patch, though I'd have to hack the user preferences page to include author name...
+
+As to the question of whether the committer was the 'script' or the wiki editor... I'm not sure. Marking it as the script somehow (`ikiwiki-cgi <ikiwiki@sitename>`)? seems to make sense and would make it easier to manage.
+
+[[harningt]]
+
+I've been thinking a bit about the GIT attribution in ikiwiki...
+
+If no email set, I think "$USERNAME" is reasonable... no point in the
+'<>' causing clutter.
+>> **adjustement wrt comments**: leave the '<>' in due to requirements in git
+
+If no username set... then something like '@[IPADDR]' makes sense...
+(not in email brackets).
+
+> Why not put it in email brackets? --[[Joey]]
+
+In the case of OpenID login.. I think that's a special case... I don't
+think attempting to munge something meaningful out of the OpenID makes
+sense... but I think some massaging might need to be done.
+
+Ex: I've noticed in the current mode where logging in w/
+harningt.eharning.us/ shows up in the logs w/o HTTP and if I login w/
+http://harningt.eharning.us/ is shows up w/ the http... causing some
+inconsistency. I think it oughtta make sure that it has the properly
+discovered, canonicalized form (ex: if there's a redirect to another
+site (harningt.eharning.us -> www.eharning.us) then technically the
+target site is the 'real' openid (at least according to how most OpenID
+RPs take it).
+
+...
+
+For OpenID edits, I think there should be a way to tell it what
+username to show in the preferences dialog (so you can have a 'normal'
+$USER <$EMAIL> setup.) This could by default be filled in w/ sreg
+nickname value (as well as email for that matter)...
+
+To convey the openid used to make the edit, I think it would be
+important that some sort of footer line along the lines of the
+Signed-off: $USER <$EMAIL> conventions I've seen.
+
+Perhaps an OpenID: $OPENID_URL would make sense. This could help w/
+making sure that no one irrefutably spoofs a post by someone (since w/
+the setup where email and effective username are configurable, there's
+no determination of uniqueness)
+>> **adj re git req**: "$OPENID_URL <>"
+
+[[harningt]]
+
+[[madduck]]: git requires `Name <Email@address>` format, as far as I know.
+
+> Yes, it does:
+>
+> joey@kodama:~/tmp/foo/bar>git commit --author "foo"
+> fatal: malformed --author parameter
+>
+> It seems to be happy with anything of the form "foo <foo>" -- doesn't seem to
+> do any kind of strict checking. Even "http://joey.kitenet.net <>" will be
+> accepted. --[[Joey]]
+>>
+>>Sounds good to me,
+>>
+>> --[[harningt]]
+
+> I think the thing to do is, as Josh suggested originally, use
+> GIT_AUTHOR_NAME and GIT_AUTHOR_EMAIL. Note that setting these
+> individually is best, so git can independently validate/sanitize both
+> (which it does do somewhat). Always put the username/openid/IP in
+> GIT_AUTHOR_NAME; if the user has configured an email address,
+> GIT_AUTHOR_EMAIL can also be set.
+>
+> There is one thing yet to be solved, and that is how to tell the
+> difference between a web commit by 'Joey Hess <joey\@kitenet.net>',
+> and a git commit by the same. I think we do want to differentiate these,
+> and the best way to do it seems to be to add a line to the end of the
+> commit message. Something like: "\n\nWeb-commit: true"
+>
+> For backwards compatability, the code that parses the current stuff needs
+> to be left in. But it will need to take care to only parse that if the
+> commit isn't flagged as a web commit! Else web committers could forge
+> commits from others. --[[Joey]]
+>
+> BTW, I decided not to use the user's email address in the commit, because
+> then the email becomes part of project history, and you don't really
+> expect that to happen when you give your email address on signup to a web
+> site.
+>
+> The problem with leaving the email empty is that it confuses some things
+> that try to parse it, including:
+> * cia (wants a username in there):
+> * git pull --rebase (?)
+> * github pushes to twitter ;-)
+>
+> So while I tried that way at first, I'm now leaning toward encoding the
+> username in the email address. Like "user <user\@web>", or
+> "joey <http://joey.kitenet.net/\@web>".
diff --git a/doc/todo/git_recentchanges_should_not_show_merges.mdwn b/doc/todo/git_recentchanges_should_not_show_merges.mdwn
new file mode 100644
index 000000000..e65efdc81
--- /dev/null
+++ b/doc/todo/git_recentchanges_should_not_show_merges.mdwn
@@ -0,0 +1,20 @@
+The recentchanges page can currently display merge commits, such as "Merge
+branch 'master' of ssh://git.kitenet.net/srv/git/ikiwiki.info".
+
+It should filter these out somehow, but I'm not sure how to do that.
+
+A merge in general is a commit with two parents, right? But such a merge
+might be what gitweb calls a "simple merge", that is I think, just a
+fast-forward. Or it could be a merge that includes manual conflict resolution,
+and should be shown in recentchanges.
+
+Seems that the problem is that it's calling git-log with the -m option,
+which makes merges be listed with the diff from the perspective of each
+parent. I think it would be better to not use that (or possibly to use the
+-c option instead?). The -m makes it show the merge from the POV of
+each of the parents. If the -m is left off, none of the changes in the
+merge are shown, even if it includes changes not in any of the parents
+(manual conflict resolution). With -c, it seems to show only the unique
+changes introduced by the merge.
+
+[[done]], using -c, hope that was the right choice --[[Joey]]
diff --git a/doc/todo/graphviz.mdwn b/doc/todo/graphviz.mdwn
new file mode 100644
index 000000000..3f2514a99
--- /dev/null
+++ b/doc/todo/graphviz.mdwn
@@ -0,0 +1,19 @@
+How about a plugin providing a
+[[preprocessor_directive|ikiwiki/directive]] to render a
+[[!debpkg graphviz]] file as an image via one of the graphviz programs
+("dot" by default) and include the resulting image on the page, using the
+"cmapx" image map format? graphviz files themselves could also render the
+same way into an HTML file with the same basename as the graphviz file;
+format and program could come either from an ikiwiki configuration option
+or comments/directives in the file. (For example, "digraph" could imply
+"dot", and "graph" could imply "neato".)
+
+To complement this, ikiwiki could support creating and editing graphviz files through the CGI interface, as a new page type; preview could render the file. It would also help to have some sort of graphviz extension attribute for linking to a wiki page, which would become a standard href or URL attribute in the input passed to the particular graphviz program.
+
+> Editing graphviz files safely online might be tricky. Graphvis would need
+> to be audited. --[[Joey]]
+
+>> I've added a [[graphviz_plugin|plugins/graphviz]] which adds a preprocessor
+>> directive to render inline graphviz graphs, addressing part of this todo
+>> item. It doesn't yet support graphviz files as a separate page type, image
+>> maps, or wikilinks.--[[JoshTriplett]]
diff --git a/doc/todo/hard-coded_location_for_man_pages_and_w3m_cgi_wrapper.mdwn b/doc/todo/hard-coded_location_for_man_pages_and_w3m_cgi_wrapper.mdwn
new file mode 100644
index 000000000..7cf37fbb9
--- /dev/null
+++ b/doc/todo/hard-coded_location_for_man_pages_and_w3m_cgi_wrapper.mdwn
@@ -0,0 +1,94 @@
+Hi,
+
+some operating systems use PREFIX/man instead of PREFIX/share/man as the base
+directory for man pages and PREFIX/libexec/ instead of PREFIX/lib/ for files
+like CGI programs.
+At the moment the location of the installed man pages and the w3m cgi wrapper
+is hard-coded in Makefile.PL.
+The patch below makes it possible to install those files to alternative directories
+while the default stays as it is now.
+
+> It should be possible to use the existing MakeMaker variables such as
+> INSTALLMAN1DIR (though MakeMaker lacks one for man8). I'd prefer not
+> adding new variables where MakeMaker already has them. --[[Joey]]
+
+[[!tag patch patch/core]]
+
+<pre>
+
+ - Introduce two variables, IKI_MANDIR and IKI_W3MCGIDIR, to be set from
+ the command line. This enables locations for man pages and the w3m
+ cgi wrapper other than the hard-coded defaults in Makefile.PL.
+
+--- Makefile.PL.orig 2007-05-20 03:03:58.000000000 +0200
++++ Makefile.PL
+@@ -3,9 +3,32 @@ use warnings;
+ use strict;
+ use ExtUtils::MakeMaker;
+
++my %params = ( 'IKI_MANDIR' => '$(PREFIX)/share/man',
++ 'IKI_W3MCGIDIR' => '$(PREFIX)/lib/w3m/cgi-bin'
++ );
++
++@ARGV = grep {
++ my ($key, $value) = split(/=/, $_, 2);
++ if ( exists $params{$key} ) {
++ $params{$key} = $value;
++ print "Using $params{$key} for $key.\n";
++ 0
++ } else {
++ 1
++ }
++} @ARGV;
++
++
+ # Add a few more targets.
+ sub MY::postamble {
+-q{
++ package MY;
++
++ my $scriptvars = <<"EOSCRIPTVARS";
++IKI_MANDIR = $params{'IKI_MANDIR'}
++IKI_W3MCGIDIR = $params{'IKI_W3MCGIDIR'}
++EOSCRIPTVARS
++
++ my $script = q{
+ all:: extra_build
+ clean:: extra_clean
+ install:: extra_install
+@@ -56,23 +79,24 @@ extra_install:
+ done; \
+ done
+
+- install -d $(DESTDIR)$(PREFIX)/share/man/man1
+- install -m 644 ikiwiki.man $(DESTDIR)$(PREFIX)/share/man/man1/ikiwiki.1
++ install -d $(DESTDIR)$(IKI_MANDIR)/man1
++ install -m 644 ikiwiki.man $(DESTDIR)$(IKI_MANDIR)/man1/ikiwiki.1
+
+- install -d $(DESTDIR)$(PREFIX)/share/man/man8
+- install -m 644 ikiwiki-mass-rebuild.man $(DESTDIR)$(PREFIX)/share/man/ma
+n8/ikiwiki-mass-rebuild.8
++ install -d $(DESTDIR)$(IKI_MANDIR)/man8
++ install -m 644 ikiwiki-mass-rebuild.man $(DESTDIR)$(IKI_MANDIR)/man8/iki
+wiki-mass-rebuild.8
+
+ install -d $(DESTDIR)$(PREFIX)/sbin
+ install ikiwiki-mass-rebuild $(DESTDIR)$(PREFIX)/sbin
+
+- install -d $(DESTDIR)$(PREFIX)/lib/w3m/cgi-bin
+- install ikiwiki-w3m.cgi $(DESTDIR)$(PREFIX)/lib/w3m/cgi-bin
++ install -d $(DESTDIR)$(IKI_W3MCGIDIR)
++ install ikiwiki-w3m.cgi $(DESTDIR)$(IKI_W3MCGIDIR)
+
+ install -d $(DESTDIR)$(PREFIX)/bin
+ install ikiwiki.out $(DESTDIR)$(PREFIX)/bin/ikiwiki
+
+ $(MAKE) -C po install PREFIX=$(PREFIX)
+-}
++};
++ return $scriptvars.$script;
+ }
+
+ WriteMakefile(
+
+</pre>
diff --git a/doc/todo/headless_git_branches.mdwn b/doc/todo/headless_git_branches.mdwn
new file mode 100644
index 000000000..bedf21d0c
--- /dev/null
+++ b/doc/todo/headless_git_branches.mdwn
@@ -0,0 +1,113 @@
+Ikiwiki should really survive being asked to work with a git branch that has no existing commits.
+
+ mkdir iki-gittest
+ cd iki-gittest
+ GIT_DIR=barerepo.git git init
+ git clone barerepo.git srcdir
+ ikiwiki --rcs=git srcdir destdir
+
+I've fixed this initial construction case, and, based on my testing, I've
+also fixed the post-update executing on a new master, and ikiwiki.cgi
+executing on a non-existent master cases.
+
+Please commit so my users stop whining at me about having clean branches to push to, the big babies.
+
+Summary: Change three scary loud failure cases related to empty branches into three mostly quiet success cases.
+
+[[!tag patch]]
+
+> FWIW, [[The_TOVA_Company]] apparently wants this feature (and I hope
+> I don't mind that I mention they were willing to pay someone for it,
+> but I told them I'd not done any of the work. :) )
+>
+> Code review follows, per hunk.. --[[Joey]]
+
+<pre>
+diff --git a/IkiWiki/Plugin/git.pm b/IkiWiki/Plugin/git.pm
+index cf7fbe9..e5bafcf 100644
+--- a/IkiWiki/Plugin/git.pm
++++ b/IkiWiki/Plugin/git.pm
+@@ -439,17 +439,21 @@ sub git_commit_info ($;$) {
+
+ my @opts;
+ push @opts, "--max-count=$num" if defined $num;
+-
+- my @raw_lines = run_or_die('git', 'log', @opts,
+- '--pretty=raw', '--raw', '--abbrev=40', '--always', '-c',
+- '-r', $sha1, '--', '.');
+-
++ my @raw_lines;
+ my @ci;
+- while (my $parsed = parse_diff_tree(\@raw_lines)) {
+- push @ci, $parsed;
+- }
++
++ # Test to see if branch actually exists yet.
++ if (run_or_non('git', 'show-ref', '--quiet', '--verify', '--', 'refs/heads/' . $config{gitmaster_branch}) ) {
++ @raw_lines = run_or_die('git', 'log', @opts,
++ '--pretty=raw', '--raw', '--abbrev=40', '--always', '-c',
++ '-r', $sha1, '--', '.');
++
++ while (my $parsed = parse_diff_tree(\@raw_lines)) {
++ push @ci, $parsed;
++ }
+
+- warn "Cannot parse commit info for '$sha1' commit" if !@ci;
++ warn "Cannot parse commit info for '$sha1' commit" if !@ci;
++ };
+
+ return wantarray ? @ci : $ci[0];
+ }
+</pre>
+
+My concern is that this adds a bit of slowdown (git show-ref is fast, but
+It's still extra work) to a very hot code path that is run to eg,
+update recentchanges after every change.
+
+Seems not ideal to do extra work every time to handle a case
+that will literally happen a maximum of once in the entire lifecycle of a
+wiki (and zero times more typically, since the setup automator puts in a
+.gitignore file that works around this problem).
+
+So as to not just say "no" ... what if it always tried to run git log,
+and if it failed (or returned no parsed lines), then it could look
+at git show-ref to deduce whether to throw an error or not.
+--[[Joey]]
+
+> Ah, but then git-log would still complain "bad revision 'HEAD'"
+> --[[Joey]]
+
+<pre>
+@@ -474,7 +478,10 @@ sub rcs_update () {
+ # Update working directory.
+
+ if (length $config{gitorigin_branch}) {
+- run_or_cry('git', 'pull', '--prune', $config{gitorigin_branch});
++ run_or_cry('git', 'fetch', '--prune', $config{gitorigin_branch});
++ if (run_or_non('git', 'show-ref', '--quiet', '--verify', '--', 'refs/remotes/' . $config{gitorigin_branch} . '/' . $config{gitmaster_branch}) ) {
++ run_or_cry('git', 'merge', $config{gitorigin_branch} . '/' . $config{gitmaster_branch});
++ }
+ }
+ }
+
+</pre>
+
+Same concern here about extra work. Code path is nearly as hot, being
+called on every refresh. Probably could be dealt with similarly as above.
+
+Also, is there any point in breaking the pull up into a
+fetch followed by a merge? --[[Joey]]
+
+<pre>
+@@ -559,7 +566,7 @@ sub rcs_commit_helper (@) {
+ # So we should ignore its exit status (hence run_or_non).
+ if (run_or_non('git', 'commit', '-m', $params{message}, '-q', @opts)) {
+ if (length $config{gitorigin_branch}) {
+- run_or_cry('git', 'push', $config{gitorigin_branch});
++ run_or_cry('git', 'push', $config{gitorigin_branch}, $config{gitmaster_branch});
+ }
+ }
+
+</pre>
+
+This seems fine to apply. --[[Joey]]
diff --git a/doc/todo/hidden_links__47__tags.mdwn b/doc/todo/hidden_links__47__tags.mdwn
new file mode 100644
index 000000000..2a4749394
--- /dev/null
+++ b/doc/todo/hidden_links__47__tags.mdwn
@@ -0,0 +1,13 @@
+[[!tag wishlist]]
+
+I would like to have the possibility for hidden tags or links.
+Using the tag functionality I could group some news items for including them into other subpages. But I don't want the links or tags to show (and I don't want Tag lists like "Tags: ?mytag").
+The tagged items should not differ from the items, that are not tagged.
+I didn't find any way to hide the tag list or links and I don't want to have to create a "hidden" page containing links to the pages and then using the backlink functionality, because this is more prone to errors. It's easier to forget adding a link on a second page than forgetting to add a needed tag to a new newsitem.
+
+> I found out, that using the [[meta plugin|plugins/meta]] it is possible to create the hidden link, that I wanted.
+-- [[users/Enno]]
+
+>> Yes, [[meta link|ikiwiki/directive/meta]] will not show up as a visible link on the page, while
+>> also not showing up in the list of tags of a page, so it seems what you
+>> want. [[done]] --[[Joey]]
diff --git a/doc/todo/hook_to_detect_markdown_links_to_wiki_pages.mdwn b/doc/todo/hook_to_detect_markdown_links_to_wiki_pages.mdwn
new file mode 100644
index 000000000..ad9c7dda4
--- /dev/null
+++ b/doc/todo/hook_to_detect_markdown_links_to_wiki_pages.mdwn
@@ -0,0 +1 @@
+For an internal wiki, we occasionally get patches that link to internal wiki pages using the Markdown link syntax. I'd love to see an optional git hook to detect that and complain.
diff --git a/doc/todo/html.mdwn b/doc/todo/html.mdwn
new file mode 100644
index 000000000..4f4542be2
--- /dev/null
+++ b/doc/todo/html.mdwn
@@ -0,0 +1,6 @@
+Create some nice(r) stylesheets.
+
+Should be doable w/o touching a single line of code, just
+editing the [[templates]] and/or editing [[style.css]].
+
+[[done]] ([[css_market]] ..)
diff --git a/doc/todo/htmlvalidation.mdwn b/doc/todo/htmlvalidation.mdwn
new file mode 100644
index 000000000..e376b840e
--- /dev/null
+++ b/doc/todo/htmlvalidation.mdwn
@@ -0,0 +1,47 @@
+ * Doctype is XHTML 1.0 Strict
+
+ One consideration of course is that regular users might embed html
+ that uses deprecated presentational elements like &lt;center&gt;. At
+ least firefox seems to handle that mixture ok.
+ --[[Joey]]
+
+ * [ [inlinepage] ] gets wrapped in &lt;p&gt;...&lt;/p&gt; which has a high chance of invalidating the page.
+
+ Since markdown does this, the only way I can think to fix it is to
+ make the inlined page text start with &lt;/p&gt; and end with
+ &lt;p&gt;. Ugly, and of course there could be problems with
+ markdown enclosing it in other spanning tags in some cases.
+ I've implemented this hack now. :-/ --[[Joey]]
+
+ I used this 'hack' myself, but yesterday I came up with a better idea:
+ &lt;div class="inlinepage"&gt;
+ [ [inlinepage] ]
+ &lt;/div&gt;
+ This prevents markdown enclosing and even adds a useful css identifier. Problem is that this should be added to every page and not in the template(s). --[[JeroenSchot]]
+
+ I can make ikiwiki add that around every inlined page easily
+ enough. However, where is it documented? Came up dry on google.
+ --[[Joey]]
+
+ From <http://daringfireball.net/projects/markdown/syntax#html>:
+ > The only restrictions are that block-level HTML elements e.g. &lt;div&gt;, &lt;table&gt;, &lt;pre&gt;, &lt;p&gt;, etc. must be separated from surrounding content by blank lines, and the start and end tags of the block should not be indented with tabs or spaces. Markdown is smart enough not to add extra (unwanted) &lt;p&gt; tags around HTML block-level tags. [snip]
+ > Note that Markdown formatting syntax is not processed within
+ > block-level HTML tags. E.g., you can't use Markdown-style \*emphasis\* inside an HTML block.
+
+ Because [ [inlinepage] ] isn't separated by a blank line it gets treated as a block-level element. Hmm, will this stop all formatting, including *'s to em-tags? --[[JeroenSchot]]
+
+ Ah didn't realize you meant it fixed it at the markdown level. I'll
+ think about making postprocessor directives into
+ preprocessor directives instead, then I could use that fix (but I'm not
+ sure how feasible it is to do that). --[[Joey]]
+
+ Done.. inlining is now a preprocessor directive, happens before
+ markdown, and the inlinepage template uses div as suggested, this does
+ prevent markdown from doing any annoying escaping of the preprocessor
+ directives, as well as preventing it wrapping subpages in &lt;p&gt;.
+ --[[Joey]]
+
+This page is now valid.
+Test: [validate this page](http://validator.w3.org/check?url=referer)
+
+[[todo/done]]
diff --git a/doc/todo/htpasswd_mirror_of_the_userdb.mdwn b/doc/todo/htpasswd_mirror_of_the_userdb.mdwn
new file mode 100644
index 000000000..e4a411780
--- /dev/null
+++ b/doc/todo/htpasswd_mirror_of_the_userdb.mdwn
@@ -0,0 +1,29 @@
+[[!tag wishlist]]
+
+Ikiwiki is static, so access control for viewing the wiki must be
+implemented on the web server side. Managing wiki users and access
+together, we can currently
+
+* use [[httpauth|plugins/httpauth/]], but some [[passwordauth|plugins/passwordauth]] functionnality [[is missing|todo/httpauth_feature_parity_with_passwordauth/]];
+* use [[passwordauth|plugins/passwordauth]] plus [[an Apache `mod_perl` authentication mechanism|plugins/passwordauth/discussion/]], but this is Apache-centric and enabling `mod_perl` just for auth seems overkill.
+
+Moreover, when ikiwiki is just a part of a wider web project, we may want
+to use the same userdb for the other parts of this project.
+
+I think an ikiwiki plugin which would (re)generate an htpasswd version of
+the user/passwd base (better, two htpasswd files, one with only the wiki
+admins and one with everyone) each time an user is added or modified would
+solve this problem:
+
+* access control can be managed from the web server
+* user management is handled by the passwordauth plugin
+* htpasswd format is understood by various servers (Apache, lighttpd, nginx, ...) and languages commonly used for web development (perl, python, ruby)
+* htpasswd files can be mirrored on other machines when the web site is distributed
+
+-- [[nil]]
+
+> I think this is a good idea. Although unless the password hashes that
+> are stored in the userdb are compatible with htpasswd hashes,
+> the htpasswd hashes will need to be stored in the userdb too. Then
+> any userdb change can just regenerate the htpasswd file, dumping out
+> the right kind of hashes. --[[Joey]]
diff --git a/doc/todo/http_bl_support.mdwn b/doc/todo/http_bl_support.mdwn
new file mode 100644
index 000000000..f7a46ee6c
--- /dev/null
+++ b/doc/todo/http_bl_support.mdwn
@@ -0,0 +1,67 @@
+[Project Honeypot](http://projecthoneypot.org/) has an HTTP:BL API available to subscribed (it's free, accept donations) people/orgs. There's a basic perl package someone wrote, I'm including a copy here.
+
+[from here](http://projecthoneypot.org/board/read.php?f=10&i=112&t=112)
+
+> The [[plugins/blogspam]] service already checks urls against
+> the surbl, and has its own IP blacklist. The best way to
+> support the HTTP:BL may be to add a plugin
+> [there](http://blogspam.repository.steve.org.uk/file/cc858e497cae/server/plugins/).
+> --[[Joey]]
+
+<pre>
+package Honeypot;
+
+use Socket qw/inet_ntoa/;
+
+my $dns = 'dnsbl.httpbl.org';
+my %types = (
+0 => 'Search Engine',
+1 => 'Suspicious',
+2 => 'Harvester',
+4 => 'Comment Spammer'
+);
+sub query {
+my $key = shift || die 'You need a key for this, you get one at http://www.projecthoneypot.org';
+my $ip = shift || do {
+warn 'no IP for request in Honeypot::query().';
+return;
+};
+
+my @parts = reverse split /\./, $ip;
+my $lookup_name = join'.', $key, @parts, $dns;
+
+my $answer = gethostbyname ($lookup_name);
+return unless $answer;
+$answer = inet_ntoa($answer);
+my(undef, $days, $threat, $type) = split /\./, $answer;
+my @types;
+while(my($bit, $typename) = each %types) {
+push @types, $typename if $bit & $type;
+}
+return {
+days => $days,
+threat => $threat,
+type => join ',', @types
+};
+
+}
+1;
+</pre>
+
+From the page:
+
+> The usage is simple:
+
+> use Honeypot;
+> my $key = 'XXXXXXX'; # your key
+> my $ip = '....'; the IP you want to check
+> my $q = Honeypot::query($key, $ip);
+
+> use Data::Dumper;
+> print Dumper $q;
+
+Any chance of having this as a plugin?
+
+I could give it a go, too. Would be fun to try my hand at Perl. --[[simonraven]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/httpauth_example.mdwn b/doc/todo/httpauth_example.mdwn
new file mode 100644
index 000000000..0c6268aa1
--- /dev/null
+++ b/doc/todo/httpauth_example.mdwn
@@ -0,0 +1,8 @@
+ikiwiki should supply an example .htaccess file for use with HTTP
+authentication (perhaps as a [[tip|tips]]), showing how to authenticate the
+user for edits without requiring authentication for the entire wiki. (Ideally,
+recentchanges should work without authentication as well, even though it goes
+through the CGI.) --[[JoshTriplett]]
+
+> (Now that recentchanges is a static page, it auths the same as other wiki
+> pages.) --[[Joey]]
diff --git a/doc/todo/httpauth_example/discussion.mdwn b/doc/todo/httpauth_example/discussion.mdwn
new file mode 100644
index 000000000..36a1630fd
--- /dev/null
+++ b/doc/todo/httpauth_example/discussion.mdwn
@@ -0,0 +1 @@
+I wrote my experiences dealing with authentication [here](http://wiki.swclan.homelinux.org/tech/). Does it fit the bill? Its not really authentication (though its mentioned in my process) so much as just an attempt to eliminate the clear text password problem. --[Andrew Sackville-West](mailto:andrew@swclan.homelinux.org) \ No newline at end of file
diff --git a/doc/todo/httpauth_feature_parity_with_passwordauth.mdwn b/doc/todo/httpauth_feature_parity_with_passwordauth.mdwn
new file mode 100644
index 000000000..eb71cf840
--- /dev/null
+++ b/doc/todo/httpauth_feature_parity_with_passwordauth.mdwn
@@ -0,0 +1,28 @@
+The only way to have a private ikiwiki, with a shared user database
+for static pages and CGI authentication, is to use
+[[plugins/httpauth]]. It would be good for httpauth to be on par with
+[[plugins/passwordauth]], i.e. to allow registering users, resetting
+passwords, and changing passwords; supporting some kind of
+`account_creation_password` configuration option would be nice, too.
+
+I'll probably propose patches implementing this at some point.
+I've not had a single look at the code yet, but it may be nice to factorize
+the relevant passwordauth code, instead of rewriting it completely in httpauth.
+
+-- [[intrigeri]]
+
+Well, on such a private wiki, one can neither register herself nor
+reset his password: the registration page, as any other page, would be
+forbidden to non-authenticated users. Admin users should then be
+enabled to:
+
+- register a new user
+- reset someone else's password
+
+In both cases, a brand new random password is sent by e-mail to the
+new user.
+
+An authenticated user should nevertheless be able to change his
+own password. -- [[intrigeri]]
+
+[[wishlist]]
diff --git a/doc/todo/hyphenation.mdwn b/doc/todo/hyphenation.mdwn
new file mode 100644
index 000000000..e7f6bc401
--- /dev/null
+++ b/doc/todo/hyphenation.mdwn
@@ -0,0 +1,32 @@
+[[!tag wishlist]]
+
+I recently found [Hyphenator](http://code.google.com/p/hyphenator/) which is quite cool ... but it should be possible to implement this functionality within ikiwiki and not rely on javascript and the client.
+
+A Perl implementation of the algorithm exists in [[!cpan TeX::Hyphen]].
+
+> I'd be inclined to say that Javascript run in the client is a better
+> place to do hyphenation: this is the sort of non-essential,
+> progressive-enhancement thing that JS is perfect for. If you did it
+> at the server side, to cope with browser windows of different sizes
+> you'd have to serve HTML sprinkled with soft-hyphen entities at
+> every possible hyphenation point, like
+>
+> pro&shy;gress&shy;ive en&shy;hance&shy;ment
+>
+> which is nearly twice the byte-count and might stop
+> search engines from indexing your site correctly.
+>
+> A browser that supports Javascript probably also supports
+> soft-hyphen marks, but I doubt all non-JS browsers support them
+> correctly.
+>
+> It might be good to have a plugin to insert a reference to the
+> hyphenation JS into the `<head>`, or a general way to enable
+> this sort of thing without writing a plugin or changing your
+> `page.tmpl`, though. Perhaps we should have a `local.js`
+> alongside `local.css`? :-)
+>
+> --[[smcv]]
+
+>> Thanks, I did not realize that the javascript does something else than add &amp;shy;s - took a closer look at it now.
+>> I doubt however that adding them will increase the byte count more than transmitting the javascript.
diff --git a/doc/todo/ikibot.mdwn b/doc/todo/ikibot.mdwn
new file mode 100644
index 000000000..cb1d53c4a
--- /dev/null
+++ b/doc/todo/ikibot.mdwn
@@ -0,0 +1,9 @@
+Random idea: create an ikiwiki IRC bot, which notices the use of ikiwiki syntax in the channel and translates. This would work nicely for "frequently-given answer" bots like dpkg on #debian, since you could give answers by linking to wiki pages. ikibot could also excerpt page content.
+
+ <newikiuser> How do I set up ikiwiki with Git?
+ <ikihacker> \[[setup]]
+ <ikibot> http://ikiwiki.info/setup.html: "This tutorial will walk you through setting up a wiki with ikiwiki. ..."
+
+--[[JoshTriplett]]
+
+[[wishlist]]
diff --git a/doc/todo/improve_globlists.mdwn b/doc/todo/improve_globlists.mdwn
new file mode 100644
index 000000000..86a4ba154
--- /dev/null
+++ b/doc/todo/improve_globlists.mdwn
@@ -0,0 +1,8 @@
+Need to improve globlists, adding more powerful boolean expressions.
+The current behavior is to check for negated expressions, and not match if
+there are any, then check for normal expressions and match if any match,
+This fails if you want to do something like match only pages with tag foo
+that are under directory bar. I think we need parens for grouping, and
+probably also boolean OR.
+
+[[todo/done]]
diff --git a/doc/todo/improved_mediawiki_support.mdwn b/doc/todo/improved_mediawiki_support.mdwn
new file mode 100644
index 000000000..68cbcf7d4
--- /dev/null
+++ b/doc/todo/improved_mediawiki_support.mdwn
@@ -0,0 +1,9 @@
+[[!tag patch todo wishlist]]
+
+I several updates to the mediawiki plugin to improve compatibility, improving img and File: support. I'd love to get them upstream. Is there any interest? Patches are at [[http://www.isi.edu/~johnh/SOFTWARE/IKIWIKI/index.html]]
+
+> The mediawiki plugin has never been included in ikiwiki, it's
+> [provided by a third party](https://github.com/jmtd/mediawiki.pm) and
+> you should send your patches to them.
+> [[done]]
+> --[[Joey]]
diff --git a/doc/todo/improved_parentlinks_styling.mdwn b/doc/todo/improved_parentlinks_styling.mdwn
new file mode 100644
index 000000000..64130a616
--- /dev/null
+++ b/doc/todo/improved_parentlinks_styling.mdwn
@@ -0,0 +1,9 @@
+Use a styled ul for the parentlinks.
+
+On second thought, I've decided not to. Doing that would make ikiwiki less
+usable in browsers like w3m that don't support styled uls. ikiwiki does use
+styled uls for other things, such as the action bar, but displaying that as
+a simple unstyled list in a simple browser works well and makes sense. For
+parent links, it does not. --[[Joey]]
+
+[[done]]
diff --git a/doc/todo/index.html_allowed.mdwn b/doc/todo/index.html_allowed.mdwn
new file mode 100644
index 000000000..f5e6f8cd7
--- /dev/null
+++ b/doc/todo/index.html_allowed.mdwn
@@ -0,0 +1,126 @@
+This page used to be used for two patches, one of which is applied
+providing the usedirs option for output. The remaining patch, discussed
+below, concerns wanting to use foo/index.mdwn source files and get an
+output page name of foo, rather than foo/index. --[[Joey]]
+
+[[!tag patch]]
+
+---
+
+I independently implemented a similar, but smaller patch.
+(It's smaller because I only care about rendering; not CGI, for example.)
+The key to this patch is that "A/B/C" is treated as equivalent
+to "A/B/C/index".
+Here it is: --Per Bothner
+
+ --- IkiWiki/Render.pm~ 2007-01-11 15:01:51.000000000 -0800
+ +++ IkiWiki/Render.pm 2007-02-02 22:24:12.000000000 -0800
+ @@ -60,9 +60,9 @@
+ foreach my $dir (reverse split("/", $page)) {
+ if (! $skip) {
+ $path.="../";
+ - unshift @ret, { url => $path.htmlpage($dir), page => pagetitle($dir) };
+ + unshift @ret, { url => abs2rel(htmlpage(bestlink($page, $dir)), dirname($page)), page => pagetitle($dir) };
+ }
+ - else {
+ + elsif ($dir ne "index") {
+ $skip=0;
+ }
+ }
+
+ --- IkiWiki.pm~ 2007-01-12 12:47:09.000000000 -0800
+ +++ IkiWiki.pm 2007-02-02 18:02:16.000000000 -0800
+ @@ -315,6 +315,12 @@
+ elsif (exists $pagecase{lc $l}) {
+ return $pagecase{lc $l};
+ }
+ + else {
+ + my $lindex = $l . "/index";
+ + if (exists $links{$lindex}) {
+ + return $lindex;
+ + }
+ + }
+ } while $cwd=~s!/?[^/]+$!!;
+
+ if (length $config{userdir} && exists $links{"$config{userdir}/".lc($link)}) {
+
+Note I handle setting the url; slightly differently.
+Also note that an initial "index" is ignored. I.e. a
+page "A/B/index.html" is treated as "A/B".
+
+> Actually, your patch is shorter because it's more elegant and better :)
+> I'm withdrawing my old patch, because yours is much more in line with
+> ikiwiki's design and architecture.
+> I would like to make one suggestion to your patch, which is:
+
+ diff -urX ignorepats clean-ikidev/IkiWiki/Plugin/inline.pm ikidev/IkiWiki/Plugin/inline.pm
+ --- clean-ikidev/IkiWiki/Plugin/inline.pm 2007-02-25 12:26:54.099113000 -0800
+ +++ ikidev/IkiWiki/Plugin/inline.pm 2007-02-25 14:55:21.163340000 -0800
+ @@ -154,7 +154,7 @@
+ $link=htmlpage($link) if defined $type;
+ $link=abs2rel($link, dirname($params{destpage}));
+ $template->param(pageurl => $link);
+ - $template->param(title => pagetitle(basename($page)));
+ + $template->param(title => titlename($page));
+ $template->param(ctime => displaytime($pagectime{$page}));
+
+ if ($actions) {
+ @@ -318,7 +318,7 @@
+ my $pcontent = absolute_urls(get_inline_content($p, $page), $url);
+
+ $itemtemplate->param(
+ - title => pagetitle(basename($p), 1),
+ + title => titlename($p, 1),
+ url => $u,
+ permalink => $u,
+ date_822 => date_822($pagectime{$p}),
+ diff -urX ignorepats clean-ikidev/IkiWiki/Render.pm ikidev/IkiWiki/Render.pm
+ --- clean-ikidev/IkiWiki/Render.pm 2007-02-25 12:26:54.745833000 -0800
+ +++ ikidev/IkiWiki/Render.pm 2007-02-25 14:54:01.564715000 -0800
+ @@ -110,7 +110,7 @@
+ $template->param(
+ title => $page eq 'index'
+ ? $config{wikiname}
+ - : pagetitle(basename($page)),
+ + : titlename($page),
+ wikiname => $config{wikiname},
+ parentlinks => [parentlinks($page)],
+ content => $content,
+ diff -urX ignorepats clean-ikidev/IkiWiki.pm ikidev/IkiWiki.pm
+ --- clean-ikidev/IkiWiki.pm 2007-02-25 12:26:58.812850000 -0800
+ +++ ikidev/IkiWiki.pm 2007-02-25 15:05:22.328852000 -0800
+ @@ -192,6 +192,12 @@
+ return $untainted;
+ }
+
+ +sub titlename($;@) {
+ + my $page = shift;
+ + $page =~ s!/index$!!;
+ + return pagetitle(basename($page), @_);
+ +}
+ +
+ sub basename ($) {
+ my $file=shift;
+
+
+> This way foo/index gets "foo" as its title, not "index". --Ethan
+
+I took another swing at this and subverted the dominant paradigm. Here goes:
+
+<pre>
+diff -ru ikiwiki-2.4/IkiWiki.pm ikiwiki/IkiWiki.pm
+--- ikiwiki-2.4/IkiWiki.pm 2007-06-26 15:01:57.000000000 -0700
++++ ikiwiki/IkiWiki.pm 2007-07-25 15:58:00.990749000 -0700
+@@ -239,6 +239,7 @@
+ my $type=pagetype($file);
+ my $page=$file;
+ $page=~s/\Q.$type\E*$// if defined $type;
++ $page=~s/\/index$// if $page =~ /\/index$/;
+ return $page;
+ }
+
+</pre>
+
+This just makes it so that all files named foo/index become pages called foo, which is the desired effect. I haven't tested everything so far, so be careful! But you can see it working at http://ikidev.betacantrips.com/one/ again, as before. --Ethan
+
+[[done]], the indexpages setting enables this.
diff --git a/doc/todo/inline:_numerical_ordering_by_title.mdwn b/doc/todo/inline:_numerical_ordering_by_title.mdwn
new file mode 100644
index 000000000..3d7424b3f
--- /dev/null
+++ b/doc/todo/inline:_numerical_ordering_by_title.mdwn
@@ -0,0 +1,254 @@
+Could you please add numerical ordering by title to [[inline|plugins/inline]]
+plugin? Now I can do only alphabetical order by title, but sometime it's not enough.
+
+> Implemented, see [[natural_sorting]] [[!tag done]] --[[Joey]]
+
+BTW, it seems that ordering by title is rather ordering by filename of page.
+For me "title" means title of page I can set using `title` parameter
+of [[meta|plugins/meta]] plugin :)
+
+Why do I need that feature? I've just been migrating an info site of our university
+[mail system](http://poczta.uw.edu.pl/) to Ikiwiki from very static, console handling
+Makefile+[WML](http://thewml.org/)+XML+XSL=HTML solution. I have many news files
+(`1.mdwn`, `2.mdwn`, etc.) and unfortunately I did very stupid thing. I've commited
+all of them in the same revision of our Subversion repo...
+
+Now I have a problem with sorting these files using inline plugin. I can't do
+sorting by age, because both old and young news files have the same age. I can't
+sort by title too. For example, when I sort them by title, then `9.mdwn` page is
+between `90.mdwn` and `89.mdwn` pages... It sucks, of course. Sorting by mtime
+also is not a solution for me, because it means that I can't touch/fix old news
+anymore.
+
+Do you have any idea how to workaround that issue? --[[Paweł|ptecza]]
+
+> Delete all files. Add files back one at a time, committing after adding
+> each file. Sort by date. --[[Joey]]
+
+>> The simplest solutions are the best :D Thanks for the hint! I didn't
+>> want to do it before, because I was affaid that my Subversion keeps
+>> old date of creation of file. --[[Paweł|ptecza]]
+
+> Maybe you can rename `9.mdwn` to `09.mdwn`? See `rename(1)`, it renames multiple files
+> in one go. --[[buo]]
+
+>> Thanks for your suggestion! But what about if number of my news files grows to 100+?
+
+>> $ ls
+>> 09.mdwn 100.mdwn 101.mdwn 102.mdwn 89.mdwn 90.mdwn
+
+>> I don't want to rename all previous files to add `0` prefix. --[[Paweł|ptecza]]
+
+>>> Rather than adding 0's or or a 'sorttype' parameter, I'd just fix the sort order.
+>>> Both MacOS and Windows use a smarter sort order than just lexical in their
+>>> file browsers (e.g. <http://support.microsoft.com/default.aspx?kbid=319827>,
+>>> <http://docs.info.apple.com/article.html?artnum=300989>).
+>>>
+>>> The [Unicode Collation algorithm](http://en.wikipedia.org/wiki/Unicode_collation_algorithm)
+>>> would seem to be a reasonable sort order. (See also <http://www.unicode.org/unicode/reports/tr10/>.)
+>>> Unfortunately the standard perl implementation, [Unicode::Collate](http://perldoc.perl.org/Unicode/Collate.html)
+>>> doesn't handle the optional [numbers](http://www.unicode.org/unicode/reports/tr10/#Customization)
+>>> extension which is what you want. --[[Will]]
+
+---
+
+Below is my simple patch. Feel free to use it or comment!
+
+I have also 2 considerations for inline sorting:
+
+1. Maybe changing name of `sort` parameter to `sortby` or `sortkey` will
+ be good idea?
+
+ > No, that would break existing wikis. --[[Joey]]
+ >> It's no problem. You just have `ikiwiki-transition` utility :D --[[Paweł|ptecza]]
+
+1. Maybe you should use `title` sort key for title from meta plugin and `name`,
+ `filename`, `page` or `pagename` for page names? In the future you can also
+ sort by meta author, license or another key.
+
+ > There are many places in ikiwiki that do not use meta title info and
+ > could. I'd prefer to deal with that issue as a whole, not here,
+ > --[[Joey]]
+
+--[[Paweł|ptecza]]
+
+ --- inline.pm-orig 2008-09-02 09:53:20.000000000 +0200
+ +++ inline.pm 2008-09-02 10:09:02.000000000 +0200
+ @@ -186,7 +186,15 @@
+ }
+
+ if (exists $params{sort} && $params{sort} eq 'title') {
+ - @list=sort { pagetitle(basename($a)) cmp pagetitle(basename($b)) } @list;
+ + if (! $params{sorttype} || $params{sorttype} eq 'lexical') {
+ + @list=sort { pagetitle(basename($a)) cmp pagetitle(basename($b)) } @list;
+ + }
+ + elsif ($params{sorttype} eq 'numeric') {
+ + @list=sort { pagetitle(basename($a)) <=> pagetitle(basename($b)) } @list;
+ + }
+ + else {
+ + return sprintf(gettext("unknown sort type %s"), $params{sorttype});
+ + }
+ }
+ elsif (exists $params{sort} && $params{sort} eq 'mtime') {
+ @list=sort { $pagemtime{$b} <=> $pagemtime{$a} } @list;
+ @@ -195,7 +203,7 @@
+ @list=sort { $pagectime{$b} <=> $pagectime{$a} } @list;
+ }
+ else {
+ - return sprintf(gettext("unknown sort type %s"), $params{sort});
+ + return sprintf(gettext("unknown sort key %s"), $params{sort});
+ }
+
+ if (yesno($params{reverse})) {
+
+> To users, "sort" already determines the type of sort. It can be by title,
+> or by date, etc. Adding a separate "sorttype" value is thus fairly
+> confusing. --[[Joey]]
+
+>> OK. I will be more careful when I play with inline plugin :) --[[Paweł|ptecza]]
+
+---
+
+Joey, have you forgotten about that request? ;) --[[Paweł|ptecza]]
+
+> Okie. Here is a different [[patch]] based on my comment above. It doesn't introduce
+> a new key, but rather changes the title sorting order. Two caveats:
+
+ * I've only tested this in `inline`, not the other places I changed the sort order.
+ * I'm unsure if the regexp used in the split should be `/(-?\d+)/` instead of `/(\d+)/`.
+ As written, '-' is interpreted as a hyphen rather than a minus sign.
+
+> --[[Will]]
+
+>> I"m not comfortable with tossing out perl's default collator and trying
+>> to maintain some other one going forward. Especially not for such an
+>> edge case. --[[Joey]]
+
+>> Hi Will! Your idea looks interesting for me, but I'm affraid that it's too big
+>> change in Ikiwiki... Maybe I'm wrong? ;) What do you think, Joey? --[[Paweł|ptecza]]
+
+>>> It isn't that big a change. It is just supplying a sort order to the sort. The
+>>> patch is a little larger because I then went through and made that sort
+>>> order available in other places where it makes sense. (Looking at the
+>>> patch again briefly, I should have also used it in the `map` plugin.)
+>>>
+>>> If you wanted a simple patch, you could just move the `titlecmp` function
+>>> into the inline plugin and only use it there. The problem with that is that
+>>> it only fixes the inline plugin. -- [[Will]]
+
+>>>> Will, I agree with you that it's improved way of sort order. But on the other
+>>>> hand I prefer to be careful when I change something in a several places,
+>>>> because I don't want to break any working things when I fix one thing.
+>>>> I hope that Joey agree with us too and all Ikiwiki users will be happy
+>>>> after applying your patch ;) --[[Paweł|ptecza]]
+
+----
+
+ diff --git a/IkiWiki.pm b/IkiWiki.pm
+ index c0f5dea..d001f8d 100644
+ --- a/IkiWiki.pm
+ +++ b/IkiWiki.pm
+ @@ -20,7 +20,7 @@ use Exporter q{import};
+ our @EXPORT = qw(hook debug error template htmlpage add_depends pagespec_match
+ bestlink htmllink readfile writefile pagetype srcfile pagename
+ displaytime will_render gettext urlto targetpage
+ - add_underlay
+ + add_underlay titlecmp
+ %config %links %pagestate %renderedfiles
+ %pagesources %destsources);
+ our $VERSION = 2.00; # plugin interface version, next is ikiwiki version
+ @@ -835,6 +835,42 @@ sub titlepage ($) {
+ return $title;
+ }
+
+ +sub titlecmp ($$) {
+ + my $titleA=shift;
+ + my $titleB=shift;
+ +
+ + my @listA=split(/(\d+)/,$titleA);
+ + my @listB=split(/(\d+)/,$titleB);
+ +
+ + while (@listA && @listB) {
+ + # compare bits of text
+ + my $a = shift @listA;
+ + my $b = shift @listB;
+ + my $c = ($a cmp $b);
+ + return $c if ($c);
+ +
+ + if (@listA && @listB) {
+ + # compare numbers
+ + $a = shift @listA;
+ + $b = shift @listB;
+ + $c = $a <=> $b;
+ + return $c if ($c);
+ +
+ + # 01 is different to 1
+ + $c = (length($a) <=> length($b));
+ + return $c if ($c);
+ +
+ + $c = ($a cmp $b);
+ + return $c if ($c);
+ + }
+ + }
+ +
+ + return 1 if (@listA);
+ + return -1 if (@listB);
+ +
+ + return 0;
+ +}
+ +
+ sub linkpage ($) {
+ my $link=shift;
+ my $chars = defined $config{wiki_file_chars} ? $config{wiki_file_chars} : "-[:alnum:]+/.:_";
+ diff --git a/IkiWiki/Plugin/brokenlinks.pm b/IkiWiki/Plugin/brokenlinks.pm
+ index 37752dd..ccaa399 100644
+ --- a/IkiWiki/Plugin/brokenlinks.pm
+ +++ b/IkiWiki/Plugin/brokenlinks.pm
+ @@ -59,7 +59,7 @@ sub preprocess (@) {
+ map {
+ "<li>$_</li>"
+ }
+ - sort @broken)
+ + sort titlecmp @broken)
+ ."</ul>\n";
+ }
+
+ diff --git a/IkiWiki/Plugin/inline.pm b/IkiWiki/Plugin/inline.pm
+ index 8efef3f..263e7a6 100644
+ --- a/IkiWiki/Plugin/inline.pm
+ +++ b/IkiWiki/Plugin/inline.pm
+ @@ -192,7 +192,7 @@ sub preprocess_inline (@) {
+ }
+
+ if (exists $params{sort} && $params{sort} eq 'title') {
+ - @list=sort { pagetitle(basename($a)) cmp pagetitle(basename($b)) } @list;
+ + @list=sort { titlecmp(pagetitle(basename($a)),pagetitle(basename($b))) } @list;
+ }
+ elsif (exists $params{sort} && $params{sort} eq 'mtime') {
+ @list=sort { $pagemtime{$b} <=> $pagemtime{$a} } @list;
+ diff --git a/IkiWiki/Plugin/orphans.pm b/IkiWiki/Plugin/orphans.pm
+ index b910758..10a1d87 100644
+ --- a/IkiWiki/Plugin/orphans.pm
+ +++ b/IkiWiki/Plugin/orphans.pm
+ @@ -56,7 +56,7 @@ sub preprocess (@) {
+ htmllink($params{page}, $params{destpage}, $_,
+ noimageinline => 1).
+ "</li>"
+ - } sort @orphans).
+ + } sort titlecmp @orphans).
+ "</ul>\n";
+ }
+
+ diff --git a/IkiWiki/Render.pm b/IkiWiki/Render.pm
+ index ceb7c84..00798e1 100644
+ --- a/IkiWiki/Render.pm
+ +++ b/IkiWiki/Render.pm
+ @@ -89,7 +89,7 @@ sub genpage ($$) {
+ $template->param(have_actions => 1);
+ }
+
+ - my @backlinks=sort { $a->{page} cmp $b->{page} } backlinks($page);
+ + my @backlinks=sort { titlecmp($a->{page}, $b->{page}) } backlinks($page);
+ my ($backlinks, $more_backlinks);
+ if (@backlinks <= $config{numbacklinks} || ! $config{numbacklinks}) {
+ $backlinks=\@backlinks;
diff --git a/doc/todo/inline_directive_should_support_pagination.mdwn b/doc/todo/inline_directive_should_support_pagination.mdwn
new file mode 100644
index 000000000..eafe6ee11
--- /dev/null
+++ b/doc/todo/inline_directive_should_support_pagination.mdwn
@@ -0,0 +1,8 @@
+Ikiwiki should support pagination for index pages. Something like showing only 10 items on the first page, and then having the other items on the other pages.
+
+Basically, the same page would be rendered multiple times:
+
+- The index page: rendered normally, but item list is truncated to N items
+- The separate pages: rendered with a slice of the item list containing N items (or less for the last page)
+
+This I think breaks one major assumption: that source pages only generate one page in the output directory.
diff --git a/doc/todo/inline_option_for_pagespec-specific_show__61__N.mdwn b/doc/todo/inline_option_for_pagespec-specific_show__61__N.mdwn
new file mode 100644
index 000000000..6ee6e046f
--- /dev/null
+++ b/doc/todo/inline_option_for_pagespec-specific_show__61__N.mdwn
@@ -0,0 +1,3 @@
+inline could have a pagespec-specific show=N option, to say things like "10 news items (news/*), but at most 3 news items about releases (news/releases/*)".
+
+This should eliminate the need for wikiannounce to delete old news items about releases. \ No newline at end of file
diff --git a/doc/todo/inline_plugin:_ability_to_override_feed_name.mdwn b/doc/todo/inline_plugin:_ability_to_override_feed_name.mdwn
new file mode 100644
index 000000000..df5bf9194
--- /dev/null
+++ b/doc/todo/inline_plugin:_ability_to_override_feed_name.mdwn
@@ -0,0 +1,29 @@
+If RSS and Atom are enabled by default, the [[plugins/contrib/comments]]
+plugin generates a feed, perhaps `/sandbox/index.atom` for comments on the
+sandbox. If a blog is added to the page, the blog will steal the name
+`/sandbox/index.atom` and the comments plugin's feed will change to
+`/sandbox/index.atom2`.
+
+If `\[[!inline]]` gained a parameter `feedname` or something, the comments
+plugin could use `feedname=comments` to produce `/sandbox/comments.atom`
+instead (this would just require minor enhancements to rsspage(),
+atompage() and targetpage()).
+
+As a side benefit, [my blog](http://smcv.pseudorandom.co.uk/) could go back
+to its historical Atom feed URL of `.../feed.atom` (which is currently a
+symlink to `index.atom` :-) )
+
+On sites not using `usedirs` the current feed is `/sandbox.atom`, and we
+could perhaps change it to `/sandbox-comments.atom` or
+`/sandbox/comments.atom` if `feedname=comments` is given.
+
+--[[smcv]]
+
+> This is slightly hard to do, because you have to worry about
+> conflicting pages setting feedname, which could cause ikiwiki to blow up.
+>
+> Particularly for the non-usedirs case, where a page `sandbox/comments`
+> would produce the same feed as sandbox with `feedname=comments`.
+> --[[Joey]]
+
+> [[done]] as feedfile option --[[Joey]]
diff --git a/doc/todo/inline_plugin:_hide_feed_buttons_if_empty.mdwn b/doc/todo/inline_plugin:_hide_feed_buttons_if_empty.mdwn
new file mode 100644
index 000000000..d046c0cd0
--- /dev/null
+++ b/doc/todo/inline_plugin:_hide_feed_buttons_if_empty.mdwn
@@ -0,0 +1,7 @@
+ < joeyh> 03:49:19> also, I think it may be less visually confusing to
+ drop the rss/atom buttons for comments when there are none yet
+
+This seems to me like something that applies to the [[plugins/inline]] plugin in general, rather than the [[plugins/contrib/comments]] plugin specifically. --[[smcv]]
+
+>> [[done]] as emptyfeeds option, not on by default for inline, but I think
+>> it should be for comments --[[Joey]]
diff --git a/doc/todo/inline_plugin:_specifying_ordered_page_names.mdwn b/doc/todo/inline_plugin:_specifying_ordered_page_names.mdwn
new file mode 100644
index 000000000..85bb4ff5a
--- /dev/null
+++ b/doc/todo/inline_plugin:_specifying_ordered_page_names.mdwn
@@ -0,0 +1,19 @@
+A [[!taglink patch]] in my git repository (the inline-pagenames branch) adds
+the following parameter to the [[ikiwiki/directive/inline]] directive:
+
+> * `pagenames` - If given instead of `pages`, this is interpreted as a
+> space-separated list of links to pages (with the same
+> [[ikiwiki/SubPage/LinkingRules]] as in a [[ikiwiki/WikiLink]]), and they are inlined
+> in exactly the order given: the `sort` and `pages` parameters cannot be used
+> in conjunction with this one.
+
+This is on my [[wishlist]] for my [[plugins/contrib/album]] plugin, which currently
+uses it internally (as it has already collected the pages in order). It could also
+be useful for other things, like [[todo/wikitrails]]. --[[smcv]]
+
+[[!tag plugins/inline]]
+
+> It's sort of a pity that a pagespec like "a or b or c" doesn't somehow
+> match to (a, b, c) in that order, but I don't see how that would be
+> generally possible. While this feels a bit like bloat and inline already
+> has far too many parameters, I have [[merged|done]] it. --[[Joey]]
diff --git a/doc/todo/inline_postform_autotitles.mdwn b/doc/todo/inline_postform_autotitles.mdwn
new file mode 100644
index 000000000..39713eb5f
--- /dev/null
+++ b/doc/todo/inline_postform_autotitles.mdwn
@@ -0,0 +1,67 @@
+[[!tag wishlist patch plugins/inline]]
+[[!template id=gitbranch branch=chrysn/patches author="[[chrysn]]"]]
+
+for postforms in inlines of pages which follow a certain scheme, it might not
+be required to set the title for each individual post, but to automatically set
+the title and show no input box prompting for it.
+this can either be based on timestamp formatting, or use the already existing
+munging mechanism, which appends numbers to page titles in case that page
+already exists.
+
+two patches ([1], [2]) set inline up for that, adding an additional `autotitle`
+parameter. if that is given, the regular input of the inline postform will be
+replaced with a hidden input of that text. in addition, the empty title is
+permitted (both for autotitle and regular titles, as they go in the same GET
+parameter, `title`). as the empty page title is illegal, munging is used,
+resulting in ascending numeric page titles to be created.
+
+the second patch is actually a one-liner, filtering the title through strftime.
+
+> Something similar was requested in [[todo/more_customisable_titlepage_function]],
+> in which [[Joey]] outlined a similar solution.
+>
+> What's your use-case for not prompting for the title at all? I can see
+> [[madduck]]'s requirement for the title he typed in (say, "foo")
+> being transformed into 2009/07/26/foo or something (I name blog posts
+> like that myself), but I can't quite see the use for *entirely* automatic
+> titles.
+>
+> However, if that's really what you want, I suspect your code could be
+> extended so it also solves madduck's second request on
+> [[todo/more_customisable_titlepage_function]].
+>
+> --[[smcv]]
+
+### potential user interaction issues
+
+this has two side effects which have to be considered: first, the empty page
+title is accepted also in normal postforms (previously, this resulted in a "bad
+page name" error); second, entering a percent sign in that field might result
+in unexpexted strftime substitution (strftime might not even substitute for
+common uses of percent as in "reach 10% market share", but might in others as
+in "the 10%-rule").
+
+both can be circumvented by using another GET parameter for autotexts, as
+implemented in [3].
+> this patch still does not work perfectly; especially, it should make a
+> distinction between "autotitle is set but equal ''" (in which case it
+> should create a page named `1.mdwn`, and "autotitle is not set, and title is
+> equal ''" (in which case it should display the old error message) --[[chrysn]]
+
+### potential security issues
+
+* the autotitle's value is directly output through the template (but that's
+ done in other places as well, so i assume it's safe)
+* i don't know if anything bad can happen if unfiltered content is passed to
+ POSIX::strftime.
+
+### further extension
+
+having a pre-filled input field instead of an unchangable hidden input might be
+cool (eg for creating an entry with yesterday's date), but would be a bit of a
+problem with static pages. javascript could help with the date part, but name
+munging would be yet another thing.
+
+[1]: http://github.com/github076986099/ikiwiki/commit/b568eb257a3ef5ff49a84ac00a3a7465b643c1e1
+[2]: http://github.com/github076986099/ikiwiki/commit/34bc82f232be141edf036d35e8ef5aa289415072
+[3]: http://github.com/github076986099/ikiwiki/commit/40dc10a4ec7809e401b4497c2abccfba30f7a2af
diff --git a/doc/todo/inline_raw_files.mdwn b/doc/todo/inline_raw_files.mdwn
new file mode 100644
index 000000000..52a4be726
--- /dev/null
+++ b/doc/todo/inline_raw_files.mdwn
@@ -0,0 +1,115 @@
+[[!template id=gitbranch branch=wtk/raw_inline author="[[wtk]]"]]
+
+summary
+=======
+
+Extend inlining to handle raw files (files with unrecognized extensions).
+
+Also raise an error in `IkiWiki::pagetype($file)` if `$file` is blank, which avoids trying to do much with missing files, etc.
+
+I'm using the new code in my [blog][].
+
+[blog]: http://blog.tremily.us/posts/yacc2dot/
+
+usage
+=====
+
+ \[[!inline pagenames="somefile.txt" template="raw" feeds="no"]]
+
+
+> But inline already supports raw files in two ways:
+>
+> * setting raw=yes will cause a page to be inlined raw without
+> using any template, as if it were part of the page at the location
+> of the inline
+> * otherwise, the file becomes an enclosure in the rss feed, for use with
+> podcasting.
+>
+> So I don't see the point of your patch. Although since your text
+> editor seems to like to make lots of whitespace changes, it's possible
+> I missed something in the large quantity of noise introduced by it.
+> --[[Joey]]
+
+>> As I understand it, setting `raw=yes` causes the page to be inlined
+>> as if the page contents had appeared in place of the directive. The
+>> content is then processed by whatever `htmlize()` applies to the
+>> inlining page. I want the inlined page to be unprocessed, and
+>> wrapped in `<pre><code>...</code></pre>` (as they are on the blog
+>> post I link to above).
+>>
+>> Enclosures do not include the page contents at all, just a link to
+>> them. I'm trying to inline the content so I can comment on it from
+>> the inlining page.
+>>
+>> Apologies for my cluttered version history, I should have branched my
+>> earlier changes off to make things clearer. I tried to isolate my
+>> whitespace changes (fixes?) in c9ae012d245154c3374d155958fcb0b60fda57ce.
+>> 157389355d01224b2d3c3f6e4c1eb42a20ec8a90 should hold all the content
+>> changes.
+>>
+>> A list of other things globbed into my master branch that should have
+>> been separate branches:
+>>
+>> * Make it easy to select a Markdown executable for mdwn.pm.
+>> * Included an updated form of
+>> [[Javier Rojas' linktoimgonly.pm|forum/link_to_an_image_inside_the_wiki_without_inlining_it]].
+>> * Included an updated form of
+>> [Jason Blevins' mdwn_itex.pm](http://jblevins.org/git/ikiwiki/plugins.git/plain/mdwn_itex.pm).
+>> * Assorted minor documentation changes.
+>>
+>> --[[wtk]]
+
+>>> I haven't heard anything in a while, so I've reorganized my version
+>>> history and rebased it on the current ikiwiki head. Perhaps now it
+>>> will be easier to merge or reject. Note the new branch name:
+>>> `raw_inline`. I'll open separate todo items for items mentioned in my
+>>> previous comment. --[[wtk]]
+
+----
+
+Reviewing your patch the first thing I see is this:
+
+<pre>
++ if (! $file) {
++ error("Missing file.");
++ }
+</pre>
+
+This fails if the filename is "0". Also, `pagetype()`
+currently cannot fail; allowing it to crash the entire
+wiki build if the filename is somehow undefined seems
+unwise.
+
+I didn't look much further, because it seems to me what you're trying to do
+can be better accomplished by using the highlight plugin. Assuming the raw
+file you want to inline and comment on is some source-code-like thing,
+which seems likely.
+
+Or, another way to do it would be to use the templates plugin, and make
+a template there that puts an inline directive inside pre tags.
+ --[[Joey]] [[!tag reviewed]]
+
+----
+
+If `pagetype()` cannot fail, then I suppose that check has to go ;).
+
+I was under the impression that [[plugins/highlight]] didn't support
+inlining code. It looks like it supports highlighing stand-alone
+files or embedded code. Perhaps I should extend it to support inlined
+code instead of pushing this patch?
+
+> If you configure highlight to support standalone files, then you can
+> inline the resulting pages and get nicely highlighted source code
+> inlined into the page. --[[Joey]]
+
+The `raw.tmpl` included in the patch *does* include the inlined
+content inside `pre` tags. The problem is that the current inline
+code insists on running `htmlize()` on the content before inserting it
+in the template. The heart of my patch is an altered
+`get_inline_content()` that makes the `htmlize()` call dependent on a
+`$read_raw` flag. If the flag is set, the raw (non-htmlized) content
+is used instead.
+
+I just rebased my patches against the current Ikiwiki trunk (no major
+changes) to make them easier to review.
+ --[[wtk]]
diff --git a/doc/todo/inlines_inheriting_links.mdwn b/doc/todo/inlines_inheriting_links.mdwn
new file mode 100644
index 000000000..c53da51c5
--- /dev/null
+++ b/doc/todo/inlines_inheriting_links.mdwn
@@ -0,0 +1,39 @@
+[[!tag wishlist]]
+
+Continuing the ideas in [[bugs/Inline doesn't wikilink to pages]].
+
+I thought of a use case for another feature: making [[ikiwiki/directive/inline]] inherit the link relations of the included pages (optionally, say, with `inheritlinks=yes`). For example, if I want to list `elements/*` that have been linked to in any of `new_stuff/*`, I could try to write a [[ikiwiki/pagespec]] like
+`elements/* and backlink(new_stuff/*)`.
+
+This is not yet possible, as discussed in [[todo/tracking_bugs_with_dependencies]].
+
+It would be possible to work around this limitation of pagespecs if it was possible to create a page `all_new_stuff` with `\[[!inline pages="new_stuff/*" inheritlinks=yes]]`: then the desired pagespec would be expressed as `elements/* and backlink(all_new_stuff)`.
+
+> Or, instead of specifying whether to inherit at the place of the inline, add more relations (`inline`, `backinline`) and relation composition (say, `*`, or haskell-ish `$` in order not confuse with the glob `*`) and explicitly write in the pagespecs that you want to follow the inline relation backwards: `elements/* and backlink$backinline(all_new_stuff)` or, equivalently, if [["classes"|todo/tracking_bugs_with_dependencies]] are implemented in pagespecs: `elements/* and backlink(backinline(all_new_stuff))`. Of course, this suggestion requires the powerful extension to pagespecs, but it gives more flexibility, and the possibility to avoid redundant information: the same pagespec at two places -- the inline and the other matching construction.
+>
+> BTW, adding more relations -- the `inline` relation among them -- would satisfy [[the other feature request|bugs/Inline doesn't wikilink to pages]]. --Ivan Z.
+
+This is not just an ugly workaround. The availability of this feature has some reason: the classes of pages you want to refer to "recursively" (in that kind of complex pagespecs) tend to have some meaning themselves. So, I might indeed want to have a page like `all_new_stuff`, it would be useful for me. And at the same time I would like to write pagespecs like `elements/* and backlink(all_new_stuff)` -- and using the proposed feature in [[todo/tracking_bugs_with_dependencies/]] would be less clean because then I would have to enter the same information at two places: the possibly complex pagespec in the inline. And having redundant information leads to inconsistency.
+
+So in a sense, in some or most cases, it would indeed be cleaner to "store" the definition of a class of pages referred to in complex pagespecs as a separate object. And the most natural representation for this definition of a class of pages (adhering to the principle of wiki that what you mean is entered/stored in its most natural representation, not through some hidden disconnected code) is making a page with an inline/map/or the like, so that at the same time you store the definition and you see what it is (the set of pages is displayed to you).
+
+I would actually use it in my current "project" in ikiwiki: I actually edit a set of materials as a set of subpages `new_stuff/*`, and I also want to have a combined view of all of them (made through inline), and at another page, I want to list what has been linked to in `new_stuff/*` and what hasn't been linked to.--Ivan Z.
+
+> I see where you're coming from, but let's think about
+> immplementation efficiency for a second.
+>
+> In order for inline inheritlinks=yes to work,
+> the inline directive would need to be processed
+> during the scan pass.
+>
+> When the directive was processed there, it would need
+> to determine which pages get inlined (itself a moderatly
+> expensive operation), and then determine which pages
+> each of them link to. Since the scan pass is unordered,
+> those pages may not have themselves been scanned yet.
+> So to tell what they link to, inline would have to load
+> each of them, and scan them.
+>
+> So there's the potential for this to slow
+> down a wiki build by about a factor of 2.
+> --[[Joey]]
diff --git a/doc/todo/integration_with_Firefox_and_Iceweasel_feed_subscription_mechanism.mdwn b/doc/todo/integration_with_Firefox_and_Iceweasel_feed_subscription_mechanism.mdwn
new file mode 100644
index 000000000..16d33b0bb
--- /dev/null
+++ b/doc/todo/integration_with_Firefox_and_Iceweasel_feed_subscription_mechanism.mdwn
@@ -0,0 +1,13 @@
+Firefox and Iceweasel, when encountering a news feed, display a page that
+allows the user to subscribe to the feed, using Live Bookmarks, Google Reader,
+Bloglines, My Yahoo!, or an external reader program. The list of available
+applications comes from URIs and titles in the preferences, under
+`browser.contentHandlers.types.*`. For the benefit of people who use
+[[plugins/aggregate]] as their feed reader, the ikiwiki CGI could expose a URI
+to directly add a new feed to the aggregated list; this would allow users to
+configure their browser to subscribe to feeds via [[plugins/aggregate]] running
+on their site. We could then provide the manual configuration settings as a
+[[tip|tips]], and perhaps provide an extension or other mechanism to set them
+automatically. --[[JoshTriplett]]
+
+[[wishlist]]
diff --git a/doc/todo/interactive_todo_lists.mdwn b/doc/todo/interactive_todo_lists.mdwn
new file mode 100644
index 000000000..22b92953a
--- /dev/null
+++ b/doc/todo/interactive_todo_lists.mdwn
@@ -0,0 +1,49 @@
+This is a fleshed out todo based on discussions at
+[[forum/managing_todo_lists]].
+
+I would like to have TODO lists inside ikiwiki wikis. This would mean:
+
+* a new markup plugin to support a language suitable for TODO lists (OPML,
+ XOXO are two possible candidates)
+* some javascript to provide interactive editing.
+
+As [[chrysn]] pointed out on the forum page, this has some crossover with
+[[structured page data]]. In particular, if the markup language chosen had a
+concept of invalid markup (existing plugins just tend to ignore stuff that
+isn't explicitly part of their markup) we would need to sensibly handle that.
+Perhaps rejecting web edits and providing context help on why the edit was
+rejected, although that sounds like a significant headache.
+
+I have started working on this, albeit slowly. A proof of concept is at
+<http://dev.jmtd.net/outliner/>.
+
+There are two git repositories associated with my WIP: one contains the
+javascript, the plugin, the changes made to page templates; the other contains
+the contents of that wiki-site (so the test todos and the contents of bugs/
+which forms a sort-of todo list for the todo list :) ) I will endeavour to get
+mirrors of those repos up on github or similar asap.
+
+-- [[Jon]]
+
+----
+
+Just to report the WIP plugin for this is now in a reasonably good state. I ended
+up just inventing a new markup language -- for now, items are divided by newlines
+and lists are one-dimensional, for simplicity. I got fed up thinking about how to
+handle the structured data issues / needing a lot of boilerplate around items and
+the implications for the "new item" dialogue.
+
+Still quite a lot to do though!
+
+-- [[Jon]]
+
+I've pushed a copy of the work in progress which consists of
+
+ * A change to page.tmpl
+ * A javascript underlay directory + javascript file
+ * a few CSS bits in a local.css
+ * a plugin
+
+to <http://github.com/jmtd/ikiwiki_todolist/>
+
+-- [[Jon]]
diff --git a/doc/todo/internal_definition_list_support.mdwn b/doc/todo/internal_definition_list_support.mdwn
new file mode 100644
index 000000000..f87dc653f
--- /dev/null
+++ b/doc/todo/internal_definition_list_support.mdwn
@@ -0,0 +1,54 @@
+While ikiwiki can support definition lists (`dl/dt/dd`) through [[multimarkdown|plugins/mdwn]], it doesn't actually /do/ anything with those valuable definitions. It would be interesting for third party plugins to have access to this stuff as a proper data structure. This is what allows MoinMoin to have plugins that collect that data across multiple pages and tabulate it, for example.
+
+What I am proposing here is that the [[variables exported to plugins|plugins/write/#index6h2]] be extended to include a `%dictionnaries` hash. For a markup like this:
+
+[[!format txt """
+Apple
+: Apple is a fruit
+: It's also a computer company
+Orange
+: Orange is a fruit
+"""]]
+
+would result in a data structure like this:
+
+[[!format txt """
+%dicts = {
+ 'Apple' => [ "Apple is a fruit", "It's also a computer company" ],
+ 'Orange' => [ "Orange is a fruit" ],
+}
+"""]]
+
+Now, I know I can write myself a `format()` parser that would do this on all pages in my own plugin, but then it would need to be adapted to all markups, while markup formatters should be the ones implementing this directly, if possible.
+
+My first use case for this would be to extend the [[plugins/osm]] plugin to tap into those lists, so that I could have this data in the page, visible to the user:
+
+[[!format txt """
+Longitude
+: -45.30
+Latitude
+: 73.67
+"""]]
+
+and then reuse that data in the plugin.
+
+Then for us running the humongous [[koumbit wiki|https://wiki.koumbit.net/]], it is a necessary step to be able to migrate away from MoinMoin to Ikiwiki as we have a lot of pages that tabulate information like this. For example, see our [[ServerList|https://wiki.koumbit.net/ServerList]] ([[source|https://wiki.koumbit.net/ServerList?action=raw]]), being generated from pages like [[this one|https://wiki.koumbit.net/metis.koumbit.net]].
+
+If there are no objections to that concept, I may try to start coding patches. Otherwise this is really just a [[wishlist]]. --[[anarcat]]
+
+> Have you looked at the [[/plugins/contrib/field]] plugin? This gives you the infrastructure, and all you need is to write a plugin that parses the definition list format. Then you could use [[/plugins/contrib/getfield]], [[/plugins/contrib/ftemplate]] and/or [[/plugins/contrib/report]] to do what you like with the data.
+> --[[KathrynAndersen]]
+
+> ----
+
+> with the recent inclusion of discount to the [[plugins/mdwn]] module, definition lists can be used by default (instead of, as with multimarkdown, after an option is enabled), and look like this:
+>
+> =Apple=
+> Apple is a fruit.
+> Apple is also a company.
+> =Orange=
+> Orange is a fruit.
+>
+> (indented with four spaces). this makes definition lists a bit more attractive for definition harvesting.
+>
+> personally, i'd prefer a solution that works from the markup'ed plain text instead of invisible directives, as it integrates more naturally in the flow of designing a document, even though a plugin for explicitly stating invisible facts certainly has its purpose too. (think [[!wikipedia RDFa]] here ;-) ) --[[chrysn]]
diff --git a/doc/todo/l10n.mdwn b/doc/todo/l10n.mdwn
new file mode 100644
index 000000000..a539f0424
--- /dev/null
+++ b/doc/todo/l10n.mdwn
@@ -0,0 +1,84 @@
+ikiwiki should be fully internationalized.
+
+----
+
+As to the hardcoded strings in ikiwiki, I've internationalized the program,
+and there is a po/ikiwiki.pot in the source that can be translated.
+--[[Joey]]
+
+----
+
+> The now merged po plugin handles l10n of wiki pages. The only missing
+> piece now is l10n of the templates.
+> --[[Joey]]
+
+----
+
+## template i18n
+
+From [[Recai]]:
+> Here is my initial work on ikiwiki l10n infrastructure (I'm sending it
+> before finalizing, there may be errors).
+
+I've revised the patches (tested OK):
+
+- $config{lang} patch:
+
+ <http://people.debian.org/~roktas/patches/ikiwiki/ikiwiki-lang.diff>
+
+ + Support for CGI::FormBuilder.
+ + Modify Makefile.PL for l10n.
+
+- l10n infrastructure from Koha project. (This patch must be applied with
+ '-p1', also, it needs a 'chmod +x l10n/*.pl' after patching.)
+
+ + Leave templates dir untouched, use a temporary translations directory
+ instead.
+ + Fix Makefile (it failed to update templates).
+
+ <http://people.debian.org/~roktas/patches/ikiwiki/ikiwiki-l10n.diff>
+
+However...
+
+> fine. Also a final note, I haven't examined the quality of generated
+> templates yet.
+
+Looks like, tmpl_process3 cannot preserve line breaks in template files.
+For example, it processed the following template:
+
+ Someone[1], possibly you, requested that you be emailed the password for
+user
+ <TMPL_VAR USER_NAME> on <TMPL_VAR WIKINAME>[2].
+
+ The password is: <TMPL_VAR USER_PASSWORD>
+
+ --
+ ikiwiki
+
+ [1] The user requesting the password was at IP address <TMPL_VAR
+REMOTE_ADDR>
+ [2] Located at <TMPL_VAR WIKIURL>
+
+as (in Turkish):
+
+Birisi[1], ki muhtemelen bu sizsiniz, <TMPL_VAR WIKINAME>[2] üzerindeki
+<TMPL_VAR USER_NAME> kullanıcısına ait parolanın epostalanması isteğinde
+bulundu. Parola: <TMPL_VAR USER_PASSWORD> -- ikiwiki [1] Parolayı isteyen
+kullanıcının ait IP adresi: <TMPL_VAR REMOTE_ADDR>[2] <TMPL_VAR WIKIURL>
+
+This could be easily worked around in tmpl_process3, but I wouldn't like to
+maintain a separate utility.
+
+----
+
+Another way to approach this would be to write a small program that outputs
+the current set of templates. Now i18n of that program is trivial,
+and it can be run once per language to generate localized templates.
+
+Then it's just a matter of installing the templates somewhere, and having
+them be used when a different language is enabled.
+
+It would make sense to make the existing `locale` setting control which
+templates are used. But the [[plugins/po]] plugin would probably want to do
+something more, and use the actual language the page is written in.
+--[[Joey]]
diff --git a/doc/todo/language_definition_for_the_meta_plugin.mdwn b/doc/todo/language_definition_for_the_meta_plugin.mdwn
new file mode 100644
index 000000000..44fcf32bb
--- /dev/null
+++ b/doc/todo/language_definition_for_the_meta_plugin.mdwn
@@ -0,0 +1,118 @@
+Here is a patch for the [[plugins/meta]] plugin. It adds the possibility to define the language
+used for a page, with \[[!meta lang="ja"]]
+
+It doesn't insert the langage information in the xhtml meta elements, but defines a LANG
+variable to use in the templates, for example with
+
+ <html xmlns="http://www.w3.org/1999/xhtml"
+ lang="<TMPL_IF NAME="LANG"><TMPL_VAR LANG><TMPL_ELSE>fr</TMPL_IF>"
+ xml:lang="<TMPL_IF NAME="LANG"><TMPL_VAR LANG><TMPL_ELSE>fr</TMPL_IF>">
+
+This way also allows to define a language for a subset of the final page, with custom
+templates and inclusion.
+
+This may be useful for sites with a few pages in different languages, but no full i18n.
+
+> Looks good, but the need to modify the template and include a default
+> language in it is a bit problimatic, I think. --[[Joey]]
+
+>> --lang=XX could be a setup option, with a default value, then the template would be
+
+ <html xmlns="http://www.w3.org/1999/xhtml" lang="<TMPL_VAR LANG>" xml:lang="<TMPL_VAR LANG>">
+
+>>> Yes, that seems reasonable. I guess there's no problem with defaulting
+>>> to en if it can be overridden in the setup. --[[Joey]]
+
+>>>> Yes, english default makes sense. I guess we should use the `$config{lang}`,
+>>>> defined from the setup file or command-line options to define the default language
+>>>> (`$config{lang}` defaults to `en` which is fine) if the html pages, and override
+>>>> it from the `meta` directive.
+>>>> — [[NicolasLimare]]
+
+>>>>> ikiwiki already has a $config{locale}, which is a full locale (ie,
+>>>>> "en_US.UTF-8". This just needs to be parsed for the lang. --[[Joey]]
+
+>>>>>> My mistake, I meant $config{locale} --[[NicolasLimare]]
+
+> So the patch below could be changed to parse `$config{locale}` for the
+> language, and pass it if no specific lang was set for the page. The only
+> problem with that would be that this is all done inside the meta plugin,
+> so if that plugin were disabled, the lang would be empty. To avoid that,
+> I guess that the template needs to look like:
+
+ <html xmlns="http://www.w3.org/1999/xhtml"
+ <TMPL_IF NAME="LANG">lang="<TMPL_VAR LANG>" xml:lang="<TMPL_VAR LANG>"</TMPL_IF>>
+
+> Now it just needs to be finished up.. --[[Joey]]
+
+<pre>
+--- meta.orig.pm 2007-07-27 00:19:51.000000000 +0200
++++ meta.pm 2007-08-05 22:37:40.000000000 +0200
+@@ -11,6 +11,7 @@
+ my %permalink;
+ my %author;
+ my %authorurl;
++my %lang;
+
+ sub import {
+ hook(type => "preprocess", id => "meta", call => \&preprocess, scan => 1);
+@@ -100,6 +101,11 @@
+ $meta{$page}.='<link href="'.encode_entities($value).
+ "\" rel=\"openid.delegate\" />\n";
+ }
++ elsif ($key eq 'lang') {
++ if ($value =~ /^[A-Za-z]{2}$/) {
++ $lang{$page}=$value;
++ }
++ }
+ else {
+ $meta{$page}.=scrub("<meta name=\"".encode_entities($key).
+ "\" content=\"".encode_entities($value)."\" />\n");
+@@ -131,6 +137,8 @@
+ if exists $author{$page} && $template->query(name => "author");
+ $template->param(authorurl => $authorurl{$page})
+ if exists $authorurl{$page} && $template->query(name => "authorurl");
++ $template->param(lang => $lang{$page})
++ if exists $lang{$page} && $template->query(name => "lang");
+
+ }
+</pre>
+
+> Please resolve lang somewhere reusable rather than within meta plugin: It is certainly usable outside
+> the scope of the meta plugin as well. --[[JonasSmedegaard]]
+
+>> I don't see any problem with having this in meta? meta is on by default, and
+>> other plugins are free to use it or even depend on it (e.g. inline does).
+>>
+>> My only comments on this patch beyond what Joey said are that the page
+>> language could usefully go into `$pagestate{$page}{meta}{lang}` for other
+>> plugins to be able to see it (is that what you meant?), and that
+>> restricting to 2 characters is too restrictive (HTML 4.01 mentions
+>> `en`, `en-US` and `i-navajo` as possible language codes).
+>> This slightly complicates parsing the locale to get the default language:
+>> it'll need `tr/_/-/` after the optional `.encoding` is removed.
+>> --[[smcv]]
+
+>>> Now that po has been merged, this patch should probably also be adapted
+>>> so that the po plugin forces the meta::lang of every page to what po
+>>> thinks it should be. --[[smcv]]
+
+>>>> Agreed, users of the po plugin would greatly benefit from it.
+>>>> Seems doable. --[[intrigeri]]
+
+>>> Perhaps [[the_special_po_pagespecs|ikiwiki/pagespec/po]] should
+>>> also work with meta-assigned languages? --[[smcv]]
+
+>>>> Yes. But then, these special pagespecs should be moved outside of
+>>>> [[plugins/po]], as they could be useful to anyone using the
+>>>> currently discussed patch even when not using the po plugin.
+>>>>
+>>>> We could add these pagespecs to the core and make them use
+>>>> a simple language-guessing system based on a new hook. Any plugin
+>>>> that implements such a hook could decide whether it should
+>>>> overrides the language guessed by another one, and optionally use
+>>>> the `first`/`last` options (e.g. the po plugin will want to be
+>>>> authoritative on the pages of type po, and will then use
+>>>> `last`). --[[intrigeri]]
+
+[[!tag wishlist patch plugins/meta translation]]
diff --git a/doc/todo/latex.mdwn b/doc/todo/latex.mdwn
new file mode 100644
index 000000000..4b9413ca2
--- /dev/null
+++ b/doc/todo/latex.mdwn
@@ -0,0 +1,244 @@
+How about a plugin adding a
+[[preprocessor_directive|ikiwiki/directive]] to render some given LaTeX
+and include it in the page? This could either render the LaTeX as a PNG via
+[[!debpkg dvipng]] and include the resulting image in the page, or perhaps
+render via [HeVeA](http://pauillac.inria.fr/~maranget/hevea/index.html),
+[TeX2page](http://www.ccs.neu.edu/~dorai/tex2page/tex2page-doc.html), or
+similar. Useful for mathematics, as well as for stuff like the LaTeX version
+of the ikiwiki [[/logo]].
+
+> [[users/JasonBlevins]] has also a plugin for including [[LaTeX]] expressions (by means of `itex2MML`) -- [[plugins/mdwn_itex]] (look at his page for the link). --Ivan Z.
+
+>> I've [[updated|mdwn_itex]] Jason's plugin for ikiwiki 3.x. --[[wtk]]
+
+>>> I've updated [[Jason's pandoc plugin|users/jasonblevins]] to permit the TeX processing to be managed via Pandoc. See <https://github.com/profjim/pandoc-iki> for details. --Profjim
+
+----
+
+ikiwiki could also support LaTeX as a document type, again rendering to HTML.
+
+> [[users/JasonBlevins]] has also a [[plugins/pandoc]] plugin (look at his page for the link): in principle, [Pandoc](http://johnmacfarlane.net/pandoc/) can read and write [[LaTeX]]. --Ivan Z.
+
+----
+
+Conversely, how about adding a plugin to support exporting to LaTeX?
+
+>> I did some tests with using Markdown and a customized HTML::Latex and html2latex
+>> and it appears it will work for me now. (I hope to use ikiwiki for many
+>> to collaborate on a printed book that will be generated at least once per day in PDF format.)
+>>
+>> --JeremyReed
+
+>>> Have a look at [pandoc](http://code.google.com/p/pandoc/). It can make PDFs via pdflatex. --[[roktas]]
+
+>>>> Interesting, just yesterday I was playing with pandoc to make PDFs from my Markdown. Could someone advise me on how to embed these PDFs into ikiwiki? I need some guidance in implementing this. --[[JosephTurian]]
+
+>>>> [[users/JasonBlevins]] has a [[plugins/pandoc]] plugin (look at his page for the link). --Ivan Z.
+
+----
+
+[here](http://ng.l4x.org/gitweb/gitweb.cgi?p=ikiwiki.git/.git;a=blob;f=IkiWiki/Plugin/latex.pm) is a first stab at
+a latex plugin. Examples [here](http://ng.l4x.org/latex/). Currently without image support for hevea. And the latex2html
+output has the wrong charset and no command line switch to change that. Dreamland.
+
+As this link is not working, I setted a mirror here: <a href="http://satangoss.sarava.org/ikiwiki/latex.pm">http://satangoss.sarava.org/ikiwiki/latex.pm</a>.
+
+
+----
+
+Okay, now is the time for a mid term report i think.
+
+The LaTeX Plugin for ikiwiki is now usable, except for the security check. This means at the moment the latex code is not validated, but only added into a very basic latex template. and the image is generated via this path: latex -> dvips -> convert (.tex -> .dvi -> .ps -> .png).
+The resulting .png is moved into the imagefolder. The name of this image is the md5hash of the code the user wrote into the wiki. So on a second run there is no need to recreate this image, if it exists. This will fasten up all but the first generation of the page.
+The generation of the image is done in an temporary folder in /tmp created with tempdir from File::Temp. The tmp-folder name is something like: $md5sumdigest.XXXXXXXX. if there is an .tex file already in this dir it will be overwritten.
+
+So until now i finished the basic things, now the most important task is to make an good input validation.
+This is a bit eased since it is not possible to execute shell commands in perl. Furthermore adding additional packages won't work since the code comes from the user is inserted after \begin{document}. Therefore this will result in an error (and this will stop latex from working --> no image is created).
+
+So my task for the next weeks is to write an good input validation.
+I think this progress is okay, since I'll had to learn the last 5-6 weeks for my final exams in university therefore I can't do anything. From now on I have 3 months of freetime and I'll use them to work heavily on this plugin.
+So I think I'm inside my own timetable. :)
+
+ps: Since I found nothere the possibility to upload an file, here is an link to my page where you can have a look. Comments are very welcome ;-)
+https://www.der-winnie.de/~winnie/gsoc07/tex.pm
+
+You'll find an demo site here:
+https://www.der-winnie.de/wiki/opensource/gsoc2007/
+I'll add some more complex formulas over the days. But this is basically only pure latex. ;-)
+
+-- Patrick Winnertz
+
+> Looks like you're very well on schedule.
+
+> But I wonder: Do you have any plans to work on the latex to html side of
+> things too? This page kinda combines both uses of latex; creating gifs
+> for formulas etc, and creating html. Dreamland already has a latex2html
+> using plugin submitted above, but it still needs some work, and
+> particularly, needs the same input validation stuff. And then there's the
+> idea of using html2latex, to convert an ikiwiki site into a book. Any
+> plans to work on that too? I guess I'm not sure what the scope is of what
+> you plan to do as part of GSoC.
+
+>> Yes I plan to write an html -> tex (or pdf) plugin as well. But I think it is better to work first on the first one and complete it and then work and complete the second one. If it is in the scope of GSoC i don't know, but I'll write it since it is fun to write on an Opensource project ;-)
+
+>> For latex-> html:
+>> I have the problem that I don't really see the sense to create html code (this means text or charts) from latex code. But if you like I can also add this variant to create html code. In my eyes it is much more important that it is possible to have complex chemical/physical & math formulas on the website without the need to use extern programs. (and upload the pictures manually).
+
+>>> Well, I suppose someone might just like latex and want to use it as the
+>>> markup for their wiki, rather than learning markdown. I guess Midnight
+>>> wanted it enough to write his plugin. But the use case is not too
+>>> compelling to me either, honestly. --[[Joey]]
+
+### code review
+
+> The main problem I see with the code is that you seem to unnecessarily create a dummy div tag
+> in preprocess, and then in format you call create(), which generates an img tag. So, why not
+> just create the img tag in preprocess?
+
+>> Mh okay, I'll improve this.
+
+Fixed
+
+> Another problem: Looks like if latex fails to create the image, the user won't be shown any
+> of its error message, but just "failed to generate image from code". I suspect that in this
+> case being able to see the error message would be important.
+
+>> Yes, that's true. It would be _very_ hard to provide the user the output of latex since this is really very much. For an simple formula as \frac{1}{2} this could be 2 printed out.
+
+>>> YM 2 printed pages? Yes, I'm familar with latex's insane errors. :-)
+>>> However, IMHO, it's worth considering this and doing something. Perhaps
+>>> put the error inside some kind of box in the html so it's delimited
+>>> from the rest of the page. (Actually, ikiwiki preprocessor directives in
+>>> general could mark up their errors better.)
+
+Okay, I'll provide the log as an link in the wiki. But there should be a kind of mechanism how they can be removed. This could lead to an DOS (create via a bot so much nonsense code that the disk is full.)
+
+Fixed, the log is now provided if latex will fail.
+
+> The url handling could stand to be improved. Currently it uses $config{url}, so it depends on that being set. Some ikiwiki builds don't have an url set. The thing to do is to use urlto(), to generate a nice relative url from the page to the image.
+
+>> Mh... i choose one single dir explizitly since if you use on several pages the same formula this would really improve the time to generate the formulas and it would waste extra space if you store every formula 3-4 times. But if you really like I'll change this behaviour.
+
+>>> No, that makes sense! (My comments about $config{url} still stand
+>>> though.
+
+Yes of course, I'll improve the url handling. My comment was only about the several folder ;-)
+
+Fixed. Now I use urlto and will_render.
+
+> Another (minor) problem with the url handling is that you put all the images in a "teximages" directory in the toplevel of the wiki. I think it would be better to put each image in the subdirectory for the page that created it. See how the `img` and `sparkline` plugins handle this.
+
+> It looks like if the tempdir already exists, tempdir() will croak(), thus crashing ikiwiki. It would be good to catch a failure there and fail more gracefully.
+
+>> Okay, I'll improve this behaviour. Maybe: if tempdir croak rerun it to get another not existing dir? (But only x times so that this is no endless loop, with x maybe = 3).
+>> Then it would not be necessary to inform the user about not generating the image.
+
+>>> Or just propigate up an error message. If it's failing, someone is
+>>> probably trying to DOS ikiwiki or something. :-)
+
+Fixed. I now use eval { create_tmp } and then: if ($?) { $returncode = 0 } else { save .tex file ... } ...
+
+
+> I'm not sure why you're sanitising the PATH before calling latex. This could be problimatic on systems where latex is not in /bin or /usr/bin
+
+>> Okay what do you suggest to use as PATH?
+>> I'll have to change the default settings, since we ikiwiki runs in taint mode. (which is good ;-))
+
+>>> But, ikiwiki already sanitises path and deletes the IFS and CDPATH etc.
+>>> See ikiwiki.in.
+
+Fixed. I'll removed these two lines completly.
+
+-----
+Okay here an short timetable how I want to proceed further:
+
+* Until weekend (21-22. July) I'll try to fix all errors above. (done)
+* From 22.July until 29. July I'll try to set up a first security check
+ My plans are two parts of a security check:
+ * One with an array of blacklisted regular expression. (This would blacklist all the well known and easy to fetch things like \include {/path/to/something} and things like closing the math formula environment ($$). (done)
+ * the second step will be based on Tom::latex, which will help to parse and get a tree view of the code.
+
+Okay what do you think of this procedure?
+
+> --[[Joey]]
+
+>> -- [[PatrickWinnertz]]
+
+----
+
+> It would be nice if it would output image tags with style="height:1em;" so that the formulas scale
+> with the rest of the text if you change the font size in your browser (ctrl + +/-).
+
+
+Thanks for the comment.. is fixed.
+Mh... not really fixed :S I added it into the return but it is somehow ignored. I'll figure out why.
+
+-----
+
+Okay, the last version of the tex plugin for ikiwiki can be downloaded [here](https://www.der-winnie.de/~winnie/gsoc07/tex.pm).
+
+> I've looked this over, fixed the indenting, fixed some variable names
+> ("$foo" is a bad variable name), removed a gratuotuous use of `tie`,
+> fixed a bug (the first time it was run, it tried to write the png file
+> before the teximages/ directory existed) and checked the result in.
+>
+> Can you please flesh out [[plugins/teximg]] with
+> whatever documentation people who know tex will expect to see?
+
+Okay, I'll fill this up today I think with information about the plugin
+
+Done. Is that docu fine with you?
+
+>> Perhaps add some documentation about the kind of tex code that can be
+>> used, or a link to some documentation so people who don't know latex
+>> well can figure this out?
+
+> Also, please review my changes. In particular, I changed the @badthings
+> array to use qr//, which is much clearer, but it needs to be tested that
+> I didn't break the checking code when I did it. It would be nice to write
+> a test case that tries to feed it bad code and makes sure it rejects it.
+
+I'll test this now on my server. I'll report here later.
+Okay, checked. it works fine. My blacklist tests were successfull.
+
+>
+> Does it really make sense to have an alt tag for the image
+> that contains the tex code? Will that make any sense when browsing
+> without images?
+
+Mh.. For people who know latex very well this would be enough to imagine how the image would look like.
+This are of course the minority of people (but I guess also the minority of people are using non-gui browsers).
+
+
+
+> I'm thinking about renameing the preprocessor directive to teximg.
+> \[[!teximg code="" alt="foo"]] makes sense.. Would it make sense to rename
+> the whole plugin, or do you think that other tex stuff should go in this
+> same plugin?
+
+I'll think over this until I'm at work ;) Only for rendering images... not for generating .tex files .../wiki/
+the name is all the same i think. If you like teximg better than switch :)
+
+
+> Note: I removed the style= attribute, since as I've told you, the
+> htmlsanitizer strips those since they can be used to insert javascript. I
+> put in a class=teximage instead; the style sheet could be modified to
+> style that, if you want to send a patch for that.
+
+Ah yes.. sorry forgot to update the plugin in my public_html folder %-). This was my last change in this plugin :) Sorry.
+
+
+>
+> --[[Joey]]
+
+-----
+
+I'm using a [plugin](http://metameso.org/~joe/math/tex.pm) created by [Josef Urban](http://www.cs.ru.nl/~urban) that gets LaTeX into ikiwiki by using [LaTeXML](http://dlmf.nist.gov/LaTeXML). This could well be "the right way" to go (long term) but the plugin still does not render math expressions right, because ikiwiki is filtering out requisite header information. Examples (I recommend you use Firefox to view these!) are available [here](http://li101-104.members.linode.com/aa/math/) and [here](http://li101-104.members.linode.com/aa/simple/). Compare that last example to the [file generated by LaTeXML directly](http://metameso.org/~joe/math/math.xml). I posted the sources [here](http://metameso.org/aa/sources/) for easy perusal. How to get ikiwiki to use the original DOCTYPE and html fields? I could use some help getting this polished off. --[[jcorneli]]
+
+> update: it seems important to force the browser to think of the content as xml, e.g. [http://metameso.org/~joe/math/example.xml](http://metameso.org/~joe/math/example.xml) has the same source code as [http://metameso.org/~joe/math/example.html](http://metameso.org/~joe/math/example.html) and the former shows math working, but the latter doesn't. --[[jcorneli]]
+
+>> Looking at the source code, it seems Ikiwiki is doing more than filtering header information - it is filtering out all HTML formatting around MathML constituent objects. In the first example, we see that formatting for tables and such is preserved. --[[jcorneli]]
+
+
+[[!tag soc]]
+[[!tag wishlist]]
diff --git a/doc/todo/latex/discussion.mdwn b/doc/todo/latex/discussion.mdwn
new file mode 100644
index 000000000..9c99c436f
--- /dev/null
+++ b/doc/todo/latex/discussion.mdwn
@@ -0,0 +1,6 @@
+Okay, moving the discussion from the soc - page to here :)
+
+I'll have a look on the MediaWiki plugin and how they do it.
+(I think the parser I want to use can also handle \newcommand{ ... and similiar things very well.
+
+The bad point is: The parser is (not yet) in debian... i'm package it atm. ;-) \ No newline at end of file
diff --git a/doc/todo/let_inline_plugin_use_pagetemplates.mdwn b/doc/todo/let_inline_plugin_use_pagetemplates.mdwn
new file mode 100644
index 000000000..b220c8f6b
--- /dev/null
+++ b/doc/todo/let_inline_plugin_use_pagetemplates.mdwn
@@ -0,0 +1,5 @@
+Is there any reason why the inline plugin's template parameter couldn't take any pagetemplate templates, meaning those in use by the template plugin? Right now it seems that inline templates have to be `.tmpl` files on the filesystem.
+
+--[[madduck]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/limit_the_markup_formats_available_for_editing.mdwn b/doc/todo/limit_the_markup_formats_available_for_editing.mdwn
new file mode 100644
index 000000000..c3eb09e52
--- /dev/null
+++ b/doc/todo/limit_the_markup_formats_available_for_editing.mdwn
@@ -0,0 +1,8 @@
+For `aggregate` to work, I have to have the `html` plugin enabled, and this allows users to create `html` pages via the standard edit form. It would be good if I could tell IkiWiki that I don't want certain page types to be editable (but still enabled to let e.g. aggregate/inline work. So by telling IkiWiki that e.g. `html` pages are uneditable (in the setup file), people would no longer
+
+- choose the `html` (or `htm`) page type in the edit form
+- bring up the edit form for `html` or `htm`) pages in the first place.
+
+--[[madduck]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/link_map.mdwn b/doc/todo/link_map.mdwn
new file mode 100644
index 000000000..a02a792e4
--- /dev/null
+++ b/doc/todo/link_map.mdwn
@@ -0,0 +1,6 @@
+An idea: Use graphviz to generate a map of all the links between pages.
+(Could it be made clickable somehow?)
+
+Graphviz can output image maps. -- ChristofferSawicki
+
+[[todo/done]]
diff --git a/doc/todo/link_plugin_perhaps_too_general__63__.mdwn b/doc/todo/link_plugin_perhaps_too_general__63__.mdwn
new file mode 100644
index 000000000..8a5fd50eb
--- /dev/null
+++ b/doc/todo/link_plugin_perhaps_too_general__63__.mdwn
@@ -0,0 +1,25 @@
+[[!tag wishlist blue-sky]]
+(This isn't important to me - I don't use MediaWiki or Creole syntax myself -
+but just thinking out loud...)
+
+The [[ikiwiki/wikilink]] syntax IkiWiki uses sometimes conflicts with page
+languages' syntax (notably, [[plugins/contrib/MediaWiki]] and [[plugins/Creole]]
+want their wikilinks the other way round, like
+`\[[plugins/write|how to write a plugin]]`). It would be nice if there was
+some way for page language plugins to opt in/out of the normal wiki link
+processing - then MediaWiki and Creole could have their own `linkify` hook
+that was only active for *their* page types, and used the appropriate
+syntax.
+
+In [[todo/matching_different_kinds_of_links]] I wondered about adding a
+`\[[!typedlink to="foo" type="bar"]]` directive. This made me wonder whether
+a core `\[[!link]]` directive would be useful; this could be a fallback for
+page types where a normal wikilink can't be done for whatever reason, and
+could also provide extension points more easily than WikiLinks' special
+syntax with extra punctuation, which doesn't really scale?
+
+Straw-man:
+
+ \[[!link to="ikiwiki/wikilink" desc="WikiLinks"]]
+
+--[[smcv]]
diff --git a/doc/todo/linkbase.mdwn b/doc/todo/linkbase.mdwn
new file mode 100644
index 000000000..5dcef3c3d
--- /dev/null
+++ b/doc/todo/linkbase.mdwn
@@ -0,0 +1,16 @@
+[[!template id=gitbranch branch=GiuseppeBilotta/linkbase author="[[GiuseppeBilotta]]"]]
+
+This patches enables the user to specify additional paths (“link bases”)
+that can be used by ikiwiki when trying to resolve links. The list of
+link bases is built as follows:
+
+* the page itself (as ikiwiki currently does)
+* all link bases specified for this page
+* all link bases specified for pagespecs matched by this page
+
+To specify the link bases, the only way made available presently by the
+patchset is a linkbase plugin that works similarly to the shortcut
+plugin (link bases are specified in a linkbases.mdwn file at the
+document root). However, are ways are potentially possible.
+
+This is still work in progress. Comments and suggestions are welcome.
diff --git a/doc/todo/linkify_and_preprocessor_ordering.mdwn b/doc/todo/linkify_and_preprocessor_ordering.mdwn
new file mode 100644
index 000000000..2936d74f0
--- /dev/null
+++ b/doc/todo/linkify_and_preprocessor_ordering.mdwn
@@ -0,0 +1,24 @@
+Currently ikiwiki linkifies text, then runs preprocessor directives. This
+allows a directive to contain a wikilink inside a parameter, but since the
+wikilink expands to some arbitrary html, the parameter needs to be
+triple-quoted to avoid quotes in the expanded text from leaking out. This
+is rather non-obvious.
+
+One fix would be to switch the order, since linkification and preprocessing
+are relatively independant. Some directives, like inline, would need to keep
+on linkifiying the inlined pages, to make the links be resolved correctly,
+but that's ok. Any directives that outputed stuff that looked like a
+wikilink, but wasn't, would need to be changed.
+
+> This solution has been implemented and _seems_ ok.
+
+An alternative would be to change the wikilink regexp so it doesn't apply
+to wikilinks that are embedded inside preprocessor directives. I haven't
+found a way to do that yet, since perl doesn't allow variable-width
+negative lookbehind.
+
+Maybe processing wikilinks and preprocessor directives
+as part of the same loop would work, but that probably has its own
+issues.
+
+[[todo/done]]
diff --git a/doc/todo/linktitle.mdwn b/doc/todo/linktitle.mdwn
new file mode 100644
index 000000000..6df3bfdce
--- /dev/null
+++ b/doc/todo/linktitle.mdwn
@@ -0,0 +1,19 @@
+Pages could have a `linktitle` (perhaps via [[plugins/meta]]), and
+[[wikilinks|ikiwiki/wikilink]] could use that title by default when linking
+to the page. That would allow pages to have a simple, easily linkable name
+(without spaces, for instance), but use the proper title for links. For
+example, [[ikiwiki/Directive]] could use the `linktitle`
+"preprocessor directive", and pages for [[users]] could have `linktitle`s
+that put spaces in their names.
+
+Ideally, perhaps two versions of the title could exist, one for general
+use, and an optional one for if the case in the actual link starts with an
+uppercase letter. That would allow [[ikiwiki/directive]] to
+use the link text "preprocessor directive", but
+[[ikiwiki/Directive]] to use the link text "Preprocessor
+Directive", for use at the beginnings of sentences. If the second version
+did not exist, the first version would apply to both cases. However, that
+also seems like potential overkill, and less important than the basic
+functionality of `linktitle`. --[[JoshTriplett]]
+
+[[wishlist]]
diff --git a/doc/todo/lists.mdwn b/doc/todo/lists.mdwn
new file mode 100644
index 000000000..4fc3e68b4
--- /dev/null
+++ b/doc/todo/lists.mdwn
@@ -0,0 +1,3 @@
+* list of all missing pages
+
+ [[todo/done]]
diff --git a/doc/todo/location_of_external_plugins.mdwn b/doc/todo/location_of_external_plugins.mdwn
new file mode 100644
index 000000000..c28003e74
--- /dev/null
+++ b/doc/todo/location_of_external_plugins.mdwn
@@ -0,0 +1,24 @@
+Would it be possible to make the installation location for the external
+plugins (those talked to via xmlrpc) configurable? Currently, they are
+installed into (and later expected to be in) /usr/lib/ikiwiki/plugins. For
+the Fedora package (which I maintain), I move them to
+/usr/libexec/ikiwiki/plugins. While not covered by the FHS, this seems to
+be a more appropriate place, see:
+https://fedoraproject.org/wiki/Packaging/Guidelines#Libexecdir.
+
+> This would need to be a build time configuration setting so the directory
+> is built into ikiwiki for use at runtime. --[[Joey]]
+
+As a side note, the accompanying proxy.py might better be placed into some directory on the python path.
+
+> If someone can show how to do so without needing a Setup.py and all the
+> pain that using one entails.. --[[Joey]]
+
+>> At the very least I don't think proxy.py should be on the `sys.path`
+>> under its current name. If it was renamed to ikiwiki_proxy or some such,
+>> possibly; but I think it's more appropriate to have it in an
+>> ikiwiki-specific directory (a "private module") since it's not useful for
+>> anything outside ikiwiki, and putting it in the same directory as the
+>> external plugins means it's automatically in their `sys.path` without
+>> needing special configuration. --[[smcv]]
+>> (a mostly-inactive member of Debian's Python modules packaging team)
diff --git a/doc/todo/location_of_ikiwiki-w3m.cgi.mdwn b/doc/todo/location_of_ikiwiki-w3m.cgi.mdwn
new file mode 100644
index 000000000..8ca925bee
--- /dev/null
+++ b/doc/todo/location_of_ikiwiki-w3m.cgi.mdwn
@@ -0,0 +1,3 @@
+The `ikiwiki-w3m.cgi` script is installed (hard-coded) into `/usr/lib/w3m/cgi-bin`. On Fedora however, the w3m package expects it in `/usr/libexec/w3m/cgi-bin`. So, it would be nice if the destination for this script could be configured.
+
+> You can use `W3M_CGI_BIN now`. [[done]] --[[Joey]]
diff --git a/doc/todo/logo.mdwn b/doc/todo/logo.mdwn
new file mode 100644
index 000000000..616720e44
--- /dev/null
+++ b/doc/todo/logo.mdwn
@@ -0,0 +1,4 @@
+ikiwiki needs a logo. I'm thinking something simple like the word "ikiwiki"
+with the first "k" backwards; drawn to show that it's "wiki" reflected.
+
+[[todo/done]]
diff --git a/doc/todo/lucene_search_engine.mdwn b/doc/todo/lucene_search_engine.mdwn
new file mode 100644
index 000000000..bac9f9130
--- /dev/null
+++ b/doc/todo/lucene_search_engine.mdwn
@@ -0,0 +1 @@
+There are [some issue](http://www.branchable.com/bugs/Exception:_Cannot_open_tables_at_consistent_revisions_at___47__usr__47__lib__47__perl5__47__Search__47__Xapian__47__WritableDatabase.pm_line_41./#comment-c159ea3f9be35fcd9ed0eeedb162e816) with the current search engine. Sometimes the database gets corrupted and it's not very good at weighting say, the title against the content. For example, [searching for pagespec](http://ikiwiki.info/ikiwiki.cgi?P=pagespec) in this wiki doesn't lead to the [[ikiwiki/pagespec]] page in the first page... but in the third page. In [[different_search_engine]], there was the idea of using Lucene - is there any reason why we should have both, or at least let lucene live in contrib?
diff --git a/doc/todo/mailnotification.mdwn b/doc/todo/mailnotification.mdwn
new file mode 100644
index 000000000..37fe9a55a
--- /dev/null
+++ b/doc/todo/mailnotification.mdwn
@@ -0,0 +1,59 @@
+Should support mail notification of new and changed pages.
+
+ Hmm, should be easy to implement this.. it runs as a svn post-coommit hook
+ already, so just look at the userdb, svnlook at what's changed, and send
+ mails to people who have subscribed.
+
+ A few details:
+ 1. [[Joey]] mentioned that being able to subscribe to globs as well as
+ explicitly named pages would be desirable.
+ 2. I think that since we're using Perl on the backend, being able to
+ let users craft their own arbitrary regexes would be good.
+
+ Joey points out that this is actually a security hole, because Perl
+ regexes let you embed (arbitrary?) Perl expressions inside them. Yuck!
+
+(This is not actually true unless you "use re 'eval';", without which
+(?{ code }) is disabled for expressions which interpolate variables.
+See perldoc re, second paragraph of DESCRIPTION. It's a little iffy
+to allow arbitrary regexen, since it's fairly easy to craft a regular
+expression that takes unbounded time to run, but this can be avoided
+with the use of alarm to add a time limit. Something like
+
+ eval { # catches invalid regexen
+ no re 'eval'; # to be sure
+ local $SIG{ALRM} = sub { die };
+ alarm(1);
+ ... stuff involving m/$some_random_variable/ ...
+ alarm(0);
+ };
+ if ($@) { ... handle the error ... }
+
+should be safe. --[[WillThompson]])
+
+ It would also be good to be able to subscribe to all pages except discussion pages or the SandBox: `* !*/discussion !sandobx`, maybe --[[Joey]]
+
+ 3. Of course if you do that, you want to have form processing on the user
+ page that lets them tune it, and probably choose literal or glob by
+ default.
+
+ I think that the new globlist() function should do everything you need.
+ Adding a field to the prefs page will be trivial --[[Joey]]
+
+ The first cut, I suppose, could use one sendmail process to batch-mail all
+ subscribers for a given page. However, in the long run, I can see users
+ demanding a bit of feature creep:
+
+ 4. Each user should be able to tune whether they see the actual diff parts or
+ not.
+ 5. Each user should be able to set a maximum desired email size.
+ 6. We might want to support a user-specified shibboleth string that will be
+ included in the email they receive so they can easily procmail the messages
+ into a folder.
+
+ --[[BrandenRobinson]]
+
+ I'm deferring these nicities until there's some demonstrated demand
+ --[[Joey]].
+
+[[todo/done]]
diff --git a/doc/todo/mailnotification/discussion.mdwn b/doc/todo/mailnotification/discussion.mdwn
new file mode 100644
index 000000000..b8af29f05
--- /dev/null
+++ b/doc/todo/mailnotification/discussion.mdwn
@@ -0,0 +1,14 @@
+I think I would like mail notifications. Though it kinda comes out of this general fear of vandalisation.
+
+So if some 'evil doer' turned my wiki into a porn site, I would like to rectify it ASAP. So I would like:
+
+1. Mail notifications of edits not made by me (or established contributors)
+2. If there is something fishy, the steps I would need to revert the changes
+
+Mail notifications are probably not required. For example I get lots of comments on my blog, but I don't get mailed about them. They go through the (proprietary) [Akismet](http://akismet.com/) filter.
+
+Perhaps a powerful little UNDO feature on RecentChanges is all that is needed.
+
+> Um, if you'll look at the [[mailnotification]] page, ikiwiki has
+> supported mail notifications for > 1 year, with a powerful [[ikiwiki/PageSpec]]
+> to allow chosing which pages you want to be notified about. --[[Joey]]
diff --git a/doc/todo/make_html-parser_use_encode_entities_numeric.mdwn b/doc/todo/make_html-parser_use_encode_entities_numeric.mdwn
new file mode 100644
index 000000000..2c3344003
--- /dev/null
+++ b/doc/todo/make_html-parser_use_encode_entities_numeric.mdwn
@@ -0,0 +1,19 @@
+Hi,
+
+Using encode_entities makes this sort of thing happen:
+
+XML Parsing Error: undefined entity
+Location: http://XXX.YYY.ZZZ/
+
+and points to the relevant entity.
+
+I think using encode_entities_numeric would help a lot with this. This is just a naïve assessment, but this would prevent xml-like pages being non-xml.
+
+[[wishlist]]
+
+> I suppose you mean a html generator, and not a html parser.
+>
+> ikiwiki uses numeric entities where required, but not otherwise.
+>
+> It seems valid for xhtml to have eg, `&lt;` in it. Do you have a specific
+> example? --[[Joey]]
diff --git a/doc/todo/make_link_target_search_all_paths_as_fallback.mdwn b/doc/todo/make_link_target_search_all_paths_as_fallback.mdwn
new file mode 100644
index 000000000..f971fd8e0
--- /dev/null
+++ b/doc/todo/make_link_target_search_all_paths_as_fallback.mdwn
@@ -0,0 +1,27 @@
+[[!tag wishlist]]
+
+## Idea
+
+After searching from the most local to the root for a wikilinkable page, descend into the tree of pages looking for a matching page.
+
+For example, if I link to \[\[Pastrami\]\] from /users/eric, the current behavior is to look for
+
+ * /users/eric/pastrami
+ * /users/pastrami
+ * /users/eric/pastrami
+
+I'd like it to find /sandwiches/pastrami.
+
+## Issues
+
+I know this is a tougher problem, especially to scale efficiently. There is also not a clear ordering unless it is the recursive dictionary ordering (ie the order of a breadth-first search with natural ordering). It would probably require some sort of static lookup table keyed by pagename and yielding a path. This could be generated initially by a breadth-first search and then updated incrementally when pages are added/removed/renamed. In some ways a global might not be ideal, since one might argue that the link above should match /users/eric/sandwiches/pastrami before /sandwiches/pastrami. I guess you could put all matching paths in the lookup table and then evaluate the ordering at parse-time.
+
+## Motivation
+
+Since I often access my documents using a text editor, I find it useful to keep them ordered in a heirarchy, about 3 levels deep with a branching factor of perhaps 10. When linking though, I'd like the wiki to find the document for me, since I am lazy.
+
+Also, many of my wiki pages comprise the canonical local representation of some unique entity, for example I might have /software/ikiwiki. The nesting, however, is only to aid navigation, and shouldn't be considered as part of resource's name.
+
+## Alternatives
+
+If an alias could be specified in the page body (for example, /ikiwiki for /software/ikiwiki) which would then stand in for a regular page when searching, then the navigational convenience of folders could be preserved while selectively flattening the search namespace.
diff --git a/doc/todo/manpages.mdwn b/doc/todo/manpages.mdwn
new file mode 100644
index 000000000..4120b8432
--- /dev/null
+++ b/doc/todo/manpages.mdwn
@@ -0,0 +1,4 @@
+ikiwiki could support manpages (or general groff input files) and convert them
+to HTML. --[[JoshTriplett]]
+
+[[wishlist]]
diff --git a/doc/todo/mark_edit_as_trivial__44___identify__47__filter_on_trivial_changes.mdwn b/doc/todo/mark_edit_as_trivial__44___identify__47__filter_on_trivial_changes.mdwn
new file mode 100644
index 000000000..2b2b0242e
--- /dev/null
+++ b/doc/todo/mark_edit_as_trivial__44___identify__47__filter_on_trivial_changes.mdwn
@@ -0,0 +1,11 @@
+One feature of mediawiki which I quite like is the ability to mark a change as 'minor', or 'trivial'. This can then be used to filter the 'recentchanges' page, to only show substantial edits.
+
+The utility of this depends entirely on whether the editors use it properly.
+
+I currently use an inline on the front page of my personal homepage to show the most recent pages (by creation date) within a subsection of my site (a blog). Blog posts are rarely modified much after they are 'created' (or published - I bodge the creation time via meta when I publish a post. It might sit in draft form indefinitely), so this effectively shows only non-trivial changes.
+
+I would like to have a short list of the most recent modifications to the site on the front page. I therefore want to sort by modified time rather than creation time, but exclude edits that I self-identify as minor. I also only want to take a short number of items, the top 5, and display only their titles (which may be derived from filename, or set via meta again).
+
+I'm still thinking through how this might be achieved in an ikiwiki-suitable fashion, but I think I need a scheme to identify certain edits as trivial. This would have to work via web edits (easier: could add a check box to the edit form) and plain changes in the VCS (harder: scan for keywords in a commit message? in a VCS-agnostic fashion?)
+
+[[!tag wishlist]]
diff --git a/doc/todo/matching_different_kinds_of_links.mdwn b/doc/todo/matching_different_kinds_of_links.mdwn
new file mode 100644
index 000000000..da3ea49f6
--- /dev/null
+++ b/doc/todo/matching_different_kinds_of_links.mdwn
@@ -0,0 +1,196 @@
+[[!tag wishlist]]
+
+As noted in [[todo/tag_pagespec_function]], there is a "misbehavior" of a `tagged()` pagespec: it matches even pages which have plain links to the tag page.
+
+And in general, it would be quite useful to be able to distinguish different kinds of links: one more kind, in addition to "tag", is "bug dependency" noted in [[todo/structured_page_data#another_kind_of_links]] and [[todo/tracking_bugs_with_dependencies#another_kind_of_links]].
+
+It could distinguish the links by the `rel=` attribute. ([[Tags already receive a special rel-class|todo/rel_attribute_for_links]].) This means there is a general need for a syntax to specify user-defined rel-classes on wikilink (then bug deps would simply use their special rel-class, either directly, or through a special directive like `\[[!depends ]]`), and to refer to them in pagespecs (in forward and backward direction).
+
+Besides pagespecs, the `rel=` attribute could be used for styles. --Ivan Z.
+
+> FWIW, the `add_link` function introduced in a recent
+> release adds an abstraction that could be used to get
+> part of the way there to storing data about different types of
+> links. That function could easily be extended to take an optional
+> third parameter specifying the link type.
+>
+> Then there's the question of how to store and access the data. `%links`
+> does not offer a good way to add additional information about links.
+> Now, we could toss `%links` entirely and switch to an accessor function,
+> but let's think about not doing that..
+>
+> The data that seems to be needed is basically a deep hash, so
+> one could check `$linktype{$page}{tag}{$link}` to see if
+> the page contains a link of the given type. (Note that pages could
+> contain links that were duplicates except for their types.)
+>
+> There would be some data duplication, unfortuantly, but if `%linktype`
+> is not populated for regular wikilinks, it would at least be limited to
+> tags and other unusual link types, so not too bad.
+>
+> `%linktype` could be stored in `%pagestate`.. if so
+> the actual use might look like `$pagestate{$page}{linktype}{tag}{$link}`.
+> That could be implemented by the tag plugin right now
+> with no core changes. (BTW, then I originally wrote tag, pagestate
+> was not available, which is why I didn't make it differentiate from
+> normal links.) Might be better to go ahead and add the variable to
+> core though. --[[Joey]]
+
+>> I've implemented this with the data structure you suggested, except that
+>> I called it `%typedlinks` instead of `%linktype` (it seemed to make more
+>> sense that way). I also ported `tag` to it, and added a `tagged_is_strict`
+>> config option. See below! --[[smcv]]
+
+I saw somewhere else here some suggestions for the wiki-syntax for specifying the relation name of a link. One more suggestion---[the syntax used in Semantic MediaWiki](http://en.wikipedia.org/wiki/Semantic_MediaWiki#Basic_usage), like this:
+
+<pre>
+... the capital city is \[[Has capital::Berlin]] ...
+</pre>
+
+So a part of the effect of [[`\[[!taglink TAG\]\]`|plugins/tag]] could be represented as something like `\[[tag::TAG]]` or (more understandable relation name in what concerns the direction) `\[[tagged::TAG]]`.
+
+I don't have any opinion on this syntax (whether it's good or not)...--Ivan Z.
+
+-------
+
+>> [[!template id=gitbranch author="[[Simon_McVittie|smcv]]" branch=smcv/ready/link-types]]
+>> [[!tag patch]]
+
+## Documentation for smcv's branch
+
+### added to [[ikiwiki/pagespec]]
+
+* "`typedlink(type glob)`" - matches pages that link to a given page (or glob)
+ with a given link type. Plugins can create links with a specific type:
+ for instance, the tag plugin creates links of type `tag`.
+
+### added to [[plugins/tag]]
+
+If the `tagged_is_strict` config option is set, `tagged()` will only match
+tags explicitly set with [[ikiwiki/directive/tag]] or
+[[ikiwiki/directive/taglink]]; if not (the default), it will also match
+any other [[WikiLinks|ikiwiki/WikiLink]] to the tag page.
+
+### added to [[plugins/write]]
+
+#### `%typedlinks`
+
+The `%typedlinks` hash records links of specific types. Do not modify this
+hash directly; call `add_link()`. The keys are page names, and the values
+are hash references. In each page's hash reference, the keys are link types
+defined by plugins, and the values are hash references with link targets
+as keys, and 1 as a dummy value, something like this:
+
+ $typedlinks{"foo"} = {
+ tag => { short_word => 1, metasyntactic_variable => 1 },
+ next_page => { bar => 1 },
+ };
+
+Ordinary [[WikiLinks|ikiwiki/WikiLink]] appear in `%links`, but not in
+`%typedlinks`.
+
+#### `add_link($$;$)`
+
+ This adds a link to `%links`, ensuring that duplicate links are not
+ added. Pass it the page that contains the link, and the link text.
+
+An optional third parameter sets the link type (`undef` produces an ordinary
+[[ikiwiki/WikiLink]]).
+
+## Review
+
+Some code refers to `oldtypedlinks`, and other to `oldlinktypes`. --[[Joey]]
+
+> Oops, I'll fix that. That must mean missing test coverage, too :-(
+> --s
+
+>> A test suite for the dependency resolver *would* be nice. --[[Joey]]
+
+>>> Bug fixed, I think. A test suite for the dependency resolver seems
+>>> more ambitious than I want to get into right now, but I added a
+>>> unit test for this part of it... --s
+
+I'm curious what your reasoning was for adding a new variable
+rather than using `pagestate`. Was it only because you needed
+the `old` version to detect change, or was there other complexity?
+--J
+
+> You seemed to be more in favour of adding it to the core in
+> your proposal above, so I assumed that'd be more likely to be
+> accepted :-) I don't mind one way or the other - `%typedlinks`
+> costs one core variable, but saves one level of hash nesting. If
+> you're not sure either, then I think the decision should come down
+> to which one is easier to document clearly - I'm still unhappy with
+> my docs for `%typedlinks`, so I'll try to write docs for it as
+> `pagestate` and see if they work any better. --s
+
+>> On reflection, I don't think it's any better as a pagestate, and
+>> the contents of pagestates (so far) aren't documented for other
+>> plugins' consumption, so I'm inclined to leave it as-is, unless
+>> you want to veto that. Loose rationale: it needs special handling
+>> in the core to be a dependency type (I re-used the existing link
+>> type), it's API beyond a single plugin, and it's really part of
+>> the core parallel to pagestate rather than being tied to a
+>> specific plugin. Also, I'd need to special-case it to have
+>> ikiwiki not delete it from the index, unless I introduced a
+>> dummy typedlinks plugin (or just hook) that did nothing... --s
+
+I have not convinced myself this is a real problem, but..
+If a page has a typed link, there seems to be no way to tell
+if it also has a separate, regular link. `add_link` will add
+to `@links` when adding a typed, or untyped link. If only untyped
+links were recorded there, one could tell the difference. But then
+typed links would not show up at all in eg, a linkmap,
+unless it was changed to check for typed links too.
+(Or, regular links could be recorded in typedlinks too,
+with a empty type. (Bloaty.)) --J
+
+> I think I like the semantics as-is - I can't think of any
+> reason why you'd want to ask the question "does A link to B,
+> not counting tags and other typed links?". A typed link is
+> still a link, in my mind at least. --s
+
+>> Me neither, let's not worry about it. --[[Joey]]
+
+I suspect we could get away without having `tagged_is_strict`
+without too much transitional trouble. --[[Joey]]
+
+> If you think so, I can delete about 5 LoC. I don't particularly
+> care either way; [[Jon]] expressed concern about people relying
+> on the current semantics, on one of the pages requesting this
+> change. --s
+
+>> Removed in a newer version of the branch. --s
+
+I might have been wrong to introduce `typedlink(tag foo)`. It's not
+very user-friendly, and is more useful as a backend for other plugins
+that as a feature in its own right - any plugin introducing a link
+type will probably also want to have its own preprocessor directive
+to set that link type, and its own pagespec function to match it.
+I wonder whether to make a `typedlink` plugin that has the typedlink
+pagespec match function and a new `\[[!typedlink to="foo" type="bar"]]`
+though... --[[smcv]]
+
+> I agree, per-type matchers are more friendly and I'm not enamored of the
+> multi-parameter pagespec syntax. --[[Joey]]
+
+>> Removed in a newer version of the branch. I re-introduced it as a
+>> plugin in `smcv/typedlink`, but I don't think we really need it. --s
+
+----
+
+I am ready to merge this, but I noticed one problem -- since `match_tagged`
+now only matches pages with the tag linktype, a wiki will need to be
+rebuilt on upgrade in order to get the linktype of existing tags in it
+recorded. So there needs to be a NEWS item about this and
+the postinst modified to force the rebuild.
+
+> Done, although you'll need to plug in an appropriate version number when
+> you release it. Is there a distinctive reminder string you grep for
+> during releases? I've used `UNRELEASED` for now. --[[smcv]]
+
+Also, the ready branch adds `typedlink()` to [[ikiwiki/pagespec]],
+but you removed that feature as documented above.
+--[[Joey]]
+
+> [[Done]]. --s
diff --git a/doc/todo/mbox.mdwn b/doc/todo/mbox.mdwn
new file mode 100644
index 000000000..a6af0c3c5
--- /dev/null
+++ b/doc/todo/mbox.mdwn
@@ -0,0 +1,18 @@
+I'd like to be able to drop an unmodified RFC2822 email message into ikiwiki, and get it formatted to HTML. Something like this: <http://lwn.net/Articles/287056/>
+
+> We're discussing doing just that (well, whole mailboxes, really) over in
+> [[comment_by_mail]] --[[Joey]]
+>> The
+>> [[plugins/contrib/mailbox]]
+>> plugin is roughly feature complete at this point. It can read mbox, maildir, and
+>> MH folders, does threading, and deals with MIME (now with
+>> pagespec based sanity checking). No doubt lots of things could be
+>> be improved, and it hasn't been tested a great deal. Formatting of the body could be attempted
+>> as well. -- [[DavidBremner]]
+>>> One hitch I noticed was that it is not currently possible to treat a maildir
+>>> or an MH directory as a page (i.e. just call it foo.mh and have it transformed
+>>> to page foo). I'm not sure if this is possible and worthwhile to fix.
+>> It is certainly workable
+>>> to use a \[[!mailbox ]] directive. -- [[DavidBremner]]
+
+[[done]]
diff --git a/doc/todo/mdwn_itex.mdwn b/doc/todo/mdwn_itex.mdwn
new file mode 100644
index 000000000..ae9a8f37a
--- /dev/null
+++ b/doc/todo/mdwn_itex.mdwn
@@ -0,0 +1,22 @@
+[[!template id=gitbranch branch=wtk/mdwn_itex author="[[wtk]]"]]
+
+summary
+=======
+
+Extend the [[plugins/mdwn]] plugin to support [itex][] using Jacques
+Distler's [itex2MML][].
+
+notes
+=====
+
+This is an updated form of [[users/JasonBlevins]]' plugin. You can
+see the plugin [in action][example] on my blog. The blog post lists a
+few additional changes you may need to make to use the plugin,
+including changing your page template to a MathML-friendly doctype and
+disabling plugins like [[plugins/htmlscrubber]] and
+[[plugins/htmltidy]] which would otherwise strip out the generated
+MathML.
+
+[itex]: http://golem.ph.utexas.edu/~distler/blog/itex2MMLcommands.html
+[itex2MML]: http://golem.ph.utexas.edu/~distler/blog/itex2MML.html
+[example]: http://blog.tremily.us/posts/mdwn_itex/
diff --git a/doc/todo/mdwn_preview.mdwn b/doc/todo/mdwn_preview.mdwn
new file mode 100644
index 000000000..fa69bad47
--- /dev/null
+++ b/doc/todo/mdwn_preview.mdwn
@@ -0,0 +1,339 @@
+ikiwiki needs a wysiwyg markdown editor. While there have been tries using
+WMD etc, they are not fully satisfactory, and also the license of
+everything around WMD is [[unclear|plugins/wmd/discussion]].
+
+[Hallo](https://github.com/bergie/hallo) is the closest to a solution
+I've seen.
+The user can edit the page by clicking on the html part they want to change
+and typing. Selecting text pops up a toolbar to modify it.
+
+[Demo of Hallo with live WYSIWYG markdown editing](http://bergie.github.com/hallo/markdown.html)
+This demo uses showdown, and I still don't know what the license of
+showdown is. However, the showdown part seems to only be to handle the live
+conversion from the markdown source in the edit field to the html. The
+(edited) html to markdown conversion is accomplished by Hallo.
+
+So, ikiwiki could use this in a page edit UI that does not show the
+markdown at all. The user would edit the live page, entirely in wysiwyg
+mode, and on saving hallo's generated markdown would be saved. Probably
+there would need to be a button to bring up the current markdown editor
+too, but without showdown, changes in it would not immediatly preview, so
+it'd make sense to disable hallo when the editor is visible.
+
+Issue: Ikiwiki directives can generate html. We would not want that html to
+be editable by halo and converted back to markdown. Also, the directives
+need to appear in the html so users can edit them. This seems to call for a
+special page rendering mode for editing, in which directives are either not
+expanded, or are expanded but the generated html wrapped in some tag that
+makes hallo refuse to edit it (which would probably require that feature be
+added to hallo, currently it acts on all blocks with `class=editable`),
+or otherwise allows it to be stripped out at save time. --[[Joey]]
+
+### old discussion
+
+
+The [StackOverflow](http://stackoverflow.com/) site uses markdown for markup.
+It has a fancy javascript thing for showing a real-time preview of what the user
+is editing. It would be nice if ikiwiki could support this, too. The thing they
+use on StackOverflow is supposed to be free software, so it should be easy to
+add to ikiwiki.
+
+> See [[wikiwyg]]. Note that I do not have a copy of the code for that, or
+> it'd be in ikiwiki already. --[[Joey]]
+
+>> I just had a brief look at the [[wikiwyg]] page and the link to the plugin was
+>> broken. The StackOverflow site uses the [WMD](http://wmd-editor.com/) editor,
+>> which seems to be related to the [ShowDown](http://attacklab.net/showdown/)
+>> javascript port of Markdown. Interestingly, [WMD source](http://wmd.googlecode.com/)
+>> is now available under an MIT license, though it is supposedly undergoing heavy
+>> refactoring. It looks like there was previous discussion ( [[todo/Add_showdown_GUI_input__47__edit]] )
+>> about a showdown plugin. Maybe a WMD plugin would be worthwhile. I might
+>> look into it if I have time on the weekend. -- [[Will]]
+
+[[!tag wishlist]]
+
+>>> Below is a simple plugin/[[patch]] to make use of the WMD editor.
+
+>>>> Now added to ikiwiki, thanks! --[[Joey]]
+
+>>> Turns out it isn't hard at all to
+>>> get a basic version going (which doesn't handle directives at all, nor does it swtich itself off when you're
+>>> editing something other than Markdown source). I've
+>>> removed the done tag so this is visible as a patch. -- [[Will]]
+
+>>>> Hmm, it would be good if it turned off for !mdwn. Although this could
+>>>> be difficult for a new page, since there is a dropdown selector to
+>>>> choose the markup language then. But it should be doable for editing an
+>>>> existing page.
+
+>>>>> I agree. I'm working on this for for both new pages and existing pages.
+>>>>> It shouldn't be hard once I get WMD going through the javascript API.
+>>>>> At the moment that is inexplicably failing, and I haven't had time to have a good look at why.
+>>>>> I may not get a chance to look at this again for a few weeks.
+
+>>>> Can I get a license statement (ie, GPL-2+) ffrom you for the plugin?
+>>>> --[[Joey]]
+
+>>>>> Certainly. You're free to use the code I posted below under the GPL-2+ license. You'll note
+>>>>> however that I haven't said anything about the WMD code itself. The WMD web page says:
+
+>>>>>> "I'm refactoring the code, and will be releasing WMD under the MIT license soon. For now you can download the most recent release (wmd-1.0.1.zip) and use it freely."
+
+>>>>> It might be best to contact <support@attacklab.net> to for an explicit license on that if you want to include it.
+>>>>> -- [[Will]]
+
+> So, I wonder if I should add a copy of the WMD source to ikiwiki, or rely
+> on the user or distribution providing it. It does not seem to be packaged
+> for Debian yet. Hmm, I also can't find any copyright or license info in
+> the zip file. --[[Joey]]
+
+>> This is a good question. My thought is that it will probably not be packaged any time soon,
+>> so you're better off adding it to IkiWiki. I'd contact the author of WMD and ask them. They
+>> may have more insight. -- [[Will]]
+
+Note that the WMD plugin does **not** handle directives. For this reason the normal `preview` button
+remains. Some CSS to clean up the display of the live WMD preview would be good.
+
+> Can you elucidate the CSS comment -- or will it be obvious what you mean
+> when I try it? Is it what's needed for the live preview? --[[Joey]]
+
+>> In the version of the plugin below, a new `div` is added just below the form. WMD
+>> populates this div with the HTML it generates from the Markdown source. This is not very
+>> pretty at the moment - it appears in the same place as the preview used to, but with no
+>> header or anything. Any standard IkiWiki preview will appear below the WMD live preview.
+>> I recommend having a look at <http://wmd-editor.com/examples/splitscreen>
+>> for what a little CSS could achieve. -- [[Will]]
+
+> Hmm, now that I've tried it, I notice that it does live preview by
+> default, below the edit window. Which is nice, but then if I hit the
+> preview button, I get two previews.. which is confusing. (Also, minor,
+> but: the live preview is missing the "Page Preview:" header.) --[[Joey]]
+
+> I wonder how annoying it would be to add some kind of simplistic wikilink
+> support to wmd's preview? And/or a wikilink button? While not supporting
+> directies is fine, not supporting wikilinks in a wiki seems a bit
+> lacking. It may also entice novide users to not use wikilinks and instead
+> use the hyperlinks that wmd does support. --[[Joey]]
+
+> Bug: When I preview, all the text in the edit field seems to be
+> converted from mdwn to html. I think that wmd is converting the mdwn
+> into html when the form is posted, so it would also save like that.
+> I assume that is designed for websites that do not use markdown
+> internally. Doesn't it have a setting to leave it as markdown?
+>> Found setting, fixed. --[[Joey]]
+
+>>> As I noted above, I've been working on the non-markdown page issue.
+>>> Below is my a new javascript file that I'm using, and below that a patch
+>>> to enable it. This patch makes the normal usage prettier - you get
+>>> a side panel with the live preview in it. It also adds a new config
+>>> option, `wmd_use101api`, which turns on code that tries to use the
+>>> wmd api. At the moment this code doesn't seem to work - moreover the
+>>> code that uses the new API dies early, so any code after that point is
+>>> completely untested. I will not
+>>> get a chance to look at this again soon though, so I thought I'd post
+>>> my progress so far. -- [[Will]]
+
+
+Place the following file in `underlays/wmd/wmd-ikiwiki.js`.
+
+----
+
+ // This is some code to interface the WMD interface 1.0.1 with IkiWiki
+ // The WMD interface is planned to change, so this file will likely need
+ // updating in future.
+
+ if (useWMDinterface) {
+ wmd_options = { autostart: false, output: "Markdown" };
+ var instance = null;
+
+ hook("onload", initwmd);
+ } else {
+ var typeSelector = document.getElementById("type");
+
+ var currentType = getType(typeSelector);
+
+ if (currentType == "mdwn") {
+ wmd_options = { output: "Markdown" };
+ document.getElementById("wmd-preview-container").style.display = 'none';
+ } else {
+ wmd_options = { autostart: false };
+ document.getElementById("wmd-preview-container").style.display = 'block';
+ }
+ }
+
+ function initwmd() {
+ if (!Attacklab || !Attacklab.wmd) {
+ alert("WMD hasn't finished loading!");
+ return;
+ }
+
+ var typeSelector = document.getElementById("type");
+
+ var currentType = getType(typeSelector);
+
+ if (currentType == "mdwn") {
+ window.setTimeout(enableWMD,10);
+ }
+
+ typeSelector.onchange=function() {
+ var docType=getType(this);
+
+ if (docType=="mdwn") {
+ enableWMD();
+ } else {
+ disableWMD();
+ }
+ }
+ }
+
+ function getType(typeSelector)
+ {
+ if (typeSelector.nodeName.toLowerCase() == 'input') {
+ return typeSelector.getAttribute('value');
+ } else if (typeSelector.nodeName.toLowerCase() == 'select') {
+ return typeSelector.value;
+ // return typeSelector.options[typeSelector.selectedIndex].innerText;
+ }
+ return "";
+ }
+
+ function enableWMD()
+ {
+ var editContent = document.getElementById("editcontent");
+ var previewDiv = document.getElementById("wmd-preview");
+ var previewDivContainer = document.getElementById("wmd-preview-container");
+
+ previewDivContainer.style.display = 'block';
+ // editContent.style.width = previewDivContainer.style.width;
+
+ /***** build the preview manager *****/
+ var panes = {input:editContent, preview:previewDiv, output:null};
+ var previewManager = new Attacklab.wmd.previewManager(panes);
+
+ /***** build the editor and tell it to refresh the preview after commands *****/
+ var editor = new Attacklab.wmd.editor(editContent,previewManager.refresh);
+
+ // save everything so we can destroy it all later
+ instance = {ta:editContent, div:previewDiv, ed:editor, pm:previewManager};
+ }
+
+ function disableWMD()
+ {
+ document.getElementById("wmd-preview-container").style.display = 'none';
+
+ if (instance != null) {
+ instance.pm.destroy();
+ instance.ed.destroy();
+ // inst.ta.style.width='100%'
+ }
+ instance = null;
+ }
+
+
+----
+
+ diff --git a/IkiWiki/Plugin/wmd.pm b/IkiWiki/Plugin/wmd.pm
+ index 9ddd237..743a0b8 100644
+ --- a/IkiWiki/Plugin/wmd.pm
+ +++ b/IkiWiki/Plugin/wmd.pm
+ @@ -17,6 +17,13 @@ sub getsetup () {
+ return
+ plugin => {
+ safe => 1,
+ + rebuild => 1,
+ + },
+ + wmd_use101api => {
+ + type => "boolean",
+ + description => "Use the advanced, but unstable, WMD api for markdown preview.",
+ + safe => 0,
+ + rebuild => 0,
+ },
+ }
+
+ @@ -24,29 +31,25 @@ sub formbuilder_setup (@) {
+ my %params=@_;
+ my $form=$params{form};
+
+ - return if ! defined $form->field("do");
+ + return unless defined $form->field("do");
+
+ return unless $form->field("do") eq "edit" ||
+ - $form->field("do") eq "create" ||
+ - $form->field("do") eq "comment";
+ -
+ - $form->tmpl_param("wmd_preview", "<div class=\"wmd-preview\"></div>\n".
+ - include_javascript(undef, 1));
+ -}
+ -
+ -sub include_javascript ($;$) {
+ - my $page=shift;
+ - my $absolute=shift;
+ -
+ - my $wmdjs=urlto("wmd/wmd.js", $page, $absolute);
+ - return <<"EOF"
+ -<script type="text/javascript">
+ -wmd_options = {
+ - output: "Markdown"
+ -};
+ -</script>
+ -<script src="$wmdjs" type="text/javascript"></script>
+ -EOF
+ + $form->field("do") eq "create" ||
+ + $form->field("do") eq "comment";
+ +
+ + my $useAPI = $config{wmd_use101api}?'true':'false';
+ + my $ikiwikijs = urlto("ikiwiki.js", undef, 1);
+ + my $wmdIkiwikijs = urlto("wmd-ikiwiki.js", undef, 1);
+ + my $wmdjs = urlto("wmd.js", undef, 1);
+ +
+ + my $previewScripts = <<"EOS";
+ + <script type="text/javascript">useWMDinterface=$useAPI;</script>
+ + <script src="$ikiwikijs" type="text/javascript"></script>
+ + <script src="$wmdIkiwikijs" type="text/javascript"></script>
+ + <script src="$wmdjs" type="text/javascript"></script>
+ +EOS
+ +
+ + $form->tmpl_param("wmd_preview", $previewScripts);
+ }
+
+ 1
+ diff --git a/doc/style.css b/doc/style.css
+ index a6e6734..36c2b13
+ --- a/doc/style.css
+ +++ b/doc/style.css
+ @@ -76,9 +76,16 @@ div.tags {
+ float: right;
+ }
+
+ +/*
+ #editcontent {
+ width: 100%;
+ }
+ +*/
+ +
+ +#wmd-preview-container {
+ + width: 49%;
+ + float: right;
+ +}
+
+ img {
+ border-style: none;
+ diff --git a/templates/editpage.tmpl b/templates/editpage.tmpl
+ index b1cf015..1d2f080 100644
+ --- a/templates/editpage.tmpl
+ +++ b/templates/editpage.tmpl
+ @@ -15,6 +15,14 @@ Page type: <TMPL_VAR FIELD-TYPE>
+ <TMPL_VAR FIELD-PAGE>
+ <TMPL_VAR FIELD-TYPE>
+ </TMPL_IF>
+ +<TMPL_IF NAME="WMD_PREVIEW">
+ +<div id="wmd-preview-container">
+ +<div class="header">
+ +<span>Live preview:</span>
+ +</div>
+ +<div class="wmd-preview" id="wmd-preview"></div>
+ +</div>
+ +</TMPL_IF>
+ <TMPL_VAR FIELD-EDITCONTENT><br />
+ <TMPL_IF NAME="CAN_COMMIT">
+ Optional comment about this change:<br />
diff --git a/doc/todo/mdwn_preview/discussion.mdwn b/doc/todo/mdwn_preview/discussion.mdwn
new file mode 100644
index 000000000..4fb30adf9
--- /dev/null
+++ b/doc/todo/mdwn_preview/discussion.mdwn
@@ -0,0 +1 @@
++1, not sure where this feature is going. I'm keen to seen this!
diff --git a/doc/todo/mercurial.mdwn b/doc/todo/mercurial.mdwn
new file mode 100644
index 000000000..de1f148e5
--- /dev/null
+++ b/doc/todo/mercurial.mdwn
@@ -0,0 +1,129 @@
+* Is the code sufficiently robust? It just warns when mercurial fails.
+* When rcs_commit is called with a $user that is an openid, it will be
+ passed through to mercurial -u. Will mercurial choke on this?
+ * Nope. Mercurial doesn't expect any particular format for the username,
+ though "Name <address@domain>" is standard. --[[bma]]
+* The way `-u $user` is passed to `hg commit`, there's no way to tell
+ if a given commit came in over the web or was done directly. So
+ rcs_recentchanges hardcodes 'committype => "mercurial"'. See the monotone
+ backend for an example of one that does this right.
+* The rcs_commit implementation seems not to notice if the file has been
+ changed since a web edit started. Unlike all the other frontends, which
+ use the rcstoken to detect if the web commit started editing an earlier
+ version of the file, and if so, merge the two sets of changes together.
+ It seems that with the current mercurial commit code, it will always
+ blindly overwrite the current file with the web edited version, losing
+ any other changes.
+* `rcs_commit_staged`, `rcs_rename`, `rcs_remove`, and `rcs_diff` are not
+ implemented for mercurial, and so attachments, remove and rename plugins
+ and recentchangesdiff cannot be used with it. (These should be fairly
+ easy to add..)
+
+Posthook: in `$srcdir/.hg/hgrc`, I have the following
+
+ [hooks]
+ incoming.update = hg up
+ update.ikiwiki = ikiwiki --setup /path/to/ikiwiki.setup --refresh
+
+This should update the working directory and run ikiwiki every time a change is recorded (someone who knows mercurial better than I do may be able to suggest a better way, but this works for me.)
+
+> Try running it with --post-commit instead of --refresh. That should
+> work better, handling both the case where the edit was made via the web
+> and then committed, and the case where a commit was made directly.
+> It can deadlock if the post-commit hook runs with --refresh in the
+> former case. --[[Joey]]
+
+The problem with --post-commit is that if you delete some pages in $SRC, ikiwiki --setup setupfile --post-commit will not delete them in $DEST. --[[users/weakish]]
+
+> You should really be using a setup file that has `mercurial_wrapper`
+> set, and running the wrapper generated by that from your hook.
+> That will work. I think that the `--setup --post-commit` on the command
+> line is currently broken and does the same expensive rebuild process as --setup
+> alone (which doesn't delete files from $DEST either). Will fix that.
+> (fixed)
+> --[[Joey]]
+
+>> Mercurial doesn't support put hooks in .hg/hooks/* (like git). In Mercurial, the only way to run
+>> your own hooks is specifying them in the hgrc file. (Or write a new extension.)
+>> I guess use a very long command will work.
+>> (e.g. ikiwiki --post-commit --a-lot-of-switches --set var=value $SRC $DEST)
+>> (Fortunately ikiwiki supports --set var=value so without --setup works.)
+>>
+>> Alternative is always editing via cgi or pushing. Never work on the $SRC/repo directly.
+>> --[[users/weakish]]
+
+>>> I don't see anything preventing you from using a setup file with
+>>> `mercurial_wrapper => ".hg/ikiwiki-hook",` and then modifying the hgrc
+>>> to run that wrapper. --[[Joey]]
+
+>> Thanks for pointing out this. I have some stupid misunderstanding on the
+>> usage of mercurial_wrapper before. The wrapper works nicely! --[[weakish]]
+
+I add the following to .hg/hgrc:(I use changegroup since I don't think we need refresh per changeset, please point out if I am wrong.)
+
+ [hooks]
+ changegroup = hg update >&2 && ikiwiki --setup path.to.setup.file --refresh
+ post-commit = path.to.the.mercurial.wrapper
+
+-----
+
+I have no idea when the deadlock will happen. --[[users/weakish]]
+
+> For the deadlock to occur, a edit has to be made via the web.
+>
+> Ikiwiki,
+> running as a CGI, takes a lock on the wiki, and commits the edit,
+> continuing to run in the background, with the lock still held.
+> When the edit is committed, the hg hook runs, running `ikwiki --refresh`.
+> Nearly the first thing that process does it try to lock the wiki..
+> which is already locked. This lock is taken in a blocking manner,
+> thus the deadlock -- the cgi is waiting for the commit to finish before
+> dropping the lock, and the commit is blocked waiting for the lock to be
+> released.
+>
+> --post-commit avoids this problem by checking if the cgi is running
+> and avoiding doing anything in that case. (While still handing the
+> refresh if the commit was not made by the CGI.)
+> So in that case, the commit finishes w/o ikiwiki doing anything,
+> and the ikiwiki CGI handles the wiki refresh.
+> --[[Joey]]
+
+
+***
+
+I have a few notes on mercurial usage after trying it out for a while:
+
+1. I have been using ikiwiki's `--post-commit` option without apparent problems. I'm the only current user of my wiki, though.
+
+1. The `ikiwiki.setup` file included in ikiwiki works with mercurial's `hgserve`, which is not the preferred solution. Mercurial's `hgwebdir.cgi` is more flexible and doesn't require running a server. I have this in my .setup file:
+
+ # Mercurial stuff.
+ rcs => "mercurial",
+ historyurl => "http://localhost/cgi-bin/hgwebdir.cgi/ikiwiki/log/tip/\[[file]]",
+ diffurl => "http://localhost/cgi-bin/hgwebdir.cgi/ikiwiki/diff/tip/\[[file]]",
+
+1. I have noticed that running `ikiwiki` after a change to the wiki adds files to a directory called `recentchanges` under `$srcdir`. I don't understand why such files are needed; worse, they are not added to mercurial's list of tracked files, so they polute the output of `hg log`. Is this a bug? Should mercurial's commit hook be modified to add these files before the commit?
+
+--buo
+
+> No, those files should not be added to revision control. --[[Joey]]
+
+>> OK. I see two problems:
+
+>> 1. If I clone my wiki, I won't get an exact copy of it: I will lose the recentchanges history. This could be an acceptable limitation but IMO this should be documented.
+
+>>> The history is stored in mercurial. How will it be lost?
+
+>> 2. The output of `hg status` is polluted. This could be solved trivially by adding a line containing `recentchanges` to `.hgignore`. Another alternative would be to store the `recentchanges` directory inside `$srdcir/.ikiwiki`.
+
+>> I think the ideal solution would be to build `$destdir/recentchanges/*` directly from the output of `hg log`. --[[buo]]
+
+>>>> That would be 100 times as slow, so I chose not to do that. --[[Joey]]
+
+>>>> Since this is confusing people, allow me to clarify: Ikiwiki's
+>>>> recentchanges generation pulls log information directly out of the VCS as
+>>>> needed. It caches it in recentchanges/* in the `scrdir`. These cache
+>>>> files need not be preserved, should never be checked into VCS, and if
+>>>> you want to you can configure your VCSignore file to ignore them,
+>>>> just as you can configure it to ignore the `.ikiwiki` directory in the
+>>>> `scrdir`. --[[Joey]]
diff --git a/doc/todo/mercurial/discussion.mdwn b/doc/todo/mercurial/discussion.mdwn
new file mode 100644
index 000000000..de3670d9b
--- /dev/null
+++ b/doc/todo/mercurial/discussion.mdwn
@@ -0,0 +1,9 @@
+How does the lack of a post-commit hook for mercurial affect my ikiwiki installation?
+I want to use ikiwiki with one of the distributed scm systems and mercurial appears have the best balance of mature ikiwiki support and windows support.
+
+> Without a post-commit hook, changes committed to the wiki (either via
+> mercurial or via the web) will not automatically cause ikiwiki to run to
+> rebuild the changed pages. The parent page has an example of how to
+> configure mercurial to run ikiwiki as a post-commit hook. Someone just
+> needs to test this (including my suggested change) and then we could
+> document it in the setup page. --[[Joey]]
diff --git a/doc/todo/meta_rcsid.mdwn b/doc/todo/meta_rcsid.mdwn
new file mode 100644
index 000000000..9e112317f
--- /dev/null
+++ b/doc/todo/meta_rcsid.mdwn
@@ -0,0 +1,51 @@
+The following patch adds an 'rcsid' parameter to the [[!taglink plugins/Meta]] plugin, to allow inclusion
+of CVS/SVN-style keywords (like '$Id$', etc.) from the source file in the page template.
+
+> So the idea is you'd write something like:
+>
+> \[[!meta rcsid="$Id$"]]
+>
+> And this would be put at the bottom of the page or somewhere like that by
+> the template?
+>
+> I wonder if it wouldn't be just as clear to say:
+>
+> <span class="rcsid">$Id$</span>
+>
+> And then use a stylesheet to display it as desired.
+> --[[Joey]]
+
+>> That's possibly true; my reasoning was that I wanted it to be more independent
+>> of the page content, and independent of any stylesheet.
+
+ --- meta.pm.orig 2007-10-10 19:57:04.000000000 +0100
+ +++ meta.pm 2007-10-10 20:07:37.000000000 +0100
+ @@ -13,6 +13,7 @@
+ my %authorurl;
+ my %license;
+ my %copyright;
+ +my %rcsid;
+
+ sub import {
+ hook(type => "preprocess", id => "meta", call => \&preprocess, scan => 1);
+ @@ -110,6 +111,9 @@
+ $meta{$page}.="<link rel=\"copyright\" href=\"#page_copyright\" />\n";
+ $copyright{$page}=$value;
+ }
+ + elsif ($key eq 'rcsid') {
+ + $rcsid{$page}=$value;
+ + }
+ else {
+ $meta{$page}.=scrub("<meta name=\"".encode_entities($key).
+ "\" content=\"".encode_entities($value)."\" />\n");
+ @@ -142,6 +146,8 @@
+ if exists $author{$page} && $template->query(name => "author");
+ $template->param(authorurl => $authorurl{$page})
+ if exists $authorurl{$page} && $template->query(name => "authorurl");
+ + $template->param(rcsid => $rcsid{$page})
+ + if exists $rcsid{$page} && $template->query(name => "rcsid");
+
+ if ($page ne $destpage &&
+ ((exists $license{$page} && ! exists $license{$destpage}) ||
+
+[[patch]]
diff --git a/doc/todo/metadata.mdwn b/doc/todo/metadata.mdwn
new file mode 100644
index 000000000..361f00351
--- /dev/null
+++ b/doc/todo/metadata.mdwn
@@ -0,0 +1,19 @@
+There should be a way to add metadata to a page. Probably a plugin could do
+this, for example:
+
+ \[[!meta foo="bar"]]
+
+Uses for this include:
+
+* Setting a page title that's not tied to the filename.
+* Any metadata that's generally useful on html pages.
+* Maybe as an alternate way to tag a page, like linking to the tag,
+ except it doesn't have to show up in the page text.
+* Recording page licenses.
+
+[[!meta link=done]]
+[[!meta title="supporting metadata..."]]
+[[!meta author="Joey Hess"]]
+[[!meta link="foo.css" rel="stylesheet" type="text/css"]]
+
+[[todo/done]]
diff --git a/doc/todo/minor_adjustment_to_setup_documentation_for_recentchanges_feeds.mdwn b/doc/todo/minor_adjustment_to_setup_documentation_for_recentchanges_feeds.mdwn
new file mode 100644
index 000000000..c2dd5fbf4
--- /dev/null
+++ b/doc/todo/minor_adjustment_to_setup_documentation_for_recentchanges_feeds.mdwn
@@ -0,0 +1,28 @@
+Expand a comment so you know which bit to uncomment if you want to
+turn on feeds for recentchanges.
+
+ diff --git a/doc/ikiwiki.setup b/doc/ikiwiki.setup
+ index 99c81cf..7ca7687 100644
+ --- a/doc/ikiwiki.setup
+ +++ b/doc/ikiwiki.setup
+ @@ -91,9 +91,9 @@ use IkiWiki::Setup::Standard {
+ #},
+ ],
+
+ - # Default to generating rss feeds for blogs?
+ + # Default to generating rss feeds for blogs/recentchanges?
+ #rss => 1,
+ - # Default to generating atom feeds for blogs?
+ + # Default to generating atom feeds for blogs/recentchanges?
+ #atom => 1,
+ # Allow generating feeds even if not generated by default?
+ #allowrss => 1,
+
+[[!tag patch]]
+
+> Hmm, recentchanges is just a blog. Of course the word "blog" is perhaps
+> being used in too broad a sense here, since it tends to imply personal
+> opinions, commentary, not-a-journalist, sitting-in-ones-underwear-typing,
+> and lots of other fairly silly stuff. But I don't know of a better word
+> w/o all these connotations. I've reworded it to not use the term "blog"..
+> [[done]] --[[Joey]]
diff --git a/doc/todo/mirrorlist_with_per-mirror_usedirs_settings.mdwn b/doc/todo/mirrorlist_with_per-mirror_usedirs_settings.mdwn
new file mode 100644
index 000000000..5701d8e2b
--- /dev/null
+++ b/doc/todo/mirrorlist_with_per-mirror_usedirs_settings.mdwn
@@ -0,0 +1,103 @@
+I've got a wiki that is built at two places:
+
+* a static copy, aimed at being viewed without any web server, using
+ a web browser's `file:///` urls => usedirs is disabled to get nice
+ and working links
+* an online copy, with usedirs enabled in order to benefit from the
+ language negotiation using the po plugin
+
+I need to use mirrorlist on the static copy, so that one can easily
+reach the online, possibly updated, pages. But as documented, "pages are
+assumed to exist in the same location under the specified url on each
+mirror", so the generated urls are wrong.
+
+My `mirrorlist` branch contains a patch that allows one to configure usedirs
+per-mirror. Note: the old configuration format is still supported, so this should
+not break existing wikis.
+
+OT: as a bonus, this branch contains a patch to support {hashes,arrays} of
+{hashes,arrays} in `$config`, which I missed a bit when writing the po plugin,
+and decided this time it was really needed to implement this feature.
+
+--[[intrigeri]]
+
+> Ping. --[[intrigeri]]
+
+[[!tag patch]]
+
+>> (I'm not an ikiwiki committer, opinions may vary.)
+>>
+>>> In my opinion, you're an ikiwiki committer! --[[Joey]]
+>>
+>> This would be easier to review if there weren't a million merges from
+>> master; perhaps either leave a branch as-is, or rebase it, or merge
+>> only at "significant" times like after a release?
+>>
+>> I believe Joey's main objection to complex $config entries is that
+>> it's not at all clear what [[plugins/websetup]] would do with them.
+>> Would something like this make a reasonable alternative?
+>>
+>> $config{mirrorlist} = ["nousedirs|file:///home/intrigeri/wiki",
+>> "usedirs|http://example.com/wiki", "http://example.net"];
+>>
+>> From how I understand tainting, this:
+>>
+>> $untainted{$_} = possibly_foolish_untaint($tainted->{$_})
+>>
+>> probably needs to untaint the key too:
+>>
+>> my $key = possibly_foolish_untaint($_);
+>> $untainted{$key} = possibly_foolish_untaint($tainted->{key});
+>>
+>> --[[smcv]]
+
+>>> You are fully right about the complex `$config` entries. I'll
+>>> convert this to use what you are suggesting, i.e. what we ended up
+>>> choosing for the `po_slave_languages` setting.
+>>>
+>>> About the merges in this branch: Joey told me once he did not care
+>>> about this; moreover the `--no-merges` git log option makes it
+>>> easy to filter these out. I'll try merging tagged releases only in
+>>> the future, though.
+>>>
+>>> --[[intrigeri]]
+
+>>>> FWIW, I don't care about merge commits etc because I review
+>>>> `git diff ...intrigeri/mirrorlist` -- and if I want to dig deeper
+>>>> into the why of some code, I'll probably checkout the branch and
+>>>> use git blame.
+>>>>
+>>>> I agree with what smcv said, my other concern though is that
+>>>> this is such an edge case, that supporting it just adds clutter.
+>>>> Have to wonder if it wouldn't perhaps be better to do something
+>>>> using the goto plugin and cgiurl, so that the mirror doesn't have
+>>>> to know about the configuration of the other mirror. --[[Joey]]
+
+>>>>> I have implemented something using the cgi + goto in my (history
+>>>>> rewrite warning) mirrorlist branch. Please review, please pull.
+>>>>> --[[intrigeri]]
+
+>>>>>> Ping? I've merged 3.20110321 in my `mirrorlist` branch and
+>>>>>> checked it still works properly. --[[intrigeri]]
+
+>>>>>>> Joey: ping? I've rebased my `mirrorlist` branch on top of
+>>>>>>> 3.20120419, and checked it still works properly. I really
+>>>>>>> would like to see this functionality merged in time
+>>>>>>> for Wheezy. --[[intrigeri]]
+
+>>>>> concerning goto/cgiurl, what about having that as the default in
+>>>>> mirrorlist, but keeping ``nousedirs|file:///home/intrigeri/wiki`` and
+>>>>> ``usedirs|http://example.com/wiki`` valid for cgi-less cases?
+>>>>> that would keep typical installation with a clutter-less configuration,
+>>>>> and support more individual setups too.
+>>>>> --[[chrysn]]
+
+>>>>>> I would not mind. On the other hand Joey was concerned about
+>>>>>> cluttering the code to support edge cases, which I fully
+>>>>>> understand. The case you (chrysn) are describing being even
+>>>>>> more specific than the one I was initially talking of, I think
+>>>>>> this should not block the merge of the branch I have been
+>>>>>> proposing. Support for the usecase you are suggesting can
+>>>>>> always be added later if needed. --[[intrigeri]]
+
+>>>>>>> Well, that came out nice and clean. [[done]] --[[Joey]]
diff --git a/doc/todo/missingparents.pm.mdwn b/doc/todo/missingparents.pm.mdwn
new file mode 100644
index 000000000..cecac7a94
--- /dev/null
+++ b/doc/todo/missingparents.pm.mdwn
@@ -0,0 +1,261 @@
+This is another blogging support thing, and it relies on
+[[pagespec_relative_to_a_target]] (but only to figure out whether a given page
+has a child). Basically, you give it a page called missingparents.mdwn,
+something like this:
+
+<pre>
+[[!missingparents pages="posts/* and !posts/*/*" generate="""[[!template id=year text="$page"]]"""]]
+[[!missingparents pages="posts/*/* and !posts/*/*/*" generate="""[[!template id=month text="$page"]]"""]]
+[[!missingparents pages="posts/*/*/* and !posts/*/*/*/*" generate="""[[!template id=day text="$page"]]"""]]
+</pre>
+
+And it scans the whole wiki for pages that match the pagespecs but are missing
+parents. If any are found, they are generated automatically using the text in
+the "generate" parameter (except $page is substituted for the page title).
+*These generated pages aren't kept in version control*, but of course they're
+ordinary wiki pages and can be edited by the web form or otherwise added, at
+which point the missingparents plugin lets go of them. (TODO: CGI.pm needs to
+know to rcs_add these pages if they are edited, and it doesn't.) If all of the
+children of a missingparent page goes away, the missingparent itself is
+unlinked automatically, and all missingparents are deleted on wiki rebuild.
+
+To implement this, I needed to tell ikiwiki that pages were being added and
+removed in a non-standard way, and so created functions newpage and delpage
+in the IkiWiki namespace to do these things. delpage is modeled on the
+Render.pm code that deletes pages, so I re-used it in Render.pm. I also
+needed a way to add files to be deleted on a refresh(), so I added a
+needsdelete hook, parallel in form to needsbuild.
+
+This patch, or one like it, would enable better blogging support, by adding
+the ability to hierarchically organize blog posts and automatically generate
+structural pages for year, month, or day. Please apply. --Ethan
+
+<pre>
+Index: IkiWiki/Render.pm
+===================================================================
+--- IkiWiki/Render.pm (revision 3926)
++++ IkiWiki/Render.pm (working copy)
+@@ -322,17 +322,7 @@
+ if (! $exists{$page}) {
+ debug(sprintf(gettext("removing old page %s"), $page));
+ push @del, $pagesources{$page};
+- $links{$page}=[];
+- $renderedfiles{$page}=[];
+- $pagemtime{$page}=0;
+- prune($config{destdir}."/".$_)
+- foreach @{$oldrenderedfiles{$page}};
+- delete $pagesources{$page};
+- foreach (keys %destsources) {
+- if ($destsources{$_} eq $page) {
+- delete $destsources{$_};
+- }
+- }
++ delpage($page);
+ }
+ }
+
+@@ -377,6 +367,10 @@
+ }
+ }
+
++ if (@del) {
++ run_hooks(needsdelete => sub { shift->(\@del) });
++ }
++
+ if (%rendered || @del) {
+ # rebuild dependant pages
+ foreach my $f (@files) {
+Index: IkiWiki/Plugin/missingparents.pm
+===================================================================
+--- IkiWiki/Plugin/missingparents.pm (revision 0)
++++ IkiWiki/Plugin/missingparents.pm (revision 0)
+@@ -0,0 +1,142 @@
++#!/usr/bin/perl
++# missingparents plugin: detect missing parents of pages and create them
++package IkiWiki::Plugin::missingparents;
++
++use warnings;
++use strict;
++use IkiWiki 2.00;
++use IkiWiki::Plugin::relative;
++
++my %ownfiles;
++my @pagespecs;
++
++sub import {
++ hook(type => "checkconfig", id => "missingparents", call => \&checkconfig);
++ hook(type => "needsdelete", id => "missingparents", call => \&needsdelete);
++ hook(type => "needsbuild", id => "missingparents", call => \&needsbuild);
++ hook(type => "savestate", id => "missingparents", call => \&savestate);
++ hook(type => "preprocess", id => "missingparents", call => \&preprocess_missingparents);
++}
++
++sub checkconfig () {
++ IkiWiki::preprocess("missingparents", "missingparents",
++ readfile(srcfile("missingparents.mdwn")));
++ loadstate();
++ if ($config{rebuild}){
++ foreach my $file (keys %ownfiles) {
++ unlink $config{srcdir}.'/'.$file;
++ }
++ }
++}
++
++sub preprocess_missingparents (@) {
++ my %params=@_;
++
++ if (! defined $params{pages} || ! defined $params{generate}) {
++ return "[[!missingparents ".gettext("missing pages or generate parameter")."]]";
++ }
++
++ push @pagespecs, \%params;
++
++ #translators: This is used to display what missingparents are defined.
++ #translators: First parameter is a pagespec, the second
++ #translators: is text for pages that match that pagespec.
++ return sprintf(gettext("missingparents in %s will be %s"),
++ '`'.$params{pages}.'`', '`\\'.$params{generate}.'`');
++}
++
++my $state_loaded=0;
++sub loadstate() {
++ my $filename = "$config{wikistatedir}/missingparents";
++ if (-e $filename) {
++ open (IN, $filename) ||
++ die "$filename: $!";
++ while (<IN>) {
++ chomp;
++ $ownfiles{$_} = 1;
++ }
++
++ close IN;
++
++ $state_loaded=1;
++ }
++}
++
++sub savestate() {
++ my $filename = "$config{wikistatedir}/missingparents.new";
++ my $cleanup = sub { unlink ($filename) };
++ open (OUT, ">$filename") || error("open $filename: $!", $cleanup);
++ foreach my $data (keys %ownfiles) {
++ print OUT "$data\n" if $ownfiles{$data};
++ }
++ rename($filename, "$config{wikistatedir}/missingparents") ||
++ error("rename $filename: $!", $cleanup);
++}
++
++sub needsdelete (@) {
++ my $files=shift;
++
++ my @mydel;
++ my $pruned = 1;
++ do {
++ $pruned = 0;
++ foreach my $file (keys %ownfiles) {
++ my $page = pagename($file);
++ if (! IkiWiki::PageSpec::match_has_child($page, "")) {
++ # No children -- get rid of it
++ push @mydel, $page;
++ delete $ownfiles{$file};
++ IkiWiki::delpage($page);
++ unlink $config{srcdir}."/".$file;
++ $pruned = 1;
++ }
++ }
++ } while($pruned);
++ foreach my $page (@mydel){
++ push @{$files}, $page;
++ }
++}
++
++sub check_matches($) {
++ my $page = shift;
++ return if $IkiWiki::pagesources{$page};
++
++ foreach my $miss (@pagespecs) {
++ next unless pagespec_match($page, $miss->{pages});
++ my $text = $miss->{generate};
++ $text =~ s/\$page/$page/;
++ my $output = $page.".mdwn";
++ writefile($output, "$config{srcdir}/", $text);
++ IkiWiki::newpage($output, $page);
++ return $output;
++ }
++ return "";
++}
++
++sub needsbuild ($) {
++ my $files=shift;
++ my @new;
++
++ foreach my $file (@{$files}) {
++ if ($ownfiles{$file}) {
++ # someone edited our file, making it the
++ # user's problem
++ delete $ownfiles{$file};
++ next;
++ }
++ my $page = pagename $file;
++ my $newfile = "";
++ foreach my $parent (split '/', $page) {
++ $newfile .= $parent;
++ my $output = check_matches($newfile);
++ push @new, $output if $output;
++ $newfile .= "/";
++ }
++ }
++ foreach my $file (@new) {
++ $ownfiles{$file} = 1;
++ push @{$files}, $file;
++ }
++}
++
++1
+Index: IkiWiki.pm
+===================================================================
+--- IkiWiki.pm (revision 3926)
++++ IkiWiki.pm (working copy)
+@@ -16,7 +16,7 @@
+ use Exporter q{import};
+ our @EXPORT = qw(hook debug error template htmlpage add_depends pagespec_match
+ bestlink htmllink readfile writefile pagetype srcfile pagename
+- displaytime will_render gettext urlto targetpage
++ displaytime will_render gettext urlto targetpage newpage delpage
+ %config %links %renderedfiles %pagesources %destsources);
+ our $VERSION = 2.00; # plugin interface version, next is ikiwiki version
+ our $version='unknown'; # VERSION_AUTOREPLACE done by Makefile, DNE
+@@ -330,6 +336,30 @@
+ error("failed renaming $newfile to $destdir/$file: $!", $cleanup);
+ }
+
++sub newpage($$) {
++ my $file=shift;
++ my $page=shift;
++
++ $pagemtime{$page} = $pagectime{$page} = time;
++ $pagesources{$page} = $file;
++ $pagecase{lc $page} = $page;
++}
++
++sub delpage($) {
++ my $page=shift;
++ $links{$page}=[];
++ $renderedfiles{$page}=[];
++ $pagemtime{$page}=0;
++ prune($config{destdir}."/".$_)
++ foreach @{$oldrenderedfiles{$page}};
++ delete $pagesources{$page};
++ foreach (keys %destsources) {
++ if ($destsources{$_} eq $page) {
++ delete $destsources{$_};
++ }
++ }
++}
++
+ my %cleared;
+ sub will_render ($$;$) {
+ my $page=shift;
+</pre>
+
+[[!tag patch patch/core]]
diff --git a/doc/todo/modify_page_filename_in_plugin.mdwn b/doc/todo/modify_page_filename_in_plugin.mdwn
new file mode 100644
index 000000000..a13c8b62f
--- /dev/null
+++ b/doc/todo/modify_page_filename_in_plugin.mdwn
@@ -0,0 +1,35 @@
+I'm writing a plugin to wikify c/c++ code.
+
+By default ikiwiki generates xxx.html for a file called xxx.c.
+
+The problem is that I occasionally have xxx.c and xxx.h in the same directory and there's a filename collision.
+
+My solution is to allow plugins to provide a hook that sets the pagename. --[[/users/bstpierre]]
+
+> You might also find the solution to [[bugs/multiple_pages_with_same_name]] helps you. That patch is already applied. -- [[Will]]
+
+ --- /usr/share/perl5/IkiWiki.pm.ORIG 2008-10-03 14:12:50.000000000 -0400
+ +++ /usr/share/perl5/IkiWiki.pm 2008-10-07 11:57:26.000000000 -0400
+ @@ -196,11 +196,32 @@
+
+ sub pagename ($) {
+ my $file=shift;
+
+ my $type=pagetype($file);
+ +
+ + if(defined $type &&
+ + exists $hooks{pagename} &&
+ + exists $hooks{pagename}{$type}) {
+ +
+ + return $hooks{pagename}{$type}{call}($file);
+ +
+ + } else {
+ +
+ my $page=$file;
+ $page=~s/\Q.$type\E*$// if defined $type;
+ return $page;
+ + }
+ }
+
+ sub htmlpage ($) {
+
diff --git a/doc/todo/monochrome_theme.mdwn b/doc/todo/monochrome_theme.mdwn
new file mode 100644
index 000000000..eaf51c080
--- /dev/null
+++ b/doc/todo/monochrome_theme.mdwn
@@ -0,0 +1,48 @@
+[[!template id=gitbranch branch=jmtd/monochrome_theme author="[[Jon]]"
+
+]][As requested](http://jmtd.net/log/goodreads/), please find a new theme named
+'monochrome' in listed git repo/branch. [Here's the screenshot of what it looks like](https://github.com/jmtd/ikiwiki/blob/30af2437cd41d394930864e93b3c2319d1ec2b06/doc/themes/monochrome.png). — [[Jon]]
+
+Perhaps controversially, I think that this would be a good basis for a default theme for the ikiwiki website. (I suspect more work is needed, I have not tested the theme against every plugin which provides theme-able bits and pieces, nor with e.g. HTML5 mode turned on, etc. etc.) Whilst the anti-theme is the best default for an ikiwiki instance (although an argument could be made against that, too!), the site needs to try to advertise some of the potential of ikiwiki to visitors, and serve as an example of what can be done. I'd appreciate thoughts of frequent ikiwiki contributors on this proposal ☺ — [[Jon]]
+
+> I appreciate you putting that branch together. I was ready to merge it,
+> but `themes/monochrome/style.css` seems to contain a lot of redundant
+> things that are in ikiwiki's normal style.css. This is especially
+> redundant since ikiwiki's style.css gets prepended to the theme's stylesheet
+> at build time! Can you remove those redundant bits please? (PITA I know,
+> but it will make maintaining this much easier.) --[[Joey]]
+
+>> Sure I'll sort that out. Sorry, I didn't realise the prepending was an automatic process. I did it manually. It should be quick for me to fix. — [[Jon]]
+
+>>> Fixed. I rebased the branch; hopefully that won't cause your script issues. — [[Jon]]
+
+>>>> I've merged your branch.
+>>>>
+>>>> Looking more closely at the css, I do have a few questions:
+>>>>
+>>>> * Is the google-provided font really necessary? I consider that a sort
+>>>> of web bug, I would prefer users of ikiwiki not need to worry that
+>>>> their referer information is being sent to some third party.
+>>>> I'd also prefer for ikiwiki sites to always be functional when
+>>>> viewed offline.
+>>>> * The external link markup needs the local url to be put into
+>>>> local.css to work right, correct? I wonder if this is too much of a
+>>>> complication to ask of users. It seems to be it could either be left
+>>>> out of the theme, or perhaps ikiwiki could be made to expand
+>>>> something in the css to the site's url at build time.
+>>>>
+>>>> --[[Joey]]
+
+>>>>>Thanks for merging!
+>>>>>
+>>>>> * the font is not necessary. I will check, it might be license-compatible
+>>>>> and thus could be bundled. As things stand, if people have no 'net connection
+>>>>> or the font fails to load, the theme still "works". Good point RE the referral
+>>>>> situation.
+>>>>>
+>>>>> * The external link markup works without customizing the CSS, but if something
+>>>>> generates a non-relative link within the content area of a page, it will be
+>>>>> styled as an external link. By default, nothing does this in ikiwiki afaik,
+>>>>> so the impact is pretty small. (except perhaps if someone specifies an absolute
+>>>>> `cgiurl` path?) The additional customization is belt-and-braces.
+>>>>> — [[Jon]]
diff --git a/doc/todo/more_class__61____34____34___for_css.mdwn b/doc/todo/more_class__61____34____34___for_css.mdwn
new file mode 100644
index 000000000..cace27d63
--- /dev/null
+++ b/doc/todo/more_class__61____34____34___for_css.mdwn
@@ -0,0 +1,83 @@
+I'm writing my own CSS for ikiwiki. During this effort I often found the need of adding more class="" attributes to the default ikiwiki templates. This way more presentational aspects of visual formatting can be delegated to CSS and removed from the HTML structure.
+
+In this patch I plan to collect changes in this direction.
+
+The first, one-liner, patch is to use a "div" element with a
+class="actions" attribute for inline page as is done with non-inlined page.
+This way the same CSS formatting can be applied to div.actions in the CSS,
+while at the moment it must be duplicated for a span.actions (which I
+believe is also incorrect, since it will contain a "ul" element, not sure
+though). In case the markup should be differentiated it will still be
+possible relying on the fact that a div.actions is contained or not in a
+div.inlinepage.
+
+Here's the one-liner:
+
+> applied --[[Joey]]
+
+----
+
+The following adds a div element with class="trailer" around the meta-information
+added after an inlined page (namely: the post date, the tags, and the actions):
+
+ --- inlinepage.tmpl.orig 2006-12-28 16:56:49.000000000 +0100
+ +++ inlinepage.tmpl 2006-12-28 17:02:06.000000000 +0100
+ @@ -17,6 +17,8 @@
+ </span>
+ <TMPL_VAR CONTENT>
+
+ +<div class="trailer">
+ +
+ <span class="pageinfo">
+ Posted <TMPL_VAR CTIME>
+ </span>
+ @@ -44,3 +46,5 @@
+ </TMPL_IF>
+
+ </div>
+ +
+ +</div>
+
+[[!tag patch]]
+
+> Unfortunately, the inlinepage content passes through markdown, and markdown
+> gets confused by these nested div's and puts p's around one of them, generating
+> broken html. If you can come up with a way to put in the div that passes
+> the test suite, or a fix to markdown, I will accept it, but the above patch
+> fails the test suite. --[[Joey]]
+
+>> Just a note... This discrepancy doesn't exist in [pandoc](http://code.google.com/p/pandoc/) as
+>> demonstrated in the relevant [page](http://code.google.com/p/pandoc/wiki/PandocVsMarkdownPl).
+>> Pandoc is a _real parser_ for markdown (contrasting the regexp based implementation of
+>> markdown.pl). I've almost finished the Debian packaging. John is working on a `--strict` mode
+>> which will hopefully make pandoc a drop-in replacement for markdown. I'll upload pandoc after
+>> his work has finished. Whether it could be used in IkiWiki is an open question, but having
+>> alternatives is always a good thing and perhaps, the fact that pandoc can make markdown->LaTeX
+>> conversion may lead to new possibilities. --[[Roktas]]
+
+>>> I confirm that this ([[!debbug 405058]]) has just been fixed in markdown
+>>> [`1.0.2b7`](http://packages.debian.org/experimental/web/markdown) (BTW, thanks to your bug
+>>> report Joey). FYI, I've observed some performance drop with `1.0.2b7` compared to `1.0.1`,
+>>> especially noticable with big files. This was also confirmed by someone else, for example,
+>>> see this [thread](http://six.pairlist.net/pipermail/markdown-discuss/2006-August/000152.html)
+>>> --[[Roktas]]
+
+>>>> 1.0.2b7 is slower, but ok, and parses much better. I'm waiting for it
+>>>> to at least get into debian testing before I make ikiwiki depend on it
+>>>> though. --[[Joey]]
+
+>> This Markdown issue seems to have been worked around by the optimization
+>> in which \[[!inline]] is replaced with a placeholder, and the
+>> placeholder is later replaced by the HTML. Meanwhile, this patch
+>> has been obsoleted by applying a similar one (wrapping things in a div
+>> with class inlinefooter). That was the last remaining unapplied patch
+>> on this page, so I think this whole page can be considered [[done]].
+>> --[[smcv]]
+
+----
+
+I'd like a class attribute on the `<span>` tag surrounding wikilinks
+that refer to non-existent pages, in Ikiwiki.pm:htmllink, so that such
+broken links can be styled more dramatically with CSS. --Jamey
+
+> added --[[Joey]]
diff --git a/doc/todo/more_customisable_titlepage_function.mdwn b/doc/todo/more_customisable_titlepage_function.mdwn
new file mode 100644
index 000000000..97fefbafc
--- /dev/null
+++ b/doc/todo/more_customisable_titlepage_function.mdwn
@@ -0,0 +1,42 @@
+I understand the `IkiWiki::titlepage` function is used to generate filenames from titles. It would be nice if there were an easier way to override what it does. I suppose I could write an *external* plugin and call `inject`, but maybe this could instead be done via the configuration file?
+
+I imagine two things: a lookup hash and a template.
+
+Since `IkiWiki::titlepage` basically translates characters, it would be cool to be able to define a lookup hash in the configuration, which would be consulted before falling back to the generic `__xx__` `ord()` representation of a letter. For instance, in German, I might prefer to have 'ä' become 'ae' instead of something illegible.
+
+> This is [[todo/unaccent_url_instead_of_encoding]]. --[[smcv]]
+
+Second, maybe a template could be honoured. The template could have a slot `%s` where the calculated title goes, and it could contain `strftime` symbols as well as variables, which get interpolated on use.
+
+Another option would be a function I could define in the setup file, or an external script, though that would be pretty expensive.
+
+-- [[madduck]]
+
+> This somewhat resembles [[todo/inline_postform_autotitles]].
+> Another way to do this, suggested in that todo, would be to
+> pre-fill the title field with YYYY/MM/DD/ using Javascript.
+> --[[smcv]]
+
+I don't think that changing titlepage is a good idea, there are
+compatability problems.
+
+Instead, I think that in the specific case of the blogpost form, there
+should be an interface to allow plugins to do arbitrary transformatiosn of
+the page name.
+
+So, add a hidden field to blogpost.tmpl, something like blogpost=1. Then in
+`editpage`, if blogpost is set, call the blogpost hooks, which are passed
+a page name and return a transformed version.
+
+If the page name is changed by those, then the user's original title might
+need to be preserved via a meta title directive. This could just be
+inserted if any changes are made to the page name. Only problem with this
+is that having the directive appear in the edit box for a new page could
+confuse the user. The title could be passed on in a hidden field, and
+prepended to the page when it's saved..
+
+--[[Joey]]
+
+> I'll pass on these comments to the two similar todo items. --[[smcv]]
+
+[[wishlist]]
diff --git a/doc/todo/more_flexible_inline_postform.mdwn b/doc/todo/more_flexible_inline_postform.mdwn
new file mode 100644
index 000000000..414476bd7
--- /dev/null
+++ b/doc/todo/more_flexible_inline_postform.mdwn
@@ -0,0 +1,23 @@
+Using the [[plugins/inline]] plugin, you can get an inline-postform for
+creating new pages.
+
+It would be quite nice to have the flexibility to do this outside of the
+inline directive.
+
+I've got a proof-of-concept hacked inline comment submission example at
+<http://dev.jmtd.net/comments/> for example. I've just copied the HTML from
+the post form and stuck it inside a [[plugins/toggle]].
+
+(Before Simon completed the comments plugin, I thought this would the a
+logical first step towards doing comment-like things with inlined pages).
+
+-- [[Jon]]
+
+> Perhaps what we need is a `postform` plugin/directive that inline depends
+> on (automatically enables); its preprocess method could automatically be
+> invoked from preprocess_inline when needed. --[[smcv]]
+
+>> I've been looking at this stuff again. I think you are right, this would
+>> be the right approach. The comments plugin could use it similarly, allowing
+>> sites which desire it to have an inline comment submission form on all
+>> pages with comments enabled. I'm going to take a look. -- [[Jon]]
diff --git a/doc/todo/mtime.mdwn b/doc/todo/mtime.mdwn
new file mode 100644
index 000000000..22d4cd4ff
--- /dev/null
+++ b/doc/todo/mtime.mdwn
@@ -0,0 +1,16 @@
+It'd be nice if the mtime of the files ikiwiki renders matched the mtime of
+the source files.
+
+However, this turns out to be more complex than just calling utime() a few
+times. If a page inlines other, younger pages, then having an older mtime
+means that an old version of it will be kept in web caches, forcing
+annoying shift-reloads to see the changed content (for example).
+
+And it's not just inline. The template plugin means that a change to a
+template can result in changes to how a page gets rendered. The version
+plugin changes page content without any younger page being involved. And
+editing one of the html templates and rebuilding the wiki can change every
+page. All of these need to be reflected in the file mtime to avoid caching
+problems.
+
+[[!tag wishlist]]
diff --git a/doc/todo/multi-thread_ikiwiki.mdwn b/doc/todo/multi-thread_ikiwiki.mdwn
new file mode 100644
index 000000000..358185a22
--- /dev/null
+++ b/doc/todo/multi-thread_ikiwiki.mdwn
@@ -0,0 +1,89 @@
+[[!tag wishlist]]
+
+[My ikiwiki instance](http://www.ipol.im/) is quite heavy. 674M of data in the source repo, 1.1G in its .git folder.
+Lots of \[[!img ]] (~2200), lots of \[[!teximg ]] (~2700). A complete rebuild takes 10 minutes.
+
+We could use a big machine, with plenty of CPUs. Could some multi-threading support be added to ikiwiki, by forking out all the external heavy plugins (imagemagick, tex, ...) and/or by processing pages in parallel?
+
+Disclaimer: I know nothing of the Perl approach to parallel processing.
+
+> I agree that it would be lovely to be able to use multiple processors to speed up rebuilds on big sites (I have a big site myself), but, taking a quick look at what Perl threads entails, and taking into acount what I've seen of the code of IkiWiki, it would take a massive rewrite to make IkiWiki thread-safe - the API would have to be completely rewritten - and then more work again to introduce threading itself. So my unofficial humble opinion is that it's unlikely to be done.
+> Which is a pity, and I hope I'm mistaken about it.
+> --[[KathrynAndersen]]
+
+> > I have much less experience with the internals of Ikiwiki, much
+> > less Multi-threading perl, but I agree that to make Ikiwiki thread
+> > safe and to make the modifications to really take advantage of the
+> > threads is probably beyond the realm of reasonable
+> > expectations. Having said that, I wonder if there aren't ways to
+> > make Ikiwiki perform better for these big cases where the only
+> > option is to wait for it to grind through everything. Something
+> > along the lines of doing all of the aggregation and dependency
+> > heavy stuff early on, and then doing all of the page rendering
+> > stuff at the end quasi-asynchronously? Or am I way off in the deep
+> > end.
+> >
+> > From a practical perspective, it seems like these massive rebuild
+> > situations represent a really small subset of ikiwiki builds. Most
+> > sites are pretty small, and most sites need full rebuilds very
+> > very infrequently. In that scope, 10 minute rebuilds aren't that
+> > bad seeming. In terms of performance challenges, it's the one page
+> > with 3-5 dependency that takes 10 seconds (say) to rebuild that's
+> > a larger challenge for Ikiwiki as a whole. At the same time, I'd
+> > be willing to bet that performance benefits for these really big
+> > repositories for using fast disks (i.e. SSDs) could probably just
+> > about meet the benefit of most of the threading/async work.
+> >
+> > --[[tychoish]]
+
+>>> It's at this point that doing profiling for a particular site would come
+>>> in, because it would depend on the site content and how exactly IkiWiki is
+>>> being used as to what the performance bottlenecks would be. For the
+>>> original poster, it would be image processing. For me, it tends to be
+>>> PageSpecs, because I have a lot of maps and reports.
+
+>>> But I sincerely don't think that Disk I/O is the main bottleneck, not when
+>>> the original poster mentions CPU usage, and also in my experience, I see
+>>> IkiWiki chewing up 100% CPU usage one CPU, while the others remain idle. I
+>>> haven't noticed slowdowns due to waiting for disk I/O, whether that be a
+>>> system with HD or SSD storage.
+
+>>> I agree that large sites are probably not the most common use-case, but it
+>>> can be a chicken-and-egg situation with large sites and complete rebuilds,
+>>> since it can often be the case with a large site that rebuilding based on
+>>> dependencies takes *longer* than rebuilding the site from scratch, simply
+>>> because there are so many pages that are interdependent. It's not always
+>>> the number of pages itself, but how the site is being used. If IkiWiki is
+>>> used with the absolute minimum number of page-dependencies - that is, no
+>>> maps, no sitemaps, no trails, no tags, no backlinks, no albums - then one
+>>> can have a very large number of pages without having performance problems.
+>>> But when you have a change in PageA affecting PageB which affects PageC,
+>>> PageD, PageE and PageF, then performance can drop off horribly. And it's a
+>>> trade-off, because having features that interlink pages automatically is
+>>> really nifty ad useful - but they have a price.
+
+>>> I'm not really sure what the best solution is. Me, I profile my IkiWiki builds and try to tweak performance for them... but there's only so much I can do.
+>>> --[[KathrynAndersen]]
+
+>>>> IMHO, the best way to get a multithreaded ikiwiki is to rewrite it
+>>>> in haskell, using as much pure code as possible. Many avenues
+>>>> then would open up to taking advantage of haskell's ability to
+>>>> parallize pure code.
+>>>>
+>>>> With that said, we already have some nice invariants that could be
+>>>> used to parallelize page builds. In particular, we know that
+>>>> page A never needs state built up while building page B, for any
+>>>> pages A and B that don't have a dependency relationship -- and ikiwiki
+>>>> tracks such dependency relationships, although not currently in a form
+>>>> that makes it very easy (or fast..) to pick out such groups of
+>>>> unrelated pages.
+>>>>
+>>>> OTOH, there are problems.. building page A can result in changes to
+>>>> ikiwiki's state; building page B can result in other changes. All
+>>>> such changes would have to be made thread-safely. And would the
+>>>> resulting lock contention result in a program that ran any faster
+>>>> once parallelized?
+>>>>
+>>>> Which is why [[rewrite_ikiwiki_in_haskell]], while pretty insane, is
+>>>> something I keep thinking about. If only I had a spare year..
+>>>> --[[Joey]]
diff --git a/doc/todo/multiple_output_formats.mdwn b/doc/todo/multiple_output_formats.mdwn
new file mode 100644
index 000000000..0538f894c
--- /dev/null
+++ b/doc/todo/multiple_output_formats.mdwn
@@ -0,0 +1,17 @@
+Ikiwiki could support building some pages into additionnal output formats,
+such as postscript or plain text, and provide links to those on the html output.
+
+This would provide true "printable versions" of the wiki pages supporting it.
+
+--[[JeremieKoenig]]
+
+Could this be done by making the output format a plugin, similar to the way
+pyblosxom works? Atom and RSS could then possibly be moved into plugins.
+
+Presumably they'd have to work by converting HTML into some other format, as
+trying to force all input languages to generate more than one output language
+would be impractical to say the least.
+
+--[[bma]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/multiple_repository_support.mdwn b/doc/todo/multiple_repository_support.mdwn
new file mode 100644
index 000000000..0625dba7e
--- /dev/null
+++ b/doc/todo/multiple_repository_support.mdwn
@@ -0,0 +1,15 @@
+I'd like to be able to use one git repository for my basic website, and
+another one for the big files (pictures, videos), and another one for temp
+files. This way I'd not bloat the basic repo, and I could toss temp files
+up, and throw the temp repo away periodically.
+
+For this to work really well, ikiwiki would need multiple repository
+support. Possibly it could be tied into 'mr'?
+
+Another thought is that it would be good if ikiwiki could determine the
+type of repo a subdirectory is in by itself, eliminating the need to
+manually configure it in the setup file.
+
+--[[Joey]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/multiple_simultaneous_rcs.mdwn b/doc/todo/multiple_simultaneous_rcs.mdwn
new file mode 100644
index 000000000..f371bb1f4
--- /dev/null
+++ b/doc/todo/multiple_simultaneous_rcs.mdwn
@@ -0,0 +1,26 @@
+One useful item would be support for multiple RCS systems simultaneously.
+
+Example:
+
+* root refs: git://.... branch wiki
+* master refs: git://..... branch master
+* dev refs: git://.... branch dev
+
+I'm not sure how the mechanics would work out for choosing between what to refs and managing branch self-references (though doesn't ikiwiki do something like this:
+
+* 1
+ * a
+ * j
+ * b
+ * Page : refs a/j
+* 2
+ * a
+ * j
+ * b
+* a
+ * j
+
+Ikiwiki checks: 1/b/a/j, then 1/a/j and succeeds...
+
+
+The new git subprojects might work, but I've had trouble making them work sanely in the past...
diff --git a/doc/todo/multiple_simultaneous_rcs/discussion.mdwn b/doc/todo/multiple_simultaneous_rcs/discussion.mdwn
new file mode 100644
index 000000000..a655c9fd1
--- /dev/null
+++ b/doc/todo/multiple_simultaneous_rcs/discussion.mdwn
@@ -0,0 +1,15 @@
+Gee... hadn't noticed [mr](http://kitenet.net/~joey/code/mr/) by the same author as ikiwiki :-P
+
+This would probably be the ideal backend for 'multi repo' I suppose.
+
+~[[harningt]] 2007-11-06
+
+> mr's interface to revision control systems is much shallower than
+> ikiwiki's. mr only needs to know how to run a "pull" or a "commit"
+> command, it doesn't matter if these commands behave very differently
+> for different systems. Ikiwiki needs to be able to commit a change,
+> handling conflicts, in a way tuned to the revision control system being
+> used, amoung other things. So mr isn't a very good fit for use by ikiwiki.
+> --[[Joey]]
+
+----
diff --git a/doc/todo/multiple_template_directories.mdwn b/doc/todo/multiple_template_directories.mdwn
new file mode 100644
index 000000000..6a474b4f3
--- /dev/null
+++ b/doc/todo/multiple_template_directories.mdwn
@@ -0,0 +1,73 @@
+It would be nice to be able to override a single template without keeping a
+private copy of the entire template directory.
+
+a setup option like
+
+ templatedirs => [ "my/dir", "/usr/share/ikiwiki/templates" ]
+
+ought to do the trick.
+
+> Now that I look at the source, I see that ikiwiki already falls back to the
+> global dir when it cannot find a template. For me, this is good enough.
+> And it is even documented in the man page. Sigh. I guess this could be
+> considered [[done]].
+
+I have a use case for this, a site composed of blogs and wikis, templates divided in three categories : common, blog and wiki. The only solution I found is maintaining hard links, being able to have multiple template dirs would obviously be better. -- Changaco
+
+> [[plugins/underlay]] used to allow adding extra templatedirs, but Joey
+> removed that functionality when he made templates search the wiki's
+> own `templates` directory.
+>
+> You can get a 3-level hierarchy like this:
+>
+> * instance-specific overrides: $srcdir/templates
+> * common to the entire site: a directory that is the value of all
+> instances' `templatedir` parameters
+> * common to every ikiwiki in the world: /usr/share/ikiwiki/templates
+> (implicitly searched)
+>
+> (by "instance" I mean an instance of ikiwiki - a .setup file, basically.)
+>
+> For a more complex hierarchy you'd need the old [[plugins/underlay]]
+> functionality, i.e. you'd need to (ask Joey to) revert the patch that
+> removed it. For instance, if anyone has a hierarchy like this, then
+> they need the old functionality back in order to split the template
+> search path for the things marked `(???)`:
+>
+> every ikiwiki in the world (/usr/share/ikiwiki/templates)
+> \--- your site (???)
+> \--- your blogs (???)
+> \--- travel blog ($srcdir/templates)
+> \--- code blog ($srcdir/templates)
+> \--- your wikis (???)
+> \--- travel wiki ($srcdir/templates)
+> \--- code wiki ($srcdir/templates)
+>
+> This looks pretty hypothetical to me, though...
+> --[[smcv]]
+
+>> The reason I removed it is because the same functionality of having
+>> multiple template directories is still present. Just put them in
+>> the templates/ subdirectory of multiple underlay directories instead.
+>> --[[Joey]]
+
+>>>Thanks, I didn't realize this was possible. Problem solved. -- Changaco
+
+>>>> We can consider this [[done]], then. For reference, the solution
+>>>> to the hierarchy I mentioned above would be:
+>>>>
+>>>> all your sites have $your_underlay as an underlay
+>>>>
+>>>> the blogs and wikis all have $blog_underlay or $wiki_underlay
+>>>> (as appropriate) as a higher priority underlay
+>>>>
+>>>> every ikiwiki in the world (/usr/share/ikiwiki/templates)
+>>>> \--- your site ($your_underlay/templates, or templatedir)
+>>>> \--- your blogs ($blog_underlay/templates)
+>>>> \--- travel blog ($srcdir/templates)
+>>>> \--- code blog ($srcdir/templates)
+>>>> \--- your wikis ($wiki_underlay/templates)
+>>>> \--- travel wiki ($srcdir/templates)
+>>>> \--- code wiki ($srcdir/templates)
+>>>>
+>>>> --[[smcv]]
diff --git a/doc/todo/multiple_templates.mdwn b/doc/todo/multiple_templates.mdwn
new file mode 100644
index 000000000..30fb8d6ee
--- /dev/null
+++ b/doc/todo/multiple_templates.mdwn
@@ -0,0 +1,13 @@
+> Another useful feature might be to be able to choose a different [[template|templates]]
+> file for some pages; [[blog]] pages would use a template different from the
+> home page, even if both are managed in the same repository, etc.
+
+Well, that would probably be fairly easy to add if it used [[pagespecs|ikiwiki/pagespec]] to
+specify which pages use the non-default template.
+
+Hmm, I think the [[pagetemplate|plugins/pagetemplate]] hook should allow one to get close enough to
+this in a plugin now.
+
+See also: [[Allow_per-page_template_selection]] -- same thing, really.
+
+[[done]]
diff --git a/doc/todo/natural_sorting.mdwn b/doc/todo/natural_sorting.mdwn
new file mode 100644
index 000000000..3c42a4f94
--- /dev/null
+++ b/doc/todo/natural_sorting.mdwn
@@ -0,0 +1,21 @@
+[[!tag wishlist]]
+[[!tag patch]]
+
+the inline plugin's sorting is plain lexical, thich may not be appropriate for
+page titles if they have numeric components. the
+[Sort::Naturally](http://search.cpan.org/dist/Sort-Naturally/) perl module
+provides an algorithm for that.
+
+there is a
+patch 55b83cb7bd1cd7c60bb45dc22c3745dd80a63fed
+attached that makes the [[plugins/inline]] plugin use Sort::Naturally if sort
+is set to "title_natural".
+
+the current patch uses `require Sort::Naturally`, so
+[libsort-naturally-perl](http://packages.debian.org/libsort-naturally-perl)
+does not become a dependency; it might be worth suggesting, though.
+
+> See also: [[inline:_numerical_ordering_by_title]] (I probably prefer your
+> approach..) --[[Joey]]
+
+> [[applied|done]]
diff --git a/doc/todo/need_global_renamepage_hook.mdwn b/doc/todo/need_global_renamepage_hook.mdwn
new file mode 100644
index 000000000..e3cec4a9b
--- /dev/null
+++ b/doc/todo/need_global_renamepage_hook.mdwn
@@ -0,0 +1,115 @@
+As documented in [[plugins/write]], the current `renamepage` hook is
+heavily oriented towards updating links in pages' content: it is run
+once per page linking to the renamed page.
+
+That's fine, but it can't be used to trigger more general actions on
+page rename. E.g. it won't be run at all if the page being renamed is
+an orphan one.
+
+This is a real issue for the [[plugins/contrib/po]] development: what
+I'm about to achieve is:
+
+- when a master page is renamed, the plugin takes notice of it (using
+ the `rename` hook), and later renames the translation pages
+ accordingly (in the `change` hook)
+- when a master page is deleted, the plugin deletes its translations
+ (using the `delete` hook)
+
+With the current `renamepage` hook behavior, combining these two goals
+has an annoying drawback: a plugin can't notice an orphan master page
+has been renamed, so instead of renaming (and preserving) its
+translations, it considers the oldpage as deleted, and deletes its
+translations. Game over.
+
+It may seem like a corner case, but I want to be very careful when
+deleting files automatically in `srcdir`, which is not always under
+version control.
+
+As a sad workaround, I can still disable any deletion in `srcdir`
+when it is not under version control. But I think ikiwiki deserves
+a global `renamepage` hook that would be run once per rename
+operation.
+
+My proposal is thus:
+
+- keep the documented `renamepage` hook as it is
+- use something inspired by the trick `preprocess` uses: when `hook`
+ is passed an optional "global" parameter, set to a true value, the
+ declared `renamepage` hook is run once per rename operation, and is
+ passed named parameters: `src`, `srcfile`, `dest` and `destfile`.
+
+I'm of course volunteering to implement this, or anything related that
+would solve my problem. Hmmm? --[[intrigeri]]
+
+> I think it would be better to have a different hook that is called for
+> renames, since the two hook actions are very different (unlike the
+> preprocess hook, which does a very similar thing in scan mode).
+>
+> Just calling it `rename` seems like a reasonable name, by analogy with
+> the `delete` and `change` hooks.
+>
+> It might make sense to rename `renamepage` to `renamelink` to make it
+> clearer what it does. (I'm not very worried about this breaking things, at
+> this point.) --[[Joey]]
+
+>> In my `po` branch, I renamed `renamepage` to `renamelink`, and
+>> created a `rename` hook that is passed a reference to `@torename`.
+>> --[[intrigeri]]
+
+>>> As Joey highlights it on [[plugins/contrib/po]], it's too late to
+>>> merge such a change, as the 3.x plugin API is released and should
+>>> not be broken. I will thus keep the existing `renamepage` as it
+>>> is, and call `rename` the global hook I need. --[[intrigeri]]
+
+>>>> [[Done]] in my `po` branch. --[[intrigeri]]
+
+I think I see a problem in the rename hook. The hook is called
+before the plugin adds any subpages to the set of pages to rename.
+So, if the user choses to rename subpages, po will not notice
+they are moving, and will not move their po files.
+
+Perhaps the hooks should be moved to come after subpages are added.
+This would, though, mean that if the hook somehow decides to add
+entirely other pages to the list, their subpages would not be
+automatically added.
+
+I also have some qualms about the design of the hook. In particular,
+passing the mutable array reference probably makes it impossible
+to use from external plugins. Instead it could return any additional
+rename hashes it wants to add. Or, if the ability to modify existing
+hashes is desired, it could return the full set of hashes.
+
+--[[Joey]]
+
+> I fixed the last part, i.e. a rename hook function now returns the
+> full set of hashes. As I also converted it to take named parameters,
+> such a function still is passed a reference to the original array,
+> though, because one can't build a hash containing an array of hashes
+> as a value, without passing this array as a reference.
+>
+>> Sure.
+>
+> I'm not entirely sure about your first concern. Calling the hook
+> before or after the subpages addition both have their own problems.
+>
+> What about running the hook before *and* after the subpages
+> addition, with an additional `when` named parameter, so that
+> a given hook function can choose to act only before or after, or both?
+>
+> --[[intrigeri]]
+>>
+>> Have you thought about making the hook be run once *per* file that is
+>> selected to be renamed? This would even handle the case where two
+>> plugins use the hook; plugin A would see when plugin B adds a new file
+>> to be renamed. And the subpage renaming stuff could probably be moved
+>> into the rename hook too. --[[Joey]]
+>>>
+>>> I've implemented this nice solution in my po branch, please review.
+>>> I'm slowly coming back to do the last bits needed to get my po and
+>>> meta branch merged. --[[intrigeri]]
+
+>>>> It looks good. I made some small changes to it in my own po branch.
+>>>> Nothing significant really. If this were not tied up in the po branch,
+>>>> I've have merged it to master already. --[[Joey]]
+
+>>>> Thanks, this is great :) --[[intrigeri]]
diff --git a/doc/todo/nested_preprocessor_directives.mdwn b/doc/todo/nested_preprocessor_directives.mdwn
new file mode 100644
index 000000000..433fc6f37
--- /dev/null
+++ b/doc/todo/nested_preprocessor_directives.mdwn
@@ -0,0 +1,69 @@
+Ikiwiki's preprocessor parser cannot deal with arbitrary nested preprocesor
+directives. It's possible to nest a directive with single quoted values
+inside a triple-quoted value of a directive, but that's all.
+
+It's not possible to unambiguously parse nested quotes, so to support
+nesting, a new syntax would be needed. Maybe something xml-like?
+
+> You can, however, unambiguously parse nested square brackets, and I think
+> that would solve the problem, as long as you never allow the contents of a
+> directive to contain a *partial* directive, which seems reasonable to me.
+>
+> For example, I *think* you can unambiguously parse the following:
+>
+> \[[!if test="enabled(template) and templates/foo" then="""
+> [[!template id=foo content="""Flying Purple People Eater"""]]
+> """]]
+>
+> --[[JoshTriplett]]
+
+>> Yes it's definitely possible to do something like that. I'm not 100%
+>> sure if it can be done in perl regexp or needs a real recursive descent
+>> parser though.
+>>
+>> [[!template id=gitbranch branch=timonator/heredoc_triplequote author="\[[timonator]]"]]
+>>
+>> In the meantime, this is an interesting approach:
+>> <https://github.com/timo/ikiwiki/commit/410bbaf141036164f92009599ae12790b1530886>
+>> (the link has since been fixed twice)
+>>
+>> \[[!directive text=<<FOO
+>> ...
+>> FOO]]
+>>
+>> Since that's implemented, I will probably just merge it,
+>> once I satisfy myself it doesn't blow up in any edge cases.
+>> (It also adds triple single quotes as a third, distinct type of quotes,
+>> which feels a bit redundant given the here docs.) --[[Joey]]
+>>
+>> Hmm, that patch changes a `m///sgx` to a `m///msgx`. Meaning
+>> that any '^' or '$' inside the regexp will change behavior from matching
+>> the start/end of string to matching the start/end of individual lines
+>> within the string. And there is one legacy '$' which must then
+>> change behavior; the "delimiter to next param".
+>>
+>> So, I'm not sure what behavior that will cause, but I suspect it will
+>> be a bug. Unless the `\s+|$' already stops matching at a newline within
+>> the string like it's whitespace. That needs more alalysis.
+>> Update: seems it does, I'm fairly satisfied that is not a bug.
+>>
+>> Also, the patch seems incomplete, only patching the first regexp
+>> but not the other two in the same function, which also are quoting-aware. --[[Joey]]
+>>
+>> Yes, I'm terribly sorry. I actually did edit the other two regexps, but
+>> I apparently missed copying it over as well. Should have been doing this
+>> in a git repo all along. Look at the new commit I put atop it that has
+>> the rest as well:
+>> (redacted: is now part of the commit linked to from above)
+>> Also: I'm not sure any more, why I added the m modifier. It was very
+>> late at night and I was getting a bit desperate (turned out, the next
+>> morning, I put my extra regexes after the "unquoted value" one. heh.)
+>> So, feel free to fix that. --Timo
+>>
+>> I've fixed the patch by rebasing, fixed the link above. I'm still not
+>> sure if the m modifier for the regex is still needed (apparently I
+>> didn't put it in the other regexes. Not completely sure about the
+>> implications.) Am now trying to wrap my head around a test case to
+>> test the new formats for a bit. --Timo
+
+[[done]]!!! --[[Joey]]
diff --git a/doc/todo/online_configuration.mdwn b/doc/todo/online_configuration.mdwn
new file mode 100644
index 000000000..02a8c6e5f
--- /dev/null
+++ b/doc/todo/online_configuration.mdwn
@@ -0,0 +1,28 @@
+It should be possible to configure ikiwiki online, in the wiki admin's
+preferences form. Rather than the current situation where most settings are
+in ikiwiki.setup, and one or two (like locked pages and upload limits) in
+the admin preferences.
+
+In theory, every setting could be configured there. In practice, some
+settings, like `srcdir` and `destdir` are ones you want to keep far away
+from editing via the web.
+
+The underlying work has been done to privide metadata about all options via
+getsetup hooks, so it's just a matter of writing a web interface plugin.
+
+The plugin could have these config options:
+
+ # list of options to include in web setup (safe = all things with safe = 1)
+ websetup_include => [qw{safe}],
+ # list of options to exclude from web setup
+ websetup_exclude => [qw{option_baz}],
+ # list of plugins that cannot be enabled/disabled via the web
+ # interface
+ websetup_force_plugins => [qw{git svn bzr mercurial monotone tla}]
+
+Leaning toward just making it write out to the same setup file, rather than
+writing to a subsidiary setup file. However, this would mean that any
+comments in the file would be lost, and that it couldn't be used if the
+setup file had weird stuff (perl code, etc).
+
+[[!tag wishlist done]]
diff --git a/doc/todo/openid_enable_cache.mdwn b/doc/todo/openid_enable_cache.mdwn
new file mode 100644
index 000000000..b09d4d9f2
--- /dev/null
+++ b/doc/todo/openid_enable_cache.mdwn
@@ -0,0 +1,4 @@
+Per <http://ur1.ca/46p7z> , ikiwiki's use of openid can cause some
+problems. While they have worked around it, should look into providing a
+cache parameter to OpenID::Consumer to avoid the problem and possibly be
+more secure and/or faster. --[[Joey]]
diff --git a/doc/todo/openid_user_filtering.mdwn b/doc/todo/openid_user_filtering.mdwn
new file mode 100644
index 000000000..6a318c4c0
--- /dev/null
+++ b/doc/todo/openid_user_filtering.mdwn
@@ -0,0 +1,13 @@
+As mentioned on IRC, I think a cheap form of [[todo/ACL]] can be maintained using [OpenID in ikiwiki](http://packages.qa.debian.org/libn/libnet-openid-consumer-perl.html).
+
+Say I want to limit edits to [wiki.webvm.net](http://wiki.webvm.net/) to users of that machine. For the user 'hendry' I create a http://hendry.webvm.net/ OpenID (which actually delegates to http://hendry.myopenid.com/). And likewise for other users.
+
+So I suggest an ikiwiki configuration like:
+
+ users => ["*.webvm.net"],
+
+Would only allow edits from openIDs of that form.
+
+> This kind of thing can be [[done]] now: --[[Joey]]
+>
+> locked_pages => "* and !user(http://*.webvm.net/)"
diff --git a/doc/todo/optimisations.mdwn b/doc/todo/optimisations.mdwn
new file mode 100644
index 000000000..b8c4fa0da
--- /dev/null
+++ b/doc/todo/optimisations.mdwn
@@ -0,0 +1,15 @@
+Ikiwiki has already been optimised a lot, however..
+
+* Look at splitting up CGI.pm. But note that too much splitting can slow
+ perl down.
+
+ > It's split enough, or possibly more than enough, now. :-)
+
+* The backlinks calculation code is still O(N^2) on the number of pages.
+ If backlinks info were stored in the index file, it would go down to
+ constant time for iterative builds, though still N^2 for rebuilds.
+
+ > Seems to be O(Num Pages * Num Links in Page), or effectively O(N)
+ > pages for most wikis.
+
+[[done]]
diff --git a/doc/todo/optimize_simple_dependencies.mdwn b/doc/todo/optimize_simple_dependencies.mdwn
new file mode 100644
index 000000000..6f6284303
--- /dev/null
+++ b/doc/todo/optimize_simple_dependencies.mdwn
@@ -0,0 +1,95 @@
+I'm still trying to optimize ikiwiki for a site using
+[[plugins/contrib/album]], and checking which pages depend on which pages
+is still taking too long. Here's another go at fixing that, using [[Will]]'s
+suggestion from [[todo/should_optimise_pagespecs]]:
+
+> A hash, by itself, is not optimal because
+> the dependency list holds two things: page names and page specs. The hash would
+> work well for the page names, but you'll still need to iterate through the page specs.
+> I was thinking of keeping a list and a hash. You use the list for pagespecs
+> and the hash for individual page names. To make this work you need to adjust the
+> API so it knows which you're adding. -- [[Will]]
+
+If you have P pages and refresh after changing C of them, where an average
+page has E dependencies on exact page names and D other dependencies, this
+branch should drop the complexity of checking dependencies from
+O(P * (D+E) * C) to O(C + P*E + P*D*C). Pages that use inline or map have
+a large value for E (e.g. one per inlined page) and a small value for D (e.g.
+one per inline).
+
+Benchmarking:
+
+Test 1: a wiki with about 3500 pages and 3500 photos, and a change that
+touches about 350 pages and 350 photos
+
+Test 2: the docwiki (about 700 objects not excluded by docwiki.setup, mostly
+pages), docwiki.setup modified to turn off verbose, and a change that touches
+the 98 pages of plugins/*.mdwn
+
+In both tests I rebuilt the wiki with the target ikiwiki version, then touched
+the appropriate pages and refreshed.
+
+Results of test 1: without this branch it took around 5:45 to rebuild and
+around 5:45 again to refresh (so rebuilding 10% of the pages, then deciding
+that most of the remaining 90% didn't need to change, took about as long as
+rebuilding everything). With this branch it took 5:47 to rebuild and 1:16
+to refresh.
+
+Results of test 2: rebuilding took 14.11s without, 13.96s with; refreshing
+three times took 7.29/7.40/7.37s without, 6.62/6.56/6.63s with.
+
+(This benchmarking was actually done with my [[plugins/contrib/album]] branch,
+since that's what the huge wiki needs; that branch doesn't alter core code
+beyond the ready/depends-exact branch point, so the results should be
+equally valid.)
+
+--[[smcv]]
+
+> Now [[merged|done]] --[[smcv]]
+
+----
+
+> We discussed this on irc; I had some worries that things may have been
+> switched to `add_depends_exact` that were not pure page names. My current
+> feeling is it's all safe, but who knows. It's easy to miss something.
+> Which makes me think this is not a good interface.
+>
+> Why not, instead, make `add_depends` smart. If it's passed something
+> that is clearly a raw page name, it can add it to the exact depends hash.
+> Else, add it to the pagespec hash. You can tell if it's a pure page name
+> by matching on `$config{wiki_file_regexp}`.
+
+>> Good thinking. Done in commit 68ce514a 'Auto-detect "simple dependencies"',
+>> with a related bugfix in e8b43825 "Force %depends_exact to lower case".
+>>
+>> Performance impact: Test 2 above takes 0.2s longer to rebuild (probably
+>> from all the calls to lc, which are, however, necessary for correctness)
+>> and has indistinguishable performance for a refresh.
+>>
+>> Test 1 took about 6 minutes to rebuild and 1:25 to refresh; those are
+>> pessimistic figures, since I've added 90 more photos and 90 more pages
+>> (both to the wiki as a whole, and the number touched before refreshing)
+>> since testing the previous version of this branch. --[[smcv]]
+
+> Also I think there may be little optimisation value left in
+> 7227c2debfeef94b35f7d81f42900aa01820caa3, since the "regular" dependency
+> lists will be much shorter.
+
+>> You're probably right, but IMO it's not worth reverting it - a set (hash with
+>> dummy values) is still the right data structure. --[[smcv]]
+
+> Sounds like inline pagenames has an already exstant bug WRT
+> pages moving, which this should not make worse. Would be good to verify.
+
+>> If you mean the standard "add a better match for a link-like construct" bug
+>> that also affects sidebar, then yes, it does have the bug, but I'm pretty
+>> sure this branch doesn't make it any worse. I could solve this at the cost
+>> of making pagenames less useful for interactive use, by making it not
+>> respect [[ikiwiki/subpage/LinkingRules]], but instead always interpret
+>> its paths as relative to the top of the wiki - that's actually all that
+>> [[plugins/contrib/album]] needs. --[[smcv]]
+
+> Re coding, it would be nice if `refresh()` could avoid duplicating
+> the debug message, etc in the two cases. --[[Joey]]
+
+>> Fixed in commit f805d566 "Avoid duplicating debug message..." --[[smcv]]
diff --git a/doc/todo/optional_underlaydir_prefix.mdwn b/doc/todo/optional_underlaydir_prefix.mdwn
new file mode 100644
index 000000000..06900a904
--- /dev/null
+++ b/doc/todo/optional_underlaydir_prefix.mdwn
@@ -0,0 +1,46 @@
+For security reasons, symlinks are disabled in IkiWiki. That's fair enough, but that means that some problems, which one could otherwise solve by using a symlink, cannot be solved. The specfic problem in this case is that all underlays are placed at the root of the wiki, when it could be more convenient to place some underlays in specific sub-directories.
+
+Use-case 1 (to keep things tidy):
+
+Currently IkiWiki has some javascript files in `underlays/javascript`; that directory is given as one of the underlay directories. Thus, all the javascript files appear in the root of the generated site. But it would be tidier if one could say "put the contents of *this* underlaydir under the `js` directory".
+
+> Of course, this could be accomplished, if we wanted to, by moving the
+> files to `underlays/javascript/js`. --[[Joey]]
+
+Use-case 2 (a read-only external dir):
+
+Suppose I want to include a subset of `/usr/local/share/docs` on my wiki, say the docs about `foo`. But I want them to be under the `docs/foo` sub-directory on the generated site. Currently I can't do that. If I give `/usr/local/share/docs/foo` as an underlaydir, then the contents of that will be in the root of the site, rather than under `docs/foo`. And if I give `/usr/local/share/docs` as an underlaydir, then the contents of the `foo` dir will be under `foo`, but it will also include every other thing in `/usr/local/share/docs`.
+
+Since we can't use symlinks in an underlay dir to link to these directories, then perhaps one could give a specific underlay dir a specific prefix, which defines the sub-directory that the underlay should appear in.
+
+I'm not sure how this would be implemented, but I guess it could be configured something like this:
+
+ prefixed_underlay => {
+ 'js' => '/usr/local/share/ikiwiki/javascript',
+ 'docs/foo' => '/usr/local/share/docs/foo',
+ }
+
+> So, let me review why symlinks are an issue. For normal, non-underlay
+> pages, users who do not have filesystem access to the server may have
+> commit access, and so could commit eg, a symlink to `/etc/passwd` (or
+> to `/` !). The guards are there to prevent ikiwiki either exposing the
+> symlink target's contents, or potentially overwriting it.
+>
+> Is this a concern for underlays? Most of the time, certianly not;
+> the underlay tends to be something only the site admin controls.
+> Not all the security checks that are done on the srcdir are done
+> on the underlays, either. Most checks done on files in the underlay
+> are only done because the same code handles srcdir files. The one
+> exception is the test that skips processing symlinks in the underlay dir.
+> (But note that the underlay directory can be a symlinkt to elsewhere
+> which the srcdir, by default, cannot.)
+>
+> So, one way to approach this is to make ikiwiki follow directory symlinks
+> inside the underlay directory. Just a matter of passing `follow => 1` to
+> find. (This would still not allow individual files to be symlinks, because
+> `readfile` does not allow reading symlinks. But I don't see much need
+> for that.) --[[Joey]]
+
+>> If you think that enabling symlinks in underlay directories wouldn't be a security issue, then I'm all for it! That would be much simpler to implement, I'm sure. --[[KathrynAndersen]]
+
+[[!taglink wishlist]]
diff --git a/doc/todo/org_mode.mdwn b/doc/todo/org_mode.mdwn
new file mode 100644
index 000000000..bb3b7dec0
--- /dev/null
+++ b/doc/todo/org_mode.mdwn
@@ -0,0 +1,36 @@
+[[!template id=gitbranch branch=wtk/org author="[[wtk]]"]]
+
+summary
+=======
+
+Add a plugin for handling files written in [org-mode][].
+
+notes
+=====
+
+This is an updated form of [Manoj Srivastava's plugin][MS]. You can
+see the plugin [in action][example] on my blog.
+
+For reasons discussed in the [[reStructuredText plugin|plugins/rst]],
+wikilinks and other ikiwiki markup that inserts raw HTML can cause
+problems. Org-mode provides a [means for processing raw HTML][raw],
+but Ikiwiki currently (as far as I know) lacks a method to escape
+inserted HTML depending on which plugins will be used during the
+[[htmlize phase|plugins/write#index11h3]].
+
+new plugin
+==========
+
+A complete rewrite of the plugin can be found
+[here][chrismgray-rewrite]. It uses a
+dedicated emacs instance to parse the org-mode files. Thus, it should
+be a bit faster than the older plugin, as well as properly handling
+[[wikilinks|ikiwiki/wikilink]] and images, two features not present in the older
+plugin. An example of its use can be found at my [blog][chrismgray-blog].
+
+[org-mode]: http://orgmode.org/
+[MS]: http://www.golden-gryphon.com/blog/manoj/blog/2008/06/08/Using_org-mode_with_Ikiwiki/
+[example]: http://blog.tremily.us/posts/Git/notes/
+[raw]: http://orgmode.org/manual/Quoting-HTML-tags.html
+[chrismgray-rewrite]: https://github.com/chrismgray/ikiwiki-org-plugin
+[chrismgray-blog]: http://chrismgray.github.com
diff --git a/doc/todo/org_mode/Discussion.mdwn b/doc/todo/org_mode/Discussion.mdwn
new file mode 100644
index 000000000..726590bf4
--- /dev/null
+++ b/doc/todo/org_mode/Discussion.mdwn
@@ -0,0 +1,7 @@
+I don't find org.pm in git branch. The steps are:
+
+1. git clone git://github.com/wking/ikiwiki.git
+2. cd ikiwiki
+3. find . | grep org.pm
+
+I've taken the name from <http://www.golden-gryphon.com/software/misc/org.pm.html>
diff --git a/doc/todo/osm__95__optimisations__95__and__95__fixes.mdwn b/doc/todo/osm__95__optimisations__95__and__95__fixes.mdwn
new file mode 100644
index 000000000..fa74d3126
--- /dev/null
+++ b/doc/todo/osm__95__optimisations__95__and__95__fixes.mdwn
@@ -0,0 +1,27 @@
+[[!template id=gitbranch branch=anarcat/osm_kml_formatting author="[[anarcat]]"]]
+[[!template id=gitbranch branch=anarcat/osm_openlayers_misc author="[[anarcat]]"]]
+
+I have accumulated a small series of patches to the OSM plugin along with the [[other|todo/osm_arbitrary_layers]] [[fixes|bugs/osm_KML_maps_do_not_display_properly_on_google_maps]] I have submitted here. They have lived in a tangled mess on my master branch so far, but not anymore!
+
+I have two main branches that need merging (on top of [[todo/osm_arbitrary_layers]]):
+
+ * `osm_kml_formatting` - indentation of the KML, optimisation: remove duplicate style declarations, folders support (even though [[it's not supported by openlayers just yet|https://trac.osgeo.org/openlayers/ticket/2195]])
+
+> If it's not supported yet, does it break something? Seems it must be hard
+> to test the change at least if it's not supported. --[[Joey]]
+
+> > Good point. Maybe that can be skipped for now, it sure doesn't look like it will be merged any time soon anyways. I do think that the optimisation needs to be merged, it's quite important because if halves the size of the resulting KML file. --[[anarcat]]
+
+> > > The merge you just did is fine, the only thing missing is folder support, I'll keep it in a separate branch for now, maybe it will be useful later! This is [[done]]. --[[anarcat]]
+
+ * `osm_openlayers_misc` - do not override the sorting of layers (so that the order defined in [[todo/osm_arbitrary_layers]] takes effect) and tell Emacs about the non-default indentation policies of the file.
+
+> I prefer not to pollute files with editor-specific garbage, and that goes
+> doubly for files served over the network. Cherry-picked the layer sorting
+> change. --[[Joey]]
+
+> > Alright, I am fine with that, thanks. -- [[anarcat]]
+
+Those two branches are also merged directly on my master branch... along with [[todo/osm_arbitrary_layers]].
+
+I am filing this as one todo to simplify matter, but I can also split it further if needs be. --[[anarcat]]
diff --git a/doc/todo/osm_arbitrary_layers.mdwn b/doc/todo/osm_arbitrary_layers.mdwn
new file mode 100644
index 000000000..d59f394ee
--- /dev/null
+++ b/doc/todo/osm_arbitrary_layers.mdwn
@@ -0,0 +1,43 @@
+[[!template id=gitbranch branch=anarcat/osm_arbitrary_layers author="[[anarcat]]"]]
+
+I got tired of hacking at the osm.pm every time I wanted to change the layers, so I made it so the layers can be changed in the .setup file. In my master branch, there are now two new configuration settings: `osm_layers` and `osm_layers_order` which replace the hackish `osm_mapurl`. The variables are a hash and an array that allow the operator to define the list of URLs to be loaded as layers and also to change the order of layers. -- [[users/anarcat]]
+
+> I try to avoid adding hashes to config, because websetup does
+> not allow configuring hashes.
+>
+> The example for `osm_layers_order` is confusing, it makes
+> it look like a perl hash, but it appears to really be a javascript
+> code fragement string, and one that is tightly bound to other
+> configuration too. Why not generate that javascript code from
+> data in a robust way?
+>
+> Does it even make sense to configure this stuff globally?
+> Could the layers be passed as parameters to the osm direction? --[[Joey]]
+>
+> > The reason for `osm_layers_order` is that order is important in the layers: the default layer is the first one and it's not possible to force Perl to have arrays generated in a reliable, reproducable order. Maybe an alternative would be to just set the default layer.
+> >
+> > That said - maybe you're right and this should be passed as an argument to the OSM directive. The problem then is that you need to pass this stuff around the waypoint directive too. It also makes it hard to have a consistent set of maps all across the wiki. On our site, we have map inserts here and there, and it's nice to have them consistent all around.
+> >
+> > In closing, I would say that I agree that `.._order` is confusing: maybe I should just have a `_default` to choose the first one? -- [[anarcat]]
+
+>>> If there's no reason to order the other layers, that makes some sense.
+>>> --[[Joey]]
+
+>>>> The layers are ordered because that's the way they are displayed in the menu. Take a look at the base layers on the top left here for an idea: <http://wiki.reseaulibre.ca/ikiwiki.cgi?map=map&do=osm&zoom=12&lat=45.5227&lon=-73.59554>. -- [[anarcat]]
+
+>>>> After sleeping over this - maybe it would be simpler if `osm_layers` was just an array. First, it would get rid of the duplication with `osm_layers_order`. Then I do not feel that having the keys in that hash is worth the duplication anymore. The only reason this is a hash is to provide an arbitrary string description for the layers. We could replace this with an automated description based on the path to the tiles provided.
+>>>>
+>>>> If that's an acceptable solution for you, I'll go right ahead and rewrite this in a separate branch for merging. Note that on my master branch, there are now 3 main changes that are not merged: arbitrary OSM layers (includes Google Maps support), KML formatting improvements (indentation, non-duplication of tags), minor OpenLayers improvements (don't sort layers arbitrarily, folders support, higher default zoom level and projection fixes). I can either make a branch for those three things or leave it on my master branch, but be warned that it will be hard to separate those as distinct/orthogonal patches as they mangle each other quite a bit.
+>>>>
+>>>> So basically, I need to know two things from you:
+>>>>
+>>>> 1. on the layers design: a) hash (which include arbitrary descriptions) + default value or b) a simple array with automated descriptions
+>>>> 2. the above changes on a single branch or on 3 different ones?
+>>>>
+>>>> Thanks for your time. -- [[anarcat]]
+
+>>>>> I have implemented 1.b) and 2. (ie. it's a simple array now, and I split this stuff in different branches.) I'll open another todo for the other branches. --[[anarcat]]
+
+>>>>>> [[merged|done]] --[[Joey]]
+
+Confirmed, thanks!! --[[anarcat]]
diff --git a/doc/todo/overriding_displayed_modification_time.mdwn b/doc/todo/overriding_displayed_modification_time.mdwn
new file mode 100644
index 000000000..160d31519
--- /dev/null
+++ b/doc/todo/overriding_displayed_modification_time.mdwn
@@ -0,0 +1,27 @@
+Some aggregators, like Planet, sort by mtime rather than ctime. This
+means that posts with modified content come to the top (which seems odd
+to me, but is presumably what the aggregator's author or operator
+wants),
+
+> Hah! That's so charitable I hope you can deduct it from your taxes. ;-)
+> --[[Joey]]
+
+but it also means that posts with insignificant edits (like
+adding tags) come to the top too. Atom defines `<updated>` to be the date
+of the last *significant* change, so it's fine that ikiwiki defaults to
+using the mtime, but it would be good to have a way for the author to
+say "that edit was insignificant, don't use that mtime".
+
+> Yes, this is a real limitiation of ikiwiki's atom support. --[[Joey]]
+
+See smcv's 'updated' branch for a basic implementation, which only affects
+the Atom `<updated>` field or the RSS equivalent.
+
+Other places the updated metadata item could be used (opinions on whether
+each should use it or not, please):
+
+* sorting by mtime in the inline directive
+* displaying "last edited" on ordinary pages
+
+> Tending toward no for both, but willing to be convinced otherwise..
+> [[merged|done]] --[[Joey]]
diff --git a/doc/todo/page_edit_disable.mdwn b/doc/todo/page_edit_disable.mdwn
new file mode 100644
index 000000000..a6977e7f7
--- /dev/null
+++ b/doc/todo/page_edit_disable.mdwn
@@ -0,0 +1,53 @@
+Disabling some of action URLs is not possible now without creating own
+version of `templates/page.tmpl` file. For example, how to disable
+displaying `EDITURL`, `RECENTCHANGESURL` or `PREFSURL` without
+touching original `page.tmpl` template?
+
+Now I can only enable/disable `HISTORYURL` and `DISCUSSIONLINK`.
+It's not hard for me, but I think that the way to do it can be
+confusing for another Ikiwiki users. For example, if I don't
+want `HISTORYURL`, then I need to comment `historyurl` hash
+in setup file. But if I want to disable discussions, then I need
+to set `discussion=0` there. So, you can see that we don't have
+one common method here.
+
+Maybe Ikiwiki setup file should has more hashes for action URLs,
+for example `edit=[01]`, `recentchanges=[01]`, `prefs=[01]`
+and `history=[01]`?
+
+If you are curious why I need that features, I can clarify it.
+I'm building "parallel" version of my site. It means that I want
+to have one editable version for internal users and second only
+readable version (+ search feature) for external users. I build
+both versions on secure, internal machine from the same pages,
+of course, and separated setup files and different templates.
+The readable version of the site will be rsynced to clustered WWW
+front-ends immediately via `post-commit` hook file or periodically
+by Cron. I haven't decided how to do it yet. --[[Paweł|ptecza]]
+
+> You disable display of recentchanges by disabling that plugin.
+
+>> Thanks for the hint! I didn't think about it :)
+
+> You disable edit and preferences by not enabling a cgiurl at all.
+
+>> Yes, I've just discovered it. Unfortunately I need cgiurl,
+>> because I would like to use searching feature also for read-only
+>> pages.
+
+> Maybe page editing will become a plugin some day, or be made
+> configurable -- there are a few things like searching and websetup
+> (and possibly the poll plugin, aggregate webtrigger, and pingee)
+> that it may make sense to enable a cgi for even if you don't want to
+> allow page editing. --[[Joey]]
+
+>> I'm glad you agree that it may make sense :) --[[Paweł|ptecza]]
+
+>> We're in a similar situation with http://web.monkeysphere.info
+>> - wanting cgiurl so that our recentchanges page displays links,
+>> but not wanting to enable editing of pages (since we're also
+>> rsync'ing the html pages to mirrors) --[[Jamie]]
+
+editpage plugin implemented, [[done]] --[[Joey]]
+
+>> Joey, you're great! Thank you very, very, very much! :D --[[Paweł|ptecza]]
diff --git a/doc/todo/pagedeletion.mdwn b/doc/todo/pagedeletion.mdwn
new file mode 100644
index 000000000..6f67769f4
--- /dev/null
+++ b/doc/todo/pagedeletion.mdwn
@@ -0,0 +1,3 @@
+It would be nice to be able to delete pages online.
+
+> [[done]] though the [[plugins/remove]] plugin. --[[Will]]
diff --git a/doc/todo/pagedown_plugin.mdwn b/doc/todo/pagedown_plugin.mdwn
new file mode 100644
index 000000000..cd49063e1
--- /dev/null
+++ b/doc/todo/pagedown_plugin.mdwn
@@ -0,0 +1,5 @@
+[[!template id=gitbranch branch=git://github.com/yds/ikiwiki.git/pagedown/ author="[[yds]]"]]
+
+Here's a [PageDown](http://Code.Google.com/p/pagedown/wiki/PageDown) plugin I put together based on the [WMD](http://WMD-Editor.com/) plugin source. In `editpage.tmpl` I moved `<TMPL_VAR WMD_PREVIEW>` to the top of the template. Makes it look like the edit `textarea` pops up below the content when hitting the edit link. Should work the same with the WMD plugin as well.
+
+I also wrote a couple of `makefile`s to make fetching and installing the sources for the `pagedown` and `wmd` `underlaydir`s simpler. And updated `doc/plugins/wmd.mdwn` to reflect these changes.
diff --git a/doc/todo/pageindexes.mdwn b/doc/todo/pageindexes.mdwn
new file mode 100644
index 000000000..cf28bec96
--- /dev/null
+++ b/doc/todo/pageindexes.mdwn
@@ -0,0 +1,5 @@
+Might be nice to support automatically generating an index based on headers
+in a page, for long pages. This could be done as a sanitize hook that
+parsed the html, with a directive that controlled it.
+
+[[todo/done]]
diff --git a/doc/todo/pagespec_aliases.mdwn b/doc/todo/pagespec_aliases.mdwn
new file mode 100644
index 000000000..748444a2f
--- /dev/null
+++ b/doc/todo/pagespec_aliases.mdwn
@@ -0,0 +1,169 @@
+[[!template id=gitbranch branch=jon/pagespec_alias author="[[Jon]]"]]
+[[!tag patch wishlist]]I quite often find myself repeating a boiler-plate
+[[ikiwiki/pagespec]] chunk, e.g.
+
+ and !*.png and !*.jpg...
+
+it would be quite nice if I could conveniently bundle them together into a
+pagespec "alias", and instead write
+
+ and !image()...
+
+I wrote the following plugin to achieve this:
+
+ <snip old patch; see git branch outlined above>
+
+I need to reflect on this a bit more before I send a pull request. In
+particular I imagine the strict/warnings stuff will make you puke. Also, I'm
+not sure whether I should name-grab 'alias' since [[todo/alias_directive]] is
+an existing wishlist item.
+
+> I think it would make sense to have "pagespec" in the name somehow.
+
+>> Good idea, how about `pagespecalias`? — [[Jon]]
+
+> No, the strict/warnings does not make me puke. Have you read my perl
+> code? :-P
+>
+> Note that your XXX is right. It would be a security hole to not validate
+> `$key`, as anyone with websetup access could cause it to run arbitrary
+> perl code.
+>
+> Well, except that websetup doesn't currently support configuring hashes
+> like used here. Which is a pity, but has led me to try to avoid using
+> such hashes in the setup file.
+
+> > If I removed the `getsetup` subroutine, it would not be exposed via
+> > website, is that right? I suppose it doesn't hurt to validate key, even if
+> > this risk was not there. Is the use of a hash here a blocker for adoption?
+> > — [[Jon]]
+
+> Have you considered not defining the pagespec aliases in the setup file, but
+> instead as directives on pages in the wiki? Using pagestate could store
+> up the aliases that have been defined. It could however, be hard to get
+> the dependencies right; any page that uses a pagespec containing
+> an alias `foo` would need to somehow depend on the page where the alias
+> was defined. --[[Joey]]
+
+> > I haven't thought the dependency issue through beyond "that might be hard".
+> > Personally, I don't like defining stuff like this in pages, but I appreciate
+> > some do. There could be some complex scenarios where some pages rely on a
+> > pagespec alias defined on others; and could have their meanings changed by
+> > changing the definition. A user might have permission to edit a page with a
+> > definition on it but not on the pages that use it, and similar subtle permission
+> > bugs. I'm also not sure what the failure mode is if someone redefines an alias,
+> > and whether there'd be an unpredictable precedence problem.
+> > How about both methods? — [[Jon]]
+
+Here's an example setup chunk:
+
+ pagespec_aliases:
+ image: "*.png or *.jpg or *.jpeg or *.gif or *.ico"
+ helper: "*.css or *.js"
+ boring: "image() or helper()"
+
+The above demonstrates self-referential dynamic pagespec aliases. It doesn't work,
+however, to add ' or internal()' to `boring`, for some reason.
+
+-- [[Jon]]
+
+> Probably needs to be `or internal(*)` --[[Joey]]
+
+> > Ah yes, could be, thanks. — [[Jon]]
+
+> another useful pagespec alias for large maps:
+
+ basewiki: "sandbox or templates or templates/* or ikiwiki or ikiwiki/* or shortcuts or recentchanges or wikiicons/*"
+
+> -- [[Jon]]
+
+>> Useful indeed! --[[Joey]]
+
+
+>>> I've tweaked my patch in light of your above feedback: The plugin has been
+>>> renamed, and I now validate keys. I've also added documentation and tests
+>>> to the branch. I haven't read rubykat's code properly yet, and don't have
+>>> access at the time of writing (I'm on a beach in Greece ☺), but I expect it
+>>> would be possible to extend what I've got here to support defining the
+>>> aliases in a PageSpec, once the dependency stuff has been reasoned out
+>>> properly.
+>>>
+>>> I'd like to solve the issue of this not being web-configurable by
+>>> implementing support for more nested datatypes in [[plugins/websetup]]. —
+>>> [[Jon]]
+
+>>>> Well, it's a difficult problem. websetup builds a form using
+>>>> CGI::FormBuilder, which makes it easy to build the simple UI we have
+>>>> now, but sorta precludes anything more complicated. And anything with
+>>>> a nested datatype probably needs a customized UI for users to be able
+>>>> to deal with it. I don't think websetupability need be a deal-breaker
+>>>> for this patch. I personally like special pages like Kathryn is doing
+>>>> more than complex setup files. --[[Joey]]
+
+>>>>> I've ran out of time to keep working on this, so I'm just going to
+>>>>> submit it as a 'contrib' plugin and leave things at that for now.
+>>>>> — [[Jon]]
+
+---------------------------
+
+Based on the above, I have written an experimental plugin called "subset".
+It's in my "ikiplugins" repo on github, in the "experimental" branch.
+<https://github.com/rubykat/ikiplugins/blob/experimental/IkiWiki/Plugin/subset.pm>
+
+It takes Joey's suggestion of defining the subsets (aliases) as directives;
+I took the example of the [[plugins/shortcut]] plugin and designated a single special page as the one where the directives are defined,
+though unlike "shortcut" I haven't hardcoded the name of the page; it defaults to "subsets" but it can be re-defined in the config.
+
+I've also added a feature which one might call subset-caching; I had to override `pagespec_match_list` to do it, however.
+An extra parameter added to `pagespec_match_list` called `subset` which
+
+* limits the result to look *only* within the set of pages defined by the subset (uses the "list" option to pagespec_match_list to do this)
+* caches the result of the subset search so that the second time subset "foo" is used, it uses the stored result of the first search for "foo".
+
+This speeds things up if one is using a particular subset more than once, which one probably is if one bothered to define the subset in the first place.
+The speed increase is most dramatic when the site has a large number of pages and the number of pages in the subset is small.
+(this is similar to the "trail" concept I used in my [[plugins/contrib/report]] plugin, but not quite the same)
+
+Note that things like [[plugins/map]] can't make use of "subset" (yet) because they don't pass along all the parameters they're given.
+But [[plugins/contrib/report]] actually works without alteration because it does pass along all the parameters.
+
+Unfortunately I haven't figured out how to do the dependencies - I'd really appreciate help on that.
+
+--[[KathrynAndersen]]
+
+> > Cool! I like the caching idea. I'm not sure about the name. I don't like defining
+> > stuff in pages, but I appreciate this is a matter of taste, and would be happy with
+> > supporting both. — [[Jon]]
+
+>>> I've now gone and completely re-done "subset" so that it is less like an alias, but it a bit clearer and simpler:
+>>> instead of having a separate "match_" function for every alias, I simply have one function, "match_subset"
+>>> which takes the name of the subset. Thus a \[[!subset name="foo"...]] would be called `subset(foo)` rather than `foo()`.
+
+>>> There are a few reasons for this:<br/>
+>>> (a) it's more secure not to be evaluating code on the fly<br/>
+>>> (b) it's simpler<br/>
+>>> (c) (and this was my main reason) it makes it possible to do caching without having to have a separate "subset" argument.
+>>> I've done a bit of a hack for this: basically, the PageSpec is checked to see if the very start of the PageSpec is `subset(foo) and` or if the whole pagespec is just `subset(foo)` and if either of those is true, then it does the subset caching stuff.
+>>> The reason I check for "and" is that if it is "subset(foo) or something" then it would be an error to use the subset cache in that case.
+>>> The reason I just check the start of the PageSpec is because I don't want to have to do complex parsing of the PageSpec.
+
+>>> As for defining subsets in the config rather than on pages, I perfectly understand that desire, and I could probably add that in.
+
+>>> As for the name "subset"... well, it's even less like an alias now, and "alias" is already a reserved name. What other names would you suggest?
+
+>>>--[[KathrynAndersen]]
+
+>>>> Regarding my comments: I wasn't clear what you are/were intending to
+>>>> achieve with your modifications. I've aimed for a self-contained plugin
+>>>> which could be merged with ikiwiki proper. I think I initially took your
+>>>> developments as being an evolution of that with the same goal, which is
+>>>> why I commented on the (change of) name. However, I guess your work is
+>>>> more of a fork than a continuation, in which case you can call it
+>>>> whatever you like ☺ I like some of the enhancements you've made, but
+>>>> having the aliases/subsets/"things" work in any pagespec (inside map, or
+>>>> inline) is a deal-breaker for me. — [[Jon]]
+
+>>>>> I'm a bit confused by your statement "having the aliases/subsets/"things" work in any pagespec (inside map, or inline) is a deal-breaker for me".
+>>>>> Do you mean that you want them to work in any pagespec, or that you *don't* want them to work in any pagespec? -- [[KathrynAndersen]]
+
+>>>>>> I mean I would want them to work in any pagespec. — [[Jon]]
diff --git a/doc/todo/pagespec_aliases/discussion.mdwn b/doc/todo/pagespec_aliases/discussion.mdwn
new file mode 100644
index 000000000..abbe80e6a
--- /dev/null
+++ b/doc/todo/pagespec_aliases/discussion.mdwn
@@ -0,0 +1,13 @@
+Something which is similar to aliases is the "trail" concept I use in the [[plugins/contrib/report]] plugin. (Also my "pmap" plugin, but that's only in my "experimental" branch on github). One can define a "trail" by making a report with the "doscan" option (I should probably change the name of that) and then that page has a "trail" which matches the pagespec in that report.
+Then one can reference that page as a "trail" without having to reuse that pagespec.
+(It's also very useful in speeding up the processing, because the matching pages have been remembered, and one doesn't have to search for them again).
+
+So, for example, one could make a page "all_images" and have a report (or pmap, which is simpler) like so:
+
+ \[[!pmap pages="*.png or *.jpg or *.jpeg or *.gif or *.ico"]]
+
+And then later, somewhere else
+
+ \[[!report template="images.tmpl" trail="all_images" pages="album/*"]]
+
+and that would show all the images under "album".
diff --git a/doc/todo/pagespec_expansions.mdwn b/doc/todo/pagespec_expansions.mdwn
new file mode 100644
index 000000000..6107f5489
--- /dev/null
+++ b/doc/todo/pagespec_expansions.mdwn
@@ -0,0 +1,151 @@
+A couple of suggestions for improving the usefulness of pagespecs:
+
+* @ to match [^/]* (i.e. all pages exactly one level below this one)
+* initial ./ to mean "from the page the pagespec is running from".
+ This would require some architectural changes and a change to the
+ interface for pagespec_match. What do you think? I have
+ lots of pages a/b/c.mdwn that inline "a/b/c/*".
+
+--Ethan
+
+> I want this too, so that the [[examples]] can use pagespecs that don't
+> have to be changed when the pages are moved around. I don't know how I
+> feel about the "@" thing, but "./" seems good. I take it you've looked at
+> how to implement it?
+>
+> It's worth mentioning that there's a bit of an inconsistency; wikilinks
+> match relative by default and absolute if prefixed with a "/", while
+> pagespecs match absolute by default. It would be nice to clear up that
+> inconsistency somehow, it's on my little list of things in ikiwiki that
+> arn't ideal. :-) --[[Joey]]
+
+I've looked at how to implement "./", yes, and I was a little hesitant
+to disturb the elegant implementation of pagespecs as it is now. That's
+why I wrote this todo item rather than just a patch. :) As I see it,
+the simplest thing to do is check globs when building the pagespec
+expression and translate "./foo" to "$from.'/foo'" in the resulting
+expression, and then add the $from paramater to pagespec_match. This does
+require an API change for all plugins which use pagespecs but hopefully
+it should be minor. I will work on a patch tomorrow.
+
+My use case for "@" (which is kind of a crummy symbol, but whatever) is
+my [projects page](http://www.betacantrips.com/projects/). I want to inline
+"summary" or "introduction" pages that are exactly one level below the
+inlining page, but not tarballs or other junk that might be in
+subdirectories. (The issue is confounded here because of my index.mdwn
+patch, but the principle is the same.) I chose "@" because it's similar in
+physical shape to "*" but enclosed, suggesting limitations. I also thought
+it would be useful in simplifying hacks like in [[plugins/map]] but I see
+now that I was mistaken.. "four or fewer levels deep" would be
+"@ or @/@ or @/@/@ or @/@/@/@". Well, I think it has a certain appeal but
+I can see why it might not be much of an improvement. :) --Ethan
+
+> Seems to me that ".." would be the natural thing to use, not "@". --[[Joey]]
+
+>> I don't understand.. "a/b/.." matches a/b/c but not a/b/c/d ? That doesn't
+>> seem natural to me at all. --Ethan
+
+>>> Ah.. in that case, why not use "a/b/* and !a/b/*/*" ? No need for a new
+>>> symbol. --[[Joey]]
+
+>>>> I know it's not necessary, but it would be helpful. --Ethan
+
+>>>>> I don't see the need for a new syntax since it's only a little long
+>>>>> using the old one. And of course even that can now be shortened:
+>>>>> "./* and !./*/*" --[[Joey]]
+
+OK, I took a shot at implementing the changes. I was thinking about making
+pagespecs relative by default but I couldn't decide whether page
+`foo/bar` inlining `*` should match `foo/bar/*` or `foo/*`.
+So I punted and left things as absolute, with `./*` matching
+`foo/bar/*`, which I think is pretty clear.
+The patch is at [ikidev](http://ikidev.betacantrips.com/patches/pagespec_enhancements.patch)
+and you can see it work at
+[this page](http://ikidev.betacantrips.com/one/two/three/index.html) or
+[this page](http://ikidev.betacantrips.com/one/two/three/princess.html) --Ethan
+
+> Nice patch, though I see the following problems with it:
+> * The sole pagespec_match in IkiWiki::Render probably should have `$p`
+> as its third parameter. This will allow add_depends to add a
+> dependency on a pagespec that matches relative to the page. I made this
+> changes and it seems to work, new pages are noticed in updates.
+
+>> OK, word.
+
+> * `! $from` fails to match pages named "0" :-)
+
+>> I don't understand. How did you even get $from into the
+>> translated pagespec?
+
+> * '/./ matches any letter, not just "." :-) :-)
+
+>> Oof, thanks for catching that.
+
+> * One other major problem. If you look at the doc/examples/blog/index.mdwn
+> I changed it to use relative globs like "./posts/*", but they didn't work,
+> because it looked for examples/blog/indexposts/* instead of
+> examples/blog/index/posts/*. And, of course, what I really expected it to
+> look for was examples/blog/posts/*. I think you may have made the wrong
+> choice about that, so I changed it to go the other way. What do you think?
+
+>> I could have sworn I made a change like that -- I was gonna make a call to
+>> basename() or something .. wait, I might have decided not to, because it
+>> would interfere with my index patch. Yeah, I guess my code was wrong.
+>> Don't "nice patches" usually work? :) My test cases were mostly "./*",
+>> so it slipped under the radar.
+
+>> As for what it should have done, that's much harder! My gut feeling is that
+>> "a/b/c.mdwn" inlining `./*` wants `a/b/c/*` and not `a/b/*`, and this is
+>> what I programmed for. I also feel that "a/b/c" inlining `./d/*` could go
+>> either way. Ideally we'd check for both, maybe using bestlink?
+
+>> The issue might be confounded some by your use of an index page, and
+>> ikiwiki doesn't have good support for those yet :) .
+>> I think ideally your index page would be treated as inlining from
+>> examples/blog/. To resolve this issue we should consider, for example:
+>> clothes/pants inlines `./jeans/*` -- probably means clothes/pants/jeans
+>> vacation/bermuda/blog inlines `./pics/*` -- probably vacation/bermuda/pics
+
+>>> What strikes me about your examples is that the "right thing" is
+>>> utterly contect dependent. Unfortunatly, I don't think that using
+>>> bestlink inside pagespec is possible. bestlinks change as pages are
+>>> added/removed, and dealing with the matches of a pagespec changing when
+>>> some page that is added or removed seems Hard.
+>>>
+>>> Since it seems we have to arbitrarily pick one of the two behaviors, I
+>>> prefer the one I picked for two reasons:
+>>> 1. The other behavior can be obtained easily from it, for example,
+>>> use ./c/* to limit the matches to that subdir.
+>>> 2. The common case is a bunch of pages in a single directory, not lots
+>>> of deeply nested subdirs.
+>>> --[[Joey]]
+
+>>>> Context-dependence was my conclusion too. My feeling is that inlining
+>>>> in a subdirectory of the current page is more common, but I don't
+>>>> really know. However, I think the changes as written should work OK
+>>>> with my index patch and allowing inlining from a/b/c/, so I'm
+>>>> satisfied. --Ethan
+
+> I've committed support for ./ to ikiwiki now, based on your patch.
+> [[todo/done]]
+> --[[Joey]]
+
+>> Cool! I haven't played with it yet, but looking over the patch, I see that
+>> you added another parameter to match_glob, which is an approach that didn't
+>> occur to me. I like it, it's more flexible. --Ethan
+
+One last thing -- could you either change:
+
+ $from=~s!/?[^/]+$!!;
+
+to
+
+ $from=~s!/?[^/]*$!!;
+
+Or could you put in:
+
+ $glob =~ s!//!/!g;
+
+somewhere? Or should I just add this to my index patch? --Ethan
+
+> If it's specific to your index patch, let's put it in there. --[[Joey]]
diff --git a/doc/todo/pagespec_relative_to_a_target.mdwn b/doc/todo/pagespec_relative_to_a_target.mdwn
new file mode 100644
index 000000000..00030cce6
--- /dev/null
+++ b/doc/todo/pagespec_relative_to_a_target.mdwn
@@ -0,0 +1,101 @@
+Sometimes you want to match a page only if it has certain properties. The use
+case I have in mind is this: show me all the pages that have children. You
+can't do that with a pagespec, so I created a plugin that adds some pagespec
+functions.
+
+`match_relative(blah)` will match a page x if a pagespec from x would match
+`blah`. This is only actually useful with relative pagespecs.
+
+`match_has_child(blah)` will match a child if it has a descendant named
+`blah`. If blah is empty, any child will match.
+
+So if I have:
+
+* foo
+* foo/blah
+* foo/bar
+* foo/bar/blah
+* foo/bar/bahoo
+* foo/baz
+* foo/baz/goo
+* foo/baz/goo/blah
+
+A pagespec `match_relative(./blah)` will match `foo/bar/bahoo`, because
+a pagespec of `./blah` from `bahoo` would match `foo/bar/blah`. A
+pagespec of `match_has_child(blah)` would match `foo`, `foo/bar`,
+`foo/baz`, and `foo/baz/goo`.
+
+Note that if you try to inline `*/blah` you will match `foo/blah`,
+`foo/bar/blah`, and `foo/baz/goo/blah` -- that is, the blah pages
+themselves rather than any relatives of theirs.
+
+This patch is useful for (among other things) constructing blogging
+systems where leaf nodes are organized hierarchically; using `has_child`,
+you can inline only leaf nodes and ignore "intermediate" nodes.
+`match_relative` can be used recursively to match properties of arbitrary
+complexity: "show me all the pages who have children called foo that
+have children called blah". I'm not sure what use it is, though.
+
+You can see the patch in action at
+<http://ikidev.betacantrips.com/conditionaltest/>,
+so named because I had hoped that something in conditional.pm could
+help me. I know the name "relative" sucks, feel free to come up with a
+better one. --Ethan
+
+<pre>
+diff -urNX ignorepats ikiwiki/IkiWiki/Plugin/relative.pm ikidev/IkiWiki/Plugin/relative.pm
+--- ikiwiki/IkiWiki/Plugin/relative.pm 1969-12-31 16:00:00.000000000 -0800
++++ ikidev/IkiWiki/Plugin/relative.pm 2007-07-26 21:48:10.642686000 -0700
+@@ -0,0 +1,39 @@
++#!/usr/bin/perl
++# relative.pm: support for pagespecs on possible matches
++package IkiWiki::Plugin::relative;
++
++use warnings;
++use strict;
++use IkiWiki 2.00;
++
++package IkiWiki::PageSpec;
++
++sub match_relative($$;@) {
++ my $parent = shift;
++ my $spec = shift;
++ my %params = @_;
++
++ foreach my $page (keys %IkiWiki::pagesources) {
++ next if $page eq $parent;
++ if (IkiWiki::pagespec_match($page, $spec, location => $parent)) {
++ return IkiWiki::SuccessReason->new("$parent can match $spec against $page");
++ }
++ }
++ return IkiWiki::FailReason->new("$parent can't match $spec against anything");
++}
++
++sub match_has_child($$;@) {
++ my $page = shift;
++ my $childname = shift;
++ my $spec;
++ if ($childname) {
++ $spec = "$page/$childname or $page/*/$childname";
++ }
++ else {
++ $spec = "$page/*";
++ }
++
++ return match_relative($page, $spec, @_);
++}
++
++1
+</pre>
+
+[[!tag patch]]
+
+> This looks really interesting. It reminds me of [[!wikipedia XPath]] and its conditionals.
+> Those might actually work well adapted to pagespecs. For instance, to write
+> "match any page with a child blah", you could just write *[blah] , or if you
+> don't want to use relative-by-default in the conditionals, *[./blah].
+> -- [[JoshTriplett]]
+
+> And it [[!taglink also_reminds_me|pagespec_in_DL_style]] of [[!wikipedia description logics]]: of course, given the relation `subpage` one could write a description-logic-style formula which would define the class of pages that are ("existentially") in a given relation (`subpage` or `inverse(subpage)*subpage`) to a certain other class of pages (e.g., named "blah") ("existentially" means there must exist a page, e.g., named "blah", which is in the given relation to the candidate).
+
+> Probably the model behind XPath is similar (although I don't know enough to say this definitely).--Ivan Z.
diff --git a/doc/todo/pagespec_to_disable_ikiwiki_directives.mdwn b/doc/todo/pagespec_to_disable_ikiwiki_directives.mdwn
new file mode 100644
index 000000000..4211c2d10
--- /dev/null
+++ b/doc/todo/pagespec_to_disable_ikiwiki_directives.mdwn
@@ -0,0 +1,5 @@
+I would like some pages (identified by pagespec) to not expand ikiwiki directives (wikilinks, or \[[!, or both, or perhaps either).
+
+I will tag this [[wishlist]]. It's something I might try myself. It's part of my thinking about how to handle [[comments]], as I'm still ruminating on alternatives to [[smcv]]'s approach. (with the greatest of respect to smcv!) (Perhaps my attempt will try to factor out the no-directives-allowed logic from the comments plugin).
+
+ -- [[Jon]]
diff --git a/doc/todo/pagestats_among_a_subset_of_pages.mdwn b/doc/todo/pagestats_among_a_subset_of_pages.mdwn
new file mode 100644
index 000000000..33f9258fd
--- /dev/null
+++ b/doc/todo/pagestats_among_a_subset_of_pages.mdwn
@@ -0,0 +1,28 @@
+[[!tag patch plugins/pagestats]]
+
+My `among` branch fixes [[todo/backlinks_result_is_lossy]], then uses that
+to provide pagestats for links from a subset of pages. From the docs included
+in the patch:
+
+> The optional `among` parameter limits counting to pages that match a
+> [[ikiwiki/PageSpec]]. For instance, to display a cloud of tags used on blog
+> entries, you could use:
+>
+> \[[!pagestats pages="tags/*" among="blog/posts/*"]]
+>
+> or to display a cloud of tags related to Linux, you could use:
+>
+> \[[!pagestats pages="tags/* and not tags/linux" among="tagged(linux)"]]
+
+I use this on my tag pages on one site, with the following template:
+
+ \[[!pagestats pages="tags/* and !tags/<TMPL_VAR raw_tag>
+ and !tags/photogallery"
+ among="tagged(<TMPL_VAR raw_tag>)"]]
+
+ \[[!inline pages="tagged(<TMPL_VAR raw_tag>)"
+ archive="yes" quick="yes" reverse="yes" timeformat="%x"]]
+
+--[[smcv]]
+
+> [[merged|done]] thanks --[[Joey]]
diff --git a/doc/todo/pal_plugin.mdwn b/doc/todo/pal_plugin.mdwn
new file mode 100644
index 000000000..0313becf4
--- /dev/null
+++ b/doc/todo/pal_plugin.mdwn
@@ -0,0 +1,9 @@
+A plugin to generate calendars using [pal](http://palcal.sourceforge.net/).
+
+Main issues:
+
+* pal's HTML output is not valid (fixed in pal SVN)
+* make sure it's secure
+* calendars change with time, so ikiwiki would need to be run from cron
+ daily to update them, and the plugin would need to somehow mark the page as
+ needing a rebuild after time had passed. Similar to [[plugins/aggregate]].
diff --git a/doc/todo/parse_debian_packages.mdwn b/doc/todo/parse_debian_packages.mdwn
new file mode 100644
index 000000000..2425645d9
--- /dev/null
+++ b/doc/todo/parse_debian_packages.mdwn
@@ -0,0 +1,70 @@
+A parser, similar in functionality to [[plugins/inline]] that would find
+and parse debian packages from a repository and include links to
+them. Functionality would be similar to the
+[PHP Apt-file parser](http://php-apt-parser.alioth.debian.org/)
+(for an example of the output, see
+[my repository](http://debian.camrdale.org/)). This would create
+a helpful index page to a small repository, listing all the
+packages, and possibly their descriptions as well, with links to
+download them or their sources.
+
+--Cameron
+
+> It's a good idea, I think there are probably several ways to approach it
+> that would all yeild good, though differing results. Maybe with
+> something like this I'd actually get around to posting ikiwiki debs to
+> the repo. ;-) --[[Joey]]
+
+I think this is easily possible (and I might be able to work on
+it myself, though Perl is not my strong suit). The trickiest
+part is probably figuring out how and when to parse the packages.
+The packages could be included in the ikiwiki RCS repository,
+which might be difficult when the Packages/Release files need to
+be generated (especially if it's via an external tool like
+reprepro/debarchiver/etc.). Or, the packages could be kept
+separate, with only a link given to the plugin, though changes
+would then not be picked up until the ikiwiki is recompiled.
+
+
+> This could be done by adding a hook to reprepro/whatever that calls
+> ikiwiki --refresh at the end of updating a repo. (I don't
+> remember if reprepro has such hooks; mini-dinstall certianly does.)
+
+>> reprepro doesn't seem to have one, :( though of course creating a
+>> script to do both would work (but it's not optimal). --Cameron
+
+>>> reprepro has two kind of hooks that could be used. One is called
+>>> whenever a Packages file is changed (normaly used to generate
+>>> Packages.diff files, but it does not need to add new files).
+>>> The other (though only available since 2.1) is called whenever
+>>> a package is added or removed (there is an example in the docs
+>>> for extracting changelogs using this).
+
+> For ikiwiki to notice that the Packages file outside its tree has
+> changed and things need to be updated, a `needsbuild` hook could be
+> used. This seems very doable.
+
+Perhaps a better (though infinitely more complicated) solution
+would be to include the reprepro/debarchiver functionality in
+ikiwiki. Packages could be posted, like blog entries, and tagged
+with the target distribution (sid/lenny/etc.). Then compiling
+ikiwiki would generate the needed Packages/Release files
+automatically.
+
+> I like the idea of
+> using packages as "source" and spitting out apt repos, though I'd not
+> want to use it for a big repo, and I'd ideally want to keep the packages
+> in a different svn repo, pulled in via svn:externals.
+
+>> I like it too, more than the easier options, why are the most
+>> interesting solutions always the most complicated? ;)
+
+>> Parsing the files sounds like it might require some outside
+>> dependencies, and given the complexity maybe this should be
+>> a separate package from ikiwiki. Is it possible to package
+>> plugins separately? --Cameron
+
+>>> Sure, a plugin is just a perl library so can easily be packaged
+>>> separately.
+
+[[!tag wishlist]]
diff --git a/doc/todo/passwordauth:_sendmail_interface.mdwn b/doc/todo/passwordauth:_sendmail_interface.mdwn
new file mode 100644
index 000000000..556240964
--- /dev/null
+++ b/doc/todo/passwordauth:_sendmail_interface.mdwn
@@ -0,0 +1,61 @@
+[[!tag wishlist plugins/passwordauth]]
+
+For sending out password reminder emails, the [[plugins/passwordauth]] plugin currently uses
+the *[Mail::Sendmail](http://search.cpan.org/perldoc?Mail::Sendmail)* module.
+This module, however, has the limitation that it can only talk *SMPT*,
+but can't use the standard Unix *sendmail* (command-line) interface.
+I don't want to have an MTA with a SMTPd running on my web server system.
+Would it be possible to switch to using one of the existing Perl modules that support
+the *sendmail* interface?
+
+From doing a quick search, these might be some candidates:
+
+ * <http://search.cpan.org/perldoc?Mail::Transport::Sendmail>
+ * <http://search.cpan.org/perldoc?Email::Send::Sendmail>
+ * <http://search.cpan.org/perldoc?Mail::SendVarious>
+ * <http://search.cpan.org/perldoc?EasyMail>
+
+None of them are packaged for Debian so far, but that should be doable easily, as far as I know.
+
+ikiwiki might perhaps even try to use all of them in turn until it finds a working one.
+
+I'd offer to work on a patch for the [[plugins/passwordauth]] plugin and other places
+in the ikiwiki source code, where emailing is done.
+
+--[[tschwinge]]
+
+> One that is in Debian is [[!cpan Email::Send]], which can do SMTP and
+> sendmail and some other methods and falls back through methods until one
+> succeeds. I haven't tried to use it but it looks like a feasable
+> candidate.
+>
+> I don't much like the idea of supporting a lot of different email sending
+> modules. --[[Joey]]
+
+OK, so I'll have a look at replacing all email handling with *Email::Send*.
+
+[[!tag patch]]
+*<http://schwinge.homeip.net/~thomas/tmp/ikiwiki-sendmail.patch>*
+
+Remaining TODOs:
+
+ * Resolve TODOs as denoted inside the patch.
+ * Is it worthwhile to use and depend on [[!cpan Return::Value]]
+ just for this bit of functionality?
+ * Debian news file.
+ * ikiwiki news file.
+
+--[[tschwinge]]
+
+
+> BTW, I think you recently sent a patch improving mail logging, but I've
+> lost it. --[[Joey]]
+
+Resent. --[[tschwinge]]
+
+> Debian now has Mail::Sender, Mail::SendEasy, and Email::Sender
+> (which, according to its dpkg description, "replaces the old and sometimes
+> problematic Email::Send library, which did a decent job at handling very
+> simple email sending tasks, but was not suitable for serious use, for a
+> variety of reasons"). Are any of those any better? It's unfortunate that
+> there doesn't seem to be a clear "best practice"... --[[smcv]]
diff --git a/doc/todo/paste_plugin.mdwn b/doc/todo/paste_plugin.mdwn
new file mode 100644
index 000000000..83384a8d7
--- /dev/null
+++ b/doc/todo/paste_plugin.mdwn
@@ -0,0 +1,36 @@
+It was suggested that using ikiwiki as an alternative to pastebin services
+could be useful, especially if you want pastes to not expire and be
+cloneable.
+
+All you really need is a special purpose ikiwiki instance that you commit
+to by git. But a web interface for pasting could also be nice.
+
+There could be a directive that inserts a paste form onto a page. The form
+would have a big textarea for pasting into, and might also have a file
+upload button (for uploading instead of pasting). It could also copy the
+page edit form's dropdown of markup types, which would be especially useful
+if using the highlight plugin to support programming languages. The default
+should probably be txt, not mdwn, if the txt plugin is enabled.
+
+(There's a lot of overlap between that and editpage of course .. similar
+to the overlap between the comment form and editpage.)
+
+When posted, the form would just come up with a new, numeric subpage
+of the page it appears on, and save the paste there.
+
+Another thing that might be useful is a "copy" (or "paste as new") action
+on the action bar. This would take an existing paste and copy it into the
+paste edit form, for editing and saving under a new id.
+
+---
+
+A sample wiki configuration using this might be:
+
+* enable highlight and txt
+* enable anonok so anyone can paste; lock anonymous users down to only
+ creating new pastes, not editing other pages
+* disable modification of existing pastes (how? disabling editpage would
+ work, but that would disallow setting up anonymous git push)
+* enable comments, so that each paste can be commented on
+* enable getsource, so the source to a paste can easily be downloaded
+* optionally, enable untrusted git push
diff --git a/doc/todo/pastebin.mdwn b/doc/todo/pastebin.mdwn
new file mode 100644
index 000000000..66dac0e28
--- /dev/null
+++ b/doc/todo/pastebin.mdwn
@@ -0,0 +1,11 @@
+ikiwiki could support a pastebin (requested by formorer on `#ikiwiki` for http://paste.debian.net/).
+
+Desired features:
+
+* expiration
+* [[plugins/contrib/syntax]] highlighting with line numbering
+* Password protection?
+
+-- [[JoshTriplett]]
+
+[[wishlist]]
diff --git a/doc/todo/pdf_output.mdwn b/doc/todo/pdf_output.mdwn
new file mode 100644
index 000000000..a0f324054
--- /dev/null
+++ b/doc/todo/pdf_output.mdwn
@@ -0,0 +1,22 @@
+Some time ago there was a [[question|http://ikiwiki.info/forum/how_to_get_nice_pdf_from_ikiwiki_pages__63__/?updated#comment-05381634f89629ad26298a1af4b1d5f9]] in the forum how to get a nice pdf output from an ikiwiki page. However there were no answers. So I put it on the todo list, because I think this would be a nice feature.
+
+Note that for example dokuwiki has a [[nice plugin|http://danjer.doudouke.org/tech/dokutexit]] which converts the wiki page to latex and then to pdf and you can customize the latex-preamble.
+
+> I've actually written one, it's just not publicly released. You can check it out from the "experimental" branch of my <a href="https://github.com/rubykat/ikiplugins">ikiplugins githup repo</a>. It's called "html2pdf" and it depends on the static version of <a href="http://code.google.com/p/wkhtmltopdf/">wkhtmltopdf</a> rather than requiring a whole LaTeX setup. It's only been used on Ubuntu, so I can't say what problems there might be on other setups, but it works for me. It's not properly documented; I'd appreciate some help with that.
+> -- [[KathrynAndersen]]
+
+>> Thanks, I downloaded the git-repro and did `sudo cp html2pdf.pm /usr/share/perl5/IkiWiki/Plugin/` then I added html2pdf to the addplugins line in my setup-file (`mywiki.setup`) as well a new line `html2pdf_pages=>"/*",`. Then I did `sudo ikiwiki --setup mywiki.setup`. However there is no button or something like that which let's me create the pdf's
+>> -- [[micheal]]
+
+>>> That is because they are created automatically as part of the page-build process. That's what the "html2pdf_pages" option is for: it defines which pages have PDFs generated from them. If a PDF is generated for page "foo", then the PDF itself will be in "foo/foo.pdf".
+
+>>> I also notice you didn't mention installing wkhtmltopdf - it won't create PDFs without that!
+>>> -- [[KathrynAndersen]]
+
+>>>> Yes, wkhtmltopdf is installed and works, however there are no pdf-files in my /var/www/myiki directory or in scrdir.
+
+>>>>> Have you tried running it with "verbose" turned on, and noting the output? That could give some clues.
+>>>>> And no, the PDFs are not placed in the source dir, only in the destination dir.
+>>>>> -- [[KathrynAndersen]]
+
+**Edit (17.02.2012)**: I have put an extended version of the question on webmasters.stackexchange: [[http://webmasters.stackexchange.com/questions/24905/run-external-application-on-markdown-source-in-ikiwiki]] , perhaps someone of the ikiwiki programmers is intersted in having this feature too...
diff --git a/doc/todo/pdfshare_plugin.mdwn b/doc/todo/pdfshare_plugin.mdwn
new file mode 100644
index 000000000..40a162bfd
--- /dev/null
+++ b/doc/todo/pdfshare_plugin.mdwn
@@ -0,0 +1 @@
+Given an ikiwiki with a PDF checked into it, how about a plugin that uses [pdfshare](http://ejohn.org/blog/easy-pdf-sharing/) to embed a version of the PDF that uses images and Javascript navigation? Great for presentations, or any other PDF you want to provide an easy preview of. The pdfshare plugin should regenerate the images and navigation if you update the PDF. --[[JoshTriplett]] \ No newline at end of file
diff --git a/doc/todo/pedigree_plugin.mdwn b/doc/todo/pedigree_plugin.mdwn
new file mode 100644
index 000000000..1977c3d4d
--- /dev/null
+++ b/doc/todo/pedigree_plugin.mdwn
@@ -0,0 +1,194 @@
+After realizing (thanks to
+[[Allow_TITLE_to_include_part_of_the_path_in_addition_to_the_basename]])
+that I needed some kind of "parentlinks on steroids", I wrote a new
+plugin, called pedigree.
+
+This plugin provides a bunch of loops that one can use in his/her
+`HTML::Template`'s to iterate over all or a subset of a page's
+parents. Inside these loops, half a dozen variables are made
+available, in addition to `PAGE` and `URL` that are already provided
+by parentlinks.
+
+Amongst many possibilities, one can e.g. simply use this plugin to
+give every parent link a different `class=` attribute, depending
+either on its depth in the path leading to the current page, or on its
+distance to it.
+
+The code and documentation (including simple and complex usage
+examples) are in the 'pedigree' Git branch in this repo:
+
+ git://repo.or.cz/ikiwiki/intrigeri.git
+
+Seems there is also a [gitweb](http://repo.or.cz/w/ikiwiki/intrigeri.git).
+
+> Ok, I'll take a look. BTW, could you allow user joey on repo.or.cz
+> push access to the main ikiwiki repo you set up there? --[[Joey]]
+
+>> I did not. The main ikiwiki repo on repo.or.cz seems to have been
+>> been setup by johannes.schindelin@gmx.de ; mine is what they call
+>> a "fork" (but it's not, obviously). -- intrigeri
+
+Any opinions on the idea/design/implementation?
+
+> Seems that there should be a more generic way to do `PEDIGREE_BUT_ROOT`
+> and `PEDIGREE_BUT_TWO_OLDEST` (also `is_second_ancestor`,
+> `is_grand_mother` etc). One way would be to include in `PEDIGREE`
+> a set of values like `depth_1`, `depth_2`, etc. The one corresponding
+> to the `absdepth` would be true. This would allow a template like this:
+
+ <TMPL_LOOP NAME="PEDIGREE">
+ <TMPL_IF NAME="depth_1">
+ </TMPL_ELSE>
+ <TMPL_IF NAME="depth_2">
+ </TMPL_ELSE>
+ <TMPL_VAR PAGE> /* only showing pages 2 levels deep */
+ </TMPL_IF>
+ </TMPL_IF>
+ </TMPL_LOOP>
+
+> The only missing information would be `reldepth`, but in the above
+> example the author of that template knows that it's `absdepth - 1`
+> (Things would be a lot nicer if `HTML::Template` had equality tests!)
+>
+> Since this would make it more generic and also fix your one documented
+> bug, I can see no reason not to do it. ;-) --[[Joey]]
+
+>> Thanks for your comments. I'll answer soon. (Grrr, I really
+>> need to find a way to edit this wiki offline, every minute
+>> online costs bucks to me, my old modem gently weeps,
+>> and I hate webbrowsers.) -- intrigeri
+
+>>> Well, I maybe didn't get your idea properly; I may be missing
+>>> something obvious, but:
+
+>>> * I don't understand how this would replace `is_grand_mother`. As a template
+>>> writer, I don't know, given an absolute array index (and this is the only
+>>> piece of data your solution gives me), if it will be e.g. the before-last
+>>> (how do I say this in correct English?) element of an array whose
+>>> (variable) size is unknown to me.
+>>> * Knowing that `reldepth`'s value is, in a given loop, always equal to
+>>> `absdepth - 1` is of little use to me (as a template writer): how do I use
+>>> this piece of information programmatically in my templates, if I want all
+>>> links with `reldepth==2` to be given the same style? I guess some bits of
+>>> Javascript might do the trick, but if it's getting so complicated, I'll
+>>> just style my parentlinks another way.
+
+>>>> Perhaps I misunderstood what `is_grand_mother` is supposed to do. The
+>>>> docs were not very clear to me. If it's supposed to be 2 down from
+>>>> the page, (and not from the root), this could be achieved by reversing
+>>>> the `depth_n` variables. So the page gets `depth_1` set, its parent gets
+>>>> `depth_2` set, etc. If you want to be able to include/exclude
+>>>> from both ends, you could also have a `height_n` that is 1 for the
+>>>> root, and counts upwards. --[[Joey]]
+
+>>> In my understanding, your suggestion gives us little more than can already
+>>> be achieved anyway with `HTML::Template`'s `loop_context_vars` (i.e.
+>>> `__first__`, `__last__` and `__counter__`). The only added bonus is doing
+>>> custom stuff for an arbitrary element in the loop, chosen by its absolute
+>>> depth. Please correct me if needed.
+
+>>> (Intermezzo: in the meantime, to suit my personal real-world needs, I added
+>>> a `DISTANCE` loop-variable. Quoting the documentation, it's "thedistance,
+>>> expressed in path elements, from the current page to the current path
+>>> element; e.g. this is 1 for the current page's mother, 2 for its
+>>> grand-mother, etc.".)
+
+>>> Anyway, your comments have made me think of other ways to simplify a bit
+>>> this plugin, which admittedly provides too much overlapping functionality.
+>>> Bellow is my reasoning.
+
+>>> In one of my own real world examples, my two main use cases are :
+
+>>> * the "full-blown example" provided in the documentation (i.e.
+>>> displaying every parent but mother and grand'ma as a group, and giving
+>>> every of these two last ones their dedicated div);
+>>> * skipping the two oldest parents, and inside what's left, displaying the
+>>> three youngest parents (i.e. mother, grand'ma and grand'grand'ma), each
+>>> one with a dedicated style;
+
+>>> Both of these can be achieved by combining `PEDIGREE`, `DISTANCE`, and some
+>>> CSS tricks to hide some parts of the list. `IS_MOTHER` and
+>>> `IS_GRAND_MOTHER`, as well as `PEDIGREE_BUT_TWO_OLDEST`, would be convenient
+>>> shortcuts, but I do not formally need them.
+
+>>> So... it seems things can be simplified greatly:
+
+>>> * I initially added `RELDEPTH` for completeness, but I'm not sure anyone
+>>> would use it. Let's give it up.
+>>> * Once `RELDEPTH` is lost (modulo Git tendencies to preserve history), the
+>>> known bug is gone as well, and `PEDIGREE_BUT_ROOT` and
+>>> `PEDIGREE_BUT_TWO_OLDEST` are now only convenient shortcuts functions;
+>>> they could as well disappear, if you prefer to.
+
+>>> It appears then that I'd be personally happy with the single `PEDIGREE` loop
+>>> (renamed to `PARENTLINKS`), providing only `PAGE`, `URL`, `ABSDEPTH` (maybe
+>>> renamed to `DEPTH`), and `DISTANCE`. This would make my templates a bit more
+>>> complicated to write and read, but would also keep the plugin's code to the
+>>> bare minimum. Let's say it is my up-to-date proposal. (Well, if the various
+>>> shortcuts don't really annoy you, I'd be glad to keep them ;)
+
+>>>> This sounds fairly similar to what I just described above. (I called
+>>>> DISTANCE "height".) I don't know about the CSS tricks; seems like if
+>>>> `DEPTH_n` and `DISTANCE_n` are provided, you can test for them inside
+>>>> the loop using HTML::Template's lame testing, and isolate any page or
+>>>> range of pages. --[[Joey]]
+
+>>>>> Ok, I definitely like this idea, as an effective and generic
+>>>>> page-range selection tool; this seems the way to go to me.
+
+>>>>> But if you discard the `DEPTH` and `HEIGHT`
+>>>>> counters, we lack a way to **style**, for example, every parent link
+>>>>> depending on its depth or height; one can do this for arbitrary
+>>>>> parents (chosen by their height or depth), but *not* for *any* parent,
+>>>>> since there is no way to express, with HTML::Template, something like
+>>>>> "display the name of the only `DEPTH_n` variable that is currently
+>>>>> true". So I am in favor of keeping the `DEPTH` and `HEIGHT` counters,
+>>>>> to allow constructs like:
+
+ <TMPL_LOOP NAME="PARENTLINKS">
+ <a href="<TMPL_VAR NAME="URL">" class="parentdistance<TMPL_VAR NAME="DISTANCE">">
+ <TMPL_VAR NAME="PAGE">
+ </a> /
+ </TMPL_LOOP>
+
+>>>>> This seems to me a nice functionality bonus, and should not
+>>>>> imply too bloated code. I'm thus going to rewrite the plugin
+>>>>> with only `PEDIGREE`, `DEPTH`, `HEIGHT`, `DEPTH_n` and
+>>>>> `HEIGHT_n`. -- intrigeri
+
+>>>>>> Done, and pushed in my pedigree branch. Update: I've also done and
+>>>>>> pushed two commits that rename the plugin and replace
+>>>>>> the core parentlinks with this one. --[[intrigeri]]
+
+(I'll try never to rebase this branch, but writing this plugin has
+been a pretext for me to start learning Git, so...)
+
+To finish with, it seems no plugin bundled with ikiwiki uses the current
+parentlinks implementation, so one could event think of moving it from the
+core to this plugin (which should then be enabled by default, since the
+default templates do use parentlinks ;).
+
+> I think that moving parentlinks out to a plugin is a good idea.
+> However, if it's done, I think the plugin should be named parentlinks,
+> and should continue to use the same template variables as are used now,
+> to avoid needing to change custom templates. Pedigree is a quite nice
+> name, but renaming it to parentlinks seems to be the way to go to me.
+> --[[Joey]]
+
+>> Agreed. -- intrigeri
+
+>> Just commited a testsuite for this plugin, BTW. It's nearly twice
+>> big as the plugin itself, I'm wondering... -- intrigeri
+
+Merged, nice work. (Overkill having a test suite. ;-) --[[Joey]]
+
+> Thanks. If the testsuite reveals itself to be harder to maintain than
+> the plugin, my ego won't be offended to see it removed. It's been
+> nice to find a way, step by step, to work with you on this small
+> plugin thing. I'm starting to feel a bit at home in ikiwiki
+> sourcetree, which is great since I may have to start working on some
+> more ambitious ikiwiki stuff, such as the ~multilingual wiki
+> (master language + translations) support. Expect news from me on
+> this front in the next weeks. --[[intrigeri]]
+
+[[!tag patch done]]
diff --git a/doc/todo/per_page_ACLs.mdwn b/doc/todo/per_page_ACLs.mdwn
new file mode 100644
index 000000000..82acac215
--- /dev/null
+++ b/doc/todo/per_page_ACLs.mdwn
@@ -0,0 +1,18 @@
+This is about going beyond the current [[ACL]] system and allow not only readonly pages (through [[plugins/lockedit]]) but also read protection, and per page. To quote that other page:
+
+> [[!acl user=joe page=.png allow=upload]]
+> [[!acl user=bob page=/blog/bob/ allow=]]
+> [[!acl user= page=/blog/bob/ deny=]]
+> [[!acl user=http://jeremie.koenig.myopenid.com/ page=/todo/* deny=create
+> reason="spends his time writing todo items instead of source code"]]
+>
+> Each would expand to a description of the resulting rule.
+>
+> a configurable page of the wiki would be used as an ACL list. Possibly could refer to other ACL pages, as in:
+>
+> [[!acl user= page=/subsite/ acl=/subsite/acl.mdwn]]
+
+I think this would be perfectly possible in Ikiwiki, provided of course the access to the full repository is not allowed, as that cannot be made granular. The way I would see that happen would be by dropping .htaccess files in the right directories and with clever configuration of the virtual host containing the ikiwiki install. Apache has plenty of methods for doing such authentication, and we could simply rely on [[plugins/httpauth/]] for that. *But* there is a key feature of having ACLs per page, or improving the httpauth plugin to have "noread" pagespecs... --[[anarcat]]
+
+Agreed with anarcat, I'am experimenting it. Moreover after sketching some kind of "private area" and a "public area" with [[plugins/httpauth/]], I realized in a public page, generated *backlinks* that appears, actually links pages in private. In the end users through backlink navigation will frequently hit HTTP/401 deterring browsing as well as for the admin at false-positive logwatching.
+So the plus would be to have a visual display noticing that some link is denied (why not with the reason in a mouseover popup). [[mathdesc]]
diff --git a/doc/todo/pingback_support.mdwn b/doc/todo/pingback_support.mdwn
new file mode 100644
index 000000000..7b3b158ee
--- /dev/null
+++ b/doc/todo/pingback_support.mdwn
@@ -0,0 +1,41 @@
+A "pingback" is a system whereby URLs you might reference in a blog post are
+contacted by the blog publishing software at publishing time (i.e., once) so
+that they might update a list of "pingbacks" to the URL. The originating
+URL's blog software might then display a list of pingbacks, or an excerpt of
+the text from your blog, perhaps interleaved with comments, etc.
+
+At a technical level, external URLs are extracted from your blog post by the
+blogging software, fetched, inspected for information to determine whether the
+remote server is configured to support pingbacks (look for link tags, or HTTP
+headers) and the relevant pingback URL sent an XML-RPC packet.
+
+There are other technologies to achieve the same thing: trackbacks predate
+pingbacks but are more vulnerable to spam due to design problems.
+
+The spec for pingbacks is at <http://www.hixie.ch/specs/pingback/pingback>.
+
+I would like to somehow use pingbacks in conjunction with ikiwiki. I suppose
+this could be achieved using a commit hook and some external software in which
+case I will consider this done with an entry in [[tips]]; otherwise a
+[[plugins|plugin]] to implement pingbacks would be great.
+
+-- [[Jon]] (Wed Jan 14 13:48:47 GMT 2009)
+
+> I think it's now possible to implement trackback and pingback receiving
+> support in ikiwiki. One easy way to do it would be to hook it into the
+> existing [[plugins/comments]] plugin -- each pingback/trackback that
+> ikiwiki recieves would result in the creation if a new comment, which
+> would be subject to the usual comment filtering (ie, blogspam) and
+> moderation and would then show up amoung the other, regular comments on
+> the page.
+>
+> (One wrinkle: would need to guard against duplicate pings. Maybe by
+> checking existing comments for any that have the same url?)
+>
+> As for sending trackbacks and pingbacks, this could fairly easily be
+> implemented using a `editcontent` hook. Since this hook is called
+> whenever a page is posted or edited, and gets the changed content, it can
+> simply scan it for urls (may have to htmlize first?), and send pings to
+> all urls found. --[[Joey]]
+
+>> Is there any update on this? This would be highly useful and is the main reason why I am not using my blog more regularly, yet. (And yes, now that git-annex is doing everything I need and more, I thought I should revisit this one, as well). -- RichiH
diff --git a/doc/todo/please_add_some_table_styles.mdwn b/doc/todo/please_add_some_table_styles.mdwn
new file mode 100644
index 000000000..1308e1950
--- /dev/null
+++ b/doc/todo/please_add_some_table_styles.mdwn
@@ -0,0 +1,8 @@
+[[!template id=gitbranch branch=jmtd/tablestyle author="[[Jon]]"
+
+]]The [[plugins/table]] plugin's "`class`" argument is a pretty useful
+shortcut, and it would be nice to provide at least one example class
+designed for use with tables pre-defined in ikiwiki. I've written a
+quick, minimal one that makes the table full-width (and some very
+minimal, useful table styling) called `fullwidth_table` — please
+consider merging it. Thanks! — [[Jon]][[!tag wishlist patch]]
diff --git a/doc/todo/pluggablerenderers.mdwn b/doc/todo/pluggablerenderers.mdwn
new file mode 100644
index 000000000..d54259765
--- /dev/null
+++ b/doc/todo/pluggablerenderers.mdwn
@@ -0,0 +1,3 @@
+Should be able to plug in support for rst or other markup formats.
+
+[[todo/done]]
diff --git a/doc/todo/plugin.mdwn b/doc/todo/plugin.mdwn
new file mode 100644
index 000000000..b3e3a7889
--- /dev/null
+++ b/doc/todo/plugin.mdwn
@@ -0,0 +1,118 @@
+Suggestions of ideas for plugins:
+
+* enable editable, non-htmlized files
+
+ Some months ago, before upgrading my wiki, I used svn to check in an XML file
+ and a companion XSL file for client-side styling. That was cool, ikiwiki
+ copied them over unchanged and the file could be linked to as `\[[foo|foo.xml]]`.
+
+ I even had the XSL produce an `Edit` link at the top, because I wanted a simple
+ way for a web user to edit the XML. But I had to hack stuff to make the edit CGI
+ not say `foo.xml is not an editable page`.
+
+ I did that in a kind of slash-and-burn way, and apparently that's the one change
+ that was uncommitted when I upgraded ikiwiki, so now it's in the same place
+ as the wikiwyg project. On the bright side, that's a chance to think about how to
+ do it better.
+
+ Any suggestions for appropriate uses of existing plugins, or the plugin API,
+ to selectively add to the set of files in the working copy that the edit CGI
+ will consider editable? --ChapmanFlack 17July2008
+
+ > It looks like 80% of the job would be accomplished by hooking `htmlize` for
+ > the `.xml` extension. That would satisfy the `pagetype` test that causes
+ > the edit CGI to say `not an editable page`. (That happens too early for a
+ > `canedit` hook.) The `htmlize` hook could just
+ > copy in to out unchanged (this is an internal wiki, I'm not thinking hard
+ > about evil XML content right now). For extra credit, an `editcontent` hook
+ > could validate the XML. (Can an `editcontent` hook signal a content error?)
+
+ > The tricky bit seems to be to register the fact that the target file should
+ > have extension `.xml` and not `.html`. Maybe what's needed is a generalized
+ > notion of an `htmlize` hook, one that specifies its output extension as well
+ > as its input, and isn't assumed to produce html? --ChapmanFlack 17July2008
+
+ > Belay that, there's nothing good about trying to use `htmlize` for this; too
+ > many html-specific assumptions follow. For now I'm back to an embarrassing quick
+ > hack that allows editing my xml file. But here's the larger generalization I
+ > think this is driving at:
+
+ > IkiWiki is currently a tool that can compile a wiki by doing two things:
+ > 1. Process files of various input types _foo_ into a single output type, html, by
+ > finding suitable _foo_->html plugins, applying various useful transformations
+ > along the way.
+ > 1. Process files of other input types by copying them with no useful transformations at all.
+
+ > What it could be: a tool that compiles a wiki by doing this:
+ > 1. Process files of various input types _foo_ into various output types _bar_, by
+ > finding suitable _foo_->_bar_ plugins, applying various useful transformations along
+ > the way, but only those that apply to the _foo_->_bar_ conversion.
+ > 1. The second case above is now just a special case of 1 where _foo_->_foo_ for any
+ > unknown _foo_ is just a copy, and no other transformations apply.
+
+ > In some ways this seems like an easy and natural generalization. `%renderedfiles`
+ > is already mostly there, keeping the actual names of rendered files without assuming
+ > an html extension. There isn't a mechanism yet to say which transformations for
+ > linkification, preprocessing, etc., apply to which in/out types, but it could be
+ > easily added without a flag day. Right now, they _all_ apply to any input type for
+ > which an `htmlize` hook exists, and _none_ otherwise. That rule could be retained
+ > with an optional hook parameter available to override it.
+
+ > The hard part is just that right now the assumption of html as the one destination
+ > type is in the code a lot. --ChapmanFlack
+
+ >> Readers who bought this also liked: [[format_escape]], [[multiple_output_formats]]
+ >> --[[JeremieKoenig]]
+
+* list of registered users - tricky because it sorta calls for a way to rebuild the page when a new user is registered. Might be better as a cgi?
+> At best, this could only show the users who have logged in, not all
+> permitted by the current auth plugin(s). HTTP auth would need
+> web-server-specific code to list all users, and openid can't feasibly do so
+> at all. --[[JoshTriplett]]
+
+* For PlaceWiki I want to be able to do some custom plugins, including one
+ that links together subpages about the same place created by different
+ users. This seems to call for a plugin that applies to every page w/o any
+ specific marker being used, and pre-or-post-processes the full page
+ content. It also needs to update pages when related pages are added,
+ so it needs to register dependencies pre-emptively between pages,
+ or something. It's possible that this is a special case of backlinks and
+ is best implemented by making backlinks a plugin somehow. --[[Joey]]
+
+* random page (cgi plugin; how to link to it easily?)
+
+* How about an event calendar. Events could be sub-pages with an embedded
+ code to detail recurrance and/or event date/time
+
+* rcs plugin ([[JeremyReed]] has one he has been using for over a month with over 850 web commits with 13 users with over ten commits each.)
+
+* asciidoc or txt2tags format plugins
+
+ Should be quite easy to write, the otl plugin is a good example of a
+ similar formatter.
+
+>>Isn't there a conflict between ikiwiki using \[\[ \]\] and asciidoc using the same?
+>>There is a start of an asciidoc plugin at <http://www.mail-archive.com/asciidoc-discuss@metaperl.com/msg00120.html>
+>>-- KarlMW
+
+* manpage plugin: convert **"ls(1)"** style content into Markdown like **\[ls(1)\]\(http://example.org/man.cgi?name=ls&sect=1\)** or into HTML directly.
+
+> With a full installation of groff available, man offers HTML output. Might
+> take some fiddling to make it fit into the ikiwiki templates, and you might
+> or might not want to convert pages in the SEE ALSO as
+> well. --[[JoshTriplett]]
+
+* As I couldn't find another place to ask, I'll try here. I would like to install some contributed plugins, but can not find anywhere to downlod them.
+
+ > Not sure what you mean, the [[plugins/contrib]] page lists contributed plugins, and each of their pages tells where to download the plugin from.. --[[Joey]]
+
+* I wrote a very crude wrapper around tex4ht to render TeX files. I hesitate to give it a contrib/plugins page in its current state, but if someone wants to play, [here](http://www.cs.unb.ca/~bremner/wiki/software/ikiwiki/tex4ht.pm) it is.--[[DavidBremner]]
+
+* Setting default values for the meta plugin in the setup file, particularly author, license, and copyright, would be useful
+There is work in progress at
+[[plugins/contrib/default_content_for___42__copyright__42___and___42__license__42__]]
+-- [[DavidBremner]]
+
+* Would it make sense to have a hook to set the page name? This would solve a problem I see with
+[[source_code_highlighting|plugins/contrib/sourcehighlight]]
+-- [[DavidBremner]]
diff --git a/doc/todo/plugin_data_storage.mdwn b/doc/todo/plugin_data_storage.mdwn
new file mode 100644
index 000000000..21e925b5b
--- /dev/null
+++ b/doc/todo/plugin_data_storage.mdwn
@@ -0,0 +1,94 @@
+ikiwiki currently stores some key data in .ikiwiki/index. Some plugins need a
+way to store additional data, and ideally it would be something managed by
+ikiwiki instead of ad-hoc because:
+
+* consistency is good
+* ikiwiki knows when a page is removed and can stop storing data for that
+ page; plugins have to go to some lengths to track that and remove their
+ data
+* it's generally too much code and work to maintain a separate data store
+
+The aggregate plugin is a use-case: of 324 lines, 70 are data storage and
+another 10 handle deletion. Also, it's able to use a format very like
+ikiwiki's, but it does need to store some lists in there, which complicates
+it some and means that a very naive translation between a big per-page hash
+and the .index won't be good enough.
+
+The current ikiwiki index format is not very flexible, although it is at
+least fairly easy and inexpensive to parse as well as hand-edit.
+
+Would this do: ?
+
+* Plugins can register savestate and loadstate hooks. The hook id is the
+ key used in the index file that the hook handles.
+* loadstate hooks are called and passed a list of all values for a page
+ that for the registered key, and the page name, and should store the data
+ somewhere
+* savestate hooks are called and passed a page, and should return a list of
+ all values for that key for that page
+* If they need anything more complex than a list of values, they will need
+ to encode it somehow in the list.
+
+Hmm, that's potentially a lot of function calls per page eave load/save
+though.. For less function calls, only call each hook *once* per load/save,
+and it is passed/returns a big hash of pages and the values for each page.
+(Which probably means `%state=@_` for load and `return %state` for save.)
+
+It may also be better to just punt on lists, and require plugins that need
+even lists to encode them. Especially since in many cases, `join(" ", @list)`
+will do. Er hmm, if I do that though, I'm actually back to a big global
+%page_data that plugins can just toss data into, arn't I? So maybe that's
+%the right approach after all, hmm.. Except that needing to decode/encode list
+data all the time when using it would quite suck, so no, let's not do that.
+
+Note that for the aggregate plugin to use this, it will need some changes:
+
+* guid data will need to be stored as part of the data for the page
+ that was aggregated from that guid. Except, expired pages don't exit, but
+ still have guid data to store. Hmm. I suppose the guid data could be
+ considered to be associated with the page that contains the aggregate
+ directive then.
+* All feeds will need to be marked as removable in loadstate, and only
+ unmarked if seen in preprocess. Then savestate will need to not only
+ remove any feeds still marked as such, but do the unlinking of pages
+ aggregated from them too.
+
+If I do this, I might as well also:
+
+* Change the link= link= stuff to just links=link+link etc.
+* Change the delimiter from space to comma; commas are rare in index files,
+ so less ugly escaped delimiters to deal with.
+
+---
+
+The [[plugins/calendar]] plugin could use plugin data storage to record
+which pages have a calendar for the current time. Then ensure they are
+rebuilt at least once a day. Currently, it needs a cron job to rebuild
+the *whole* wiki every day; with this enhancement, the cron job would only
+rebuild the few pages that really need it.
+
+
+---
+
+New design:
+
+`%Ikiwiki::state` is an exported hash that stores per-page state.
+Set with `$state{$page}{id}{key}=$value`. The `id` is the same `id` passed
+to `hook()`.
+
+This is stored in the index like:
+
+src=foo.mdwn dest=bar.mdwn id_key=value [...]
+
+The underscore ensures that there's no conflict with ikiwiki's own
+state variables. (Note that `id` and `key` need to be encoded here.)
+
+Plugins are reponsible for deleting old state info, though ikiwiki will
+handle deleting it if a page is removed.
+
+Ikiwiki needs to know when it can drop state for plugins that are no longer
+enabled. This is done via `hook()` -- if a plugin registers a hook
+ikiwiki knows it's still active, and preserves the state for the hook id.
+If not, that state will be dropped.
+
+[[done]]!! Now to use it..
diff --git a/doc/todo/plugin_dependency_calulation.mdwn b/doc/todo/plugin_dependency_calulation.mdwn
new file mode 100644
index 000000000..28b36fc81
--- /dev/null
+++ b/doc/todo/plugin_dependency_calulation.mdwn
@@ -0,0 +1,24 @@
+A few plugins need more complex dependency calculations than ikiwiki can do
+on its own:
+
+* Use of a version plugin should only make the page rebuild when it's built
+ with a new version of ikiwiki.
+* Some plugin might want to _always_ rebuild the page that uses it.
+* If backlinks were turned into a plugin, it would need to make a page
+ rebuild when its backlinks changed.
+
+These suggest there should be a way for plugins to have hooks that tweak
+the list of pages to rebuild.
+
+Which in turn suggests that there should *be* a list of pages to rebuild;
+currently there's not, and the best such an interface could do would be to
+rebuild the pages even if they were already going to be rebuilt for some
+other reason. (See [[optimisations]].)
+
+It also suggests that plugins will want to examine pages and/or
+[[store_data|plugin_data_storage]] about them to use in the dependency
+calculations. For example, the version plugin would need to store info
+about what pages use it.
+
+> I [[fixed|done]] this without realizing it when I added the needsbuild hook!
+> --[[Joey]]
diff --git a/doc/todo/po:_add_lang_name_and_code_template_variables.mdwn b/doc/todo/po:_add_lang_name_and_code_template_variables.mdwn
new file mode 100644
index 000000000..1617e4914
--- /dev/null
+++ b/doc/todo/po:_add_lang_name_and_code_template_variables.mdwn
@@ -0,0 +1,7 @@
+My po branch adds two new template variables: `lang_code` and
+`lang_name`, that respectively display the current page's language
+codename and pretty name. Please review and pull. --[[intrigeri]]
+
+> [[done]] --[[Joey]]
+
+[[patch]]
diff --git a/doc/todo/po:_avoid_rebuilding_to_fix_meta_titles.mdwn b/doc/todo/po:_avoid_rebuilding_to_fix_meta_titles.mdwn
new file mode 100644
index 000000000..9bb9c72c4
--- /dev/null
+++ b/doc/todo/po:_avoid_rebuilding_to_fix_meta_titles.mdwn
@@ -0,0 +1,60 @@
+Re the meta title escaping issue worked around by `change`.
+
+> I suppose this does not only affect meta, but other things
+> at scan time too. Also, handling it only on rebuild feels
+> suspicious -- a refresh could involve changes to multiple
+> pages and trigger the same problem, I think. Also, exposing
+> this rebuild to the user seems really ugly, not confidence inducing.
+>
+> So I wonder if there's a better way. Such as making po, at scan time,
+> re-run the scan hooks, passing them modified content (either converted
+> from po to mdwn or with the escaped stuff cheaply de-escaped). (Of
+> course the scan hook would need to avoid calling itself!)
+>
+> (This doesn't need to block the merge, but I hope it can be addressed
+> eventually..)
+>
+> --[[Joey]]
+>>
+>> I'll think about it soon.
+>>
+>> --[[intrigeri]]
+>>
+>>> Did you get a chance to? --[[Joey]]
+
+>>>> I eventually did, and got rid of the ugly double rebuild of pages
+>>>> at build time. This involved adding a `rescan` hook. Rationale
+>>>> and details are in my po branch commit messages. I believe this
+>>>> new way of handling meta title escaping to be far more robust.
+>>>> Moreover this new implementation is more generic, feels more
+>>>> logical to me, and probably fixes other similar bugs outside the
+>>>> meta plugin scope. Please have a look when you can.
+>>>> --[[intrigeri]]
+
+>>>>> Glad you have tackled this. Looking at
+>>>>> 25447bccae0439ea56da7a788482a4807c7c459d,
+>>>>> I wonder how this rescan hook is different from a scan hook
+>>>>> with `last => 1` ? Ah, it comes *after* the preprocess hook
+>>>>> in scan mode. Hmm, I wonder if there's any reason to have
+>>>>> the scan hook called before those as it does now. Reordering
+>>>>> those 2 lines could avoid adding a new hook. --[[Joey]]
+
+>>>>>> Sure. I was fearing to break other plugins if I did so, so I
+>>>>>> did not dare to. I'll try this. --[[intrigeri]]
+
+>>>>>>> Done in my po branch, please have a look. --[[intrigeri]]
+
+>>>>>>>> I've merged it. Didn't look at the po.pm changes closely;
+>>>>>>>> assume they're ok. [[done]] --[[Joey]]
+>>>>>>>>
+>>>>>>>> My thinking about the reordering being safe is that
+>>>>>>>> the relative ordering of scan and preprocess in scan mode hooks
+>>>>>>>> has not been defined before, so it should be ok to define it. :)
+>>>>>>>>
+>>>>>>>> And as to possible breakage from things that assumed the old
+>>>>>>>> ordering, such a thing would need to have a scan hook and a
+>>>>>>>> preprocess in scan mode hook, and the two hooks would need to
+>>>>>>>> populate the same data structure with conflicting information,
+>>>>>>>> in order for there to be a problem. That seems highly unlikely
+>>>>>>>> and would be pretty broken on its own. And no plugin in ikiwiki
+>>>>>>>> itself has both types of hooks. --[[Joey]]
diff --git a/doc/todo/po:_better_documentation.mdwn b/doc/todo/po:_better_documentation.mdwn
new file mode 100644
index 000000000..6e9804df4
--- /dev/null
+++ b/doc/todo/po:_better_documentation.mdwn
@@ -0,0 +1,3 @@
+Maybe write separate documentation for the po plugin, depending on the
+people it targets: translators, wiki administrators, hackers. This
+plugin may be complex enough to deserve this.
diff --git a/doc/todo/po:_better_links.mdwn b/doc/todo/po:_better_links.mdwn
new file mode 100644
index 000000000..af879a56a
--- /dev/null
+++ b/doc/todo/po:_better_links.mdwn
@@ -0,0 +1,12 @@
+Once the fix to
+[[bugs/pagetitle_function_does_not_respect_meta_titles]] from
+[[intrigeri]]'s `meta` branch is merged into ikiwiki upstream, the
+generated links' text will be optionally based on the page titles set
+with the [[meta|plugins/meta]] plugin, and will thus be translatable.
+It will also allow displaying the translation status in links to slave
+pages. Both were implemented, and reverted in commit
+ea753782b222bf4ba2fb4683b6363afdd9055b64, which should be reverted
+once [[intrigeri]]'s `meta` branch is merged.
+
+An integration branch, called `meta-po`, merges [[intrigeri]]'s `po`
+and `meta` branches, and thus has this additional features.
diff --git a/doc/todo/po:_better_translation_interface.mdwn b/doc/todo/po:_better_translation_interface.mdwn
new file mode 100644
index 000000000..d2ae2ed5c
--- /dev/null
+++ b/doc/todo/po:_better_translation_interface.mdwn
@@ -0,0 +1,5 @@
+Add a message-by-message translation interface to the PO plugin,
+with automatic escaping of special chars.
+
+[[Integrating with transifex|todo/po: transifex integration]] or with
+Pootle would be another way to go.
diff --git a/doc/todo/po:_remove_po_files_when_disabling_plugin.mdwn b/doc/todo/po:_remove_po_files_when_disabling_plugin.mdwn
new file mode 100644
index 000000000..5d0318ae1
--- /dev/null
+++ b/doc/todo/po:_remove_po_files_when_disabling_plugin.mdwn
@@ -0,0 +1,13 @@
+ikiwiki now has a `disable` hook. Should the po plugin remove the po
+files from the source repository when it has been disabled?
+
+> pot files, possibly, but the po files contain work, so no. --[[Joey]]
+
+>> I tried to implement this in my `po-disable` branch, but AFAIK, the
+>> current rcs plugins interface provides no way to tell whether a
+>> given file (e.g. a POT file in my case) is under version control;
+>> in most cases, it is not, thanks to .gitignore or similar, but we
+>> can't be sure. So I just can't decide it is needed to call
+>> `rcs_remove` rather than a good old `unlink`. --[[intrigeri]]
+
+>>> I guess you could call `rcs_remove` followed by `unlink`. --[[Joey]]
diff --git a/doc/todo/po:_rethink_pagespecs.mdwn b/doc/todo/po:_rethink_pagespecs.mdwn
new file mode 100644
index 000000000..98c7ff655
--- /dev/null
+++ b/doc/todo/po:_rethink_pagespecs.mdwn
@@ -0,0 +1,40 @@
+I was suprised that, when using the map directive, a pagespec of "*"
+listed all the translated pages as well as regular pages. That can
+make a big difference to an existing wiki when po is turned on,
+and seems generally not wanted.
+(OTOH, you do want to match translated pages by
+default when locking pages.) --[[Joey]]
+
+> Seems hard to me to sort apart the pagespec whose matching pages
+> list must be restricted to pages in the master (or current?)
+> language, and the ones that should not. The only solution I can see
+> to this surprising behaviour is: documentation. --[[intrigeri]]
+
+>> Well, a sorting criteria might be that if a PageSpec is used
+>> with a specified locaction, as happens whenever a PageSpec is
+>> used on a page, then it should match only `currentlang()`. If it
+>> is used without a location, as in the setup file, then no such limit.
+
+>>> Ok. --[[intrigeri]]
+
+>> Note that
+>> `match_currentlang` currently dies if called w/o a location -- if
+>> it instead was always true w/o a location, this would just mean that
+>> all pagespecs should have `and currentlang()` added to them. How to
+>> implement that? All I can think of doing is wrapping
+>> `pagespec_translate`.
+
+>>> Seems doable. --[[intrigeri]]
+
+>> The only case I've found where it does make sense to match other
+>> language pages is on `l10n.ikiwiki.info` when listing pages that
+>> need translation.
+>>
+>> Otherwise, it can be documented, but that's not really enough;
+>> a user who makes a site using auto-blog.setup and enables po will
+>> get a really screwed up blog that lists translations as separate posts
+>> and needs significant work to fix. I have thought about making
+>> `match_currentlang` a stub in IkiWiki (done in my currentlang branch),
+>> so I can use it in all the PageSpecs in the example blog etc, but I
+>> can't say I love the idea.
+>> --[[Joey]]
diff --git a/doc/todo/po:_should_cleanup_.pot_files.mdwn b/doc/todo/po:_should_cleanup_.pot_files.mdwn
new file mode 100644
index 000000000..e8032c54d
--- /dev/null
+++ b/doc/todo/po:_should_cleanup_.pot_files.mdwn
@@ -0,0 +1,8 @@
+[[!tag wishlist]]
+
+The [[plugins/po]] plugin litters the wiki srcdir with .pot files, but when pages are removed, the corresponding .pot files are not.
+
+intrigeri says
+
+> Calling rcs_remove followed by IkiWiki::prune against the POT file,
+> in po.pm:deletetranslations, should be enough to get rid of it.
diff --git a/doc/todo/po:_transifex_integration.mdwn b/doc/todo/po:_transifex_integration.mdwn
new file mode 100644
index 000000000..dbdb4017b
--- /dev/null
+++ b/doc/todo/po:_transifex_integration.mdwn
@@ -0,0 +1,13 @@
+[[!tag wishlist]]
+
+[Debian bug #627693](http://bugs.debian.org/627693)
+
+I would like for the [[plugins/po]] plugin to optionally run from the working directory the transifex-client command equivalent to
+
+`tx set --auto-local -r $transifex_project_name.$mungedpagename $mungedpagename'.<lang>.po' --source-lang en --source-file $page.pot --execute`
+
+for any new .pot files that are added in that particular ikiwiki run, and then run
+
+`tx push -s`
+
+each time.
diff --git a/doc/todo/po:_translation_of_directives.mdwn b/doc/todo/po:_translation_of_directives.mdwn
new file mode 100644
index 000000000..89fc93620
--- /dev/null
+++ b/doc/todo/po:_translation_of_directives.mdwn
@@ -0,0 +1,8 @@
+If a translated page contains a directive, it may expand to some english
+text, or text in whatever single language ikiwiki is configured to "speak".
+
+Maybe there could be a way to switch ikiwiki to speaking another language
+when building a non-english page? Then the directives would get translated.
+
+(We also will need this in order to use translated templates, when they are
+available.)
diff --git a/doc/todo/po_needstranslation_pagespec.mdwn b/doc/todo/po_needstranslation_pagespec.mdwn
new file mode 100644
index 000000000..45b7377ea
--- /dev/null
+++ b/doc/todo/po_needstranslation_pagespec.mdwn
@@ -0,0 +1,12 @@
+Commit b225fdc44d4b3d in my po branch adds a `needstranslation()`
+PageSpec. It makes it easy to list pages that need translation work.
+Please review. --[[intrigeri]]
+
+> Looks good, cherry-picked. The only improvment I can
+> think of is that `needstranslation(50)` could match
+> only pages less than 50% translated. --[[Joey]]
+
+>> This improvement has been implemented as 98cc946 in my po branch.
+>> --[[intrigeri]]
+
+[[!tag patch done]]
diff --git a/doc/todo/preprocessor_directive_for_proposed_changes.mdwn b/doc/todo/preprocessor_directive_for_proposed_changes.mdwn
new file mode 100644
index 000000000..1542f39ae
--- /dev/null
+++ b/doc/todo/preprocessor_directive_for_proposed_changes.mdwn
@@ -0,0 +1,60 @@
+There are some kind of changes to the underlying repository
+which can't be made through the web interface:
+
+ * changes to files outside the wiki, to locked pages;
+ * advanced RCS operations such as merge, move, copy or del;
+ * changes you're not confident enough to apply outright.
+
+Of course in these cases, you can add your request to a discussion page
+and wait for someone with the access/confidence to apply them.
+Maybe this can be enhanced with a [[ikiwiki/Directive]]:
+
+<pre>
+\[[!suggest op=merge dstfile=trunk srcfile=branches/jk oldrev=1234 newrev=1342]]
+
+\[[!suggest op=move srcpage=/blog dstpage=/blog_support]]
+
+\[[!suggest patch="""
+Index: IkiWiki/CGI.pm
+===================================================================
+--- IkiWiki/CGI.pm (révision 4119)
++++ IkiWiki/CGI.pm (copie de travail)
+@@ -497,9 +497,11 @@
+(...)
+"""]]
+</pre>
+
+These would expand to a description of the changes,
+and provide "apply theses changes", "preview changes", and maybe
+"show diff" buttons. When those would be clicked,
+an rcs_ function would be called to apply the changes in
+the working copy, and depending on the request they would
+be svn diff'ed or rendered and shown, and kept.
+(all the affected pages would be inlined for the preview)
+
+Ultimately my planned [[review_mechanism]] would manage pages
+with such directives by itself.
+
+Thinking about it, describing changes inside a directive rather
+than as pages of their own is a bad remedy for the temporary
+lack of web-based file upload in ikiwiki.
+
+Implementing this as new pages formats would be simpler,
+and combined with inlining and file uploading it would be
+at least as powerful. It would be easier to handle changes
+automatically (for instance, moving the change pages once
+they have been applied). There would still be associated
+discussion pages in markdown.
+
+Regular pages could be used as change pages as well,
+if they provide subpages in a format describing changes.
+This would allow grouping and documenting changes.
+
+I'm still uncertain about many things, so please anyone feel free to comment.
+Specifically:
+
+ * Would it be possible to detect already applied changes
+ (without extra state, that is), and propose to "revert
+ changes" in that case?
+
+--[[JeremieKoenig]]
diff --git a/doc/todo/pretty-print_OpenIDs_even_if_not_enabled.mdwn b/doc/todo/pretty-print_OpenIDs_even_if_not_enabled.mdwn
new file mode 100644
index 000000000..3d4338a78
--- /dev/null
+++ b/doc/todo/pretty-print_OpenIDs_even_if_not_enabled.mdwn
@@ -0,0 +1,29 @@
+A feature I originally requested on
+[[a_related_bug|bugs/openid_no_longer_pretty-prints_OpenIDs]]:
+
+ Allow the openid plugin to be loaded but disabled, for its side-effect of defining IkiWiki::openiduser
+
+ On various sites I have two IkiWiki instances running from the same
+ repository: one accessible via http and only accepting openid logins,
+ and one accessible via authenticated https and only accepting httpauth.
+ Ideally, the https version should still pretty-print OpenIDs seen in
+ git history.
+
+--[[smcv]]
+
+> I wonder if an option is the best approach. Maybe it would be better to
+> simply move `openiduser` into `userlink`, and thus always support openid
+> usernames whether the plugin is enabled or not. --[[Joey]]
+
+>> OK, implemented that as 'smcv/always-openid'; if you don't think that's
+>> bloating the IkiWiki core too much, please consider merging. The poll on
+>> [[news/openid]] indicates fairly strong support for *only* accepting OpenID
+>> logins, so I think recognising OpenIDs can reasonably be considered core
+>> functionality! --[[smcv]]
+
+>>> That seemed easier than expected, [[done]].
+>>> (I do wonder if the call to openiduser still needs to be evaled --
+>>> it was probably only evaled before in case it was not available, but
+>>> I have not carefully checked it to make sure it doesn't ever die. --[[Joey]]
+
+[[!tag patch]]
diff --git a/doc/todo/preview_changes.mdwn b/doc/todo/preview_changes.mdwn
new file mode 100644
index 000000000..28797b452
--- /dev/null
+++ b/doc/todo/preview_changes.mdwn
@@ -0,0 +1,14 @@
+When editing a page, it would help to have a "preview changes" or "show diff" button, which brings up a diff from the current page content to the proposed new page content. --[[JoshTriplett]]
+
+Some discussion from the main [[/index/discussion]] page:
+
+>It would be nice to be able to have a button to show "Differences" (or "Show Diff") when
+>editing a page. Is that an option that can be enabled?
+>
+>> It's doable, it could even be done by a [[todo/plugin]], I think.
+>> --[[Joey]]
+>>
+
+See the [[plugins/editdiff]] plugin --[[JeremieKoenig]]
+
+[[done]]
diff --git a/doc/todo/preview_changes_before_git_commit.mdwn b/doc/todo/preview_changes_before_git_commit.mdwn
new file mode 100644
index 000000000..187497cf4
--- /dev/null
+++ b/doc/todo/preview_changes_before_git_commit.mdwn
@@ -0,0 +1,17 @@
+ikiwiki allows to commit changes to the doc wiki over the `git://...` protocol.
+It would be nice if there'd be a uniform way to view these changes before `git
+push`ing. For the GNU Hurd's web pages, we include a *render_locally* script,
+<http://www.gnu.org/software/hurd/render_locally>, with instructions on
+<http://www.gnu.org/software/hurd/contributing/web_pages.html>, section
+*Preview Changes*. With ikiwiki, one can use `make docwiki`, but that excludes
+a set of pages, as per `docwiki.setup`. --[[tschwinge]]
+
+> `ikiwiki -setup some.setup --render file.mdwn` will build the page and
+> dump it to stdout. So, for example:
+
+ ikiwiki -setup docwiki.setup --render doc/todo/preview_changes_before_git_commit.mdwn | w3m -T text/html
+
+> You have to have a setup file, though it suffices to make up your own
+> if you don't have the real one. Using ikiwiki.info's real setup file
+> won't actually work since it uses a search plugin that gets unhappy
+> if this is not in `/srv/web/ikiwiki.info`. --[[Joey]]
diff --git a/doc/todo/progressbar_plugin.mdwn b/doc/todo/progressbar_plugin.mdwn
new file mode 100644
index 000000000..12aef5ebb
--- /dev/null
+++ b/doc/todo/progressbar_plugin.mdwn
@@ -0,0 +1,132 @@
+I would like to add next plugin to Ikiwiki. It's `progressbar` or simply `progress`.
+I'm not sure what plugin name better is, probably that shorter ;) I know that
+[DokuWiki](http://wiki.splitbrain.org/plugin:progressbar) has similar plugin,
+so I think it can be useful also for Ikiwiki users.
+
+Here is proposition of the plugin syntax:
+
+ \[[!progress done=50]]
+
+Of course, `done` argument is integer from 0 to 100.
+
+A here is its HTML result:
+
+ <div class="progress">
+ <div class="progress-done" style="width: 50%">50%</div>
+ </div>
+
+Note: I was trying with `<span>` tags too, but that tag is inline, so I can't
+set `width` property for it.
+
+> In the poll plugin, I ended up using a `<hr>` for the progress-like
+> thing. One reason I did so is because it actually works in text-mode
+> browsers (w3m, lynx), that do not support css or colorized
+> divs. Since the hr is an element they display, just setting its width can
+> make a basic progress-type display. The style then makes it display
+> better in more capable browsers.
+>
+> The other advantage to that approach is that the htmlscrubber lets
+> through the `class` and `width` fields, that are all that are needed for
+> it to work. No need to work around htmlscrubber.
+>
+> So I suggest adapting this to use similar html. --[[Joey]]
+
+>> I just had a brief play with this. It seems there are some trade-offs involved.
+>> The `width` attribute of an `<hr>` tag is deprecated, but that's not the big one.
+>> I can't see how to place text next to an `<hr>` tag. I note that in the
+>> [[plugins/poll]] plugin there is text above and below the 'graph line', but none
+>> on the same line as the graph. I prefer the way the current code renders,
+>> with the percentage complete appearing as text inside the graph.
+>>
+>> So, if we use `hr` we get:
+>>
+>> - Graph line on text / non-css browsers
+>> - No percentage complete text on the same line as the graph line
+>> - Deprecated HTML
+>>
+>> If we use `div` we get:
+>>
+>> - Need to clean up after HTMLScrubber (which is not hard - already implemented)
+>> - Get the percentage written as text on text / non-css browsers
+>> - Get the percentage on the same line as the graph in css browsers
+>>
+>> I'm strongly in favour of having the percentage text label on the graph, and on
+>> text based browsers I think having the text label is enough -- the lack of the line
+>> in that case doesn't bother me.
+>> So, given the choice between the two suggested techniques, I'd take the second and
+>> stay with div... unless you know how to get text next to (or within) an `<hr>` tag. -- [[Will]]
+
+Default CSS styles for the plugin can be like below:
+
+ div.progress {
+ border: 1px solid #ddd;
+ /* border: 2px solid #ddd; */
+ width: 200px;
+ background: #fff;
+ padding: 2px;
+ /* padding: 0px; */
+ border: 2px solid #aaa;
+ background: #eee;
+ }
+ div.progress-done {
+ height: 14px;
+ background: #ff6600;
+ font-size: 12px;
+ text-align: center;
+ vertical-align: middle;
+ }
+
+You can use alternative, commented CSS code for `div.progress` if you dislike
+padding around done strip.
+
+Any comments? --[[Paweł|ptecza]]
+
+> Please make sure to always set a foreground color if a background color is
+> set, and use '!important' so the foreground color can be overridden. (CSS
+> best practices) --[[Joey]]
+
+>> Below is the CSS I've been using -- [[Will]]
+
+ div.progress {
+ margin-top: 1ex;
+ margin-bottom: 1ex;
+ border: 1px solid #888;
+ width: 400px;
+ background: #eee;
+ color: black !important;
+ padding: 1px;
+ }
+ div.progress-done {
+ background: #ea6 !important;
+ color: black !important;
+ text-align: center;
+ padding: 1px;
+ }
+
+> This looks like a nice idea. If I could add one further suggestion: Allow your
+> ratio to be a pair of pagespecs. Then you could have something like:
+
+ \[[!progress totalpages="bugs/* and backlink(milestoneB)" donepages="bugs/* and backlink(milestoneB) and !link(bugs/done)"]]
+
+> to have a progress bar marking how many bugs were compete for a
+> particular milestone. -- [[Will]]
+
+>> Thanks a lot for your comment, Will! It seems very interesting for me.
+>> I need to think more about improving that plugin. --[[Paweł|ptecza]]
+
+>> Attached is a [[patch]] (well, source) for this. You also need to add the proposed CSS above to `style.css`.
+>> At the moment this plugin interacts poorly with the [[plugins/htmlscrubber]] plugin.
+>> HTMLScrubber plugin removes the `style` attribute from the `progress-done` `div` tag, and so it defaults
+>> to a width of 100%. -- [[Will]]
+
+>>> Thank you for the code! I know how to fix that problem, because I had
+>>> the same issue while writing [[todo/color_plugin]] :) --[[Paweł|ptecza]]
+
+>>>> Ahh - good idea. Patch updated to work with HTMLScrubber. --[[Will]]
+
+>>>>> I like it, but I think that Joey should take a look at that patch too :)
+>>>>> --[[Paweł|ptecza]]
+
+>>>>>> Reviewed, looks excellent, added. [[done]] --[[Joey]]
+
+>>>>>>> Thanks a lot for you and Will! :) [[Paweł|ptecza]]
diff --git a/doc/todo/provide_a_mailing_list.mdwn b/doc/todo/provide_a_mailing_list.mdwn
new file mode 100644
index 000000000..6e0cd51e6
--- /dev/null
+++ b/doc/todo/provide_a_mailing_list.mdwn
@@ -0,0 +1,40 @@
+I am aware that you (Joey) [[made_a_choice_against_a_mailing_list|contact]],
+but I would like to see you reconsider.
+
+The wiki works well for bugs and
+collaborating on the software, but not for simple user support issues. For
+instance, I spent the last three days waiting for any form of reply to my
+question on IRC (none so far):
+
+ 09 15:49 < madduck> any ideas how to implement a two-level menu
+ 09 15:40 < madduck> like http://www.abacons.ch/leistungen/treuhand.html ?
+ 09 15:50 < madduck> the top bar are the main sections
+ 09 15:50 < madduck> and then i might need subsections (further down) under each
+ 09 15:50 < madduck> i'd like to do this without hardcoding sections in the template
+ 09 15:50 < madduck> but at least the main sections are to appear on all pages
+ 09 15:50 < madduck> so the template needs a slot, i just wonder how best to fill it
+ 09 15:51 < madduck> i can only ever use one navbar it seems
+ 09 15:51 < madduck> and i already need the sidebar for something else
+
+I would not know where to take this question on the wiki itself.
+
+A mailing list is made for these kind of questions, and as we pick up even
+more users, the number of such requests will also increase. --[[madduck]]
+
+> I wouldn't mind a mailing list myself. For many people, e-mail is more
+> efficient to process, particularly offline, than wiki pages. Also,
+> ikiwiki's discussion pages require a fair amount of discipline from
+> users to make it easy to follow a long discussion. On the other hand,
+> it would be interesting to make improvements to ikiwiki
+> (read: plugins) to to see if it's possible to accomodate both
+> mail people and online people.
+> --[liw](http://liw.fi)
+
+>> The [Zwiki](http://zwiki.org) project has ideas to mine in this area. We explored various permutations of wiki and mail list.
+>> Currently Zwiki's [email features](http://zwiki.org/Mail) are good enough that we use it as the mail list and don't run separate
+>> ml software. This is simple and mostly satisfying. Now and then we miss the familiarity and industrial
+>> strength of a standard mail list and its simple time-based archive. We do gateway with [gmane](http://news.gmane.org/gmane.comp.web.zope.zwiki) so people
+>> can use that as an alternative. I'm happy to chat about this, ping me..
+>> --[sm](http://joyful.com)
+
+[[!tag wishlist]]
diff --git a/doc/todo/provide_inline_diffs_in_recentchanges.mdwn b/doc/todo/provide_inline_diffs_in_recentchanges.mdwn
new file mode 100644
index 000000000..3bf1bdc33
--- /dev/null
+++ b/doc/todo/provide_inline_diffs_in_recentchanges.mdwn
@@ -0,0 +1,27 @@
+[[!template id=gitbranch branch=anarcat/inline_diffs author="[[anarcat]]"]]
+
+It would rock if I could view diffs from the web without going via feeds. I envision toggle-style buttons on the recentchanges page, or just links to the CGI, which then displays the diff... --[[madduck]]
+
+> The diffs are actually there, enabled by the [[plugins/recentchangesdiff]]
+> plugin, but they are hidden in the XHTML version by the stylesheet.
+> You might try a user stylesheet with `div.diff { display: block }`.
+> --[[JasonBlevins]]
+
+> > I have implemented this in a branch in my repository (see the side box).
+> >
+> > Unfortunately it has some issues:
+> >
+> > 1. <del>it assumes the toggle.js code is loaded somehow</del> - now loaded manually
+> > 2. <del>if the toggle code isn't loaded the diffs are displayed (which is arguably better than showing nothing since we ship the diff to the UA anyways...)</del> - i actually think that's fine
+> > 3. <del>it will show only if there's a revert URL, which is backwards, but otherwise the display is weird, with each button on its own line</del> fixed!
+> > 4. <del>if the diffurl parameter is set in the template, we'd actually see two sets of glasses, which is silly.</del> - just added a tmp_unless to fix this.
+> >
+> > I feel this should nevertheless be implemented because if we're going to compile all this crap in the page anyways and send it to the client, why not allow the user to show it? I also feel that showing it by default is a lesser evil for non-javascript users.
+> >
+> > -- [[anarcat]] 2012-03-03
+
+> > > I have pushed a new version of this patch to my branch, which fixes all the above issues. I think this is ready to be merged now. -- [[anarcat]] 2012-07-19
+
+>>>> [[done]] --[[Joey]]
+
+[[!tag wishlist patch]]
diff --git a/doc/todo/provide_sha1_for_git_diffurl.mdwn b/doc/todo/provide_sha1_for_git_diffurl.mdwn
new file mode 100644
index 000000000..01aa512f8
--- /dev/null
+++ b/doc/todo/provide_sha1_for_git_diffurl.mdwn
@@ -0,0 +1,26 @@
+This [[patch]] allows for `\[[sha1]]` substitution in the `diffurl`
+for git repositories. This is useful for use with [cgit][] which has
+diffurls of the following form:
+
+ /project.git/diff/\[[file]]?id=\[[sha1_commit]]
+
+ [cgit]: http://hjemli.net/git/cgit/
+
+ diff --git a/IkiWiki/Plugin/git.pm b/IkiWiki/Plugin/git.pm
+ index 5bef928..164210d 100644
+ --- a/IkiWiki/Plugin/git.pm
+ +++ b/IkiWiki/Plugin/git.pm
+ @@ -518,6 +518,7 @@ sub rcs_recentchanges ($) {
+
+ my $diffurl = defined $config{'diffurl'} ? $config{'diffurl'} : "";
+ $diffurl =~ s/\[\[file\]\]/$file/go;
+ + $diffurl =~ s/\[\[sha1\]\]/$sha1/go;
+ $diffurl =~ s/\[\[sha1_parent\]\]/$ci->{'parent'}/go;
+ $diffurl =~ s/\[\[sha1_from\]\]/$detail->{'sha1_from'}/go;
+ $diffurl =~ s/\[\[sha1_to\]\]/$detail->{'sha1_to'}/go;
+
+> [[done]], but I called it `sha1_commit` since I think that's what it's
+> actually a sha1 of. --[[Joey]]
+
+>> I was at a loss for something more descriptive...I like that much
+>> better :) Thanks! --[[JasonBlevins]]
diff --git a/doc/todo/publishing_in_the_future.mdwn b/doc/todo/publishing_in_the_future.mdwn
new file mode 100644
index 000000000..55fe3aa1f
--- /dev/null
+++ b/doc/todo/publishing_in_the_future.mdwn
@@ -0,0 +1,127 @@
+[[!tag wishlist]]I would quite like the ability to write a page (blog post in
+practice) but for the page to not be displayed until a date and time after it
+is added to the wiki. I've thought this through a bit, but would appreciate
+feedback from people before I go any further. Would anyone else find this
+useful?
+
+Thinking about how to implement this in ikiwiki, perhaps a conditional
+pagespec would be best (which could be tidied up into a template)
+
+ \[[!if test="current_date_before(<TMPL_VAR date>)"
+ then="""[[!tag draft]]"""
+ else="""[[!meta date="<TMPL_VAR date>"]]"""
+ ]]
+
+…pre-supposing a scheme whereby tagging 'draft' hides the page from an
+aggregation somewhere. With a template, this could collapse to
+
+ \[[!template id=publishafter date="Thu Aug 30 14:13:06 BST 2012"]]
+
+This would require implementing the `current_date_before` pagespec.
+
+You would also need a regularly scheduled wiki refresh and a way of marking the
+unpublished pages as 'dirty' so they were always scanned on refresh until their
+publish date has occurred. That could perhaps be implemented via a small plugin
+which defined a pagespec which ensured the page was 'dirty':
+
+ \[[!meta date="<TMPL_VAR date>"]]
+ \[[!if test="!current_date_before(<TMPL_VAR date>)"
+ then="""[[!tag draft]][[!dirty]]"""
+ ]]
+
+The following is an attempt at the dirty part:
+
+ #!/usr/bin/perl
+ package IkiWiki::Plugin::dirty;
+ # provides a pagespec 'dirty' which ensures the page will always be
+ # re-scanned for content on wiki refresh.
+
+ use warnings;
+ use strict;
+ use IkiWiki 3.00;
+
+ hook(type => "preprocess", id => "dirty", call => \&preprocess);
+ hook(type => "needsbuild", id => "dirty", call => \&needsbuild);
+
+ sub preprocess (@) {
+ my %params = @_;
+ $pagestate{$params{page}}{dirty}{dirty} = 1;
+ return '';
+ }
+
+ sub needsbuild (@) {
+ my $pages= shift;
+ my %p2 = map { $_ => 1 } @$pages;
+ my %d2 = map { $_ => 1 } @$deleted;
+
+ foreach my $page (keys %pagestate) {
+ if(exists $pagestate{$page}{dirty}{dirty}) {
+ push @$pages, $pagesources{$page} unless
+ (exists $p2{$pagesources{$page}} or exists $d2{$pagesources{$page}});
+ delete $pagestate{$page}{dirty}{dirty};
+ }
+ }
+
+ return $pages;
+ }
+
+ 1
+
+Although it doesn't fit, the `current_date_before` pagespec could be implemented
+in the same plugin. I tried the following (before the trailing `1`):
+
+ package IkiWiki::PageSpec;
+ use Date::Parse;
+
+ sub match_current_date_before ($$;@) {
+ shift;
+ my $date = shift;
+ my $out = str2time($date);
+ if(defined $out) {
+ return IkiWiki::SuccessReason->new("time before now") if $out < time();
+ return IkiWiki::FailReason->new("time not before now");
+ } else { return IkiWiki::ErrorReason->new("couldn't parse time $date")};
+ }
+
+I always hit the `ErrorReason` branch when I try to use it, even with strings
+which work fine in test scripts. If anyone can help me debug that I'd be very
+grateful.
+If anyone has any clues as to why this doesn't work
+
+Thoughts on the whole idea? — [[Jon]]
+
+> There is an old todo about it: [[tagging_with_a_publication_date]].
+> I feel my idea there about making a pagespec that is limited to
+> items in the present/past, combined with setting the meta data, is a good
+> way.. --[[Joey]]
+
+>> Thanks for your response Joey. Should I merge these two TODOs, then?
+>> So if I understand you correctly, you would prefer some new pagespecs
+>> to match future/past dates, and a plugin which kept track of pages with
+>> a future date and kept them 'dirty' (similar to the above), which means
+>> avoiding the need for a `dirty` pagespec in the page itself. Is that
+>> about right?
+>>
+>> I came up with the following, but I haven't adapted `dirty.pm` inline
+>> with my understanding above, yet.
+
+ sub match_infuture ($$;@) {
+ my $page = shift;
+ return IkiWiki::SuccessReason->new("page time is in the future")
+ if $IkiWiki::pagectime{$page} > time;
+ return IkiWiki::FailReason->new("page time is not in the future");
+ }
+
+>> I've managed to get my original suggestion working. The problem was
+>> I was using quotes when invoking the pagespec, which stopped `str2time`
+>> working.
+>>
+>> Let me know if I've understood your POV correctly and I'll see about
+>> tidying this up and putting it in a branch.
+>>
+>> Finally, a way of scheduling future runs of ikiwiki *within ikiwiki
+>> itself* might be useful for other things too, and would avoid the
+>> need for a cron job in this case. (I'm thinking of a plugin that
+>> implemented itself in terms of cron, or at, or both, or possibly
+>> other things depending on what people want to support). But that would
+>> be substantially more work, more than I can afford atm at least. — [[Jon]]
diff --git a/doc/todo/quieten-bzr.mdwn b/doc/todo/quieten-bzr.mdwn
new file mode 100644
index 000000000..c6d83a519
--- /dev/null
+++ b/doc/todo/quieten-bzr.mdwn
@@ -0,0 +1,28 @@
+The _bzr_ plug echoes "added: somefile.mdwn" when it adds somefile.mdwn to the repository. As a result, the redirect performed after a new article is created fails because the _bzr_ output comes before the HTTP headers.
+
+The fix is simply to call `bzr` with the _--quiet_ switch. Something like this applied to _bzr.pm_ works for me:
+
+ 46c46
+ < my @cmdline = ("bzr", $config{srcdir}, "update");
+ ---
+ > my @cmdline = ("bzr", "update", "--quiet", $config{srcdir});
+ 74c74
+ < my @cmdline = ("bzr", "commit", "-m", $message, "--author", $user,
+ ---
+ > my @cmdline = ("bzr", "commit", "--quiet", "-m", $message, "--author", $user,
+ 86c86
+ < my @cmdline = ("bzr", "add", "$config{srcdir}/$file");
+ ---
+ > my @cmdline = ("bzr", "add", "--quiet", "$config{srcdir}/$file");
+ 94a95,97
+ > eval q{use CGI 'escapeHTML'};
+ > error($@) if $@;
+ >
+
+
+[[!tag patch]]
+
+> [[done]], although I left off the escapeHTML thing which seems to be in
+> your patch by accident.
+>
+> (Please use diff -u BTW..) --[[Joey]]
diff --git a/doc/todo/rcs.mdwn b/doc/todo/rcs.mdwn
new file mode 100644
index 000000000..3cc572516
--- /dev/null
+++ b/doc/todo/rcs.mdwn
@@ -0,0 +1,25 @@
+Here is a beginning of a rcs plugin that uses rcsmerge, rcs, ci, co and rlog.
+I have used it probably over hundred times but needs some work.
+
+<http://www.reedmedia.net/~reed/tmp-sfhkcjkfrfh/rcs.pm>
+
+[[!tag patch]]
+
+> Clearly needs some cleanup and perhaps some of the missing stubs
+> implemented, before it can be included into ikiwiki.
+>
+> Notes on individual functions:
+>
+> * rcs_prepedit - I'm not sure why you do the locking since the comment
+> notes that the locking does no good..
+>
+> * rcs_getctime - You ask why this would be better than mtime. It's
+> because with something like subversion, a file's modification time or
+> ctime is not necessarily accurate WRT when the file was first checked
+> into the repo.
+>
+--[[Joey]]
+
+Also here is a quick script to browse the RCS history to use for "historyurl".
+
+<http://www.reedmedia.net/~reed/tmp-sfhkcjkfrfh/rcshistory.txt>
diff --git a/doc/todo/rcs__95__diff_implementation_for_Mercurial_backend__44___based_on_Git_backend.mdwn b/doc/todo/rcs__95__diff_implementation_for_Mercurial_backend__44___based_on_Git_backend.mdwn
new file mode 100644
index 000000000..42cfe1a19
--- /dev/null
+++ b/doc/todo/rcs__95__diff_implementation_for_Mercurial_backend__44___based_on_Git_backend.mdwn
@@ -0,0 +1,40 @@
+(**Note:** this patch is built on top of [[Attempt to extend Mercurial backend support]] and [[rcs__95__get__123__c__44__m__125__time_implementation_for_Mercurial_backend__44___based_on_Git_backend]]. The former is needed for the `safe_hg()` definition. The latter only shows up in the very last line matching of this patch.)
+
+CC of `rcs_diff` implementation in `git.pm` with few changes. Mercurial provides the `hg diff -g` switch, which outputs the diff in Git-format, making the implementation easy. I think it's a good idea to base `mercurial.pm` as much as possible om `git.pm` to simplify and be able to benefit from the maintenance of `git.pm`, which probably is more used.
+
+[Patch at my hg repo](http://510x.se/hg/program/ikiwiki/diff/cc73d670bf99/Plugin/mercurial.pm) ([raw format](http://510x.se/hg/program/ikiwiki/raw-file/cc73d670bf99/Plugin/mercurial.pm)).
+
+--[[Daniel Andersson]]
+
+> Guess that makes sense, [[done]] --[[Joey]]
+
+---
+
+ diff -r 1b6c46b62a28 -r cc73d670bf99 Plugin/mercurial.pm
+ --- a/Plugin/mercurial.pm Tue Jul 19 13:35:17 2011 +0200
+ +++ b/Plugin/mercurial.pm Tue Jul 19 13:35:37 2011 +0200
+ @@ -307,7 +307,23 @@
+ }
+
+ sub rcs_diff ($;$) {
+ - # TODO
+ + my $rev=shift;
+ + my $maxlines=shift;
+ + my @lines;
+ + my $addlines=sub {
+ + my $line=shift;
+ + return if defined $maxlines && @lines == $maxlines;
+ + push @lines, $line."\n"
+ + if (@lines || $line=~/^diff --git/);
+ + return 1;
+ + };
+ + safe_hg(undef, $addlines, "hg", "diff", "-c", $rev, "-g");
+ + if (wantarray) {
+ + return @lines;
+ + }
+ + else {
+ + return join("", @lines);
+ + }
+ }
+
+ {
diff --git a/doc/todo/rcs__95__get__123__c__44__m__125__time_implementation_for_Mercurial_backend__44___based_on_Git_backend.mdwn b/doc/todo/rcs__95__get__123__c__44__m__125__time_implementation_for_Mercurial_backend__44___based_on_Git_backend.mdwn
new file mode 100644
index 000000000..54ab4ad3a
--- /dev/null
+++ b/doc/todo/rcs__95__get__123__c__44__m__125__time_implementation_for_Mercurial_backend__44___based_on_Git_backend.mdwn
@@ -0,0 +1,157 @@
+(**Note:** this patch is built on top of the patch discussed at [[Attempt to extend Mercurial backend support]]. The `run_or_die()` function declared therein is needed for this patch to run.)
+
+Patch to change the Mercurial entries for `rcs_getctime` and `rcs_getmtime` from "slow"/"no" to "fast"/"fast" in [[/rcs]].
+
+The patch is mostly a slightly modified cc of the code in `git.pm`. The exception is that a Mercurial style file is needed to get a reasonable output from `hg log`. To make the file self-contained in its current state, this was solved with a generated temp file, but that section could and should be replaced with just setting `$tmpl_filename` to a path to a static file `map-cmdline.ikiwiki-log` (to conform with Mercurial's naming of its default styles) in the Ikiwiki distribution, with contents
+
+ changeset = "{date}\n{files}\n"
+ file = "{file}\n"
+
+which is based on an [example](http://hgbook.red-bean.com/read/customizing-the-output-of-mercurial.html#id417978) in [Mercurial: The Definitive Guide](http://hgbook.red-bean.com/) (and otherwise fascinatingly undocumented). A style *file* is required for this kind of formatting. There is a switch `hg log --template` to directly control simple output formatting, but in this case, the `{file}` directive must be redefined, which can only be done with `hg log --style`.
+
+If `{file}` is not redefined, all filenames are output on a single line separated with a space. It is not possible to conclude if the space is part of a filename or just a separator, and thus impossible to use in this case. Some output filters are available in hg, but they are not fit for this cause (and would slow down the process unnecessarily).
+
+In the patch listing below, I've marked the parts of the patch that should be removed when the tempfile replacement is done with **Marker# start** and **Marker# end**.
+
+[Patch at pastebin](http://pastebin.com/QBE4UH6n).
+
+[Patch at pastebin with tempfile code replaced by a path to a static file (change path accordingly)](http://pastebin.com/dmSCRkUK).
+
+[My `mercurial.pm` in raw format after this and beforementioned patches (tempfile code present)](http://510x.se/hg/program/ikiwiki/raw-file/1b6c46b62a28/Plugin/mercurial.pm).
+
+--[[Daniel Andersson]]
+
+> I have applied this, but I left the temp file in.
+> The overhead seems small since it will only be run once per ikiwiki run,
+> and only when `ikiwiki --gettime` is run, or the first time
+> ikiwiki runs. Thanks for this! [[done]] --[[Joey]]
+
+---
+
+ diff -r 78a217fb13f3 -r 1b6c46b62a28 Plugin/mercurial.pm
+ --- a/Plugin/mercurial.pm Sat Jul 16 03:19:25 2011 +0200
+ +++ b/Plugin/mercurial.pm Tue Jul 19 13:35:17 2011 +0200
+ @@ -310,28 +310,91 @@
+ # TODO
+ }
+
+ -sub rcs_getctime ($) {
+ - my ($file) = @_;
+ +{
+ +my %time_cache;
+
+ - my @cmdline = ("hg", "-R", $config{srcdir}, "log", "-v",
+ - "--style", "default", "$config{srcdir}/$file");
+ - open (my $out, "-|", @cmdline);
+ +sub findtimes ($$) {
+ + my $file=shift;
+ + my $id=shift; # 0 = mtime ; 1 = ctime
+
+ - my @log = (mercurial_log($out));
+ + if (! keys %time_cache) {
+ + my $date;
+
+ - if (@log < 1) {
+ - return 0;
+
+**Marker1 start**
+
+ + # The tempfile logic should be replaced with a file included
+ + # with ikiwiki containing
+ + # --
+ + # changeset = "{date}\n{files}\n"
+ + # file = "{file}\n"
+ + # --
+ + # to avoid creating a file with static contents every time this
+ + # function is called. The path to this file should replace
+ + # $tmpl_filename in run_or_die() below.
+ + #
+
+**Marker1 end**
+
+ + # It doesn't seem possible to specify the format wanted for the
+ + # changelog (same format as is generated in git.pm:findtimes(),
+ + # though the date differs slightly) without using a style
+ + # _file_. There is a "hg log" switch "--template" to directly
+ + # control simple output formatting, but in this case, the
+ + # {file} directive must be redefined, which can only be done
+ + # with "--style".
+ + #
+ + # If {file} is not redefined, all files are output on a single
+ + # line separated with a space. It is not possible to conclude
+ + # if the space is part of a filename or just a separator, and
+ + # thus impossible to use in this case.
+ + #
+ + # Some output filters are available in hg, but they are not fit
+ + # for this cause (and would slow down the process
+ + # unnecessarily).
+ +
+
+**Marker2 start**
+
+ + use File::Temp qw(tempfile);
+ + my ($tmpl_fh, $tmpl_filename) = tempfile(UNLINK => 1);
+ +
+ + print $tmpl_fh 'changeset = "{date}\\n{files}\\n"' . "\n";
+ + print $tmpl_fh 'file = "{file}\\n"' . "\n";
+ +
+
+**Marker2 end**
+
+ + foreach my $line (run_or_die('hg', 'log', '--style',
+ + $tmpl_filename)) {
+ + # {date} gives output on the form
+ + # 1310694511.0-7200
+ + # where the first number is UTC Unix timestamp with one
+ + # decimal (decimal always 0, at least on my system)
+ + # followed by local timezone offset from UTC in
+ + # seconds.
+ + if (! defined $date && $line =~ /^\d+\.\d[+-]\d*$/) {
+ + $line =~ s/^(\d+).*/$1/;
+ + $date=$line;
+ + }
+ + elsif (! length $line) {
+ + $date=undef;
+ + }
+ + else {
+ + my $f=$line;
+ +
+ + if (! $time_cache{$f}) {
+ + $time_cache{$f}[0]=$date; # mtime
+ + }
+ + $time_cache{$f}[1]=$date; # ctime
+ + }
+ + }
+
+**Marker3 start**
+
+ + close ($tmpl_fh);
+
+**Marker3 end**
+
+ }
+
+ - eval q{use Date::Parse};
+ - error($@) if $@;
+ -
+ - my $ctime = str2time($log[$#log]->{"date"});
+ - return $ctime;
+ + return exists $time_cache{$file} ? $time_cache{$file}[$id] : 0;
+ +}
+ +
+ +}
+ +
+ +sub rcs_getctime ($) {
+ + my $file = shift;
+ +
+ + return findtimes($file, 1);
+ }
+
+ sub rcs_getmtime ($) {
+ - error "rcs_getmtime is not implemented for mercurial\n"; # TODO
+ + my $file = shift;
+ +
+ + return findtimes($file, 0);
+ }
+
+ 1
diff --git a/doc/todo/rcs_updates_needed.mdwn b/doc/todo/rcs_updates_needed.mdwn
new file mode 100644
index 000000000..472a5800f
--- /dev/null
+++ b/doc/todo/rcs_updates_needed.mdwn
@@ -0,0 +1,10 @@
+I've added three new functions to the ikiwiki VCS interface to support
+renaming and removing files using the web interface. The mercurial and
+tla [[rcs]] backends need implementions of these functions.
+
+(The maintainers of these backends have been mailed. --[[Joey]])
+
+Also, currently git is the only VCS to have support for
+[[untrusted_push|tips/untrusted_git_push]]. It _may_ be possible to
+implement it for other DVCS, if they offer a hook that can be used to check
+incoming pushes early.
diff --git a/doc/todo/recentchanges.mdwn b/doc/todo/recentchanges.mdwn
new file mode 100644
index 000000000..25a8ea4db
--- /dev/null
+++ b/doc/todo/recentchanges.mdwn
@@ -0,0 +1,144 @@
+* Why isn't it statically-genereated, but generated dynamically by CGI? It
+ seems like it could be beneficial to have it rendered in the post-commit
+ hook, just like everything else in the wiki.
+
+ > I hope to statically generate it eventually, currently the problem is
+ > that it takes at least several seconds to generate the recentchanges
+ > page, and adding several seconds to every page edit is not desiriable. If
+ > the time can be reduced it could be done, I'm also not adverse to
+ > adding an optional way to statically render it even at the current
+ > speed. --[[Joey]]
+
+* Also, is it planned/desired that recent changes generate the same
+ information in RSS feed format? This seems like it could be a useful way
+ to keep track of the wiki as a whole.
+
+ > This is used by various interwiki type things, I think, so should be
+ > done.. --[[Joey]]
+
+* Lastly, would it be possible to use the recent changes code with a
+ pagespec? I understand this sort of infringes on territory covered by the
+ inline plugin, but the inline plugin only puts a page in the RSS feed
+ once, when it's created, and I imagine some people -- some deranged,
+ obsessive-compulsive people like myself -- would like to know about the
+ changes made to existing pages as well as newly-created pages.
+
+ > That would work rather well for pages like [[todo]] and [[bugs]], where
+ > you want to know about any updates, not just initial
+ > creation. --[[JoshTriplett]]
+
+ > Of course you can use email subscriptions for that too.. --[[Joey]]
+
+ >> I have more thoughts on this topic which I will probably write
+ >> tomorrow. If you thought my other patches were blue-sky, wait until
+ >> you see this. --Ethan
+
+OK, so here's how I see the RecentChanges thing. I write blog posts and
+the inline plugin generates RSS feeds. Readers of RSS feeds are notified
+of new entries but not changes to old entries. I think it's rude to change
+something without telling your readers, so I'd like to address this.
+To tell the user that there have been changes, we can tell the user which
+page has been changed, the new text, the RCS comment relating to
+the change, and a diff of the actual changes. The new text probably isn't
+too useful (I have a very hard time rereading things for differences),
+so any modifications to inline to re-inline pages probably won't help,
+even if it were feasible (which I don't think it is). So instead we
+turn to creating diffs automatically and (maybe) inlining them.
+
+I suggest that for every commit, a diff is created automagically
+but not committed to the RCS. The page containing this diff would be
+a "virtual page", which cannot be edited and is not committed.
+(Committing here would be bad, because then it would create a new
+commit, which would need a new diff, which would need to be committed,
+etc.) Virtual pages would "expire" and be deleted if they were not
+depended on in some way.
+
+Let's say these pages are created in edits/commit_%d.mdwn. RecentChanges
+would then be a page which did nothing but inline the last 50 `edits/*`.
+This would give static generation and RSS/Atom feeds. The inline
+plugin could be optionally altered to inline pages from `edits/*`
+that match any pages in its pagespec, and through this we could get
+a recent-changes+pagespec thing. You could also exclude edits that have
+"minor" in the commit message (or some other thing that marks them as
+unremarkable).
+
+You could make an argument that I care way too much about what amounts
+to edits anyhow, but like Josh says, there are use cases for this.
+While this could be done with mail subscriptions, I can think of sites
+where you might want to disable all auth so that people can't edit
+your pages. --Ethan
+
+> I really dislike all Wiki engine recentchanges pages. They all tend to be
+> fairly machine readable, but confusing for non-wiki users to grok. And I've
+> yet to see an _attractive_ recentchanges implementation. IkiWikis' is no
+> better or worse than the others.
+>
+> I really like the frontpage of [Bill
+> Seitz](http://webseitz.fluxent.com/wiki/FrontPage) as an recentchanges
+> format. Note how he uses some clever css to show changes in different
+> sections of the website. I modeled my own
+> [recentchanges](http://xtermin.us/recentchanges) page page on his ideas. This
+> probably isn't appropriate for non-WikiLog style setups, but is this
+> something closer to what you what was requested?
+>
+> BTW: My recentchanges plugin does not seem to add a lot processing time
+> to compiling. Then again, I'm not pulling changelog message from the RCS
+> backend.
+>
+> -- CharlesMauch
+
+----
+
+Here's a full design for redoing recentchanges, based on Ethan's ideas:
+
+* Add a recentchanges plugin that has a preprocessor directive:
+ \[[!recentchanges num=100 pages=* template=recentchanges.tmpl]]
+ If put on the [[recentchanges]] page, this would result in up to 100
+ recentchanges/change_$id.mdwn files being created.
+* Which means the plugin has to store state and use a checkconfig hook
+ or the like to create the requested pages (and delete old ones) when
+ the wiki is rebuilt and when the post_commit hook is run.
+* Then it's a simple matter of using inline on the recentchanges page
+ to display the changes. (With a special template to display nicely.)
+* Rss/atom comes for free..
+* So drop mail notifications.
+* If someone wants to subscribe to notifications for only a subset
+ of pages, they can either filter the recentchanges in their rss
+ aggregator, or they can set up their own page that uses the recentchanges
+ directive for only the pages they want.
+* The `rcs_notify` functions will be removed.
+* To add diffs, another plugin can add a pagetemplate hook that calls
+ a `rcs_diff`. (optional)
+* So to update the changes files, just call `rcs_recentchanges`, create
+ files for each new id, and delete files for each id that is no longer
+ included.
+* The cgi support for recentchanges can be dropped, or moved to a different
+ plugin.
+
+I'm unsure how fast this will all be, but by using regular pages, there's
+cacheing, at least. The main slowdown might turn out to be the inlining and
+not the generation of the changes pages. The current cgi recentchanges
+code saves a tenth of a second or so by memoizing htmllink, an optimisation
+that won't be available when using the more general inlining code.
+
+An obvious optimisation, and one implied by this design, is that each change
+file is only written once. This assumes that the data in them doesn't ever
+change, which actually isn't true (svn commit messages can be changed), but
+is probably close enough to true for our purposes.
+
+Another optimisation would be to htmlize the change files when they're
+written out -- avoids re-rendering a given file each time a new change is
+made (thus doing 1/100th the work).
+
+Links in the change files to the changed pages will need special handling.
+These links should not generate backlinks. They probably shouldn't be
+implemented as wikiliks at all. Instead, they should be raw, absolute
+html links to the pages that were changed.
+
+Only problem with this approach is that the links break if the changed
+page later gets deleted. I think that's acceptable. It could link to
+`ikiwiki.cgi?do=redir&page=foo`, but that's probably overkill.
+
+--[[Joey]]
+
+[[done]] !! (in this branch at least :-)
diff --git a/doc/todo/recentchanges_feed_with_comment.mdwn b/doc/todo/recentchanges_feed_with_comment.mdwn
new file mode 100644
index 000000000..4c32b9ca9
--- /dev/null
+++ b/doc/todo/recentchanges_feed_with_comment.mdwn
@@ -0,0 +1,5 @@
+There is currently know clean way to extract the actual "description" the user provided for a change in the recent changes. They get displayed in the "description" blob of the recent changes, but only as HTML and alongside the diff blob and other things.
+
+It would be nice if the user's "description" (the git commitlog, really) would be a first class citizen, because right now the RSS feed titles only say "user: change to page/blah", which is really not informative. We should at least have the commitlog available as a field. -- [[anarcat]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/recentchanges_path.mdwn b/doc/todo/recentchanges_path.mdwn
new file mode 100644
index 000000000..186047918
--- /dev/null
+++ b/doc/todo/recentchanges_path.mdwn
@@ -0,0 +1,9 @@
+# RecentChanges should show path to wiki sub-pages?
+
+The RecentChanges only shows the final file name for the recently changes file,
+for example "discussion". It would be more useful to see "index/discussion"
+or the path to the sub-page. I think this is handled by the htmllink() routine.
+
+> Went ahead and did this, IMHO the display is ok as-is. --[[Joey]]
+
+[[todo/done]]
diff --git a/doc/todo/recommend_libtext-markdown-discount_instead_of_depending.mdwn b/doc/todo/recommend_libtext-markdown-discount_instead_of_depending.mdwn
new file mode 100644
index 000000000..bb953ef8d
--- /dev/null
+++ b/doc/todo/recommend_libtext-markdown-discount_instead_of_depending.mdwn
@@ -0,0 +1,25 @@
+Would you consider Recommending `libtext-markdown-discount` instead of
+depending on it? It isn't available in wheezy, and the sid version is the
+wrong side of a perl transition, unfortunately. This is the only
+dependency preventing the stock sid version of ikiwiki from being
+installable on a wheezy host. -- [[Jon]]
+
+> That's a temporary problem. Actually, I just checked and it's already
+> resolved; discount is in testing now, as is the latest ikiwiki.
+>
+> Ikiwiki depends on enough perl modules
+> to make sure it will work, and since it uses discount by default,
+> it needs to depend on it. If I make the dependency on
+> `libtext-markdown-discount-perl | libtext-markdown-perl',
+> then users will not automatically transition to using discount, which
+> I want them to do. [[done]] --[[Joey]]
+
+>> Sorry, I made a mistake in the phrasing of my original request. It's
+>> not installable on *squeeze*, which is what I care about, rather than
+>> *wheezy*. Someone needs to backport `libtext-markdown-discount` I
+>> guess. — [[Jon]]
+
+>>> For squeeze, it will be appropriate for an ikiwiki backport to
+>>> still depend on the old markdown. Although a discount backport would be
+>>> nice! I don't want the current ikiwiki to be held back by requirement
+>>> that it be installable as-is on squeeze. --[[Joey]]
diff --git a/doc/todo/redirect_automatically_after_rename.mdwn b/doc/todo/redirect_automatically_after_rename.mdwn
new file mode 100644
index 000000000..1cbb824d2
--- /dev/null
+++ b/doc/todo/redirect_automatically_after_rename.mdwn
@@ -0,0 +1,10 @@
+In some wikis, (e.g. Mediawiki) after [[renaming|plugins/rename]]
+a page, the old page still exist but only redirect to the
+new page. This is convenient since external sites may
+have links pointing to the old url.
+
+If [[plugins/meta]] plugin is enabled, users can manually edit the
+page, and put in '\[[!meta redir=newpage]]', but this is
+not very convenient.
+
+
diff --git a/doc/todo/refreshing_recentchanges_page.mdwn b/doc/todo/refreshing_recentchanges_page.mdwn
new file mode 100644
index 000000000..4846236fe
--- /dev/null
+++ b/doc/todo/refreshing_recentchanges_page.mdwn
@@ -0,0 +1,20 @@
+What do you think about refreshing RecentChanges page (via Meta Refresh Tag)?
+It can be useful for users like me which rather prefer watching the last changes
+in WWW browser tab than subscribing to page. --[[Paweł|ptecza]]
+
+> Depends, if it were done the time period should be made configurable.
+> Unwanted server load due to refeshing could be a problem for some.
+> --[[Joey]]
+
+>> Yes, it should be configurable by ikiwiki admin. I believe he's not
+>> stupid and he will not set too short refresh period to kill his server :)
+>> I propose to add `recentchanges_refresh` variable in ikiwiki setup
+>> to setting refresh period. If it's not defined, then ikiwiki doesn't put
+>> refresh meta tag into `recentchanges.tmpl`. Do you like it? ;) --[[Paweł|ptecza]]
+
+>>> Seems reasonable --[[Joey]]
+
+> Sounds like a client-side issue, not an ikiwiki issue. Grab the
+> [ReloadEvery](https://addons.mozilla.org/firefox/115/) extension for
+> <del>Firefox</del>Iceweasel, and use that to periodically refresh any page you
+> want. --[[JoshTriplett]]
diff --git a/doc/todo/rel__61__nofollow_on_external_links.mdwn b/doc/todo/rel__61__nofollow_on_external_links.mdwn
new file mode 100644
index 000000000..bbf51fa91
--- /dev/null
+++ b/doc/todo/rel__61__nofollow_on_external_links.mdwn
@@ -0,0 +1,4 @@
+Ikiwiki could optionally use rel=nofollow on all external links, or on all those from a configurable subset of pages (such as */discussion if using [[plugins/opendiscussion]]). --[[JoshTriplett]]
+
+> This is tricky, and would need a html parser, since external links can be
+> added in lots of ways including raw in markdown. --[[Joey]]
diff --git a/doc/todo/rel_attribute_for_links.mdwn b/doc/todo/rel_attribute_for_links.mdwn
new file mode 100644
index 000000000..3b4cea436
--- /dev/null
+++ b/doc/todo/rel_attribute_for_links.mdwn
@@ -0,0 +1,19 @@
+A rel="" attribute is desirable for links, for example to
+
+* limit the interest of comment spam with rel="nofollow" for anonymous wiki contributions (see [Google](http://googleblog.blogspot.com/2005/01/preventing-comment-spam.html))
+* identify page tags with rel="tag" (see [microformats](http://microformats.org/wiki/rel-tag))
+* define a social network with rel="friend co-worker met ..." for contacts (see [XFN](http://www.gmpg.org/xfn/))
+* define a license with rel="license" (see [microformats](http://microformats.org/wiki/rel-license))
+
+This patch adds this possibility to htmllink().
+
+This one uses it for tags:
+
+> Both applied, thanks. Leaving the bug open since other parts are not
+> implemented yet. See also [[rel=nofollow_on_external_links]] --[[Joey]]
+
+This can also help for css decoraton. An example of these patches in use: http://poivron.org/~nil/iki/japonesie/horizon_large/
+
+— NicolasLimare
+
+[[!tag wishlist]]
diff --git a/doc/todo/relative_pagespec_deficiency.mdwn b/doc/todo/relative_pagespec_deficiency.mdwn
new file mode 100644
index 000000000..4500581c7
--- /dev/null
+++ b/doc/todo/relative_pagespec_deficiency.mdwn
@@ -0,0 +1,8 @@
+While a relative pagespec like `./posts/*` will work, when used in a page
+such as `bdale/blog`, you cannot do
+`created_after(./posts/foo)` -- only `glob()` supports relative page
+references.
+
+The other pagespec functions should too, where appropriate.
+
+[[done]]
diff --git a/doc/todo/remove_basewiki_redir_pages.mdwn b/doc/todo/remove_basewiki_redir_pages.mdwn
new file mode 100644
index 000000000..fe9e49ce2
--- /dev/null
+++ b/doc/todo/remove_basewiki_redir_pages.mdwn
@@ -0,0 +1,4 @@
+In version 2.16, several redir pages were put in for [[basewiki]] pages
+that were moved. These redirs should be removed later. --[[Joey]]
+
+[[done]]
diff --git a/doc/todo/replace_HTML::Template_with_Template_Toolkit.mdwn b/doc/todo/replace_HTML::Template_with_Template_Toolkit.mdwn
new file mode 100644
index 000000000..9616f724f
--- /dev/null
+++ b/doc/todo/replace_HTML::Template_with_Template_Toolkit.mdwn
@@ -0,0 +1,115 @@
+[[!tag wishlist]]
+
+HTML::Template is an okay templating kit, but it lacks a lot of powerful
+features and thus makes it rather hard to give an ikiwiki site a consistent
+look. If you browse the templates provided in the tarball, you'll notice that
+more than one of them contain the `<html>` tag, which is unnecessary.
+
+> Note that is no longer true, and I didn't have to do such an intrusive
+> change to fix it either. --[[Joey]]
+
+Maybe it's just me, I also find HTML::Template cumbersome to use, due in part
+to its use of capital letters.
+
+> Its entirely optional use of capital letters? --[[Joey]]
+
+Finally, the software seems unmaintained: the mailing list and searchable
+archives linked from
+<http://html-template.sourceforge.net/html_template.html#frequently%20asked%20questions>
+are broken and the author has not replied to my query in months.
+
+I would love to see ikiwiki use the [Template
+Toolkit](http://template-toolkit.org/) as templating engine.
+
+One major reason for TT is its use of slots, a concept I first encountered
+with Zope Page Templates and never wanted to miss it again. Let me quickly
+illustrate, using the HTML::Template syntax for simplicity. Traditionally,
+templating is done with includes:
+
+ Page A Page B
+ <TMPL_INCLUDE header> <TMPL_INCLUDE header>
+ this is page A this is page B
+ <TMPL_INCLUDE footer> <TMPL_INCLUDE footer>
+
+This involves four pages, and if you mistype "footer" on page B,
+it'll be broken in potentially subtle ways.
+
+Now look at the approach with slots:
+
+ MainTemplate
+ This is the header
+ <TMPL_SLOT content>
+ This is the footer
+
+ Page A Page B
+ <TMPL_USE MainTemplate> <TMPL_USE MainTemplate>
+ <TMPL_FILL content> <TMPL_FILL content>
+ This is page A This is page B
+ </TMPL_FILL> </TMPL_FILL>
+ </TMPL_USE> </TMPL_USE>
+
+As soon as you think about more structure pages with various slots
+to fill, I am sure you can see the appeal of that approach. If not,
+here is some more documentation: <http://wiki.zope.org/ZPT/METALSpecification11>
+
+I would be glad to volunteer time to make this switch happen, such as rewrite
+the templates. I'd prefer not having to touch Perl though...
+
+
+-----
+
+Yes, Template::Toolkit is very powerful. But I think it's somehow overkill for a wiki. HTML::Template can keep things simple, though. --[weakish](http://weakish.int.eu.org/blog/)
+
+I'd have to agree that Template::Toolkit is overkill and personally I'm not a fan, but it is very popular (there is even a book) and the new version (3) is alleged to be much more nimble than current version. --[[ajt]]
+
+HTML::Template's HTML-like markup prevents me from editing templates in KompoZer or other WYSIWYG HTML editors. The editor tries to render the template markup rather than display it verbatim, and large parts of the template become invisible. A markup syntax that doesn't confuse editors (such as Template::Toolkit's "[% FOO %]") may promote template customization. The ability to replace the template engine would be within the spirit of ikiwiki's extensibility. --Rocco
+
+> HTML::Template allows the use of `<!-- TMPL_SOMETHING ... -->`
+> instead of `<TMPL_SOMETHING ...>`, see
+> <http://search.cpan.org/~samtregar/HTML-Template-2.6/Template.pm#NOTES>
+> for details. I used this PERL regexp to convert my own templates:
+>
+> s{<\s*(/?TMPL_[A-Z]+)((\s+\w+(=(['"]?)\w+\5)?)+)?\s*/?>}{<!-- $1$2 -->}gi;
+>
+> (Quoting it properly to use from the shell command-line is
+> nightmarish, write a script with it.)
+> --[[RiccardoMurri]]
+
+I agree that being able to replace the template toolkit would be a great piece of modularity, and one I would use. If I could use the slot-based filling and the conditional logic from Template::Toolkit, we could build much more flexible inline and archivepage templates that would look different depending on where in the wiki we use them. Some of this can currently be accomplished with separate templates for each use case and a manual call to the right template in the !inline directive, but this is limited, cumbersome, and makes it difficult to reuse bits of formatting by trapping all of that information in multiple template files. -Ian
+
+> I don't wish HTML::Template to be *replaced* by Template::Toolkit - as
+> others have said above, it's overkill for my needs. However, I also
+> agree that HTML::Template has its own problems too. The idea of making
+> the template system modular, with a choice of which backend to use - I
+> really like that idea. It would enable me to use some other template
+> system I like better, such as Text::Template or Text::NeatTemplate. But I
+> think it would be a lot of work to implement, though perhaps no more work
+> than making the revision-control backend modular, I guess. One would
+> need to write an IkiWiki template interface that didn't care what the
+> backend was, and yet is somehow still flexible enough to take advantage
+> of special features of different backends. There are an *awful lot* of
+> things that use templates - not just the `pagetemplate` and `template`
+> plugins, but a number of others which have specialized templates of their
+> own. -- [[KathrynAndersen]]a
+
+>> A modular template system in ikiwiki is unlikely, as template objects
+>> are part of the API, notably the `pagetemplate` hook. Unless the other
+>> system has a compatible template object. --[[Joey]]
+
+>>> I hacked an adapter that exposes the HTML::Template API but uses
+>>> Template::Toolkit for the template rendering. Very rough, but it
+>>> works: my Wikis compile mostly ok. The code includes a `.tmpl`
+>>> converter script. Get it from: <http://github.com/riccardomurri/ikiwiki>
+>>> --[[RiccardoMurri]]
+
+---
+
+>>> I found this thing yesterday:
+>>>
+>>> http://search.cpan.org/~rhandom/Template-Alloy-1.016/lib/Template/Alloy.pod
+>>> I think (hope) this can solve all the problems related to this feature request.
+>>>
+>>> From the url above:
+>>> "With Template::Alloy you can use your favorite template interface and syntax
+>>> and get features from each of the other major template systems."
+>>> --[[Cstamas]]
diff --git a/doc/todo/require_CAPTCHA_to_edit.mdwn b/doc/todo/require_CAPTCHA_to_edit.mdwn
new file mode 100644
index 000000000..83ba07eb0
--- /dev/null
+++ b/doc/todo/require_CAPTCHA_to_edit.mdwn
@@ -0,0 +1,327 @@
+I don't necessarily trust all OpenID providers to stop bots. I note that ikiwiki allows [[banned_users]], and that there are other todos such as [[todo/openid_user_filtering]] that would extend this. However, it might be nice to have a CAPTCHA system.
+
+I imagine a plugin that modifies the login screen to use <http://recaptcha.net/>. You would then be required to fill in the captcha as well as log in in the normal way.
+
+-- [[users/Will]]
+
+> I hate CAPTCHAs with a passion. Someone else is welcome to write such a
+> plugin.
+>
+> If spam via openid (which I have never ever seen yet) becomes
+> a problem, a provider whitelist/blacklist seems like a much nicer
+> solution than a CAPTCHA. --[[Joey]]
+
+>> Apparently there has been openid spam (you can google for it). But as for
+>> white/black lists, were you thinking of listing the openids, or the content?
+>> Something like the moinmoin global <http://master.moinmo.in/BadContent>
+>> list?
+
+>>> OpenID can be thought of as pushing the problem of determining if
+>>> someone is a human or a spambot back from the openid consumer to the
+>>> openid provider. So, providers that make it possible for spambots to
+>>> use their openids, or that are even set up explicitly for use in
+>>> spamming, would be the ones to block. Or, providers that are known to
+>>> use very good screening for humans would be the ones to allow.
+>>> (Openid delegation makes it a bit harder than just looking at the
+>>> openid url though.) --[[Joey]]
+
+>>>> Well, OpenID only addresses authentication issues, not authorisation issues.
+>>>> Given that it is trivial to set up your own OpenID provider (a full provider, not
+>>>> just a forward to another provider), I can't see a
+>>>> blacklist working in the long term (it would be like blacklisting email).
+>>>> A whitelist might work (it would not be quite as bad as whitelisting email). In any case,
+>>>> there is now a captcha plugin for those that want it. It is accessible
+>>>> (there is an audio option) and serves a social purpose along with
+>>>> keeping bots out (the captcha is used to help digitise hard to read
+>>>> words in books for [Carnegie Mellon University](http://www.cs.cmu.edu/) and
+>>>> [The Internet Archive](http://www.archive.org/) ). Finally, because the actual captcha is outsourced
+>>>> it means that someone else is taking care of keeping it ahead of
+>>>> the bot authors.
+
+Okie - I have a first pass of this. There are still some issues.
+
+Currently the code verifies the CAPTCHA. If you get it right then you're fine.
+If you get the CAPTCHA wrong then the current code tells formbuilder that
+one of the fields is invalid. This stops the login from going through.
+Unfortunately, formbuilder is caching this validity somewhere, and I haven't
+found a way around that yet. This means that if you get the CAPTCHA
+wrong, it will continue to fail. You need to load the login page again so
+it doesn't have the error message on the screen, then it'll work again.
+
+> fixed this - updated code is attached.
+
+A second issue is that the OpenID login system resets the 'required' flags
+of all the other fields, so using OpenID will cause the CAPTCHA to be
+ignored.
+
+> This is still not fixed. I would have thought the following patch would
+> have fixed this second issue, but it doesn't.
+
+(code snipped as a working [[patch]] is below)
+
+>> What seems to be happing here is that the openid plugin defines a
+>> validate hook for openid_url that calls validate(). validate() in turn
+>> redirects the user to the openid server for validation, and exits. If
+>> the openid plugins' validate hook is called before your recaptcha
+>> validator, your code never gets a chance to run. I don't know how to
+>> control the other that FormBuilder validates fields, but the only fix I
+>> can see is to somehow influence that order.
+>>
+>> Hmm, maybe you need to move your own validation code out of the validate
+>> hook. Instead, just validate the captcha in the formbuilder_setup hook.
+>> The problem with this approach is that if validation fails, you can't
+>> just flag it as invalid and let formbuilder handle that. Instead, you'd
+>> have to hack something in to redisplay the captcha by hand. --[[Joey]]
+
+>>> Fixed this. I just modified the OpenID plugin to check if the captcha
+>>> succeeded or failed. Seeing as the OpenID plugin is the one that is
+>>> abusing the normal validate method, I figured it was best to keep
+>>> the fix in the same place. I also added a config switch so you can set if
+>>> the captcha is needed for OpenID logins. OpenID defaults to ignoring
+>>> the captcha.
+>>> Patch is inline below.
+>>> I think this whole thing is working now.
+
+>>>> Ok, glad it's working. Not thrilled that it needs to modify the
+>>>> openid plugin, especially as I'm not sure if i I will integrate the
+>>>> captcha plugin into mainline. Also because it's not very clean to have
+>>>> the oprnid plugin aware of another plugin like that. I'd like to
+>>>> prusue my idea of not doing the captcha validation in the validate
+>>>> hook.
+
+--- a/IkiWiki/Plugin/openid.pm
++++ b/IkiWiki/Plugin/openid.pm
+@@ -18,6 +18,7 @@ sub getopt () {
+ error($@) if $@;
+ Getopt::Long::Configure('pass_through');
+ GetOptions("openidsignup=s" => \$config{openidsignup});
++ GetOptions("openidneedscaptcha=s" => \$config{openidneedscaptcha});
+ }
+
+ sub formbuilder_setup (@) {
+@@ -61,6 +62,7 @@ sub formbuilder_setup (@) {
+ # Skip all other required fields in this case.
+ foreach my $field ($form->field) {
+ next if $field eq "openid_url";
++ next if $config{openidneedscaptcha} && $field eq "recaptcha";
+ $form->field(name => $field, required => 0,
+ validate => '/.*/');
+ }
+@@ -96,6 +98,18 @@ sub validate ($$$;$) {
+ }
+ }
+
++ if ($config{openidneedscaptcha} && defined $form->field("recaptcha")) {
++ foreach my $field ($form->field) {
++ next unless ($field eq "recaptcha");
++ if (! $field->validate) {
++ # if they didn't get the captcha right,
++ # then just claim we validated ok so the
++ # captcha can cause a fail
++ return 1;
++ }
++ }
++ }
++
+ my $check_url = $claimed_identity->check_url(
+ return_to => IkiWiki::cgiurl(do => "postsignin"),
+ trust_root => $config{cgiurl},
+
+
+Instructions
+=====
+
+You need to go to <http://recaptcha.net/api/getkey> and get a key set.
+The keys are added as options.
+
+ reCaptchaPubKey => "LONGPUBLICKEYSTRING",
+ reCaptchaPrivKey => "LONGPRIVATEKEYSTRING",
+
+You can also use "signInSSL" if you're using ssl for your login screen.
+
+
+The following code is just inline. It will probably not display correctly, and you should just grab it from the page source.
+
+----------
+
+#!/usr/bin/perl
+# Ikiwiki password authentication.
+package IkiWiki::Plugin::recaptcha;
+
+use warnings;
+use strict;
+use IkiWiki 2.00;
+
+sub import {
+ hook(type => "formbuilder_setup", id => "recaptcha", call => \&formbuilder_setup);
+}
+
+sub getopt () {
+ eval q{use Getopt::Long};
+ error($@) if $@;
+ Getopt::Long::Configure('pass_through');
+ GetOptions("reCaptchaPubKey=s" => \$config{reCaptchaPubKey});
+ GetOptions("reCaptchaPrivKey=s" => \$config{reCaptchaPrivKey});
+}
+
+sub formbuilder_setup (@) {
+ my %params=@_;
+
+ my $form=$params{form};
+ my $session=$params{session};
+ my $cgi=$params{cgi};
+ my $pubkey=$config{reCaptchaPubKey};
+ my $privkey=$config{reCaptchaPrivKey};
+ debug("Unknown Public Key. To use reCAPTCHA you must get an API key from http://recaptcha.net/api/getkey")
+ unless defined $config{reCaptchaPubKey};
+ debug("Unknown Private Key. To use reCAPTCHA you must get an API key from http://recaptcha.net/api/getkey")
+ unless defined $config{reCaptchaPrivKey};
+ my $tagtextPlain=<<EOTAG;
+ <script type="text/javascript"
+ src="http://api.recaptcha.net/challenge?k=$pubkey">
+ </script>
+
+ <noscript>
+ <iframe src="http://api.recaptcha.net/noscript?k=$pubkey"
+ height="300" width="500" frameborder="0"></iframe><br>
+ <textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
+ <input type="hidden" name="recaptcha_response_field"
+ value="manual_challenge">
+ </noscript>
+EOTAG
+
+ my $tagtextSSL=<<EOTAGS;
+ <script type="text/javascript"
+ src="https://api-secure.recaptcha.net/challenge?k=$pubkey">
+ </script>
+
+ <noscript>
+ <iframe src="https://api-secure.recaptcha.net/noscript?k=$pubkey"
+ height="300" width="500" frameborder="0"></iframe><br>
+ <textarea name="recaptcha_challenge_field" rows="3" cols="40"></textarea>
+ <input type="hidden" name="recaptcha_response_field"
+ value="manual_challenge">
+ </noscript>
+EOTAGS
+
+ my $tagtext;
+
+ if ($config{signInSSL}) {
+ $tagtext = $tagtextSSL;
+ } else {
+ $tagtext = $tagtextPlain;
+ }
+
+ if ($form->title eq "signin") {
+ # Give up if module is unavailable to avoid
+ # needing to depend on it.
+ eval q{use LWP::UserAgent};
+ if ($@) {
+ debug("unable to load LWP::UserAgent, not enabling reCaptcha");
+ return;
+ }
+
+ die("To use reCAPTCHA you must get an API key from http://recaptcha.net/api/getkey")
+ unless $pubkey;
+ die("To use reCAPTCHA you must get an API key from http://recaptcha.net/api/getkey")
+ unless $privkey;
+ die("To use reCAPTCHA you must know the remote IP address")
+ unless $session->remote_addr();
+
+ $form->field(
+ name => "recaptcha",
+ label => "",
+ type => 'static',
+ comment => $tagtext,
+ required => 1,
+ message => "CAPTCHA verification failed",
+ );
+
+ # validate the captcha.
+ if ($form->submitted && $form->submitted eq "Login" &&
+ defined $form->cgi_param("recaptcha_challenge_field") &&
+ length $form->cgi_param("recaptcha_challenge_field") &&
+ defined $form->cgi_param("recaptcha_response_field") &&
+ length $form->cgi_param("recaptcha_response_field")) {
+
+ my $challenge = "invalid";
+ my $response = "invalid";
+ my $result = { is_valid => 0, error => 'recaptcha-not-tested' };
+
+ $form->field(name => "recaptcha",
+ message => "CAPTCHA verification failed",
+ required => 1,
+ validate => sub {
+ if ($challenge ne $form->cgi_param("recaptcha_challenge_field") or
+ $response ne $form->cgi_param("recaptcha_response_field")) {
+ $challenge = $form->cgi_param("recaptcha_challenge_field");
+ $response = $form->cgi_param("recaptcha_response_field");
+ debug("Validating: ".$challenge." ".$response);
+ $result = check_answer($privkey,
+ $session->remote_addr(),
+ $challenge, $response);
+ } else {
+ debug("re-Validating");
+ }
+
+ if ($result->{is_valid}) {
+ debug("valid");
+ return 1;
+ } else {
+ debug("invalid");
+ return 0;
+ }
+ });
+ }
+ }
+}
+
+# The following function is borrowed from
+# Captcha::reCAPTCHA by Andy Armstrong and are under the PERL Artistic License
+
+sub check_answer {
+ my ( $privkey, $remoteip, $challenge, $response ) = @_;
+
+ die
+ "To use reCAPTCHA you must get an API key from http://recaptcha.net/api/getkey"
+ unless $privkey;
+
+ die "For security reasons, you must pass the remote ip to reCAPTCHA"
+ unless $remoteip;
+
+ if (! ($challenge && $response)) {
+ debug("Challenge or response not set!");
+ return { is_valid => 0, error => 'incorrect-captcha-sol' };
+ }
+
+ my $ua = LWP::UserAgent->new();
+
+ my $resp = $ua->post(
+ 'http://api-verify.recaptcha.net/verify',
+ {
+ privatekey => $privkey,
+ remoteip => $remoteip,
+ challenge => $challenge,
+ response => $response
+ }
+ );
+
+ if ( $resp->is_success ) {
+ my ( $answer, $message ) = split( /\n/, $resp->content, 2 );
+ if ( $answer =~ /true/ ) {
+ debug("CAPTCHA valid");
+ return { is_valid => 1 };
+ }
+ else {
+ chomp $message;
+ debug("CAPTCHA failed: ".$message);
+ return { is_valid => 0, error => $message };
+ }
+ }
+ else {
+ debug("Unable to contact reCaptcha verification host!");
+ return { is_valid => 0, error => 'recaptcha-not-reachable' };
+ }
+}
+
+1;
diff --git a/doc/todo/review_mechanism.mdwn b/doc/todo/review_mechanism.mdwn
new file mode 100644
index 000000000..66ed58144
--- /dev/null
+++ b/doc/todo/review_mechanism.mdwn
@@ -0,0 +1,35 @@
+Basically, what I need is a two-sided wiki:
+
+* one side would be the published version, with the ikiwiki CGI disabled;
+* another would be the developement version, which would be editable online.
+
+These two sides would correspond to branches in the repository.
+Each time someone makes a change to the developement version,
+the created revision number would be added to a list of changes to be reviewed,
+maybe by a pre/post-commit hook. This would be done only if a published version of
+the page exists, and could be requested when a new page needs to be published.
+Some kind of priviledged user could then move the change around,
+from the "review needed" queue to the "accepted" or "rejected" ones.
+This would be done in a way that would trigger the appropriate VCS merge operations.
+
+A generic "change queue" mechanism could be used for translations or other stuff as well.
+Each change would have its own wiki page under changes/revNNNN.
+Change queues would be wiki pages as well (probably using [[inlines|plugins/inline]]);
+[[Pagespecs|ikiwiki/Pagespec]] and [[tags]] would be used to control the queues to which a given change would belong.
+
+--[[JeremieKoenig]]
+
+> You can achieve something like this right now, by using Git. The
+> development and published versions each have their own repository, with
+> remotes set up so they push either to two backend repositories or to two
+> different branches of the same backend repository. You can then merge from
+> one to the other whenever you want.
+>
+> You could theoretically do this with SVN as well.
+>
+> I do like the idea you suggest of reviewing and merging changes through the
+> web interface, though.
+>
+> -- [[JoshTriplett]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/rewrite_ikiwiki_in_haskell.mdwn b/doc/todo/rewrite_ikiwiki_in_haskell.mdwn
new file mode 100644
index 000000000..e48765b0e
--- /dev/null
+++ b/doc/todo/rewrite_ikiwiki_in_haskell.mdwn
@@ -0,0 +1,65 @@
+[[!tag wishlist blue-sky]]
+
+In the long term, I have been considering rewriting ikiwiki in haskell.
+It's appealing for a lot of reasons, including:
+
+* No need to depend on a C compiler and have wrappers. Instead, ikiwiki
+ binaries could be built on demand to do the things wrappers are used for
+ now (cgi, post-commit, etc).
+* Potentially much faster. One problem with the now very modular ikiwiki is
+ that it has to load up dozens of perl modules each time it runs, which
+ means both opening lots of files and evaluating them. A haskell version
+ could run from one pre-compiled file. Other speed efficienies are also
+ likely with haskell. For example, pandoc is apparently an order of
+ magnitude faster than perl markdown implementations.
+* Many plugins could be written in pure functional code, with no side
+ effects. Not all of them, of course.
+* It should be much easier to get ikiwiki to support parallel compilation
+ on multi-core systems using haskell.
+* A rewrite would be an opportunity to utterly break compatability and
+ redo things based on experience. Since the haskell libraries used for
+ markdown, templates, etc, are unlikely to be very compatable with the perl
+ versions, and since perl plugins obviously wouldn't work, and perl setup
+ files wouldn't be practical to keep, a lot of things would unavoidably
+ change, and at that point changinge everything else I can think of
+ probably wouldn't hurt (much).
+
+ - Re templates, it would be nice to have a template library that
+ doesn't use html-ish templating tags, since those are hard for users to
+ edit in html editors currently.
+ - This would be a chance to make WikiLinks with link texts read
+ "the right way round" (ie, vaguely wiki creole compatably).
+ *[See also [[todo/link_plugin_perhaps_too_general?]] --[[smcv]]]*
+ - The data structures would probably be quite different.
+ - I might want to drop a lot of the command-line flags, either
+ requiring a setup file be used for those things, or leaving the
+ general-purpose `--set var=value` flag.
+ - Sometimes the current behavior of `--setup` seems confusing; it might
+ only cause a setup file to be read, and not force rebuild mode.
+ - Hard to say how the very high level plugin interface design would change,
+ but at the least some of the names of hooks could stand a rename, and
+ their parameter passing cleaned up.
+
+We know that a big, break-the-world rewrite like this can be a very
+bad thing for a project to attempt. It would be possible to support
+external plugins written in haskell today, without any rewrite; and a few
+of the benefits could be obtained by, eg, making the mdwn plugin be a
+haskell program that uses pandoc. I doubt that wouod be a good first step
+to converting ikiwiki to haskell, because such a program would have very
+different data structures and intercommuniucation than a pure haskell
+version.
+
+Some other things to be scared about:
+
+* By picking perl, I made a lot of people annoyed (and probably turned
+ several people away from using ikiwiki). But over time there turned out
+ to be a lot of folks who knew perl already (even if rustily), and made
+ some *very* useful contributions. I doubt there's as large a pool of haskell
+ programmers, and it's probably harder for a python user to learn haskell
+ than perl if they want to contribute to ikiwiki.
+* It might be harder for users of hosting services to install a haskell based
+ ikiwiki than the perl version. Such systems probably don't have ghc and
+ a bunch of haskell libraries. OTOH, it might be possible to build a
+ static binary at home and upload it, thus avoiding a messy installation
+ procedure entirely.
+ --[[Joey]]
diff --git a/doc/todo/rewrite_ikiwiki_in_haskell/discussion.mdwn b/doc/todo/rewrite_ikiwiki_in_haskell/discussion.mdwn
new file mode 100644
index 000000000..e19ceaa8f
--- /dev/null
+++ b/doc/todo/rewrite_ikiwiki_in_haskell/discussion.mdwn
@@ -0,0 +1,61 @@
+Ok, I have to admit, I have no idea if this is an April fool's joke or not.
+Congratulations for demonstrating that April fools jokes can still be subtle
+(whether intentionally or not!) -- [[Jon]]
+
+> Having said all that, have you looked at erlang? Have you heard of couchdb?
+> I'd strongly recommend looking at that. -- [[Jon]]
+
+>> I've glanced at couchdb, but don't see how it would tie in with ikiwiki.
+>> --[[Joey]]
+
+
+>>> It doesn't really. I recently (re-)read about couchdb and thought that
+>>> what it was trying to do had some comparisons with the thinking going on
+>>> in [[todo/structured_page_data]]. -- [[Jon]]
+
+-----
+
+I'm torn about this idea, if it's actually serious. I'm very comfortable
+programming in Perl, and have written quite a few modules for IkiWiki, and
+it would be a huge pain to have to start from scratch all over again. On
+the other hand, this could be a motivation for me to learn Haskell. My
+only encounter with Haskell has been a brief time when I was using the
+Xmonad window manager, but it looks like an interesting language.
+Functional programming is cool.
+
+There are a lot of interesting plusses for Haskell you note (in the parent
+page), but it's true that the idea is horribly daunting (as [[Joey]] said
+"If only I had a spare year"). Is there any way that you could "start
+small"? Because nothing will ever happen if the task is too daunting to
+even start.
+
+> This seems destined to remain a thought experiment unless something like
+> that can be done, or I get a serious case of second system disease.
+>
+> I've considered doing things like using the external plugin interface
+> to run a separate haskell program, which would allow implementing
+> arbitrary plugins in haskell (starting with a pandoc plugin..),
+> and could perhaps grow to subsume the perl code. However, this would
+> stick us with the perl data structures, which are not a very good fit
+> for haskell. --[[Joey]]
+
+On further thought... perhaps it would be easier to fork or contribute to
+an existing Haskell-based wiki, such as <a
+href="http://jaspervdj.be/hakyll">Hakyll</a>?
+
+--[[KathrynAndersen]]
+
+> As far as I know there are no other wikis (haskell or otherwise)
+> that are wiki compilers. Since we know from experience that dealing
+> with static compilation turns out to be one of the trickiest parts of
+> ikiwiki, I'm doubtful about trying to bolt that into one. --[[Joey]]
+
+>> Haykll isn't a wiki but it does do static compilation. The missing
+>> parts are: the web interface, the wiki link processing, and page
+>> dependency stuff. -- [[tychoish]]
+
+>>> (nods) Which is why I suggested it. I'm not sure whether it would be easier to "bolt on" those things than static compilation, but it could be worth looking at, at least. -- [[KathrynAndersen]]
+
+-----
+
+Rather than coding plugins for the Perl ikiwiki in Haskell, I wonder how easily a Haskell ikiwiki could still support plugins written in Perl? The (old and apparently stale) [HsPerl5](http://hackage.haskell.org/package/HsPerl5) package might provide a helpful starting point there. -- [[JoshTriplett]]
diff --git a/doc/todo/rss_title_description.mdwn b/doc/todo/rss_title_description.mdwn
new file mode 100644
index 000000000..f0138cb72
--- /dev/null
+++ b/doc/todo/rss_title_description.mdwn
@@ -0,0 +1,35 @@
+There could be a better way of setting the title and description of the rss feeds. Perhaps
+through the meta plugin, or extra options to the inline plugin.
+
+At the moment The description is the same for all feeds from
+a single wiki it seems, and the title is forced to be one word,
+though I don't think it needs to be.
+
+A few pointers and I might be able to implement this myself. -- JamesWestby
+
+> I don't see any problem with the title, it's the same as the title
+> of the wiki page that the rss feed comes from, which can be set
+> using the meta plugin. There are no restrictions to one word or
+> anything like that. Just made ikiwiki emit the following in a test
+> feed:
+
+> > <title>billy bob&#39;s news</title>
+
+> Now, the description field currently defaults to the wiki name,
+> and that could indeed stand to be made configurable. Since the
+> current (svn) version of ikiwiki supports long, word-wrapped
+> blocks of text as parameters to [[ikiwiki/Directive]]s, seems
+> to me the best way would be to simple modify inline.pm to make the
+> descripion configurable by such parameter, with a fallback to the
+> wiki name. You'll need to modify rsspage.tmpl to use whatever new
+> template variable you define, that should be all.
+
+> --[[Joey]]
+
+Apologies for the title thing. I tried it yesterday, and it onlt used the first word.
+I must have done something wrong. I'll have a look at implementing the description.
+Thanks. -- JamesWestby
+
+My patch can be found at <http://jameswestby.net/scratch/blog-desc.diff> -- JamesWestby
+
+> Thanks, [[todo/done]] --[[Joey]]
diff --git a/doc/todo/rst_plugin_python_rewrite.mdwn b/doc/todo/rst_plugin_python_rewrite.mdwn
new file mode 100644
index 000000000..222fdb177
--- /dev/null
+++ b/doc/todo/rst_plugin_python_rewrite.mdwn
@@ -0,0 +1,7 @@
+The [[plugins/rst]] plugin is slow because it forks python for each page
+rendered. Now that ikiwiki supports plugins written in
+[[other_languages|plugins/write/external]], it would be excellent if someone
+could rewrite the rst plugin as a pure python external plugin. It would
+then run nice and quick.
+
+[[done]], thanks to madduck!
diff --git a/doc/todo/salmon_protocol_for_comment_sharing.mdwn b/doc/todo/salmon_protocol_for_comment_sharing.mdwn
new file mode 100644
index 000000000..1e56b0a8b
--- /dev/null
+++ b/doc/todo/salmon_protocol_for_comment_sharing.mdwn
@@ -0,0 +1,21 @@
+The <a href="http://www.salmon-protocol.org/home">Salmon protocol</a>
+provides for aggregating comments across sites. If a site that syndicates
+a feed receives a comment on an item in that feed, it can re-post the
+comment to the original source.
+
+> Ikiwiki does not allow comments to be posted on items it aggregates.
+> So salmon protocol support would only need to handle the comment
+> receiving side of the protocol.
+>
+> The current draft protocol document confuses me when it starts talking
+> about using OAuth in the abuse prevention section, since their example
+> does not show use of OAuth, and it's not at all clear to me where the
+> OAuth relationship between aggregator and original source is supposed
+> to come from.
+>
+> Their security model, which goes on to include Webfinger,
+> thirdparty validation services, XRD, and Magic Signatures, looks sorta
+> like they kept throwing technology, at it, hoping something will stick. :-P
+> --[[Joey]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/search.mdwn b/doc/todo/search.mdwn
new file mode 100644
index 000000000..79342f286
--- /dev/null
+++ b/doc/todo/search.mdwn
@@ -0,0 +1,5 @@
+* page name substring search
+* full text (use third-party tools?)
+ - hyperestraier looks nice
+
+[[todo/done]]
diff --git a/doc/todo/search_terms.mdwn b/doc/todo/search_terms.mdwn
new file mode 100644
index 000000000..cf1708c34
--- /dev/null
+++ b/doc/todo/search_terms.mdwn
@@ -0,0 +1,7 @@
+The [[plugins/search]] plugin could use xapian terms to allow some special
+searches. For example, "title:foo", or "link:somepage", or "author:foo", or
+"copyright:GPL".
+
+Reference: <http://xapian.org/docs/omega/termprefixes.html>
+
+[[done]] for title and link, which seem like the really useful ones.
diff --git a/doc/todo/section-numbering.mdwn b/doc/todo/section-numbering.mdwn
new file mode 100644
index 000000000..3a2d232a8
--- /dev/null
+++ b/doc/todo/section-numbering.mdwn
@@ -0,0 +1,7 @@
+[[!tag wishlist]]
+
+Optional automatic section numbering would help reading: otherwise, a reader (like me) gets lost in the structure of a long page.
+
+I guess it is implementable with complex CSS... but one has first to compose this CSS in any case. So, this wish still has a todo status. --Ivan Z.
+
+And another aspect why this is related to ikiwiki, not just authoring a CSS, is that the style of the numbers (genereated by CSS probably) should match the style of the numbers in ikiwiki's [[plugins/toc]]. --Ivan Z.
diff --git a/doc/todo/selective_more_directive.mdwn b/doc/todo/selective_more_directive.mdwn
new file mode 100644
index 000000000..2a9998205
--- /dev/null
+++ b/doc/todo/selective_more_directive.mdwn
@@ -0,0 +1,28 @@
+I'm setting up a blog for NaNoWriMo and other story-writing, which means long posts every day. I want to have excerpts on the front page, which link to the full length story posts. I also want a dedicated page for each story which inlines the story in full and in chronological order. I can use the "more" directive to achieve this effect on the front page but then it spoils the story page. My solution was to add a pages= parameter to the more directive to make it more selective.
+
+ --- /usr/share/perl5/IkiWiki/Plugin/more.pm 2010-10-09 00:09:24.000000000 +0000
+ +++ .ikiwiki/IkiWiki/Plugin/more.pm 2010-11-01 20:24:59.000000000 +0000
+ @@ -26,7 +26,10 @@
+
+ $params{linktext} = $linktext unless defined $params{linktext};
+
+ - if ($params{page} ne $params{destpage}) {
+ + if ($params{page} ne $params{destpage} &&
+ + (! exists $params{pages} ||
+ + pagespec_match($params{destpage}, $params{pages},
+ + location => $params{page}))) {
+ return "\n".
+ htmllink($params{page}, $params{destpage}, $params{page},
+ linktext => $params{linktext},
+
+I can now call it as
+
+ \[[!more pages="index" linktext="Chapter 1" text="""
+ etc
+ """]]
+
+I'm not entirely happy with the design, since I would rather put this information in the inline directive instead of in every story post. Unfortunately I found no way to pass parameters from the inline directive to the inlined page.
+
+-- [[dark]]
+
+> Me neither, but nor do I see a better way, so [[applied|done]]. --[[Joey]]
diff --git a/doc/todo/shortcut_link_text.mdwn b/doc/todo/shortcut_link_text.mdwn
new file mode 100644
index 000000000..952e84608
--- /dev/null
+++ b/doc/todo/shortcut_link_text.mdwn
@@ -0,0 +1,19 @@
+[[plugins/shortcut]] creates link shortcut [[ikiwiki/Directive]]s,
+which substitute their argument into the specified shortcut URL to generate
+the link target, and use the argument as the link text. For example, given
+the example [[shortcuts]], `\[[!wikipedia ikiwiki]]` generates a link to
+<http://en.wikipedia.org/wiki/ikiwiki>, with the link text "ikiwiki". This
+works well in many cases; however, for things like the `debbug` example, it
+simply uses the number as the link text, which does not always provide
+enough context to understand the link at first glance. For example,
+`\[[!debbug 397501]]` generates a link to <http://bugs.debian.org/397501>,
+with just "397501" as the link text. While [[plugins/template]] provides a
+general solution for arbitrary cases, it would help to have a simple option
+via the shortcut plugin to set the link text, with a `%s` substitution.
+Thus, something like `\[[!shortcut name=debbug
+url="http://bugs.debian.org/%s" desc="bug #%s"]]` might suffice on a
+Debian-specific wiki to indicate a bug number, while a more general wiki
+might use something like `\[[!shortcut name=debbug
+url="http://bugs.debian.org/%s" desc="Debian bug #%s"]]`.
+
+> [[todo/done]] --[[Joey]]
diff --git a/doc/todo/shortcut_optional_parameters.mdwn b/doc/todo/shortcut_optional_parameters.mdwn
new file mode 100644
index 000000000..445404315
--- /dev/null
+++ b/doc/todo/shortcut_optional_parameters.mdwn
@@ -0,0 +1,46 @@
+Consider the "All files in this package search" on
+<http://packages.debian.org>. The URL for such a search looks like this:
+
+ http://packages.debian.org/cgi-bin/search_contents.pl?word=packagename&searchmode=filelist&case=insensitive&version=unstable&arch=i386
+
+To create a "debfiles" [[shortcut|shortcuts]] that takes a package name, you
+could just hardcode the architecture and distribution:
+
+ \[[!shortcut name=debfiles url="http://packages.debian.org/cgi-bin/search_contents.pl?word=%s&searchmode=filelist&case=insensitive&version=unstable&arch=i386"]]
+ \[[!debfiles ikiwiki]]
+
+But what if you could have them as optional parameters instead? The syntax
+for the invocation should look like this:
+
+ \[[!debfiles ikiwiki dist=testing]]
+
+Some possible syntax choices for the shortcut definition:
+
+ \[[!shortcut name=debfiles url="http://packages.debian.org/cgi-bin/search_contents.pl?word=%s&searchmode=filelist&case=insensitive&version=%(dist)s&arch=%(arch)s" dist="unstable" arch="i386"]]
+ \[[!shortcut name=debfiles url="http://packages.debian.org/cgi-bin/search_contents.pl?word=%s&searchmode=filelist&case=insensitive&version=%(dist=unstable)s&arch=%(arch=i386)s"]]
+ \[[!shortcut name=debfiles url="http://packages.debian.org/cgi-bin/search_contents.pl?word=%s&searchmode=filelist&case=insensitive&version=%{dist=unstable}&arch=%{arch=i386}"]]
+ \[[!shortcut name=debfiles url="http://packages.debian.org/cgi-bin/search_contents.pl?word=$*&searchmode=filelist&case=insensitive&version=${dist=unstable}&arch=${arch=i386}"]]
+
+--[[JoshTriplett]]
+
+Well, you can already do this kind of thing with templates. Invocation does
+look different:
+
+ \[[!template id=debfiles package=ikiwiki dist=testing]]
+
+--[[Joey]]
+
+> I think I would find templates sufficient, if:
+>
+> 1. I could use the name of the template as a preprocessor directive
+> (`\[[!templatename ...]]`), rather than using the `template` directive
+> with an `id` argument (`\[[!template id=templatename]]`).
+> 2. Template invocation allowed bare values in addition to `key=value`
+> arguments, and template definition supported some means to access the
+> value. This would allow `\[[!debfiles ikiwiki]]` rather than
+> `\[[!debfiles package=ikiwiki]]`.
+> 3. I could use ikiwiki syntax in the template, not just HTML and
+> HTML::Template. (If I can already do that, then [[/plugins/template]]
+> should make that more clear.)
+>
+> --[[JoshTriplett]]
diff --git a/doc/todo/shortcut_with_different_link_text.mdwn b/doc/todo/shortcut_with_different_link_text.mdwn
new file mode 100644
index 000000000..8615b2754
--- /dev/null
+++ b/doc/todo/shortcut_with_different_link_text.mdwn
@@ -0,0 +1,67 @@
+I'd like the ability to use a shortcut, but declare an explicit link text
+rather than using the link text defined on [[/shortcuts]]. For example, if I
+create a shortcut `protogit` pointing to files in the xcb/proto.git gitweb
+repository, I don't always want to use the path to the file as the link text;
+I would like to src/xcb.xsd, but use the link text "XML Schema for the X
+Window System protocol". --[[JoshTriplett]]
+
+> If I understand you correctly, you can use Markdown \[your link text\]\(the path or URL\) . Using your example:
+> [XML Schema for the X Window System protocol](src/xcb.xsd)
+>
+> If I don't understand this, can you give an HTML example? --[[JeremyReed]]
+
+>> The problem is like that in [[bugs/shortcuts_don't_escape_from_Markdown]]. We would like to use
+>> the shortcuts plugin but add a descriptive text -- in this case \[[!xcbgit src/xcb.xsd|XML Schema...]]
+>> The file src/xcb.xsd could be any url, and the point of shortcuts is that you get to shorten it.
+>> --Ethan
+
+>>> Some clarifications:
+>>> You can always write something like
+>>> `[XML Schema for the X Window System Protocol](http://gitweb.freedesktop.org/?p=xcb/proto.git;a=blob;hb=HEAD;f=src/xcb.xsd)`
+>>> to get [XML Schema for the X Window System Protocol](http://gitweb.freedesktop.org/?p=xcb/proto.git;a=blob;hb=HEAD;f=src/xcb.xsd).
+>>> However, I want to define a [[plugins/shortcut]] to save the typing. If I
+>>> define something like `protogit` pointing to
+>>> `http://gitweb.freedesktop.org/?p=xcb/proto.git;a=blob;hb=HEAD;f=%s`, then
+>>> I can write `\[[!protogit src/xcb.xsd]]`; however, I then can't change the
+>>> link text to anything other than what the shortcut defines as the link
+>>> text. I want to write something like
+>>> `\[[XML Schema for the X Window System Protocol|protogit src/xcb.xsd]]`,
+>>> just as I would write a wikilink like
+>>> `\[[the_shortcuts_on_this_wiki|shortcuts]]` to get
+>>> [[the_shortcuts_on_this_wiki|shortcuts]]. (The order you suggest, with the
+>>> preprocessor directive first, seems quite confusing since wikilinks work
+>>> the other way around.) --[[JoshTriplett]]
+
+> How about [xcbgit XML_Schema|src/xcb.xsd]. That's the same way round
+> as a wikilink, if you look at it the right way. The syntax Josh suggests
+> is not currently possible in ikiwiki.
+>
+> However.. [[Short_wikilinks]] has some similar objectives in a way, and
+> over there a similar syntax to what Josh proposes was suggested. So maybe
+> I should modify how ikiwiki preprocessors work to make it doable.
+> Although, I seem to have come up with a clear alternative syntax over
+> there. --[[Joey]]
+
+---
+
+One possible alternative, would be a general `\[[!url ]]` scheme for all kinds of links. As mentioned in [[Short_wikilinks]], I have wanted a way to enter links to the wiki with markdown-style references,
+specifying the actual target elsewhere from the text, with just a short reference in the text. To facilitate automatic conversion from earlier (already markdownised) "blog", I finally ended up writing a custom plugin that simply gets the location of wikipage, and use markdown mechanisms:
+
+ Here [is][1] a link.
+
+ [1]: [[!l a_page_in_the_wiki]]
+
+ Obviously [this]([[!l another_page]]) also works, although the syntax is quite cumbersome.
+
+So that the 'l' plugin inserts the location the page there, and markdown does the rest. My plugin currently fails if it can't find the page, as that is sufficient for my needs. Differing colouring for non-existing pages is not doable in a straightforward manner with this approach.
+
+For external links, that is no concern, however. So you could define for each shortcut an alternative directive, that inserts the URL. Perhaps `\[[!url shortcutname params]]` or `\[[@shortcutname params]]` (if the preprocessor supported the @), and this could be extended to local links in an obvious manner: `\[[!url page]]` or `\[[@page]]`. Now, if you could just get rid off the parantheses for markdown, for the short inline links --[[tuomov]] (who'd really rather not have two separate linking mechanisms: ikiwiki's heavy syntax and markdown's lighter one).
+
+---
+
+I've added code to make the \[[!foo 123]] syntax accept a _desc_
+parameter. I've named it like this to signal that it overrides the
+_desc_ provided at description time. `%s` is expanded here as well.
+
+[[todo/done]] -- Adeodato Simó
+
diff --git a/doc/todo/shortcut_with_no_url_parameter__44___only_desc.mdwn b/doc/todo/shortcut_with_no_url_parameter__44___only_desc.mdwn
new file mode 100644
index 000000000..56a74029e
--- /dev/null
+++ b/doc/todo/shortcut_with_no_url_parameter__44___only_desc.mdwn
@@ -0,0 +1,23 @@
+Currently, [[shortcuts]] must have the `url` parameter, and can optionally
+have the `desc` parameter. If the `shortcut` directive instead required at
+least one of `url` or `desc`, then shortcuts could just supply a description
+without an URL. Since desc can contain arbitrary wiki markup, this would
+allow shortcuts with multiple links, such as the mmlist shortcut proposed on
+[[simple_text_parsing_or_regex_in_template_or_shortcut]], or a comprehensive
+Debian package shortcut which linked to the package page and parenthetically
+to the BTS and PTS.
+
+--[[JoshTriplett]]
+
+It sounds like you're looking for templates, not shortcuts. --[[Joey]]
+
+> Perhaps true (see my issues with template syntax on
+> [[todo/shortcut_optional_parameters]]), but allowing a `shortcut` without an
+> `url` still seems reasonable, and simple. You could also use such shortcuts
+> without markup at all, as an abbreviation mechanism:
+>
+> \[[!shortcut name=spi desc="Software in the Public Interest, Inc."]]].
+> \[[!shortcut name=sosp desc="Symposium on Operating System Principles"]]].
+> \[[!shortcut name=cacm desc="Communications of the ACM"]]].
+>
+> --[[JoshTriplett]]
diff --git a/doc/todo/should_optimise_pagespecs.mdwn b/doc/todo/should_optimise_pagespecs.mdwn
new file mode 100644
index 000000000..728ab8994
--- /dev/null
+++ b/doc/todo/should_optimise_pagespecs.mdwn
@@ -0,0 +1,313 @@
+I think there is a problem in my "dependency graph". As an example,
+[here](http://poivron.org/~nil/misc/ikiwiki_buggy_index) is the index
+ikiwiki generated for [my site](http://poivron.org/~nil/misc/ikiwiki_buggy_index)
+(note that the site changed since this index was generated).
+
+Some **HUGE** dependencies appear, clearly non optimal, like
+
+ depends = A| B | A | C | A | D | A | E | A | F | A | G | ....
+
+or
+
+ depends= A | B | C | D | A | B | C | D | A | B | C | D | ....
+
+Couldn't isolate the cause, but some sources for this problem may be:
+
+* related to the img module
+* easily observable in my sire because one of my pages includes 80 resized images
+
+Other special things in my templates and site:
+
+* a sidebar with \[[!include pages="notes/\*" template=foo]] while notes.mdwn has
+ a \[[!include pages="notes/*"]] and uses the sidebar; removed it, doesn't change
+* a template (biblio.tmpl) calling the "img" plugin with a template parameter as the
+ image filename; removed it, doesn't change
+* some strange games with tags whose page calls a "map" directive to show other tags
+ shile tags are also used in tagclouds (in the sidebar and in the main pages)
+* ...
+
+I observed these problems (same *kind*, I didn't check in details) on
+
+* ikiwiki 2.00gpa1 + v5.8.4 + Debian 3.1
+* ikiwiki 2.3 + v5.8.8 + Ubuntu 7.04
+
+I can think about reducung the size of my wiki source and making it available online for analysis.
+
+-- NicolasLimare
+
+> As long as these dependencies don't grow over time (ie, when a page is
+> edited and nothing changed that should add a dependency), I wouldn't
+> worry about them. There are many things that can cause non-optimal
+> dependencies to be recorded. For one thing, if you inline something, ikiwiki
+> creates a dependency like:
+>
+> (PageSpec) or (file1 or file2 or file3 ...)
+>
+> Where fileN are all the files that the PageSpec currently matches. (This
+> is ncessary to detect when a currently inlined file is deleted, and know
+> the inlining page needs an update.) Now consider what it does if you have
+> a single page with two inline statements, that inline the same set of
+> stuff twice:
+>
+> ((PageSpec) or (file1 or file2 or file3 ...) or (PageSpec) or (file1 or file2 or file3 ...)
+>
+> Clearly non-optimal, indeed.
+>
+> Ikiwiki doesn't bother to simplify complex PageSpecs
+> because it's difficult to do, and because all they use is some disk
+> space. Consider what ikiwiki uses these dependencies for.
+> All it wants to know is: does the PageSpec for this page it's considering
+> rebuilding match any of the pages that have changed? Determining this is
+> a simple operation -- the PageSpec is converted to perl code. The perl
+> code is run.
+>
+> So the total impact of an ugly dependency like this is:
+>
+> 1. Some extra data read/written to disk.
+> 2. Some extra space in memory.
+> 3. A bit more data for the PageSpec translation code to handle. But that
+> code is quite fast.
+> 4. Typically one extra function call when the generated perl code is run.
+> Ie, when the expression on the left-hand side fails, which typically
+> happens after one (inexpensive) function call, it has to check
+> the identical expression on the right hand side.
+>
+> So this is at best a wishlist todo item, not a bug. A PageSpec simplifier
+> (or improved `pagespec_merge()` function) could be written and improve
+> ikiwiki's memory and disk usage, but would it actually speed it up any?
+> We'd have to see the code to the simplifier to know.
+>
+> --[[Joey]]
+
+>> I've been looking at optimizing ikiwiki for a site using
+>> [[plugins/contrib/album]] (which produces a lot of pages) and it seems
+>> that checking which pages depend on which pages does take a significant
+>> amount of time. The optimize-depends branch in my git repository
+>> avoids using `pagespec_merge()` for this (indeed it's no longer used
+>> at all), and instead represents dependencies as a list of pagespecs
+>> rather than a single pagespec. This does turn out to be faster, although
+>> not as much as I'd like. --[[smcv]]
+
+>>> [[Merged|done]] --[[smcv]]
+
+>>> I just wanted to note that there is a whole long discussion of dependencies and pagespecs on the [[todo/tracking_bugs_with_dependencies]] page. -- [[Will]]
+
+>>>> Yeah, I had a look at that (as the only other mention of `pagespec_merge`).
+>>>> I think I might have solved some of the problems mentioned there,
+>>>> actually - `pagespec_merge` no longer needs to exist in my branch (although
+>>>> I haven't actually deleted it), because the "or" operation is now done in
+>>>> the Perl code, rather than by merging pagespecs and translating. --[[smcv]]
+
+>>>>> I've now added a patch to the end of that branch that deletes
+>>>>> `pagespec_merge` almost entirely (we do need to keep a copy around, in
+>>>>> ikiwiki-transition, but that copy doesn't have to be optimal or support
+>>>>> future features like [[tracking_bugs_with_dependencies]]). --[[smcv]]
+
+---
+
+Some questions on your optimize-depends branch. --[[Joey]]
+
+In saveindex it still or'd together the depends list, but the `{depends}`
+field seems only useful for backwards compatability (ie, ikiwiki-transition
+uses it still), and otherwise just bloats the index.
+
+> If it's acceptable to declare that downgrading IkiWiki requires a complete
+> rebuild, I'm happy with that. I'd prefer to keep the (simple form of the)
+> transition done automatically during a load/save cycle, rather than
+> requiring ikiwiki-transition to be run; we should probably say in NEWS
+> that the performance increase won't fully apply until the next
+> rebuild. --[[smcv]]
+
+>> It is acceptable not to support downgrades.
+>> I don't think we need a NEWS file update since any sort of refresh,
+>> not just a full rebuild, will cause the indexdb to be loaded and saved,
+>> enabling the optimisation. --[[Joey]]
+
+>>> A refresh will load the current dependencies from `{depends}` and save
+>>> them as-is as a one-element `{dependslist}`; only a rebuild will replace
+>>> the single complex pagespec with a long list of simpler pagespecs.
+>>> --[[smcv]]
+
+Is an array the right data structure? `add_depends` has to loop through the
+array to avoid dups, it would be better if a hash were used there. Since
+inline (and other plugins) explicitly add all linked pages, each as a
+separate item, the list can get rather long, and that single add_depends
+loop has suddenly become O(N^2) to the number of pages, which is something
+to avoid..
+
+> I was also thinking about this (I've been playing with some stuff based on the
+> `remove-pagespec-merge` branch). A hash, by itself, is not optimal because
+> the dependency list holds two things: page names and page specs. The hash would
+> work well for the page names, but you'll still need to iterate through the page specs.
+> I was thinking of keeping a list and a hash. You use the list for pagespecs
+> and the hash for individual page names. To make this work you need to adjust the
+> API so it knows which you're adding. -- [[Will]]
+
+> I wasn't thinking about a lookup hash, just a dedup hash, FWIW.
+> --[[Joey]]
+
+>> I was under the impression from previous code review that you preferred
+>> to represent unordered sets as lists, rather than hashes with dummy
+>> values. If I was wrong, great, I'll fix that and it'll probably go
+>> a bit faster. --[[smcv]]
+
+>>> It depends, really. And it'd certianly make sense to benchmark such a
+>>> change. --[[Joey]]
+
+>>>> Benchmarked, below. --[[smcv]]
+
+Also, since a lot of places are calling add_depends in a loop, it probably
+makes sense to just make it accept a list of dependencies to add. It'll be
+marginally faster, probably, and should allow for better optimisation
+when adding a lot of depends at once.
+
+> That'd be an API change; perhaps marginally faster, but I don't
+> see how it would allow better optimisation if we're de-duplicating
+> anyway? --[[smcv]]
+
+>> Well, I was thinking that it might be sufficient to build a `%seen`
+>> hash of dependencies inside `add_depends`, if the places that call
+>> it lots were changed to just call it once. Of course the only way to
+>> tell is benchmarking. --[[Joey]]
+
+>>> It doesn't seem that it significantly affects performance either way.
+>>> --[[smcv]]
+
+In Render.pm, we now have a triply nested loop, which is a bit
+scary for efficiency. It seems there should be a way to
+rework this code so it can use the optimised `pagespec_match_list`,
+and/or hoist some of the inner loop calculations (like the `pagename`)
+out.
+
+> I don't think the complexity is any greater than it was: I've just
+> moved one level of "loop" out of the generated Perl, to be
+> in visible code. I'll see whether some of it can be hoisted, though.
+> --[[smcv]]
+
+>> The call to `pagename` is the only part I can see that's clearly
+>> run more often than before. That function is pretty inexpensive, but..
+>> --[[Joey]]
+
+>>> I don't see anything that can be hoisted without significant refactoring,
+>>> actually. Beware that there are two pagename calls in the loop: one for
+>>> `$f` (which is the page we might want to rebuild), and one for `$file`
+>>> (which is the changed page that it might depend on). Note that I didn't
+>>> choose those names!
+>>>
+>>> The three loops are over source files, their lists of dependency pagespecs,
+>>> and files that might have changed. I see the following things we might be
+>>> doing redundantly:
+>>>
+>>> * If `$file` is considered as a potential dependency for more than
+>>> one `$f`, we evaluate `pagename($file)` more than once. Potential fix:
+>>> cache them (this turns out to save about half a second on the docwiki,
+>>> see below).
+>>> * If several pages depend on the same pagespec, we evaluate whether each
+>>> changed page matches that pagespec more than once: however, we do so
+>>> with a different location parameter every time, so repeated calls are,
+>>> in the general case, the only correct thing to do. Potential fix:
+>>> perhaps special-case "page x depends on page y and nothing else"
+>>> (i.e. globs that have no wildcards) into a separate hash? I haven't
+>>> done anything in this direction.
+>>> * Any preparatory work done by pagespec_match (converting the pagespec
+>>> into Perl, mostly?) is done in the inner loop; switching to
+>>> pagespec_match_list (significant refactoring) saves more than half a
+>>> second on the docwiki.
+>>>
+>>> --[[smcv]]
+
+Very good catch on img/meta using the wrong dependency; verified in the wild!
+(I've cherry-picked those bug fixes.)
+
+----
+
+Benchmarking results: I benchmarked by altering docwiki.setup to switch off
+verbose, running "make clean && ./Makefile.PL && make", and timing one rebuild
+of the docwiki followed by three refreshes. Before each refresh I used
+`touch plugins/*.mdwn` to have something significant to refresh.
+
+I'm assuming that "user" CPU time is the important thing here (system time was
+relatively small in all cases, up to 0.35 seconds per run).
+
+master at the time of rebasing: 14.20s to rebuild, 10.04/12.07/14.01s to
+refresh. I think you can see the bug clearly here - the pagespecs are getting
+more complicated every time!
+
+> I can totally see a bug here, and it's one I didn't think existed. Ie,
+> I thought that after the first refresh, the pagespec should stabalize,
+> and what it stabalized to was probably unnecessarily long, but not
+> growing w/o bounds!
+>
+> a) Explains why ikiwiki.info has been so slow lately. Well that and some
+> other things that overloaded the system.
+> b) Suggests to me we will probably want to force a rebuild on upgrade
+> when fixing this (via the mechanism in the postinst).
+>
+> I've investigated why the pagespecs keep growing: When page A changes,
+> its old depends are cleared. Then
+> page B that inlines A gets rebuilt, and its old depends are also cleared.
+> But page B also inlines page C; which means C gets re-rendered. And this
+> happens w/o its old depends being cleared, so C's depends are doubled.
+> --[[Joey]]
+
+After the initial optimization: 14.27s to rebuild, 8.26/8.33/8.26 to refresh.
+Success!
+
+Not pre-joining dependencies actually took about ~0.2s more; I don't know why.
+I'm worried that duplicates will just build up (again) in less simple cases,
+though, so 0.2s is probably a small price to pay for that not happening (it
+might well be experimental error, for that matter).
+
+> It's weird that the suggested optimisations to
+> `add_depends` had no effect. So, the commit message to
+> b6fcb1cb0ef27e5a63184440675d465fad652acf is actually wrong.. ? --[[Joey]]
+
+>> I'll try benchmarking again on the non-public wiki where I had the 4%
+>> speedup. The docwiki is so small that 4% is hard to measure... --[[smcv]]
+
+Not saving {depends} to the index, using a hash instead of a list to
+de-duplicate, and allowing add_depends to take an arrayref instead of a single
+pagespec had no noticable positive or negative effect on this test.
+
+> I see e4cd168ebedd95585290c97ff42234344bfed46c is still in your branch
+> though. I don't like using an arrayref, it could just take `($page, @depends)`.
+> and I don't see the need to keep it if it doesn't currently help.
+
+>> I'll drop it. --[[smcv]]
+
+> Is there any reason to keep 7227c2debfeef94b35f7d81f42900aa01820caa3
+> if it doesn't improve speed?
+> --[[Joey]]
+
+>> I'll try benchmarking on a more complex wiki and see whether it has a
+>> positive or negative effect. It does avoid being O(n**2) in number of
+>> dependencies. --[[smcv]]
+
+Memoizing the results of pagename brought the rebuild time down to 14.06s
+and the refresh time down to 7.96/7.92/7.92, a significant win.
+
+> Ok, that seems safe to memoize. (It's a real function and it isn't
+> called with a great many inputs.) Why did you chose to memoize it
+> explicitly rather than adding it to the memoize list at the top?
+
+>> It does depend on global variables, so using Memoize seemed like asking for
+>> trouble. I suppose what I did is equivalent to Memoize though... --[[smcv]]
+
+Refactoring to use pagespec_match_list looks more risky from a code churn
+point of view; rebuild now takes 14.35s, but refresh is only 7.30/7.29/7.28,
+another significant win.
+
+--[[smcv]]
+
+> I had mostly convinced myself that
+> `pagespec_match_list` would not lead to a speed gain here. My reasoning
+> was that you want to stop after finding one match, while `pagespec_match_list`
+> checks all pages for matches. So what we're seeing is that
+> on a rebuild, `@changed` is all pages, and not short-circuiting leads
+> to unnecessary work. OTOH, on refresh, `@changed` is small and I suppose
+> `pagespec_match_list`'s other slight efficiencies win out somehow.
+>
+> Welcome to the "I made ikiwiki twice as fast
+> and all I got was this lousy git sha1sum" club BTW :-) --[[Joey]]
+
+[[!tag wishlist patch patch/core]]
diff --git a/doc/todo/should_use_a_standard_encoding_for_utf_chars_in_filenames.mdwn b/doc/todo/should_use_a_standard_encoding_for_utf_chars_in_filenames.mdwn
new file mode 100644
index 000000000..a454d7da5
--- /dev/null
+++ b/doc/todo/should_use_a_standard_encoding_for_utf_chars_in_filenames.mdwn
@@ -0,0 +1,61 @@
+It seems that I can't use Polish characters in post title.
+When I try to do it, then I can see error message: "Błąd: bad page name".
+
+I hope it's a bug, not a feature and you fix it soon :) --[[Paweł|ptecza]]
+
+> ikiwiki only allows a very limited set of characters raw in page names,
+> this is done as a deny-by-default security thing. All other characters
+> need to be encoded in __code__ format, where "code" is the character
+> number. This is normally done for you, but if you're adding a page
+> manually, you need to handle it yourself. --[[Joey]]
+
+>> Assume I have my own blog and I want to send a new post with Polish
+>> characters in a title. I think it's totally normal and common thing
+>> in our times. Do you want to tell me I shouldn't use my native
+>> characters in the title? It can't be true ;)
+
+>> In my opinion encoding of title is a job for the wiki engine,
+>> not for me. Joey, please try to look at a problem from my point
+>> of view. I'm only user and I don't have to understand
+>> what the character number is. I only want to blog :)
+
+>> BTW, why don't you use the modified-UTF7 coding for page names
+>> as used in IMAP folder names with non-Latin letters? --[[Paweł|ptecza]]
+
+>>> Joey, do you intend to fix that bug or it's a feature
+>>> for you? ;) --[[Paweł|ptecza]]
+
+>>>> Of course you can put Polish characters in the title. but the page
+>>>> title and filename are not identical. Ikiwiki has to place some limits
+>>>> on what filenames are legal to prevent abuse. Since
+>>>> the safest thing to do in a security context is to deny by default and
+>>>> only allow a few well-defined safe things, that's what it does, so
+>>>> filenames are limited to basic alphanumeric characters.
+>>>>
+>>>> It's not especially hard to transform your title into get a legal
+>>>> ikiwiki filename:
+
+ joey@kodama:~>perl -MIkiWiki -le 'print IkiWiki::titlepage(shift).".mdwn"' "Błąd"
+ B__197____130____196____133__d.mdwn
+
+>>>>> Thanks for the hint! It's good for me, but rather not for common users :)
+
+>>>>>> Interesting... I have another result:
+>>>>>>
+>>>>>> perl -MIkiWiki -le 'print IkiWiki::titlepage(shift).".mdwn"' "Błąd"
+>>>>>> B__179____177__d.mdwn
+>>>>>>
+>>>>>> What's your locale? I have both pl\_PL (ISO-8859-2) and pl\_PL.UTF-8,
+>>>>>> but I use pl\_PL. Is it wrong? --[[Paweł|ptecza]]
+
+>>>> Now, as to UTF7, in retrospect, using a standard encoding might be a
+>>>> better idea than coming up with my own encoding for filenames. Can
+>>>> you provide a pointer to a description to modified-UTF7? --[[Joey]]
+
+>>>>> The modified form of UTF7 is defined in [RFC 2060](http://www.ietf.org/rfc/rfc2060.txt)
+>>>>> for IMAP4 protocol (please see section 5.1.3 for details).
+
+>>>>> There is a Perl [Unicode::IMAPUtf7](http://search.cpan.org/~fabpot/Unicode-IMAPUtf7-2.01/lib/Unicode/IMAPUtf7.pm)
+>>>>> module at the CPAN, but probably it hasn't been debianized yet :( --[[Paweł|ptecza]]
+
+[[wishlist]]
diff --git a/doc/todo/sigs.mdwn b/doc/todo/sigs.mdwn
new file mode 100644
index 000000000..f4f8edf13
--- /dev/null
+++ b/doc/todo/sigs.mdwn
@@ -0,0 +1,25 @@
+Need a way to sign name in page that's easier to type than "--\[[Joey]]"
+and that includes the date.
+
+What syntax do other wikis use for this? I'm considering "\[[--]]"
+as it has a nice nmemonic.
+
+OTOH, adding additional syntax for this would be counter to one of the
+design goals for ikiwiki: keeping as much markup as possible out of the
+wiki and not adding nonstandard markup. And it's not significantly hard to
+type "--\[[Joey]]", and as to the date, we do have page history.
+
+I'm also unsure how to possibly implement this. Seems ikiwiki would need to
+expand the rune to the user's name when a page is saved, but that leaves
+out svn commits.
+
+---
+
+Or, just make a sig plugin that expands `~~~~` and `~~~` as wikipedia does.
+The plugin could be an editcontent hook, so it would take effect only when a
+page was edited via the web.
+
+I tried implementing this, but to make the link to the user, I wanted to
+use `userlink()`, which generates html. But the right thing to generate is
+really a wikilink. Except for openid, when the best thing to generate is a
+markdown link. Except when the page isn't formatted in markdown..
diff --git a/doc/todo/sigs/discussion.mdwn b/doc/todo/sigs/discussion.mdwn
new file mode 100644
index 000000000..4ae2aae87
--- /dev/null
+++ b/doc/todo/sigs/discussion.mdwn
@@ -0,0 +1 @@
+TWiki has the signature text in the edit page. You manually copy and paste it into the text area. \ No newline at end of file
diff --git a/doc/todo/simple_text_parsing_or_regex_in_template_or_shortcut.mdwn b/doc/todo/simple_text_parsing_or_regex_in_template_or_shortcut.mdwn
new file mode 100644
index 000000000..3ff8b9ef6
--- /dev/null
+++ b/doc/todo/simple_text_parsing_or_regex_in_template_or_shortcut.mdwn
@@ -0,0 +1,32 @@
+Either [[plugins/template]] or [[plugins/shortcut]] should support some form
+of very simple text parsing or regex application, to make it possible to write
+shortcuts like these:
+
+ [[!mmlist listname@lists.example.org]] -> <listname@example.org> ([mailman page] (http://lists.example.org/mailman/listinfo/listname)
+ [[!debcl packagename]] -> [packagename changelog](http://packages.debian.org/changelogs/pool/main/p/packagename/current/changelog)
+
+For shortcut definitions, a `match` parameter could supply a regex, and then the `url` and `desc` parameters could make use of the named or numbered groups from the match.
+
+--[[JoshTriplett]]
+
+I'm not comfortable with exposing regexps to web editing. At the very least
+it's trivial to construct regexps that take indefinitely long to match
+certain strings, which could be used to DOS ikiwiki. At worst, perl code
+can be embedded in regexps in a variety of ways that are painful to filter
+out, and perl's regexp engine could also potentially have bugs that could
+be exploited by user-supplied regexps.
+
+It seems that a better place to put this kind of text munging is in
+special-purpose plugins. It should be very simple to write plugins for the
+above two examples, that look identical to the user as what you described.
+
+--[[Joey]]
+
+Fair enough. I only proposed regexes for the purposes of generality.
+
+That said, some simple text substitution mechanisms might handle many of these
+cases without the need for a specialized plugin beyond [[plugins/shortcut]].
+For instance, substring extraction would suffice for the `debcl` shortcut, and
+something like a split function would work for the `mmlist` shortcut.
+
+--[[JoshTriplett]]
diff --git a/doc/todo/skip_option_for_inline_plugin.mdwn b/doc/todo/skip_option_for_inline_plugin.mdwn
new file mode 100644
index 000000000..f37d75ccb
--- /dev/null
+++ b/doc/todo/skip_option_for_inline_plugin.mdwn
@@ -0,0 +1,8 @@
+How about a skip option for [[plugins/inline]]? This would allow things like the following:
+
+ \[[!inline pages="news/*" show="5"]]
+ \[[!inline pages="news/*" skip="5" show="5" archive="yes"]]
+
+> I just wrote a patch. --Ethan
+
+[[todo/done]] --[[Joey]]
diff --git a/doc/todo/smarter_sorting.mdwn b/doc/todo/smarter_sorting.mdwn
new file mode 100644
index 000000000..901e143a7
--- /dev/null
+++ b/doc/todo/smarter_sorting.mdwn
@@ -0,0 +1,141 @@
+I benchmarked a build of a large wiki (my home wiki), and it was spending
+quite a lot of time sorting; `CORE::sort` was called only 1138 times, but
+still flagged as the #1 time sink. (I'm not sure I trust NYTProf fully
+about that FWIW, since it also said 27238263 calls to `cmp_age` were
+the #3 timesink, and I suspect it may not entirely accurately measure
+the overhead of so many short function calls.)
+
+`pagespec_match_list` currently always sorts *all* pages first, and then
+finds the top M that match the pagespec. That's innefficient when M is
+small (as for example in a typical blog, where only 20 posts are shown,
+out of maybe thousands).
+
+As [[smcv]] noted, It could be flipped, so the pagespec is applied first,
+and then sort the smaller matching set. But, checking pagespecs is likely
+more expensive than sorting. (Also, influence calculation complicates
+doing that.)
+
+Another option, when there is a limit on M pages to return, might be to
+cull the M top pages without sorting the rest.
+
+> The patch below implements this.
+>
+> But, I have not thought enough about influence calculation.
+> I need to figure out which pagespec matches influences need to be
+> accumulated for in order to determine all possible influences of a
+> pagespec are known.
+>
+> The old code accumulates influences from matching all successful pages
+> up to the num cutoff, as well as influences from an arbitrary (sometimes
+> zero) number of failed matches. New code does not accumulate influences
+> from all the top successful matches, only an arbitrary group of
+> successes and some failures.
+>
+> Also, by the time I finished this, it was not measuarably faster than
+> the old method. At least not with a few thousand pages; it
+> might be worth revisiting this sometime for many more pages? [[done]]
+> --[[Joey]]
+
+<pre>
+diff --git a/IkiWiki.pm b/IkiWiki.pm
+index 1730e47..bc8b23d 100644
+--- a/IkiWiki.pm
++++ b/IkiWiki.pm
+@@ -2122,36 +2122,54 @@ sub pagespec_match_list ($$;@) {
+ my $num=$params{num};
+ delete @params{qw{num deptype reverse sort filter list}};
+
+- # when only the top matches will be returned, it's efficient to
+- # sort before matching to pagespec,
+- if (defined $num && defined $sort) {
+- @candidates=IkiWiki::SortSpec::sort_pages(
+- $sort, @candidates);
+- }
+-
++ # Find the first num matches (or all), before sorting.
+ my @matches;
+- my $firstfail;
+ my $count=0;
+ my $accum=IkiWiki::SuccessReason->new();
+- foreach my $p (@candidates) {
+- my $r=$sub->($p, %params, location => $page);
++ my $i;
++ for ($i=0; $i < @candidates; $i++) {
++ my $r=$sub->($candidates[$i], %params, location => $page);
+ error(sprintf(gettext("cannot match pages: %s"), $r))
+ if $r->isa("IkiWiki::ErrorReason");
+ $accum |= $r;
+ if ($r) {
+- push @matches, $p;
++ push @matches, $candidates[$i];
+ last if defined $num && ++$count == $num;
+ }
+ }
+
++ # We have num natches, but they may not be the best.
++ # Efficiently find and add the rest, without sorting the full list of
++ # candidates.
++ if (defined $num && defined $sort) {
++ @matches=IkiWiki::SortSpec::sort_pages($sort, @matches);
++
++ for ($i++; $i < @candidates; $i++) {
++ # Comparing candidate with lowest match is cheaper,
++ # so it's done before testing against pagespec.
++ if (IkiWiki::SortSpec::cmptwo($candidates[$i], $matches[-1], $sort) < 0 &&
++ $sub->($candidates[$i], %params, location => $page)
++ ) {
++ # this could be done less expensively
++ # using a binary search
++ for (my $j=0; $j < @matches; $j++) {
++ if (IkiWiki::SortSpec::cmptwo($candidates[$i], $matches[$j], $sort) < 0) {
++ splice @matches, $j, $#matches-$j+1, $candidates[$i],
++ @matches[$j..$#matches-1];
++ last;
++ }
++ }
++ }
++ }
++ }
++
+ # Add simple dependencies for accumulated influences.
+- my $i=$accum->influences;
+- foreach my $k (keys %$i) {
+- $depends_simple{$page}{lc $k} |= $i->{$k};
++ my $inf=$accum->influences;
++ foreach my $k (keys %$inf) {
++ $depends_simple{$page}{lc $k} |= $inf->{$k};
+ }
+
+- # when all matches will be returned, it's efficient to
+- # sort after matching
++ # Sort if we didn't already.
+ if (! defined $num && defined $sort) {
+ return IkiWiki::SortSpec::sort_pages(
+ $sort, @matches);
+@@ -2455,6 +2473,12 @@ sub sort_pages {
+ sort $f @_
+ }
+
++sub cmptwo {
++ $a=$_[0];
++ $b=$_[1];
++ $_[2]->();
++}
++
+ sub cmp_title {
+ IkiWiki::pagetitle(IkiWiki::basename($a))
+ cmp
+</pre>
+
+This would be bad when M is very large, and particularly, of course, when
+there is no limit and all pages are being matched on. (For example, an
+archive page shows all pages that match a pagespec specifying a creation
+date range.) Well, in this case, it *does* make sense to flip it, limit by
+pagespe first, and do a (quick)sort second. (No influence complications,
+either.)
+
+> Flipping when there's no limit implemented, and it knocked 1/3 off
+> the rebuild time of my blog's archive pages. --[[Joey]]
+
+Adding these special cases will be more complicated, but I think the best
+of both worlds. --[[Joey]]
diff --git a/doc/todo/smileys_do_not_work_in_PreprocessorDirective_arguments.mdwn b/doc/todo/smileys_do_not_work_in_PreprocessorDirective_arguments.mdwn
new file mode 100644
index 000000000..06c06e191
--- /dev/null
+++ b/doc/todo/smileys_do_not_work_in_PreprocessorDirective_arguments.mdwn
@@ -0,0 +1,18 @@
+Several [[ikiwiki/Directive]]s take ikiwiki-formatted text as arguments,
+such as the `then` and `else` arguments of the new `if` directive, or the
+`desc` argument of the `shortcut` directive. However, smileys do not work in
+these arguments.
+
+Since the arguments to [[ikiwiki/Directive]]s might use the same syntax as
+smileys for a different meaning, smiley substitution should not happen until
+after [[ikiwiki/Directive]]s.
+
+--[[JoshTriplett]]
+
+> Sorry, I should have filed this under [[bugs]], not [[todo]].
+>
+> Also, for an example of this issue, consider the sample conditional on [[plugins/conditional]].
+>
+> --[[JoshTriplett]]
+
+[[todo/done]] --[[Joey]]
diff --git a/doc/todo/softlinks.mdwn b/doc/todo/softlinks.mdwn
new file mode 100644
index 000000000..9c5e3fac1
--- /dev/null
+++ b/doc/todo/softlinks.mdwn
@@ -0,0 +1,14 @@
+If I have a filesystem soft-link, e.g. "foo.mdwn" links to "bar.mdwn", it doesn't work.
+The page "foo/" does not exist.
+
+This is too bad, because sometimes it is convenient to have several different names for the same page.
+
+Could softlinks be handled gracefully by ikiwiki?
+
+> Soft links are explicitly not handled by IkiWiki as a [[security]] measure. If you want several names for
+> the same page, I suggest using the [[ikiwiki/directive/meta]] directive to make one page redirect to
+> another. -- [[Will]]
+
+>> Will is right. I don't plan to support symlinks. [[done]] --[[Joey]]
+
+>> With the appropriate template, inline can also help copy pages around [[DavidBremner]]
diff --git a/doc/todo/sort_parameter_for_map_plugin_and_directive.mdwn b/doc/todo/sort_parameter_for_map_plugin_and_directive.mdwn
new file mode 100644
index 000000000..b07ea33f1
--- /dev/null
+++ b/doc/todo/sort_parameter_for_map_plugin_and_directive.mdwn
@@ -0,0 +1,53 @@
+## sort= parameter
+
+Having a `sort=` parameter for the map plugin/directive would be real nice; like `inline`'s parameter, with `age`, `title`, etc.
+
+I may hack one in from `inline` if it seem within my skill level.
+
+> this could leverage the [[sorting mechanism|ikiwiki/pagespec/sorting]] already in place. as it's not sorting a flat list, there's a number of different ways to sort, which should be configurable imo.
+>
+> as an example, i'll consider pages created in the sequence c/1, a, b, c, a/1, c/2.
+>
+> sorting could:
+>
+> * sort within each level:
+>
+> sorting order of child nodes would only matter internally in the groups
+>
+> that would create a (a/1) b c (c/1 c/2) sequence in our example.
+>
+> * sort by maximum
+>
+> the highest ranking page in a group would pull the parent to its own position
+>
+> that would create b a (a/1) c (c/1 c/2).
+>
+> * sort by minimum
+>
+> the lowest ranking page in a group would pull the parent to its own position
+>
+> here, that would give c (c/1 c/2) a (a/1) b
+>
+> * forced sequence
+>
+> all deepest-level items are forced to their positions, even if that means their parents are repeated at positions where they wouldn't occur naturally. parent nodes that don't have child nodes that occur directly before or after them are shown without the child nodes.
+>
+> that'd be c (c/1) a b c a (a/1) c (c/2) in our example.
+>
+> admittedly, the use cases for that are not too obvious, but think of a travel diary, for example, where you'd have the entries chronologically but grouped by the country you've visited. when you visit the same country twice, it should show up twice too.
+>
+> --[[chrysn]]
+
+------
+
+> i now do have two thirds of the solution:
+>
+> * i've patched the map plugin to accept a sort parameter (as usual in pagespec directives) and a strategy parameter, which is used to choose how the tree should be sorted. it turned out that the changes required were minimal; even precautions for having to display a node's parents although they are not supposed to be shown by themselves are present (they're decorated with the mapparent css class).
+> * i've implemented algorithms for the described strategies, but in python -- i tried in perl, but i'm not versed well enough in perl for such things. the "force" strategy works in perl but i'm afraid it depends on more than the perl sort algorithm to be just stable.
+> * if someone could port the three strategies implemented in python to perl, we'd have a complete patch for this.
+>
+> when comparing the implementation to my notes above, you'll see that there is a minor difference in the "force" algorithm -- my code doesn't generate the "parent" entries (**c** (c/1) a b c **a** (a/1) **c** (c/2) in the example), but they're generated by the already existing output code.
+>
+> the code can be found at [[incomplete_patch.pl.pl]] and [[python_algorithms.py]]. --[[chrysn]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/sort_parameter_for_map_plugin_and_directive/incomplete_patch.pl.pl b/doc/todo/sort_parameter_for_map_plugin_and_directive/incomplete_patch.pl.pl
new file mode 100644
index 000000000..1297be663
--- /dev/null
+++ b/doc/todo/sort_parameter_for_map_plugin_and_directive/incomplete_patch.pl.pl
@@ -0,0 +1,77 @@
+diff --git a/IkiWiki/Plugin/map.pm b/IkiWiki/Plugin/map.pm
+index 38f090f..6b884cd 100644
+--- a/IkiWiki/Plugin/map.pm
++++ b/IkiWiki/Plugin/map.pm
+@@ -25,6 +25,42 @@ sub getsetup () {
+ },
+ }
+
++sub strategy_byparents (@) {
++ # Sort by parents only
++ #
++ # With this strategy, children are sorted *under* their parents
++ # regardless of their own position, and the parents' positions are
++ # determined only by comparing the parents themselves.
++
++ # FIXME this is *not* what's described above, but the old behavior (for
++ # testing/comparison)
++ use sort 'stable';
++ my (@sequence,) = @_;
++ @sequence = sort @sequence;
++ return @sequence;
++}
++
++sub strategy_forcedsequence (@) {
++ # Forced Sequence Mode
++ #
++ # Using this strategy, all entries will be shown in the sequence; this
++ # can cause parents to show up multiple times.
++ #
++ # The only reason why this is not the identical function is that
++ # parents that are sorted between their children are bubbled up to the
++ # top of their contiguous children to avoid being repeated in the
++ # output.
++
++ use sort 'stable';
++
++ my (@sequence,) = @_;
++ # FIXME: i'm surprised that this actually works. i'd expect this to
++ # work with bubblesort, but i'm afraid that this may just not yield the
++ # correct results with mergesort.
++ @sequence = sort {($b eq substr($a, 0, length($b))) - ($a eq substr($b, 0, length($a)))} @sequence;
++ return @sequence;
++}
++
+ sub preprocess (@) {
+ my %params=@_;
+ $params{pages}="*" unless defined $params{pages};
+@@ -37,8 +73,11 @@ sub preprocess (@) {
+
+ # Get all the items to map.
+ my %mapitems;
++ my @mapsequence;
+ foreach my $page (pagespec_match_list($params{page}, $params{pages},
+- deptype => $deptype)) {
++ deptype => $deptype,
++ sort => exists $params{sort} ? $params{sort} : "title")) {
++ push(@mapsequence, $page);
+ if (exists $params{show} &&
+ exists $pagestate{$page} &&
+ exists $pagestate{$page}{meta}{$params{show}}) {
+@@ -88,7 +127,15 @@ sub preprocess (@) {
+ $map .= "<ul>\n";
+ }
+
+- foreach my $item (sort keys %mapitems) {
++ if (!exists $params{strategy} || $params{strategy} eq "parent") {
++ @mapsequence = strategy_byparents(@mapsequence);
++ } elsif ($params{strategy} eq "forced") {
++ @mapsequence = strategy_forcedsequence(@mapsequence);
++ } else {
++ error("Unknown strategy.");
++ }
++
++ foreach my $item (@mapsequence) {
+ my @linktext = (length $mapitems{$item} ? (linktext => $mapitems{$item}) : ());
+ $item=~s/^\Q$common_prefix\E\///
+ if defined $common_prefix && length $common_prefix;
diff --git a/doc/todo/sort_parameter_for_map_plugin_and_directive/python_algorithms.py b/doc/todo/sort_parameter_for_map_plugin_and_directive/python_algorithms.py
new file mode 100644
index 000000000..e89c54fae
--- /dev/null
+++ b/doc/todo/sort_parameter_for_map_plugin_and_directive/python_algorithms.py
@@ -0,0 +1,86 @@
+testdata = "c/3 a b d b/1 c/1 c/2/x c/2 c".split(" ")
+
+def strategy_byearlychild(sequence):
+ """Sort by earliest child
+
+ When this strategy is used, a parent is displayed with all its children as
+ soon as the first child is supposed to be shown.
+
+ >>> strategy_byearlychild(testdata)
+ ['c', 'c/3', 'c/1', 'c/2', 'c/2/x', 'a', 'b', 'b/1', 'd']
+ """
+
+ # first step: pull parents to top
+ def firstchildindex(item):
+ childindices = [i for (i,text) in enumerate(sequence) if text.startswith(item + "/")]
+ # distinction required as min(foo, *[]) tries to iterate over foo
+ if childindices:
+ return min(sequence.index(item), *childindices)
+ else:
+ return sequence.index(item)
+ sequence = sorted(sequence, key=firstchildindex)
+
+ # second step: pull other children to the start too
+ return strategy_byparents(sequence)
+
+def strategy_byparents(sequence):
+ """Sort by parents only
+
+ With this strategy, children are sorted *under* their parents regardless of
+ their own position, and the parents' positions are determined only by
+ comparing the parents themselves.
+
+ >>> strategy_byparents(testdata)
+ ['a', 'b', 'b/1', 'd', 'c', 'c/3', 'c/1', 'c/2', 'c/2/x']
+ """
+
+ def partindices(item):
+ """Convert an entry a tuple of the indices of the entry's parts.
+
+ >>> sequence = testsequence
+ >>> assert partindices("c/2/x") == (sequence.index("c"), sequence.index("c/2"), sequence.index("c/2/x"))
+ """
+ return tuple(sequence.index(item.rsplit('/', i)[0]) for i in range(item.count('/'), -1, -1))
+
+ return sorted(sequence, key=partindices)
+
+def strategy_forcedsequence(sequence):
+ """Forced Sequence Mode
+
+ Using this strategy, all entries will be shown in the sequence; this can
+ cause parents to show up multiple times.
+
+ The only reason why this is not the identical function is that parents that
+ are sorted between their children are bubbled up to the top of their
+ contiguous children to avoid being repeated in the output.
+
+ >>> strategy_forcedsequence(testdata)
+ ['c/3', 'a', 'b', 'd', 'b/1', 'c', 'c/1', 'c/2', 'c/2/x']
+ """
+
+ # this is a classical bubblesort. other algorithms wouldn't work because
+ # they'd compare non-adjacent entries and move the parents before remote
+ # children. python's timsort seems to work too...
+
+ for i in range(len(sequence), 1, -1):
+ for j in range(1, i):
+ if sequence[j-1].startswith(sequence[j] + '/'):
+ sequence[j-1:j+1] = [sequence[j], sequence[j-1]]
+
+ return sequence
+
+def strategy_forcedsequence_timsort(sequence):
+ sequence.sort(lambda x,y: -1 if y.startswith(x) else 1)
+ return sequence
+
+if __name__ == "__main__":
+ import doctest
+ doctest.testmod()
+
+ import itertools
+
+ for perm in itertools.permutations(testdata):
+ if strategy_forcedsequence(testdata[:]) != strategy_forcedsequence_timsort(testdata[:]):
+ print "difference for testdata", testdata
+ print "normal", strategy_forcedsequence(testdata[:])
+ print "timsort", strategy_forcedsequence_timsort(testdata[:])
diff --git a/doc/todo/sortable_tables.mdwn b/doc/todo/sortable_tables.mdwn
new file mode 100644
index 000000000..8e7e6fe25
--- /dev/null
+++ b/doc/todo/sortable_tables.mdwn
@@ -0,0 +1 @@
+It would be nice if ikiwiki's table plugin could create sortable tables like Mediawiki does.
diff --git a/doc/todo/sortbylastcomment_plugin.mdwn b/doc/todo/sortbylastcomment_plugin.mdwn
new file mode 100644
index 000000000..84cf86e21
--- /dev/null
+++ b/doc/todo/sortbylastcomment_plugin.mdwn
@@ -0,0 +1,13 @@
+This plugin provides the `last_comment` [[ikiwiki/pagespec/sorting]] order which uses the modification time of their last comment to sort pages. It also updates the mtime of the page to this value.
+
+For example, it could be useful to make active threads of discussion appear on top of the list of threads in a forum.
+
+You'll find it in this repository, in the 'sortbylastcomment' branch:
+
+<https://un.poivron.org/~sajolida/ikiwiki.git/>
+
+[[!tag wishlist patch]]
+
+> Reviewed, tested: looks good to me. We need it for the [Tails forum](https://tails.boum.org/forum/). --[[intrigeri]]
+
+>> Hi, is there a chance of seeing this plugin getting included in a release at any point soon? --sajolida
diff --git a/doc/todo/sorting_by_path.mdwn b/doc/todo/sorting_by_path.mdwn
new file mode 100644
index 000000000..a483c331a
--- /dev/null
+++ b/doc/todo/sorting_by_path.mdwn
@@ -0,0 +1,18 @@
+[[!tag patch]]
+[[!template id=gitbranch branch=smcv/trail3 author="[[smcv]]"]]
+
+My branch for [[plugins/contrib/trail]] also includes `path`
+and `path_natural` sort orders, which sort the entire page name,
+e.g. "a a/z ab ab/c b", much like [[ikiwiki/directive/map]].
+I used `path` as the default order for the
+[[plugins/contrib/ikiwiki/directive/trailitems]] directive,
+since it seemed the most sensible.
+([[plugins/contrib/ikiwiki/directive/trailinline]] uses
+`age` as its default, to be consistent with `inline`.)
+
+It's one commit (including a regression test) which can be
+cherry-picked if you don't want the rest of `trail`.
+
+--[[smcv]]
+
+> [[done]] --[[Joey]]
diff --git a/doc/todo/source_link.mdwn b/doc/todo/source_link.mdwn
new file mode 100644
index 000000000..cf3e69487
--- /dev/null
+++ b/doc/todo/source_link.mdwn
@@ -0,0 +1,135 @@
+How about a direct link from the page header to the source of the latest version, to avoid the need to either use edit or navigate to the current version via the history link?
+
+ I'd like this too (and might try to implement it). -- [[users/jon]]
+
+I just implemented this. There is one [[patch]] to the default page template, and a new plugin. -- [[Will]]
+
+All of this code is licensed under the GPLv2+. -- [[Will]]
+
+> The use of sessioncgi here seems undesirable: on wikis where anonymity is
+> not allowed, you'll be asked to log in. Couldn't you achieve the same thing
+> by loading the index with IkiWiki::loadindex, like [[plugins/goto]] does?
+> --[[smcv]]
+
+[[done]]
+
+>> I've applied the patch below in a git branch, fixed my earlier criticism,
+>> and also fixed a couple of other issues I noticed:
+>>
+>> * missing pages could be presented better as a real 404 page
+>> * the default Content-type should probably be UTF-8 since the rest of
+>> IkiWiki tends to assume that
+>> * emitting attachments (images, etc.) as text/plain isn't going to work :-)
+>>
+>> Any opinions on my branch? I think it's ready for merge, if Joey approves.
+>>
+>> --[[smcv]]
+
+>>> I need a copyright&license statement, so debian/copyright can be updated for
+>>> the plugin, before I can merge this. Otherwise ready. --[[Joey]]
+
+>>> That looks like a nice set of fixes. One more that might be worthwhile: instead of reading the page source into a var, and then writing it out later, it might be nice to just
+>>> `print readfile(srcfile(pagesources{$page}));` at the appropriate point. -- [[Will]]
+
+>>>> OK, I've committed that. --[[smcv]]
+
+----
+
+ diff --git a/templates/page.tmpl b/templates/page.tmpl
+ index f2f9c34..3176bed 100644
+ --- a/templates/page.tmpl
+ +++ b/templates/page.tmpl
+ @@ -46,6 +46,9 @@
+ <TMPL_IF NAME="HISTORYURL">
+ <li><a href="<TMPL_VAR HISTORYURL>">History</a></li>
+ </TMPL_IF>
+ +<TMPL_IF NAME="GETSOURCEURL">
+ +<li><a href="<TMPL_VAR GETSOURCEURL>">Get Source</a></li>
+ +</TMPL_IF>
+ <TMPL_IF NAME="PREFSURL">
+ <li><a href="<TMPL_VAR PREFSURL>">Preferences</a></li>
+ </TMPL_IF>
+
+----
+
+ #!/usr/bin/perl
+ package IkiWiki::Plugin::getsource;
+
+ use warnings;
+ use strict;
+ use IkiWiki;
+ use open qw{:utf8 :std};
+
+ sub import {
+ hook(type => "getsetup", id => "getsource", call => \&getsetup);
+ hook(type => "pagetemplate", id => "getsource", call => \&pagetemplate);
+ hook(type => "sessioncgi", id => "getsource", call => \&cgi_getsource);
+ }
+
+ sub getsetup () {
+ return
+ plugin => {
+ safe => 1,
+ rebuild => 1,
+ },
+ getsource_mimetype => {
+ type => "string",
+ example => "application/octet-stream",
+ description => "Mime type for returned source.",
+ safe => 1,
+ rebuild => 0,
+ },
+ }
+
+ sub pagetemplate (@) {
+ my %params=@_;
+
+ my $page=$params{page};
+ my $template=$params{template};
+
+ if (length $config{cgiurl}) {
+ $template->param(getsourceurl => IkiWiki::cgiurl(do => "getsource", page => $page));
+ $template->param(have_actions => 1);
+ }
+ }
+
+ sub cgi_getsource ($$) {
+ my $cgi=shift;
+ my $session=shift;
+
+ # Note: we use sessioncgi rather than just cgi
+ # because we need $IkiWiki::pagesources{} to be
+ # populated.
+
+ return unless (defined $cgi->param('do') &&
+ $cgi->param("do") eq "getsource");
+
+ IkiWiki::decode_cgi_utf8($cgi);
+
+ my $page=$cgi->param('page');
+
+ if ($IkiWiki::pagesources{$page}) {
+
+ my $data = IkiWiki::readfile(IkiWiki::srcfile($IkiWiki::pagesources{$page}));
+
+ if (! $config{getsource_mimetype}) {
+ $config{getsource_mimetype} = "text/plain";
+ }
+
+ print "Content-Type: $config{getsource_mimetype}\r\n";
+
+ print ("\r\n");
+
+ print $data;
+
+ exit 0;
+ }
+
+ error("Unable to find page source for page: $page");
+
+ exit 0;
+ }
+
+ 1
+
+[[done]] --[[smcv]]
diff --git a/doc/todo/spell_check_plug-in.mdwn b/doc/todo/spell_check_plug-in.mdwn
new file mode 100644
index 000000000..61f7f9a1d
--- /dev/null
+++ b/doc/todo/spell_check_plug-in.mdwn
@@ -0,0 +1,12 @@
+A speel chek plug-in woold be fantaztik. Anyone working on this?
+
+----
+
+Knot adz fair ass eye no --[[Joey]]
+
+----
+
+Firefox 2 (or whatever it will be in Debian) does this for you, and then there's the mozex extension
+
+> Yeah, IMHO gecko's spellchecker nailed this. Spellcheckers server-side
+> are now passe. Calling this [[done]] --[[Joey]]
diff --git a/doc/todo/strftime.mdwn b/doc/todo/strftime.mdwn
new file mode 100644
index 000000000..3c854391f
--- /dev/null
+++ b/doc/todo/strftime.mdwn
@@ -0,0 +1,4 @@
+There should be a --strftime switch that controls how all the dates are
+formatted.
+
+[[todo/done]]
diff --git a/doc/todo/structured_page_data.mdwn b/doc/todo/structured_page_data.mdwn
new file mode 100644
index 000000000..9f21fab7f
--- /dev/null
+++ b/doc/todo/structured_page_data.mdwn
@@ -0,0 +1,633 @@
+This is an idea from [[JoshTriplett]]. --[[Joey]]
+
+* See further discussion at [[forum/an_alternative_approach_to_structured_data]].
+
+Some uses of ikiwiki, such as for a bug-tracking system (BTS), move a bit away from the wiki end
+of the spectrum, and toward storing structured data about a page or instead
+of a page.
+
+For example, in a bug report you might want to choose a severity from a
+list, enter a version number, and have a bug submitter or owner recorded,
+etc. When editing online, it would be nice if these were separate fields on
+the form, rather than the data being edited in the big edit form.
+
+There's a tension here between remaining a wiki with human-editable source
+files, containing freeform markup, and more structured data storage. I
+think that it would be best to include the structured data in the page,
+using a directive. Something like:
+
+ part of page content
+ \[[data yaml="<arbitrary yaml here>"]]
+ rest of page content
+
+As long as the position of the directive is not significant, it could be
+stripped out when web editing, the yaml used to generate/populate form fields,
+and then on save, the directive regenerated and inserted at top/bottom of
+the page.
+
+Josh thinks that yaml is probably a good choice, but the source could be a
+`.yaml` file that contains no directives, and just yaml. An addition
+complication in this scenario is, if the yaml included wiki page formatted content,
+ikiwiki would have to guess or be told what markup language it used.
+
+Either way, the yaml on the page would encode fields and their current content.
+Information about data types would be encoded elsewhere, probably on a
+parent page (using a separate directive). That way, all child pages could
+be forced to have the same fields.
+
+There would be some simple types like select, boolean, multiselect, string, wiki markup.
+Probably lists of these (ie, list of strings). Possibly more complex data
+structures.
+
+It should also be possible for plugins to define new types, and the type
+definitions should include validation of entered data, and how to prompt
+the user for the data.
+
+This seems conceptually straightforward, if possibly quite internally
+complex to handle the more complicated types and validation.
+
+One implementation wrinkle is how to build the html form. The editpage.tmpl
+currently overrides the standard [[!cpan CGI::FormBuilder]] generated form,
+which was done to make the edit page be laid out in a nice way. This,
+however, means that new fields cannot be easily added to it using
+[[!cpan CGI::FormBuilder]]. The attachment plugin uses the hack of bouilding
+up html by hand and dumping it into the form via a template variable.
+
+It would be nice if the type implementation code could just use
+FormBuilder, since its automatic form generation, and nice field validation
+model is a perfect match for structured data. But this problem with
+editpage.tmpl would have to be sorted out to allow that.
+
+Additional tie-ins:
+
+* Pagespecs that can select pages with a field with a given value, etc.
+ This should use a pagespec function like field(fieldname, value). The
+ semantics of this will depend on the type of the field; text fields will
+ match value against the text, and link fields will check for a link
+ matching the pagespec value.
+* The search plugin could allow searching for specific fields with specific
+ content. (xapian term search is a good fit).
+
+See also:
+
+[[tracking_bugs_with_dependencies]]
+
+> I was also thinking about this for bug tracking. I'm not sure what
+> sort of structured data is wanted in a page, so I decided to brainstorm
+> use cases:
+>
+> * You just want the page to be pretty.
+> * You want to access the data from another page. This would be almost like
+> like a database lookup, or the OpenOffice Calc [VLookup](http://wiki.services.openoffice.org/wiki/Documentation/How_Tos/Calc:_VLOOKUP_function) function.
+> * You want to make a pagespec depend upon the data. This could be used
+> for dependancy tracking - you could match against pages listed as dependencies,
+> rather than all pages linked from a given page.
+>
+>The first use case is handled by having a template in the page creation. You could
+
+
+
+
+>have some type of form to edit the data, but that's just sugar on top of the template.
+>If you were going to have a web form to edit the data, I can imagine a few ways to do it:
+>
+> * Have a special page type which gets compiled into the form. The page type would
+> need to define the form as well as hold the stored data.
+> * Have special directives that allow you to insert form elements into a normal page.
+>
+>I'm happy with template based page creation as a first pass...
+>
+>The second use case could be handled by a regular expression directive. eg:
+>
+> \[[regex spec="myBug" regex="Depends: ([^\s]+)"]]
+>
+> The directive would be replaced with the match from the regex on the 'myBug' page... or something.
+>
+>The third use case requires a pagespec function. One that matched a regex in the page might work.
+>Otherwise, another option would be to annotate links with a type, and then check the type of links in
+>a pagespec. e.g. you could have `depends` links and normal links.
+>
+>Anyway, I just wanted to list the thoughts. In none of these use cases is straight yaml or json the
+>obvious answer. -- [[Will]]
+
+>> Okie. I've had a play with this. A 'form' plugin is included inline below, but it is only a rough first pass to
+>> get a feel for the design space.
+>>
+>> The current design defines a new type of page - a 'form'. The type of page holds YAML data
+>> defining a FormBuilder form. For example, if we add a file to the wiki source `test.form`:
+
+ ---
+ fields:
+ age:
+ comment: This is a test
+ validate: INT
+ value: 15
+
+>> The YAML content is a series of nested hashes. The outer hash is currently checked for two keys:
+>> 'template', which specifies a parameter to pass to the FromBuilder as the template for the
+>> form, and 'fields', which specifies the data for the fields on the form.
+>> each 'field' is itself a hash. The keys and values are arguments to the formbuilder form method.
+>> The most important one is 'value', which specifies the value of that field.
+>>
+>> Using this, the plugin below can output a form when asked to generate HTML. The Formbuilder
+>> arguments are sanitized (need a thorough security audit here - I'm sure I've missed a bunch of
+>> holes). The form is generated with default values as supplied in the YAML data. It also has an
+>> 'Update Form' button at the bottom.
+>>
+>> The 'Update Form' button in the generated HTML submits changed values back to IkiWiki. The
+>> plugin captures these new values, updates the YAML and writes it out again. The form is
+>> validated when edited using this method. This method can only edit the values in the form.
+>> You cannot add new fields this way.
+>>
+>> It is still possible to edit the YAML directly using the 'edit' button. This allows adding new fields
+>> to the form, or adding other formbuilder data to change how the form is displayed.
+>>
+>> One final part of the plugin is a new pagespec function. `form_eq()` is a pagespec function that
+>> takes two arguments (separated by a ','). The first argument is a field name, the second argument
+>> a value for that field. The function matches forms (and not other page types) where the named
+>> field exists and holds the value given in the second argument. For example:
+
+ \[[!inline pages="form_eq(age,15)" archive="yes"]]
+
+>> will include a link to the page generated above.
+
+>>> Okie, I've just made another plugin to try and do things in a different way.
+>>> This approach adds a 'data' directive. There are two arguments, `key` and `value`.
+>>> The directive is replaced by the value. There is also a match function, which is similar
+>>> to the one above. It also takes two arguments, a key and a value. It returns true if the
+>>> page has that key/value pair in a data directive. e.g.:
+
+ \[[!data key="age" value="15"]]
+
+>>> then, in another page:
+
+ \[[!inline pages="data_eq(age,15)" archive="yes"]]
+
+>>> I expect that we could have more match functions for each type of structured data,
+>>> I just wanted to implement a rough prototype to get a feel for how it behaves. -- [[Will]]
+
+>> Anyway, here are the plugins. As noted above these are only preliminary, exploratory, attempts. -- [[Will]]
+
+>>>> I've just updated the second of the two patches below. The two patches are not mutually
+>>>> exclusive, but I'm leaning towards the second as more useful (for the things I'm doing). -- [[Will]]
+
+I think it's awesome that you're writing this code to explore the problem
+space, [[Will]] -- and these plugins are good stabs at at least part of it.
+Let me respond to a few of your comments.. --[[Joey]]
+
+On use cases, one use case is a user posting a bug report with structured
+data in it. A template is one way, but then the user has to deal with the
+format used to store the structured data. This is where a edit-time form
+becomes essential.
+
+> This was the idea with the 'form' plugin. With the 'data' plugin I was exploring
+> a different approach: try to keep the markup simple enough that the user can edit
+> the markup directly, and still have that be ok. I admit it is a stretch, but I thought
+> it worth exploring.
+
+Another use case is, after many such bugs have been filed,
+wanting to add a new field to each bug report. To avoid needing to edit
+every bug report it would be good if the fields in a bug report were
+defined somewhere else, so that just that one place can be edited to add
+the new field, and it will show up in each bug report (and in each bug
+report's edit page, as a new form field).
+
+> If I was going to do that, I'd use a perl script on a checked out
+> workspace. I think you're describing a rare operation and
+> so I'd be happy not having a web interface for it. Having said that,
+> if you just wanted to change the form for *new* pages, then you
+> can just edit the template used to create new pages.
+
+Re the form plugin, I'm uncomfortable with tying things into
+[[!cpan CGI::FormBuilder]] quite so tightly as you have.
+
+> Yeah :). But I wanted to explore the space and that was the
+> easiest way to start.
+
+CGI::FormBuilder
+could easily change in a way that broke whole wikis full of pages. Also,
+needing to sanitize FormBuilder fields with security implications is asking
+for trouble, since new FormBuilder features could add new fields, or
+add new features to existing fields (FormBuilder is very DWIM) that open
+new security holes.
+
+> There is a list of allowed fields. I only interpret those.
+
+I think that having a type system, that allows defining specific types,
+like "email address", by writing code (that in turn can use FormBuilder),
+is a better approach, since it should avoid becoming a security problem.
+
+> That would be possible. I think an extension to the 'data' plugin might
+> work here.
+
+One specific security hole, BTW, is that if you allow the `validate` field,
+FormBuilder will happily treat it as a regexp, and we don't want to expose
+arbitrary perl regexps, since they can at least DOS a system, and can
+probably be used to run arbitrary perl code.
+
+> I validate the validate field :). It only allows validate fields that match
+> `/^[\w\s]+$/`. This means you can really only use the pre-defined
+> validation types in FormBuilder.
+
+The data plugin only deals with a fairly small corner of the problem space,
+but I think does a nice job at what it does. And could probably be useful
+in a large number of other cases.
+
+> I think the data plugin is more likely to be useful than the form plugin.
+> I was thinking of extending the data directive by allowing an 'id' parameter.
+> When you have an id parameter, then you can display a small form for that
+> data element. The submission handler would look through the page source
+> for the data directive with the right id parameter and edit it. This would
+> make the data directive more like the current 'form' plugin.
+
+> That is making things significantly more complex for less significant gain though. --[[Will]]
+
+> Oh, one quick other note. The data plugin below was designed to handle multiple
+> data elements in a single directive. e.g.
+
+ \[[!data key="Depends on" link="bugs/bugA" link="bugs/bugB" value=6]]
+
+> would match `data_eq(Depends on,6)`, `data_link(Depends on,bugs/bugA)`, `data_link(Depends on,bugs/bugB)`
+> or, if you applied the patch in [[todo/tracking_bugs_with_dependencies]] then you can use 'defined pagespecs'
+> such as `data_link(Depends on,~openBugs)`. <a id="another_kind_of_links" />The ability to label links like this allows separation of
+> dependencies between bugs from arbitrary links.
+>> This issue (the need for distinguished kinds of links) has also been brought up in other discussions: [[tracking_bugs_with_dependencies#another_kind_of_links]] (deps vs. links) and [[tag_pagespec_function]] (tags vs. links). --Ivan Z.
+
+>>> And multiple link types are now supported; plugins can set the link
+>>> type when registering a link, and pagespec functions can match on them. --[[Joey]]
+
+----
+
+ #!/usr/bin/perl
+ # Interpret YAML data to make a web form
+ package IkiWiki::Plugin::form;
+
+ use warnings;
+ use strict;
+ use CGI::FormBuilder;
+ use IkiWiki 2.00;
+
+ sub import {
+ hook(type => "getsetup", id => "form", call => \&getsetup);
+ hook(type => "htmlize", id => "form", call => \&htmlize);
+ hook(type => "sessioncgi", id => "form", call => \&cgi_submit);
+ }
+
+ sub getsetup () {
+ return
+ plugin => {
+ safe => 1,
+ rebuild => 1, # format plugin
+ },
+ }
+
+ sub makeFormFromYAML ($$$) {
+ my $page = shift;
+ my $YAMLString = shift;
+ my $q = shift;
+
+ eval q{use YAML};
+ error($@) if $@;
+ eval q{use CGI::FormBuilder};
+ error($@) if $@;
+
+ my ($dataHashRef) = YAML::Load($YAMLString);
+
+ my @fields = keys %{ $dataHashRef->{fields} };
+
+ unshift(@fields, 'do');
+ unshift(@fields, 'page');
+ unshift(@fields, 'rcsinfo');
+
+ # print STDERR "Fields: @fields\n";
+
+ my $submittedPage;
+
+ $submittedPage = $q->param('page') if defined $q;
+
+ if (defined $q && defined $submittedPage && ! ($submittedPage eq $page)) {
+ error("Submitted page doensn't match current page: $page, $submittedPage");
+ }
+
+ error("Page not backed by file") unless defined $pagesources{$page};
+ my $file = $pagesources{$page};
+
+ my $template;
+
+ if (defined $dataHashRef->{template}) {
+ $template = $dataHashRef->{template};
+ } else {
+ $template = "form.tmpl";
+ }
+
+ my $form = CGI::FormBuilder->new(
+ fields => \@fields,
+ charset => "utf-8",
+ method => 'POST',
+ required => [qw{page}],
+ params => $q,
+ action => $config{cgiurl},
+ template => scalar IkiWiki::template_params($template),
+ wikiname => $config{wikiname},
+ header => 0,
+ javascript => 0,
+ keepextras => 0,
+ title => $page,
+ );
+
+ $form->field(name => 'do', value => 'Update Form', required => 1, force => 1, type => 'hidden');
+ $form->field(name => 'page', value => $page, required => 1, force => 1, type => 'hidden');
+ $form->field(name => 'rcsinfo', value => IkiWiki::rcs_prepedit($file), required => 1, force => 0, type => 'hidden');
+
+ my %validkey;
+ foreach my $x (qw{label type multiple value fieldset growable message other required validate cleanopts columns comment disabled linebreaks class}) {
+ $validkey{$x} = 1;
+ }
+
+ while ( my ($name, $data) = each(%{ $dataHashRef->{fields} }) ) {
+ next if $name eq 'page';
+ next if $name eq 'rcsinfo';
+
+ while ( my ($key, $value) = each(%{ $data }) ) {
+ next unless $validkey{$key};
+ next if $key eq 'validate' && !($value =~ /^[\w\s]+$/);
+
+ # print STDERR "Adding to field $name: $key => $value\n";
+ $form->field(name => $name, $key => $value);
+ }
+ }
+
+ # IkiWiki::decode_form_utf8($form);
+
+ return $form;
+ }
+
+ sub htmlize (@) {
+ my %params=@_;
+ my $content = $params{content};
+ my $page = $params{page};
+
+ my $form = makeFormFromYAML($page, $content, undef);
+
+ return $form->render(submit => 'Update Form');
+ }
+
+ sub cgi_submit ($$) {
+ my $q=shift;
+ my $session=shift;
+
+ my $do=$q->param('do');
+ return unless $do eq 'Update Form';
+ IkiWiki::decode_cgi_utf8($q);
+
+ eval q{use YAML};
+ error($@) if $@;
+ eval q{use CGI::FormBuilder};
+ error($@) if $@;
+
+ my $page = $q->param('page');
+
+ return unless exists $pagesources{$page};
+
+ return unless $pagesources{$page} =~ m/\.form$/ ;
+
+ return unless IkiWiki::check_canedit($page, $q, $session);
+
+ my $file = $pagesources{$page};
+ my $YAMLString = readfile(IkiWiki::srcfile($file));
+ my $form = makeFormFromYAML($page, $YAMLString, $q);
+
+ my ($dataHashRef) = YAML::Load($YAMLString);
+
+ if ($form->submitted eq 'Update Form' && $form->validate) {
+
+ #first update our data structure
+
+ while ( my ($name, $data) = each(%{ $dataHashRef->{fields} }) ) {
+ next if $name eq 'page';
+ next if $name eq 'rcs-data';
+
+ if (defined $q->param($name)) {
+ $data->{value} = $q->param($name);
+ }
+ }
+
+ # now write / commit the data
+
+ writefile($file, $config{srcdir}, YAML::Dump($dataHashRef));
+
+ my $message = "Web form submission";
+
+ IkiWiki::disable_commit_hook();
+ my $conflict=IkiWiki::rcs_commit($file, $message,
+ $form->field("rcsinfo"),
+ $session->param("name"), $ENV{REMOTE_ADDR});
+ IkiWiki::enable_commit_hook();
+ IkiWiki::rcs_update();
+
+ require IkiWiki::Render;
+ IkiWiki::refresh();
+
+ IkiWiki::redirect($q, "$config{url}/".htmlpage($page)."?updated");
+
+ } else {
+ error("Invalid data!");
+ }
+
+ exit;
+ }
+
+ package IkiWiki::PageSpec;
+
+ sub match_form_eq ($$;@) {
+ my $page=shift;
+ my $argSet=shift;
+ my @args=split(/,/, $argSet);
+ my $field=shift @args;
+ my $value=shift @args;
+
+ my $file = $IkiWiki::pagesources{$page};
+
+ if ($file !~ m/\.form$/) {
+ return IkiWiki::FailReason->new("page is not a form");
+ }
+
+ my $YAMLString = IkiWiki::readfile(IkiWiki::srcfile($file));
+
+ eval q{use YAML};
+ error($@) if $@;
+
+ my ($dataHashRef) = YAML::Load($YAMLString);
+
+ if (! defined $dataHashRef->{fields}->{$field}) {
+ return IkiWiki::FailReason->new("field '$field' not defined in page");
+ }
+
+ my $formVal = $dataHashRef->{fields}->{$field}->{value};
+
+ if ($formVal eq $value) {
+ return IkiWiki::SuccessReason->new("field value matches");
+ } else {
+ return IkiWiki::FailReason->new("field value does not match");
+ }
+ }
+
+ 1
+
+----
+
+ #!/usr/bin/perl
+ # Allow data embedded in a page to be checked for
+ package IkiWiki::Plugin::data;
+
+ use warnings;
+ use strict;
+ use IkiWiki 2.00;
+
+ my $inTable = 0;
+
+ sub import {
+ hook(type => "getsetup", id => "data", call => \&getsetup);
+ hook(type => "needsbuild", id => "data", call => \&needsbuild);
+ hook(type => "preprocess", id => "data", call => \&preprocess, scan => 1);
+ hook(type => "preprocess", id => "datatable", call => \&preprocess_table, scan => 1); # does this need scan?
+ }
+
+ sub getsetup () {
+ return
+ plugin => {
+ safe => 1,
+ rebuild => 1, # format plugin
+ },
+ }
+
+ sub needsbuild (@) {
+ my $needsbuild=shift;
+ foreach my $page (keys %pagestate) {
+ if (exists $pagestate{$page}{data}) {
+ if (exists $pagesources{$page} &&
+ grep { $_ eq $pagesources{$page} } @$needsbuild) {
+ # remove state, it will be re-added
+ # if the preprocessor directive is still
+ # there during the rebuild
+ delete $pagestate{$page}{data};
+ }
+ }
+ }
+ }
+
+ sub preprocess (@) {
+ my @argslist = @_;
+ my %params=@argslist;
+
+ my $html = '';
+ my $class = defined $params{class}
+ ? 'class="'.$params{class}.'"'
+ : '';
+
+ if ($inTable) {
+ $html = "<th $class >$params{key}:</th><td $class >";
+ } else {
+ $html = "<span $class >$params{key}:";
+ }
+
+ while (scalar(@argslist) > 1) {
+ my $type = shift @argslist;
+ my $data = shift @argslist;
+ if ($type eq 'link') {
+ # store links raw
+ $pagestate{$params{page}}{data}{$params{key}}{link}{$data} = 1;
+ my $link=IkiWiki::linkpage($data);
+ add_depends($params{page}, $link);
+ $html .= ' ' . htmllink($params{page}, $params{destpage}, $link);
+ } elsif ($type eq 'data') {
+ $data = IkiWiki::preprocess($params{page}, $params{destpage},
+ IkiWiki::filter($params{page}, $params{destpage}, $data));
+ $html .= ' ' . $data;
+ # store data after processing - allows pagecounts to be stored, etc.
+ $pagestate{$params{page}}{data}{$params{key}}{data}{$data} = 1;
+ }
+ }
+
+ if ($inTable) {
+ $html .= "</td>";
+ } else {
+ $html .= "</span>";
+ }
+
+ return $html;
+ }
+
+ sub preprocess_table (@) {
+ my %params=@_;
+
+ my @lines;
+ push @lines, defined $params{class}
+ ? "<table class=\"".$params{class}.'">'
+ : '<table>';
+
+ $inTable = 1;
+
+ foreach my $line (split(/\n/, $params{datalist})) {
+ push @lines, "<tr>" . IkiWiki::preprocess($params{page}, $params{destpage},
+ IkiWiki::filter($params{page}, $params{destpage}, $line)) . "</tr>";
+ }
+
+ $inTable = 0;
+
+ push @lines, '</table>';
+
+ return join("\n", @lines);
+ }
+
+ package IkiWiki::PageSpec;
+
+ sub match_data_eq ($$;@) {
+ my $page=shift;
+ my $argSet=shift;
+ my @args=split(/,/, $argSet);
+ my $key=shift @args;
+ my $value=shift @args;
+
+ if (! exists $IkiWiki::pagestate{$page}{data}) {
+ return IkiWiki::FailReason->new("page does not contain any data directives");
+ }
+
+ if (! exists $IkiWiki::pagestate{$page}{data}{$key}) {
+ return IkiWiki::FailReason->new("page does not contain data key '$key'");
+ }
+
+ if ($IkiWiki::pagestate{$page}{data}{$key}{data}{$value}) {
+ return IkiWiki::SuccessReason->new("value matches");
+ } else {
+ return IkiWiki::FailReason->new("value does not match");
+ }
+ }
+
+ sub match_data_link ($$;@) {
+ my $page=shift;
+ my $argSet=shift;
+ my @params=@_;
+ my @args=split(/,/, $argSet);
+ my $key=shift @args;
+ my $value=shift @args;
+
+ if (! exists $IkiWiki::pagestate{$page}{data}) {
+ return IkiWiki::FailReason->new("page $page does not contain any data directives and so cannot match a link");
+ }
+
+ if (! exists $IkiWiki::pagestate{$page}{data}{$key}) {
+ return IkiWiki::FailReason->new("page $page does not contain data key '$key'");
+ }
+
+ foreach my $link (keys %{ $IkiWiki::pagestate{$page}{data}{$key}{link} }) {
+ # print STDERR "Checking if $link matches glob $value\n";
+ if (match_glob($link, $value, @params)) {
+ return IkiWiki::SuccessReason->new("Data link on page $page with key $key matches glob $value: $link");
+ }
+ }
+
+ return IkiWiki::FailReason->new("No data link on page $page with key $key matches glob $value");
+ }
+
+ 1
diff --git a/doc/todo/structured_page_data/discussion.mdwn b/doc/todo/structured_page_data/discussion.mdwn
new file mode 100644
index 000000000..bc7f39277
--- /dev/null
+++ b/doc/todo/structured_page_data/discussion.mdwn
@@ -0,0 +1 @@
+How about using JSON? YAML is popular in the Perl world, but the Web 2.0 world seems more excited about using JSON for data serialization. I find it easier to edit JSON by hand than YAML. -- [[Edward|/users/Edward_Betts]]
diff --git a/doc/todo/stylesheet_suggestion_for_verbatim_content.mdwn b/doc/todo/stylesheet_suggestion_for_verbatim_content.mdwn
new file mode 100644
index 000000000..b73155c98
--- /dev/null
+++ b/doc/todo/stylesheet_suggestion_for_verbatim_content.mdwn
@@ -0,0 +1,33 @@
+I suggest the attached change for verbatim contents. Paddings/margins are
+optional, but IMHO, we should at least define a monospace font.
+
+-- Recai (via email)
+
+AFAICS, my web browser already has a built-in monospace font, which I can
+see in action in the preformatted patch below. So I don't see why the
+default style sheet should do this. --[[Joey]]
+
+[[!tag patch]]
+
+<pre>
+diff --git a/basewiki/style.css b/basewiki/style.css
+index 6ec6f89..1970561 100644
+--- a/basewiki/style.css
++++ b/basewiki/style.css
+@@ -198,3 +198,15 @@ li.L7 {
+ li.L8 {
+ list-style: upper-alpha;
+ }
++
++/* verbatim content */
++pre, tt, code {
++ font-family: "Courier", monospace;
++ color: black;
++}
++
++pre {
++ margin-left: 1.5em;
++ padding: 0.5em;
++ overflow: hidden;
++}
+</pre>
diff --git a/doc/todo/submodule_support.mdwn b/doc/todo/submodule_support.mdwn
new file mode 100644
index 000000000..d6a7edb03
--- /dev/null
+++ b/doc/todo/submodule_support.mdwn
@@ -0,0 +1,15 @@
+I would love to be able to publish my theme in my personnal wiki. The theme is in a separate git repository, and i feel it would be pretty awesome if it was rendered within my main ikiwiki site. I have tried the following:
+
+ git submodule add /usr/share/ikiwiki/themes/night_city/
+ git commit -m"add the theme to my site" ; git push
+
+But this made really weird things on the other side. The files from the theme end up flat in the parent directory. Now I have reverted the above change and ikiwiki *still* generates those files. Not sure what is going on.
+
+To be really clear here: this is an arbitrary source code repository that I want to include, I do not mean to enable the theme with this, this could very well be presentation material, C source code or whatever else... In fact, I think this would be a powerful way to do syntax highlightning for source code published on your website...
+
+Other people had experience with this? Or other suggestions on how to publish repositories within my site? -- [[anarcat]]
+
+> Ikiwiki does not support git submodules.
+>
+> You can use the [[plugins/underlay]] plugin to merge the
+> contents of other directories into your wiki's source. --[[Joey]]
diff --git a/doc/todo/support_creole_markup.mdwn b/doc/todo/support_creole_markup.mdwn
new file mode 100644
index 000000000..5a1e1286d
--- /dev/null
+++ b/doc/todo/support_creole_markup.mdwn
@@ -0,0 +1,18 @@
+creole is a wanna to be standard markup for all wikis.
+
+It's an agreement archived by many wiki engine developers.
+
+Currently MoinMoin and Oddmuse supports it. And a lot of wikis (dokuwiki, tiddlywiki, pmwiki. podwiki, etc) have partial support. More info on support: <http://www.wikicreole.org/wiki/Engines>
+
+
+Some useful infomation:
+
+And there is a perl module: Text::WikiCreole
+
+
+Syntax file for vim: http://www.peter-hoffmann.com/code/vim/ (Since a typical ikiwiki user usually use external editors. :))
+
+> Should be pretty easy to add a plugin to do it using [[!cpan
+> Text::WikiCreole]]. --[[Joey]]
+
+[[done]]
diff --git a/doc/todo/support_dicts_in_setup.mdwn b/doc/todo/support_dicts_in_setup.mdwn
new file mode 100644
index 000000000..68242fda8
--- /dev/null
+++ b/doc/todo/support_dicts_in_setup.mdwn
@@ -0,0 +1,26 @@
+It would be nice for some plugins to use hashes as setup data structures
+(which ones? pagespec aliases for one. Any others?), but these cannot
+currently be adequately described in `getsetup()`, nor represented in
+`websetup()`. It would be nice to extend ikiwiki to support this.
+
+I've had an initial go at how to represent this in a nice way within a HTML
+page. An initial mock up is available at
+<https://github.com/jmtd/ikiwiki/blob/websetup_hashes/hash.html>. The
+approach taken is to use a javascript hash/dictionary as the canonical copy of
+the data; to express that in the form elements, and to capture all relevant
+events to update the main data structure (and the HTML representations
+thereof).
+
+I imagine packing the js structure into a form element which is posted, and
+ignoring the other form element data.
+
+This would mean mandating javascript support for editing such hashes.
+
+— [[Jon]]
+
+> I really don't like mandating javascript for anything in ikiwiki.
+>
+> Ikiwiki's websetup is built using CGI::FormBuilder, which makes it easy
+> to create forms for simple stuff, but does not allow custom UI for
+> complex stuff. This does not seem compatable with that, unless your
+> idea is to have a separate form for these more complex things. --[[Joey]]
diff --git a/doc/todo/support_for_SDF_documents.mdwn b/doc/todo/support_for_SDF_documents.mdwn
new file mode 100644
index 000000000..18ce4e106
--- /dev/null
+++ b/doc/todo/support_for_SDF_documents.mdwn
@@ -0,0 +1,8 @@
+I think it would be useful for ikiwiki to support [[!debpkg sdf]] input,
+which can be converted and rendered to many formats.
+I should add, however, that SDF allows executing arbitrary perl code
+from its documents; which means some sanitization would need to occur
+before the document is fed to sdf.
+--[[JeremieKoenig]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/support_for_plugins_written_in_other_languages.mdwn b/doc/todo/support_for_plugins_written_in_other_languages.mdwn
new file mode 100644
index 000000000..006b6fd5e
--- /dev/null
+++ b/doc/todo/support_for_plugins_written_in_other_languages.mdwn
@@ -0,0 +1,56 @@
+ikiwiki should support [[writing plugins in other languages|plugins/write/external]]
+
+> [[done]] !!
+
+While it should be possible to call ikiwiki from C, doing the callbacks in C is
+probably hard. And accessing perl at all from C is ugly. It also doesn't
+make it very easy to write plugins in an interpreted language, since that
+would mean linking perl and eg, python in one binary. (Been there, done
+that, never again.)
+
+Instead, I'm considering using XML RPC to let ikiwiki communicate with a
+child process that it can spawn. The child could then be any program,
+written in any language. It could talk XML RPC via stdio. (This assumes
+that most languages allow easily serialising XML::RPC calls and responses
+to a file descriptor. Some XML RPC implementations may be hardcoded to use
+http..) For ease of implementation, each rpc request sent via stio should
+end with a newline, and begin with "<?xml ..>".
+
+Here's how it would basically look, not showing the actual XML RPC used to
+pass values.
+
+ -> call import
+ <- call hook type => preprocess, id => foo, call => plugin_preprocess
+ -> result 1
+ <- result 1
+
+ -> call plugin_preprocess page => bar
+ <- call getconfig url
+ -> result "http://example.com", ...
+ <- call debug "foo"
+ -> result 1
+ <- result done "my return value"
+
+From ikiwiki's POV:
+
+* ikiwiki always initiates each conversation with a command
+* After sending a command, ikiwiki reads commands, dispatches them, and
+ returns the results, in a loop, until it gets a result for the command it
+ called.
+
+From the plugin's POV:
+
+* It's probably sitting in an XML::RPC loop.
+* Get a command from ikiwiki.
+* Dispatch the command to the appropriate function.
+* The function can use XML::RPC to communicate with ikiwiki to get things
+ like config values; and to call ikiwiki functions.
+* Send the function's return value back to ikiwiki.
+
+Simple enough, really. ikiwiki would need to add accessor functions for
+all important variables, such as "getconfig" and "setconfig". It would
+probably be easiest for ikiwiki to dispatch a command by just evaling
+IkiWiki::$command.
+
+Plugin programs could be dropped into /usr/share/ikiwiki/plugins/, and
+load_plugin() would just open2 the plugin program and call import.
diff --git a/doc/todo/support_includes_in_setup_files.mdwn b/doc/todo/support_includes_in_setup_files.mdwn
new file mode 100644
index 000000000..50afb2b6b
--- /dev/null
+++ b/doc/todo/support_includes_in_setup_files.mdwn
@@ -0,0 +1,10 @@
+I have a client server setup so I can I edit/preview on my laptop/desktop and push to a server. I therefore have two almost identical setup files that reasonably often I let get out of sync. I'd like to be able into include the common parts into the two setup files. Currently the following works, but it relies on knowing the implementation of IkiWiki::Setup::Standard
+
+use IkiWiki::Setup::Standard { specific stuff };
+require "/path/to/common_setup";
+
+where common_setup contains a call to IkiWiki::Setup::merge
+
+To see that this is fragile, note that the require must come second, or ikiwiki will try to load a module called IkiWiki::Setup::merge
+
+DavidBremner
diff --git a/doc/todo/support_link__40__.__41___in_pagespec.mdwn b/doc/todo/support_link__40__.__41___in_pagespec.mdwn
new file mode 100644
index 000000000..653db1ff2
--- /dev/null
+++ b/doc/todo/support_link__40__.__41___in_pagespec.mdwn
@@ -0,0 +1,21 @@
+[[!tag wishlist]]
+
+It would be nice to have pagespecs support "link(.)" as syntax.
+This would match pages that link to the page that invokes the pagespec.
+The use case is a blog with tags, and having a page for each tag
+which uses !inline to list all posts with the tag.
+
+Joey said on IRC that "probably changing the derel() function in
+IkiWiki.pm is the best way to do it".
+
+> I implemented this suggestion in the simplest possible way, [[!taglink patch]] available [[here|http://git.oblomov.eu/ikiwiki/patch/f4a52de556436fdee00fd92ca9a3b46e876450fa]].
+> An alternative approach, very similar, would be to make the empty page parameter mean current page (e.g. `link()` would mean pages linking here). The patch would be very similar.
+> -- GB
+
+>> Thanks for this, and also for your recent spam-fighting.
+>> Huh, I was right about changing derel, didn't realize it would be
+>> so obvious a change. :) Oh well, I managed to complicate it
+>> some in optimisation pass.. ;)
+>>
+>> Note that your git-daemon on git.oblomov.eu seems down.
+>> I pulled the patch from gitweb, [[done]] --[[Joey]]
diff --git a/doc/todo/support_multiple_perl_libraries.mdwn b/doc/todo/support_multiple_perl_libraries.mdwn
new file mode 100644
index 000000000..2869b5033
--- /dev/null
+++ b/doc/todo/support_multiple_perl_libraries.mdwn
@@ -0,0 +1,11 @@
+It would be useful to have
+
+ libdir=>[ qw{libdir1 libdir2 libdir3} ]
+
+as a setup option. I have a couple of different directories that e.g. come from different git repos, so merging them is a bit messy.
+
+I think the change is a one-liner, but I put this here for discussion before attempting a patch. If some more confident person wants to have a go, feel free.
+
+[[DavidBremner]]
+
+[[!taglink wishlist]]
diff --git a/doc/todo/supporting_comments_via_disussion_pages.mdwn b/doc/todo/supporting_comments_via_disussion_pages.mdwn
new file mode 100644
index 000000000..420ae4a7e
--- /dev/null
+++ b/doc/todo/supporting_comments_via_disussion_pages.mdwn
@@ -0,0 +1,222 @@
+I would love to see more traditional support for comments in ikiwiki. One
+way would be to structure data on the discussion page in such a way that a
+"comment" plugin could parse it and yet the discussion page would still be
+a valid and usable wiki page.
+
+For example if the discussion page looked like this:
+
+ # Subject of First Comment
+ Posted by [Adam Shand](http://adam.shand.net/) at 10:34PM on 14/04/2007
+
+ Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Morbi consectetuer nunc quis
+ magna. Etiam non est eget sapien vulputate varius. Vivamus magna. Sed justo. Donec
+ pellentesque ultrices urna.
+
+ # Subject of the Second Comment
+ Posted by [Foo Bar](http://foobar.net/) at 11:41PM on 14/04/2007
+
+ Quisque lacinia, lorem eget ornare facilisis, enim eros iaculis felis, id volutpat nibh
+ mauris ut felis. Vestibulum risus nibh, adipiscing volutpat, volutpat et, lacinia ut,
+ pede. Maecenas dolor. Vivamus feugiat volutpat ligula.
+
+Each header marks the start of a new comment and the line immediately
+following is the comments meta data (author, email/url, datestamp).
+Hopefully you could structure it in such a way that the scope
+
+This would allow:
+
+ * A comment plugin to render the comments in "traditional blog" format .
+ * Possibly even support nesting comments by the header level?
+ * A comment plugin to create a form at the bottom of the page for people to add comments in the appropriate format to the discussion page
+ * Still remain usable and readable by people who work via svn.
+ * When there is ACL support you could mark the discussion page as read only so it could only be updated by the comment plugin (if that's what you wanted)
+
+Is this simple enough to be sensible?
+
+-- [[AdamShand]]
+
+> Well, if it's going to look like a blog, why not store the data the same
+> way ikiwiki stores blogs, with a separate page per comment? As already
+> suggested in [[discussion_page_as_blog]] though there are some things to
+> be worked out also discussed there.
+> --[[Joey]]
+
+>> I certainly won't be fussy about how it gets implemented, I was just trying to think of the lightest weight most "wiki" solution. :-) -- Adam.
+
+>>> As a side note, the feature described above (having a form not to add a page but to expand it in a formated way) would be useful for other things when the content is short (timetracking, sub-todo list items, etc..) --[[hb]]
+
+# [[MarceloMagallon]]'s implementation
+
+I've been looking into this. I'd like to implement a "blogcomments"
+plugin. Looking at the code, I think the way to go is to have a
+formbuilder_setup hook that uses a different template instead of the
+standard editpage one. That template would not display the editcontent
+field. The problem that I'm running into is that I need to append the new
+content to the old one.
+
+-- [[MarceloMagallon]]
+
+> Anything I can do to help? --[[Joey]]
+
+>> Figured it out. Can you comment on the code below? Thanks. -- [[MarceloMagallon]]
+
+So, I have some code, included below. For some reason that I don't quite get it's not updating the wiki page after a submit. Maybe it's something silly on my side...
+
+What I ended up doing is write something like this to the page:
+
+ [[!blogcomment from="""Username""" timestamp="""12345""" subject="""Some text""" text="""the text of the comment"""]]
+
+Each comment is processed to something like this:
+
+ <div>
+ <dl>
+ <dt>From</dt><dd>Username</dd>
+ <dt>Date</dt><dd>Date (needs fixing)</dd>
+ <dt>Subject</dt><dd>Subject text</dd>
+ </dl>
+
+ <p>Text of the comment...</p>
+ </div>
+
+. In this way the comments can be styled using CSS.
+
+-- [[MarceloMagallon]]
+
+## Code
+
+ #!/usr/bin/perl
+ package IkiWiki::Plugin::comments;
+
+ use warnings;
+ use strict;
+ use IkiWiki '1.02';
+
+ sub import {
+ hook(type => "formbuilder_setup", id => "comments",
+ call => \&formbuilder_setup);
+ hook(type => "preprocess", id => "blogcomment",
+ call => \&preprocess);
+ }
+
+ sub formbuilder_setup (@) {
+ my %params=@_;
+ my $cgi = $params{cgi};
+ my $form = $params{form};
+ my $session = $params{session};
+
+ my ($page)=$form->field('page');
+ $page=IkiWiki::titlepage(IkiWiki::possibly_foolish_untaint($page));
+
+ # XXX: This needs something to make it blog specific
+ unless ($page =~ m{/discussion$} &&
+ $cgi->param('do') eq 'edit' &&
+ ! exists $form->{title})
+ {
+ return;
+ }
+
+ if (! $form->submitted)
+ {
+ $form->template(IkiWiki::template_file("makeblogcomment.tmpl"));
+ $form->field(name => "blogcomment", type => "textarea", rows => 20,
+ cols => 80);
+ return;
+ }
+
+ my $content="";
+ if (exists $pagesources{$page}) {
+ $content=readfile(srcfile($pagesources{$page}));
+ $content.="\n\n";
+ }
+ my $name=defined $session->param('name') ?
+ $session->param('name') : gettext('Anonymous');
+ my $timestamp=time;
+ my $subject=defined $cgi->param('comments') ?
+ $cgi->param('comments') : '';
+ my $comment=$cgi->param('blogcomment');
+
+ $content.=qq{[[!blogcomment from="""$name""" timestamp="""$timestamp""" subject="""$subject""" text="""$comment"""]]\n\n};
+ $content=~s/\n/\r\n/g;
+ $form->field(name => "editcontent", value => $content, force => 1);
+ }
+
+ sub preprocess (@) {
+ my %params=@_;
+
+ my ($text, $date, $from, $subject, $r);
+
+ $text=IkiWiki::preprocess($params{page}, $params{destpage},
+ IkiWiki::filter($params{page}, $params{text}));
+ $from=exists $params{from} ? $params{from} : gettext("Anonymous");
+ $date=localtime($params{timestamp}) if exists $params{timestamp};
+ $subject=$params{subject} if exists $params{subject};
+
+ $r = qq{<div class="blogcomment"><dl>\n};
+ $r .= '<dt>' . gettext("From") . "</dt><dd>$from</dd>\n" if defined $from;
+ $r .= '<dt>' . gettext("Date") . "</dt><dd>$date</dd>\n" if defined $date;
+ $r .= '<dt>' . gettext("Subject") . "</dt><dd>$subject</dd>\n"
+ if defined $subject;
+ $r .= "</dl>\n" . $text . "</div>\n";
+
+ return $r;
+ }
+
+ 1;
+
+# [[smcv]]'s implementation
+
+I've started a smcvpostcomment plugin (to be renamed to postcomment if people like it, but I'm namespacing it while it's still experimental) which I think more closely resembles what Joey was after. The code is cargo-culted from a mixture of editpage and inline's "make a blog post" support - it has to use a lot of semi-internal IkiWiki:: functions (both of those plugins do too). It doesn't fully work yet, but I'll try to get it into a state where it basically works and can be published in the next week or two.
+
+My approach is:
+
+* Comments are intended to be immutable after posting (so, only editable by direct committers), so they go on internal pages (*._comment); these internal pages are checked in to the RCS (although later I might make this optional)
+
+* ?do=smcvpostcomment (in the CGI script) gives a form that lets logged-in users (later, optionally also anonymous users) create a new comment
+
+* \[[!smcvpostcomment]] just inserts a "Post comment" button into the current page, which goes to ?do=smcvpostcomment - it's intended to be used in conjunction with an \[[!inline]] that will display the comments
+
+* The title (subject line), author and authorurl are set with \[[!meta]] directives, just like the way aggregate does it (which means I'll probably have to disallow the use of those \[[!meta]] directives in the body of the comment, to avoid spoofing - obviously, spoofing can be detected by looking at RecentChanges or gitweb, but the expectation for blog-style comments is that the metadata seen in the comment can be trusted)
+
+* The initial plan is to have comments hard-coded to be in Markdown, with further directives not allowed - I'll relax this when I've worked out what ought to be allowed!
+
+I've also updated Marcelo's code (above) to current ikiwiki, and moved it to a "marceloblogcomment" namespace - it's in the "marcelocomments" branch of my repository (see <http://git.debian.org/?p=users/smcv/ikiwiki.git;a=log;h=refs/heads/marcelocomments>). I had to reconstitute the .tmpl file, which Marcelo didn't post here.
+
+--[[smcv]]
+
+OK, the postcomment branch in my repository contains an implementation. What
+do you think so far? Known issues include:
+
+* The combination of RSS/Atom links and the "post new comment..." button is
+ ugly - I need a way to integrate the "new comment" button into the feed links
+ somehow, like the way inline embeds its own "new blog post..." feature
+ (I don't think the current way really scales, though)
+
+* There are some tweakables (whether to commit comments into the VCS, whether
+ wikilinks are allowed, whether directives are allowed) that are theoretically
+ configurable, but are currently hard-coded
+
+* The wikilink/directive disarming doesn't work unless you have
+ prefixdirectives set (which I just realised)
+
+* \[[!smcvpostcomment]] now displays the comments too, by invoking \[[!inline]]
+ with suitable parameters - but it does so in a very ugly way
+
+* Start-tags in a comment with no corresponding end-tag break page formatting
+ (unless htmltidy is enabled - inline and aggregate have the same problem)
+
+* There is no access control, so anonymous users can always comment, and so
+ can all logged-in users. Perhaps we need to extend canedit() to support
+ different types of edit? Or perhaps I should ignore canedit() and make the
+ access control configurable via a parameter to \[[!smcvpostcomment]]?
+ I'd like to be able to let anonymous (or at least non-admin) users comment
+ on existing pages, but not edit or create pages (but perhaps I'm being too
+ un-wikiish).
+
+--[[smcv]]
+
+I've updated smcvpostcomment and publicised it as [[plugins/contrib/comments]]. --[[smcv]]
+
+> While there is still room for improvement and entirely other approaches,
+> I am calling this done since smcv's comments plugin is ready. --[[Joey]]
+
+[[done]]
diff --git a/doc/todo/svg.mdwn b/doc/todo/svg.mdwn
new file mode 100644
index 000000000..274ebf3e3
--- /dev/null
+++ b/doc/todo/svg.mdwn
@@ -0,0 +1,77 @@
+We should support SVG. In particular:
+
+* We could support rendering SVGs to PNGs when compiling the wiki. Not all browsers support SVG yet.
+
+* We could support editing SVGs via the web interface. SVG can contain unsafe content such as scripting, so we would need to whitelist safe markup.
+ * I am interested in seeing [svg-edit](http://code.google.com/p/svg-edit/) integrated -- [[EricDrechsel]]
+
+--[[JoshTriplett]]
+
+[[wishlist]]
+
+I'm allowing for inline SVG on my own installation. I've patched my
+copy of htmlscrubber.pm to allow safe MathML and SVG elements (as
+implemented in html5lib). <del datetime="2008-03-20T23:04-05:00">Here's a patch
+if anyone else is interested.</del>
+<ins datetime="2008-03-20T23:05-05:00">Actually, that patch wasn't quite
+right. I'll post a new one when it's working properly.</ins> --[[JasonBlevins]]
+
+* * *
+
+I'd like to hear what people think about the following:
+
+1. Including whitelists of elements and attributes for SVG and MathML in
+ htmlscrubber.
+
+2. Creating a whitelist of safe SVG (and maybe even HTML) style
+ attributes such as `fill`, `stroke-width`, etc.
+
+ This is how the [sanitizer][] in html5lib works. It shouldn't be too
+ hard to translate the relevant parts to Perl.
+
+ --[[JasonBlevins]], March 21, 2008 11:39 EDT
+
+[sanitizer]: http://code.google.com/p/html5lib/source/browse/trunk/ruby/lib/html5/sanitizer.rb
+
+* * *
+
+Another problem is that [HTML::Scrubber][] converts all tags to lowercase.
+Some SVG elements, such as viewBox, are mixed case. It seems that
+properly handling SVG might require moving to a different sanitizer.
+It seems that [HTML::Sanitizer][] has functions for sanitizing XHTML.
+Any thoughts? --[[JasonBlevins]], March 21, 2008 13:54 EDT
+
+[HTML::Scrubber]: http://search.cpan.org/~podmaster/HTML-Scrubber-0.08/Scrubber.pm
+[HTML::Sanitizer]: http://search.cpan.org/~nesting/HTML-Sanitizer-0.04/Sanitizer.pm
+
+I figured out a quick hack to make HTML::Scrubber case-sensitive by
+making the underlying HTML::Parser case-sensitive:
+
+ $_scrubber->{_p}->case_sensitive(1);
+
+So now I've got a version of [htmlscrubber.pm][] ([diff][])
+which allows safe SVG and MathML elements and attributes (but no
+styles&mdash;do we need them?). I'd be thrilled to see this
+in the trunk if other people think it's useful.
+--[[JasonBlevins]], March 24, 2008 14:56 EDT
+
+[htmlscrubber.pm]:http://xbeta.org/gitweb/?p=xbeta/ikiwiki.git;a=blob;f=IkiWiki/Plugin/htmlscrubber.pm;h=3c0ddc8f25bd8cb863634a9d54b40e299e60f7df;hb=fe333c8e5b4a5f374a059596ee698dacd755182d
+[diff]: http://xbeta.org/gitweb/?p=xbeta/ikiwiki.git;a=blobdiff;f=IkiWiki/Plugin/htmlscrubber.pm;h=3c0ddc8f25bd8cb863634a9d54b40e299e60f7df;hp=3bdaccea119ec0e1b289a0da2f6d90e2219b8d66;hb=fe333c8e5b4a5f374a059596ee698dacd755182d;hpb=be0b4f603f918444b906e42825908ddac78b7073
+
+> Unfortuantly these links are broken. --[[Joey]]
+
+* * *
+
+Actually, there's a way to embed SVG into MarkDown sources using the [data: URI scheme][rfc2397], [like this](data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBzdGFuZGFsb25lPSJubyI/Pgo8c3ZnIHdpZHRoPSIxOTIiIGhlaWdodD0iMTkyIiB4bWxuczp4bGluaz0iaHR0cDovL3d3dy53My5vcmcvMTk5OS94bGluayIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj4KIDwhLS0gQ3JlYXRlZCB3aXRoIFNWRy1lZGl0IC0gaHR0cDovL3N2Zy1lZGl0Lmdvb2dsZWNvZGUuY29tLyAtLT4KIDx0aXRsZT5IZWxsbywgd29ybGQhPC90aXRsZT4KIDxnPgogIDx0aXRsZT5MYXllciAxPC90aXRsZT4KICA8ZyB0cmFuc2Zvcm09InJvdGF0ZSgtNDUsIDk3LjY3MTksIDk3LjY2OCkiIGlkPSJzdmdfNyI+CiAgIDxyZWN0IHN0cm9rZS13aWR0aD0iNSIgc3Ryb2tlPSIjMDAwMDAwIiBmaWxsPSIjRkYwMDAwIiBpZD0ic3ZnXzUiIGhlaWdodD0iNTYuMDAwMDAzIiB3aWR0aD0iMTc1IiB5PSI2OS42Njc5NjkiIHg9IjEwLjE3MTg3NSIvPgogICA8dGV4dCB4bWw6c3BhY2U9InByZXNlcnZlIiB0ZXh0LWFuY2hvcj0ibWlkZGxlIiBmb250LWZhbWlseT0ic2VyaWYiIGZvbnQtc2l6ZT0iMjQiIHN0cm9rZS13aWR0aD0iMCIgc3Ryb2tlPSIjMDAwMDAwIiBmaWxsPSIjZmZmZjAwIiBpZD0ic3ZnXzYiIHk9IjEwNS42NjgiIHg9Ijk5LjY3MTkiPkhlbGxvLCB3b3JsZCE8L3RleHQ+CiAgPC9nPgogPC9nPgo8L3N2Zz4=).
+Of course, this way to display an image one needs to click a link, but it may be considered a feature.
+&mdash;&nbsp;[[Ivan_Shmakov]], 2010-03-12Z.
+
+[rfc2397]: http://tools.ietf.org/html/rfc2397
+
+> You can do the same with img src actually.
+>
+> If svg markup allows unsafe elements (ie, javascript),
+> which it appears to,
+> then this is a security hole, and the htmlscrubber
+> needs to lock it down more. Darn, now I have to spend my afternoon making
+> security releases! --[[Joey]]
diff --git a/doc/todo/syntax_highlighting.mdwn b/doc/todo/syntax_highlighting.mdwn
new file mode 100644
index 000000000..3d122829b
--- /dev/null
+++ b/doc/todo/syntax_highlighting.mdwn
@@ -0,0 +1,120 @@
+There's been a lot of work on contrib syntax highlighting plugins. One should be
+picked and added to ikiwiki core.
+
+We want to support both converting whole source files into wiki
+pages, as well as doing syntax highlighting as a preprocessor directive
+(which is either passed the text, or reads it from a file). But,
+the [[ikiwiki/directive/format]] directive makes this easy enough to
+do if the plugin only supports whole source files. So, syntax plugins
+do no really need their own preprocessor directive, unless it makes
+things easier for the user.
+
+## The big list of possibilities
+
+* [[plugins/contrib/highlightcode]] uses [[!cpan Syntax::Highlight::Engine::Kate]],
+ operates on whole source files only, has a few bugs (see
+ [here](http://u32.net/Highlight_Code_Plugin/), and needs to be updated to
+ support [[bugs/multiple_pages_with_same_name]]. (Currently a 404 :-( )
+* [[!cpan IkiWiki-Plugin-syntax]] only operates as a directive.
+ Interestingly, it supports multiple highlighting backends, including Kate
+ and Vim.
+* [[plugins/contrib/syntax]] only operates as a directive
+ ([[not_on_source_code_files|automatic_use_of_syntax_plugin_on_source_code_files]]),
+ and uses [[!cpan Text::VimColor]].
+* [[plugins/contrib/sourcehighlight]] uses source-highlight, and operates on
+ whole source files only. Needs to be updated to
+ support [[bugs/multiple_pages_with_same_name]].
+* [[sourcecode|todo/automatic_use_of_syntax_plugin_on_source_code_files/discussion]]
+ also uses source-highlight, and operates on whole source files.
+ Updated to work with the fix for [[bugs/multiple_pages_with_same_name]]. Untested with files with no extension, e.g. `Makefile`.
+* [[users/jasonblevins]]'s code plugin uses source-highlight, and supports both
+ whole file and directive use.
+
+* [hlsimple](http://pivot.cs.unb.ca/git/?p=ikiplugins.git;a=blob_plain;f=IkiWiki/Plugin/hlsimple.pm;hb=HEAD) is a wrapper for the the perl module [[!cpan Syntax::Highlight::Engine::Simple]]. This is pure perl, pretty simple, uses css. It ought to be pretty fast (according to the author, and just because it is not external).
+On the other hand, there are not many predefined languages yet. Defining language syntaxes is about as much
+work as source-highlight, but in perl. I plan to package the base module for debian. Perhaps after the author
+releases the 5 or 6 language definitions he has running on his web site, it might be suitable for inclusion in ikiwiki. [[DavidBremner]]
+
+* [[plugins/highlight]] uses [highlight](http://www.andre-simon.de) via
+ its swig bindings. It optionally supports whole files, but also
+ integrates with the format directive to allow formatting of *any* of
+ highlight's supported formats. (For whole files, it uses either
+ keepextension or noextension, as appropriate for the type of file.)
+
+## General problems / requirements
+
+* Using non-perl syntax highlighting backends is slower. All things equal,
+ I'd prefer either using a perl module, or a multiple-backend solution that
+ can use a perl module as one option. (Or, if there's a great highlighter
+ python module, we could use an external plugin..)
+
+ Of course, some perl modules are also rather slow.. Kate, for example
+ can only process about 33 lines of C code, or 14 lines of
+ debian/changelog per second. That's **30 times slower than markdown**!
+
+ By comparison, source-highlight can do about 5000 lines of C code per
+ second... And launching the program 100 times on an empty file takes about
+ 5 seconds, which isn't bad. And, it has a C++ library, which it
+ seems likely perl bindings could be written for, to eliminate
+ even that overhead.
+ > [highlight](http://www.andre-simon.de) has similar features to source-highlight, and swig bindings
+ > that should make it trivial in principle to call from perl. I like highlight a bit better because
+ > it has a pass-through feature that I find very useful. My memory is unfortunately a bit fuzzy as to how
+ > well the swig bindings work. [[DavidBremner]]
+
+* Engines that already support a wide variety of file types are of
+ course preferred. If the engine doesn't support a particular type
+ of file, it could fall back to doing something simple like
+ adding line numbers. (IkiWiki-Plugin-syntax does this.)
+* XHTML output.
+* Emitting html that uses CSS to control the display is preferred,
+ since it allows for easy user customization. (Engine::Simple does
+ this; Kate can be configured to do it; source-highlight can be
+ made to do it via the switches `--css /dev/null --no-doc`)
+* Nothing seems to support
+ [[wiki-formatted_comments|wiki-formatted_comments_with_syntax_plugin]]
+ inside source files. Doing this probably means post-processing the
+ results of the highlighting engine, to find places where it's highlighted
+ comments, and then running them through the ikiwiki rendering pipeline.
+ This seems fairly doable with [[!cpan Syntax::Highlight::Engine::Kate]],
+ at least.
+* The whole-file plugins tend to have a problem that things that look like
+ wikilinks in the source code get munged into links by ikiwiki, which can
+ have confusing results. Similar problem with preprocessor directives.
+ One approach that's also been requested for eg,
+ [[plugins/contrib/mediawiki]] is to allow controlling which linkification
+ types a page type can have on it.
+
+ > The previous two points seem to be related. One thought: instead of
+ > getting the source from the `content` parameter, the plugin could
+ > re-load the page source. That would stop directives/links from
+ > being processed in the source. As noted above, comments
+ > could then be parsed for directives/links later.
+ >
+ > Would it be worth adding a `nodirectives` option when registering
+ > an htmlize hook that switches off directive and link processing before
+ > generating the html for a page?
+
+* The whole-file plugins all get confused if there is a `foo.c` and a `foo.h`.
+ This is trivially fixable now by passing the keepextension option when
+ registering the htmlize hooks, though. There's also a noextension option
+ that should handle the
+ case of source files with names that do not contain an extension (ie,
+ "Makefile") -- in this case you just register the while filename
+ in the htmlize hook.
+* Whole-file plugins register a bunch of htmlize hooks. The wacky thing
+ about it is that, when creating a new page, you can then pick "c" or
+ "h" or "pl" etc from the dropdown that normally has "Markdown" etc in it.
+ Is this a bug, or a feature? Even if a feature, plugins with many
+ extensions make the dropdown unusable..
+
+ Perhaps the thing to do here is to use the new `longname` parameter to
+ the format hook, to give them all names that will group together at or
+ near the end of the list. Ie: "Syntax: perl", "Source code: c", etc.
+
+---
+
+I'm calling this [[done]] since I added the [[plugins/highlight]]
+plugin. There are some unresolved issues touched on here,
+but they either have the own other bug reports, or are documented
+as semi-features in the docs to the plugin. --[[Joey]]
diff --git a/doc/todo/syntax_highlighting/discussion.mdwn b/doc/todo/syntax_highlighting/discussion.mdwn
new file mode 100644
index 000000000..27cb7084b
--- /dev/null
+++ b/doc/todo/syntax_highlighting/discussion.mdwn
@@ -0,0 +1,28 @@
+sourcehighlight is annoyingly slow, but it does support wiki directives
+in comments. See [here](http://www.cs.unb.ca/~bremner/teaching/java_examples/snippet/ListMerge/)
+for an example (tags).
+
+> I think that is just a result of it expanding directives, and wikilinks,
+> everywhere in the file, which is generally a possible problem..
+> --[[Joey]]
+
+* * * * *
+
+I think having the option to choose source code page types from the
+dropdown list is definitely a feature. This gives users an easy way
+to contribute programs (say `.pl` files) or code snippets (like, for
+example, the Elisp area of the EmacsWiki). Actually, would there any
+other way to create a `.pl` file without write access to the
+repository? --[[JasonBlevins]]
+
+> Well, you can upload them as an attachment if the wiki is configured to
+> allow it. Having them in the drop down becomes a problem when there are
+> so many wacky extensions in there that you can't find anything.
+> --[[Joey]]
+
+>> I should just note that the
+>> [[sourcecode|todo/automatic_use_of_syntax_plugin_on_source_code_files/discussion]]
+>> plugin only adds the file extensions listed in the config. This shouldn't cause
+>> massive drop-down menu pollution. -- [[Will]]
+
+>>> That seems to be the way to go! --[[Joey]]
diff --git a/doc/todo/syslog_should_show_wiki_name.mdwn b/doc/todo/syslog_should_show_wiki_name.mdwn
new file mode 100644
index 000000000..068979514
--- /dev/null
+++ b/doc/todo/syslog_should_show_wiki_name.mdwn
@@ -0,0 +1,8 @@
+We run many ikiwikis on wiki.cs.pdx.edu, and we occasionally get messages like this in the syslog `user.log` file:
+
+ Apr 26 07:31:41 svcs ikiwiki: bad page name
+ Apr 26 07:43:26 svcs ikiwiki: bad page name
+
+Those don't give us any information about which ikiwiki they came from. ikiwiki needs to provide the wiki name in syslog messages. --[[JoshTriplett]]
+
+[[done]]
diff --git a/doc/todo/table_with_header_column.mdwn b/doc/todo/table_with_header_column.mdwn
new file mode 100644
index 000000000..e9fad9c54
--- /dev/null
+++ b/doc/todo/table_with_header_column.mdwn
@@ -0,0 +1,7 @@
+Tables support a header row or no header, but do not support a header column.
+
+> I have proposed a patch to the table plugin that enable such behaviour: [[table/discussion|plugins/table/discussion]].
+>
+> -- [[AlexandreDupas]]
+
+>> [[applied|done]]
diff --git a/doc/todo/tag_pagespec_function.mdwn b/doc/todo/tag_pagespec_function.mdwn
new file mode 100644
index 000000000..3604a83d9
--- /dev/null
+++ b/doc/todo/tag_pagespec_function.mdwn
@@ -0,0 +1,41 @@
+Implementing tags in terms of links is clever, but it would be nice if it was
+opaque in both directions: tagging and matching tags. Writing pagespecs to
+find out which pages are tagged with a given name means that the pagespec is
+tied to whatever the tagbase is.
+
+This patch adds a pagespec function 'tag' which lets you write pagespecs to
+match tagged pages independent of whatever the tagbase is set to.
+
+ -- [[users/Jon]] 2009/02/17
+
+> So, this looks good, appreciate the patch.
+>
+> The only problem I see is it could be confusing if `tag(foo)` matched
+> a page that just linked to the tag via a wikilink, w/o actually tagging it.
+
+>> (My [[!taglink wishlist]].) Yes, this is confusing and not nice. I observed this misbehavior, because I wanted to match two different lists of pages (only tagged or linked in any way), but it didn't work. Would this feature require a complex patch? --Ivan Z.
+
+>>> If you link to a page 'foo' which happens to be a tag then the page you link from will turn up in the set of pages returned by tagged(foo). The only way to avoid this would be for the tag plugin to not use wikilinks as an implementation method. That itself would not be too hard to do, but there might be people relying on the older behaviour. A better alternative might be to have a "tag2" plugin (or a better name) which implements tagging entirely separately. -- [[Jon]]
+>>>> I see; at least, your response is encouraging (that it's not hard). I could even find some work that can give similar features: [[structured page data#another_kind_of_links]] -- they envisage a pagespec like `data_link(Depends on,bugs/bugA)`, thus a "separation of dependencies between bugs from arbitrary links".
+
+>>>> Indeed, having many relations that can be used in the formulas defining classes of objects (like pagespecs here) is a commonly imagined thing, so this would be a nice feature. (I'll be trying out the patches there first, probably.) In general, extending the language of pagespecs to something more powerful (like [[!wikipedia description logics]]) seems to be a nice possible feature. I saw more discussions of ideas [[!taglink about_the_extension_of_the_pagespec_language_in_the_direction_similar_to_description_logics|pagespec_in_DL_style]] somewhere else here. --Ivan Z.
+
+> One other thing, perhaps it should be called `tagged()`? --[[Joey]]
+
+[[!tag patch done]]
+
+ --- a/plugins/IkiWiki/Plugin/tag.pm 2009-02-16 11:30:11.000000000 +0000
+ +++ b/plugins/IkiWiki/Plugin/tag.pm 2009-02-17 15:40:03.000000000 +0000
+ @@ -125,4 +125,12 @@
+ }
+ }
+
+ +package IkiWiki::PageSpec;
+ +
+ +sub match_tag ($$;@) {
+ + my $page = shift;
+ + my $glob = shift;
+ + return match_link($page, IkiWiki::Plugin::tag::tagpage($glob));
+ +}
+ +
+ 1
diff --git a/doc/todo/tagging_with_a_publication_date.mdwn b/doc/todo/tagging_with_a_publication_date.mdwn
new file mode 100644
index 000000000..39fc4e220
--- /dev/null
+++ b/doc/todo/tagging_with_a_publication_date.mdwn
@@ -0,0 +1,71 @@
+Feature idea: I'd like to be able to tag pages in an ikiwiki blog with a
+publication date, and have the option of building a blog that excludes
+publication dates in the future. (meta pubdate= ?)
+
+I'm using ikiwiki on git for a "tip of the day" RSS feed, and I'd like to
+be able to queue up a bunch of items instead of literally putting in one
+tip per day. In the future I think this will come in handy for other
+Mainstream Media-oriented requirements such as "embargo dates" and "editor
+on vacation".
+
+> The problem with implementing a feature like this is that, since ikwiiki
+> is a wiki compiler, if something causes content to change based on the
+> date, then the wiki needs to be rebuilt periodically. So you'd need a
+> cron job or something.
+>
+> Thinking about this some more, if you're going to have a cron job, you
+> could just set up a branch containing the future post. The branch could
+> have a name like 20080911. Then have the cron job git merge the day's
+> branch, if any, into master each day. And voila, post is completly hidden
+> until published. You'd want to avoid merge conflicts in your cron job ..
+> but they'd be unlikely if you limited yourself to just adding new
+> pages. Alternatively, for larger organisations wishing to deploy more
+> sweeping changes on a given date, replace cron job with intern.. ;-)
+> --[[Joey]]
+
+> > Good approach if you have one day on which a big change goes through, but
+> > often the reason for tagging with a publication date is so that you can
+> > dribble out articles one per day when you're gone for a week. Branches are easy
+> > in git, but it would still be an extra step to switch branches every time
+> > you edit a different day's article.
+> >
+> > And just to make it a little harder, some sites might want an internal
+> > copy of the wiki that _does_ build the future pages, just tags them with the publication
+> > date, for previewing.
+> >
+> > One more reason to have publication date: if you move a page from your old CMS to ikiwiki
+> > and want to have it show up in the right order in RSS feeds.
+> >
+> > I no longer have the original wiki for which I wanted this feature, but I can
+> > see using it on future ones. -- [[DonMarti]]
+
+>>> FWIW, for the case where one wants to update a site offline,
+>>> using an ikiwiki instance on a laptop, and include some deffered
+>>> posts in the push, the ad-hoc cron job type approach will be annoying.
+>>>
+>>> In modern ikiwiki, I guess the way to accomplish this would be to
+>>> add a pagespec that matches only pages posted in the present or past.
+>>> Then a page can have its post date set to the future, using meta date,
+>>> and only show up when its post date rolls around.
+>>>
+>>> Ikiwiki will need to somehow notice that a pagespec began matching
+>>> a page it did not match previously, despite said page not actually
+>>> changing. I'm not sure what the best way is.
+>>>
+>>> * One way could be to
+>>> use a needsbuild hook and some stored data about which pagespecs
+>>> exclude pages in the future. (But I'm not sure how evaluating the
+>>> pagespec could lead to that metadata and hook being set up.)
+>>> * Another way would be to use an explicit directive to delay a
+>>> page being posted. Then the directive stores the metadata and
+>>> sets up the needsbuild hook.
+>>> * Another way would be for ikiwiki to remember the last
+>>> time it ran. It could then easily find pages that have a post
+>>> date after that time, and treat them the same as it treats actually
+>>> modified files. Or a plugin could do this via a needsbuild hook,
+>>> probably. (Only downside to this is it would probably need to do
+>>> a O(n) walk of the list of pages -- but only running an integer
+>>> compare per page.)
+>>>
+>>> You'd still need a cron job to run ikiwiki -refresh every hour, or
+>>> whatever, so it can update. --[[Joey]]
diff --git a/doc/todo/tags.mdwn b/doc/todo/tags.mdwn
new file mode 100644
index 000000000..ee9151116
--- /dev/null
+++ b/doc/todo/tags.mdwn
@@ -0,0 +1,12 @@
+Stuff still needing to be done with [[/tags]]:
+
+* It's unfortunate that the rss category (tag) support doesn't include
+ a domain="" attribute in the category elements. That would let readers
+ know how to follow back to the tag page in the wiki. However, the domain
+ attribute is specified to be the base url, to which the category is just
+ appended. So there's no way to add ".html", so the url won't be right.
+
+ This is one good argument for changing ikiwiki so that pages are all
+ dir/index.html, then a link to just "dir" works.
+
+ So this could be implemented now for userdirs wikis..
diff --git a/doc/todo/target_filter_for_brokenlinks.mdwn b/doc/todo/target_filter_for_brokenlinks.mdwn
new file mode 100644
index 000000000..137277c21
--- /dev/null
+++ b/doc/todo/target_filter_for_brokenlinks.mdwn
@@ -0,0 +1,9 @@
+[[!tag wishlist]]
+
+Currently, [[plugins/brokenlinks]] supports filtering by the place where a broken wikilink is used.
+
+Filtering by the target of the broken link would also be useful, e.g.,
+
+ \[[!brokenlinks matching="tagbase/*"]]
+
+would list the tags not yet "filled out". --Ivan Z.
diff --git a/doc/todo/terminalclient.mdwn b/doc/todo/terminalclient.mdwn
new file mode 100644
index 000000000..b420a3d17
--- /dev/null
+++ b/doc/todo/terminalclient.mdwn
@@ -0,0 +1,10 @@
+Hack together a local ikiwiki w/o a web server using w3m's cgi-less mode
+and $EDITOR. Browse around a wiki, select pages to edit and get dropped
+right into the editor and have the page committed to svn automatically.
+
+[[todo/done]]
+
+Less grandiosely, a simple command line util to add a new page would be
+useful, especially if it made it easy to add blog entries to the wiki. I
+have a special purpose version of this in my [blog
+script](http://kitenet.net/~joey/code/bin.html).
diff --git a/doc/todo/test_coverage.mdwn b/doc/todo/test_coverage.mdwn
new file mode 100644
index 000000000..56e11e01a
--- /dev/null
+++ b/doc/todo/test_coverage.mdwn
@@ -0,0 +1,24 @@
+[[!tag patch]]
+[[!template id=gitbranch branch=smcv/coverage author="[[smcv]]"]]
+
+It would be nice for `make coverage` (or something) to produce a HTML
+test-coverage report. I found this very useful for test-driven development of
+[[plugins/contrib/trail]].
+
+Limitations of the current branch, which uses [[!cpan Devel::Cover]]:
+
+* Some tests use `./blib` and some use `.` so coverage gets split between
+ the two copies of each module; not a problem for [[plugins/contrib/trail]]
+ which only has one test.
+
+> How annoying. I think in at least some cases there is reason to use
+> `./blib` -- perhaps everything that users `.` should be changed to use
+> it. --[[Joey]]
+
+* The [[plugins/git]] and [[plugins/mercurial]] plugins want to `chdir`,
+ and so does [[!cpan Devel::Cover]], so they fight. For now, those tests
+ are disabled under `make coverage`.
+
+--[[smcv]]
+
+> [[merged|done]] --[[Joey]]
diff --git a/doc/todo/themes_should_ship_with_templates.mdwn b/doc/todo/themes_should_ship_with_templates.mdwn
new file mode 100644
index 000000000..875e5ce89
--- /dev/null
+++ b/doc/todo/themes_should_ship_with_templates.mdwn
@@ -0,0 +1,19 @@
+if i understand [[todo/multiple_template_directories]] correctly, i read it as templates directories being separate from themes. shouldn't a themer be able to ship more than just a CSS and instead override the (say) page.tmpl page? -- [[anarcat]]
+
+> A theme can ship any files it wants to, including templates (in the
+> templates/ directory).
+>
+> So far none of them do; I'd much rather have one version of page.tmpl to
+> maintain than one per theme.. --[[Joey]]
+
+> > that, dear author, is amazingly simple, intuitive and useful. why didn't i try this before! :) thank you very much!
+> >
+> > i do agree that having a completely different template file is tricky, but I think it's important for themes to be able to modify those files, otherwise making themes is *really* hard. besides the burden of maintenance falls on the theme maintainer, and as long as it's not in ikiwiki core, you have nothing to worry about... right? ;)
+> >
+> > i have just ported the [night_city theme](http://www.openwebdesign.org/viewdesign.phtml?id=3318) to ikiwiki, and it was a nightmare when i wasn't modifying the page.tmpl... but now it's done! i'll try to publish my changes somewhere when i finish figuring out how to share small things like that easily. ;) ([[suggestions|submodule_support]]?)
+> >
+> > i guess the only thing to be done here is to update the documentation so that this is clearer - where do you suggest I do that? --[[anarcat]]
+> >
+> > > i have updated the [[plugins/theme]] documentation, hopefully that was the right place. so i think this is [[done]] --[[anarcat]]
+
+>>>> I would love it it you would share your theme! Thanks, Adam. :-)
diff --git a/doc/todo/tidy_git__39__s_ctime_debug_output.mdwn b/doc/todo/tidy_git__39__s_ctime_debug_output.mdwn
new file mode 100644
index 000000000..bfc130d69
--- /dev/null
+++ b/doc/todo/tidy_git__39__s_ctime_debug_output.mdwn
@@ -0,0 +1,15 @@
+ Index: IkiWiki/Rcs/git.pm
+ ===================================================================
+ --- IkiWiki/Rcs/git.pm (revision 4532)
+ +++ IkiWiki/Rcs/git.pm (working copy)
+ @@ -467,7 +467,7 @@
+ my $sha1 = git_sha1($file);
+ my $ci = git_commit_info($sha1);
+ my $ctime = $ci->{'author_epoch'};
+ - debug("ctime for '$file': ". localtime($ctime) . "\n");
+ + debug("ctime for '$file': ". localtime($ctime));
+
+ return $ctime;
+ }
+
+[[!tag patch done]]
diff --git a/doc/todo/tla.mdwn b/doc/todo/tla.mdwn
new file mode 100644
index 000000000..b6b082cfe
--- /dev/null
+++ b/doc/todo/tla.mdwn
@@ -0,0 +1,7 @@
+* Need to get post commit hook code working.
+* Need some example urls for web based diffs.
+* `rcs_commit_staged`, `rcs_rename`, `rcs_remove`, are not
+ implemented for tla, and so attachments, remove and rename plugins
+ cannot be used with it. (These should be fairly easy to add..)
+
+[[!tag rcs/tla]]
diff --git a/doc/todo/tmplvars_plugin.mdwn b/doc/todo/tmplvars_plugin.mdwn
new file mode 100644
index 000000000..2fe819682
--- /dev/null
+++ b/doc/todo/tmplvars_plugin.mdwn
@@ -0,0 +1,75 @@
+A simple plugin to allow per-page customization of a template by passing paramaters to HTML::Template. For those times when a whole pagetemplate is too much work. --Ethan
+
+[[!tag patch]]
+
+> The implementation looks fine to me (assuming it works with current ikiwiki),
+> apart from the "XXX" already noted in the patch. The design could reasonably
+> be considered premature generalization, though - how often do you actually
+> need to define new tmplvars?
+>
+> As for the page/destpage/preview thing, it would be good if the preprocess
+> hook could distinguish between software-supplied and user-supplied
+> parameters (the [[plugins/tag]] plugin could benefit from this too). Perhaps
+> the IkiWiki core could be modified so that
+> `hook(type => "preprocess", splitparams => 1, ...)` would invoke preprocess
+> with { page => "foo", destpage => "bar", ... } as a special first argument,
+> and the user-supplied parameters as subsequent arguments? Then plugins like
+> tag could use:
+>
+> my $ikiparams = shift;
+> my %params = @_;
+>
+> add_tags($ikiparams->{page}, keys %params);
+>
+> --[[smcv]]
+
+----
+
+ #!/usr/bin/perl
+ package IkiWiki::Plugin::tmplvars;
+
+ use warnings;
+ use strict;
+ use IkiWiki 2.00;
+
+ my %tmplvars;
+
+ sub import {
+ hook(type => "preprocess", id => "tmplvars", call => \&preprocess);
+ hook(type => "pagetemplate", id => "tmplvars", call => \&pagetemplate);
+ }
+
+ sub preprocess (@) {
+ my %params=@_;
+
+ if ($params{page} eq $params{destpage}) {
+ my $page = $params{page};
+ if (undef $tmplvars{$page}){
+ $tmplvars{$page} = {};
+ }
+ # XXX: The only way to get at just the user-specified params is
+ # to try to remove all the Ikiwiki-supplied ones.
+ delete $params{page};
+ delete $params{destpage};
+ delete $params{preview};
+ foreach my $arg (keys %params){
+ $tmplvars{$page}->{$arg} = $params{$arg};
+ }
+ }
+
+ }
+
+ sub pagetemplate (@) {
+ my %params=@_;
+ my $template = $params{template};
+
+ if (exists $tmplvars{$params{page}}) {
+ foreach my $arg (keys %{$tmplvars{$params{page}}}){
+ $template->param($arg => $tmplvars{$params{page}}->{$arg});
+ }
+ }
+
+ return undef;
+ }
+
+ 1
diff --git a/doc/todo/tmplvars_plugin/discussion.mdwn b/doc/todo/tmplvars_plugin/discussion.mdwn
new file mode 100644
index 000000000..93cb9b414
--- /dev/null
+++ b/doc/todo/tmplvars_plugin/discussion.mdwn
@@ -0,0 +1 @@
+I find this plugin quite usefull. But one thing, I would like to be able to do is set a tmplvar e.g. in a sidebar so that it affects all Pages this sidebar is used in. --martin
diff --git a/doc/todo/toc-with-human-readable-anchors.mdwn b/doc/todo/toc-with-human-readable-anchors.mdwn
new file mode 100644
index 000000000..0f358f4e6
--- /dev/null
+++ b/doc/todo/toc-with-human-readable-anchors.mdwn
@@ -0,0 +1,7 @@
+The [[/plugins/toc]] plugin is very useful but it creates anchors with names such as #index1h3
+
+In #ikiwiki today, another user and I were in agreement that an option for human readable anchors would be preferable.
+
+> +1 - i would love to see that happen too. Here's a patch I wrote a while back for similar functionality in moinmoin: https://svn.koumbit.net/koumbit/trunk/patches/moinmoin/nice_headings.patch -- [[anarcat]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/toc_plugin:_set_a_header_ceiling___40__opposite_of_levels__61____41__.mdwn b/doc/todo/toc_plugin:_set_a_header_ceiling___40__opposite_of_levels__61____41__.mdwn
new file mode 100644
index 000000000..07d2d383c
--- /dev/null
+++ b/doc/todo/toc_plugin:_set_a_header_ceiling___40__opposite_of_levels__61____41__.mdwn
@@ -0,0 +1,48 @@
+It would be nice if the [[plugins/toc]] plugin let you specify a header level "ceiling" above which (or above and including which) the headers would not be incorporated into the toc.
+
+Currently, the levels=X parameter lets you tweak how deep it will go for small headers, but I'd like to chop off the h1's (as I use them for my page title) -- [[Jon]]
+
+> This change to toc.pm should do it. --[[KathrynAndersen]]
+
+> > The patch looks vaguely OK to me but it's hard to tell without
+> > context. It'd be much easier to review if you used unified diff
+> > (`diff -u`), which is what `git diff` defaults to - almost all
+> > projects prefer to receive changes as unified diffs (or as
+> > branches in their chosen VCS, which is [[git]] here). --[[smcv]]
+
+> > > Done. -- [[KathrynAndersen]]
+
+> > > > Looks like Joey has now [[merged|done]] this. Thanks! --[[smcv]]
+
+ --- /files/git/other/ikiwiki/IkiWiki/Plugin/toc.pm 2009-11-16 12:44:00.352050178 +1100
+ +++ toc.pm 2009-12-26 06:36:06.686512552 +1100
+ @@ -53,8 +53,8 @@
+ my $page="";
+ my $index="";
+ my %anchors;
+ - my $curlevel;
+ - my $startlevel=0;
+ + my $startlevel=($params{startlevel} ? $params{startlevel} : 0);
+ + my $curlevel=$startlevel-1;
+ my $liststarted=0;
+ my $indent=sub { "\t" x $curlevel };
+ $p->handler(start => sub {
+ @@ -67,10 +67,16 @@
+
+ # Take the first header level seen as the topmost level,
+ # even if there are higher levels seen later on.
+ + # unless we're given startlevel as a parameter
+ if (! $startlevel) {
+ $startlevel=$level;
+ $curlevel=$startlevel-1;
+ }
+ + elsif (defined $params{startlevel}
+ + and $level < $params{startlevel})
+ + {
+ + return;
+ + }
+ elsif ($level < $startlevel) {
+ $level=$startlevel;
+ }
+
+[[!tag patch]]
diff --git a/doc/todo/toc_plugin_to_skip_one_level.mdwn b/doc/todo/toc_plugin_to_skip_one_level.mdwn
new file mode 100644
index 000000000..4891a1197
--- /dev/null
+++ b/doc/todo/toc_plugin_to_skip_one_level.mdwn
@@ -0,0 +1,23 @@
+It would be great if I could to this:
+
+ \[[!toc levels=3 skip=1]]
+
+I use h1 for big title on each page, and don't want it in my toc on that page.
+
+That way I could have toc for h2 and h3 and h1 is skipped bacause it is big title for many pages.
+
+> I realize there is a lot of personal preference involved here but before
+> another option is added, I wonder why you're using a h1 for a title on
+> each page when the page name already appears at the top of each page. And
+> if the page name isn't right for the title, you can use
+> \[[!meta title="foo"]] to override it. And this purposefully doesn't show
+> up in the toc. --[[Joey]]
+
+>> aaaahhh, I made a mistake. U used some other page.tmpl and title was hidden,
+>> so I used h1 for it. Thanks, consider this [[closed|done]] :)
+
+P.S. I tried to indent "[[toc..." to make it monospaced but it does not work ?
+
+> It's monospaced here.
+
+>> ehhh, CSS bug on my part :(
diff --git a/doc/todo/toggle_initial_state.mdwn b/doc/todo/toggle_initial_state.mdwn
new file mode 100644
index 000000000..dbad39c75
--- /dev/null
+++ b/doc/todo/toggle_initial_state.mdwn
@@ -0,0 +1,6 @@
+It would be nice if one could set the initial state of the toggleable area.
+--[[rdennis]]
+
+[[!tag plugins/toggle]]
+
+[[done]]
diff --git a/doc/todo/toplevel_index.mdwn b/doc/todo/toplevel_index.mdwn
new file mode 100644
index 000000000..92cef99ac
--- /dev/null
+++ b/doc/todo/toplevel_index.mdwn
@@ -0,0 +1,37 @@
+Some inconsistences around the toplevel [[index]] page:
+
+* [[ikiwiki]] is a separate page; links to [[ikiwiki]] should better go to
+ the index though.
+
+ > At least for this wiki, I turned out to have a use for [[ikiwiki]]
+ > pointing to a different page, though the general point might still
+ > stand.
+
+* The toplevel [[ikiwiki/Discussion]] page has some weird parentlinks
+ behavior. This could be special cased around with the following patch.
+ However, I'm unsure if I like the idea of more special cases around this.
+ It would be better to find a way to make the toplevel index page not be a
+ special case at all.
+
+Here is a patch:
+
+ --- IkiWiki/Render.pm (revision 1187)
+ +++ IkiWiki/Render.pm (working copy)
+ @@ -71,6 +71,7 @@
+ my $path="";
+ my $skip=1;
+ return if $page eq 'index'; # toplevel
+ + $path=".." if $page=~s/^index\///;
+ foreach my $dir (reverse split("/", $page)) {
+ if (! $skip) {
+ $path.="../";
+
+ > Came up with a better patch for this, [[done]] --[[Joey]]
+
+---
+
+> I would like to suggest another tack, namely a bigger, better special case.
+> The basic idea is that all indices of the form foo/bar/index get the wiki path foo/bar.
+> You could do this today using [[todo/index.html_allowed]], except that the toplevel
+> page "index" becomes "", which causes all sorts of chaos. The discussion page would
+> become /discussion, and the weird parentlinks behavior would go away. --Ethan
diff --git a/doc/todo/tracking_bugs_with_dependencies.mdwn b/doc/todo/tracking_bugs_with_dependencies.mdwn
new file mode 100644
index 000000000..456dadad0
--- /dev/null
+++ b/doc/todo/tracking_bugs_with_dependencies.mdwn
@@ -0,0 +1,680 @@
+[[!tag patch patch/core]]
+
+I like the idea of [[tips/integrated_issue_tracking_with_ikiwiki]], and I do so on several wikis. However, as far as I can tell, ikiwiki has no functionality which can represent dependencies between bugs and allow pagespecs to select based on dependencies. For instance, I can't write a pagespec which selects all bugs with no dependencies on bugs not marked as done. --[[JoshTriplett]]
+
+> I started having a think about this. I'm going to start with the idea that expanding
+> the pagespec syntax is the way to attack this. It seems that any pagespec that is going
+> to represent "all bugs with no dependencies on bugs not marked as done" is going to
+> need some way to represent "bugs not marked as done" as a collection of pages, and
+> then represent "bugs which do not link to pages in the previous collection".
+>
+> One way to do this would be to introduce variables into the pagespec, along with
+> universal and/or existential [[!wikipedia Quantification]]. That looks quite complex.
+>
+>> I thought about this briefly, and got about that far.. glad you got
+>> further. :-) --[[Joey]]
+
+>> Or, one [[!taglink could_also_refer|pagespec_in_DL_style]] to the language of [[!wikipedia description logics]]: their formulas actually define classes of objects through quantified relations to other classes. --Ivan Z.
+>
+> Another option would be go with a more functional syntax. The concept here would
+> be to allow a pagespec to appear in a 'pagespec function' anywhere a page can. e.g.
+> I could pass a pagespec to `link()` and that would return true if there is a link to any
+> page matching the pagespec. This makes the variables and existential quantification
+> implicit. It would allow the example requested above:
+>
+>> `bugs/* and !*/Discussion and !link(bugs/* and !*/Discussion and !link(done))`
+>
+> Unfortunately, this is also going to make the pagespec parsing more complex because
+> we now need to parse nested sets of parentheses to know when the nested pagespec
+> ends, and that isn't a regular language (we can't use regular expression matching for
+> easy parsing).
+>
+>> Also, it may cause ambiguities with page names that contain parens
+>> (though some such ambigutities already exist with the pagespec syntax).
+>
+> One simplification of that would be to introduce some pagespec [[shortcuts]]. We could
+> then allow pagespec functions to take either pages, or named pagespec shortcuts. The
+> pagespec shortcuts would just be listed on a special page, like current [[shortcuts]].
+> (It would probably be a good idea to require that shortcuts on that page can only refer
+> to named pagespecs higher up that page than themselves. That would stop some
+> looping issues...) These shortcuts would be used as follows: when trying to match
+> a page (without globs) you look to see if the page exists. If it does then you have a
+> match. If it doesn't, then you look to see if a similarly named pagespec shortcut
+> exists. If it does, then you check that pagespec recursively to see if you have a match.
+> The ordering requirement on named pagespecs stops infinite recursion.
+>
+> Does that seem like a reasonable first approach?
+>
+> -- [[Will]]
+
+>> Having a separate page for the shortcuts feels unwieldly.. perhaps
+>> instead the shortcut could be defined earlier in the scope of the same
+>> pagespec that uses it?
+>>
+>> Example: `define(~bugs, bugs/* and !*/Discussion) and define(~openbugs, ~bugs and !link(done)) and ~openbugs and !link(~openbugs)`
+
+>>> That could work. parens are only ever nested 1 deep in that grammar so it is regular and the current parsing would be ok.
+
+>> Note that I made the "~" explicit, not implicit, so it could be left out. In the case of ambiguity between
+>> a definition and a page name, the definition would win.
+
+>>> That was my initial thought too :), but when implementing it I decided that requiring the ~ made things easier. I'll probably require the ~ for the first pass at least.
+
+>> So, equivilant example: `define(bugs, bugs/* and !*/Discussion) and define(openbugs, bugs and !link(done)) and openbugs and !link(openbugs)`
+>>
+
+>> Re recursion, it is avoided.. but building a pagespec that is O(N^X) where N is the
+>> number of pages in the wiki is not avoided. Probably need to add DOS prevention.
+>> --[[Joey]]
+
+>>> If you memoize the outcomes of the named pagespecs you can make in O(N.X), no?
+>>> -- [[Will]]
+
+>>>> Yeah, guess that'd work. :-)
+
+> <a id="another_kind_of_links" />One quick further thought. All the above discussion assumes that 'dependency' is the
+> same as 'links to', which is not really true. For example, you'd like to be able to say
+> "This bug does not depend upon [ [ link to other bug ] ]" and not have a dependency.
+> Without having different types of links, I don't see how this would be possible.
+>
+> -- [[Will]]
+
+>> I saw that this issue is targeted at by the work on [[structured page data#another_kind_of_links]]. --Ivan Z.
+
+>>> It's fixed now; links can have a type, such as "tag", or "dependency",
+>>> and pagespecs can match links of a given typo. --[[Joey]]
+
+Okie - I've had a quick attempt at this. Initial patch attached. This one doesn't quite work.
+And there is still a lot of debugging stuff in there.
+
+At the moment I've added a new preprocessor plugin, `definepagespec`, which is like
+shortcut for pagespecs. To reference a named pagespec, use `~` like this:
+
+ [ [!definepagespec name="bugs" spec="bugs/* and !*/Discussion"]]
+ [ [!definepagespec name="openbugs" spec="~bugs and !link(done)"]]
+ [ [!definepagespec name="readybugs" spec="~openbugs and !link(~openbugs)"]]
+
+At the moment the problem is in `match_link()` when we're trying to find a sub-page that
+matches the appropriate page spec. There is no good list of pages available to iterate over.
+
+ foreach my $nextpage (keys %IkiWiki::pagesources)
+
+does not give me a good list of pages. I found the same thing when I was working on
+this todo [[todo/Add_a_plugin_to_list_available_pre-processor_commands]].
+
+> I'm not sure why iterating over `%pagesources` wouldn't work here, it's the same method
+> used by anything that needs to match a pagespec against all pages..? --[[Joey]]
+
+>> My uchecked hypothesis is that %pagesources is created after the refresh hook.
+>> I've also been concerned about how globally defined pagespec shortcuts would interact with
+>> the page dependancy system. Your idea of internally defined shortcuts should fix that. -- [[Will]]
+
+>>> You're correct, the refresh hook is run very early, before pagesources
+>>> is populated. (It will be partially populated on a refresh, but will
+>>> not be updated to reflect new pages.) Agree that internally defined
+>>> seems the way to go. --[[Joey]]
+
+Immediately below is a patch which seems to basically work. Lots of debugging code is still there
+and it needs a cleanup, but I thought it worth posting at this point. (I was having problems
+with old style glob lists, so i just switched them off for the moment.)
+
+The following three inlines work for me with this patch:
+
+ Bugs:
+
+ [ [!inline pages="define(~bugs, bugs/* and ! */Discussion) and ~bugs" archive="yes"]]
+
+ OpenBugs:
+
+ [ [!inline pages="define(~bugs, bugs/* and ! */Discussion) and define(~openbugs,~bugs and !link(done)) and ~openbugs" archive="yes"]]
+
+ ReadyBugs:
+
+ [ [!inline pages="define(~bugs, bugs/* and ! */Discussion) and define(~openbugs,~bugs and !link(done)) and define(~readybugs,~openbugs and !link(~openbugs)) and ~readybugs" archive="yes"]]
+
+> Nice! Could the specfuncsref be passed in %params? I'd like to avoid
+> needing to change the prototype of every pagespec function, since several
+> plugins define them too. --[[Joey]]
+
+>> Maybe - it needs more thought. I also considered it when I was going though changing all those plugins :).
+>> My concern was that `%params` can contain other user-defined parameters,
+>> e.g. `link(target, otherparameter)`, and that means that the specFuncs could be clobbered by a user (or other
+>> weird security hole). I thought it better to separate it, but I didn't think about it too hard. I might move it to
+>> the first parameter rather than the second. Ikiwiki is my first real perl hacking and I'm still discovering
+>> good ways to write things in perl.
+>>
+>>>> `%params` contains the parameters passed to `pagespec_match`, not
+>>>> user-supplied parameters. The user-supplied parameter to a function
+>>>> like `match_glob()` or `match_link()` is passed in the second positional parameter. --[[Joey]]
+
+>>>>> OK. That seems reasonable then. The only problem is that my PERLfu is not strong enough to make it
+>>>>> work. I really have to wonder what substance was influencing the designers of PERL...
+>>>>> I can't figure out how to use the %params. And I'm pissed off enough with PERL that I'm not going
+>>>>> to try and figure it out any more. There are two patches below now. The first one uses an extra
+>>>>> argument and works. The second one tries to use %params and doesn't - take your pick :-). -- [[Will]]
+
+>> What do you think is best to do about `is_globlist()`? At the moment it requires that the 'second word', as
+>> delimited by a space and ignoring parens, is 'and' or 'or'. This doesn't hold in the above example pagespecs (so I just hard wired it to 0 to test my patch).
+>> My thought was just to search for 'and' or 'or' as words anywhere in the pagespec. Thoughts?
+
+>>> Dunno, we could just finish deprecating it. Or change the regexp to
+>>> skip over spaces in parens. (`/[^\s]+\s+([^)]+)/`) --[[Joey]]
+
+>>>> I think I have a working regexp now.
+
+>> Oh, one more thing. In pagespec_translate (now pagespec_makeperl), there is a part of the regular expression for `# any other text`.
+>> This contained `()`, which has no effect. I replaced that with `\(\)`, but that is a change in the definition of pagespecs unrelated to the
+>> rest of this patch. In a related change, commands were not able to contain `)` in their parameters. I've extended that so the cannot
+>> contain `(` or `)`. -- [[Will]]
+
+>>> `[^\s()]+` is a character class matching all characters not spaces or
+>>> parens. Since the pervious terminals in the regexp consume most
+>>> occurances of an open paren or close paren, it's unlikely for one to
+>>> get through to that part of the regexp. For example, "foo()" will be
+>>> matched by the command matcher; "(foo)" will be matched by the open
+>>> paren literal terminal. "foo(" and "foo)" can get through to the
+>>> end, and would be matched as a page name, if it didn't exclude parens.
+>>>
+>>> So why exclude them? Well, consider "foo and(bar and baz)". We don't
+>>> want it to match "and(" as a page name!
+>>>
+>>> Escaping the parens in the character class actually changes nothing; the
+>>> changed character class still matches all characters not spaces or
+>>> parens. (Try it!).
+>>>
+>>> Re commands containing '(', I don't really see any reason not to
+>>> allow that, unless it breaks something. --[[Joey]]
+
+>>>> Oh, I didn't realise you didn't need to escape parens inside []. All else I
+>>>> I understood. I have stopped commands from containing parens because
+>>>> once you allow that then you might have a extra level of depth in the parsing
+>>>> of define() statements. -- [[Will]]
+
+>>> Updated patch. Moved the specFuncsRef to the front of the arg list. Still haven't thought through the security implications of
+>>> having it in `%params`. I've also removed all the debugging `print` statements. And I've updated the `is_globlist()` function.
+>>> I think this is ready for people other than me to have a play. It is not well enough tested to commit just yet.
+>>> -- [[Will]]
+
+I've lost track of the indent level, so I'm going back to not indented - I think this is a working [[patch]] taking into
+account all comments above (which doesn't mean it is above reproach :) ). --[[Will]]
+
+> Very belated code review of last version of the patch:
+>
+> * `is_globlist` is no longer needed
+
+>> Good :)
+
+> * I don't understand why the pagespec match regexp is changed
+> from having flags `igx` to `ixgs`. Don't see why you
+> want `.` to match '\n` in it, and don't see any `.` in the regexp
+> anyway?
+
+>> Because you have to define all the named pagespecs in the pagespec, you sometimes end up with very long pagespecs. I found it useful to split them over multiple lines. That didn't work at one point and I added the 's' to make it work. I may have further altered the regex since then to make the 's' redundant. Remove it and see if multi-line pagespecs still work. :)
+
+>>> Well, I can tell you that multi-line pagespecs are supported w/o
+>>> your patch .. I use them all the time. The reason I find your
+>>> use of `/s` unlikely is because without it `\s` already matches
+>>> a newline. Only if you want to treat a newline as non-whitespace
+>>> is `/s` typically necessary. --[[Joey]]
+
+> * Some changes of `@_` to `%params` in `pagespec_makeperl` do not
+> make sense to me. I don't see where \%params is defined and populated,
+> except with `\$params{specFunc}`.
+
+>> I'm not a perl hacker. This was a mighty battle for me to get going.
+>> There is probably some battlefield carnage from my early struggles
+>> learning perl left here. Part of this is that @_ / @params already
+>> existed as a way of passing in extra parameters. I didn't want to
+>> pollute that top level namespace - just at my own parameter (a hash)
+>> which contained the data I needed.
+
+>>> I think I understand how the various `%params`
+>>> (there's not just one) work in your code now, but it's really a mess.
+>>> Explaining it in words would take pages.. It could be fixed by,
+>>> in `pagespec_makeperl` something like:
+>>>
+>>> my %specFuncs;
+>>> push @_, specFuncs => \%specFuncs;
+>>>
+>>> With that you have the hash locally available for populating
+>>> inside `pagespec_makeperl`, and when the `match_*` functions
+>>> are called the same hash data will be available inside their
+>>> `@_` or `%params`. No need to change how the functions are called
+>>> or do any of the other hacks.
+>>>
+>>> Currently, specFuncs is populated by building up code
+>>> that recursively calls `pagespec_makeperl`, and is then
+>>> evaluated when the pagespec gets evaluated. My suggested
+>>> change to `%params` will break that, but that had to change
+>>> anyway.
+>>>
+>>> It probably has a security hole, and is certianly inviting
+>>> one, since the pagespec definition is matched by a loose regexp (`.*`)
+>>> and then subject to string interpolation before being evaluated
+>>> inside perl code. I recently changed ikiwiki to never interpolate
+>>> user-supplied strings when translating pagespecs, and that
+>>> needs to happen here too. The obvious way, it seems to me,
+>>> is to not generate perl code, but just directly run perl code that
+>>> populates specFuncs.
+
+>>>> I don't think this is as bad as you make out, but your addition of the
+>>>> data array will break with the recursion my patch adds in pagespec_makeperl.
+>>>> To fix that I'll need to pass a reference to that array into pagespec_makeperl.
+>>>> I think I can then do the same thing to $params{specFuncs}. -- [[Will]]
+
+>>>>> You're right -- I did not think the recursive case through.
+>>>>> --[[Joey]]
+
+> * Seems that the only reason `match_glob` has to check for `~` is
+> because when a named spec appears in a pagespec, it is translated
+> to `match_glob("~foo")`. If, instead, `pagespec_makeperl` checked
+> for named specs, it could convert them into `check_named_spec("foo")`
+> and avoid that ugliness.
+
+>> Yeah - I wanted to make named specs syntactically different on my first pass. You are right in that this could be made a fallback - named specs always override pagenames.
+
+> * The changes to `match_link` seem either unecessary, or incomplete.
+> Shouldn't it check for named specs and call
+> `check_named_spec_existential`?
+
+>> An earlier version did. Then I realised it wasn't actually needed in that case - match_link() already included a loop that was like a type of existential matching. Each time through the loop it would
+>> call match_glob(). match_glob() in turn will handle the named spec. I tested this version briefly and it seemed to work. I remember looking at this again later and wondering if I had mis-understood
+>> some of the logic in match_link(), which might mean there are cases where you would need an explicit call to check_named_spec_existential() - I never checked it properly after having that thought.
+
+>>> In the common case, `match_link` does not call `match_glob`,
+>>> because the link target it is being asked to check for is a single
+>>> page name, not a glob.
+
+>>>> A named pagespec should fall into the glob case. These two pagespecs should be the same:
+
+ link(a*)
+
+>>>> and
+
+ define(aStar, a*) and link(~aStar)
+
+>>>> In the first case, we want the pagespec to match any page that links to a page matching the glob.
+>>>> In the second case, we want the pagespec to match any page that links to a page matching the named spec.
+>>>> match_link() was already doing existential part. The patches to this code were simply to remove the `lc()`
+>>>> call from the named pagespec name. Can that `lc` be removed entirely? -- [[Will]]
+
+>>>>> I think we could get rid of it. `bestlink` will lc it itself
+>>>>> if the uppercase version does not exist; `match_glob` matches
+>>>>> insensitively.
+>>>>> --[[Joey]]
+
+> * Generally, the need to modify `match_*` functions so that they
+> check for and handle named pagespecs seems suboptimal, if
+> only because there might be others people may want to use named
+> pagespecs with. It would be possible to move this check
+> to `pagespec_makeperl`, by having it check if the parameter
+> passed to a pagespec function looked like a named pagespec.
+> The only issue is that some pagespec functions take a parameter
+> that is not a page name at all, and it could be weird
+> if such a parameter were accidentially interpreted as a named
+> pagespec. (But, that seems unlikely to happen.)
+
+>> Possibly. I'm not sure which I prefer between the current solution and that one. Each have advantages and disadvantages.
+>> It really isn't much code for the match functions to add a call to check_named_spec_existential().
+
+>>> But if a plugin adds its own match function, it has
+>>> to explicitly call that code to support named pagespecs.
+
+>>>> Yes, and it can do that in just three lines of code. But if we automatically check for named pagespecs all the time we
+>>>> potentially break any matching function that doesn't accept pages, or wants to use multiple arguments.
+
+>>>>> 3 lines of code, plus the functions called become part of the API,
+>>>>> don't forget about that..
+>>>>>
+>>>>> Yes, I think that is the tradeoff, the question is whether to export
+>>>>> the additional complexity needed for that flexability.
+>>>>>
+>>>>> I'd be suprised if multiple argument pagespecs become necessary..
+>>>>> with the exception of this patch there has been no need for them yet.
+>>>>>
+>>>>> There are lots of pagespecs that take data other than pages,
+>>>>> indeed, that's really the common case. So far, none of them
+>>>>> seem likely to take data that starts with a `~`. Perhaps
+>>>>> the thing to do would be to check if `~foo` is a known,
+>>>>> named pagespec, and if not, just pass it through unchanged.
+>>>>> Then there's little room for ambiguity, and this also allows
+>>>>> pagespecs like `glob(~foo*)` to match the literal page `~foo`.
+>>>>> (It will make pagespec_merge even harder tho.. see below.)
+>>>>> --[[Joey]]
+
+>>>>>> I've already used multi-argument pagespec match functions in
+>>>>>> my data plugin. It is used for having different types of links. If
+>>>>>> you want to have multiple types of links, then the match function
+>>>>>> for them needs to take both the link name and the link type.
+>>>>>> I'm trying to think of a way we could have both - automatically
+>>>>>> handle the existential case unless the function indicates somehow
+>>>>>> that it'll do it itself. Any ideas? -- [[Will]]
+
+> * I need to check if your trick to avoid infinite recursion
+> works if there are two named specs that recursively
+> call one-another. I suspect it does, but will test this
+> myself..
+
+>> It worked for me. :)
+
+> * I also need to verify if memoizing the named pagespecs has
+> really guarded against very expensive pagespecs DOSing the wiki..
+
+> --[[Joey]]
+
+>> There is one issue that I've been thinking about that I haven't raised anywhere (or checked myself), and that is how this all interacts with page dependencies.
+>>
+>>> I've moved the discussion of that to [[dependency_types]]. --[[Joey]]
+>>
+>> I'm not sure anymore that the `pagespec_merge` function will continue to work in all cases.
+
+>>> The problem I can see there is that if two pagespecs
+>>> get merged and both use `~foo` but define it differently,
+>>> then the second definition might be used at a point when
+>>> it shouldn't (but I haven't verified that really happens).
+>>> That could certianly be a show-stopper. --[[Joey]]
+
+>>>> I think this can happen in the new closure based code. I don't think this could happen in the old code. -- [[Will]]
+
+>>>> Even if that works, this is a good argument for having a syntactic difference between named pagespecs and normal pages.
+>>>> If you're joining two pagespecs with 'or', you don't want a named pagespec in the first part overriding a page name in the
+>>>> second part. Oh, and I assume 'or' has the right operator precedence that "a and b or c" is "(a and b) or c", and not "a and (b or c)" -- [[Will]]
+
+>>>>> Looks like its bracketed in the code anyway... -- [[Will]]
+
+>>>> Perhaps the thing to do is to have a `clear_defines()`
+>>>> function, then merging `A` and `B` yields `(A) or (clear_defines() and (B))`
+>>>> That would deal with both the cases where `A` and `B` differently
+>>>> define `~foo` as well as with the case where `A` defines `~foo` while
+>>>> `B` uses it to refer to a literal page.
+>>>> --[[Joey]]
+
+>>>>> I don't think this will work with the new patch, and I don't think it was needed with the old one.
+>>>>> Under the old patch, pagespec_makeperl() generated a string of unevaluated, self-contained, perl
+>>>>> code. When a new named pagespec was defined, a recursive call was made to get the perl code
+>>>>> for the pagespec, and then that code was used to add something like `$params{specFuncs}->{name} = sub {recursive code} and `
+>>>>> to the result of the calling function. This means that at pagespec testing time, when this code is executed, the
+>>>>> specFuncs hash is built up as the pagespec is checked. In the case of the 'or' used above, later redefinitions of
+>>>>> a named pagespec would have redefined the specFunc at the right time. It should have just worked. However...
+
+>>>>> Since my original patch, you started using closures for security reasons (and I can see the case for that). Unfortunately this
+>>>>> means that the generated perl code is no longer self-contained - it needs to be evaluated in the same closure it was generated
+>>>>> so that it has access to the data array. To make this work with the recursive call I had two options: a) make the data array a
+>>>>> reference that I pass around through the pagespec_makeperl() functions and have available when the code is finally evaluated
+>>>>> in pagespec_translate(), or b) make sure that each pagespec is evaluated in its correct closure and a perl function is returned, not a
+>>>>> string containing unevaluated perl code.
+
+>>>>> I went with option b). I did it in such a way that the hash of specfuncs is built up at translation time, not at execution time. This
+>>>>> means that with the new code you can call specfuncs that get defined out of order:
+
+ ~test and define(~test, blah)
+
+>>>>> but it also means that using a simple 'or' to join two pagespecs wont work. If you do something like this:
+
+ ~test and define(~test, foo) and define(~test, baz)
+
+>>>>> then the last definition (baz) takes precedence.
+>>>>> In the process of writing this I think I've come up with a way to change this back the way it was, still using closures. -- [[Will]]
+
+>>> My [[remove-pagespec-merge|should_optimise_pagespecs]] branch has now
+>>> solved all this by deleting the offending function :-) --[[smcv]]
+
+
+
+Patch updated to use closures rather than inline generated code for named pagespecs. Also includes some new use of ErrorReason where appropriate. -- [[Will]]
+
+> * Perl really doesn't need forward declarations, honest!
+
+>> It complained (warning, not error) when I didn't use the forward declaration. :(
+
+> * I have doubts about memoizing the anonymous sub created by
+> `pagespec_translate`.
+
+>> This is there explicitly to make sure that runtime is polynomial and not exponential.
+
+> * Think where you wrote `+{}` you can just write `{}`
+
+>> Possibly :) -- [[Will]]
+
+----
+
+ diff --git a/IkiWiki.pm b/IkiWiki.pm
+ index 061a1c6..1e78a63 100644
+ --- a/IkiWiki.pm
+ +++ b/IkiWiki.pm
+ @@ -1774,8 +1774,12 @@ sub pagespec_merge ($$) {
+ return "($a) or ($b)";
+ }
+
+ -sub pagespec_translate ($) {
+ +# is perl really so dumb it requires a forward declaration for recursive calls?
+ +sub pagespec_translate ($$);
+ +
+ +sub pagespec_translate ($$) {
+ my $spec=shift;
+ + my $specFuncsRef=shift;
+
+ # Convert spec to perl code.
+ my $code="";
+ @@ -1789,7 +1793,9 @@ sub pagespec_translate ($) {
+ |
+ \) # )
+ |
+ - \w+\([^\)]*\) # command(params)
+ + define\(\s*~\w+\s*,((\([^()]*\)) | ([^()]+))+\) # define(~specName, spec) - spec can contain parens 1 deep
+ + |
+ + \w+\([^()]*\) # command(params) - params cannot contain parens
+ |
+ [^\s()]+ # any other text
+ )
+ @@ -1805,10 +1811,19 @@ sub pagespec_translate ($) {
+ elsif ($word eq "(" || $word eq ")" || $word eq "!") {
+ $code.=' '.$word;
+ }
+ - elsif ($word =~ /^(\w+)\((.*)\)$/) {
+ + elsif ($word =~ /^define\(\s*(~\w+)\s*,(.*)\)$/s) {
+ + my $name = $1;
+ + my $subSpec = $2;
+ + my $newSpecFunc = pagespec_translate($subSpec, $specFuncsRef);
+ + return if $@ || ! defined $newSpecFunc;
+ + $specFuncsRef->{$name} = $newSpecFunc;
+ + push @data, qq{Created named pagespec "$name"};
+ + $code.="IkiWiki::SuccessReason->new(\$data[$#data])";
+ + }
+ + elsif ($word =~ /^(\w+)\((.*)\)$/s) {
+ if (exists $IkiWiki::PageSpec::{"match_$1"}) {
+ push @data, $2;
+ - $code.="IkiWiki::PageSpec::match_$1(\$page, \$data[$#data], \@_)";
+ + $code.="IkiWiki::PageSpec::match_$1(\$page, \$data[$#data], \@_, specFuncs => \$specFuncsRef)";
+ }
+ else {
+ push @data, qq{unknown function in pagespec "$word"};
+ @@ -1817,7 +1832,7 @@ sub pagespec_translate ($) {
+ }
+ else {
+ push @data, $word;
+ - $code.=" IkiWiki::PageSpec::match_glob(\$page, \$data[$#data], \@_)";
+ + $code.=" IkiWiki::PageSpec::match_glob(\$page, \$data[$#data], \@_, specFuncs => \$specFuncsRef)";
+ }
+ }
+
+ @@ -1826,7 +1841,7 @@ sub pagespec_translate ($) {
+ }
+
+ no warnings;
+ - return eval 'sub { my $page=shift; '.$code.' }';
+ + return eval 'memoize (sub { my $page=shift; '.$code.' })';
+ }
+
+ sub pagespec_match ($$;@) {
+ @@ -1839,7 +1854,7 @@ sub pagespec_match ($$;@) {
+ unshift @params, 'location';
+ }
+
+ - my $sub=pagespec_translate($spec);
+ + my $sub=pagespec_translate($spec, +{});
+ return IkiWiki::ErrorReason->new("syntax error in pagespec \"$spec\"")
+ if $@ || ! defined $sub;
+ return $sub->($page, @params);
+ @@ -1850,7 +1865,7 @@ sub pagespec_match_list ($$;@) {
+ my $spec=shift;
+ my @params=@_;
+
+ - my $sub=pagespec_translate($spec);
+ + my $sub=pagespec_translate($spec, +{});
+ error "syntax error in pagespec \"$spec\""
+ if $@ || ! defined $sub;
+
+ @@ -1872,7 +1887,7 @@ sub pagespec_match_list ($$;@) {
+ sub pagespec_valid ($) {
+ my $spec=shift;
+
+ - my $sub=pagespec_translate($spec);
+ + my $sub=pagespec_translate($spec, +{});
+ return ! $@;
+ }
+
+ @@ -1919,6 +1934,68 @@ sub new {
+
+ package IkiWiki::PageSpec;
+
+ +sub check_named_spec($$;@) {
+ + my $page=shift;
+ + my $specName=shift;
+ + my %params=@_;
+ +
+ + return IkiWiki::ErrorReason->new("Unable to find specFuncs in params to check_named_spec()!")
+ + unless exists $params{specFuncs};
+ +
+ + my $specFuncsRef=$params{specFuncs};
+ +
+ + return IkiWiki::ErrorReason->new("Named page spec '$specName' is not valid")
+ + unless (substr($specName, 0, 1) eq '~');
+ +
+ + if (exists $specFuncsRef->{$specName}) {
+ + # remove the named spec from the spec refs
+ + # when we recurse to avoid infinite recursion
+ + my $sub = $specFuncsRef->{$specName};
+ + delete $specFuncsRef->{$specName};
+ + my $result = $sub->($page, %params);
+ + $specFuncsRef->{$specName} = $sub;
+ + return $result;
+ + } else {
+ + return IkiWiki::ErrorReason->new("Page spec '$specName' does not exist");
+ + }
+ +}
+ +
+ +sub check_named_spec_existential($$$;@) {
+ + my $page=shift;
+ + my $specName=shift;
+ + my $funcref=shift;
+ + my %params=@_;
+ +
+ + return IkiWiki::ErrorReason->new("Unable to find specFuncs in params to check_named_spec_existential()!")
+ + unless exists $params{specFuncs};
+ + my $specFuncsRef=$params{specFuncs};
+ +
+ + return IkiWiki::ErrorReason->new("Named page spec '$specName' is not valid")
+ + unless (substr($specName, 0, 1) eq '~');
+ +
+ + if (exists $specFuncsRef->{$specName}) {
+ + # remove the named spec from the spec refs
+ + # when we recurse to avoid infinite recursion
+ + my $sub = $specFuncsRef->{$specName};
+ + delete $specFuncsRef->{$specName};
+ +
+ + foreach my $nextpage (keys %IkiWiki::pagesources) {
+ + if ($sub->($nextpage, %params)) {
+ + my $tempResult = $funcref->($page, $nextpage, %params);
+ + if ($tempResult) {
+ + $specFuncsRef->{$specName} = $sub;
+ + return IkiWiki::SuccessReason->new("Existential check of '$specName' matches because $tempResult");
+ + }
+ + }
+ + }
+ +
+ + $specFuncsRef->{$specName} = $sub;
+ + return IkiWiki::FailReason->new("No page in spec '$specName' was successfully matched");
+ + } else {
+ + return IkiWiki::ErrorReason->new("Named page spec '$specName' does not exist");
+ + }
+ +}
+ +
+ sub derel ($$) {
+ my $path=shift;
+ my $from=shift;
+ @@ -1937,6 +2014,10 @@ sub match_glob ($$;@) {
+ my $glob=shift;
+ my %params=@_;
+
+ + if (substr($glob, 0, 1) eq '~') {
+ + return check_named_spec($page, $glob, %params);
+ + }
+ +
+ $glob=derel($glob, $params{location});
+
+ my $regexp=IkiWiki::glob2re($glob);
+ @@ -1959,8 +2040,9 @@ sub match_internal ($$;@) {
+
+ sub match_link ($$;@) {
+ my $page=shift;
+ - my $link=lc(shift);
+ + my $fullLink=shift;
+ my %params=@_;
+ + my $link=lc($fullLink);
+
+ $link=derel($link, $params{location});
+ my $from=exists $params{location} ? $params{location} : '';
+ @@ -1975,25 +2057,37 @@ sub match_link ($$;@) {
+ }
+ else {
+ return IkiWiki::SuccessReason->new("$page links to page $p matching $link")
+ - if match_glob($p, $link, %params);
+ + if match_glob($p, $fullLink, %params);
+ $p=~s/^\///;
+ $link=~s/^\///;
+ return IkiWiki::SuccessReason->new("$page links to page $p matching $link")
+ - if match_glob($p, $link, %params);
+ + if match_glob($p, $fullLink, %params);
+ }
+ }
+ return IkiWiki::FailReason->new("$page does not link to $link");
+ }
+
+ sub match_backlink ($$;@) {
+ - return match_link($_[1], $_[0], @_);
+ + my $page=shift;
+ + my $backlink=shift;
+ + my @params=@_;
+ +
+ + if (substr($backlink, 0, 1) eq '~') {
+ + return check_named_spec_existential($page, $backlink, \&match_backlink, @params);
+ + }
+ +
+ + return match_link($backlink, $page, @params);
+ }
+
+ sub match_created_before ($$;@) {
+ my $page=shift;
+ my $testpage=shift;
+ my %params=@_;
+ -
+ +
+ + if (substr($testpage, 0, 1) eq '~') {
+ + return check_named_spec_existential($page, $testpage, \&match_created_before, %params);
+ + }
+ +
+ $testpage=derel($testpage, $params{location});
+
+ if (exists $IkiWiki::pagectime{$testpage}) {
+ @@ -2014,6 +2108,10 @@ sub match_created_after ($$;@) {
+ my $testpage=shift;
+ my %params=@_;
+
+ + if (substr($testpage, 0, 1) eq '~') {
+ + return check_named_spec_existential($page, $testpage, \&match_created_after, %params);
+ + }
+ +
+ $testpage=derel($testpage, $params{location});
+
+ if (exists $IkiWiki::pagectime{$testpage}) {
diff --git a/doc/todo/transient_pages.mdwn b/doc/todo/transient_pages.mdwn
new file mode 100644
index 000000000..fe2259b40
--- /dev/null
+++ b/doc/todo/transient_pages.mdwn
@@ -0,0 +1,318 @@
+On [[todo/auto-create_tag_pages_according_to_a_template]], [[chrysn]]
+suggests:
+
+> Instead of creating a file that gets checked in into the RCS, the
+> source files could be left out and the output files be written as
+> long as there is no physical source file (think of a virtual underlay).
+> Something similar would be required to implement alias directive,
+> which couldn't be easily done by writing to the RCS as the page's
+> contents can change depending on which other pages claim it as an alias.
+
+`add_autofile` could be adapted to do this, or a similar API could be
+added.
+
+This would also be useful for autoindex, as suggested on
+[[plugins/autoindex/discussion]] and [[!debbug 544322]]. I'd also like
+to use it for [[plugins/contrib/album]].
+
+It could also be used for an [[todo/alias_directive]].
+
+--[[smcv]]
+
+> All [[merged|done]] --[[Joey]]
+
+--------------------------
+
+[[!template id=gitbranch branch=smcv/ready/transient author="[[smcv]]"]]
+[[!tag patch]]
+
+Related branches:
+
+* `ready/tag-test`: an extra regression test for tags
+ > merged --[[Joey]]
+* either `transient-relative` or `transient-relative-api`: avoid using `Cwd`
+ on initialization
+ > merged the latter --[[Joey]]
+* `ready/transient-aggregate`: use for aggregate
+ > merged --[[Joey]]
+* `ready/transient-autoindex`: optionally use for autoindex,
+ which is [[!debbug 544322]] (includes autoindex-autofile from
+ [[todo/autoindex should use add__95__autofile]])
+ > merged. I do note that this interacts badly with ikiwiki-hosting's
+ > backup/restore/branch handling, since that does not back up the
+ > transientdir by default, and so autoindex will not recreate the
+ > "deleted" pages. I'll probably have to make it back up the transientdir
+ > too. --[[Joey]]
+* `ready/transient-recentchanges`: use for recentchanges
+ > merged --[[Joey]]
+* `ready/transient-tag`: optionally use for tag (includes tag-test)
+ > merged --[[Joey]]
+
+I think this branch is now enough to be useful. It adds the following:
+
+If the `transient` plugin is loaded, `$srcdir/.ikiwiki/transient` is added
+as an underlay. I'm not sure whether this should be a plugin or core, so
+I erred on the side of more plugins; I think it's "on the edge of the core",
+like goto.
+
+Pages in the transient underlay are automatically
+deleted if a page of the same name is created in the srcdir (or an underlay
+closer to the srcdir in stacking order).
+
+With the additional `ready/transient-tag` branch,
+`tag` enables `transient`, and if `tag_autocreate_commit` is set to 0
+(default 1), autocreated tags are written to the transient underlay.
+There is a regression test.
+
+With the additional `transient-autoindex` branch,
+`autoindex` uses autofiles. It also enables `transient`, and if
+`autoindex_commit` is set to 0 (default 1), autoindexes are written to
+the transient underlay. There is a regression test. However, this branch
+is blocked by working out what the desired behaviour is, on
+[[todo/autoindex_should_use_add__95__autofile]].
+
+> I wonder why this needs to be configurable? I suppose that gets back to
+> whether it makes sense to check these files in or not. The benefits of
+> checking them in:
+>
+> * You can edit them from the VCS, don't have to go into the web
+> interface. Of course, files from the underlays have a similar issue,
+> but does it make sense to make that wart larger?
+> * You can know you can build the same site with nothing missing
+> even if you don't there enable autoindex or whatever. (Edge case.)
+
+>> I'm not sure that that's a huge wart; you can always "edit by
+>> overwriting". If you're running a local clone of the wiki on your laptop
+>> or whatever, you have the underlays already, and can copy from there.
+>> Tag and autoindex pages have rather simple source code anyway. --s
+
+> The benefit of using transient pages seems to just be avoiding commit
+> clutter? For files that are never committed, transient pages are a clear
+> win, but I wonder if adding configuration clutter just to avoid some
+> commit clutter is really worth it.
+
+>> According to the last section of
+>> [[todo/auto-create_tag_pages_according_to_a_template]], [[chrysn]] and
+>> Eric both feel rather strongly that it should be possible to
+>> not commit any tags; in [[plugins/autoindex/discussion]],
+>> lollipopman and [[JoeRayhawk]] both requested the same for autoindex.
+>> I made it configurable because, as you point out,
+>> there are also reasons why it makes sense to check these
+>> automatically-created files in. I'm neutral on this, personally.
+>>
+>> If this is a point of contention, would you accept a branch that
+>> just adds `transient` and uses it for [[plugins/recentchanges]],
+>> which aren't checked in and never have been? I've split the
+>> branch up in the hope that *some* of it can get merged.
+>>
+>>> I will be happy to merge transient-recentchanges when it's ready.
+>>> I see no obstacle to merging transient-tag either, and am not
+>>> really against using it for autoindex or aggregate either
+>>> once they get completed.
+>>> I just wanted to think through why configurability is needed.
+>>> --[[Joey]]
+>>
+>> One potentially relevant point is that configuration clutter only
+>> affects the site admin whereas commit clutter is part of the whole
+>> wiki's history. --[[smcv]]
+
+> Anyway, the configurability
+> appears subtly broken; the default is only 1 if a new setup file is
+> generated. (Correction: It was not even the default then --[[Joey]])
+> With an existing setup file, the 'default' values in
+> `getsetup` don't take effect, so it will default to undef, which
+> is treated the same as 0. --[[Joey]]
+
+>> Fixed in the branches, hopefully. (How disruptive would it be to have
+>> defaults take effect whenever the setup file doesn't set a value, btw?
+>> It seems pretty astonishing to have them work as they do at the moment.) --s
+
+>>> Well, note that default is not actually a documented field in
+>>> getsetup hooks at all! (It is used in IkiWiki.pm's own `getsetup()`, and
+>>> the concept may have leaked out into one or two plugins (comments,
+>>> transient)).
+>>>
+>>> Running getsetup at plugin load time is something I have considered
+>>> doing. It would simplify some checkconfig hooks that just set hardcoded
+>>> defaults. Although since dying is part of the getsetup hook's API, it
+>>> could be problimaric.
+>>> --[[Joey]]
+
+autoindex ignores pages in the transient underlay when deciding whether
+to generate an index.
+
+With the additional `ready/transient-recentchanges` branch, new recent
+changes go in the transient underlay; I tested this manually.
+
+Not done yet (in that branch, at least):
+
+* `remove` can't remove transient pages: this turns out to be harder than
+ I'd hoped, because I don't want to introduce a vulnerability in the
+ non-regular-file detection, so I'd rather defer that.
+
+ > Hmm, I'd at least want that to be dealt with before this was used
+ > by default for autoindex or tag. --[[Joey]]
+
+ >> I'll try to work out which of the checks are required for security
+ >> and which are just nice-to-have, but I'd appreciate any pointers
+ >> you could give. Note that my branch wasn't meant to enable either
+ >> by default, and now hopefully doesn't. --[[smcv]]
+
+ >>> Opened a new bug for this, [[bugs/removal_of_transient_pages]]
+ >>> --[[Joey]]
+
+* Transient tags that don't match any pages aren't deleted: I'm not sure
+ that that's a good idea anyway, though. Similarly, transient autoindexes
+ of directories that become empty aren't deleted.
+
+ > Doesn't seem necessary, or really desirable to do that. --[[Joey]]
+
+ >> Good, that was my inclination too. --s
+
+* In my `untested/transient` branch, new aggregated files go in the
+ transient underlay too (they'll naturally migrate over time). I haven't
+ tested this yet, it's just a proof-of-concept.
+
+ > Now renamed to `ready/transient-aggregate`; it does seem to work fine.
+ > --s
+
+> I can confirm that the behavior of autoindex, at least, is excellent.
+> Haven't tried tag. Joey, can you merge transient and autoindex? --JoeRayhawk
+
+>> Here are some other things I'd like to think about first: --[[Joey]]
+>>
+>> * There's a FIXME in autoindex.
+>>
+>> > Right, the extra logic for preventing autoindex pages from being
+>> > re-created. This is taking a while, so I'm going to leave out the
+>> > autoindex part for the moment. The FIXME is only relevant
+>> > because I tried to solve
+>> > [[todo/autoindex should use add__95__autofile]] first, but
+>> > strictly speaking, that's an orthogonal change. --s
+
+>> * Suggest making recentchanges unlink the transient page
+>> first, and only unlink from the old location if it wasn't
+>> in the transient location. Ok, it only saves 1 syscall :)
+>>
+>> > Is an unlink() really that expensive? But, OK, fixed in the
+>> > `ready/transient-recentchanges` branch. --s
+
+>> >> It's not, but it's easy. :) --[[Joey]]
+
+>> * Similarly it's a bit worrying for performance that it
+>> needs to pull in and use `Cwd` on every ikiwiki startup now.
+>> I really don't see the need; `wikistatedir` should
+>> mostly be absolute, and ikiwiki should not chdir in ways
+>> that break it anyway.
+>>
+>> > The reason to make it absolute is that relative underlays
+>> > are interpreted as relative to the base underlay directory,
+>> > not the cwd, by `add_underlay`.
+>> >
+>> > The updated `ready/transient-only` branch only loads `Cwd` if
+>> > the path is relative; an extra commit on branch
+>> > `smcv/transient-relative` goes behind `add_underlay`'s
+>> > back to allow use of a cwd-relative underlay. Which direction
+>> > would you prefer?
+>> >
+>> > I note in passing that [[plugins/autoindex]] and `IkiWiki::Render`
+>> > both need to use `Cwd` and `File::Find` on every refresh, so
+>> > there's only any point in avoiding `Cwd` for runs that don't
+>> > actually refresh, like simple uses of the CGI. --s
+
+>> >> Oh, right, I'd forgotten about the horrificness of File::Find
+>> >> that required a chdir for security. Ugh. Can we just avoid
+>> >> it for those simple cases then? (demand-calculate wikistatedir)
+>> >> --[[Joey]]
+
+>> >>> The reason that transientdir needs to be absolute is that it's
+>> >>> added as an underlay.
+>> >>>
+>> >>> We could avoid using `Cwd` by taking the extra commit from either
+>> >>> `smcv/transient-relative` or `smcv/transient-relative-api`;
+>> >>> your choice. I'd personally go for the latter.
+>> >>>
+>> >>> According to git grep, [[plugins/po]] already wants to look at
+>> >>> the underlaydirs in its checkconfig hook, so I don't think
+>> >>> delaying calculation of the underlaydir is viable. (I also noticed
+>> >>> a bug,
+>> >>> [[bugs/po:_might_not_add_translated_versions_of_all_underlays]].)
+>> >>>
+>> >>> `underlaydirs` certainly needs to have been calculated by the
+>> >>> time `refresh` hooks finish, so `find_src_files` can use it. --s
+
+>> * Unsure about the use of `default_pageext` in the `change`
+>> hook. Is everything in the transientdir really going
+>> to use that pageext? Would it be better to look up the
+>> complete source filename?
+>>
+>> > I've updated `ready/transient` to do a more thorough GC by
+>> > using File::Find on the transient directory. This does
+>> > require `File::Find` and `Cwd`, but only when pages change,
+>> > and `refresh` loads both of those in that situation anyway.
+>> >
+>> > At the moment everything in the transientdir will either
+>> > have the `default_pageext` or be internal, although I
+>> > did wonder whether to make [[plugins/contrib/album]]
+>> > viewer pages optionally be `html`, for better performance
+>> > when there's a very large number of photos. --s
+
+>> >> Oh, ugh, more File::Find... Couldn't it just assume that the
+>> >> transient page has the same extension as its replacement?
+>> >> --[[Joey]]
+
+>> >>> Good idea, that'll be true for web edits at least.
+>> >>> Commit added. --s
+
+--------------------------
+
+## An earlier version
+
+I had a look at implementing this. It turns out to be harder than I thought
+to have purely in-memory pages (several plugins want to be able to access the
+source file as a file), but I did get this proof-of-concept branch
+to write tag and autoindex pages into an underlay.
+
+This loses the ability to delete the auto-created pages (although they don't
+clutter up git this way, at least), and a lot of the code in autoindex is
+probably now redundant, so this is probably not quite ready for merge, but
+I'd welcome opinions.
+
+Usage: set `tag_underlay` and/or `autoindex_underlay` to an absolute path,
+which you must create beforehand. I suggest *srcdir* + `/.ikiwiki/transient`.
+
+Refinements that could be made if this approach seems reasonable:
+
+* make these options boolean, and have the path always be `.ikiwiki/transient`
+* improve the `remove` plugin so it also deletes from this special underlay
+
+>> Perhaps it should be something more generic, so that other plugins could use it (such as "album" mentioned above).
+>> The `.ikiwiki/transient` would suit this, but instead of saying "tag_underlay" or "autoindex_underlay" have "use_transient_underlay" or something like that?
+>> Or to make it more flexible, have just one option "transient_underlay" which is set to an absolute path, and if it is set, then one is using a transient-underlay.
+>> --[[KathrynAndersen]]
+
+>>> What I had in mind was more like `tag_autocreate_transient => 1` or
+>>> `autoindex_transient => 1`; you might conceivably want tags to be
+>>> checked in but autoindices to be transient, and it's fine for each
+>>> plugin to make its own decision. Going from that to one boolean
+>>> (or just always-transient if people don't think that's too
+>>> astonishing) would be trivial, though.
+>>>
+>>> I don't think relocating the transient underlay really makes sense,
+>>> except for prototyping: you only want one, and `.ikiwiki` is as good
+>>> a place as any (ikiwiki already needs to be able to write there).
+>>>
+>>> For [[plugins/contrib/album]] I think I'd just make the photo viewer
+>>> pages always-transient - you can always make a transient page
+>>> permanent by editing it, after all.
+>>>
+>>> Do you think this approach has enough potential that I should
+>>> continue to hack on it? Any thoughts on the implementation? --[[smcv]]
+
+>>>> Ah, now I understand what you're getting at. Yes, it makes sense to put transient pages under `.ikiwiki`.
+>>>> I haven't looked at the code, but I'd be interested in seeing whether it's generic enough to be used by other plugins (such as `album`) without too much fuss.
+>>>> The idea of a transient underlay gives us a desirable feature for free: that if someone edits the transient page, it is made permanent and added to the repository.
+>>>>
+>>>> I think the tricky thing with removing these transient underlay pages is the question of how to prevent whatever auto-generated the pages in the first place from generating them again - or, conversely, how to force whatever auto-generated those pages to regenerate them if you've changed your mind.
+>>>> I think you'd need something similar to `will_render` so that transient pages would be automatically removed if whatever auto-generated them is no longer around.
+>>>> -- [[KathrynAndersen]]
diff --git a/doc/todo/translation_links.mdwn b/doc/todo/translation_links.mdwn
new file mode 100644
index 000000000..63e8d1010
--- /dev/null
+++ b/doc/todo/translation_links.mdwn
@@ -0,0 +1,46 @@
+This is an offshoot of [[this rant|translation/discussion/#index3h1]].
+
+Basically, while I can appreciate the [[plugins/po]] plugin for more or less "static" site, or more organised wikis, for certain wikis, it's way too overhead.
+
+## Stories
+
+The following stories should be answered by that plugin:
+
+ 1. a user browses the wiki, finds that the page is translated in another language and clicks on the language to see that page translated
+ 2. a user browses the wiki and gets automatically the right language
+ 3. an editor creates a wiki page, and it gets assigned a language
+ 4. a translator sees that page and translates it to another language, and that page is linked with the first one, both ways (in that stories 1 and 2 can work)
+ 5. (optional) a translator can see the list of pages needing translation and translate pages
+ 6. (optional) an editor changes a wiki page, the translated page is marked as "dirty" (ie. needing translation)
+
+## Fundamental constraints
+
+This issue is about creating a "wikipedia-like" translation structure where:
+
+ 1. there's no "master language"
+ 2. there's a loose connexion between pages
+ 3. not all pages are necessarily translated, nor is it a goal
+
+Those are fundamental constraints that should be required by that plugin. It doesn't mean that the plugin cannot be used otherwise, but that's all it needs to respect to fulfill the requirements here.
+
+## Optional constraints
+
+There can be more constraints that we may want to impose or not, which will make things more or less complicated:
+
+ 4. the page URLs need to be translatable - it would make [[!wikipedia Content_negotiation]] fail, so it would require the CGI for story 2. it would also make it harder to create the connexion between pages, as metadata would be needed in each page
+ 5. the language must not be visible in the URL - same as #4
+ 6. translation system must also be usable from the commandline/git repository - #5 and #6 would be basically impossible to implement there
+
+## Basic spec
+
+ 1. a hook that looks for foo.la.mdwn pages, where la is a language code (defined where..?), and that lists available translations -
+ this is where most of the work needs to happen. we can probably reuse the builtin template stuff that got injected with the [[plugins/po]] plugin was imported, to start with
+ 2. instructions on how to setup [[!wikipedia Content_negotiation]] so that the above works out of the box - just documentation
+ 3. a button to create such translations - that would be through the [[pageactions hook|plugins/write/#index15h3]]
+ 4. a default language setting? - that's obviously the getsetup hook
+ 5. a set of language code settings? - same
+ 6. content-negotiation - the po module has good code for that
+
+## Authors
+
+ * [[anarcat]]
diff --git a/doc/todo/turn_edittemplate_verbosity_off_by_default.mdwn b/doc/todo/turn_edittemplate_verbosity_off_by_default.mdwn
new file mode 100644
index 000000000..14bb43782
--- /dev/null
+++ b/doc/todo/turn_edittemplate_verbosity_off_by_default.mdwn
@@ -0,0 +1,34 @@
+`edittemplate` replaces its directive with a note like "edittemplate person
+registered for people/*". It would be nice if this were dependent on
+a `verbose` parameter and default to off. I don't see the value in it, and by
+disabling the output, I can keep template registration as close as possible to
+the action as needed.
+
+I think this (untested) patch might just do the trick:
+
+ --- a/IkiWiki/Plugin/edittemplate.pm
+ +++ b/IkiWiki/Plugin/edittemplate.pm
+ @@ -46,8 +46,13 @@ sub preprocess (@) {
+
+ $pagestate{$params{page}}{edittemplate}{$params{match}}=$params{template};
+
+ - return sprintf(gettext("edittemplate %s registered for %s"),
+ - $params{template}, $params{match});
+ + if (yesno($params{verbose})) {
+ + return sprintf(gettext("edittemplate %s registered for %s"),
+ + $params{template}, $params{match});
+ + }
+ + else {
+ + return '';
+ + }
+ }
+
+ sub formbuilder (@) {
+
+--[[madduck]]
+
+[[!tags wishlist patch]]
+
+[[done]], though the patch I eventually applied uses "silent" as the
+parameter name. Sorry for forgetting about this patch until someone else
+implemented it too. --[[Joey]]
diff --git a/doc/todo/two-way_convert_of_wikis.mdwn b/doc/todo/two-way_convert_of_wikis.mdwn
new file mode 100644
index 000000000..170927431
--- /dev/null
+++ b/doc/todo/two-way_convert_of_wikis.mdwn
@@ -0,0 +1,18 @@
+[[!tag wishlist]]
+
+Ok, the vision is this: Some of you will know git-svn. I want something like
+git-svn,, but for wikis. I want to be able to do the following:
+
+1. Convert a moinmoin (or whatever) wiki to a local ikiwiki on my laptop.
+2. Edit my local copy (offline).
+3. Preview the changes with my local ikiwki installation + browser.
+4. Push the changes back to moinmoin (or whatever) wiki.
+
+I know, I know, ikiwiki wasn't designed for that, but it would be really cool,
+and useful and people ask for that kind of thing too.
+
+--[[David_Riebenbauer]]
+
+----
+
+I have worked on this a little, for Mediawiki. My script is able to incrementally import changes from mediawiki, but doesn't push them back in. One of the big issues I found in doing that is that most wikis have Captcha or anti-spam controls that will make automating those steps difficult. See [[tips/convert_mediawiki_to_ikiwiki]] for the script. -- [[anarcat]]
diff --git a/doc/todo/typography_plugin_configuration.mdwn b/doc/todo/typography_plugin_configuration.mdwn
new file mode 100644
index 000000000..dd162a084
--- /dev/null
+++ b/doc/todo/typography_plugin_configuration.mdwn
@@ -0,0 +1,6 @@
+The [[typography_plugin|plugins/typography]] could support configuration of
+which translations to make. [[!cpan Text::Typography]] supports fine-grained
+control of which translations to make, so [[plugins/typography]] just needs to
+expose this somehow. --[[JoshTriplett]]
+
+[[done]] --[[Joey]]
diff --git a/doc/todo/unaccent_url_instead_of_encoding.mdwn b/doc/todo/unaccent_url_instead_of_encoding.mdwn
new file mode 100644
index 000000000..e5ad34335
--- /dev/null
+++ b/doc/todo/unaccent_url_instead_of_encoding.mdwn
@@ -0,0 +1,24 @@
+If one puts localized chars in wikilinks ikiwiki will escape it.
+This works right from a technical point of view, but the URLs will become ugly.
+
+So I made a patch which unaccent chars: <http://users.itk.ppke.hu/~cstamas/code/ikiwiki/unaccentpagetitlenames/>
+This is a one liner change, but requires a bit of reordering in the code.
+
+--[[cstamas]]
+
+> This was previously requested in [[todo/more_customisable_titlepage_function]],
+> in which [[Joey]] said "I don't think that changing titlepage is a good idea,
+> there are compatability problems".
+>
+> The problem is that altering titlepage changes the meaning of your wiki,
+> by resolving all wiki links to different page names. That means that:
+>
+> * unaccenting can't be automatic, it has to be a configuration option
+> (so you don't accidentally get different behaviour by installing
+> Text::Unaccent)
+> * upgrading Text::Unaccent becomes risky, as I doubt it guarantees to
+> have stable rules for how to transliterate into ASCII!
+>
+> --[[smcv]]
+
+[[!tag wishlist patch patch/core]]
diff --git a/doc/todo/underlay.mdwn b/doc/todo/underlay.mdwn
new file mode 100644
index 000000000..9bcfea62b
--- /dev/null
+++ b/doc/todo/underlay.mdwn
@@ -0,0 +1,13 @@
+Rather than copy the basewiki around everywhere, it should be configured to
+underlay the main srcdir, and pages be rendered from there if not in the
+srcdir. This would allow upgrades to add/edit pages in the basewiki.
+
+Implementaion will be slightly tricky since currently ikiwiki is hardcoded
+in many places to look in srcdir for pages. Also, there are possible
+security attacks in the vein of providing a file ikiwiki would normally
+skip in the srcdir, and tricking it to processing this file instead of the
+one from the underlaydir. -- Fixed by scanning srcdir first, then
+underlaydir, and refusing to add any files from underlaydir if they also
+exist in the srcdir. However, see [[security]] for caveats.
+
+[[todo/done]]
diff --git a/doc/todo/unified_temporary_file__47__directory_handling.mdwn b/doc/todo/unified_temporary_file__47__directory_handling.mdwn
new file mode 100644
index 000000000..ca63fbeea
--- /dev/null
+++ b/doc/todo/unified_temporary_file__47__directory_handling.mdwn
@@ -0,0 +1,19 @@
+Many plugins seem to create temporary files. Although it is not much code, it is duplicated, and a
+typical place for security bugs. Would it be worthwhile to have library functions for creating temporary files
+and directories? If nothing else, it would serve as documentation of the "official way".
+
+Off to cut and paste :-) --[[DavidBremner]]
+
+> Hmm, I see only three users of temp files in all ikiwiki:
+> * hnb uses `File::Temp::mkstemp` to create two temp file handles.
+> * teximg uses `File::Temp::tempdir` to create a temporary directory.
+> * attachment retrieves a temp file from `CGI::tmpFileName`.
+> These are three quite different uses of temp files, not subject to
+> unification. Using `File::Temp` (and avoiding the posibly insecure
+> `mktemp`, `tmpname`, and `tempnam` functions) is probably as unified as
+> can be managed. --[[Joey]]
+
+>> OK, fair enough. Somehow the code in teximg made me think it was
+>> all a bit complicated. But after I played with it a bit more (and used File::Temp)
+>> I tend to agree, there is no real problem there to fix.
+>> Feel free to mark [[done]] [[DavidBremner]]
diff --git a/doc/todo/untrusted_git_push_hooks.mdwn b/doc/todo/untrusted_git_push_hooks.mdwn
new file mode 100644
index 000000000..313078ce5
--- /dev/null
+++ b/doc/todo/untrusted_git_push_hooks.mdwn
@@ -0,0 +1,12 @@
+Re the canrename, canremove, and canedit hooks:
+
+Of the three, only canremove is currently checked during an untrusted
+git push (a normal git push is assumed to be from a trusted user and
+bypasses all checks).
+
+It would probably make sense to add the canedit hook to the checks done
+there. Calling the canrename hook is tricky, because after all, git does
+not record explicit file moves.
+
+The checkcontent hook is another hook not currently called there, that
+probably should be.
diff --git a/doc/todo/upgradehooks.mdwn b/doc/todo/upgradehooks.mdwn
new file mode 100644
index 000000000..47da73443
--- /dev/null
+++ b/doc/todo/upgradehooks.mdwn
@@ -0,0 +1,8 @@
+It's annoying to have to manually run --setup, especially for multiple
+blogs, on upgrade. Is the deb is used, there could be a postinst hook to do
+this.
+
+Let there be an /etc/ikiwiki/wikis, which just lists setup files and the
+user who owns them. postinst loops through, su's, and runs --setup. Voila!
+
+[[todo/done]]
diff --git a/doc/todo/use_secure_cookies_for_ssl_logins.mdwn b/doc/todo/use_secure_cookies_for_ssl_logins.mdwn
new file mode 100644
index 000000000..194db2f36
--- /dev/null
+++ b/doc/todo/use_secure_cookies_for_ssl_logins.mdwn
@@ -0,0 +1,36 @@
+[[!template id=gitbranch branch=smcv/ready/sslcookie-auto author="[[smcv]]"]]
+[[!tag patch]]
+
+At the moment `sslcookie => 0` never creates secure cookies, so if you log in
+with SSL, your browser will send the session cookie even over plain HTTP.
+Meanwhile `sslcookie => 1` always creates secure cookies, so you can't
+usefully log in over plain http.
+
+This branch adds `sslcookie => 0, sslcookie_auto => 1` as an option; this
+uses the `HTTPS` environment variable, so if you log in over SSL you'll
+get a secure session cookie, but if you log in over HTTP, you won't.
+(The syntax for the setup file is pretty rubbish - any other suggestions?)
+
+> Does this need to be a configurable option at all? The behavior could
+> just be changed in the sslcookie = 0 case. It seems sorta reasonable
+> that, once I've logged in via https, I need to re-login if I then
+> switch to http.
+
+>> Even better. I've amended the branch to have this behaviour, which
+>> turns it into a one-line patch. --[[smcv]]
+
+> And, if your change is made, the sslcookie option could probably itself
+> be dropped too -- at least I don't see a real use case for it if ikiwiki
+> is more paranoid about cookies by default.
+
+>> I haven't done that; it might make sense to do so, but I think it'd be
+>> better to leave it in as a safety-catch (or in case someone's
+>> using a webserver that doesn't put `$HTTPS` in the environment). --s
+
+> Might be best to fix
+> [[todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both]]
+> first, so that dual https/http sites can better be set up. --[[Joey]]
+
+>> Thanks for merging that! :-) --s
+
+[[merged|done]] --[[Joey]]
diff --git a/doc/todo/use_templates_for_the_img_plugin.mdwn b/doc/todo/use_templates_for_the_img_plugin.mdwn
new file mode 100644
index 000000000..1cee1b535
--- /dev/null
+++ b/doc/todo/use_templates_for_the_img_plugin.mdwn
@@ -0,0 +1,29 @@
+[[!template id=gitbranch branch=jmtd/img_use_template author="[[Jon]]"]]
+
+Not finished! :-)
+
+The patches in <http://github.com/jmtd/ikiwiki/tree/img_use_template> convert the `img.pm` plugin to use a template (by default, `img.tmpl`, varied using a `template=` parameter) rather than hard-code the generated HTML.
+
+I originally thought of this to solve the problem outlined in [[bugs/can't mix template vars inside directives]], before I realised I could wrap the `img` call in my pages with a template to achieve the same thing. I therefore sat on it.
+
+However, I since thought of another use for this, and so started implementing it. (note to self: explain this other use)
+
+----
+
+Ok, I have managed to achieve what I wanted with stock ikiwiki, this branch might not have any more life left in it (but it has proven an interesting experiment to see how much logic could be moved from `img.pm` into a template relatively easily. Although the template is not terribly legible.)
+
+My ikiwiki page has a picture on the front page. I've changed that picture just once, but I would like to change it again from time to time. I also want to keep a "gallery", or at least a list, of previous pictures, and perhaps include text alongside each picture, but not on the front page.
+
+I've achieved this as follows
+
+ * each index picture gets a page under "indexpics".
+ * the "indexpics" page has a raw inline to include them all[1]
+ * the front page has more-or-less the same inline, with show=1
+ * each index picture page has a [[plugins/conditional]]:
+ * if you are being included, show the resized picture only, and link the picture to the relevant indexpic page
+ * else, show the picture with the default link to a full-size image, and include explanatory text.
+ * most of the boilerplate is hidden inside a template
+
+It is not quite as I envisaged it: the explanatory text would probably make sense on the indexpics "gallery" page, but since that includes the page, the wrong trouser-leg of the conditional is used. But it works quite well. Introducing a new index picture involves creating an appropriate page under indexpics and the rest happens automatically.
+
+[1] lie #1: the pagespec is a lot more complex as it has to exclude raw image filetypes
diff --git a/doc/todo/usedirs__95__redir_proposed_additional_module.mdwn b/doc/todo/usedirs__95__redir_proposed_additional_module.mdwn
new file mode 100644
index 000000000..6e9f27af0
--- /dev/null
+++ b/doc/todo/usedirs__95__redir_proposed_additional_module.mdwn
@@ -0,0 +1,8 @@
+I wrote a new ikiwiki plugin (download source) to generate redirection files so that the URL http://example.com/wiki/foo.html turns into http://example.com/wiki/foo/.
+
+This plugin is particularly useful when converting old sites built with static wiki pages into shiny new ikiwiki ones, while preserving external links.
+
+I'm happy to contribute the module to ikiwiki if there's interest. Source is [here](
+http://www.isi.edu/~johnh/SOFTWARE/IKIWIKI/usedirs_redir.pm.txt).
+
+[[!tag wishlist todo patch]]
diff --git a/doc/todo/user-defined_templates_outside_the_wiki.mdwn b/doc/todo/user-defined_templates_outside_the_wiki.mdwn
new file mode 100644
index 000000000..1d72aa6a7
--- /dev/null
+++ b/doc/todo/user-defined_templates_outside_the_wiki.mdwn
@@ -0,0 +1,10 @@
+[[!tag wishlist]]
+
+The [[plugins/contrib/ftemplate]] plugin looks for templates inside the wiki
+source, but also looks in the system templates directory (the one with
+`page.tmpl`). This means the wiki admin can provide templates that can be
+invoked via `\[[!template]]`, but don't have to "work" as wiki pages in their
+own right. I think the normal [[plugins/template]] plugin could benefit from
+this functionality.
+
+[[done]] --[[Joey]]
diff --git a/doc/todo/user-subdir_mechanism_like_etc_ikiwiki_wikilist.mdwn b/doc/todo/user-subdir_mechanism_like_etc_ikiwiki_wikilist.mdwn
new file mode 100644
index 000000000..826990e9f
--- /dev/null
+++ b/doc/todo/user-subdir_mechanism_like_etc_ikiwiki_wikilist.mdwn
@@ -0,0 +1,3 @@
+Currently, ikiwiki has the configuration file `/etc/ikiwiki/wikilist`, which `ikiwiki-mass-rebuild` can use to rebuild all the ikiwikis on the system, such as when upgrading ikiwiki. This file includes usernames, and `ikiwiki-mass-rebuild` (which must run as root) changes to the specified user to rebuild their wiki. However, this means that adding new ikiwikis to the list must require administrator action, since editing the file would allow you to run ikiwiki as any user. What about a user-subdirectory mechanism for this? If each user could have their own `/etc/ikiwiki/users/$user/wikilist`, which only contained wikis (no users), `ikiwiki-mass-rebuild` could rebuild each wiki in this list as the corresponding user only. This would mean that an administrator need only create the directory and provide user or group write permission, and the user or group can then create wikis as needed.
+
+[[todo/Done]], though somewhat differently. --[[Joey]]
diff --git a/doc/todo/userdir_links.mdwn b/doc/todo/userdir_links.mdwn
new file mode 100644
index 000000000..02dbdaa65
--- /dev/null
+++ b/doc/todo/userdir_links.mdwn
@@ -0,0 +1,5 @@
+The userdir should be searched at the end of the search "path" for links,
+so that users can put their pages in the userdir, and still link to them
+easily when signing things, without giving a path.
+
+[[todo/done]]
diff --git a/doc/todo/utf8.mdwn b/doc/todo/utf8.mdwn
new file mode 100644
index 000000000..278fb9382
--- /dev/null
+++ b/doc/todo/utf8.mdwn
@@ -0,0 +1,18 @@
+ikiwiki should support utf-8 pages, both input and output. To test, here's a
+utf-8 smiley:
+
+# ☺
+
+Currently ikiwiki is belived to be utf-8 clean itself; it tells perl to use
+binmode when reading possibly binary files (such as images) and it uses
+utf-8 compatable regexps etc.
+
+There may be the odd corner where utf-8 still doesn't work; these are being
+fixed as they're found.
+
+Notes:
+
+* Apache "AddDefaultCharset on" settings will not play well with utf-8
+ pages. Turn it off.
+
+[[todo/done]]
diff --git a/doc/todo/varioki_--_add_template_variables___40__with_closures_for_values__41___in_ikiwiki.setup.mdwn b/doc/todo/varioki_--_add_template_variables___40__with_closures_for_values__41___in_ikiwiki.setup.mdwn
new file mode 100644
index 000000000..d292a1184
--- /dev/null
+++ b/doc/todo/varioki_--_add_template_variables___40__with_closures_for_values__41___in_ikiwiki.setup.mdwn
@@ -0,0 +1,272 @@
+varioki - Add variables for use in ikiwiki templates
+
+This plugin attempts to provide a means to add templates for use in ikiwiki templates, based on a hash variable set in the ikiwiki configuration file. The motivation for this plugin was to provide an easy way for end users to add information to be used in templates -- for example, my "Blosxom" blog entry template does fancy things with the date components of the entry, and there was no easy way to get that information into the template. Or if one wants to have a different page template for the top level index page than for the rest of the pages inthe wiki (for example, to only put special content, like, say, 'last.fm" play lists, only on the front page).
+
+This plugin hooks itsef into the "pagetemplate" hook, and adds parameters to the appropriate templates based on the type. For example, the following inserted into "ikiwiki.setup" creates "TMPL_VAR MOTTO" and "TOPLVL" which can then be used in your templates.
+
+ varioki => {
+ ’motto’ => ’"Manoj\’s musings"’,
+ ’toplvl’ => ’sub {return $page eq "index"}’
+ },
+
+For every key in the configured hash, the corresponding value is evaluated. Based on whether the value was a stringified scalar, code, array, or hash, the value of the template parameter is generated on the fly. The available variables are whatever is available to "pagetemplate" hook scripts, namely, $page, $destpage, and $template. Additionally, the global variables and functions as defined in the Ikiwiki documentation (<http://ikiwiki.info/plugins/write/>) may be used.
+
+ManojSrivastava
+
+> I think you could now implement "toplvl" using [[conditionals|/plugins/conditional]]:
+>
+> \[[!if test="destpage(/index)" then="""...""" else="""..."""]]
+>
+> --[[JoshTriplett]]
+
+> > Right. But how about some more complex stuff, for example, from my varioki settings below? --ManojSrivastava
+
+> Here's a dump of the file Manoj sent me, for reference.
+>
+> My take on this is that simple plugins can do the same sort of things, this is
+> kind of wanting to avoid the plugin mechanism and just use templates and
+> stuff in the config file. Not too thrilled about that. --[[Joey]]
+
+> > OK. How do you implement something like category I have in my varioki
+> > settings? As a user, I can just add new stuff to my config and my template;
+> > with a plugin I'll have to write a plugin, and install it in the ikiwiki plugin
+> > directory, which is not very easy for a plain ol' user. Not everyone is the
+> > sysadmin of their own machines with access to system dirs. --ManojSrivastava
+
+>>> It seems worth mentioning here that the `libdir` configuration parameter
+>>> lets you install additional plugins in a user-controlled directory
+>>> (*libdir*`/IkiWiki/Plugin`), avoiding needing root; indeed, a full local
+>>> ikiwiki installation without any involvement from the sysadmin is
+>>> [[possible|tips/DreamHost]]. --[[smcv]]
+
+<pre>
+ varioki => {'motto' => '"Manoj\'s musings"',
+ 'arrayvar' => '[0, 1, 2, 3]',
+ 'hashvar' => '{1, 1, 2, 2}',
+ 'toplvl' => 'sub {return $page eq "index"}',
+ 'isblog' => 'sub {return $page =~ m,blog/.*,}',
+ 'category' => 'sub { return " " unless $page=~ m,^blog/,; my $p=""; my $i="&lt;a href=\"$config{url}/blog\"&gt;Top::&lt;/a&gt;"; my @a=split ("/",$page); shift @a; pop @a; foreach my $dir (@a) { $p.=$dir; $i.="&lt;a href=\"$config{url}/tag/$p\"&gt;$dir&lt;/a&lt;::"; $p.="/"; }; return $i }',
+ 'date' => 'sub { return POSIX::strftime("%d", gmtime((stat(srcfile($pagesources{$page})))[9])); }',
+ 'year' => 'sub { return POSIX::strftime("%Y", gmtime((stat(srcfile($pagesources{$page})))[9])); }',
+ 'month' => 'sub { return POSIX::strftime("%B", gmtime((stat(srcfile($pagesources{$page})))[9])); }',
+ 'day' => 'sub { return POSIX::strftime("%A", gmtime((stat(srcfile($pagesources{$page})))[9])); }',
+ },
+</pre>
+
+> > I'd argue in favour of this plugin; it's true that a simple plugin can be
+> > used to set a template variable, but that makes it necessary to write a new
+> > plugin for every variable (or set of variables) that are needed. In that
+> > kind of situation, I don't think bypassing the plugin mechanism is a bad
+> > thing, unless an ever-growing collection of plugins to set one or two
+> > variables is a good thing.
+> >
+> > --[[bma]]
+
+----
+
+<pre>
+* looking for srivasta@debian.org--2006-misc/ikiwiki--upstream--1.0--patch-488 to compare with
+* comparing to srivasta@debian.org--2006-misc/ikiwiki--upstream--1.0--patch-488: ................................................................ done.
+
+* added files
+
+--- /dev/null
++++ mod/IkiWiki/Plugin/.arch-ids/varioki.pm.id
+@@ -0,0 +1 @@
++Manoj Srivastava <srivasta@debian.org> Thu Dec 7 12:59:07 2006 12659.0
+--- /dev/null
++++ mod/IkiWiki/Plugin/varioki.pm
+@@ -0,0 +1,190 @@
++#!/usr/bin/perl
++# -*- Mode: Cperl -*-
++# varioki.pm ---
++# Author : Manoj Srivastava ( srivasta@glaurung.internal.golden-gryphon.com )
++# Created On : Wed Dec 6 22:25:44 2006
++# Created On Node : glaurung.internal.golden-gryphon.com
++# Last Modified By : Manoj Srivastava
++# Last Modified On : Thu Dec 7 13:07:36 2006
++# Last Machine Used: glaurung.internal.golden-gryphon.com
++# Update Count : 127
++# Status : Unknown, Use with caution!
++# HISTORY :
++# Description :
++#
++# arch-tag: 6961717b-156f-4ab2-980f-0d6a973aea21
++#
++# Copyright (c) 2006 Manoj Srivastava <srivasta@debian.org>
++#
++# This program is free software; you can redistribute it and/or modify
++# it under the terms of the GNU General Public License as published by
++# the Free Software Foundation; either version 2 of the License, or
++# (at your option) any later version.
++#
++# This program is distributed in the hope that it will be useful,
++# but WITHOUT ANY WARRANTY; without even the implied warranty of
++# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
++# GNU General Public License for more details.
++#
++# You should have received a copy of the GNU General Public License
++# along with this program; if not, write to the Free Software
++# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
++#
++
++require 5.002;
++
++package IkiWiki::Plugin::varioki;
++
++use warnings;
++use strict;
++use IkiWiki '1.00';
++
++our $VERSION = "0.1";
++my $file = __FILE__;
++
++
++=head1 NAME
++
++varioki - Add variables for use in ikiwiki templates
++
++=cut
++
++=head1 DESCRIPTION
++
++This plugin attempts to provide a means to add templates for use in
++ikiwiki templates, based on a hash variable set in the ikiwiki
++configuration file. The motivation for this plugin was to provide an
++easy way for end users to add information to be used in templates --
++for example, my C<Blosxom> blog entry template does fancy things with
++the date components of the entry, and there was no easy way to get
++that information into the template. Or if one wants to have a
++different page template for the top level index page than for the rest
++of the pages in the wiki (for example, to only put special content,
++like, say, C<last.fm> play lists, only on the front page).
++
++This plugin hooks itsef into the C<pagetemplate> hook, and adds
++parameters to the appropriate templates based on the type. For
++example, the following inseted into C<ikiwiki.setup> creates
++C<TMPL_VAR MOTTO>, C<ARRAYVAR>, C<HASHVAR> and C<TOPLVL> which can
++then be used in your templates. The array and hash variables are only
++for completeness; I suspect that the first two forms are all that are
++really required.
++
++ varioki => {
++ 'motto' => '"Manoj\'s musings"',
++ 'toplvl' => 'sub {return $page eq "index"}',
++ 'arrayvar' => '[0, 1, 2, 3]',
++ 'hashvar' => '{1, 1, 2, 2}'
++ },
++
++Please note that the values in the hash must be simple strings which
++are then eval'd, so a string value has to be double quoted, as above
++(the eval strips off the outer quotes).
++
++=cut
++
++
++sub import {
++ hook(type => "pagetemplate", id => "varioki", call => \&pagetemplate);
++}
++
++
++=pod
++
++For every key in the configured hash, the corresponding value is
++evaluated. Based on whether the value was a stringified scalar, code,
++array, or hash, the value of the template parameter is generated on
++the fly. The available variables are whatever is available to
++C<pagetemplate> hook scripts, namely, C<$page>, C<$destpage>, and
++C<$template>. Additionally, the global variables and functions as
++defined in the Ikiwiki documentation
++(L<http://ikiwiki.kitenet.net/plugins/write.html>) may be used.
++
++=cut
++
++sub pagetemplate (@) {
++ my %params=@_;
++ my $page=$params{page};
++ my $template=$params{template};
++
++ return unless defined $config{varioki};
++ for my $var (keys %{$config{varioki}}) {
++ my $value;
++ my $foo;
++ eval "\$foo=$config{varioki}{$var}";
++ if (ref($foo) eq "CODE") {
++ $value = $foo->();
++ }
++ elsif (ref($foo) eq "SCALAR") {
++ $value = $foo;
++ }
++ elsif (ref($foo) eq "ARRAY") {
++ $value = join ' ', @$foo;
++ }
++ elsif (ref($foo) eq "HASH") {
++ for my $i (values %$foo ) {
++ $value .= ' ' . "$i";
++ }
++ }
++ else {
++ $value = $foo;
++ }
++ warn "$page $var $value\n";
++ if ($template->query(name => "$var")) {
++ $template->param("$var" =>"$value");
++ }
++ }
++}
++
++1;
++
++=head1 CAVEATS
++
++This is very inchoate, at the moment, and needs testing. Also, there
++is no good way to determine how to handle hashes as values --
++currently, the code just joins all hash values with spaces, but it
++would be easier for the user to just use an anonymous sub instead of
++passing in a hash or an array.
++
++=cut
++
++=head1 BUGS
++
++Since C<ikiwiki> evals the configuration file, the values have to all
++on a single physical line. This is the reason we need to use strings
++and eval, instead of just passing in real anonymous sub references,
++since the eval pass converts the coderef into a string of the form
++"(CODE 12de345657)" which can't be dereferenced.
++
++=cut
++
++=head1 AUTHOR
++
++Manoj Srivastava <srivasta@debian.org>
++
++=head1 COPYRIGHT AND LICENSE
++
++This script is a part of the Devotee package, and is
++
++Copyright (c) 2002 Manoj Srivastava <srivasta@debian.org>
++
++This program is free software; you can redistribute it and/or modify
++it under the terms of the GNU General Public License as published by
++the Free Software Foundation; either version 2 of the License, or
++(at your option) any later version.
++
++This program is distributed in the hope that it will be useful,
++but WITHOUT ANY WARRANTY; without even the implied warranty of
++MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
++GNU General Public License for more details.
++
++You should have received a copy of the GNU General Public License
++along with this program; if not, write to the Free Software
++Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
++
++=cut
++
++1;
++
++__END__
++
+</pre>
+
+[[!tag patch]]
diff --git a/doc/todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both.mdwn b/doc/todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both.mdwn
new file mode 100644
index 000000000..6ede7f91e
--- /dev/null
+++ b/doc/todo/want_to_avoid_ikiwiki_using_http_or_https_in_urls_to_allow_serving_both.mdwn
@@ -0,0 +1,376 @@
+## current status
+
+[[done]] again! :)
+
+Actually, there are two places where the configured url is still hardcoded:
+
+1. When searching, all the links will use it. This is annoying to fix,
+ and we deem it not a problem.
+2. When ikiwiki dies with an error, the links on the error page will
+ use it. Too bad :)
+
+------
+
+## semi-old
+
+
+* CGI pages, with the exception of edit pages, set `<base>` to
+ `$config{url}`
+
+ I had to revert using `baseurl(undef)` for that, because it needs
+ to be a full url.
+
+ Ideally, baseurl would return an absolute url derived from the url
+ being used to access the cgi, but that needs access to the CGI object,
+ which it does not currently have. Similarly, `misctemplate`
+ does not have access to the CGI object, so it cannot use it to
+ generate a better baseurl. Not sure yet what to do; may have to thread
+ a cgi parameter through all the calls to misctemplate. --[[Joey]]
+
+ > Fixed, cgitemplate is used now. --[[Joey]]
+
+* Using `do=goto` to go to a comment or recentchanges item
+ will redirect to the `$config{url}`-based url, since the
+ permalinks are made to be absolute urls now.
+
+ Fixing this would seem to involve making meta force permalinks
+ to absolute urls when fulling out templates, while allowing them
+ to be left as partial urls internally, for use by goto. --[[Joey]]
+
+ > This reversion has now been fixed. --[[Joey]]
+
+## old attempt
+
+It looks like all links in websites are absolute paths, this has some limitations:
+
+* If connecting to website via https://... all links will take you back to http://
+* Makes it harder to mirror website via HTML version, as all links have to be updated.
+
+It would be good if relative paths could be used instead, so the transport method isn't changed unless specifically requested.
+
+-- Brian May
+
+> Er, which absolute links are you talking about? If you view the source
+> to this page, you'll find links such as "../favicon.ico", "../style.css",
+> "../../", and "../". The only absolute links are to CGIs and the w3c DTD.
+> --[[Joey]]
+
+>> The problem is within the CGI script. The links within the HTML page are all
+>> absolute, including links to the css file. Having a http links within a HTML
+>> page retrieved using https upset most browsers (I think). Also if I push cancel
+>> on the edit page in https, I end up at at http page. -- Brian May
+
+>>> Ikiwiki does not hardcode http links anywhere. If you don't want
+>>> it to use such links, change your configuration to use https
+>>> consistently. --[[Joey]]
+
+Errr... That is not a solution, that is a work around. ikiwiki does not hard
+code the absolute paths, but absolute paths are hard coded in the configuration
+file. If you want to serve your website so that the majority of users can see
+it as http, including in rss feeds (this allows proxy caches to cache the
+contents and has reduced load requirements), but editing is done via https for
+increased security, it is not possible. I have some ideas how this can be
+implemented (as ikiwiki has the absolute path to the CGI script and the
+absolute path to the destination, it should be possible to generate a relative
+path from one to the other), although some minor issues still need to be
+resolved. -- Brian May
+
+I noticed the links to the images on <http://ikiwiki.info/recentchanges/> are
+also absolute, that is <http://ikiwiki.info/wikiicons/diff.png>; this seems
+surprising, as the change.tmpl file uses &lt;TMPL_VAR BASEURL&gt; which seems
+to do the right thing in page.tmpl, but not for change.tmpl. Where is BASEURL
+set? -- Brian May
+
+> The use of an absolute baseurl in change.tmpl is a special case. --[[Joey]]
+
+So I'm facing this same issue. I have a wiki which needs to be accessed on
+three different URLs(!) and the hard coding of the URL from the setup file is
+becoming a problem for me. Is there anything I can do here? --[[Perry]]
+
+> I remain puzzled by the problem that Brian is discussing. I don't see
+> why you can't just set the cgiurl and url to a https url, and serve
+> the site using both http and https.
+>
+> Just for example, <https://kitenet.net/> is an ikiwiki, and it is accessible
+> via https or http, and if you use https, links will remain on https (except
+> for links using the cgi, which I could fix by changing the cgiurl to https).
+>
+> I think it's possible ikiwiki used to have some
+> absolute urls that have been fixed since Brian filed the bug. --[[Joey]]
+
+[[wishlist]]
+
+----
+
+[[!toggle id="smcv-https" text="Some discussion of a rejected implementation, smcv/https."]]
+[[!toggleable id="smcv-https" text="""
+
+[[!template id=gitbranch branch=smcv/https author="[[smcv]]"]]
+
+For a while I've been using a configuration where each wiki has a HTTP and
+a HTTPS mirror, and updating one automatically updates the other, but
+that seems unnecessarily complicated. My `https` branch adds `https_url`
+and `https_cgiurl` config options which can be used to provide a HTTPS
+variant of an existing site; the CGI script automatically detects whether
+it was accessed over HTTPS and switches to the other one.
+
+This required some refactoring, which might be worth merging even if
+you don't like my approach:
+
+* change `IkiWiki::cgiurl` to return the equivalent of `$config{cgiurl}` if
+ called with no parameters, and change all plugins to indirect through it
+ (then I only need to change that one function for the HTTPS hack)
+
+* `IkiWiki::baseurl` already has similar behaviour, so change nearly all
+ references to the `$config{url}` to call `baseurl` (a couple of references
+ specifically wanted the top-level public URL for Google or Blogspam rather
+ than a URL for the user's browser, so I left those alone)
+
+--[[smcv]]
+
+> The justification for your patch seems to be wanting to use a different
+> domain, like secure.foo.com, for https? Can you really not just configure
+> both url and cgiurl to use `https://secure.foo.com/...` and rely on
+> relative links to keep users of `http://insecure.foo.com/` on http until
+> they need to use the cgi?
+
+>> My problem with that is that uses of the CGI aren't all equal (and that
+>> the CA model is broken). You could put CGI uses in two classes:
+>>
+>> - websetup and other "serious" things (for the sites I'm running, which
+>> aren't very wiki-like, editing pages is also in this class).
+>> I'd like to be able to let privileged users log in over
+>> https with httpauth (or possibly even a client certificate), and I don't
+>> mind teaching these few people how to do the necessary contortions to
+>> enable something like CACert.
+>>
+>> - Random users making limited use of the CGI: do=goto, do=404, and
+>> commenting with an OpenID. I don't think it's realistic to expect
+>> users to jump through all the CA hoops to get CACert installed for that,
+>> which leaves their browsers being actively obstructive, unless I either
+>> pay the CA tax (per subdomain) to get "real" certificates, or use plain
+>> http.
+>>
+>> On a more wiki-like wiki, the second group would include normal page edits.
+>>
+>>> I see your use case. It still seems to me that for the more common
+>>> case where CA tax has been paid (getting a cert that is valid for
+>>> multiple subdomains should be doable?), having anything going through the
+>>> cgiurl upgrade to https would be ok. In that case, http is just an
+>>> optimisation for low-value, high-aggregate-bandwidth type uses, so a
+>>> little extra https on the side is not a big deal. --[[Joey]]
+>>
+>> Perhaps I'm doing this backwards, and instead of having the master
+>> `url`/`cgiurl` be the HTTP version and providing tweakables to override
+>> these with HTTPS, I should be overriding particular uses to plain HTTP...
+>>
+>> --[[smcv]]
+>>>
+>>> Maybe, or I wonder if you could just use RewriteEngine for such selective
+>>> up/downgrading. Match on `do=(edit|create|prefs)`. --[[Joey]]
+
+> I'm unconvinced.
+>
+> `Ikiwiki::baseurl()."foo"` just seems to be asking for trouble,
+> ie being accidentially written as `IkiWiki::baseurl("foo")`,
+> which will fail when foo is not a page, but some file.
+
+>> That's a good point. --s
+
+> I see multiple places (inline.pm, meta.pm, poll.pm, recentchanges.pm)
+> where it will now put the https url into a static page if the build
+> happens to be done by the cgi accessed via https, but not otherwise.
+> I would rather not have to audit for such problems going forward.
+
+>> Yes, that's a problem with this approach (either way round). Perhaps
+>> making it easier to run two mostly-synched copies like I was previously
+>> doing is the only solution... --s
+
+"""]]
+
+----
+
+[[!template id=gitbranch branch=smcv/ready/localurl author="[[smcv]]"]]
+[[!tag patch]]
+
+OK, here's an alternative approach, closer in spirit to what was initially
+requested. I included a regression test for `urlto`, `baseurl` and `cgiurl`,
+now that they have slightly more complex behaviour.
+
+The idea is that in the common case, the CGI and the pages will reside on the
+same server, so they can use "semi-absolute" URLs (`/ikiwiki.cgi`, `/style.css`,
+`/bugs/done`) to refer to each other. Most redirects, form actions, links etc.
+can safely use this form rather than the fully-absolute URL.
+
+The initial version of the branch had config options `local_url` and
+`local_cgiurl`, but they're now automatically computed by checking
+whether `url` and `cgiurl` are on the same server with the the same URL
+scheme. In theory you could use things like `//static.example.com/wiki/`
+and `//dynamic.example.com/ikiwiki.cgi` to preserve choice of http/https
+while switching server, but I don't know how consistently browsers
+support that.
+
+"local" here is short for "locally valid", because these URLs are neither
+fully relative nor fully absolute, and there doesn't seem to be a good name
+for them...
+
+I've tested this on a demo website with the CGI enabled, and it seemed to
+work nicely (there might be bugs in some plugins, I didn't try all of them).
+The branch at [[todo/use secure cookies for SSL logins]] goes well with
+this one.
+
+The `$config{url}` and `$config{cgiurl}` are both HTTP, but if I enable
+`httpauth`, set `cgiauthurl` to a HTTPS version of the same site and log
+in via that, links all end up in the HTTPS version.
+
+New API added by this branch:
+
+* `urlto(x, y, 'local')` uses `$local_url` instead of `$config{url}`
+
+ > Yikes. I see why you wanted to keep it to 3 parameters (4 is too many,
+ > and po overrides it), but I dislike overloading the third parameter
+ > like that.
+ >
+ > There are fairly few calls to `urlto($foo, $bar)`, so why not
+ > make that always return the semi-local url form, and leave the third
+ > parameter for the cases that need a true fully-qualified url.
+ > The new form for local urls will typically be only a little bit longer,
+ > except in the unusual case where the cgiurl is elsewhere. --[[Joey]]
+
+ >> So, have urlto(x, y) use `$local_url`? There are few calls, but IMO
+ >> they're for the most important things - wikilinks, img, map and
+ >> other ordinary hyperlinks. Using `$local_url` would be fine for
+ >> webserver-based use, but it does stop you browsing your wiki's
+ >> HTML over `file:///` (unless you set that as the base URL, but
+ >> then you can't move it around), and stops you moving simple
+ >> outputs (like the docwiki!) around.
+ >>
+ >> I personally think breaking the docwiki is enough to block that.
+ >>
+ >>> Well, the docwiki doesn't have an url configured at all, so I assumed
+ >>> it would need to fall back to current behavior in that case. I had
+ >>> not thought about browsing wiki's html files though, good point.
+ >>
+ >> How about this?
+ >>
+ >> * `urlto($link, $page)` with `$page` defined: relative
+ >> * `urlto($link, undef)`: local, starts with `/`
+ >> * `urlto($link)`: also local, as a side-effect
+ >> * `urlto($link, $anything, 1)` (but idiomatically, `$anything` is
+ >> normally undef): absolute, starts with `http[s]://`
+ >>
+ >> --[[smcv]]
+ >>
+ >>> That makes a great deal of sense, bravo for actually removing
+ >>> parameters in the common case while maintaining backwards
+ >>> compatability! --[[Joey]]
+ >>>
+ >>>> Done in my `localurl` branch; not tested in a whole-wiki way
+ >>>> yet, but I did add a regression test. I've used
+ >>>> `urlto(x, undef)` rather than `urlto(x)` so far, but I could
+ >>>> go back through the codebase using the short form if you'd
+ >>>> prefer. --[[smcv]]
+ >>>
+ >>> It does highlight that it would be better to have a
+ >>> `absolute_urlto($link)` (or maybe `absolute(urlto($link))` )
+ >>> rather than the 3 parameter form. --[[Joey]]
+ >>>
+ >>> Possibly. I haven't added this.
+
+* `IkiWiki::baseurl` has a new second argument which works like the
+ third argument of `urlto`
+
+ > I assume you have no objection to this --[[smcv]]
+
+ >> It's so little used that I don't really care if it's a bit ugly.
+ >> (But I assume changes to `urlto` will follow through here anyway.)
+ >> --[[Joey]]
+
+ >>> I had to use it a bit more, as a replacement for `$config{url}`
+ >>> when doing things like referencing stylesheets or redirecting to
+ >>> the top of the wiki.
+ >>>
+ >>> I ended up redoing this without the extra parameter. Previously,
+ >>> `baseurl(undef)` was the absolute URL; now, `baseurl(undef)` is
+ >>> the local path. I know you objected to me using `baseurl()` in
+ >>> an earlier branch, because `baseurl().$x` looks confusingly
+ >>> similar to `baseurl($x)` but has totally different semantics;
+ >>> I've generally written it `baseurl(undef)` now, to be more
+ >>> explicit. --[[smcv]]
+
+* `IkiWiki::cgiurl` uses `$local_cgiurl` if passed `local_cgiurl => 1`
+
+ > Now changed to always use the `$local_cgiurl`. --[[smcv]]
+
+* `IkiWiki::cgiurl` omits the trailing `?` if given no named parameters
+ except `cgiurl` and/or `local_cgiurl`
+
+ > I assume you have no objection to this --[[smcv]]
+ >
+ >> Nod, although I don't know of a use case. --[[Joey]]
+
+ >>> The use-case is that I can replace `$config{cgiurl}` with
+ >>> `IkiWiki::cgiurl()` for things like the action attribute of
+ >>> forms. --[[smcv]]
+
+Fixed bugs:
+
+* I don't think anything except `openid` calls `cgiurl` without also
+ passing in `local_cgiurl => 1`, so perhaps that should be the default;
+ `openid` uses the `cgiurl` named parameter anyway, so there doesn't even
+ necessarily need to be a way to force absolute URLs? Any other module
+ that really needs an absolute URL could use
+ `cgiurl(cgiurl => $config{cgiurl}, ...)`,
+ although that does look a bit strange
+
+ > I agree that makes sense. --[[Joey]]
+
+ >> I'm not completely sure whether you're agreeing with "perhaps do this"
+ >> or "that looks too strange", so please disambiguate:
+ >> would you accept a patch that makes `cgiurl` default to a local
+ >> (starts-with-`/`) result? If you would, that'd reduce the diff. --[[smcv]]
+
+ >>> Yes, I absolutely think it should default to local. (Note that
+ >>> if `absolute()` were implemented as suggested above, it could also
+ >>> be used with cgiurl if necessary.) --[[Joey]]
+
+ >>>> Done (minus `absolute()`). --[[smcv]]
+
+Potential future things:
+
+* It occurs to me that `IkiWiki::cgiurl` could probably benefit from being
+ exported? Perhaps also `IkiWiki::baseurl`?
+
+ > Possibly, see [[firm_up_plugin_interface]]. --[[Joey]]
+
+ >> Not really part of this branch, though, so wontfix (unless you ask me
+ >> to do so). --[[smcv]]
+
+* Or, to reduce use of the unexported `baseurl` function, it might make
+ sense to give `urlto` a special case that references the root of the wiki,
+ with a trailing slash ready to append stuff: perhaps `urlto('/')`,
+ with usage like this?
+
+ do_something(baseurl => urlto('/', undef, local)`);
+ do_something_else(urlto('/').'style.css');
+ IkiWiki::redirect(urlto('/', undef, 1));
+
+ > AFACIS, `baseurl` is only called in 3 places so I don't think that's
+ > needed. --[[Joey]]
+
+ >> OK, wontfix. For what it's worth, my branch has 6 uses in IkiWiki
+ >> core code (IkiWiki, CGI, Render and the pseudo-core part of editpage)
+ >> and 5 in plugins, since I used it for things like redirection back
+ >> to the top of the wiki --[[smcv]]
+
+merged|done --[[Joey]] (But reopened, see above.)
+
+----
+
+Update: I had to revert part of 296e5cb2fd3690e998b3824d54d317933c595873,
+since it broke openid logins. The openid object requires a complete,
+not a relative cgiurl. I'm not sure if my changing that back to using
+`$config{cgiurl}` will force users back to eg, the non-https version of a
+site when logging in via openid.
+
+> Ok, changed it to use `CGI->url` to get the current absolute cgi url. --[[Joey]]
diff --git a/doc/todo/wanted_pages_plugin.mdwn b/doc/todo/wanted_pages_plugin.mdwn
new file mode 100644
index 000000000..4758090ef
--- /dev/null
+++ b/doc/todo/wanted_pages_plugin.mdwn
@@ -0,0 +1,3 @@
+[[plugins/orphans]] shows all pages that exist but have no links to them. We should also support the converse, wanted pages: pages that have links to them but do not exist. --[[JoshTriplett]]
+
+That's [[plugins/brokenlinks]] so [[done]] already I guess --[[Joey]]
diff --git a/doc/todo/wdiffs_in_recentchanges.mdwn b/doc/todo/wdiffs_in_recentchanges.mdwn
new file mode 100644
index 000000000..0f203b6c8
--- /dev/null
+++ b/doc/todo/wdiffs_in_recentchanges.mdwn
@@ -0,0 +1 @@
+Would be nice if the recentchangesdiff plugin could optionally provide wdiffs, or highlight the changes on each line. When people edit a wiki via the web interface, they often put each paragraph on a single line, like this one, and if someone later edits the paragraph, the diff only shows the whole line changed, making it difficult to figure out what the actual change is. --liw
diff --git a/doc/todo/web-based_image_editing.mdwn b/doc/todo/web-based_image_editing.mdwn
new file mode 100644
index 000000000..a03206091
--- /dev/null
+++ b/doc/todo/web-based_image_editing.mdwn
@@ -0,0 +1,3 @@
+We could support web-based image editing, using something like [Snipshot](http://snipshot.com/). Several comparisons of web-based image editors exist; we would need to choose which one(s) to support. --[[JoshTriplett]]
+
+[[wishlist]]
diff --git a/doc/todo/web_gui_for_managing_tags.mdwn b/doc/todo/web_gui_for_managing_tags.mdwn
new file mode 100644
index 000000000..c865bf738
--- /dev/null
+++ b/doc/todo/web_gui_for_managing_tags.mdwn
@@ -0,0 +1,12 @@
+Along the same lines as [[default_content_for_new_post]] and [[default_name_for_new_post]] it would be helpful if the default web edit page provided a gui for managing tags. Being able to manage tags directly via the text box is **wonderful**, but having a dropdown box of available tags can be very helpful in a couple of ways:
+
+ * as a visual reminder to add tags
+ * as a cheat sheet for which tags already exist (eg. does this site use "howto" or "tutorial", "hack" or "hacks")
+
+As with [[default_content_for_new_post]] I understand that this falls into a bit of a funny space because making changes via svn would bypass this functionality but I given that the web seems to be the way most non-admin changes are made, it still seems a valuable addition for encouraging people to accurately tag content.
+
+MoinMoin has a simple implementation of this, [click here to see an example](http://www.spack.org/wiki/SandBox?action=edit&editor=text).
+
+--[[AdamShand]]
+
+[[!tag wishlist]]
diff --git a/doc/todo/web_reversion.mdwn b/doc/todo/web_reversion.mdwn
new file mode 100644
index 000000000..841fc3703
--- /dev/null
+++ b/doc/todo/web_reversion.mdwn
@@ -0,0 +1,73 @@
+Goal: Web interface to allow reverting of changes.
+
+Interface:
+
+At least at first, it will be exposed via the recentchanges
+page, with revert icons next to each change. We may want a dynamic
+per-page interface that goes back more than 100 changes later.
+
+Limiting assumptions:
+
+* No support for resolving conflicts in reverts; such a revert would just
+ fail and not happen.
+* No support for reset-to-this-point; initially the interface would only
+ revert a single commit, and if a bunch needed to go, the user would have
+ to drive that one at a time.
+
+Implementation plan:
+
+* `rcs_revert` hook that takes a revision to revert.
+* CGI: `do=revert&rev=foo`
+* recentchanges plugin adds above to recentchanges page
+* prompt user to confirm (to avoid spiders doing reverts),
+ check that user is allowed to make the change, commit reversion,
+ and refresh site.
+
+Peter Gammie has done an initial implementation of the above.
+[[!template id=gitbranch branch=peteg/revert author="[[users/peteg]]"]]
+
+>> It is on a separate branch now. --[[users/peteg]]
+
+> Review: --[[Joey]]
+>
+> The revert commit will not currently say what web user did the revert.
+> This could be fixed by doing a --no-commit revert first and then using
+> rcs_commit_staged.
+>> Fixed, I think. --[[users/peteg]]
+>
+> So I see one thing I completly forgot about is `check_canedit`. Avoiding users
+> using reverting to make changes they would normally not be allowed to do is
+> tricky. I guess that a easy first pass would be to only let admins do it.
+> That would be enough to get the feature out there..
+>
+> I'm thinking about having a `rcs_preprevert`. It would take a rev and look
+> at what changes reverting it would entail, and return the same data
+> structure that `rcs_recieve` does. This could be done by using `git revert
+> --no-commit`, and then examining the changes, and then `git reset` to drop
+> them.
+>> We can use the existing `git_commit_info` with the patch ID - no need to touch the working directory. -- [[users/peteg]]
+>
+> Then the code that is currently in IkiWiki/Receive.pm, that calls
+> `check_canedit` and `check_canremove` to test the change, can be
+> straightforwardly refactored out, and used for checking reverts too.
+>> Wow, that was easy. :-) -- [[users/peteg]]
+>
+> (The data from `rcs_preprevert` could also be used for a confirmation
+> prompt -- it doesn't currently include enough info for diffs, but at
+> least could have a list of changed files.)
+>
+> Note that it's possible for a git repo to have commits that modify wiki
+> files in a subdir, and code files elsewhere. `rcs_preprevert` should
+> detect changes outside the wiki dir, and fail, like `rcs_receive` does.
+>> Taken care of by refactoring `rcs_receive` in `git.pm`
+>> I've tested it lightly in my single-user setup. It's a little nasty that the `attachment` plugin
+>> gets used to check whether attachments are allowed -- there really should be a hook for that.
+>>> I agree, but have not figured out a way to make a hook work yet.
+>>> --[[Joey]]
+>>
+>> Please look it over and tell me what else needs fixing... -- [[users/peteg]]
+
+>>> I have made my own revert branch and put a few^Wseveral fixes in there.
+>>> All merged to master now! --[[Joey]]
+
+[[done]]
diff --git a/doc/todo/websetup_should_link_to_plugin_descriptions.mdwn b/doc/todo/websetup_should_link_to_plugin_descriptions.mdwn
new file mode 100644
index 000000000..8b15fb7d0
--- /dev/null
+++ b/doc/todo/websetup_should_link_to_plugin_descriptions.mdwn
@@ -0,0 +1,3 @@
+A [[wishlist]] item.
+
+It would be nice if the websetup plugin could link to plugin descriptions. When it refers to a plugin by name, the name could be a link to <http://ikiwiki.info/plugins/$NAME/> (or other suitable location). --liw
diff --git a/doc/todo/wiki-formatted_comments_with_syntax_plugin.mdwn b/doc/todo/wiki-formatted_comments_with_syntax_plugin.mdwn
new file mode 100644
index 000000000..7a4a295d4
--- /dev/null
+++ b/doc/todo/wiki-formatted_comments_with_syntax_plugin.mdwn
@@ -0,0 +1,9 @@
+[[Wishlist]] item: I'd love to see the ability to optionally switch back to
+wiki syntax within the comments of code pretty-printed with the
+[[plugins/contrib/syntax]] plugin. This would allow the use of links and
+formatting in comments.
+
+> You can do this using the [[plugins/highlight]] plugin, but you have
+> to explicitly put a format directive in the comment to do it. Thus,
+> I'm leaving this open for now.. ideally, comments would be detected,
+> and formatted as markdown. --[[Joey]]
diff --git a/doc/todo/wikilink_titles.mdwn b/doc/todo/wikilink_titles.mdwn
new file mode 100644
index 000000000..e47928ae2
--- /dev/null
+++ b/doc/todo/wikilink_titles.mdwn
@@ -0,0 +1,4 @@
+If a wikilink does not show the name of the page, because it's been
+overridden to show something else, it could put a title="pagename" in the
+link. This way users mousing over the wikilink would get a nice tooltip
+with some extra info.
diff --git a/doc/todo/wikilinkfeatures.mdwn b/doc/todo/wikilinkfeatures.mdwn
new file mode 100644
index 000000000..bf32dafbb
--- /dev/null
+++ b/doc/todo/wikilinkfeatures.mdwn
@@ -0,0 +1,4 @@
+- \[[John|Fred]] is a Wikipedia method for linking to the one page
+ while displaying it as the other, Kyle would like this.
+
+[[todo/done]]
diff --git a/doc/todo/wikitrails.mdwn b/doc/todo/wikitrails.mdwn
new file mode 100644
index 000000000..f9daea6b1
--- /dev/null
+++ b/doc/todo/wikitrails.mdwn
@@ -0,0 +1,49 @@
+## summary
+at times it is useful to have a guided tour or trail through a subset of the pages of a wiki; in pmwiki, this is implemented as [wikitrails](http://www.pmwiki.org/wiki/PmWiki/WikiTrails).
+
+### smcv's implementation
+
+... is the out-of-tree [[plugins/trail]] plugin, see there for details.
+
+> And will be the one landing in ikiwiki. So, I'm closing this bug report. [[done]] --[[Joey]]
+
+### chrysn's implementation
+
+i'm working on a python xmlrpc plugin for ikiwiki to support wikitrails, both as a navigation feature (have "forward" and "back" links based on a sequence) and a modified inline that includes all pages in the trail with appropriate modifications (suitable for printing if necessary).
+
+the current status is published on `git://github.com/github076986099/ikiwiki-plugins.git`; as of now, i don't have a public demo of it.
+
+feedback on both the concept and the code is very much appreciated by [[discussion]] or [email](mailto:chrysn@fsfe.org).
+
+#### usage
+
+two preprocessor commands are provided:
+
+##### \[[!trail index="my_indexpage"]]
+
+embeds a navigation object with forward and back links as well as an indicator of the current position in the trail.
+
+if index is not specified, a suitable page up the path is used.
+
+this works very well together with the [[sidebar|plugins/sidebar]] plugin if the pages in a directory are roughly the same as the pages in the trail and the `index` is directory index page; just put the \[[!trail]] in the sidebar.
+
+##### \[[!trailinclude index="my_indexpage"]]
+
+all pages linked from the index page are included in the same way as \[[!inline]] does, but in the proper sequence, with headings according to the indent in the source page and adoptions for the headings inside the page (a level 2 heading in a page that is a sub-sub-chapter in the whole trail will become a level 5 heading when trailincluded).
+
+#### the index page
+
+the index page is parsed as markdown; numbered lists and "`*`" bulleted lists are discovered.
+
+#### current issues
+
+ * rebuilding --- currently, there is no propper rebuilding of pages (will use `will_render` and `add_depends`). care has to be taken of how not yet created pages play into this.
+ * inline recursion --- there is simply no guard yet
+ * navigation layout --- has to be both flexible and usable-by-default
+ * heading shifting
+ * currently only works for markdown
+ * can break the limit of html's six heading levels
+ * search for index page is currently next to hardcoded
+ * reading the index --- markdown syntax parsing is currently on a it-can-use-what-i-produce level; maybe integrate with existing mdwn parser
+ * uses undocumented titlepage command
+ > Don't worry about that, titlepage isn't going anywhere, and will probably before a formal part of the api next time I consider api changes. --[[Joey]]
diff --git a/doc/todo/wikitrails/discussion.mdwn b/doc/todo/wikitrails/discussion.mdwn
new file mode 100644
index 000000000..1ceb51f0d
--- /dev/null
+++ b/doc/todo/wikitrails/discussion.mdwn
@@ -0,0 +1,84 @@
+(This mainly discusses the original implementation (chrysn's). --[[smcv]])
+
+----
+
+This is a nice idea, I do have my gripes about the imeplementation.
+
+Assuming that the index's list is in mdwn format is not ideal. I guess the
+other way to do it would be to make the index be a directive, something
+like: \[[!trail pages="foo bar baz"]]. Assuming that a flat trail structure
+is enough, otherwise you'd have to get more fancy.
+
+The trailinclude seems a bit redundant with inline, and wanting to inline
+together all pages in a trail for printing or whatever seems like an
+unusual use case anyway?
+
+The !trail directive could be simplified to just \[[!trail my_indexpage]].
+But I wonder if needing to add this directive to every page is the best
+approach. Alternate approach would be to make the trail index cause
+breadcrums to be automatically inserted at the top of every page on the
+trail. (You'd have to use a directive to define the index for that to work.)
+
+--[[Joey]]
+
+----
+
+Revisiting this, after effectively reimplementing a small version of it
+in [[plugins/contrib/album]]: it occurs to me that might be a more
+"ikiwiki-like" way we could get this functionality.
+
+In the index page, you either want an [[ikiwiki/directive/inline]], or
+a list of links. In the former case, maybe we could extend inline like
+this:
+
+ \[[!inline ... blah blah ... trail=yes]]
+
+to make it remember the pages it inlined, in order, in the pagestate;
+in the latter case, we could replace the wikilinks with a directive,
+an operation something like this in diff notation:
+
+ - \[[one]] - the unit
+ - \[[two]] - the base of binary
+ - \[[three|3]] - is a crowd
+ + \[[!trailitem one]] - the unit
+ + \[[!trailitem two]] - the base of binary
+ + \[[!trailitem three|3]] - is a crowd
+
+and have that directive remember the pages in order.
+
+In both cases, a scan() hook could clear the list before starting to
+scan, then the inline or trailitem preprocessor directive could run in
+the scan stage as well as the render stage (in the case of inline,
+there'd be a very early return if trail=yes was not given, and
+an early return after collecting and sorting the pages if not
+actually rendering).
+
+This would mean that the contents of the trail, and a list of
+trails in which each page can be found, would already be in
+the pagestate by the time any page was rendered, so we'd be able
+to use them for output, either in a pagetemplate() hook or
+a \[[!trail]] preprocessor directive.
+
+This way, my album plugin could be turned inside out: instead
+of precomputing the pages to be inlined, then using
+[[pagenames|todo/inline plugin: specifying ordered page names]]
+to get them into the inline, it could just do the inline, then
+incorporate the output of \[[!trail]] into the template rendered
+for \[[!albumimage]] on each viewer page. (Also, the viewers
+wouldn't necessarily need to reference the album, only the other
+way round.)
+
+Using a pagetemplate() hook to stuff the next/previous links
+into page.tmpl would actually be a bit unfortunate for \[[!album]],
+because that plugin definitely wants to style the next/previous
+links as a thumbnail, which means there'd have to be a way to
+affect the style - perhaps by arranging for album's pagetemplate
+hook to run *after* trail's, or perhaps by having trail's
+pagetemplate hook disable itself for pages that contain
+a \[[!trail]] directive.
+
+I have now implemented this at [[plugins/contrib/trail]].
+What do you think? I'm still not sure how it would relate
+to [[plugins/contrib/album]], but if trail is reviewed
+and approved in principle, I'll try to adapt album as
+outlined above. --[[smcv]]
diff --git a/doc/todo/wikiwyg.mdwn b/doc/todo/wikiwyg.mdwn
new file mode 100644
index 000000000..602a1b436
--- /dev/null
+++ b/doc/todo/wikiwyg.mdwn
@@ -0,0 +1,71 @@
+[Wikiwyg](http://www.wikiwyg.net/)
+is a WYSIWYG editor written in javascript for wikis. It allows editing in a
+gui or in wikitext and converts edits back to wiki format to be saved to
+the wiki.
+
+It would be awesome to use this in ikiwiki, but to take full advantage of
+it with ikiwiki, it would need to know about MarkDown. Wikiwyg does allow
+defining the text that is stuck on each side of a given html element to
+make it wikified, for example, it can add "# " for a h1, "[[" and "]]" for
+a link, etc. This seems easily doable.
+
+The other thing that would need doing is a `saveChanges` function would
+need to be implemented that saves the text back to ikiwiki.
+http://svn.wikiwyg.net/code/trunk/wikiwyg/share/Kwiki/lib/Wikiwyg/Kwiki.js
+seems like a good starting point for building a submit form on the fly.
+
+One other problem: Wikiwyg works by parsing html from a div, turning it
+back into the wiki markup, and editing/saving that. That seems to assume
+that there's a way of parsing a page's html and getting back to the underlying
+wiki markup, which is not always the case in ikiwiki. Unless there's some
+other way to feed it the actual source for a page, this seems like a
+problem. According to the developers, it is possible to do that, and start
+off in WikiText mode.
+
+[[!tag soc]]
+
+[[!tag wishlist]]
+
+[[!tag patch]]
+
+Project IkiWiki::WIKIWYG v1.6 - <http://ikiwiki.xbaud.com/>
+===========================================================
+
+[Wikiwyg][] is a "What you see is what you get" editor for wikis. It will allow you to double click on the text in a wiki and save it without reloading the page. The IkiWiki version will allow you to edit your wiki in Markdown or WYSIWYG. You will also be able to edit a page with the wikiwyg editor by clicking the "Edit" link at the top of the page.
+
+This plugin requires that you have Archive::Tar intalled.
+
+The plugin can be downloaded from <http://ikiwiki.xbaud.com/wikiwyg-1.6.tar.gz>
+
+### Installation Instructions
+
+1. Copy the files from the wikiwyg tarball to the corrosponding folders in your ikiwiki installation
+2. Add "wikiwyg" (no quotes) to the add_plugins section of your ikiwiki.setup
+3. Rebuild your wiki
+4. That's it!
+
+### Current Issues
+
+* Some versions of Mozilla Firefox require you to click Wysiwyg, then another mode, then Wysiwyg again to get Wysiwyg mode working in the Edit link
+* Double lists don't work
+
+### Upcoming Features
+
+* More supported ikiwiki plugins (such as img, smilies, etc.)
+* More wiki languages (such as wiki and otl)
+* Ability to upload directly through wikiwyg (pending the upload plugin)
+* Personalized settings
+
+[Wikiwyg]: http://www.wikiwyg.net/
+
+> As noted in [[discussion]], the url above doesn't work, and I stupidly
+> lost my copy of this before merging it. I hope that this plugin will turn
+> back up. In the meantime, there is a wmd plugin that accomplishes the
+> same basic task of WSYWIG markdown editing. --[[Joey]]
+
+>> Seems the new place is now at Github: <https://github.com/audreyt/wikiwyg-js>
+>> FYI
+>> --[[users/Olea]]
+
+>>> No, that's the wikiwyg source, not the ikiwiki plugin to use it. The
+>>> latter is what's lost. --[[Joey]]
diff --git a/doc/todo/wikiwyg/discussion.mdwn b/doc/todo/wikiwyg/discussion.mdwn
new file mode 100644
index 000000000..11c96a8b5
--- /dev/null
+++ b/doc/todo/wikiwyg/discussion.mdwn
@@ -0,0 +1,181 @@
+Very nice! There are some rough spots yes, but this looks exactly as I'd
+hoped it would, and seems close to being ready for merging.
+
+A few observations, in approximate order of priority:
+
+* What's the copyright and license of showdown? Please include that from
+ the original zip file.
+ * Done. Check licences folder
+* What happens if there are concurrent edits? The CGI.pm modification to
+ save an edited wikiwyg part doesn't seem to check if the source file has
+ changed in the meantime, so if the part has moved around, it might
+ replace the wrong part on saving. I've not tested this.
+ * When you click the edit button, the exact same protocol is used for saving.
+ However when you double click to edit, this still is possibly an issue.
+* The stuff you have in destdir now really belongs in basewiki so it's
+ copied over to any destdir.
+ * Done.
+* Personally, I'm not sure if I need double-click to edit a section in my
+ wiki, but I'd love it if the edit form in the cgi could use wikiwyg. Seems
+ like both of these could be independent options. Doable, I'm sure?
+ * Done.
+* It would be good to move as much as possible of the inlined javascript in
+ wikiwyg.tmpl out to a separate .js file to save space in the rendered
+ pages.
+ * Done.
+* Both this plugin and the [[Gallery]] are turning out
+ to need to add a bunch of pages to the basewiki. I wonder what would be a
+ good way to do this, without bloating the basewiki when the plugins arn't
+ used. Perhaps the underlaydir concept needs to be expanded so it's a set
+ of directories, which plugins can add to. Perhaps you should work with
+ arpitjain on this so both plugins can benefit. (The smiley plugin would
+ also benefit from this..)
+ * Done. All plugin files are now stored in a tarball. IkiWiki checks for
+ <plugin name>.tar.gz in the basedir and if the plugin is being used, then
+ it extracts the files to destdir. Currently IkiWiki does not render these
+ files though (my plugin doesn't need them to be rendered). However it wouldn't
+ be too hard to modify it to render them.
+* Is there any way of only loading enough of wikiwyg by default to catch
+ the section double-clicks, and have it load the rest on the fly? I'm
+ thinking about initial page load time when visiting a wikiwyg-using wiki
+ for the first time. I count 230k or so of data that a browser downloads
+ in that case..
+ * Done-ish. I fixed it so that all of the javascript files(except for the main two)
+ are loaded after the content is loaded. It is possible to make is so that
+ the files are only loaded when you double click, however that is *a lot*
+ more work, plus it will slow the load time for wikiwyg. But if you would
+ prefer that the files only load after double clicking, I can do that. Also,
+ I'm working on reducing the file sizes via [Javascript Compression][]. Theoretically,
+ I can get the size down to about 70kb, I'm working out the kinks now.
+
+--[[Joey]]
+
+Oh, by the way, let me know if I forgot to tarball anything. --[[TaylorKillian]]
+
+[Javascript Compression]: http://javascriptcompressor.com/
+
+---
+
+Some more comments, on version 1.6. You seem to be making nice progress.
+
+changes.diff:
+
+* I don't really like the tarball approach. Doesn't feel like the right
+ approach somehow. A list of underlay directories feels to me like a
+ better approach. One reason is that it's more general than a tarball tied
+ to a given plugin. A list of underlay directories could also be used to
+ prefer a translated underlay, and use the english version of untranslated
+ pages, for example.
+ * I don't quite get what you want to do with the underlay directory, it sounds like
+ you have something pretty specific in mind. I can talk to you about that more
+ on IRC later(assuming my internet is working right).
+ * Basically the idea is to change `$config{underlaydir}` to an array..
+ Ok, take a look at the new `add_underlay()` function. You can now just
+ `add_underlay("wikiwyg")` and it'll look in
+ /usr/share/ikiwiki/wikiwyg/ for the files.
+* When is the WIKIWYG variable in misc.tmpl used?
+ * The WIKIWYG variable in misc.tmpl is used for the edit page. I believe that is what
+ you wanted me to do (Check Revision 3840).
+ * Ah, right.
+* Could you move the code that handles saving a page of the page into the
+ plugin? I just added an editcontent hook, which should allow you to do
+ that.
+ * Alright, np.
+* Your patch exports run_hooks, but I don't see the plugin using that.
+ * Yeah, that was from an earlier revision of my plugin, I just forgot to remove that.
+* I don't know about exporting pagetitle. So far, only the inline plugin
+ needs to use that function, I generally only export things after it's
+ clear a lot of plugins will need them.
+ * Just looked through the inline plugin. So if I want to use pagetitle in my code,
+ I have to use the IkiWiki package instead of IkiWiki::Plugin::Wikiwyg? Or would a
+ better approach be to just copy that function into the Wikiwyg plugin?
+ * You can just call `IkiWiki::pagetitle()`.
+ > Note: pagetitle is now exported.
+
+wikiwyg.tar.gz
+
+* Would it be possible to provide a diff between wikiwyg upstream and any
+ modifications you made to its files? I'm not sure which version you used,
+ so I'm seeing changes in diffing that I'm unsure if you made..
+ * <http://ikiwiki.xbaud.com/JavaScript_Diffs.tar.gz>, also emailed them to you
+ in case my internet goes down.
+ * Could you redo that with diff -u plz?
+ * Link is updated
+* If the files aren't modified, would it be better for users to get them
+ from the wikiwgy upstream, instead of including them in the plugin? (If so,
+ they'd go in their own Debian package..)
+ * The files *are* modified, but I doubt it will make a difference. There have
+ been no updates to Wikiwyg since 5/30/07 so I'm pretty sure it's unmaintained
+ now. Showdown is the same case, they haven't changed anything since SoC began.
+ I could separate them diff's though if you feel it is worth it.
+ * Well, from a packaging perspective, the question is whether some
+ other package might want to use the wikiwyg/showdown javascript
+ files. And whether your mods might break that. If the answers to
+ these questions are yes and no, then it would make sense to package
+ them as standalone packages rather than embedding them in ikiwiki.
+
+misc:
+
+* What are your thoughts on handling plugins? Just make preview do a
+ server-side callback?
+ * That is an option, however I was trying to avoid that due to bandwidth, cpu time
+ concerns (Two reasons I really like IkiWiki). I was planning on just manually
+ implementing some of the easier ones (such as img), however I'm still trying to
+ think of a way for the more complex ones.
+ * It just seems like it would never be able to support everything,
+ and would mean reimplementing stuff in javscript and would constantly
+ need to be kept up to date. Ikiwiki's preview is actually pretty
+ fast, the only real overhead being the cgi call.
+* How do I configure it to only support whole-page editing with wikiwyg and
+ not insert the javascript into html pages?
+ * There currently is no option to do that, however it is a 2 line change that I'll work
+ on after I finish typing this.
+* When editing a whole page with wikiwyg, I think it would be good to keep
+ the save, preview, cancel buttons at the bottom like they are in a
+ regular page edit. Also the comments box. Kind of a least suprise thing, so that enabling
+ wikiwyg for whole-page editing basically just changes how the edit box
+ behaves and keeps the rest of the behavior the same. And I think the preview
+ button should show a preview rendered server-side, like with a regular edit,
+ since such a preview is able to support all plugins.
+ * That's probably a good idea ;)
+
+Everything else looks fine and ready for merging. If, that is, you think
+I should include the plugin with all of its java code in ikiwiki. Thoughts?
+
+--[[Joey]]
+
+I'll start working on the changes... Let me know if you find anything else
+that needs to be changed. I'd be honored to have my code merged with IkiWiki :)
+
+--[[TaylorKillian]]
+
+I wonder if you've had a chance to make any of the remaining changes above?
+Even just some of the smaller changes would be much easier for you to
+do than for me, and it would be nice to get them sorted out before I
+merge it into ikiwiki. --[[Joey]
+
+None of the links for the WYSIWYG editor work anymore. Does anyone have an up to date link?
+Thanks, [[Greg]]
+
+> There's a branch in [[git]] for the wikiwyg stuff, which includes
+> the latest version I sucked in from TaylorKillian's svn repository before
+> it went offline. Disapponted that nothing seems to be moving here.
+> --[[Joey]]
+
+>> How far from ready did this seem to be at that point? I find it a bit unclear
+>> in the above discussion what was completed and what remained. Also, to recover the
+>> wikiwyg-specific stuff from git, it looks like I'd need to ask git for
+>> a diff between the wikiwyg branch and its branch point; is there a nice way to do
+>> that with gitweb, or would I need to install a full-fledged git client? --Chapman Flack
+
+>>> I think that the largest missing thing was support for using ikiwiki
+>>> to render page previews.
+>>>
+>>> Erm.. I seem to have screwed up the creation or pushing out of the
+>>> wikiwyg branch. It doesn't seem to have any of the wikiwyg changes in
+>>> it, and at this point, I don't know where to find them anymore! Damn,
+>>> damn, damn. I suspect I did that right when I was learning git, and
+>>> screwed up pushing the branch. :-( --[[Joey]]
+>>>> Seems the new place is now at Github: <https://github.com/audreyt/wikiwyg-js>
+>>>> FYI
+>>>> --[[users/Olea]]
diff --git a/doc/todo/wmd_editor_live_preview.mdwn b/doc/todo/wmd_editor_live_preview.mdwn
new file mode 100644
index 000000000..d76fb2ba4
--- /dev/null
+++ b/doc/todo/wmd_editor_live_preview.mdwn
@@ -0,0 +1,11 @@
+Some time ago there was [[a question|http://ikiwiki.info/forum/wmd_editor_double_preview/]] in the forum about wmd editor and preview. However there were no answers:
+
+I use the wmd editor in my ikiwiki. However live preview seems not to be a fully correct preview so nevertheless I have to hit the preview button to get a correct preview. However then I have two previews so that I have to scroll down to see the correct one.
+
+Is it possible to disable the live preview or to replace the live preview with the correct one after pressing the preview button?
+
+> There's another page already tracking this UI problem: [[mdwn_preview]]
+> There is a patch there, but AFAIK nobody has done any more work on
+> WMD integration with ikiwiki. --[[Joey]]
+
+>> I tried to apply the patch via git apply, however get an error: fatal: corrupt patch at line 63, any idea?
diff --git a/doc/todo/wrapperuser.mdwn b/doc/todo/wrapperuser.mdwn
new file mode 100644
index 000000000..4c42b046f
--- /dev/null
+++ b/doc/todo/wrapperuser.mdwn
@@ -0,0 +1,7 @@
+ikiwiki's .setup file can specify wrappergroup, and ikiwiki will set the group
+of the wrappers accordingly. Having had people encounter difficulty before
+when trying to do the same thing with users (for instance, making all wrappers
+6755 ikiwiki:ikiwiki), I think it would help to have "wrapperuser". This could
+only actually take effect if building the wrappers as root (not really the best
+plan), but ikiwiki could at least warn if wrapperuser does not match the user
+the wrapper will end up with.
diff --git a/doc/translation.mdwn b/doc/translation.mdwn
new file mode 100644
index 000000000..9d874d98e
--- /dev/null
+++ b/doc/translation.mdwn
@@ -0,0 +1,46 @@
+If you want to translate your wiki into another language, there are
+essentially three pieces needed for a complete translation:
+
+1. The messages in the ikiwiki program itself need to be translated.
+ Ikiwiki is internationalised, and most such messages are already marked
+ with `gettext()`. The source tarball includes a `po/ikiwiki.pot`
+ that can be copied and translated as a po file. All very standard.
+
+ Note that a few things in the source are not currently translated. These
+ include:
+
+ * Error messages of the "should never happen" variety.
+ * Certian info in commit messages, which is not visible from inside the
+ wiki, but is visible in the commit logs. This needs to stay in English
+ so that ikiwiki can parse it back out of the logs.
+ * Some parts of FormBuilder forms, which should be translatable by
+ adding templates. Note that these forms don't need templates for the
+ English version.
+ * The name of the `index` page, which has a special meaning to browsers
+ anyway.
+ * The names of some other pages, like `sidebar` and `openid`.
+ * The names and values of parameters, both to the program, in the setup
+ file, and in preprocessor directives.
+
+1. The [[basewiki]] needs to be translated. The
+ [[plugins/po]] ikiwiki plugin will allow translating
+ wikis using po files and can be used for this.
+
+ There is now a website, [l10n.ikiwiki.info](http://l10n.ikiwiki.info)
+ that both demos the translated basewiki, and allows easy translation of
+ it.
+
+ To generate the po and pot files for translating the basewiki,
+ get ikiwiki's source, edit the `po/underlay.setup` file,
+ adding your language. Then run 'make -C po underlays`.
+ This will generate many po files under `po/underlays`. The first
+ ones you'll want to translate are in the `po/underlays/basewiki` directory,
+ which is really not very large, just a few thousand words.
+ After that is done, you can tackle those under
+ `po/underlays/directives`, which are a much larger (tens of
+ thousands of words).
+
+1. The templates also need to be translated. Some work has been done on an
+ infrastructure for maintaining translated templates, as documented in
+ [[todo/l10n]], but until that's complete, you'd need to copy and
+ translate the templates by hand.
diff --git a/doc/translation/discussion.mdwn b/doc/translation/discussion.mdwn
new file mode 100644
index 000000000..b274317cd
--- /dev/null
+++ b/doc/translation/discussion.mdwn
@@ -0,0 +1,121 @@
+[[!toc]]
+
+# A few questions about translating PO file
+
+I have a few questions about translating PO file:
+
+1. Assume I copied `ikiwiki.pot` file to `pl.po` file and translated it
+from English to Polish. How can I check that my `pl.po` file works good?
+I have some experience with building Debian packages, but I don't know
+too much about working with PO files in Debian packages.
+
+ > Try putting it into the po/ directory and running make and make install
+ > in there, that should create the .mo and install it somewhere
+ > appropriate. ikiwiki should display translated messages when building the
+ > wiki (with -v).
+
+2. I'll send you my translation when I finish it, of course. But about
+updating my PO file? Should I send it to you for every ikiwiki issue?
+Maybe you should give write access to ikiwiki repository for translators
+of PO files?
+
+ > We use git now, so you can clone my repo, commit to your clone, and
+ > use git to mail me patches. --[[Joey]]
+
+3. What is the best way to update my PO file when you do some changes in
+`ikiwiki.pot` file? Should I translate my PO file from scratch or
+can I do diff for old and new `ikiwiki.pot` file and update only differences?
+
+ > There are standard tools for working with po files, and the po file
+ > should be updated as part of the wiki build process so that any fuzzy
+ > strings are so marked.
+
+ >> Could you please point me any good references or write a quick start
+ >> for translators? I think it can be very useful for me and other people.
+
+ >>> I'm not a translator, so I don't really know..
+
+ >>>> OK, I hope I handle it :)
+
+4. What about "gettexting" button titles and link names? Do you really
+think that there should be hardcoded in ikiwiki templates? --[[Paweł|ptecza]]
+
+ > I don't know, really. Recai's approach seems to show promise.
+
+ >> BTW, why does ikiwiki number my questions wrongly (1., 1., 1., 1.,
+ >> instead of 1., 2., 3., 4.)? Where have I made a Markdown mistake? --[[Paweł|ptecza]]
+
+ >>> My indentation mistake, I think. --[[Joey]]
+
+ >>>> Now it's perfect :) Thank you very much! --[[Paweł|ptecza]]
+
+----
+
+# Less laconic gettext messages
+
+I'm just translating `ikiwiki.pot` file to Polish and I have
+problems with some gettext messages, because unfortunately
+there are very laconic, for example "update of %s's %s by %s".
+
+Sometimes I don't understand background well, because I don't use
+all ikiwiki plugins and I have to check ikiwiki source code.
+Besides in Polish language we have conjugation of a verb and
+I can't do it correctly if I don't know what subject of
+a message is. Then I have to check the sources again...
+
+So I have a request to Joey and the rest of ikiwiki coders:
+please write more verbose gettext messages and don't fear using
+subject there. It will be huge help for me and other ikiwiki
+translators. Thank you! :) --[[Paweł|ptecza]]
+
+> Well, those messages are mostly laconic because they're output by
+> ikiwiki running in unix program mode and other tight situations, and
+> it should be clear from context when you see the expanded message what
+> the various bits are.
+>
+> For example, "update of foowiki's MooBar by joey" seems to say enough to
+> be clear (and fit in mutt's subject line display), while the corresponding
+> "actualizado el wiki foowiki y la página MooBar por el usuario joey" feels
+> a bit verbose. (And should it say "updated foowiki *and* the MooBar page"
+> like that? My Spanish sucks though..) In my crappy Spanish I might instead
+> say something like "actualizado MooBar de foowiki por joey". Or maybe
+> "actualizado página Moobar por joey"?
+
+>> But you know that "update of %s's %s by %s" string can be "update of
+>> foowiki's MooBar by joey", but I can only guess it :)
+
+> Anyway, to get back to your point, it's true that translators often
+> need additonal context about things like what variables expand to, and
+> size limits. This is generally done by adding comments in the pot file,
+> and I've turned that on, and added a few. --[[Joey]]
+
+>> Thank you very much! It also will be a big help for me. --[[Paweł|ptecza]]
+
+# Why .po files? Designing an alternative, more user-friendly
+
+I think this plugin is very powerful, and very useful, especially for frontend or vanity sites. However, for live and dynamic sites, edited through the web interface (ie. a wiki!!), the .po file format is very limiting and counter-intuitive for newbies. I know I can fire up emacs and get a nice interface to translate it, but it's not really what I am looking for. .po formats aim to completely translate certain data, while I would expect a wiki to be a bit more lax.
+
+Basically, the problem is that the current plugin assumes the user is familiar with .po files and has a master language, two assumptions that I think are invalid in a lot of cases, especially in "bilingual" or multilingual countries.
+
+One of the canonical examples of functional translation tracking in a wiki is the LizzyWiki, a screencast of which can be seen here: <https://www.youtube.com/watch?v=42kHzyVKsZw> Some of those ideas were implemented in wikis like [[TikiWiki|http://tiki.org/i18n]]
+
+I believe that there are some parts the po plugin that could be reused for such a flexible translation system, in which all languages could be in any format. But basically to implement such a system, those things would be required:
+
+ 1. mapping between pages - this is accomplished by the current po plugin
+ 2. allow the user to flip between pages (other languages links...) - in the current po plugin
+ 3. create a translation based on another page - this is in the current po plugin, but the resulting page is a .po, it should be a regular wiki page
+ 4. track required translations - this is done by gettext in the current po plugin, so this would need to be redone if we change the format
+
+Step 4 is obviously the hard part - tracking changes between pages would involve extra metadata (maybe in the .ikiwiki directory?) to mark which commits have been translated. The metadata is not so much an issue as the user interface problems.
+
+So say when a user edits page foo.fr.mdwn that is a translation from page foo.en.mdwn, how does he/she tell that the translation is finished or not? The web UI could show the changes that have been done in foo.en.mdwn that need to be translated, and when the user saves the page, he/she is asked to confirm if the page is now completely translated.
+
+Through git only, this would seem to be harder. Maybe if the two translations are committed together they can be assumed to be completely translated, for one. Then maybe there is a way commits could be amended so that they are link - maybe some tree merging magic here? My git internals are a bit far to elaborate on that, and anyways i feel that ikiwiki aims to be RCS agnostic so relying to much on those internals doesn't sound like a good idea.
+
+Obviously, this is a lot more work, diverging in a different direction than the current po-based approach, but it seems to me this is a much more natural system for users.
+
+Also, the thing with the above is that if functionalities 1 and 2 (mapping and page flipping) is stripped out of the po plugin and made reusable, functionalities 3 and 4 can be made optional and a wiki is still translatable, giving the user the responsability of tracking the translations...
+
+So basically, what I think should happen is to have ikiwiki be able to use the .po plugin without .po files - just allow for pages to be linked together. Detect foo.fr.mdwn when parsing foo.mdwn and create links to it would already be a huge start... -- [[anarcat]]
+
+> I have a hopefully clearer spec for a plugin called [[todo/translation_links]]. -- [[anarcat]]
diff --git a/doc/usage.mdwn b/doc/usage.mdwn
new file mode 100644
index 000000000..427a51f3b
--- /dev/null
+++ b/doc/usage.mdwn
@@ -0,0 +1,389 @@
+# NAME
+
+ikiwiki - a wiki compiler
+
+# SYNOPSIS
+
+ikiwiki [options] source destination
+
+ikiwiki --setup setupfile
+
+# DESCRIPTION
+
+`ikiwiki` is a wiki compiler. It builds static HTML pages for a wiki, from
+`source` in the [[ikiwiki/Markdown]] language (or others), and writes it out to
+`destination`.
+
+Note that most options can be shortened to single letters, and boolean
+flags such as --verbose can be negated with --no-verbose.
+
+# MODE OPTIONS
+
+These options control the mode that ikiwiki operates in.
+
+* --refresh
+
+ Refresh the wiki, updating any changed pages. This is the default
+ behavior so you don't normally need to specify it.
+
+* --rebuild
+
+ Force a rebuild of all pages.
+
+* --setup setupfile
+
+ The default action when --setup is specified is to automatically generate
+ wrappers for a wiki based on data in a setup file, and rebuild the wiki.
+ If you only want to build any changed pages, you can use --refresh with
+ --setup.
+
+* --changesetup setupfile
+
+ Reads the setup file, adds any configuration changes specified by other
+ options, and writes the new configuration back to the setup file. Also
+ updates any configured wrappers. In this mode, the wiki is not fully
+ rebuilt, unless you also add --rebuild.
+
+ Example, to enable some plugins:
+
+ ikiwiki --changesetup ~/ikiwiki.setup --plugin goodstuff --plugin calendar
+
+* --dumpsetup setupfile
+
+ Causes ikiwiki to write to the specified setup file, dumping out
+ its current configuration.
+
+* --wrappers
+
+ If used with --setup --refresh, this makes it also update any configured
+ wrappers.
+
+* --clean
+
+ This makes ikiwiki clean up by removing any files it generated in the
+ `destination` directory, as well as any configured wrappers, and the
+ `.ikiwiki` state directory. This is mostly useful if you're running
+ ikiwiki in a Makefile to build documentation and want a corresponding
+ `clean` target.
+
+* --cgi
+
+ Enable [[CGI]] mode. In cgi mode ikiwiki runs as a cgi script, and
+ supports editing pages, signing in, and registration.
+
+ To use ikiwiki as a [[CGI]] program you need to use --wrapper or --setup
+ to generate a wrapper. The wrapper will generally need to run suid 6755 to
+ the user who owns the `source` and `destination` directories.
+
+* --wrapper [file]
+
+ Generate a wrapper binary that is hardcoded to do action specified by
+ the other options, using the specified input files and `destination`
+ directory. The filename to use for the wrapper is optional.
+
+ The wrapper is designed to be safely made suid and be run by untrusted
+ users, as a [[post-commit]] hook, or as a [[CGI]].
+
+ Note that the generated wrapper will ignore all command line parameters.
+
+* --aggregate
+
+ If the [[plugins/aggregate]] plugin is enabled, this makes ikiwiki poll
+ configured feeds and save new posts to the srcdir.
+
+ Note that to rebuild previously aggregated posts, use the --rebuild option
+ along with this one. --rebuild will also force feeds to be polled even if
+ they were polled recently.
+
+* --render file
+
+ Renders a single file, outputting the resulting html. Does not save state,
+ so this cannot be used for building whole wikis, but it is useful for
+ previewing an edited file at the command line. Generally used in conjunction
+ with --setup to load in a wiki's setup:
+
+ ikiwiki --setup ~/ikiwiki.setup --render foo.mdwn
+
+* --post-commit
+
+ Run in post-commit mode, the same as if called by a [[post-commit]] hook.
+ This is probably only useful when using ikiwiki with a web server on one host
+ and a repository on another, to allow the repository's real post-commit
+ hook to ssh to the web server host and manually run ikiwiki to update
+ the web site.
+
+* --version
+
+ Print ikiwiki's version number.
+
+# CONFIG OPTIONS
+
+These options configure the wiki. Note that [[plugins]] can add additional
+configuration options of their own. All of these options and more besides can
+also be configured using a setup file.
+
+* --wikiname name
+
+ The name of the wiki, default is "wiki".
+
+* --templatedir dir
+
+ Specify the directory that [[templates|templates]] are stored in.
+ Default is `/usr/share/ikiwiki/templates`, or another location as configured at
+ build time. If the templatedir is changed, missing templates will still
+ be searched for in the default location as a fallback. Templates can also be
+ placed in the "templates/" subdirectory of the srcdir.
+
+ Note that if you choose to copy and modify ikiwiki's templates, you will need
+ to be careful to keep them up to date when upgrading to new versions of
+ ikiwiki. Old versions of templates do not always work with new ikiwiki
+ versions.
+
+* --underlaydir dir
+
+ Specify the directory that is used to underlay the source directory.
+ Source files will be taken from here unless overridden by a file in the
+ source directory. Default is `/usr/share/ikiwiki/basewiki` or another
+ location as configured at build time.
+
+* --wrappermode mode
+
+ Specify a mode to chmod the wrapper to after creating it.
+
+* --wrappergroup group
+
+ Specify what unix group the wrapper should be owned by. This can be
+ useful if the wrapper needs to be owned by a group other than the default.
+ For example, if a project has a repository with multiple committers with
+ access controlled by a group, it makes sense for the ikiwiki wrappers
+ to run setgid to that group.
+
+* --rcs=svn|git|.., --no-rcs
+
+ Enable or disable use of a [[revision_control_system|rcs]].
+
+ The `source` directory will be assumed to be a working copy, or clone, or
+ whatever the revision control system you select uses.
+
+ In [[CGI]] mode, with a revision control system enabled, pages edited via
+ the web will be committed.
+
+ No revision control is enabled by default.
+
+* --svnrepo /svn/wiki
+
+ Specify the location of the svn repository for the wiki.
+
+* --svnpath trunk
+
+ Specify the path inside your svn repository where the wiki is located.
+ This defaults to `trunk`; change it if your wiki is at some other path
+ inside the repository. If your wiki is rooted at the top of the repository,
+ set svnpath to "".
+
+* --rss, --norss
+
+ If rss is set, ikiwiki will default to generating RSS feeds for pages
+ that inline a [[blog]].
+
+* --allowrss
+
+ If allowrss is set, and rss is not set, ikiwiki will not default to
+ generating RSS feeds, but setting `rss=yes` in the inline directive can
+ override this default and generate a feed.
+
+* --atom, --noatom
+
+ If atom is set, ikiwiki will default to generating Atom feeds for pages
+ that inline a [[blog]].
+
+* --allowatom
+
+ If allowatom is set, and rss is not set, ikiwiki will not default to
+ generating Atom feeds, but setting `atom=yes` in the inline directive can
+ override this default and generate a feed.
+
+* --pingurl URL
+
+ Set this to the URL of an XML-RPC service to ping when an RSS feed is
+ updated. For example, to ping Technorati, use the URL
+ http://rpc.technorati.com/rpc/ping
+
+ This parameter can be specified multiple times to specify more than one
+ URL to ping.
+
+* --url URL
+
+ Specifies the URL to the wiki. This is a required parameter in [[CGI]] mode.
+
+* --cgiurl http://example.org/ikiwiki.cgi
+
+ Specifies the URL to the ikiwiki [[CGI]] script wrapper. Required when
+ building the wiki for links to the cgi script to be generated.
+
+* --historyurl URL
+
+ Specifies the URL to link to for page history browsing. In the URL,
+ "\[[file]]" is replaced with the file to browse. It's common to use
+ [[ViewVC]] for this.
+
+* --adminemail you@example.org
+
+ Specifies the email address that ikiwiki should use for sending email.
+
+* --diffurl URL
+
+ Specifies the URL to link to for a diff of changes to a page. In the URL,
+ "\[[file]]" is replaced with the file to browse, "\[[r1]]" is the old
+ revision of the page, and "\[[r2]]" is the new revision. It's common to use
+ [[ViewVC]] for this.
+
+* --exclude regexp
+
+ Specifies a rexexp of source files to exclude from processing.
+ May be specified multiple times to add to exclude list.
+
+* --include regexp
+
+ Specifies a rexexp of source files, that would normally be excluded,
+ but that you wish to include in processing.
+ May be specified multiple times to add to include list.
+
+* --adminuser name
+
+ Specifies a username of a user (or, if openid is enabled, an openid)
+ who has the powers of a wiki admin. Currently allows locking of any page,
+ and [[banning|banned_users]] users, as well as powers granted by
+ enabled plugins (such as [[moderating comments|plugins/moderatedcomments]]
+ and [[plugins/websetup]]. May be specified multiple times for multiple
+ admins.
+
+ For an openid user specify the full URL of the login, including "http://".
+
+* --plugin name
+
+ Enables the use of the specified [[plugin|plugins]] in the wiki.
+ Note that plugin names are case sensitive.
+
+* --disable-plugin name
+
+ Disables use of a plugin. For example "--disable-plugin htmlscrubber"
+ to do away with HTML sanitization.
+
+* --libdir directory
+
+ Makes ikiwiki look in the specified directory first, before the regular
+ locations when loading library files and plugins. For example, if you set
+ libdir to "/home/you/.ikiwiki/", you can install a foo.pm plugin as
+ "/home/you/.ikiwiki/IkiWiki/Plugin/foo.pm".
+
+* --discussion, --no-discussion
+
+ Enables or disables "Discussion" links from being added to the header of
+ every page. The links are enabled by default.
+
+* --numbacklinks n
+
+ Controls how many backlinks should be displayed at the bottom of a page.
+ Excess backlinks will be hidden in a popup. Default is 10. Set to 0 to
+ disable this feature.
+
+* --userdir subdir
+
+ Optionally, allows links to users of the wiki to link to pages inside a
+ subdirectory of the wiki. The default is to link to pages in the toplevel
+ directory of the wiki.
+
+* --htmlext html
+
+ Configures the extension used for generated html files. Default is "html".
+
+* --timeformat format
+
+ Specify how to display the time or date. The format string is passed to the
+ strftime(3) function.
+
+* --verbose, --no-verbose
+
+ Be verbose about what is being done.
+
+* --syslog, --no-syslog
+
+ Log to syslog(3).
+
+* --usedirs, --no-usedirs
+
+ Toggle creating output files named page/index.html (default) instead of page.html.
+
+* --prefix-directives, --no-prefix-directives
+
+ Toggle new '!'-prefixed syntax for preprocessor directives. ikiwiki currently
+ defaults to --prefix-directives.
+
+* --w3mmode, --no-w3mmode
+
+ Enable [[w3mmode]], which allows w3m to use ikiwiki as a local CGI script,
+ without a web server.
+
+* --sslcookie
+
+ Only send cookies over an SSL connection. This should prevent them being
+ intercepted. If you enable this option then you must run at least the
+ CGI portion of ikiwiki over SSL.
+
+* --gettime, --no-gettime
+
+ Extract creation and modification times for each new page from the
+ the revision control's log. This is done automatically when building a
+ wiki for the first time, so you normally do not need to use this option.
+
+* --set var=value
+
+ This allows setting an arbitrary configuration variable, the same as if it
+ were set via a setup file. Since most commonly used options can be
+ configured using command-line switches, you will rarely need to use this.
+
+* --set-yaml var=value
+
+ This is like --set, but it allows setting configuration variables that
+ use complex data structures, by passing in a YAML document.
+
+# EXAMPLES
+
+* ikiwiki --setup my.setup
+
+ Completely (re)build the wiki using the specified setup file.
+
+* ikiwiki --setup my.setup --refresh
+
+ Refresh the wiki, using settings from my.setup, and avoid
+ rebuilding any pages that have not changed. This is faster.
+
+* ikiwiki --setup my.setup --refresh --wrappers
+
+ Refresh the wiki, including regnerating all wrapper programs,
+ but do not rebuild all pages. Useful if you have changed something
+ in the setup file that does not need a full wiki rebuild to update
+ all pages, but that you want to immediatly take effect.
+
+# ENVIRONMENT
+
+* CC
+
+ This controls what C compiler is used to build wrappers. Default is 'cc'.
+
+* CFLAGS
+
+ This can be used to pass options to the C compiler when building wrappers.
+
+# SEE ALSO
+
+* [[ikiwiki-mass-rebuild]](8)
+* [[ikiwiki-update-wikilist]](1)
+* [[ikiwiki-transition]](1)
+
+# AUTHOR
+
+Joey Hess <joey@ikiwiki.info>
+
+Warning: Automatically converted into a man page by mdwn2man. Edit with care
diff --git a/doc/usage/discussion.mdwn b/doc/usage/discussion.mdwn
new file mode 100644
index 000000000..189d74eb0
--- /dev/null
+++ b/doc/usage/discussion.mdwn
@@ -0,0 +1 @@
+Man page does not document "account\_creation\_password". I started to add it, then noticed other configurations are not documented in the manual page either. --[[JeremyReed]]
diff --git a/doc/users.mdwn b/doc/users.mdwn
new file mode 100644
index 000000000..cc3cf5268
--- /dev/null
+++ b/doc/users.mdwn
@@ -0,0 +1,11 @@
+See [[IkiwikiUsers]] for the list of sites using ikiwiki.
+
+Users of this wiki, feel free to create a subpage of this one and talk
+about yourself on it, within reason. You can link to it to sign your
+comments.
+
+List of users
+=============
+[[!inline pages="users/* and !users/*/* and !*/Discussion"
+feeds=no archive=yes sort=title template=titlepage
+rootpage="users" postformtext="Add yourself as an ikiwiki user:"]]
diff --git a/doc/users/BerndZeimetz.mdwn b/doc/users/BerndZeimetz.mdwn
new file mode 100644
index 000000000..cf21dc585
--- /dev/null
+++ b/doc/users/BerndZeimetz.mdwn
@@ -0,0 +1,8 @@
+See [wiki.debian.org/BerndZeimetz](http://wiki.debian.org/BerndZeimetz) for details.
+
+<pre>
+ Bernd Zeimetz Debian GNU/Linux Developer
+ http://bzed.de http://www.debian.org
+ GPG Fingerprints: 06C8 C9A2 EAAD E37E 5B2C BE93 067A AD04 C93B FF79
+ ECA1 E3F2 8E11 2432 D485 DD95 EB36 171A 6FF9 435F
+</pre>
diff --git a/doc/users/Christine_Spang.mdwn b/doc/users/Christine_Spang.mdwn
new file mode 100644
index 000000000..223e9739d
--- /dev/null
+++ b/doc/users/Christine_Spang.mdwn
@@ -0,0 +1 @@
+Running ikiwiki on her [homepage](http://spang.cc/) and [blog](http://blog.spang.cc/).
diff --git a/doc/users/DamianSmall.mdwn b/doc/users/DamianSmall.mdwn
new file mode 100644
index 000000000..75177d7b3
--- /dev/null
+++ b/doc/users/DamianSmall.mdwn
@@ -0,0 +1 @@
+New Ikiwiki user.
diff --git a/doc/users/Daniel_Andersson.mdwn b/doc/users/Daniel_Andersson.mdwn
new file mode 100644
index 000000000..8565be17f
--- /dev/null
+++ b/doc/users/Daniel_Andersson.mdwn
@@ -0,0 +1,3 @@
+Heard about ikiwiki through Planet Debian and Joey Hess. Started playing with it, and make attempts to report bugs and maybe even fix them, though my Perl experience is equal to zero.
+
+At the moment I'm fiddling with ikiwiki at <http://510x.se/notes>.
diff --git a/doc/users/DavidBremner.mdwn b/doc/users/DavidBremner.mdwn
new file mode 100644
index 000000000..8462b1b4f
--- /dev/null
+++ b/doc/users/DavidBremner.mdwn
@@ -0,0 +1 @@
+<http://www.cs.unb.ca/~bremner>
diff --git a/doc/users/David_Riebenbauer.mdwn b/doc/users/David_Riebenbauer.mdwn
new file mode 100644
index 000000000..d7469696e
--- /dev/null
+++ b/doc/users/David_Riebenbauer.mdwn
@@ -0,0 +1,8 @@
+Runs ikiwiki on his [homepage](http://liegesta.at/) and can be reached through
+<davrieb@liegesta.at>
+
+## Branches in his [[git]] repository ##
+
+* `autotag`
+([browse](http://git.liegesta.at/?p=ikiwiki.git;a=shortlog;h=refs/heads/autotag))
+See [[todo/auto-create_tag_pages_according_to_a_template]]
diff --git a/doc/users/Edward_Betts.mdwn b/doc/users/Edward_Betts.mdwn
new file mode 100644
index 000000000..61d6150ef
--- /dev/null
+++ b/doc/users/Edward_Betts.mdwn
@@ -0,0 +1,4 @@
+My watchlist:
+
+[[!inline archive="yes" sort="mtime" atom="yes" pages="todo/allow_wiki_syntax_in_commit_messages* or todo/shortcut_with_different_link_text* or todo/structured_page_data* or tips/convert_mediawiki_to_ikiwiki*"]]
+
diff --git a/doc/users/Erkan_Yilmaz.mdwn b/doc/users/Erkan_Yilmaz.mdwn
new file mode 100644
index 000000000..070a3a45a
--- /dev/null
+++ b/doc/users/Erkan_Yilmaz.mdwn
@@ -0,0 +1,2 @@
+* [[recentchanges]]
+* my site: [here](http://iaskquestions.com)
diff --git a/doc/users/Gianpaolo_Macario.mdwn b/doc/users/Gianpaolo_Macario.mdwn
new file mode 100644
index 000000000..203d75d28
--- /dev/null
+++ b/doc/users/Gianpaolo_Macario.mdwn
@@ -0,0 +1,14 @@
+Just started learning Ikiwiki...
+
+I have been using [MediaWiki](http://www.mediawiki.org/wiki/MediaWiki) for a long time, now I would like to switch to [Ikiwiki](http://ikiwiki.info/) because of:
+
++ git backend
++ MarkDown syntax
+
+I have browsed <http://www.wikimatrix.org/compare/ikiwiki+MediaWiki>, but I am still not sure about which useful features (if any) I would miss from MediaWiki...
+
+So the best way is actually trying it, isn't it?
+
+I would also try integrating Ikiwiki with [Infinote](http://infinote.org/) (or perhaps [jinfinote](http://www.jinfinote.com/)), in order to allow concurrent editing of pages.
+
+-- [Gianpaolo Macario](http://gmacario.altervista.org/)
diff --git a/doc/users/GiuseppeBilotta.mdwn b/doc/users/GiuseppeBilotta.mdwn
new file mode 100644
index 000000000..7b15da959
--- /dev/null
+++ b/doc/users/GiuseppeBilotta.mdwn
@@ -0,0 +1,6 @@
+Custom patches to IkiWiki can be found in select branches of
+<http://git.oblomov.eu>.
+
+Patches proposed/discussed:
+[[!map pages="link(users/GiuseppeBilotta) and (todo/* or bugs/*) and
+link(branches) and !link(todo/done)"]]
diff --git a/doc/users/HenrikBrixAndersen.mdwn b/doc/users/HenrikBrixAndersen.mdwn
new file mode 100644
index 000000000..bdf4c00a8
--- /dev/null
+++ b/doc/users/HenrikBrixAndersen.mdwn
@@ -0,0 +1,3 @@
+Henrik Brix Andersen AKA `brix` is <henrik@brixandersen.dk> - his weblog is available on <http://blog.brixandersen.dk/>.
+
+Brix maintains the FreeBSD port of ikiwiki, available in the FreeBSD ports collection under [www/ikiwiki](http://www.freshports.org/www/ikiwiki/). \ No newline at end of file
diff --git a/doc/users/Jamie.mdwn b/doc/users/Jamie.mdwn
new file mode 100644
index 000000000..12aa518a5
--- /dev/null
+++ b/doc/users/Jamie.mdwn
@@ -0,0 +1 @@
+[Jamie](http://current.workingdirectory.net/) is trying to unlearn php so he can write ikiwiki Plugins.
diff --git a/doc/users/JeremieKoenig.mdwn b/doc/users/JeremieKoenig.mdwn
new file mode 100644
index 000000000..5dc9edd57
--- /dev/null
+++ b/doc/users/JeremieKoenig.mdwn
@@ -0,0 +1,3 @@
+I'm planning to push for ikiwiki as a supplement for SPIP at some site.
+I will especially need good review and translations support in order to succeed.
+My email address is `jk@jk.fr.eu.org`. \ No newline at end of file
diff --git a/doc/users/Jimmy_Tang.mdwn b/doc/users/Jimmy_Tang.mdwn
new file mode 100644
index 000000000..a1402bcae
--- /dev/null
+++ b/doc/users/Jimmy_Tang.mdwn
@@ -0,0 +1 @@
+<http://www.sgenomics.org/~jtang>
diff --git a/doc/users/JoshBBall.mdwn b/doc/users/JoshBBall.mdwn
new file mode 100644
index 000000000..ed4b5b294
--- /dev/null
+++ b/doc/users/JoshBBall.mdwn
@@ -0,0 +1,3 @@
+My name is Joshua B. Ball. I can be contacted at [JoshBBall@gmail.com](mailto:JoshBBall@gmail.com). I love haskell.
+
+[Github](http://github.com/sciolizer)
diff --git a/doc/users/Kai_Hendry.mdwn b/doc/users/Kai_Hendry.mdwn
new file mode 100644
index 000000000..3a865762d
--- /dev/null
+++ b/doc/users/Kai_Hendry.mdwn
@@ -0,0 +1,5 @@
+I use ikiwiki on a bunch of sites:
+
+* [For co-ordinating a Web kiosk distribution](http://webconverger.org/)
+* [at work](http://wiki.webvm.net/)
+* [my tips collection](http://dabase.com/tips/)
diff --git a/doc/users/KarlMW.mdwn b/doc/users/KarlMW.mdwn
new file mode 100644
index 000000000..c058b13b0
--- /dev/null
+++ b/doc/users/KarlMW.mdwn
@@ -0,0 +1,3 @@
+[[!meta title="Karl Mowatt-Wilson"]]
+
+Working on an [asciidoc](http://www.methods.co.nz/asciidoc/) plugin for ikiwiki so I can use it for my [website](http://mowson.org/karl), where I'm documenting how to run linux on the HP/Compaq Evo T20 'thin client'. \ No newline at end of file
diff --git a/doc/users/KarlMW/discussion.mdwn b/doc/users/KarlMW/discussion.mdwn
new file mode 100644
index 000000000..4a111a3f9
--- /dev/null
+++ b/doc/users/KarlMW/discussion.mdwn
@@ -0,0 +1,27 @@
+When you edited [[ikiwikiusers]] all the utf-8 on the page was removed. Is this an issue with the web browser you used? I've fixed the utf-8 damage. --[[Joey]]
+
+> Ooops! Sorry - yes, my fault - I was using w3m, which spawned nano to edit the text, and I'm not setup for utf-8. I'll be more careful in future. --[[KarlMW]]
+
+## Asciidoc
+
+While I have your attention... I got my asciidoc plugin going, at least
+well enough to render my website without complaint. If you want to have a
+look at it, the plugin is at <http://mowson.org/karl/colophon>. Is it
+worthy of adding to the ikiwiki plugin collection? This is my first ever
+perl programming, so I may well have made absurd mistakes - if there are
+things that need changing then I will probably need help/guidance.
+--[[KarlMW]]
+
+> The main problem I see is the html escaping issue. This is not really
+> unique to asciidoc, see [[todo/format_escape]]. I wonder if the
+> technique provided by that patch could be used to let your plugin
+> automatically handle the escaping. Unfortunatey, I have not yet gotten
+> around to reviewing/applying the patch. --[[Joey]]
+
+>> Escaping is indeed a problem with asciidoc - it seems to me that asciidoc still processes some things which have supposedly been escaped, although that may just be a matter of me misunderstanding something. Inline escaping is done with both prefix and suffix of "+++" - no way to nest it. Block escaping starts and ends with a line of "++++" (4 or more of "+").
+
+>> I suspect that asciidoc can't really be made to play nice to the extent that I would want casual users/abusers to have it as a markup option on a live wiki - it's fine for a personal site where you can look at the output before putting it online, but I think it would be a hideously gaping integrity hole for anything more than that. However, for a personal site (as I am using it), it does seem to have its uses.
+
+>> I'll keep an eye on the format_escape plugin, and assuming it is accepted into ikiwiki, will see if I can apply it to asciidoc. --[[KarlMW]]
+
+Is there any way to enable latexmath rendering? It semes that ikiwiki strips the necessary javascript and/or style sheet information from the HTML page generated by asciidoc. --Peter
diff --git a/doc/users/KathrynAndersen.mdwn b/doc/users/KathrynAndersen.mdwn
new file mode 100644
index 000000000..8e827b0da
--- /dev/null
+++ b/doc/users/KathrynAndersen.mdwn
@@ -0,0 +1,8 @@
+* aka [[rubykat]]
+* <http://kerravonsen.dreamwidth.org>
+* <http://www.katspace.org> (uses IkiWiki!)
+* <http://github.com/rubykat>
+* Also an active [PmWiki](http://www.pmwiki.org) user, interested in having the best of both worlds.
+
+Has written the following plugins:
+[[!map pages="!*/Discussion and ((link(users/KathrynAndersen) or link(users/rubykat)) and plugins/*) "]]
diff --git a/doc/users/KathrynAndersen/discussion.mdwn b/doc/users/KathrynAndersen/discussion.mdwn
new file mode 100644
index 000000000..4f2790c39
--- /dev/null
+++ b/doc/users/KathrynAndersen/discussion.mdwn
@@ -0,0 +1,20 @@
+Had a look at your site. Sprawling, individualistic, using ikiwiki in lots of
+ways. Makes me happy. :) I see that I have let a lot of contrib plugins
+pile up. I will try to get to these. I'm particularly interested in
+your use of yaml+fields. Encourage you to go ahead with any others you
+have not submitted here, like pmap. (Unless it makes more sense to submit
+that as a patch to the existing map plugin.) --[[Joey]]
+
+> Thanks. I would have put more up, but I didn't want to until they were properly documented, and other things have taken a higher priority.
+
+> I think pmap is probably better as a separate plugin, because it has additional dependencies (HTML::LinkList) which people might not want to have to install.
+
+>> One approach commonly used in ikiwiki is to make such optional features
+>> be enabled by a switch somewhere, and 'eval q{use Foo}` so the module
+>> does not have to be loaded unless the feature is used. --[[Joey]]
+
+>>> Unfortunately, HTML::LinkList isn't an optional feature for pmap; that's what it uses to create the HTML for the map. --[[KathrynAndersen]]
+
+> The "includepage" plugin I'm not sure whether it is worth releasing or not; it's basically a cut-down version of "inline", because the inline plugin is so complicated and has so many options, I felt more at ease to have something simpler.
+
+> --[[KathrynAndersen]]
diff --git a/doc/users/Larry_Clapp.mdwn b/doc/users/Larry_Clapp.mdwn
new file mode 100644
index 000000000..ffa7f3c8a
--- /dev/null
+++ b/doc/users/Larry_Clapp.mdwn
@@ -0,0 +1,3 @@
+Added so I could sign my [bug report](bugs/map_doesn__39__t_calculate___34__common__95__prefix__34___correctly/) correctly.
+
+My email address is <larry@theclapp.org>. I'm trying to use ikiwiki for my home page, blog, and a small software project. I learned about it from Reddit.
diff --git a/doc/users/LucaCapello.mdwn b/doc/users/LucaCapello.mdwn
new file mode 100644
index 000000000..5ddccbf3d
--- /dev/null
+++ b/doc/users/LucaCapello.mdwn
@@ -0,0 +1,5 @@
+[Debian Developer](http://wiki.debian.org/LucaCapello)
+
+[homepage](http://luca.pca.it)
+
+[write me](mailto:luca@pca.it)
diff --git a/doc/users/MatthiasIhrke.mdwn b/doc/users/MatthiasIhrke.mdwn
new file mode 100644
index 000000000..0d54f27f0
--- /dev/null
+++ b/doc/users/MatthiasIhrke.mdwn
@@ -0,0 +1,4 @@
+*Love* ikiwiki!
+
+## Done here: ##
+* [[plugins/contrib/bibtex]]-plugin
diff --git a/doc/users/Mick_Pollard.mdwn b/doc/users/Mick_Pollard.mdwn
new file mode 100644
index 000000000..9c558e357
--- /dev/null
+++ b/doc/users/Mick_Pollard.mdwn
@@ -0,0 +1 @@
+.
diff --git a/doc/users/NeilSmithline.mdwn b/doc/users/NeilSmithline.mdwn
new file mode 100644
index 000000000..27370248e
--- /dev/null
+++ b/doc/users/NeilSmithline.mdwn
@@ -0,0 +1 @@
+Neil Smithline, that's me!
diff --git a/doc/users/NicolasLimare.mdwn b/doc/users/NicolasLimare.mdwn
new file mode 100644
index 000000000..56a950f7e
--- /dev/null
+++ b/doc/users/NicolasLimare.mdwn
@@ -0,0 +1 @@
+[[!meta redir="nil"]]
diff --git a/doc/users/Oblomov.mdwn b/doc/users/Oblomov.mdwn
new file mode 100644
index 000000000..be6e666cb
--- /dev/null
+++ b/doc/users/Oblomov.mdwn
@@ -0,0 +1 @@
+Getting started with Ikiwiki, like the git backend a lot, would like to see a dynamic version of it.
diff --git a/doc/users/Olea.mdwn b/doc/users/Olea.mdwn
new file mode 100644
index 000000000..1db2a7cf5
--- /dev/null
+++ b/doc/users/Olea.mdwn
@@ -0,0 +1,4 @@
+[[!meta title="Ismael Olea"]]
+
+Ismael Olea is <a href="mailto:ismael@olea.org">ismael@olea.org</a>.
+His web page is [here](http://olea.org/diario/).
diff --git a/doc/users/OscarMorante.mdwn b/doc/users/OscarMorante.mdwn
new file mode 100644
index 000000000..7ece36db0
--- /dev/null
+++ b/doc/users/OscarMorante.mdwn
@@ -0,0 +1,3 @@
+Oscar Morante <oscar@morante.eu>
+
+[homepage](http://oscar.morante.eu)
diff --git a/doc/users/Perry.mdwn b/doc/users/Perry.mdwn
new file mode 100644
index 000000000..d10b8621f
--- /dev/null
+++ b/doc/users/Perry.mdwn
@@ -0,0 +1 @@
+Just another IkiWiki user.
diff --git a/doc/users/Ramsey.mdwn b/doc/users/Ramsey.mdwn
new file mode 100644
index 000000000..696f00bf6
--- /dev/null
+++ b/doc/users/Ramsey.mdwn
@@ -0,0 +1,3 @@
+Hi everyone, I'm Ramsey. I am a web developer/aerospace engineer in order of preference.
+
+I am using ikiwiki to create a [[blog|http://blog.coderfly.com]] and [[wiki|http://wiki.coderfly.com]]
diff --git a/doc/users/Remy.mdwn b/doc/users/Remy.mdwn
new file mode 100644
index 000000000..5cde4c43d
--- /dev/null
+++ b/doc/users/Remy.mdwn
@@ -0,0 +1 @@
+Test page
diff --git a/doc/users/RickOwens.mdwn b/doc/users/RickOwens.mdwn
new file mode 100644
index 000000000..c619569d8
--- /dev/null
+++ b/doc/users/RickOwens.mdwn
@@ -0,0 +1 @@
+I'm a Systems Analyst in Montana. I use ikiwiki as a private notebook/journal/worklog/etc.
diff --git a/doc/users/Simon_Michael.mdwn b/doc/users/Simon_Michael.mdwn
new file mode 100644
index 000000000..3fe11aaad
--- /dev/null
+++ b/doc/users/Simon_Michael.mdwn
@@ -0,0 +1,8 @@
+Simon Michael (<simon@joyful.com>, sm on freenode) is a free software developer and consultant.
+His site is at <http://joyful.com>.
+He is the lead developer of the [Zwiki](http://zwiki.org) zope-based wiki engine, and also an ikiwiki fan.
+
+Favourite ikiwiki features: efficient/robust static html and rcs integration.
+Least favourite ikiwiki features: unstable hierarchical urls and setup complexity.
+He is [interested](http://ikiwiki.info/rcs/details/#index3h2) in getting a
+robust [darcs](http://ikiwiki.info/todo/darcs/) back end working.
diff --git a/doc/users/Stefano_Zacchiroli.mdwn b/doc/users/Stefano_Zacchiroli.mdwn
new file mode 100644
index 000000000..b874df9a3
--- /dev/null
+++ b/doc/users/Stefano_Zacchiroli.mdwn
@@ -0,0 +1 @@
+<http://upsilon.cc/~zack>
diff --git a/doc/users/StevenBlack.mdwn b/doc/users/StevenBlack.mdwn
new file mode 100644
index 000000000..ea7a6a97a
--- /dev/null
+++ b/doc/users/StevenBlack.mdwn
@@ -0,0 +1,5 @@
+It feels like there are a lot of people named Steven Black. While I'm just one of many with my name, sometimes it is actually just me and I've forgotten that I had an account somewhere.
+
+I'm not a doctor, though I would certainly trust any doctor, dentist, or philosopher named Steven Black. (There are several.)
+
+I *am* a huge Ikiwiki fan. I've had my eye on it for many years for personal projects (though I never quite got around to installing it). Recently, however, I managed to convince my coworkers that it would be a good idea for an internal wiki. Boy was I right. The thing is practically designed to be the perfect developer-centered wiki.
diff --git a/doc/users/TaylorKillian.mdwn b/doc/users/TaylorKillian.mdwn
new file mode 100644
index 000000000..71e257db4
--- /dev/null
+++ b/doc/users/TaylorKillian.mdwn
@@ -0,0 +1,9 @@
+[[!meta title="Taylor Killian"]]
+
+Hi,
+
+I'm Taylor Killian, I got selected to work on IkiWiki as part of [Google Summer of Code](http://code.google.com/soc). My goal is to implement [[todo/wikiwyg]] as a plugin for IkiWiki. I have my own test IkiWiki set up at <http://ikiwiki.xbaud.com/> and will be doing most of my work there.
+
+I am currently enrolled at [Columbus State Community College](http://www.cscc.edu/) as a freshmen and plan on attending [Wright State](http://www.wright.edu/) next year in pursuit of a degree in Computer Science & Engineering.
+
+Taylor Killian \ No newline at end of file
diff --git a/doc/users/TaylorKillian/discussion.mdwn b/doc/users/TaylorKillian/discussion.mdwn
new file mode 100644
index 000000000..b96a21b8e
--- /dev/null
+++ b/doc/users/TaylorKillian/discussion.mdwn
@@ -0,0 +1,5 @@
+You're in Ohio.. I wonder if you've any plans to attend the Ohio LinuxFest this year. I've been thinking about going, and it would be nice to meet ikiwiki people. --[[Joey]]
+
+Yeah, definitely! I went last year, it was pretty cool. The only part I was disappointed about was the fact that one of speakers didn't show up (but we got to learn about mod_rewrite instead ;) . I'm only about 15 minutes away, so I have no excuse not to attend ;) Oh, just registered btw. --[[TaylorKillian]]
+
+I'm still thinking about going, maybe I'll see you there. --[[Joey]]
diff --git a/doc/users/The_TOVA_Company.mdwn b/doc/users/The_TOVA_Company.mdwn
new file mode 100644
index 000000000..24cd4dec9
--- /dev/null
+++ b/doc/users/The_TOVA_Company.mdwn
@@ -0,0 +1,32 @@
+We're a small medical software and hardware company. We're based in LA, but all of our developers are up here in Portland. We choose ikiwiki for both our [public web site](http://www.tovatest.com/) and for our internal wikis. We love it, especially based on git: it lets us merge our document repository with our wiki, which is kind of neat.
+
+We're looking for the following "standard" wiki features for some of our git-disabled users:
+
+- [[Attachments|todo/fileupload]]
+ - Must have an attachment link (of whatever kind) on each page.
+ - May have the link be disabled by some security setting.
+ - Must be able to upload a file to the same directory as the source file (e.g., the .mdwn file).
+ - Might be able to choose arbitrary location.
+ - Might scan for file information, and disallow security risks (if so, this must be configurable).
+ - Must be able to delete files.
+ - Might be able to rename them and/or move attached files.
+ - Must be able to list attachments (that is, files in the directory)
+ - Should list file names, sizes, and date uploaded in a small table.
+
+- Page renaming/moving/deleting
+ > There is an (old) [[patch|todo/Moving_pages]] for this, still needing
+ > significant cleanups and improvements to be rcs-agnostic, etc. --[[Joey]]
+
+- inter-wiki links (specifically for having to-do lists for people across multiple wikis)
+ > Something more than the shortcut plugin? Maybe you should spell out
+ > exactly what you want in a [[todo]] item. --[[Joey]]
+
+- Some cleanups on other plugins:
+ - Add the ability to use the meta title to the map plugin.
+ > Patch [[exists|plugins/map/discussion]], just needs some cleanup.
+ > Now done. --[[Joey]]
+
+----
+Looking for a part-time ikiwiki developer
+
+The TOVA Company, a small medical software and hardware company in Portland, Oregon, is looking for developers to add functionality to ikiwiki. We're looking for developers who are already familiar with ikiwiki development, including plugins, and who would be willing to work on a part-time, non-employee, project-based basis for each of the small features that we want. The [[features_we're_interested_in|users/The_TOVA_Company]] would obviously be GPL'd, and released to the community (if they'll have them :) ). Please contact Andrew Greenberg (andrew@thetovacompany) if you're interested. Thanks!
diff --git a/doc/users/TimBosse.mdwn b/doc/users/TimBosse.mdwn
new file mode 100644
index 000000000..bd459cc80
--- /dev/null
+++ b/doc/users/TimBosse.mdwn
@@ -0,0 +1 @@
+<http://bosboot.org>
diff --git a/doc/users/Tim_Lavoie.mdwn b/doc/users/Tim_Lavoie.mdwn
new file mode 100644
index 000000000..90df011c6
--- /dev/null
+++ b/doc/users/Tim_Lavoie.mdwn
@@ -0,0 +1 @@
+Hey... I'm just starting to use ikiwiki, but am happy to find it repeatedly doing the sorts of things in a way which makes sense to me. (e.g. most pages are static, DVCS for file store etc.)
diff --git a/doc/users/Will.mdwn b/doc/users/Will.mdwn
new file mode 100644
index 000000000..1956263e0
--- /dev/null
+++ b/doc/users/Will.mdwn
@@ -0,0 +1,28 @@
+I started using Ikiwiki as a way to replace [Trac](http://trac.edgewall.org/) when using [Monotone](http://monotone.ca/). Version control has been an interest of mine for a while and I wrote most of the ikiwiki [[rcs/monotone]] plugin. I'm not actively working on the Monotone plugin any more.
+
+Lately I've been using Ikiwiki for other things and seem to be scratching a few itches here and there. :)
+
+I generally use my [[ikiwiki/openid]] login when editing here: <http://www.cse.unsw.edu.au/~willu/> or <http://www.google.com/profiles/will.uther>.
+
+I have a git repository for some of my IkiWiki code: <http://www.cse.unsw.edu.au/~willu/ikiwiki.git>.
+
+Generic License Grant
+-----------------
+
+Unless otherwise specified, any code that I post to this wiki I release under the GPL2+. Any non-code patches I post are released under [[standard ikiwiki licenses|freesoftware]].
+
+------
+
+Disabling these as I'm not using them much any more...
+
+### Open Bugs:
+
+\[[!inline pages="link(users/Will) and bugs/* and !bugs/done and !bugs/discussion and !link(patch) and !link(bugs/done) and !bugs/*/*" archive="yes" feeds="no" ]]
+
+### Open ToDos:
+
+\[[!inline pages="link(users/Will) and todo/* and !todo/done and !todo/discussion and !link(patch) and !link(todo/done) and !bugs/*/*" archive="yes" feeds="no" ]]
+
+### Unapplied Patches:
+
+\[[!inline pages="link(users/Will) and (todo/* or bugs/*) and !bugs/done and !bugs/discussion and !todo/done and !todo/discussion and link(patch) and !link(bugs/done) and !link(todo/done) and !bugs/*/*" archive="yes" feeds="no" ]]
diff --git a/doc/users/acathur.mdwn b/doc/users/acathur.mdwn
new file mode 100644
index 000000000..fc3768ee1
--- /dev/null
+++ b/doc/users/acathur.mdwn
@@ -0,0 +1,3 @@
+Today I finally managed to setup and use ikiwiki, the way I intended to, after thinking about it for 3 years or more!
+This's to celebrate that, and to hopefuly contributing to ikiwiki in any possible way.
+
diff --git a/doc/users/adamshand.mdwn b/doc/users/adamshand.mdwn
new file mode 100644
index 000000000..acb2290ca
--- /dev/null
+++ b/doc/users/adamshand.mdwn
@@ -0,0 +1,7 @@
+[[!meta title="Adam Shand"]]
+
+New ikiwiki user (well not really "new" anymore), long time wiki user. :-)
+
+<http://adam.shand.net/>
+
+[[!map pages="link(AdamShand)"]]
diff --git a/doc/users/ajt.mdwn b/doc/users/ajt.mdwn
new file mode 100644
index 000000000..61efbf8b6
--- /dev/null
+++ b/doc/users/ajt.mdwn
@@ -0,0 +1,20 @@
+[[!meta title="Adam Trickett"]]
+
+# Adam Trickett
+
+## "ajt"
+
+I'm a long time hacker of sorts, I like to program in Perl on Debian systems but work pays me to program in ABAP (COBOL) on SAP.
+
+I like wikis and I'm currently in love with ikiwiki, having moved my home intranet from a home made template solution to ikiwiki over a weekend. I'm using ikiwiki more like a web content management system (e.g. RedDot) rather than a traditional wiki.
+
+### My Links
+
+* [iredale dot net](http://www.iredale.net/) my web server and main blog
+* [ajt](http://www.perlmonks.org/index.pl?node_id=113686) my Perkmonks home node
+* [ATRICKETT](http://search.cpan.org/~atrickett/) my CPAN folder
+* [ajt](http://www.debian-administration.org/users/ajt) my Debian-Administration home (good site btw)
+* [ajt](http://www.blipfoto.com/ajt) by blipfoto photo blog
+* [drajt](http://www.linkedin.com/in/drajt) my LinkedIn profile
+* [drajt](http://www.slideshare.net/drajt) my "Slidespace" on SlideShare
+* [AdamTrickett](http://www.hants.lug.org.uk/cgi-bin/wiki.pl?AdamTrickett) my wiki page on my LUG's site
diff --git a/doc/users/aland.mdwn b/doc/users/aland.mdwn
new file mode 100644
index 000000000..5980675a9
--- /dev/null
+++ b/doc/users/aland.mdwn
@@ -0,0 +1 @@
+# aland
diff --git a/doc/users/alexander.mdwn b/doc/users/alexander.mdwn
new file mode 100644
index 000000000..b2894a90c
--- /dev/null
+++ b/doc/users/alexander.mdwn
@@ -0,0 +1 @@
+I use ikiwiki to organize information - projects, reading notes, outlines, todo lists, etc.
diff --git a/doc/users/alexandredupas.mdwn b/doc/users/alexandredupas.mdwn
new file mode 100644
index 000000000..f85775d37
--- /dev/null
+++ b/doc/users/alexandredupas.mdwn
@@ -0,0 +1,7 @@
+#Alexandre Dupas
+
+Soon to become a user of ikiwiki!
+
+Currently working on a plugin to display bibtex file for my publication list using [Perl::Text::Bibtex](http://search.cpan.org/dist/Text-BibTeX/).
+
+More news soon!
diff --git a/doc/users/anarcat.mdwn b/doc/users/anarcat.mdwn
new file mode 100644
index 000000000..2bd50c76b
--- /dev/null
+++ b/doc/users/anarcat.mdwn
@@ -0,0 +1,31 @@
+See <https://wiki.koumbit.net/TheAnarcat>
+
+[[!toc]]
+
+My todos
+========
+
+... or the ones I commented it, to be more precise.
+
+[[!inline pages="todo/* and !todo/done and !link(todo/done) and
+link(users/anarcat) and !todo/*/*" sort=mtime feeds=no actions=yes archive=yes show=0]]
+
+Done
+----
+
+[[!inline pages="todo/* and !todo/done and link(todo/done) and
+link(users/anarcat) and !todo/*/*" feeds=no actions=yes archive=yes show=0]]
+
+My bugs
+=======
+
+... same.
+
+[[!inline pages="bugs/* and !bugs/done and !link(bugs/done) and
+link(users/anarcat) and !bugs/*/*" sort=mtime feeds=no actions=yes archive=yes show=0]]
+
+Fixed
+-----
+
+[[!inline pages="bugs/* and !bugs/done and link(bugs/done) and
+link(users/anarcat) and !bugs/*/*" feeds=no actions=yes archive=yes show=0]]
diff --git a/doc/users/anarcat.wiki b/doc/users/anarcat.wiki
new file mode 100644
index 000000000..7ef474ed6
--- /dev/null
+++ b/doc/users/anarcat.wiki
@@ -0,0 +1 @@
+Hello! I'm anarcat. See [[https://wiki.koumbit.net/TheAnarcat]] to know more about me.
diff --git a/doc/users/arpitjain.mdwn b/doc/users/arpitjain.mdwn
new file mode 100644
index 000000000..5632806b4
--- /dev/null
+++ b/doc/users/arpitjain.mdwn
@@ -0,0 +1,7 @@
+[[!meta title="Arpit Jain"]]
+Hi,
+
+I am Arpit Jain. I am final year B.Tech/M.Tech(Dual Degree) student at Department of Computer Science and Engineering, Indian Institute of Technology, Kharagpur.
+I would be working on [[todo/Gallery]] project for ikiwiki this summers.
+
+More info about me can be found at <http://www.arpitjain.com>. \ No newline at end of file
diff --git a/doc/users/bartmassey.mdwn b/doc/users/bartmassey.mdwn
new file mode 100644
index 000000000..55e64fdd0
--- /dev/null
+++ b/doc/users/bartmassey.mdwn
@@ -0,0 +1,5 @@
+# Bart Massey
+bart@cs.pdx.edu
+Assoc Prof Computer Science, Portland State University
+
+See http://www.cs.pdx.edu/~bart for details. \ No newline at end of file
diff --git a/doc/users/bbb.mdwn b/doc/users/bbb.mdwn
new file mode 100644
index 000000000..933ba78e1
--- /dev/null
+++ b/doc/users/bbb.mdwn
@@ -0,0 +1,5 @@
+[[!meta title="Bruno Beaufils"]]
+
+Bruno Beaufils is **<bruno@boulgour.com>**.
+
+You can find me at [work](http://www.lifl.fr/~beaufils) or at [home](http://bruno.boulgour.com).
diff --git a/doc/users/blipvert.mdwn b/doc/users/blipvert.mdwn
new file mode 100644
index 000000000..7c4a24ba1
--- /dev/null
+++ b/doc/users/blipvert.mdwn
@@ -0,0 +1 @@
+<http://github.com/blipvert>
diff --git a/doc/users/bstpierre.mdwn b/doc/users/bstpierre.mdwn
new file mode 100644
index 000000000..327d25016
--- /dev/null
+++ b/doc/users/bstpierre.mdwn
@@ -0,0 +1 @@
+Brian St. Pierre is **<brian@bstpierre.org>**
diff --git a/doc/users/cfm.mdwn b/doc/users/cfm.mdwn
new file mode 100644
index 000000000..4feab9601
--- /dev/null
+++ b/doc/users/cfm.mdwn
@@ -0,0 +1 @@
+I maintain a [home page](http://www.panix.com/~cfm/ "Cory Myers").
diff --git a/doc/users/chris.mdwn b/doc/users/chris.mdwn
new file mode 100644
index 000000000..ec0e1451e
--- /dev/null
+++ b/doc/users/chris.mdwn
@@ -0,0 +1,7 @@
+[[!meta title="Chris Green"]]
+
+Chris is Chris Green, an ancient C/C++/Java programmer, I started around 1982 or 1983.
+
+I was programming even before that in assembler and things like that, I was first programming for a living in about 1970.
+
+I'm considering using ikiwiki for keeping notes and maybe a ToDo list etc. \ No newline at end of file
diff --git a/doc/users/chrismgray.mdwn b/doc/users/chrismgray.mdwn
new file mode 100644
index 000000000..b0084830d
--- /dev/null
+++ b/doc/users/chrismgray.mdwn
@@ -0,0 +1,4 @@
+I'm Chris Gray. I have an ikiwiki-based blog at
+[[http://chrismgray.github.com]]. I wrote a plugin for
+[[org-mode|todo/org_mode]] files that is probably the first ikiwiki
+plugin written mostly in emacs lisp.
diff --git a/doc/users/chrysn.mdwn b/doc/users/chrysn.mdwn
new file mode 100644
index 000000000..0daa3b2b9
--- /dev/null
+++ b/doc/users/chrysn.mdwn
@@ -0,0 +1,4 @@
+* **name**: chrysn
+* **website**: <http://christian.amsuess.com/>
+* **uses ikiwiki for**: a bunch of internal documentation / organization projects
+* **likes ikiwiki because**: it is a distributed organization tool that pretends to be a web app for the non-programmers out there
diff --git a/doc/users/cord.mdwn b/doc/users/cord.mdwn
new file mode 100644
index 000000000..c8775d607
--- /dev/null
+++ b/doc/users/cord.mdwn
@@ -0,0 +1 @@
+http://Cord.de
diff --git a/doc/users/cstamas.mdwn b/doc/users/cstamas.mdwn
new file mode 100644
index 000000000..64a173feb
--- /dev/null
+++ b/doc/users/cstamas.mdwn
@@ -0,0 +1,4 @@
+Csillag Tamas (cstamas) webpage is available from here:
+
+* <http://digitus.itk.ppke.hu/~cstamas/> (mostly hungarian)
+* <http://users.itk.ppke.hu/~cstamas/code/ikiwiki/> (patches I made for ikiwiki or hacks I use for my own wiki)
diff --git a/doc/users/dark.mdwn b/doc/users/dark.mdwn
new file mode 100644
index 000000000..e1d06d0b0
--- /dev/null
+++ b/doc/users/dark.mdwn
@@ -0,0 +1,3 @@
+[[!meta title="Richard Braakman"]]
+
+Lars Wirzenius convinced me to try ikiwiki for blogging :)
diff --git a/doc/users/dato.mdwn b/doc/users/dato.mdwn
new file mode 100644
index 000000000..87b49ebf9
--- /dev/null
+++ b/doc/users/dato.mdwn
@@ -0,0 +1,3 @@
+[[!meta title="Adeodato Simó"]]
+
+<http://chistera.yi.org/~adeodato>
diff --git a/doc/users/dirk.mdwn b/doc/users/dirk.mdwn
new file mode 100644
index 000000000..80c2725b0
--- /dev/null
+++ b/doc/users/dirk.mdwn
@@ -0,0 +1 @@
+Dirk is a computer consultant living in West Sussex that does a fair amount of perl and a lot of C.
diff --git a/doc/users/dom.mdwn b/doc/users/dom.mdwn
new file mode 100644
index 000000000..c75435679
--- /dev/null
+++ b/doc/users/dom.mdwn
@@ -0,0 +1,3 @@
+<http://www.larted.org.uk/~dom>
+
+Just another ikiwiki user.
diff --git a/doc/users/donmarti.mdwn b/doc/users/donmarti.mdwn
new file mode 100644
index 000000000..bafec7170
--- /dev/null
+++ b/doc/users/donmarti.mdwn
@@ -0,0 +1,2 @@
+Don Marti home page: <http://zgp.org/~dmarti/> email: <dmarti@zgp.org>
+
diff --git a/doc/users/emptty.mdwn b/doc/users/emptty.mdwn
new file mode 100644
index 000000000..08ef7d0f3
--- /dev/null
+++ b/doc/users/emptty.mdwn
@@ -0,0 +1,2 @@
+My professional homepage is [here](http://www.loria.fr/~quinson/). I'm currently (09/09) trying to move it from WML to ikiwiki.
+I'm sure I'll have a bunch of ideas, requests and maybe even patches in the process.
diff --git a/doc/users/ericdrechsel.mdwn b/doc/users/ericdrechsel.mdwn
new file mode 100644
index 000000000..2efb7039c
--- /dev/null
+++ b/doc/users/ericdrechsel.mdwn
@@ -0,0 +1 @@
+[My homewiki profile](http://wiki.shared.dre.am/people/eric/)
diff --git a/doc/users/fil.mdwn b/doc/users/fil.mdwn
new file mode 100644
index 000000000..b17399383
--- /dev/null
+++ b/doc/users/fil.mdwn
@@ -0,0 +1 @@
+http://hands.com/~phil
diff --git a/doc/users/fmarier.mdwn b/doc/users/fmarier.mdwn
new file mode 100644
index 000000000..ecf342697
--- /dev/null
+++ b/doc/users/fmarier.mdwn
@@ -0,0 +1,6 @@
+# François Marier
+
+Free Software and Debian Developer. Lead developer of [Libravatar](http://www.libravatar.org)
+
+* [Blog](http://feeding.cloud.geek.nz)
+* [Identica](http://identi.ca/fmarier) / [Twitter](http://twitter.com/fmarier)
diff --git a/doc/users/harishcm.mdwn b/doc/users/harishcm.mdwn
new file mode 100644
index 000000000..292a3bfad
--- /dev/null
+++ b/doc/users/harishcm.mdwn
@@ -0,0 +1 @@
+Using ikiwiki for my personal website <http://harish.19thsc.com>
diff --git a/doc/users/harningt.mdwn b/doc/users/harningt.mdwn
new file mode 100644
index 000000000..d4ef07658
--- /dev/null
+++ b/doc/users/harningt.mdwn
@@ -0,0 +1,11 @@
+[[!meta title="Thomas Harning Jr"]]
+
+I began using ikiwiki since it ties into git... and so far it's working great!
+
+<http://www.eharning.us/> is my homepage
+
+OpenID:
+
+<http://openid.eharning.us/> or <http://harningt.eharning.us/>
+
+Refs <https://openid.trustbearer.com/harningt>
diff --git a/doc/users/hb.mdwn b/doc/users/hb.mdwn
new file mode 100644
index 000000000..c3e52da6d
--- /dev/null
+++ b/doc/users/hb.mdwn
@@ -0,0 +1,11 @@
+[[!meta title="Hugues Bernard"]]
+
+For now I'm using ikiwiki just for my personal needs :
+
+* gtd/todo management
+* time tracking
+* projects notes
+
+on 2 different laptops, not allways connected nor in sync ;)
+
+--hb (Hugues Bernard) \ No newline at end of file
diff --git a/doc/users/hb/discussion.mdwn b/doc/users/hb/discussion.mdwn
new file mode 100644
index 000000000..3214e1521
--- /dev/null
+++ b/doc/users/hb/discussion.mdwn
@@ -0,0 +1,6 @@
+I'd love to see any notes you have on using ikiwiki for GTD. Would you
+consider documenting them? Perhaps we could turn the result into a
+[[tip|tips]]. -[[JoshTriplett]]
+> Well, certainly. Basically it's just inline + tag feature. I'm going to have more time in May for ikiwiki, I hope.
+> > Any news about that ?
+> > > I am also interested if you do not mind to share with us. [[cstamas]]
diff --git a/doc/users/hendry.mdwn b/doc/users/hendry.mdwn
new file mode 100644
index 000000000..8deaaaf8d
--- /dev/null
+++ b/doc/users/hendry.mdwn
@@ -0,0 +1 @@
+[Kai Hendry](http://hendry.iki.fi/)
diff --git a/doc/users/intrigeri.mdwn b/doc/users/intrigeri.mdwn
new file mode 100644
index 000000000..8fa9965a5
--- /dev/null
+++ b/doc/users/intrigeri.mdwn
@@ -0,0 +1,4 @@
+intrigeri AT boum.org, already loving ikiwiki.
+
+* [gnupg key](http://gaffer.ptitcanardnoir.org/intrigeri/intrigeri.asc)
+* Git repository with various ikiwiki {feature, bugfix}-branches : `git://gaffer.ptitcanardnoir.org/ikiwiki.git`
diff --git a/doc/users/iustin.mdwn b/doc/users/iustin.mdwn
new file mode 100644
index 000000000..db8cae218
--- /dev/null
+++ b/doc/users/iustin.mdwn
@@ -0,0 +1 @@
+I use ikiwiki to maintain my [personal blog](http://k1024.org/) and also for some private wikis.
diff --git a/doc/users/ivan_shmakov.mdwn b/doc/users/ivan_shmakov.mdwn
new file mode 100644
index 000000000..3dd297086
--- /dev/null
+++ b/doc/users/ivan_shmakov.mdwn
@@ -0,0 +1,54 @@
+&hellip; To put it short: an Ikiwiki newbie.
+
+[Emacs]: http://www.gnu.org/software/emacs/
+[Lynx]: http://lynx.isc.org/
+
+## Wikis
+
+Currently, I run a few Ikiwiki instances. Namely:
+
+* <http://lhc.am-1.org/lhc/>
+ &mdash; to hold random stuff written by me, my colleagues,
+ students, etc.
+
+* <http://rsdesne.am-1.org/rsdesne-2010/>
+ &mdash; for some of the materials related to the
+ &ldquo;Remote Sensing in Education, Science and National
+ Economy&rdquo; (2010-03-29 &hellip; 2010-04-10, Altai State
+ University) program I've recently participated in as
+ an instructor.
+
+* <http://nets.asu591.ru/networks-2011/>
+ &mdash; for bits &amp; pieces related to the course on computer
+ networks I've read in 2011.
+
+## Preferences
+
+I prefer to use [Lynx][] along with [Emacs][] (via
+`emacsclient`) to work with the wikis. (Note the &ldquo;Local
+variables&rdquo; section below.)
+
+The things I dislike in the wiki engines are:
+
+* the use of home-brew specialized version control systems
+ &mdash; while there're a lot of much more developed general
+ purpose ones;
+
+* oversimplified syntax
+ &mdash; which (to some extent) precludes more sophisticated
+ forms of automated processing; in particular, this forces one
+ to reformat the material, once complete, to, say, prepare a
+ book, or an article, or slides.
+
+Out of all the wiki engines I'm familiar with, only Ikiwiki is
+free of the first of these. I hope that it will support more
+elaborate syntaxes eventually.
+
+----
+
+ Local variables:
+ mode: markdown
+ coding: utf-8
+ fill-column: 64
+ ispell-local-dictionary: "american"
+ End:
diff --git a/doc/users/jasonblevins.mdwn b/doc/users/jasonblevins.mdwn
new file mode 100644
index 000000000..e4a459e30
--- /dev/null
+++ b/doc/users/jasonblevins.mdwn
@@ -0,0 +1,89 @@
+[[!meta title="Jason Blevins"]]
+
+I am a former Ikiwiki user who wrote several plugins and patches
+related to MathML, [[SVG|todo/svg]], and [[todo/syntax highlighting]].
+Some related links and notes are archived below.
+
+Homepage: <http://jblevins.org/>
+
+## Plugins
+
+The following [plugins](http://jblevins.org/projects/ikiwiki/)
+are no longer maintained, but please feel free to use, modify, and
+redistribute them. Read the corresponding perldoc documentation for
+more details.
+
+ * [mdwn_itex][] - Works with the [[`mdwn`|plugins/mdwn]] plugin to convert
+ inline [[todo/LaTeX]] expressions to MathML using `itex2MML`.
+
+ * [h1title][] - If present, use the leading level 1 Markdown header to
+ set the page title and remove it from the page body.
+
+ * [code][] - Whole file and inline code snippet [[todo/syntax highlighting]]
+ via GNU Source-highlight. The list of supported file extensions is
+ configurable.
+
+ * [metamail][] - a plugin for loading metadata from email-style
+ headers at top of a file (e.g., `title: Page Title` or
+ `date: November 2, 2008 11:14 EST`).
+
+ * [pandoc][] - [[ikiwiki/Markdown]] page processing via
+ [Pandoc](http://johnmacfarlane.net/pandoc/) (a Haskell library for
+ converting from one markup format to another). [[todo/LaTeX]] and
+ [[reStructuredText|plugins/rst]] are optional.
+
+ * [path][] - Provides path-specific template conditionals such as
+ `IS_HOMEPAGE` and `IN_DIR_SUBDIR`.
+
+ [mdwn_itex]: http://jblevins.org/git/ikiwiki/plugins.git/plain/mdwn_itex.pm
+ [h1title]: http://jblevins.org/git/ikiwiki/plugins.git/plain/h1title.pm
+ [code]: http://jblevins.org/projects/ikiwiki/code
+ [metamail]: http://jblevins.org/git/ikiwiki/plugins.git/plain/metamail.pm
+ [pandoc]: http://jblevins.org/git/ikiwiki/plugins.git/plain/pandoc.pm
+ [path]: http://jblevins.org/git/ikiwiki/plugins.git/plain/path.pm
+
+## MathML and SVG support
+
+So far, I've made some notes on sanitizing MathML and SVG via
+htmlscrubber on the [[todo/svg]] todo item.
+
+I've also worked out some content-negotiation issues. First of all,
+one needs to modify the default templates to use the
+XHTML+MathML+SVG doctype (see e.g., this [patch][template-patch]).
+For most browsers, the content type of the pages should be
+`application/xhtml+xml`. The solution is easy if you want to
+just send `application/xhtml+xml` to everybody:
+just change the content type of `.html` files across the board.
+
+However, if you want to support browsers that don't accept
+`application/xhtml+xml` (and those that will but say they
+don't, such as IE with the MathPlayer plugin), then one
+needs a `mod_rewrite` rule like the following:
+
+ RewriteCond %{HTTP_ACCEPT} application\/xhtml\+xml [OR]
+ RewriteCond %{HTTP_USER_AGENT} (W3C.*Validator|MathPlayer)
+ RewriteRule \.html$ - [T=application/xhtml+xml]
+
+This solves the problem of MathML and inline SVG in static pages
+but some additional work is required for dynamically generated
+pages, like page previews, that are generated by `ikiwiki.cgi`.
+We need to allow `ikiwiki.cgi` to set the content type dynamically
+based on the `HTTP_CONTENT_TYPE` environment variable
+(e.g., with the following [patch][cgi-patch]). Then, the following
+rewrite rules can pass the correct content type to ikiwiki:
+
+ RewriteCond %{HTTP_ACCEPT} application\/xhtml\+xml [OR]
+ RewriteCond %{HTTP_USER_AGENT} (W3C.*Validator|MathPlayer)
+ RewriteRule ikiwiki.cgi$ - [T=application/xhtml+xml]
+
+One final critical issue is that a production-ready setup needs to
+implement some sort of on-the-fly error handling. If a user submits
+an invalid LaTeX expression or SVG code (not malicious, just invalid)
+and saves the page, then browsers like Firefox will halt processing of
+the page, preventing any further viewing or editing. A less than
+optimal solution is to force users to preview the page before saving.
+That way if someone introduces invalid XHTML then they can't save the
+page in the first place (unless they post directly to the right URL).
+
+ [template-patch]: http://jblevins.org/git/ikiwiki.git/commit/?h=xbeta&id=416d5d1b15b94e604442e4e209a30dee4b77b684
+ [cgi-patch]: http://jblevins.org/git/ikiwiki.git/commit/?id=fa538c375250ab08f396634135f7d79fce2a9d36
diff --git a/doc/users/jasonriedy.mdwn b/doc/users/jasonriedy.mdwn
new file mode 100644
index 000000000..c94e8e4be
--- /dev/null
+++ b/doc/users/jasonriedy.mdwn
@@ -0,0 +1 @@
+I'm over [thattaway](http://lovesgoodfood.com/jason), although sometimes more easily caught [on identi.ca](http://identi.ca/jasonriedy).
diff --git a/doc/users/jaywalk.mdwn b/doc/users/jaywalk.mdwn
new file mode 100644
index 000000000..31b3e0c4d
--- /dev/null
+++ b/doc/users/jaywalk.mdwn
@@ -0,0 +1,5 @@
+Jonatan Walck. Home page: [jonatan.walck.se](http://jonatan.walck.se)
+
+## Contact ##
+* jonatan at walck dot se
+* [I2P-Bote key](http://jonatan.walck.i2p/docs/i2p-bote.txt) [what is this?](http://i2pbote.i2p.to/)
diff --git a/doc/users/jcorneli.mdwn b/doc/users/jcorneli.mdwn
new file mode 100644
index 000000000..6c11eac09
--- /dev/null
+++ b/doc/users/jcorneli.mdwn
@@ -0,0 +1,3 @@
+I am a Ph. D. student at the Knowledge Media Institute of The Open University, UK. I'm also on the board of directors of [PlanetMath.org](http://planetmath.org), and a contributor to the [Planetary](http://trac.mathweb.org/planetary) project where we are rebuilding PlanetMath's backend (features will include some with significant inspiration from ikiwiki).
+
+My personal homepage is here: [http://metameso.org/~joe](http://metameso.org/~joe)
diff --git a/doc/users/jeanprivat.mdwn b/doc/users/jeanprivat.mdwn
new file mode 100644
index 000000000..4d75a9867
--- /dev/null
+++ b/doc/users/jeanprivat.mdwn
@@ -0,0 +1 @@
+Jean Privat is <jean@pryen.org>.
diff --git a/doc/users/jelmer.mdwn b/doc/users/jelmer.mdwn
new file mode 100644
index 000000000..1f2f71aad
--- /dev/null
+++ b/doc/users/jelmer.mdwn
@@ -0,0 +1 @@
+[Jelmer Vernooij](http://samba.org/~jelmer/)
diff --git a/doc/users/jeremyreed.mdwn b/doc/users/jeremyreed.mdwn
new file mode 100644
index 000000000..8cfa5fc59
--- /dev/null
+++ b/doc/users/jeremyreed.mdwn
@@ -0,0 +1,3 @@
+[[!meta title="Jeremy Reed"]]
+
+I am testing ikiwiki. I made a RCS plugin. \ No newline at end of file
diff --git a/doc/users/jerojasro.mdwn b/doc/users/jerojasro.mdwn
new file mode 100644
index 000000000..e2e620d3f
--- /dev/null
+++ b/doc/users/jerojasro.mdwn
@@ -0,0 +1,3 @@
+Javier Rojas
+
+I keep a personal [wiki](http://devnull.li/~jerojasro/wiki) and my [blog](http://devnull.li/~jerojasro/blog) in ikiwiki.
diff --git a/doc/users/jmtd.mdwn b/doc/users/jmtd.mdwn
new file mode 100644
index 000000000..a816baf6f
--- /dev/null
+++ b/doc/users/jmtd.mdwn
@@ -0,0 +1 @@
+[[!meta redir=users/jon]]
diff --git a/doc/users/joey.mdwn b/doc/users/joey.mdwn
new file mode 100644
index 000000000..134aa21d1
--- /dev/null
+++ b/doc/users/joey.mdwn
@@ -0,0 +1,8 @@
+[[!meta title="Joey Hess"]]
+
+Joey Hess is <a href="mailto:joey@kitenet.net">joey@kitenet.net</a>.
+His web page is [here](http://kitenet.net/~joey/).
+
+Joey hates programming web crap, and hates being locked into a web browser
+to do something, and this probably shows in the design choices made in
+ikiwiki.
diff --git a/doc/users/jogo.mdwn b/doc/users/jogo.mdwn
new file mode 100644
index 000000000..e8068a10f
--- /dev/null
+++ b/doc/users/jogo.mdwn
@@ -0,0 +1,5 @@
+ * An [economic game](http://sef.matabio.net/) in french, which [use](http://sef.matabio.net/wiki/) IkiWiki.
+ * Some [plugins](http://www.matabio.net/tcgi/hg/IkiPlugins/file/).
+ * An alternative [base wiki](http://www.matabio.net/tcgi/hg/FrIkiWiki/file/) in french.
+
+email: `jogo matabio net`.
diff --git a/doc/users/jon.mdwn b/doc/users/jon.mdwn
new file mode 100644
index 000000000..3d5304365
--- /dev/null
+++ b/doc/users/jon.mdwn
@@ -0,0 +1,65 @@
+[[!meta title="Jon Dowland"]][[!toc levels=2]]
+
+## intro
+
+I'm looking at ikiwiki both for my personal site but also as a
+team-documentation management system for a small-sized group of UNIX
+sysadmins.
+
+* my edits should appear either as 'Jon' (if I've used
+ [[tips/untrusted_git_push]]); 'jmtd.net', 'jmtd.livejournal.com',
+ 'jmtd' if I've forgotten to set my local git config properly,
+ or once upon a time 'alcopop.org/me/openid/' or 'jondowland'.
+* My [homepage](http://jmtd.net/) is powered by ikiwiki
+
+I gave a talk at the [UK UNIX User's Group](http://www.ukuug.org/) annual
+[Linux conference](http://www.ukuug.org/events/linux2008/) in 2008 about
+organising system administrator documentation. Roughly a third of this talk
+was discussing IkiWiki in some technical detail and suggesting it as a good
+piece of software for this task.
+
+ * slides at <http://www.staff.ncl.ac.uk/jon.dowland/unix/docs/>.
+
+I am also working on some ikiwiki hacks:
+
+* [[todo/allow site-wide meta definitions]]
+* Improving the means by which you can migrate from mediawiki to
+ IkiWiki. See [[tips/convert mediawiki to ikiwiki]] and the
+ [[plugins/contrib/mediawiki]] plugin.
+
+I am mostly interested in ikiwiki usability issues:
+
+ * [[bugs/the login page is unclear when multiple methods exist]]
+ * [[bugs/backlinks onhover thing can go weird]]
+ * [[todo/CSS classes for links]]
+ * [[todo/adjust commit message for rename, remove]]
+
+The following I have been looking at, but are on the back-burner:
+
+* an alternative approach to [[plugins/comments]] (see
+ [[todo/more flexible inline postform]] for one piece of the puzzle;
+ <http://dev.jmtd.net/comments/> for some investigation into making the post
+ form more integrated); possibly also [[todo/pagespec to disable ikiwiki directives]]
+* a system for [[forum/managing_todo_lists]] (see also
+ [[todo/interactive todo lists]] and <http://dev.jmtd.net/outliner/> for the
+ current WIP).
+* a `tag2` plugin, which does the same thing as [[plugins/tag]], but
+ does not sit on top of [[ikiwiki/wikilink]]s, so does not result in
+ bugs such as [[bugs/tagged() matching wikilinks]]. Code for this lives
+ in my github `tag2` branch: <http://github.com/jmtd/ikiwiki>
+
+Penultimately, the following are merely half-formed thoughts:
+
+ * adding and removing tags to pages via the edit form by ticking and
+ unticking checkboxes next to a tag name (rather than entering the
+ directive into the text of the page directly)
+ * perhaps the same for meta
+ * I'd like to make profiling ikiwiki in action very easy for newcomers.
+ Perhaps even a plugin that created a file /profile or similar on build.
+
+## backlinks
+
+Finally, backlinks (since I have issues with the current backlinks
+implementation, see [[bugs/backlinks onhover thing can go weird]]):
+
+[[!map pages="link(users/Jon)"]]
diff --git a/doc/users/jonassmedegaard.mdwn b/doc/users/jonassmedegaard.mdwn
new file mode 100644
index 000000000..6119e7d49
--- /dev/null
+++ b/doc/users/jonassmedegaard.mdwn
@@ -0,0 +1,5 @@
+[[!meta title="Jonas Smedegaard"]]
+
+Jonas Smedegaard is a Debian developer, like joey. A big fan of this novel approach to wiki: serving as pages static!
+
+JonasSmedegaard maintains the packaging of another wiki - MoinMoin - for Debian, but is personally tired of the heavy burden of Python on his web servers.
diff --git a/doc/users/josephturian.mdwn b/doc/users/josephturian.mdwn
new file mode 100644
index 000000000..5ad68290d
--- /dev/null
+++ b/doc/users/josephturian.mdwn
@@ -0,0 +1,10 @@
+Joseph Turian is a scientist. You can email him at
+ lastname at gmail dot com
+
+* Academic: <http://www-etud.iro.umontreal.ca/~turian/>
+
+He hopes to set up ikiwiki and organize his thoughts.
+
+He also imagines adding wacky NLP and machine learning to ikiwiki.
+
+In more hazy dreams, he hopes that ikiwiki is someday ported to python.
diff --git a/doc/users/joshtriplett.mdwn b/doc/users/joshtriplett.mdwn
new file mode 100644
index 000000000..29178057c
--- /dev/null
+++ b/doc/users/joshtriplett.mdwn
@@ -0,0 +1,16 @@
+[[!meta title="Josh Triplett"]]
+
+Email: `josh@{joshtriplett.org,freedesktop.org,kernel.org,psas.pdx.edu}`.
+
+[Josh Triplett's homepage](http://joshtriplett.org)
+
+Proud user of ikiwiki.
+
+Worked on scripts to convert MoinMoin and TWiki wikis to ikiwikis backed by a
+git repository, including full history. Used for a couple of wikis, and now no
+longer maintained, but potentially still useful. Available from the following
+repositories, though not well-documented:
+
+ git clone git://svcs.cs.pdx.edu/git/wiki2iki/moin2iki
+ git clone git://svcs.cs.pdx.edu/git/wiki2iki/html-wikiconverter
+ git clone git://svcs.cs.pdx.edu/git/wiki2iki/twiki
diff --git a/doc/users/joshtriplett/discussion.mdwn b/doc/users/joshtriplett/discussion.mdwn
new file mode 100644
index 000000000..bbe0ed7c1
--- /dev/null
+++ b/doc/users/joshtriplett/discussion.mdwn
@@ -0,0 +1,68 @@
+Can we please have a very brief HOWTO?
+
+I have a Moin wiki in /var/www/wiki and want to create an IkIwiki clone of it in /var/www/ikiwiki backed by a git repos in /data/ikiwiki.
+
+I tried:
+
+ mkdir /var/www/ikiwiki
+ mkdir /data/ikiwiki
+ PATH=.:/usr/lib/git-core:$PATH ./moin2iki /data/ikiwiki http://localhost/wiki
+
+Help please!but this failed. (BTW, I don't usually put . in my PATH). The failure appears to be that the converter doesn't actually create an ikiwiki instance, but appears to want to update one:
+
+ fatal: ambiguous argument 'master': unknown revision or path not in the working tree.
+ Use '--' to separate paths from revisions
+ fatal: ambiguous argument 'master': unknown revision or path not in the working tree.
+ Use '--' to separate paths from revisions
+ fatal: Not a valid object name master
+ Traceback (most recent call last):
+ File "/home/peterc/src/moin2iki/git-map", line 125, in <module>
+ if __name__ == "__main__": sys.exit(main(sys.argv[1:]))
+ File "/home/peterc/src/moin2iki/git-map", line 117, in main
+ print git_map_file('commit', new_head)
+ File "/home/peterc/src/moin2iki/git-map", line 33, in git_map_file
+ f(inproc.stdout, outproc.stdin, sha, arg)
+ File "/home/peterc/src/moin2iki/git-map", line 64, in handle_commit
+ string, tree = lines.pop(0).split()
+ IndexError: pop from empty list
+
+OK, so I created one:
+
+ ikiwiki --setup /etc/ikiwiki/auto.setup
+ .....
+This process created several files and directories in my home directory:
+
+ wiki.git/
+ public_html/wiki/
+ wiki.setup
+ .ikiwiki/
+
+Following the instructions on the setup page, I did:
+ mv wiki.git /data/ikiwiki
+ ( cd /data/ikiwiki; git clone -l wiki.git wiki; )
+ mv .ikiwiki /data/ikiwiki/ikiwiki
+ mv ~/public_html/wiki /var/ikiwiki/
+
+then did again
+
+ PATH=.:/usr/lib/git-core:$PATH ./moin2iki /data/ikiwiki/wiki http://www/wiki
+
+and saw no output, and no change to the filesystem.
+
+I'm totally confused. It looks as though the script calls moin2git iff the target directory isn't there, but the script fails in interesting ways if it is.
+
+The other thing I saw was:
+
+ 2009-12-04 09:00:31,542 WARNING MoinMoin.log:139 using logging configuration read from built-in fallback in MoinMoin.log module!
+ Traceback (most recent call last):
+ File "./moin2git", line 128, in <module>
+ if __name__ == '__main__': main(*sys.argv[1:])
+ File "./moin2git", line 43, in main
+ r = request.RequestCLI()
+ AttributeError: 'module' object has no attribute 'RequestCLI'
+
+Moin version is 1.8.5
+
+Help please!
+
+> Please take a look at [[tips/Convert_moinmoin_to_ikiwiki]] again, the code has radically changed and should now be easier to use *and* work with 1.8.x. --[[anarcat]]
diff --git a/doc/users/jrblevin.mdwn b/doc/users/jrblevin.mdwn
new file mode 100644
index 000000000..4eb250bfa
--- /dev/null
+++ b/doc/users/jrblevin.mdwn
@@ -0,0 +1 @@
+[[!meta redir=users/jasonblevins]]
diff --git a/doc/users/justint.mdwn b/doc/users/justint.mdwn
new file mode 100644
index 000000000..23db51566
--- /dev/null
+++ b/doc/users/justint.mdwn
@@ -0,0 +1 @@
+Casual ikiwiki user.
diff --git a/doc/users/jwalzer.mdwn b/doc/users/jwalzer.mdwn
new file mode 100644
index 000000000..e66ad1a52
--- /dev/null
+++ b/doc/users/jwalzer.mdwn
@@ -0,0 +1,3 @@
+Jan Walzer started to look on ikiwiki just recently.
+
+Read [here](http://wa.lzer.net/wiki/ikiwiki/whyikiwiki/) why he uses ikiwiki.
diff --git a/doc/users/kyle.mdwn b/doc/users/kyle.mdwn
new file mode 100644
index 000000000..7960b3b21
--- /dev/null
+++ b/doc/users/kyle.mdwn
@@ -0,0 +1,2 @@
+[[!meta title="Kyle MacLea"]]
+[Kyle MacLea](http://kitenet.net/~kyle) was an early adopter of **ikiwiki**. He really likes it, especially for his [FamilyWiki](http://kitenet.net/~kyle/family/wiki) and [Emigration Registry](http://kitenet.net/~kyle/family/registry). \ No newline at end of file
diff --git a/doc/users/madduck.mdwn b/doc/users/madduck.mdwn
new file mode 100644
index 000000000..c423703af
--- /dev/null
+++ b/doc/users/madduck.mdwn
@@ -0,0 +1,9 @@
+My sites:
+
+- [Homepage](http://madduck.net)
+- [Debian stuff](http://people.debian.org/~madduck)
+
+I track this site with the following feed:
+
+[[!inline pages="internal(recentchanges/change_*) and !author(http://madduck.net/)"
+feedonly=yes atom=no]]
diff --git a/doc/users/marcelomagallon.mdwn b/doc/users/marcelomagallon.mdwn
new file mode 100644
index 000000000..f59e9a1ae
--- /dev/null
+++ b/doc/users/marcelomagallon.mdwn
@@ -0,0 +1,3 @@
+[[!meta title="Marcelo E. Magallon"]]
+
+Marcelo E. Magallon &lt;marcelo dot magallon in Google Mail&gt; \ No newline at end of file
diff --git a/doc/users/mathdesc.mdwn b/doc/users/mathdesc.mdwn
new file mode 100644
index 000000000..acb2a0756
--- /dev/null
+++ b/doc/users/mathdesc.mdwn
@@ -0,0 +1,190 @@
+mathdesc-at-scourge.biz
+.
+## PROFILING slow render : Case buggy [[plugins/filecheck]] ?
+
+Saving an article from ikiwiki editor is long ?
+<tt>ikiwiki --setup wiki.setup --rebuild</tt> is long ?
+
+Of course it depends the size of the wiki but if it's tiny and still take
+more that two minutes, it's boring. But if it takes a **dozen of minutes**, it's plain buggy.
+
+Actually one can with a verbose rebuild narrow down which page "lags" :
+
+<code>
+ private/admin.mdmn
+ tag/admin
+ tag/private
+</code>
+
+It's also possible to measure render time on one of these pages like this:
+
+<code>
+time ikiwiki --setup wiki.setup --render private/admin.mdwn
+</code>
+
+Well indeed for such a simple page, something fishy is going on.
+
+Still for simple yet superficial but enough profiling test, it requires
+a sub-level perl profiler.
+
+## Using SmallProf
+
+[[tips/optimising_ikiwiki/#index10h2]] proposed [[!cpan Devel::NYTProf]].
+
+Try it hard to make it spits realistic numbers or even a trend to point
+the bottleneck in the code. Bref -- nothing valuable nor coherent, it's way to sophisticated to be handy
+in my situation (virtual machine, SMP system, long runs, clock drifts, etc...)
+
+[[!cpan Devel::SmallProf]] is simple and just works(c)
+
+<pre>
+export PERL5OPT=-d:SmallProf
+time ikiwiki --setup wiki.setup --rebuild
+sort -k 2nr,2 -k 3nr,3 smallprof.out | head -n 6
+</pre>
+
+
+### Results : 6 top slowpits
+
+Total rebuild time:<br/>
+real 5m16.283s<br/>
+user 2m38.935s<br/>
+sys 2m32.704s<br/>
+
+
+Total rebuild time (under profiling) : <br/>
+real 19m21.633s<br/>
+user 14m47.831s<br/>
+sys 4m11.046s<br/>
+
+
+<pre>
+[num] [walltime] [cputime] [line]: [code]
+3055 114.17165 15.34000 149: $mimetype=<$file_h>;
+1626527 69.39272 101.4700 93: read($fh, $line, $$ref[1]); # read max
+3055 50.62106 34.78000 148: open(my $file_h, "-|", "file", "-bi",
+1626527 14.86525 48.50000 92: seek($fh, $$ref[0], SEEK_SET); # seek
+1626527 13.95613 44.78000 102: return undef unless $line =~ $$ref[3]; #
+3055 5.75528 5.81000 76: for my $type (map @$_, @rules) {
+</pre>
+
+legend :
+*num* is the number of times that the line was executed, *time* is the amount of "wall time" (time according the the clock on the wall vs. cpu time)
+spent executing it, *ctime* is the amount of cpu time expended on it and *line* and *code* are the line number and the actual text of the executed line
+(read from the file).
+
+
+3 topmost issues are located in this file :
+
+<tt>/usr/lib/perl5/vendor_perl/5.12.3/IkiWiki/Plugin/filecheck.pm</tt>
+<pre>
+sub match_mimetype ($$;@) {
+ my $page=shift;
+ my $wanted=shift;
+
+ my %params=@_;
+ my $file=exists $params{file} ? $params{file} : IkiWiki::srcfile($IkiWiki::pagesources{$page});
+ if (! defined $file) {
+ return IkiWiki::ErrorReason->new("file does not exist");
+ }
+
+ # Get the mime type.
+ #
+ # First, try File::Mimeinfo. This is fast, but doesn't recognise
+ # all files.
+ eval q{use File::MimeInfo::Magic};
+ my $mimeinfo_ok=! $@;
+ my $mimetype;
+ if ($mimeinfo_ok) {
+ my $mimetype=File::MimeInfo::Magic::magic($file);
+ }
+
+ # Fall back to using file, which has a more complete
+ # magic database.
+ if (! defined $mimetype) {
+ open(my $file_h, "-|", "file", "-bi", $file);
+ $mimetype=<$file_h>;
+ chomp $mimetype;
+ close $file_h;
+ }
+ if (! defined $mimetype || $mimetype !~s /;.*//) {
+ # Fall back to default value.
+ $mimetype=File::MimeInfo::Magic::default($file)
+ if $mimeinfo_ok;
+ if (! defined $mimetype) {
+ $mimetype="unknown";
+ }
+ }
+
+ my $regexp=IkiWiki::glob2re($wanted);
+ if ($mimetype!~$regexp) {
+ return IkiWiki::FailReason->new("file MIME type is $mimetype, not $wanted");
+ }
+ else {
+ return IkiWiki::SuccessReason->new("file MIME type is $mimetype");
+ }
+}
+</pre>
+
+Next 3 in this file :
+
+<tt>/usr/lib/perl5/vendor_perl/5.12.3/File/MimeInfo/Magic.pm</tt>
+<pre>
+sub _check_rule {
+ my ($ref, $fh, $lev) = @_;
+ my $line;
+
+ # Read
+ if (ref $fh eq 'GLOB') {
+ seek($fh, $$ref[0], SEEK_SET); # seek offset
+ read($fh, $line, $$ref[1]); # read max length
+ }
+ else { # allowing for IO::Something
+ $fh->seek($$ref[0], SEEK_SET); # seek offset
+ $fh->read($line, $$ref[1]); # read max length
+ }
+
+ # Match regex
+ $line = unpack 'b*', $line if $$ref[2]; # unpack to bits if using mask
+ return undef unless $line =~ $$ref[3]; # match regex
+ print STDERR '>', '>'x$lev, ' Value "', _escape_bytes($2),
+ '" at offset ', $$ref[1]+length($1),
+ " matches at $$ref[4]\n"
+ if $DEBUG;
+ return 1 unless $#$ref > 4;
+
+ # Check nested rules and recurs
+ for (5..$#$ref) {
+ return 1 if _check_rule($$ref[$_], $fh, $lev+1);
+ }
+ print STDERR "> Failed nested rules\n" if $DEBUG && ! $lev;
+ return 0;
+}
+</pre>
+
+*"It seems it's a unique cause, that snails it all"*
+
+## Conclusion
+
+This describes an issue in the attachment filechecker with mime type detection.
+The smallprof out file reveals it always fall back to using file which is very time-consuming.
+
+So what the hell did I put as complex allowed file attachment ruining File::Mimeinfo fast yet sparse recon ?
+Well, it was set in the config this way:
+
+<tt>allowed_attachments => 'mimetype(image/*) or maxsize(5000kb) or mimetype(text/plain) or mimetype(text/css) or mimetype(video/*)'</tt>
+
+Ok... maybe the wildcards induce ....hum whatever... let's try something , the simplest thing :
+
+<tt>allowed_attachments => 'mimetype(text/plain) or mimetype(text/css)'</tt>
+
+Same slowness : yek, File::Mimeinfo recons nothing ... not even simplest files.
+
+Disabling it is a temporary cure obviously but it only took **30 seconds** .
+
+<tt>disable_plugins => [qw{filecheck}]</tt>
+
+I tried also to upgrade [[!cpan File::MimeInfo]] to current 0.16, did not helped either. :/
+
+I opened a bug [[bugs/Slow_Filecheck_attachments___34__snails_it_all__34__]]
+
diff --git a/doc/users/michaelrasmussen.wiki b/doc/users/michaelrasmussen.wiki
new file mode 100644
index 000000000..d0538254d
--- /dev/null
+++ b/doc/users/michaelrasmussen.wiki
@@ -0,0 +1 @@
+New to ikiwiki in October 2007. Longer term MoinMoin, phpWiki user. \ No newline at end of file
diff --git a/doc/users/neale.mdwn b/doc/users/neale.mdwn
new file mode 100644
index 000000000..5245c2c99
--- /dev/null
+++ b/doc/users/neale.mdwn
@@ -0,0 +1,10 @@
+I used IkiWiki to supplant some custom journal software. I like that it uses
+the filesystem, my intent is to make journal entries as future-proof as
+possible. I'll probably start using it for generation of entire sites, soon.
+
+Things generated by IkiWiki with some fancypants stylesheets:
+
+* [woozle.org](http://woozle.org/)
+* [My page](http://woozle.org/~neale/)
+* [Amy's blog](http://woozle.org/~aim/blog/)
+* [Heidi's blog](http://woozle.org/~heidi/blog/)
diff --git a/doc/users/nil.mdwn b/doc/users/nil.mdwn
new file mode 100644
index 000000000..e1826cec6
--- /dev/null
+++ b/doc/users/nil.mdwn
@@ -0,0 +1,8 @@
+nil first used ikiwiki on a site/wiki/blog/something... and felt this approach much more comfortable than the usual web-only ones.
+Since then, ikiwiki is a kind of swiss army knife when it comes to build anything for the web.
+
+Can be reached at nicolas at limare.net
+
+The current big ikiwiki-powered project is <http://www.ipol.im>
+
+TODO: document "how to split public/edition interfaces"
diff --git a/doc/users/nolan.mdwn b/doc/users/nolan.mdwn
new file mode 100644
index 000000000..64b405e60
--- /dev/null
+++ b/doc/users/nolan.mdwn
@@ -0,0 +1 @@
+Hi, I'm Nolan. I'll add more later.
diff --git a/doc/users/patrickwinnertz.mdwn b/doc/users/patrickwinnertz.mdwn
new file mode 100644
index 000000000..979ce8973
--- /dev/null
+++ b/doc/users/patrickwinnertz.mdwn
@@ -0,0 +1,10 @@
+Okay, I would like to show here who am I.
+
+I'm Patrick Winnertz, 20 years old (born in 1986) and I was accepted in GSoC 07 to write an latex plugin for ikiwiki.
+
+I study chemistry in the RWTH Aachen in Germany and in my freetime I work on Debian-Edu and Debian (and now on ikiwiki). ;-)
+
+For some more details about me please visit my homepage at: https://www.der-winnie.de and have a look into the wiki where I develop the plugin:
+
+https://www.der-winnie.de/wiki.
+
diff --git a/doc/users/pdurbin.mdwn b/doc/users/pdurbin.mdwn
new file mode 100644
index 000000000..15ded8346
--- /dev/null
+++ b/doc/users/pdurbin.mdwn
@@ -0,0 +1 @@
+<http://greptilian.com>
diff --git a/doc/users/pelle.mdwn b/doc/users/pelle.mdwn
new file mode 100644
index 000000000..5475b67e8
--- /dev/null
+++ b/doc/users/pelle.mdwn
@@ -0,0 +1 @@
+Have migrated some company internal wiki's from MediaWiki to IkiWiki. Going from mysql+php to git+perl is a blessing, never going to look back!
diff --git a/doc/users/perolofsson.mdwn b/doc/users/perolofsson.mdwn
new file mode 100644
index 000000000..627a9fdc3
--- /dev/null
+++ b/doc/users/perolofsson.mdwn
@@ -0,0 +1,7 @@
+[[!meta title="Per Olofsson"]]
+
+Per Olofsson
+
+* <pelle@dsv.su.se>
+* <pelle@debian.org>
+* <http://people.dsv.su.se/~pelle/>.
diff --git a/doc/users/peteg.mdwn b/doc/users/peteg.mdwn
new file mode 100644
index 000000000..4e2face0e
--- /dev/null
+++ b/doc/users/peteg.mdwn
@@ -0,0 +1,7 @@
+I'm adding some plugins to Ikiwiki to support a bioacoustic wiki. See here:
+
+<http://bioacoustics.cse.unsw.edu.au/wiki/>
+
+Personal home page:
+
+<http://peteg.org/>
diff --git a/doc/users/peter_woodman.mdwn b/doc/users/peter_woodman.mdwn
new file mode 100644
index 000000000..62f31a5e2
--- /dev/null
+++ b/doc/users/peter_woodman.mdwn
@@ -0,0 +1 @@
+hello. i live [here](http://shortbus.org/).
diff --git a/doc/users/ptecza.mdwn b/doc/users/ptecza.mdwn
new file mode 100644
index 000000000..3f6fd39e8
--- /dev/null
+++ b/doc/users/ptecza.mdwn
@@ -0,0 +1,21 @@
+[[!meta title="Paweł Tęcza"]]
+
+My name is Paweł Tęcza. Currently I work as mail system administrator,
+C/Perl programmer and computer projects designer at Warsaw University, Poland.
+
+I've founded a few web sites for Polish Debian users and written many
+articles for them. The latest and the most famous of these sites is
+[www.debianusers.pl](http://www.debianusers.pl/). Unfortunately now
+I'm too busy (job, home, woman and child) to care about it.
+
+I've used Debian for many years. My first Debian release was Potato,
+but now I rather prefer Ubuntu, because it has faster release cycle
+than Debian and I don't want to wait more then 1 year for new stable
+release.
+
+I'm also author of unofficial ikiwiki backports. In the past I was
+rebuilding ikiwiki source package for Debian Sarge and Ubuntu Gutsy.
+Now I do the same for Ubuntu Hardy. You can find this and another
+my backports at [public GPA's Ubuntu packages archive](http://gpa.net.icm.edu.pl/ubuntu/).
+
+I love using Ikiwiki and bug reporting ;)
diff --git a/doc/users/rubykat.mdwn b/doc/users/rubykat.mdwn
new file mode 100644
index 000000000..f37d13306
--- /dev/null
+++ b/doc/users/rubykat.mdwn
@@ -0,0 +1 @@
+See [[KathrynAndersen]].
diff --git a/doc/users/sabr.mdwn b/doc/users/sabr.mdwn
new file mode 100644
index 000000000..c5a1a2066
--- /dev/null
+++ b/doc/users/sabr.mdwn
@@ -0,0 +1,32 @@
+[[!toc ]]
+
+### My name
+
+Scott Bronson
+
+### My quest
+
+a wiki that doesn't suck.
+
+### My iki
+
+<http://iki.u32.net>
+
+### Feed
+
+Thanks to [[madduck]], I track this site with the following feed:
+
+[[!inline pages="internal(recentchanges/change_*) and !author(http://sabr.myopenid.com/)"
+feedonly=yes rss=no atom=yes]]
+
+### Tests
+
+* Does this bullet go through? • yes, of course.
+* Can I create this page? [[/root_page_test]]
+ * no, it's a bug: [[/bugs/Can__39__t_create_root_page]]
+* This page has a [[plugins/toc]]. Why doesn't it appear in the edit preview? [[plugins/toc/discussion]]
+* Add two subpages: [[sub1]] and [[sub2]] to try to produce a directory listing as discussed in [[todo/pagespec_expansions]]. Will it list?
+
+> [[!inline pages="./sabr/* and !./sabr/*/*" template="titlepage" archive="yes" feeds="no"]]
+
+> How very strange that I need to put the name of my own page in there. I don't understand that. But it does seem to work.
diff --git a/doc/users/sabr/sub1.mdwn b/doc/users/sabr/sub1.mdwn
new file mode 100644
index 000000000..85bfe26bc
--- /dev/null
+++ b/doc/users/sabr/sub1.mdwn
@@ -0,0 +1 @@
+one sabr sub
diff --git a/doc/users/sabr/sub2.mdwn b/doc/users/sabr/sub2.mdwn
new file mode 100644
index 000000000..f63487d30
--- /dev/null
+++ b/doc/users/sabr/sub2.mdwn
@@ -0,0 +1 @@
+two sabr sub
diff --git a/doc/users/schmonz-web-ikiwiki.mdwn b/doc/users/schmonz-web-ikiwiki.mdwn
new file mode 100644
index 000000000..6b0dbed88
--- /dev/null
+++ b/doc/users/schmonz-web-ikiwiki.mdwn
@@ -0,0 +1 @@
+[[!meta redir=users/schmonz]]
diff --git a/doc/users/schmonz.mdwn b/doc/users/schmonz.mdwn
new file mode 100644
index 000000000..97fa1cbd6
--- /dev/null
+++ b/doc/users/schmonz.mdwn
@@ -0,0 +1,32 @@
+[Amitai Schlair](http://www.schmonz.com/) has contributed code to ikiwiki...
+
+[[!map
+pages="!*/Discussion and ((link(users/schmonz) and plugins/*) or rcs/cvs or todo/fancypodcast)"
+]]
+
+...and uses ikiwiki for all sorts of things:
+
+## Public
+
+* [A major open-source project's wiki](http://wiki.netbsd.org) (with
+ the [[rcs/cvs]] plugin)
+* [An undergraduate group's university-provided-static-hosted
+ site](http://www.columbia.edu/cu/philo/) (with [[plugins/rsync]] and a [WIND
+ authentication](http://www.columbia.edu/acis/rad/authmethods/wind/) plugin)
+* [A small personal site](http://www.anglofish.net/) (happily hosted at
+ [Branchable](http://www.branchable.com/))
+
+## Non-public
+
+* At work, team documentation and project planning: product and sprint
+ backlogs, burndown charts, release plans/procedures/announcements,
+ aggregating feeds of shared interest, etc. (with the
+ [[plugins/contrib/dynamiccookies]] and [[plugins/contrib/proxies]] plugins)
+* On my laptop, personal to-do and scratch space
+* [A small personal site](http://podcast.schmonz.com/) (happily hosted at
+ [Branchable](http://www.branchable.com/))
+
+## Non-yet-ikiwiki
+
+* [My personal web site](http://www.schmonz.com/) (pending
+ [[todo/fancypodcast]] integration)
diff --git a/doc/users/seanh.mdwn b/doc/users/seanh.mdwn
new file mode 100644
index 000000000..d093c2f32
--- /dev/null
+++ b/doc/users/seanh.mdwn
@@ -0,0 +1 @@
+seanh is an ikiwiki user.
diff --git a/doc/users/simonraven.mdwn b/doc/users/simonraven.mdwn
new file mode 100644
index 000000000..13681a674
--- /dev/null
+++ b/doc/users/simonraven.mdwn
@@ -0,0 +1,7 @@
+## personal/site info
+
+Have several ikiwiki-based sites at my web site, blog, kisikew.org home site, for indigenews, and our indigenous-centric wiki (mostly East Coast/Woodlands area).
+## ikiwiki branch at github
+
+Maintain my own branch, partly to learn about VCS, git, ikiwiki, Debian packaging, and Perl. Thinking of removing most 3rd-party plugins (found in contrib/). Have some custom plugins to support dual bottom-of-the-page "sidebars" and an attempt at supporting HTTPBL (see projecthoneypot.org).
+
diff --git a/doc/users/smcv.mdwn b/doc/users/smcv.mdwn
new file mode 100644
index 000000000..59d1affba
--- /dev/null
+++ b/doc/users/smcv.mdwn
@@ -0,0 +1,10 @@
+Website: [pseudorandom.co.uk](http://www.pseudorandom.co.uk/)
+
+Blog: [smcv.pseudorandom.co.uk](http://smcv.pseudorandom.co.uk/)
+
+My repository containing ikiwiki branches:
+
+* gitweb: http://git.pseudorandom.co.uk/smcv/ikiwiki.git
+* anongit: git://git.pseudorandom.co.uk/git/smcv/ikiwiki.git
+
+Currently thinking about a [[users/smcv/gallery]] plugin.
diff --git a/doc/users/smcv/gallery.mdwn b/doc/users/smcv/gallery.mdwn
new file mode 100644
index 000000000..3d40b069d
--- /dev/null
+++ b/doc/users/smcv/gallery.mdwn
@@ -0,0 +1,4 @@
+This plugin has now been implemented as [[plugins/contrib/album]].
+
+This page's history contains some older thoughts about it;
+I've deleted them in an attempt to reduce confusion.
diff --git a/doc/users/smcv/gallery/discussion.mdwn b/doc/users/smcv/gallery/discussion.mdwn
new file mode 100644
index 000000000..676bdba0c
--- /dev/null
+++ b/doc/users/smcv/gallery/discussion.mdwn
@@ -0,0 +1,18 @@
+The examples linked to www.pseudorandom.co.uk do not exist.
+
+Does anyone have recent examples of any image album that works with ikiwiki?
+
+-- [[JeremyReed]]
+
+> I've put up a new demo at <http://ikialbum.hosted.pseudorandom.co.uk/>.
+
+> The current implementation is at [[plugins/contrib/album]],
+> but please note that this plugin is not production-ready - only use it
+> if you're comfortable with hacking on it yourself, and you don't mind
+> migrating your data for newer versions if it ever gets merged.
+> Improvements would be welcomed, of course!
+>
+> The `album-live` branch is probably the closest I have to working code
+> for this at the moment, although I'm now looking into integrating
+> album with my [[plugins/contrib/trail]] plugin, on the `album2` branch.
+> --[[smcv]]
diff --git a/doc/users/solofo.mdwn b/doc/users/solofo.mdwn
new file mode 100644
index 000000000..f53dc1b9a
--- /dev/null
+++ b/doc/users/solofo.mdwn
@@ -0,0 +1 @@
+ikiwiki recent user (2008).
diff --git a/doc/users/sphynkx.mdwn b/doc/users/sphynkx.mdwn
new file mode 100644
index 000000000..c161d77a8
--- /dev/null
+++ b/doc/users/sphynkx.mdwn
@@ -0,0 +1 @@
+[My infopage](http://sphynkx.org.ua)
diff --git a/doc/users/sunny256.mdwn b/doc/users/sunny256.mdwn
new file mode 100644
index 000000000..faf829358
--- /dev/null
+++ b/doc/users/sunny256.mdwn
@@ -0,0 +1,15 @@
+I'm Øyvind A. Holm, a Norwegian guy who's been in love with \*NIX-like operating systems since I first tried [QNX](http://www.qnx.com/) in 1987.
+Then, after playing around with [Coherent](http://en.wikipedia.org/wiki/Coherent_%28operating_system%29) for a while, I finally got on the Linux bandwagon at kernel 1.2.8 in 1995.
+
+I live in Bergen, Norway, at [N 60.37436° E 5.3471°](http://www.openstreetmap.org/?mlat=60.374252&mlon=5.34722&zoom=16&layers=M), to be specific.
+I'm quite passionate about Open Source in general, freedom of speech, music and science.
+I'm a photo enthusiast, musician now and then, atheist and some kind of anarchist.
+
+Most of the places on the Net I hang around are listed on my [Google profile](http://www.google.com/profiles/sunny256).
+
+I discovered ikiwiki on 2011-02-15, and immediately clicked with it.
+One week later it had replaced everything on [my web server](http://www.sunbase.org), after using some homegrown CMS written in Perl for years, [Mediawiki](http://www.mediawiki.org), [Drupal](http://drupal.org) and whatnot.
+Seems as I've found the perfect system at last.
+Thanks for creating it, Joey.
+
+I have a clone of the ikiwiki repository at <https://github.com/sunny256/ikiwiki> where patches go.
diff --git a/doc/users/svend.mdwn b/doc/users/svend.mdwn
new file mode 100644
index 000000000..f7a88d125
--- /dev/null
+++ b/doc/users/svend.mdwn
@@ -0,0 +1,4 @@
+[[!meta title="Svend Sorensen"]]
+
+* [website](http://svend.ciffer.net)
+* [blog](http://svend.ciffer.net/blog)
diff --git a/doc/users/tbm.mdwn b/doc/users/tbm.mdwn
new file mode 100644
index 000000000..cd7491c56
--- /dev/null
+++ b/doc/users/tbm.mdwn
@@ -0,0 +1,3 @@
+[[!meta title="Martin Michlmayr"]]
+
+Currently trying to convert [my homepage](http://www.cyrius.com/) to ikiwiki.
diff --git a/doc/users/tjgolubi.mdwn b/doc/users/tjgolubi.mdwn
new file mode 100644
index 000000000..edf871843
--- /dev/null
+++ b/doc/users/tjgolubi.mdwn
@@ -0,0 +1,3 @@
+[[!meta title="Terry Golubiewski"]]
+Terry Golubiewski likes to program in C++ and manages software projects.
+Terry is just starting to use ikiwiki.
diff --git a/doc/users/tschwinge.mdwn b/doc/users/tschwinge.mdwn
new file mode 100644
index 000000000..435208a71
--- /dev/null
+++ b/doc/users/tschwinge.mdwn
@@ -0,0 +1,151 @@
+[[!meta title="Thomas Schwinge"]]
+# Thomas Schwinge
+
+<thomas@schwinge.name>
+<http://schwinge.homeip.net/~thomas/>
+
+I have converted the [GNU Hurd](http://www.gnu.org/software/hurd/)'s previous
+web pages and previous wiki pages to a *[[ikiwiki]]* system; and all that while
+preserving the previous content's history, which was stored in a CVS repository
+for the HTML web pages and a TWiki RCS repository for the wiki; see
+<http://www.gnu.org/software/hurd/colophon.html>.
+
+# Issues to Work On
+
+## Stability of Separate Builds
+
+The goal is that separate builds of the same source files should yield the
+exactly same HTML code (of course, except for changes due to differences in
+Markdown rendering, for example).
+
+ * Timestamps -- [[forum/ikiwiki__39__s_notion_of_time]], [[forum/How_does_ikiwiki_remember_times__63__]]
+
+ Git set's the current *mtime* when checking out files. The result is that
+ <http://www.gnu.org/software/hurd/contact_us.html> and
+ <http://www.bddebian.com:8888/~hurd-web/contact_us/> show different *Last
+ edited* timestamps.
+
+ This can either be solved by adding a facility to Git to set the
+ checked-out files' *mtime* according to the *AuthorDate* / *CommitDate*
+ (which one...), or doing that retroactively with the
+ <http://www.gnu.org/software/hurd/set_mtimes> script before building, or
+ with a ikiwiki-internal solution.
+
+ * HTML character entities
+
+ <http://www.gnu.org/software/hurd/purify_html>
+
+### \[[!map]] behavior
+
+The \[[!map]] on, for example,
+<http://www.gnu.org/software/hurd/tag/open_issue_hurd.html>, should not show
+the complete hierarchy of pages, but instead just the pages that actually *do*
+contain the \[[!tag open_issue_hurd]].
+
+> `tagged(open_issue_hurd)` in its pagespec should do that. --[[Joey]]
+
+>> Well, that's exactly what this page contains: \[[!map
+>> pages="tagged(open_issue_hurd) and !open_issues and !*/discussion"
+>> show=title]]
+>>
+>> This is currently rendered as can be seen on
+>> <http://www.gnu.org/software/hurd/tag/open_issue_hurd.html>, but I'd imagine
+>> it to be rendered by **only** linking to the pages that actually do contain
+>> the tag, (**only** the outer leaf ones, which are *capturing stdout and
+>> stderr*, *ramdisk*, *syncfs*, ...; but **not** to *hurd*, *debugging*,
+>> *translator*, *libstore*, *examples*, ...). Otherwise, the way it's being
+>> rendered at the moment, it appears to the reader that *hurd*, *debugging*,
+>> *translator*, *libstore*, *examples*, ... were all tagged, too, and not only
+>> the outer ones.
+
+## Anchors -- [[ikiwiki/wikilink/discussion]]
+
+## Default Content for Meta Values -- [[plugins/contrib/default_content_for___42__copyright__42___and___42__license__42__]]
+
+This will decrease to be relevant, as we're going to add copyright and
+licensing headers to every single file.
+
+## [[bugs/img vs align]]
+
+## Texinfo -- [[plugins/contrib/texinfo]]
+
+Not very important. Have to consider external commands / files / security (see
+[[plugins/teximg]] source code)?
+
+## Shortcuts -- [[plugins/shortcut/discussion]]
+
+## \[[!meta redir]] -- [[todo/__42__forward__42__ing_functionality_for_the_meta_plugin]]
+
+Implement a checker that makes sure that no pages that use \[[!meta redir]]
+redirect to another page (and are thus considered legacy pages for providing
+stable URLs, for example) are linked to from other wiki pages. This is useful
+w.r.t. backlinks. Alternative, the backlinks to the \[[!meta redir]]-using
+pages could perhaps be passed on to the referred-to page?
+
+> I found that backlinks was an easy way to find such links to such pages.
+> (Although the redirection made it hard to see the backlinks!) --[[Joey]]
+
+## \[[!meta redir]] -- tell what's going on
+
+Add functionality that a text like *this page's content has moved to [new
+page]; in a few seconds you'll be redirected thither* is displayed on every
+page that uses \[[!meta redir]].
+
+## Sendmail -- [[todo/passwordauth:_sendmail_interface]]
+
+## [[bugs/Broken Parentlinks]]
+
+## Modifying [[plugins/inline]] for showing only an *appetizer*
+
+Currently ikiwiki's inline plugin will either show the full page or nothing of
+it. Often that's too much. One can manually use the [[plugins/toggle]] plugin
+-- see the *News* section on <http://www.gnu.org/software/hurd/>. Adding a new
+mode to the inline plugin to only show an *appetizer* ending with *... (read
+on)* after a customizable amount of characters (or lines) would be a another
+possibility. The *... (read on)* would then either toggle the full content
+being displayed or link to the complete page.
+
+> You're looking for [[plugins/more]] (or possibly a way to do that automatically,
+> I suppose. --[[Joey]]
+
+## Prefix For the HTML Title
+
+The title of each page (as in `<html><head><title>`...) should be prefixed with
+*GNU Project - GNU Hurd -*. We can either do this directly in `page.tmpl`, or
+create a way to modify the `TITLE` template variable suitably.
+
+## [[plugins/inline]] feedfile option
+
+Not that important. Git commit b67632cdcdd333cf0a88d03c0f7e6e62921f32c3. This
+would be nice to have even when *not* using *usedirs*. Might involve issues as
+discussed in *N-to-M Mapping of Input and Output Files* on
+[[plugins/contrib/texinfo]].
+
+## Unverified -- these may be bugs, but have yet to be verified
+
+ * ikiwiki doesn't change its internal database when \[[!meta date]] /
+ \[[!meta updated]] are added / removed, and thusly these meta values are
+ not promulgated in RSS / Atom feeds.
+
+ > I would rather see this filed as a bug, but FWIW, the problem
+ > is probably that meta does not override the mdate_3339
+ > template variable used by the atom and rss templates.
+ > (Meta does store ctime directly in the ikiwiki database, but cannot
+ > store mtime in \%pagemtime because it would mess up detection of when
+ > actual file mtimes change.) --[[Joey]]
+
+ * Complicated issue w.r.t. *no text was copied in this page*
+ ([[plugins/cutpaste]]) in RSS feed (only; not Atom?) under some conditions
+ (refresh only, but not rebuild?). Perhaps missing to read in / parse some
+ files?
+ [[Reported|bugs/Error:_no_text_was_copied_in_this_page_--_missing_page_dependencies]].
+
+ * [[plugins/recentchanges]]
+
+ * Creates non-existing links to changes.
+
+ * Invalid *directory link* with `--usedirs`.
+
+ * Doesn't honor `$timeformat`.
+
+ * Does create `recentchangees.*` files even if that is overridden.
diff --git a/doc/users/ttw.mdwn b/doc/users/ttw.mdwn
new file mode 100644
index 000000000..03caad6c6
--- /dev/null
+++ b/doc/users/ttw.mdwn
@@ -0,0 +1 @@
+n0gOoi3
diff --git a/doc/users/tupyakov_vladimir.mdwn b/doc/users/tupyakov_vladimir.mdwn
new file mode 100644
index 000000000..95f85adc2
--- /dev/null
+++ b/doc/users/tupyakov_vladimir.mdwn
@@ -0,0 +1 @@
+всем привет!
diff --git a/doc/users/tychoish.mdwn b/doc/users/tychoish.mdwn
new file mode 100644
index 000000000..75b285e4a
--- /dev/null
+++ b/doc/users/tychoish.mdwn
@@ -0,0 +1,10 @@
+I'm Sam Kleinman ([tychoish](http://www.tychoish.com/)), and I use ikiwiki.
+
+I used to use ikiwiki a bunch, in a purely local setup, for various projects and personal note taking of various kinds. I've since switched to using [orgmode](http://www.orgmode.org) and a number of just regular old git repositories with some textfiles in them, which seems to work just as well for my particular use case.
+
+Currently, I'm using ikiwiki on the [cyborg institute wiki](http://www.cyborginstitute.com/wiki/) and I'm working on helping someone develop a collaborative fiction project at the [critical futures wiki](http://www.criticalfutures.com/wiki/)... and I'm thinking about other projects, but nothing that exists enough to mention here.
+
+I hang out in #ikiwiki on oftc, and I'm working on getting some of my templates in order to share with you all. Soon. For sure.
+
+Cheers,
+tychoish
diff --git a/doc/users/ulrik.mdwn b/doc/users/ulrik.mdwn
new file mode 100644
index 000000000..02e1c1cbd
--- /dev/null
+++ b/doc/users/ulrik.mdwn
@@ -0,0 +1,3 @@
+I use ikiwiki for a small personal website at <http://kaizer.se/wiki/>
+
+I have proposed some patches for ikiwiki: <http://kaizer.se/wiki/hacks/ikiwiki_rst/> (subject to change)
diff --git a/doc/users/undx.mdwn b/doc/users/undx.mdwn
new file mode 100644
index 000000000..9b62209ce
--- /dev/null
+++ b/doc/users/undx.mdwn
@@ -0,0 +1,7 @@
+![undx](http://www.undx.net/images/site/undx_original.png)
+
+IRL, I'm Emmanuel GALLOIS.
+
+This is my temporary ikiwiki user's page.
+
+Ciao.
diff --git a/doc/users/victormoral.mdwn b/doc/users/victormoral.mdwn
new file mode 100644
index 000000000..fe5d860b1
--- /dev/null
+++ b/doc/users/victormoral.mdwn
@@ -0,0 +1,6 @@
+[[!meta title="Victor Moral"]]
+
+I'm a spanish perl programmer and linux system administrator.
+
+My personal page is <http://taquiones.net/victor.html> and my email is <mailto:victor@taquiones.net>.
+
diff --git a/doc/users/weakish.mdwn b/doc/users/weakish.mdwn
new file mode 100644
index 000000000..a5f252e75
--- /dev/null
+++ b/doc/users/weakish.mdwn
@@ -0,0 +1,3 @@
+email: weakish@gmail.com
+
+website: <http://weakish.github.com>
diff --git a/doc/users/weakishjiang.mdwn b/doc/users/weakishjiang.mdwn
new file mode 100644
index 000000000..0cafb4653
--- /dev/null
+++ b/doc/users/weakishjiang.mdwn
@@ -0,0 +1,4 @@
+[My blog](http://millenniumdark.blog.ubuntu.org.cn)
+
+> So, you're learning haskell. You know, I want to add support for haskell
+> external plugins to ikiwiki.. :-) --[[Joey]]
diff --git a/doc/users/wentasah.mdwn b/doc/users/wentasah.mdwn
new file mode 100644
index 000000000..3363c1d8f
--- /dev/null
+++ b/doc/users/wentasah.mdwn
@@ -0,0 +1,9 @@
+My homepage: <http://rtime.felk.cvut.cz/~sojka/>
+
+My other ikiwikis:
+
+- <http://support.dce.felk.cvut.cz/osp/>
+- <http://support.dce.felk.cvut.cz/psr/>
+- <http://frsh-forb.sourceforge.net/>
+- <http://orte.sourceforge.net/>
+- <http://ortcan.sourceforge.net/>
diff --git a/doc/users/wiebel.mdwn b/doc/users/wiebel.mdwn
new file mode 100644
index 000000000..4d6997a9e
--- /dev/null
+++ b/doc/users/wiebel.mdwn
@@ -0,0 +1,5 @@
+Started my [Homepage](http://wiebel.scorpius.uberspace.de/) with ikiwiki added some plugins like:
+
+* [hlinclude](https://wiebel.scorpius.uberspace.de/space/hlinclude/) -> addition to highlight to render a files content and provide it as a download, example can be seen at [favlinks](https://wiebel.scorpius.uberspace.de/code/favlink/), sadly it seem to be unable to render itself, meh
+* [favlinks](https://wiebel.scorpius.uberspace.de/code/favlink/) -> to include favicons off linked Pages
+
diff --git a/doc/users/wtk.mdwn b/doc/users/wtk.mdwn
new file mode 100644
index 000000000..d00911a76
--- /dev/null
+++ b/doc/users/wtk.mdwn
@@ -0,0 +1,6 @@
+[[!meta title="W. Trevor King"]]
+
+* Git branch: `wtk`.
+* [Ikiwiki-based blog][blog]
+
+[blog]: http://blog.tremily.us/
diff --git a/doc/users/xma/discussion.mdwn b/doc/users/xma/discussion.mdwn
new file mode 100644
index 000000000..34adbf821
--- /dev/null
+++ b/doc/users/xma/discussion.mdwn
@@ -0,0 +1,18 @@
+How do you edit this wiki (I mean [ikiwiki]) without the web browser ? Is there a way to git clone/pull/push and thus to use our favorite [text editor](http://www.gnu.org/software/emacs) ? --[[xma]]
+
+> You can clone ikiwiki's [[git]] repo. I have not implemented a way to
+> allow users to push doc wiki only changesets anonymously, but you can
+> mails changesets to me. --[[Joey]]
+> > How can I send you the changesets ? (git command) --[[xma]]
+> > > `git-format-patch` --[[Joey]]
+
+> > > > Glad to hear I can mail changesets to you, since I wrote the [[todo/applydiff_plugin]] wishlist entry. --[[intrigeri]]
+
+> It would be nice to have a git recieve hook that
+> checked that a commit contained only changes to .mdwn or other allowed
+> extensions.. if someone writes up a good one, I'd be willing to deploy it
+> for ikiwiki. --[[Joey]]
+
+> > I'll think about it. It may solve some of my offline-being issues. --[[intrigeri]]
+
+>>>> Now developed! --[[Joey]]
diff --git a/doc/users/xtaran.mdwn b/doc/users/xtaran.mdwn
new file mode 100644
index 000000000..1c08ff452
--- /dev/null
+++ b/doc/users/xtaran.mdwn
@@ -0,0 +1,5 @@
+[Homepage](http://noone.org/abe/), [Blog](http://noone.org/blog)
+
+Currently only use it to play around (so no link to any ikiwiki yet), but I really like it and [I see it as a potential WML killer](http://noone.org/blog/English/Computer/Web/WML/Is%20ikiwiki%20a%20WML%20killer%3f.futile). It also reminds me to Blosxom somehow although I think it won't be a Blosxom killer. OTOH I know of people who moved from Blosxom to Ikiwiki as blogging engine. :)
+
+Probably will use it for some parts of my website revamp and for wikiizing my [[/plugins/hnb]] note files.
diff --git a/doc/users/yds.mdwn b/doc/users/yds.mdwn
new file mode 100644
index 000000000..f081f4134
--- /dev/null
+++ b/doc/users/yds.mdwn
@@ -0,0 +1 @@
+[yds.CoolRat.org](http://yds.CoolRat.org)
diff --git a/doc/w3mmode.mdwn b/doc/w3mmode.mdwn
new file mode 100644
index 000000000..04e37ba04
--- /dev/null
+++ b/doc/w3mmode.mdwn
@@ -0,0 +1,11 @@
+It's possible to use all of ikiwiki's web features (page editing, etc) in
+the [`w3m`](http://w3m.sourceforge.net/) web browser without using a web server. `w3m` supports local CGI
+scripts, and ikiwiki can be set up to run that way. This requires some
+special configuration:
+
+ * `w3mmode` must be enabled
+ * A CGI wrapper must be created, in ~/.ikiwiki/wrappers/
+ * `cgiurl` should be set to just the base of the filename of the CGI
+ wrapper.
+
+This [[ikiwiki.setup]] is an example of setting up a wiki using w3mmode.
diff --git a/doc/w3mmode/ikiwiki.setup b/doc/w3mmode/ikiwiki.setup
new file mode 100644
index 000000000..5f5cbbff9
--- /dev/null
+++ b/doc/w3mmode/ikiwiki.setup
@@ -0,0 +1,31 @@
+#!/usr/bin/perl
+# Configuration file for ikiwiki (w3m mode).
+# Passing this to ikiwiki --setup will make ikiwiki generate wrappers and
+# build the wiki.
+#
+# Remember to re-run ikiwiki --setup any time you edit this file.
+
+use IkiWiki::Setup::Standard {
+ wikiname => "ikiwiki",
+
+ # Be sure to customise these..
+ srcdir => "doc",
+ destdir => "html",
+
+ # Enable w3m mode.
+ w3mmode => 1,
+ cgiurl => 'ikiwiki.cgi',
+ rcs => "",
+
+ # The wrapper must be put in ~/.ikiwiki/wrappers/, since
+ # ikiwiki-w3m.cgi only looks in this one location.
+ # The wrapper can be given any name as long as it's
+ # in that directory.
+ cgi_wrapper => "$ENV{HOME}/.ikiwiki/wrappers/ikiwiki.cgi",
+ cgi_wrappermode => "0755",
+
+ add_plugins => [qw{anonok}],
+ rss => 1,
+ atom => 1,
+ discussion => 1,
+}
diff --git a/doc/whyikiwiki.mdwn b/doc/whyikiwiki.mdwn
new file mode 100644
index 000000000..04d5db4aa
--- /dev/null
+++ b/doc/whyikiwiki.mdwn
@@ -0,0 +1,15 @@
+Why call it ikiwiki? Well, partly because I'm sure some people will find
+this a pretty Iky Wiki, since it's so different from other Wikis. Partly
+because "ikiwiki" is a nice palindrome. Partly because its design turns
+the usual design for a Wiki inside-out and backwards.
+
+(BTW, I'm told that "iki" is Finnish for "forever" so ikiwiki is "forever
+wiki". It's also nice to note that ikiwiki contains a kiwi.)
+
+Oh, maybe you wanted to know why you'd want to choose ikiwiki instead of
+all the other wikis out there? Unless your personal strangeness
+significantly aligns with [[Joey]]'s, so that keeping everything in
+subversion, compiling websites to static html, and like design [[features]]
+appeal to you, you probably won't.
+
+Hmm, the above paragraph is less true today than it was when I wrote it.
diff --git a/doc/wikiicons/diff.png b/doc/wikiicons/diff.png
new file mode 100644
index 000000000..0b98d79ac
--- /dev/null
+++ b/doc/wikiicons/diff.png
Binary files differ
diff --git a/doc/wikiicons/openidlogin-bg.gif b/doc/wikiicons/openidlogin-bg.gif
new file mode 100644
index 000000000..a3bfe1098
--- /dev/null
+++ b/doc/wikiicons/openidlogin-bg.gif
Binary files differ
diff --git a/doc/wikiicons/revert.png b/doc/wikiicons/revert.png
new file mode 100644
index 000000000..c39e65c33
--- /dev/null
+++ b/doc/wikiicons/revert.png
Binary files differ
diff --git a/doc/wikiicons/search-bg.gif b/doc/wikiicons/search-bg.gif
new file mode 100644
index 000000000..02f9da4a7
--- /dev/null
+++ b/doc/wikiicons/search-bg.gif
Binary files differ
diff --git a/doc/wishlist.mdwn b/doc/wishlist.mdwn
new file mode 100644
index 000000000..627503760
--- /dev/null
+++ b/doc/wishlist.mdwn
@@ -0,0 +1,6 @@
+These [[todo]] tagged 'wishlist' encompass all kinds of features and
+improvements people would like to see in ikiwiki. Good patches for any of
+these will likely be accepted.
+
+[[!inline pages="todo/* and !todo/done and !link(todo/done) and
+link(wishlist) and !link(patch) and !todo/*/*" archive=yes show=0]]
diff --git a/doc/wishlist/watched_pages.mdwn b/doc/wishlist/watched_pages.mdwn
new file mode 100644
index 000000000..d943571d7
--- /dev/null
+++ b/doc/wishlist/watched_pages.mdwn
@@ -0,0 +1 @@
+Is there a way to mark pages I edit as "watched"? A way of doing this through git would be acceptable too. Right now I link to my homepage on every page I comment on, but this doesn't tell me if the pages were updated... --[[anarcat]]