aboutsummaryrefslogtreecommitdiff
path: root/IkiWiki
Commit message (Expand)AuthorAge
* tidy up new page_types codeJon Dowland2009-05-16
* add a long name for textile/txtl ("Textile")Jon Dowland2009-05-16
* add a long name for mdwn ("Markdown")Jon Dowland2009-05-16
* check for longname for each syntax plugin•••We build an array of [ plugin name, long name ] pairs, where long name is an optional argument to hook(). So, a syntax plugin could define long "friendly" name, such as "Markdown" instead of mdwn, and we would then pass this array to formbuilder to populate the drop-down on the edit page. Jon Dowland2009-05-16
* remove pagespec_match_list override for external•••Not needed since it returns a list of pages, not a fail/success object. Joey Hess2009-05-06
* external: Fix pagespec_match and pagespec_match_list. Closes: #527281Joey Hess2009-05-06
* Avoid %links accumulating duplicates. (For TOVA)•••This is sorta an optimisation, and sorta a bug fix. In one test case I have available, it can speed a page build up from 3 minutes to 3 seconds. The root of the problem is that $links{$page} contains arrays of links, rather than hashes of links. And when a link is found, it is just pushed onto the array, without checking for dups. Now, the array is emptied before scanning a page, so there should not be a lot of opportunity for lots of duplicate links to pile up in it. But, in some cases, they can, and if there are hundreds of duplicate links in the array, then scanning it for matching links, as match_link and some other code does, becomes much more expensive than it needs to be. Perhaps the real right fix would be to change the data structure to a hash. But, the list of links is never accessed like that, you always want to iterate through it. I also looked at deduping the list in saveindex, but that does a lot of unnecessary work, and doesn't completly solve the problem. So, finally, I decided to add an add_link function that handles deduping, and make ikiwiki-transition remove the old dup links. Joey Hess2009-05-06
* inline: Minor optimisation.•••When finding the pageurl, it was calling bestlink unnecessarily. Since at this point $page contains the full name of the page that is being inlined, there is no need to do bestlink's scan for it. This is only a minor optimisation, since bestlink is only called once per displayed, inlined page. Joey Hess2009-05-05
* simplifiyJoey Hess2009-04-23
* Revert "pagespec_match_list * optimisation"•••This reverts commit 2f96c49bd1826ecb213ae025ad456a714aa04863. I forgot about internal pages. We don't want * matching them! I left the optimisation in pagecount, where it used to live. Internal pages probably don't matter when they're just being counted. Joey Hess2009-04-23
* avoid using pagespec_match_list here•••I forgot to check if it was called from preprocess, and it is not; it's called by a format hook. If an error is thrown from a format hook, wiki build fails, so we don't want that. Joey Hess2009-04-23
* simplifyJoey Hess2009-04-23
* pagespec_match_list * optimisation•••Add an optimisation for the semi-common case of a "*" pagespec. Can avoid doing any real processing in this case. Joey Hess2009-04-23
* formattingJoey Hess2009-04-23
* typoJoey Hess2009-04-23
* pagespec_match_list added and used in most appropriate places•••* pagespec_match_list: New API function, matches pages in a list and throws an error if the pagespec is bad. * inline, brokenlinks, calendar, linkmap, map, orphans, pagecount, pagestate, postsparkline: Display a handy error message if the pagespec is erronious. Joey Hess2009-04-23
* comments: Add link to comment post form to allow user to sign in if they wish...Joey Hess2009-04-23
* pagespec error/failure distinction and error display by inline•••* Add IkiWiki::ErrorReason objects, and modify pagespecs to return them in cases where they fail to match due to a configuration or syntax error. * inline: Display a handy error message if the inline cannot display any pages due to such an error. This is perhaps somewhat incomplete, as other users of pagespecs do not display the error, and will eventually need similar modifications to inline. I should probably factor out a pagespec_match_all function and make it throw ErrorReasons. Joey Hess2009-04-23
* fix idJoey Hess2009-04-22
* websetup: If setup fails, restore old setup file.Joey Hess2009-04-22
* blogspam: Load RPC::XML library in checkconfig, so that an error can be print...Joey Hess2009-04-22
* websetup: Display stderr in browser if ikiwiki setup fails.Joey Hess2009-04-22
* remove unnecessary variableJoey Hess2009-04-04
* remove debuggingJoey Hess2009-04-04
* Merge branch 'darcs'•••Conflicts: debian/changelog Joey Hess2009-04-04
|\
| * fix display of web commits in recentchanges•••The darcs backend appends @web to the names of web committers, so remove it when extracting. Joey Hess2009-04-04
| * fix name of wrapperJoey Hess2009-04-04
| * fix bug I introducedJoey Hess2009-04-04
| * support darcs in setup automator•••use a consistent name for the ikiwiki wrapper file Joey Hess2009-04-04
| * move comments to copyright and changelogJoey Hess2009-04-04
| * formatting, layout, indentation, coding styleJoey Hess2009-04-04
| * Merge branch 'master'•••Conflicts: doc/ikiwiki-makerepo.mdwn Joey Hess2009-04-04
| |\
| * | only darcs add files not yet in version controlJoey Hess2008-10-16
| * | updated from pesco's darcs repo, current to Oct 11 versionJoey Hess2008-10-15
| * | Merge branch 'master' into darcsJoey Hess2008-10-15
| |\ \
| * | | add pesco's darcs pluginJoey Hess2008-10-01
* | | | Add missing newline to Confirm Password prompt.Joey Hess2009-04-04
| |_|/ |/| |
* | | recentchanges: change to using do=goto links.Joey Hess2009-04-01
* | | use md5sum for page_to_id•••The munged ids were looking pretty nasty, and were not completly guaranteed to be unique. So a md5sum seems like a better approach. (Would have used sha1, but md5 is in perl core.) Joey Hess2009-03-27
* | | comments: Fix anchor ids to be legal xhtml. Closes: #521339•••Well, that was a PITA. Luckily, this doesn't break guids to comments in rss feeds, though it does change the links. I haven't put in a warning about needing to rebuild to get this fix. It's probably good enough for new comments to get the fix, without a lot of mass rebuilding. Joey Hess2009-03-26
* | | comments: Fix too loose test for comments pages that matched normal pages wit...Joey Hess2009-03-26
* | | fix rcs_getctime to return first, not last, change time•••This was being buggy and returning the file's last change time, not its creation time. (I checked all the others (except tla) and they're ok.) Joey Hess2009-03-20
* | | fix rcs_getctime to return first, not last, change time•••This was being buggy and returning the file's last change time, not its creation time. Joey Hess2009-03-20
* | | inline: Fix urls to feed when feedfile is used on an index page.•••It would be better to use urlto() here, but will_render has not yet been called on the feed files at this point, so it won't work. (And reorganizing so it can be is tricky.) Joey Hess2009-03-19
* | | avoid crashing if Sort::Naturally is not installedJoey Hess2009-03-19
* | | implement sort=title_natural for inline•••adds a new sorting order, title_natural, that uses Sort::Naturally's ncmp function to provide better sorting for inlines chrysn2009-03-19
* | | git: Manually decode git output from utf-8, avoids warning messages on invali...Joey Hess2009-03-09
* | | git: Fix utf-8 encoding of author names.•••I guess what's happening here is that since the name is passed to git via an environment variable, perl's normal utf-8 IO layer stuff doesn't work. So we have to explicitly decode the string from perl's internal representation into utf-8. Joey Hess2009-03-09
* | | avoid uninitialized value warningsJoey Hess2009-03-09
* | | When loading a template in scan mode, let preprocess know it only needs to scan.•••This makes wikis such as zack's much faster in the scan pass. In that pass, when a template contains an inline, there is no reason to process the entire inline and all its pages. I'd forgotten to pass along the flag to let preprocess() know it was in scan mode, leading to much unncessary churning. Joey Hess2009-03-08