| Commit message (Collapse) | Author | Age |
| |
|
|
|
|
|
|
|
|
|
|
|
| |
When the wiki is in a subdir of the git repo, a web revert would show
in recentchanges as eg, doc/index, instead of just index.
This happened because decode_git_file caches a $prefix that is dependant
on the $git_dir setting, and the revert code runs with a different
$git_dir, which polluted the $prefix for later.
Fix this by adding a with_git_dir that juggles the variables properly.
|
|
|
|
|
| |
Its strftime is from Date::Format, doesn't have the problem, and using the
POSIX one breaks its %o.
|
|\ |
|
| | |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
strftime is a C function, it does not return decoded utf8.
Several places in ikiwiki manually decoded it, but at least two
forgot to.
Also, strftime might not return even encoded utf8, if LC_TIME is set
to a non-utf8 value. Went ahead and supported decoding whatever encoding
it uses.
The remaining direct calls to strftime() are all ones that first set
LC_TIME=C, in order to get times that are not for human display.
|
|\ \ |
|
| | | |
|
| | |
| | |
| | |
| | |
| | | |
Extract cvs_keyword_subst_args() and ensure it runs in $config{srcdir}.
Using Perl's -T operator appears to work equally well, perhaps switch?
|
| | | |
|
| |\ \ |
|
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | | |
In the code:
* general plugin API calls (in plugins/write order),
* VCS plugin API calls (in plugins/write order), then
* internal support routines (in alphabetical order).
In the tests:
* general meta-behavior (in no particular order, yet),
* general plugin API calls (in plugins/write order),
* VCS plugin API calls (in plugins/write order), then
* internal support routines (in semi-logical order).
|
| |/ /
|/| |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
https://rt.cpan.org/Ticket/Display.html?id=74487
Gave up trying to support multiple YAML backends. The XS one requires ugly
manual encoding to get unicode right, and doesn't allow dumping yaml
fragments w/o the yaml header, but at least it doesn't randomly crash
on import like YAML::Mo has started to.
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
A diff was already truncated after 200 lines. But it could still be
arbitrarily enormous, if a spammer or other random noise source likes long
lines. That could use a lot of memory to html encode etc the diff and fill
it into the template. Truncating after 100kb seems sufficient; it allows
for 200 lines of up to 512 characters each.
|
|/ / |
|
| |
| |
| |
| | |
markdown discount engine, when maximum compatability is needed.
|
| | |
|
|/ |
|
|
|
|
|
| |
Empty input, or input consisting soley of whitespace
caused an uninitialized value warning.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
mdwn: Can use the discount markdown library, via the
Text::Markdown::Discount perl module.
This is preferred if available since it's the fastest currently supported
markdown library, speeding up markdown rendering by a factor of 40.
That is to say, when only rendering a lot of markdown, discount is 40x
faster. When building a ikiwiki site, ikiwiki's other overhead gets in the
way, but I still see significant speedups. Building the ikiwiki docwiki
dropped from 62 to 45 seconds, for example.
However, when multimarkdown is enabled, Text::Markdown::Multimarkdown is
still used.
While discount contains some nonstandard markdown extensions,
including tables and footnotes, AFAICS most of them are not
enabled by default in the perl bindings.
I consider sticking to non-extended markdown a desirable thing, since this
is probably not the last markdown engine. In particular, sundown is waiting
in the wings to get packaged and get a perl binding.
----
Reviewing all the showdown extensions, here are the ones that are enabled:
centered paragraphs:
->centered<-
image sizes: [dust mite](http://dust.mite =150x150)
<style>..</style> blocks are eaten. The perl binding does not provide
access to the gathered CSS. This is not legal html anyway, so unlikely
to cause breakage.
|
|
|
|
| |
how did that get set?
|
|
|
|
| |
This ensures that RSS/Atom feeds produced are valid XML.
|
|
|
|
| |
(cherry picked from commit 272e0b2f17c33c625b494b07f581da400066a216)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
We had a weird problem where, after moving to a new, faster server,
"git push" would sometimes fail like this:
Unpacking objects: 100% (3/3), done.
fatal: The remote end hung up unexpectedly
fatal: The remote end hung up unexpectedly
What turned out to be going on was that git-receive-pack was dying due
to an uncaught SIGPIPE. The SIGPIPE occurred when it tried to write to
the pre-receive hook's stdin. The pre-receive hook, in this case, was
able to do all the checks it needed to do without the input, and so did
exit(0) without consuming it.
Apparently that causes a race. Most of the time, git forks the hook,
writes output to the hook, and then the hook runs, ignores it, and exits.
But sometimes, on our new faster server, git forked the hook, and it
ran, and exited, before git got around to writing to it, resulting in
the SIGPIPE.
write(7, "c9f98c67d70a1cfeba382ec27d87644a"..., 100) = -1 EPIPE (Broken
pipe)
--- SIGPIPE (Broken pipe) @ 0 (0) ---
I think git should ignore SIGPIPE when writing to hooks. Otherwise,
hooks may have to go out of their way to consume all input, and as I've
seen, the races when they fail to do this can lurk undiscovered.
I have written to the git mailing list about this.
As a workaround, consume all stdin before exiting.
|
| |
|
|
|
|
| |
(Sponsored by The TOVA Company.)
|
|
|
|
|
|
|
|
|
|
|
|
| |
Using a file was sorta not right.
Note that when previewing, %pagestate is not saved, so
it has to rebuild the graph every time until that graph is saved;
then previews can use the cached data until the next time the graph
is changed.
Also note that it's stored in the destpage's pagestate. The imagemap
could vary between a page and an inlined page if wikilinks were supported.
|
|
|
|
|
|
|
|
| |
imagemap.
Also, I let preview mode write real files, rather than using data: uri.
Which is ok these days, since ikiwiki tracks files created during
previewing, and cleans them up later.
|
|
|
|
|
|
|
| |
In 875d550f1278215e6c87d3b78ff87db24c6d76b3 I for some reason
made $page be changed when creating a discussion page, which
broke the link on the edit page. Changing page seems unnecessary,
so reverted that part of the change.
|
| |
|
| |
|
| |
|
|
|
|
|
|
| |
the right place and with the right case.
Broken by page case preservation feature added in 3.20110707.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Involved dropping some checks for .svn which didn't add anything, since if
svn is enabled and you point it at a non-svn checkout, you get both pieces.
The tricky part is add and rename, in both cases the new file can be in
some subdirectory that is not added to svn.
For add, turns out svn has a --parents that will deal with this by adding
the intermediate directories to svn as well.
For rename though, --parents fails if the directories exist but are not
yet in svn -- which is exactly the case, since ikiwiki makes them
by calling prep_writefile. So instead, svn add the parent directory,
recursively.
tldr; svn made a reasonable change in dropping the .svn directories from
everywhere, but the semantics of other svn commands, particularly their
pickiness about whether parent directories are in svn or not, means
that without the easy crutch of checking for those .svn directories,
code has to tiptoe around svn to avoid pissing it off.
|
|
|
|
| |
When reverting, an add is a remove, and a remove is an add.
|
|
|
|
|
|
|
|
| |
installed. Closes: #637606
There's a nice message if the plugin is loaded and used and highlight is
not available, and a nice fallback. So no need for this other warning,
which can happen any time all plugins are loaded to generate a setup file.
|
|
|
|
| |
inlining page.
|
| |
|
|
|
|
|
|
| |
This kind of change is scary, but this particular lock is very simply
used and so it seems ok to make it even just for better portability to
SunOS. (People still use that?)
|
|
|
|
| |
before Image::Magick.
|
| |
|
| |
|
|
|
|
|
| |
Example case was a tag with & in its name, which resulted in a malformed
rss feed.
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
regexp blowup.
Complex regular subexpression recursion limit (32766) exceeded at
/home/joey/src/ikiwiki/IkiWiki.pm line 1532.
This doesn't fix the blowup potential itself, it just fixes the typo. :)
A sample page that causes the blowup is attached below for future
reference. The first directive is not terminated. Contributing are the
additional quotes around the following directives, which mean that they can
each be processed as a parameter to the first directive, or as an
individual directive. In resolving this ambiguity, the regexp blows up.
Happily, perl contains the explosion , so I don't think there is an exploit
here.
"[[!shortcut name=wiktionary url=\"https://secure.wikimedia.org/wiktionary/en/"
"[[!shortcut name=debss url=\"http://snapshot.debian.net/package/%s\"]]"
"[[!shortcut name=debwiki url=\"http://wiki.debian.org/%s\"]]"
"[[!shortcut name=fdobug url=\"https://bugs.freedesktop.org/show_bug.cgi?id=%s\" desc=\"freedesktop.org bug #%s\"]]"
"[[!shortcut name=fdolist url=\"http://lists.freedesktop.org/mailman/listinfo/%s\" desc=\"%s@lists.freedesktop.org\"]]"
"[[!shortcut name=cpanrt url=\"https://rt.cpan.org/Ticket/Display.html?id=%s\" desc=\"CPAN RT#%s\"]]"
"[[!shortcut name=novellbug url=\"https://bugzilla.novell.com/show_bug.cgi?id=%s\" desc=\"bug %s\"]]"
"[[!shortcut name=fdolist url=\"http://lists.freedesktop.org/mailman/listinfo/%s\" desc=\"%s@lists.freedesktop.org\"]]"
"[[!shortcut name=gnomebug url=\"http://bugzilla.gnome.org/show_bug.cgi?id=%s\" desc=\"GNOME bug #%s\"]]"
"[[!shortcut name=linuxbug url=\"http://bugzilla.kernel.org/show_bug.cgi?id=%s\" desc=\"Linux bug #%s\"]]"
"[[!shortcut name=gmane url=\"http://dir.gmane.org/gmane.%s\" desc=\"gmane.%s\"]]"
"[[!shortcut name=gmanemsg url=\"http://mid.gmane.org/%s\"]]"
"[[!shortcut name=cpan url=\"http://search.cpan.org/search?mode=dist&query=%s\"]]"
"[[!shortcut name=ctan url=\"http://tug.ctan.org/cgi-bin/ctanPackageInformation.py?id=%s\"]]"
"[[!shortcut name=hoogle url=\"http://haskell.org/hoogle/?q=%s\"]]"
"[[!shortcut name=iki url=\"http://ikiwiki.info/%S/\"]]"
"[[!shortcut name=ljuser url=\"http://%s.livejournal.com/\"]]"
"[[!shortcut name=rfc url=\"http://www.ietf.org/rfc/rfc%s.txt\" desc=\"RFC %s\"]]"
"[[!shortcut name=c2 url=\"http://c2.com/cgi/wiki?%s\"]]"
"[[!shortcut name=meatballwiki url=\"http://www.usemod.com/cgi-bin/mb.pl?%s\"]]"
"[[!shortcut name=emacswiki url=\"http://www.emacswiki.org/cgi-bin/wiki/%s\"]]"
"[[!shortcut name=haskellwiki url=\"http://haskell.org/haskellwiki/%s\"]]"
"[[!shortcut name=dict url=\"http://www.dict.org/bin/Dict?Form=Dict1&Strategy=*&Database=*&Query=%s\"]]"
"[[!shortcut name=imdb url=\"http://imdb.com/find?q=%s\"]]"
"[[!shortcut name=gpg url=\"http://pgpkeys.mit.edu:11371/pks/lookup?op=vindex&exact=on&search=0x%s\"]]"
"[[!shortcut name=perldoc url=\"http://perldoc.perl.org/search.html?q=%s\"]]"
"[[!shortcut name=whois url=\"http://reports.internic.net/cgi/whois?whois_nic=%s&type=domain\"]]"
"[[!shortcut name=cve url=\"http://cve.mitre.org/cgi-bin/cvename.cgi?name=%s\"]]"
"[[!shortcut name=cia url=\"http://cia.vc/stats/project/%s\"]]"
"[[!shortcut name=ciauser url=\"http://cia.vc/stats/user/%s\"]]"
"[[!shortcut name=flickr url=\"http://www.flickr.com/photos/%s\"]]"
"[[!shortcut name=man url=\"http://linux.die.net/man/%s\"]]"
"[[!shortcut name=ohloh url=\"http://www.ohloh.net/projects/%s\"]]"
"[[!shortcut name=cpanrt url=\"https://rt.cpan.org/Ticket/Display.html?id=%s\" desc=\"CPAN RT#%s\"]]"
"[[!shortcut name=novellbug url=\"https://bugzilla.novell.com/show_bug.cgi?id=%s\" desc=\"bug %s\"]]"
|
|\ |
|
| |\ |
|
| | | |
|