| Commit message (Collapse) | Author | Age |
| |
|
|
|
|
|
|
| |
The cgi shows a fullscreen map, so having this other option to do it seems
redundant, and also layering a fullscreen map over an existing wiki page
doesn't look very good to me (and prevents editing the page etc).
|
| |
|
|
|
|
|
|
| |
This was not set anywhere, which causes their javascript to crash.
It *seems* the idea is this is the url to use to view the map full screen,
which uses ikiwiki.cgi.
|
|
|
|
|
|
|
|
| |
* fix will_render calls to pass proper relative filenames
* fix urls to kml etc files to not assume wiki's top is at /
* avoid building the javascript to display the map in two different
ways between the cgi and on-page maps
* refactor duplicate code
|
|
|
|
|
|
|
|
| |
This hook involves urlto, and that needs to have state loaded to work
in all situations.
Note that I can see no reason for the osm plugin to use a cgi hook at all.
This could just as well be a static html page!
|
|
|
|
|
|
|
| |
Foo->Bar->can("method") works just as well, even if Foo::Bar is not
loaded. Using UNIVERSAL::can is deprecated.
But, I was unable to easily eliminate conditional.pm's use of UNIVERSAL::can
|
|\ |
|
| |
| |
| |
| |
| | |
In principle, building any pages affected by links, backlinks etc.
could work the same way.
|
| | |
|
| |\ |
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
| | |
| | |
| | |
| | | |
it is put into kml files, etc
|
| | | |
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
Build links the right way.
This also involved dropping that leading slash on the osm_default_icon.
And since it would require changing the old osm_tag_icons too,
I just removed that relic.
|
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
It just didn't work, but also, it didn't use writefile, which is not
desirable for security. Fixed both issues.
Also removed some unnecessary debug messages.
|
| | |
| | |
| | |
| | |
| | | |
ikiwiki source files can contain at least one character that
needs to be escaped in an url: +
|
| | | |
|
| | | |
|
| | | |
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
Add an underlay for the osm plugin.
Update links to right path to icon. Note that the osm plugin has a
pervasive bug in how it links to icons; it assumes the site is at /.
I did not attempt to fix that; it should be using urlto() to make a correct
relative link.
|
| | |
| | |
| | |
| | | |
no code changes
|
| | | |
|
| | |
| | |
| | |
| | |
| | | |
I think it's the wrong encoding, seems like mojibake to me, but it works
now. Closes: #661198
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
Removing the version means that rebuilds are reproducible over time.
Both the generator tag and its version attribute are optional:
http://tools.ietf.org/html/rfc4287#section-4.2.4
|
| | | |
|
| |/
|/|
| |
| |
| |
| |
| |
| |
| |
| |
| | |
When the wiki is in a subdir of the git repo, a web revert would show
in recentchanges as eg, doc/index, instead of just index.
This happened because decode_git_file caches a $prefix that is dependant
on the $git_dir setting, and the revert code runs with a different
$git_dir, which polluted the $prefix for later.
Fix this by adding a with_git_dir that juggles the variables properly.
|
| |
| |
| |
| |
| | |
Its strftime is from Date::Format, doesn't have the problem, and using the
POSIX one breaks its %o.
|
|\ \ |
|
| | | |
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
strftime is a C function, it does not return decoded utf8.
Several places in ikiwiki manually decoded it, but at least two
forgot to.
Also, strftime might not return even encoded utf8, if LC_TIME is set
to a non-utf8 value. Went ahead and supported decoding whatever encoding
it uses.
The remaining direct calls to strftime() are all ones that first set
LC_TIME=C, in order to get times that are not for human display.
|
|\ \ \ |
|
| | | | |
|
| | | |
| | | |
| | | |
| | | |
| | | | |
Extract cvs_keyword_subst_args() and ensure it runs in $config{srcdir}.
Using Perl's -T operator appears to work equally well, perhaps switch?
|
| | | | |
|
| |\ \ \ |
|
| | | | |
| | | | |
| | | | |
| | | | |
| | | | |
| | | | |
| | | | |
| | | | |
| | | | |
| | | | |
| | | | |
| | | | |
| | | | | |
In the code:
* general plugin API calls (in plugins/write order),
* VCS plugin API calls (in plugins/write order), then
* internal support routines (in alphabetical order).
In the tests:
* general meta-behavior (in no particular order, yet),
* general plugin API calls (in plugins/write order),
* VCS plugin API calls (in plugins/write order), then
* internal support routines (in semi-logical order).
|
| |/ / /
|/| | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | | |
https://rt.cpan.org/Ticket/Display.html?id=74487
Gave up trying to support multiple YAML backends. The XS one requires ugly
manual encoding to get unicode right, and doesn't allow dumping yaml
fragments w/o the yaml header, but at least it doesn't randomly crash
on import like YAML::Mo has started to.
|
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | | |
A diff was already truncated after 200 lines. But it could still be
arbitrarily enormous, if a spammer or other random noise source likes long
lines. That could use a lot of memory to html encode etc the diff and fill
it into the template. Truncating after 100kb seems sufficient; it allows
for 200 lines of up to 512 characters each.
|
|/ / / |
|
| | |
| | |
| | |
| | | |
markdown discount engine, when maximum compatability is needed.
|
| | | |
|
|/ / |
|
| |
| |
| |
| |
| | |
Empty input, or input consisting soley of whitespace
caused an uninitialized value warning.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
mdwn: Can use the discount markdown library, via the
Text::Markdown::Discount perl module.
This is preferred if available since it's the fastest currently supported
markdown library, speeding up markdown rendering by a factor of 40.
That is to say, when only rendering a lot of markdown, discount is 40x
faster. When building a ikiwiki site, ikiwiki's other overhead gets in the
way, but I still see significant speedups. Building the ikiwiki docwiki
dropped from 62 to 45 seconds, for example.
However, when multimarkdown is enabled, Text::Markdown::Multimarkdown is
still used.
While discount contains some nonstandard markdown extensions,
including tables and footnotes, AFAICS most of them are not
enabled by default in the perl bindings.
I consider sticking to non-extended markdown a desirable thing, since this
is probably not the last markdown engine. In particular, sundown is waiting
in the wings to get packaged and get a perl binding.
----
Reviewing all the showdown extensions, here are the ones that are enabled:
centered paragraphs:
->centered<-
image sizes: [dust mite](http://dust.mite =150x150)
<style>..</style> blocks are eaten. The perl binding does not provide
access to the gathered CSS. This is not legal html anyway, so unlikely
to cause breakage.
|