| Commit message (Collapse) | Author | Age |
... | |
| | | | | |
| | | | | |
| | | | | |
| | | | | |
| | | | | |
| | | | | |
| | | | | |
| | | | | |
| | | | | |
| | | | | | |
Avoid adding the page matched against as an influence for
currently failing pagespec matches, while still adding
any other influences.
This avoids bloating depends_simple with lots of bogus influences when
matching eg, "!link(done)". It's only necessary for the page being tested
to be an influence of that if the page matches.
|
| |_|_|_|/
|/| | | | |
|
| |_|_|/
|/| | | |
|
| | | | |
|
| |_|/
|/| |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
With this change, the <span> with class createlink is always created
around the link text, even when no CGI URL is defined. This allows
styling of these 'links' in this case too. The same class is used as when
CGI URL is defined so that e.g. clones of the same ikiwiki, one with CGI
and one without, display in the same way (modulo the missing question mark
link).
(cherry picked from commit 290d1b498f00f63e6d41218ddb76d87e68ed5081)
|
| | | |
|
|/ /
| |
| |
| |
| |
| |
| |
| |
| | |
Many calls to file_prune were incorrectly calling it with 2 parameters.
In cases where the filename being checked is relative to the srcdir,
that is not needed.
Made absolute filenames be pruned. (This won't work for the 2 parameter call
style.)
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
* Automatically run --gettime the first time ikiwiki is run on
a given srcdir.
* Optimise --gettime for git, so it's appropriatly screamingly
fast. (This could be done for other backends too.)
* However, --gettime for git no longer follows renames.
* Use above to fix up timestamps on docwiki, as well as ensure that
timestamps on basewiki files shipped in the deb are sane.
|
| |
| |
| |
| |
| |
| |
| |
| | |
* Rename --getctime to --gettime. (The old name still works for
backwards compatability.)
* --gettime now also looks up last modification time.
* Add rcs_getmtime to plugin API; currently only implemented
for git.
|
| | |
|
| |
| |
| |
| |
| |
| | |
This can be a lot faster, since huge numbers of pages are not sorted
only to mostly be thrown away. It sped up a build of my blog by at least
5 minutes.
|
| | |
|
|\ \
| | |
| | |
| | |
| | | |
Conflicts:
debian/NEWS
|
| | | |
|
| | |
| | |
| | |
| | |
| | | |
Also rename cmpspec_translate (internal function) to sortspec_translate
for consistency.
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
| | | |
|
|\ \ \ |
|
| | | |
| | | |
| | | |
| | | |
| | | |
| | | |
| | | | |
tagged()
Plugins that introduce a link type should also introduce pagespec syntax
for it.
|
| | | | |
|
|/ / /
| | |
| | |
| | |
| | |
| | |
| | | |
Both markdown and tidy add paragraph tags around text, that needs to be
stripped when the text is a short, one line fragment that is being inserted
into a larger page. tidy also adds several newlines to the end, and this
broke removal of the paragraph tags.
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
pagespec_translate may set $@ if it fails to parse a pagespec, but
due to memoization, this is not reliable. If a memoized call is repeated,
and $@ is already set for some other reason previously, it will remain
set through the call to pagespec_translate.
Instead, just check if pagespec_translate returns undef.
|
|/ /
| |
| |
| | |
could lead to bad dependency handling in certian situations.
|
| | |
|
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Finally removed the last hardcoding of IkiWiki::Setup::Standard.
Take the first "IkiWiki::Setup::*" in the setup file to define the
setuptype, and remember that type to use in dumping later. (But it can be
overridden using --set, etc.)
Also, support setup file types that are not evaled.
|
| |
| |
| |
| |
| |
| | |
The POSIX perl module exports a huge number of functions by default, so
make sure all imports are qualified. (And remove one that was not
necessary.)
|
| | |
|
| | |
|
| |
| |
| |
| | |
Precompile the regexp, rather than rebuilding on every call.
|
| |
| |
| |
| | |
files, such as .htaccess, that would normally be skipped for security or other reasons. Closes: #447267 (Thanks to Aaron Wilson for the original patch.)
|
| |
| |
| |
| | |
used by yahoo and google urls.
|
| |
| |
| |
| | |
destdir, as well as wrappers and the .ikiwiki directory.
|
| | |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
As I was adding ngettext support, I realized I could optimize the gettext
functions by memoizing the creation of the gettext object. Note that
the object creation is still deferred until a gettext function is called,
to avoid unnecessary startup penalties on code paths that do not need
gettext.
A side benefit is that separate stub functions are no longer needed to
handle the C language case.
|
| | |
|
| |
| |
| |
| | |
only being edited via users authed with httpauth.
|
| | |
|
| |
| |
| |
| | |
Not yet exported, as only 4 quite core plugins use it.
|
|/
|
|
|
| |
That was dead code; changes to lockedit and recentchanges removed the last
callers.
|
|
|
|
|
| |
This was probably not noticed because it only results in a warning, and in
the checkcontent diff having some unchanged lines in it.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
bestlink was looking at whether %links existed for a page in order to tell
if the page exists, but just-deleted pages still have entries in there (for
reasons it may be best not to explore). So bestlink would return
just-deleted pages. Instead, make bestlink use %pagesources.
Also, when finding a deleted page, %pagecase was not cleared of that page.
This, again, made bestlink return just-deleted pages. Now that is cleared.
Fixing bestlink exposed another issue though. The backlink calculation code
uses bestlink. So when a page was deleted, no backlinks to it are found,
and pages that really did backlink to it were not updated, and had broken
links.
To fix that, the code that actually removes deleted pages had to be split
out from find_del_files, so it can run a bit later. It is run just after
backlinks are calculated. This way, backlink calculation still sees the
deleted pages, but everything afterwards does not.
However, it does not address the original bug report that started this
whole thing, [[bugs/bestlink_returns_deleted_pages]]. Because there
bestlink is run in the needsbuild hook. And that happens before backlink
calculation, and so bestlink still returns deleted pages then. Also in the
scan hook.
If bestlink needs to work consistently during those hooks, a more involved
fix will be needed.
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This avoids unnecessary influences being recorded from pagespecs
such as "link(done) and bugs/*", when a page cannot ever possibly
match.
A pagespec term that returns a value without influence is an influence
blocker. If such a blocker has a false value (possibly due to being
negated) and is ANDed with another term, it blocks that term's influence
from propigating out.
If the term is ORed, or has a true value, it does not block influence.
(Consider "link(done) or bugs/*" and "link(done) and !nosuchpage")
In the implementation in merge_influence, I had to be careful to never
negate $this or $other when testing if they are an influence blocker,
since negation mutates the object. Thus the slightly weird if statement.
|
| |
|
|
|
|
|
|
|
|
| |
I made match_* functions whose influences can vary depending on the page
matched set a special "" influence to indicate this.
Then add_depends can try just one page, and if static influences are found,
stop there.
|