aboutsummaryrefslogtreecommitdiff
path: root/doc/todo/inlines_inheriting_links.mdwn
diff options
context:
space:
mode:
Diffstat (limited to 'doc/todo/inlines_inheriting_links.mdwn')
-rw-r--r--doc/todo/inlines_inheriting_links.mdwn25
1 files changed, 25 insertions, 0 deletions
diff --git a/doc/todo/inlines_inheriting_links.mdwn b/doc/todo/inlines_inheriting_links.mdwn
index 12531990c..56f18418d 100644
--- a/doc/todo/inlines_inheriting_links.mdwn
+++ b/doc/todo/inlines_inheriting_links.mdwn
@@ -18,3 +18,28 @@ This is not just an ugly workaround. The availability of this feature has some r
So in a sense, in some or most cases, it would indeed be cleaner to "store" the definition of a class of pages referred to in complex pagespecs as a separate object. And the most natural representation for this definition of a class of pages (adhering to the principle of wiki that what you mean is entered/stored in its most natural representation, not through some hidden disconnected code) is making a page with an inline/map/or the like, so that at the same time you store the definition and you see what it is (the set of pages is displayed to you).
I would actually use it in my current "project" in ikiwiki: I actually edit a set of materials as a set of subpages `new_stuff/*`, and I also want to have a combined view of all of them (made through inline), and at another page, I want to list what has been linked to in `new_stuff/*` and what hasn't been linked to.--Ivan Z.
+
+> I see where you're coming from, but let's think about
+> immplementation efficiency for a second.
+>
+> In order for inline inheritlinks=yes to work,
+> the inline directive would need to be processed
+> during the scan pass.
+>
+> When the directive was processed there, it would need
+> to determine which pages get inlined (itself a moderatly
+> expensive operation), and then determine which pages
+> each of them link to. Since the scan pass is unordered,
+> those pages may not have themselves been scanned yet.
+> So to tell what they link to, inline would have to load
+> each of them, and scan them.
+>
+> And that would happen on *every* build of the wiki,
+> even if the page with the inline didn't change. So
+> there's the potential for this to really badly slow
+> down a wiki build.
+>
+> Maybe there's the potential to add some really smart
+> caching code that avoids unnecessary re-scanning
+> and is really quick.. but I suspect it would be *very*
+> complex too. --[[Joey]]