From 2bb08aa5896e65036e457ccf911e308868438310 Mon Sep 17 00:00:00 2001 From: "https://www.google.com/accounts/o8/id?id=AItOawlSFgIlytGZgMLh_Cw4IA011V8pLKk5dVg" Date: Thu, 28 Nov 2013 16:56:28 -0400 Subject: Possible code for converting static assets --- doc/plugins/write/discussion.mdwn | 44 ++++++++++++++++++++++++--------------- 1 file changed, 27 insertions(+), 17 deletions(-) (limited to 'doc/plugins/write') diff --git a/doc/plugins/write/discussion.mdwn b/doc/plugins/write/discussion.mdwn index d4d8f57a3..5afe95f2a 100644 --- a/doc/plugins/write/discussion.mdwn +++ b/doc/plugins/write/discussion.mdwn @@ -45,23 +45,33 @@ distributed wiki. Since there's no mailing list, I'll post my request for help here :-) -I would like to use ikiwiki to build a static site which needs some transformations to be made on binary assets. A simple example is to translate a .odp presentation to .pdf using (e.g.) unoconv. I'd probably make a plugin with a config which maps extensions to shell commands. But what's the right place to hook in to do this? - -I can see that binary assets are normally hardlinked or copied verbatim. The logic from `sub render` in `IkiWiki/Render.pm` is: - -* If the private hash $rendered{$file} is already set, skip -* If the extension is known to pagetype(), i.e. it has been registered for the htmlize hook, send content through the full cycle of `genpage(htmlize(linkify(preprocess(filter(readfile)))))` -* ...except for extensions which start with underscore, in which case the processing is aborted before the write -* Any file whose extension is unknown to pagetype() is either hardlinked or copied directly to the target directory - -Options I can see are: - -* Register .odp as a htmlize extension, use the scan hook(), inside there write out the file and alter the page name so that it has an underscore (xxx.odp -> xxx._odp) -* Use the scan() hook, write out the file, directly manipulate the private %rendered hash to stop `sub render` handling it -* use needsbuild to build the page as a side effect and at the same time remove it from the list of pages to be built -* other way?? - -It's not clear to me which of these is the right way to go, taking into account all the existing logic for rebuilding pages on demand. (For example: if I git add and push a new .odp to the repository, I want the .pdf to be generated automatically in the output site through the post-commit hook) +I would like to use ikiwiki to build a static site which needs some transformations to be made on binary assets. A simple example is to translate a .odp presentation to .pdf using (e.g.) unoconv. If I add a new .odp attachment, or push one into the repo, I want the corresponding .pdf to appear in the generated site. What's the right place to hook in to do this? + +I've made an experimental prototype which hooks into needsbuild, builds the pages then and there, and at the same time removes them from the list of pages to be built. + +~~~ +sub needsbuild { + my $files=shift; + my $nfiles=[]; + foreach my $f (@$files) { + if ($f =~ /\.odp$/) { + my $g = $f; + $g =~ s/\.odp$/\.pdf/; + debug("building $f to $g"); + will_render($f, $g); + if (system("unoconv","-f","pdf","-o",IkiWiki::dirname("$config{destdir}/$g"),srcfile($f)) != 0) { + error("unoconv: failed to translate $f to $g"); + } + } + else { + push @$nfiles, $f; + } + }; + return $nfiles; +} +~~~ + +It appears to work, but is this the right way to do it, bearing in mind ikiwiki's dependency tracking and the like? And is the usage of will_render() correct? [[BrianCandler]] -- cgit v1.2.3