aboutsummaryrefslogtreecommitdiff
path: root/doc/todo/aggregation.mdwn
diff options
context:
space:
mode:
authorjoey <joey@0fa5a96a-9a0e-0410-b3b2-a0fd24251071>2006-07-30 04:31:08 +0000
committerjoey <joey@0fa5a96a-9a0e-0410-b3b2-a0fd24251071>2006-07-30 04:31:08 +0000
commite49ff966a39d1037ccbf168b8dbd12618cf1b41e (patch)
treecedd803bbefbef0f8fded40fa084bd1eeb9394b1 /doc/todo/aggregation.mdwn
parent7bb447675fced4861eeec9e687935b9eedf98033 (diff)
downloadikiwiki-e49ff966a39d1037ccbf168b8dbd12618cf1b41e.tar
ikiwiki-e49ff966a39d1037ccbf168b8dbd12618cf1b41e.tar.gz
* ikiwiki can now download and aggregate feeds with its new aggregate
plugin, so it's possible to implement a Planet using ikiwiki! * --setup --refresh no longer rebuilds wrappers. Use --setup --refresh --wrappers to do that. * Add %IkiWiki::forcerebuild to provide a way for plugins like aggregate to update pages that haven't changed on disk.
Diffstat (limited to 'doc/todo/aggregation.mdwn')
-rw-r--r--doc/todo/aggregation.mdwn25
1 files changed, 1 insertions, 24 deletions
diff --git a/doc/todo/aggregation.mdwn b/doc/todo/aggregation.mdwn
index 7d765f9e9..53b3133e2 100644
--- a/doc/todo/aggregation.mdwn
+++ b/doc/todo/aggregation.mdwn
@@ -1,24 +1 @@
-Here's a scary idea.. A plugin that can aggregate feeds from other
-locations. Presumably there would need to be a cron job to build the wiki
-periodically, and each time it's built any new items would be turned into
-pages etc. There might also need to be a way to expire old items, unless
-you wanted to keep them forever.
-
-This would allow ikiwiki to work as a kind of a planet, or at least a
-poor-man's news aggregator.
-
-* XML::Feed has a very nice interface, may require valid feeds though.
-* How to store GUIDs? Maybe as meta tags on pages, although that would need
- caching of such metadata somewhere.
-* How to configure which feeds to pull, how often, and where to put the
- pulled entries? One way would be command line/config file, but I think
- better would be to use preprocessor directives in a wiki page, probably
- the same page that inlines all the pages together.
-* Where to store when a feed was last pulled?
-
-So I need:
-
-* A way to store info from the preprocessor directives about what pages
- to pull and expiry.
-* A way to store info on last pull time, guids, etc.
-* Switch for a mode that a) pulls b) expires old c) rebuilds wiki (for cron)
+* Still need to support feed expiry.