blob: 7d765f9e9b51d75b4e6a9b7f1cdd8735facfff72 (
plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
|
Here's a scary idea.. A plugin that can aggregate feeds from other
locations. Presumably there would need to be a cron job to build the wiki
periodically, and each time it's built any new items would be turned into
pages etc. There might also need to be a way to expire old items, unless
you wanted to keep them forever.
This would allow ikiwiki to work as a kind of a planet, or at least a
poor-man's news aggregator.
* XML::Feed has a very nice interface, may require valid feeds though.
* How to store GUIDs? Maybe as meta tags on pages, although that would need
caching of such metadata somewhere.
* How to configure which feeds to pull, how often, and where to put the
pulled entries? One way would be command line/config file, but I think
better would be to use preprocessor directives in a wiki page, probably
the same page that inlines all the pages together.
* Where to store when a feed was last pulled?
So I need:
* A way to store info from the preprocessor directives about what pages
to pull and expiry.
* A way to store info on last pull time, guids, etc.
* Switch for a mode that a) pulls b) expires old c) rebuilds wiki (for cron)
|