Feed the Federation

We consider the best way to feed live data into the federation without making a nuisance of oneself.

See Active Feeds

Active Feeds

First of all understand that the federation is a network of gardens tended by thoughtful humans. The journal reflects the evolution of that thought, not the mechanical heartbeat of a datastream.

A site is of course free to produce pages by any means it sees fit and store those as recognizable json. We encourage this in Federating Foreign Servers.

Federating Foreign Servers

We've tried various mechanisms for feeding fresh data into json pages and find some to be obnoxious. Our better approach is that adopted by the JSON plugin where update history is contained with the plugin and reported upon viewing. See About JSON Plugin

About JSON Plugin

We've also built libraries that will construct whole pages through a series of calls that, for example, add elements to a page with the corresponding dated journal entry, often only a millisecond from the previous. This has never proven to be useful.

Best practice for one-time constructions is to add a complete story to the page and note its completeness at creation by including it as the story field in one create action in the journal. Subsequent human edits are then correctly recorded.

Aside: We will need to check that we do the right thing in all cases now that forking back to the create is interpreted as delete. Theory: if it was created whole once, it can be created whole again. If not, fork a copy and revert to this fork to find the original.

Now, consider this new approach for wrapping computer generated content in curated pages. Create and revise the wrapper as one would for plugin about pages. A temporary wiki on localhost works great for this. Leave a Factory plugin where fresh data will be inserted. Then have automation replace this item with original content on each update.

Conceptually the automation is forking the curated page and might add a dated fork action to the journal. We ignore forks when dating pages for sitemaps and recent changes so this mechanical timestamp remains polite.

This might not be so great for automation that produces a variable number of items as is done with Recent Farm Activity. This is an old script that predates Rosters and would be much better handled by creating and replacing a single Roster in an otherwise human curated page.

Aside: But who will feed new sites into the scrape if a sitemap somewhere isn't incremented upon their arrival? Perhaps an exception to this date logic could be made when a new site is detected?

We can now offer server admins server enhancements in the form of server-side plugins. Let us consider what we might expect of a Residents Plugin.

Residents Plugin

We could encourage feeds that update a specific item in place, much as an edit-action might, but without or with optional journal modification. Perhaps a Wiki Edit Command Line would do.

Wiki Edit Command Line