User:Nsh/temp/plan

Here are some ideas / grandious schemes I'm thinking about.

Brief
Load a wiki page and have changes to it sent via persistant http connection and diff(1)-like patches applied by javascript in real time

Mechanism

 * Browser loads page with ">"update javascript function"<". After page loads, javascript function polls pseudo-"cgi script"
 * cgi script tracks updates via RC or polls for dynamic pages (**see active-wiki**). when page change is detected, script requests a render of the new version
 * script applies diff to new revision html versus old revision html
 * each change in html is sent to browser over persistant http connect to update javascript function
 * update javascript function parses the diff(1)-like patch and replaces the html and (if necessary) reloads the page

Brief
Wiki which is generated dynamically by programing (generic, or specifically in this context **see paradigm-wiki**) instead of static content which changes in discrete intervals. Open ended building block

Mechanism
Generator program pipes to wiki rendering engine (unless raw wiki code is requested) and displays as normal

Brief
Entirely new programing paradigm. Code modularity is enforced, but language is open. May be used to run external code in sandboxed by environment. Programs are dynamically generated prior to execution by creating the source code through the paradigm system. (MORE)

Mechanism

 * Pick intended language(s). Must be able to take external arguements and generate text output.
 * Allow editors to create and freely edit code segments (modules) which: Take arguments in accordance with argument system and either: Create data to be used as further variables, or: output text.
 * Allow editors to create "program maps": Using only links (with meta-data to specify argument orders) to form any complete program using the existing community edited code segment "articles". Program maps return all outputed text from their module execution
 * Program maps may be used to directly create **Active Wiki** pages, or may be included in other program maps. See *Wikivolution*

Brief
Idea for an intelligent learning article generation system implimented in Paradigm Wiki.

Mechanism

 * Multiple program maps make guesses for whether an article edit will be reverted in some time period.
 * A master program map tracks these guesses and checks their accuracies.
 * The component maps are informed if they guessed correctly or not and can update their constituent algorithms accordingly.
 * A "food scarcity" algorithm determines which component maps are "fed" more or less edits per unit time.
 * Successful maps will be fed more and thus recieve more feedback than crap ones, but if successful maps stop improving, the feeding division tends back to equal.
 * The most successful map at any time (determined by an editable alrogithm (map) as is everything) can be used for outward facing wikis with high vandalism, etc. Fallback system could be implimented where second most successful takes over if #1 has algo glitch.

Brief
Creates data from an articles history tables.

Mechanism

 * Data contains all text that has ever been included in article (unimportant text can be ommited for memory/space reasons).
 * Smallest discrete text string is the largest continuous string which hasn't appeared in part.
 * Data entries define contatenations of discrete strings which have appeared.
 * Each entry has the following data:
 * - Originating editor
 * - Total time it appeared in article, broken into each continuous segment
 * - Editor(s) who have commited an edit with this string
 * - Edit summaries of edits which have added or removed the string

Applications

 * Produce history maps (like history-flow tool). Shows article as strata of discrete strings over time, each string is given a different colour. (Strings produced by the same author may be all similarly coloured for clarity, etc).
 * For wikivolution-style Active Wiki program maps which need to consider whether the addition or removal of discrete strings is likely to be accepted or reverted.

Create PoC(s) to demonstrate:

 * 1) Real time Wiki	difficulty: easyish
 * 2) Active Wiki		difficulty: easy
 * 3) Paradigm Wiki	difficulty: hardish
 * 4) Wikivolution		difficulty:    "     +    hard
 * 5) History Analyser	difficulty: medium

1

 * (a) CGI script: perl/php/whatever. Uses some feed to track recent changes for article being "streamed". Asks for new wikitext (or diff! impliment diff raw output in index.php!). Uses some renderer (own mediawiki or external) to create html diff. Sends diff to javascript function


 * (b) Javascript function: Wrapper around page. Loads page and keeps connection open waiting for updates. Applies update "patches" as they appear.