Rudimentary support for building wiki archives. The content is dumped to html but the wikitext isn't parsed yet.
mwparserfromhell is used for parsing wikitext but it has no support for rendering to HTML so we'll have to build it manually.
This commit is contained in:
@@ -8,8 +8,11 @@ This repository contains the tickets, scripts, and documentation for the end of
|
||||
#### `deploy_archives`
|
||||
Run this once the archives have been built to tar them up and scp them to the server.
|
||||
|
||||
#### Wiki Data
|
||||
##### `find_data`
|
||||
#### Wiki Data (`wiki` directory)
|
||||
##### `wiki_pages`
|
||||
Not a script, just a listing of all the pages in the wiki (as of the 27 July 2020 lockdown). Use this and Special:Export to create an XML dump of wiki pages and place it in the `wiki` directory.
|
||||
|
||||
##### `find_pages`
|
||||
Run this locally (it uses the MediaWiki HTTP API). Finds all pages in categories related to Pokemon generations 1 - 4 that have been edited since 31 March 2020.
|
||||
|
||||
#### Forum Data (`forum` directory)
|
||||
|
Reference in New Issue
Block a user