Robowiki mirroring
Jump to navigation
Jump to search
Hi mates,
I start to be convinced that we need a mirror of Robowiki. I do not suggest to migrate away but to have "read only" version during the infrequent glitches. The last one took 2 month to be fixed, but the scary part that Robowiki was not awailable even for browsing. I.e. we would not even be able to contact admins since info is unavailable.
I am investigating back up options:
- dumpgenerator from wikiteam
- pros:
- seems to work and pulls full archive with history of revisions
- cons:
- redownload the whole wiki every time (about 700 MB dump)
- not clear what to do with this dump, I cannot find a reasonable howto for converting this dump to a useful web accesible instance. I.e. it is only good for a local searches. Am I right about it?
- pros:
- git-mediawiki
- pros:
- no need to redowload the whole wiki thanks to incremental changes by Git
- apparently can export entries to another mediawiki hosted instance
- can be used for local editing of wiki (this is super cool)
- cons:
- I cannot reliably make a replica of Robowiki, it dowloads about 1600 pages and then just make an appearance of working. I think it is either wiki API limit or something similar because sometimes it does more. The main problem is with discussion threads which are full of useful info but apparently we have more than 10000 of them and git-mediawiki fails here.
- pros:
- it would be super cool to do something like along the distributed web projects
- IPFS
- Dat Project then we can even host the mirror from our browsers like Beaker
The distributed web seems to be good for read only, and I am not sure how to put updates in. IPFS seems to be designed for a static content.
You do not have permission to edit this page, for the following reasons:
You can view and copy the source of this page.
Return to Thread:RoboWiki talk:About/Robowiki mirroring/reply.