Migration and upgrade

Fragment of a discussion from Robowiki talk:About
Jump to navigation Jump to search

Hi. Thanks a lot for resurrecting the wiki from ashes. Must be hell of the work.

I have git-wiki backup, last dated by Mon Mar 16 18:17:51 2020 +0000, with the commit message "Raven 1.1.1 -> 1.1.2" by Dsekercioglu. It is not exactly Mediawiki format, but it has pages in the file format: like normal pages, discussion, and attached files.

I will be more than happy to feed it back to wiki.

Speaking of the backups. I have raised this question before but maybe we got strong enough message this time. We need some ways to replicate the mirrors of the wiki. My study of mediawiki a couple years ago showed that it has no built in mechanism for incremental backups. I was able to download a GB xml dump of wiki, but I did not date to do it more than once a year. So full dump is stale.

Is there a way to have a shadow "read only" wiki?

Beaming (talk)15:13, 20 April 2020

I'm not sure how to reintegrate these kind of backups. If there is some process you know, I'd be happy to try. I still have a local mediawiki running on my raspberry-pi that I practiced the transfer on first, so we don't risk messing anything up. But there is also the internal state missing, including things like users. I have more hopes of being able to integrate the more recent backups from the original server, we just need to wait for the old hosting provider to bring the image back.

For future, I've also set up a daily backup to my local NAS of the database, mediawiki/oldwiki install and apache config. This would let me restore from scratch in less than a day. Hopefully it never needs to be used.

Skilgannon (talk)17:39, 20 April 2020

Well on my side, gitted wiki looks like collection of files corresponding to pages in old wiki. For example I have the latest version of RoboRumble Participants in the rumbles. I can either send them to you as patches, or as final verison of pages changed since your backup time.

I can even try to push them into the wiki, but the authorship will not be preserved.

How large is daily backups? I would love to provide some redundancy.

Beaming (talk)20:52, 20 April 2020

So you have the raw wikitext? Let's hold off a little longer - if we get the original that would be ideal of course, but this is good to have as an alternative. Maybe a few key pages should be updated ASAP - rumbles for example. And if we don't get the originals back, then I'm not too concerned about authorship - the wiki already went through a migration once.

Daily backups are ~250MB. I have daily for a week, then weekly for a month, then monthly. So total will be under 10GB for a long time. Extra backups can't hurt :-)

Skilgannon (talk)21:01, 20 April 2020

Yes. I have raw wikitexts. They also come with authorship, but I guess there is no way to stick it back.

Ok. I can manage 10GB. No problem. Though, it is sounds very wasteful for a few kB of text files changed weekly.

Beaming (talk)21:39, 20 April 2020

I don't know if there is an easy way to get diffs - it must be possible for mysql slave mirrors, but the wiki is small so we can be lazy with our backup mechanisms ;-)

Skilgannon (talk)21:55, 20 April 2020

I've been using mysql slaves for years, and it is not only easy to set up, but also a perfect candidate as a readonly mirror of the wiki.

And the slave can be set as master anytime, making the mirror the main wiki when the original one goes down. This ensures zero downtime if set up automatically.

Anyway, this is not a replacement of monthly backup. If a hacker managed to run drop db, the slave will also be destroyed.

Xor (talk)07:22, 25 April 2020
 

And mysql slave is yes perfectly incremental.

Xor (talk)07:23, 25 April 2020

Hi Xor,

Could you elaborate on mysql slave? I would like to try it, and maybe even host readonly replica.

Beaming (talk)16:39, 25 April 2020

I've set up this several times, maybe I can help setting this up. I'm also willing to help hosting readonly version ;)

Xor (talk)13:45, 27 April 2020