Migration and upgrade

Jump to navigation Jump to search
Revision as of 20 April 2020 at 21:09.
The highlighted comment was created in this revision.

Migration and upgrade

Hi, sorry for the downtime! The wiki crashed again, so I took the last backup I had and have put it onto an upgraded mediawiki on a new host. Hopefully no more stability issues!

There are a few known issues:

  • The backup is only from October last year :( All new users, pages and edits since then are lost, at least for now. Working on this.
  • The RumbleStats extension uses deprecated and removed functions in Mediawiki, and the changes I made to at least have it not crash the wiki don't result in it working. If anybody wants to help I'm happy to give server credentials.
  • No twitter in the tools sidebar
  • Images on the old wiki still broken
  • The URLs now have index.php in them, and my attempts at fixing this so far break everything. I'm probably missing some Apache setup somewhere.

On the other hand:

  • Upgraded mediawiki
  • Emails working again
  • New Robocode-specific captcha, so I've re-opened signups.

Anyway, welcome back, and happy Robocoding!

    Skilgannon (talk)12:04, 19 April 2020

    Fixed:

    • Images on the old wiki
    • Short URLs again, same as before, so the rumble clients should be happy.

    Still missing:

      Skilgannon (talk)21:06, 19 April 2020

      Fixed:

      • Images (they were broken in the short URL migration)

      New:

        Skilgannon (talk)10:24, 20 April 2020

        Fixed:

        • http redirects to https, except for if you have action=raw in the URL, for roborumble compatibility
        • This fixes if you log in with https, it would redirect you to http and then not allow your login token for security reasons.
          Skilgannon (talk)17:32, 20 April 2020

          Can we not forcefully redirect? I think it breaks my rss reader and probably others too.

          If someone wants to use https, I understand them. But one day the certificate will expire, and stupidly paranoid modern browsers do not even let you see pages behind expired certificates.

          Out edits are public anyway, I see no harm someone sniffing them.

          Yes, I know that sometimes pages content is modified by evil providers but I think it is less harm than not able to see content of a page at all.

            Beaming (talk)20:58, 20 April 2020

            It fixed a bug with the login, which redirected back to http for some reason. I'll add an exception for the RSS URL to not do redirect, like I did for the roborumble. Can you think of any other URLs which need exceptions?

              Skilgannon (talk)21:03, 20 April 2020

              Exception added for anything using api.php

                Skilgannon (talk)21:51, 20 April 2020
                 

                There is some differnces between new and old wikis.

                The old one had some special api enabled. Specificly there were working two special urls

                http://robowiki.net/w/ http://robowiki.net/api.php

                they allowed to check for incremental changes.

                They also gave access to rss reader.

                  Beaming (talk)21:54, 20 April 2020

                  Ok, I'm not sure how that was set up exactly. It is better not to have php files in the root namespace, or using directory defaults for file access. But there is now: http://robowiki.net/w/api.php

                  This should have what you need.

                    Skilgannon (talk)22:05, 20 April 2020

                    This seems to work. Let's keep it during migration stage. Once users are back and rediscover new mechanism, we can put it back to wiki pristine way of handling apis.

                      Beaming (talk)22:09, 20 April 2020
                       
                       
                       
                       
                       
                       
                       

                      Surely I would like to help with RumbleStats.

                      Btw we still have the data since Oct. last year on the original host right? Can we merge database so we don’t lose changes from here.

                        Xor (talk)00:34, 20 April 2020

                        Hopefully the old data can be recovered. But it might take some time.

                          Skilgannon (talk)10:20, 20 April 2020
                           

                          Btw literumble seems to crash, right after I started 8 roborumble instances...

                          It used to work OK, but now it returns http 500.

                            Xor (talk)02:45, 20 April 2020

                            Literumble shouldn't be affected, it doesn't know that the robowiki exists except for the link on the landing page. I'll take a look.

                              Skilgannon (talk)10:21, 20 April 2020
                               

                              Hi. Thanks a lot for resurrecting the wiki from ashes. Must be hell of the work.

                              I have git-wiki backup, last dated by Mon Mar 16 18:17:51 2020 +0000, with the commit message "Raven 1.1.1 -> 1.1.2" by Dsekercioglu. It is not exactly Mediawiki format, but it has pages in the file format: like normal pages, discussion, and attached files.

                              I will be more than happy to feed it back to wiki.

                              Speaking of the backups. I have raised this question before but maybe we got strong enough message this time. We need some ways to replicate the mirrors of the wiki. My study of mediawiki a couple years ago showed that it has no built in mechanism for incremental backups. I was able to download a GB xml dump of wiki, but I did not date to do it more than once a year. So full dump is stale.

                              Is there a way to have a shadow "read only" wiki?

                                Beaming (talk)15:13, 20 April 2020

                                I'm not sure how to reintegrate these kind of backups. If there is some process you know, I'd be happy to try. I still have a local mediawiki running on my raspberry-pi that I practiced the transfer on first, so we don't risk messing anything up. But there is also the internal state missing, including things like users. I have more hopes of being able to integrate the more recent backups from the original server, we just need to wait for the old hosting provider to bring the image back.

                                For future, I've also set up a daily backup to my local NAS of the database, mediawiki/oldwiki install and apache config. This would let me restore from scratch in less than a day. Hopefully it never needs to be used.

                                  Skilgannon (talk)17:39, 20 April 2020

                                  Well on my side, gitted wiki looks like collection of files corresponding to pages in old wiki. For example I have the latest version of RoboRumble Participants in the rumbles. I can either send them to you as patches, or as final verison of pages changed since your backup time.

                                  I can even try to push them into the wiki, but the authorship will not be preserved.

                                  How large is daily backups? I would love to provide some redundancy.

                                    Beaming (talk)20:52, 20 April 2020

                                    So you have the raw wikitext? Let's hold off a little longer - if we get the original that would be ideal of course, but this is good to have as an alternative. Maybe a few key pages should be updated ASAP - rumbles for example. And if we don't get the originals back, then I'm not too concerned about authorship - the wiki already went through a migration once.

                                    Daily backups are ~250MB. I have daily for a week, then weekly for a month, then monthly. So total will be under 10GB for a long time. Extra backups can't hurt :-)

                                      Skilgannon (talk)21:01, 20 April 2020

                                      Yes. I have raw wikitexts. They also come with authorship, but I guess there is no way to stick it back.

                                      Ok. I can manage 10GB. No problem. Though, it is sounds very wasteful for a few kB of text files changed weekly.

                                        Beaming (talk)21:39, 20 April 2020

                                        I don't know if there is an easy way to get diffs - it must be possible for mysql slave mirrors, but the wiki is small so we can be lazy with our backup mechanisms ;-)

                                          Skilgannon (talk)21:55, 20 April 2020