View source for Robowiki talk:About

From Robowiki
Jump to navigation Jump to search

Contents

Thread titleRepliesLast modified
Migration and upgrade3613:47, 27 April 2020
RoboWiki:Copyright page117:30, 6 May 2019
Robowiki mirroring116:27, 14 June 2018
Finally Recovered!!! 017:34, 9 October 2017

Migration and upgrade

Hi, sorry for the downtime! The wiki crashed again, so I took the last backup I had and have put it onto an upgraded mediawiki on a new host. Hopefully no more stability issues!

There are a few known issues:

  • The backup is only from October last year :( All new users, pages and edits since then are lost, at least for now. Working on this.
  • The RumbleStats extension uses deprecated and removed functions in Mediawiki, and the changes I made to at least have it not crash the wiki don't result in it working. If anybody wants to help I'm happy to give server credentials.
  • No twitter in the tools sidebar
  • Images on the old wiki still broken
  • The URLs now have index.php in them, and my attempts at fixing this so far break everything. I'm probably missing some Apache setup somewhere.

On the other hand:

  • Upgraded mediawiki
  • Emails working again
  • New Robocode-specific captcha, so I've re-opened signups.

Anyway, welcome back, and happy Robocoding!

Skilgannon (talk)12:04, 19 April 2020

Fixed:

  • Images on the old wiki
  • Short URLs again, same as before, so the rumble clients should be happy.

Still missing:

Skilgannon (talk)21:06, 19 April 2020

Fixed:

  • Images (they were broken in the short URL migration)

New:

Skilgannon (talk)10:24, 20 April 2020

Fixed:

  • http redirects to https, except for if you have action=raw in the URL, for roborumble compatibility
  • This fixes if you log in with https, it would redirect you to http and then not allow your login token for security reasons.
Skilgannon (talk)17:32, 20 April 2020

Can we not forcefully redirect? I think it breaks my rss reader and probably others too.

If someone wants to use https, I understand them. But one day the certificate will expire, and stupidly paranoid modern browsers do not even let you see pages behind expired certificates.

Out edits are public anyway, I see no harm someone sniffing them.

Yes, I know that sometimes pages content is modified by evil providers but I think it is less harm than not able to see content of a page at all.

Beaming (talk)20:58, 20 April 2020

It fixed a bug with the login, which redirected back to http for some reason. I'll add an exception for the RSS URL to not do redirect, like I did for the roborumble. Can you think of any other URLs which need exceptions?

Skilgannon (talk)21:03, 20 April 2020
 
 
 
 

You do not have permission to edit this page, for the following reasons:

  • The action you have requested is limited to users in the group: Users.
  • You must confirm your email address before editing pages. Please set and validate your email address through your user preferences.

You can view and copy the source of this page.

Return to Thread:Robowiki talk:About/Migration and upgrade/reply (2).

Hopefully the old data can be recovered. But it might take some time.

Skilgannon (talk)10:20, 20 April 2020
 

Btw literumble seems to crash, right after I started 8 roborumble instances...

It used to work OK, but now it returns http 500.

Xor (talk)02:45, 20 April 2020

Literumble shouldn't be affected, it doesn't know that the robowiki exists except for the link on the landing page. I'll take a look.

Skilgannon (talk)10:21, 20 April 2020
 

Hi. Thanks a lot for resurrecting the wiki from ashes. Must be hell of the work.

I have git-wiki backup, last dated by Mon Mar 16 18:17:51 2020 +0000, with the commit message "Raven 1.1.1 -> 1.1.2" by Dsekercioglu. It is not exactly Mediawiki format, but it has pages in the file format: like normal pages, discussion, and attached files.

I will be more than happy to feed it back to wiki.

Speaking of the backups. I have raised this question before but maybe we got strong enough message this time. We need some ways to replicate the mirrors of the wiki. My study of mediawiki a couple years ago showed that it has no built in mechanism for incremental backups. I was able to download a GB xml dump of wiki, but I did not date to do it more than once a year. So full dump is stale.

Is there a way to have a shadow "read only" wiki?

Beaming (talk)15:13, 20 April 2020

I'm not sure how to reintegrate these kind of backups. If there is some process you know, I'd be happy to try. I still have a local mediawiki running on my raspberry-pi that I practiced the transfer on first, so we don't risk messing anything up. But there is also the internal state missing, including things like users. I have more hopes of being able to integrate the more recent backups from the original server, we just need to wait for the old hosting provider to bring the image back.

For future, I've also set up a daily backup to my local NAS of the database, mediawiki/oldwiki install and apache config. This would let me restore from scratch in less than a day. Hopefully it never needs to be used.

Skilgannon (talk)17:39, 20 April 2020

Well on my side, gitted wiki looks like collection of files corresponding to pages in old wiki. For example I have the latest version of RoboRumble Participants in the rumbles. I can either send them to you as patches, or as final verison of pages changed since your backup time.

I can even try to push them into the wiki, but the authorship will not be preserved.

How large is daily backups? I would love to provide some redundancy.

Beaming (talk)20:52, 20 April 2020

So you have the raw wikitext? Let's hold off a little longer - if we get the original that would be ideal of course, but this is good to have as an alternative. Maybe a few key pages should be updated ASAP - rumbles for example. And if we don't get the originals back, then I'm not too concerned about authorship - the wiki already went through a migration once.

Daily backups are ~250MB. I have daily for a week, then weekly for a month, then monthly. So total will be under 10GB for a long time. Extra backups can't hurt :-)

Skilgannon (talk)21:01, 20 April 2020

Yes. I have raw wikitexts. They also come with authorship, but I guess there is no way to stick it back.

Ok. I can manage 10GB. No problem. Though, it is sounds very wasteful for a few kB of text files changed weekly.

Beaming (talk)21:39, 20 April 2020
 
 
 
 

New captcha is awesome! I barely pass the exam :)

Though, could it be disabled for logged in users? I will probably do some automatic restores from backup. It would stop my bots (they will use my password and login).

Beaming (talk)22:38, 20 April 2020

Oh, I didn't realize they were on for everything. I will change it only to the creation of accounts.

However, could you make a separate user for your bot, and we can set it as a Bot user. Then it is easier to filter out in the Recent Changes page.

Skilgannon (talk)16:58, 21 April 2020

I checked, it is only set to give a CAPTCHA when adding a URL, which is critical for fighting spam. However, I have set it to skip CAPTCHA for bots. So you should create a user for your bot, and then we can give it 'Bot' permissions.

Skilgannon (talk)17:15, 21 April 2020

It is probably not big issue, but every regular will see captcha when updates the participants page. This is when I experienced it.

Beaming (talk)18:39, 21 April 2020
 
  Your edit includes new external links. To protect the wiki against automated edit spam, we kindly ask you to answer the question that appears below

maybe we should whitelist literumble

or rumblestats will cause captcha each time a edit is done

also dropbox, github, google drive etc. should get whitelisted, which helps when editing rumble participants

I also suggest adding wikipedia/mediawiki which is frequent source of quoting.

Xor (talk)08:46, 25 April 2020

I've whitelisted some of these - feel free to add more as/when you think of them: MediaWiki:Captcha-addurl-whitelist.

Skilgannon (talk)09:02, 25 April 2020

I have no permission ;)

btw, I think the \.jar line has no effect, as only host will get matched

Xor (talk)09:07, 25 April 2020
 

per en.wikipedia.org/wiki/MediaWiki:Captcha-addurl-whitelist

it seems that the lines should not begin with \, rather \b, as word boundaries, for matching domains.

now it shows error message each time captcha is shown ;(

Xor (talk)09:11, 25 April 2020
 

and I think error/warning messages should be turned off now, as no need to debug anymore

or it will cause precious information to be leaked.

Xor (talk)09:15, 25 April 2020
 
 
 
 

The RumbleStats extension should be fixed now ;)

by following guidance on https://www.mediawiki.org/wiki/Manual:Parser_functions

It's broken just because mediawiki changed format of extensions.

Xor (talk)08:59, 25 April 2020
 

Hi. I think I was able to recover normal pages (except some which are just redirecting pages).

The main problem are discussions. They are using different API for web injecting, and this API is much more elaborative for the quick parser. Quick glance in the missing threads shows that they are not that important for robocoders.

The one which we probably need to recreate or at least summarize, is devoted to RobocodeGL. User:Xor would you mind to repost links to the latest releases, and I will try to add up with my command line knowledge so it runs on linux with missing OpenGL components.

Beaming (talk)21:04, 25 April 2020

I almost forgot this ;)

I thought I had no activity during the non-backed-up period

Anyway the history version doesn't matter. I'll release a new release with bug fixes soon ;)

Xor (talk)13:47, 27 April 2020
 
 

RoboWiki:Copyright page

It seems that

http://robowiki.net/wiki/RoboWiki:Copyrights

is not gone

It's deleted by admin instead

Is that possible to recover a page deleted by admin?

Or maybe the original content is still renamed, and the deleted page is actually recreated by spam.

Xor (talk)14:07, 6 May 2019

Hi

There was nothing useful on that page, it was created by a spammer.

Skilgannon (talk)17:30, 6 May 2019
 

Robowiki mirroring

Hi mates,

I start to be convinced that we need a mirror of Robowiki. I do not suggest to migrate away but to have "read only" version during the infrequent glitches. The last one took 2 month to be fixed, but the scary part that Robowiki was not awailable even for browsing. I.e. we would not even be able to contact admins since info is unavailable.

I am investigating back up options:

  • dumpgenerator from wikiteam
    • pros:
      • seems to work and pulls full archive with history of revisions
    • cons:
      • redownload the whole wiki every time (about 700 MB dump)
      • not clear what to do with this dump, I cannot find a reasonable howto for converting this dump to a useful web accesible instance. I.e. it is only good for a local searches. Am I right about it?
  • git-mediawiki
    • pros:
      • no need to redowload the whole wiki thanks to incremental changes by Git
      • apparently can export entries to another mediawiki hosted instance
      • can be used for local editing of wiki (this is super cool)
    • cons:
      • I cannot reliably make a replica of Robowiki, it dowloads about 1600 pages and then just make an appearance of working. I think it is either wiki API limit or something similar because sometimes it does more. The main problem is with discussion threads which are full of useful info but apparently we have more than 10000 of them and git-mediawiki fails here.
  • it would be super cool to do something like along the distributed web projects

The distributed web seems to be good for read only, and I am not sure how to put updates in. IPFS seems to be designed for a static content.

Beaming (talk)16:24, 14 June 2018

Cool, I've been thinking the same thing for ages, but you took actions!

btw, we can use archive.org currently to find contact info, since that's not updated frequently.

Xor (talk)16:27, 14 June 2018
 

Finally Recovered!!!

Thanks for fixing the robowiki! Could not update bots for almost a day.

Xor (talk)17:34, 9 October 2017