[05:47:10] is the searchindex table supported by InnoDB yet? a forum about the matter saying it does not is from 6 years ago [05:47:55] c: MW does not create the table as innodb, but it should work fine as an innodb table [05:48:16] bawolff: i wanted to make sure, since for some reason that's the /only/ table not on InnoDB already [05:48:36] The search algorithms used are slightly different for innodb - so you may have slightly different results. The internet seems divided on whether or not that's better or worse [05:48:56] c: Yeah, I think that's mostly laziness, and also back-compat for really old versions of mysql [05:49:01] c: there's a bug for it somewhere [05:49:55] we're running maria 10.0.24 so not too old, and i wanted to double check before i attempted to convert because fulltext search wasn't supported yet [05:50:38] RDS prefers InnoDB for all tables for a smoother point-in-time restore process if needed [05:51:28] yeah, innodb is better by basically every measure [05:54:20] * c scratches head at a notice bot [05:57:17] I mostly find it annoying when i know the answer, because I don't really want to go to another site - but if I could just reply to the bot that would be great [05:57:54] * bawolff not really sold on the whole discourse thing, but other people seem to like it [05:59:12] it just makes more sense to do PRIVMSG over NOTICE [05:59:32] especially because some clients cause annoyance with NOTICE [06:12:03] I actually think it should just do a normal message to the channel [06:14:58] that's what PRIVMSG is [06:18:08] bawolff: you can reply here if you like :) [06:18:14] (i'll transcribe over there) [06:18:26] (or, rather, update wm.o docs if they need it) [06:18:40] samwilson: I meant in general. I don't know the answer to this question in particular, but there's been other times [06:18:50] ah cool :) [06:19:11] yeah, seems like a cool thing, to be able to talk to discourse from here [06:19:20] Well actually I sort of know where to look. there is a clone db class that does stuff related to that [06:19:38] yeah, that's where i've been looking, but it just does structure, no data [06:19:53] which makes sense, because one doesn't really want all data... hard for it to know what's what... [06:21:35] have you looked at MediaWikiTestCase::setupDatabaseWithTestPrefix() ? [06:23:10] Hmm, how come my global preferences don't seem to work [06:23:11] hm, no seems to just end up with `CREATE $tmp TABLE $newName (LIKE $oldName)` [06:23:31] hm, that's odd. all of em? [06:23:56] Oh wait, I just didn't realize that its not enabled yet on wikipedia [06:23:58] GP isn't on wikipedias yet [06:24:03] ah, yup that's it. [06:25:04] While for parser tests, there's a whole complex thing where parser tests can define new "pages" to insert into the db [06:25:39] as in, a page and any links/categories etc? [06:26:04] yeah [06:26:50] hm. seems maybe that extension data is not cloned ever [06:27:34] maybe the insert statements should be moved into their own sql file, and then the test can run that when required [06:27:40] ParserTestRunner::addArticle() [06:28:12] It should just use MW's normal linksupdate stuff [06:28:49] it doesn't seem all that different to MediaWikiTestCase::insertPage() [06:29:09] MediaWikiTestCase probably copied it [06:29:22] ParserTestRunner is much older than the other unit test stuff [06:29:27] ah cool [06:29:48] and insertPage() says it should be called from saddDBData, but I can't see why really [06:29:54] *addDBData() [08:02:57] We need phab admin to block https://phabricator.wikimedia.org/p/238482n375/ [08:03:37] Amir1: you a phab admin? [08:04:41] How do i even get a list of phab admins? [08:10:40] https://phabricator.wikimedia.org/people/query/kdjggddrQv.X/#R [08:10:45] ^ list of admins [09:55:24] bawolff: yup, I'm. Anything I can help with? [09:56:13] Amir1: Its mostly dealt with now, although people are still working on the cleanup [10:04:20] Hello [10:23:59] Amir1: https://phabricator.wikimedia.org/T184485 [10:24:19] eh, wrong task. https://phabricator.wikimedia.org/T197456 [11:04:27] Vulpix: there waaay more. Fixing it is hard, is there an easy way to rollback someone's action in phabricator? [11:07:23] Amir1: sadly no. Wondering if that's possible... https://phabricator.wikimedia.org/T84#3592783 [11:08:18] Vulpix: that's technically possibel [14:35:17] hi, lets say my wiki responsible to display students. i have few hundreds photos, some for each students. i want to upload all of them and somehow set category for each photo (by its name or something like that). that possible somehow? [14:46:02] anyone?:S [15:29:16] Hello [15:29:31] I'm trying tp upgrade to MediaWiki 1.31.0 [15:30:17] I can't start the installation as I get the following error: "Your session data was lost! Check your php.ini and make sure session.save_path is set to an appropriate directory." [15:30:44] /var/lib/php/sessions was already writable, in fact the sessions were indeed generated inside of it [15:30:52] I even tried setting /tmp as the sessions folder [15:30:59] They are generated, but I get the same error [15:31:02] Any ideas? [15:33:22] davidebeatrici: the error is on the installer? or when logging in? [15:33:30] Vulpix: On the installer [15:33:38] On the language selection [15:36:14] davidebeatrici: from what version are you upgrading? [15:38:38] Vulpix: 1.23 [15:41:25] this gives some tips: https://www.mediawiki.org/wiki/Manual:How_to_debug/Login_problems. The session.auto_start could be an issue, but I'm not sure it could affect the installer [15:42:43] It's set to 0 [15:50:12] Vulpix: Unfortunately the log file is not generated [15:51:50] yeah, the installer doesn't use LocalSettings.php yet, so it's hard to debug [15:52:36] Can you try private browsing? just to discard a problem with existing cookies [15:52:51] Yes, same problem [15:53:28] that possible to run something like that on cargo? select (first-second) as diff from tbl orderby=diff? im getting error :S [15:53:43] Vulpix: I noticed that every time I press on the "Continue" button, it creates two session files [15:53:49] Is that normal? [15:54:14] The first weights 64 bytes, the second 2113 [15:54:53] mediawiki handles sessions on it's own, but not in the installer IIRC, so this shouldn't happen [15:55:45] I'm using PHP 7.2 and Apache 2 on Ubuntu 14.04, if that helps [15:56:48] MediaWiki 1.30 had issues with PHP 7.2, I'd expect them to be fixed in 1.31 now that it requires PHP 7+ [15:57:18] Yeah, I upgraded from PHP 5 to 7 in order to upgrade MediaWiki [15:59:51] if you can, try to downgrade PHP to 7.1 and see if that helps [16:00:47] 1.31 is new and I'm not confident all php7.2 bugs have been resolved, specially if it was unusable in 7.2 on 1.30 [16:06:46] Vulpix: Tried right now with 7.1 [16:06:48] Same problem [16:24:57] Vulpix php7.2 has been working for us with redis when using mw 1.31 :) [16:25:30] good to know! [16:30:05] Vulpix: The test code I took from here fails: https://dodona.wordpress.com/2008/08/26/how-can-i-check-if-php-on-my-webserver-supports-sessions/ [16:31:06] Weird... [16:36:27] davidebeatrici: is your browser accepting cookies correctly? Hit F12, on the network tab, and check the HTTP headers of request and response. See if a cookie is retrieved as a response, and when reloading, the browsers sends that same cookie value in the request [16:38:48] Vulpix: It changes every time [16:39:27] The session in the header weights 64 bytes [16:41:02] on the test code do you get also 2 session files for each reload? [16:41:50] 4 [16:42:05] omg [16:42:35] well, or maybe other users are hitting your webserver while you're doing the test [16:43:03] I don't think so, the file is quite hidden... [16:43:19] And I see the files appearing only when I load the page [16:43:37] any PHP file on your server will trigger the session, I guess [16:43:46] Right [16:50:55] PHP doesn't throw any errors