[05:47:02] any reason why a mysqldump of a database wouldn't match the output of a query? e.g. the sql file is 7.2GB but a query in mysql returns 9.57 GB as the size? or is it possible for it to timeout but still present no error and partially dump [05:48:29] That's quite a large gap [05:49:12] i guess i could run it again with --verbose [05:49:33] c: I found some comment online about mysqldump not dumping indices, not sure how true that is (I don't really work much with mysql) [05:56:35] would this include indices? [05:56:37] ROUND(SUM(data_length + index_length) / 1024 / 1024, 1) AS 'DB Size in MB' [05:58:54] c: I wouldn't worry about it. [05:59:07] Unless you have reason to believe that the dump is incomplete or corrupt or missing data. [06:00:05] A mysqldump-generated dump is always going to be larger than the raw data. [06:00:32] mysql said the database was larger, the dump is actually smaller [06:01:33] An sql dump file would not include indicies as that is not part of the data (well it might include create index statements) [06:03:30] But sql dump and native db format is going to look very different. The db would certainly be larger if you are including indicies [06:15:11] okay well i ran it again with verbose and it completed with no errors and is the same size. oh well [08:48:00] Hi ! I was wondering how to apply for Google Code In mentor for MediaWiki [15:01:24] Technical Advice IRC meeting starting in 60 minutes in channel #wikimedia-tech, hosts: @Lucas_WMDE - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [15:51:22] Technical Advice IRC meeting starting in 10 minutes in channel #wikimedia-tech, hosts: @Lucas_WMDE - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [16:06:54] So. I was wondering why MediaWiki is running a space for pedophiles on https://www.boywiki.org/en/Main_Page. There are pages connected which have onion-links on it which are labelled 'Chats'. Do you like such content on your page? [17:40:08] Anyone used pywikibot? [20:40:41] oh-noes [20:42:25] got some CAPTCHA-passsing bots at two wikis appearing from multiple IP's [20:44:21] jubo2: on mediawiki.org or a different wiki? [20:45:09] we can't really help much here if it's not a wikimedia wiki, this is a general support channel for the mediawiki software so we won't necessarily have admin rights on 3rd party wikis [20:46:04] That seemed more a rant than an ask for support. Or at least I'm missing the question [20:46:45] I was assuming it was a request to block the bots/rollback spam/etc. [20:48:29] Skizzerz: No, not mediawiki wiki. I realize this is not a Mediawiki issue, sorry about talking here, when this is a question of the weakness of the NoCAPTCHA [20:48:51] ah, well if you control the wiki or have a say in it, I'd suggest AbuseFilter as a good second line of defense [20:51:06] s/mediawiki/Wikimedia/ [21:01:07] Questy captcha is the best, if you can install it. Most of the spambots don't even bother to submit the login form when they encounter that extra field [21:01:43] NoCaptcha was being bypassed daily until I changed it [21:05:09] hi [21:05:24] is it possible to upgrade a db without a localsettings.php? [21:05:33] since from the web manager it gives timeout? [21:06:04] update.php won't run if there are no LocalSettings.php [21:06:43] You can install on a new empty database, get the LocalSettings, then change it to point to the old database and run update.php [21:06:52] oh [21:06:53] i can do that [21:07:00] Questy is easily bypassed if your wiki is large enough to attract the attention of dedicated spammers as opposed to drive-by [21:10:42] At least it requires human intervention [21:11:09] I agree having AbuseFilter is a must, anyway [21:11:15] thanks Vulpix will try [21:11:28] But at least the AbuseLog is not so heavily spammed [21:12:21] Since I changed ReCaptcha with Questy, I've seen more human registrations than before [21:30:42] Vulpix: im getting "did you run schema updater?" [21:31:00] did you run it? [21:31:09] which? [21:31:24] schema updater [21:31:31] which is? [21:31:56] please i forgot [21:32:20] i did php update.php [21:32:41] that's it [21:33:23] but it said that [21:33:58] let me paste [21:34:44] https://pastebin.com/PwtbLyaB [21:36:06] Sorry, that's T229092 [21:36:10] T229092: Investigate migrateActors script failing due to duplicate empty actor_name - https://phabricator.wikimedia.org/T229092 [21:36:42] let me check thanks [21:37:10] so no solution [21:37:10] :\ [21:38:02] It would be good if you can raise your problem on that task... apparently I'm the only one reporting those bugs for other people, and if nobody else comments, developers may think it's not worth fixing/investigating or this was a one-time error [21:38:23] i just installed it today [21:38:36] you say you were upgrading [21:38:58] yes [21:38:59] if your database is so big it can't handle the web updater, I guess you're upgrading an old and big wiki [21:39:04] yes [21:39:22] i mean i havent used mediawiki for a long time