[08:54:54] Hi everyone! I'm trying to get tokens in order to access some functions (https://www.mediawiki.org/wiki/Extension:OAuth), but it seems that before that I need to contact mediawiki's admin in order to get one, who should I contacct? [08:55:58] Polochon_street: The list of wikimedia oauth admins is at https://meta.wikimedia.org/w/index.php?title=Special%3AListUsers&username=&group=oauthadmin&limit=50 [08:56:29] Polochon_street: anomie, csteipp and yuvipanda can often be found on irc [08:58:16] okay, thanks very much :) [09:08:15] Hi, I just want to tell you that there is a problem when downloading an mediawiki extension. --2015-09-15 10:57:29-- https://extdist.wmflabs.org/dist/extensions/LdapAuthentication-REL1_25-d4db6f0.tar.gz Resolving extdist.wmflabs.org (extdist.wmflabs.org)... 208.80.155.156 Connecting to extdist.wmflabs.org (extdist.wmflabs.org)|208.80.155.156|:443... connected. ERROR: cannot verify extdist.wmflabs.org's certificate, issued by � [09:08:33] Issued certificate has expired. [09:09:41] Guest97153: Can you report that in the #wikimedia-operations channel? [09:09:49] But I ignored the certificate error and was able to download the file via wget. [09:10:06] Ok how can I switch? [09:10:19] Guest97153: That's ok, I can just pass it on [09:10:20] Not After : Sep 15 00:43:52 2015 GMT [09:10:58] this is for all labs [09:11:01] Ok, then I will pass it. [09:11:46] Thanks for your help. [09:12:34] I've been told the cert is in the process of being renewed right now [09:13:15] https://phabricator.wikimedia.org/T112608 [09:31:23] #wikimedia-operations [10:39:39] Hello, I am importing a xml dump into a MW with sqlite - which is slow and misses stuff like: user accounts, edit logs, ... :-( [10:39:40] Is there a way I could import the full mysql export into sqlite ? [10:40:20] Mysql syntax is very close to sqlite syntax. There's probably scripts on the internet to convert [10:40:43] that would be perfect, let me extend my initial search [10:40:55] Is the new wiki/db still sqlite or is it MySQL? or was the old db mysql? [10:41:15] I've tried using conversion scripts and (maybe I just messed up, but) I couldn't get most of them to work right, even some of the paid apps :p [10:41:56] I just upgraded a wiki using mysql by installing a new version of mediawiki with the same sqlite db name, and then i coppied the old db over the new one and mediawiki just used it :p i hate sqlite tho [10:42:14] on a related note, I broke shit! http://prntscr.com/8gjx1i [10:42:20] the old wiki is using mysql and the new wiki will be using sqlite [10:42:32] If i may ask, why are you switching from MySQL to sqlite [10:43:06] I don't want to run mysql all the time as service, I am here on a not-so-good-resourced device [10:43:43] Ah, alright [10:44:37] Hi #mediawiki, does anyone know how to delete a "File:" page for an image/file that doesn't exist on the server? When I try to do this I currently get on-wiki MWExceptions about file's not existing (including the image itself and lock files) [10:45:42] Erkan_Yilmaz: did you follow https://meta.wikimedia.org/wiki/Data_dumps/Tools_for_importing ? [10:46:12] oh sqlite [10:46:36] yes, I was using: importDump.php since I thought Special:Export might time out [10:47:12] Often most of the time goes into l10n lookups and stuff, make sure to check https://www.mediawiki.org/wiki/Manual:Performance_tuning [10:47:58] ok, will do [10:48:12] I am currently importing to see how large the sqlite Db will get [10:48:39] * NDK|Cloud is curious as to what "sudo rm -rf /boot" will do on the machine hosting his wiki. [10:50:07] -bash: line 96: /bin/ls: No such file or directory. [10:50:52] NDK|Cloud: you could possibly delete rows from the database... [10:51:17] XD saper: yeah I'm just going to go ahead and rebuild this vm from a fresh debian 8 image :p [10:51:51] deleted sudo, tried to reinstall sudo, "/usr/bin/apt-get: no such file or directory" whoops :p eheheh [10:52:03] I still have my backups but I kinda broke shit on wiki :p woo [10:57:48] I guess I could try: mysql2sqlite.sh https://gist.github.com/esperlu/943776 [10:59:58] NDK|Cloud: running sudo rm -rf on random directories generally ends poorly ;) [11:02:02] bawolff: sudo stopped working. :p [11:02:16] Rebuild in progress lmao.. Whoopsies [11:49:02] that mysql2sqlite.sh doesn't look good, running it throws too many: "too many terms in compound SELECT" [11:59:10] :P installing mediawiki from git master [12:46:21] anyone noticed a slight increase in real people creating accounts on your wiki with external links on their User pages? [12:46:54] i had about 2 in 6 months, then 10 or so this week alone :( [12:47:14] and that's using a dynamic captcha [12:48:11] it amazes me that someone trying to sell knockoff Gucci handbags would think anyone on a technical site would click on their links [12:51:55] Well, you know, automated computer programs tend not to think things through [12:52:03] ^ yea [12:53:48] bawolff: well, if it's a bot, it's a good one as my captcha is dynamic and random [12:54:22] Well you know, sometimes there are hybrid bots [12:54:58] it saves just the captcha, then re-uses the captcha on a porn site to get the answer, then answers the captcha. Or just pay some poor person in a third world country to answer captchas all day [12:55:16] the latter is what i figure is happening [12:55:34] I laughed way too hard [12:56:12] The re-use captchas on porn sites to get the answer, is a real thing that spammers sometimes do [13:42:00] Hi everybody. I just wanted to confirm a hunch - if I use the latest CLDR with an older version of Mediawiki, is it going to cause a bunch of preg_match errors? [13:43:11] there is an incompatibility between some version of pcre and MW [13:44:29] On recent versions of MW, I think we test for that incompatibility in the installer (Not 100% sure) [13:45:18] malaverdiere: https://phabricator.wikimedia.org/T60213 this was [13:45:37] hm. I'm using MW as a case study for my phd project and I have to use older versions - all the way to 1.15 [13:45:44] oh, that is older [13:46:11] malaverdiere: If you were saying something like 1.19, that probably wouldn't be a good idea, but 1.15 is a long time ago [13:46:29] right now I"m trying to get 1.22 to work :) [13:47:19] Most extensions have REL1_XX branches, that work with that version of mw (prior to 1.19, the branches aren't in git, only svn) [13:47:33] so for 1.22, use the REL1_22 branch of whatever extension [13:48:38] last time I checked, they weren't in the composer repo - I'm guessing that its on purpose [13:48:49] * malaverdiere starts tweaking his script [13:49:37] right now I'm using a Debian Jessie setup - but I can use an older Debian setup for the older versions as well - are there other incompatibilities to expect? [13:50:17] I wouldn't switch OS for different mediawikis [13:51:02] its in Docker - not a big deal [13:53:03] well, additional variables to the equation [13:53:08] oh wait, composer has tags for dev-REL1_21 and up :) [13:53:32] so that's gonna be easier to manage :) [13:54:42] I'm planning to open source that benchmark environment when I'm done getting it to work [13:54:56] That's going to be very useful for future researchers [13:57:57] I'm surprised anything is in the composer repo... [14:00:15] malaverdiere: remember the basic settings suitable for your use case, if you want to measure something realistic https://www.mediawiki.org/wiki/Manual:Performance_tuning [14:01:16] I have 4-5 different configurations [14:01:51] MW has good documentation - I didn't have to dig in the code too much [15:51:44] So I'm still getting preg_match compilation failed errors in 1.22 for PHP 5.6.12-0+deb8u1. Happens both with and without cldr [15:52:27] I know this is not a very important - since its older - but I wouldn't mind a hint or two :) [16:00:42] malaverdiere: not sure if it's the same bug, but i remember something with preg_match failing with some versions of PCRE, let me see if i can find it [16:17:55] https://www.mediawiki.org/wiki/Manual:Errors_and_symptoms [16:17:58] PHP Warning: preg_replace(): Compilation failed: group name must start with a [16:18:00] non-digit at offset 4 in /var/www/wiki/htdocs/includes/MagicWord.php [16:18:09] " You need to downgrade PCRE, or update MediaWiki to version 1.22.1 or newer" [16:18:14] lemme try on 1.22.1 first [16:19:32] malaverdiere: sorry, i got distracted; that's the one i was thinking of. IIRC the patch applies successfully to a number of older MW versions people tried it with. [16:20:45] gud [16:23:28] so far so good, I'm not seeing those when running the installer [16:23:43] I'll tweak my script to cherry-pick that commit when working with an older version [16:24:24] Gotta love people who dont read the API docs [17:39:30] HI, what are you using to make a static copy of your MW site ? [17:40:00] I tried some extensions, but I think wget is the best from them, but still not satisfactory [17:40:11] wget is no extension, ofc :-( [17:41:06] btw: I gave up on using sqlite as DB, it may be good to set that for a fresh wiki, but with my existing wiki, it's just consuming time [17:41:09] back to mysql [17:41:55] trying to run update.php and I get Fatal error: Uncaught exception 'Exception' with message '/home/wiki/web_docs/mediawiki-1.25.2/extensions/ConfirmEdit/extension.json does not exist!' ; I've confirmed that there was no extension.json file at the root level in the tarball, but there are for the four captcha subdirectories [18:06:19] "A simple functioning solution to produce static HTML from MediaWiki doesn't currently exist!" https://www.mediawiki.org/wiki/Extension:DumpHTML [18:11:26] Pennth: hmm. can you file a bug? [18:11:33] !fileabug [18:11:36] !newbug [18:11:36] https://phabricator.wikimedia.org/maniphest/task/create/ [18:12:07] !fileabug alias newbug [18:12:08] Created new alias for this key [18:12:36] Pennth: you can leave all fields other than "Title" and "Description" empty/unchanged [18:18:59] Thanks, MatmaRex. I used the old-style require_once, and the update ran, but that does seem like an extension bug, since it's packaged with the base install [18:26:52] MatmaRex, https://phabricator.wikimedia.org/T88047 [19:23:13] Question about CentralNotice - for campaigns that link to external sites - should it link in a new window(target=blank)? or use regular link? [20:03:33] Hey guys hopefully quick question. Maybe I'm crazy, but in 1.24 I used to use array_merge() to make a group inherit rights from multiple groups at the same time, and I thought this worked in 1.25 also. I just checked out the git master earlier today and installed mediawiki, copy/pasted some group permissions, and according to Special:ListGroupRights it [20:03:33] appears to not be working [21:00:03] Hello, I'm looking for advice [21:59:39] mew? =^.^= [22:00:35] anyone here? [22:00:35] Hi bumm13_, I am here, if you need anything, please ask, otherwise no one is going to help you... Thank you [22:01:03] just have a somewhat-minor template/infobox issue [22:02:34] I can't get entry columns to line up using certain parameters in the Infobox TV channel template [22:04:17] (using "online serv" / "online chan") [22:09:37] anyway, if anyone is interested at all, the article is "AWE (TV network)" [22:09:47] I left a note on the article's talk page [22:38:34] Still having difficulty working out how to create a page from form data -- details (and links to code) here, in case anyone is masochistic enough^W^W^W happens to know how to do this: https://plus.google.com/u/0/102282887764745350285/posts/Mj2zaY7J7Sv [23:45:38] I'm playing in the API sandbox in the en wiki and I see that action=query&prop=revisions&rvdiffto=prev, the diff propery is escaped HTML. Is this how it's stored in the DB? Can I get it in a UNIX-like diff? Just wondering. Thanks. [23:59:49] robertlabrie: it's not stored, it's generated on demand. [23:59:52] https://phabricator.wikimedia.org/T15209 <-- explained everything