[00:09:03] Hi. [00:09:33] I know it's a weird reques, but can someone set up a tool on wmflabs to redirect to an arbitrary wmflabs subdomain? [01:04:28] Anyone? [01:04:55] Why don't some wmflabs subdomains work with https? [01:05:16] Is there a list of those that do? [01:28:14] PiRSquared: It depends whether there is a webserver listening on the https port. What domain do you mean? [01:28:31] Any. In general. [01:28:40] like https://ganglia.wmflabs.org [01:53:41] Someone set a https server up for that. [02:24:33] !log tools chmod g-w ~whym/.forward [02:24:36] Logged the message, Master [02:36:06] !log tools Disabled terminatord on tools-login and tools-dev [02:36:07] Logged the message, Master [04:19:45] (03PS2) 10Tim Landscheidt: Simple tool to simplify using the backup snapshots [labs/toollabs] - 10https://gerrit.wikimedia.org/r/76313 (owner: 10Platonides) [04:21:34] (03CR) 10Tim Landscheidt: [C: 04-1] "The man page still needs tweaking; lintian isn't satisfied with it:" [labs/toollabs] - 10https://gerrit.wikimedia.org/r/76313 (owner: 10Platonides) [06:10:01] [bz] (8RESOLVED - created by: 2Daniel Zahn, priority: 4Low - 6enhancement) [Bug 47906] install Extension:Interwiki on wikitech/labs wiki - https://bugzilla.wikimedia.org/show_bug.cgi?id=47906 [06:20:10] !zhuyifei1999 del [06:20:11] Successfully removed zhuyifei1999 [06:45:18] [bz] (8NEW - created by: 2Daniel Kinzler, priority: 4Unprioritized - 6normal) [Bug 52693] Allow login using mosh as an alternative to ssh to make labs usable at Wikimania - https://bugzilla.wikimedia.org/show_bug.cgi?id=52693 [06:56:01] yuvipanda: ^ [06:58:04] legoktm: hmm? [06:58:07] legoktm: it works for me [06:58:14] legoktm: I'm logged into tools-login from mosh *right* now [06:58:16] on tools yes [06:58:20] not on the rest of labs [06:58:22] same for me :P [06:58:23] aaah [06:58:24] :P [06:58:25] right [06:58:44] legoktm: yeah, mosh is fucking awesome [06:58:49] <3 [07:16:42] (03PS1) 10Yuvipanda: Remove random debugging statement [labs/tools/gerrit-to-redis] - 10https://gerrit.wikimedia.org/r/78492 [07:16:43] (03PS1) 10Yuvipanda: Open a new MySQL connection for each registration action [labs/tools/gerrit-to-redis] - 10https://gerrit.wikimedia.org/r/78493 [07:17:54] &ping [07:17:54] Pinging all local filesystems, hold on [07:17:55] Written and deleted 4 bytes on /tmp in 00:00:00.0008600 [07:18:00] grr [07:18:42] still down [07:19:18] (03CR) 10Yuvipanda: [C: 032 V: 032] Remove random debugging statement [labs/tools/gerrit-to-redis] - 10https://gerrit.wikimedia.org/r/78492 (owner: 10Yuvipanda) [07:19:18] Written and deleted 4 bytes on /data/project in 00:01:24.1982080 [07:23:05] (03CR) 10Yuvipanda: [C: 032 V: 032] Open a new MySQL connection for each registration action [labs/tools/gerrit-to-redis] - 10https://gerrit.wikimedia.org/r/78493 (owner: 10Yuvipanda) [07:26:37] (03PS1) 10Yuvipanda: Do not keep redis connections open all the time either [labs/tools/gerrit-to-redis] - 10https://gerrit.wikimedia.org/r/78494 [07:28:13] (03CR) 10Yuvipanda: [C: 032 V: 032] Do not keep redis connections open all the time either [labs/tools/gerrit-to-redis] - 10https://gerrit.wikimedia.org/r/78494 (owner: 10Yuvipanda) [09:01:06] andrewbogott: ping? how hard is it to backport nginx-extras from 13.04 to precise? [09:01:17] andrewbogott: I was trying to get someone else to do it but looks like that is not happening [09:01:37] I can certainly add the package to the precise repo if it works as is... [09:01:42] do you know if it needs tweaking? [09:01:56] andrewbogott: It just needs to be a newer versions [09:01:58] *version [09:02:05] no need to rebuild, or change options [09:02:19] Ah, what version is up now and what version do you need? [09:02:47] andrewbogott: let me check [09:03:14] (and, do you happen to know if the existing package is used by any WMF systems currently?) [09:03:49] andrewbogott: nginx-extras is not, but it replaces nginx, which is used [09:03:53] so we can't add this to official repo [09:03:59] Ooh, I see. Hm. [09:04:17] I'm not positive I know how to handle that, let me think [09:05:07] Does it install as apt-get nginx? [09:07:15] yuvipanda, looks like the nginx-extras package in production is a custom wmf build. That's concerning... [09:07:19] andrewbogott: apt-get install nginx-extras [09:07:27] andrewbogott: oh, we have an nginx-extras in production? [09:07:33] 1.1.19 [09:07:38] It should just install in labs if you ask for it [09:07:54] andrewbogott: I think I need at least 1.2 [09:08:11] andrewbogott: it does install 1.1.x now, but it is missing a core feature for my work [09:09:00] &ping [09:09:00] Pinging all local filesystems, hold on [09:09:01] Written and deleted 4 bytes on /tmp in 00:00:00.0003470 [09:09:02] Written and deleted 4 bytes on /data/project in 00:00:00.0064110 [09:11:26] andrewbogott: to be more specific, I need at least v0.5.0 of nginx-lua, and it is packaged only as -extras [09:12:03] yuvipanda, do you know, was 1.1.9 added for you at your request or has it been there for a while? I don't know who built/installed the package we have now [09:12:12] andrewbogott: no, nothing has been done at my request so far [09:12:18] hm, ok [09:12:31] andrewbogott: Ryan was talking about how this can be done as a local repo of some sort [09:13:15] Hm, I don't know what that means :) Let's ask him in a minute [09:13:37] andrewbogott: alright [09:18:18] andrewbogott: hmm, I can't really just compile from source, can I? :) [09:19:07] Well, getting the package isn't necessarily hard, is it? The question is the install path. [09:20:25] yuvipanda, just like last time I offered to help with this, I may have to run off before the problem is resolved :( [09:21:34] andrewbogott: heh [09:22:08] I'm going to send an email to the ops list to try to find out why we have a custom version of that package -- hopefully someone will respond. There's nothing in the logs about it :/ [09:22:29] apt doesn't install it as a dependency of nginx does it? [09:22:43] andrewbogott: shouldn't [09:22:51] nginx-extras conflicts nginx, even [09:27:06] yuvipanda: ok, emailed… nag me tomorrow if I haven't followed up, ok? [09:27:14] andrewbogott: deal! [09:27:28] andrewbogott: I'm wondering if I should just do a manual compile now so I can continue developing [09:27:36] (Which, of course, you can ask Ryan in the meantime but he and I will be rushing to the same 6:00 appointment) [09:27:42] haha [09:27:43] ok [09:27:57] Are you working in your own project? You can just download and dpkg-install on an instance if you're blocked. [09:28:06] andrewbogott: oooh, yeah, i should be able to do that [09:28:09] yeah, I can do that [09:28:15] Obviously that's terrible for a running tool, but seems fine for development/tinkering. [09:28:29] andrewbogott: yeah, it is in instance project-proxy on project proxy-project [09:28:30] :P [09:28:39] 'k [09:28:42] later! [09:28:43] so I can just dpkg [09:33:50] Change on 12mediawiki a page Wikimedia Labs/status was Sharihareswara (WMF), changed by https://www.mediawiki.org/w/index.php?diff=760768 link latest edit summary: $6 [09:34:53] Change on 12mediawiki a page Wikimedia Labs/status was Sharihareswara (WMF), changed by https://www.mediawiki.org/w/index.php?diff=760771 link /* 2013-07-29 */ details edit summary: $6 [13:26:19] Cyberpower678: Why not -offtopic? [16:41:00] hey all, I'm trying to find some way to connect to the replicated wikipedia databases, but I'm having trouble finding any documentation on the wiki [17:26:09] !toolshelp | Hersfold [17:26:28] !ping [17:26:28] !pong [17:26:36] !toolsdoc | Hersfold [17:26:37] Hersfold: https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/Help [18:14:11] scfc_de: thanks! [18:14:18] Hersfold: np [18:14:55] legoktm: BTW, re Pywikipediabot, "git clone ssh://scfc@gerrit.wikimedia.org:29418/pywikibot/core.git" creates a 25 MB repo for me. Are there other repos needed for Pywikipediabot as well? [18:20:54] legoktm: Ah, he said pywikibot/*compat*. Okay, that solves that mystery. [18:39:08] scfc_de: another reason to switch to core ;-) [18:39:33] I actually tried hard to get the size of compat down, but there's just too much history [18:43:05] valhallasw: I don't think in this day and age 200 MByte is something to be concerned about :-). Also, if you're on the same filesystem, clones can share history. But as I said, for 200 MByte I wouldn't scratch my head too often :-). [18:45:20] it's somewhat annoying for my 8GB-disk-VM [18:45:32] but mediawiki is a worse offender iirc [19:00:25] scfc_de: it's only 14M after git gc --aggressive O_o [19:06:50] valhallasw: :-) [19:18:28] and now I'm burning some cycles for mediawiki-core. I'm still wondering whether we couldn't just get the small version from gerrit in the first place... [19:24:23] IIRC ^d had done something like that? Although, IIRVC, he almost wiped the repo clean in one of his first tries as Gerrit doesn't store all data in the way "git gc" recognizes as "important, do not delete" :-). [19:28:50] *grin* [19:33:12] valhallasw: small version of mediawiki core? :O [19:34:11] addshore: pywikibot-compat, but mw-core would also be good [19:34:20] I agree [19:34:22] <: [19:34:32] but my computer chokes on git gc [19:34:47] not enough memory -_-' [19:35:51] :< [20:01:11] addshore, scfc_de - mw-core gets down to 188M from 370M [20:10:00] valhallasw: Impressive. [20:29:01] valhallasw: rather nice [21:34:40] Is it possible for someone to do a FLUSH TABLES page, redirect, pagelinks; on enwiki_p ? I'm having problems with a query that's been 'copying to tmp tables' for weeks now, it seems to be related to http://bugs.mysql.com/bug.php?id=14070 or so. [21:36:27] wolfgang42: you should be able to kill it yourself [21:37:06] Betacommand: I killed it, that's not the problem. They still won't run. [21:37:15] use the terminal, connect to the database and run show processlist [21:37:44] Betacommand: I did. They were copying to tmp tables for some five weeks. [21:38:03] (Seven for an identical query I tried earlier.) [21:38:38] The query simply refuses to run, and the bug report states that flushing the tables fixes the problem temporarily, which is good enough for me. [21:38:45] I only need it to work once. [22:07:07] wolfgang42: Only binasher and Coren have access to the replica DB servers. Probably best to file a bug at https://bugzilla.wikimedia.org/enter_bug.cgi?product=Wikimedia%20Labs&component=tools. [22:19:54] scfc_de: Thanks, I'll do that shortly.