[08:54:13] anyone happen to know how to add a mediawiki template to the mw instance that runs with the jenkins tests? [10:43:23] Hi. Could somebody please tell me if we still should use tools-login.wmflabs.org, or if it has changed ? [10:43:28] Thanks by advance [12:00:40] quentinv57: Yes, it's still a bastion. [12:47:09] thanks a930913... I was confused because the fingerprint changed [13:03:07] quentinv57: Moved from pmtpa to eqiad a while back. [13:03:44] a930913, yes, I noticed, but I hadn't the time to connect via SSH before now [13:42:18] * Coren returns from vacation with ~1200 email to catch up to. Yeay. [13:44:01] welcome back! [13:44:13] * andrewbogott has caught up on most email thanks to jet-lag insomnia [13:44:30] * a930913 bets that over half of Coren's emails are from various people here rage asking where he is. [13:44:58] hi Coren and welcome back [13:45:07] andrewbogott: You get jet lag insomnia? I get jet lag coma instead. :-) [13:45:39] Coren: both! alternating [13:47:32] Bah, you guys have it lucky. You try keeping American times in the UK. [14:03:55] I'm submitting a job, a python script, with a $ jsub -N .... it works fine, but if I do $ jsub -once -continuous -N ... I get a : ImportError: No module named pywikibot. In the tools ~/.profile I've export PYTHONPATH=/shared/.... [14:03:55] is it normal the job env look like different with -once -continous ? [14:05:41] 3Wikimedia Labs / 3tools: Expose revision.rev_content_format on replicated wikidatawiki - 10https://bugzilla.wikimedia.org/54164#c5 (10Marc A. Pelletier) 5REO>3ASS Ah, hm. It fails to apply because my view management system right now does not properly deal with the concept that the same schema doesn't a... [14:08:12] phe: Not specifically, but there is an extra level of indirection that might confuse things is your .bashrc is different from your .profile (one often wants to source the former from the latter to be certain) [14:08:51] phe: That is, you probably want to set your environment in .bashrc, and source that with ". ~/.bashrc" in .profile [14:09:00] Coren, ok [14:09:38] Yes, IMO, that's silly behaviour -- but it's bash's normal behaviour. [14:29:13] 3Wikimedia Labs / 3deployment-prep (beta): beta labs mysteriously goes read-only overnight - 10https://bugzilla.wikimedia.org/65486 (10Chris McMahon) 3NEW p:3Unprio s:3normal a:3None I've been seeing this in the overnight runs of the browser tests in recent times. The build for VisualEditor will fail... [14:32:41] 3Wikimedia Labs / 3tools: root's crontab on tools-submit produces daily error - 10https://bugzilla.wikimedia.org/65027#c1 (10Marc A. Pelletier) 5NEW>3RES/FIX Indeed it was. Fix't. [14:34:17] !log deployment-prep Restarted logstash service on deployment-logstash1; it stopped logging new events at 10:37:13Z [14:34:19] Logged the message, Master [14:35:58] 3Wikimedia Labs / 3tools: Create views for user_daily_contribs table - 10https://bugzilla.wikimedia.org/61300#c6 (10Marc A. Pelletier) 5NEW>3ASS a:5Marc A. Pelletier>3Sean Pringle Dumping in Sean's capable hands; once the actual data has been replicated I'll add a view to it. [14:47:12] 3Wikimedia Labs / 3Infrastructure: filearchive table not available on labs - 10https://bugzilla.wikimedia.org/61813#c9 (10Marc A. Pelletier) There is, IMO, a plausible issue with the SHA but I don't know whether it is relevant for legal: its primary use case is (of course) to note files which have been previ... [14:48:33] hi all :) [14:49:54] i've created a new instance, and 15 minutes after booting, in Special:NovaInstance it says 'Puppet status: failed'. in 'get console output', i see lots of messages like 'May 19 14:40:18 cgtest nslcd[1120]: [6d8d3c] error writing to client: Broken pipe' [14:49:59] anybody seen this before? [14:51:02] mabye you need to wait for the next puppet run? [14:52:13] 3Wikimedia Labs / 3tools: create a node for dexbot service group which runs the user:dexbot - 10https://bugzilla.wikimedia.org/65199#c1 (10Marc A. Pelletier) 5NEW>3ASS Sounds legit. :-) [14:53:48] JohannesK_WMDE: nscld is the ldap lookup service. I have seen it flake out on occasion. My first attempt to fix would be to reboot the instance. Crude, but often effective. [15:02:10] JohannesK_WMDE: try to force a puppet run before the reboot [15:14:54] puppet was also disabled for some reason... ran puppet agent --enable, puppetd -tv, then rebooted... things look ok now. thanks for the tips matanya, bd808 [15:15:19] :) [15:24:33] Down to ~400 email [15:41:03] Coren, I can't make it working, even a simple $ jsub -once -continuous -l h_vmem=512M -N test_env /bin/bash -c "env" [15:41:03] doesn't show in test_env.out anything coming from .profile or .bashrc [15:41:03] w/o -once -continuous it works, the environment seems altered by -once -continuous after .profile is read, SHELL and SGE_O_SHELL seems ok == "/bin/bash" [15:42:16] phe: ... that's really odd. Please open a bugzilla for it, I'll need to look deeper into it. [15:42:37] I'll try to take a look at it in a few hours after I caught up on mail and urgent stuff. [15:43:13] Coren: phe: I gave up trying to use shared pywikibot and just installed it locally. [15:44:32] a930913, it works for me, it even works with jsub, the only pb come from jsub -once -continuous, the env doesn't contains the needed PYTHONPATH= [15:45:42] phe: And, just as a silly check, you do /export/ that variable and not just set it, right? [15:46:05] Coren, yeps export PYTHONPATH=/shared/pywikipedia/core:/shared/pywikipedia/core/externals/h\ [15:46:05] ttplib2:/shared/pywikipedia/core/scripts [15:46:19] Yeah, that should work. [16:00:44] Anybody know how to parse wikitext for image captions? [16:04:37] @notify Earwig|away [16:04:37] This user is now online in #wikimedia-labs. I'll let you know when they show some activity (talk, etc.) [16:14:29] 3Wikimedia Labs / 3tools: running job with SGE doesn't get the right environment when using -once -continuous - 10https://bugzilla.wikimedia.org/65491 (10Philippe Elie) 3NEW p:3Unprio s:3major a:3Marc A. Pelletier When I run a python job using pywikibot it works with a simple jsub but fails to import... [16:15:16] Coren: How many mails now? :) [16:22:45] (03Abandoned) 10Incola: Add .gitignore [labs/tools/maintgraph] - 10https://gerrit.wikimedia.org/r/133770 (owner: 10Incola) [16:31:56] Hi all. How can I access a tools webservice from tools-logon? [16:32:20] curl -m 5 'http://tools.wmflabs.org/zoomviewer/index.php' > /dev/null 2>&1 always times out [16:33:14] dschwen: Use tools-webproxy I think. [16:33:52] dschwen: So, curl -m 5 'http://tools-webproxy/zoomviewer/index.php' > /dev/null 2>&1 [16:38:48] thx, that works [16:39:07] but why in god's name is curl on tools-login pestering me to "Supply a filename!" ? [16:39:16] that is not curl's standard behavior [16:39:26] I cannot even specify "-o -" [16:40:01] Is somebody trying to "protect" me from "accidentally" spilling data to my termonal? [16:54:25] Hi, Tool-Labs question, does anyone know about the connectivity tool, or Lvova? (or an alternate tool to show most linked disambiguations not just for enwiki) [17:00:14] petan: Poke? [17:38:41] 3Wikimedia Labs / 3tools: running job with SGE doesn't get the right environment when using -once -continuous - 10https://bugzilla.wikimedia.org/65491#c1 (10Philippe Elie) typo, I meant /data/project/phetools/test_env.out.good and /data/project/phetools/test_env.out.bad [17:52:57] (03PS1) 10Yuvipanda: Move multimedia related bugs to #wikimedia-multimedia [labs/tools/pywikibugs] - 10https://gerrit.wikimedia.org/r/134140 [17:55:53] (03PS2) 10Yuvipanda: Move multimedia related bugs to #wikimedia-multimedia [labs/tools/pywikibugs] - 10https://gerrit.wikimedia.org/r/134140 [18:05:30] (03CR) 10MarkTraceur: [C: 031] Move multimedia related bugs to #wikimedia-multimedia [labs/tools/pywikibugs] - 10https://gerrit.wikimedia.org/r/134140 (owner: 10Yuvipanda) [18:06:39] valhallasw: ^ +2? I don't have rights :'( [18:07:17] YuviPanda: let me fix that for you [18:07:37] YuviPanda: you should have +2? [18:07:42] valhallasw: yeah [18:07:44] i hope? :) [18:07:46] oh [18:07:46] according to group labs-tools-pywikibugs [18:07:47] now [18:08:03] (03CR) 10Yuvipanda: [C: 032 V: 032] Move multimedia related bugs to #wikimedia-multimedia [labs/tools/pywikibugs] - 10https://gerrit.wikimedia.org/r/134140 (owner: 10Yuvipanda) [18:08:04] valhallasw: hmm, I do [18:08:06] I think qchris added you somewhere today [18:08:09] valhallasw: dunno why it didn't show up the last time [18:08:59] also added you and legoktm as group admins [18:09:09] valhallasw: cool! :D [18:09:20] valhallasw: I'm hopping around talking to the different teams moving things off -dev [18:09:37] Sure. Maybe we should also move stuff back, as legoktm suggested? [18:09:46] should be as easy as adding it explicitly to the -dev channel [18:10:10] valhallasw: which stuff? and where was it suggested? [18:10:10] * YuviPanda looks at list archives [18:10:25] coreextensions stuff [18:10:27] iir [18:10:28] c [18:11:00] valhallasw: ah, in that case I think it makes sense to change grrrit-wm's behavior [18:11:07] less spam on -dev the better [18:11:21] sure [18:11:27] legoktm: ^ [18:13:27] (03PS1) 10Yuvipanda: Add GWToolset to #wikimedia-multimedia [labs/tools/pywikibugs] - 10https://gerrit.wikimedia.org/r/134147 [18:14:42] marktraceur: quick review? https://gerrit.wikimedia.org/r/#/c/134147/ [18:15:56] (03CR) 10MarkTraceur: [C: 031] Add GWToolset to #wikimedia-multimedia [labs/tools/pywikibugs] - 10https://gerrit.wikimedia.org/r/134147 (owner: 10Yuvipanda) [18:16:10] (03CR) 10Yuvipanda: [C: 032 V: 032] Add GWToolset to #wikimedia-multimedia [labs/tools/pywikibugs] - 10https://gerrit.wikimedia.org/r/134147 (owner: 10Yuvipanda) [18:21:05] hi this is palerdot. new to both wikitech and this channel. I recently signed up for wikitech developer access and got a message that I can log into bastion [18:21:13] i need some help on how to proceed further [18:21:54] I tried to join a commons table with a table from dewiki, but had no success (on Tool Labs). I connected to dewiki ("sql dewiki") and tried a simple query ("SELECT * FROM page LIMIT 0,1;"). This worked fine. Now I wanted to test the same with the foreign table, but this gives no result ("SELECT * FROM commonswiki_f_p.page LIMIT 0,1;"). The syntax seems to work, because I get an error message, if I write something different, but this query [18:21:54] gives no result (at least for some minutes now...) [18:22:25] i ssh ed into palerdot@bastion.wmflabs.org and i'm clueless on what to do next. [18:25:15] (03PS1) 10Yuvipanda: Move Growth related extensions to #wikimedia-growth [labs/tools/pywikibugs] - 10https://gerrit.wikimedia.org/r/134150 [18:26:19] (03CR) 10Swalling: [C: 031] "Please do." [labs/tools/pywikibugs] - 10https://gerrit.wikimedia.org/r/134150 (owner: 10Yuvipanda) [18:26:59] palerdot: I'm in a meeting at the moment but can help in ~30 mins. [18:27:12] (03CR) 10Mattflaschen: [C: 031] "Channels are correct." [labs/tools/pywikibugs] - 10https://gerrit.wikimedia.org/r/134150 (owner: 10Yuvipanda) [18:27:33] (03CR) 10Yuvipanda: [C: 032 V: 032] "Thanks!" [labs/tools/pywikibugs] - 10https://gerrit.wikimedia.org/r/134150 (owner: 10Yuvipanda) [18:28:06] andrewbogott: thank you for the help. I will wait till you are available. thanks again [18:28:50] (03PS1) 10Yuvipanda: Move growth things to #wikimedia-growth [labs/tools/grrrit] - 10https://gerrit.wikimedia.org/r/134153 [18:30:33] (03CR) 10Swalling: [C: 031] "LGTM" [labs/tools/grrrit] - 10https://gerrit.wikimedia.org/r/134153 (owner: 10Yuvipanda) [18:31:01] (03CR) 10Yuvipanda: [C: 032 V: 032] Move growth things to #wikimedia-growth [labs/tools/grrrit] - 10https://gerrit.wikimedia.org/r/134153 (owner: 10Yuvipanda) [18:31:47] I got a result... But shouldn't a query like "SELECT * FROM commonswiki_f_p.page LIMIT 0,1;" be faster than 2 minutes from dewiki? [18:35:16] (03PS1) 10Yuvipanda: Remove duplication of -multimedia extensions from -dev [labs/tools/grrrit] - 10https://gerrit.wikimedia.org/r/134154 [18:38:57] 3Wikimedia Labs / 3tools: running job with SGE doesn't get the right environment when using -once -continuous - 10https://bugzilla.wikimedia.org/65491#c2 (10Philippe Elie) Looking the man page http://cf.ccmr.cornell.edu/cgi-bin/w3mman2html.cgi?qsub%281B%29 environment var are not exported to the job, it's th... [18:56:26] palerdot: ok, I have a few minutes now… can you tell me what you're hoping to do? [19:03:26] andrewbogott: i just want to get to know about developer access and how one can use it for contributing [19:03:51] palerdot: OK… labs is divided into projects, each of which has its own access policy. [19:04:04] So until you know what you want to do/who you're doing it with… I'm not sure where to start :) [19:04:21] If you want shell access to write a tool or a bot, that would be the 'tools' project. [19:04:29] oh ok. i signed up to clone the mediawiki repository from git [19:05:17] so that i can browse around the code and get to know the wikipedia development stack. [19:05:53] ok! That you can do locally without logging into labs. Or you can set up 'vagrant' and get a mediawiki dev environment set up locally. [19:06:12] (or we can set you up with a mw dev environment on labs, but that's probably the most trouble and I don't know that that helps you.) [19:06:18] Lemme find some vagrant docs [19:07:00] palerdot: what are you running on your local machine? windows/linux/mac/??? [19:07:06] linux ubuntu [19:07:43] let us say if i set up a local environment what should i do to push some code to the project ? [19:08:12] If you want a local checkout of the source (without a running server or anything) just do this: [19:08:13] git clone https://gerrit.wikimedia.org/r/p/mediawiki/core.git [19:08:38] The account that you just now created will provide you with 'gerrit' access, which is the system we use to submit and review patches. [19:09:07] So in order to submit a patch, you'd want to log into gerrit.wikimedia.org (using the same credentials as on wikitech) and upload an ssh public key there. [19:09:21] ok! i just now did that [19:09:35] Then you can submit patches with just 'git review' [19:09:54] presuming everything goes well with auth on gerrit, patches will appear publicly for others to view, review, merge on gerrit.wikimedia.org [19:10:11] ok! i believe i still need to know a lot more things before i could contribute anything meaningful to mediawiki [19:10:16] palerdot: to set up a local test/dev box, try this: https://www.mediawiki.org/wiki/MediaWiki-Vagrant#Quick_start [19:10:35] It's really slick, gets you a local VM running a mediawiki web server on your local machine. [19:10:52] So you can muck around with the MW source and break things :) [19:10:58] ok! thanks! will definitely try that. [19:11:04] and by the way thanks for the help [19:11:07] sure thing. [19:11:09] this is my first irc chat [19:11:14] :) [19:11:25] i cannot believe people here took their time to help me out [19:11:34] Labs comes into play if you have a project that you want to host in a public place so other people can see your work. [19:11:51] So for instance if you write a custom extension or something like that (rather than just a minor bugfix) and want to demonstrate it [19:11:54] oh ok ! i will set up a local environment and see how it goes [19:11:59] you'd set up a labs server so that other people can visit. [19:12:09] Yep, local env will be easier in the meantime. [19:12:32] is it something like a google appspot [19:13:00] hm, maybe? I don't know what that is :) [19:13:22] ok! thanks! will start with setting up a local environment [19:13:47] is php knowledge enough to get to know mediawiki ? [19:15:13] yep, should be. [19:15:27] There's some javascript in there but I'm always able to ignore it [19:16:37] ok! great! thanks again for the help! i'm going off for a while now. Hope we will have more great conversations here in future [19:16:41] bye ! [19:16:45] good luck! [19:16:51] thanks ! [19:44:41] 3Tool Labs tools / 3Matthewrbowker's tools: MATTHEWRBOWKER-8 On-IRC help system for WikiWelcomer - 10https://bugzilla.wikimedia.org/59067 (10Andre Klapper) s:5critic>3enhanc [20:19:12] YuviPanda: if you're still awake, should we try to upgrade nginx again? [20:19:19] (And, I guess by 'we' I mean 'you') [20:19:27] andrewbogott: sure [20:20:29] andrewbogott: tools first? [20:20:36] Either way. [20:20:41] ok tools first [20:20:50] sshing in now [20:20:58] http://tools.wmflabs.org/ uses the tools proxy? [20:21:21] yup [20:21:23] YuviPanda: please tests if the pywikibot nightlies work afterwards ;-) [20:21:33] or poke me to do that [20:22:50] valhallasw: sure [20:24:26] valhallasw: test now? [20:24:48] seems to work [20:24:53] andrewbogott: ^ [20:24:58] everything seems alright [20:25:00] not very fast though [20:25:02] scfc_de: just upgraded nginx on tools [20:25:06] yep, looks ok to me. [20:25:07] valhallasw: slower than before? [20:25:07] ~ 130 KB/s [20:25:10] instead of 2MB/s [20:25:27] I'm on terrible internet so would be a terrible person to test, can someone else check? [20:25:31] valhallasw: can you paste a link? [20:25:48] http://tools.wmflabs.org/pywikibot/core.tar.gz [20:26:11] andrewbogott: do you have a fast enough connection to check speed on ^? [20:26:27] basically, gzipping gzipped content makes no sense, and makes the proxy unhappy, I think :-p [20:26:35] I don't think we're doing that [20:26:41] mime types are explicitly whitelisted [20:26:43] for gzipping [20:26:46] I'm getting around 1.5 M/s [20:27:31] valhallasw: is it still slower if you try again? Maybe restarting nginx invalidated a cache someplace? [20:28:06] ~600KB/s now. Maybe it's actually my connection that's crappy :/ [20:28:11] andrewbogott: while you are here, care to merge https://gerrit.wikimedia.org/r/#/c/133265/ [20:28:11] ? [20:28:17] andrewbogott: and https://gerrit.wikimedia.org/r/#/c/133266/ [20:31:32] YuviPanda: you upgraded nginx on the labs proxy as well? [20:31:49] andrewbogott: nope, thinking of letting it simmer in tools for a while [20:31:55] ok [20:32:10] andrewbogott: tomorrow perhaps? [20:32:13] sure [20:32:43] !log tools upgraded nginx on tools-webproxy... again. [20:39:51] I get about 1.5 MByte/s for core.tar.gz. [20:40:14] YuviPanda: What would be the downgrade procedure (just in case)? [20:40:58] scfc_de: 1. rm 1.7 nginx from /data/project/.system/debs/all and amd64 2. run update-repos.sh in debs, 3. do apt-get update 4. apt-get install nginx-extras 5. run puppetd -tv [20:42:08] Does that downgrade the nginx package? [20:42:17] yup, since that'll leave the 1.5 packages there [20:42:43] But if 1.7 is already installed, wouldn't it just keep that installed (and running)? [20:43:34] (Never tried if Puppet => latest downgrades if that would be the conclusion.) [20:43:34] scfc_de: I think the install downgrades it? if not you can always just do a dpkg -i of both common (in all) and extras (in amd64) [20:44:05] For the moment, let's hope it works :-). What were the symptoms last time? [20:45:27] scfc_de: :) 503s [20:45:33] scfc_de: but those were from redis, as we later discovered [20:50:14] (03Abandoned) 10Adamw: Reformat using a slightly more yaml-y style [labs/tools/grrrit] - 10https://gerrit.wikimedia.org/r/112311 (owner: 10Adamw) [21:07:19] bits.beta.wmflabs.org is bitchy :( [21:44:51] andrewbogott, with regards to the trusty image; have you done any further testing in order to remove the provisional status of that image? paravoid and I brokered a deal with greg-g that I would either release PDF into betalabs on a non testing image or as a beta feature using the real hardware I already have [21:45:00] but that relies on a non provisional image [21:45:15] alternatively; if you think it's stable enough to work with I can just use it [21:45:32] mwalker: Should be relatively close to release; ask me again tomorrow :) [21:45:37] kk [21:45:44] Best not to create anything based on the 'testing' image though, as I'm hoping to replace it. [21:45:52] And VMs die if their parent image is removed. [21:45:56] makes sense [22:11:08] Coren: is wikitech treating you ok just now? [22:12:14] andrewbogott: 15:11 <+icinga-wm> PROBLEM - LDAP on virt1000 is CRITICAL: Connection refused [22:12:16] Ah, that'd be it [22:12:19] andrewbogott: It logged me out and won't log me back in. [22:12:23] Ah, clearly. [22:12:28] andrewbogott: Want me to go look? [22:13:04] Coren: Sure… I'm looking too. [22:13:24] Hm. It's running. [22:13:49] But apparently not listening. [22:15:04] * andrewbogott wonders for the 10th time why having a backup on virt0 does not help [22:15:24] Coren: Shall I restart opendj or do you think there are other interesting things to look for first? [22:15:36] I'm not sure the openldap auth extension to mediawiki actually uses a backup. [22:15:38] Though most other things will. [22:16:07] andrewbogott: I had already scoured the logs to no avail and I just restarted it. [22:16:19] ok [22:16:33] andrewbogott: Hoping to see something interesting during its startup -- or more precisely hoping to /not/ see anything. :-) [22:16:47] I guess it can't be a firewall issue since it's the same host [22:16:57] aaand... that worked. [22:17:07] * Coren growls at opendj. [22:18:32] ok, I'm going to restart pdns since it usually freaks out when opendj restarts [22:47:23] so I supposedly have bastion shell access but the bastion host seems to be expecting a key that I don't have. do I need to submit my ssh public key somewhere to have it added to authorized_keys on labs hosts? [22:55:08] twentyafterfour, you need to add it to your public keys list in wikitech preferences [22:55:20] (iirc) [23:00:59] Krenair: I see, thanks! I'm in