[00:03:06] 6Labs: Enable nfs to access /data/project for the wikimetrics project on labs - https://phabricator.wikimedia.org/T122258#1899730 (10madhuvishy) 3NEW a:3yuvipanda [03:12:50] I hope somebody could help me to connect to a labs instance [03:13:43] SPage was helping me last time [03:13:58] and I'm running into some trouble again [03:14:20] it's the instance for livingstyleguide.wmflabs.org/wiki/Main_Page [03:20:52] https://www.irccloud.com/pastebin/YezJo6f3/ssh-error [06:47:55] Change on 12wikitech.wikimedia.org a page Nova Resource:Tools/Access Request/Arlitt was created, changed by Arlitt link https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/Access_Request/Arlitt edit summary: Created page with "{{Tools Access Request |Justification=I am working on the oa project and would like to access the recitation-bot project here. |Completed=false |User Name=Arlitt }}" [09:08:15] 6Labs: New instance creation broken - https://phabricator.wikimedia.org/T122281#1900424 (10yuvipanda) 3NEW [09:36:51] 6Labs: nfs-exports.service is failing on labstore1001 often - https://phabricator.wikimedia.org/T122250#1900442 (10yuvipanda) [09:39:33] 6Labs: nfs-exports.service is failing on labstore1001 often - https://phabricator.wikimedia.org/T122250#1900443 (10yuvipanda) p:5High>3Normal I am pretty sure this is all LDAP flakiness, since this hits LDAP pretty heavily. It's also low-criticalness service, if it fails new instances in projects that have... [09:48:09] 10PAWS, 5Patch-For-Review: PAWS network error: - https://phabricator.wikimedia.org/T120561#1900465 (10yuvipanda) 5Open>3Resolved Pretty sure the flannel fix fixed it :) Instance lockup is being tracked in T121998 [09:48:37] 6Labs: New instance creation broken - https://phabricator.wikimedia.org/T122281#1900469 (10yuvipanda) Trusty seems fine, Jessie's broken. [09:49:55] 10PAWS: Add a way to expose static files to the internet from PAWS - https://phabricator.wikimedia.org/T119859#1900471 (10yuvipanda) I've split up the proxy and the hub into separate pods so they can be scaled separately now. Now to add register-with-proxy-support to nbserve... [09:50:43] 6Labs: New instance creation broken - https://phabricator.wikimedia.org/T122281#1900472 (10yuvipanda) oooh, fascinating. Another instance created worked out fine. wut. [09:51:30] 6Labs: New instance creation broken - https://phabricator.wikimedia.org/T122281#1900473 (10yuvipanda) sacrificial-puppy was trusty instance, sacrificial-kitten was jessie that didn't work and sacrificial-red-hair was jessie that does work. [10:06:58] PROBLEM - Puppet failure on sacrificial-red-hair is CRITICAL: CRITICAL: 12.50% of data above the critical threshold [0.0] [10:09:11] PROBLEM - Puppet failure on congratulatory-green-hair is CRITICAL: CRITICAL: 16.67% of data above the critical threshold [0.0] [10:12:05] RECOVERY - Puppet failure on sacrificial-red-hair is OK: OK: Less than 1.00% above the threshold [0.0] [10:14:03] RECOVERY - Puppet failure on congratulatory-green-hair is OK: OK: Less than 1.00% above the threshold [0.0] [10:17:14] 6Labs: Enable nfs to access /data/project for the wikimetrics project on labs - https://phabricator.wikimedia.org/T122258#1900491 (10yuvipanda) 5Open>3declined So I talked to @madhuvishy in person today, and I think I've convinced her that NFS is evil and requires blood sacrifices that we do not have the cap... [10:35:23] 6Labs: Instances locking up randomly - https://phabricator.wikimedia.org/T121998#1900510 (10faidon) So, tools-worker-07 is currently "stuck". Kernel and userland both seem to be responsive, but logins over SSH timeout. I logged in over VNC and Yuvi ran "passwd -d root" over salt (which still works), so I could l... [10:43:09] 6Labs, 10Tool-Labs, 5Patch-For-Review: strace sshd to figure out which filesystems are hit on login - https://phabricator.wikimedia.org/T104327#1900514 (10yuvipanda) Boom some more! [10:44:58] 6Labs: Instances locking up randomly - https://phabricator.wikimedia.org/T121998#1900516 (10yuvipanda) Ok, I'm going to move tools-worker-01 to -05 to 3.19 and -06 to -09 on 4.2 [10:46:26] !log tools migrate tools-worker-01 to 3.19 kernel [10:46:30] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/SAL, Master [11:35:10] 6Labs: Instances locking up randomly - https://phabricator.wikimedia.org/T121998#1900612 (10faidon) @valhallasw raised the good point that root SSH logins should be working even with NFS failed/stuck. I ran strace and found out that ssh-key-ldap-lookup was trying to access /home/ssh-key-ldap-lookup/.local/lib/py... [13:50:29] hi [13:50:30] I have a project in github, how I could update the source in wmflabs automatically? [13:57:04] hoo: I could change the autologout time in toolserver? [13:57:11] The_Photographer: I suppose you could have some sort of post-receive hook on github, but most people just deploy in a seperate step [13:57:52] valhallasw`cloud: I don't know if i could do login in wmflabs from a post-receive [13:57:54] I don't think we have an autologout? [13:58:11] ssh logout time [13:58:30] autologout because the session is inactive [13:58:32] ssh session [13:58:38] ah. Send keepalive packets. [13:58:54] http://www.howtogeek.com/howto/linux/keep-your-linux-ssh-session-from-disconnecting/ [13:59:43] thanks [13:59:43] The_Photographer: I don't think there's a simple way to do post-receive from github, no. [14:00:39] valhallasw`cloud: do you know some tool for Continuous integration ? [14:00:47] in wmflabs, of course [14:01:08] what are you trying to do? [14:01:20] (I don't see how CI is related to github post-receive hooks) [14:02:18] I want my code ready to test in wmflabs for each commit in github [14:02:46] I dont have time to do a git pull for test everything [14:02:53] I can't test it locally [14:03:41] again, no easy way to do that. [14:04:24] you can push to a repo on tool labs, and use a post-commit hook there, but it's always fiddly. [14:04:39] the easiest way is to just have a terminal open and do a git pull [14:06:44] ok, thanks [14:07:12] (or make a shell script that does that immediately after pushing; e.g. git push && ssh tools-login.wmflabs.org "become wikibugs /bin/sh -c 'cd src/wikibugs2 && git pull'") [14:08:22] yes a local script [14:31:52] valhallasw`cloud: could you add gerrit for wikiradio ? [14:32:02] The_Photographer: no, please follow the on-wiki process. [14:32:10] ok [14:32:55] PROBLEM - ToolLabs Home Page on toollabs is CRITICAL: CRITICAL - Socket timeout after 10 seconds [14:37:19] hm. works for me [14:37:43] it's mostly just flaky, it seems [14:37:52] RECOVERY - ToolLabs Home Page on toollabs is OK: HTTP OK: HTTP/1.1 200 OK - 966126 bytes in 8.114 second response time [14:42:27] !log tools toollabs homepage is unhappy because tools.xtools-articleinfo is using a lot of cpu on tools-webgrid-lighttpd-1409. Checking to see what's happening there. [14:42:31] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/SAL, Master [14:43:08] valhallasw`cloud: thank you for investigating! I’m still hoping to eat breakfast before I get in to this :) [14:43:46] it looks like someone is scraping GET /xtools-articleinfo/?article=Category:University_of_California,_San_Diego&project=en.wikipedia.org&uselang=* [14:47:57] and that IP is also firing lots of requests to geohack [14:48:53] valhallasw`cloud: Wouldn't be the first idiot spider that crawls internal links into a loop. [14:49:18] if it's a spider, it's also malicious -- the user agent is "Mozilla/5.0 (Macintosh; Intel Mac OS X) AppleWebKit (KHTML, like Gecko)" [14:49:24] (also not too uncommon) [14:50:32] Also. [14:50:51] What does the IP whois to? [14:51:04] ipXX-YY-ZZ-AA.sd.sd.cox.net [14:51:32] sorry, whois. Eh, cox communications ;-) [14:52:09] Enduser. Huh. Either a zombie, or some client "read offline" insanity. I'm pretty sure safari doesn't do that by default. [14:52:23] I'm betting the former [14:52:27] google suggests it's apple mail, which makes even less sense [14:52:56] I'm thinking someone with a email scraper bit of malware. [14:53:14] I'd blackhole at the proxy, at least for a while. [14:57:35] How I could add a remote github repository? [14:58:03] git remote add origin git@github.com:emijrp/wikiradio.git -->> Please make sure you have the correct access rights [14:58:11] I am in my public_html tool folder [15:07:20] Coren: any issues with having the IP address in puppet? [15:07:36] and if so, how can we set them out-of-bound on both servers? [15:23:39] andrewbogott: ^? [15:34:49] * Coren ponders. [15:35:19] valhallasw`cloud: It can't be associated with a user, so it's not PII. [15:35:42] valhallasw`cloud: There's a bit of BEANS, but I don't think that can't be helped. At least, not without uglies. [15:36:01] * Coren counts negatives. [15:36:03] BEANS? [15:36:32] valhallasw`cloud: https://en.wikipedia.org/wiki/Wikipedia:BEANS [15:36:39] Sorry, enwiki alphabet soup. :-) [15:37:14] I'm still fiddling with nginx to get it to actually return sane data so chrome doesn't puke. Fun. [15:38:13] ah, extra newlines. I can do that. [16:24:48] hi [16:26:00] Hello! [16:49:54] valhallasw`cloud: I’m back, sorry — are you still fighting with tools.xtools-articleinfo ? [16:50:07] andrewbogott: fighting with getting nginx to do what I want, mostly [16:50:26] 'I'll make sure they get a nice 403, that can't be too hard?' [16:50:33] #famouslastwords [16:50:53] ok — let me know if I can help :) [16:53:55] Hellow [16:54:34] The_Photographer: what’s up? [16:54:39] how I could solve this https://tools.wmflabs.org/wikiradio/get_content.html problem with CORS, I am trying make a test [16:54:57] thanks andrewbogott [16:57:07] The_Photographer: Ah, dang, I’m probably not the right person to ask. Shouting out the question here is a reasonable start… you could also write to labs-l [16:57:31] I am sorry [16:58:15] You’re not in the wrong place :) A few of us here can troubleshoot labs infrastructure (that’s me) but there are also often tools developer folks as well [16:59:42] andrewbogott: https://tools.wmflabs.org/.error/banned.html :-D [16:59:51] ok, now let me puppetize this [17:00:58] That’s a very polite page! I assume that the linked page is a wip and not that you meant a different link? [17:01:26] andrewbogott: writing content on that page is next on the to-do [17:01:30] I am sorry, why you are banned :( [17:01:52] The_Photographer: it's an example page, not normally shown [17:02:06] ok [17:14:42] andrewbogott: what's your assessment on having the banned ip addresses in puppet/phabricator? [17:15:13] I *think* it's possible to use a seperate unpuppetized file, but it's a bit of a mess [17:15:38] valhallasw`cloud: it doesn’t freak me out that much but… is this really the first time we’ve ever banned an IP? [17:15:41] Coren: ^ ? [17:16:01] andrewbogott: No. It's the first time we did it cleanly in puppet though. [17:16:19] andrewbogott: Every other time was emergency filtering directly in iptables because outage-causing. [17:16:32] ah, ok [17:16:36] (Also, before clean ferm rules) [17:16:49] well… I don’t think that putting an IP in puppet leaks any information other than ‘this is an IP address' [17:17:08] 6Labs, 10Tool-Labs: 70.179.6.X is hammering xtools-articleinfo - https://phabricator.wikimedia.org/T122307#1901159 (10valhallasw) 3NEW [17:17:13] ok, then I'll just hop it in there. [17:17:25] andrewbogott: That was my opinion as well - it's not PII since it is not linked to any user. [17:18:00] Probably good to send an email to labs-l saying that you blocked it, so that potential innocent victims know what is happening [17:18:11] *nod* [17:18:18] although that error page should be clear enough ;-) [17:19:40] oh, that’s true [17:22:36] guys i am using the example from https://www.mediawiki.org/wiki/Manual:CORS in https://tools.wmflabs.org/wikiradio/get_content.html whats happend? [17:23:29] it show "Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://en.wikipedia.org/w/api.php?action=query&meta=tokens&format=json&origin=https%3A%2F%2Fwww.mediawiki.org. (Reason: CORS header 'Access-Control-Allow-Origin' missing)." [17:25:02] I added to .htaccess t"Header set Access-Control-Allow-Origin "*"" [17:25:04] Coren: does my question on https://gerrit.wikimedia.org/r/#/c/257411/ make sense or am I misunderstanding what that patch does? [17:25:20] * Coren goes read said question. [17:27:25] The question makes sense, but the answer is "Nah." :-) Just explained on the patch. [17:29:04] (But the patch is outdated for another reason) [17:29:51] how I could red a wikipedia article content from php? where I could begin? [17:30:11] The_Photographer: You want to use the API for that. [17:30:23] Coren: show me the way [17:30:40] Coren: from a wmflabs applicatio [17:30:41] n [17:31:12] The_Photographer: That's rather a wide scope. :-) Lemme see if I can find some example PHP code for you. [17:31:20] thanks [17:35:05] I am trying do it with javascript, however its showing a CORS problem, then, php could be more easy [17:41:08] Could somebody DROP DATABASE u2815_p;? I'm not using it--and can't even touch it anymore due security permission changes. [17:41:48] * Dispenser taps Coren on the shoulder [17:41:55] Dispenser: please create a task in phab [17:42:01] #DBA project [17:51:20] valhallasw`cloud: are you ready for me to merge that now? [17:51:24] andrewbogott: yep! [17:51:58] ‘abr: always be rebasing' [17:52:02] :D [17:52:14] 'git abr' sounds like something that could exist [17:52:28] watch 'git fetch && git rebase origin/production' [17:54:04] Yeah, we could turn git-review into a service that runs in the background, detects a change in the tip and automatically resubmits patches. The spam potential would be limitless! [17:55:27] merged [17:57:37] service paladox start [17:58:33] * valhallasw`cloud prods labs-puppetmaster [18:01:41] Dispenser, please create a phabricator task to verify you are the owner of that database. I can rename it for you, but I have to confirm it is owned by you [18:02:18] how I could get a wikipedia article content? [18:03:02] In any language [18:03:07] I don't care [18:03:19] I need only need the article content [18:03:27] The_Photographer: I’ve had good luck using https://github.com/mwclient/mwclient [18:03:28] in python [18:03:35] The_Photographer: https://en.wikipedia.org/w/api.php?action=parse&page=foobar [18:03:45] There’s also ‘pywikibot’ which I think does similar things, but I’m unsure if that’s the right scope [18:04:40] it also depends if you want wiki code or HTML [18:05:31] http://dispenser.homenet.org/~dispenser/sources/make_dabs_gz.sh Script to mass download parsed disambiguation pages and package em into a tar file [18:06:17] The_Photographer: here's variety for plain text https://en.wikipedia.org/w/api.php?format=xml&action=query&prop=extracts&titles=Cat&redirects=true [18:06:22] mutante: the raw code [18:06:25] mostly [18:06:37] mutante: I can't use the api [18:06:55] because CORS problem [18:07:08] There's also ?action=render [18:07:12] wouldnt that be the same problem whether you use the API or not? [18:07:13] andrewbogott: some idea to install it in wmflabs? [18:07:43] mutante: I can't use the api from javascript [18:07:43] valhallasw`cloud: ok, that’s merged finally [18:07:51] The_Photographer: so you want the mediawiki syntax? [18:08:04] as you would use it when editing a page? [18:08:16] mutante: I don't care, I use this page like javascript [18:08:25] The_Photographer: Yes you can. http://dispenser.homenet.org/~dispenser/view/Dab_solver does it all the time [18:08:26] The_Photographer: http://stackoverflow.com/questions/1625162/get-text-content-from-mediawiki-page-via-api [18:08:42] You need to setup callbacks [18:09:18] https://en.wikipedia.org/w/api.php?action=parse&format=jsonfm&callback=mosdab_callback&redirects&page=/\ [18:09:50] mutante: I am sorry, I cant use the fucking api because CORS problem [18:10:11] The_Photographer: i think Dispenser has the solution to that [18:10:24] https://tools.wmflabs.org/wikiradio/get_content.html [18:11:16] Dispenser: I am using the same example from https://www.mediawiki.org/wiki/Manual:CORS and it simply dont work [18:11:30] http://dispenser.homenet.org/~dispenser/view/Dabfix#Statistics That table's loaded remotely [18:11:59] [18:12:01] [18:13:29] Dispenser: please, could you use pastebin? [18:14:06] because I am not underestanding you [18:14:22] it's just about the URL in there [18:14:30] the api.php link uses callback= [18:14:33] that's the difference [18:14:50] I am using callback [18:15:08] sorry, then i don't know [18:15:20] no problem [18:16:20] Anyway, I preffer do it from server side [18:16:23] The_Photographer: http://pastebin.com/eLnf2AGq [18:17:22] valhallasw`cloud: working now? [18:17:36] andrewbogott: almost. For loops doing something else than I expected [18:17:49] andrewbogott: https://gerrit.wikimedia.org/r/260775 fixed that -- I think :/ [18:18:00] but puppet runs correctly, and the .error subdir works [18:18:12] just the list of IPs that fails to display [18:18:14] hm [18:18:30] there's also no 1, though, so that suggests the for loop fails to run at all [18:18:33] oops, yeah, that looks necessary [18:18:45] Dispenser: thanks [18:20:19] andrewbogott: I'm going to make dinner and look at it later [18:20:27] see ya [18:20:46] valhallasw`cloud: ok, I’ll be around on and off for a while [18:20:47] So was it what you were looking for? [18:21:31] Dispenser: do you have some documentation? [18:21:47] Dispenser: well, I am looking for ajax [18:21:56] I just read through https://en.wikipedia.org/w/api.php?action=help&recursivesubmodules=1 [18:22:18] What are you specifically looking for? [18:22:57] BTW, AJAX doesn't work cross domain. You need to create a new