[00:05:52] !log tools.stashbot Restarted bot to clear busted phabricator session [00:05:54] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.stashbot/SAL, Master [00:06:05] !log tools.stashbot Added Greg Grossmeier as co-maintainer [00:06:08] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.stashbot/SAL, Master [00:06:24] greg-g: you has the power! [00:07:44] yay [00:08:11] huh, I don't get an Echo notification about that [00:08:26] it's all LDAP I think [00:08:37] also echo and wikitech aren't the best of friends [00:08:53] yeah yeah [00:10:03] 6Labs, 6Operations, 10wikitech.wikimedia.org, 13Patch-For-Review: Update wikitech-static OS/PHP version - https://phabricator.wikimedia.org/T126385#2101565 (10Krenair) a:3Krenair * Copied my usual files across (basically what I have in prod puppet modules/admin/files/home/krenair), installed git (and bas... [00:11:42] testing the log tailing T123 [00:11:52] not in here eh [00:12:14] it should work everywhere [00:12:30] the errors take a while to show up 'cause NFS [00:12:35] * greg-g nods [00:12:36] gotcha [00:12:41] 'tis working [00:13:14] hmm.. T123 [00:13:29] weird [00:13:55] the bot is answering in ##stashbot [00:14:00] but not here [00:28:29] and in devtools [00:29:24] greg-g: hmm.. that seems suspicious [00:29:59] !log tools.stashbot Checking to see if bot is really listening in #wikimedia-labs [00:30:02] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.stashbot/SAL, Master [00:30:34] that worked -- https://tools.wmflabs.org/sal/log/AVNYx8A4RVcO3hhfZoUt [00:33:47] 10PAWS, 6Research-and-Data: Build a pool of beta testers for PAWS - https://phabricator.wikimedia.org/T129297#2101483 (10jayvdb) What is CSCW? [00:34:28] T129283 [00:37:01] 10Wikibugs: Get icon and color from API instead of screen scraping - https://phabricator.wikimedia.org/T1176#2101622 (10mmodell) [00:37:29] 10Wikibugs: Get icon and color from API instead of screen scraping - https://phabricator.wikimedia.org/T1176#20337 (10mmodell) Can this be closed now that 1b6bbd391ad1f23a merged? [00:38:22] greg-g: are you an op in here? I wonder if somebody +b'd stashbot? [00:38:42] I'm not [00:38:57] 10Wikibugs: Get icon and color from API instead of screen scraping - https://phabricator.wikimedia.org/T1176#20337 (10Legoktm) No? That patch didn't make it stop screen-scraping. [00:39:48] 10Wikibugs: Wikibugs links sometimes to the creation event, not to the mentioned comment - https://phabricator.wikimedia.org/T129246#2101648 (10mmodell) FYI: I want to build a unified conduit api so that you wouldn't have to screenscrape anything, and hopefully it won't even be necessary to make multiple api cal... [00:49:30] !bash Just a test [00:50:15] greg-g: stashbot must be muted by channel flags here somehow [00:56:02] bd808: :( [00:56:41] RECOVERY - Puppet failure on tools-services-02 is OK: OK: Less than 1.00% above the threshold [0.0] [01:22:33] 6Labs, 6Operations, 10wikitech.wikimedia.org, 13Patch-For-Review: Update wikitech-static OS/PHP version - https://phabricator.wikimedia.org/T126385#2101749 (10Krenair) * Copied the wikitech import cron across * Copied MW config * Set up new copy of MediaWiki * Installed memcached and mysql-server-core-5.5/... [01:52:19] 10Quarry: Quarry should preserve the protocol when logging in through OAuth - https://phabricator.wikimedia.org/T92600#2101766 (10Huji) 5Invalid>3Open [02:25:32] 10Quarry: Quarry should preserve the protocol on redirects - https://phabricator.wikimedia.org/T92600#2101895 (10Huji) [02:49:56] 6Labs, 6Operations, 10wikitech.wikimedia.org, 13Patch-For-Review: Update wikitech-static OS/PHP version - https://phabricator.wikimedia.org/T126385#2101906 (10Dzahn) once we are ready to switch i was going to do this https://gerrit.wikimedia.org/r/#/c/276088/ [02:56:22] RECOVERY - Puppet failure on tools-exec-1406 is OK: OK: Less than 1.00% above the threshold [0.0] [03:08:18] RECOVERY - Puppet failure on tools-webgrid-lighttpd-1412 is OK: OK: Less than 1.00% above the threshold [0.0] [03:15:10] 10PAWS: PAWS 404 for users with special characters in their names - https://phabricator.wikimedia.org/T120066#2101919 (10yuvipanda) After messing around and debugging, I believe https://github.com/jupyter/jupyterhub/pull/472 fixes it! \o/ The terminal does not work yet, which I believe is fixed by https://githu... [03:15:37] /buff/buffer 28 [03:15:38] bah [03:16:08] yuvipanda: I'm more used to seeing "w/in 23" [03:17:37] 10PAWS, 6Research-and-Data: Build a pool of beta testers for PAWS - https://phabricator.wikimedia.org/T129297#2101920 (10yuvipanda) @jayvdb CSCW was a conference where we ran a workshop on using PAWS to do wikimedia related research - you can see more info at https://meta.wikimedia.org/wiki/Research:Breaking_i... [03:17:39] bd808: heh [03:17:45] bd808: I switched to weechat rather than irssi [03:17:52] hopefully someday I can stop using IRC completely [03:18:01] * bd808 will miss yuvipanda [03:18:33] RECOVERY - Puppet failure on tools-webgrid-generic-1402 is OK: OK: Less than 1.00% above the threshold [0.0] [03:19:15] bd808: hopefully the new place will also be open and free and just better technology and contain commandline clients too for people who prefer that [03:19:22] unfortunately no such place exists atm [03:19:31] yeah. I get the hope [03:19:53] I just kinda don't believe that it is coming any time soon [03:20:10] I think it might exist in the next 10 years [03:20:23] but maybe all the l33t users of slack will start to work on it after microsfot buys them [03:20:27] * yuvipanda just got added to another slack 'team' today [03:20:31] *microsoft [03:37:48] 6Labs, 6Operations, 10wikitech.wikimedia.org, 13Patch-For-Review: Wikitechwiki has 4xx responses to requests for some static assets inc. poweredby_mediawiki_88x31.png and WikiEditor's button-sprite.svg - https://phabricator.wikimedia.org/T128747#2101935 (10Krenair) 5Open>3Resolved a:5Krenair>3None [03:45:35] 6Labs, 6Operations, 10wikitech.wikimedia.org, 13Patch-For-Review: Update wikitech-static OS/PHP version - https://phabricator.wikimedia.org/T126385#2101940 (10Krenair) * Fiddled with apache config some more (`Require all granted`) to make this work under Apache 2.4 * Installed php5-mysql * `mysqldump` on o... [03:51:33] 6Labs, 6Operations, 10wikitech.wikimedia.org, 13Patch-For-Review: Update wikitech-static OS/PHP version - https://phabricator.wikimedia.org/T126385#2101941 (10Krenair) a:5Krenair>3Andrew @Andrew, do you think this is OK to switch now? [04:22:39] 10PAWS, 13Patch-For-Review: Setup an icinga check for PAWS - https://phabricator.wikimedia.org/T129209#2101955 (10yuvipanda) Had to revert with https://gerrit.wikimedia.org/r/#/c/276093/ since icinga wouldn't recognize the host at all :| @Dzahn any idea what I did wrong? [04:24:21] 6Labs, 10Tool-Labs: Goal: Allow using k8s instead of GridEngine as a backend for webservices (Tracking) - https://phabricator.wikimedia.org/T129309#2101957 (10yuvipanda) [04:26:05] 6Labs, 10Labs-Sprint-100, 10Tool-Labs: Unify / simplify webservice code - https://phabricator.wikimedia.org/T98440#2101972 (10yuvipanda) [04:26:06] 6Labs, 10Tool-Labs: Goal: Allow using k8s instead of GridEngine as a backend for webservices (Tracking) - https://phabricator.wikimedia.org/T129309#2101971 (10yuvipanda) [04:27:32] 6Labs, 10Tool-Labs: Setup a proper deployment strategy for Kubernetes - https://phabricator.wikimedia.org/T129311#2101987 (10yuvipanda) [04:28:33] 6Labs, 6Operations, 13Patch-For-Review: Setup private docker registry with authentication support in tools - https://phabricator.wikimedia.org/T118758#2102004 (10yuvipanda) [04:28:35] 6Labs, 10Tool-Labs: Goal: Allow using k8s instead of GridEngine as a backend for webservices (Tracking) - https://phabricator.wikimedia.org/T129309#2102003 (10yuvipanda) [04:28:37] 6Labs, 10Tool-Labs: Setup DNS for kubernetes services - https://phabricator.wikimedia.org/T111914#2102005 (10yuvipanda) [04:28:48] 6Labs, 10Tool-Labs: Define base Wikimedia Docker container - https://phabricator.wikimedia.org/T118446#2102009 (10yuvipanda) [04:28:50] 6Labs, 10Tool-Labs: Goal: Allow using k8s instead of GridEngine as a backend for webservices (Tracking) - https://phabricator.wikimedia.org/T129309#2101957 (10yuvipanda) [04:29:15] 6Labs, 10Tool-Labs: Define base Wikimedia Docker container - https://phabricator.wikimedia.org/T118446#1800488 (10yuvipanda) https://github.com/wikimedia/operations-docker-images-debian is where it's at, btw. [04:31:35] 6Labs, 10Tool-Labs: Setup a supported HTTP Ingress solution for Kubernetes - https://phabricator.wikimedia.org/T129312#2102015 (10yuvipanda) [05:13:49] 6Labs, 10Tool-Labs: Setup a supported HTTP Ingress solution for Kubernetes - https://phabricator.wikimedia.org/T129312#2102061 (10yuvipanda) [05:14:42] 6Labs, 10Tool-Labs: Setup a supported HTTP Ingress solution for Kubernetes - https://phabricator.wikimedia.org/T129312#2102015 (10yuvipanda) We could also totally decide to not do this for the goal - this is probably a fair chunk of work - and just use our current setup (which will work just as well) for this... [08:34:53] PROBLEM - Host tools-bastion-01 is DOWN: CRITICAL - Host Unreachable (10.68.17.228) [10:11:16] 6Labs, 6Operations, 10wikitech.wikimedia.org, 13Patch-For-Review: Update wikitech-static OS/PHP version - https://phabricator.wikimedia.org/T126385#2102371 (10Southparkfan) Not very important, but is it possible to enable OPcache in PHP? Even if you only allow 64MB of cache usage, this should make the wiki... [14:48:36] andrewbogott: I am all happy with Horizon :-} [14:48:48] I can spawn/delete instances, look at boot logs etc [14:49:39] hashar: great! DNS is going to be messy :( [14:49:58] andrewbogott: I think a couple instances lost DNS entries [14:50:04] havent investigated further though [14:50:15] newish instances? [14:50:49] a few months old maybe [14:50:54] will try to find the affected ones [14:51:03] that prevent ssh access to it iirc [14:51:09] but we do most commands via salt nowadays [14:57:27] hashar: hm… I purged a bunch of leaked/duplicate DNS entries a few months ago. And I must’ve hit an awful lot of valid entries by mistake because this issue keeps coming up :( [15:03:14] 6Labs, 6Operations, 10wikitech.wikimedia.org, 13Patch-For-Review: Update wikitech-static OS/PHP version - https://phabricator.wikimedia.org/T126385#2103178 (10Andrew) I'm happy to switch over, as soon as daniel removes the WIP from https://gerrit.wikimedia.org/r/#/c/276088 [15:36:43] (03PS1) 10MarcoAurelio: CSS and Elections [labs/tools/stewardbots] - 10https://gerrit.wikimedia.org/r/276184 (https://phabricator.wikimedia.org/T128742) [15:42:40] (03PS1) 10Youni Verciti: Stage 1 - Cleaning code, using explict names [labs/tools/vocabulary-index] - 10https://gerrit.wikimedia.org/r/276187 [15:44:40] (03CR) 10MarcoAurelio: [C: 032] CSS and Elections [labs/tools/stewardbots] - 10https://gerrit.wikimedia.org/r/276184 (https://phabricator.wikimedia.org/T128742) (owner: 10MarcoAurelio) [15:45:12] (03Merged) 10jenkins-bot: CSS and Elections [labs/tools/stewardbots] - 10https://gerrit.wikimedia.org/r/276184 (https://phabricator.wikimedia.org/T128742) (owner: 10MarcoAurelio) [15:46:48] !log tools.stewardbots Merged https://gerrit.wikimedia.org/r/276184 [15:46:51] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools.stewardbots/SAL, Master [16:12:52] 6Labs, 10Horizon: Horizon dashboard for managing http proxies for labs instances - https://phabricator.wikimedia.org/T129245#2103360 (10Andrew) Having thought about this for 30 seconds, I'm remembering that the SpecialNovaProxy interface actually manages /two/ things: the actual proxy (via invisible unicorn)... [16:18:09] andrewbogott: would it be worth running a script to verify all instances are properly in DNS backend (ldap?) and can resolve fine? [16:20:09] hashar: if you are offering to write that script then absolutely! [16:20:13] (It’s not ldap, mercifully) [16:21:37] * hashar ducks [16:21:52] hashar: we have a ticket for this somewhere if you threw a note in there on teh diea [16:21:54] idea even [16:21:57] andrewbogott: if you get the raw data, I dont mind handling the compare [16:22:29] which is really all about: colordiff -u <(sort -f instances.txt) <(sort -f dns.txt) [16:22:38] taking a break before next meeting [16:35:48] 6Labs, 10Tool-Labs: Linux Error: libgcc_s.so.1 - https://phabricator.wikimedia.org/T129361#2103417 (10doctaxon) [16:36:13] 6Labs, 10Tool-Labs: Linux Error: libgcc_s.so.1 - https://phabricator.wikimedia.org/T129361#2103431 (10doctaxon) p:5Triage>3High [16:39:30] 6Labs, 10Tool-Labs: Linux Error: libgcc_s.so.1 - https://phabricator.wikimedia.org/T129361#2103417 (10valhallasw) You're not requesting enough memory for the job, so malloc() fails, causing this error. Pass e.g. -mem 512M (or a higher number) to jsub to request more memory than the default. [16:42:57] 6Labs, 10Tool-Labs: Linux Error: libgcc_s.so.1 - https://phabricator.wikimedia.org/T129361#2103449 (10doctaxon) I gave it -mem 1g jsub -once -j y -quiet -v LC_ALL=en_US.UTF-8 -mem 1g -l release=trusty ldw.tcl [16:57:09] (03PS1) 10Glaisher: Centralize db credentials config file [labs/tools/stewardbots] - 10https://gerrit.wikimedia.org/r/276205 [17:02:01] (03PS1) 10MarcoAurelio: General overhaul of CSS/JS content [labs/tools/stewardbots] - 10https://gerrit.wikimedia.org/r/276206 [17:05:17] 10Tool-Labs-tools-stewardbots, 13Patch-For-Review: Make elections.php work again - https://phabricator.wikimedia.org/T128742#2103537 (10MarcoAurelio) CSS/JS imported, working again. But some files are also missing which I think this change (https://gerrit.wikimedia.org/r/#/c/276206/) will solve. [17:19:01] kaldari, ping [17:19:09] Howdy [17:19:59] Cyberpower678: BTW, we have the logging API for Cyberbot finished: http://tools.wmflabs.org/deadlinks/ https://meta.wikimedia.org/wiki/Fixing_dead_links/Deadlink_logging_app [17:20:01] kaldari, I don't want to put a damper on things, but how's the checkIfDead class coming. It seems the current focus is the central API. [17:20:41] Cyberpower678: The checkifDead improvements will probably be done this week [17:20:48] Yay. :-) [17:21:01] Then I can probably launch another BRFA. [17:21:13] Cyberbot is almost finished for enwiki. :D [17:21:31] Then I will move it to InternetArchiveBot [17:23:20] Cool. Let me know when you have a chance to look at the deadlink logging API. If you have suggestions for improving it, feel free to ping me. The next step on that is to provide some output APIs: 1 for finding the date that an article was last processed, and 1 for finding out the last article that a bot worked on. [17:24:38] Cyberpower678: We could also add the code to Cyberbot to interface with the API or we can leave that to you if you want. [17:25:17] (03PS1) 10MarcoAurelio: Adding .lighttpd.conf file [labs/tools/stewardbots] - 10https://gerrit.wikimedia.org/r/276209 [17:25:18] Hopefully this will provide a better view into Cyberbot's progress than the tagging system [17:26:09] As long as the response time from the API is almost instantaneous, I should be fine. [17:27:06] But then again, if the bot runs on tools and so does the API, you could create an internal API for the bot to access too. So in case of a common webservice failure, Cyberbot can continue to log. [17:27:36] (03PS2) 10MarcoAurelio: Adding .lighttpd.conf file [labs/tools/stewardbots] - 10https://gerrit.wikimedia.org/r/276209 [17:28:01] kaldari, ^ [17:29:43] Cyberpower678: That's not a bad idea [17:30:30] The API should be super fast. It doesn't have to query anything to do the logging, so it should be nearly instant [17:30:39] kaldari, but get the checkIfDead finished first. That's a little higher on my priority list at the moment. I've been itching to turn that feature on and see Cyberbot will handle [17:31:14] Cyberpower678: Will do. I'll see if we can have it ready by tomorrow actually [17:31:38] kaldari, I just deployed a serious bug. :/ [17:31:47] uh oh [17:32:23] Some URL's query string has stuff in that needs to be escaped before I can feed that into a cite template. [17:32:34] But I ended up escaping the entire URL [17:32:46] Now all the links it converts, are broken. :/ [17:33:12] oops [17:33:29] This is why you need unit tests ;) [17:33:51] I thought it was a simple enough fix to not need any tests. [17:33:58] Shows what I know. :p [17:35:29] (03PS3) 10MarcoAurelio: Adding .lighttpd.conf file [labs/tools/stewardbots] - 10https://gerrit.wikimedia.org/r/276209 [17:38:42] (03PS4) 10MarcoAurelio: Adding .lighttpd.conf file [labs/tools/stewardbots] - 10https://gerrit.wikimedia.org/r/276209 [17:42:42] andrewbogott: are you an op in this channel? I think that stashbot got muted here somehow. The bot isn't acking T1234 phab references anymore, but it works in all the other channels I've tested in. [17:43:09] The bot is here and seeing new messages and not logging any errors about sending things [17:43:17] 10Wikibugs: Get icon and color from API instead of screen scraping - https://phabricator.wikimedia.org/T1176#2103733 (10mmodell) project.search and maniphest.search return the needed info... Though it'd be nice to have it combined into one api call [18:16:54] RECOVERY - Puppet failure on tools-exec-1215 is OK: OK: Less than 1.00% above the threshold [0.0] [18:17:53] bd808: There are only two quiets: one at *!*@198.199.82.216 and one at *!*@wikimedia/bot/helpmebot [18:18:15] yeah those were the only ones I could see too. [18:18:48] But it think there are ways to +q that don't show to non-ops (might be wrong on that) [18:19:12] I don't there are non visible quiets? I don't know such a possibilty [18:19:24] btw, is the topic still up to date? [18:19:56] it isn't [18:20:05] 'some tools may need to be manually restarted' is not true [18:20:22] however my client sucks and I can't copy paste the title to just remove that bit, can someone else do it? [18:20:24] Ok, I will remove it [18:20:45] thanks Luke081515 [18:20:58] (I'm using a second client for that too ;)) [18:21:23] heh :D [18:21:54] This client gots a lot of "extra functions", but normally I don'T need them ;) [18:36:15] yuvipanda, once when I was looking over your shoulder, I saw you type "labs " and that was interpreted as a "ssh .eqiad.wmflabs" [18:36:27] Was wondering how you did that. Alias line? Bash script? [18:36:59] halfak: alias line :) [18:37:08] Care to share your line? :) [18:38:35] I don't have it anymore unfortunately [18:38:36] but let me see [18:38:57] Oh! I made it work with a function [18:39:06] * halfak pastes his function and alias [18:39:19] cool :D [18:39:28] aliases usually don't take args I wonder how you did it? [18:39:28] https://gist.github.com/halfak/c29724704fc99f6c2241 [18:39:34] A function! [18:39:40] http://stackoverflow.com/questions/7131670/make-bash-alias-that-takes-parameter [18:39:43] Thanks SO :D [18:40:38] I typo "eqiad.wmflabs" as "eqiad.wmnet" so often. I'm looking forward to not doing that anymore. [18:40:50] :D [18:41:06] Sorry to bug you yuvipanda. Do you think this would be a nice email to labs-l? [18:42:04] halfak: indeed! [18:47:46] woaaah [18:47:50] that would save me so much time [18:49:01] OK. Email sent :) [18:49:19] I just made one for accessing my virtualenvs too. :) [18:49:36] This is a useful trick :) [18:51:13] I have `py2` and `py3` aliased to different virtualenvs so its easy to switch [18:51:51] I just set up an alias that handles "venv " [18:52:06] so "venv 3.4" starts up my general 3.4 env [18:52:07] Only downside to `labs host` is loss of tab complete... [18:52:15] But I have some specific ones -- like 'ores" [18:52:23] legoktm, good point [18:53:02] I usually don't list out big sets of instances in my .ssh/config, but I suppose that if you did, tab complete could be really powerful. [18:53:12] You just have to maintain the list [18:53:14] woah, there's tab complete? [18:54:58] oh, I have mine set up to remember any host I've ever ssh'd into using .ssh/known_hosts [18:55:22] you could pretty easily do bash completion for labs [18:55:35] chasemp, cool. how do you set something like that up? [18:55:36] copy the normal ssh one almost and put in bash.completion.d I would imagine [18:56:15] Woah. [18:56:20] * halfak did not know this was here :) [18:56:45] bash just uses bash to do tab completion, even contextual like certain options for a command [18:56:56] so the magic is usually just a shell script in the .d with a package [18:58:48] I think w/ homebrew on osx you can peek in ls /usr/local/etc/bash_completion.d/ [18:58:50] to kind of get an idea [19:02:23] /etc/bash_completion.d in Ubuntu Trusty :) [19:02:30] heh you win [19:02:46] there may also be logic to parse like ~/.bash_completion.d I think [19:02:50] to do local overrides [19:05:37] * halfak digs into that for a minute and finds a bit of a mess [19:05:46] Looks like the ubuntu default .bashrc [19:05:51] doesn't have that. [19:06:07] But if you dig into the lines it does have about loading completion vars, you could probably figure out how to load your own. [19:06:43] 'cause there are a few lines about executing /etc/bash_completion if it exists [19:06:55] * halfak waits for model to build [19:07:14] A lot like waiting for something to compile. :) [19:07:28] I fell down this hole writing my prompt line a few years ago :) [19:07:34] bash completion that is [19:11:34] yuvipanda: ssh does have tab completion by default, but ssh hashes known_hosts by default so it doesn't work [19:11:49] you can do "HashKnownHosts no" in your .ssh/config [19:11:50] ah [19:12:10] and/or you can use a more authoritative list for your known_hosts [19:12:17] than typing yes when prompted with fingerprints [19:12:39] for production, I copy known_hosts from bast1001 [19:12:50] so I have a chain of trust, essentially [19:13:01] (which unfortunately right now involves puppet, but ok) [19:13:13] yeah [19:13:16] a bit harder for labs tho [19:14:50] oh and halfak, including /etc/bash_completion should suffice [19:21:17] 6Labs, 6Operations, 10wikitech.wikimedia.org, 13Patch-For-Review: Update wikitech-static OS/PHP version - https://phabricator.wikimedia.org/T126385#2104213 (10Dzahn) Done, removed the WIP and merged DNS switch. [19:22:09] yuvipanda, hey [19:22:10] 6Labs, 6Operations, 10wikitech.wikimedia.org, 13Patch-For-Review: Update wikitech-static OS/PHP version - https://phabricator.wikimedia.org/T126385#2104214 (10Dzahn) the sync between wikitech and wikitech-static .. does that need any change because the IP changed? or all just hostname based? [19:22:17] hey Krenair [19:22:35] I set up an instance called horizon-proxy-dashboard.openstack.eqiad.wmflabs yesterday [19:22:47] 6Labs, 6Operations, 10wikitech.wikimedia.org: Update wikitech-static OS/PHP version - https://phabricator.wikimedia.org/T126385#2104215 (10Dzahn) [19:22:54] but it's now decided to prompt me for a password before allowing me to sudo [19:23:22] 6Labs, 6Operations, 10wikitech.wikimedia.org: Update wikitech-static OS/PHP version - https://phabricator.wikimedia.org/T126385#2012874 (10Dzahn) a:5Andrew>3Krenair [19:23:54] 6Labs, 6Operations, 10wikitech.wikimedia.org: Update wikitech-static OS/PHP version - https://phabricator.wikimedia.org/T126385#2104219 (10Krenair) I don't think anything needs to be changed - the access restriction was removed recently and the dumps became public [19:24:03] Krenair: ah [19:24:10] Krenair: andrewbogott would have a better idea about that, probably [19:24:15] ok [19:24:36] I suppose I could always add myself to the root authorized keys for the project and see what's up [19:25:18] 6Labs, 6Operations, 10wikitech.wikimedia.org: Update wikitech-static OS/PHP version - https://phabricator.wikimedia.org/T126385#2104222 (10Dzahn) 5Open>3Resolved eh, true :) i just did that in T54170 i will just claim you resolved this [19:30:35] 6Labs, 6Operations, 10wikitech.wikimedia.org: decom old wikitech-static machine - https://phabricator.wikimedia.org/T129391#2104237 (10Dzahn) [19:31:27] 6Labs, 6Operations, 10wikitech.wikimedia.org: decom old wikitech-static machine - https://phabricator.wikimedia.org/T129391#2104237 (10Dzahn) [19:34:25] yuvipanda, mind running puppet there? [19:36:04] Krenair: sure [19:36:31] > The last Puppet run was at Tue Mar 8 21:05:42 UTC 2016 (1350 minutes ago). [19:36:33] hmm [19:36:44] Error: Could not retrieve catalog from remote server: Error 400 on SERVER: Failed to determined $::labsproject at /etc/puppet/manifests/realm.pp:24 on node horizon-proxy-dashboard.openstack.eqiad.wmflabs [19:36:46] Warning: Not using cache on failed catalog [19:36:48] Error: Could not retrieve catalog; skipping run [19:36:50] Krenair: ^ [19:37:29] PROBLEM - Puppet failure on tools-worker-1009 is CRITICAL: CRITICAL: 16.67% of data above the critical threshold [0.0] [19:40:22] labsprojectfrommetadata looks... interesting [19:41:32] metadata = Facter::Util::Resolution.exec("curl -f http://169.254.169.254/openstack/2013-10-17/meta_data.json/ 2> /dev/null").chomp [19:41:33] what. [19:42:35] RECOVERY - Puppet failure on tools-worker-1009 is OK: OK: Less than 1.00% above the threshold [0.0] [19:42:46] okay [19:42:57] krenair@horizon-proxy-dashboard:~$ hostname -d [19:42:57] krenair@horizon-proxy-dashboard:~$ [19:43:12] fun [19:43:33] other instances print something like bastion.eqiad.wmflabs [19:44:22] of course, I can't change anything useful because no root [19:45:20] Krenair: I just ran a [19:45:27] usermod -a -G sudo krenair [19:45:31] ty [19:45:35] Krenair: so logout and log back in and try again? [19:45:58] no luck [19:46:04] am in the group but sudo still prompts for password [19:46:40] could you just apply https://wikitech.wikimedia.org/w/index.php?title=Hiera:Openstack&diff=356786&oldid=191986 manually? [19:47:29] Krenair: just did [19:47:50] works, thanks [19:48:08] np [19:54:20] Hi all, I keep getting warning emails about failed puppet runs on one of my instances [19:54:31] I think, however, that I fixed the issue [19:54:44] manual runs work without any problems [19:54:59] is there something that needs to be reset for the emails to stop coming? [19:57:56] dschwen, you get no errors when running puppet, but you still get emails about failures? [20:03:30] PROBLEM - Puppet failure on tools-worker-1009 is CRITICAL: CRITICAL: 40.00% of data above the critical threshold [0.0] [20:22:14] yuvipanda, got it working again [20:22:29] yuvipanda, had to re-set the hostname and remove that IP from the loopback interface [20:22:37] fun [20:23:55] 10PAWS: PAWS 404 for users with special characters in their names - https://phabricator.wikimedia.org/T120066#2104474 (10yuvipanda) The notebook PR got merged upstream too. [20:27:42] Krenair, that's correct [20:27:49] no errors and $? is 0 [20:31:06] dschwen: can you file a bug? [20:31:12] dschwen: and cc andrewbogott :D [20:32:23] dschwen: if it works manually it really shouldn’t be sending you emails. Do please file a bug. [20:33:31] RECOVERY - Puppet failure on tools-worker-1009 is OK: OK: Less than 1.00% above the threshold [0.0] [20:40:47] k, will do [20:45:53] done [20:46:54] assigned to andrewbogott (I assume that is what you meant by CC, yuvipanda?) [20:48:08] ah, there's a 'subscribers' field, which is what I meant :) [20:48:16] but andrewbogott can probably unassign/reassign as he sees fit [20:48:26] you should also usually associate the 'Labs' project [20:48:33] thanks for the bug, dschwen! [20:56:53] 6Labs, 10Tool-Labs: Linux Error: libgcc_s.so.1 - https://phabricator.wikimedia.org/T129361#2103417 (10yuvipanda) give it more memory? [20:59:39] 10PAWS: Implement a sane way to access mysql replicas from PAWS - https://phabricator.wikimedia.org/T120471#2104639 (10yuvipanda) a:3yuvipanda [21:00:06] ^ is what I'm going to work on today! [21:06:41] 6Labs, 7Puppet: Receiving puppet run failure alert for instance where manual puppet runs complete fine - https://phabricator.wikimedia.org/T129403#2104657 (10dschwen) a:5Andrew>3yuvipanda [21:07:10] Oh, I guess that's necessary to have wikibugs post the announcements here :-) [21:07:47] dschwen: yup [21:08:07] dschwen: also in general, you don't assign things to people. people pick and assign things to themselves [21:08:16] 6Labs, 7Puppet: Receiving puppet run failure alert for instance where manual puppet runs complete fine - https://phabricator.wikimedia.org/T129403#2104660 (10yuvipanda) a:5yuvipanda>3None [21:21:09] 6Labs, 10Tool-Labs, 10pywikibot-core: Tool Labs Pywikibot does not work with new shared Pywikibot config files - https://phabricator.wikimedia.org/T129406#2104686 (10Ato_01) [21:21:56] Hello, can someone tell me, how to add a new wiki to wikidata, if you're using the wikidata role at vagrant? [21:34:57] 10PAWS, 6Research-and-Data: Build a pool of beta testers for PAWS - https://phabricator.wikimedia.org/T129297#2104733 (10DarTar) [21:35:05] !log rcm deleted rcm-4 (was unused) [21:35:14] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Rcm/SAL, Master [21:35:28] labs-morebots is slow today [21:35:28] I am a logbot running on tools-exec-1221. [21:35:28] Messages are logged to wikitech.wikimedia.org/wiki/Server_Admin_Log. [21:35:29] To log a message, type !log . [21:35:42] 6Labs, 10Tool-Labs, 10pywikibot-core: Tool Labs Pywikibot does not work with new shared Pywikibot config files - https://phabricator.wikimedia.org/T129406#2104734 (10Ato_01) p:5Triage>3Normal [21:40:34] Luke081515: Hi please try in the #wikidata channel you may get a faster response. You can also ping the admins there and they are fast at replying. [21:40:56] ok, I will try it [21:42:19] Luke081515: Ok thanks. [22:23:14] !ping [22:23:14] !pong [22:23:16] ok [22:37:04] 10Tool-Labs-tools-Other, 6Community-Tech, 7Community-Wishlist-Survey, 7Milestone: Pageview Stats tool - https://phabricator.wikimedia.org/T120497#2105003 (10kaldari) [23:06:59] 6Labs: Sort out labs user privs in Horizon vs. Wikitech - https://phabricator.wikimedia.org/T91830#2105124 (10Andrew) 5Open>3Resolved a:3Andrew I think this is reasonably well-address now, but for odd corner cases. [23:07:01] 6Labs, 7Tracking: Make OpenStack Horizon useful for production labs - https://phabricator.wikimedia.org/T87279#2105127 (10Andrew) [23:09:51] 6Labs: Switch to using Horizon/Designate for labs public dns - https://phabricator.wikimedia.org/T124184#2105132 (10Andrew) There are a few conflicts between our security model and designates. Specifically, domains are 'owned' by a project, and subdomains cannot be created in other projects. E.g. wmflabs.org w... [23:12:26] PROBLEM - ToolLabs Home Page on toollabs is CRITICAL: CRITICAL - Socket timeout after 10 seconds [23:13:13] yuvipanda: ^ [23:13:40] ori: yup, load is going back down now [23:13:50] we should also get rid of the shinken check, since we've a paging icinga one [23:13:53] I'll try to do that later [23:17:22] RECOVERY - ToolLabs Home Page on toollabs is OK: HTTP OK: HTTP/1.1 200 OK - 801037 bytes in 5.241 second response time [23:35:56] 10PAWS, 13Patch-For-Review: Setup an icinga check for PAWS - https://phabricator.wikimedia.org/T129209#2105209 (10Dzahn) works now, after moving it into the icinga module service: https://icinga.wikimedia.org/cgi-bin/icinga/status.cgi?host=paws.wmflabs.org&nostatusheader host: https://icinga.wikimedia.org/cg... [23:36:07] 10PAWS, 13Patch-For-Review: Setup an icinga check for PAWS - https://phabricator.wikimedia.org/T129209#2105210 (10Dzahn) 5Open>3Resolved [23:42:08] 10PAWS, 13Patch-For-Review: Setup an icinga check for PAWS - https://phabricator.wikimedia.org/T129209#2105217 (10Dzahn) So yea, when using @monitoring::host to create a virtual host in Icinga, that needs to happen within the icinga module or something that is applied on neon. we have this structure that wor...