[00:03:37] 6Labs, 10Labs-Infrastructure, 10Tool-Labs, 6operations: failed backups on labstore? - https://phabricator.wikimedia.org/T125749#1996375 (10Dzahn) - /usr/local/bin/nrpe_check_systemd_unit_state on labstore is puppetized, but the check_replicate commands and NRPE config seems to be missing - check_systemd... [00:11:01] 6Labs, 10Labs-Infrastructure, 10Tool-Labs, 6operations: failed backups on labstore? - https://phabricator.wikimedia.org/T125749#1996400 (10Dzahn) nevermind, the script actually just gets this from systemctl like this: /bin/systemctl show replicate-maps | grep Result Result=exit-code This is what is execu... [00:11:41] 6Labs, 10Labs-Infrastructure, 10Tool-Labs, 6operations: labstore - replication to codfw broken or not working yet - https://phabricator.wikimedia.org/T125749#1996409 (10Dzahn) [00:12:49] RECOVERY - SSH on tools-worker-1002 is OK: SSH OK - OpenSSH_6.7p1 Debian-5+deb8u1 (protocol 2.0) [00:18:48] PROBLEM - SSH on tools-worker-1002 is CRITICAL: Server answer [00:43:37] 6Labs, 10Labs-Infrastructure, 10Tool-Labs, 6operations: labstore - replication to codfw broken or not working yet - https://phabricator.wikimedia.org/T125749#1996500 (10yuvipanda) Old snapshot on labstore2001 had gotten full, causing lvs to fail, causing the backup script to fail. I've cleaned them out on... [00:44:02] chasemp: ^ fyi. I cleaned it up but am not starting replication manually since I think the timer will trigger at some point anyway [00:48:05] YuviPanda, so I'm getting a wierd bug when testing out new code on the labs infrastructure [00:48:08] PHP Fatal error: Call to undefined function mysqli_fetch_all() in /data/project/cyberbot/bots/cyberbot-ii/IABot/DB.php on line 27 [00:48:30] you're probably on precise, and that's running php 5.3 which doesn't have mysqli (I think?) [00:48:43] if it's using jsub, add -l release=trusty to your jsub call [00:48:57] if it's webservice, use 'webservice --release=trusty restart' to move it to trusty [00:48:58] I'm actually running it on-login to get some quick output [00:49:17] Before I send it to the grid [00:49:54] ah [00:49:58] so that's already trusty [00:50:42] YuviPanda, ANY IDEAS? [00:50:49] sorry caps lock [00:51:29] php5-mysql is already the newest version. [00:51:31] so it is installed [00:51:54] php -i | grep mysqli [00:51:55] Well it's not working apparently [00:51:57] shows it's configured [00:53:13] Cyberpower678: file a bug, I'll look at it in maybe an hour? [00:53:57] YuviPanda, I should point out that if it's able to get to mysqli_fetch_all then the other mysqli functions are working [00:54:51] indeed, but I'm in the middle of something else and can't quite dive into it atm :) if you file a bug, I'll remembe rto look when I'm done with this in an hour or so :) [00:55:08] * Cyberpower678 files a bug [00:55:28] ty [00:58:46] RECOVERY - SSH on tools-worker-1002 is OK: SSH OK - OpenSSH_6.7p1 Debian-5+deb8u1 (protocol 2.0) [00:59:32] 6Labs, 10Tool-Labs: mysqli_fetch_all causes a Fatal error in PHP - https://phabricator.wikimedia.org/T125758#1996562 (10Cyberpower678) 3NEW a:3yuvipanda [01:00:08] YuviPanda, This is very important as the bot I'm developing is for the top requested community wishlist request [01:00:24] So I set it to Needs Triage. [01:00:36] Hope you don't mind [01:02:19] kaldari, ^ [01:03:50] lol, looks like Phabricator just went down :P [01:04:07] and it's back [01:04:28] kaldari, I just deployed the latest code [01:04:38] yeah, server needed rebooting [01:04:46] PROBLEM - SSH on tools-worker-1002 is CRITICAL: Server answer [01:04:49] kaldari, but toollabs has a broken mysqli extension [01:05:02] So the bot crashes immediately [01:07:22] Cyberpower678: would it be possible to use mysqli_fetch_array instead and iterate through it? Just suggesting a possible workaround :) [01:08:22] kaldari, meh, it doens't change the fact that toollabs has a broken extension installed [01:08:29] true [01:08:52] Because it works on my local installation [01:09:07] guess we'll see what Yuvi says [01:18:32] kaldari, I wonder if IA could optimize their API in any way. [01:29:50] RECOVERY - SSH on tools-worker-1002 is OK: SSH OK - OpenSSH_6.7p1 Debian-5+deb8u1 (protocol 2.0) [01:40:04] Cyberpower678: am back. taking a look [01:40:11] YuviPanda, yay [01:40:16] :-) [01:41:12] Cyberpower678: can you paste a snippet of your code that is misbehaving in the bug? [01:41:57] YuviPanda, PHP Fatal error: Call to undefined function mysqli_fetch_all() in /data/project/cyberbot/bots/cyberbot-ii/IABot/DB.php on line 27 [01:42:20] YuviPanda, https://github.com/cyberpower678/Cyberbot_II/blob/test-code/IABot/DB.php [01:45:35] YuviPanda, does that help>? [01:51:20] Cyberpower678: found the issue, should be fixed on tools-login just now [01:51:27] I'm going to submit a puppet patch to fix it elsewhere now [01:52:31] once I verify it won't break anything, at least [01:53:57] YuviPanda, it works on -login [02:02:58] Cyberpower678: cool. I merged a patch that should make it work on the exec hosts too [02:03:36] Cyberpower678: I also emailed labs-l about the switch to a different mysql php provider, so if anyone else's code runs into issues with it I'll have to revert. but shouldn't cause any issues [02:04:01] cyberbot exec is still failing I think [02:04:19] it'll take about 20mins for it to propogate [02:04:30] Oh [02:04:35] Then I'll wait. [02:04:37] it's also a precise instance [02:04:42] so I don't know if it'll work there or not [02:04:47] we'll find out, I guess [02:04:49] kaldari, When the change merges Cyberbot will automatically start up [02:04:52] it's running an ancient version of PHP [02:05:07] Cyberbot's exec node wasn't updated? [02:05:15] What version is it running? [02:05:20] 5.3 [02:05:26] Oh that's fine [02:05:29] that exec node was precise from the start [02:05:36] This bot is compatible with 5 [02:05:39] and we're not doing any new 'special' exec nodes [02:06:01] I think it was tested with 5.4 [02:23:14] YuviPanda, let me know when the deploy is complete [02:23:26] it happens automatically [02:23:51] so you should run your code and check to see if it works :) [02:27:12] There it goes. Cyberbot just started up [02:28:52] ok :) [02:35:27] valhallasw`cloud: I sent an announcement for this just in case it did affect someone [02:35:53] (03PS1) 10Tim Landscheidt: Switch list.php to proxymanager's new API [labs/toollabs] - 10https://gerrit.wikimedia.org/r/268343 [02:39:32] (03CR) 10Tim Landscheidt: [C: 04-2] "Depends on I2d643fc902208eafaaa0d7814e586f0c326f16b5 being deployed. I didn't actually test this due to T109807, but I'm pretty sure it w" [labs/toollabs] - 10https://gerrit.wikimedia.org/r/268343 (owner: 10Tim Landscheidt) [04:21:40] tools-webgrid-lighttpd-1208 is dead [04:22:05] ermm [04:22:58] not dead but not wealthy at least [04:51:48] PROBLEM - SSH on tools-worker-1002 is CRITICAL: Server answer [05:08:04] PROBLEM - SSH on tools-webgrid-lighttpd-1208 is CRITICAL: Server answer [05:13:05] RECOVERY - SSH on tools-webgrid-lighttpd-1208 is OK: SSH OK - OpenSSH_6.6.1p1 Ubuntu-2ubuntu2~wmfprecise2 (protocol 2.0) [05:19:52] wikibugs seems to be down [05:31:05] twentyafterfour, legoktm, valhallasw`cloud, Krenair: I think wikibugs is down. Maybe the phab reboot made it sad? [05:46:47] RECOVERY - SSH on tools-worker-1002 is OK: SSH OK - OpenSSH_6.7p1 Debian-5+deb8u1 (protocol 2.0) [05:52:47] PROBLEM - SSH on tools-worker-1002 is CRITICAL: Server answer [06:52:40] hey, my webservice is not working at all. Doesn't respond in time [06:52:52] and I can't restart it [07:19:16] RECOVERY - SSH on tools-worker-1002 is OK: SSH OK - OpenSSH_6.7p1 Debian-5+deb8u1 (protocol 2.0) [07:23:48] PROBLEM - SSH on tools-worker-1002 is CRITICAL: Server answer [07:34:10] 10Wikibugs: wikibugs offline, hangs on json decoding - https://phabricator.wikimedia.org/T125769#1996951 (10valhallasw) 3NEW [07:34:21] bd808: hm, seems online again? [07:35:04] 10Wikibugs: wikibugs offline, hangs on json decoding - https://phabricator.wikimedia.org/T125769#1996958 (10valhallasw) 5Open>3Invalid a:3valhallasw Hm, but this is reported so it seems up again? Maybe someone else already restarted it? [07:36:03] Amir1: which one? [07:36:29] Dexbot [07:36:33] thanks :) [07:38:23] 6Labs, 10Tool-Labs: tools-webgrid-lighttpd-1208.eqiad.wmflabs hangs - https://phabricator.wikimedia.org/T125770#1996965 (10valhallasw) 3NEW [07:38:49] RECOVERY - SSH on tools-worker-1002 is OK: SSH OK - OpenSSH_6.7p1 Debian-5+deb8u1 (protocol 2.0) [07:41:58] 6Labs, 10Tool-Labs: tools-webgrid-lighttpd-1208.eqiad.wmflabs hangs - https://phabricator.wikimedia.org/T125770#1996981 (10valhallasw) I force-rescheduled jobs using `qhost -j -h tools-webgrid-lighttpd-1208.eqiad.wmflabs | grep -e 'MASTER' | cut -b1-11 | xargs qmod -rj`, but left the host up (unrebooted) for f... [07:42:10] Amir1: should be ok now [07:42:32] although https://tools.wmflabs.org/dexbot/ 404s, but that's probably something on your side? [07:42:46] yeah, thanks :) [07:43:13] No, you can test it using https://tools.wmflabs.org/dexbot/tools/mass_translate.php [07:43:17] or several other tools [07:44:17] I'm not going to remember that ;-) [07:44:47] PROBLEM - SSH on tools-worker-1002 is CRITICAL: Server answer [07:49:18] I put something out for the main page :D [08:04:19] 6Labs, 10Tool-Labs: mysqli_fetch_all causes a Fatal error in PHP - https://phabricator.wikimedia.org/T125758#1996683 (10yuvipanda) 5Open>3Resolved This was caused by us using php5-mysql rather than php5-mysqlnd. I've switched them over (and informed labs-l). Works ok now. [08:12:35] YuviPanda: https://stackoverflow.com/questions/29980884/proxy-pass-does-not-resolve-dns-using-etc-hosts [08:12:37] (sigh) [08:29:19] 6Labs, 10Tool-Labs: mysqli_fetch_all causes a Fatal error in PHP - https://phabricator.wikimedia.org/T125758#1996713 (10Ricordisamoa) [09:29:02] 6Labs, 10Labs-Infrastructure, 10DBA, 6operations: db1069 is running low on space - https://phabricator.wikimedia.org/T124464#1997110 (10jcrespo) p:5Triage>3Normal [09:30:54] 6Labs, 10Labs-Infrastructure, 10DBA, 6operations: db1069 is running low on space - https://phabricator.wikimedia.org/T124464#1997115 (10jcrespo) There were many innodb tables there, but mostly not on purpose, as the come from imports/backup recovery/alters/new deployment. Doing a (slow) batch conversion of... [10:17:25] 10Wikibugs: wikibugs offline, hangs on json decoding - https://phabricator.wikimedia.org/T125769#1997191 (10Legoktm) Not me! :) [10:24:49] RECOVERY - SSH on tools-worker-1002 is OK: SSH OK - OpenSSH_6.7p1 Debian-5+deb8u1 (protocol 2.0) [10:30:47] PROBLEM - SSH on tools-worker-1002 is CRITICAL: Server answer [10:32:27] 6Labs, 10DBA, 6operations, 5Patch-For-Review: Set up additional filters for Echo tables - https://phabricator.wikimedia.org/T125591#1997200 (10jcrespo) Filters are both puppetized and applied live on sanitarium/labs. They have been tested to work succesfully. Now I have to delete all echo tables from there. [10:53:06] PROBLEM - SSH on tools-webgrid-lighttpd-1208 is CRITICAL: Server answer [10:58:05] RECOVERY - SSH on tools-webgrid-lighttpd-1208 is OK: SSH OK - OpenSSH_6.6.1p1 Ubuntu-2ubuntu2~wmfprecise2 (protocol 2.0) [12:05:28] 6Labs, 10DBA, 6operations, 5Patch-For-Review: Set up additional filters for Echo tables - https://phabricator.wikimedia.org/T125591#1997494 (10jcrespo) Dropping is ongoing, it will take some time as I do not want to affect labs' replication lag. [12:05:48] RECOVERY - SSH on tools-worker-1002 is OK: SSH OK - OpenSSH_6.7p1 Debian-5+deb8u1 (protocol 2.0) [12:11:48] PROBLEM - SSH on tools-worker-1002 is CRITICAL: Server answer [12:47:06] 6Labs, 10Beta-Cluster-Infrastructure: Disable /data/project for instances in deployment-prep that do not need it - https://phabricator.wikimedia.org/T125624#1997565 (10hashar) Last time I checked, a various backend services wrote their logs to /data/project so it can be read from anywhere. Though that might j... [13:29:26] PROBLEM - Puppet staleness on tools-webgrid-lighttpd-1208 is CRITICAL: CRITICAL: 66.67% of data above the critical threshold [43200.0] [14:12:45] 6Labs, 10DBA, 6operations, 5Patch-For-Review: Set up additional filters for Echo tables - https://phabricator.wikimedia.org/T125591#1997835 (10jcrespo) These tables where deleted in db1069 and labsdb*: {F3310163} {F3310164} [14:35:51] 6Labs, 10DBA, 6operations, 5Patch-For-Review: Set up additional filters for Echo tables - https://phabricator.wikimedia.org/T125591#1997943 (10jcrespo) 5Open>3Resolved @Mattflaschen This is done, please continue reporting any potential issue regarding privacy && labs, even if data was not really expose... [16:07:21] 6Labs, 10wikitech.wikimedia.org, 5Patch-For-Review: Decide on future of Semantic extensions on Wikitech - https://phabricator.wikimedia.org/T123599#1998225 (10Aklapper) All Gerrit patches merged or abandoned for this "Unbreak now" priority task. What is left to do here? Someone please lay out what else is ne... [16:23:19] Hi, all. I wanted to point out this thread: https://github.com/openstreetmap/openstreetmap-website/issues/1146 [16:25:01] i.e. " Add Wikipedia authentication" to OpenStreetMap [16:27:30] as of now you can use several third party service to login in OSM [16:27:31] including Google, Facebook, and Wordpress [16:27:31] it would be possibile to add Wikipedia logins among these services? [16:27:31] CristianCantoro, WITH OAuth, yes [16:39:59] https://www.mediawiki.org/wiki/OAuth/For_Developers [16:40:15] that's the page for media wiki oauth [16:46:05] but the goal of OAuth is not to give a user permission to edit WIkipedia/Wikimedia wikis with external services? In this case the point is to authenticate the user with Wikimedia credentials and let him/het edit OSM [16:46:06] ? [16:49:49] I think you can use /identify: https://www.mediawiki.org/wiki/Extension:OAuth#Identify_the_User_.28optional.29 [16:51:09] CristianCantoro: ? [16:51:21] let me check [16:51:41] there's several kinds of oauth grants here [16:52:08] some are api access via oauth [16:52:20] some are login only [16:52:59] check https://meta.wikimedia.org/wiki/Special:OAuthConsumerRegistration/propose [16:54:34] is mwoauth for python installed on tool labs and i'm just being an idiot? or should I file a request? [16:54:58] AmandaNP: you can install it locally [16:55:15] on virtualenv [16:55:33] zhuyifei1999_: just using the --install-dir right? [16:55:52] on virtualenv? [16:56:37] https://wikitech.wikimedia.org/wiki/Help:Tool_Labs#My_tool_requires_a_package_that_is_not_currently_installed_in_Tool_Labs._How_can_I_add_it.3F [16:56:49] thanks, looking [16:57:27] you can "$ python setup.py install --user" if you don't want to use virtualenv [17:00:05] AmandaNP: if you have issues of virtualnv not activating automatically on every login, try renaming the mentioned .bashrc to .bash_profile [17:01:12] zhuyifei1999_: will virtualenv work when i invoke python through jsub? [17:01:25] hmm idk [17:01:30] or should I just install user... [17:01:43] me thinks install for user may be best [17:01:55] in that case install user seems more reliable [17:02:29] ya. i'll do that. [17:08:12] CristianCantoro: the "authentication only" OAuth grant is for things that just want to know that a user has been authenticated by a Wikimedia wiki. See https://meta.wikimedia.org/w/index.php?title=Special:OAuthListConsumers/view/aea31746a1e5d5b3e7514952f70e7035&name=&publisher=BDavis+%28WMF%29&stage=1 [17:08:38] 6Labs, 10wikitech.wikimedia.org: Old versions of Semantic* extensions break Wikitech (after removing long-living deprecated functions from core) - https://phabricator.wikimedia.org/T123599#1998456 (10Florian) 5Open>3Resolved a:3Reedy [17:10:27] 6Labs, 10wikitech.wikimedia.org: Old versions of Semantic* extensions break Wikitech (after removing long-living deprecated functions from core) - https://phabricator.wikimedia.org/T123599#1933483 (10Florian) Seems, that {T53642} is, in fact, the underlying problem of this task, while this task tries to solve... [17:12:27] AmandaNP: yes, but make sure to pass --release=trusty [17:12:37] (re: jsub) [17:13:03] 6Labs, 10wikitech.wikimedia.org: Get rid of SMW/SRF/SF - https://phabricator.wikimedia.org/T53642#1998491 (10Florian) p:5Normal>3High I know, increase of the Priority doesn't result in a faster resolve, if no-one is working on the task. But with a look at {T123599} I think, normal isn't the appropriate pri... [17:13:45] valhallasw`cloud: so I could call virtualenv in the jsub argument and add that? [17:14:06] AmandaNP: if you use /path/to/virtualenv/bin/python, it will automatically use the virutalenv [17:15:13] bd808: https://tools.wmflabs.org/bash/quip/AU8FCPz66snAnmqnLHDj omg I love this from your tool [17:15:38] :) [17:16:08] arbitrary resource access is on the OWASP Top 10 list you know ;) [17:35:04] https://tools.wmflabs.org/bash/quip/AU7VT5c76snAnmqnK_qL <= just can't help myself from laughing [17:39:27] (03PS1) 10Jcrespo: Add (fake) mysql client pass for access from ruthenium to m5-master [labs/private] - 10https://gerrit.wikimedia.org/r/268423 (https://phabricator.wikimedia.org/T125435) [17:40:32] (03PS2) 10Jcrespo: Add (fake) mysql client pass for access from ruthenium to m5-master [labs/private] - 10https://gerrit.wikimedia.org/r/268423 (https://phabricator.wikimedia.org/T125435) [17:42:41] (03PS3) 10Jcrespo: Add (fake) mysql client pass for access from ruthenium to m5-master [labs/private] - 10https://gerrit.wikimedia.org/r/268423 (https://phabricator.wikimedia.org/T125435) [17:56:28] thanks bd808 and zhuyifei1999 [18:08:16] Cyberpower678: how would i do that? [18:08:53] It doesnt ask for them and they are in the user config [18:08:54] When you registered your OAuth Consumer, it should've given a consumer key/secret as well as an access token/secret [18:09:13] So 4 keys in total [18:09:18] Yes i got those. [18:09:30] But login.py doesnt ask for them [18:10:05] I'm not sure Pywikibot works, but it should ask for those when logging in with OAuth. If it doesn't then it isn't built correctly. [18:10:20] *sur ehow [18:10:24] *sure how [18:10:52] It only asks for consumer stuff unless im missing an argument [18:11:08] Can you point me to the code? [18:11:14] of the login function? [18:14:25] AmandaNP: using pyikibot? [18:14:37] *pywikibot [18:15:23] see https://www.mediawiki.org/wiki/Manual:Pywikibot/OAuth [18:15:34] zhuyifei1999_: yes. And i did. [18:16:02] what's your output? [18:16:02] Cyberpower678: on my phone atm. [18:16:28] AmandaNP: see w:WP:BOTN under breaking change [18:16:40] Err zhuyifei1999_ [18:16:43] zhuyifei1999_, AmandaNP is trying to logon via OAuth [18:16:57] But apparently pywikibot only asks for the consumer keys and secrets [18:17:08] Cyberpower678: pretty sure the complexity of pywikibot will drive you crazy [18:18:07] If pywikibot only asks for those keys, then it's designed wrong [18:18:14] 6Labs, 10Labs-Infrastructure, 6Phabricator: can't log in to phab-01.eqiad.wmflabs - https://phabricator.wikimedia.org/T125666#1998922 (10Luke081515) p:5Triage>3High [18:18:36] AmandaNP: which step are you having error on? [18:18:51] zhuyifei1999_, when pywikibot tries to identify [18:19:17] zhuyifei1999_: post entering the consumer key/token [18:19:26] Cyberpower678: my tool generally have little issues with pywikibot oauth [18:19:43] I haven't seen the code myself, but I'm guessing a developer who uses OAuth misprogrammed pywikibot. [18:19:44] AmandaNP: which command? [18:20:24] Cyberpower678: I think I know what's the issue, just confirming [18:20:43] zhuyifei1999_, cool. Then I'll leave it in your capable hands [18:21:30] Standby getting on comp [18:21:35] k [18:21:46] * Cyberpower678 works on college stuff in the meantime [18:21:54] * Cyberpower678 opens his assembler [18:22:35] AmandaNP: you should set all four tokens in user-config.py [18:22:40] assembler? [18:22:49] valhallasw`cloud: yep, that's done [18:23:04] then try editing [18:23:13] right, so why would login.py need to ask for anything? [18:23:19] https://en.wikipedia.org/wiki/Assembler_(computing) ? [18:23:27] zhuyifei1999_, I'm writing a program with assembly [18:23:47] that would be my question [18:23:51] because he might be using python login.py -oauth ? [18:24:04] I am [18:24:04] Cyberpower678: cool [18:24:06] well [18:24:14] pwb.py login -oauth [18:25:18] AmandaNP: after tokens to user-config, I think I don't have to run that script again [18:25:28] * after adding [18:25:34] * you don't [18:26:58] try doing an edit with pywikibot [18:29:28] one sec i've f'd up my directory earlier by moving my bot scripts around [18:42:41] (03CR) 10Jcrespo: [C: 032] Add (fake) mysql client pass for access from ruthenium to m5-master [labs/private] - 10https://gerrit.wikimedia.org/r/268423 (https://phabricator.wikimedia.org/T125435) (owner: 10Jcrespo) [18:42:51] (03CR) 10Jcrespo: [V: 032] Add (fake) mysql client pass for access from ruthenium to m5-master [labs/private] - 10https://gerrit.wikimedia.org/r/268423 (https://phabricator.wikimedia.org/T125435) (owner: 10Jcrespo) [18:54:49] zhuyifei1999_: i'm going to be a while. while updating pywikibot, apparently there are breaking changes from long ago. [18:55:31] Cyberpower678: got any pywikibot examples I can read? I need to figure out the new way to write things [18:56:21] afaik, CP678 works with PHP and not pyikibot [18:58:14] looks like it. [18:58:19] * AmandaNP looks up another user [19:24:18] valhallasw`cloud: yeah, is kinda nuts [19:24:43] valhallasw`cloud: I wrote a blog post about it, and convinced jupyter to switch: https://github.com/jupyter/jupyterhub/pull/411 [19:30:15] (03PS3) 10Addshore: Add wikidata/.* to wikidata-feed [labs/tools/grrrit] - 10https://gerrit.wikimedia.org/r/247831 [19:30:24] 6Labs, 10Beta-Cluster-Infrastructure: Disable /data/project for instances in deployment-prep that do not need it - https://phabricator.wikimedia.org/T125624#1999364 (10yuvipanda) If anything is writing logs to NFS we must make sure it stops as soon as possible - I'll check with the parsoid people :) ALL insta... [19:34:16] zhuyifei1999_, correct [19:34:30] AmandaNP, sorry my knowledge of Pywikibot is limit [19:34:44] I only know the theory of it's operation [19:35:11] AmandaNP: asking in the #pywikibot channel will probably help :) [19:37:55] 6Labs, 10Beta-Cluster-Infrastructure: Disable /data/project for instances in deployment-prep that do not need it - https://phabricator.wikimedia.org/T125624#1999407 (10hashar) [19:39:34] 6Labs, 10Beta-Cluster-Infrastructure: Disable /data/project for instances in deployment-prep that do not need it - https://phabricator.wikimedia.org/T125624#1992944 (10hashar) Listed the NFS types via salt/df/magic: ``` root@deployment-salt:~ # salt -v --out=txt '*' cmd.run "df -t nfs 2>/dev/null|grep -v ^File... [20:23:06] 6Labs, 10Labs-Infrastructure, 10Tool-Labs, 6operations, 7Monitoring: Ensure mysql credential creation for tools users is running - https://phabricator.wikimedia.org/T125874#1999618 (10Dzahn) 3NEW [20:44:00] 6Labs: Create Labs project for ifttt - https://phabricator.wikimedia.org/T124131#1946760 (10bd808) Has the service outgrown running as a tool? Running a project takes more admin work than keeping things running in the tools grid. In any case we need the template details I added to the summary filled out so we k... [20:49:21] 6Labs, 15User-bd808: Labs project for MW fuzzing - https://phabricator.wikimedia.org/T125340#1999766 (10bd808) 5Open>3Resolved a:3bd808 https://wikitech.wikimedia.org/wiki/Nova_Resource:Mwfuzz [20:49:23] 6Labs, 7Tracking: New Labs project requests (tracking) - https://phabricator.wikimedia.org/T76375#1999769 (10bd808) [20:50:20] 6Labs, 6Parsing-Team, 7Tracking: Create labs project for parsing team to run experimental mediawiki installs - https://phabricator.wikimedia.org/T125882#1999774 (10ssastry) 3NEW [20:51:26] 6Labs, 6Parsing-Team, 7Tracking: Create labs project for parsing team to run experimental mediawiki installs - https://phabricator.wikimedia.org/T125882#1999774 (10ssastry) [20:51:32] 6Labs: Create Labs project for ifttt - https://phabricator.wikimedia.org/T124131#1999785 (10yuvipanda) 5Open>3Resolved a:3yuvipanda I, uh, actually already created the project and forgot to close this task... I've been appropriately shamed! [20:51:34] 6Labs, 7Tracking: New Labs project requests (tracking) - https://phabricator.wikimedia.org/T76375#1999788 (10yuvipanda) [20:51:57] 6Labs, 6Stewards-and-global-tools: Create a Labs project for admin tooling enhancements - https://phabricator.wikimedia.org/T123993#1999790 (10bd808) @lfaraone any objection to calling the project "cuos" so the overloaded "tools" word isn't part of the name? [20:56:06] 6Labs, 6Parsing-Team, 15User-bd808: Create labs project for parsing team to run experimental mediawiki installs - https://phabricator.wikimedia.org/T125882#1999798 (10bd808) a:3bd808 [20:57:04] 6Labs: Create a labs project for apt repository for mediawiki-vagrant installs - https://phabricator.wikimedia.org/T125884#1999811 (10yuvipanda) 3NEW [21:01:21] 6Labs, 7Tracking: New Labs project requests (tracking) - https://phabricator.wikimedia.org/T76375#1999852 (10yuvipanda) [21:01:23] 6Labs: Create a labs project for apt repository for mediawiki-vagrant installs - https://phabricator.wikimedia.org/T125884#1999849 (10yuvipanda) 5Open>3Resolved a:3yuvipanda Done! [21:02:26] 6Labs, 6Parsing-Team, 15User-bd808: Create labs project for parsing team to run experimental mediawiki installs - https://phabricator.wikimedia.org/T125882#1999854 (10bd808) 5Open>3Resolved https://wikitech.wikimedia.org/wiki/Nova_Resource:Wikitextexp [21:02:28] 6Labs, 7Tracking: New Labs project requests (tracking) - https://phabricator.wikimedia.org/T76375#1999857 (10bd808) [21:02:34] mobrovac: bd808 I'm setting up the aptly now [21:02:55] nice! [21:03:06] mobrovac: what version of node did you want me to import? [21:03:13] YuviPanda: 4.2.4 [21:03:15] ok [21:03:28] mobrovac: is there a preferred trusty repo you'd like me to import it from? :) [21:03:42] oh [21:03:49] found it [21:03:56] https://deb.nodesource.com/node_4.x/ [21:04:29] YuviPanda: https://github.com/nodesource/distributions#installation-instructions [21:04:34] YuviPanda: hehe, that's the one [21:04:58] these aren't the packages that we have in prod though [21:05:28] paravoid: yeah, but we don't have trusty packages for nodejs in prod do we? [21:05:38] so this is the policy stuff someone involved with MWV has to figure out :) [21:05:45] MWV? [21:06:11] paravoid: mediawiki vagrant [21:06:21] oh [21:06:28] https://phabricator.wikimedia.org/T125760 [21:06:34] this is for MWV [21:07:03] dare I ask why mediawiki-vagrant still uses trusty? :) [21:07:22] no, you may not! :) [21:07:23] because MW in prod runs on trusty [21:08:15] I'd be willing to start working on converting it to jessie. We need at least an HHVM build for that [21:09:20] * bd808 may have signed himself up for too many side projects lately [21:10:13] bd808: mobrovac ok, imported, available as mwv-apt.wmflabs.org [21:10:18] there's no useful http interface there [21:10:24] but you can add it to sources.list and should work [21:10:36] no signing yet [21:10:38] need to figure that out [21:11:27] 6Labs: Create a labs project for apt repository for mediawiki-vagrant installs - https://phabricator.wikimedia.org/T125884#1999928 (10bd808) [21:11:38] deb [trusted=yes] http://tools-services-01/repo trusty-tools main [21:11:42] except [21:12:02] deb [trusted=yes] http://mwv-apt.wmflabs.org/repo trusty-mwv-apt main [21:12:07] we can get rid of the trusted=yes [21:12:10] once I figure out signing [21:13:24] ok [21:13:26] brb [21:18:42] YuviPanda: signing is a lot of effort for very little gain [21:18:51] valhallasw`cloud: yeah, but this is for MWV [21:18:59] valhallasw`cloud: so we can pust a gpg key just there and see how it goes [21:43:33] bd808: mobrovac do you care about signing? [21:43:35] for this? [21:43:39] https works too... [21:45:00] We can probably survive with just https unless csteipp hates that. [21:45:05] (03PS1) 10Tim Landscheidt: Import sql from operations/puppet [labs/toollabs] - 10https://gerrit.wikimedia.org/r/268563 [21:45:36] mobrovac already figured out how to stand on his head and get the https apt transport installed in the VMs [21:45:43] bd808: yeah [21:45:47] bd808: and https works for this so... [21:45:57] bd808: so you need to have root in the project to update packages [21:46:09] bd808: and by default there's a precise, jessie and trusty repo too [21:51:43] hi guys! i'm finding lately a python script of mine is having a "segmentation fault" when it tries to run a mysql query that actually works in the mysql command line. does anyone know why this happen? [21:53:30] a python bug? [21:53:52] I would check the memory usag [21:54:09] it's possible that your script is ussing more memory than allowed, and getting killed [21:57:49] mmm must be a bug, it seems strange that the query works in command line [21:59:32] is your script somewhere that can be seen? [22:00:01] wait a sec, pastebin [22:00:15] http://pastebin.com/RxzyTY0t [22:03:20] your problem is mysql_cur.fetchall() [22:03:41] you are trying to get into memory the whole revision table [22:03:41] mmm, it dies before that [22:03:44] which is HUGE [22:03:50] it dies after the print [22:03:52] query print [22:04:12] sorry [22:04:16] i mean before the query print [22:04:21] so it actually doesn't get to fetch it [22:04:54] maybe the execute in python-mysql already does an implicit fetch? [22:05:20] i'm not sure about that [22:05:48] the query in command line takes 2 min or 3 [22:05:56] which means it's assumable [22:06:21] it's indexed [22:06:31] but still, you are retrieving the full revision table [22:06:35] why do you need that? [22:06:54] a study i'm doing [22:07:12] i cannot do with less unfortunately [22:07:34] you're probably better off just parsing the dumps for something like this [22:08:25] it's another option yes [22:08:40] but it annoys me a lot that the query actually works and runs all the lines in mysql command [22:09:10] you can always run it from mysql binary [22:09:18] well, you should be able to loop over the rows [22:09:35] rather than getting all at the same time [22:09:46] he has a loop below [22:09:53] valhallasw`cloud: it doesn't die when fetching them, but in the execute [22:10:15] I bet it's pre-fetching and getting too many results to hold in ram [22:10:48] I don't know the guts of the python mysql bindings but the php bindings would do that (pre-fetch results) [22:10:55] I would be surprised -- mysqldb is pretty low level, and matches the mysql C library's logic with cursors etc [22:10:57] but it's possible [22:11:04] marmick: is this on a bastion or on the grid? [22:11:14] bastion [22:11:23] but that fetchall() will die no matter what really [22:11:29] hm. [22:11:43] in any case, it should never segfault -- I don't get why it does that [22:11:58] allocation error imho [22:12:00] maybe my mysqldb connection has sth wrong [22:12:24] Platonides: 'allocation error'? [22:13:04] as in: tries to malloc() too much and crashes when that fails? [22:13:53] attributeslist parameter is unused [22:14:18] Platonides: right i didn't use it this time, going to delete it [22:14:21] valhallasw`cloud: it malloc(), and the kernel gives it a virtual page [22:14:32] when trying to actually use it, it turns out there's no memory for that [22:14:34] so it's killed [22:16:09] Platonides: overcommit is disabled on tools hosts [22:16:28] so malloc() would fail instead [22:16:50] I wasn't aware of that [22:18:18] * Platonides tries running it [22:20:11] it does load something in memory, at least [22:20:26] memory usage went up to ~800M without actually reading rows [22:20:39] for ~2M rows [22:20:50] what wiki are u trying the query with? [22:20:56] nlwiktionary [22:21:19] it succeeds, but requires 800M memory, so it's likely a larger wiki will need more memory, cause malloc() to fail, etc. [22:23:53] then, hwo come does it work when it is used n the command line? python mysqldb? [22:27:10] marmick: dunno, it might skip the cursor altogether [22:27:12] valhallasw`cloud: in iswiki works perfectly [22:27:21] but in cawiki dies [22:27:21] (the cursor has to store some sort of reference to the retrieved rows) [22:29:04] oh, actually [22:29:19] http://mysql-python.sourceforge.net/MySQLdb.html#using-and-extending [22:29:40] apparently the normal cursor indeed immediately retrieves rows [22:30:33] ah, bad cursor... [22:31:51] you u just sent is omitting the cursor right? [22:32:26] probably [22:32:33] should be using SSCursor [22:32:34] marmick: what? [22:32:35] SSCursor [22:32:36] A "server-side" cursor. Like Cursor but uses CursorUseResultMixIn. Use only if you are dealing with potentially large result sets. [22:32:41] Yeah, it works with SSCursor [22:32:54] it also tells me there are 18446744073709551615 results which is maybe a bit much [22:33:51] and you might run into sql server disconnects [22:34:03] again, the database is not really suited for things like this [22:35:20] 18446744073709551615? [22:35:29] wow [22:36:36] that seems… too much [22:36:44] on which wiki are you trying it, valhallasw`cloud ? [22:37:50] even for enwiki, there aren't more than 703338990 revisions [22:37:57] Nlwikt [22:38:55] max rev_id is 2543158, cound() revision is 2495900 [22:39:24] Yeah, the Number is clearly nonsense [22:39:31] it's a -1 [22:39:49] (uint64_t)-1 [22:45:10] Ah, that makes sense [22:47:05] funny, nl wiktionary has 38 pageless revisions [22:50:08] apparently it took 5 min to fetch cawiki revisions when there are 15938019 results [22:50:08] actually, it misses some page entries in labs [22:50:24] in command line i mean! [22:50:30] in python it still dies [22:52:42] 10Labs-Other-Projects: Configure Single Sign On at discourse.wmflabs.org - https://phabricator.wikimedia.org/T124691#2000367 (10Tgr) >>! In T124691#1993035, @EBernhardson wrote: > another potential blocker though is that we don't disclose email addresses over oauth1 from mediawiki. We do, you need to select //A... [23:03:01] anyone know of a pageview API tool that supports deeplinking? the demo is pretty awesome, except you can create deeplinks to certain pages... https://analytics.wmflabs.org/demo/pageview-api/ [23:03:11] *can't [23:05:26] Cyberpower678: you up for the task? :) [23:05:54] I would do it, but I do Ruby and no one else does [23:05:58] MusikAnimal, unfortunately no. [23:06:05] I already have a large task [23:06:17] yeah me too, got a backlog of tech work [23:06:28] It's name is InternetArchiveBot [23:06:34] I can at least create something very simple [23:06:55] yes, an exciting task that is! [23:08:22] And quite a time demanding one too [23:08:39] And a lot more fun than xTools [23:08:46] Surprisingly I [23:09:25] *I'm getting a lot more comments and input than I ever did with xTools [23:10:12] glad to hear it [23:10:32] having something like that has been a long time coming. There used to be another bot that did it way back when, guess it wasn't meant to be [23:11:39] This bot will last a while [23:13:03] MusikAnimal, someone I can't remember, mentioned that this would be the next big thing after ClueBot NG