[00:46:44] 10MediaWiki-extensions-OpenStackManager: Can't delete NovaProxy instance with malformed DNS hostname - https://phabricator.wikimedia.org/T69927#1718662 (10Krenair) No, I made one of these proxy entries and tried to delete it by sending requests to dynamicproxy-api. It's the `/`s which break everything. ```krena... [00:49:11] 6Labs, 10MediaWiki-extensions-OpenStackManager, 10Labs-Infrastructure: Can't delete NovaProxy instance with malformed DNS hostname - https://phabricator.wikimedia.org/T69927#1718663 (10Krenair) https://github.com/mitsuhiko/flask/issues/900 https://github.com/mitsuhiko/werkzeug/issues/477 [00:52:06] 6Labs, 10MediaWiki-extensions-OpenStackManager, 10Labs-Infrastructure: Can't delete NovaProxy instance with malformed DNS hostname - https://phabricator.wikimedia.org/T69927#1718665 (10Krenair) So I think we should change invisible-unicorn to validate domain names, and delete (on the server-side) those that... [00:57:33] 6Labs, 10Labs-Infrastructure: Can't delete NovaProxy instance with malformed DNS hostname - https://phabricator.wikimedia.org/T69927#1718666 (10Krenair) [00:59:01] It seems /data/scratch/dumps/ got moved. [00:59:10] There's now /public/dumps, but it's not writeable? [00:59:25] If I want to download https://dumps.wikimedia.org/enwiki/20151002/enwiki-20151002-pages-meta-current.xml.bz2 somewhere, where should I put it? [01:03:10] I went with /data/scratch/dumps again. [01:03:19] But if there's a better place, please let me know. [01:27:13] (03PS1) 10Alex Monk: Don't allow creation of invalid domains [labs/invisible-unicorn] - 10https://gerrit.wikimedia.org/r/245200 (https://phabricator.wikimedia.org/T69927) [01:54:44] 6Labs, 10MediaWiki-extensions-OpenStackManager, 10wikitech.wikimedia.org: https://wikitech.wikimedia.org/w/api.php?action=novaprojects&subaction=getall times out - https://phabricator.wikimedia.org/T115034#1718700 (10Krenair) See {T102404} [01:56:47] 6Labs: Kill action=novaprojects - https://phabricator.wikimedia.org/T102404#1718703 (10Krenair) Doesn't give you user roles though. :/ [01:57:02] 10MediaWiki-extensions-OpenStackManager: Kill action=novaprojects - https://phabricator.wikimedia.org/T102404#1718704 (10Krenair) [02:02:29] 6Labs: Automate population of floating->internal IP aliasing in labs pdns recursor. - https://phabricator.wikimedia.org/T108063#1718706 (10Krenair) [02:02:30] 6Labs, 10Labs-Infrastructure: Automate generation of floating/private dns aliases in the labs recursor - https://phabricator.wikimedia.org/T100990#1718707 (10Krenair) [02:02:49] 6Labs, 10Labs-Infrastructure: Automate generation of floating/private dns aliases in the labs recursor - https://phabricator.wikimedia.org/T100990#1718709 (10Krenair) a:3Krenair Being handled in https://gerrit.wikimedia.org/r/#/c/243357/ [02:03:07] 6Labs, 10Labs-Infrastructure, 5Patch-For-Review: Automate generation of floating/private dns aliases in the labs recursor - https://phabricator.wikimedia.org/T100990#1718711 (10Krenair) [02:05:53] Hi - how can I request more memory for my PHP web tool? [02:18:21] 6Labs, 10MediaWiki-extensions-OpenStackManager, 10wikitech.wikimedia.org: Provide wikitech API for list of OpenStack regions - https://phabricator.wikimedia.org/T115237#1718712 (10scfc) 3NEW [02:27:32] 6Labs, 5Patch-For-Review: Convert all ldap globals into hiera variables instead - https://phabricator.wikimedia.org/T101447#1718721 (10Krenair) These are the puppetVar attributes in the hosts database right? There's some remaining ssh_hba entries which haven't been cleaned up, as well as a [[ https://gerrit.wi... [02:27:49] 6Labs: Convert all ldap globals into hiera variables instead - https://phabricator.wikimedia.org/T101447#1718722 (10Krenair) [02:47:33] 6Labs, 6Discovery, 10Wikidata, 10Wikidata-Query-Service: Wikidata Metrics - https://phabricator.wikimedia.org/T115120#1718727 (10Smalyshev) It is possible, but I don't think it is a good idea for the service as we run it, since it would just consume space/memory and we don't use them much. Still, for stand... [03:19:55] liangent: file a bug! [03:22:34] yuvipanda: do you mean filing a bug for requesting this customization feature or setting a higher limit for my tool? [03:22:47] if it's the latter I don't really know how much is enough [03:22:58] liangent: yeah, only admins can set a higher limit [03:23:07] so if you file a bug I can raise it tomorrow [03:23:10] (gotta go sleep now :() [03:23:29] good night [03:34:10] 6Labs, 10Tool-Labs: Expose wb_items_per_site table on replicated database on Tool Lab - https://phabricator.wikimedia.org/T115239#1718785 (10liangent) 3NEW [03:36:02] 6Labs, 10Tool-Labs: Expose wb_items_per_site table on replicated database on Tool Lab - https://phabricator.wikimedia.org/T115239#1718793 (10liangent) 5Open>3Invalid a:3liangent Wrong database used I guess; sorry. [03:38:31] This dump is still downloading. [03:38:32] Hmm. [09:46:38] PROBLEM - Puppet staleness on tools-k8s-bastion-01 is CRITICAL: CRITICAL: 100.00% of data above the critical threshold [43200.0] [10:48:16] (03CR) 10BBlack: [C: 031] Add blank mx secrets [labs/private] - 10https://gerrit.wikimedia.org/r/245139 (https://phabricator.wikimedia.org/T87848) (owner: 10Alex Monk) [11:11:56] Katie: yay, fast dump downloads! [11:12:08] !krrrit-wm tell me about yourself! [11:12:42] (03CR) 10Addshore: "Has this been deployed? :O" [labs/tools/grrrit] - 10https://gerrit.wikimedia.org/r/241060 (owner: 10Addshore) [11:12:53] ^^ for anyone that knows how to check.. :D [13:39:07] 6Labs, 10Labs-Other-Projects, 10The-Wikipedia-Library: Create Cyberbot Project on Labs - https://phabricator.wikimedia.org/T112881#1719218 (10Addshore) From experience of running a fast editing global bot http://meta.wikimedia.org/wiki/User:Addbot 10 million articles can be edited in a month across all sites... [13:59:20] 6Labs, 10Labs-Other-Projects, 10The-Wikipedia-Library: Create Cyberbot Project on Labs - https://phabricator.wikimedia.org/T112881#1719265 (10yuvipanda) I agree with @addshore too. I also don't think ops will be very happy with you making millions of API requests for page contents either. [14:38:52] sitic: Crosswatch gives a 404. What's up? [14:39:09] I had started relying on it much... [14:58:02] legoktm: yuvipanda ^^ [15:03:20] Niharika: in meetings all day :( [15:03:37] yuvipanda: Whenever you get time then. :) [15:03:58] I'm actually in Puerto Rico for the next week [15:04:00] :( [15:04:03] so technically not all of this week [15:05:11] Puerto Rico. Nice! [15:05:26] addshore: It was slow. [15:05:26] It's okay. I'll manage the week. [15:05:29] addshore: It took like 4 hours. [16:16:32] 6Labs: Ignored file /etc/apt/apt.conf.d/20auto-upgrades.ucf-dist - https://phabricator.wikimedia.org/T110055#1719522 (10brion) I get this on a fresh instance. [16:34:25] 6Labs: Document labs SSH Fingerprints in sha256 format - https://phabricator.wikimedia.org/T112993#1719560 (10He7d3r) Looks good. [16:38:07] hi - I visited here about 2 days ago.. to ask about my account, and see what services i could access.. (I am writing a proposal that may include some wikimedia component) .. I appreciate the pointers I got at that time [16:38:39] I have some of the links in the history.. a kid in the household here closed some of my open tabs, so I might not have all of them [16:39:34] I would like to see if I can login to the 'bastion' base .. which appears to be the start of more detailed machine access [16:39:51] (I run some linux machines myself, but I dont know the whole setup on -labs) [16:44:32] dbb: it's complicated :-) [16:45:01] dbb: let me start by saying that if you want to use tool labs, you're better off just connecting to login.tools.wmflabs.org [16:45:36] having said that, tool labs was built on top of wikimedia labs, which is essentially a virtual machine host. All tool labs hosts are VMs running there [16:45:47] and there's a few hundred VMs also running there [16:46:23] most of those VMs don't have an external IP address, so you can't just directly SSH in. Instead, you SSH to a bastion, and use that to connect to the internal IP address of the VM you're trying to reach [16:52:18] (and I don't think all of those which do have external IPs will let you SSH from outside the network) [17:22:46] Change on 12wikitech.wikimedia.org a page Nova Resource:Tools/Access Request/Scoopfinder was created, changed by Scoopfinder link https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/Access_Request/Scoopfinder edit summary: Created page with "{{Tools Access Request |Justification=Run pywikipedia |Completed=false |User Name=Scoopfinder }}" [17:27:17] ok - this is how far I got.. [17:28:20] from a stable ubuntu 14.04 machine, I tried to login to a few of these machines.. using the same password auth I used at wikitech.wikimedia.org [17:28:43] it appeared to want an ssh key.. I do not recall using keys a year ago, but my memory is dim [17:29:14] so, logged in, going to this page https://wikitech.wikimedia.org/wiki/Special:Preferences#mw-prefsection-openstack [17:29:42] I generated two key pairs on my side, with ssh-keygen, using slightly differnt settings [17:30:06] I uploaded the .pub portion to the OpenStack tab in Prefs, as indicated [17:30:24] but, using ssh -i mykey remote, I am not being let in [17:30:34] I could certainly be missing something on my side [17:34:19] dbb, okay so you definitely shouldn't be logging in with passwords [17:34:51] you can SSH into bastions, correct? [17:36:08] hi Krenair [17:36:23] I have not successfull ssh'd into anything this year [17:36:48] okay, so you should get that sorted first [17:36:55] try sshing to bastion.wmflabs.org [17:36:56] when I supply the private part of the key via ssh -i I still am denied [17:37:02] tryng [17:38:22] I had requested a login of darkblueb, to match my wiki identity, but it is unclear whether I was init'd with dbb or darkblueb or what [17:38:35] so thats a variable [17:39:06] aha - dbb worked [17:39:15] this is new behavior -- good news [17:40:08] Linux bastion-01 3.16.0-4-amd64 #1 SMP Debian 3.16.7-ckt4-3 (2015-02-03) x86_64 GNU/Linux [17:40:12] have you found the page about setting up your ssh config to go through the bastions? [17:40:29] where is that? [17:43:25] dbb, oh, first: what instance are you going to try to access? tools? [17:43:35] something in tools* [17:44:37] I do not know yet.. I am trying to a) clear up my login creds from last year; b) get a 'geohack' sort of response via some service; c) understand how that might get epanded relevent to the proposal I am writing [17:44:48] we seem to be making progress :-) [17:45:27] do you know what I mean by 'geohack' ? basically retrieve context info, given a POINT or BBOX [17:45:28] no point setting up ssh client config to access instances if you haven't been granted any project memberships [17:45:52] ok good - fewer steps at this time, is better [17:46:13] I guess you just want tools access? [17:47:02] this sort of info to start.. https://tools.wmflabs.org/geohack/geohack.php?pagename=Berkeley%2C_California¶ms=37_52_18_N_122_16_22_W_type:city%28112580%29_region:US-CA [17:47:18] then, referencing things there, get pages in some condensed format perhaps [17:47:43] valhallasw`cloud, ^ [17:48:00] sorry, having dinner now. will be back in an hour or so [17:49:35] thx - I willbe aroundfor about three hours or so [17:49:42] :-) [17:56:18] 6Labs: Document labs SSH Fingerprints in sha256 format - https://phabricator.wikimedia.org/T112993#1719672 (10Krenair) Done, posted the script I used to generate the pages to https://wikitech.wikimedia.org/wiki/Help:SSH_Fingerprints Still got a couple of production hosts to sort out [17:56:26] 6Labs: Document labs SSH Fingerprints in sha256 format - https://phabricator.wikimedia.org/T112993#1719673 (10Krenair) 5Open>3Resolved [18:13:33] 6Labs: Document labs SSH Fingerprints in sha256 format - https://phabricator.wikimedia.org/T112993#1719722 (10Krenair) Done those stat ones (by hand using ssh-keyscan on a production bastion and ssh-keygen in my VM, then copying everything manually into the template), added git-ssh (I also added bast2001/bast400... [18:19:09] Is it possible to share a database between two tools? [18:20:48] Matthew_, I guess ops could allow you to do that [18:21:43] Hm, OK. I want to split my bot away from the web interface into separate tools, as the bot will not be tied to this web interface alone. [18:33:32] 6Labs: Ignored file /etc/apt/apt.conf.d/20auto-upgrades.ucf-dist - https://phabricator.wikimedia.org/T110055#1719773 (10hashar) 5Open>3stalled Yup I broke it with https://gerrit.wikimedia.org/r/#/c/210024/1/modules/labs_vmbuilder/files/postinst.copy which causes the image building script to do a recursive co... [18:34:37] Krenair: What do you think the procedure would be to do that? Phabricator? [18:47:10] Matthew_: if the name of your database ends with _p, every other user can also read it [18:47:22] Matthew_: you might also be able to give grants to specific users, but I'm not sure about that. [18:47:40] valhallasw`cloud: I don’t want every user to read it, as it will contain usernames. I just want these specific tools to be able to. [18:49:04] hi valhallasw`cloud [18:49:48] dbb: you indeed have access to the bastion and nothing else. OK. [18:50:12] valhallasw`cloud: did you see the summary of what I am trying to accomplish ? [18:50:23] dbb: yes, but I don't understand it [18:50:29] a) clear up my login creds from last year; b) get a 'geohack' sort of response via some service; c) understand how that might get epanded relevent to the proposal I am writing [18:50:42] so a) seems to have worked? [18:50:56] I don't get what you mean with b) (you can access https://tools.wmflabs.org/geohack/geohack.php already?) [18:51:07] I am now able to log into bastion with username dbb, and a key I generated, yes [18:51:27] hmm I would like to avoind parsing HTML pages [18:51:31] avoid [18:52:29] are the interface/params/result sets of geohack.php documented ? [18:54:05] Matthew_: you can use GRANT to give specific other users access [18:54:11] e.g. GRANT ALL on u1092__testing_grants TO s51328; [18:54:22] really, I would like to get wikipedia pages .. more than a list of other map systems.. I make map systems myself [18:54:39] .. but geohack.php already knows a lot of things [18:54:42] dbb: I'm confused. What data are you looking for? [18:54:53] valhallasw`cloud: Thank you [18:55:00] geohack doesn't know anything, it just reformats what's in the URL query string [18:55:09] hmmm [18:55:11] well, mostly [18:55:57] in this proposal, I have a POINT of interest, or BBOX of interest.. some kind of TIME of interest.. I want to use wikimedia services to get context information on that location [18:56:31] I don't see how geohack provides you that [18:56:42] ok maybe I am asking the wrong thing then [18:59:18] dbb: if you want to know which articles are close to a location, you can use the api [18:59:24] however, there is some system in wiipedia, where *some* pages are marked with a location [18:59:40] which api(s) ? [18:59:42] the inverse (from page to location) also works with the api [18:59:48] https://en.wikipedia.org/w/api.php?action=help&recursivesubmodules=1#query+geosearch [18:59:55] ah oh hm [18:59:57] eh, https://en.wikipedia.org/w/api.php?action=help&modules=query%2Bgeosearch is quicker to load [19:00:15] I dont care about performance yet [19:00:34] as long as it is somewhat sane [19:01:09] you can use https://en.wikipedia.org/w/api.php?action=help&modules=query%2Bcoordinates to get coordinates for a given page [19:15:28] ok - thx! I have to go, back later.. thank you for the pointers [19:31:38] 6Labs, 10Tool-Labs: Webservice on Tools Labs fails repeatedly - https://phabricator.wikimedia.org/T115231#1719874 (10valhallasw) p:5Triage>3Unbreak! [19:34:18] 6Labs, 10Tool-Labs: Webservice on Tools Labs fails repeatedly - https://phabricator.wikimedia.org/T115231#1719876 (10valhallasw) At the moment this seems to work, so I'm closing this. If it happens again, please re-open it (with the same priority). [19:34:29] 6Labs, 10Tool-Labs: Webservice on Tools Labs fails repeatedly - https://phabricator.wikimedia.org/T115231#1719877 (10valhallasw) [20:56:55] (03PS1) 10: Added a wikidata-based "chart of the nuclides" under the periodic table app (link to /nuclides) [labs/tools/ptable] - 10https://gerrit.wikimedia.org/r/245591 [21:15:46] 6Labs: Two small instances: for WikiToLearn development - https://phabricator.wikimedia.org/T115282#1720069 (10CristianCantoro) 3NEW [22:30:47] Change on 12wikitech.wikimedia.org a page Nova Resource:Tools/Access Request/Scoopfinder was modified, changed by Tim Landscheidt link https://wikitech.wikimedia.org/w/index.php?diff=192236 edit summary: [22:31:55] Change on 12wikitech.wikimedia.org a page Nova Resource:Tools/Access Request/Vaggelisifa was modified, changed by Tim Landscheidt link https://wikitech.wikimedia.org/w/index.php?diff=192239 edit summary: