[00:02:23] Getting dab solver running involves about 3/4 of the work of a simple port my tools [00:05:15] It kind of specifically architected for Toolserver, Web browsers, and the weird rules [00:06:19] If you need help porting, I'd be more than willing to spend some time on it [00:07:11] Have you ported TS query-killer yet? Yeah, it relies on that. [00:07:30] labs has it's own query killer iirc [00:07:35] or did you need something specific? [00:07:48] in which case, I'd be willing to help port that. [00:10:27] hi [00:10:42] I am new to wikimedia tool labs [00:11:17] I am trying to access the tool labs but getting this connection error: debug1: Authentications that can continue: publickey,hostbased debug1: Next authentication method: publickey debug1: Offering RSA public key: /Users/sayantanm/.ssh/id_rsa debug1: Authentications that can continue: publickey,hostbased debug1: Trying private key: /Users/sayantanm/.ssh/id_dsa debug1: read PEM private key done: type RSA debug1: Authentications t [00:11:33] Really, I thought I read "Toolserver is the only place where read queries increase replication lag" someplace. It needs to support /* related.py LIMIT:3 NM */ [00:12:34] I wrote my own query-killer (TS's wasn't quick enough killing), but screen deamon's aren't allowed here [00:12:49] I regenerated my RSA keys but no luck [00:17:37] Dispenser: what does the full query look like? [00:18:14] That's just the relevant part [00:18:27] can your daemon just run in the grid? [00:22:46] May be able to. It was in auto-restarting bash loop and it printed out the current status in case I needed diagnose things. [01:57:46] 3Wikimedia Labs / 3deployment-prep (beta): Template search API request in Beta Labs finds nothing - 10https://bugzilla.wikimedia.org/66981#c3 (10James Forrester) 5REOP>3NEW a:3None Yes, appears to be the same Beta Labs flakiness again; moving there. [04:43:56] Where is the socket for mysql located? [05:11:04] never mind, found it [09:06:22] !log deployment-prep Adding cawiki and eswiki for cxserver testing {{gerrit|Ibbcbd43d43f878099788518d9e952fa7b20d8ca3}} [09:06:24] Logged the message, Master [09:10:39] !log deployment-prep used addwiki.php to create the wiki. manually triggered the Jenkins job that update the databases https://integration.wikimedia.org/ci/job/beta-update-databases-eqiad/2319/ [09:10:40] Logged the message, Master [09:11:16] kart_: the dbs are upgrading [09:11:16] https://integration.wikimedia.org/ci/job/beta-update-databases-eqiad/2319/label=deployment-bastion-eqiad,wikidb=eswiki/console [09:11:20] https://integration.wikimedia.org/ci/job/beta-update-databases-eqiad/2319/label=deployment-bastion-eqiad,wikidb=cawiki/console [09:11:31] nice :) [09:12:19] beer for hashar! [09:16:04] the stupid addwiki script did not create the index [09:16:05] ba [09:17:27] :( [09:22:38] !log deployment-prep Blow up ElasticSearch indices for cawiki and eswiki with: mwscript extensions/CirrusSearch/maintenance/updateOneSearchIndexConfig.php --wiki cawiki --startOver --indexType content && mwscript extensions/CirrusSearch/maintenance/updateOneSearchIndexConfig.php --wiki cawiki --startOver --indexType general [09:22:40] Logged the message, Master [09:24:14] !log deployment-prep Reindexed ElasticSearch index for cawiki/eswiki with: mwscript extensions/CirrusSearch/maintenance/forceSearchIndex.php --wiki {cawiki,eswiki} --batch-size=50 [09:24:16] Logged the message, Master [09:24:18] kart_: should be good now [09:24:31] http://ca.wikipedia.beta.wmflabs.org/wiki/Pàgina_principal http://es.wikipedia.beta.wmflabs.org/wiki/Página_principal [09:40:40] hasteur: Thanks, again! [10:09:13] DispenserAFK: printing out things land up in the 'out' log on the tool's home dir [11:12:00] 3Wikimedia Labs / 3tools: Provide wiki metadata in the databases similar to toolserver.wiki - 10https://bugzilla.wikimedia.org/48626 (10merl) [11:12:01] 3Wikimedia Labs / 3tools: Missing Toolserver features in Tools (tracking) - 10https://bugzilla.wikimedia.org/58791 (10merl) [11:12:02] 3Wikimedia Labs / 3tools: Add is_sensitive to meta_p.wiki - 10https://bugzilla.wikimedia.org/67476 (10merl) 3NEW p:3Unprio s:3enhanc a:3Marc A. Pelletier on toolserver.wiki there was a column is_sensitive. This is needed by to tools parsing user input oder langlinks. E.g first letter of titles for de... [11:14:53] !log [11:21:46] 3Wikimedia Labs / 3tools: Add some of the missing tables in commonswiki_f_p - 10https://bugzilla.wikimedia.org/59683#c12 (10nosy) Seems to be a general problem. [11:57:21] How can I restrict a query to pages created within the last month? Whatever I do seems to be slow. "SELECT page_title, (SELECT COUNT(*) FROM imagelinks WHERE il_to = page_title) AS imagelinks, (SELECT COUNT(*) FROM pagelinks WHER [11:57:25] E pl_namespace = 6 AND pl_title = page_title) AS links FROM page WHERE page_namespace = 6 AND page_is_redirect = 1 HAVING imagelinks + links <= 1;" [12:07:01] 3Wikimedia Labs / 3tools: Add some of the missing tables in commonswiki_f_p - 10https://bugzilla.wikimedia.org/59683#c13 (10Silke Meyer (WMDE)) Any chance we get this resolved? Please increase priority so that Merl can finish his migration, too. [12:07:34] meh [12:13:31] a930913: lazy man's solution: first look up the page id of a page created a month ago (from revision, I guess?) then use that as min(page_id)? [12:14:15] valhallasw: Now /that/ is programming :D [12:14:40] well, let's see if it's fast enough first :-p [12:16:52] valhallasw: Oh, 4 seconds to go back a day :( [12:18:09] a930913: maybe use recentchanges? [12:40:56] a930913: I would also suggest what valhallasw says but also use the rev_prev or what ever its called [12:41:37] Coren: enwiki is about 45 minutes lagging [13:04:00] Betacommand: Help? :p [13:06:10] a930913: sure if you give me some help :P [13:06:24] * a930913 hands Betacommand some help. [13:06:53] a930913: Give me a sec and Ill dig up the RC query [13:07:14] a930913: there is a new query at VPT about a whois [13:08:16] http://tools.wmflabs.org/betacommand-dev/cgi-bin/SIL?ip= 101.60.209.117 or more generichttp://tools.wmflabs.org/betacommand-dev/SIL.html [13:08:25] http://tools.wmflabs.org/betacommand-dev/cgi-bin/SIL?ip=101.60.209.117 or more generichttp://tools.wmflabs.org/betacommand-dev/SIL.html [13:08:54] a930913: can you post a note for me? [13:12:08] a930913: select * from recentchanges where rc_last_oldid = 0 ; gives you all new pages in the last 30 days [13:12:30] 3Wikimedia Labs / 3deployment-prep (beta): Import some content on beta.wmflabs.org wikis - 10https://bugzilla.wikimedia.org/66402 (10Antoine "hashar" Musso) [13:16:38] Betacommand: I should stop this whole meatsocking attribution thing, and just claim the glory for myself :p [13:16:54] a930913: might cause fewer headaches [13:17:05] Betacommand: I have a query, and I want to limit it to pages created in the last month. [13:17:09] a930913: I really dont care, as long as that info is given [13:17:19] a930913: see above [13:17:36] that query just gives new pages in the last month [13:17:38] I don't want a list of new pages though :/ [13:17:45] Oh! [13:17:52] You mean like that. [13:17:59] a930913: join it to your query [13:18:10] Sorry, I was still thinking of valhallasw's hack :) [13:18:25] a930913: which is what I used :P [13:18:51] Betacommand: Erm, what type of join is that? [13:19:18] pastebin your current query [13:19:42] http://tools.wmflabs.org/betacommand-dev/SIL.html doesn't provide full WHOIS details... [13:20:06] Glaisher: what is is missing? [13:20:22] lots... [13:20:24] http://whois.domaintools.com/127.0.0.1 [13:20:37] somethnig similar to overlordq's would be nice [13:20:42] Betacommand: https://en.wikipedia.org/wiki/Wikipedia:Database_reports/Unused_file_redirects/Configuration [13:22:31] a930913: what are you trying to do [13:23:41] Betacommand: Limit that query to those created within the $last_month. [13:25:34] Coren: What should my licensing be? :p [13:25:37] a930913: Deleting unused file redirects is kind of silly. [13:26:02] I'm not sure what the virtue of that database report is. [13:26:06] a930913: Public domain! [13:26:20] Carmela: It's just cruft, no? [13:26:25] a930913: That's really a personal choice. Most people either go GPL of some version or the 2-clause BSD, with dozen of options inbetween. [13:26:55] a930913: Maybe. [13:27:10] Personally, I tend to slap a 2-clause BSD on stand alone stuff and GPL on large collaborative projects. [13:27:27] Coren: What licence are you going to assign all my works? :p [13:27:40] I use public domain for most code. Copyright is evil. [13:28:14] Carmela: My code is essentially public domain, in that I have never specified any limitation :p [13:28:31] That's not quite how it works. [13:28:33] a930913: I'm not going to assign any. I'm talking with Legal about providing a default license for stuff that doesn't *have* an explicit one in Labs, but you can always override that default with licenses for any file, project or globally just by stating so. [13:28:49] Works are assumed to be copyrighted unless explicitly marked otherwise. [13:29:23] Coren: I haven't explicitly licenced any :p [13:29:24] In the U.S., anyway. [13:29:37] a930913: You should, regardless of whether we implement a default or not. [13:29:42] ^ [13:29:56] Coren: I know, but I can never decide which :) [13:30:05] So if you chose for me... :D [13:30:10] Heh. [13:31:03] Carmela: It's marked as open source, given the code to anybody who asked, with no limitations specified. So where does that put it? [13:31:43] a930913: Nowhere specific, and it puts reusers in trouble potentially. :-) [13:32:07] a930913: If you really don't care what people do with your code, the simples is the 2-clause BSD: http://opensource.org/licenses/BSD-2-Clause [13:34:27] a930913: http://pastebin.com/8TE87U8A [13:35:21] coren enwiki replication is stuck [13:39:25] Betacommand: <3 [13:40:50] a930913: had to look up rc_cur_id not sure why its called that [13:41:00] rc_page_id [13:41:09] would be a better name [13:42:24] * a930913 hands Betacommand a trout to slap somebody with. [13:42:43] a930913: I have enough of them [13:42:59] a930913: Remember when I used to and them out on AN/ANI? [14:23:31] 3Wikimedia Labs / 3tools: Add some of the missing tables in commonswiki_f_p - 10https://bugzilla.wikimedia.org/59683#c14 (10Marc A. Pelletier) There's a complicated issue with schema variation between databases that is causing probems. The one caused by rc_moved_to_ns is gone, I'm working on the other one n... [14:37:36] !log tools replication for enwiki is halted current lag is at 9876 [14:37:38] Logged the message, Master [14:45:47] a930913: if you're looking for a license, I reccomend WTFPL [14:48:04] YuviPanda: :) [15:12:39] Hmm, [[toollabs:external_link_made_to_look_internal/?http://google.com|Google]] ? [15:25:53] YuviPanda: The WTFPL isn't (strangely enough) a proper open source license. [15:29:07] legoktm: Make ^ for {{noping}}? [15:30:21] Oh, noping got Lua'd. [15:35:28] Coren: Can a tool have a colon in its name? [15:36:03] a930913: No; [a-zA-Z_][a-zA-Z_0-9]* [15:36:26] Because [[toollabs:http://foo.bar]] -> /data/project/http:/public_html//foo.bar :p [15:36:29] Though I would recommend against using uppercase. [15:37:18] Dispenser: o/ [15:37:47] * a930913 mooos at Dispenser ;) [15:38:00] hi a930913 [15:38:17] * a930913 sees what Dispenser did. :) [15:38:50] Dispenser: Can I see a photo of your 24TB array? :3 [15:41:34] Only got 6 TB on my home computer and 81 % full :-( [15:42:07] Dispenser: So where is the other 19TB? [15:42:41] Not yet purchased and I might have to revise that 24 TB figure upward [15:43:30] Dispenser: I thought you had 24TB of data? [15:44:10] Sorry [15:44:50] The request was for 24 TB to improve Checklinks and Reflinks [15:48:40] Can I help you revise a plan, so it can be submitted to Coren to review, and hopefully approve? Because, while you might need 24TB of data one day, you don't actually need that much yet, do you? [15:51:52] Likely take 3 months to fill. I'm sourcing numbers (likely deduplication and compression rates). I'm thinking of going independent, already got Reflinks somewhat running in a VM. [15:52:49] Dispenser: how are you forwarding to your remote server from labs? [15:52:58] php script [15:53:59] There are other strategies and we could include a purge-non-essential data if Labs quickly need the space. [15:56:39] aude: Do you know much about the wikidata-test instance? Or can you refer me to someone who does? [15:56:58] why? [15:57:31] Two questions: 1) is it OK if I force a refresh of the puppet config there? and 2) does that box really need to be self-hosted puppet? [15:57:45] 1) maybe/probably [15:57:48] 2) is curiosity, not an action item [15:57:49] Dispenser: What about the legality of it? Running your own removes liability from wikimedia, but what's its status? [15:57:57] 2) need to make a role so it doesn't need to be self-hosted [15:58:11] do you need me to pull latest puppet [15:58:12] ? [15:58:27] oh, it has custom patches now? Paravoid just claimed that it didn't but I haven't verified for myself. [15:58:38] Anyway, yes please, if you're able to fetch/rebase/rerun puppet [15:58:40] that'd be great. [15:58:44] they are on github at the moment [15:59:14] idk if it makes sense maintaining them in wmf operations/puppet [15:59:21] That box is running with a slightly broken exim config (which I don't understand the details of yet, but it should be easy to fix) [15:59:25] ok [15:59:51] we may want to use labsvagrant (probably) at some point [15:59:55] it's totally fine for it to be self-hosted. Just wondering if it was a mistake or an obsolete change. [16:00:09] ok [16:00:11] one sec [16:00:28] aude: yeah, I'm trying to train myself to use labsvagrant for my labs MW installs. Seems to work fine in most cases. [16:00:44] we don't have vagrant for wikibase yet [16:00:55] that's the issue but definitely would like to have it [16:01:31] aude, if you don't mind, maybe update puppet on the other two as well? Looks like wdjenkins and wdjenkins-node1 are self-hosted as well. [16:01:42] a930913: In American anyone can sue for any reason, valid and invalid. There plenty of web crawlers on the net, standard practice is to respect robots.txt. I imagine that most webmaster will be happy over some of the features I have planned. [16:01:46] might have to do tomorrow [16:01:56] aude: ok, no rush on the other two [16:01:58] how urgent? [16:02:16] i might need tobi to do those, since i'm afraid there might be important stuff overwrittent [16:02:25] and he's on holiday [16:02:41] aude: that's fine -- you can do wikidata-test right now, right? [16:02:51] Coren: That's a point, google and archival sites cache this stuff, so why can't we? [16:02:53] ok [16:03:09] doing [16:03:20] many thanks! [16:03:53] thanks aude! [16:04:27] !log deployment-prep Updated scap to ff04431 [16:04:29] Logged the message, Master [16:04:40] Error 400 on SERVER: Could not find class passwords::mysql::phabricator [16:04:45] what to do? [16:04:54] huh. [16:05:00] Oh, you need to rebase the private repo too [16:05:03] a930913: There are bazillion of caveats in how it's done and why. Like I said, none of it is insurmountable, but it does need to be addressed beyond "need X terabytes". Who has access, what are the archival criteria, who handles takedown notices (and how), etc. [16:05:07] it's in /var/lib/git/labs/private I believe. [16:05:12] k [16:10:35] Dispenser: Sounds quite reasonable, being an official body and all. I'm sure the community would approve of your application being fast tracked. What say you? [16:11:32] sounds good [16:14:17] Dispenser: So, moving forward, can we get the ball rolling at the end of the day? [16:14:50] Mine you I can't write proposals [16:15:03] andrewbogott: done [16:15:22] there's some stuff incompatible in our puppet code but can deal with it later [16:15:34] and can probalby remove the self thing [16:15:55] Coren: What format of a proposal are you looking for? [16:20:17] a930913: Im already on that [16:20:32] Betacommand: On what? [16:20:48] the proposal [16:21:37] Betacommand: So what do we have so far? [16:21:55] Not much [16:22:05] waiting to hear from Dispenser [16:25:29] aude: actually, switching a host from self-hosted to master-hosted is… hard. Like, I don't think anyone has ever done it without breaking the instance. [16:25:47] So it's only a good option if you can safely build a fresh replacement. [16:25:49] * a930913 stares intently at Dispenser. [16:26:33] we can scrap it and make a new one [16:26:46] our puppet stuff is designed to just work [tm] [16:27:10] just not now and not tomorrw probably [16:27:45] aude: sure; there's no pressing reason to do that, it's just good housekeeping. [16:27:45] 3Wikimedia Labs / 3deployment-prep (beta): Template search API request in Beta Labs finds nothing - 10https://bugzilla.wikimedia.org/66981#c4 (10Chad H.) 5NEW>3RESO/FIX Config error on beta labs, fixed in gerrit 143902. [16:45:25] !ping [16:45:25] !pong [16:49:11] Coren: enwiki replication has stopped [16:49:31] I'll go see (fully expecting catscan2 to be at it again) [16:49:49] looks like commonswiki is down too [16:52:10] transactions over 30ks killed; the DB should start catching up soon. [16:53:38] It may take a while though; what the catscan queries was blocking is a DDL query changing the schema and that pauses replication. [16:55:13] Ah, and I see the same 'alter table' in the queue for commons. [16:55:56] * Coren expects the same will hold for a couple of other databases as well. Yeah schema changes. [16:58:16] Coren: those are the only two with major lag [16:58:54] !log tools Coren: transactions over 30ks killed; the DB should start catching up soon. [16:58:56] Logged the message, Master [16:59:11] !log tools Coren: It may take a while though; what the catscan queries was blocking is a DDL query changing the schema and that pauses replication. [16:59:12] Logged the message, Master [17:00:05] Betacommand: Possibly the schema changes already when through on the other databases; they're not as problematic when the DB is much smaller. [17:11:13] Dispenser: Dude, you can't just redirect from to Tool Labs tool to a server you own without an intersitial explaining that by proceeding forward users are giving you their IP and browser info. [17:11:40] https://wikitech.wikimedia.org/wiki/Wikitech:Labs_Terms_of_use#If_my_tools_collect_Private_Information... [17:11:49] ^^ this has the message you have to display [17:12:32] Well, actually, you need to adapt it somewhat because it's not a Labs server but yours, but you get the point. [17:14:03] I'm not getting IP info [17:14:17] And are you saying hot linking is forbidden? [17:14:45] in a way, yes. [17:14:55] you can't use Google Analytics or Google Maps, for example. [17:15:07] or linking to favicons [17:15:09] or hot linking anything from 3rd party server (including yours?) [17:15:24] that as well, assuming they are in 3rd party domains, directly from the client [17:17:15] Dispenser: You have a web server -> you are getting IP info (and logging it, by default). So yeah, hotlinking is specifically forbidden because it discloses third parties' IPs. [17:17:48] Once a Labs tool, always a Labs tool? Cuz' there no redirects out [17:18:33] Dispenser: You're not getting the point. People follow a link to the Tool Labs, which has the WMF privacy policy, and they end up quietly disclosing their IPs in violation of that policy. [17:19:13] Once a Labs tool, always a Labs tool [17:19:15] Dispenser: You /can/ redirect users to a server you own, but you gotta warn them first. [17:20:01] (I.e.: beyond this point the WMF privacy policy doesn't apply and you're giving your IP to something not the WMF) [17:21:04] Well it a proxy, so unless everyone's IP address is 208.80.155.255 there shouldn't be a problem [17:21:45] Coren: I think because its all done back end and all the tool sees is the webserver address info [17:22:03] Ah, a proxy would be okay if there are no inline images, etc, that hit your server directly. [17:22:23] (I.e.: all relative paths, or rewriting). [17:23:21] So you're telling me my tools are boned anyway [17:23:36] Dispenser: no [17:24:05] No, I'm telling you that you can't quietly break the privacy policy without warning the editors that follow a link to your tool. If that doesn't happen, then you're golden. [17:24:33] Because they hotlink favicon, open web pages in iFrames, use Google Charts API [17:24:43] Dispenser: as long as IP info is stripped before it reaches your server (which I think is already done) it shouldnt be an issue [17:25:06] There's a different, interresting question, about whether a tool that depends on a service you keep closed source on a third party server complies with the open source requirement, but I'm not interested in debating that one one. [17:25:21] Coren: Im in the process of writing that proposal, keep in mind its going to be rough and may lack some [17:25:49] And if someone uses the Google Charts API in a way that the endusers' IPs end up at Google on project please tell me, because that's very very prohibited. [17:25:57] Dispenser: well, you could use wmflabs as a proxy, by getting the data from your server when the user requests a page on wmflabs [17:26:03] Coren: Sure, google is not open source, but I can still use their results can't I? [17:26:30] but the question is how much sense it makes to use a wmflabs domain in that case [17:26:31] specific details as those will be discovered as we develop the process [17:26:31] a930913: depends on how you use them. if you're doing a serverside API call that is ok, but doing a *client* side call is nono, I think. [17:26:33] a930913: If you do it in a way that obeys /their/ TOS sure. Just don't send the endusers there quietly. :-) [17:27:05] Betacommand: Obviously; there's nothing wrong with prototypes and exploratory coding. In fact, that's kinda expected. [17:27:13] Dispenser: and in the end, it's all allowed, as long as you show the user a warning before doing that [17:27:24] You have to think of Dispenser's tools as a large API :p HTTP request goes in, data comes out :) [17:28:08] valhallasw: Allowed but /discouraged/ unless you have to; but that's philosophical not technical. But yeah, if you have to do things that disclose client info you gotta warn. [17:29:44] Coren: grep api.charts.google.com / -R :-) [17:36:33] whoops, grep chart.apis.google.com / -R [18:01:21] chrismcmahon et al: beta labs failure "http://en.wikipedia.beta.wmflabs.org/wiki/Talk:Sandbox" [18:01:33] DB error Error: 1146 Table 'centralauth.renameuser_status' doesn't exist (10.68.16.193) [18:01:47] Coren: ^^ [18:02:17] spagewmf: Coren there were some db changes being rolled out a little while ago I think? [18:03:16] Wait, if every external link /was/ cached, any deadlink could be replaced with the cached version :o [18:03:19] chrismcmahon: I've seen a couple schema changes go through recently; I don't know where in the process the beta labs lives. [18:07:03] 3Wikimedia Labs / 3deployment-prep (beta): db error on beta labs "centralauth.renameuser_status' doesn't exist" - 10https://bugzilla.wikimedia.org/67485 (10Chris McMahon) 3NEW p:3Unprio s:3major a:3None Seen at http://en.wikipedia.beta.wmflabs.org/wiki/Talk:Sandbox [18:12:30] 3Wikimedia Labs / 3deployment-prep (beta): db error on beta labs "centralauth.renameuser_status' doesn't exist" - 10https://bugzilla.wikimedia.org/67485#c1 (10Tomasz W. Kozlowski) p:5Unprio>3Normal Also affecting http://commons.wikimedia.beta.wmflabs.org/ A database query error has occurred. This ma... [18:37:50] Coren: what address should I send the proposal to ? [18:39:15] Betacommand: It'd probably be even better on-wiki somewhere; easy to point people at and easy to comment. [18:39:33] Otherwise, you can mail it to me (marc@wikimedia.org) [18:40:21] Coren: sent [18:41:07] I'll see about finding a good on-wiki place for it later this afternoon. You okay with it? [18:42:10] yeah [18:42:22] Its a rough draft [18:42:37] kk. I'll post it and point the relevant people at it; I'm pretty sure many people in Engineering will offer refinements to it. [18:43:10] Coren: tools should really have there own wiki [18:43:31] merging to wiki tech is confusing as hell at times [18:43:39] Betacommand: It's been considered; just not beyond the "toying with the idea" phase yet. [18:43:59] and using a combination of meta and wikitech...... [18:44:32] Betacommand: Also, this proposal is probably going to revive the archive.org discussion which, IIRC, kinda stalled because we didn't really have the resources to develop it fully back then. [18:45:06] Coren: this is basically doing just that [18:45:28] Along with a crapton of metadata harvesting [18:45:52] Yeah, but by using the WMF resources to host the mirror, which may or may not be the best scenario. Restarting that discussion is worthwhile in itself IMO [18:46:35] Coren: it does solve a lot of issues though if we bring it in house [18:47:16] Potentially yes. Personally, I like the idea, but it has impact and (opportunity) cost and requires consideration. [18:48:08] Coren: drop me an email with the link in case Im afk, Im going to be busy [18:48:18] kk [18:59:38] !log deployment-prep manually created centralauth.renameuser_status table [18:59:41] Logged the message, Master [19:05:05] If I'm correctly reading the rules, Google Analytics is allowed if the Private Information message is shown milliseconds before its loaded [19:05:48] oh god, techies trying to find 'smart ways to circumvent rules' [19:06:28] Dispenser: No [19:09:39] I never gotten a chance to use Google Analytics. If I get my users to click "I agree to Google Analytics snooping on me" can I use it then? [19:11:18] Dispenser: Yes, that's the point of the rule. [19:11:31] Dispenser: yes if you notify them that their IP address is being exposed [19:11:49] Dispenser, what are you using Google Analytics for? [19:12:23] Nothing yet! But I'd sure like know what it provides! [19:12:54] Dispenser: not much [19:13:33] <^d> I always enjoy when I visit a site and Ghostery shows me that I'm not contributing to people's google or facebook analytics :) [19:15:53] ^d: :) me too. [19:16:40] <^d> Also, I get a warm fuzzy feeling when it visits WMF wikis and shows that there's nothing to block :D [19:17:33] <^d> wikia has 8 :( [19:17:52] I don't. I get a disturbing feeling they're using something custom and more invasive [19:18:33] <^d> Well, maybe $randomSite is. [19:18:36] <^d> WMF wikis don't. [19:18:58] yeah, we don't even have 'how many people have visited the site?' info [19:19:16] <^d> We do, based on the aggregate squid logs. [19:19:33] <^d> *varnish, whatev [19:19:33] ^d: not very accurate, especially for mobile [20:46:16] 3Wikimedia Labs / 3tools: Move wiki.toolserver.org to WMF - 10https://bugzilla.wikimedia.org/60220#c33 (10Krinkle) I too would recommend against setting up a new wiki. A wiki named "Toolserver" certainly wouldn't make sense as that as of late is no longer going to be operational. Perhaps set up its content a... [20:47:25] I'm getting an odd permission error when trying to copy a file from my tool's directory to my own, anyone around who could help with that? [20:51:18] Nettrom: What file are you trying to copy from where to where? [20:52:06] scfc_de: opentasks.py, from /data/project/suggestbot/projects/opentask to /home/nettrom [20:52:58] scfc_de: somehow I get a "Permission denied" error when trying to do that [20:53:23] Nettrom: Are you executing the copy as your user account or your tool account? The former should work, the latter not. [20:53:34] scfc_de: as my user account [20:53:52] hmm... is the problem that I've used "become" to be the tool account first? [20:53:56] let me check [20:54:14] nope, still permission denied [20:56:18] Nettrom: Ah, okay, one moment. [20:58:09] So the permission error is that user nettrom can't read the file. nettrom is member of tools.suggestbot and that group has read access for file and x rights for all directories leading up to it. Strange. [20:58:50] scfc_de: that's what I found too and why I couldn't understand it [21:00:18] User nettrom can read /data/project/suggestbot/access.log ... [21:00:55] And /data/project/suggestbot/logs/inlink-update.err ... [21:01:40] And it can read the containing directory, so the perm problem should be with the file only?! [21:02:40] Nettrom: Ah, you went for "foo" for testing :-). [21:02:40] yep, I can create files in the directory [21:02:46] :D [21:03:46] and even if I take the 'foo' file I can edit it with the right permissions [21:03:51] must be something weird with this file [21:04:21] copy the file to a new filename and I can read that [21:04:48] Yep, stat for /data/project/suggestbot/projects/opentask/{opentasks.py,test} are the same (0664/51172/51172), yet it can only read test. Perhaps NFS does some weird negative caching? [21:05:33] yeah, I have no idea... but looks like the copy is readable just fine, so I solved it that way [21:05:33] thanks for looking into it scfc_de ! [21:07:08] Nettrom: No problem. If the issue disappears in an hour or so, just drop a note here, then I think it might have been the NFS server caching some failure. Otherwise I have no idea why. [21:17:01] 3Wikimedia Labs / 3tools: Move wiki.toolserver.org to WMF - 10https://bugzilla.wikimedia.org/60220#c34 (10Tim Landscheidt) Eh, the intention was always to set it up as a public, (almost) read-only archive. Of course it would describe obsolete infrastructure, just as https://wikimania2005.wikimedia.org/ is o... [21:20:29] Coren: on labstore1001, is /usr/local/sbin/sync-exports a hotfix or is it part of a package or a manifest someplace? [21:20:52] ... it /should/ be in a manifest. [21:21:32] and it's clearly not. I probably have the commit stashed in my repo. @&#^ [21:22:04] :) [21:23:08] Yeah, there it is. Hang on, lemme swap it out with my WIP and review [21:24:26] * andrewbogott hangs on [21:25:00] Coren: Sorry to interrupt, I was just going to offer to puppetize it :) [21:25:25] Yeah, don't worry about it, I already have it stashed. It's just a short rebase away. [22:21:20] Coren: seems like the beta labs db has been read-only for some time now, do you know an ETA for that to be over? [22:21:40] (03PS1) 10Yuvipanda: Move analytics product into #wikimedia-analytics [labs/tools/pywikibugs] - 10https://gerrit.wikimedia.org/r/144075 [22:21:48] milimetric: ^ thought I should let you know :) [22:23:14] No; I know of schema changes (possibly still in progress) but not their nature nor duration. legoktm did/does them afaik, asking him might help? [22:23:28] chrismcmahon: ^^ [22:24:48] thanks Coren [22:26:39] valhallasw: I don't know your gerrit id :| https://gerrit.wikimedia.org/r/#/c/144075/ [22:26:58] valhallasw: aha, foudn you [22:27:01] added as reviewer [22:39:12] oh, thanks for letting me know Yuvi, I wonder if the new noise will be annoying [22:40:38] (03CR) 10Merlijn van Deen: [C: 032 V: 032] Move analytics product into #wikimedia-analytics [labs/tools/pywikibugs] - 10https://gerrit.wikimedia.org/r/144075 (owner: 10Yuvipanda) [22:41:09] YuviPanda: ^ done [22:41:09] milimetric: you'll find out soon enough :) Am just cleaning up -dev, since you guys seem to be using it for stories as well [22:41:30] milimetric: and there's no analytics people on -dev anyway, and people on -dev don't know much about that either [22:41:37] * valhallasw is off to bed [22:41:40] valhallasw: ty [22:41:47] but YuviPanda, you have +2 rights there, right? [22:42:28] merging simple stuff is fine with me, as long as you fix it when it breaks :-p [22:42:44] valhallasw: yeah :) [22:42:47] * valhallasw is off to bed [22:43:15] YuviPanda: no, totally fair, I was feeling bad about the noise there, but yeah, we're using Bugzilla for Everything now, it's crazy [22:43:39] milimetric: :) yeah. we can suppress things easily based on conditionals if you want. if you see the patch, it's just a plain python lambda [22:44:12] yep, i like it [22:44:14] thanks again [22:44:53] milimetric: :) yw! [22:45:02] milimetric: feel free to poke me or valhallas.w for review/merge/deploys [22:47:08] lol stop pinging the poor guy he's been trying to get to sleep for like 6 minutes [22:48:37] milimetric: I didn't ping him! I put a . there, to not ping :) [23:28:40] I love how the upstream proxy hijacks my custom 404 and 503 error messages [23:34:17] chrismcmahon: errr, my schema change took about half a second, I was just adding a table. I don't think that's caused by me... [23:36:46] legoktm: bummer. thanks for the notice [23:51:11] sge can't start job, I got: [23:51:16] Unable to run job: error writing object "2054684" to spooling database [23:51:16] aborting transaction (rollback) [23:51:17] job 2054684 was rejected cause it couldn't be written. [23:51:17] Exiting. [23:55:47] mh, i have the same