[00:33:42] gifti: hashar is working on that IIRC. [00:36:56] aha [03:42:18] Project CirrusSearch-en.wikipedia.beta.wmflabs.org-linux-firefox build #11: STILL FAILING in 3 min 21 sec: https://integration.wikimedia.org/ci/job/CirrusSearch-en.wikipedia.beta.wmflabs.org-linux-firefox/11/ [03:42:19] * neverett: Don't add a filter if not needed [03:42:19] * neverett: Quoted searches with accents only find accented [08:53:12] Yippee, build fixed! [08:53:12] Project CirrusSearch-en.wikipedia.beta.wmflabs.org-linux-firefox build #12: FIXED in 31 sec: https://integration.wikimedia.org/ci/job/CirrusSearch-en.wikipedia.beta.wmflabs.org-linux-firefox/12/ [09:01:27] Project UploadWizard-api-commons.wikimedia.beta.wmflabs.org build #1: FAILURE in 33 sec: https://integration.wikimedia.org/ci/job/UploadWizard-api-commons.wikimedia.beta.wmflabs.org/1/ [09:04:46] Project CirrusSearch-en.wikipedia.beta.wmflabs.org-linux-firefox build #13: FAILURE in 3 min 8 sec: https://integration.wikimedia.org/ci/job/CirrusSearch-en.wikipedia.beta.wmflabs.org-linux-firefox/13/ [09:06:18] Yippee, build fixed! [09:06:18] Project CirrusSearch-en.wikipedia.beta.wmflabs.org-linux-firefox build #14: FIXED in 33 sec: https://integration.wikimedia.org/ci/job/CirrusSearch-en.wikipedia.beta.wmflabs.org-linux-firefox/14/ [09:11:34] Project CirrusSearch-en.wikipedia.beta.wmflabs.org-linux-firefox build #15: SUCCESS in 2 min 17 sec: https://integration.wikimedia.org/ci/job/CirrusSearch-en.wikipedia.beta.wmflabs.org-linux-firefox/15/ [09:28:35] Project CirrusSearch-test2.wikipedia.org-linux-firefox build #1: SUCCESS in 43 sec: https://integration.wikimedia.org/ci/job/CirrusSearch-test2.wikipedia.org-linux-firefox/1/ [09:48:52] Project browsertests-CirrusSearch-test2.wikipedia.org-linux-firefox build #1: FAILURE in 5.7 sec: https://integration.wikimedia.org/ci/job/browsertests-CirrusSearch-test2.wikipedia.org-linux-firefox/1/ [09:48:55] Project browsertests-CirrusSearch-en.wikipedia.beta.wmflabs.org-linux-firefox build #1: FAILURE in 5.8 sec: https://integration.wikimedia.org/ci/job/browsertests-CirrusSearch-en.wikipedia.beta.wmflabs.org-linux-firefox/1/ [10:44:52] Project browsertests-Translate-meta.wikimedia.org-linux-firefox build #1: SUCCESS in 1 min 8 sec: https://integration.wikimedia.org/ci/job/browsertests-Translate-meta.wikimedia.org-linux-firefox/1/ [10:49:03] Project browsertests-UniversalLanguageSelector-sandbox.translatewiki.net-linux-firefox build #1: SUCCESS in 1 min 6 sec: https://integration.wikimedia.org/ci/job/browsertests-UniversalLanguageSelector-sandbox.translatewiki.net-linux-firefox/1/ [10:51:25] Project browsertests-TwnMainPage-sandbox.translatewiki.net-linux-firefox build #1: FAILURE in 19 sec: https://integration.wikimedia.org/ci/job/browsertests-TwnMainPage-sandbox.translatewiki.net-linux-firefox/1/ [10:58:26] Project browsertests-TwnMainPage-sandbox.translatewiki.net-linux-firefox build #2: STILL FAILING in 1 min 13 sec: https://integration.wikimedia.org/ci/job/browsertests-TwnMainPage-sandbox.translatewiki.net-linux-firefox/2/ [10:58:27] zeljko.filipin: All Selenium tests should run in Firefox [11:10:46] Project browsertests-TwnMainPage-sandbox.translatewiki.net-linux-firefox build #3: STILL FAILING in 5 min 41 sec: https://integration.wikimedia.org/ci/job/browsertests-TwnMainPage-sandbox.translatewiki.net-linux-firefox/3/ [11:18:44] Project browsertests-Translate-sandbox.translatewiki.net-linux-firefox build #1: FAILURE in 1 min 13 sec: https://integration.wikimedia.org/ci/job/browsertests-Translate-sandbox.translatewiki.net-linux-firefox/1/ [11:30:13] Project browsertests-UniversalLanguageSelector-commons.wikimedia.beta.wmflabs.org-linux-firefox build #1: FAILURE in 2 min 2 sec: https://integration.wikimedia.org/ci/job/browsertests-UniversalLanguageSelector-commons.wikimedia.beta.wmflabs.org-linux-firefox/1/ [11:35:09] Project browsertests-Translate-sandbox.translatewiki.net-linux-firefox build #2: STILL FAILING in 14 min: https://integration.wikimedia.org/ci/job/browsertests-Translate-sandbox.translatewiki.net-linux-firefox/2/ [11:37:56] hashar, why does mw-jenkinsbot output here? [11:44:15] Project browsertests-UniversalLanguageSelector-commons.wikimedia.beta.wmflabs.org-linux-firefox build #2: STILL FAILING in 13 min: https://integration.wikimedia.org/ci/job/browsertests-UniversalLanguageSelector-commons.wikimedia.beta.wmflabs.org-linux-firefox/2/ [11:48:22] valhallasw: ah yeah we should drop it probably [11:48:27] sorry [11:49:21] valhallasw: done (hopefully) [11:49:46] hashar: \o/ [11:50:10] now, let's take a further look at wikibugs 2.0 [11:50:39] oh I have ignored it long ago [11:50:44] I am not even in #mediawiki nowadays :D [11:51:02] I guess wikibugs 2.0 can be revisited whenever we switch to phabricator [11:52:53] Is that going to happen anytime soon? I thought it was on the long-long timescale [11:52:57] (i.e. > 1 year) [11:59:58] valhallasw: the RFC is still being discussed [12:00:14] then we need to figure out a .plan and allocate staff ressources to the phabricator project [12:00:24] it will definitely comes in several phases [12:00:34] the first one probably to migrate mingle [12:00:44] then possibly importing bugzilla history [12:00:53] and finally dropping Gerrit (if we ever do that) [12:01:25] right. In any case, migrating wikibugs to something not-on-a-mailserver and not-perl is still a good idea ;-) [12:03:04] yup [12:03:16] I dont think Bugzilla has anyway to emit events [12:03:28] pet an wrotre something consuming the RSS feeds [12:04:10] Bugzilla has an API you can ask [12:04:47] anyway, if one want to start wikibugs 2, it should be done against phabricator [12:05:43] https://bugzilla.wikimedia.org/show_bug.cgi?id=40970 [12:05:48] hashar: wikibugs-l ;-) [12:06:05] yeah we now the drama already when it comes to parsing changing email headers :-D [12:06:07] maybe wait until there is a way to import bugs to phab [12:06:19] but +1 for not reading mails [12:06:21] combined with calls to the api [12:06:48] the RSS also sounds interesting, though. [12:07:15] here are all the wikibugs bugs: https://bugzilla.wikimedia.org/buglist.cgi?component=wikibugs%20IRC%20bot&list_id=306368&product=Wikimedia&resolution=--- [12:07:42] in any case, the IRC backend will be reusable [12:08:19] dont waste your time honestly :-] [12:08:24] https://secure.phabricator.com/book/phabdev/article/chatbot/ ! [12:09:06] hashar: well, push that output to redis and everyone will be happy :-) [12:09:36] still a waste of time since we will phase out bugzilla by the end of this year [12:09:56] much better to invest time in enhancing Phabricator chatbot. [12:10:14] it might not support directing different components to different channels which is definitely something we can use [12:10:22] yeah, except a phabricator chatbot will, *again*, be on some WMF-administered server [12:10:35] and we'll just have the same problems all over again [12:10:47] 'no, we can't fix that because no-one has time to review and deploy that patch' [12:10:53] which is ? [12:11:11] if the chatbot is configurable from the web interface, we can grant rights to volunteers [12:12:02] apparently the bot runs out of a json file, we can have it in a repo and get it deployed + reload the chatbot daemon on postmerge [12:16:30] well, in that case it'll be for learning asyncio ^__^ [13:03:48] Cyberpower678: Hi CP, I did some minor changes to the web-script and added an anti-spider entry to lighty (as quick workaround) [13:09:45] hedonil, thanks. [13:10:00] Cyberpower678: yw ;) [13:10:09] Hi [13:10:25] Is the tools servers lagging behind the main db? [13:21:57] http://goo.gl/91ILas hasn't updated despite me KNOWING i've tagged images so they shouldn't be on the list [13:22:01] generated [13:54:34] @replag [13:54:35] Replication lag is approximately 05:40:44.1278910 [14:08:51] !topic [14:08:59] !ping [14:08:59] !pong [14:09:05] @search topic [14:09:06] Results (Found 2): StoneB, channels, [14:09:11] !channels [14:09:12] | #wikimedia-labs-nagios #wikimedia-labs-offtopic #wikimedia-labs-requests [14:09:29] petan, rumor has you wrote something to track bug changes using bugzilla's rss feeds [14:09:38] yes I did [14:09:54] at some point it might be broken now because it didn't produce any message for some time [14:10:04] hmkay [14:10:04] I think latest bugzilla upgrade broke it [14:10:11] I might fix it soon [14:10:44] it basically uses rss feed that is produced by bz, parse it and relay to irc [14:12:43] Ok. I'm fiddling with a 'wikibugs 2.0', but using the mail infra. [14:12:58] which has the advantage of not having to poll, but the disadvantage of being mail [14:17:06] I don't want that [14:17:33] if they want a solution that works, the information needs to be stored to some queue with push abilities [14:17:57] which I am emulating through rss which sucks, but sucks much less than using e-mail parsing [14:19:05] I'm thinking of a mixed approach (email as trigger, BZ api to get details) [14:20:10] but at the moment it's mainly a fiddle project, as apparently BZ will be killed in favor of phab [14:25:38] poor BZ [14:25:50] I don't know phab but BZ is so far the best bug tracker I know of [14:26:04] petan: hrm, you parse the HTML tables in the RSS feed? [14:26:07] written in perl <3 [14:26:12] yes [14:26:19] * valhallasw shudders [14:28:49] primary phabricator features: [14:28:53] meme generators [14:29:03] that thing really looks useful [14:29:21] hiding stuff from coworkers LOL [14:29:27] that is what open source community really needs [14:29:43] meme generators and hidden code [14:30:09] valhallasw: who did this decision to kill bugzilla with that thing? [14:30:19] see the RFC on wikitech-l [14:31:05] I just see some phabricator meeting mail [14:31:10] it wasn't recently? [14:31:38] RfC on Product Management Tools and Development Toolchain [14:31:49] oh [14:32:05] if the title wasn't so cryptic I would notice earlier :P [14:41:27] It's not decided yet, but it looks as if this will be the way. [14:53:49] meh, the phab feed is also fugly [14:53:53] http://fab.wmflabs.org/api/feed.query [14:56:26] but it's certainly enterprise [14:58:31] omg looks like json, have fun [14:59:10] yeah, it's json all right. Json that refers to 'PHIDs' [14:59:12] ugh. [15:00:52] oh, goodie goodie [15:00:56] if you ask for text format [15:01:04] you get doesn't [15:02:18] Can I use Labs to host a tool like this? http://rawgit.com/whym/gdc/master/gdc.html?user=whym [15:02:45] I'm wondering about privacy issues. [15:03:29] iirc the rule for edit counters is 'only opt-in', but I'm not 100% sure [15:05:02] Actually, it's not that clearcut. Tools doesn't have a rule about data aggregation, but we /do/ have a rule that local project policies have to be obeyed. [15:05:33] My understanding is that there is consensus on dewiki that opt-in is required for editcounter-like things. [15:06:09] So you may have to enable some sort of opt-in for data from dewiki, or at least be prepared to do so if that community objects. [15:08:12] valhallasw: At least it looks like true JSON, and not "Oh, just chop off the first line"-Gerrit-JSON. [15:08:57] valhallasw: Coren: Or maybe I could whitelist wikis that are free from these issues and allow them only. [15:09:25] scfc_de: not very RESTful json, though :-( [15:09:31] whym: That's another option. [15:11:14] valhallasw: Coren: thanks for prompt information. :) [15:16:15] scfc_de: isn't point of json that it's meant to be readable by humans and machines as well? [15:16:26] this thing looks less readable than raw binary to me [15:19:08] I thought it was meant to be read by JavaScript :-). "curl -s http://fab.wmflabs.org/api/feed.query | jq ." looks quite alright. [15:20:15] bash: jq: command not found [15:20:41] http://stedolan.github.io/jq/ - better than sliced bread. [15:21:52] And it's even available in Ubuntu. [17:42:20] scfc_de: I see that I can add any user to a project, but can they ssh into an instance if they're not in shell group? [17:46:20] andrewbogott surely knows too [17:48:00] If you don't have shell access you can't ssh to anything. That's what 'shell' means. [17:49:41] Talkin' bout my s-s-s-s-shelleration [17:49:50] andrewbogott: sure, it's what I supposed but I wanted to confirm :) [17:49:53] thanks [17:51:14] oh, "All new labs accounts automatically file a request for shell access"; cute [17:51:35] I didn't know this yet :) [18:54:01] Coren: High Repication Lag: Oh dear. The database appears to be lagging behind. I won't be able to present you with information newer than 38353 seconds. For enwiki [18:56:08] springle: ping [18:59:05] Coren: enwiki is lagging 10:43:45 [19:00:48] Perhaps we could give access to SHOW PROCESSLIST so naming and shaming would get easier :-). [19:23:54] !log rebuilding Cirrus indexes to pick up auxiliary fields and smarter accent matching [19:23:55] rebuilding is not a valid project. [19:24:02] !log deployment-prep rebuilding Cirrus indexes to pick up auxiliary fields and smarter accent matching [19:24:04] Logged the message, Master [19:24:07] much better [20:16:12] Hello [20:16:15] Annyone awake? [20:49:03] Qcoder00: What's up? [20:49:27] Some catscan queries aren't updatign when i re-run them [20:49:43] Is there tools server working from an older db dump? [20:49:46] *the [20:50:29] Hi, I would like to work with "analytics" project team... What is the procedure? [20:56:46] Qcoder00: I think CatScan caches the tree structure for a few hours; but also enwiki has a replication lag of over a day now, so if you're querying this, you might see this. [20:57:21] And when does the replication lag reduce? [20:59:30] Qcoder00: I don't know. [21:15:10] Heh. Chances are, catscan is the /cause/ of the current lag. Ima go see if that's the case as usual. [21:16:22] Ah, no, for once it isn't. :-) [21:17:44] This will need the loving care of our DBA [21:51:13] Do you people create new ssh keys for each site that requires a key? [21:51:20] or do you just use one everywhere? [22:05:01] SigmaWP: I use one "everywhere". [22:09:12] SigmaWP: it's fairly easy to set per-site keys in .ssh/config, though, as well as running an agent that adds all keys [22:10:07] SigmaWP: I have two keys, one for WMF use and one for work use [22:10:50] and then an extra set of labs to do inter-host login (so it has the private keys stored /on/ labs) [22:11:16] hmmm [22:12:32] Oh [22:12:35] I see [22:24:45] replag is 14 hours, 10 minutes, 45 seconds. [22:24:47] Anyone know why? [22:37:58] some blocking query again? [22:39:39] hmm... looks like someone's looking into it already [23:40:03] SigmaWP: Known issues; our DBA will look into it shortly. [23:40:30] Thank you