[00:11:07] I kinda need help here.. [00:11:16] techman224: Could you help with the confirm? [00:11:27] Seems we all forgot aboit it [00:11:38] Riley, with what? [00:11:48] techman224: SRP and the Round 4 page [00:13:02] Just updated the Main page and finished the nominations for Round 5 [00:13:42] Updated the s.itenotice as well. [00:14:00] Riley, you forgot the german sitenotice [00:14:05] I got that [00:15:36] Oh, didn't know there was one. Thanks [00:15:57] What can I do to help now? :p [00:18:29] fixed the formatting and such on the main confirm page :p [00:19:02] Thanks :P [00:19:36] :) [00:20:45] techman224: Want me to do the Round 4 page? [00:20:56] Riley, I got SRP [00:21:06] You can do the page [00:21:17] Kk [00:29:25] Done [00:39:38] erg [00:39:57] someone want to set up a spamblacklist/abusefilter rule to block spambots? [00:40:28] Does the abusefilter work? [00:40:35] yes [00:40:38] spamblacklist does not [00:40:43] (for items) [00:40:58] I'm an idiot, I have been using the log to block spambots for days -.- [00:41:03] * Riley smacks himself [00:41:07] the uiuc.edu (ironically thats my university) is on the spamblacklist, but isnt being stopped [00:41:43] * Jasper_Deng wishes people would format the confirmation votes using a numbered list [00:42:58] * Moe_Epsilon chuckles a little [00:48:03] * duh is waiting…. https://www.wikidata.org/wiki/Special:Log/abusefilter [01:06:58] duh: You do know that we have https://www.wikidata.org/wiki/Special:AbuseFilter/history which is much more handy, don't you? [01:07:17] hoo: that wont show me private filter changes though [01:08:08] duh: You aren't admin? :/ [01:08:17] not yet [01:08:40] Have you opened an RfA yet? /me is eager to vtoe [01:08:44] he has [01:09:00] (though remember this channel is publicly logged. Don't get accused of canvassing) [01:09:13] which is why i didnt mention it :P [01:09:31] Jasper_Deng: Ah, I see.... I got that page on my watch list and still I don't notice most of the requests :/ [01:09:53] hoo: also, do you know if there is a bug to make spamblacklist compatible with non-text items? [01:10:12] duh: No and that wont happen [01:10:17] why not? [01:10:50] Spam blacklist works on top of the parser (it only applies it's regexp against links the parser actually links... tags) [01:10:56] * its [01:11:11] So plain text links aren't covered [01:11:16] oh right [01:12:19] duh: Voted, btw ;) [01:12:36] :) [02:00:24] We need a faster archiving of http://www.wikidata.org/wiki/Wikidata:Requests_for_deletions [02:00:31] It's getting too big [02:01:37] yeah [02:02:00] ideally anything marked with {{done}} and no response in 4 hours should be archived [02:02:12] or {{not done}} [02:05:17] I can't find the archive settings on the page [02:05:26] its hardcoded into the bot [02:05:28] i think [02:09:35] I made a note on Hazard-SJ's user page [02:10:23] he originally set it to 12 hours [02:10:30] * Jasper_Deng htinks consensus might be needed to change this [02:12:45] nah [02:12:56] theres no reason to have the page that long [02:13:15] on-wiki discussion needed, most likely [02:14:11] why? [02:14:14] it seems rather logical [02:14:37] well the 12-hour time was something setup by consensus (in the bot request I think) [02:14:52] * Jasper_Deng thinks he gets a bit paranoid sometimes [02:15:42] Well 12-hours right now is too long, given the amount of user and bot requests [02:16:19] right [02:16:33] how about we split bots and user requests? [02:16:36] my other alternative idea is do a category like CSD thing [02:16:41] just tag the talk page with a template [02:17:01] duh, then we are creating more pages [02:17:14] More things to delete [02:17:17] true [02:20:45] duh: glad to see you finally caught on to {{Rfd group}} [02:21:14] Heh [02:21:33] Just be glad my bot isnt adding the dupes it finds to WD:RFD [02:21:34] Also, [02:21:37] While I have you [02:21:55] Theres a french interwiki conflict I'm stuck on [02:22:02] https://www.wikidata.org/wiki/Wikidata:Interwiki_conflicts#Q329397.2FQ3120398 [02:22:06] Can you figure it out? [02:22:11] wait... i thought your bot *is* adding the dupes it finds... [02:22:31] No, it posts them on [[User:Legobot/Dupes]] [02:22:56] Its already found over 3k dupes, so it would just destroy WD:RFD. [02:23:15] * Riley still thinks WD:RFD/B should exist [02:23:24] (B = Bots) [02:23:42] so it's just doing exact dupes or something? [02:23:49] or did you deactivate that task? [02:24:03] Exact dupes and normal dupes [02:24:05] (and I'll look at that conflict in a bit) [02:24:45] ahh. see, you thought it was a dumb idea to support up to 90 items at a time :P [02:25:17] Pink|Winter: The problem with RfD group is that It only allows one delete reason for all the items. Each dupe needs its own delete reason. [02:26:07] duh: ahhh... want me to fix that? [02:26:24] Nah [02:26:28] Probably not worth it [02:26:33] a {{{reason1}}}, {{{reason2}}}, etc., should do the trick [02:26:48] Nah [02:26:55] My bot constructs it manually [02:26:58] Its much easier [02:27:11] yeah but RfD is illegible right now [02:27:47] in terms of size? [02:28:02] yeah [02:29:33] O.O http://toolserver.org/~vvv/adminstats.php?wiki=wikidatawiki_p&tlimit=none [02:30:11] as I suspected, I have by far the most blocks...... [02:30:26] So 5k dupes then? [02:30:44] yEP [02:33:35] duh: but like... idk, if you could do something to make all the consecutive reports take up a little less display space, that'd be ncie [02:33:51] er for what? [02:34:25] Make reports inside of collapsible templates. [02:34:32] Simple as that [02:35:26] Do the same format you are doing now but just add a section header and a collapsible template for the report to go in. [02:35:40] yeah that'd work [02:35:51] header like "Bulk deletion request from Legobot" [02:35:58] meh [02:35:59] ok [02:36:01] Riley: done yet? [02:36:09] almost [02:37:56] duh: Done [02:37:59] Riley: do you have some script you're doing this with or something? [02:38:05] ok starting the next run [02:38:13] Pink|18HourNap: Riley just clicks fast. [02:38:31] ahh. so does Legobot do the merges herself? [02:38:58] Legobot isn't female. I don't think. Not sure. And it doesn't do any merges. [02:39:03] It just finds dupes. [02:39:20] all bots are female. like ships. [02:39:20] And it makes sure that all sitelinks in the smaller one are also in the bigger one. [02:39:29] open in new tab, click enter on each one, close all other tabs using http://puu.sh/20E9j and then repeat. [02:40:04] Riley: but what are you doing with the links from the smaller one? [02:40:18] what do you mean links? [02:40:28] sitelinks [02:40:33] * Riley doesn't even know which is the smaller one [02:40:38] theyre already in the bigger one.... [02:40:44] Riley: the smaller one when the dupe isnt exact [02:40:51] which is the one the bot has you delete [02:41:03] i thought the software doesn't let you add identical sitelinks.... [02:41:12] bug [02:41:44] Pink|18HourNap: not anymore. it used to though [02:42:03] cleanup is messy [02:42:05] ahh. why not? [02:43:53] because it shouldnt? [02:45:48] how so? [02:47:31] wait are you asking why duplicate sitelinks arent allowed? [02:50:29] yeah [02:50:32] no [02:50:44] i'm asking why the software doesn't prevent them any more [02:50:54] for good reason [02:51:16] for example some wikis have a single article where others have multiple [02:52:06] wait no [02:52:08] it does prevent them [02:52:16] Pink|18HourNap: we're cleaning up from when it used to allow them [02:52:23] ^^ [02:52:40] oh ok [02:52:46] ironically, most of these were created by the same bot [02:52:52] :P [02:54:01] actually let me rephrase [02:54:03] its not one bot [02:54:11] its usually the same bot creating both items [02:58:27] oh Jasper_Deng_away: can you start a discussion somewhere to turn on $wgAbuseFilterNotifications and $wgAbuseFilterNotificationsPrivate for wikidata? [02:59:33] duh: First one is enabled per default AFAIR [03:00:02] hoo: its not :/ [03:00:13] only enwiki has it turned on [03:00:28] it should be default for all imo though [03:00:36] duh: You're right... commons got it enabled per default as well, btw [03:00:43] - per default [03:00:44] 04Error: Command “per” not recognized. Please review and correct what you’ve written. [03:00:47] actually [03:01:03] the problem is that some wikis restrict abusefilter-log to autoconfirmed i think [03:01:09] or w/e the permission is [03:01:18] i think zhwiki has the strictest [03:01:22] We don't, do we? [03:01:31] no wikidata uses a sane configuration :P [03:02:46] duh: Open a brief discussion on wiki (shouldn't really be controversial), then someone (I can do that) can push a change to gerrit and we'll get that merge [03:02:46] d [03:03:12] for just public? or private too? [03:03:46] duh: Ask for both and mention that most (all?) of our filters are for spam bots anyway [03:03:55] but leave them the option to only get one [03:05:34] http://pastebin.com/RDyXj4vX [03:05:36] ok [03:05:38] uh oh, error [03:06:19] Riley: Try it again, this should be temporary [03:07:08] hoo: Ik, just thought I would report it [03:08:03] Riley: Well... that error isn't very meaningful [03:08:07] what were you doing? [03:09:06] Deleting a page [03:10:18] Its worth mentioning that you were deleting a page that had duplicate sitelinks [03:10:36] That too ^ [03:13:41] Riley: Looking at the page the deadlock happened on: That has nothing to do with wikibase and was just a temporary hiccup [03:13:57] hoo: Okay, thanks. [03:13:59] if that appears more often report it, but I think this is "fine" [03:14:24] * Riley wonders how many server kittens he is hurting by making these mass deletions [03:14:26] https://www.wikidata.org/wiki/User:Legobot/Dupes Probably two pages of these were deleted at nearly the same time [03:14:46] hoo: Maybe even more than two pages.. [03:15:19] Possible... [03:18:06] hoo / everyone: https://www.wikidata.org/wiki/Wikidata:Project_chat#Enable_AbuseFilter_notifications_to_IRC [03:19:57] duh: Commented [03:23:43] :) [03:33:28] Wtf [03:33:34] Disconnected from services? [03:33:45] O.O [03:33:49] by* [03:33:54] That means staff did it.. [03:34:14] legoktm: What happened? O.o [03:34:27] Riley: Not necessary ... maybe /squit gives that as well [03:34:28] o_O [03:35:16] I think my bouncer ping timed out or something [03:35:17] So I ghosted myself [03:35:35] :P [03:36:06] DONE. 1000 pages deleted! *phew* [03:36:20] Riley: You didn't do that per hand, did you? :P [03:36:27] Yes.. [03:36:43] They all have different deletion reasons :/ [03:40:27] can someone inform me what to do when a place of birth is not available when adding a statement, and clicking the icon does not save it [03:41:08] sDrewth: need to create an item for the place of birth [03:41:19] i can import it using a script, link? [03:41:27] prior to adding it? [03:41:29] yes [03:41:32] adding the place of birht [03:41:39] bah humbug [03:41:46] duh: ? [03:41:55] once you have the right gadgets installed it takes like 10 seconds. [03:41:56] to do that I have to know how all the places are being engered [03:42:06] Jasper_Deng: check the WD:PC [03:42:18] sDrewth: they should have reasonably sane names... [03:42:30] Ditchley, Oxfordshire [03:42:40] or is it just Ditchley [03:42:48] or is it ..., England [03:43:03] link me to the en.wp item? [03:43:14] s/item/page/ [03:43:29] oh fair dinkum, just tell me the naming huierarchy [03:43:57] it depends? normally i see Town, State [03:49:24] it isn't evident what statements one would add for a place [03:49:41] and there is no help to assist [03:53:40] oh crumbs this is like peeling onions [03:53:46] with your freaking eyelids [03:55:35] naming policy was pretty hard fought on wikipedia [03:55:58] I'm sure wikidata will provide some fun opportunities to reopen some debates [03:56:13] * Jasper_Deng doesn't want drama [03:56:28] I like Wikidata the way it is.... spam-free, no edit warring, no POV-pushing.... [03:56:45] "spam-free" [03:56:53] ever since we got abusefilter enabled [03:56:56] https://en.wikipedia.org/w/index.php?title=Special%3ALog&type=move&user=&page=Perth&year=&month=-1&tagfilter=&hide_patrol_log=1&hide_review_log=1 [03:57:09] yes Tim, and why at Wikisource, I fought for the root name to be the disambig page, and everybody gets disambiguated [03:57:32] try finding a Woodstock among all of these [03:57:57] at least with Wikidata, the placename is just a label [03:58:15] though, they are going to need to deal with duplicates [04:02:22] huh: my recommendation would be that everyplace should be a two step name, no matter where in the world that it is [04:03:11] well [04:03:15] I think NYC has a one step name [04:03:26] * duh goes to fix that [04:03:38] there may be a few onsies, but make it the practice [04:03:43] onesies [04:03:45] vatican city, vatican city [04:03:47] right [04:04:06] https://www.wikidata.org/wiki/Q60 [04:04:28] for US, you will probably want to look to drill down to county level, much data is in the four step process [04:05:08] especially for genealogical data, four step is the default for US [04:05:49] but you can keep county implicit [04:05:51] Brooklyn, Kings, New York, USA [04:06:35] right [04:06:43] you should probably mention this on Project chat somewhere [04:07:00] old UK, is Parish, County, Country, though often (property name), Parish, County, Country [04:07:21] no point in starting behidn the ball and having to fix later [04:07:28] train people right in the first place [04:08:10] and wait until you see how many Washington counties there are in the US [04:08:20] they are so underdeveloped in originality! [04:08:30] * sDrewth ducks the shoes [04:10:25] does wikidata populate the remaining interwikis based on an article? [04:10:36] add one link, it fills the rest? [04:12:15] no [04:12:21] you need to add them manually [04:12:21] crap [04:12:22] but [04:12:32] there are many scripts that will autoimport the rest [04:14:24] there is a gadget, good [04:14:28] yeah [04:14:40] theres also a gadget that lets you do it upon creation [04:15:01] some of those would do well to be default [04:15:35] probably [04:16:41] oh, so many of these Woodstock have been imported by bot [04:16:47] yes [04:17:56] and will the geographic have geocodes? [04:18:08] you mean coordinates? eventually [04:18:25] https://www.wikidata.org/wiki/Wikidata:Property_proposal [04:18:46] I can see the value in sucking in officially sourced geographic places [04:20:09] like https://services.land.vic.gov.au/vicnames/placeName.html?method=exportList [04:20:29] :o [04:21:04] that seems pretty cool [04:21:09] im going to sleep now though [04:21:12] night all [04:21:17] probably a federal govt too [04:21:49] worn the poor dear out :-p [04:30:50] sDrewth: http://www.wikidata.org/wiki/Q2124420 is rather problematic, but bots are not smart enough to put in translations [04:31:11] oh, I recognise the page [04:31:13] NOT! [04:31:24] * Jasper_Deng_busy just saw you post @ project chat [04:32:03] oh, that was the ref was it, I didn't look, it was a copy and paste only [04:33:58] Jasper_Deng_busy: then flag it for a fix, or don't have it in english, and flag it [04:34:05] empty is more problematic [04:34:16] empty is dick useless [04:35:00] sDrewth: I agree that empty is useless, but "then flag it for a fix" = bugzilla? (b/c we have no provisions for placing tags directly on items, perhaps you want tags on talk pages?) [04:35:19] and "don't have it in english" <--- ? [04:35:25] not tags, just have the label [04:35:49] * Jasper_Deng_busy is confused [04:35:58] no flag it for a fix, can be a bot recording [04:36:27] hey, I know nothing, I am just a user bashing my way through, with some knowledge of wikis, and geographic data [04:37:09] there are bazillions of such items..... it always annoys me how non-English items show up in my search results for English terms [04:39:05] * Jasper_Deng_busy invites sDrewth to idle on #cvn-wikidata [04:43:26] sDrewth: Could you finish https://www.wikidata.org/wiki/Q4454258 ? [04:44:32] Ah, it is a duplicate. [04:45:05] sDrewth: Before creating an item, use https://www.wikidata.org/wiki/Special:ItemByTitle to see if it already exists. [04:46:03] Since it was a duplicate of https://www.wikidata.org/wiki/Q752111 , I have deleted it. Try a new creating a new item but make sure to use Special:ItemByTitle first to see if it already exists. :) [04:46:31] yep, was getting back to it, and got distracted by other thoughts [04:46:48] * sDrewth smacks Riley upside to the head [04:46:59] search the term Woodstock, and tell me that again [04:47:00] :) [04:47:32] What? [04:47:50] sDrewth: Special:ItemByTitle isn't the search bar [04:50:17] can I suggest that you add the link to [[MediaWiki:Search-summary]] or one of the other sections then [04:50:51] ah, it is in the sidebar, but tucked in among others [04:51:05] "Item by title" [04:51:08] :P [04:51:27] try "Lookup item by title" [04:51:43] Moe_Epsilon: it is in the noise of all the standard links, and hence ignored [04:51:54] I expect to see it with search [04:52:29] I suppose the search results summary can show a link to it [04:53:03] That link that you pointed to is sitting under 'Navigation', and not where I would peek for a lookup or search [04:53:56] are you sure it's MediaWiki:Search-summary? [04:54:11] not 100% [04:54:20] I forget, let me go look at what I did at enWS [04:56:25] I found http://www.wikidata.org/wiki/MediaWiki:Searchmenu-new [04:56:28] and put it in there [04:58:05] k, thx [04:58:20] http://www.wikidata.org/w/index.php?title=Special%3ASearch&profile=default&search=Something&fulltext=Search [04:58:22] seems alright [04:59:30] yes, though nothing shows for http://www.wikidata.org/w/index.php?title=Special%3ASearch&search=&fulltext=Search [05:01:00] hmm, if you can find what mediawiki page the grey bar is on, I can fix that [05:01:18] stick &uselang=qqx [05:01:42] (searchprofile-articles) [05:01:42] (searchprofile-images) [05:01:42] (searchprofile-project) [05:01:42] (translate-searchprofile) [05:01:42] (searchprofile-everything) [05:01:42] (searchprofile-advanced) [05:01:48] so they look set [05:01:59] the line is set, but labels can change [05:05:08] hmm [05:05:21] http://www.wikidata.org/w/index.php?title=Special%3AAllMessages&prefix=Search&filter=all&lang=en&limit=1000 [05:05:32] might pay to go and play on a test wiki [05:07:42] Moe_Epsilon: http://test2.wikipedia.org/w/index.php?title=Special%3ASearch&search=&fulltext=Search [05:07:58] see the line(s) of text above the search box [05:08:18] [[MediaWiki:Search-summary]] [05:08:43] oh, yeah [05:08:51] is that where you preferred it to be? [05:09:03] so, that is the page to prepend any text that always appears with a search box [05:09:14] * sDrewth shrugs [05:10:21] just cannot see where else one can add to the page [05:10:51] yeah, I see what you mean [05:11:47] Moe_Epsilon: let's see what happens if we change "searchprofile-articles" [05:12:49] &uselang=qqx is one of the best tools ever :D [05:13:51] riley, there is a gadget as well [05:14:12] I am aware :). [05:14:24] Moe_Epsilon: nada, it squashes it into the text box [05:14:31] http://www.wikidata.org/w/index.php?search=&button=&title=Special%3ASearch [05:14:31] http://test2.wikipedia.org/w/index.php?title=Special%3ASearch&search=&fulltext=Search [05:15:11] looks good Moe, thx [05:15:16] :) [05:26:42] Moe_Epsilon: go to http://www.wikidata.org/wiki/Special:ItemByTitle and enter the terns enwiki and Woodstock [05:27:08] result isn't helpful [05:27:32] it defaults to one page [05:28:07] cf. http://www.wikidata.org/w/index.php?title=Special:Search&limit=100&offset=0&redirs=1&profile=default&search=woodstock [05:28:54] sDrewth: the point of that search item, is to find the exact article based on what you search. Searching en.wiki and "Universe" will always pull up [[Q1]] because the English Wikipedia article on that topic is titled [[Universe]] [05:30:04] got a oversight, bbs [05:30:06] an [05:40:17] yes Moe_Epsilon, I understand that, but when I had twenty + articles all name Woodstock, finding the right one is the issue [05:41:05] though I understand it is a back and forth issue [05:41:29] that said, I started this to fix something at Commons, which is preparing its Creator: ns for the data [05:41:53] and that was >> 1 hour ago, and I am still hacking away [05:42:03] * sDrewth rolls his eyes [05:42:09] ack dust balls [05:43:54] ItemByTitle isn't really for searching for items anyways, the traditional search bar will be used for that once that is worked on. ItemByTitle was created IIRC to find if a article on particular wiki so that we can see if we need to create an entry [05:44:48] since the search was even more terrible when we started [05:44:48] :P [05:45:15] apologies if that sounded like a criticism [05:45:25] oh, not at all [05:45:25] :D [05:45:30] more expressing issues as I trip over them and my feet [05:47:11] wikidata even through phase III probably won't be heavily user-friendly like these two phases. Lots of bugs :) [05:53:50] hmm, help me here then, http://www.wikidata.org/wiki/Q3569843, from the (type ahead text) I don't get enough options [05:54:11] oh, yes I do, never mind [05:55:23] oh, I think it's a agreed to not have disambiguation in the labels [05:55:33] (since I think I saw you add it) [05:56:02] you mean the word? [05:56:09] (novel) [05:56:24] well, how else do you tell it apart? [05:56:25] We need an admin bot.. [05:56:53] Descriptions? [05:57:16] hmm, none that I have touch had them [05:57:35] Yeah, there's a taskforce for filling them in [05:57:45] they were added by bots without descriptions [05:57:57] that becomes hard for placenames [05:58:32] I started at http://www.wikidata.org/wiki/Q358912 [05:58:41] and went to add a place of death [05:59:09] and without fuller descriptive labels, I am not going to get close enough [05:59:54] I have to get "Woodstock, O" to get a hit on the list [06:00:20] anything shorter doesn't present the right option [06:00:33] so you are going to need to have means to disambiguate [06:01:50] description will allow you to choose, but you have to have the right selection from which to make the choice first [06:02:40] Woodstock, Oxfordshire is fine, I wouldn't change that to just "Woodstock" [06:03:10] they were all just "Woodstock" [06:03:23] which is what led me on the crusage [06:03:47] and I have found a couple with no label [06:04:16] so you will need a means to work out how one would find Woodstock the novel [06:05:12] phase II isn't complete :P there are bugs fixed on a daily basis it seems. You can contact the developers and bring it up to them :) [06:05:40] I sort of did on the chat page [06:06:14] though you may wish to add to that with your perspective of how loony I am, and need better "edumahkation" [06:06:35] lots betterer [06:06:51] there's also http://www.wikidata.org/wiki/Wikidata:Contact_the_development_team [06:06:58] (if you never saw it) [06:08:08] you bring up a valid point, it is hard to see beyond a certain number of items if you are adding properties [06:08:18] it may help to have it able to enter by Q number [06:08:21] that would solve that [06:08:31] nooooooo [06:08:52] I like the lookahead function, and if things are allowed to have a longer name, it will be fine [06:08:59] For clarification, it would display the label, but you can enter it by Q number :P [06:09:28] Moe_Epsilon: I doubt that I will ever know the Q number, it is what I came to get for that article [06:09:59] and interestingly the only place that you see it is in the url, it isn't in the article [06:15:06] Yeah, I personally find things easier to get by just looking in the URL and getting the Q number, but that's just me I suppose :P It's something that is being worked on. The Lookahead feature is nice, but if both ways of inserting data are there, then it would be nice. [06:15:29] I'm also pretty sure that there is a bot we are having developed to remove all the disambiguation from labels [06:21:13] I think it's just that the interface of Wikidata changing so frequently, everything isn't going to be as practical right now :p The intention is to have labels with descriptions disambiguating them. Properties are about a week old, so that search is going to evolve [06:25:07] yep, maybe too organic for me who wants to have a standard [06:30:33] sDrewth: sorry if I'm not getting you the answers you want :< [06:30:59] no issue [06:31:32] this stuff is still in the early development [06:32:14] yeah, I'm learning something new every day on this project :P [06:32:16] and I have enough development in other places, and D: isn't yet just drop in and do (with ease) [06:32:36] * sDrewth puts out a velcrose landing pad for Oren_Bochman  [06:34:10] how strange, you're a steward, an admin on commons and on en.wiki and I've never heard or spoke to you before. :o [06:34:56] mostly b/c I am a hack from Wikisource, and part of the reasons for the other two adminships was due to gettnig thins fixed for enWS [06:35:09] oh, it's Moe_Epsilon! gotten your sight back? [06:35:20] yes :D thanks [06:35:56] I edited for a few days with one eye, too addicted x_o [06:36:39] i feel ya. got some ice in a tear duct during the blizzard, and it's only just stopped hurting ;) plus i'm used to not being able to see in general, what with family histories of bad eyesight from all 4 grandparents [06:37:29] I've just got terrible luck :p I had perfect vision, but I've been hit in the eyes repeatedly, and have scratched my pupils on both eyes [06:38:23] Moe_Epsilon: need to get a girlfriend who doesn't mind when you wander [06:38:34] xD [06:39:32] actually, my sister is the one who scracted my pupils, twice in fact [06:40:03] scratched* [06:40:07] Moe_Epsilon: leave that bit out, just say "a female" it makes for a better story [06:40:29] leave a bit of mystery, by saying "I'd prefer not to go into detail" [06:40:29] well I wasn't going for bonus points :p [06:42:25] but that isn't the latest thing that happened to me, that was before I was a teenager. I'll leave what happened this time a mystery [06:42:29] :D [06:54:47] * duh pokes PinkAmpersand  [06:55:04] * PinkAmpersand yelps [06:55:12] check your talk page plz :) [06:55:30] WD or WP? [06:56:10] WD [06:56:21] otherwise i would have pinged you in the other channel [06:57:08] :P [06:57:11] replying [06:58:16] {{talkback}} [06:58:45] random question: is there any template or magic word to make something display differently based on a user's skin? [06:59:02] no [06:59:09] no, not as far as I know [06:59:10] you would have to set css in the global css files [06:59:19] like set it one way in monobook.js [06:59:22] er *.css [06:59:27] and set it another in vector.css [06:59:56] you could also probably do some js detection, but css is better [07:15:01] :( [07:15:26] i ask because of the ampersand topicon on my talk page. it works fine for me, but shows up wrong in vector [07:15:48] oh, and Moe_Epsilon, if you think all that's bad, I know a girl who took a firework to the eye [07:16:38] I've had a close call with fireworks when one fell towards me and launched, but it thankfully missed :p [07:17:32] i don't know what fireworks to the eye feels like, but what I went through has to be comparable :p [09:17:46] Moe_Epsilon: :D [09:18:01] Isn't this deleting so fun? -.- [09:18:12] I think it is [09:18:13] :) [09:19:24] Enjoy the fun while you can. [09:20:00] why, is there a limited amount? [09:20:00] :P [09:20:22] Moe_Epsilon: based on my totally random estimate, i'm guessing there are about 8k left. [09:20:33] oh, that's still a lot [09:20:33] lol [09:31:09] Moe_Epsilon: What I mean is, once you start pressing that deletion button for the 6911th time, it gets kinda tiring [09:31:26] BACKLOG OMNOMNOMNOM [09:33:07] LOL [09:35:17] haha [09:35:24] guys you're awesome [09:36:14] :) [09:38:02] is Wikidata allready deployed in enwiki? [09:40:56] lbenedix: no there were technical problems apparently and it got delayed [09:51:56] * Moe_Epsilon burps [09:53:31] Moe_Epsilon!!! manners please ;-) [09:54:00] sorry, I'm full of deletions ;p [09:54:06] lbenedix: yeah as duh said we ran into issues last night - probably next try tonight or tomorrow night [09:54:08] Moe_Epsilon: haha [09:54:31] duh: your nick is lovely and confusing at the same time... [09:54:44] haha [09:54:54] just wait till "huh" joins :P [09:55:04] :D [09:56:31] * hoo is here to make the confusion perfect :D [09:58:20] Lydia_WMDE: You just cheered up my morning, thanks. [09:58:32] Riley: :) [09:58:35] always... [09:58:58] :D [09:59:01] Lydia_WMDE: you cheered up my evening, thanks ;-) [09:59:05] :D [09:59:30] Riley, you ruined my deleted spree (ec X 50) XD [09:59:49] deletion* [09:59:53] Moe_Epsilon: maybe you should start from the bottom? [09:59:59] I did! [10:00:07] xD [10:00:07] Moe_Epsilon: Mine lasted 1000 pages [10:00:09] Beat that! [10:00:28] * Moe_Epsilon no gusta [10:00:48] :p [10:03:37] Riley: is there some kind of schedule when the bot updates? Its updates seem random [10:03:49] Moe_Epsilon: Its random [10:03:52] No schedule [10:03:56] oh :< [10:04:01] jkjk -.- [10:04:12] It starts every hour on the dot [10:04:19] oh.. :D [10:04:21] * Riley was trying to hog them [10:04:30] And then it takes time to process them all, and them posts. [10:04:34] then* [10:05:05] oh, okay, that makes sense. :) Want me to start from an opposite side next time? :P [10:06:01] Sure [10:06:27] * Moe_Epsilon starts on the same side and trolls Riley [10:06:39] Moe_Epsilon: I was doing that last round... [10:06:43] Purposely :P [10:06:47] |: [10:06:49] xD [10:06:52] Hehe [10:07:23] I do have a t-shirt with trollface on it, I need to take a picture and make that my userpage image :) [10:07:38] [10:07:51] I will be :3 [10:08:47] * Jeblad_TROLL goes to troll office to do the daily trolling [10:09:16] :P [10:09:47] Jeblad_WMDE: LOL [10:10:41] Trolling is a way of fishing btw.. [10:10:56] Quite common, but now, off I go.. [10:12:18] Haha [10:13:08] PinkAmpersand: not in a rush, but when do you think you can have the template done? I'd like to have my code ready by then [10:13:55] also random question [10:14:11] hmm. probably gonna finish voting at RfA and go grab an early-morning snack first (somehow slept 18 hours, so I'm on yet another fucked-up sleep schedule), so... hour or two? [10:14:14] do all disambig pages get description of "Wikipedia disamgibuation page"? [10:14:30] PinkAmpersand: oh ok i was thinking more of # of days, but that works too :D [10:14:39] if so, can a bot do it? [10:14:55] yes. there's even a bot that replicates the alias across a bunch of languages [10:15:12] * duh rephrases [10:15:17] *is* there a bot doing it? [10:15:51] hmm. don't recall if that bot (whichever it is) also adds the alias itself, or if it just checks to see if it's listed in any language [10:18:12] are we talking about https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Innocent_bot ? [10:23:34] hmm not sure [10:25:18] * duh whips up a quick bot [10:26:34] btw, now that you say that you were thinking more days, not hours, i should note that i tend to underestimate template coding times by ~75%... always find more things to add, then more things to fix, then more things to add... etc... [10:28:13] >_> [10:28:14] <_< [10:28:15] https://www.wikidata.org/wiki/User:Moe_Epsilon [10:29:10] SO MANY DATA [10:29:40] are you really planning on keeping a list of how many items you create? [10:29:47] yes :D [10:30:10] but I posted the link, because I have a new picture of me xD [10:30:34] :) [10:30:38] Moe_Epsilon: that's a lot of items :) [10:30:50] I know x_x i'm not even done [10:30:55] so is there *any* exception to setting the description to "Wikipedia disambiguation page" if it has a disambig template? [10:30:58] i'm through January 6 [10:31:09] Moe_Epsilon: you know we have scripts that do these things for you? [10:31:10] :o [10:31:27] >: I looked for one and I couldn't find a tool! [10:32:04] silly [10:32:06] https://toolserver.org/~tparis/pages/ [10:32:18] I tried that and it didn't work [10:32:19] s: [10:32:52] No database selectedMySQL ERROR! Table 'toolserver.user' doesn't exist [10:32:58] oh. [10:34:05] New review: Anja Jentzsch; "Patch Set 1: Verified+1 Code-Review+1" [mediawiki/extensions/Wikibase] (master); V: 1 C: 1; - https://gerrit.wikimedia.org/r/48481 [10:35:11] Moe_Epsilon / PinkAmpersand / etc: so is there *any* exception to setting the description to "Wikipedia disambiguation page" if it has a disambig template? [10:35:42] I haven't seen one other than "Wikipedia disambiguation page" [10:38:26] ok [10:38:29] translate https://www.wikidata.org/wiki/User:Legoktm/disambig.js please [10:39:05] * Moe_Epsilon sadface [10:39:27] just add it in yourself [10:39:44] http://24.media.tumblr.com/3ecf94fd5fd75420f9d822e7f9ccbbc0/tumblr_mhawq5dZd81qll34mo1_400.gif [10:39:45] no [10:39:59] can't we use the is-a property, can have an item for "Wikipedia disambiguation page"? [10:40:00] okay, i'll add a couple if i can [10:40:04] this seems to be done for categories already [10:40:18] that would be less error prone and much easier to query and process later [10:40:39] DanielK_WMDE: Probably, but I don't think anyone has written a wikipedia article about that [10:40:45] duh: so? [10:40:46] oh [10:40:51] we could use https://en.wikipedia.org/wiki/Wikipedia:Disambiguation ? [10:40:59] aka https://www.wikidata.org/wiki/Q4167410 [10:41:00] for example, yes [10:41:07] sounds good [10:41:41] i can have my properties bot mass import that [10:42:20] using is-a properties like that to mark items that are disambiguation pages and categories would be very helpful i think [10:42:31] (same for wikipedia policy pages, user pages, etc...) [10:42:43] Wikipédia page d’homonymie is french [10:42:44] >: [10:42:50] Moe_Epsilon: too late! [10:42:53] oh [10:43:00] * Moe_Epsilon sadface again [10:43:07] if you can delete that page in my userspace plz [10:43:20] what page [10:43:28] the disambig.js [10:43:39] oh [10:43:59] Moe_Epsilon: Come do some dupes! [10:44:12] D: [10:44:13] OH [10:45:04] my bot is now spitting out 1k items at a time [10:47:30] DanielK_WMDE: Having some issues with the EntityContent/View/ViewAction mess, do you have a minute? [10:48:38] duh & Moe_Epsilon: hold on. I believe that English is the only language that uses "Wikipedia"... the rest just say "page d'homonymie", etc. [10:48:50] right [10:49:00] but we moved on [10:49:10] we're going to use DanielK_WMDE's idea of properties [10:49:20] _is a_ Wikipedia disambiguation page [10:49:48] that is a good idea [10:49:49] :P [10:50:53] I am wondering why ItemContent::getParserOutput or even EntityView::getParserOutput does not include the JS variables? This is done in Vi ewEntityAcdtion::show() instead. The variables are just generated with a second EntityView in ViewEntityAcdtion::show(), put apparently with the wrong entity content (wrong revision) [10:51:05] DanielK_WMDE: ^^ [10:53:04] Moe_Epsilon: Nice team work :) [10:53:19] I was late xD [10:53:43] Very late.. [10:53:45] fun fact, Riley has more deletions than all other sysops combined. [10:54:34] http://en.wikipedia.org/wiki/This_is_Moe_3.JPG <- guess that explains that, Riley [10:54:48] https://en.wikipedia.org/wiki/File:This_is_Moe_3.JPG * [10:54:52] Fun fact: I have deleted 0.21042% of Wikidata's items. [10:55:06] [10:55:11] XD [10:55:36] * Moe_Epsilon offers to help, takes pictures instead [10:55:50] lool [10:59:05] New review: Anja Jentzsch; "Patch Set 2: Verified+2 Code-Review+2" [mediawiki/extensions/Wikibase] (mw1.21-wmf9); V: 2 C: 2; - https://gerrit.wikimedia.org/r/48570 [10:59:05] Change merged: Anja Jentzsch; [mediawiki/extensions/Wikibase] (mw1.21-wmf9) - https://gerrit.wikimedia.org/r/48570 [11:00:04] if self.source.startswith(tuple(self.local_site.namespaces()[14])) <-- i18n ftw. [11:00:15] ok, my bot is ready to use templates now :) [11:01:36] :) [11:02:22] of course, this probably makes your concerns even more important [11:04:00] slightly [11:04:14] ultimately users just have to be careful [11:04:40] and by requiring sysops to approve requests, it adds in a double check system [11:09:09] as long as we're super careful with it, then I don't see a problem :) [11:24:56] --.- [11:25:01] Seriously guys.. [11:25:04] I claimed top! Moe_Epsilon [11:25:26] :P [11:25:29] i got the bottom [11:25:43] Oh, then its Wiki13 [11:48:15] Lydia_WMDE: is there an estimated time at which point enwiki deployment will be reattempted? [11:49:07] duh: looks like around 17utc [11:49:15] ok thanks :) [11:49:19] np [11:50:40] embrace for collisions on project chat [11:58:02] DanielK_WMDE: Why is Article::fetchContentObject or anything related not public? [12:02:20] Goodnight folks! [12:06:03] good night Riley [12:06:25] night Riley! [12:06:25] :D [12:06:38] Moe_Epsilon: You get all the deletions for the next hours... [12:06:41] :3 [12:06:56] The bot is even bumped up to 2k.. [12:07:04] * Moe_Epsilon locks Riley in his room [12:07:38] Moe_Epsilon: I hope that will give you enough time to learn how to catch up to me ;) [12:07:49] *cough*20p/m*cough* [12:07:50] XD haha [12:07:50] jkjk [12:07:53] Goodnight :) [12:07:59] later :) [12:21:25] * Jeblad_WMDE went hunting for some dead animal [12:22:31] aude: Is this sign => called a "hash rocket"??? :D [12:23:24] Silke_WMDE_: no idea but sure wikipedia has the answer somewhere [12:23:33] :) [12:23:41] it's not googleable [12:23:48] it's a bad title on wikipedia [12:23:57] might be a bit harder to find out [12:24:19] ah http://en.wikipedia.org/wiki/Fat_comma [12:24:31] ah, okay [12:24:32] "In Ruby, the fat comma is occasionally called a hash rocket." [12:24:45] * aude learns something new every day [12:24:54] * Silke_WMDE_ too [12:25:04] New review: Daniel Kinzler; "Patch Set 1:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/48612 [12:37:45] New review: Siebrand; "Patch Set 6: Code-Review+1" [mediawiki/extensions/Wikibase] (master) C: 1; - https://gerrit.wikimedia.org/r/47817 [14:16:39] New review: Tobias Gritschacher; "Patch Set 3: Code-Review-1" [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/48426 [14:27:56] New review: Tobias Gritschacher; "Patch Set 1:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/48612 [14:51:08] New review: Tobias Gritschacher; "Patch Set 6:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/47716 [15:05:49] whazzup [15:26:26] So, what are/were the errors we were getting yesterday? [15:26:37] I've been asked to dig back through the error logs we had to see if they show any signs of them [15:26:54] Though, since Tim fixed the syslog stuff (~sunday), I'd not noticed any errors [15:29:26] Reedy: it was because we added minwiki (and kicked the caches) to wikidata [15:29:42] the clients still had cached sites data with no minwiki and freaked out when they saw "minwiki" [15:30:11] we've 1) patched the code to avoid that 2) refreshed the caches and updated sites table on the clients [15:30:40] * aude can provide a specific error log line if you need [15:37:38] I suspect there's not much point looking for it then [15:39:04] Reedy: as long as it's not appearing now, then good [15:39:34] we'll want to update wikibase submodule again before trying deployment again for enwiki [15:39:47] AnjaJ_WMDE: https://bugzilla.wikimedia.org/44225 [15:40:02] * aude enjoying the new gerrit features of editing commit summaries :) [15:40:41] New review: Tobias Gritschacher; "Patch Set 1: Verified+1 Code-Review+1" [mediawiki/extensions/Wikibase] (master); V: 1 C: 1; - https://gerrit.wikimedia.org/r/48569 [15:41:21] New patchset: Aude; "Add debug point for sites data" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/48481 [15:41:39] ok, i can only edit stuff that is not merged :( [15:43:28] We might want to do that a bit earlier [15:43:58] sure [15:44:17] Reedy: when roughly? [15:44:40] it might be nice to inform the community, though [15:44:50] they know roughly 17:00 UTC [15:45:00] (i need to know so i can get home before or not) [15:45:01] so like another hour [15:45:15] Lydia_WMDE: go home now! :) [15:45:21] hurry [15:46:34] heh [15:47:21] Yeah, 73 minutes [15:47:31] i'd say update the submodule (which i'm doing now) [15:47:47] and then just for sanity let it sit a little while before enabling it for enwiki [15:47:48] ok then i will go home soonish [15:47:57] * aude sure things are good this time [15:51:28] Reedy: how is the dispatching changes doing? [15:51:57] did we ever repopulate sites table for test2 wiki? [15:52:49] * aude sees my wikidata edits appearing in huwiki/ itwiki but not in test2wiki [15:53:17] not since yesterday for test2 [15:53:58] Errr [15:54:11] * aude wonders why [16:01:50] That's stuff deployed [16:09:38] the stuff deployed seems fine, except we are experiencing an unrelated issue with bits (js/css) [16:09:45] so js might not work at the moment [16:09:48] FYI [16:10:48] including on wikidata, unfortunately :( [16:12:49] eqiad bits apaches, so it's likely to affect most people :( [16:13:12] * aude proxying via the US, so yes it's broken for me [16:13:49] good news, the wikidata rc i18n preference message is good now [16:13:53] yay [16:13:59] localisation update hasn't finished yet [16:14:04] think it's done wmf9 though [16:15:22] it's done and we have the submodule update, which all helps [16:24:38] re [16:24:52] sorry, family duties [16:24:53] so... [16:25:28] np [16:25:36] Danwe, Tobi_WMDE: Article is evil. Don't use it. Use WikiPage and Revision instead. [16:25:50] article should die [16:26:00] in a fire :) [16:26:10] Needs to be formally deprecated first [16:26:18] which it still AFAIK isn't ... [16:26:32] along with the ChangesList class (replaced by something new and better) [16:26:34] Long live $wgArticle [16:26:43] * aude hates the RC and watchlist code in core [16:26:44] hoo: right, that needs some more refactoring (rendering stuff needs to move into ViewAction, etc) [16:27:13] see what we can do to fix it but unfortunatley need to leave the old classes there for deprecation / bc reasons [16:27:19] aude: ChangesList isn't *so* bad. But EditPage... [16:27:24] oh well, let's not start that :P [16:27:24] it's bad [16:27:39] it mixes a lot of logic and a challenge for extensions to hook into [16:27:44] yea [16:28:04] * aude has a bunch of hacks like to filter the watchlist, and the core only encourages such hacks [16:28:20] cause there's no other way at the moment to do things [16:28:28] grep -r 'horrible' core | wc -l [16:28:28] 15 [16:28:29] :P [16:28:29] anyway, ..... [16:28:32] heh [16:28:43] * aude poking around on itwiki / huwiki with the updated code [16:28:48] hewiki too [16:32:06] aude: so, what's the schedule for tonight, except activating wikidata on enwiki? will enwiki also go from wmf8 to wmf9? [16:32:25] hewiki etc already already on wmf9, right? [16:32:30] * DanielK_WMDE is a bit out of the loop [16:32:30] DanielK_WMDE: yes enwiki [16:32:35] enwiki is already on wmf9 [16:32:44] but right now, the bits that serve js are down [16:32:47] hewiki, itwiki, huwiki as well [16:32:54] When did we freeze wmf9? [16:33:05] last week i think [16:33:06] e.g. no JS on wikidata and all the wikis.... nothing to do with us but obviously highest priority [16:33:09] * hoo wonders whether the client dialog will be live [16:33:14] last monday [16:33:22] hoo: it's still experimental [16:33:46] aude: Yep, I have some things I wanted to improve, but I really didn't find the time yet [16:33:47] soon as we can polish it up a little bit, then maybe 2 weeks after we launch to all the wikiepdias [16:33:50] sure [16:34:00] * aude busy too but willing to poke at the code [16:34:35] anyway, my watchlist looks fine with enhanced changes now on itwiki [16:35:25] aude: do we have a sites table on enwiki now? [16:35:43] yes, although we'll double check [16:35:49] ok [16:36:03] js down, that's the big moment for the non-js view. yeah! :-) [16:36:14] DanielK_WMDE: speaking of which, i'd like to talk with dab sometime and see about the sites table views being added on toolserver [16:36:35] aude: just file a ticket on jira [16:36:35] or a way to see a copy of the db and check stuff like this [16:36:41] mmmm ok [16:36:56] it helped to see the sites table for wikidata to see minwiki was indeed there [16:37:50] it would also be cool to have an api for the sites table data [16:38:00] that would be another way to poke and verify stuff [16:38:06] true [16:38:12] though it would give you the memcached data [16:38:20] true [16:38:52] re [16:39:20] Lydia_WMDE: so there is no JS on wikidata and all the wikipedias [16:39:32] nothing to do with us, but a wmf-wide issue at the moment [16:39:37] aude: -.- [16:39:38] ok [16:39:43] thx [16:39:49] deployment status? [16:40:04] Lydia_WMDE: once this more critical issue is resolved.... [16:40:13] right [16:40:13] ok [16:40:25] we updated the code that's on it / he /hu wiki and i confirm the watchlist and rc look fine with enhanced changes [16:40:34] e.g. nothing there that shouldn't be [16:41:07] i also want us to check the dispatching changes stuff before enabling enwiki [16:41:30] it's working fine on he / hu / it as far as i can tell but perhaps not on test2 [16:41:52] * aude suspects the sites table needs updating and cache kicked there [16:43:18] Lydia_WMDE: you broke en! [16:43:32] PinkAmpersand: you can't prove it! :D [16:43:39] PinkAmpersand: what's broken? [16:43:51] PinkAmpersand: the JS is broken but nothing to do with wikidata [16:44:04] still waiting to deploy on enwiki :) [16:44:34] New review: Daniel Kinzler; "Patch Set 1:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/48612 [16:46:04] JS seems to be back (and I almost reverted Lydia_WMDE due to the CSS suddenly reappearing) [16:47:00] oh, good [16:47:37] although if you get bits from amsterdam then you might not have the same problem as folks that get bits from eqaid [16:48:01] still very much broken for me [16:48:19] I don't use any proxy atm, dunnp [16:48:33] ah [16:49:22] * aude sees nostalgia skin on test2 wikipedia :D [16:49:33] everyone is using the same bits apaches [16:49:38] ah, ok [16:49:49] i suppose it depends on stuff in browser cache [16:49:58] * aude disabled browser cache [16:52:30] my South Korean public proxy is dead as it seems [16:52:34] * hoo heads of for Pizza [16:56:58] Lydia_WMDE: idk the folks at VPT are blaming y'all [16:57:19] PinkAmpersand: yes i saw - already commented there [16:58:07] PinkAmpersand: if you see it anywhere else please let me know quickly [16:58:11] ahh sorry. i've been drafting a response to my first XfD'ed page [16:58:28] i'd like to kick this as soon as possible so people don't get any wrong ideas [16:58:33] bad enough as is... [16:59:11] yikes [16:59:17] we are not deploying before the bits issue is resolved i reckon? [16:59:21] aude: ? [16:59:22] Denny_WMDE: right [16:59:33] obviously reedy, etc. are busy [16:59:38] that is good [16:59:53] yes, first resolve one, then the other [17:00:00] but we do have updated code on it / he /hu wiki and the watchlist looks good with enhanced changes [17:00:27] the dispatching is fine , although it seems to have stopped on test2 since yesterday [17:00:48] * aude would like to know why (probably minwiki is still missing there) [17:00:49] I'm not so busy ;) [17:01:01] ah, okay but still we don't deploy when stuff is broken [17:01:04] ^ [17:01:06] still, i guess we shouldn't deploy while the bits are hiccupping [17:01:09] I was just going to say the same [17:01:21] if you want to look into dispatching changes on test2, that would be great [17:01:28] does test2 have minwiki in the sites table? [17:01:56] * aude sees my wikidata changes in my watchlist on the wikipedias, except test2 [17:02:32] it would be nice to still have test2 for testing rather than have to "vandalise" some item that is on itwiki, for example [17:02:39] Yeah, we can do that in the meantime [17:02:48] * aude trying to verify https://bugzilla.wikimedia.org/44225 [17:04:54] test2wiki db: [17:05:01] | 786 | minwiki | mediawiki | wikipedia | local | min | http | gro.aidepikiw.nim. | a:1:{s:5:"paths";a:2:{s:9:"file_path";s:29:"http://min.wikipedia.org/w/$1";s:9:"page_path";s:32:"http://min.wikipedia.org/wiki/$1";}} | 0 | a:0:{} [17:05:01] hmmm [17:05:10] * aude tries to link to minwiki on test2 [17:05:43] > var_dump( Sites::singleton()->getSite( 'minwiki' ) ); [17:05:50] ^ gives valid info on test2wiki itself [17:05:56] ok [17:06:11] * aude checks our settings [17:07:26] 786 doesnt sound right [17:07:34] oh, that is not right [17:07:34] it's 790 on enwiki, no? [17:07:38] nah, that's the id [17:07:43] we also have new wikivoyages [17:08:00] wikivoyages might be 787, 788, 789, 790 [17:08:01] 443 on enwiki [17:08:07] ok [17:08:15] huh? [17:08:22] i thought it would be the newest on both, and 790 on them [17:08:24] 443 what [17:08:25] nmd [17:08:28] | 443 | minwiki | mediawiki | wikipedia | local | min | http | gro.aidepikiw.nim. | a:1:{s:5:"paths";a:2:{s:9:"file_path";s:29:"http://min.wikipedia.org/w/$1";s:9:"page_path";s:32:"http://min.wikipedia.org/wiki/$1";}} | 0 | a:0:{} | [17:08:39] Denny_WMDE: i think the script does them by site group [17:08:45] ok [17:08:48] it will populate all the wikipedias, then the wikibooks, etc. [17:09:00] enwiki should be fine, then [17:09:39] 786 on itwiki too [17:09:46] doesnt explain why test2 doesnt work, still [17:09:48] that's fine [17:09:56] aude: it does them in the order thy come from the site matrix api query. [17:10:02] so i can't add more siteslinks for wikidata [17:10:12] DanielK_WMDE: ok, and maybe it's by group? [17:10:13] * DanielK_WMDE thinks we should kill the internal id for sites [17:10:14] or something [17:10:17] aude: maybe [17:10:19] agree [17:10:30] probably "or something" [17:10:30] it should use the global ids [17:10:35] yes [17:10:36] have a mapping of global -> local id [17:10:38] We still use language codes as identifiers in JS, btw [17:10:46] right [17:10:51] hoo: ugh?! [17:11:00] DanielK_WMDE: Yep, that's super ugly [17:11:02] i'm sure they are used in tons of places [17:11:24] I hacked a function to work with global ids though [17:11:37] but still the object is organized with langcodes [17:11:42] well, language codes can be used as "local identifiers" (aka interwiki prefixes). that's fine. [17:12:02] DanielK_WMDE: The wbSites object [17:12:03] ...at least for core [17:12:13] in Wikibase, we should use the global ids [17:12:49] Finger of blame might be heading this way [17:12:55] aude: ok, I'm off fixing dinner etc. ping me via sms or mail when deployment starts. [17:12:58] :o [17:13:04] Reedy: o_O [17:13:08] mw.config.get( 'wbSiteDetails' ); [17:13:25] Problems seemingly started after l10nupdate and the sites/wikibase pushes [17:13:26] * aude checks the village pump on enwiki :) [17:14:30] is there a place where the discussion is going on? i dont see anything in -tech or -operations? [17:14:42] Reedy: there's a custom module providing the list of sites to JS. ask Danwe about it. [17:14:47] that's the only connection i can see [17:14:54] and it should only be used where wikibase is enabled. [17:15:49] http://en.wikipedia.org/wiki/Wikipedia:Village_pump_(technical)#Page_loading_issues :/ [17:18:14] I do love the uninformed making comments [17:18:25] :D [17:18:31] one of my favourite hobbies [17:19:00] happy belated birthday Reedy btw! [17:19:23] Reedy: so http://test2.wikipedia.org/wiki/Jakarta [17:19:35] it has a link on wikidata to min wiki but it does not appear in test2 [17:19:40] in the sidebar [17:19:51] oh, nevermind... [17:19:56] * aude sees a bug  [17:21:29] Might also be mobiles fault [17:21:45] aude: link to min is there [17:21:51] it is at the end, though [17:22:08] aude says this needs fixing before ;) [17:22:16] more cake!!! [17:22:23] one sec [17:27:43] New review: Anja Jentzsch; "Patch Set 2: Verified+2 Code-Review+2" [mediawiki/extensions/Wikibase] (mw1.21-wmf9); V: 2 C: 2; - https://gerrit.wikimedia.org/r/48641 [17:27:43] Change merged: Anja Jentzsch; [mediawiki/extensions/Wikibase] (mw1.21-wmf9) - https://gerrit.wikimedia.org/r/48641 [17:29:33] The parrot in my browser window, is that also due to problems with bits? [17:31:38] Reedy: https://gerrit.wikimedia.org/r/#/c/48643/ (sorry!) [17:31:51] that puts minwiki in correct sort order [17:32:00] Roan is currently reverting that code update [17:32:07] oh [17:32:09] So we shall wait and see.. [17:32:12] hrm [17:32:17] https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/extensions/Wikibase.git;a=commit;h=f10bfdc25c70d7e7fb8c3d0e86b350845c2a32ae https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/extensions/Wikibase.git;a=commit;h=fe4cef1047c9cb4aeceb62e215db8bdb861875b3 [17:32:28] Neither of those have anything obviously relevant (ie what was newly deployed) [17:35:19] * aude confirms, despite no JS, the wikibase code updates looked fine on the wikiepdais [17:35:23] wikipedia [17:35:32] slightly confused [17:35:58] It's a big stabbing around in the dark contest atm it seems [17:36:04] sure [17:37:16] [17:36:53] load went down before my rollback [17:37:17] [17:37:00] So I think mark did it [17:37:21] Pointing at mobile again [17:37:30] ah [17:37:57] so we can rollback the rollback... [17:38:09] is there a habit of writing postmortems after events like these? [17:38:21] Denny_WMDE: i think so [17:38:30] good. looking fwd to see this one [17:41:24] Yeah, for bigger outages, they usually happen [17:42:30] i bet it's the update to gerrit [17:42:44] Poor Chad [17:42:53] or the fact that we are popeless. this leads to a few dangling references, i guess. [17:43:11] itwiki looks better [17:43:15] js and all [17:44:09] so i see my wikidata edits to jakarta on itwiki watchlist [17:44:15] checking test2 [17:44:21] pretty fast, btw [17:44:39] good [17:44:43] nothing on test2, although it could take time [17:45:16] * aude suspects dispatching changes broken for test2, for some reason [17:45:45] Do we log the dispatcher? [17:45:49] we do [17:45:54] dispatcher.log [17:45:57] I'll have a look and see if it shows anything interesting [17:47:10] it should say something like Posted 2 changes to test2wiki [17:47:34] hmm, where is said log.. [17:47:48] * aude finds the puppet config [17:48:38] /var/log/wikidata/dispatcher.log [17:49:06] oh, on hume presumably? [17:49:13] on hume or whatever is doing cron jobs now [17:49:21] yup [17:49:40] brb then I'll have a look [17:49:51] ok [17:53:24] right [17:54:03] 17:53:41 Posted 1 changes to test2wiki [17:54:03] 17:53:42 Processing changes for test2wiki [17:54:03] 17:53:42 Posted 0 changes to test2wiki [17:54:03] 17:53:47 Processing changes for test2wiki [17:54:03] 17:53:48 Posted 0 changes to test2wiki [17:54:03] 17:53:50 Processing changes for test2wiki [17:54:05] 17:53:50 Posted 0 changes to test2wiki [17:54:08] oh, ok [17:54:13] it found something [17:54:38] reedy@hume:/var/log/wikidata$ grep test2wiki dispatcher.log -c [17:54:39] 12566 [17:54:39] reedy@hume:/var/log/wikidata$ grep itwiki dispatcher.log -c [17:54:39] 12874 [17:54:52] it would look to be doing a correct number of updates [17:57:03] and how's the job queue? [17:57:21] last thing i see that is wikidata is from yesterday [17:57:22] (diff | hist) . . New York City (Q60); 20:49 . . Aude (Talk | contribs) (Language link added: an:Nueva York) [17:57:46] i since did 2 more edits to that item, and 2 to jakarta and some other edits that should appear [17:58:17] and they did appear on it / hu / he [18:00:58] giddity giddity, so, is it time to switch the wicked wikidata on? [18:01:02] reedy@fenari:/home/wikipedia/log$ mwscript showJobs.php test2wiki [18:01:02] 79166 [18:01:06] That looks suspect :) [18:01:29] it, hu and he are all < 100 [18:01:38] what cluster is itwiki on? [18:01:45] do jobs run by cluster or something? [18:02:05] found itwiki -- s2 [18:02:05] itwiki is s2 [18:02:19] he and huwiki are s7 [18:02:22] test2 is s3 [18:02:34] * aude no idea if clusters matter for job queue [18:02:37] Oldest job is [18:02:38] 20130201185823 [18:02:41] ChangeNotification [18:03:00] sounds "wrong" ... sounds like the problem [18:03:19] mysql:wikiadmin@db1019 [test2wiki]> select job_cmd, count(job_cmd) from job group by job_cmd; [18:03:20] +--------------------+----------------+ [18:03:20] | job_cmd | count(job_cmd) | [18:03:20] +--------------------+----------------+ [18:03:20] | ChangeNotification | 143134 | [18:03:21] | webVideoTranscode | 2 | [18:03:23] now that test2 has minwiki ( i believe we kicked the cache ) [18:03:24] +--------------------+----------------+ [18:03:26] Very suspect :D [18:03:30] can we restart the job runner for test2? [18:03:48] that's a lot of jobs for test2 :o [18:03:49] Ah [18:03:52] I think they're failing [18:04:06] nothing in the error logs for test2? [18:04:06] http://p.defau.lt/?d6ymw6SLR7a9BPADLp0HVA [18:04:12] * aude click [18:04:24] * Reedy runs jobs on fenari [18:04:36] oh, ugh [18:04:44] failed to load revision [18:04:49] the page has been deleted [18:04:56] http://www.wikidata.org/wiki/Q1056299 [18:05:00] oh, not good [18:05:12] indeed [18:05:21] test2 behaves as enwiki [18:05:26] Failed to load revision 6341950 of q1252879 [18:05:37] * aude goes to view the deleted item [18:06:13] Should I log a bug to make that fail a bit more cleanly? [18:06:13] i assume if the item is deleted, the revision is inaccessible to the script, which is good [18:06:37] yes, thanks, Reedy. or not to fail at all and deal with it appropriately :) [18:06:50] it had just one link to huwiki,and [18:07:03] deleted at 15:17 today [18:07:27] doesn't match the last change notification unless there are more such errors [18:07:42] and huwiki job runner seems fine [18:10:22] Wikibase should be back on the previously deployed revision [18:10:32] * aude poking at the code [18:11:38] previous, as in when? [18:11:49] what I deployed for you earlier this afternoon [18:12:00] ah, good [18:12:01] before roan rolled back earlier [18:14:11] checking my watchlists [18:14:36] < 72000 now [18:14:45] hmmm, better [18:16:27] watchlist looks good [18:16:51] Shall we bring in that extra minwiki change? [18:17:14] sure [18:18:47] https://gerrit.wikimedia.org/r/48648 [18:19:22] ah. [18:19:24] try again [18:20:45] Gerrit is being a little too clever [18:21:34] reedy@fenari:/home/wikipedia/log$ mwscript showJobs.php test2wiki [18:21:35] 61645 [18:21:49] hmmm. ok [18:22:04] they're going down pretty quickly [18:22:16] i've still got the manual runner on fenari [18:27:11] * aude waits to see if gerrit likes https://gerrit.wikimedia.org/r/#/c/48651/ [18:27:31] ah, nevermind :) [18:27:45] Submit Type Merge if Necessary [18:27:46] Status Review in Progress [18:27:46] Can Merge Yes [18:27:59] * aude confused by gerrit sometimes [18:28:42] * aude just waits [18:29:46] reedy@fenari:/home/wikipedia/common$ mwscript showJobs.php test2wiki [18:29:46] 48994 [18:30:36] https://bugzilla.wikimedia.org/44911 <-- made a bug ticket for this [18:31:03] thx aude [18:31:08] Hahas [18:31:14] I thought I had done it [18:31:25] didn't see one [18:31:26] but internet fail and chrome timed out [18:32:40] Oooh [18:32:49] Not sure if all of these are your faults.. [18:32:59] Is there allready a interwiki shortcut for data? (de:wp... e.f.) [18:33:22] hmmm [18:33:43] Warning: call_user_func_array() expects parameter 1 to be a valid callback, class 'Wikibase\ClientHooks' does not have a method 'onWikibaseDefaultSettings' in /usr/local/apache/common-local/php-1.21wmf9/includes/Hooks.php on line 256 [18:33:43] Fatal error: Call to a member function getPageLanguage() on a non-object in /usr/local/apache/common-local/php-1.21wmf9/includes/OutputPage.php on line 1777 [18:33:43] Warning: call_user_func_array() expects parameter 1 to be a valid callback, class 'Wikibase\LibHooks' does not have a method 'onWikibaseDefaultSettings' in /usr/local/apache/common-local/php-1.21wmf9/includes/Hooks.php on line 2 [18:33:43] Warning: include_once(/usr/local/apache/common-local/php-1.21wmf9/extensions/Wikibase/lib/config/WikibaseLib.default.php) [function.include-once]: failed to open stream: No such file or directory in /usr/local/apache/common-local/php-1.21wmf9/extensions/Wikibase/lib/WikibaseLib.php on line 274 [18:33:43] Warning: include_once() [function.include]: Failed opening '/usr/local/apache/common-local/php-1.21wmf9/extensions/Wikibase/lib/config/WikibaseLib.default.php' for inclusion (include_path='/usr/local/apache/common-local/php-1.21wmf9/extensions/TimedMediaHandler/handlers/OggHandler/PEAR/File_Ogg:/usr/local/apache/common-local/php-1.21wmf9:/usr/local/lib/php:/usr/share/php') in /usr/local/apache/common- [18:33:43] local/php-1.21wmf9/extensions/Wikibase/lib/WikibaseLib.php on line 274 [18:33:45] Fatal error: Call to a member function disable() on a non-object in /usr/local/apache/common-local/php-1.21wmf9/includes/GlobalFunctions.php on line 2160 [18:33:47] o.o [18:33:52] oh [18:34:09] ugh, really [18:34:23] One in the middle might just be file permission fail [18:35:00] OH SHIT# [18:35:23] the settings thing shouldn't bet there [18:36:02] I updated it to master master... [18:36:11] oh, we want the branch [18:36:19] yeah, just reverting now [18:36:47] yeah, that is brand new stuff from the weekend that is still experimental [18:36:50] won't work :o [18:36:50] heh [18:37:29] Can you do a proper update revision for me? [18:37:35] Just going to check these fatals [18:38:05] ? [18:38:37] Can you update wikibase in wmf9 to whatever revision you actually wanted? [18:38:42] Not what I updated it to... [18:38:43] sure [18:39:13] DanielK_WMDE: killed my first Article: https://gerrit.wikimedia.org/r/#/c/48612/ R.I.P. [18:40:41] reedy@fenari:/home/wikipedia/common$ mwscript showJobs.php test2wiki [18:40:41] 16577 [18:40:56] https://gerrit.wikimedia.org/r/#/c/48654/ [18:41:51] Will wait for those errors to filter out of the logs [18:41:55] Will confirm it was just me failing [18:42:13] ok [18:42:32] interwiki: we have d ^^ :) [18:42:32] That's deployed [18:45:52] hmm, minwiki is still last, although it's not critical [18:46:59] please tell me when it's good to publish blog post and so on [18:47:07] Lydia_WMDE: ok [18:47:09] not quite yet [18:47:12] k [18:48:00] sorry disconnected [18:48:02] reedy@fenari:/home/wikipedia/common$ mwscript showJobs.php test2wiki [18:48:02] 1 [18:48:06] nice [18:48:07] One problem dealt with for now [18:48:32] the stuff is in my test2 watchlist now [18:49:04] Ohh [18:49:18] ? [18:49:43] Now minwiki should be sorted properly [18:49:58] newer git is more confusing with its output about submodules [18:50:11] * aude happy! [18:50:17] http://test2.wikipedia.org/wiki/Jakarta?action=purge looks good [18:50:28] next question is whether the AFT deploy is doing anything [18:50:33] sure [18:51:18] Matthais apparently isn't online [18:51:19] Nice time... cu [18:51:22] * Reedy grins [18:52:12] i think we just keep an eye on the job queues, if that error happens again [18:52:29] yeah, I'll check on it in a few hours [18:52:40] i don't see an obvious fix for why it crashed but daniel can look at it [18:52:54] * aude needs to dig into that part of the code more, and maybe can tonight [18:53:00] DanielK_WMDE: ^ fyi [18:53:17] error logs are clear again, so it was my deploy of the wrong code [18:53:25] ok [18:54:19] Sooo, enwiki [18:54:51] i'm happy with test2 and the other wikipedias, so think we are good [18:55:17] good to post for me then? [18:55:23] Lydia_WMDE: no [18:55:27] Lydia_WMDE: well, it's not enabld yet ;) [18:55:32] ohhh [18:55:32] ok [18:55:35] let's wait and see it works :) [18:55:38] sure [18:55:41] i thought it was [18:55:54] Lydia_WMDE: we're ready but not done yet [18:55:59] k [18:56:07] i'll go get food then - back in 5 [18:56:25] ok [18:57:19] wtf... [18:57:21] https://www.wikidata.org/wiki/Special:Undelete/Q4488843 [18:57:29] how in hell did that happen... [18:57:40] huh [18:57:46] thats my ip [18:57:50] and im logged in [18:57:53] oh no [18:57:53] o.O [18:58:12] Memcached failures can cause that [18:58:20] * hoo hides just for using the evil word [18:58:25] * aude can't see it as non-admin [18:58:36] aude, Wiki13: as Danwe pointed out, when swicthing between www.wikidata.org and plain wikidata.org, cookies are lost... [18:58:40] can take a look after we handle enwiki [18:58:40] aude: Just two random IP edits [18:58:48] ah [18:58:53] hoo: nope, it is my IP... [18:59:00] and Im logged in right now [18:59:01] if you are viewing the site as wikidata.org, but accessing the api as www.wikidata.org, you are editing as an anon [18:59:03] that sucks... [18:59:05] didnt do anything for it [18:59:25] DanielK_WMDE: That's more likely yes... I ever only once noticed a buggy memcached on WMF [18:59:27] yea DanielK_WMDE [18:59:31] Wiki13: you are logged in... what's the domain you see? with or without www? [18:59:38] with [18:59:43] Wiki13: try without [18:59:46] k [18:59:47] still logged in? [18:59:54] aude: Good to go now? ;) [19:00:02] Reedy: sure [19:00:16] aude: i'm afk again in a minute, just dropping by [19:00:18] * aude has "vandalised" Q60 to see that the change appears [19:00:19] ping me if you need me [19:00:30] DanielK_WMDE: i think we are good, though note one bug about change notificaiton [19:00:42] doesn't seem a blocker though but we'll keep an eye on the job queue [19:01:09] DanielK_WMDE: https://bugzilla.wikimedia.org/show_bug.cgi?id=44911 [19:01:28] yay [19:01:32] And we should be live... [19:01:56] https://en.wikipedia.org/wiki/Wikidata [19:02:02] Edit links looks weird [19:02:52] * aude purging page [19:03:36] * aude sees [19:04:11] Failed to load resource: the server responded with a status of 503 (Service Unavailable) http://bits.wikimedia.org/en.wikipedia.org/load.php?debug=false&lang=en&mod…ttingStarted%7Cwikibase.client.init&skin=vector&version=20130212T161905Z&* [19:04:14] huh? [19:04:39] /* No modules requested. Max made me put this here */ [19:04:49] GET http://bits.wikimedia.org/en.wikipedia.org/load.php?debug=false&lang=en&mod…GettingStarted%7Cwbclient.watchlist&skin=vector&version=20130212T190144Z&* 503 (Service Unavailable) load.php:152 [19:04:58] They're making noise again [19:05:03] oh [19:06:02] edit links ok [19:06:08] it's quite slow at the moment to load a page [19:06:15] for a while... [19:06:37] \o/ [19:06:41] i'm sure this has been asked elsewhere (and probably here), but when will it be possible to add values in other languages than your interface language? (e.g. when the item you want doesn't have a label in your language) [19:08:32] Here it looks ok: https://en.wikipedia.org/wiki/The_Hague [19:09:16] Request: POST http://en.wikipedia.org/w/index.php?title=New_York_City&action=submit, from 69.164.222.250 via cp1006.eqiad.wmnet (squid/2.7.STABLE9) to 10.64.0.138 (10.64.0.138) [19:09:20] Error: ERR_READ_TIMEOUT, errno [No Error] at Tue, 12 Feb 2013 19:08:53 GMT [19:09:24] not cool [19:13:45] disabled? [19:13:51] yep [19:14:11] i can't say it's wikibase that's an issue but happened at the same time [19:14:39] shouldn't be a problem, but maybe mixed with other factors, enwiki is not liking it at the moment :( [19:18:16] * aude still unable to save [[en:New York City]] on enwiki [19:18:45] aude: ah, changes to deleted pages/revisions. they fail to load. should be handled gracefully, with a wfDebug [19:18:59] i can look into that later. or did anyone already do that? [19:19:03] ok [19:19:16] * aude didn't really look at it in depth [19:19:42] Jhs: soon, that feature is already in testing. [19:19:51] DanielK_WMDE, awesome :) [19:19:53] It seems very co-incidental [19:20:01] aude: what i said above is just a guess, but it sure looks like it. [19:20:15] make sense [19:20:31] * aude able to save http://en.wikipedia.org/wiki/Computable_topology [19:20:35] it's a smaller page [19:20:44] so... what's slow? page rendering? [19:21:02] * DanielK_WMDE is only here for a couple of minutes, unless enwiki falls over [19:21:04] saving [19:21:11] purging, too? [19:21:12] sure [19:21:14] preview, too? [19:21:17] everything is slow for me all across Wikimedia. not just wikidata [19:21:18] wikidata is off for enwiki [19:21:20] still problems [19:21:32] hm. cold caches? [19:21:38] * aude tries http://en.wikipedia.org/w/index.php?title=New%20York%20City&action=purge [19:22:00] slowhohoho... [19:22:26] but wikibase isn't enabled on enwiki? so it can't really be our fault, can it? [19:22:30] right [19:22:35] i think it's a coincidence [19:22:37] ok, the page purged and loaded now. [19:22:54] but regardless, we already got blamed for JS / CSS being broken WMF wide [19:22:58] [19:23:03] cause it was a coincidence and people suspected us [19:23:35] [19:23:45] lols [19:23:47] although, a big page like this may always be slow anyway [19:23:48] hm, total render time would be useful output too, in the html. [19:23:59] bazillion templates, etc [19:24:28] Preprocessor visited node count: 262644/1000000 [19:24:31] ugh :P [19:24:34] yes [19:25:11] On the larger ones, it wouldn't suprise me if they were borderline already.. And wikidata juts tips it over slightly [19:25:18] could be [19:25:30] * aude tries the same page on itwiki again [19:25:36] Lydia_WMDE: we might need cake, got some ingredients at home? [19:25:50] AnjaJ_WMDE: nope -.- [19:26:19] alright, no wikidata on enwiki tonight :( [19:26:37] too many issues, regardless of wikidata [19:27:04] [[Ted Poe]] is reasonably quick: [19:27:11] anyway. afk again [19:27:35] yeah, that's a small page [19:27:51] argh... [19:27:52] Wikibase doesn't even get a mention in the combined profiling stats [19:27:58] this is starting to make us look like fools [19:27:59] [19:28:05] for [[it:New York]] [19:28:10] on action=purge [19:28:24] that's a huge difference [19:28:40] ok so i get to send out more "not today" notes now? [19:28:46] Lydia_WMDE: yes :( [19:28:47] It still feels highly co-incidental [19:28:50] people understand [19:28:59] with the other problems with CSS / JS [19:29:13] that it's best not to try to add something else new into the mix [19:29:22] until ops really understand what the problem was / is [19:29:36] yes i understand but.. argh [19:29:38] oh well [19:29:43] nothing to be done now i guess [19:29:45] if we deploy and theres till are coincidental issues, then we get blamed [19:29:50] * Lydia_WMDE goes and does her stuff [19:29:54] Lydia_WMDE: ok :/ [19:30:38] oh *crap*, my removing the links did save [19:30:55] * aude reverting [19:31:46] trying to load old revision of New York City but that's taking fooooooooooooorrrrrrrrreeeeeeeevvvvvvvver [19:32:18] Just use revert? [19:32:34] i can try [19:33:33] ok, that took a long time but did work [19:34:01] heh [19:34:44] trying to load an old revision again [19:34:50] just see if it's possible or not [19:35:10] 75% of all requests hitting it are for mobile.startup [19:35:17] ok, it worked [19:35:19] finally [19:38:10] aude: so, it's off for today? [19:38:16] [19:36:27] look at 327 modules=mobile.startup..&version=$VERSION requests hitting a single bits apache in 20 sec, there are 158 different $VERSION numbers [19:38:16] [19:36:51] so, that's pretty broken [19:38:18] DanielK_WMDE: yes [19:38:22] seems like i'll be staying in berlin tomorrow night, then :/ [19:38:23] too many other issues [19:38:29] was planning to go back home in the evening [19:38:31] oh, well [19:38:35] assuming rob is happy to try again [19:38:45] we need to see that the other issues are totally resolved [19:38:51] and they understand what went wrong [19:38:59] I can't see why not, if mobile fixes the issues that Asher is pointing out [19:39:03] * aude thinks it's likely we can try again but can't promise [19:39:07] right [19:39:30] so, we blame mobile for now? [19:39:31] [[New York City]] is always a bit slow but it seems quite excessive tonight [19:39:37] seems so :o [19:39:41] k [19:40:38] * aude wants to go to the English pub [19:40:40] reedy@fenari:/home/wikipedia/common$ mwscript showJobs.php test2wiki [19:40:40] 182 [19:40:43] once we deploy successfully [19:40:45] That looks more sensible [19:40:47] Reedy: looks fine [19:47:18] what's up with our non-mainspace diffs? [19:53:07] PinkAmpersand: on wikidata.org? [19:53:24] yeah [19:53:41] looks fine to me: https://www.wikidata.org/w/index.php?title=Wikidata%3AMain_Page&diff=5777586&oldid=5498506 [19:54:07] could be the bits? is css just formatted wrong or something [19:54:23] h. me too now. had it displaying w/o the color highlighting a second back [19:54:29] *oh [19:54:40] yeah, that's bis related [19:54:42] *bits [19:55:26] seems so [19:55:28] DanielK_WMDE: haha bonus points for using a diff of an edit i made [19:55:42] try a hard control+R or whatever to clear your cache [19:55:52] maybe append a "&debug=true" [19:56:49] ok. [20:00:11] Reedy, aude: when a wikibase client gets added, the client gets swamped with change notifications for the last couple of days. we could fake an entry in wb_changes_dispatch to avoid this, if you like. [20:00:29] that would be a good thing to do [20:00:47] * aude wonders if enwiki is still being notified despite being off [20:00:52] we could already do that now, except we'd have to guess the change ID we want it to start processing at.- [20:00:52] actually [20:01:00] * aude checks the configs [20:01:09] oh.... [20:01:13] the repo wouldn't know [20:01:18] unless we disable that line in the config [20:01:59] aude, Reedy: if enwiki has ChangeNotificationJobs in the queue, and the client extension is turned off, it will fail to instantiate these jobs, because it doesn't know the class. [20:02:06] i hope the job runner is robust enough to handle this [20:02:06] heh [20:02:07] true [20:02:08] let me see [20:02:08] but [20:02:22] aude: can we test what happens when that happensß [20:02:24] ? [20:02:31] | ChangeNotification | 11323 | [20:02:36] Shall I just delete them? [20:02:37] \o/ [20:02:42] Reedy: yes. [20:03:01] done [20:03:06] config patch coming [20:03:54] https://gerrit.wikimedia.org/r/#/c/48668/ [20:05:43] ahh [20:05:48] aude: good call. let's remember to undo that tomorrow, though! [20:05:58] sure [20:06:05] we could re-enable the extension, poke around [20:06:10] and then if good, re-enable the jobs [20:06:17] the jobs aren't 100% necessary [20:06:21] true [20:06:26] but very good to have [20:06:41] yes, but we can prod and test without them [20:06:46] yep [20:07:08] stuff doesn't go into the rc table then [20:07:13] right [20:07:18] it's not horrible [20:07:32] although very nice to be notified in recentchanges [20:07:58] after all, almost all the links will just appear and never be seen in recent changes [20:08:15] since they've been added long ago to wikidata and the changes pruned [20:10:15] mobile should've just fixed the broken ness [20:10:21] Shall we try again? ;) [20:11:18] aude, Reedy: -->#wikimedia-tech [20:11:49] we could easily drop the respective entries from recentchanges [20:11:55] they have a special rc_type [20:13:17] DanielK_WMDE: link? [20:13:19] https://test2.wikipedia.org/w/index.php?title=Special:RecentChanges&hidewikidata=0 looks fine [20:13:34] unless there are some entries in enwiki [20:13:40] * aude would drop those [20:14:40] i could be completely wrong, https://www.wikidata.org/wiki/Wikidata:Administrators/Confirm_2013/5#Bin.C3.A1ris_.28talk_.E2.80.A2_contribs_.E2.80.A2_log_actions.29 [20:14:47] that seems kind of suspicious? [20:18:10] Moe_Epsilon: suspicious how? [20:19:35] DanielK_WMDE: the candidate is active daily on the wiki that is basing all his support on from the past couple hours, new users and IPs supporting [20:20:10] DanielK_WMDE: I'm off, cya tomorrow [20:20:16] not saying that they are wrong or they have no right to vote, but it's a lot of similarity coming from the same group of hu.wiki users [20:20:22] Moe_Epsilon: don't know about that, but he *is* well known. assuming no impersonation is going on. [20:20:34] Abraham_WMDE: cu [20:20:35] AFK for 15-20 [20:21:31] DanielK_WMDE: I was suspecting more along the lines of possibly canvassing a vote on hu.wiki :s I have no evidence of that but when users are joining to support en mass [20:21:35] it makes you wonder [20:22:57] Moe_Epsilon: my unqualified impression is: looks like a good candidate was inactive for too long, got shot down for it, and called home for support. wouldn't call that canvassing. [20:23:01] *shrug* [20:24:33] as long as he's not, then there's no problem with hu.wiki users supporting [20:26:31] possibly just a coincidence :) [20:28:25] * aude wonders what i can get to eat at this hour [20:29:00] not feel like cooking when i get home, of course [20:29:22] oh, I see why people are coming to his confirmation RFA now, a link to Wikidata is in his signature and he's been very active there the past day on a forum [20:29:24] makes sense [20:29:48] aude: you are in berlin! in kreuzberg! food should not be a problem [20:30:15] hello, what do you think about this: https://hu.wikipedia.org/w/index.php?title=Wikip%C3%A9dia%3AKocsmafal_%28egy%C3%A9b%29&diff=13024277&oldid=13023719 user Pallerti invites Hungarian users to vote Binaris [20:30:30] |: [20:30:32] yeah, but not the same stuff i ate at midnight last night :D [20:30:43] aude: if all else fails, take a detour via cotbusser tor and go to maroush [20:30:49] true [20:31:02] sushi place probably is closed by the time i get there [20:31:20] DanielK_WMDE: that kind of confirms my canvassing concern, unfortunately :( [20:31:27] aude: on oranienstraße? unlikely. [20:31:40] no, different one [20:31:58] anyway, i think we are good for tonight [20:32:59] we at least know the change propagation and all works for enwiki [20:33:17] * aude pretty sure the issue was unrelated but a coincidence [20:33:17] Moe_Epsilon: maybe. i think he doesn't mean any harm, quite the opposite. make of that what you will :P [20:33:30] * aude does not want wikidata associated with whatever other technical problem / coincidence though [20:34:50] no, no harm is intended, but I think it's worth mentioning since I believe others would frown upon it. I know I do, because it is asking for support :/ [20:35:20] aude: btw, Nollendorfplatz is another good place on U1/U2 for finding foot late. [20:35:41] sure [20:37:49] err. food. [20:37:59] it's late [20:38:12] you can also find feet there, but i wouldn't trust them. [20:38:23] heh [20:40:47] see folks tomorrow (or later on irc) [20:41:43] cu [21:09:59] mh gotta find a certain message on translate wiki [21:10:08] since I found some mistaken use of {{plural}} [21:17:54] Vito, it's not mistaken, it's a bug [21:18:03] that appeared in the last couple of hours [21:18:08] cool [21:18:11] so don't change the message [21:18:17] yep Jhs [21:18:26] actually I cannot add acertain page [21:18:30] i don't know if it's been reported or anything, i'm just assuming someone who can fix it will see it soon :P [21:19:01] Vito, where can't you add something? [21:19:37] Jhs: http://it.wikidata.org/wiki/Q573631 [21:19:45] it:Operatore di Fredholm [21:20:06] it gives a really generical error message [21:20:17] An error occurred while trying to perform save and because of this, your changes could not be completed. [21:21:14] hmm, that was a strange one [21:21:50] it may be because the page is so new and there is some sort of database lag [21:21:58] but that's just an unqualified guess [21:21:59] mh [21:22:43] btw the error messages are supposed to be a bit more explicative [21:22:50] *-the [21:23:31] New patchset: John Erling Blad; "(Bug 41163, Bug 41165) Changed Autocomment class Summary" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/47716 [21:24:48] Vito, i was able to save it now [21:24:57] weird [21:25:01] definitely weird [21:25:26] i wasn't a couple of minutes ago, i had the same problem as you. but just now it worked [21:25:42] so probably some kind of lag. or some invisible pink unicorn fixed it [21:26:01] likely to be a synch issue [21:26:21] anyway now we have a new cool bug with plural magic word :D [21:26:22] There are some strange bugs during save [21:26:37] Some of them will disappear on wednesday [21:27:25] Jeblad_WMDE, kan du kanksje fikse denne PLURAL-tingen som oppsto for noen timer sida? det st�r n� �Sider som lenkes til dette datasettet(4 {{PLURAL:4|oppf�ring|oppf�ringer}})� [21:27:52] tydeligvis har m�ten beskjeden parses p� blitt endra [21:28:04] In addition to a weird save issue in our magic diff-patch stuff there is this really generic error that should be more specific [21:28:11] Its a core bug in some JS-library [21:28:17] we want more errors :D [21:28:20] Jeblad_WMDE: mh I think an announcement in channel's topic + mailing list could be useful [21:28:23] Hopefully it will be fixed soon [21:28:46] people will complain a lot and the smartest one will try fixing it on translatewiki :D [21:28:58] Try reloading your browser cache, it could be that it is already fixed [21:29:25] But please don't "fix" it by changing the message, the bug can't be fixed that way [21:30:24] Last news, it is fixed in core but I'm not sure if the prod servers are updated [21:31:12] There are hundreds of failing messages, so it will be a lot of fixing.. ;) [21:32:48] 805 in the i18n file for Wikibase repo.. [21:52:26] Vito: as a quick fix, maybe hide the counter with site css or something? [21:52:39] at least would hide the ugliness :) a [21:52:44] hope to have it fixed shortly [21:53:06] aude: I can live without the necessary fix the same :p [21:53:17] ok :) [22:34:44] Riley: i need you to delete https://www.wikidata.org/wiki/Q371239 so i can merge it into another item [22:35:01] ([[Q2509321]]) [22:35:07] Done [22:35:42] wtf [22:35:45] theres a third dupe [23:42:45] New patchset: John Erling Blad; "Add dt/dd msgs for of the datatypes in Special:ListDatatypes" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/47817