[01:18:52] !admin [01:19:06] https://www.wikidata.org/w/index.php?title=Q316013&action=history needs some watching because of recent death [01:24:28] Lydia_WMDE: okay [01:24:40] thanks :) [01:26:11] Lydia_WMDE: semi protected actually [01:26:20] great [01:29:14] good night [10:10:29] Jonas_WMDE: I'm working today. Lets stay in contact if possible, ok? [10:17:53] *waves* I am also here! (still working on api serialization / DM serialization stuff / indexed tags stuff and xml stuff [10:31:40] addshore: hi! [10:31:46] hiya! [10:32:53] internetz problemz here. should work now. [10:33:13] urgent review requests from anybody? [10:33:17] :) [10:37:20] nope! :) [10:42:47] i have one review request that bugs me: https://github.com/wmde/ValueView/pull/176 [11:24:15] addshore: https://gerrit.wikimedia.org/r/#/c/224060/ is quite horrible. my local wiki is unusable because of this. [11:50:28] heck, what's wrong with ULS? I'm again wastinghours and hours fixing my local environment. [12:10:43] JeroenDeDauw: give me a ping when your around? [12:45:44] Hi, there is a way to request from wikidata all countries name translation ? [12:47:23] Can you please rephrase this question? I do not understand. [12:48:48] I want all the countries name in all language [12:50:16] So, you want all labels of a Wikidata item? [12:50:49] https://www.wikidata.org/w/api.php?action=wbgetentities&props=labels&ids=Q55 [12:50:51] Freed, you can request IDs of all countries with WDQ and then request labels with API. [12:54:18] ok, but I have to request country by country ? [12:54:39] https://tools.wmflabs.org/autolist/autolist1.html?start=1250&q=CLAIM%5B31%3A%28tree%5B6256%250A%5D%5B%5D%5B279%5D%29%5D - like this (if you really want ALL countries :)) [12:56:11] Freed, if you have bot flag, you can get 500 items per request. [12:57:29] hum, thank you. I wasn't thinking about old countryes ;) [13:00:37] Where is hoo? I hope not on Hoo-liday. ;) [13:02:36] * JohnFLewis laughs then gives sjoerddebruin a 'why' look [13:03:01] Would like a entity suggester database update (ESDU) [13:10:45] WDQ seems broken... :/ [14:22:45] Jonas_WMDE: Are you online now? [14:22:56] yep [14:23:22] Your comment from 24 minutes ago: "It works like in the description, but is not intuitive." Is there anything to do for me? Do you want an answer? [14:23:40] nope [14:23:54] its better than before I think [14:24:20] shouldn't u be on the beach or something? [14:24:50] or at least drunk from tequila? [14:27:54] Confused my flight, it's tomorrow morning. [14:28:35] When was the moment you find that out? [14:43:42] Jonas_WMDE: I rebase the looong chain of patches and moved the controversial patch to the end. Now the chain starts here: https://gerrit.wikimedia.org/r/#/c/223544/ Would be cool if you could continue reviewing this. [15:29:09] * hoo waves [16:18:55] * addshore waves at hoo [16:19:00] hi addshore :) [16:19:05] ohia! [16:19:44] hoo, away of how to know what an api module is going to output (other than patrsing $params looking for the format?) [16:20:04] *anyway [16:20:48] Mh... you mean the output data structure? [16:21:12] yup [16:22:10] trying to get rid of the use of $api->getResult()->getIsRawMode(), MW should ignore metadata tags for formats that dont need it afaik now, but unfortunatly the difference between our JSON and XML serialization arrays if not just metadata tags.... (also different keys / no keys at all etc.) [16:26:45] hoo: "Hoo, I'm afraid you are confusing something." (2 comments) [extensions/Wikibase] - https://gerrit.wikimedia.org/r/224060 (https://phabricator.wikimedia.org/T105211) [16:26:59] legoktm: ^^ any ideas about my thing either? :) [16:27:13] Thiemo_WMDE: Let me test it again [16:37:01] hoo: There you are. :D [16:37:51] hi sjoerddebruin o/ [16:38:04] I'm going to update the suggester today [16:38:10] after more than two months ;) [16:38:10] !lydia [16:38:10] \o/ [16:38:27] Yeah, every two months sounds like a ideal thing to me. [16:39:28] If needed, we can create it more often... but in the end it's quite time consuming to do that [16:40:05] There are not much notable differences then. [16:40:53] And people need time to adapt a new sorting. [16:41:09] Think of a supermarket that changes the place of a product every week. [16:43:06] Yeah... but on the other end, in a perfect world it would be close to real time, thus things would change slightly every now and then [16:43:16] which would be expected [16:43:57] I think we should focus on things like fixing reference suggestions first. :) [16:45:29] Yeah... especially as that's a problem which is easier to fix than implementing live updates [17:18:28] hoo: whats the reason for the -1 in https://gerrit.wikimedia.org/r/#/c/224060/ ??? [17:21:52] hoo|away: Wikibase is broken and unusable because of this. Why does nobody merge this patch? I do not understand. Am I the only one that sees this? How is this possible? [17:44:17] addshore: can you have a look into https://gerrit.wikimedia.org/r/#/c/224060/ ? this drives me crazy. [17:44:32] how is it possibly that the world is burning ONLY FOR ME? [17:59:06] Thiemo_WMDE: Maybe you forced ResourceLoader debug [17:59:16] In that case it's broken, otherwise it should work [18:01:00] May I amend it? [18:01:43] amend what? [18:02:53] i would like to understand whats wrong and why. [18:03:08] I just checked my install with and without the patch, and everything seems to work for both [18:03:14] amending my patch does not make sense because it works as it is. amending means it will again be broken. [18:03:18] I want it to be in line with the other resource definitions we have [18:03:23] it is. [18:03:33] thats exactly what i do not understand. [18:03:44] it is in line with 69 other identical places in the code. [18:03:47] Thiemo_WMDE: it was never broken in the first place for me :/ as far as i can tell [18:03:48] No, we usually avoid paths in the file names, and move all of that into the paths [18:03:56] addshore: Only in debug mode [18:03:58] thats not correct. [18:04:21] and it does not work. [18:05:10] i do not see how a subset of my patch can fix the problem. it will again not work. [18:05:23] do you have $wgResourceLoaderDebug set? [18:05:28] I amended it [18:06:01] and no, "we" do *not* avoid paths in file names and "move them into the paths". as I said, 69 places in the code base do *not* do this. [18:06:33] you basically reverted everything i did. [18:06:38] I'm sorry?????? [18:06:58] Well, but it will work with both debug mode and without and [18:06:58] i just asked you to *not* do this. [18:07:01] * it [18:07:04] and that's what you need [18:07:06] meh [18:07:09] revert, then [18:07:18] but that fixes your bug in a way I perceive as nicer [18:08:21] "nicer" by listing the path twice instead of once? [18:08:31] argggggg! [18:08:35] Getting the base path right [18:13:03] fuuuu gerrit [18:13:05] the base path was right. [18:13:12] It was [18:13:16] I just didn't like the style [18:13:34] ??? [18:14:14] I'm out here for today. please revert your change. thanks. [18:14:35] Done and +2 [18:14:54] * hoo still thinks the path should be in the base path etc. [18:14:55] but whatever [18:58:58] DanielK_WMDE: would you like to review https://gerrit.wikimedia.org/r/#/c/222215/ ? It's about cleaning up XSD 1.0/1.1 mess (https://phabricator.wikimedia.org/T99795) [19:16:07] SMalyshev: i'm packing for wikimania, flying out tomorrow [19:16:54] DanielK_WMDE: ah, I see. OK, if you'll have to occupy some time on the plane... ;) [19:17:16] if they get me internet on the plane, sure :) [19:17:31] anyway, whenever you have time. there's also some smaller fixes coming from Markus' comments soon [19:18:02] on a first look: ugh, boolean constructor param. what does new DateTimeValueCleaner( true ) mean?? [19:18:59] DanielK_WMDE: I see your point, but not sure how to make it better. $cleaner->setXSD11(true)? [19:19:53] also, do you think default should be 1.1 or 1.0? PHP/Java all are on 1.0 semantics. [19:19:57] const XSD10 = "1.0" maybe. Or just take string params, and check that it's one of the allowed values [19:20:16] I think 1.0 should still be the default [19:20:36] or... well... no. wrong [19:20:53] we shouldn't change the default later. so the default should be 1.1 now [19:20:54] I don't like string ones since it's only two values that can ever be there [19:42:59] sjoerddebruin: Lydia_WMDE: FYI: Property suggester updated! :) [19:43:10] \o/ [19:43:11] thx hoo [20:20:23] Anybody knows if "reject reference" in primary sources tool supposed to do anything? Because for me it does nothing at all [20:36:02] SMalyshev: it is supposed to rember that someone rejected it and not suggest it again [20:36:38] * hoo just profiled dispatching a bit... sadness [20:41:10] We save memory there, but that costs us a lot of cpu time [20:41:25] I guess just putting that stuff on hhvm would make it so much nicer [20:41:37] or just php 5.6 :P [20:42:51] jzerebecki: doesn't look like it does. I.e. on https://www.wikidata.org/wiki/Q28614#P106 it suggests link to smh.com.au which is 404. Rejecting does not remove it [20:43:35] hoo: php 7? ;) [20:44:15] Well... given how fast we adopt new php version: Let's stay realistic :P [20:45:18] that's why it makes sense to start early - there's a chance it'd be adopted before it's EOLed ;) [20:46:25] SMalyshev: it does not suggest that for me. do you use it via https://www.wikidata.org/wiki/Special:Preferences#mw-prefsection-gadgets or some other variant/version? or perhaps it is a caching bug. [20:47:16] jzerebecki: yes, via gadgets. Maybe caching... if you don't see it then it's fine, probably cache issue [20:49:09] Easy fix [20:59:48] SMalyshev: jzerebecki: Hot fixed it... [20:59:52] Not nice, but should work now [21:00:06] (you may need to clear your cache/ give RL its 5 minutes) [21:00:07] hoo: cool, thanks [21:18:56] SMalyshev: WMF is ~18 months into converting to hhvm and not done yet. I don't know that there will be much enthusiasm for starting on PHP7 [21:20:10] bd808: I'd be so happy if we had a hhvm version of tin [21:20:13] bd808: yeah I know it's not easy... but php 7 should be much faster than 5 [21:20:28] * terbium [21:20:48] hoo: as soon as we finish the imagescalers.... [21:20:49] But I guess this topic is going to come up once terbium cries because it's under load [21:21:22] hhvm and a lot of cron jobs won't be good friends [21:21:39] hhvm has a startup penalty due to JIT [21:21:54] so if you task is <30s or soe hhvm will make it slower [21:22:01] bd808: Well, but for CPU bound cron jobs, I think they can get along quite good [21:22:20] If there are many edits on Wikidata terbium goes 70%+ CPU [21:22:33] Probably more now, as we threw more CPU time at that problem [21:22:44] I saw you folks talking about that over the weekend. why has it not be pushed out to the job runners? [21:23:19] we have a whole fleet of boxes for running background jobs for this reason [21:23:19] bd808: Well, it's not a job [21:23:34] could it be? [21:23:41] And if it were one, we would need to make sure we have a couple of dedicated runner [21:23:42] s [21:23:52] that's easy enough [21:24:22] After the hhvm switchover we have plenty of mw boxes [21:24:49] and it is easy to partition by job type [21:25:05] We have a bug for that... and it's only a bit more than two years old https://phabricator.wikimedia.org/T48643 [21:25:59] I think it may still be worth it to check out php 7, if it's about performance it may provide serious improvement [21:26:22] everything is better than the php 5.3 we are running now :) [21:26:35] bd808: Well, given we want to use delayed jobs, it might be that we end up needing https://phabricator.wikimedia.org/T97909 [21:27:38] unitl you have a job it won't matter how nice the runners are [21:28:03] Creating a job shouldn't be hard by now [21:28:10] given we already did most of the refactoring [21:28:19] maybe I can hack that up during Wikimania [21:28:29] yeah 5.3 has been EOL for like a year now :) and no real bugfixes for 2 years. [21:28:59] Good thing it's not buggy... [21:29:06] lol [21:29:29] yeah I know PHP doesn't have bugs, not at all [21:30:09] We have various workarounds in the code base guarding against php 5.3 either messing serialization or completely segfaulting on stuff [21:30:41] The 14.04 boxes have php 5.5.9-1ubuntu4.11 [21:31:08] which is much much faster than the php 5.3.10-1ubuntu3.19+wmf1 on tin/terbium [21:31:36] 5.5 is better, but 7 would be even better :) just sayin... ;) [21:31:41] the big problem with php7 will be porting all the extensions [21:32:20] scribunto, fss, wikidiff2, ... [21:32:35] bd808: true, but it's not _that_ hard. inless they do hardcore php stuff mostly legwork [21:33:31] The contributed port for yaml seems to work nicely but it is a full fork of the codebase [21:33:41] which makes me very sad [21:39:20] jzerebecki: I'm working fixing https://phabricator.wikimedia.org/T93375 but need to understand what wikidata expects to find in the iwcache [21:40:03] I have a python script working that I can use to edit interwiki.cdb [21:40:19] I doubt we look into the iwcache at all [21:40:23] but I need to figure out what to stick in the cache for each wiki that mw-vagrant creates [21:40:42] well when it is used there is no interwiki db right? [21:40:50] sites and site_identifiers [21:40:59] Can have both [21:41:13] * jzerebecki has no knowledge of what actually happens there [21:41:23] The way stuff works together is weird... [21:41:34] I thought the interwiki.cdb would be nicer for mw-vagrant since it can be shared among all the sites [21:41:37] I think we still require iw data to be configured for the actual display in the UO [21:41:39] * UI [21:41:47] but everything else should work despite of that [21:42:12] bd808: iw data can be in the table or the file, doesn't matter [21:42:23] but we need to sites and site_identifiers tables to be in shape as well [21:42:43] the annoying thing about normal iw db tables is that they are per-wiki. Puppet isn't very nice for doing things that need to iterate [21:43:26] Well, you can call out to ruby... but indeed, that's ugly [21:43:41] maning when I add a new wiki I need to update all the other wiki dbs and also update the new wiki with the info on the rest of the farm [21:43:48] s*meaning [21:43:55] yeah [21:44:06] but with the interwiki.cdb I can jsut make sure the new wiki is added [21:44:14] and point all of the wikis at it [21:49:22] hoo: do you call directly to the db to get the sites or do you go through some internal abstraction in MW? [21:50:10] bd808: We go through abstraction, also stuff is cached in MC [21:58:03] hoo: I think I tracked it down. You need a working SiteStore [21:58:44] Yeah [21:58:57] It might be that there is a working file based site store in core [21:59:05] but if there is one (dunno), it's not production tested [21:59:21] HashSiteStore is sort of one [22:00:47] I think I could populate a HashSiteStore from the data I have in the MultiWiki layer [22:02:04] hoo, bd808: run we need a replacement for what we currently use in production, that is e.g. file based or per node cached instead of in memcached, that is tracked in https://phabricator.wikimedia.org/T47532 [22:02:21] s/run // [22:32:52] hoo: currently thinking of changing https://www.wikidata.org/wiki/Wikidata:Bots as it mentions dispatch lag. what should be a suggested value of dispatch lag when bots should wait? and should they look at media like the alert or something else? [22:33:10] s/media /median / [22:34:20] I guess median is fine [22:34:36] Is it just late or is stuff very flaky? [22:34:45] 14 edits in the last minute [22:34:53] But ~900 the minute before [22:35:43] how/where do you look at that? an sql query? [22:35:51] Yeah [22:36:39] hoo: at how many seconds median dispatch lag should bots wait? [22:37:02] 60-100s maybe? [22:38:06] hoo: the alert is at >=60 [22:38:37] Wasn't it 120? [22:38:40] 60 is to low [22:38:55] I think the check is at 120/2m [22:39:01] We only send out alerts for db lag after 300s [22:39:01] *120s/2m [22:39:04] and that is way more serious [22:39:20] it says so but it actually checks for if it is a number below 60 [22:39:32] so message and check missmatch [22:39:45] Awesome [22:39:59] still, I guess having it match 301+ would be better [22:40:10] It's not *that* important after all [22:40:23] the site's not going read only due to taht (but it can due to slave lag) [22:40:40] ok [23:14:04] * jeblad Zzz