[07:18:55] * Silke_WMDE has to reboot the demo system after upgrades... [07:21:18] New review: Henning Snater; "I guess it would be better to put the line into the horribly named but more generic wikibase.utiliti..." [mediawiki/extensions/Wikibase] (master); V: 0 C: -1; - https://gerrit.wikimedia.org/r/27876 [07:37:53] New review: Henning Snater; "The site id does not necessarily have to be the language code, it is just a coincidence regarding Wi..." [mediawiki/extensions/Wikibase] (master); V: 0 C: -1; - https://gerrit.wikimedia.org/r/27873 [08:21:44] Change merged: John Erling Blad; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27519 [08:35:02] New patchset: Aude; "per bug 40353, inject recent changes nicely [DO NOT MERGE]" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27392 [08:36:35] Change merged: John Erling Blad; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27871 [08:39:04] New patchset: Anja Jentzsch; "fix for SiteListTest (for 32bit php versions)" [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta) - https://gerrit.wikimedia.org/r/27979 [08:43:34] New patchset: Amire80; "(bug 41005) Define tag editor direction" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27876 [08:50:43] New review: John Erling Blad; "john@loke[master] ~/Workspace/core/tests/phpunit> php phpunit.php --verbose --group Site --conf /var..." [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta); V: -1 C: 0; - https://gerrit.wikimedia.org/r/27979 [08:51:38] New review: John Erling Blad; "Sorry, this was tried against master." [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta); V: 0 C: 0; - https://gerrit.wikimedia.org/r/27979 [08:56:25] moin [08:56:27] hi DanielK_WMDE [08:56:38] thanks for the email about the changes stuff [08:57:05] AnjaJ_WMDE: do you perchance have the bug or change ID for the $egWB... to $wgWB... rename? [08:57:10] DanielK_WMDE! we have sea-monkeys at the office! we will bring them to life in an aquarium after tghe daily. [08:57:28] DanielK_WMDE: if you want, i can poke at the wb_changes stuff... [08:57:38] ("YPS Urzeitkrebse") [08:57:38] it's a prerequisite for my recent changes patch to get it right [08:57:39] Jens_WMDE: someone bought the new Yps? [08:57:50] DanielK_WMDE: Abraham_WMDE did. [08:57:54] hehehe :) [08:59:09] aude: one very important bit for this is: thew poll script accesses wb_changes (and only that) on the repo. the poll script then creates jobs for updating the local wikis. For now, these jo0bs can be executed directly, but we need them as jobs, so we can later queue them for each wiki. [08:59:33] that update job writes it recentchanges, invalidates (or optionally rerenders) the page, etc. [09:00:39] the client wiki does not (apart from the poll script) access the repo's database. We should pretend to not be able to access the repo directly at all, so it will be easy to replace wb_changes later, with PubHub or something [09:00:59] or NSQ. http://word.bitly.com/post/33232969144/nsq [09:01:01] New review: Denny Vrandecic; "Naming was confusing, the comment was inconsistent, and it should have been false but used the other..." [mediawiki/extensions/Wikibase] (master); V: 0 C: -2; - https://gerrit.wikimedia.org/r/27742 [09:01:10] AnjaJ_WMDE: if you don't, i'll dig it up. just wanted to avoid duplicate work on that [09:01:31] Change abandoned: Denny Vrandecic; "Comment was wrong that is why this change is invalid." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27742 [09:01:35] AnjaJ_WMDE: yep [09:01:37] oops [09:01:39] DanielK_WMDE: yep [09:02:05] DanielK_WMDE: my one concern is that the client needs a cache of the changes i think [09:02:29] maybe [09:02:48] Jens_WMDE: we can discuss the push (or poll) machnism again, no problem. i just want to have a solid draft and a warking implementation using the currently deployed platform. [09:02:55] Change merged: Anja Jentzsch; [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta) - https://gerrit.wikimedia.org/r/27979 [09:03:05] I did backport tobi's fix for the current unit test errors [09:03:16] * aude will think about it more.... [09:03:21] DanielK_WMDE can we talk for a minute? [09:03:31] after the daily [09:03:38] aude: you said that before, but I never understood why we would need that [09:03:43] AnjaJ_WMDE: sure [09:04:46] Jens_WMDE: there are two major factors: 1) de-coupeling, i.e. no blocking on other services during the web request that makes the change. 2) robustness against temporary failure (i.e. to ability to replay changes). [09:05:36] Jens_WMDE: but practically, there's also the "will it run on the wmf cluster as it is now" factor. I don't want to depend on another technology that will be deployed "really soon now". [09:08:19] aude: but anyway, if we do need a permanent log of the changes on the client, nothing keeps us from just adding a table for that. that would not chnage the architecture. [09:09:53] AnjaJ_WMDE: nvm, found bug 40552 [09:15:29] New review: Daniel Kinzler; "This is part of the fix of bug 40552." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/26023 [09:16:17] DanielK_WMDE: i'm just wondering about a way to rebuild changes stuff i think [09:16:40] it could be another way to do that, but clearing the wb_changes table could be a problem [09:16:56] with OSM, the "client" needs to be rebuilt sometimes [09:17:03] for various reasons [09:17:40] aude: but for that, the *current* version of the data items is sufficient. and we do cache these. [09:17:44] and to show diffs, etc. for changes older than the recent changes or wb_changes table.... [09:17:52] if we want to do them client side at some point [09:18:12] DanielK_WMDE: maybe [09:18:55] for now we can just link to the repo [09:18:59] when would it not be sufficient? Also, we always *can* go to the repo for the data if we really need it. Via a dump, API requests or direct database connection. [09:19:21] For the diffs... well, we'd need to store the full history of all data items locally. that kinds of sucks. [09:19:23] ok [09:19:51] the diffs will be harder and can be considered later.... [09:19:59] I'm still thinking about how to best do the diffs. Currently, I'm inclined to call the repo's API to get a diff. That's not totally pretty, but the best I can currently think of. [09:20:05] yea, exactly :) [09:20:15] * aude thinks it'd be awesome to be able to see a diff that takes into account changes of templates, etc. also [09:20:24] hehe [09:20:26] yea [09:20:37] you'll never be able to see an old version of the main page, for example [09:20:44] way it is now [09:20:55] unless you subst all the templates and archive them [09:21:09] and when we have inline stuff, etc...... [09:21:31] aude: http://www.mediawiki.org/wiki/Extension:Memento [09:21:33] * aude thinks one cache of the changes stuff per database section [09:21:46] it doesn't work 100% and isn't very efficient, afaik. [09:21:52] but it would be a start, maybe. [09:22:00] yes i have seen that [09:22:02] interesting proof of concept, any way :) [09:22:24] for now, linking to the repo is okay and i think wb_changes is okay [09:22:47] aude: caching the changes wouldn't really help with the diffs. we'd need to duplicate the full history. [09:22:53] it would be nice to have periodic dumps and changeset stuff, maybe, like osm does [09:23:00] even with the actual data in external storage, that would still suck [09:23:07] DanielK_WMDE: right. cache the entities [09:23:30] osm has minutely diffs, hourly, etc. and daily full dumps [09:23:45] and tools to handle those. not totally perfect but works :) [09:23:45] yes, periodic dumps of changes would be nice. I want to look at OAI-PMH for that [09:23:57] and easy to rebuild, whatever. [09:24:08] ok [09:24:35] for today, use the ORMTable foreign db stuff for wb_changes and make sure all the info we need is put into wb_changes [09:25:29] aude: ah, cool! if you want to take a shot at that, i'll not touch wb_changes today [09:25:47] ok [09:26:08] it's essential for merging my recent changes stuff (which still needs further polishing to make nice) [09:26:15] the i18n stuff [09:27:48] aude: please make small patches. they don't need to be complete working features. much better to make small incremental changes. [09:28:10] it's more work up front, but make the team more efficient as a whole. [09:28:23] * DanielK_WMDE tries to do the same [09:28:50] AnjaJ_WMDE: i have put the change IDs of the stuff that needs backporting into the google docs. [09:29:21] AnjaJ_WMDE: i was about to start to dog up the commit hashes and timestamps. BUt I'm not eager to do that, if you want to :) [09:29:37] DanielK_WMDE: ok [09:29:38] I could then fix jeroen's patch for the Sites stuff. [09:29:40] DanielK_WMDE: can we skype for a moment? [09:29:59] yea, let me dig out my headset. i'll call you [09:30:18] AnjaJ_WMDE: did you backport https://gerrit.wikimedia.org/r/#/c/25511/ ? ($wgWBStores) [09:30:56] fwiw, i put it on the list, but didn't touch it yet [09:35:59] New patchset: Henning Snater; "renaming wikibaseAutocomplete" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27986 [09:37:14] New patchset: Jeroen De Dauw; "Backport 4145c4f7037e036bc6ffc2fc5397827d4cde2620" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27987 [09:38:14] New review: Jeroen De Dauw; "I did an amend but somehow it ended up in a separate commit (dafuiq): Ife650576" [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta); V: 0 C: 0; - https://gerrit.wikimedia.org/r/27728 [09:41:02] New patchset: Anja Jentzsch; "rename $wbStores to $wgWBStores and $wbcStores to $wgWBClientStores, per bug 40552" [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta) - https://gerrit.wikimedia.org/r/27988 [09:41:17] Change merged: Anja Jentzsch; [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta) - https://gerrit.wikimedia.org/r/27988 [09:42:04] New patchset: John Erling Blad; "(Bug 41032) Remove configuration of write mode from api" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27989 [09:44:45] New patchset: John Erling Blad; "(Bug 41032) Remove configuration of write mode from api" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27989 [09:48:27] New patchset: Daniel Kinzler; "Backport 4145c4f7037e036bc6ffc2fc5397827d4cde2620" [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta) - https://gerrit.wikimedia.org/r/27728 [09:52:38] New review: Daniel Kinzler; "merging after minor fix." [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta); V: 1 C: 2; - https://gerrit.wikimedia.org/r/27728 [09:53:04] New patchset: Jens Ohlig; "Create test script to create properties" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27720 [09:54:33] New patchset: Anja Jentzsch; "Changed egWBSettings to wgWBSettings" [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta) - https://gerrit.wikimedia.org/r/27990 [09:55:02] Change merged: Anja Jentzsch; [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta) - https://gerrit.wikimedia.org/r/27990 [09:58:36] Change merged: Daniel Werner; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27986 [09:58:53] New patchset: Daniel Kinzler; "Backport 4145c4f7037e036bc6ffc2fc5397827d4cde2620" [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta) - https://gerrit.wikimedia.org/r/27728 [09:59:48] New review: Daniel Kinzler; "rebased" [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta); V: 1 C: 2; - https://gerrit.wikimedia.org/r/27728 [09:59:48] Change merged: Daniel Kinzler; [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta) - https://gerrit.wikimedia.org/r/27728 [10:02:30] Change abandoned: Daniel Kinzler; "wrong branch" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27987 [10:11:29] New patchset: Jens Ohlig; "Create test script to create properties" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27720 [10:30:58] New patchset: Daniel Werner; "Introduced new $.NativeEventHandler and its documentation and tests" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27994 [10:52:03] New patchset: Daniel Kinzler; "Check for actual content type, not namespace, in hook" [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta) - https://gerrit.wikimedia.org/r/27997 [10:53:24] New patchset: Daniel Kinzler; "Check for actual content type, not namespace, in hook" [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta) - https://gerrit.wikimedia.org/r/27997 [11:04:14] New patchset: John Erling Blad; "(Bug 41034) Fix for strange timeout during test" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27998 [11:08:56] JeroenDeDauw: you really think arguing about instanceof is worth while for a temporary fix? this code doesn't exist on master any more anyway... [11:09:08] I'll amend using the master version, just for consistency [11:09:21] just for the record, i beliebe instanceof is actually The Right Thing (tm) [11:09:31] ...though it should be instanceof EntityContent, not ItemConten [11:12:17] gah! [11:12:36] Utils::getEntityContentModels is utterly broken. [11:13:31] New review: John Erling Blad; "Doublechecked the tests, still works, self-submitted." [mediawiki/extensions/Wikibase] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/27989 [11:13:32] Change merged: John Erling Blad; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27989 [11:13:43] ...or not. hmpf. anyway. [11:19:49] New patchset: Daniel Kinzler; "Check for actual content type, not namespace, in hook" [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta) - https://gerrit.wikimedia.org/r/27997 [11:20:44] Denny_WMDE1: I13793613 needs to be merged, but then we should be good to go. [11:21:32] Some core tests are still failing with items in the main namespace, but most of these failures have been fixed (most notable, all parser tests are skipped for now if the main namespace is not wikitext) [11:22:01] Some fixed for test cases are still on gerrit, and i think there are still a few i havn't yet looked at [11:22:08] will go through my gerrit backlog next. [11:27:47] plop [11:28:45] New patchset: Aude; "include rc info in wb_changes info field" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28002 [11:29:26] Nikerabbit: blub [11:31:11] New patchset: Aude; "include rc info in wb_changes info field" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28002 [11:32:18] New patchset: Aude; "include rc info in wb_changes info field" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28002 [11:34:33] New patchset: John Erling Blad; "(Bug 41034) Fix for strange timeout during test" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27998 [11:37:31] New patchset: Aude; "kill duplicate test, same as DiffChangeTest" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28003 [11:42:00] New review: John Erling Blad; "Seems like the timeout problem is still there..." [mediawiki/extensions/Wikibase] (master); V: -1 C: 0; - https://gerrit.wikimedia.org/r/27998 [11:50:03] Change merged: Anja Jentzsch; [mediawiki/extensions/Wikibase] (wikidata-wmfphase1beta) - https://gerrit.wikimedia.org/r/27997 [12:31:58] DanielK_WMDE: How much of an issue is https://bugzilla.wikimedia.org/show_bug.cgi?id=41030 ? [12:33:06] Reedy: it's an annoying bug, not sure how often it is actually hit. the bug report doesn't give instructions to reproduce. [12:33:22] heh [12:33:28] Reedy: i just write a test and verified. will submit the fix in a minute [12:33:32] Oh, cool [12:33:38] s/write/wrote/ [12:33:45] I guess it wasn't really a blocker for deployment [12:34:23] well - it causes a fatal error when a nonexisting page is rendered. that does suck. though usually, mediawiki will notice that the page isn't there and not try to render it [12:34:33] so - not sure when it is hit in operation [12:34:37] but would be nicer to fix it [12:34:42] it's 3 lines. [12:37:56] Reedy: https://gerrit.wikimedia.org/r/#/c/28012/ [12:53:47] Reedy: I sent you updated setup hints compared to Friday. I hope it is helpful. [13:07:08] Lydia_WMDE_: which api will break? wikidata modules on live system compared to currenent test system or "only" wikidata api on test system because of phase 2 [13:07:33] Merlissimo: unfortunately both if i understood it correctly [13:08:24] so the live production wikidata api will not be available on test system before? [13:08:34] Merlissimo: it is [13:08:40] Merlissimo: no it will go live with what we have now on the test system [13:08:40] but it is not stable yet [13:08:53] i.e. it will change after release [13:10:28] why should the api not be stable BEFORE it is released? [13:11:00] a breaking api change on production system is much more complicated [13:14:49] well, we could either delay release [13:14:59] or change the API after release [13:15:12] we decided for the second option [13:15:22] and tell people explicitly that it is going to change [13:15:45] basically, at some point in the near future the API will change on the life version [13:15:50] it will be rather sooner than later [13:16:03] Change merged: Jens Ohlig; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28003 [13:16:18] and then we will declare that as the rather stable API [13:16:47] is there any ETA für relase on huwiki? [13:18:37] New patchset: Henning Snater; "Introduced new $.NativeEventHandler and its documentation and tests" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27994 [13:19:04] Merlissimo: no [13:19:43] "two weeks after testing of the client on beta" is the current plan [13:20:02] ok [13:29:40] how do i know if langlinks are included in source code or at wikidata? [13:30:56] Change merged: Henning Snater; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27994 [13:31:17] if my bot can not found langlinks return by api in page source it starts searching on included pages. And i don't want my bot to do this in case of wikidata langlinks [13:32:46] Merlissimo: you can ask wikidata via the api if there is an item for the given page and then check which language links that item has? [13:34:48] Lydia_WMDE_: what if langlinks exists in page source and in wikidata? [13:35:23] does mediawiki know where the langlinks comes from? [13:36:25] Denny_WMDE1: ^ [13:39:37] if mediawiki would know it an indicator flag on prop=langlinks could be added easily [13:39:58] if the langlink has the same target, only one is displayed [13:40:20] if the langlink from Wikidata and the one local have different targets, both are displayed [13:40:46] wrt to the API, I am actually not 100% sure how it works [13:41:05] but you can test the behaviour on test-client and see how it works there [13:42:03] using test client api is what i am currently doing. [13:44:57] and does it behave as you expect? [13:47:04] http://wikidata-test-client.wikimedia.de/w/api.php?action=query&titles=Helium&prop=langlinks returns the langlinks. then my bot would search page source for that links and cannit not found them. Then is starts scanning includeparts of all included pages. and i am searching for a solution to prevent my bot for scanning all these pages [13:47:43] Merlissimo: do you really need to search includes? [13:47:47] for language links [13:47:49] something like .. would be great [13:48:20] Lydia_WMDE_: langlinks are often included from subpages like on template and wikipedia namespace [13:48:43] hmm ok that's bad then [13:48:50] or fiwiki has a namespace for langlinks only which are included by pages on different namespaces [13:49:00] -.- [13:49:44] there is no gernal rule how langlinks are included [13:49:58] Merlissimo: i think the best would be if you can send an email describing your problem to the mailing list and then i'll get you an answer as soon as possible [14:00:29] * DanielK_WMDE finished reading his mail backlog [14:01:02] Merlissimo: I understand the point [14:01:11] what about checking Wikidata? [14:01:30] i.e. if you cannot find it in the wikitext, check wikidata for it, and only then go for the subpages? [14:01:34] should be cheaper this way [14:02:20] so i have to create a relative complement of wikidata in local by my own [14:02:43] that could douple the api requests [14:03:14] New patchset: Jeroen De Dauw; "Fixed undeletion on non-entity-content pages" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28018 [14:04:53] I'll try to figure out how easy it is to add the info to the query/langlinks module [14:05:05] but i think it won't be completely trivial [14:05:22] btw how do i know the script url of the wikidata repository used by a wiki? [14:05:24] and what you just described might be the more efficient solution [14:05:39] good point [14:05:46] we need to add that to the api [14:05:55] i have to leave for now, will be back in 30-50 minutes [14:06:08] Merlissimo: can you add this to an Email, please? that would be appreciated [14:08:05] maybe later this week i have time to write an email [14:09:38] mediawiki must know somewhere which langlinks are local or from wikidata because of database updates needed after pagemove [14:15:21] Merlissimo: after pagemove? why? i don't see the connection. [14:16:12] DanielK_WMDE: are sitelinks at wikidata not updated after local pagemoves? [14:16:33] New patchset: Aude; "add selenium/Gemfile.lock to .gitignore" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28021 [14:16:53] the database of the local wiki are using pageids, which does not need updates on moves [14:17:02] Merlissimo: ah, you mean whether the sitelinks on wikidata are updated automatically when a page is moved? [14:17:23] we talked about that at some point, but i think it's still open. [14:17:43] at the moment, a page move loses the interwiki links, the sitelink must be updated manually. [14:17:50] oh i though there are already updated [14:17:52] i hope we have a ticket open for that... [14:18:06] no, they are not, and I'm not convinced that they should be [14:18:19] well... maybe. it wouldn't be terribly hard. [14:18:51] there are many page moves on articles having langlinks [14:19:03] anyway, it would also be nice to have an api where you can ask which of the links on the page come from wikidata. though... well, effectively, you can. just ask wikidata for the links for that page. [14:19:36] yea, pages get moved. at the moment, we'd just hope for the interwiki bot to fix that. [14:19:59] though... well, it can't really, can it? [14:20:42] Change merged: Aude; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28018 [14:20:44] Merlissimo: https://bugzilla.wikimedia.org/show_bug.cgi?id=36729 [14:21:02] you can lobby denny to change the prop from "low"... [14:21:09] *prio [14:22:00] i think it is more like a blocker? [14:23:26] DanielK_WMDE: how should interwiki bot update that? e.g. if a page is moved without creating a redirect? [14:29:12] New patchset: Tobias Gritschacher; "adjusted create_property tests" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28023 [14:29:31] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28023 [14:37:10] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28021 [14:38:26] aude: have some geo+wikidata questions for a GLAM meeting this afternoon [14:39:11] so... wikidata will allow polygons and currently we only allow points? [14:40:05] no, i don't think a bot should do that [14:40:07] and points have now radius or zoom level or anything? [14:40:28] Denny_WMDE: i assume you're not talking to me? [14:40:34] no i am not , sorry [14:40:39] my scroll bar was wrong [14:40:42] heh [14:40:56] Merlissimo: DanielK_WMDE: no, i don't think a bot should update after a page move [14:41:08] Denny_WMDE: met someone yesterday that tried to apply for your job after you were already hired... will try to remember his name [14:42:05] Denny_WMDE: i think so, too. There there must be a fix before release. That why i think it's a blocker bug. [14:42:54] aude: also, how is (and how will be) support for having geo info for just part of an article or more than one per article? [14:43:03] Merlissimo: A fix for what? [14:43:20] Denny_WMDE: the problem i see is that after you move page X to Y on the clinet wiki, there's no way any more to even find the corresponding data item. the association is effectively broken. [14:43:22] pagemove wikidata repository updates [14:43:30] aude: i guess maybe you're gone... hilight me when you're back ;-) [14:43:53] Denny_WMDE: hm... one simple solution would be to periodically check all links on the repo and resolve redirects. that's kind of slow and expensive, though. [14:44:16] stop thinking that machines have to solve everything [14:44:16] DanielK_WMDE: that's why i told you month before: use pageids. [14:44:33] a user will rename a page [14:44:35] DanielK_WMDE: why do you think there are always redirects left? [14:44:35] Denny_WMDE: i see no good way for a human to solve this either. [14:44:48] then see that the language links are gone [14:44:50] Merlissimo: there usually are. [14:44:52] then go to wikidata [14:44:59] search for the item [14:45:02] and correct it there [14:45:05] Merlissimo: pageids would create more problems that they would solve. [14:45:37] DanielK_WMDE: in most of the times there are deleted very quickly by speedy deletion requests and bot and admin can move pages without redirects [14:45:38] Denny_WMDE: that turns a two click operation into a 10 click operation. people will simply not do it. they have no hint that they even could or should do it. [14:46:27] *if* this turns out to be a problem [14:46:37] Merlissimo: most? really? but anyway, you don't have to convince me. I do see the problem [14:46:48] we can on the "page move succeeded" page offer a link to allow that [14:46:57] Denny_WMDE: i think this will be a fairly big problem, and one that actually causes data loss. [14:47:20] it takes quite a bit of manual labor to fix this after is has been going wrong for a couple of weeks. [14:48:34] i do not see potential for data loss [14:48:43] DanielK_WMDE: maybe you can store pageids for a maintanace script so that they can be restored after wikidata from temporaly broken on live system [14:49:42] not updated globalimageusage template on commons after temporaly s4 database problems are causing big problems atm [14:49:48] Denny_WMDE: Item 23 links page en:X. en:X is moved to en:Y, the redirect is deleted. Now, en:Y does not have any language links any more, the association is broken. it's lost. [14:50:00] you can only try to find it again using a full text search and human judgement [14:50:24] Merlissimo: that is one possibility, though I don't like it much. [14:50:26] and wikidata should be much mure reliable than globalimageusage table [14:51:42] well, we can hook into the page move and call the wikidata api. it's streight forward, but a question of performance, operational dependencies and also of identity management [14:51:57] DanielK_WMDE: think of page moves because of creating disambig pages on main title [14:52:02] to whom shall the change on wikidata be attributed if the user who moved the page doesn't have an account there? [14:52:20] Merlissimo: yes, as i said, you don't have to convince me. [14:53:09] i only try to give you more arguments for future conferences ;-) [14:54:04] New review: Daniel Kinzler; "please add unit tests" [mediawiki/extensions/Wikibase] (master); V: 0 C: -1; - https://gerrit.wikimedia.org/r/28002 [14:59:58] DanielK_WMDE: you have to compare to the current workflow of doing a move [15:00:10] we have the same "data loss" as currently [15:00:41] i would strongly favor a not-automatic approach of updating sitelinks [15:01:12] Denny_WMDE: currently langlinks are stored in db by pageids key which is not changed on moves [15:01:55] New patchset: Tobias Gritschacher; "changes to watch and unwatch in selenium tests" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28028 [15:01:56] so there is nor workflow for that atm [15:01:56] Merlissimo: no, but links *to* the moved pages are broken. [15:02:06] s/nor/no [15:02:14] Merlissimo: if i move page X to Y, how are language links that point to X fixed at the moment? [15:02:15] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28028 [15:02:22] DanielK_WMDE: that is fixed by bots, yes [15:02:27] Merlissimo: how? [15:02:34] Merlissimo: how does the bot know the new page name? [15:03:25] Denny_WMDE: doesn't have to be automatic, but it would sure be nice to offer some help and guidance [15:04:03] let's say a page on aawiki is moved without a redirect. then a interwiki bot starting at aawiki is updating all links. i there are bots starting at other wikis they simply delete this link and a bot starting at aawiki later readd its. [15:04:10] but you are right, as far as I can see it's not worse than the current situation. Well, somewhat: at the moment, the page's outgoing links still work. [15:04:30] New patchset: Jeroen De Dauw; "Added GeoCoordinateFormatter" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/28029 [15:04:43] Merlissimo: so, basically, the problem heals itself based on the page's outgoing links? [15:04:48] that's interesting. [15:04:49] if there is a redirect left bots do know the new name and can even update the page if starting at another wiki [15:04:55] DanielK_WMDE: yes [15:05:08] and something that would indeed not work with wikidata. i have never though about this case in detail. [15:05:28] but after pagemove and wikidata langlinks there are not outgoing links anymore [15:05:41] Denny_WMDE: I think Merlissimo has a point there. The problem is that the outgoing language links can no longer be used to heal the broken links. [15:05:48] Merlissimo: indeed. [15:05:55] as i said earlier: what about on the "the page has been moved" page display a link that would allow the user to mirror the change on wikidata? [15:06:22] DanielK_WMDE: that why we currently need interwikibots starting a every wiki [15:07:11] at every wiki [15:07:26] Denny_WMDE: sounds like a good start, and might possibly be enough. [15:08:23] Denny_WMDE: what about people ignoring this "mirror at wikidata link" should they be blocked? [15:09:20] well, what about people who don't create redirects when they should? [15:09:23] are they blocked? [15:09:47] no because it does not create data loss [15:10:15] bbl [15:10:17] (and that is also the reason why only admin can move pages without redirect) [15:11:57] people moving articles to their user namespace without a good reason are blocked [15:12:18] moving pages without good reason is not constructive behaviour anyway [15:12:39] i am not sure we should be spending too much effort to support that [15:13:32] with wikidata, it will further be possible to do disruptive actions [15:15:26] Denny_WMDE: but they can be easily reverted by a single click. fixing langlinks later is much more complicated [15:16:02] In the end about one third of the memory are eaten in the attemptedSave, that was a surprise.. but it is there most of the storing happens .. [15:17:40] Merlissimo: depends. If we have a good mover, they would likely just update the language link [15:18:02] Merlissimo: if we have a bad mover, their action will probably be reverted with one click, i.e. by reverting the page move [15:21:46] JeroenDeDauw: PHP Fatal error: Can't inherit abstract function Wikibase\Claim::getPropertyId() (previously declared abstract in Wikibase\Statement) in /DATA/var/www/daniel/wikidata/extensions/Wikibase/lib/includes/statement/Statement.php on line 32 [15:21:47] o_O [15:22:44] * Merlissimo only knows the practice done by authors on dewiki [15:23:17] maybe that will become a dewiki only problem - i don't know [15:26:09] Merlissimo: sorry, I don't understand what you mean. Reverting a move is a one click action on the special:logs on de too, so I guess you mean something else? [15:26:43] i mean the case if a move is not reverted [15:27:41] you mean, the move is not reverted *and* the mover did not update the link on wikidata after they were invited to? [15:28:06] many automatic created redirect have a speedy deletion request and the admin have to take care than pagelink are updated before deletion. so people even do not care about those links [15:28:41] Denny_WMDE: never mind this discussion: we need as ticket for that "invite to update wikidata after move" function. [15:28:47] i think having it is pretty important [15:28:50] New patchset: Jeroen De Dauw; "Remove now duplicate method in interface" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28037 [15:28:57] DanielK_WMDE: ^ [15:28:59] i don't know if we need more, i'm inclined to wait & see [15:29:09] most people do a single action without taking care about anything [15:29:18] DanielK_WMDE: that's the same issue you ran into before with a more failing PHP :) [15:29:33] JeroenDeDauw: i got that stuff muted, it makes too many popups when i work on gerrit :) [15:29:40] and yea, php is stupid [15:29:57] At least it converts ints to floats on windows, that's a nice troll [15:32:28] * aude will read the logs later but notes the issue [15:33:10] Merlissimo: DanielK_WMDE: https://bugzilla.wikimedia.org/show_bug.cgi?id=41038 [15:33:46] this together with https://bugzilla.wikimedia.org/show_bug.cgi?id=36729 should ensure that it doesn't fall off our table [15:34:53] Denny_WMDE: thanks [15:36:09] Hola. [15:36:18] Change merged: Daniel Kinzler; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28037 [15:36:25] hi aharoni [15:36:27] hm... [15:36:40] There are some files with Windows line endings, for example lib/resources/wikibase.ui.PropertyEditTool.EditableValue.SiteIdInterface.js . [15:36:40] DanielK_WMDE: we need a way to resolve claims based on their guid. So I'm about to create a new table that maps statement guid to entity id. Anything else that should be in there that you can think of? [15:36:42] Is it OK? [15:36:48] is it just me, or is Wikibase\Test\SnakTest::testSerialize failing for others, too? [15:37:04] aharoni: the UI people are using Windows [15:37:28] But we should avoid the line endings I guess, they make some people mad :) [15:37:37] JeroenDeDauw: the entity type, maybe? don't know. but i don't understand why we need this anyway. [15:37:39] DanielK_WMDE: I had some issue with it a while back, but fixed it [15:37:48] JeroenDeDauw, it doesn't make me very mad :) [15:37:55] Wikibase\Test\SnakTest::testSerialize with data set #5 (Wikibase\PropertyValueSnak) [15:37:56] Exception: Wikibase\PropertyValueSnak::serialize() must return a string or NULL [15:37:58] JeroenDeDauw: --^ [15:38:01] if it doesn't spoil anything else, let it remain. [15:38:09] heh wtf [15:38:26] JeroenDeDauw: same with testGetHash [15:38:40] DanielK_WMDE: working fine for me [15:38:46] don't see how it could return null [15:39:11] JeroenDeDauw: anyway... if i have the statement ID, i should always also have the item ID, no? Can't we just require that? Just like sections don't have global IDs, you always need the page title AND the section... [15:39:19] DanielK_WMDE: you don't understand why we need to be able to resolve claims based on their guid? [15:39:45] hmm [15:39:54] Denny_WMDE: ^ what do you think? [15:40:04] yes. on their guid along. item id + statement id should be sufficient, and should always be known. can't think of a situation where that is not the case [15:40:23] JeroenDeDauw: re SnakTest: it sais it *doesn't* return null. [15:40:44] but... why is this an exception? not just a failure? [15:40:48] something is quite odd there [15:41:34] JeroenDeDauw, Denny_WMDE: by requireing the item id to be given along with the statement id, we can avoid one large table that needs to be kept in sync. [15:41:40] DanielK_WMDE: your PHP probably killed itself after your harsh comment earlier ? [15:42:29] possibly ;) [15:44:00] DanielK_WMDE: Denny_WMDE: if we require the entity id to be provided, it will be needed on a lot of places, just look at how many times "Statement" is passed as argument here https://meta.wikimedia.org/wiki/Wikidata/Development/Phase_2_API [15:45:45] * JeroenDeDauw pokes Denny_WMDE [15:45:46] i still agree with DanielK_WMDE here. always pass the item too. [15:45:55] ok [15:46:29] Still agree? If you both wanted this, then why does the draft just have statement? :/ [15:46:56] there might be other reasons for keeping such a mapping... [15:47:07] the draft meant "the object of a statement" [15:47:17] not necessarily how it will be adressed through the api [15:47:46] this might always be the "item,key" way as stated in the prio1-methods [15:48:10] JeroenDeDauw: NumberValue::serialize() returns an int. The magic method serialize is apparently not allowed to return an int. [15:48:11] oh wait [15:48:16] look in the last tine: Statement ::= item#key [15:48:58] DanielK_WMDE: I know [15:49:05] DanielK_WMDE: are you using an old version? [15:49:08] Denny_WMDE, JeroenDeDauw: that would be one way, just make the statement ID a parsable composite [15:49:25] JeroenDeDauw: i just pulled.... oh, crap. too many repos! [15:49:57] JeroenDeDauw: fairly old, it seems - sorry :) [15:50:07] but i learned something about serialize(). good. [15:50:36] Denny_WMDE: DanielK_WMDE oh right, now I remember, will use entityId#guid [15:50:52] no idea why we need a guid then, but hey ;) [15:51:12] New patchset: Amire80; "(bug 40238) Apply correct lang and dir to language names" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/27873 [15:51:26] DanielK_WMDE: do you suggest not having it then? [15:52:09] could just be a counter in the entity. _next_statement_id or whatever. but that's just a different kind of nasty... [15:52:28] guids are the nicer solution, really. just a lot of overhead. [15:52:40] i don't care much either way [15:52:53] entityid#guid sounds fine [15:53:06] they should be less overhead than keeping a keynumber etc. [15:53:18] Denny_WMDE: perhaps we should have used guids for the items too. no conflicts during import, etc. [15:53:21] anyway. [15:53:35] DanielK_WMDE: perhaps. Too late now, I'd say :) [15:53:44] what, already stopping the id bikeshed? :/ [15:53:53] I am dissapoint [15:54:46] * jeblad_WMDE has been bikeshedding with himself the whole day [15:55:27] jeblad_WMDE: you sure that counts as bikeshedding? [15:55:41] [15:56:26] I wonder if we have been fixing several bugs without knowing they existed.. [16:01:29] New patchset: Jeroen De Dauw; "Removed unused use statements :)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28039 [16:04:14] New patchset: Jeroen De Dauw; "Remove unused local vars" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28041 [16:07:59] jeblad_WMDE: i hope so :D [16:22:30] DanielK_WMDE: talking about bikeshedding [16:22:31] New patchset: John Erling Blad; "(Bug 40804) Kick the garbage collectors lazy butt" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28042 [16:22:43] do you mind if i change "perlocate" to "synch" or something? [16:24:26] New review: John Erling Blad; "The timeout seems to be very random and could be something with my setup. In a later changeset I inc..." [mediawiki/extensions/Wikibase] (master); V: 0 C: 0; - https://gerrit.wikimedia.org/r/27998 [16:34:42] Change merged: John Erling Blad; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28039 [16:41:14] Change merged: John Erling Blad; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28041 [17:43:32] Lydia_WMDE_ & folks: https://gerrit.wikimedia.org/r/#/c/28040/ :) [17:43:49] setting up our wikidata stuff [18:32:17] DanielK_WMDE: I has bug for j000 [18:32:18] https://bugzilla.wikimedia.org/41043 [18:32:30] New patchset: Jeroen De Dauw; "Cleanup of entity serialization in the API" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28068 [18:42:58] Reedy: got some more context? full stack trace? method to reproduce? [18:43:29] Reedy: oh, please use the ContentHandler component, so I can find this stuff. [18:44:40] It should be in our fatal logs... [18:45:02] yup [18:45:16] stack trace pasted into the bug [18:45:23] http://www.mediawiki.org/w/index.php?title=Help:Preferences/es&action=edit§ion=6?printable=yes [18:45:28] Replicable at least! ;) [18:47:34] Reedy: let me guess - section 6 doesn't exist. [18:47:52] Reedy: can this wait until tomorrow, or do I have to put in a night shift? [18:48:28] let me just check [18:48:35] if normal editing works, it's not urgent IMHO [18:48:44] Which, I know it does [18:48:50] https://www.mediawiki.org/w/index.php?title=MediaWiki_1.21%2FRoadmap&diff=593873&oldid=593603 [18:49:13] oh... [18:49:18] It doesn't like the 2 question marks in the url... [18:49:20] Reedy: that URL is broken. it has two "?" [18:49:24] http://www.mediawiki.org/w/index.php?title=Help:Preferences/es&action=edit§ion=6&printable=yes [18:49:25] indeed :) [18:49:38] it doesn't like any malformed section id [18:49:39] http://www.mediawiki.org/w/index.php?title=Help:Preferences/es&action=edit§ion=1x [18:49:51] I wonder if someone was being given that url... [18:50:03] makes sense, i guess. try to grab section content -> get false -> try using it -> barf. [18:50:34] DanielK_WMDE: looks like a weird enough corner case that it's probably not a huge issue [18:52:04] robla: well, it would also happen if the section you are trying to edit was just removed. [18:52:10] but yea, it doesn't happen that often [18:52:20] i'll fix it first thing tomorrow, ok? [18:55:15] that's fine, thanks! [19:03:38] New patchset: Jeroen De Dauw; "Remove duplicate of file. Current version is at includes/values/IriValue.php" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/28074 [19:11:03] DanielK_WMDE is best to review, but https://gerrit.wikimedia.org/r/#/c/28076/ gets rid of the error [19:22:20] thanks aude! What does returning false in that context do? does it edit the whole page, give some other (hopefully more helpful) error, or something else? [19:24:17] New patchset: Aude; "add change row tests" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28078 [19:25:37] New review: Aude; "see tests in https://gerrit.wikimedia.org/r/#/c/28078/" [mediawiki/extensions/Wikibase] (master); V: 0 C: 0; - https://gerrit.wikimedia.org/r/28002 [19:26:49] robla: the wiki gives me an error saying i'm trying to edit a non-existing section [19:29:00] ah, that seems sensible, thanks [19:30:46] i'm not sure it totally works.... [19:31:11] i fiddled around with the getContentObject function and that seemed to work, in addition [19:32:10] New patchset: Jeroen De Dauw; "Added simple getArrayValue method to the DVs" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/28079 [19:37:47] New review: Jeroen De Dauw; "Yay! Tests!" [mediawiki/extensions/Wikibase] (master); V: 0 C: -1; - https://gerrit.wikimedia.org/r/28078 [19:50:46] robla: new patch, although ideal if DanielK_WMDE would review it [19:50:47] https://gerrit.wikimedia.org/r/#/c/28076/ [19:59:37] !help [19:59:45] @info [19:59:46] http://bots.wmflabs.org/~wm-bot/dump/%23wikimedia-wikidata.htm [20:00:25] !officehours [20:00:25] Next office hours on 4th (German) and 5th (English) of April at 16:30 UTC. [20:00:29] hehehe [20:00:33] Lydia_WMDE_: ^ [20:00:42] !officehours del [20:00:43] Successfully removed officehours [20:00:43] -.- [20:00:45] thx [20:01:21] !nyan is ~[,,_,,]:3 [20:01:21] Key was added [20:01:25] !nyan [20:01:25] ~[,,_,,]:3 [20:01:28] * JeroenDeDauw is happy [20:01:39] :D [20:01:56] New patchset: Aude; "add change row tests" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28078 [20:02:55] New review: Aude; "all the issues fixed. " [mediawiki/extensions/Wikibase] (master); V: 0 C: 0; - https://gerrit.wikimedia.org/r/28078 [20:04:18] heh :) [20:04:20] !nyan [20:04:20] ~[,,_,,]:3 [20:32:56] New patchset: Jeroen De Dauw; "use DV::getArrayValue" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28133 [20:33:14] New review: Jeroen De Dauw; "Depends on https://gerrit.wikimedia.org/r/#/c/28079/" [mediawiki/extensions/Wikibase] (master); V: 0 C: 0; - https://gerrit.wikimedia.org/r/28133 [22:03:04] New patchset: Aude; "add change row tests" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28078