[08:00:30] hi [08:00:39] i think i found a douplicated object [08:01:14] http://www.wikidata.org/wiki/Q231 http://www.wikidata.org/wiki/Q83078 [08:01:21] can someone merge it? [08:06:30] Schisma - Are you sure? What do with [[Región Valona]] and [[Valonia (tierra romance)]] in eswiki, for example? [08:06:31] [7] 04https://www.wikidata.org/wiki/Regi%C3%B3n_Valona13 => [08:06:33] [8] 04https://www.wikidata.org/wiki/Valonia_%28tierra_romance%29 [08:06:49] no i'm not sure ^^ [08:08:02] but it looks like [08:08:21] we should ask someone he can read all these languages :D [08:09:07] at least the german and the english page share the same commons page [08:09:38] same flag [08:09:57] according to the map they describe the same region [08:10:11] same captipal [08:10:29] same ISO code [08:10:30] etc. [08:21:49] * reosarevok checks [08:23:51] So, Región Valona is for the actual "official region" so to say, and Valonia (tierra_romance) is for the... "idea" somehow? [08:23:56] * reosarevok reads the merge discussion [08:25:37] Yeah, it claims that it's "political entity" vs "identitary concept" in the same way http://en.wikipedia.org/wiki/Catalonia and http://en.wikipedia.org/wiki/Pa%C3%AFsos_Catalans are separate pages [08:26:26] Whether that makes any sense or not, I don't know :) [08:36:21] okay [08:37:26] So a wikidata page for Valonia (tierra romance) probably shouldn't have any ISO codes, flags and the like [09:00:19] Abraham_WMDE: hi! struggeling to get hangout working on the replacement box... [09:08:14] two easy reviews: [09:08:17] https://gerrit.wikimedia.org/r/#/c/74394/ [09:08:21] https://gerrit.wikimedia.org/r/#/c/75886/ [09:08:27] would be great to get those in [09:21:27] (03CR) 10Daniel Werner: [C: 032] Limiting number of registered event handlers in toolbar button widget [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/72529 (owner: 10Henning Snater) [10:20:20] aude: hello! [10:44:52] (03PS5) 10Henning Snater: (bug 48937) ValueFormatter for Time DataValue [extensions/DataValues] - 10https://gerrit.wikimedia.org/r/74599 [10:53:23] Tobi_WMDE: https://gerrit.wikimedia.org/r/#/c/76508/ [10:56:12] (03CR) 10Tobias Gritschacher: [C: 032] Do not install satooshi/php-coveralls for now, since we are not using coveralls for this repo yet [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/76508 (owner: 10Jeroen De Dauw) [10:56:15] hi pragunbhutani [10:56:19] (03Merged) 10jenkins-bot: Do not install satooshi/php-coveralls for now, since we are not using coveralls for this repo yet [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/76508 (owner: 10Jeroen De Dauw) [10:57:10] (03PS3) 10Jeroen De Dauw: Implementation work on QueryEntityDeserializer [extensions/WikibaseQuery] - 10https://gerrit.wikimedia.org/r/76096 [10:57:30] Denny_WMDE: https://gerrit.wikimedia.org/r/#/q/status:open+project:mediawiki/extensions/WikibaseQuery,n,z [10:57:37] . [10:59:57] DanielK_WMDE: hey :-] [11:00:39] DanielK_WMDE: was replying to your error_reporting detection in PHPUnit. Seems the root cause is wfSuppressWarnings() not being matched by a wfRestoreWarnings(). Same happens with wfProfileIn() / wfProfileOut(). [11:01:35] DanielK_WMDE: I am wondering whether we should phase out those global functions with a class of static methods. That would let us track the last caller and report back an error whenever a caller is still set and one calls wfProfileIn() or wfSuppressWarnings() [11:02:00] JeroenDeDauw: should have pinged you as well for above text ^^^ :-] [11:02:35] hashar: could do that, but just tracking the level in unit tests would already help a lot with finding such "leaks". [11:02:46] yup I agree [11:02:51] looking for a long term solution [11:05:38] hashar: i still think my patch is a good short term solution :)= [11:10:18] (03PS2) 10Daniel Werner: Performance improvement for jQuery.wikibase.entityview edit toolbars handling [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/76490 [11:15:52] DanielK_WMDE: for creating a mobile view of wikibase, i'm wondering where the most suitable place is to store the mobile text? [11:16:47] i see stuff (result of EntityView->getHtml() ) is put into parser output text [11:17:07] then output page gets that and it can be displayed [11:44:32] hashar: where does one need those methods anyway? [11:44:36] or rather functions [11:45:14] that got made to workaround @ slowness and evilness [11:45:44] how is @ evil again? I forgot [11:45:56] can't remember [11:46:03] I think it was plain sold back in php4 day or something [11:46:18] and could cause some weird error when the @ed function throw an exception [11:46:42] hashar: I have not seen other php apps use this [11:46:55] So perhaps there is no need for evil global functions or static methods [11:47:01] that is because our code base sucks :] [11:47:03] That need an opening and closing call [11:47:07] Which clearly is error prone [11:47:28] yup [11:47:37] yeah well, then fi the code in question, rather then introducing an equaly bad though more complex way of hacking around it [11:47:46] and lot of error hiding are made without even a comment line explaining why it is required [11:48:02] mhh [11:48:16] hashar: we are also not using Exceptions properly in MW [11:48:21] See my last mail to smwdevel [11:48:31] mind linking to archive ? :D [11:48:36] I am not subscribed to there [11:49:15] perhaps I should blog about this ;p [11:49:19] yup :-] [11:49:58] we should also list all those issues and start dealing with them [11:50:05] or at least write plans to get rid of the oddities [11:54:35] hashar: easier said then done ofc [11:54:43] Since people will not agree on the issues being issues [11:54:47] Or on how to solve them ;p [11:55:21] hashar: did you see http://www.bn2vs.com/blog/2013/07/23/profiling-and-logging-in-mediawiki/ ? [11:56:35] hashar: http://sourceforge.net/mailarchive/message.php?msg_id=31231379 [12:00:01] separation of concerns is a big issue [12:00:25] and us maintaining ton of reinvented wheels [12:00:43] I am out to grab a snack, will look at your blog post while eating :-] [12:15:46] aude: i suppose the mobile version would also be handled via ParserOutput, with the "mobile" flag somehow getting used in the ParserCache key. [12:16:03] (we currently have the parser cache disabled, but that shouldn't stay that way) [12:20:37] DanielK_WMDE: ok, we can see about putting a flag [12:23:14] would that be with ParserOptions? [12:23:36] addExtraKey( $key ) [12:34:00] hmmmm.... just when i want to go out to get food... big thunderstorm! [12:52:54] aude: i guess so, i'm blurry on the details. [12:53:10] about the parser cache, not thunderstorms :P [12:55:43] DanielK_WMDE: we have EntityContent (or whatever type of content) :: getParserOutput() [12:55:54] that returns EntityView::getParserOutput [12:57:29] when we are viewing a page (like at skin or other level), then i'm trying to figure out how to request what [12:58:20] * aude came up with https://gist.github.com/filbertkm/66223517a4e158a0b5af (to have parser output store the entire html) [12:58:50] or it could store maybe the data [12:59:06] although output page does extra stuff to check permissions, so not sure [13:13:14] (03PS1) 10Tobias Gritschacher: Added new ChangeOp for claim operations [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/76710 [13:25:46] (03CR) 10Denny Vrandecic: [C: 032] Make use of MWQueryInterfaceBuilder [extensions/WikibaseQuery] - 10https://gerrit.wikimedia.org/r/76088 (owner: 10Jeroen De Dauw) [13:26:58] (03CR) 10Denny Vrandecic: [C: 032] Implementation work on QueryEntityDeserializer [extensions/WikibaseQuery] - 10https://gerrit.wikimedia.org/r/76096 (owner: 10Jeroen De Dauw) [13:27:14] (03Merged) 10jenkins-bot: Make use of MWQueryInterfaceBuilder [extensions/WikibaseQuery] - 10https://gerrit.wikimedia.org/r/76088 (owner: 10Jeroen De Dauw) [13:28:31] (03Merged) 10jenkins-bot: Implementation work on QueryEntityDeserializer [extensions/WikibaseQuery] - 10https://gerrit.wikimedia.org/r/76096 (owner: 10Jeroen De Dauw) [13:28:42] Denny_WMDE: http://ec2-54-234-76-44.compute-1.amazonaws.com/clouds.php :) [13:30:44] Granjow: cool [13:32:33] [travis-ci] wikimedia/mediawiki-extensions-WikibaseQuery#25 (master - 4371d89 : jeroendedauw): The build has errored. [13:32:33] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-WikibaseQuery/compare/5c9ebaeeb40e...4371d8994a84 [13:32:33] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-WikibaseQuery/builds/9647904 [13:32:36] why do we collect Wikipedia disambiguation pages as entities? [13:32:56] shouldn't they be generated by wikidata itself? [13:34:28] [travis-ci] wikimedia/mediawiki-extensions-WikibaseQuery#26 (master - 3857086 : jeroendedauw): The build has errored. [13:34:28] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-WikibaseQuery/compare/4371d8994a84...38570864619f [13:34:28] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-WikibaseQuery/builds/9647998 [13:37:04] Tobi_WMDE: easy review is easy https://gerrit.wikimedia.org/r/#/c/76535/ [13:37:40] Schisma: I think for some of these pages it makes sense to have language links [13:40:28] (03PS1) 10Jeroen De Dauw: Fix TravisCI build by including example repo config [extensions/WikibaseQuery] - 10https://gerrit.wikimedia.org/r/76716 [13:42:12] JeroenDeDauw: I love your blog links "By the way, do you know what the strpbrk PHP function does? No? You didn’t even know it existed? I’m not surprised. " from STUPID http://nikic.github.io/2011/12/27/Dont-be-STUPID-GRASP-SOLID.html [13:44:48] JeroenDeDauw: the more I think about it the more I guess we could even get rid of wfProfile entirely [13:44:57] or any profiling tools in Mediawiki [13:47:11] Krinkle|detached: In case you read this, is submitting a patch for jQuery the right way to get the docs updated? (https://github.com/Granjow/api.jquery.com/compare/jquery:master...patch-1?quick_pull=1) [13:58:37] (03PS1) 10Jeroen De Dauw: Finish initial implementation of IndexDefinition [extensions/WikibaseDatabase] - 10https://gerrit.wikimedia.org/r/76717 [14:01:08] (03CR) 10Daniel Kinzler: "(2 comments)" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/76710 (owner: 10Tobias Gritschacher) [14:03:31] hashar: that would be idial from a code quality perspective yes [14:04:49] JeroenDeDauw: there is a bunch of over stuff we could drop as well :D [14:13:59] (03PS1) 10Jeroen De Dauw: Remove index field from FieldDefinition [extensions/WikibaseDatabase] - 10https://gerrit.wikimedia.org/r/76718 [14:13:59] (03PS1) 10Jeroen De Dauw: Readability imporvements to FieldDefinitionTest [extensions/WikibaseDatabase] - 10https://gerrit.wikimedia.org/r/76719 [14:14:44] hashar: we might want to switch to using private chat and OTR, else you might get assasinated ;p [14:15:10] I am powerful enough to get the killers diverted to some other task :-] [14:15:13] such as writing unit tests [14:16:21] Denny_WMDE: sure, they should simply link to the same word in another language [14:16:38] transliterated of course [14:17:31] (03PS6) 10Daniel Kinzler: Adding tests for claims to EntityTest. [extensions/WikibaseDataModel] - 10https://gerrit.wikimedia.org/r/72974 [14:18:26] hashar, JeroenDeDauw: when testing one of the lib repos that Wikibase depends on, we should really test all of wikibase with it, right? [14:18:45] hotherwise, we could merge something into DataValues that is internally consistent but breaks Wikibase [14:19:31] (03PS1) 10Jeroen De Dauw: Match changes in WikibaseDatabase [extensions/WikibaseQueryEngine] - 10https://gerrit.wikimedia.org/r/76720 [14:19:40] (03PS1) 10Henning Snater: Cleaned up siteselector and entityselector QUnit test files [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/76721 [14:19:41] hashar, JeroenDeDauw: we had that case last week, i'd like to avoid this in the future. the purpose of CI is to avoid this by making sure everything is tested with everything, right?... [14:21:26] Denny_WMDE, aude: fyi, deploymetn of the url data type is blocked on https://gerrit.wikimedia.org/r/#/c/75867/ [14:21:31] needs approval from the core team [14:22:10] (03CR) 10Jeroen De Dauw: "(1 comment)" [extensions/WikibaseDataModel] - 10https://gerrit.wikimedia.org/r/72974 (owner: 10Daniel Kinzler) [14:23:23] JeroenDeDauw: i don't understand that comment ---^ [14:24:05] JeroenDeDauw: you want me to move the claim stuff to ItemTest, even though the Entity class defines the functions? [14:24:41] also, I still think it's bead to make rank_truth the default (and why "truth"? is that the same as "preferred"?) [14:26:56] hashar: death by unit test? [14:28:01] (03CR) 10Daniel Kinzler: "(1 comment)" [extensions/WikibaseDataModel] - 10https://gerrit.wikimedia.org/r/72974 (owner: 10Daniel Kinzler) [14:28:43] JeroenDeDauw, hashar: i suggest we set up the same test jobs for all the extensions involved in wikibase, so everything is tested with everything before getting merged. [14:28:47] what do you think? [14:29:12] One job that runs it all, and is part of all builds would be good [14:29:25] I hope we can avoid having to duplicate the config all over [14:29:35] I have no clue what Wikibase is nor I have clue about what the libs contains :D [14:29:44] they are merely git projects that need to have phpunit test being run. [14:29:49] Since it would really suck to update the config for Diff because some extension to Wikibase has some new dependency or something [14:30:48] DanielK_WMDE: you can always set this up on Travis yourself, though that is going to make the config more complex [14:31:02] but i want gerrit integration. [14:31:05] potentially we might want to run the wikibase test suite whenever one of its dependency received a new patch [14:31:08] the important bit is the gateway check [14:31:30] hashar: that's exactly what i mean. not "potentially" but "urgently". [14:32:04] there are two main problem there which both relates to how we manage the dependencies: [14:32:12] i broke master last wekk, by merging somethign into DataValues that broke a test in Wikibase. not good. [14:32:33] 1) the Zuul layout would need to be updated to trigger a Wikibase test suite whenever a change is merged in a dependency. So when a dependency is added Zuul needs to be edited. [14:33:24] can't zuul use a dependency file maintainewd in the target repo? [14:33:36] 2) the jenkins jobs define the dependency version has being the latest master version. Which makes the tests somehow unpredictable [14:33:45] yeah that is called submodules :-D [14:33:57] but submodules suck for deployment [14:34:07] not really [14:34:11] especially if multiple extensions have the same dependencies. they can't share submodules [14:34:11] (03CR) 10Jeroen De Dauw: "(1 comment)" [extensions/WikibaseDataModel] - 10https://gerrit.wikimedia.org/r/72974 (owner: 10Daniel Kinzler) [14:34:31] but what sucks is that Diff could end up being shipped as a submodule of Wikibase and as a submodule of another extension, or directly as an extension [14:34:40] then you end up with two version of Diff and a duplicate class [14:34:56] hashar: which will cause php to scream and die [14:35:09] if we only look at Wikibase, you could have a specific version of each libraries bound to a certain version [14:35:17] that doesn't help [14:35:34] if you wanted to update the dependency you would update the composer.json versions for travis and the submodule pointers and craft a commit "update dependnecies" [14:35:39] and we don't want to bind to a certain version [14:35:51] yup I understood that [14:36:21] so what could be done is that on each dependency we had a 'postmerge' that triggers the Wikibase job [14:36:26] hashar: and the README ;p [14:36:36] that should fetch out Wikibase + all latest master versions and report back in Gerrit whether something weird happened. [14:36:43] how about this (based on my blurry understandiong of zuul): when a commit for repo X comes in, zuul looks at X/composer.json to find out all the dependencies. [14:37:24] nop [14:37:31] it then checks out the master branch for all the dependencies, and runs the tests. [14:37:32] ideally it would do that one day :-] [14:37:34] no?= [14:37:48] (it still needs a way to hook them all up with mediawiki, or at least register the tests) [14:37:50] the workaround is that we declare in the Jenkins jobs the list of extensions which are dependencies [14:38:05] and pass the list of extension to a lame shell script that clone each dependency in the extensions/ directory [14:38:21] that is lame :/ [14:38:26] the poor man dependency system [14:38:37] hashar: but that'S what i don't understand... basically, all (deployed) extensions have to be compatible with each other, so they all "depend" on each other [14:38:58] to support composer we would need to have it packaged for Ubuntu and probably setup a composer repository hosted on wmf wiki (unless composer is able to fetch from a git repo) [14:39:01] there is nop need to define dependencies for individual extensions [14:39:11] hashar: never mind composer. [14:39:18] yeah lets skip composer [14:39:18] make a single list of all deployed extensions [14:39:26] if any of them changes, test all (and core) [14:39:29] so extensions are simply tested against the latest mediawiki core [14:39:35] we do not have any integration tests for the wmf branches [14:39:37] ...and all other extensions! [14:39:49] well, you can have those too [14:39:54] i'm thinking of the gateway job [14:39:59] we would need to fetch mediawiki-config , fetch both wmf branches, update all submodules and then run the phpunit tests of both wmf branch [14:40:12] and the gateway job should be an integration test against the master of all other extensions (and core) [14:40:13] that would need to be done on every single patchset being merged :-] [14:40:25] oh yeah + master [14:40:52] and we might want to test master of extension with the REL1_19 / REL1_20 / REL1_21 branches of core whenever the extension is only having a master branch [14:40:59] so yeah there is still a lot of job to be done there [14:41:02] hashar: why would it need bundling with ubuntu? [14:41:05] we *might* want to do all of that [14:41:10] but that's not the point [14:41:17] JeroenDeDauw: we can't install software on production without a debian package [14:41:31] hashar: when merging something into core, it's tested against core and a bunch of extensions, right? [14:41:35] hashar: you dont need to "install" it [14:41:44] hashar: just get the phar, and run using the phar ;p [14:41:58] you can even get it at the start of the job and then ditch it again [14:41:58] I don't need, I HAVE to ! [14:42:30] which need one have to figure out whether composer works with the years old PHP version we are using [14:42:33] hashar: ...all i'm saying is that the same should happen for changes to any of these extensions. [14:42:33] write the debian package [14:42:49] convince some ops to review it and finally get the puppet manifest reviewed to actually install it [14:42:52] hashar: jenkins is "installing" MW extensions just as much as it would be "installing" composer. So it seems to me this ought to not be a problem [14:42:58] that would takes 2 - 3 days of work over a couple months. [14:43:13] hashar: it does work with 5.3 [14:43:23] hashar: we have this working on Travis rmemeber? ;p [14:43:45] hashar: that all seems besides the point of what i'm asking.... [14:44:01] DanielK_WMDE: when a change is merged in core, we only run the core unit tests. Not all the extensions (we have 600 of them). [14:44:13] hashar: deployed extensions. [14:44:16] DanielK_WMDE: and some extensions are incompatible with each others (at least last time I checked) [14:44:28] ick. [14:44:35] deployed extensions? [14:45:03] DanielK_WMDE: we don't do that either. That will also need to setup the mediawiki-config repo and run the tests using different database id (such as commonswiki, dewiki, wikidata, testwiki and some other important roles) + vary by mobile maybe [14:45:33] o_O [14:45:36] *sigh* [14:45:44] yeah :( [14:45:57] but you know all of that is mostly half a person time (aka me half the time) [14:46:06] the other half I got to handle beta / ops [14:46:13] ok, so let's try a simpler approach: one list of extensions that gets tested whenever one of the extensions on the list gets a commit. [14:46:30] and the last half I do review and realease management [14:46:41] heh :P [14:47:01] anyway... is that doable? the list is already there (it's the dependency list of Wikibase). [14:47:13] [1..3].clone(hashar)->setName( randomfolk() ) [14:47:14] All I want is for all the extensions on that list to have the same list of dependencies [14:47:38] that would end up triggering the same job [14:47:47] so we could get a "wikibase dependencies integration test" [14:47:47] hashar: yea, spawning clones is cool, but i'd like to join & merge later :) [14:47:51] that would be triggered by each project [14:47:57] yes [14:48:00] that's the idea [14:48:17] doesn't even need to trigger for each PS [14:48:20] that is potentially easy to do :-] [14:48:20] but before merging [14:48:26] hashar: that would be great! [14:48:58] hashar: it would also make maintenance easier: only one list of dependencies to manage [14:49:02] * hashar digs in Zuul mess [14:49:30] so on wikibase we have three jobs triggered on each patchset: [14:49:31] - mwext-Wikibase-testextensions-master [14:49:32] - mwext-Wikibase-client-tests [14:49:33] - mwext-Wikibase-repo-tests [14:49:43] but before merge we only have mwext-Wikibase-repo-tests :D [14:50:20] eek [14:50:23] that's wrong :) [14:50:36] mwext-Wikibase-testextensions-master comes with all the dependencies apparently : 'Ask,Serialization,Diff,DataValues,DataTypes,WikibaseDataModel,Validator' [14:50:42] we want mwext-Wikibase-testextensions-master there [14:50:50] yea [14:50:51] bah that job is duplicated ho ho [14:51:03] * hashar needs vacations [14:51:26] that job should also cause the repo and client extension of wikibase to be active at once. [14:51:31] DanielK_WMDE: do you know your way in zuul config ? [14:51:41] i have no clue about it whatesoever [14:51:44] never looked at it [14:52:19] DanielK_WMDE: https://github.com/wikimedia/integration-zuul-config/blob/master/layout.yaml#L2155 [14:52:54] that defines Zuul behavior whenever something happens on mediawiki/extensions/Wikibase [14:52:54] test: is when a whitelisted user submit a patchset [14:53:03] gate-and-submit: the jobs triggered on CR+2 that will lead to a merge [14:53:42] I am not sure whether the client/repo jobs are still needed since there is the "do it all" mwext-Wikibase-testextensions-master [14:53:43] yea, looks streight forward [14:53:55] but can i re-use bits of this? macros/references/inclues? [14:54:02] the job mwext-Wikibase-testextensions-master is defined in Jenkins Job Builder which gives out dependencies. [14:54:20] so potentially we could add to each Wikibase dependencies project a new job to be triggered in gate-and-submit: that would run the mwext-Wikibase-testextensions-master [14:54:31] hashar: the client job is definitly needed, to check the client works without the repo. the repo-without-client thing... is nice to have. [14:54:49] but that would only run the master version of them, not the actual patchset being submitted [14:55:02] (03PS1) 10Tobias Gritschacher: Use ChangeOps for wbsetclaimvalue and wbcreateclaim [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/76725 [14:55:23] hashar: is there no "get master of all these, then apply the current patch to this repo" thingy? [14:56:13] that's basically the logic we need: when submitting for B, test A,B*,C. When submitting for C, test A,B,C*. [14:56:34] will need to write something for it [14:56:55] hmm [14:57:21] might not even work since the mwext-Wikibase-testextensions-master is made to fetch the patchset from Wikibase extension [14:57:33] so if the job is triggered by another extension, it will not find the patchset submitted in Wikibase [14:57:43] that need a bunch of hacking / new jobs defined [15:01:14] (03PS7) 10Daniel Kinzler: Adding tests for claims to ItemTest. [extensions/WikibaseDataModel] - 10https://gerrit.wikimedia.org/r/72974 [15:01:19] aude: sorry, I was away [15:01:19] JeroenDeDauw: --^ [15:01:47] hashar: it seems so easy, yet it's so hard... [15:02:10] hashar: considering that this is what CI is all about, I'm surprised there's no tools for this in place [15:04:49] I filled a bug https://bugzilla.wikimedia.org/show_bug.cgi?id=52278 [15:04:54] with my own random thought in it [15:06:35] (03PS2) 10Tobias Gritschacher: Use ChangeOps for wbsetclaimvalue and wbcreateclaim [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/76725 [15:11:05] (03PS2) 10Henning Snater: Cleaned up QUnit test files [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/76721 [15:16:28] (03PS1) 10Tobias Gritschacher: Get rid of unused Autocomment code [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/76727 [15:18:17] (03PS1) 10Jeroen De Dauw: Added indexes field to TableDefinition [extensions/WikibaseDatabase] - 10https://gerrit.wikimedia.org/r/76728 [15:24:58] (03PS2) 10Tobias Gritschacher: Get rid of unused Autocomment code [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/76727 [15:28:58] (03CR) 10Daniel Kinzler: [C: 032] Initial work on IndexDefinition [extensions/WikibaseDatabase] - 10https://gerrit.wikimedia.org/r/76535 (owner: 10Jeroen De Dauw) [15:29:04] (03Merged) 10jenkins-bot: Initial work on IndexDefinition [extensions/WikibaseDatabase] - 10https://gerrit.wikimedia.org/r/76535 (owner: 10Jeroen De Dauw) [15:30:46] hi pragunbhutani [15:31:01] (03CR) 10Daniel Kinzler: [C: 031] "seems fine, but we don't have any tests for this, and I didn't test manually." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/74636 (owner: 10Umherirrender) [15:32:30] aude: hello [15:32:42] we still don't have a consensus on how to proceed? [15:32:55] pragunbhutani: not sure [15:33:14] I did read the mail you sent [15:33:27] if we do everythign in the skin that that means transforming the standard html, right? [15:33:54] could be possible but i'm not as familiar with the approach [15:34:26] I think that's how mobile frontend handles wikipedia stuff [15:34:39] we could wait for Jon to respond to your mail [15:34:42] yeah, which seems to work but [15:34:45] that'll shed some light on the matter [15:35:03] i'm thinking option 2 would be nice, if there are no issues with it [15:35:32] to have the data in the parser output and then would make it flexible for skins to do things [15:36:18] is there any reason that we shouldn't do so? [15:36:24] i don't know [15:36:37] like to make sure permissions are still handled correctly, etc. [15:36:55] out for today, catching my daughter. [15:37:10] it might be worth another wikitech mail, but meanwhile [15:37:26] we can still work on the layout, etc. [15:38:58] pragunbhutani: but then the way i obtained the parser output is a bit hacky and not sure the best way [15:39:40] aude: oh okay [15:39:48] it worked, though! [15:40:00] and doing that again would mean replicating the hacky stuff [15:40:01] you can put anything into the extension data [15:40:07] not sure that's good for the long run? [15:40:15] right [15:40:46] yeah, true [15:41:21] what about just parsing the dom and adjusting the layout? [15:41:33] that does solve this problem [15:41:34] that can work [15:41:37] yeah [15:42:15] [travis-ci] wikimedia/mediawiki-extensions-WikibaseDatabase#31 (master - 257cfca : jeroendedauw): The build passed. [15:42:15] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-WikibaseDatabase/compare/213e62e8738e...257cfca1611b [15:42:15] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-WikibaseDatabase/builds/9653321 [15:42:30] some of the issues are that the current php entity view does not render all the data [15:42:38] it leaves some stuff like q3 [15:42:46] i think we are working to fix that [15:43:22] option 2 would be nice because if we have data in a serialized format [15:43:34] it can be re-used for other purposes [15:43:38] yeah [15:43:56] then it's just a matter of sticking templates (however they are done) in the right place? [15:44:17] yeah that sounds about right to me [15:45:00] okay, so the bottlenecks with option 2 are 1) tacky parser output and 2) permission issues [15:45:02] anyway, key pieces of code to look at are ViewEntityAction which then calls Article::view() [15:45:04] *hacky [15:45:29] it gets the parser output from either cache or otherwise it's generated [15:45:54] there is EntityContent (and related content classes in core) which has getParserOutput [15:46:02] which then gets stuff from EntityView [15:46:23] or if it's a wikitext page (talk page), then it does the normal stuff [15:46:31] (03PS1) 10Henning Snater: Adjusted site selector widget to site groups [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/76735 [15:46:59] then Article::view puts stuff into output page, which skin can access [15:48:04] so doesn't that mean that we already have our data in parser output? [15:48:11] issue is to either make use of what's in output page by default (standard html) or access parseroutput [15:48:33] we do, but skin / output page don't have direct access to parseroutput [15:48:59] if it's sane to access it how i did it, then option 1 or 2 might be okay [15:49:05] ahh so that's the permission issue that you were talking about? [15:49:15] i'm not sure where permissions are handled [15:49:42] it seemed to work in my snippet (e.g. if i set my wiki to read only) [15:50:09] and if there was no mobile-html or whatever in parser output, then it defaults to the standard html [15:50:29] probably super too simplistic though [15:51:17] aude, pragunbhutani: you'd need to hook into OutputPage and make it take some "extra data" from ParserOutput, so it becomes available to the skin. [15:51:24] DanielK_WMDE: yes [15:51:30] PHP lets you glue random info to any object... hackish but handy :) [15:51:35] that's what i tried [15:51:37] well it works [15:51:45] DanielK_WMDE: there is extension data now in parser output [15:51:54] which is designed to avoid the most hackish [15:52:29] aude, DanielK_WMDE : what 'extra data' do we need? [15:52:29] aude: yes, guess who added that :) [15:52:34] heh :D [15:52:44] i don't know what extra data you need [15:52:58] if all you need in the skin is the HTML, then there is no need to hack around [15:53:15] otherwise, i recommend to use the OutputPageParserOutput hook to transfer whatever you need from the PArserOutput into the OutputPage object [15:53:28] right, like how we make entity id available [15:53:39] kind of.... [15:53:48] I think the html is all we need [15:53:49] anyway, off to fix dinner :) [15:54:01] because a skin is essentially just a different layout, isn't it? [15:54:13] $pout->getExtensionData [15:54:45] pragunbhutani: the skin usually does the different layout and then plop the body content in [15:54:55] (with whatever special css and javascript, applied) [15:55:30] which is what we need [15:55:38] true [15:56:16] how about see what can be done with transforming the html? [15:56:24] without any extra data [15:56:57] btw, no js (since not all mobile devices have js enabled) [15:57:23] so then do we go the mobile frontend way? [15:57:56] if we can make it work well [15:58:20] if it becomes too hackish, then lets consider providing some extra data in the parseroutput / output page [15:58:44] whatever is needed to make things work well [16:03:45] let's see what Jon says then [16:04:14] I'll ask him to comment on how feasible he thinks it is [16:04:27] meanwhile, I can do an html mockup of the layout with dummy entities [16:05:15] aude: DanielK_WMDE: I found an easteregg in the API! \o/ [16:05:42] :) [16:06:18] hah [16:06:31] pragunbhutani: sounds good [16:07:07] wbremoveclaims has the ability to remove multiple claims (from whatever entity they're from) - ok, the name implies this - by concatenating multiple guids with a "|" - but this is not documented anywhere in the API [16:07:46] so, I ask myself what would be a usecase for that functionality? [16:07:53] * aude new that [16:07:56] knew [16:08:04] aude: I knew that you knew.. ;) [16:08:06] that's why it's removeclaims (plural) [16:08:13] bizarre, nonetheless [16:08:37] aude: but the parameter says "claim" which is singular and it is not documented [16:08:51] surprised! [16:09:08] i wonder if anyone is using remove claims in that way? [16:10:40] aude: me too. and it's extremely hard to construct meaningful summaries for that [16:11:02] beacause the claims do not have to belong to the same entities [16:11:35] hmmm, i would make it into multiple change ops? [16:12:07] do we really want to support removing multiple claims, though? [16:12:18] aude: but I cannot apply those multiple changeOps as a patch of changeOps because they all affect different entities [16:12:32] true [16:13:10] what does addshore say? [16:13:11] aude: DanielK_WMDE: so, do you think I can remove this (hidden) functionality and just make it a "removeclaim" (singular) [16:13:12] ? [16:13:28] changing the name of the module itself is quite significant [16:13:40] removing the functionality.... not a problem, imho [16:13:50] if it can still remove multiple claims of a single entity [16:14:06] just not for any number of different entities [16:14:23] aude: yes, that would be a solution [16:14:26] ok [16:14:39] * aude says just do that [16:14:55] but think we'll need to announce breaking change [16:15:15] aude: we would have to check if all given claims belong to the same entity. or we would have to introduce a param "entity" [16:15:31] i'd say both [16:15:50] or maybe a new param is ok [16:16:00] like the new=item one [16:16:51] aude: would just be a param "entity" which has to be the entityId [16:17:21] i can't imagine it's a problem, but that *is* an important breaking change [16:17:57] aude: or we don't add a new parameter and just check whether all the claims are from the same entity and throw an error otherwise [16:18:28] which is better? [16:18:37] aude: then it would even not be a real breaking change [16:18:44] true [16:18:58] aude: because we just removed undocumented functionality [16:19:00] i would be okay with either but ask addshore [16:19:06] sure [16:19:11] and the "normal" behaviour would stay the same [16:19:16] ok [16:19:37] aude: will do it in that way [16:19:44] ok [16:20:05] don't expect many complains as I don't think this was used at all [16:20:17] i think it would be bad for bots to do that [16:20:31] and no edit summary can properly explain that [16:20:57] aude: bad for bots to do what? [16:22:29] Tobi_WMDE: to be removing claims from multiple entities in a single api call [16:22:51] it would be a challenge to monitor those edits [16:22:52] aude: yes [16:23:22] aude: it would be impossible to see that in the diff, right? [16:23:52] it's evil [16:24:00] ugh [16:27:08] (03PS1) 10Daniel Werner: (bug 48611) valueview expert test setups won't create instances anymore [extensions/DataValues] - 10https://gerrit.wikimedia.org/r/76741 [16:28:15] Danwe_WMDE: Can you explain what an 'expert' is in this context (just curious) - or a link to doc. [16:28:27] * Krinkle wonders what experts we are destroying (or not) that cause tests to fail [16:28:58] https://bugzilla.wikimedia.org/show_bug.cgi?id=48611 [16:31:38] Krinkle: jQuery.valueview.valueview is our widget for displaying and editing data values. There is a composition relation between that widget and an "Expert" instance. A specific expert implementation takes care of displaying/editing a certain kind of data value. [16:37:34] Danwe_WMDE: ok [16:38:08] I updated the bug as well [16:41:12] Danwe_WMDE: Not sure if this is an option here, but you may want to use setup/teardown. It looks like you're trying to implement that manually. [16:41:59] Krinkle: thought about it but it didn't seem to integrate with the qunit parameterize thing we are using [16:42:38] have to look into it again, could be adjusted perhaps [16:43:06] Yeah, not sure if that's needed but oh well. [16:44:17] it saves some code if you want to do the same assertions on instances created with different constructor arguments [16:45:32] aude: also, please remember to fill the evaluation report! :) [16:45:40] pragunbhutani: done! [16:46:31] oh cool, I've done it as well [16:46:35] I'll update it in the table then [16:46:40] ok [16:46:53] thanks ;) [16:46:59] and don't forget the monthly status update for wmf [16:47:05] (03PS2) 10Daniel Werner: (bug 48611) valueview expert test setups won't create instances anymore [extensions/DataValues] - 10https://gerrit.wikimedia.org/r/76741 [16:47:15] you mean the monthly report? [16:47:18] yes [16:47:25] yep, I've linked that in the table already :) [16:47:30] for july [16:47:31] perfect [16:49:04] so i think at this point, in terms of layout, we have mockup 2 and 3 [16:49:55] maybe send a followup email to wikidata-l / mobile lists? [17:01:56] yep, and I really like mockup 3 [17:02:06] okay, I'll send a mail through [17:04:49] pragunbhutani: i like it too :) [17:06:39] maybe we can use something like mockup 2 for the smaller devices where the width isn't enough to accommodate two boxes [17:06:39] and use mockup 3 on larger devices and tabs [17:06:44] because the boxes can't be too small, or they won't be readable [17:07:00] true [17:07:16] i'd be interested in seeing a demo of both (hard coded) [17:07:27] it's difficult to to decide from mockups (for me) [17:07:43] yeah, the HTML mockups will make that easier [17:07:46] you can do this entirely within the skin [17:08:13] where i assign the body text in https://gist.github.com/filbertkm/66223517a4e158a0b5af [17:08:20] anything can be assigned from anywhere [17:08:44] without the parser output stuff [17:10:09] you mean hard-coding a demo of the layouts? [17:10:32] yeah [17:10:39] i think that would help [17:11:16] how can I do that, again? [17:11:19] and then which one we decide (with or without subsequent tweak), it's only a matter of hooking up the pipes (to get the real data) [17:11:25] in the skin? [17:12:03] ah okay you mean we can supply any html to that 'bodytext' [17:12:05] you could make two skins (derived from SkinMobile)? [17:12:08] yeah [17:12:14] (03PS1) 10Daniel Werner: valueview experts have a clear definition of destroy now [extensions/DataValues] - 10https://gerrit.wikimedia.org/r/76744 [17:14:03] i would start with one of them [17:14:51] i'm not sure how to have two mobile skins and toggle between them [17:15:25] is it possible to make the skin responsive? [17:15:39] sure [17:15:53] together with skin mobile, not sure exactly (ask jon) [17:16:05] * aude made a bootstrap skin and not difficult [17:16:36] aude: okay, I will ask Jon [17:16:45] ok [17:16:55] i think he's on vacation this week, though closer to your timezone :) [17:17:06] so hope he is responsive [17:17:25] (03PS1) 10Daniel Werner: Improved jQuery.ui.inputextender/jQuery.ui.listrotator destroy implementations [extensions/DataValues] - 10https://gerrit.wikimedia.org/r/76745 [17:17:39] yes, he isn't very far! [17:17:44] :) [17:17:45] he has been responding to mails [17:17:51] good [17:19:06] I'm slightly unclear on one thing, should I make HTML mockups first or were you talking about creating skins with hardcoded data (not sure?) [17:19:33] i think skin with hardcoded data is just as easy [17:19:46] and allows to see in context of the skin header, etc. [17:20:53] yeah, I think that should work [17:21:19] you still have your test wiki, right? on labs? [17:21:20] passing html into a skin is just as easy [17:21:25] yep I do [17:21:26] yes [17:31:10] aude: on a different note, will you be attending SMWCon? [17:32:19] pragunbhutani: it's in berlin, so assume so [17:33:21] do you know if they only give scholarships to people who're giving talks? [17:33:23] * aude remembers previous one being mostly business & government people, though [17:33:41] the european smwcons might be different/better [17:33:47] ah okay, so it's not like a hackathon? [17:33:48] ask JeroenDeDauw or Denny_WMDE [17:33:54] not really [17:34:14] hmm I don't know if it makes sense for me to apply then [17:34:42] if there's a mediawiki hackathon in india (or the next one in the spring), that would be better [17:35:19] the one in the spring will be some place, don't know where [17:35:39] there was one in India at the beginning of the year [17:35:45] yeah [17:35:46] don't know about the next [17:35:53] but yeah, that would probably be better [17:36:10] it depends on someone organizing them :) [17:36:45] we're trying to hold an OSScamp at my Uni as well [17:36:51] cool :) [17:37:02] but now that I've graduated, it really depends on how active we can get my juniors to be [17:37:11] ok [17:48:32] aude: I think I'll still go ahead and apply [17:48:41] I'd really like to meet people from the Wikidata team [18:09:10] pragunbhutani: sounds good [18:45:30] addshore: around? or any other bot folks? [18:48:03] aude: Addshore should be around. legoktm might be here. [18:52:44] seems to be just me, but on some machines, i am getting old json for certain entities [18:53:01] if i try on different machines, or with xml on the same machine, i get new data [18:53:13] something is stuck cached or something [18:53:38] means my bot cannot operate until i resolve this :( [18:56:22] huh, http and https, i get different results [18:57:07] http://www.mediawiki.org/w/index.php?title=Manual:Sites_table&action=edit&redlink=1 [18:57:12] aude: Addshore gets a few errors on wikidata with bot editting. Also https has been made the default login protocal now. [18:57:55] Reedy: heh [18:58:03] i could do https for my bot [19:03:26] alright, i am going to try later or tomorrow to see if it's a fluke [19:03:39] or if anyone else has an issue and whatnot [19:04:11] if i change my request in any manner, like add an extra bogus parameter (with json), then it gives me fresh data [19:04:28] my bot could just give a bogus new random param every time! [19:04:48] hey aude [19:05:00] hi, have you seen any issues like i describe? [19:05:22] i dont know but at about 12:31 today my bot stoped editing wikidata [19:05:29] oh, really [19:05:37] thought I have broken anything , kept getting a 'bad token' result from the api [19:05:40] * aude checks server logs [19:05:57] they switched north america, http traffic to use varnish text cache [19:06:07] requesting a new token gave me the same token, eveything seemed correct from my side [19:06:10] 12:21 mark: Put all non-European wikidata traffic on cp1052 and cp1065 [19:06:22] *checks his last wikidata edit [19:06:43] this morning, my bot operated without incident [19:06:50] (show/hide) 12:16, 30 July 2013 (diff | hist) . . (+62)‎ . . Category:Cities in Alabama (Q5653749) ‎ (‎Updated item) [19:06:59] (03PS7) 10Daniel Werner: Introduces GenericSet.js [extensions/DataValues] - 10https://gerrit.wikimedia.org/r/76079 [19:07:03] thats the last edit it managed to make :P [19:07:08] wow [19:07:13] i assume that's utc [19:07:43] i think my token should be fine [19:07:47] and unrelated [19:07:47] yup [19:07:58] aude: what were you experiencing? [19:08:21] i'm requesting https://www.wikidata.org/w/api.php?action=wbgetclaims&entity=q3457234&format=json [19:08:24] old json for some entries, heh [19:08:26] how odd [19:08:32] yes [19:08:53] if i add a bogus param to the end, or xml, or such, i get fresh data [19:09:04] or try on my mac, on my other linode it works [19:09:07] i wonder if the cache change broke my bot O_o [19:09:12] could be [19:09:28] err, my bot is using http [19:09:32] if it uses https, it works [19:09:37] hhmm [19:09:41] im not sure what mine uses [19:09:48] it gets everything dynamically :P [19:09:51] and it did the check before i edited the item, so it would have the old data [19:10:02] I give it meta.wikimedia.org and it works everything else out xD [19:10:06] then it saw the item again in my list (another category or something) and checked again [19:10:18] and thought it was still missing the properties, so added duplicate snaks [19:10:19] I think its probably http [19:10:27] meta is not varnish [19:10:42] i could just give a bogus parameter each request :D [19:10:48] heh! xD [19:10:54] but that's ignoring a possible serious bug [19:11:01] &qwerty=uiop! [19:11:04] yes [19:11:23] aude, addshore: not at karoke / sleeping :O ? [19:11:26] i even have a random string function for my bot to work on test wiki [19:11:34] Danwe_WMDE: well, i would love to :) [19:11:45] so, go, I go now ;) [19:11:52] maybe.... :D [19:12:07] cu there [19:12:11] xD I didnt manage to sleep :P too much clubamtte [19:12:12] :) [19:12:25] i don't know that i can fix my bot tonight, but shall ask mark tomorrow [19:12:32] addshore: then move out now ;p [19:12:41] it means no adding hong kong to wikidata [19:12:44] no adding anything else [19:12:45] Danwe_WMDE: need to wait for my phone to have a bit more charge ;p [19:12:57] lol [19:13:07] aude: wonder if changing to https or http would fix my issue [19:13:18] i have to add hong kong like in the next day or it won't appear in denny's map in time :) [19:13:22] addshore: it might [19:13:29] https is not in varnish [19:13:29] aude: bug me tomorow if you need help :) [19:13:40] :) [19:13:42] mhhm, but what has varnish broken it! [19:13:57] addshore: We all prefer bugging you today :P [19:14:01] *why [19:14:01] i don't know... [19:14:12] if varnish is sending me cached data in the api? [19:14:20] i don't think api results should be cacehd [19:14:22] cached like this [19:14:29] no :/ [19:14:41] maybe mark can fix it [19:14:41] *goes to find where he has to add https [19:14:59] aude: my framework is coming along nicely ;p [19:15:05] cool [19:15:29] my bot is rather customized to it's task but maybe can contribute or use your code at some point [19:15:46] aude: https://github.com/addshore/addwiki/blob/master/scripts/HelloWorld/EditPage.php [19:15:49] yeah [19:16:06] aude: tell me what features it needs / what the task is and ill make sure it is all there :) [19:16:18] well, it needs postgres / postgis :D [19:16:21] to geocode stuff [19:16:28] all possible ;p [19:16:36] eventually on toollabs [19:16:40] coren is working on it [19:16:42] infact, link me to your code? ;p if its easy to link too ^^ [19:17:32] not yet [19:18:03] postgres part is not open source yet [19:18:16] alright, if i am going to karaeoke (sp) [19:18:21] * aude shall leave now [19:20:19] aude: karaoke. [19:20:23] yeah [20:13:52] Denny_WMDE: Around? [20:14:07] or maybe aude can answer [20:14:16] just a simple question [20:14:16] :) [20:14:27] Hahc21: am around [20:14:32] ok [20:14:58] When phase 2 will go live on Wikivoyage? [20:15:09] so that we can invoke properties [20:15:16] actually we currently have no plans for that [20:15:19] it's up next [20:15:31] but due to wikimania, the next deployment won't be before end of august [20:15:40] oh [20:15:42] it might well be, that it will just go live then [20:15:51] oh [20:15:53] nice [20:15:55] thanks :) [20:15:57] if our tests turn out no unexpected issues [20:16:06] let's hope they don't [20:16:07] but we haven't tested yet [20:16:11] yes, i hope so too [20:16:28] thank you again [20:16:31] yw [20:37:43] hello! So, in wikipedia articles about musicians you can see an "associated acts" section. what would be the corresponding data item? [21:03:03] addshore: ping [21:03:09] pong [21:03:32] would it possible to get you for another interview? [21:03:42] I dont see why not :) [23:53:43] Hi. Does wikidata store wiki pages information? [23:54:22] The actual wiki page I mean. [23:54:33] I'm confused [23:58:30] indio: Mind explaining what you mean? [23:59:36] I think I have a wrong idea of what wikidata is.