[01:22:43] Jasper_Deng_busy: Is "disclaim" even a word? [01:23:05] if disclaimer is, that certainly is [05:59:57] Denny_WMDE1: ah there's already bug 36430 + 37461 and I saw it just now [06:39:28] New patchset: Tobias Gritschacher; "Puts twisted dependencies from I4873abce509e391cbe3104883965c91dd0b592d1 where they belong" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60297 [07:12:47] pear channel-discover pear.phpunit.de [07:12:52] ooops :) [07:58:55] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60297 [07:59:04] Change merged: Tobias Gritschacher; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/60296 [08:03:16] New patchset: Tobias Gritschacher; "info about approved copyright message in one language will remain" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60281 [08:08:23] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60281 [08:12:13] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60210 [08:16:50] Change merged: Tobias Gritschacher; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/59126 [08:28:09] 18:25 < Reedy> http://p.defau.lt/?WWRduRwL4GnQxlhjlSozLw [08:28:16] 18:08 < Reedy> http://p.defau.lt/?F66Mt8Ry3MmDqB1VUzltfA [08:28:54] https://bugzilla.wikimedia.org/47415 [08:29:02] not that bug.... [08:30:45] https://bugzilla.wikimedia.org/47022 <-- this one [08:31:45] New review: Tobias Gritschacher; "This needs some attention again. Rebase fails & when trying it without rebasing, it gives errors in ..." [mediawiki/extensions/Wikibase] (master); V: -1 - https://gerrit.wikimedia.org/r/57843 [08:34:56] Change abandoned: Aude; "too outdated and stale" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57843 [08:35:13] New review: Tobias Gritschacher; "As per discussion with Jeroen yesterday, ClaimSummary should not inherit from Summary, and rather be..." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58880 [08:45:51] New patchset: Aude; "(Bug 47125) ChunkCache for speeding up dispatchChanges." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/59388 [08:46:17] New review: Aude; "rebased: patchset 4" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/59388 [08:53:20] New patchset: Tobias Gritschacher; "Autosummary for setClaim" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58880 [08:54:29] New review: Aude; "would be nice to look into using memcached or redis to put some of this stuff in memory, but this st..." [mediawiki/extensions/Wikibase] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/59388 [08:54:31] Change merged: Aude; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/59388 [09:51:59] New patchset: Tobias Gritschacher; "Autosummary for setClaim" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58880 [09:54:44] "Delete. This lady is HOOOOOOOT but not notable." Oh, Wikipedia. [09:54:58] heh :) [09:55:30] that was actually supposed to go in #wikipedia-en, but it's all good. [09:55:57] * aude enjoys the humor [10:00:51] New review: Aude; ""return new EntityUsageIndex"" [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/59413 [10:01:26] New review: Aude; "retract my comment :)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/59413 [10:02:01] New patchset: Aude; "(Bug 47288) Abstraction layer for usage tracking." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/59412 [10:02:34] tommorris: WP:HOTTIE :P [10:03:14] New patchset: Tobias Gritschacher; "Autosummary for setClaim" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58880 [10:28:36] New review: Daniel Werner; "(1 comment)" [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/58223 [10:30:47] New review: Daniel Kinzler; "(6 comments)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/59746 [10:37:56] New review: Daniel Kinzler; "Ok for merging, but would be nice to see my new comments addressed." [mediawiki/extensions/Wikibase] (master); V: 2 C: 1; - https://gerrit.wikimedia.org/r/59746 [11:02:24] Lydia_WMDE: when was the global rollout of wikidata phase 1? [11:02:50] lbenedix: best check the news section on the main page of wikidata.org [11:02:53] it has all the dates [11:03:10] okay [11:03:25] New review: Daniel Kinzler; "(6 comments)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58880 [11:03:30] btw. wikidata could need a human-understandable homepage... [11:03:54] in my usability tests no one was able to say what wikidata is about when I showed them the mainpage [11:04:25] thx for the hint with the news-page! http://www.wikidata.org/wiki/Wikidata:News [11:04:38] i'd suggest bringing this up on the discussion page of the main page including suggestions for how to improve it [11:04:54] meh... [11:04:55] and where exactly the issues were [11:05:16] the main issue was, that the page is full of text and links [11:05:26] ok [11:05:52] a landingpage with a little picture that shows what wikidata is and links to little "how to use" videos would be great [11:06:03] and a link to the mainpage [11:06:06] of course [11:07:12] bring it up on-wiki - that's the only way to get it changed ;-) [11:07:32] i am sure people would be interested in suggestions about how to make it easier to understand [11:07:53] I think, the landingpage should not be a mediawiki-page [11:08:21] it looks too much like wikipedia [11:09:43] * lbenedix doesn't like writing... maybe he can show you the recordings of the test-sessions in may after he finished his thesis [11:09:52] im importing the entire musicbrainz database onto my laptop right now, and should be ready to add it to wikidata once its done :D [11:09:57] ~150k artists [11:10:10] is that fine with the licences? [11:10:23] yup, their database is licensed cc-0 [11:10:32] lbenedix: it's really not me you need to show it though - the content and design of the main page is completely up to the community and they need to understand the issues with it to improve it [11:10:48] lbenedix: draft something in your sandbox and then propose it [11:11:03] people will tweak it and whatnot, and then itll get implemented :D [11:11:08] s/sandbox/userspace/ [11:13:58] * lbenedix thinks about this in may [11:14:26] one week until the thesis-deadline [11:15:11] I thougt phase 2 is live in all wikipedias as well... [11:26:57] Tobi_WMDE: Danwe_WMDE Uncaught TypeError: Cannot read property 'wikiUrlencode' of undefined [11:27:00] experts.CommonsMediaType.js:76 [11:27:05] i've updated everything [11:27:16] is it just me or do you get that? [11:27:29] mediawiki.util is undefined [11:35:13] bleh [11:35:16] i imported it all [11:35:21] then didnt call connection.commit() [11:41:11] legoktm: ouch [11:43:47] :( [11:44:09] im also trying to figure out which program on my laptop has a memory leak [11:44:16] probably all [11:45:07] haha [13:11:45] New review: Tobias Gritschacher; "(6 comments)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58880 [13:12:09] New patchset: Tobias Gritschacher; "Autosummary for setClaim" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58880 [13:20:21] New review: Hoo man; "(6 comments)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/59746 [13:20:34] DanielK_WMDE: https://gerrit.wikimedia.org/r/#/c/59746/4/repo/includes/api/SetAliases.php is taht ok for you? [13:21:39] * legoktm is chatting with the musicbrainz folks in #musicbrainz-devel about getting data into wikidata :) [13:42:51] New patchset: Hoo man; "Refactor wikibase.store - introduce AbstractedRepoApi" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58223 [13:44:27] New review: Hoo man; "Reordered functions in RepoApi" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58223 [13:44:39] and it needs another rebase -.- [13:47:28] New patchset: Hoo man; "Refactor wikibase.store - introduce AbstractedRepoApi" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58223 [13:47:55] New review: Hoo man; "Rebased" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58223 [13:53:50] legoktm: when you have time, want to chat with me about OmegaWiki data? [13:54:05] New patchset: Hoo man; "Refactor wikibase.store - introduce AbstractedRepoApi" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58223 [13:54:06] sure :P [13:54:37] just not now, i gotta run to class :) [13:55:26] New review: Hoo man; "Fixed some dependencies" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58223 [13:56:00] kk [13:56:52] New patchset: Hoo man; "Slightly overhaul SetAliases" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/59746 [14:15:15] Warning: Invalid argument supplied for foreach() in /usr/local/apache/common-local/php-1.22wmf2/extensions/Wikibase/repo/includes/api/SetReference.php on line 116 [14:15:26] Warning: preg_match() [function.preg-match]: Unknown modifier 'F' in /usr/local/apache/common-local/php-1.22wmf2/extensions/Wikibase/repo/includes/api/SearchEntities.php on line 148 [14:15:37] I think I've reported that 2nd one before [14:19:31] Reedy: https://gerrit.wikimedia.org/r/60314 [14:19:33] * hoo hides [14:21:33] Reedy: Do you ahve bugs for the above? [14:22:05] found one [14:22:07] [09:16:56 AM] (NEW) Wikibase Repo Invalid argument supplied for foreach() - https://bugzilla.wikimedia.org/47553 minor; MediaWiki extensions: WikidataRepo; () [14:22:07] [09:17:33 AM] (mod) Wikidate Repo preg_match() Unknown modifiers - https://bugzilla.wikimedia.org/45287 summary (Sam Reed (reedy)) [14:22:10] hoo: ^ [14:22:20] legoktm: I'm to stupid today... found it [14:22:24] :) [14:22:27] * legoktm hugs hoo  [14:23:33] :) [14:24:23] New patchset: Hoo man; "Quote regexp delimiter in SearchEntities" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60400 [14:33:31] tomorrow is deployment of phase 2 in all wikipedias? [14:33:53] lbenedix: AFAIK, yes [14:34:13] * lbenedix has to change the actual state of wikidata in his thesis [14:36:48] I cant see that the conflict with risker was solved [14:51:57] New patchset: Hoo man; "Basic validation for snaks JSON in SetReference" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60410 [15:03:21] is it possible to say that wikidata phase 1 is completely finished? [15:04:08] eh [15:04:26] i wouldnt say its finished until all the links that can be, are stored in wikidata [15:04:39] is anything ever really finished? :) [15:04:44] but look, it is possible! [15:05:55] Im talking about the development [15:05:59] and the deployment [15:06:57] well, I'd say the whole idea of phases is historic [15:07:06] that was how we structured year 1 [15:07:31] ;) [15:07:56] lets ask again... is the whole interwiki-link-thingy done? [15:08:23] for wikipedia, yes [15:08:31] we could do badges [15:08:35] we could do sister projects [15:08:47] badges? [15:08:54] we are doing more work on reducing the server load of keeping the links in sync [15:09:05] badges = featured articles, good articles, etc. [15:09:11] badges are the little FA icon that shows up next to the link [15:09:15] aye [15:09:20] ahh [15:09:24] that would be nice [15:09:33] look at the links on https://en.wikipedia.org/wiki/Barack_obama [15:09:34] but its in the world, people are using it and it works [15:09:47] it looks like [15:10:28] ok [15:21:35] legoktm: the whole userfication thing seems like a bit of a hurdle to me :/ [15:22:20] yeah :( [15:22:49] as potentially things could be moved to userspace as vandalism [15:23:08] and if I were to make the bot remove a sitelink if moved to userspace all traces would be lost if it was later reverted [15:23:23] we might just have to take that on the chin? :P [15:23:37] no [15:23:40] i would just log it [15:23:41] and ignore it [15:23:54] thats what betabot does iirc [15:24:20] so log the fact that the page has moved to the user space [15:24:25] and dont do anything to the sitelink? [15:24:29] yeah [15:24:40] the problem then is [15:24:41] if [15:24:49] A-->B (where B is userspace) [15:24:52] C-->A [15:25:06] mhhm [15:26:03] or even A->B(User) C->D(User) E->A A->E B->C D->A :O [15:26:06] just liberally log conflicts :P [15:26:11] omg too many page moves [15:26:13] xD [15:26:25] or even A->B(User) C->D(User) E->A A->E B->C D->A :O [15:26:32] im not really sure what that is [15:26:51] why do you E-->A --> E ?? [15:26:59] thats userfication of 2 good articles and moving an article out of userfication to replace one of those, then restoring things but accidently swapping the pages over? :P [15:27:14] lolwut [15:27:18] xD [15:27:21] its not gonna happen ;p [15:27:26] gonna go and code this log ;p [15:27:43] what about that one admin who moved all of this talk page archives into some random usersubpage and then "histmerged" them all? :PPP [15:28:38] none of them should have wikidata enteries anyway so nothing should happen ;p [15:31:54] you know legoktm, when your logged into every rc feed, there is a hellm of allot fo stuff to try and look at before it vanishes xD [15:32:07] yeahhhhhhhhhhh [15:39:28] Denny_WMDE: hello [15:45:48] New patchset: Tobias Gritschacher; "Autosummary for setClaim" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58880 [15:46:00] New patchset: Tobias Gritschacher; "Selenium tests for autocomments/autosummaries" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60252 [15:50:50] New patchset: Tobias Gritschacher; "Autosummary for setClaim" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58880 [15:52:07] Change merged: Daniel Werner; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60400 [15:52:43] New patchset: Tobias Gritschacher; "Selenium tests for autocomments/autosummaries" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60252 [16:02:13] DanielK_WMDE: as soon as you have time again: https://gerrit.wikimedia.org/r/#/c/58880/ :-) [16:12:58] ok i remember seeing a page that had a list of various databases that we could import phase 2 data from [16:12:58] and now i cant find it [16:13:53] https://www.wikidata.org/wiki/Wikidata:Third-party_databases [16:14:14] oh my https://www.wikidata.org/wiki/Wikidata:Book_sources [16:17:47] but thats not what i was looking for [16:17:52] it was a bunch of external links [16:17:53] hmmmmm [16:18:45] legoktm: I copied that from en.wp [16:18:57] ok [16:19:10] im looking for a page that had sections and had external links [16:19:18] might have been a WT page [16:19:19] bleh [16:19:37] I don't know [16:20:28] https://www.wikidata.org/wiki/Wikidata:Data_collaborators [16:20:28] AHA [16:26:39] liangent: hi [16:32:29] Denny_WMDE: sorry but I gotta go now (again) ... [16:52:52] !admin7 [16:52:56] !admin [16:52:57] hi [16:53:00] whats up? [16:53:01] hi [16:53:04] whats up? [16:53:06] hello [16:53:16] Pyfisch: please tell us your request [16:53:46] [[Wikidata:Requests_for_permissions/Bot/FischBot_2]] [16:53:46] Addshore and legoktm with the same message... me suspects something. [16:54:15] was approved by Ymblanter yesterday [16:54:45] o.O [16:54:58] this would be so much easier if i knew german [16:55:28] legoktm: should I translate something for you? [16:55:39] legoktm: Wins the award for the only Crat who doesn't know German :) [16:55:45] nah, google translate is working good enough right now [16:56:45] * Pyfisch will start a rfc about "every admin and crat must speak german" ;-) [16:57:16] * SannitAway will start a rfc about "over my cold dead body" :P [16:57:24] Pyfisch: What about the oversighters too? They are at RFP :P [16:58:50] legoktm: If you flag to bot (assuming you are) maybe reclose the request or restore the close as it was overwritten due to a EC. [16:59:19] JohnLewis: bot is already flagged [16:59:26] Oh. [16:59:38] legoktm: Mind reclosing it then? [16:59:40] Pyfisch: Has their been a recent discussion on the german "project chat" about this? [16:59:57] legoktm: yes [17:00:20] oh duh, i scrolled up and found it :) [17:00:28] but Polarys had not discussed there [17:01:43] legoktm: yes? [17:01:43] I am a bit discontent about overwriting admin edits and I don't see that it was already approved :-( [17:02:05] Vogone: we're looking at https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/FischBot_2 right now [17:05:11] i gtg, class just ended [17:05:23] Pyfisch: ill take a look in a bit [17:05:28] Enjoy legoktm. [17:05:40] hai JohnLewis [17:05:46] Hai [17:06:16] Pyfisch: In my personal opinion, both task and test edits are okay [17:08:23] so I think it can be reapproved [17:17:25] hoo: http://192.168.2.251:8080/job/wikibase-selenium-win7-grid-chrome/55/console looks like some site-link tests are failing with your change set, didn't have time to investigate yet. Perhaps you can have a look already. Other than that, looks fine for the merge. [17:17:37] How are you Vogone? [17:18:50] fine [17:21:18] Danwe_WMDE: That's an internal IP and I have no VPN into the office :/ [17:21:29] hoo: just noticed, i'll use pastebin [17:21:42] :) [17:23:05] hoo: http://pastebin.com/ESmuBfzG [17:24:17] hoo: there are also tests failing on your client widget, but that's ok I guess [17:24:24] more ascii art! [17:24:28] mhhhm [17:24:43] who made Special:UnconnectedPages ? :P [17:24:51] i think the tests are broken [17:24:57] the selenium ones and tobi knows [17:25:27] aude: the site link ones? [17:25:31] yes [17:25:43] aude: He said there are some broken for the edit detection [17:25:50] and the client site site links thing. [17:26:00] hmmmm [17:26:57] We've updated the client widget w/o the tests, indeed... but that should be minor issues (like renamed ids or so) [17:28:43] hoo: /aude: the failures I am concerned about are just 4 to 10 [17:31:25]      Watir::Wait::TimeoutError: [17:31:28] Sounds scary [17:33:04] oh, it could be just me (even though i updated), but had problem with js on the repo [17:33:30] 11:26 < aude> Tobi_WMDE: Danwe_WMDE Uncaught TypeError: Cannot read property 'wikiUrlencode' of undefined [17:33:56] 11:26 < aude> experts.CommonsMediaType.js:76 [17:34:05] is that just me? [17:34:13] if not, could cause test failure [17:34:19] that's on the repo [17:34:47] * aude tries updating everything again [17:35:15] aude: I changed some dependencies, tobie merged that this morning [17:35:20] yes, i updated [17:35:24] aude: concerns DataValues and WB [17:35:24] looks fine on master [17:36:05] hoo: could be that the selenium tests just failed randomly, happens sometimes. I'd have to check your changeset out and look for myself [17:36:09] hoo: will do so later [17:36:39] Ok, great... i tested a lot of things per hand, but who knows [17:38:43] ok, my js is ok now :) [17:42:39] aude: yeah, tobi even wrote a mail about that this morning :P [17:42:48] yes [17:48:53] Pyfisch: vogone took care of it :) [17:59:15] aude: hi. I see the "Add links" link on interwiki-less English Wikipedia articles, but not in the Hebrew Wikipedia. Are the deployed versions different? [17:59:39] aharoni: It's only enabled on enwiki now... [17:59:50] it's going to be deployed further if no problem occur [18:04:47] legoktm, Vogone: thanks for helping me [18:04:54] np :) [18:04:56] yw [18:05:16] :) [18:13:49] hoo: thanks [18:17:41] New patchset: Hoo man; "Attempt fix for the linkitem selenium tests" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60448 [18:18:27] Danwe_WMDE: ^ No idea about the repo ones, tough [18:38:29] New patchset: Umherirrender; "Use RecentChange::getTitle in ExternalChangesLine" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60452 [18:39:06] New review: Umherirrender; "Untested" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60452 [18:47:57] New patchset: Hoo man; "Use RecentChange::getTitle in ExternalChangesLine" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60452 [19:01:13] :/ [19:01:29] ? [19:01:30] {"error":{"code":"add-sitelink-failed","info":"The external client site did not provide page information."}} [19:02:03] huh? [19:02:16] On my system [19:02:37] :( [19:05:10] ah, I know [19:05:12] Danwe_WMDE: [19:05:12] dah [19:05:13] ReferenceError: event is not defined [19:05:14] [Bei diesem Fehler anhalten] [19:05:16] $( event.target ).parents( '.wb-claim-section' ).addClass( 'wb-edit' ); [19:05:33] guh, where does that silly second line come from? sorry for the clutter [19:05:57] hoo: you asked a remote wiki for a page it doesn't know [19:06:13] (yes, that error message is somehwat obscure) [19:06:30] maybe sites table not populated? [19:06:46] DanielK_WMDE: Just realized that my setup even ask en.wikipedia if I link smth. on my local client which "is" enwiki -.- [19:06:51] * asks [19:07:38] * hoo goes to find a page w/ an obscure enough title on enwiki, then... [19:07:59] hoo: check new page log [19:08:13] http://en.wikipedia.org/wiki/Special:NewPagesFeed [19:08:24] aude: I need smth. like Category:File:Foo [19:08:34] oh, good luck :) [19:08:45] I'm testing that one: https://gerrit.wikimedia.org/r/60452 it looks sane, but I want to test it [19:09:00] * aude would create a test page on enwiki :O [19:09:03] :o [19:09:04] only dewiki is this insane [19:09:10] i can delete it for you [19:09:39] Danwe_WMDE: that's line 103 in claimlistview.js [19:09:40] * aude thinks nobody would care [19:10:19] or set your client wiki siteid to dewiki [19:10:52] aude: Is that a one line change or can it turn out more difficult? [19:11:04] one line i think [19:11:13] in local settings [19:11:14] * hoo will do that then [19:11:31] $wgWBClientSettings['globalSiteID'] i think [19:11:45] yep, got it already... let's see [19:12:05] it's siteGLobalID [19:12:09] siteGlobalID [19:12:10] :D [19:12:29] might also need siteLocalID = 'de' [19:13:38] At least I can use the widget to link the new page :D [19:13:46] :) [19:14:51] probably I have to set the site lang to de as well [19:15:03] that would work [19:16:48] Reproduce the bug \o/ [19:16:56] yay [19:17:01] Could not parse json query response: u'Verified' [19:17:07] I hate git-review [19:17:35] never seen that one [19:18:00] I've seen that one once before... also on Wikibase [19:18:10] manually fetched the change via git now [19:19:58] New review: Hoo man; "Thanks for the patch... works like a charm and only uses data available in the client RC entries." [mediawiki/extensions/Wikibase] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/60452 [19:19:58] Change merged: Hoo man; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60452 [19:20:04] \o/ [19:20:22] the method is just like Title:makeSafe [19:26:53] aude: : wikibase-conflict-patched, Your edit was patched into the latest version <- wut? [19:27:35] it means your edit was merged [19:27:40] * aude hides :) [19:27:58] legoktm: You know that that crashes the rewrite/ [19:27:59] ? [19:28:01] basically pwb sends the revid to avoid edit conflicts [19:28:09] I know! [19:28:16] I haven't had the time to debug it [19:28:27] probably could say "merged into the latest version" [19:28:30] I haven't been able to reproduce the warning reliably [19:28:42] It's easy [19:29:05] {u'messages': {u'0': {u'type': u'warning', u'name': u'wikibase-conflict-patched'}, u'html': {u'*': u'

Your edit was patched into the latest version.\n

'}}} [19:29:24] result["warnings"][mod]["*"]) <- fails [19:29:37] Because there is u'html' as a key [19:30:48] is this a pwb issue, or is wikibase sending non-standard warnings? [19:30:57] Combi [19:32:36] New patchset: Hoo man; "Update the README" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60469 [19:32:39] wanna fix it? :) [19:33:07] aude: ^ [19:33:25] I could merge it myself, as it's doc only, but "code" review is nicer :P [19:33:50] legoktm: It's the sources thing [19:33:59] sources? [19:34:32] Yes, since I enabled that in claimit.py I'm getting the warnings on every second edit [19:35:44] hmm [19:36:01] I think its because we send the revid that we got in the .get(), and it isnt updated when the source is added [19:36:46] Does a save return a new revision id? [19:36:59] yes [19:37:19] So we should update the internal state of an object after a save [19:37:39] And we should fix the error handling [19:37:51] in Site.addClaim [19:37:52] item.lastrevid = data['pageinfo']['lastrevid'] [19:38:19] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60469 [19:38:42] legoktm: Is that already in there? [19:38:45] yes [19:38:56] the issue is that when you add a claim, you know which item it is associated with [19:39:14] but you add a source in reference to a claim [19:39:23] so you dont know what the item is, which means you dont have the revid stored [19:39:33] so when you go to add a new claim and send the revid, its now out of date [19:40:12] So return item.lastrevid [19:40:42] And in page.py [19:40:51] if self.repo.addClaim(self, claim, bot=bot) >0 : [19:42:02] New patchset: Daniel Werner; "(hotfix) Fixes broken editing introduced in 55e6d866" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60471 [19:42:13] legoktm: In what var do you store the revision id? [19:42:30] item.lastrevid [19:42:50] Right _revid [19:43:20] er, no [19:43:24] thats for a normal Page [19:43:30] ItemPage is lastrevid [19:43:35] Why do we do something different in ItemPage? [19:44:18] because you cant get the history of itempage's [19:44:46] So we just update self.lastrevid ? [19:44:54] and probably because i wasnt thinking of trying to keep it the same as Page [19:44:54] yeah [19:46:14] You seem to be doing that in the site object, but that doesn't work. Any idea why? [19:47:15] no it does work [19:47:29] the problem is [19:47:35] item.get() <-- got revid [19:48:06] item.addClaim(claim) <-- new change, fetched revid, item.lastrevid is updated [19:48:30] claim.addSource <-- new change, we're sent a revid, but we don't have an item object to update [19:48:55] item.addClaim(claim) <-- new change, the revid we sent is one revision old, warning appears [19:52:28] legoktm: In the state of claim the itempage is included, you can use that [19:52:47] what do you mean? [19:53:05] In the claim you have self.target [19:53:23] thats a different item [19:53:43] Hey all. According to the stats, Wikidata currently has 40 bots with bot flag. Wikidata:List of bots lists 38 of them. This is a problem. [19:53:46] target is the value of the claim. it could be a string, item, commonsMedia, etc [19:54:18] Sven_Manguard: I purged the page and we're at 45 :P [19:54:43] umm [19:54:46] https://www.wikidata.org/wiki/Special:ListUsers/bot [19:54:50] Sven_Manguard: You really worry too much [19:55:08] http://www.wikidata.org/wiki/Wikidata:List_of_bots [19:55:14] So we might have some unflagged bots, boohooo [19:55:22] multichill: not the issue [19:55:33] Also, I think addshore sent notices to everyone to get added to the new table https://www.wikidata.org/wiki/Wikidata:List_of_bots#footer [19:55:34] the issue is that we want the flagged bots to be on the list of flagged bots [19:56:05] * addshore waves [19:56:23] Sven_Manguard: then make it so? ;p [20:00:37] it not I will do it once my new list has filled out a bit more :) [20:03:04] legoktm: The language code to Q id thing shouldn't be stored in code. It should be on WIkidata itself [20:04:07] sure, where? [20:05:34] legoktm: LIke https://www.wikidata.org/wiki/Q328? [20:06:03] oh [20:06:20] using the Wikipedia language edition qualifier thing? [20:06:34] Got a better idea? :P [20:06:49] https://www.wikidata.org/wiki/Special:WhatLinksHere/Q10876391 [20:06:56] Could cache it for some time [20:07:42] hmm [20:07:44] i guess so [20:07:47] Or without the qualifiers it's also possible [20:07:57] how? [20:07:59] It includes https://www.wikidata.org/wiki/Q1860 and that includes the language code [20:08:34] oh [20:23:51] aude: Reedy: do you know why https://gerrit.wikimedia.org/r/60314 is still open? Someone on en asked why it's not working :P [20:31:24] quick question [20:31:41] to get WikibaseClient to work [20:31:52] do I need to include WIkibaseLib? [20:32:02] or repo/Wikibase? [20:32:04] or both? [20:32:04] pragunbhutani: Yes, it's needed for both repo and client [20:32:20] You need repo + lib on repo and lib + client on the client [20:32:59] hmm [20:33:16] I'm afraid I don't quite understand [20:34:19] pragunbhutani: Did you yet read http://www.mediawiki.org/wiki/Extension:Wikibase ? [20:34:30] yes [20:34:38] hoo: that's where I got a little confused [20:35:12] The documentation is a bit thin, I guess... [20:35:19] I don't install both repo and client on the same mw installation [20:35:23] (and I'm new) :) [20:35:26] pragunbhutani: In a nutshell: You need WikibaseLib everywhere [20:35:58] hoo: to give you a little context [20:36:08] this is something I'm trying out for my gsoc proposal [20:36:23] Mobilize wikidata [20:36:40] so I want to try and see if mobileFrontend works with Wikidata [20:37:01] if it doesn't (and where it breaks), that's what I'll propose to contribute [20:37:19] [20:37:48] pragunbhutani: Do you even need a client then? [20:37:54] Probably it's handy to have it anyway [20:38:07] hoo: I'm not sure, tbqh [20:38:38] if I just get Wikibase set up and then install mobile frontend [20:38:38] pragunbhutani: In the Wikimedia Setup the Repo is wikidata.org and the clients are the wikipedias I guess you know that?! [20:39:04] You need Wikibase, Diff, DataValue and ULS [20:39:35] I didn't know that, sorry [20:39:43] okay so I'm done installing the dependancies [20:39:50] I'll install Wikibase now [20:47:56] New patchset: Hoo man; "Fix an error in the README" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60570 [20:48:06] shame on me ^ [20:51:22] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60570 [21:30:29] Change merged: Daniel Werner; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58223 [21:30:47] \o/ [21:35:28] is the dispatch-lag-problem solved? [21:37:52] I see ~800 edits/min on my wikidata-meter [21:40:50] is anyone here? [21:41:29] TOS; Many. [21:43:12] What exactly is Phase 2? [21:47:18] hi [21:49:48] legoktm: Is the bot finding empty items still running ? [21:56:25] TOS: sorry, the person who responded to you seems to have departed [21:56:29] oh, oops [21:57:31] Sven_Manguard: I have not :P [21:58:08] Hi. [21:58:14] One moment, this will take a sec to type [21:59:14] So Phase I was interwiki links. Phase II is, I guess the best way to describe it, is "the data". It's a lot of the information that currently exists in Infoboxes on Wikipedisa [21:59:19] Wikipedias* [21:59:41] So birth date, geocoordinates, time ranges, etc [22:00:03] It's sort of hard for me to describe [22:01:40] https://www.wikidata.org/wiki/Wikidata:Introduction the chart in the right hand side [22:01:53] "Phase 2 of Wikidata: [22:01:55] Information for infoboxes [22:01:56] of all languages [22:01:58] on one central point" [22:01:59] I guess that's it [22:02:53] And this information comes from all wikipedias, or just en [22:03:11] from all WPS [22:03:15] *WPs [22:04:04] Sven_Manguard, basically, anything that can be made a datapoint [22:04:11] but also semantic relationships [22:04:24] I thought semantic relationships was phase III [22:04:26] "X is the son of Y", "A is part of B", etc. [22:04:47] How do you translate all the data then? [22:04:58] an army of volunteers [22:04:59] phase III is dynamically doing stuff with the data :-) [22:06:10] TOS: for a lot of things, they're either a) numeric data (no translation needed), b) a reference to another item [22:06:27] (the X is son of Y, for example; article X refers to Y) [22:06:51] and if it's a reference to another item, we can use the translation from that item, which we've taken from the relevant Wikipedia :-) [22:10:22] How much of this has been done? [22:10:36] And how can I, as an editor, help? [22:12:51] More relevant question: how can editors edit the data? (assuming the infoboxen will eventually be fed from wikidata, and not the other way around.) [22:14:18] TOS, three ways [22:14:23] Or maybe that should simply be 'can editors edit the data'? [22:14:56] a) manually adding properties - find an item and start adding information to it (note that only some data elements are working just now) [22:15:19] b) working out how to migrate infoboxes (and then getting bots to do it) [22:15:29] c) translating "labels" for properties and items [22:15:57] a) is a good way to start understanding it [22:16:15] here's all the currently working properties (data elements that can be created for an item) [22:16:15] https://www.wikidata.org/wiki/Wikidata:Properties [22:17:07] here's the first item that appeared on my watchlist (truly random :-) ) [22:17:08] https://www.wikidata.org/wiki/Q737720 [22:17:37] so you can see that type is set as person, sex as male, and there's a value for his VIAF identity [22:18:58] scroll to the bottom of the "statements" section, and you can add one - hit the [add] which is sitting out on its own, and you'll be asked for what property you want [22:19:08] Where can I find a list of properties [22:19:09] possible ones we could add: [22:19:14] https://www.wikidata.org/wiki/Wikidata:Properties [22:19:17] I cant find dob [22:19:29] dates aren't running yet, weirdly! [22:19:43] Why not? [22:19:47] 13. September 1857 [22:20:01] there's something complex being developed to handle dates - they want the field to be able to interpret it as date/time not just as a string [22:20:46] (otherwise we have the whole 13 September vs. September 13, etc etc, all over again) [22:20:55] hmm, how about "country of citizenship" [22:20:56] shimgray: php has an app for that. [22:20:58] that makes sense [22:21:59] put that in the property box, it'll give you another box to enter the value; values for this sort of item can be defined as anything with a Wikidata element [22:22:27] so we could have him as a citizen of ice-cream. let's skip that option, plug in United States of America, save, and you're done. [22:22:39] want to give it a shot? [22:24:21] Amgine_, doesn't help us with 4/3/2013 ;-) [22:24:32] Yes, it does. [22:24:52] (I think they're also trying to make it handle fuzzy dates, specific ones, etc. not following this too closely, all I know is dates are "not quite yet") [22:25:47] TOS, occupation was you? [22:25:49] :-) [22:25:51] http://ca2.php.net/manual/en/function.strtotime.php <- most any english representation; not sure how perfect it is with other scripts. [22:25:53] No, me [22:26:14] aha [22:28:52] New patchset: Daniel Werner; "Added IDs for entity page's sitelink and claim headings" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60596 [22:35:21] shimgray: are these edits 'live'? also, I tried to edit the occupation, which was 'autocorrected' to confectionery from confectioner, and cannot. [22:35:48] Amgine_, these are all live. [22:36:26] occupation is semantic, so it has to match an item in WD. at the moment, there's only one item for confectionery to which confectioner redirects (as with Wikipedia, in fact...) [22:37:12] (oops, my error, there's no redirect at confectioner. either way, still only one entry) [22:37:50] there's a "field of work" which probably works better for non-occupational things like that [22:39:31] Actually, confectioner is an occupation. Confectionery is the field of work. [22:40:04] yeah, that's what I meant [22:40:27] (confectionary - non-occupational term) [22:40:44] it's a little patchy, as the Wikipedias its using for sources are also idiosyncratic [22:41:08] we have confectionery but not confectioner, butcher but not butchery [22:41:26] Not to mention often wrong when relating to words/terms. [22:42:59] http://en.wikipedia.org/w/index.php?title=Butcher&oldid=551103363 - I am baffled but delighted by the infobox caption here [22:43:17] it contrives to indicate that the image below is what Canadian butchers look like [22:44:18] [22:44:33] Not at all like the abatoir up the street from me. [22:46:51] that, presumably, is the *wholesale* butcher [22:48:35] Yes and no; it means more than just 'slaughterhouse'. They have whole carcass butchering. [22:52:10] hmm, midnight [22:52:13] to bed, I think! [22:52:15] later, all... [22:52:28] shimgray: So, effectively, only terms which are from en.Wikipedia can be used? [22:52:40] Amgine_, terms/concepts from any and all wikipedias [22:53:04] but if de,fr,ru, etc all have an article and en doesn't, it may not have got an en label yet so may not be easily findable [22:53:05] okay. Thanks. [22:53:39] No, I'm thinking about the proposal to use wikidata for OmegaWiki. [22:53:39] there's some db reports trying to identify the most widely used non-en (or non-en-labelled) topics [22:53:57] anyway, speak later :-) [22:56:46] I just installed wikibase on my installation of mediawiki [22:57:06] when i access it on my localhost, i see the same default mediawiki page as before [22:57:25] I just want to confirm if everything's running all right?