[00:00:42] * lbenedix realized that it's 2am and falls asleep [00:24:54] New patchset: Aude; "Remove redundant bloat from API modules." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49862 [01:41:08] So did we get qualifiers? i'd rather not read scrollback [01:41:14] nope [01:41:21] aww :( [01:41:34] hopefully tomorrow [01:41:50] good, will give me some time to catch up :D [04:18:06] Change on 12mediawiki a page Extension:Wikibase was modified, changed by Llfanyll link https://www.mediawiki.org/w/index.php?diff=674763 edit summary: [+55] /* Configuration */ [04:18:52] Change on 12mediawiki a page Extension:Wikibase was modified, changed by Llfanyll link https://www.mediawiki.org/w/index.php?diff=674764 edit summary: [+55] /* rebuildAllData.php */ [04:19:41] Change on 12mediawiki a page Extension:Wikibase was modified, changed by Llfanyll link https://www.mediawiki.org/w/index.php?diff=674765 edit summary: [+55] /* External links */ [04:36:06] Change on 12mediawiki a page Extension:Wikibase was modified, changed by Peachey88 link https://www.mediawiki.org/w/index.php?diff=674766 edit summary: [-165] Rejected the last 3 text changes (by [[Special:Contributions/Llfanyll|Llfanyll]]) and restored revision 663872 by Silke WMDE [11:09:37] addshore: I have a question about http://bots.wmflabs.org/~addshore/addbot/index.php [11:43:51] aude: I want to play with the widget to add links to an item. [11:44:04] I have Wikibase set up and I can add items and site links to existing Wikipedias. [11:45:04] Now how can I create a page in my own wiki that uses these links? [11:45:42] I suppose that I need to add my own wiki to the sites table and probably do some more configuration. What's the right way to do it? [11:46:16] DanielK_WMDE: ^ you'll probably know [12:02:08] http://www.wikidata.org/wiki/Property:P416 some desc should be provided [12:02:13] Denny_WMDE: go for it [12:04:01] i am not sure I understand what the columns mean [12:04:22] haha its funny you should say that [12:04:29] does pending mean: not processed yet [12:05:01] does checked mean: processed, but addbot couldn't replace existing links [12:05:03] Pending is meant to mean the DB row doesnt have a value for 'links' meaning the bot hasnt looked at it yes [12:05:23] check means the bot has passed over the page and there is now a value for 'link' (the links on the page) [12:05:28] if links = 0 the page is removed [12:05:48] note I wrot at the top the other day I'm not really sure if this page is right or not (the totals should be but collums your best of checking the onwiki pages! [12:06:14] also when i rewrote my bot a few days ago I happened to totally forget to update the db rows after the bot had checked them [12:06:40] so although the bot would move through the articles and remove where 0 and add to wd etc. it wouldnt update the db saying how many links it had left on the page ;p [12:07:40] hmm. sorry, i am still confused [12:07:48] let's take azwiki [12:07:48] so am I [12:07:54] * addshore logs into mysql [12:07:56] ok :) [12:08:24] is there another view to see for a specific wiki what the status is? [12:08:35] no :/ [12:08:44] although I could make that [12:08:51] I think I need to remake all of the pages tbh [12:10:15] select count(*) from iwlinked where links=0; [12:10:22] 381,551 [12:10:33] that is how many the bot is yet to check as far as I can tell [12:10:51] so 2588001 in the loop [12:10:53] individual pages over all wikis? [12:10:58] yes [12:11:07] so 2,5 Mio or 380 k? [12:11:09] so 2588001 have been checked where linsk remain [12:11:21] 381,551 havent even been checked yet [12:11:27] ah, ok [12:11:29] 15:02:09 - Base-w: http://www.wikidata.org/wiki/Property:P416 some desc should be provided [12:11:30] 15:02:09 - Base-w: http://www.wikidata.org/wiki/Property:P416 some desc should be provided [12:11:30] 15:02:09 - Base-w: http://www.wikidata.org/wiki/Property:P416 some desc should be provided [12:11:30] :) [12:11:35] wait [12:11:39] thats totally wrong..? [12:11:43] and 2.5 M articles still have iwlinks [12:11:46] as the db has 9721260 rows... [12:11:46] don't know :) [12:12:05] im honestly rather confused [12:12:09] it should be closer to 27 Mio [12:12:37] the db should have 27million articles? [12:12:47] it started off with 25ish [12:13:06] yah, that's close enough [12:13:11] 25ish is probably right :) [12:13:19] i think I need to migrate databases :/ [12:13:26] aw [12:13:41] I started off doing this task just for en 2.5 million rows ish to begin with so i didnt think much about the db design [12:13:55] then i increased this to 25 million and performance shot down [12:14:14] problem is migrating from one to the other will likely take a long time :P [12:15:00] I also think I broke the bot when adding the update queries back in :/ > http://bots.wmflabs.org/~addshore/addbot/status [12:31:45] Aha [12:32:04] DanielK_WMDE: Running rebuild search term key is a LOT faster on terbium [12:32:21] like one batch of 500 in roughly a second [12:32:41] sweet [12:33:06] 33642512 [12:33:14] 25 Mio / 500 = 10k, that's 3 hours [12:33:29] oh, its 33 Mio by now? [12:33:50] i got my math completely wrong [12:34:29] 33.6 M / 500 = 67 k -> 19 hours [12:35:53] Just under 37.5% done [12:36:22] so 12 more hours [12:40:40] 36111474 [12:40:43] It's certainly speeding up [12:41:29] 39% [12:41:58] oh [12:42:09] so it will have about 100 Mio rows in the end? [12:42:18] max is currently 92381538 [12:42:35] oy [12:42:40] Not sure at what rate it's growing.. [12:43:04] yo, yoo, that's a tidbit bit bigger than i expected… [12:43:09] http://p.defau.lt/?InQBY4K_QqMCjXyrSXCfyA [12:43:40] 65.5GB total size [12:44:05] that's more right [12:44:08] 40 M, no 90 M [12:44:20] wonder why the id is so much higher [12:44:25] I was just wondering the same thing [12:44:57] I have an idea, but that could be considered a bug in the respective module [12:45:02] I'll make a note and check it [12:46:19] Oh, I see [12:46:26] | 92388373 | 8306522 | item | en | label | Category:Books by Jonathan Green | | [12:46:26] | 92388372 | 8306522 | item | es | description | categoría de Wikipedia | | [12:46:26] | 92388371 | 8306522 | item | fr | description | page de catégorie de Wikipédia | | [12:46:35] Of course it's going to be higher if each item as >= 1 rows [12:47:46] nah, that's not the issue [12:48:02] there are 40 M labels, descriptions, and aliases [12:48:09] and each of them has one line in that table [12:48:16] there are only 10 M items [12:48:29] Eh? [12:48:30] but the rowid is at 90 M [12:48:44] I wonder were the 50 M rowids have been gone [12:48:46] and I have an idea [12:48:55] In the example I pasted above the same item number has at least 3 rows [12:49:23] yes, but it is on label and two descriptions [12:49:36] and every term (i.e. label, description, alias) has one row [12:49:43] not every item has one row [12:49:53] every item should have many terms [12:49:57] Hmm [12:50:33] makes no sense? [13:38:38] can someone create a list of templates on nl-wiki where the interwikis are still on the template itself instead of Wikidata? [13:44:46] they have templates adding langlinks? or they have templates with langlinks for the templates? [13:45:08] Romaine: http://bots.wmflabs.org/~addshore/addbot/index.php <- what you want but not limited to templates [13:45:18] addshore might be able to hel cut it down [13:45:20] *help [13:46:35] I notice a lot of bots left one interwiki on the templates I checked [13:47:06] Romaine: what interwiki was it? [13:47:39] mostly war and os languages [13:47:58] while they were on Wikidata on the item page [13:49:40] I would like to solve interwiki conflicts related to iw's on nl-wiki [13:49:58] I hoped already almost all iw's would have been moved [13:50:29] Lydia_WMDE: you have a moment? [13:50:39] Denny_WMDE: yes [13:50:51] where is the current state of the discussion about sources? [13:51:00] getting a link - sec [13:51:10] i tried reading http://www.wikidata.org/wiki/Help:Sources http://www.wikidata.org/wiki/Help_talk:Sources http://www.wikidata.org/wiki/Wikidata:Requests_for_comment/References_and_sources [13:51:16] is that it? [13:51:21] http://www.wikidata.org/wiki/Wikidata:Requests_for_comment/References_and_sources [13:51:39] ^ is the most current one [13:51:50] Updated 500 search keys, up to row 60517944. [13:53:09] why do they make it so complicated? [13:53:15] it is even hard to comment on that RFC [14:10:20] addshore: and tl [14:11:40] well, I am about 3% of the way through migrating to a new DB currently :) [14:12:29] Lydia_WMDE: I tried to add some comments on http://www.wikidata.org/wiki/Wikidata:Requests_for_comment/References_and_sources [14:12:33] Romaine: in nl whats is Template:? [14:12:50] Denny_WMDE: thanks [14:13:32] Lydia_WMDE: maybe we should make an out of schedule office hour for that, in order to answer technical questions if they have them? Also about what we were thinking about when creating that part? I don't think you need to attend, but would you mind to suggest it and see if they want it? [14:14:01] Denny_WMDE: will do [14:14:04] thx [14:14:05] makes sense [14:14:15] anyone here who would like to have such a discussion? [14:14:19] (just to get a feeling) [14:16:31] ok, thanks :) I get a feeling for that ;) [14:17:03] ;p [14:17:24] it could be good for people :P [14:22:35] addshore: several in a serial set, cliecked them away already [14:23:33] war example: https://nl.wikipedia.org/w/index.php?title=Sjabloon:Navigatie_district_Dielsdorf&curid=785889&diff=37185840&oldid=33354914 [14:24:09] I think the namespace is not exactly the one on war-wiki, what causes the bot to skip it [14:24:48] ahhh [14:24:57] im trying to modify my bot to account for that right now :) [14:25:51] might even be done today :) [14:26:45] thats probably the reason my bot is struggling to remove as many interwikis as it should :P [14:27:12] this one is also strange: https://nl.wikipedia.org/w/index.php?title=Sjabloon:Navigatie_werelderfgoed_Australi%C3%AB&curid=139655&diff=37185476&oldid=34516158 [14:27:20] half the iw's where not moved [14:27:33] I just added them to the item page, but were missing there too [14:29:40] that edit was made before my bot was correctly moving as many interwikis as it could to wikidata (I think ) [14:30:20] ok [14:32:24] addshore: tl example: https://nl.wikipedia.org/w/index.php?title=Sjabloon:Navigatie_Superman&curid=1073095&diff=37184643&oldid=33376103 [14:32:28] also namespace related [14:33:26] Romaine: take a lovely look at this which is the next thing I am trying to add to my bot :) http://en.wikipedia.org/w/index.php?title=User:Addshore/Sandbox&action=edit&oldid=550455416 [14:33:53] every version of the namespace should be in there for every lang :) [14:34:43] how many time do you think ot will take to have all interwiki's transferred to Commons? [14:34:54] is that within a month? [14:35:31] any idea? [14:35:52] hmm, how do you mean? [14:37:04] the goal is to add interwiki's to Wikidata and remove them from the pages on Wikipedia's, I don't know how fast it is going? [14:37:36] hmmm, id say we are over half way [14:38:29] I started with a list of about 25million articles and I am now down to about 10million [14:39:07] in about 36 hours I will have an exact figure as of 2 days ago :) [14:39:29] ok [14:39:38] an os example: https://nl.wikipedia.org/w/index.php?title=Sjabloon:Positiekaart_ABC-eilanden&curid=1902230&diff=37184360&oldid=33300722 [14:44:25] an pa example: https://nl.wikipedia.org/w/index.php?title=Sjabloon:Infobox_India_op_Olympische_Spelen&curid=931584&diff=37184047&oldid=33353883 [14:44:30] also namespace issue [14:45:26] pfl also namespace issue: [14:45:27] https://nl.wikipedia.org/w/index.php?title=Sjabloon:Gebruiker_pdc-2&curid=1499551&diff=37183608&oldid=33390705 [14:46:19] pdc also namespace issue: https://pfl.wikipedia.org/w/index.php?title=Vorlach%3AUser_pdc-2&diff=46047&oldid=34591 [14:52:13] Reedy: aude: do we plan to try the deployment again? [15:14:34] Updated 500 search keys, up to row 79904401. [15:19:21] wohoo, almost done :) 10 Mio more [15:22:36] 81465919. [15:24:14] Max is 92556240 [15:24:59] Romaine: just to let you know I have just finished coding the change to my bot to make it account for the user of other versions of namespaces in interwikilinks :0 [15:25:12] should see more removals as soon as my database has updated later today :) [15:25:15] :) [15:29:59] i will get aude and DanielK_WMDE in place about what to do once the script is through :) [15:30:24] 83M [15:33:21] yay for terbium :) [15:33:58] Denny_WMDE1, Reedy: re rebuildTermSearchkey: https://bugzilla.wikimedia.org/show_bug.cgi?id=46378#c6 [15:34:13] Reedy: would you know how much faster it runs on the new server? [15:34:20] roughly [15:35:18] aude: can you prepare a changeset for https://bugzilla.wikimedia.org/show_bug.cgi?id=46378#c6 step 2? [15:35:49] * aude breaking wikidata :o [15:35:52] Error: 1205 Lock wait timeout exceeded; try restarting transaction (10.64.32.28) [15:35:57] it's not a big issue though [15:36:33] Denny_WMDE1: what do you want first? [15:36:39] deploy qualifiers, etc. [15:36:48] or enable term search key [15:38:09] http://www.youtube.com/watch?v=FOdWxf1tRmI [15:38:38] aude: actually, I have no preference — whatever you think makes more sense [15:38:57] DanielK_WMDE: There's no sort of rate reporting in it [15:39:17] It's probably around 1s per 500, which could've been a minute or more on hume [15:39:28] Denny_WMDE1: i think deployment first [15:39:31] woot, factor 50-100? [15:39:32] sweet [15:39:38] 84514359 [15:39:40] then search key and we can try enabling changes as json [15:39:52] i hope the dispatcher has a similar factor :) [15:59:23] meeeetings [15:59:53] :) [16:03:45] finding the hangout url is always the hardest part [16:04:52] <^demon> I'm rebooting my laptop so I can join. [16:04:57] <^demon> Been having hangout problems since yesterday [16:07:14] <^demon> Seriously hangout...no video still... [16:15:23] 90037969. [16:15:35] (drumroll) [16:17:52] It's nearly finished [16:17:55] 2.5M or so [16:21:47] yeah, aude is in some meeting right now, so she'll help you later with the change :) [16:25:19] 91377474. [16:25:30] Denny_WMDE1: I'm in the same meeting ;) [16:27:39] 91686682. [16:29:19] hehe :) [16:37:24] Done! [16:37:37] wohoo [16:37:42] there seems to be a few hundred coming in new every second [16:37:59] or maybe not so much [16:47:15] Thu Apr 18 16:46:13 UTC 2013 [16:47:15] reedy@terbium:~$ date && mwscript extensions/Wikibase/repo/maintenance/rebuildTermsSearchKey.php wikidatawiki --only-missing --force --batch-size=500 && date [16:47:15] Thu Apr 18 16:47:02 UTC 2013 [16:47:15] Updated 500 search keys, up to row 92637837. [16:47:15] Updated 500 search keys, up to row 92638339. [16:47:16] Updated 58 search keys, up to row 92638397. [16:47:18] Done. Updated 1058 search keys. [16:47:20] Thu Apr 18 16:47:03 UTC 2013 [16:47:21] :) [16:47:41] Thu Apr 18 16:47:03 UTC 2013 [16:47:41] reedy@terbium:~$ date && mwscript extensions/Wikibase/repo/maintenance/rebuildTermsSearchKey.php wikidatawiki --only-missing --force --batch-size=500 && date [16:47:41] Thu Apr 18 16:47:30 UTC 2013 [16:47:41] Updated 500 search keys, up to row 92638532. [16:47:42] Updated 500 search keys, up to row 92639032. [16:47:44] Updated 111 search keys, up to row 92639143. [16:47:46] Done. Updated 1111 search keys. [16:47:49] Thu Apr 18 16:47:30 UTC 2013 [16:47:53] Over 1000 in 30 seconds [16:54:38] Reedy greg-g https://gerrit.wikimedia.org/r/#/c/59755/ [16:54:44] for safety [16:54:50] 2) https://gerrit.wikimedia.org/r/#/c/59667/ [16:54:56] 3) switch wikidata to wmf2 [17:03:52] * aude waves [17:11:04] wait, there are zero sources for Barack Obama's place of birth ?!?!?!?!?! https://www.wikidata.org/wiki/Q76 [17:11:10] heh [17:11:16] well, birthers. [17:11:33] alright, core is updated with new versions of the extension [17:11:33] you'd think there'd be like 10, all saying different things ;) [17:11:47] i think it's ready to switch over anytime... [17:11:59] "also known as Barry Obama" [17:12:01] HAHA [17:12:05] greg-g: shocking, isn't it? [17:12:12] then we'll do config change for the terms table to use the search key (and can use json format for changes) [17:12:20] Denny_WMDE1: I had more faith in the birthers, I guess ;) [17:12:33] they all over one conservadata [17:12:51] hah [17:13:08] nifty thing is, their time-datatype only needs to go back 6 millennia. lucky bastards. [17:13:44] lol [17:14:12] hey, socking on wikidata isn't illegal. [17:15:30] It's oh so quiet.. [17:16:09] hi all, are there full dumps of the wikidata db/xml available? i'm particularly interested in the interlanguage links. [17:16:32] heh [17:16:35] They have been for a while.. [17:16:46] :) [17:16:51] elplatt: http://dumps.wikimedia.org/wikidatawiki/20130417 [17:19:02] Great! I saw those, but I think I'm missing something. The langlinks file is only 25KB. That can't be complete, can it? [17:19:37] Reedy: we need localisation cache updated [17:21:35] elplatt: it's in the content of the page [17:21:40] not in the langlinks table [17:21:53] * aude can happily edit but awaits localisation update [17:22:07] so you need e.g. pages-articles [17:22:18] until then, folks might see "wikibase-addqualifier" [17:22:22] Ah, that makes sense. Thanks! [17:23:27] 750,000 rows done, one can only guess how many remain [17:23:47] addshore: over 9000 [17:24:02] Reedy: unfortunatly so ;p [17:24:45] heh [17:26:20] any admin here? [17:26:30] would someone like to create this property? http://www.wikidata.org/wiki/Wikidata:Property_proposal/Creative_work#as [17:41:02] hey ^demon [17:41:19] <^demon> hi. [17:44:51] assume reedy is handling it but we need localisation cache updated (due to wikidata update) [17:45:01] * aude patient [17:45:39] everything else seems fine [17:45:47] New patchset: Daniel Kinzler; "(Bug 47125) ChunkCache for speeding up dispatchChanges." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/59388 [17:50:59] https://www.wikidata.org/w/index.php?title=Q2275950&diff=prev&oldid=26782407 [17:51:02] e.g. [17:53:15] aude: I started it when you asked me 18:19 [17:53:28] It's not finished doing both of the cache rebuilds yet :/ [17:54:08] ok [17:54:25] On a go slow [17:54:46] 27906 new entries in about 20 minutes [17:55:00] * Reedy keeps pressing up and enter [17:55:11] no hurry [17:55:25] whenever ready, then my config patch can be deployed [17:56:21] then i think we do the rebuild term search key only for any terms that were created / updated since and don' thave the field populated [17:56:51] Yay, it's syncing to the apaches now [17:56:56] :) [18:04:39] 20688 www-data 30 10 985m 422m 4992 R 99 10.7 0:17.11 apache2 [18:04:40] 20705 root 20 0 229m 143m 4188 R 99 3.6 0:11.89 puppet [18:04:43] I guess that's why fenari is slow [18:05:12] ah [18:05:57] load average: 1.03, 0.87, 0.97 [18:20:12] I just made an edit in wikidata and cant see the new sitelink in wikipedia... Is the lag back? [18:20:38] lbenedix: let's see [18:21:09] pending is ~100 so should not take long [18:21:10] is the delay time available over the api? [18:21:15] sadly not yet [18:21:25] next deployment i'd hope and expect [18:21:28] would be a nice feedback when a user clicks save [18:21:57] "to see your edit in wikipedia you have to wait 2days and 16 hours" ;) [18:22:24] * aude sees "(diff | cron) . . New York (Q60); 17:18 . . Aude (Discussione | contributi) (3 modifiche)" [18:22:34] * aude tries new edit [18:22:41] now its there [18:22:45] good [18:23:02] but I cant see the edit in the history of the article [18:23:06] http://en.wikipedia.org/w/index.php?title=Social_news&action=history [18:23:24] no, we don't support that yet [18:23:24] http://www.wikidata.org/w/index.php?title=Q3963243&action=history [18:23:34] only in the watchlist and recent changes (for now) [18:23:41] creating multiple links in one request is possible now? [18:23:50] in RC of wikipedia? [18:23:57] putting it in the article history introduces new complexities and possibility of breaking stuff :o [18:24:07] lbenedix: yes, in rc of wikipedia [18:24:11] nice [18:24:23] multiple links is possible via the api. not the ui [18:24:51] but this is new, isnt it? [18:24:59] lbenedix: no not new [18:25:22] rc does not work yet if you have the "enhanced" group changes with javascript preference [18:25:28] * aude working on it...... [18:25:59] but this is creating a bias when looking at botedits vs. human edits [18:26:15] the multiple links per edit [18:26:19] true [18:30:55] btw. I did some usability testing the last three days and my testers found some issues with the wikidata-ui [18:38:23] aude: l10nupdate done [18:38:33] yay [18:47:52] Reedy: whenever ready we can fill in the missing search keys with "--only-missing" [18:48:00] i think that's the last thing [18:48:17] edsu: ping [18:52:24] Reedy: it still says "" [18:52:33] did we miss anything? [18:53:23] huh http://www.wikidata.org/wiki/MediaWiki:Wikibase-addqualifiers seems missing [18:55:44] lbenedix: pong [18:57:02] I had a question regarding wikichanges but I found everything [18:57:20] lbenedix: oh good, any bugs? [18:57:21] maybe you could add a list of dependencies on the github page [18:57:28] no bug [18:58:03] lbenedix: did the `apt-get install nodejs npm` not cover it? [18:58:05] took me about 2min to install all needed dependencies [18:58:32] npm install should grab all of them for you [18:58:53] npm install underscore && npm install irc-js [18:59:03] oh? [18:59:15] they're right here https://github.com/edsu/wikichanges/blob/master/package.json [18:59:32] if you `npm install wikichanges` it should grab the dependencies [18:59:33] right [18:59:44] ohh... i cloned the github repo [18:59:53] oh i see [19:00:03] first time using nodejs and npm [19:00:09] yeah, you don't really need to clone the repo, maybe i should make that clearer [19:00:15] lbenedix: gotcha [19:00:38] btw. its great! [19:00:45] wikichanges is great [19:01:26] I just want to monitor the changes for wikidata.org, do you have an aproximation for the traffic wikichanges will generate? [19:08:32] Reedy: the localisation looks fixed now, fyi [19:08:43] * aude happy [19:09:01] going home.... back later [19:22:36] wikidata-RC will be a lot of data, if I want raw data ;) [19:23:09] lbenedix: thnx! [19:23:29] how do you store the changes for your bot-graph? [19:23:31] lbenedix: iwkichanges itself doesn't generate any traffic, it just listens to irc channels for updates [19:24:09] by traffic I mean bandwidth [19:24:13] lbenedix: unfortunately you don't get tons of details about the change, so you might want to do api calls to get more info, that's when traffic becomes an issue i guess [19:24:19] oh, i haven't noticed [19:24:38] at the moment i'm storing the activity using the stathat service [19:24:55] that is something like rrdtool "in the cloud"? [19:25:10] it was good enough for my purposes at the moment, but won't be good longterm since they are moving to a payment model at the end of the year [19:25:21] yes, exactly like rrdtool in the cloud [19:25:35] i think if i had the server space i'd probably use cube [19:25:40] * lbenedix thinks about making another hardware-vizualisation like the pediameter for bots vs. humans [19:25:51] http://square.github.io/cube/ [19:26:48] thx [19:46:06] aude: I guess 59844 needs deploying soon [19:46:06] ? [19:46:22] Done. Updated 158557 search keys. [20:19:36] hi [20:19:48] so I see you removed all iw from wp [20:19:58] but when youll, link wikidata? [20:22:02] edsu: is wikichange recovering after a disconnect? [20:33:09] Juandev: What do you mean? [20:34:49] I had a short look at RC and it seems that KLBot2 has no botflag [20:35:09] sry [20:35:15] looked at the wrong field [20:35:40] https://www.wikidata.org/w/index.php?title=Q4115189&diff=26855236&oldid=26679007 :>>>>> [20:40:54] legoktm: Claimit.py seems to work quite well. You might want to advertise it to get more feedback [20:41:05] Or do you want to do all the wikidata edits yourself? :P [20:41:10] Heheheh [20:41:21] I have a few hundred thousand sitting in queue right now [20:41:29] 1839583 1.00311 wd_prop legoktm qw 04/18/2013 07:00:09 1 [20:41:35] Still queued ! [20:42:01] And I will fix that category pagegenerator bug [20:42:05] That one is my fault [20:46:45] Reedy: yes we can deploy the settings [20:46:59] done, i see :) [20:47:07] or merged at least [20:47:40] multichill: we also need to add sources to that, and once i code it into the framework, qualifiers [20:50:35] Reedy: I mean, this: https://en.wikipedia.org/wiki/Rubens_Ometto_Silveira_Mello [20:51:04] aude: deployed tooo [20:51:07] Reedy: there is no easy way to add iw, and iw are not displayed even the article exists in pt [20:51:13] legoktm: I still think that "Dutch Wikipedia" sucks as a source [20:51:27] I do too [20:51:38] But it's better than nothing [20:51:39] Juandev: You mean when there are no language links, telling people to do it on wikidata? [20:52:10] So implement it, you already have the code, you just have to commit it [20:52:25] yay [20:52:42] after my midterm :) [20:52:49] will be back in a few hours [20:52:55] Reedy: if you want to fill in any missing term search keys, then we can do that with "--only-missing" [20:53:04] aude: I have been :) [20:53:08] Juandev: There is [[Q7376024]] and [[Q10365110]] for Rubens_Ometto_Silveira_Mello, I'll merge them on Wikidata [20:53:08] good [20:53:09] 10[1] 10https://www.wikidata.org/wiki/Q737602413 => [20:53:11] 10[2] 10https://www.wikidata.org/wiki/Q10365110 [20:53:13] that's the last thing to do [20:53:14] reedy@terbium:~$ date && mwscript extensions/Wikibase/repo/maintenance/rebuildTermsSearchKey.php wikidatawiki --only-missing --force --batch-size=500 && date [20:53:15] Up [20:53:16] Enter [20:53:19] Up [20:53:20] ;) [20:53:21] nice [20:53:34] Reedy: probably [20:53:35] aude: Though, there's a bug [20:53:42] reedy@terbium:~$ date && mwscript extensions/Wikibase/repo/maintenance/rebuildTermsSearchKey.php wikidatawiki --only-missing --force --batch-size=500 && date [20:53:42] Thu Apr 18 20:53:25 UTC 2013 [20:53:42] Updated 365 search keys, up to row 92923466. [20:53:42] Done. Updated 365 search keys. [20:53:42] Thu Apr 18 20:53:25 UTC 2013 [20:53:50] huh [20:53:52] As many times as I press up and enter, it's always 365 [20:53:58] They're not being populated for some reason [20:54:15] hrm [20:54:16] http://p.defau.lt/?Qzt3dpKhjAIOGG_YrrIvNw [20:54:29] otherwise, the search is case insensitive now [20:54:43] It's probably the bug I reported before.. [20:55:04] hmm [20:55:06] db query time [20:56:15] Yup, funky entries [20:56:20] ugh [20:56:56] < 400 entries with problems out of over 90 million is pretty good going [20:57:14] sure [20:58:06] Comments cannot be longer than 65535 characters. [20:58:08] Damn bugzilla [20:58:25] http://p.defau.lt/?R6bnyOoKgSyCXvU7kZ_m6A [20:59:06] strange [20:59:35] Reedy: hey, that's pretty good compared to Twitter. Imagine if you could only do a bugzilla request in 140 characters [21:00:53] something like http://www.wikidata.org/wiki/Q93863?uselang=de seems non-trivial to normalise [21:01:18] all the ones in the list seem odd like that [21:02:03] if those are not normalized, not sure it's an issue or not [21:06:55] aude: Denny_WMDE: hey :) at home now - what's the status? [21:08:39] Lydia_WMDE: hola [21:08:49] everything greatq [21:08:51] ;-) [21:08:53] qualifiers are deployed [21:08:56] cool [21:09:02] i will send notes then [21:09:11] what about the termsearch fix? [21:09:16] Lydia_WMDE: can you as a community manager help me with the registration for the hackathon? I wonder if it is possible to get assistance with the travel and the accomodation if I'm looking for the accomodation by myself? [21:09:28] search is improved too [21:09:33] termsearch is in [21:09:41] * lbenedix has to try qualifiers [21:09:57] lbenedix: Contact Sumana... the Hackathon isn't a WMDE event (this year) [21:10:02] lbenedix: that is a question i can't answer sorry - it would be best to ask sumana for that or the orga team [21:10:11] ok [21:10:12] Denny_WMDE: sweet - will add that to the news [21:10:20] is in the Netherlands [21:10:25] is there an example for the qualifiers? [21:10:33] Usually accomodation is wherever they've reserved, not random hotel you find... [21:11:05] * hoo still hasn't booked train tickets... [21:11:30] hoo: fixit! :D [21:11:34] :D [21:11:41] I need to book a crossing.. [21:11:51] alright guys - less chatting for me and more writing - back in a few [21:12:05] Lydia_WMDE: there's a few odd terms that don't normalize [21:12:25] aude: ok but nothing major i assume? [21:12:59] not an issue but there might be a few entries missing perhaps [21:13:03] http://www.wikidata.org/wiki/Q93863?uselang=de [21:13:05] for example [21:13:12] ok [21:13:15] good to know [21:13:33] not sure if it means anything valid is missing or not [21:14:31] a question that you should be able to answer: what is the cheapest way to get to amsterdam from berlin? [21:14:52] train likely [21:14:57] aude: Do you consider the wikibase.store refactoring a blocker for the sitelink widget? [21:15:03] hoo: no [21:15:08] Great [21:15:21] it's a little bit slow to load but might be better in production [21:15:28] and it's on demand, so fine i think [21:15:30] I hope to get it into a mergeable state until monday, tough [21:15:37] good [21:15:48] think we can enable it on monday for enwiki and others on wednesday [21:16:07] \o/ [21:16:11] :) [21:16:21] I guess it would be best If I'm here at that time [21:16:39] sure, if you can be [21:16:58] it's already on test2 [21:17:15] Depends on the deploy times... if it's late as usual, I'm pretty sure I can make it [21:17:16] no issues afaik [21:17:18] * lbenedix loves bahn.de [21:17:24] lbenedix ü1 [21:17:26] * +1 [21:17:28] sure, should be as usual [21:17:40] hoo: it is planned for late as usual again [21:17:46] hoo: having you around would be great :) [21:18:17] hoo: and i just got another bunch of requests for the widget today so it is awesome to finally be able to tell them it is coming [21:18:19] :D [21:18:33] * lbenedix could watch for hours at https://www.img-bahn.de/es/v1000/img/loading_grau_66x66.gif [21:18:42] :) [21:19:31] Lydia_WMDE: Ehm, just another question [21:19:38] ^^ [21:19:46] Hahc21: shoot [21:20:00] Monolingual text is planned to arrive this year? [21:20:35] and what are the differences between monolingual text and string (from a technical standpoint) [21:20:47] Hahc21: for all i know yes but Denny_WMDE has the last word on that [21:20:57] the difference is that you can't translate the former [21:21:09] from Thursday to Monday easyjet is cheaper than one way with DB [21:21:18] Ehm [21:21:30] lbenedix: Get youself a Bahncard [21:21:32] then what is the difference between monolingual and multilingual? [21:21:37] * Hahc21 is noe confused [21:21:41] now* [21:21:47] mono = one? multi = more than one [21:22:00] legoktm: from a technical standpoint [21:22:04] :P [21:22:31] Hahc21: multilingual: you can translate it [21:22:39] string: doesn't have a language associated with it [21:22:44] like a fiaf id [21:22:44] oh [21:22:51] makes more sense now? [21:22:55] so [21:23:01] let me write what I understood [21:23:13] string = value not associated with any language [21:23:44] monolingual = a value that is associated to a single, unique language, and appears as is to all users, regardless of their language [21:24:38] multilingual = translatable value that is not associated to a single, unique language, and appears translated to all users, depending on their language, � la labels? [21:25:20] Hahc21: strings are for stuff like identifiers [21:25:27] e.g. don't really have a language [21:25:32] yeah [21:25:41] a string is like a pstal code, or a number [21:25:45] Hahc21: Well, string means a sequence of characters. Monolingual means one for all languages, the others are translatable [21:25:47] not a number [21:25:51] postal code, sure [21:26:03] I mean, a number that is not set to change [21:26:04] we'll have something else for numbers [21:26:06] A number can be a string to, but usually shouldn't [21:26:07] sure [21:26:08] like a postal code, or a game review [21:26:21] Hahc21: correct [21:26:29] I got it then :D [21:26:32] Thanks [21:26:41] and yes, planned in this year [21:26:47] and now [21:26:52] I proposed a property [21:26:52] omg [21:26:53] (show/hide) 21:26, 18 April 2013 (diff | hist) . . (+252)‎ . . (Q2934910) ‎ (‎Created claim: place of birth (P19), Frameries (Q666751)) [21:26:58] That is awesome! [21:27:02] "recorded at" [21:27:08] legoktm: it's still rough [21:27:19] and the proposed values are the names of the recording studios [21:27:19] and only works for creating claims with wbcreateclaim [21:27:25] we're workign on the rest of the summaries [21:27:32] can you create claims in other ways? [21:27:35] and make them nicer [21:27:45] legoktm: yes, with other api modules like set claim (i think) [21:27:46] those names are tied to a language, so the property's type should be monolingual? [21:27:47] It would be nice if I could override the autosummary with summary= [21:27:50] Yes... summaries are todo :P [21:27:53] Oh yeah, right [21:27:58] Lydia_WMDE: ^ [21:28:02] i am not sure that the ui uses create claim :/ [21:28:04] (my comments only) [21:28:14] legoktm: I think the autosummaries are quite nice, if set correctly [21:28:36] for example, separating the key value with a comma is not ideal [21:28:46] no better way to do it at the moment [21:28:56] aude: why not an arrow? [21:29:07] legoktm: then it has to be internationalized [21:29:11] hoo: True, but my bot takes requests from users, and I'd like to include the users name in the summary so people dont complain at my talk page [21:29:14] Hahc21: not sure tbh right now - it is late -.- [21:29:16] point the arrow to the left for hebrew [21:29:22] heh >.> [21:29:39] the comma is internationalized using core function [21:29:41] heh [21:30:26] Hahc21: I'd say the studio should be an item [21:30:51] Most studios are not relevant to have an item [21:31:06] shall we create items for them? [21:31:31] (They would meet the WD:N#3) [21:31:48] "needed to make statements made in other items more useful." [21:32:58] up to the community, Hahc21. [21:33:33] k [21:33:39] Thanks for all the help :) [21:33:41] Hahc21: It's less work and arguing if you just write an article for it on a 'pedia. [21:33:56] It'll be deleted :L [21:35:13] * legoktm points to simple [21:35:17] I never said that fwiw. [21:36:05] I'd prefer to use monolingual [21:36:13] I'm against items without interwikis [21:37:21] Denny_WMDE: Are there bidirectional relations in wikidate? such as "Klaus Wowereit" -- is_mayor_of --> "Berlin" is the same like "Berlin" -- mayor_is --> "Klaus Wowereit"? [21:37:46] lbenedix: there are some [21:37:56] like mother / child [21:37:59] "Could not parse json query response: u'Verified'" [21:39:01] If I add a Statement "Klaus Wowereit" -- is_mother_of --> "Berlin", I get "Berlin" -- is_child_of --> "Klaus Wowereit" automatically? [21:39:08] no [21:39:19] so there are no bidirectional relations [21:39:28] not automatically no [21:39:43] write a bot :) [21:40:04] (that will make it impossible to remove such a relation again...) [21:40:38] interesting [21:40:55] what is the reason not to have this? [21:41:01] DanielK_WMDE: the community got all omg vandals will abuse it and magnus gave up trying to get it approved :( [21:41:24] https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/ImplicatorBot [21:42:00] will there be bidirectional relations in the future? [21:42:39] My testees were confused when they enter KW as the mayor of berlin and didn't find the mayor_is relation for the berlin item [21:42:39] possibly, but it's nothing we are working on [21:42:48] i don't think it has been rules out, but it's not easy to do [21:44:03] any kind of automatically implied stuff has to be thought of carefully. did you consider qualifiers? how do they translate? [21:44:08] what about sources? [21:44:24] will the other page possibly grow too big? [21:44:33] like, should germany contain all people born there? [21:44:41] why not? [21:45:01] have fun loading the page :) [21:45:28] have fun finding the useful info on the page [21:45:29] ;-) [21:45:35] is scalability a problem of the users or the developers ;) [21:45:49] both in the end [21:45:50] of the system [21:46:21] maybe not every statement should be loaded [21:46:22] of the whole socio-technical system that is the Wikidata project, embedded in the Wikimedia projects [21:46:35] maybe you could have some properties reciprocal, some not? like we have a distinction on-wiki between things to categorise vs. things not to [21:46:53] could be [21:46:53] "is father of", yes. "lives in X", no. [21:46:56] there are a lot of relations like "is mother of" vs. "is child of" [21:46:58] plenty of possibilities [21:47:04] we will figure it out with time [21:47:41] the implicator bot looks good [21:48:23] btw. I did some usability-testing on wikidata and like to share my insights with you [21:48:34] but not until may [21:48:44] uhm [21:48:46] * lbenedix has to write his thesis... [21:48:50] ok, sure [21:48:53] would love to hear that [21:49:13] a written report that we could publish would be even better :) [21:49:33] mehh... [21:50:01] that could take more time than showing the recordings to you ;) [21:50:43] And I think seeing how different users struggle with different tasks is much better than reading a report [21:54:44] * lbenedix thinks he has to come to wikidata-office to talk about the wikidata-meter as well... [22:05:35] New review: Hoo man; "This even adds a few help URIs" [mediawiki/extensions/Wikibase] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/49862 [22:05:35] Change merged: Hoo man; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49862 [22:12:07] would love to see the videos [23:26:41] edsu: hey your wikidata-bot graph stopped http://inkdroid.org/wikidata-bots/#period=hour [23:41:32] I ended up setting up my own graph for legobot's mainspace edits: https://www.stathat.com/stats/K5Vs