[00:29:04] legoktm: if they show up in the irc channel yes, i think so [00:29:10] ok [08:13:21] * legoktm pokes benestar  [08:13:35] there's an RfA ready for closing [08:13:54] legoktm: hi [08:14:03] hi :) [08:14:19] I see ;) [08:15:27] New patchset: Nemo bis; "Expand legend on Special:DispatchStats" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/59030 [08:18:54] lol since you're one of two crats now you've gotta do all the work :P [08:19:45] :P [08:23:38] thx [08:29:01] legoktm: just a question: is your bot approved for this task? https://www.wikidata.org/w/index.php?title=Wikidata:Requests_for_deletions&curid=454&diff=24781270&oldid=24781239 [08:31:16] oops [08:31:21] that was supposed to be through my account [08:31:36] abuse!@ [08:31:47] legoktm: you know I almost got an oppose at my RFB for that... [08:31:55] meh [08:32:30] to the person who wants to do that by hand, i wish them the best of luck [08:32:53] https://www.wikidata.org/w/index.php?title=Wikidata:Requests_for_deletions&diff=prev&oldid=24781626 [08:32:59] Riley: ;) [10:04:35] aude: how was your talk? i saw some great comments on twitter [10:24:33] Lydia_WMDE: are you around? [10:24:53] benestar: hey :) [10:24:55] wasup? [10:25:08] Lydia_WMDE: found this: https://www.wikidata.org/wiki/Wikidata:Google_Summer_of_Code [10:25:16] yes [10:25:22] looks quite interesting :) [10:25:26] :D [10:25:32] interested? [10:25:41] Lydia_WMDE: what does "student" mean in this context? [10:26:02] let me get you the eligibility criteria [10:26:04] one sec [10:26:46] benestar: http://www.google-melange.com/gsoc/document/show/gsoc_program/google/gsoc2013/help_page#2._Whos_eligible_to_participate_as_a [10:27:13] also the question before that [11:03:51] i wanted to be awake when the dispatch lag hit 0 :/ [11:03:53] oh well, good night [11:11:19] legoktm: good night :) [11:13:15] * tommorris is in Katie's session at GLAM-WIKI [11:13:36] tommorris: yay! [11:13:38] how is it? [11:14:24] fine. I sorta know about Wikidata already. ;) [11:15:11] hehe yeah [11:16:39] here's some new facts (because I just calculated it for the first time)… Wikidata had more than 10000 registered users editing it so far [11:17:27] Denny_WMDE: have you subtracted the number of bots? [11:17:40] yep [11:17:44] just 39 bots anyway [11:18:00] * tommorris should really apply for bot approval. [11:18:39] I get 9832 users with >0 edits [11:20:12] https://www.wikidata.org/w/api.php?action=query&list=allusers&format=xml&auprop=editcount&auwitheditsonly=true&auexcludegroup=bot&aulimit=500 [11:21:24] oh, interesting [11:21:33] i took the dumps and analyzed them with python [11:21:39] New patchset: Jeroen De Dauw; "Work on implementing NoValueSnakStore::storeSnak [DO NOT MERGE]" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58705 [11:21:55] and get 10678 users including bots [11:22:21] I made 20requests to the api with python [11:22:34] funny, wonder where those 800 users disappeared [11:23:02] are you trying to figure out if wikidata-users are all wikipedians? [11:23:22] I'm working on it [11:23:24] http://lb.bombenlabor.de/ba/users_raw.txt [11:23:31] that is the list of users I got [11:23:42] Denny_WMDE: I can tell you just now… no, not all… for ex. I'm wikisourcian ;) [11:24:15] hsarrazin: cool [11:24:15] hello, by the way :) [11:24:37] The exception proves the rule. [11:24:37] what i wonder is if we also pull in completely new contributors to the wikimedia world [11:24:50] and I also work on authority control on Commons… [11:25:08] and I'd like to see data to see how we constitute our contributors [11:25:40] hsarrazin: then you sure love the viaf ids and all these nice ids in wikidata :) [11:25:54] we working on an api to ask wikidata for the item with a given viaf id [11:26:00] (or any id) [11:26:08] hsarrazin: "frwikisource": 70246edits, "wikidatawiki": 4100 [11:26:11] well, for now, I did not really looked in the viaf ids on wikidata… [11:26:34] thanx lbenedix :) [11:26:49] and "commonswiki": 43817 [11:27:10] < 500 on other wikis [11:27:44] lbenedix: on other wikis, I occasionnally correct an error when I stumble upon it :D [11:28:04] lbenedix: i think your utf-8 encoding on that file is a bit broken [11:28:30] but fundamentally I'm a librarian, which means I like datas to be clean and neat… :D [11:28:41] looks good in sublimeedit [11:29:26] http://lbenedix.monoceres.uberspace.de/screenshots/k3wsge666t_(2013-04-13_13.29.06).png [11:32:07] "Denny": "dewiki": 697, "enwiki": 1858, "hrwiki": 9345, "metawiki": 533, "wikidatawiki": 1819 [11:32:07] "Denny Vrandečić (WMDE)": "metawiki": 833, "wikidatawiki": 461 [11:33:28] Denny_WMDE: is the api giving me accurate results or is your dump-analysis better? [11:33:58] hum, i have no idea [11:34:10] i always trusted both [11:34:22] logically, the dump should be better [11:34:36] the api can always have shortcuts [11:35:31] but one way or the other, we have about 10k contributors so far :) [11:35:58] * lbenedix agrees [11:37:31] and ~500 with only one edit [11:38:04] only? sweet [11:38:28] do you have numbers for >=5, >=10, >=50? [11:39:34] ~1000 < 10 [11:39:55] nope [11:40:55] 1500 [11:41:28] btw... is it possible to get the right to make bigger api requests? [11:41:45] no, bots already have the rights for bigger api requests [11:41:51] the first wikis have a dispatch lag of zero : ) https://www.wikidata.org/wiki/Special:DispatchStats [11:42:06] * lbenedix thinks he is a bot [11:42:12] Sk1d: yay! [11:42:16] *piep* *piep* *piep* [11:42:38] :D [11:42:38] are the other wikis getting faster to zero know? bpywiki seems to have bug [11:42:54] yeah, I am wondering about bpywiki too [11:42:59] the others should get faster, yes [11:43:13] ok so we should wait until all wikis are to zero [11:43:32] at least for the median [11:43:33] can you see all wiki lags? [11:43:38] and then a bit longer [11:43:49] and if bpywiki doesn't disappear then, it is a bug [11:44:07] if median is at zero it means more than half of the wikis are at zero [11:44:12] no i dont have further stats [11:44:42] whould be greate if there would be the possibility to show all wikis at once [11:46:15] agreed [11:46:28] over the last hour or so the position of bpywiki has not changed from 22680649 [11:47:05] hmhm [11:47:14] DanielK_WMDE: ^ [11:49:11] total non-bot-edits on wikidata: 2.616.425 [11:49:40] not much on ~10.000.000 items [11:49:54] Sk1d: i had it on 22654642 like an hour ago [11:50:13] and now it is on 22683650 [11:50:40] it is moving [11:51:11] lbenedix: well, I am not sure how interesting those items are [11:51:40] but 2.6 M human edits in 5 months on a new project is pretty cool I'd say ;) [11:51:49] definitively more than we expected [11:52:23] show me a wikimedia wiki that has more human edits in that time frame, besides maybe english wikipedia, commons, and, hm, maybe german wikipedia :) [11:52:28] i think there are still bots without botflags [11:52:50] let's see if we can discover them [12:07:18] Denny_WMDE: more editcount-numbers: "=1: 2678, >=5: 4608, >=10: 3434, >=50: 1672, >=100: 1200, >=500: 549, >=1000: 345" [12:08:39] oh, i thought you said <500 with 1 [12:08:52] pity. but this looks more like expected [12:08:54] I started excel [12:09:15] these are great numbers! [12:09:49] >=5.000: 90, >=10.000: 46 [12:10:17] sounds like the usual exponential distribution, not very surprising [12:10:40] but nice to see so many editors with many edits. i was a bit worried about having much more with <5 [12:10:55] cool [12:11:11] did you already correlate with sister project edits? [12:11:31] still working on it [12:15:27] how is it possible to make >10.000 edits in 5month? [12:16:45] lbenedix: why not? i made almost 2000 edits, and I really didnt try hard [12:17:41] if I'd spend every second evening half an hour on Wikidata, I would easily have more than 10k edits [12:18:24] i've made 18,000 in one month before on en.wiki [12:18:40] (I had no life at that particular point) [12:18:54] impressive contribution [12:19:34] (at least you didn't just watch a simpsons marathon, but contributed your cognitive surplus to a greater good (™) ) [12:19:51] (although I would love to watch a simpsons marathon at some point) [12:20:04] XD haha, true [12:20:05] 2011/12 17016 [12:20:06] 2012/01 18630 [12:20:19] https://www.wikidata.org/w/api.php?action=query&list=allusers&format=xml&auprop=editcount&auwitheditsonly=true&auexcludegroup=bot&aulimit=1&aufrom=Denny_Vrandečić_(WMDE) [12:20:20] there's supposed to be a space in there [12:20:24] name="Denny Vrandečić (WMDE)" editcount="461" [12:20:34] try Denny [12:20:40] that's my non-work account [12:20:48] ok [12:20:53] name="Denny" editcount="1819" [12:22:30] * lbenedix dont trust his excel magics [12:23:09] the name to editcount relation is wrong in excel... [12:23:36] * Moe_Epsilon hopes to make a million global edits one day :3 [12:24:06] there is one contributor who made that so far, right? [12:24:16] two, I believe [12:24:37] knowledge sharing millionaires :) [12:24:48] Koavf and Rich_Farmbough, but Rich is banned for a year [12:24:49] :< [12:25:02] aww [12:25:45] I'd love to see a Website that, given a Wikimedia-Login, calculates how many people have seen your contributions [12:25:47] Moe_Epsilon: 121.537 edits over all projects [12:25:59] o: [12:26:03] good way :) [12:26:04] yep, seems about right [12:26:30] only took me since 2005 XD at this rate, I will get to a million when I'm in my sixties [12:26:43] keep up the good work! [12:26:56] by then we might have a badge for that ;) [12:27:19] I request a good 401K and a retirement plan at that point [12:27:20] ;p [12:27:44] my poor arthritic hands.. [12:28:05] about 100.000 edits in enwiki [12:28:28] :D [12:28:52] ~6000 on wikidata [12:29:03] ~3000 on commons [12:29:12] the rest is <200 [12:29:36] yep lol all other projects is mostly me starting userpages xD [12:29:38] or Commons works [12:29:41] work* [12:30:56] or asking for a unified login in my case … :P [12:31:36] if they ever do get around to creating a global userpage, it would be nice, because I don't want to create 700 :< [12:32:19] https://en.wikipedia.org/wiki/User:Moe_Epsilon/Sitematrix I've got 8 created xD [12:37:20] i think it's on the todo list [12:37:31] for athena [12:37:37] neat :) [12:37:52] do you think we could get it by 2014? [12:38:03] i thought the plan was actually 2015 [12:38:07] but i might be wrong [12:38:17] need to ask jorm [12:38:41] hmm, either way, it will save oodles of time for me xD [12:38:49] not just for you :) [12:38:58] yeah :) [12:39:44] I would particularly like it so people can stop requesting we get a new consensus on adding userpages to Wikidata every month lol [12:41:52] Anyone care to take a rather speedy look and comment at http://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/Addbot_1 ? I would love to get it approved rather speedily so as not to slow my crosswiki bot down :) [12:45:00] looks good to me, but i am no admin actually :) [12:45:08] New patchset: Raimond Spekking; "Consistency tweak: article -> page" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/59043 [12:51:12] haha Denny_WMDE I dont really see how anything can go wrong, I only use 1 api function :D [12:51:25] i agree :) [12:55:02] more than a 1000 users made more than a 100 edits, and the other way around [12:55:07] playing with numbers is fun [12:55:17] xD [12:56:18] one could define an H-Index for a wiki: what is the highest H for which the statement "H users have at least H edits" [12:57:02] I don't make enough edits to Wikidata :< I spend too much time doing admin-ly things [12:58:09] I would disagree that the number of edits is the sole or even the most important indicator for contribution to the project [12:59:36] I agree, I just get sad when my edit count hasn't changed a whole bunch in a month and I'm there every day xD but at least I've got over 2,000 admin actions now [13:03:08] Ironically, I'm doing a school paper right now and the topic in this section is procrastination, so I should probably be focusing xD [13:04:51] :D [13:05:00] lbenedix: do you know how many anon edits we had? [13:07:14] Denny_WMDE: I dont have actual numbers, but I think ~0 [13:08:20] "All together 15,798 article edits were made by anonymous users, out of a total of 8146,309 article edits (0%)" [13:08:22] good guess [13:08:27] this was in february [13:08:29] between 2013-01-01 and 2013-01-25 there were 1.952.584 bot, 179.623 logged in, 2.019 anon [13:09:05] I asked Daniel, if he could give me this numers 2 Months ago [13:09:17] it's here: http://stats.wikimedia.org/wikispecial/EN/draft/TablesWikipediaWIKIDATA.htm [13:09:20] select count(*) as n, count( if( rc_user = 0, 1, NULL ) ) as anon, count( if( rc_bot = 1, 1, NULL ) ) as bot from recentchanges where rc_timestamp >= "20130101000000" AND rc_timestamp <= "20130125000000" [13:09:44] i can't run stuff on the database [13:09:47] and probably better so [13:09:48] :) [13:09:57] another of this beautyful yellow tables [13:10:08] yah, I love them :) [13:11:41] * lbenedix feels to be back in the 90ties [13:15:40] the median lag is gone, yay [13:16:13] so let's see if we can make the average and the stalest go away too [13:31:09] will the lag go back to 2days if the the bots start editing with >1edit/min? [13:34:03] if there is no sudden increase in bots, no [13:34:45] we seem to be able to deal with about 200-250 epm. since we have <40 bots, 1epm per bot should be no issue at all. but 1epm for a bot is far too low [13:35:06] the editcount right now is at ~80, it was ~800 on tuesday [13:35:20] you mean epm, right? [13:35:24] yepp [13:35:40] yeah, 800 epm would increase the lag again [13:35:40] Denny_WMDE: we'll need to look into bpywiki on monday [13:35:57] something's very broken, for some unknown reason with that wiki [13:36:02] aude: let's wait another hour. i think it will simply drop. if it doesn't, yes, we need to check it on tuesday [13:36:10] that's not the new one, isn't it? [13:36:13] nah, bpywiki sems special [13:36:14] that was minwiki? [13:36:19] ok [13:36:21] it's always been stuck last [13:36:26] no not minwiki [13:36:27] isnt it possible to place the human edits ahead of the queue to avoid frustrating human editors? [13:36:30] minwiki is fine [13:36:48] bishnupriya manipuri [13:36:51] yes [13:36:57] minwiki is indonesian language [13:37:37] and bpy? looks southeastasian too [13:37:55] bots dont care if they cant see their edits, but I would say: "Wikidata is crap" if my edit takes two days to be seen on wikipedia [13:41:09] lbenedix: yep. that's why we are having the moratorium [13:41:28] and its not possible to get human edits to the front of the queue [13:41:43] there are 24 changes in wb_changes (according toolserver) that have bpywiki in it [13:43:35] Change merged: Jeroen De Dauw; [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/58677 [13:43:58] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58682 [13:44:24] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58683 [13:44:51] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58685 [13:45:32] max timestamp rc_type = 5 in bypwiki is 20130410024345 [13:46:12] lbenedix: that's a great idea [13:46:37] i got to write it down not to forget it [13:46:56] lbenedix: interesting idea [13:49:59] aude: is bpywiki the only one like this? [13:50:05] would you know? [13:50:21] Denny_WMDE: need to look more [13:50:40] if you have the time [13:50:48] New patchset: Shirayuki; "Spell-check: occured -> occurred" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/59049 [13:53:36] i don't know if 24 is a number that is reasonable (perhaps)... [13:54:04] there are 132 changes that contain minwiki in the change [13:55:15] it only contains the diff, but the change dispatcher also considers if an item is connected (via site links table) [13:55:29] shall look tomorrow or monday [13:55:53] ok, thx [13:58:08] alright, out of battery at any minute [13:59:32] * aude listening to barbara and other panelists about maps :) [13:59:56] aude: ping. ;) [13:59:57] http://www.bl.uk/maps/ is quit enice [14:00:02] hah! :D [14:04:54] aude: ohhh that is a beautiful idea [14:04:57] i am a sucker for maps [14:52:32] now bpywiki is also quickly reducing its lag [15:05:59] Denny_WMDE: hmmm, ok [15:06:04] Hello, is it me or is there a BIG mistake on P107 here : http://www.wikidata.org/wiki/Q5053460 [15:06:23] this is an album, not a person… but I cannnot modify it :( [15:06:55] hsarrazin: (unrelated) the zhwiki article seems to be around a person [15:07:36] so, there is a wrong link, because the enwiki is about the album… [15:08:03] I don't read chinese, sorry :DD [15:09:11] corrected the wrong link [15:09:12] oh sorry [15:09:17] i should have left that to you, hsarrazin [15:10:05] * aude thinks bots are imperfect [15:10:32] agrees with aude [15:10:36] sure there are plenty of errors like that and people will need to find them, fix [15:11:00] Yes... it's especially bad if the bots then "revert" the user action [15:11:06] * aude found the movie "Good Morning, Vietnam" was labelled as "Good Morning" [15:11:07] would it be possible to develop a tool/query/bot/etc to search for possible islands of articles which aren't linked to another group of articles about the same subject [15:11:35] Romaine: I guess it would [15:11:53] stuff after the comma chopped off, as for a town, state (might be appropriate) [15:12:12] but error, in some cases [15:12:28] so, the bots have started again .. let's keep track of the lag [15:12:38] ok [15:12:45] thanks to everyone for helping with the moratorium [15:12:59] thnx Denny_WMDE :) [15:13:10] that's one more edit for you ;) [15:13:22] 200-300 edits per minute the past hour [15:13:43] aude: https://gerrit.wikimedia.org/r/#/c/58223/2/lib/resources/wikibase.RepoApi/wikibase.AbstractedRepoApi.js do you an idea about that (2nd comment from /me) [15:14:05] hoo: probably a question for Danwe but looking [15:14:13] and bpywiki is under 100k [15:15:15] Denny_WMDE: good [15:15:30] aude: Probably, but I guess I wont be able to catch him before Monday [15:15:38] Denny_WMDE: nothing seemed obviously wrong looking at the toolserver [15:15:55] hoo: he might be around later [15:16:05] * aude wouldn't be suprised but can't promise (of course) [15:16:11] maybe it is a off-by-one error, that made bpywiki not be selected so often for some reason [15:16:23] I'll idle around, probably ;) [15:16:25] * aude not good with naming [15:16:34] Denny_WMDE: could be [15:17:12] with the dispatch stats, i'll talk with daniel more, but would like to get the entire data in the dispatches table via the api [15:17:20] how many iw's are left on local Wikipedia projects [15:17:21] e.g. ask about any particular client [15:18:18] * aude checks how many sitelinks there are for bpywiki [15:18:30] Romaine: I don't know, would love to check it out [15:18:43] I have the scripts, but I need to set them up somewhere and run them [15:19:01] we need to know to fix interwiki conflicts too [15:19:28] bpywiki is under 10k [15:19:37] hmmm, bpywiki has 25k sitelinks [15:19:46] for comparison, minwiki has 3k [15:19:49] Denny_WMDE: good [15:19:53] nlwiki? [15:19:58] that's quick [15:21:07] and done. bpywiki is dispatched [15:21:19] Romaine: wikidata is connected to 1245003 nlwiki pages [15:21:26] it includes stuff like categories, though [15:21:36] not sure how many are not connected [15:21:59] Denny_WMDE: happy to see a different wiki as stalest :) [15:22:05] yeah [15:22:14] now all dispatch lag is gone [15:22:17] let's keep it this way [15:22:18] bpywiki was stuck for a while [15:22:22] yeah [15:23:23] quick handwavy question to bot owners here: what would be a reasonable server-side limit for a throttle for bots? [15:23:36] I was suggesting 20 epm… (hopes not to be stoned) [15:23:39] fyi https://gerrit.wikimedia.org/r/#/c/58708/ (moving dispatcher to ashburn, new hardware) is pending [15:23:45] currently i suggested 60 edits/ min [15:23:53] * aude thinks you will be stoned [15:24:18] got negative comments on my patch, saying even with tools and gadgets or not, humans can edit faster than 20 per minute [15:24:29] we can have for humans a higher limit [15:24:31] we can apply the limit to bots only [15:24:32] sure [15:24:33] not every bot is this fast and I think there is a consens in the bot owners that the dispatch lag should be to zero [15:24:50] we currently cannot implement a server side throttle anyway [15:25:04] if the bot owners can keep it low themselves, I would be more than happy not to do anything [15:25:13] not sure when it will be deployed but high important todo is to have dispatch stats via api [15:25:27] the bots will be able to know what the lag is [15:25:33] yes, that would be very helpful [15:25:37] yea that would be grate [15:25:48] * aude screen scraping the special page now :( [15:25:56] to keep my own stats [15:26:12] yeah … :P [15:27:44] alright, time to log off computer [15:28:10] tomorrow is hackathon so can help people more specifically with wikidata and using the api [15:28:13] stalest lag of 0 minutes, yay. where's the success kid meme picture? :) [15:28:18] heh [15:28:20] aude: cool [15:28:37] the crowd here is pretty technical or consisted of many wikidata users [15:28:50] alright, signing off... [15:30:00] aude: http://tools.wmflabs.org/legobot/dispatchstats.log [15:35:54] epm is at 275 [15:49:37] DanielK_WMDE: Quick question, what does https://www.mediawiki.org/wiki/Extension:DataTypes do exactly? [15:56:25] multichill|work: that goes more to Jeroen [15:56:42] Denny_WMDE: Was afraid of that [15:56:51] a data type is composed of several components [15:57:16] so if you want to have a data type, you take appropriate parsers, formatters, validators, value containers, stick them together, and voila [15:58:10] Ok. Thank you, I have Katie here so we'll have a look at it tomorrow [16:15:09] quick question, before I engage in wrong mass action, please ? [16:16:00] hsarrazin: What's the question? [16:16:44] what is the exact use of P373 (Commons cat.) ? is it to link the item to the cat. with same value like https://www.wikidata.org/wiki/Q731852 ? [16:16:45] hi [16:16:56] or to group items that would go "into" a cat ? [16:17:18] https://www.wikidata.org/wiki/Property:P373 says "name of the Wikimedia Commons category containing files related to this item (without the prefix "Category")" [16:17:19] should I enter items that don't have an wikipedia items at all? [16:17:45] I'm trying to link commons creator categories to wikidata corresponding items :) [16:18:33] addshore: Am I correct? [16:18:59] hsarrazin: https://www.wikidata.org/wiki/Q731852 looks correct to me [16:19:54] Schisma: Please see the Notability policy https://www.wikidata.org/wiki/Wikidata:Notability [16:20:19] that answers my question thanks ^^ [16:20:20] Riley: thx - we have a pb on Commons, no articles, but Category are used to identify authors, places, etc. typically data :) [16:21:09] Your welcome. And sorry that I wasn't much help, hsarrazin. I don't work with properties much [16:21:10] Schisma if you really want to create an item, you may create a pedia article first ;) [16:21:18] okay [16:22:10] Riley: me either… I would have liked to have the ability to add a link to Commons… but hey, not before a few months at least, if I understood well… [16:22:54] and the phase II deployment creates a pb for commons cat., the tool that collected wikilinks does not work anymore… [16:22:57] :( [16:24:09] Yea :/ [16:39:32] hsarrazin: there actually is already a lot of stuff among the language versions [16:43:55] any more comments for my proposal? http://www.wikidata.org/wiki/Wikidata:Property_proposal/Creative_work#MusicBrainz_Identifier_.2F_MusicBrainz_Identifikator_.2F_num.C3.A9ro_d.27identification_MusicBrainz [17:24:31] Riley: ? [17:25:10] addshore: I was asking for your opinion on something I wasn't sure about. [17:25:18] still need it? :p [17:25:19] Doesn't matter now [17:25:21] no :) [17:25:31] awesome ;p [17:25:39] care to take a look at my bot request on wd? :) [17:25:45] id love to get it approved soonish [17:25:52] sure. link? [18:42:13] Anyone thah sees the ko-language sitelink as a readable string? http://www.wikidata.org/wiki/Q5508899 [18:42:20] Here it is jammed together [21:21:17] Question please ? [21:23:41] is it possible to search on items that have (or do not have) a specific property ? [21:28:54] hsarrazin: you can use "what links here" on the property page [22:15:29] hsarrazin: or magnus query tool [22:15:40] i am off [22:15:43] good night [22:17:19] Lydia_WMDE: any update on the bug? [22:18:15] Jasper_Deng: i talked to tobi and he said he'd look at it as soon as he can