[02:35:50] damn Magnus's script doesn't run via ResourceLoader [03:08:20] What did i do wrong to set off the abuse filter https://www.wikidata.org/w/index.php?title=Q6830775&action=history [03:10:39] gsdsghs: the "unexpected value"? [03:11:17] is male an expected value? [03:11:37] for gender, yes [03:11:45] so I think the filter is broken [03:11:52] or doesn't detect it right [03:12:18] okay [03:33:34] * Hazard-SJ looks up [07:33:00] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56377 [07:36:01] New patchset: Tobias Gritschacher; "(Bug 44876) Add toc to EntityView" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52796 [08:09:03] New patchset: Henning Snater; "Changed null parser to be useful." [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57048 [08:17:55] New review: Tobias Gritschacher; "This changeset causes 2 errors in the dataTypes QUnit tests." [mediawiki/extensions/DataValues] (master); V: -1 - https://gerrit.wikimedia.org/r/57050 [08:22:18] New patchset: Henning Snater; "(bug 45002) jQuery.valueview is now a single Widget using composition rather than inheritance" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57050 [08:22:28] New patchset: Henning Snater; "(bug 45002) Created new extension folder for "ValueView" extension." [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57049 [08:22:36] New patchset: Henning Snater; "(bug 45002) jQuery.valueview is now a single Widget using composition rather than inheritance" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57050 [08:22:43] New patchset: Henning Snater; "(bug 45002) additional experts for jQuery.valueview" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57051 [08:24:05] New review: Tobias Gritschacher; "This changeset causes 2 errors in the dataTypes QUnit tests." [mediawiki/extensions/DataValues] (master); V: -1 - https://gerrit.wikimedia.org/r/57050 [09:41:16] New review: Henning Snater; "(18 comments)" [mediawiki/extensions/DataValues] (master); V: -1 C: -1; - https://gerrit.wikimedia.org/r/57050 [09:42:19] hi [09:42:51] is it possible to have a property that points to two different entities? [09:42:56] yes [09:43:26] hi [09:44:16] just add the property pointing to one [09:44:22] and then add another claim within that property [10:10:51] Lydia_WMDE: are you there? got a couple of question and I'm in a bit of a hurry [10:11:04] Sannita: yes i'm here [10:11:05] wasup? [10:11:24] got a couple of clarification needed [10:11:30] sure [10:11:32] shoot :) [10:12:26] 1) are Linked Open Data considered for integration in WD? [10:12:37] (there are people at a meeting and they need those answers) [10:13:00] yes [10:13:07] 2) is it possible to consider data as permanent URI in the database? [10:13:11] New patchset: Tobias Gritschacher; "(testing) Selenium tests for qualifiers UI (WIP, DNM)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57270 [10:13:25] your questions are a bit unclear, though [10:13:39] you will be able to use URIs as data types [10:13:50] I know, they're not mine [10:13:57] i.e. you will be able to say Q64 that is the same thing as http://example.org/id/Berlin [10:13:58] and I can't talk to them, just chat [10:14:26] all entities in wikidata have persistent uris [10:14:36] and we will export everything as LOD [10:14:43] we are currently working on it, in fact [10:14:58] and then you can easily link to it [10:15:21] I know that, they're asking if data right now are "de facto" URI, that possibly means that they can be collected in any time in the world [10:15:49] there is a bit of a misunderstanding here [10:15:58] not data is URI, entities are identified by URIs [10:16:03] ok [10:16:05] thanks [10:16:17] see also http://meta.wikimedia.org/wiki/Wikidata/Notes/URI_scheme [10:16:23] this is probably what you are looking for [10:16:33] I'm just translating their questions, even I have problems in understanding what the hell they mean :) [10:16:45] :) just curious, who are they? [10:16:51] i know the LOD world a bit [10:18:21] people from Wikimedia Italy [10:18:42] ok. so what are they currently doing with LOD? working on DBpedia It? [10:18:43] they're at a meeting here in Rome, which I couldn't attend because *officially* I'm working :P [10:19:00] no, they're involved in a project about archaeology [10:19:18] interesting. and they are publishing their data as LOD? [10:19:30] and they wonder wether they can link to Wikidata data in their LOD data? [10:19:31] AFAIK there's the possibility that a huge database of Greek and Latin inscriptions is going to be merged into Europeana [10:19:46] AND we're trying to obtain the data and merge it to WD [10:19:46] sweet. Europeana does a lot of LOD publishing. [10:20:03] because we can link those inscriptions to the WP articles [10:20:15] and starting an interaction with Europeana [10:20:18] yes, this is a possibility [10:20:31] it is still at avery early stage [10:20:44] so Europeana has already identifiers for their items [10:20:56] one possibility is to add those to Wikidata at the respective items [10:21:22] once this mapping is done, it can be further considered to take the actual data and merge it piecewise, respecting the Wikidata community and license [10:21:43] yep [10:21:49] that's what we're discussing [10:21:51] whenever they publish their data as LOD, a mapping to the respective URIs and identifiers can be given in Wikidata [10:21:56] and that's exactly what I told them [10:22:07] so there is also no need to put the whole dataset into Wikidata, it can be linked from it [10:22:14] Wikidata isn't about original research after all [10:22:36] so basically, what I would tell them is to not worry about the technical issues of this too much [10:22:43] but rather about the social and community issues [10:22:44] well, there's lot of interesting things [10:23:02] btw, the main objective right now is to import photos to commons [10:23:03] just unloading a big dataset of original research in Wikidata might not be completely appropriate [10:23:07] and text to 'source [10:23:18] that is probably a good idea [10:23:25] then we're discussing IF it is possible to involve WD too [10:23:52] mapping the URIs and IDs, that would be the hardest and most rewarding part [10:23:56] everything else is rather easy [10:24:11] yeah, but we've got some workforce ;) [10:24:30] got another question [10:25:00] single statements can be linked as single URIs? I assume not, so how this will turn into LOD? [10:25:14] actually, single statements have a single URI [10:25:38] as you can see here: http://meta.wikimedia.org/wiki/Wikidata/Development/RDF [10:25:42] every statement is reified [10:25:55] which means it has an identity when exposed as RDF [10:26:05] so, not only entities have URIs, but also statements in Wikidata [10:26:16] usually, people dislike that, though [10:26:19] good, thanks [10:26:21] and we are quiet about this fact [10:26:30] :) [10:27:25] Denny_WMDE: I owe you a couple of beer I think :) [10:27:31] *beers [10:27:44] thanks. I change them into Gin&tonics if you don't mind :) [10:28:21] LOL, it appears we're soulmates :D [10:29:15] so, I am off. meeting. cu [10:29:20] cheers [10:38:37] Lydia_WMDE: knock knock [11:04:38] lbenedix: hey [11:06:09] Lydia_WMDE: is there any easy to understand explanation what wikidata is for and how to use (edit) it? [11:06:25] not yet, no [11:06:26] "Wikidata for dummies" [11:06:30] we are working on a video [11:06:40] that will not explain editing though [11:06:43] just what wikidata is [11:06:59] i think it is too early yet to do a proper video for editing yet [11:07:04] because it will still evolve [11:07:18] so i don't want to personally spent too much time on that part on a great video now [11:07:29] there are millions of edits [11:07:34] but if anyone wants to do a quick video/screencast please go ahead [11:07:39] yes i know [11:08:03] but i can't spend 20h on such a video now when i have to redo it in 2 months [11:10:42] I totaly understand the point, but I don't think that the wikidata-interface will change completely [11:10:45] (that of course doesn't mean anyone else can't - by all means...) [11:11:00] well right now there are no qualifiers [11:11:08] there are only 3 data types [11:11:15] there are no sister projects [11:11:40] it's not decided how sources are going to be used [11:11:50] but the first key features are there, Labels, Descriptions, Lang-Links [11:12:37] yeah but those are not the complicated to understand ones [11:12:47] in comparison [11:13:08] but as i said - if anyone wants to do something right now please go ahead [11:13:19] it is just not something i can spent time on [11:15:30] okay [11:18:53] I have another question ;) Is it possible to get some elements from the live-systen to the test system.... [11:19:21] I still want to make some usability studies and having only the chemical elements is kind of boring [11:25:10] lbenedix: http://www.wikidata.org/wiki/Special:Export is probably what you want [11:47:15] hi [11:47:58] hello [12:23:28] hi [12:59:25] aude_: Is https://gerrit.wikimedia.org/r/#/c/56367/ ok to be deployed whenever? [13:01:02] 22441319/72788166 * 100 = 30.83% [13:08:56] Lydia_WMDE: ^ [13:09:12] looking [13:10:06] Reedy: katie is on vacation but as far as denny and i know it is ok to go [13:10:23] but maybe wait for katie? [13:10:32] otoh [13:10:38] worst case you can just undeploy it right? [13:10:41] then go ahead i'd say [13:11:49] New patchset: Tobias Gritschacher; "(testing) Selenium tests for qualifiers UI" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57270 [13:12:32] Reedy: Lydia_WMDE: Even if https://gerrit.wikimedia.org/r/#/c/52417/ isn't live it would totally blow AFAIK [13:13:00] hoo: ? [13:13:07] sorry - not sure i understand [13:14:10] Lydia_WMDE: The change Reedy asked about... it probably wouldn't even break w/o that change [13:14:40] oh yeah it doesn't but the sort order of their language links is wrong (TM) [13:14:42] :P [13:14:56] and since they've been waiting for it for quite a bit i'd say let's go for it [13:15:01] and undeploy if there are issues [13:15:38] Reedy: ^ and yay @ 30% [13:21:57] multichill: you added some commons property the other day - can you remind me what it was? [13:22:05] i can't seem to find it [13:22:10] 30% of what? [13:23:53] mabdul: the script that runs on the database to make the search case-insensitive [13:26:02] New patchset: Tobias Gritschacher; "(testing) Selenium tests for qualifiers UI" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57270 [13:26:09] oh, great. I hate the case sensetive. esp. the first letter [13:26:40] Lydia_WMDE: commons category? [13:27:01] mabdul: jep :) should be fixed when the script finished and we flipped a switch [13:27:08] legoktm: possibly [13:27:10] hmm i cant find it [13:27:12] New patchset: Tobias Gritschacher; "(minor) removed commented out stuff from selenium test" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57293 [13:27:26] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57293 [13:27:53] oh here [13:27:54] https://www.wikidata.org/wiki/Property:P373 [13:28:00] thank you! [13:29:12] np [13:29:18] its not listed on WD:P for some reason [13:53:37] here's my 1 day of wikidata bot stats stathat report, a screenshot alas http://inkdroid.org/tmp/wikidata-bots-stathat.png [13:53:51] hmmm [13:53:57] clearly my bot is slacking off [13:54:04] * legoktm checks [13:54:13] was going to use it to figure out who to invite to the botparty [13:54:23] indeed it is [13:56:21] legoktm: which is yours again? [13:56:27] Legobot :) [13:57:08] one of my fav names :) [13:57:39] up there w/ FuzzyBot and BetaBot [13:58:07] I should look at the database size again [14:00:03] Has anyone worked out roughly how long it will take for the growth to plateau? [14:00:17] So much as, all the articles of all the wikipedias are there? [14:01:01] And/or graphed the rare of new item creation? [14:01:09] Danwe_WMDE: : did you see the comments on your changes? [14:01:12] i haven't heard of anyone doing that [14:02:06] there is http://stats.wikimedia.org/wikispecial/EN/draft/TablesWikipediaWIKIDATA.htm [14:06:51] now thats more like it https://www.wikidata.org/wiki/Special:Contributions/Legobot :) [14:06:57] Reedy: yeah lemme find the chart [14:07:13] https://www.wikidata.org/wiki/File:Wikidata_item_creation_progress.png [14:08:32] Shouldn't it be on commons? :p [14:08:41] Oh, it is [14:09:21] yeah i just linked off a userpage [14:11:08] i hope you're tracking now edsu, legobot's contributions are spiking :) [14:11:16] i *might* have removed the throttle :P [14:19:17] Tobi_WMDE: taking care of it right now, almost through [14:32:33] are edit summaries when using wbeditentity broken? [14:34:02] legoktm: wbsetreference is broken (there's a patch in gerrit) [14:34:08] but editentity... got an example? [14:34:28] lemme find one [14:35:03] https://www.wikidata.org/w/index.php?title=Q8067817&diff=prev&oldid=19061496 [14:35:09] the edit summary should have been [14:35:16] "Importing link from xxwiki" [14:35:20] Where xxwiki was the source wiki [14:36:23] I believe the edit summaries of editing statements are already broken for a long time.. [14:36:33] not statements [14:36:35] "/* wbsetsitelink-set:1|ruwiki */ Сжимаемость" [14:36:37] this is sitelinks [14:36:42] that's what's in the DB [14:36:42] oh right [14:36:49] its not using wbeditentity [14:36:52] thats wbsetsitelink [14:36:56] yep [14:36:58] but that allows an edit summary [14:37:02] last i checked [14:37:04] New patchset: Daniel Werner; "(bug 45002) additional experts for jQuery.valueview" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57051 [14:37:15] summary - Summary for the edit. [14:37:15] Will be prepended by an automatically generated comment. [14:37:30] and im 100% sure my bot is sending the parameter [14:38:32] legoktm: nevermind me, I was confused with something else that is broken atm [14:38:33] :P [14:39:44] :) [14:40:27] your bot is like going at the speed of light legoktm :p [14:40:35] at the moment [14:41:09] heheheh [14:43:39] New review: Daniel Werner; "(18 comments)" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57050 [14:55:14] New patchset: Daniel Werner; "(bug 45002) additional experts for jQuery.valueview" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57051 [15:08:27] Lydia_WMDE: P373, see https://www.wikidata.org/wiki/Special:Contributions/BotMultichill [16:26:10] Lydia_WMDE: still in the office? [16:26:20] Danwe_WMDE: no at home already [16:26:27] ok [16:34:58] New patchset: Daniel Werner; "(bug 45002) jQuery.valueview is now a single Widget using composition rather than inheritance" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57050 [16:40:13] legoktm: Online? [16:41:24] hi [16:43:46] Were you working on an easy tagging bot? [16:43:57] * multichill doesn't want to reinvente the wheel [16:44:06] easy tagging? [16:44:39] claimThis.py - P123 Q4567 P533 Q14523452 [16:44:57] oh [16:44:58] yeah [16:45:03] And than loops over all the pages in the generator and adds a claims [16:45:08] Change abandoned: Daniel Werner; "I created a mess when adding a new patch set with changes that should have gone into If387a45e76b7aa..." [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57051 [16:45:15] yes, that will be done by the end of the week [16:45:23] Probably some simple and smarter duplicate checking, etc [16:45:35] there is a less-featured version if you're using categories or a template as your gen ready though [16:45:52] https://www.wikidata.org/wiki/User:Legobot/properties.js/Documentation [16:46:14] That's webbased, I'm talking about a commandline pywikipedia program [16:46:32] oh [16:46:45] i can spend some time tonight turning it into a commandline program if you want [16:47:10] Should probably be less than 200 lines of code (including documentation) [16:47:24] And should be exactly the same for trunk and rewrite [16:47:30] eh [16:47:34] If you could do that, would be nice [16:47:43] problem is that the implementation for trunk / rewrite of wikidata is sooo different [16:47:53] New review: Daniel Werner; "(1 comment)" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57051 [16:48:18] New patchset: Daniel Werner; "(bug 45002) jQuery.valueview is now a single Widget using composition rather than inheritance" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57050 [16:48:26] grmbl, I thought people were working on that [16:48:28] i did implement some caching into rewrite so its marginally faster [16:54:19] I found strange font rendering on wikidata.org: http://lbenedix.monoceres.uberspace.de/screenshots/nj51n74243_(2013-04-03_18.53.17).png [16:54:44] lbenedix: i bet thats webfonts [16:55:03] I bet it looks strange [16:55:12] you see the same thing at https://am.wikipedia.org/wiki/%E1%8B%8B%E1%8A%93%E1%8B%8D_%E1%8C%88%E1%8C%BD right? [16:55:45] I'm talking about the word "Vorschau" [16:56:20] there are many more fonts for the word "Vorschau" [16:57:24] er [16:57:28] pretty sure its still webfonts [16:57:44] What do you mean? [16:57:58] https://am.wikipedia.org/wiki/%E1%8A%A0%E1%89%A3%E1%88%8D:Legoktm [16:58:01] see the weird fonts? [16:58:16] I think so [16:58:39] the website (wikipedia) is specifying a font to use [16:58:46] but is there any good reason to show the same word in different fonts on wikidata.org? [16:58:52] because if it didnt use that font [16:59:00] its unlikely you would be able to see those characters [16:59:36] used to be https://www.mediawiki.org/wiki/Extension:WebFonts and apparently it got merged with ULS [16:59:51] I think I would see the word "Vorschau" [17:01:50] I'm not talkint about the article name [17:02:10] ahhh... I think its a gadget that adds the preview-function [17:02:42] btw. Are there bots that remove the langlinks from the wikipedia articles? [17:04:01] lbenedix: yup. i run one, addshore does too [17:04:40] New patchset: Daniel Werner; "(bug 45002) additional experts for jQuery.valueview" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57321 [17:04:58] how many articles in dewiki are cleaned already? [17:05:42] not sure, but this is how many are left bots.wmflabs.org/~addbot/ [17:06:25] my bot runs off the latest dumps so its slower but it has more features [17:07:07] what happens if someone removes the links by hand? [17:08:27] Editing the statements is quite difficult [17:08:47] New review: Daniel Werner; "(1 comment)" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57048 [17:09:45] I have no idea what are the available properties [17:09:59] lbenedix: https://www.wikidata.org/wiki/Wikidata:P [17:10:26] New patchset: Daniel Werner; "(bug 45002) additional experts for jQuery.valueview" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57321 [17:11:40] How should my grandmother come from http://www.wikidata.org/wiki/Q1825830 to the list of properties? [17:12:27] um [17:12:31] you dont? [17:12:48] file a bug maybe? :) [17:14:01] I'm not a friend of bugzilla [17:15:07] post on project chat then? [17:17:23] If you have go to WMDE and graffti it on the wall. [17:17:45] haha [17:18:02] New patchset: Daniel Werner; "(bug 45002) jQuery.valueview is now a single Widget using composition rather than inheritance" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57050 [17:18:18] New patchset: Daniel Werner; "(bug 45002) additional experts for jQuery.valueview" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57321 [17:19:02] WMDE is not far away... [17:19:26] Ibenedix: Get some spray paint and go then :) [17:19:27] maybe for you :P [17:20:06] Not far for me either. Just a quick plane ticket [17:20:20] :x [17:20:38] together we could paint a big nyancat over the whole building [17:20:42] AHAHAHA [17:20:43] YES [17:21:01] Indeed, That would be fun. [18:00:39] Change merged: Jeroen De Dauw; [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/56766 [18:02:41] New review: Jeroen De Dauw; "(2 comments)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56183 [18:03:04] New patchset: Jeroen De Dauw; "Change DataValue serialization of EntityId to use prefixed id" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56183 [18:05:28] New patchset: Jeroen De Dauw; "Change DataValue serialization of EntityId to use prefixed id" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56183 [18:07:21] Change abandoned: Jeroen De Dauw; "Abandoning this as I cannot in good conscience provide this "improvement"." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56183 [18:42:50] legoktm: not gonna add your new scans to my bot yet [18:42:56] :( [18:43:03] will do tomorow! [18:43:14] and then we will have a lovely idea of how many articles there are left :p [18:43:22] addshore: OVER 9000 [18:43:26] I love refereshing http://bots.wmflabs.org/~addbot/ ;p [18:43:59] That's quite neat [18:44:17] :D [18:44:19] Then categories and stuff that don't have links now [18:44:31] There's another 10 million items to go to wikidata then, at least [18:44:50] Maybe at a push, halfway migrated [18:44:56] categories are included in those numbers [18:45:05] Oh? Nice [18:45:08] Reedy: my bot is importing categories from all 500k+ wikis as we speak :) [18:45:25] Why doesn't someone do the small ones so the list is shorter :p [18:45:34] https://www.wikidata.org/wiki/Special:NewPages hit show bots :P [18:45:42] Reedy: im gonna do those tonight [18:45:56] ^^ [18:46:04] Yeah, I check recentchanges now and again [18:46:05] hmm legoktm maybe I will start importing your new lists now :P [18:46:15] Looks like at least a 3rd of that list are < 1000 items [18:47:09] yep :) [18:47:11] we will have an perfectly uptodat pciture after I import these last dump lists :) [18:47:25] heh [18:47:59] So we'll have in the region of 16-18 million items when the 'pedias are done? [18:48:16] 348 epm muahahah [18:48:25] Nooice [18:48:38] legoktm: thats nothing ;p [18:48:46] Just over 600 for 9.5M [18:49:28] Reedy: my bot has created 6m, 7m, and 9m. I'm going for 10m [18:50:13] well, we're over 9.5M now [18:51:22] :D [18:53:14] legoktm: where are your txt files? [18:53:27] It's going to be amusing when the mass editing pretty much just stops [18:53:31] /~legoktm/dump_scans? [18:53:32] http://bots.wmflabs.org/~legoktm/dump_scans/ [18:53:34] yeah [18:53:34] :) [18:53:36] * Reedy goes to find out database sizes [18:53:47] or /data/project/legoktm/wikidata/*wiki.txt [18:54:00] are they still in the old place too? :D [18:54:06] [TXT] enwiki.txt 03-Apr-2013 16:21 49M [18:54:07] yeah [18:54:10] :D [18:54:11] i used cp instead of mv [18:54:18] I dont really have to modify my script then :P [18:54:20] [TXT] dewiki.txt 03-Apr-2013 16:21 19M [18:54:29] that doesnt seem to agree with your count? [18:54:38] we will see :) [18:54:38] probably cuz the enwiki dump is old... [18:54:41] dewiki = 228.708226770163 GB [18:54:46] wikidatawiki = 50.339569091797 GB [18:54:47] Slackers. [18:54:58] Ill import them all and then review the new total :P [18:55:18] im probably just about to overload the labs bots instances again :P [18:55:55] heheh [18:56:05] http://p.defau.lt/?yZ2PjPMqTD1ZG1LexrWmqQ [18:56:07] billinghurst already asked me if i was overloading bots ;P [18:56:13] ^ db size breakdowns for anyone that is interested [18:56:22] hmmm [18:56:25] so we need more labels [18:56:50] i wonder if anyone has tried copying all the en labels to en-gb with simple regex rules... [18:58:37] no :P [18:58:40] I doubt xD [18:59:11] * Reedy glares at legoktm [18:59:33] It's an idea! [18:59:58] I could also write a bot to fill up the revision table by making an edit and reverting itself constantly [19:00:04] plus that would fill up recentchanges too [19:00:08] and checkuser [19:00:09] :PPPP [19:00:32] | wikidatawiki.wb_id_counters | 0.00M | 0.00G | 0.00G | 0.00G | 0.00 | [19:00:36] Reedy: whats that table? [19:01:02] nfi [19:01:06] It's a wikidata table [19:01:13] i realize :P [19:01:22] can you take a look at whats in it? [19:01:31] nothing I bet [19:01:40] looks like that way :P [19:01:45] Oh [19:01:46] | 9504197 | wikibase-item | [19:01:46] | 386 | wikibase-property | [19:02:00] 9504277 [19:02:24] oh [19:02:31] is that just [19:02:41] keeping track of the highest id? [19:02:45] to assign the next one? [19:03:10] dont think we can make that table much bigger :P [19:07:24] why would we have more log entries than edits? [19:07:30] is that because of the patrol log? [19:18:01] legoktm: improved the statistics for my DB [19:18:02] Total in DB = 6,904,071, Total bot checked = 3,453,595, Total to go= 3,450,476 [19:18:06] yay [19:18:36] addshore: You're importing new items, right? [19:18:40] so hopfully your dumps have those 3.5 million articles in them ;p [19:19:23] multichill: no :P [19:19:43] You're nickname suggests otherwise ;-) [19:33:57] I know :P [19:34:55] its amazing how many people thought the 'add' bit of my bot name meant it should be adding things that messagaed me saying that my bot was removing things :P [19:35:56] I have a strage bug in Wikidata: I wanted to add WLM in capitals, but doing that the page makes it Wlm, and after refreshing it is WLM [19:36:05] I have seen such before [19:38:06] New patchset: Jeroen De Dauw; "Get rid of GenericArrayObject usage" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/57347 [19:40:14] New patchset: Jeroen De Dauw; "Specify that PHPUnit should be 3.7.*" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/57349 [20:05:45] Lydia_WMDE: when will the datatypes be available in Wikidata? [20:24:24] Hi. [20:24:37] Is Wikidata primarily for interwikis or for facts? [20:25:12] Somebody has merged items about Mozilla add-ons and Mozilla extensions. Maybe no Wikipedia has articles on both, but the things are different. [20:28:17] AVRS: what item do you mean? please give the id [20:28:40] Mozilla extension (Q2467894)‎‎ [20:28:46] (Deletion log) . . Vogone (A) (talk | contribs) deleted page Q7593080 ‎(RfD: Duplicate of Mozilla extension (Q2467894).) [20:29:07] https://www.wikidata.org/wiki/Q604841 ?! [20:29:28] the one above is about add ons [20:29:32] oh, https://www.wikidata.org/wiki/Wikidata:Interwiki_conflicts#Q2467894.2FQ7593080 [20:29:57] benestar: no, that's about the repository [20:30:04] ah [20:30:34] AVRS: I think you found the right place to discuss that [20:30:46] New patchset: Jeroen De Dauw; "Removed Diff\Exception" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/57407 [20:30:47] New patchset: Jeroen De Dauw; "Made Diff fully standalone from MediaWiki" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/57408 [20:30:47] New patchset: Jeroen De Dauw; "Improved dodcs and code style" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/57409 [20:31:32] benestar: I have commented at https://www.wikidata.org/wiki/Wikidata_talk:Interwiki_conflicts#Sibling_conflicts with an idea on what could be done with such things, so that interwikis do not override facts [20:31:47] benestar: although my idea only really concerned 2-article [20:32:16] That is, a property which would make Wikipedias show interwikis from both. [20:34:48] Thanks, I've reopened the request. [20:57:06] Empress? =) [21:02:26] New patchset: Jeroen De Dauw; "Made Diff fully standalone from MediaWiki" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/57408 [21:24:45] ..a greasy burger... [21:25:05] ..I want a greasy burger.. [21:25:15] ..go get me a greasy burger... [21:25:41] ..no? ..anyone? darn [21:41:58] New review: Daniel Werner; "(1 comment)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57073 [22:20:07] FYI, the running of the rebuildTermsSearchKey.php script has been stopped. [22:25:36] Reedy: why? [22:26:08] Most lagged slave was nearly 18,000 seconds lag [22:26:15] *lagged [22:26:16] 04[1] 04https://www.wikidata.org/wiki/ [22:26:29] And was only going to get worse for the next 7-8 hours [22:26:51] DB Dump time I think [22:27:18] ah ok [22:27:19] I've noticed a line that looks slightly screwy [22:27:20] | 247135 | 41253 | item | bn | alias | � | | [22:27:29] but term_search_key is '' [22:27:31] hmmm that looks strange indeed [22:27:53] I see it as a square in term_text on shell [22:27:56] can't say more about it unfortunately - fear that'll have to wait for daniel [22:27:59] or katie maybe [22:28:02] select min(term_row_id) from wb_terms where term_search_key = ''; [22:28:11] ^ So the query to find out where we need to work from won't work :D [22:28:26] heh [22:29:02] Shouldn't be a big deal... As it's indexed, using --only-missing it should attempt to update that row, then jump to find hte next '' [22:29:49] *nod* [22:30:10] * Reedy grumbles [22:30:11] mysql:wikiadmin@db35 [wikidatawiki]> select min(term_row_id) from wb_terms where term_row_id > 247140 AND term_search_key = ''; [22:30:11] +------------------+ [22:30:11] | min(term_row_id) | [22:30:11] +------------------+ [22:30:11] | 254476 | [22:30:33] Another square and '' [22:30:45] I'll log a bug as a reminder this may need poking [22:32:06] thx [22:49:49] Reedy, the script do some normalization, basically stripping of whatever seems to not be a char [22:51:11] I don't remember exactly what it do, check it out.. but if something prints out in your term as squares it is not unlikely it will be stripped. [23:06:47] New patchset: Daniel Werner; "(bug 45002) jQuery.valueview is now a single Widget using composition rather than inheritance" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57050 [23:16:17] [05:26:08 PM] Most lagged slave was nearly 18,000 seconds lag <-- really??? [23:16:25] Yeah [23:16:32] did that show up in the maxlag? [23:16:36] in pmtpa so you muggles wouldn't notice [23:16:41] oh [23:16:52] otherwise my bot would have slept for 18000 seconds :PPP [23:17:05] Reedy: Toolserver can do much better... beginners :P [23:17:33] ouch, the Esperanto pages in wikidata are in need of considerable corrections [23:18:09] tempodivalse: It's a wiki [23:18:14] Yup, {{sofixit}} [23:18:14] 04[3] 04https://www.wikidata.org/wiki/Template:sofixit [23:18:18] which is exactly what I"m doing :) [23:18:33] It's fun to contribute to new projects [23:18:38] help them get on their feet [23:20:55] :) [23:22:57] Wikidata and eo.wikivoyage should keep me busy for a while yet [23:23:52] well let us know if you have any questions :D [23:24:09] Thanks ! [23:26:22] So only 11 wikipedias can use info from Wikidata so far ? [23:26:37] How can we help expand this functionality to other projects? [23:27:23] its a slow rollout [23:27:38] im not sure when the next deployment is [23:28:00] since the project is still in beta, only projects that opted-in have it right now [23:28:20] soon it will be available to all projects though [23:30:34] ok [23:30:44] when exactly did Wikidata start? November 2012, I think it was? [23:32:59] https://www.wikidata.org/?diff=1 [23:36:16] 20:00, 26 October 2012 [23:36:21] it went online a bit later [23:36:30] nice!