[02:28:54] * Sven_Manguard throws legos at legoktm [02:29:08] hey! [02:29:20] I'd throw something at you but I'm not sure what a sven is. [04:11:24] * legoktm hugs Hazard-SJ  [04:11:56] * Hazard-SJ sighs [04:12:05] did you ever get a chance to write that RfBot archiver? [04:12:17] Not yet [04:12:45] ok [04:12:55] I came online fairly late and was working on https://commons.wikimedia.org/wiki/Commons:Bots/Requests/Hazard-Bot_7 [04:13:04] i started working on an implicatorbot clone [04:13:54] oh [04:14:03] Hazard-SJ: you just need to update the "Commons category" property [04:16:31] legoktm: That exists :O [04:16:36] * Hazard-SJ sighs and looks [04:16:54] look at the linked item again [04:18:25] Found it [04:18:51] * Hazard-SJ tries to figure out how to find it via the API [04:19:30] the problem is [04:19:37] there's no way to go from commons cat --> item [04:19:52] you could o [04:20:01] commons cat --> wp article --> item [04:26:43] legoktm: What's the best way to do that? [04:27:59] ok, so you start with https://commons.wikimedia.org/wiki/Category:Mr._Hankey,_the_Christmas_Poo [04:28:10] then you go to the en article at https://en.wikipedia.org/wiki/Mr._Hankey,_the_Christmas_Poo [04:28:20] at which point you end up at https://www.wikidata.org/wiki/Q927930 [04:29:27] i *think* multichill has a mysql db of all the commons categories [04:29:34] with their items [04:31:50] legoktm: From either the {{w}} or {{Sisterwikipedia}}, I'm assuming, but in which case, they don't exist for all cats. [04:31:57] Yeah [04:32:02] So basically it's going to a pain [04:32:20] You should talk to multichill though [04:32:27] He imported 99% of all the commonscat properties [04:33:27] legoktm: And speaking of mysql, I think I might have found a task you might be able to do via MySQL, otherwise I'll do it via API [04:33:39] Oh what is it? [04:34:22] https://simple.wikipedia.org/wiki/Wikipedia_talk:Bots#Bot_job_request_--_identify_long_stub_articles [04:38:27] bleh [04:38:31] I don't help simple [04:38:32] But [04:38:37] I'll show you where the query is [04:38:53] https://github.com/mzmcbride/database-reports/blob/master/enwiki/longstubs.py [04:41:18] Heh. [04:43:30] Hi Susan [04:44:05] OK, thanks, legoktm. I assume "mzmcbride" pinged Susan? :P [04:46:27] Or database-reports. [04:48:31] :P [07:55:25] hey all i need to report a problem [07:55:30] sup? [07:55:43] http://www.wikidata.org/wiki/Q3050758 needs to be merged with http://www.wikidata.org/wiki/Q2481769 [07:56:39] {{done}} [07:57:20] t [07:57:21] tx [07:57:38] in the future you can do it yourself :P [07:57:48] :x [10:58:02] anyone want to make an adhoc task force and annotate all wikipedia languages with the wikimedia language property? [10:58:08] i am 80 down, 200 to go :) [12:36:39] mhm, I get Saving the translation failed: Unknown error: "tpt-translation-restricted" when I want to save one particular translation of as page... [12:37:12] does anyone here know what the error means and why I can´t save the transaltion? [12:37:17] translation* [12:37:30] whats the page and what language? [12:38:12] Dutch and it is the display title of [[Wikidata:Bots]] [12:38:13] 10[1] 10https://www.wikidata.org/wiki/Wikidata:Bots [12:38:37] oh yeah [12:38:49] so that page wasnt finalized and someone marked it for translation [12:38:53] and it got translated in italian [12:38:56] -.- [12:39:09] but since it wasnt finalized, we restricted translation to italian so we didnt lose the existing one [12:39:35] oh [12:39:38] :/ [12:40:09] but the error isn´t that informative to be hinest [12:40:11] honest [12:40:39] yeah.... [14:26:05] Our Qunit code coverage is rather poor :/ [15:18:58] New patchset: Hoo man; "Refactor wikibase.store - introduce AbstractedRepoApi" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58223 [15:34:18] New patchset: Hoo man; "Refactor wikibase.store - introduce AbstractedRepoApi" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58223 [15:55:59] hello Denny_WMDE [16:14:16] aude: The whole refactor only gave us about 15-20% less data to load for linkItem :| I hoped for much more... [16:19:14] New patchset: Jeroen De Dauw; "Factor ClientStoreFactory into WikibaseClient." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/59711 [16:25:40] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/59711 [17:04:48] New patchset: Jeroen De Dauw; "Moved class registration of WikibaseRepo extension into dedicated file" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60209 [17:07:12] New patchset: Jeroen De Dauw; "Remove obsolete test files" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60210 [17:13:54] New patchset: Jeroen De Dauw; "Remove obsolete test files" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60210 [17:33:13] hi liangent [17:52:07] New patchset: Jeroen De Dauw; "Remove usage of LibRegistry from WBC extension" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60213 [17:57:26] New patchset: Jeroen De Dauw; "Remove usage of LibRegistry in ClaimSaver" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60214 [17:58:50] New patchset: Jeroen De Dauw; "Remove usage of LibRegistry in WBR API" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60215 [18:00:57] New patchset: Jeroen De Dauw; "Remove usage of LibRegistry in WBR SpecialPages" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60216 [18:05:31] New patchset: Jeroen De Dauw; "Use Property->setDataTypeId rather then ->setDataType so no DataType construction is needed" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60217 [18:13:39] New patchset: Jeroen De Dauw; "Moved class registration of WikibaseLib into dedicated file" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60218 [18:37:28] New patchset: Jeroen De Dauw; "Load only lib and repo settings for repo" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60219 [18:37:28] New patchset: Jeroen De Dauw; "Load only lib and client settings for client" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60220 [18:42:08] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60218 [18:48:22] New patchset: Jeroen De Dauw; "Move HashArray from lib to DataModel, as it is directly needed in DataModel and not in lib" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60221 [18:57:33] New patchset: Jeroen De Dauw; "Move classes from lib to DataModel, as they are directly needed in DataModel and not in lib" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60221 [19:18:12] can someone tell me how many subscribers the wikidata-mailinglist has? [19:26:39] > 0 [19:42:27] lbenedix1: https://lists.wikimedia.org/mailman/roster/wikidata-l [19:43:57] thanks! Is there such a site for all mailinglists? [19:44:44] lbenedix1: http://lists.wikimedia.org/mailman/listinfo [19:45:32] * lbenedix1 tries to remember his pass for wikitech-l [19:46:08] (The subscribers list is only available to the list administrator.) [19:46:58] gn [19:47:13] tries to remember his pass for wikitech-l [19:47:22] hehe [19:49:03] nope, my password doesn't work [19:50:18] I asked for the number of subscribers on wikitech yesterday and I was told that its 1555 [19:50:50] ask lydia tomorrow [19:51:04] I got all I need [19:51:09] she has this kind of numbers [19:51:29] would have been great if the subsriberlist of wikitech was public [19:51:56] * lbenedix1 dont understand why the list is only visible to admins... the mails with adresses are public... [19:53:45] to prevent spamming, I'd assume [19:54:18] my spambot would harvest the archive for mail-adresses [19:55:49] the list of actual subscribers might be smaller than the list of all contributors to the list [20:15:42] dunno why [20:16:01] is it the subcriberlist for wikien-l or wikitech-l non-public [20:16:03] ? [20:27:18] yepp [20:27:36] on listinfo of wikitech-l you can read: (The subscribers list is only available to the list administrator.) [20:30:58] well, then it's for consistency [21:07:52] Lydia_WMDE: https://en.wikipedia.org/wiki/Wikipedia:Requests_for_comment/Wikidata_Phase_2 [21:09:10] rschen7754: thx - will have a look [21:10:47] rschen7754: looks good :) happy to finally see that coming together [21:11:01] yeah, i didn't write it though :P [21:11:22] hehe [21:11:24] fair enough [21:35:30] rschen7754, it's interesting, but I think there's a conceptual gap [21:35:50] on the RFC? [21:35:52] the RFC talks about infoboxes and article text, but doesn't really consider non-infobox article elements such as lists or tables [21:36:26] Wikidata in running prose is crazy, I agree, but wikidata used to create, say, a list of all the towns in a county and their populations and so on, could be really valuable. [21:36:31] *nods* [21:36:47] shimgray: that's phase 3 [21:37:40] we can use it in phase 2, though. as soon as you can invoke the wikidata information for a different page to the one you're on, you'll be able to make lists/tables [21:38:25] ie, line 1 is city [q100000], so population is [population:q100000], etc. [21:38:38] phase 3 is more about the software generating it all for you :-) [21:39:27] i suppose [23:14:39] JohnLewis: what did I tell you about making mass requests? [23:14:56] Whoops. Sorry. [23:15:54] JohnLewis: Whats the problem? [23:16:05] Riley: Huh? [23:16:10] Jasper_Deng: * [23:16:11] :P [23:16:38] Riley: he posted all those requests individually, so now you get to go on a frenzy of marking them done. also idk why you requested deletion, being a sysop, but I guess you're unsure/involved. [23:17:30] Jasper_Deng: No editor is required to make batch requests, besides, it takes much longer. As for me, I was testing how the tool worked since I have never used it. [23:17:44] * Jasper_Deng wasn't /forcing/ JohnLewis to [23:17:50] Riley: it just makes more work for you and I [23:18:30] If it is too much work for you, don't delete the pages. [23:18:37] that isn't it [23:18:47] it's then marking them all as done [23:19:15] If it is too much work for you, don't get involved in the process by deleting the pages* [23:19:33] nothing is too much [23:19:43] it's just a bit less efficient [23:20:28] and? [23:20:45] you know how horribly backlogged it can get