[00:00:00] New patchset: Daniel Werner; "(Bug 48622) [DataValues] Introduction of UnUnserializableValue" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/64543 [00:01:31] New patchset: Daniel Werner; "(Bug 48622) [DataValues] Introduction of UnUnserializableValue" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/64543 [00:03:08] New patchset: Daniel Werner; "Update of the jQuery.wikibase.Claimview's isValid behavior" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64544 [00:03:11] New patchset: Daniel Werner; "(Bug 48622) When unserializing Snaks, consider faulty data value JSON" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64545 [00:08:19] New patchset: Daniel Werner; "Change calendar IDs to match Denny's guidelines" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/64546 [00:08:47] New patchset: Daniel Werner; "(Bug 48622) When unserializing Snaks, consider faulty data values JSON" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64545 [00:09:14] New patchset: Daniel Werner; "[time.js/DataValues] Change calendar IDs to match Denny's guidelines" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/64546 [00:16:39] New patchset: Daniel Werner; "(48145) Moving "Time" data type out of experimental" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64547 [00:32:32] New patchset: Daniel Werner; "(48145) Moves "Time" data type out of experimental" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64547 [00:39:50] New patchset: Daniel Werner; "(Bug 48622) When unserializing Snaks, consider faulty data values JSON" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64545 [04:47:53] Hi, could someone please give me access to #wikidata-admin? [04:49:07] The_Anonymouse: doing [04:51:11] The_Anonymouse: check your invites [04:51:48] rschen7754: thanks [04:51:52] np [10:43:54] hello, anybody here knowing about the Authority control tool ? [10:50:58] salut Tpt_ :) [11:02:42] hsarrazin: Salut! [11:02:58] Tpt_: comment vas-tu ? [11:03:09] Bien et toi ? [11:03:27] ok… [11:03:44] as-tu déjà utilisé l'outil de Magnus pour les AC sur wikidata ? [11:04:39] Pas tellement. [11:05:03] Pour l'instant il faut plutôt laisser faire les bots d'import et corriger derrière eux les erreurs? [11:07:49] Pour cela, les listes comme https://www.wikidata.org/wiki/Wikidata:Database_reports/Constraint_violations/P213 sont utiles. [12:26:45] addshore: You online? [12:55:16] multichill: yes [12:55:22] sort of [12:55:24] :D [12:55:57] I was looking at http://tools.wmflabs.org/addshore/addbot/iwlinks/raw.html?lang=nl and I noticed a lot of articles that could easily be linked [12:56:15] Take for example https://nl.wikipedia.org/wiki/Eupithecia_classicata [12:56:20] Is that something your bot does? [12:56:42] not right this second :/ [12:56:50] I had to turn off the importing to wikidata it [12:56:51] *bit [12:56:58] but I hope to work on it at the hackathon this weekend [12:58:48] can a botowner here create 1800 items for person related articles in the german wikipedia? [12:59:17] addshore: Would be nice to just have a bot in pywikipedia to do this so other people can help out [12:59:27] I dont do python ;p [13:01:15] are there any plans to have a proper merge function that redirects old ids instead of breaking them? [13:01:42] I have a list of articles without an wikidata item generated by my bot, but my bot only adds claims and descs and it cant import sitelinks or create items [13:02:20] I'm quite anoyed by the fact that everyone seems to be cooking their own code instead of sharing it [13:03:17] * Reedy open sources multichill [13:05:16] multichill: yes I think this is one big problem, but as big as the problem you mentioned is I think that we don't have coordination for our bots [13:09:10] True Pyfisch. For one the wrinting of the code and the running of the bots are two seperate things that are now heavily mixed. [13:10:49] multichill: mine is open [13:11:03] and I welcome anyone to improve on it, or try and put it into python [13:11:30] but currently it is messy php as I have had no time since phase1 really until the hackathon [13:11:52] My goals would be to have all the basic bots in Pywikipedia so a lot more people can run it [13:12:16] Besides that we probably have more advanced things like what you're doing addshore. That could be based of the framework or not [13:12:42] my task could easily be done my pywikipedia [13:12:45] i just dont know how :P [13:13:09] *by [13:13:41] multichill: If there are people interested in my bot I will publish it - after cleaning up the code. I think I have a good, extensiable template and category wrapper part which can be used for all kinds of import bots. [13:14:31] multichill: you any good with pywikipedia? [13:14:48] Pyfisch: Lol, you know you're making the age old mistake: "Clean up and than publish" ;-) [13:15:54] addshore: Good enough [13:16:16] well, I can write perfect psuedocode for functions of my bot [13:16:52] im just rairly sure I cant write a single line of py ;p [13:17:24] addshore: python is easy to learn :-p [13:17:37] teach me? :) [13:19:11] addshore: Take for example http://svn.wikimedia.org/viewvc/pywikipedia/branches/rewrite/scripts/claimit.py?revision=11537&view=markup [13:20:10] i think the problem might even be more the fact that I just dont know pywikipedia well enough to make it efficient.. [13:21:13] The whole Wikidata implementation in Pywikipedia is new to me too, but I'm getting the hang of it [13:21:37] I could write api functions in php and make it do exactly what I want in about 5% of the time it would take me to read into everything I need to do in pywikipedia and do it [13:21:56] bbl :) [13:22:26] Pyfisch: Take https://nl.wikipedia.org/wiki/Eupithecia_classicata, is your bot able to process this? [13:23:17] multichill: yes! [13:23:46] (but my bot don't speak nl) ;-) [13:30:08] multichill: my bot does something similar with persondaten on dewp. [13:30:53] I meant linking the nl and enwiki articles in a wikidata item [13:34:23] multichill: ok my bot can't do this yet. I only works on claims, sources and descriptions [13:38:55] Pyfisch: :-( [13:39:39] multichill: I think legoktm has published source code for this [14:15:24] New patchset: Denny Vrandecic; "first draft of the RDF dump creating script (DO NOT MERGE)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64081 [14:15:41] https://gerrit.wikimedia.org/r/#/c/64081/ [14:26:56] multichill: here my bot code: https://github.com/pyfisch/wikidata-import-bot [14:57:10] New patchset: Aude; "Catch property not found exception in snak formatter" [mediawiki/extensions/Wikibase] (mw1.22-wmf4) - https://gerrit.wikimedia.org/r/64578 [14:57:13] New patchset: Aude; "(bug 48497) Check for property id exists in ByPropertyIdArray" [mediawiki/extensions/Wikibase] (mw1.22-wmf4) - https://gerrit.wikimedia.org/r/64579 [15:19:51] Change merged: Aude; [mediawiki/extensions/Wikibase] (mw1.22-wmf4) - https://gerrit.wikimedia.org/r/64578 [15:20:11] Change merged: Aude; [mediawiki/extensions/Wikibase] (mw1.22-wmf4) - https://gerrit.wikimedia.org/r/64579 [15:27:00] New patchset: Denny Vrandecic; "RDF dump creating script" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64081 [16:16:49] hello, wanted to correct links: Fio de Ariadne (lógica) (pt) with Ariadnefaden (german) but it was impossible. both articles have the same subject and are beside the language identical. [16:18:29] Site link pt:Fio de Ariadne (lógica) already used by item Q3655970. is ok , but why not linked together? [16:22:47] ladmin [16:23:38] Guest36185: You first have to remove the links from one the items as a link can only be used in one [16:26:57] excuse me, it's the first time i found such an error, both articles are linkend to other languages, de to polan, pt to english, and ?russia, to much for a beginner. could you please beg another people to do that? thank you for assistance and have a nice day [16:28:38] good bye [16:58:19] oh guest has gone [16:58:25] i am not sure it is really the same item [17:05:15] New patchset: Denny Vrandecic; "changed namespaces a bit, added license" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64538 [17:09:26] help… I found 2 items, with very different names, that seem to be the same person, from the description I can understand… how do you deal with that ? [17:10:08] Q560443 and Q65276 ?? [17:10:22] I don't understand enough German to be able to differentiate… [17:26:41] hsarrazin: They have both articles on the English Wikipedia and the article about Bernhard Knipperdolling explains that he is a follower of Jean de Leyde [17:27:09] Tpt_: merci… :)) [17:27:34] je suis tombée dessus à partir de la liste d'ISNI en doubles… [17:28:27] c'est marrant… jusqu'ici, tous ceux qui ne sont pas des doublons viennent d'une erreur de VIAF du wikipedia de ;) [17:30:12] hsarrazin: de ? J'en ai vu aussi beaucoup de problématiques venant de en. [17:30:22] Tpt_: c'est curieux, les deux semblent être morts le même jour, le 22 janvier 2536… [17:30:56] pour le moment, ceux que j'ai trouvé venaient de de… mais je n'en ai fait que quelques dizaines au total… [17:30:59] hsarrazin: Il ont sans doute été exécutés le même jour lors de la prise de la ville [17:31:56] Tpt_: *1536 évidemment, pas 2536… et tu as sans doute raison… je ne me suis pas penchée en détail sur leur histoire ;) [17:34:25] Tpt_: ma question tout à l'heure, à propos de l'outil AC, concernait des personnes qui n'ont pas d'autorité sur les wp de référence… donc pas moyen de les récupérer par bot… [17:34:56] or, on les a sur Commons… mais tout reporter à la main, c'est… beeuuuuuuhhhhh [17:34:59] hsarrazin: A cause de l'import depuis VIAF qui ne marche pas ? [17:35:04] oui, [17:38:04] Tpt_: si au moins, on pouvait récupérer les autres valeurs en rentrant le VIAF correct… [17:38:29] hsarrazin: Je vais essayer de débugger cela. [17:38:55] ça serait super sympa… j'ai mis un petit message à Magnus, tout à l'heure… [17:41:32] New review: Tpt; "Some (maybe silly) questions." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64538 [17:49:46] New patchset: Aaron Schulz; "Removed useless root job params." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64599 [17:53:08] New review: Jeroen De Dauw; "Beware: the method touched does not have test coverage" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64599 [17:57:37] hsarrazin: Son code repose majoritairement sur un outil qu'il a développé sur les Wikimedia Labs ce qui rend le code difficile à comprendre rapidement. Le plus simple est donc de lui dire que la recherche sur VIAF de son script https://www.wikidata.org/wiki/User:Magnus_Manske/authority_control.js ne se termine souvent pas quand on l'exécute avec Firefox/Chrome/.... [17:58:10] Tpt_: ok, merci… ça fonctionne chez toi ? [17:58:24] hsarrazin: Non [17:58:35] pas sous Firefox 23 [17:58:40] je vais faire un test avec Safari, pour voir… [18:00:45] Tpt_: ça ne marche pas non plus avec Safari… :S [18:04:54] 8 Warning: assert() [function.assert]: Assertion failed in /usr/local/apache/common-local/php-1.22wmf4/extensions/Wikibase/DataModel/DataModel/Claim/Claims.php on line 283 [18:04:57] More of those apparently [18:06:39] not that again [18:15:47] hello aude [18:16:08] aude: Done. Not sure if you need a scap, but there's one coming anyway for login stuffs [18:16:16] hi pragunbhutani [18:16:34] Reedy: not this time [18:16:38] aude: does the site links table on the left use an entity id? [18:17:29] on the left? [18:19:27] Reedy: if the assertion thing appears more, then i can take more look at it [18:19:36] It comes and goes [18:19:41] seems a different issue [18:19:42] aude: on a wikibase client article page [18:20:01] pragunbhutani: ok, the "language links"? [18:20:06] in the sidebar? [18:20:12] aude: yes, exactly [18:21:05] yes, when a page is parsed (e.g. saved), it finds out based on page title, if it has a "connected" wikidata item [18:21:09] then stores the entity id [18:21:17] in the parser output [18:21:55] it also injects links from wikidata in the language links list [18:22:18] entity id is used to construct the "edit links" link [18:22:43] okay, so if it stores the entity ID, all I have to do is find where it's stored and then add it to the info items of the page through a hook [18:23:18] for the info action, we might want to do a separate lookup in the site link table [18:23:35] how do I do that? [18:27:25] pragunbhutani: the SiteLinkTable class has methods for looking up an entity id by site link (client page title and site id) [18:30:15] SiteLinkTable class in Wikibase client? [18:31:15] ah okay, I've found it [18:31:17] its in wikibase lib (shared code that the repo uses) [18:32:15] so then you need to figure out how to construct a "SiteLink" object [18:32:36] from a "site id" and page title [18:33:09] I think I see it [18:33:19] I'll just paste bin a part of code [18:33:23] ok [18:34:24] http://pastebin.com/Qqn1CrHk [18:36:17] pragunbhutani: the right direction :) [18:36:38] i don't think info=action has access to the $parser but there are other ways there to get the title object [18:37:04] I found that in parserhooks/PropertyParserFunction.php [18:37:17] ok [18:37:31] the parser function definitely has access to Parser [18:38:11] so $entityId should already have the information stored in it? [18:38:24] even if we have to do a new lookup [18:38:36] it can be done the same way, can't it [18:38:36] ? [18:39:52] entityId is what we what, if that's what you are asking? [18:40:08] yes [18:40:49] ok [18:41:22] EntityId is an object but it's possible to get the prefixed id from it [18:41:25] e.g. q60 [18:41:42] that's what should be added to info=action [18:44:19] oh okay, I see what you mean [18:44:44] we only want a particular data member [19:47:58] aude: in Wikibase Lib, there's a function 'public function getEntityID that returns the id in a string [19:48:27] so if I can create and object and call that method on the object, we should have a string with the ID that can be added to the info actions [19:51:30] pragunbhutani: where is that method? [19:59:00] aude: extensions/Wikibase/client/includes/WikibaseLibrary.php [19:59:19] Line 94 onwards [20:03:35] back...