[00:08:43] nope, no way [07:36:17] Aaarghhh https://www.wikidata.org/w/index.php?title=Q1275297&action=history [07:38:12] sjoerddebruin: seems like an interwiki bot going wild [07:40:49] No, users who have no idea what they are doing. [07:41:20] It happens every week. https://www.wikidata.org/w/index.php?title=Q67528&action=history [09:13:51] Hallo. [09:16:52] Hi [09:25:11] Amir1: thanks, humans with shit categories are hard to find. :) https://nl.wikipedia.org/w/index.php?title=Chi-A.D.&type=revision&diff=44870691&oldid=37603364 [09:26:15] Oh, it's not you. :O [09:32:30] * aude rage.... what is wikidatawiki fatal ERROR: [4ae6f0f0] PHP Fatal Error: unknown exception? [09:32:48] no details at all :o [09:46:24] in the hangouts [09:46:30] benestar|cloud: ^^ [10:37:34] aude: unknown exception? :D [10:37:36] woo! [10:38:55] addshore: helpful :o [10:38:57] aude: https://phabricator.wikimedia.org/T93970 [10:39:38] i don't think it happens [10:39:47] i would close it, saying "reopen if it happens again" [10:40:08] addshore: maybe you could look at https://phabricator.wikimedia.org/T112003 [10:40:18] probably very easy to fix and easy to reproduce [10:43:08] Lydia_WMDE: would https://phabricator.wikimedia.org/T111056 be a task? [10:43:28] i imagine it would have multiple subtasks and shouldn't result in a change of user-facing behavior [10:45:05] aude: I'll take a quick look at it! [10:45:19] thanks [10:45:28] * aude is triaging bugs so can't look right now [10:54:38] DanielK_WMDE: https://phabricator.wikimedia.org/T110214 [11:05:07] aude: how regular are those exceptions? [11:06:26] not often but easy to reproduce [11:06:30] cool [11:06:53] https://www.wikidata.org/w/index.php?title=Q123603&diff=192731313&oldid=53185603 [11:07:03] Katie: DanielK_WMDE Thiemo_WMDE jzerebecki asian food ? https://www.lieferheld.de/lieferservices-berlin/restaurant-big-bao-sushi-thai/16817/ [11:07:14] mmmmm [11:10:58] * aude wants Gaeng Phed with tofu [11:12:53] :D [11:13:35] on https://www.wikidata.org/w/index.php?oldid=53185603 [11:13:40] P107 is deleted [11:14:18] Jonas_WMDE: https://phabricator.wikimedia.org/T50143 [11:14:25] not sure that isthe problem though [11:15:49] Jonas_WMDE: Gaeng Sale with Shrimp [11:17:12] more food wishes? aude jzerebecki Thiemo_WMDE? [11:17:21] Jonas_WMDE: still choosing [11:17:45] Jonas_WMDE: me food please! please wait. [11:19:02] Jonas_WMDE: Hauptgerichte - Gaeng Sale, Hühnerfleisch, 6,50, please. [11:21:51] aude: yeh, got the issue, making a patch now, its in datamodelservices [11:22:01] will need a minor release [11:24:45] Jonas_WMDE: Cream Cheese Sake, Sake Avocado Maki, Hotbeef I. O., Fischrogen, Chicken I. O., Sesam [11:25:06] thanks addshore [11:26:20] easily reproducable locally, by deleting a property then viewing a diff that has the property [11:30:29] yeh, it would happen in all cases that formatter was used for deleted entities [11:30:41] * addshore didnt reproduce, just looked at the code ;) [11:31:25] that too [11:32:08] benestar: around? i rebased the relevant datamodel pull requests. [11:32:16] thanks for the merge. [11:36:44] aude: https://github.com/wmde/WikibaseDataModelServices/pull/65 [11:36:48] \o/ [11:37:24] looks sane but will try it [11:37:30] :) [11:40:38] * aude cries [11:42:17] aude: ? :( [11:44:09] * aude pokes at EntityInfoTermLookup [11:44:58] aude: another thing? [11:45:31] got a fix [11:45:36] in wikibase [11:45:52] ? :O [11:46:16] for the same issue? [11:47:34] related [11:47:45] appears once i apply your patch on data model services [11:48:11] hah! [11:51:01] https://gerrit.wikimedia.org/r/#/c/237357/ [11:51:04] * aude wants to add tests [11:52:57] jzerebecki: https://phabricator.wikimedia.org/T108101 [11:55:40] * aude runs phpunit tests on wikibase [11:58:17] looks like things need more adjusting [12:35:49] DanielK_WMDE: https://phabricator.wikimedia.org/T112003 [12:36:09] now i get https://phabricator.wikimedia.org/P2003 when viewing a diff with a delete property [12:36:25] have https://gerrit.wikimedia.org/r/#/c/237357/ but tests fail and think it causes stuff to break elsewhere [12:36:59] * aude wants to be very sure that whatever we backport will not make new bugs and make things worse [12:37:43] very easy to reproduce the bug [12:41:35] DanielK_WMDE: https://phabricator.wikimedia.org/T92517 [12:53:17] aude: jzerebecki: https://phabricator.wikimedia.org/T92532 <- still relevant? [12:53:59] it probably is [13:04:09] aude: is this still open? https://phabricator.wikimedia.org/T92532 [13:07:25] Jonas_WMDE: it probably is [13:07:56] i think there is a tag for needs checking [13:08:29] or maybe a column [13:09:24] aude: I'll let you deal with https://gerrit.wikimedia.org/r/#/c/237356/ etc ;) I made the patch but it might be better for it to go with your stuff! [13:09:26] DanielK_WMDE: https://gerrit.wikimedia.org/r/#/c/237356/ [13:09:38] addshore: daniel is looking now [13:09:44] cool! [13:09:45] (so i can look at the quality extension) [13:10:36] * aude got some of those failures in my patch also [13:11:00] maybe if i rebase [13:16:24] aude: https://phabricator.wikimedia.org/T91397 still relevant? can not find other languages on mobile ... [13:41:13] Jonas_WMDE: probably benestar fixed it :) [13:42:30] DanielK_WMDE: still want to help with https://gerrit.wikimedia.org/r/#/c/237357/ [13:48:58] aude: https://gerrit.wikimedia.org/r/237374 [13:49:21] DanielK_WMDE: enough discussion?: https://phabricator.wikimedia.org/T103346#1625236 [13:52:58] jzerebecki: why do we have a test that makes sure that json parsing fails on trailing commas? [13:53:03] is that something we actively rely on? [13:53:45] jzerebecki: why is this in the wikidata project, anyway? [13:54:17] DanielK_WMDE: because scribunto is included in the wikidata build [13:54:25] DanielK_WMDE: can this be linked to an epic? https://phabricator.wikimedia.org/T88986 [14:07:53] there is https://phabricator.wikimedia.org/T92357 which is tangentally related [14:08:16] https://phabricator.wikimedia.org/T88445 looks like an epic [14:08:22] Jonas_WMDE: ^ [14:08:32] T92357 is for specific issues [14:09:03] maybe we want a wikibase epic for our issues and add it as a blcoker to T88445 [14:09:07] DanielK_WMDE: ^ [14:10:24] is it known that the edit summary doesn't seem to link items for changed statements which point to items now? it just shows the target item's label as text like it were a string property, not an item [14:10:32] (if that makes any sense...) [14:11:17] nikki: do you mean this? https://www.wikidata.org/wiki/?diff=250029949&oldid=249601743 [14:12:16] ah, I have the ui in english, but yeah, that looks like the same thing [14:12:19] thanks [14:30:42] DanielK_WMDE: https://gerrit.wikimedia.org/r/#/c/237374/ and the follow up [15:06:50] DanielK_WMDE: https://gerrit.wikimedia.org/r/#/c/237357/ [15:52:53] /window 35 [15:52:55] gah [16:11:22] xD [16:25:08] jzerebecki: https://phabricator.wikimedia.org/T112120 [16:27:30] hmm... https://fi.wikipedia.org/w/index.php?title=Toiminnot:Tuoreet_muutokset&hidewikidata=0 .... --> (ero | historia) . . D Wikipedia:Wikiprojekti Avoin kulttuuridata hyötykäyttöön/Tietojen vieminen Wikidataan (Q55); 19.21 . . Fnielsen (keskustelu | muokkaukset) (Wikidata-kohdetta muutettu) [16:28:05] and the diff https://www.wikidata.org/w/index.php?title=Q55&curid=175&diff=250789571&oldid=247996280 says nothing about "Wikipedia:Wikiprojekti Avoin kulttuuridata hyötykäyttöön/Tietojen vieminen Wikidataan" [16:29:18] Lydia_WMDE: I guess you already have a bug for the unit turning into a link to the entity instead of showing the right Qid in the interface? [16:29:35] multichill: when editing? [16:29:39] yup [16:29:45] will be fixed with the next release :) [16:30:01] great [16:31:27] First food, than look for docs how unit conversion is going to work out [17:10:34] Lydia_WMDE: Does that also include that 44,5 turns into 445? [17:10:46] Looks like only 44.5 stays 44.5 [17:11:01] that's a different thing. but also a ticket open already [17:11:07] and independent of unit support [17:23:34] DanielK_WMDE: https://phabricator.wikimedia.org/T103912 [17:36:45] Lydia_WMDE: I believe https://www.mediawiki.org/wiki/Wikibase/DataModel#Quantities needs updating? [17:37:11] Or is that still all the case? [17:39:51] multichill: looking [17:40:16] Just filed the bug to implement this in Pywikibot, would suck if we implement it the wrong way :P [17:40:46] yeah. from my side it still looks ok but aude, DanielK_WMDE or jzerebecki should have a look too [17:41:43] * Harmonia_Amanda is pretty tired of the massive distrust some French Wikipedians have for Wikidata [17:43:35] multichill: the documentation looks ok to me [17:43:38] Harmonia_Amanda: See it as constructive scepticism pushing a bit back to the people who might be going a bit too fast ;-) [17:44:00] multichill: if it was only that, it would be great actually [17:44:27] but no, there people here who genuinely think 95% of wikidata edits are about interwikis [17:44:42] or that Wikipedia is the exclusive source for wikidata [17:44:54] ergo wikipedia shouldn't use wikidata at all [17:45:11] and that's not "constructive scepticism" in my book [17:45:22] These days every statement I import is sourced. Much better than Wikipedia! [17:45:42] ^^ [17:46:21] there were many great ideas, like a gadget to modify wikidata directly from a wikipedia infobox [17:46:30] It's not only the French Harmonia_Amanda. [17:47:00] and there are these massive discussions about prejudices and falseties [17:47:08] and of that, I'm tired [17:48:01] multichill: actually, that's not reassuring at all [17:48:11] Sorry :-( [17:49:03] I would prefer if it where only a dozen of angry French people [17:49:13] and not a "general" thing ;) [17:49:38] They'll get over it [17:49:51] I hope [17:49:55] You know, the same thing happened with Commons when it started [17:49:59] really? [17:50:26] Yeah, of course. Some of the old farts still don't like Commons [17:50:33] because we searched on the french discussions pages and no big debate for commons was found [17:50:48] Some of the other languages sure had! [17:50:48] everyone kinda accepted it right away [17:51:06] only some discussion on *how* to transfer things to commons [17:51:10] In Dutch at some point we just transfered most of it and deleted everything [17:51:17] well ok then [17:51:23] multichill: how does pywikibot go from a string to a datavalue? does it use the wbparsevalue api module? [17:51:41] DanielK_WMDE: In what context? [17:51:53] multichill: when improting things into wikidata [17:52:01] well, now that we have units, I'll start fabricating tables for the sled dogs races to insert in articles [17:52:16] wbeditentity requires DataValue structures in JSON. but the bot generally starts out with a plain string [17:52:17] and I bet nobody will see it until months and months [17:52:47] DanielK_WMDE: http://git.wikimedia.org/blob/pywikibot%2Fcore.git/67b2cca00956553e57c622cc4ecdc270ceaee32a/pywikibot%2F__init__.py [17:52:50] multichill: so, how is the JSON that represents a QuantityValue constructed? [17:53:03] Look for toWikibase [17:53:58] DanielK_WMDE: http://git.wikimedia.org/blob/pywikibot%2Fcore.git/67b2cca00956553e57c622cc4ecdc270ceaee32a/pywikibot%2F__init__.py#L462 [17:54:36] multichill: no, that's not really what i'm looking for. How does it go from a string to a WbQuantity object? [17:54:49] multichill: especially: how is the error parameter populated? [17:55:18] i want to avoid pywikibot re-implementing the logic for parsing and guessing the precision [17:55:30] we have code and a public api for that [17:56:41] DanielK_WMDE: I guess you noticed https://phabricator.wikimedia.org/T112130 ? [17:58:13] multichill: not until now... the ticket looks fine, but my question is really about the error magin, not the unit. i expect a lot of bots importing a lot of values. and i don't want them to do that with +/-0 per default, because that is nearly always wrong. [17:58:57] DanielK_WMDE: http://git.wikimedia.org/blob/pywikibot%2Fcore.git/67b2cca00956553e57c622cc4ecdc270ceaee32a/pywikibot%2F__init__.py#L486 won't make you happy [17:59:30] >grep wbparsevalue *.py doesn't return anything [17:59:49] multichill: yea, i saw that. a bit worrying, but not totally horrible - as long as the error is usually set. [18:00:01] my suggestion is to *require* the error to be set explicitly [18:00:02] No, it won't be [18:00:18] that'S very bad. [18:00:25] I'll file a bug [18:00:45] that means lots of values improted with apaprently mathematically absolute precision. which is never true for a measured quantity [18:00:49] multichill: thanks [18:01:36] actually, the only measuable quantity i can think of that has mathematicaly absolute precision is the speed of light - because the meter is *defined* relative to C :) [18:01:50] Hmm, not sure how I would use it DanielK_WMDE. Did you see the time implementation? [18:02:03] ( http://git.wikimedia.org/blob/pywikibot%2Fcore.git/67b2cca00956553e57c622cc4ecdc270ceaee32a/pywikibot%2F__init__.py#L112 ) [18:02:25] Erhm, http://git.wikimedia.org/blob/pywikibot%2Fcore.git/67b2cca00956553e57c622cc4ecdc270ceaee32a/pywikibot%2F__init__.py#L328 [18:02:40] It sets the precision based on what the user set [18:03:13] multichill: what do you mean? [18:03:27] it seems fairly simmilar to how this works in php [18:03:34] If you provide year and month the precision is set to month [18:03:40] If you add the day to it's set to day, etc [18:04:02] So it would probably be a wrapper helper function [18:04:25] but you can also set the precision explicitly [18:04:45] when you get something back from the API, you'd just split the ISO string, and pass in the precision directly [18:05:03] the constructor shouldn't call the parse module [18:05:17] there should be a separate object for going from string to value object [18:05:20] Exactly. If an existing object is worked on, all the data is loaded from that object [18:05:46] yes, that'S how it should be [18:05:55] i'm worried about the case when a numebr is taken from an infobox [18:05:58] how is it parsed? [18:07:04] DanielK_WMDE: Like a data from an infobox? I'm not sure we have a bot already capable of doing that [18:07:27] multichill: we have bots importing statements, no? [18:07:57] the values come from wikitext somehow. so they have to be parsed [18:08:14] http://git.wikimedia.org/blob/pywikibot%2Fcore.git/67b2cca00956553e57c622cc4ecdc270ceaee32a/scripts%2Fharvest_template.py#L168 <- that's the one I wrote [18:08:37] It supports images, strings and items [18:09:01] hm, Amir1 at least is importing quantities [18:09:27] Like what for example? He probably build something around it [18:11:30] aude: https://www.wikidata.org/w/index.php?title=Q4407383&diff=0&oldid=241486749 [18:13:08] multichill: ok, so, my suggestion is to file two bugs: a) require explicit error values (don't default to 0, 0 is *rare*) and b) implement a parsing helper based on wbparsevalue (because otherwise, people will roll their own, doing horrible things) [18:13:43] Helper one is at https://phabricator.wikimedia.org/T112140 [18:14:09] thanks [18:19:46] DanielK_WMDE: And filed https://phabricator.wikimedia.org/T112141 for a practical application [18:20:45] For "a) require explicit error values (don't default to 0, 0 is *rare*) " not sure how I would file that..... You could give it a shot [18:22:52] addshore: Lydia_WMDE grafana just got put behind an authentication wall, btw [18:22:58] :P [18:23:03] yay [18:23:07] whats the authentication? [18:23:10] ah [18:23:13] ldap / nda? [18:23:13] same as gerrit [18:23:16] yep [18:23:56] YuviPanda: yeh,, https://phabricator.wikimedia.org/T108546#1626647 [18:27:34] aude: addshore btw, graphite's /render/ urls were never behind LDAP [18:27:43] so you can still get data out if you know exactly what you want [18:27:56] yeh :P [18:28:08] or, lots of screenshots of static graphs ;) [18:36:17] DanielK_WMDE / aude: How do I use wbparsevalue to get a valid quantity with unit? https://www.wikidata.org/w/api.php?action=wbparsevalue&datatype=quantity&values=50.4&validate&options={%22unit%22:%22Q174728%22} [18:37:03] DanielK_WMDE: multichill hey I was afk [18:37:14] Welcome back [18:37:21] I build something using pywikibot but not from harvest [18:37:26] It's pretty similar [18:37:38] https://www.wikidata.org/w/api.php?action=wbparsevalue&format=json&parser=quantity&values=3&options=%7B%22lang%22%3A%22en%22%2C%22unit%22%3A%22http%3A%2F%2Fwww%2Ewikidata%2Eorg%2Fentity%2FQ11573%22%2C%22applyRounding%22%3Afalse%2C%22applyUnit%22%3Afalse%7D [18:37:47] not too readable [18:37:56] code review in pywikibot is kind of hard, I don't want the headache of adding a script to the repo [18:37:57] but is what the ui does [18:38:22] Oh, crap, the entity id [18:38:28] the unit needs to be the URI [18:38:28] and pywikibot also needs update (the most important one: pywikibot/__ini__.py [18:38:39] http://www.wikidata.org/entity/Q11573 [18:38:54] https://www.wikidata.org/w/api.php?action=wbparsevalue&datatype=quantity&values=50.4&validate&options={%22unit%22:%22http://www.wikidata.org/entity/Q174728%22} [18:38:56] Yah! [18:38:57] like globe [18:39:02] :) [18:39:44] Amir1: You're using a deprecated field parser, should be datatype [18:39:58] https://www.wikidata.org/w/api.php?action=wbparsevalue&format=json&datatype=quantity&values=3&options={%22lang%22%3A%22en%22%2C%22unit%22%3A%22http%3A%2F%2Fwww.wikidata.org%2Fentity%2FQ11573%22%2C%22applyRounding%22%3Afalse%2C%22applyUnit%22%3Afalse} [18:41:24] I don't know [18:41:35] I don't use any field parsers [18:41:44] I use mwparserfromhell to parse template data [18:44:20] Amir1: So that parser (wbparsevalue) works in different languages? Hmm, that opens up some possibilities [18:44:49] Lydia_WMDE: should tasks for Wikidata-Page-Banner really always be in Wikidata?: https://phabricator.wikimedia.org/herald/rule/30/ [18:45:25] I send you the really hack nasty code [18:45:31] Amir1: https://www.wikidata.org/w/api.php?action=wbparsevalue&datatype=time&values=30%20maart%201853&options={%22lang%22:%22nl%22} [18:45:33] and you can see how it works [18:45:39] That doesn't work. How to force the language? [18:45:53] never mind, made a typo [18:46:06] That's pretty cool [18:46:14] I'm harvesting data from Wikipeida [18:46:36] I only check if the property exist in wikidata or not [18:49:47] benestar: around? [18:50:11] aude: yes [18:50:23] wonder if you are still going to do https://phabricator.wikimedia.org/T111025 ? [18:50:31] suppose it doesn't need to be today, but sooner the better :) [18:50:39] we haven't deployed just yet [18:51:43] aude / benestar : What do you think of https://www.wikidata.org/w/index.php?title=Q11573&type=revision&diff=250900754&oldid=250728725 to relate units? [18:51:56] yes, I will write a script for that but not sure if I get to that tomorrow :S [18:54:10] benestar: ok [18:54:15] if you want me to do it, let me know [18:54:25] * aude only needs a list of pages [18:54:32] but not urgent [18:55:04] sjoerddebruin: You created a lot of new unit properties today? [18:55:12] Nope. [18:55:23] aude: I'd start with https://www.wikidata.org/wiki/Special:WhatLinksHere/Q17437798 and check the sitelinks [18:56:32] sjoerddebruin: Do you think https://www.wikidata.org/w/index.php?title=Q11573&type=revision&diff=250900754&oldid=250728725 is the right direction to go? [18:56:50] Sounds logicial. [18:57:25] benestar: i could start with generating a list of pages [18:57:40] not right not but maybe later or tomorrow [18:59:39] Amir1: Thanks! I'll probably hack up something to import dimensions of Van Gogh paintings. You like Van Gogh, right? ;-) [18:59:56] yeah :) [19:00:00] thanks [19:17:49] Amir1: Families are still hard for Kian [19:18:12] What do you mean by families? [19:18:21] can you give me some examples? [19:18:25] https://www.wikidata.org/wiki/Q525971 [19:20:06] multichill, Amir1: https://phabricator.wikimedia.org/T112156 [19:20:46] SHit @@#$@!#$@!, I just realized that Pywikibot crashes if it tries to load an item with units on it!!! [19:21:09] yeah [19:21:17] you need to change pywikibot/__init__.py [19:21:36] in wbquantity class __init__ function [19:21:50] multichill: Do you want to make a patch? [19:21:57] or I will do it [19:22:29] sjoerddebruin: I can't teach Kian to add such families [19:22:30] https://www.wikidata.org/wiki/Special:WhatLinksHere/Q525971 [19:22:38] since it's not used at all [19:22:45] It's shitty categorizing too btw [19:23:41] Honestly since ANN can handle non-linear classifications, It wouldn't be a problem, usually I have issue of low data [19:23:55] and that's the problem [19:24:37] Thanks DanielK_WMDE :) [19:25:21] Amir1: If you can patch up pywikibot that would be nice. https://phabricator.wikimedia.org/T11213 is the bug [19:26:35] sjoerddebruin: if you suggest me some models to work, I can run and see if they work properly or not. three things build a model: a property, value of that property, a wiki. e.g. P31:Q5, enwiki [19:26:44] it can be any wiki including wikinews, etc. [19:26:50] multichill: sure [19:26:53] thanks [19:26:58] There other example also got family inside the label, so a autolist job could also do [19:27:08] Amir1: Maybe first a small patch so that it doesn't crash anymore? [19:27:30] yeah [19:27:35] Amir1: I could just update one of the Q items used for the testing to have all the tests fail :P [19:27:43] multichill: writing it right now [19:32:36] aargh broken javascript shit drives me crazy [19:32:48] have to refresh items like two times to get it working [19:32:56] multichill: the error is not correct [19:33:02] can you give it me? [19:33:07] *bug number [19:33:31] Amir1: https://phabricator.wikimedia.org/T112130#1627447 [19:33:51] there was a zero missing :) [19:35:11] patch is there [19:39:06] multichill: does this looks like a useful site for a property multichill? http://zoeken.nai.nl/CIS/persoon/3419 [19:40:34] sjoerddebruin: Interesting. How many people are on there? [19:40:54] Altough the warning is not very nice [19:40:56] 19119 it seems [19:41:10] Oh, missed that [19:41:57] "De collectie van Het Nieuwe Instituut is toegankelijk via het Collectie Informatiesysteem. Dit systeem is verouderd en wordt niet langer onderhouden. Het zal in de loop van 2015 worden vervangen door een nieuw zoekportaal." [19:42:15] Need to keep an eye on that. [19:42:21] Maybe drop them an email? [19:42:54] Will try that. [19:44:36] Done [19:46:28] Also 9074 objects. [20:16:01] I merged https://www.wikidata.org/wiki/Q16671641 into https://www.wikidata.org/wiki/Q16269373 [20:16:05] Can someone make the redirect? Thank you [20:17:40] harej: done, you can use the merge gadget to do it yourself ;) [20:33:32] DanielK_WMDE: Why didn't you implement quantity as a tuple? [20:34:42] ;-) [20:39:16] aude: at which rate can I purge pages? [20:44:20] sjoerddebruin / Amir1: Can you please complete the steps at https://www.wikidata.org/wiki/Wikidata:Property_creators#Steps_when_creating_properties for the properties you created? [20:44:31] tia [20:45:03] Trying my best [20:45:12] sure :) [20:49:46] Another question: how should I deal with https://www.wikidata.org/wiki/Q3314420 + https://www.wikidata.org/wiki/Q7116910 ? [20:50:01] They refer to the same thing, but the latter item links to the English Wikipedia's "data page" on the chemical [20:58:55] addshore: around? [21:00:06] benestar: as fast as a bot would edit [21:00:15] ok [21:00:31] although you can put multiple pages in one api purge request [21:01:27] aude: shall I? [21:01:37] \o/ https://en.wikipedia.org/wiki/Special:PagesWithBadges?badge=Q17437796 [21:01:43] already has stuff [21:01:49] benestar: go ahead [21:01:58] I will do it per wiki [21:02:01] * aude might put 10 pages at a time [21:02:04] just a guess [21:02:59] aude: is there a mapping of siteid to api url somewhere? [21:03:25] in the sites table [21:04:08] there is file_path [21:04:43] sounds good [21:40:17] benestar: think you need to use forcelinksupdate with api purge [21:40:46] https://en.wikipedia.org/w/api.php?action=help&modules=purge [21:41:22] aude: otherwise the page props won't get updated? [21:41:24] thanks for the tip [21:43:19] i think so [21:52:06] aude: do you wanna review? https://github.com/Benestar/benebot/blob/master/src/PurgeBadgesPageProps.php [21:53:06] * benestar is going to bed now [21:53:15] I will run the bot tomorrow if no issues arise [21:53:34] * aude looks [21:54:28] looks sane at a glance [21:54:37] I could perhaps save some api calls by using the database for loading entities [21:54:43] but that's not implemented yet [21:54:46] * benestar slaps addshore [21:55:45] gn8 all :)