[00:09:57] http://toolserver.org/~magnus/ts2/geneawiki/?q=Q855749 <-ok, I killed it :P [00:13:44] Whoa. [00:14:29] Nice, Mr. Manske. [00:16:06] pretty awesome [00:19:42] Next round of confirmations began [00:20:27] did someone do the stuff? [00:20:57] seems wikidata site is good [00:21:11] and it's on meta [00:22:47] cool, that went pretty smoothly [00:22:52] duh, added :D [00:22:57] yay! [00:23:17] and just fyi, the hidden comments i added were just do i would know which is which, they arent required [00:23:39] yeah, i know, but i added them for fun :P [00:24:03] and i've quality checked all the country IDs (to make sure they're right) and categories (to make sure they don't have any unrelated crap) [00:24:47] awesome [00:24:50] rschen7754: Us admins split the work :P [00:24:57] :) [00:25:05] once the bot is approved by someone, ill copy them over and load them up [00:25:14] duh: Which bot? [00:25:18] Legobot 4 [00:25:25] all concerns have been addressed [00:25:37] and its been open for +1 week i think [00:25:38] was it clarified somewhere whether neutrals get counted in RfA? [00:25:47] neutrals aren't counted [00:25:57] duh: Shouldn't your RfA end later? [00:26:01] only on commons do neutrals count as a semi oppose or whatever [00:26:05] Ajraddatz: for what? [00:26:12] RfAs/RfPs [00:26:19] Hazard-SJ: yeah my RfA closes tomorrow i think. This is for a bot request. [00:26:22] Ajraddatz: ok, I was thinking in the Commons way [00:26:31] https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Legobot_4 [00:27:16] what's consensus for bots? everyone's happy? [00:27:25] yeah [00:27:27] duh, i assume that when the bot finds an article in more than one of those categories, it will just add the necessary item to the correct property? [00:27:30] I don't think it's been defined yet [00:28:14] Jhs: yes, it treats each template row thing as an individual request [00:28:22] so if you [00:28:33] |ignore an article from one, and not another, it only uses the list for that specific request [00:28:41] duh, closed as done [00:28:49] thanks [00:29:33] nice nice :D [00:31:09] ok gotta go afk for 10 min, and then ill come back and fire it up :D [00:32:12] yay! [00:33:47] Ajraddatz: I know [00:34:06] duh: Care to run another trial? :P [00:34:15] Hazard-SJ, then why did you ask O_o [00:34:18] Hazard-SJ: for which bot? [00:34:52] Ajraddatz: I was wondering if I made a mistake in closing one of the confirmations [00:34:55] duh: 4 [00:36:25] yeah once i go upstairs, i got stuck helping someone in #wikipedia [00:37:01] what a nice guy [00:37:06] OK [00:38:13] duh: Actually...since Ajraddatz closed it, it wouldn't exactly be a trial :P [00:38:39] when it comes to duh, trials are a bit pointless [00:39:00] he just sprinkles his fairy dust everywhere and it works :D [00:39:07] :D [00:39:12] my bot has only screwed up once on wikidata :P [01:04:58] Hazard-SJ: the CennoxX confirmaion caught my eyes, it would be unsuccessful if it were on Commons [01:05:39] don't know how others naturally expect [01:06:35] whym_away: I just closed them per https://www.wikidata.org/wiki/Wikidata:Administrators [01:06:55] At least 8 supports, and >= 75% support [01:07:27] I've never liked how commons does it with the neutrals. If I vote "neutral" on someone, I don't want it being counted against the person. [01:07:34] duh: rivers of africa :/ [01:07:38] Feel free to propose modifications, whym_away [01:07:57] Ajraddatz: That's like counting them as weak opposes [01:08:18] yes, commons interepretation of "those who voted" include neutrals [01:08:25] a footnote on this might be a good idea [01:08:48] rschen7754: yeah i'm just reviewing the code since ive made quite a few modifications since i last ran it [01:08:53] ok [01:08:59] * Hazard-SJ blames duh's bot [01:09:30] Hazard-SJ: maybe we should schedule our bots to run at different times? [01:09:40] mine runs on the hour and apparently takes 3 minutes for the query [01:09:52] i can switch it to :30 [01:10:14] but [01:10:22] my bot does check that the item hasnt been reported yet [01:10:28] but what if it takes over an hour for the run? [01:10:35] oh, that's for the other task [01:10:59] rschen7754: since i use SGE, if a version of that script is already running, it wont start another process [01:11:05] oh [01:14:49] duh: Mine takes over 4 hours to run, and just stuffs on everything at once [01:14:56] ah [01:15:15] maybe check right before saving that the items havent been reported yet? [01:15:28] also, [01:15:34] why does it take that long? [01:16:07] duh: I'll ask it kindly to check that the item still exists just before requesting deletion [01:16:16] eh [01:16:16] duh: It has many wikis to check [01:16:20] i dont think thats a good idea [01:16:24] :/ [01:16:35] if you just check if it exists, you're not checking if it has any sitelinks [01:16:36] etc [01:16:49] and its possible a human already reported it [01:16:53] or my bot [01:17:08] its easier just to check if 'Q42' on rfd.get(): [01:17:16] duh: The sitelink check would have already been made by then [01:17:26] yes but its possible someone re-adds it in that much time [01:17:29] or something [01:17:36] im more concerned about the four hours part [01:17:54] my redirect bot takes 30 minutes to run an expensive query against all wikis [01:18:31] duh: Do you want to see the code? [01:18:41] yes please [01:26:04] * Hazard-SJ uploads code [01:27:10] duh: The statistics from the last run are in https://www.wikidata.org/?diff=6733513 [01:27:52] yeah thats a long time... [01:29:05] also [01:29:12] find a better spot than the sandbox to do that :P [01:31:53] duh: https://github.com/HazardSJ/Hazard-Bot/blob/master/Wikidata/delsitelinks.py [01:32:02] ok will look in a few minutes [01:32:07] OK [01:37:52] https://en.wikidata.org/wiki/Special:Contributions/Legobot :D [01:40:02] :D [01:40:40] great... [01:40:48] i just figured out theres a bug when the bot stops [01:41:56] How do you add a discription when there appears to be no edit button. I am able to type in the discription but have no way of saving it, closing the page loses the edit see "SpamCop" that has no description [01:42:27] dbiel: have you tried hitting "enter' [01:43:36] Yes, it appears to do nothing as well [01:43:44] dbiel: Just so you know, you can't be editing descriptions and, say sitelinks or labels at the same time [01:43:47] link to the specific item? [01:44:01] dbiel: Tried reloading the page? [01:47:24] Hazard-SJ: i think i found your problem [01:47:45] also your unicode support is…well bad. [01:48:21] if rfd != []: [01:48:22] ewww [01:48:24] if not rfd: [01:49:05] anyways [01:49:06] your main problem [01:49:07] I just found the problem as well, IE 8 does not seem to be supported. going over to Google Chrome the edit fields appear [01:49:14] dbiel: ah right yes [01:49:18] there are issues with IE8 [01:49:24] theres a thread on project chat about it i think [01:49:38] Thanks for the help [01:52:44] bye [01:54:53] anyways, Hazard-SJ your issue is with your generator [01:55:06] duh: You mean if rfd:? [01:55:08] you're just fetching say the last 200 deletion entries [01:55:14] Hazard-SJ: oh right, yeah if rfd: [01:55:58] you should check based on timestamp [01:56:12] like "get all entries since blah" [01:56:23] you should also filter undelete entries and such [01:58:53] duh: How do I even know if it is an undelete entry?:S [01:59:14] There isn't an undeletion log [01:59:27] you need to filter by logaction vs logtype [01:59:31] logtype="delete" [01:59:39] logaction="delete" or "undelete" [02:01:25] duh: pagegenerators.LogpagesPageGenerator(number = number, mode = 'delete', site = site) [02:01:48] pagegenerators.LogpagesPageGenerator doesn't support anything of the sort [02:02:05] yeah [02:02:19] you're goign to have to directly use the QueryGenerator [02:03:22] * yurik doesn't like pywiki implementation :( [02:03:25] duh: You mean query.GetData, right? [02:03:36] no [02:03:41] i mean the generator [02:03:42] let me find it [02:03:50] the generator automatically handles the query-continue for it [02:04:00] yurik: So fix it [02:04:27] Hazard-SJ: just like in biology, after a certain point, it is easier to have a new kid than to fix an old person :( [02:04:48] ….. [02:04:53] that was funny [02:05:09] there are many conceptual problems that i see with pywiki [02:05:40] a) pywiki overconcentrated on wrapping everything into objects and methods, at the expense of flexibility and efficiency [02:05:49] ^ [02:06:12] yurik: Would the new kid be the rewrite and the old person be the trunk? :P [02:06:16] it is almost impossible to use api to the fullest - like use the generator AND get langlinks AND get categories [02:06:23] unfortunatelly no [02:06:34] i looked closelly at it. it is much better, but the core problem remains [02:07:02] Hazard-SJ: https://github.com/legoktm/pywikipedia-rewrite/blob/master/pywikibot/data/api.py#L842 [02:07:04] yurik: the query.GetData allows that, doesn't it? [02:07:21] yeah but then you're just making direct API queries [02:07:22] Hazard-SJ: what do you mean? [02:08:51] duh: all those generators are not really generators - they simply do generator=list & prop=pageinfo, after which you have to do another query to actually get the properties you need [02:09:03] duh: That isn't in trunk as far as I'm aware [02:09:15] yurik: yeah but it does what Hazard-SJ needs [02:09:21] according to ori, most bots don't get more than one page at a time, resulting in a huge inefficiency [02:09:22] Hazard-SJ: ouch. well then GetData works I guess [02:09:27] yes! [02:09:36] thats the main issue [02:09:43] rewrite kinda helps with that [02:09:57] duh, i don't doubt that it "works", it is "how" it works is what causes grief :) [02:10:06] :P [02:10:21] the reason for all this is that pywiki has a huge legacy of dump processing and page scraping :( [02:10:50] rschen7754: can you "approve" your roads jobs and Jhs's rivers jobs (I checked them) by copying them over to the main page. I don't feel like logging into the bots account to do it [02:11:03] ok let me just vote for a bugzilla bug [02:11:13] …you do realize that does nothing right? [02:11:17] aww [02:11:23] it adds me to teh cc list [02:11:25] no one looks at those really [02:11:34] is there a log somehwere? [02:11:38] of completed jobs [02:12:04] yes [02:12:15] IMO, the api should have been something like this: get a site object that self populates with the meta info, and expose a raw api object to do actions, and for query do an extra query object that returns merged page objects (in case the continue was called) [02:12:23] rschen7754: https://www.wikidata.org/wiki/User:Legobot/properties.js/Archive [02:13:43] duh: in other words: in python code it should look like this: [02:13:47] for result in pywiki.query( {'generator':'allpages', 'prop':'links'} ): [02:13:48] # process result data [02:13:51] yes [02:14:42] duh: done [02:14:48] where "result" is the dict returned from the server [02:15:10] rschen7754: er can you remove the html comments [02:15:24] this way there could be a wrapper that yields one *complete* page dictionary at a time [02:15:26] :( [02:15:38] just use find+replace [02:15:54] find: replace: '' [02:15:58] with regex enabled [02:16:47] i'll just do it by hand [02:17:02] api would handle continuing between requests, merging of incomplete page dicts (imagine if you request multiple properties and some get populated and some are only returned in the next call), and possibly even revision changes - api can detect if the page has changed between calls [02:17:26] but you get back a simple dict {} that describes everything you requested [02:17:42] hm [02:18:03] would that be done client side or server side? [02:18:17] https://www.mediawiki.org/wiki/Requests_for_comment/API_Future#Easy_continue [02:18:51] kinda both - the server will notify the client if the page is "incomplete" (although we can work around it even now with some extra python code) [02:18:51] oh right [02:18:56] thats on my to-read list. [02:19:07] see the light green samples [02:19:54] and the client will know how to handle basic things like errors (will raise exceptions), log warnings, but won't try to do complex object creation [02:20:29] i am seriously thinking of rewriting the framework from scratch, and simply reusing some components from pywiki [02:20:37] duh, should be good [02:20:45] thanks [02:20:53] np [02:20:56] soon ill write something in so the bot will ignore it [02:22:04] duh: since you wanted +2, and since i recently requested it too, it might be a good team project ;) [02:22:12] i dont want +2... [02:22:21] are you confusing me with hoo? [02:22:34] huh? [02:22:38] duh [02:22:45] hoo? [02:23:01] * hoo is now in confused mode [02:23:03] ok starting the bot again [02:23:27] * Hazard-SJ has to go [02:23:29] Bye [02:23:37] dope, there are two of you :( [02:23:41] :D [02:23:59] duh hoo! [02:24:04] i dont do much core hacking [02:24:37] mainly AbuseFilter stuff [02:25:04] duh: You could at some point go for +2 on AF... I'm atm doing that more or less alone [02:25:09] sorry for the confusion, although if you do bot hacking, you might have some feedback :) [02:26:11] hoo: eventually maybe. it would be bad to give +2 to someone who doesn't know php though :P [02:26:11] * yurik thinks it should be possible to comment on each revision people do... like in facebook :) [02:26:26] duh: You don't know php?! [02:26:33] php is evil, stay away [02:26:44] https://en.wikipedia.org/wiki/User:Legoktm/php-0 [02:26:55] * yurik is secretly planning to rewrite the entire MW in C# [02:27:06] yurik: someone is planning to do it in python actually [02:27:09] yurik: Go for Python and I'm in [02:27:11] :D [02:27:26] it was on labs-l [02:27:33] duh: Whoops, things I shouldn't have said loud :D [02:27:39] hahah [02:27:40] hehe. I spoke with .... the guy who is a dev in PHP, forgot his name, and he was thinking about it [02:27:59] i mean - he is a dev of the php itself [02:28:23] the idea was to start migrating by rewriting api in php [02:28:35] yurik: He was thinking of porting MW away from PHP? [02:28:36] and then some other services (parser, etc) [02:28:39] yep [02:28:44] :D [02:28:55] it is a dead end unfortunatelly :( [02:29:13] you just can't do enough CPR to bring to life :) [02:29:24] broken by design :) [02:29:31] my fav - the array() object [02:29:43] the most hated thing imo [02:29:56] * yurik stops ranting [02:31:00] i still think c# is better - it forces you to write proper code for large systems, and it handles upgrades more gracefully [02:31:06] * yurik stops ranting again [02:35:39] * Jeblad_WMDE yawns [02:36:45] rschen7754: https://www.wikidata.org/w/index.php?diff=6740311&oldid=6738657&rcid=6753075 [02:37:16] Jeblad_WMDE: hi, would you know when i can expect https://gerrit.wikimedia.org/r/#/c/49205/ to get deployed? [02:39:16] I'm not sure if it is on the list of things to get backported, if not it will take a little over two weeks [02:39:43] who do i need to talk to to get it backported? [02:40:06] because bots should be able to edit as bots [02:40:12] mine is filling up recentchanges right now [02:40:37] I don't think its on the prioritized list, .. [02:41:59] you're the one who marked it as "high" priority :P [02:46:35] duh: ok cool [02:46:37] th [02:46:38] x [02:47:32] woot [02:47:34] my bot didnt crash [02:47:50] i was afraid Jhs's name would kill it [02:48:02] yay :D [02:48:31] rschen7754: can you copy over one of my jobs please? i want to make sure the bot can handle other people editing the page when its running [02:48:43] ok [02:49:40] done [02:49:41] duh: setting it to high doesn't mean it goes in, just that someone should do the necessary job before other stuff that is flagged as normal [02:50:34] Jeblad_WMDE: ok makes sense. but so how do i get it backported? its a rather simple fix that is needed on the live wiki soon [02:54:00] aaah duh abort [02:54:13] uhoh.... [02:54:15] i killed it [02:54:19] :( [02:54:22] whats up? [02:54:58] i somehow put in the property "is a" instead of "country" in the field� :/ [02:55:03] duh, the list of things to be backported is very long [02:55:52] And it is other things with higher priority [02:56:05] But perhaps, I don't know [02:56:47] Jeblad_WMDE: well i'm pretty there are way more important things [02:56:51] "is a" is problematic and I don't think it should be used without some clearification [02:57:13] Jhs: had the job already run/was running? [02:57:17] rschen7754 can rollback the bot [02:57:22] yeah, but only on a few pages [02:57:23] :/ [02:57:25] i can fix it by hand [02:57:29] ok [02:57:33] we need detachable rollback [02:57:43] New patchset: Hoo man; "Log API errors in JavaScript to console if in debug mode" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/48968 [02:58:39] New review: Hoo man; "Patch Set 4:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/48968 [02:58:57] Jhs: ok can you let rschen7754 which jobs need to be fixed/removed before i start the bot up again? [02:59:00] New review: Hoo man; "Patch Set 3:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/48968 [03:00:53] :o jhs can edit .js pages? [03:01:06] couldn't i? [03:01:17] non admins cant... [03:01:21] oops, i just misused global sysop rights [03:01:22] maybe you have it globally? [03:01:23] :P [03:01:24] oh well [03:01:33] i dont care [03:01:48] and moving them to the bottom doesnt work since the bot loads the job page all at once [03:01:57] so just stick a # in front of them for now [03:02:00] to comment them out [03:02:27] jhs mostly do things well, mostly.. [03:02:31] oh, that's who jhs is… :P [03:02:49] * Jeblad_WMDE plan to kick jhs but the nxt time he is in Norway [03:02:57] h�h� [03:04:30] jhs, it would be nice if you could take time to proofread the translations to norwegian [03:04:43] Jeblad_WMDE, yup. tomorrow maybe [03:04:50] Some of the new english source messages are a bit weird [03:06:21] Jhs: can you comment out the jobs you moved to the bottom so the bot wont accidentally run them? [03:06:27] or are those safe to run? [03:07:05] they are safe, yeah [03:07:17] ok [03:08:15] rschen7754: propose detached rollback on PC? [03:08:25] duh: it already failed [03:08:30] oh [03:08:46] One of the extraordinary weird messages https://translatewiki.net/wiki/MediaWiki:Wikibase-snakview-variation-datavaluetypemismatch-details/en [03:09:03] Its kind of cute [03:10:44] heh [03:10:56] "property's data type's data value type" [03:10:58] Jeblad_WMDE, lol [03:12:24] Verdiens dataverditype �$1� passer ikke med egenskapens datatyps dataverditype �$2� [03:12:40] det h�res ut som en �ystein Sunde-sang [03:12:42] its a property that has a datatype, but the datatype of the datavalue does not match up [03:13:01] It needs rewording [03:13:36] indeed [03:13:48] rschen7754: is https://www.wikidata.org/wiki/Q408192 a property of itself? [03:13:51] er like [03:14:03] ooh [03:14:05] no [03:14:10] oops [03:14:11] * duh goes to fix recursion [03:14:20] lol [03:14:35] recursive items are recursive [03:16:05] is https://www.wikidata.org/wiki/Q179976 right? [03:16:26] oh [03:16:28] no that doesnt work [03:16:29] ugh [03:16:42] Just as a note; "is a" -type of properties are transitive, and we don't have code to support them [03:17:07] Recursion: See Self-Referencing [03:17:24] Self-Referencing: See Recursion [03:17:38] To handle transitive properties are difficult and it is not done in a few days to implement them [03:17:51] Error. Stack overflow. [03:18:30] It is necessary to implement that part to make "is a" work for some types of properties [03:18:50] or rather parameters in templates on wikipedia [03:19:08] Thats why "is a" should not be used [03:21:23] hm [03:21:31] i cant figure out a better way to make the recursion joke [03:22:37] public static void recurse(){recurse();} [03:22:46] duh, i really think Google nailed it http://www.google.co.uk/#hl=en&tbo=d&output=search&sclient=psy-ab&q=recursion&oq=recursion&gs_l=hp.3..0l4.978.2105.0.2177.9.8.0.0.0.0.158.850.5j3.8.0.les%3B..0.0...1c.1.3.psy-ab.NTfEcP5VAqo&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.&bvm=bv.42553238,d.bGE&fp=27da8d583881e653&biw=1366&bih=682 [03:23:00] yeah, i was trying something like that [03:23:09] (hmmm� i have no idea why the url for a simple google search is so long) [03:23:25] most of it is analytics, its just the &q=recursion [03:23:42] i was only able to come up with is a [03:23:45] which doesnt work [03:29:29] Imagine implementing sizzle for Wikidata [03:29:39] Not sure if it is possible at all [03:30:52] * Jeblad_WMDE plan to do some work on the sleep protocol [04:06:31] New patchset: Hoo man; "(bug 45043) Overhauled site input field" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49460 [04:10:14] New patchset: Hoo man; "(bug 45043) Overhauled site input field of wbclient.linkItem.js" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49460 [04:50:36] does wikidata parse titles with lang+site properly? Something like [[w:en:API]] --> enwiki / API [04:54:00] duh: i am thinking of your idea, and i can't come up with any scenario useful for the client [04:54:13] What do you mean? [04:54:31] well, what kind of request would a bot/user need? [04:54:38] translation. [04:54:40] Its easy [04:54:47] can you give an example? [04:54:49] You store each Qid as an "object" [04:54:54] user clicks on it [04:54:58] depending on the language set [04:55:03] you show them the content in that language [04:55:16] wait wait, is that in a client or repo? [04:55:27] which site? [04:55:46] i'm failing to see the possible usage [04:55:48] 3rd party [04:55:50] I dunno [04:55:54] Just an idea. [04:55:57] Mainly [04:56:07] no,it might be good, i am just trying to figure it out [04:56:09] If my bot has the item # [04:56:15] And I want enwiki's page content [04:56:25] I need to look up the enwiki sitelink, then another request for page content. [04:56:54] ok, but how would you have that Q#? [04:57:02] Because of properties [04:57:03] how would bot get that list [04:57:26] Say I want list of males born in DC or something stupid [04:57:33] That would return me IDs. [04:57:55] who would return the ids? wikidata will give you the list of titles in any lang you want [04:58:50] I'm not sure [04:58:58] I thought there was going to be a query interface for stuff like this, [04:59:19] there will be, i'm just trying to understand which queries will be useful [04:59:40] maybe the whole wbtsearch (as described https://www.mediawiki.org/wiki/Requests_for_comment/Wikidata_API ) [04:59:44] can be made remotable [05:00:04] so that anything it can do you can use from any site [05:00:12] and get pageids back [05:00:17] of the local site [05:00:30] (usually would work as a generator{) [05:02:32] hm [05:02:33] idk [05:02:42] personally i try and use the numbers as much as possible [05:02:49] because it means i dont have to worry about unicode [05:03:02] numbers is fine (although unicode should really be transparent for you) [05:03:07] :P [05:10:28] Jeblad_WMDE: you awake?> [05:31:36] Tedium: http://www.wikidata.org/wiki/Special:Contributions/Sven_Manguard [05:35:17] you know... [05:35:26] you could have gotten a bot to do that? [05:36:13] https://www.wikidata.org/wiki/Wikidata:Bot_requests [05:42:41] ahhh rschen7754 you screwed up!!!! [05:42:54] look at the first template line [05:42:55] https://www.wikidata.org/wiki/User:Legobot/properties.js [05:42:56] ? [05:43:05] its missing the second } [05:43:06] :( [05:43:14] haha i was like wtf kind of error is this [05:43:15] sorry lol [05:43:20] no worries [05:43:22] in the future [05:43:22] use [05:43:32] https://www.wikidata.org/wiki/User:Legobot/properties.js/Nice_looking [05:43:39] it shows it rather obviously :P [05:43:52] ok [05:45:25] rschen7754: so….can you fix it plx? :) [05:45:39] they need to sysop you :P [05:45:47] done [05:45:57] Thanks [05:46:07] and sven already put it on meta :P [06:29:18] My bot just edit-conflicted with itself. [06:29:27] Or at least pywikibot thinks it did. [06:29:30] Stupid stupid stupid. [06:33:37] force=True [06:33:54] Will that fix it? [06:34:04] * duh checks [06:34:14] something like that, yes [06:34:27] ugh [06:34:39] i shut down my IDE since it was using 1.5GB of ram [06:34:46] ! [06:34:54] get more ram, it's cheap [06:35:05] My laptop already has 8 [06:35:50] 8GB is like 640KB... [06:36:01] lolwut [06:36:56] nope [06:36:59] in rewrite its different [06:37:01] if not force and not self.botMayEdit(): [06:37:06] botMayEdit() == {{bots}} [07:16:00] Wat [07:16:03] This is stupid [07:16:06] It makes the edit. [07:16:11] And then thinks it editconflicted. [07:16:16] yurik: aklsdhfkjsdhfsdf [07:16:34] could you be a bit more specific? [07:16:47] oh, what you said... [07:16:49] hm... [07:16:59] yes [07:17:03] It just made https://www.wikidata.org/w/index.php?title=User:Legobot/properties.js/Archive/2013/02/17&diff=prev&oldid=6757596 [07:17:06] there was a discussion about it a bit ago [07:17:08] WARNING: Waiting 5 seconds before retrying. [07:17:09] WARNING: Waiting 10 seconds before retrying. [07:17:09] WARNING: Waiting 20 seconds before retrying. [07:17:15] no clue what that even means [07:17:20] The edit went through. [07:17:37] oh, it means that it doesn't like you [07:18:20] ok, so what's the issue [07:18:30] i'm a bit confused - the fact that it postponned the edit? [07:18:39] that seems to be a recent development [07:18:54] WARNING: Waiting 40 seconds before retrying. [07:19:02] definitly doesn't like you [07:19:03] The issue is that it thinks the edit hasn't gone through [07:19:11] ^C [07:19:11] i really don't know :( [07:19:14] i'm sorry :( [07:34:19] ugh its happening again [07:34:35] i bet its because its a large edit or something. [07:36:36] Though when I ^C [07:36:50] The traceback is waiting for a thread [07:36:55] Maybe something is timing out? [07:49:21] rschen7754: so apparently my method of archiving right now is making the bot timeout. [07:49:29] er mediawiki is timing out [07:49:33] :/ [07:49:33] any suggestions? [07:49:37] the long lists i take it? [07:49:51] yeah [07:50:00] i just tried removing a bunch of stuff manually [07:50:02] and got the timeout [07:50:13] how do pathoschild and hoo do it? [07:50:37] they probably use a better bot framework [07:50:44] Oh and they use html tables [07:50:50] Not wiki tables [07:50:53] But I hate that [07:51:19] maybe if i subst: the /rows? [07:52:27] on the archive page? [07:53:01] yeah [07:53:17] because each /row has 5? parser functions i think [07:53:20] maybe? [07:53:22] ! [07:53:32] which is like 5*1000 [07:53:55] Actually not [07:53:57] There are just [07:54:04] Too many rows to do it with right now [07:54:05] Ugh [07:54:45] I'll just use mysql. [07:56:37] also, [07:56:41] rollback is super useful [07:58:35] Heh [07:58:39] This would let me optimize so much more [07:59:50] also rschen7754, if you're not busy, what do you think of doing an announcment of this on project chat? right now we just need more people to find categories and write up the requests [08:00:06] yeah, project chat should be good [08:00:33] do you mind writing one up? [08:00:39] i want to get this stupid archive thing fixed [08:00:47] ok [08:02:20] done [08:03:12] * duh was thinking of something a bit longer, but that works too :P [08:26:17] is the deletion of properties common? [08:26:36] it's generally discussed a lot more [08:26:37] do we need a bot until https://bugzilla.wikimedia.org/show_bug.cgi?id=44639 is fixed? [08:26:45] because if "is a" gets deleted [08:26:47] yay...... [08:26:56] it's a known bug [08:27:02] you have to unlink before deleting [08:28:03] yeah [08:28:07] so if you have a bot do it [08:28:10] its easier [08:28:21] true [08:28:49] no clue if the API can even do that [08:28:54] removing sitelinks is a hack [08:28:58] you just set it to null [08:29:04] lol [09:07:46] hello? [09:07:52] hi [09:08:34] wikidata takes charge or interwiki [09:08:36] so [09:09:01] interwiki will override wikidata [09:09:05] yes [09:09:14] but, for example [09:09:25] when new article is created in other wikipedia [09:09:52] intewiki is not added [09:09:55] not mind [09:09:55] yes [09:09:56] mine [09:09:59] you need to do it on wikidata [09:09:59] but bot [09:10:01] is [09:10:03] manually [09:10:27] bot [09:10:31] doesn't work [09:10:34] right? [09:10:38] no, they wont [09:10:45] there are bots that import interwikis into wikidata though [09:10:53] but [09:10:56] for example [09:11:00] ko [09:11:14] wikipedia (korean wikipedia ) [09:11:32] wikidata isnt on korean wikipedia yet [09:11:36] [[라텐베르크]] (which is not created by me but anyway) [09:11:38] it will happen in a few weeks [09:11:45] doesn't added [09:11:55] in [[Rattenberg]] [09:12:02] en.wikipedia.org/Rattenberg [09:12:21] it needs to be added on wikidata [09:12:24] see the "edit links"? [09:12:25] yes [09:12:27] but [09:12:33] i mean [09:12:35] korean wiki [09:12:44] doesn't introduced [09:12:49] right [09:12:51] wikidata yet [09:13:07] how the new articles will then [09:13:11] https://www.wikidata.org/w/index.php?title=Q261778&diff=6765217&oldid=4085185 [09:13:16] it's impossible [09:13:17] you need to add them to wikidata [09:13:26] without the bot [09:13:32] but [09:13:46] before add myself [09:14:11] they will not add [09:14:13] right? [09:14:34] maybe you can ask on https://www.wikidata.org/wiki/Wikidata:%EC%82%AC%EB%9E%91%EB%B0%A9 ? [09:14:42] i dont think i understand your question [09:15:28] um wait [09:15:54] you're a korean speaker right? [09:16:07] i will arrange(?) my hought, [09:16:14] yes [09:16:35] ok [09:16:54] if you ask on that page, someone can answer your question in korean [09:18:39] but i think there are few people [09:19:14] wait [09:19:24] quick question: [09:19:36] before introduce wikidata [09:19:49] the bot added automatically [09:19:59] yes [09:20:03] but now [09:20:16] when new article is created [09:20:23] now you just edit wikidata, and it shows up on all projects at once [09:20:30] yes [09:20:31] but [09:20:49] why bot doesn't add in wikidata now? [09:21:04] new articles are created but not connected [09:21:12] lots of articles!! [09:21:16] yes [09:21:19] its a problem [09:21:27] there arent enough bot runners to cover all projects [09:22:46] so, from now, add passively [09:22:53] it's impossible [09:23:08] not impossible [09:23:15] just someone hasnt written a bot to do it yet [09:23:24] our resources are spread very thinly [09:23:58] there are around ~20 bot operators trying to cover over 4 million items and 286 languages [09:23:58] which wikipedia introduced wikidata except english wikipedia [09:24:09] english, he, hu, it [09:24:24] it means italian [09:24:30] hu is hungarian [09:24:34] he = hebrew [09:24:55] rschen7754: when are other projects going live? [09:25:01] mar 6 [09:25:18] so all other projects get added march 6 [09:27:22] then except that language wikipedia, it will be added ? [09:27:53] i dont understand your question… [09:28:55] uh i mean [09:29:19] except that 4 language wikipedias, intewiki will be addeD? [09:29:30] yes [09:29:36] they try to get all projects [09:29:36] uh [09:29:40] but it may not happen all the time [09:29:50] you have a bot [09:29:58] see how https://www.wikidata.org/wiki/Q261778 has ko [09:30:03] because i added it? [09:30:09] so the links just need to be added [09:30:30] can you run a bot [09:30:38] such as this problem [09:30:49] i'm working on a solution right now [09:30:53] its just not finished [09:31:30] it's just done by personally [09:31:35] not bot [09:32:32] ? [09:32:35] duh? [09:32:41] its done by both [09:32:53] the problem with wikidata is that everyone has todo lists a mile long [09:33:11] we don't even have bureaucrats [09:33:24] or half the policies that a normal WMF wiki has [09:35:36] what does mean "has todo lists a mile long" [09:35:55] we have things we want to do, we just dont have time to do it [09:36:08] there's a lot of stuff to do [09:36:10] more than we have time for [09:36:15] yes [09:36:20] so you'll have to be patient [09:36:23] i understand [09:36:56] but over 4 million articles of en.wiki [09:37:37] when the time takes 1 sec for one article, [09:37:45] 4mill sec [09:39:08] 2777 days? [09:39:15] is it? [09:42:04] maybe :( [09:42:12] many enwiki already have articles [09:42:18] but it will be a slow process [10:27:44] hello. do you have an analogue of {{db-self}}? [10:28:47] no [10:28:49] whats the item? [10:28:53] i can delete it for you [10:29:24] the problem is we cant categorize or add templates, so we just have WD:RFD [10:29:26] [[Q4804079]] (a duplicate) [10:29:49] done [10:30:46] thanks. [10:31:16] np [10:32:17] I realize that this media is not a template-controlled one, but I asked about a procedure analogous to WP:CSD G7, not about a template, of course. [10:32:41] ah yeah [10:32:49] its just ask for it to get deleted, and it will :P [10:33:20] when it comes to items, either something should be kept, or it should be deleted [10:33:30] there really is no questionable area [11:04:34] duh, i just added a huge job for your bot on the talk page :D [11:04:48] yay! [11:04:52] its currently broken right now [11:05:03] its rather funny actually [11:05:37] the logging mechanism puts everyday on its own page [11:05:45] however there were enough requests yesterday [11:05:54] that it now when you try and edit the page [11:05:57] the edit goes through [11:06:01] but you get the "timeout" error [11:06:06] so pywikibot tries saving again [11:06:07] and again [11:06:40] so im switching it to log to a mysql database [11:06:52] and will eventually write a webtool for people to look at the log [11:08:58] Jhs: hmmm my bot does not have apihighlimits on svwiki so it will take longer. do you think they would be willing to flag it as bot just to get data faster? [11:10:17] duh, probably. but why not get global bot status? it won't actually change anything on svwiki or other wikis, so the status will only be to get the limit, right? [11:10:34] oh right, that would be better :P [11:10:47] can i just make a request on the requests page? [11:11:15] or do i need approval/discussion first? [11:11:35] hmm, maybe even a second global group with only apihighlimits could be better. that would make it much easier for Wikidata bots that aren't going to do other stuff [11:12:21] that would work too [11:12:36] it would also work on wikis opted out of global bots [11:13:16] yeah! [11:13:30] i'll go ahead and suggest that [11:14:07] ok thanks :D [11:19:17] there already is https://meta.wikimedia.org/wiki/Special:GlobalGroupPermissions/API_High_Limit_Requestor [11:19:25] it's only used for one bot, FacebookBot though [11:19:29] and has no documentation really [11:21:58] ah [11:22:05] well should I make a request for that then? [11:22:15] thats for the Facebook community pages [11:23:45] yeah, i think you could try that. https://meta.wikimedia.org/wiki/Steward_requests/Global_permissions#Additional_global_rights [11:24:47] ok ill file a request in a bit [11:25:35] (Y) [11:43:48] duh: You were playing around with pywikidia + wikidata too right? [11:43:52] yes [11:43:57] my bot runs on it [11:44:17] I'm looking at the code and it's ehmm, ba [11:44:20] bad [11:44:34] https://github.com/legoktm/wikidata [11:44:52] * aude waves [11:45:06] i have a fork that ill merge back into rewrite once its clean https://github.com/legoktm/pywikipedia-rewrite/tree/wikidata [11:45:09] * duh waves to aude  [11:45:30] duh: i think we can backport your patch :) [11:45:30] If I look at say for example https://www.wikidata.org/w/api.php?action=wbgetentities&sites=enwiki&titles=Duke%20Maximilian%20Joseph%20in%20Bavaria&languages=en [11:45:43] And the datamodel I now get in pywikipedia than it's quite different [11:46:01] aude: awesome! Jhs just gave my bot 130k+ edits to make, so it will be appreciated [11:46:09] :D [11:46:29] multichill: yeah my bots just use the data straight from the wikidata api [11:46:41] duh, actually it's double the amount of edits since there are two properties to fill :P [11:46:48] unless you have some way of doing them at the same time [11:46:50] heh [11:46:54] im working on doing that [11:47:06] we can do both at once [11:47:09] because doing it twice over seems silly [11:47:14] usually [11:47:29] we, errr bots [11:47:32] I would expect it to be more like https://www.wikidata.org/w/api.php?action=wbgetentities&sites=enwiki&titles=Duke%20Maximilian%20Joseph%20in%20Bavaria&languages=en&format=json [11:47:51] duh: I want proper abstraction [11:48:07] aude: it would be nice if the API had some documentation… https://www.mediawiki.org/wiki/Extension:Wikibase/API is rather bare [11:48:25] :( [11:48:29] * multichill just reads the code [11:49:39] that's is something Jeblad_WMDE can probably help with or else i take a look when i can [11:50:21] btw, what do you think of about https://bugzilla.wikimedia.org/show_bug.cgi?id=45093 ? [11:52:19] duh: right now i don't think descriptions parse wikitext at all [11:52:38] it might take a bit of work to implement it but submitting a bug for this is helpful [11:52:40] yeah well i wasnt expecting it to be like real templates [11:52:46] but just a template-like system [11:53:04] duh: Anyway, too much core hacking for me. I just wanted to make a who's your daddy bot. [11:53:13] ahahaha [11:53:18] * aude thinks templates for items would be nice too, but maybe not trivial to implement and requires discussion [11:53:24] Extracting the data is easy, inserting is easy, just checking if it's not already there is a PITA [11:53:30] PITA? [11:53:33] e.g. a movie probably has the same properties, generally [11:53:42] duh: pain in the ass :D [11:53:46] :P [11:53:52] my bot does that [11:53:57] https://www.wikidata.org/wiki/Special:Contributions/BotMultichill [11:54:46] hm [11:55:19] multichill: https://github.com/legoktm/wikidata/blob/master/wikidata_properties.py#L175 [11:55:26] thats how my bot checks [12:13:47] New review: Aude; "Patch Set 9: Code-Review-1" [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/46294 [12:15:50] New review: Aude; "Patch Set 9:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46294 [12:16:23] what's with gerrit :o [12:16:53] it doesn't like cut & paste of stack traces nor my pastebin link [12:17:00] heh [12:17:27] * aude sure stack traces were fine before [12:17:45] anyway, looking at autocomments now :) [12:40:08] duh: https://www.wikidata.org/wiki/Special:Contributions/BotMultichill <- seems to work well [12:40:17] New review: Aude; "Patch Set 9:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46294 [12:40:24] awesome! [12:40:36] where are you getting the data from? [12:40:51] English Wikipedia [12:41:06] The royal infobox [12:41:20] ah ok [12:41:39] i was going to implement that into my property bot next [12:42:14] This is just a proof-of-concept [12:42:23] ah ok [12:42:34] The code is really really awful [12:42:40] heheh [12:43:12] I'm only going up in the tree, not adding backlinks [12:44:16] LOL, all the biblical figures are also in there [12:44:35] :P [12:45:02] you should probably file a bot flag req too [12:45:11] i just gave it autopatrolled [12:45:14] oh right [12:46:24] Debugging is quite anoying without diffs btw [12:46:42] yeah [12:47:04] I noticed someone messed up a family tree, but I couldn't find out who did it [12:53:59] multichill: diffs should be available tomorrow night [12:54:21] oh yay :) [12:54:28] :) [12:56:51] That would be nice. Hunting down errors (or worse, vandalism) is hard right now [12:57:05] duh: https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/BotMultichill [12:57:19] requests go on the bottom :P [12:57:31] will comment in a minute [12:57:33] I always put new ones at top [12:57:45] Common(s) practice ;-) [13:00:18] Weird that https://en.wikipedia.org/wiki/Cleopatra isn't in wikidata [13:01:00] someone add it [13:01:32] https://www.wikidata.org/wiki/Q4812189 [13:01:48] importing the rest of the links [13:01:52] oh wait [13:02:03] ugh [13:02:04] dupes [13:02:09] https://www.wikidata.org/wiki/Q635 [13:02:09] fuck [13:02:41] [[Q4812189]] righto [13:02:43] to [13:02:44] trash [13:03:05] https://www.wikidata.org/wiki/Q635 is probably just an interwiki conflict [13:03:37] multichill: no it was a redirect on enwiki [13:05:40] multichill: commented [13:06:08] Ah, I see [13:06:53] Would be a pretty awesome graph when all these family ties are filled in :-) [13:07:20] New review: Daniel Kinzler; "Patch Set 9:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46294 [13:11:26] New patchset: Aude; "Make ItemChange more robust to handle diffOp objects or arrayized diffOps" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49482 [13:16:12] New patchset: Aude; "Make ItemChange more robust to handle diffOp objects or arrayized diffOps" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49482 [13:26:03] duh: Getting "createclaim-save-failed" errors from the api for Cleopatra [13:26:20] what happens if you do it manually? [13:26:27] Works [13:26:30] hm [13:26:45] i think theres an open bug for the fact that API errors dont give enough info [13:26:54] {u'servedby': u'mw1190', u'error': {u'info': u'Failed to save the change', u'code': u'createclaim-save-failed'}} [13:27:33] For some reason I think it has to do with the fact that we just changed it [13:28:16] i dont think my bot has run into any of those errors yet [13:47:50] multichill: your unflagged bot runs too fast at the moment [13:48:16] That's just me typing really fast [13:48:36] https://www.wikidata.org/wiki/Special:Contributions/BotMultichill <- this? [13:48:54] I don´t think so [13:49:03] :P [13:50:17] New review: Aude; "Patch Set 2:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49482 [13:51:15] I should be more careful not to get RSI. I added 30 second microbreaks. Bit anoying that throttling wasn't properly implemented [13:52:01] New review: Aude; "Patch Set 9:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46294 [13:52:09] haha [13:52:23] multichill: throttling is based on action [13:52:35] No, I mean in Pywikipedia [13:52:39] so you need to add "wbcreateclaim" to the throttle [13:52:42] me too :) [13:53:01] The function should check the throttle before doing the action, but it's not in there [13:53:26] Nothing some good old time.sleep(30) hacking can't solve [13:53:30] in the rewrite branch, its hidden in ./data/api.py [13:53:42] its just a list of actions = ['edit','move'] blah [13:54:26] Why do I see https://www.wikidata.org/wiki/User:Legobot in RC? [13:54:33] New review: Daniel Kinzler; "Patch Set 2:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49482 [13:54:59] multichill: because right now setting claims doesnt support marking as bot [13:55:05] cute [13:55:12] so i wrote a patch for it, which needs to get backported [14:02:17] Why did I see legobot here. Is it only on my page ? http://www.wikidata.org/w/index.php?title=Special:NewPages&hidebots=1 [14:02:30] err [14:02:32] hmmm [14:02:46] * duh looks [14:03:17] ah my bad [14:03:22] i think it wasnt sending &bot=1 [14:03:29] * duh punches himself [14:04:37] sorry about that [14:08:57] New review: Daniel Kinzler; "Patch Set 2:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49482 [14:13:01] New patchset: Daniel Kinzler; "(bug 43990) Robust serialization of change objects" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46294 [14:14:16] New review: Daniel Kinzler; "Patch Set 10:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46294 [14:26:19] rschen7754|away: fyi for the future https://www.wikidata.org/w/index.php?title=User_talk%3ALegobot%2Fproperties.js&diff=6789625&oldid=6788901 [14:32:04] duh: Do you plan on marking persons? [14:32:16] yes [14:32:33] i just forgot to add that type [14:32:42] New review: Jeroen De Dauw; "Patch Set 10:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46294 [14:32:54] The type description is horrible [14:33:00] though i think it will be easier to use larger categories [14:33:02] 'GND ontology high level entity' [14:33:06] like Category:Living people [14:33:12] yeah that makes no sense imo [14:33:25] Of course. It's like a pie, start with the big pieces [14:33:58] apparently svwiki has 2 big categories for males and females that Jhs found :D [14:34:50] oh and [14:34:55] I applied for global apihighlimits [14:35:13] because its likely we'll be dealing with big categories [14:35:30] duh, i think German has the same categories (with the same criteria: only real people), so there's ~500 000 more pages for you :) [14:35:35] i can add them to the request pile ;) [14:35:39] yes please! [14:37:41] apihighlimits ? [14:37:55] Did you ever hit the api limit? [14:38:03] if the category is over 500 pages yes [14:38:57] You don't get them in batches? [14:39:05] you do [14:39:11] but each batch can only be 500 [14:39:17] with apihighlimits i can go up to 5000 [14:41:08] Jhs: you know there are a ton more here: https://www.wikidata.org/wiki/Q1410641 …i wonder how many of those pages are going to be duplicates though [15:08:44] hello [15:08:53] hi [15:09:41] what is the right place to report a bug? There is a change that I should be able to make but I can't. [15:10:13] "Site link already used" false positive [15:10:31] New review: Jeroen De Dauw; "Patch Set 11:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/48459 [15:12:53] rinaku: link to the items? [15:13:31] duh: on http://www.wikidata.org/wiki/Q11465 I can't add [[fr:Vecteur vitesse]] [15:13:42] ok [15:13:47] because [[fr:Vitesse]] exists on http://www.wikidata.org/wiki/Q3711325 [15:14:18] hm [15:14:38] wait [15:14:42] Vecteur vitesse is a redirect. [15:15:05] oh [15:15:12] I didn't realize [15:15:13] sorry [15:15:20] does fr not have separate pages for speed and velocity? [15:15:21] no worries [15:15:28] duh: apparently not [15:16:02] weird. [15:36:33] SlurpInterwiki isn't a default, right? [15:38:13] Don't think so. [15:45:38] * duh grumbles about edit conflicts [16:11:26] duh, sorry, didn't see your previous message till now. yeah, there are a many more, but none are as disciplined as the German and Swedish ones [16:11:44] the German and Swedish ones _only_ contain articles, and only about real people [16:11:55] ah ok [16:12:02] and they span all biographies on those projects [16:12:07] i wonder [16:12:17] should we set the gender for fictional people? [16:12:27] New patchset: Aude; "Make ItemChange more robust to handle diffOp objects or arrayized diffOps" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49482 [16:12:31] i don't think so [16:12:49] many of the others are severely incomplete, have subcategories with unrelated articles (e.g. enwiki's Category:Men's studies) [16:13:27] well, for fictional people, i'm not sure [16:13:43] i prefer not to think about it :P [16:17:47] ill start a discussion on the talk page :P [16:23:42] duh: According to the enwp article, all royalty decents from http://www.wikidata.org/wiki/Q3044 . Wonder if we can proof that with Wikidata ;-) [16:23:55] woah. [16:24:00] that would be pretty cool :D [16:40:31] duh: I forked Magnus' tool to only include children. http://toolserver.org/~multichill/ts2/geneawiki/?q=Q3044 seems to be a bit incomplete ;-) [16:41:34] heh [16:41:36] :P [16:55:29] New review: Aude; "Patch Set 10: Verified+2 Code-Review+2" [mediawiki/extensions/Wikibase] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/46294 [16:55:31] Change merged: Aude; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46294 [16:55:52] Woohoo. https://toolserver.org/~legoktm/cgi-bin/wikidata/logs.py?job=2 [17:07:08] New patchset: Aude; "Move WikibaseDiffOpFactory class to own file" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49492 [17:10:40] Hey, how were the items numbers attributed? [17:10:58] I don't mean the easter eggs but the general order, starting from #1 [17:11:21] was it done algorithmically or chosen by someone? [17:13:53] rinaku: done in order [17:14:01] the easter eggs were special though [17:14:58] duh: but in what order? [17:15:11] increasing order [17:15:17] incrementing* [17:15:58] duh: I mean: how come Universe is #1, Earth is #2, etc. Who decided taht? [17:16:03] oh [17:16:09] those are the easter eggs [17:16:18] they were created specially [17:16:24] and i dont know who decided that [17:19:04] duh: well ok the first few have symbollic value, but how come that after them come countries and not whatever, that's what I would like to know [17:19:24] ok so the ones that were specially imported were [17:19:37] https://wikidata.org/wiki/Special:Contributions/127.0.0.1 [17:19:41] after that [17:19:42] There is a rather short list of easter eggs [17:20:05] in whatever order they were created, they got assigned a number [17:20:06] And they are spread out over several thousand entries [17:20:30] [[d:q42]] [17:20:34] :o [17:21:05] ok [17:22:07] * aude decided about Q44 :D [17:22:37] aude: why 44? [17:22:45] incremental at that point [17:22:49] ok [17:23:32] * aude knows how to say beer in 10+ languages [17:23:41] :) [17:23:48] so good item for testing adding links :) [17:30:56] New patchset: Aude; "(bug 43990) Robust serialization of change objects" [mediawiki/extensions/Wikibase] (mw1.21-wmf10) - https://gerrit.wikimedia.org/r/49500 [17:33:57] New review: Aude; "Patch Set 1: Code-Review-2" [mediawiki/extensions/Wikibase] (mw1.21-wmf10) C: -2; - https://gerrit.wikimedia.org/r/49500 [17:55:55] DanielK_WMDE: poke, any chance you are around? [17:58:20] New review: John Erling Blad; "Patch Set 4: Verified+2 Code-Review+2" [mediawiki/extensions/Wikibase] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/48968 [17:58:21] Change merged: John Erling Blad; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/48968 [18:03:38] New patchset: Jeroen De Dauw; "(bug 44095) visualize claim differences [DO NOT MERGE]" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49502 [18:06:47] yurik: so the discussion on deleting "is a" is split…can you figure out a way to get it all in one place? right now its on WD:RfD, WD:PC, and the talk page.... [18:07:02] New patchset: Jeroen De Dauw; "(bug 44095) visualize claim differences [DO NOT MERGE]" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49502 [18:07:18] can't right now, but yes, should be in one spot [18:07:25] where would you think is best? [18:09:20] i would go for project chat [18:09:35] yeah, that's fine [18:09:35] I think RfD should be more of a speedy deletion [18:10:42] New review: Aude; "Patch Set 1: Verified+2 Code-Review+2" [mediawiki/extensions/Wikibase] (mw1.21-wmf10); V: 2 C: 2; - https://gerrit.wikimedia.org/r/49500 [18:12:27] New patchset: Jeroen De Dauw; "(bug 44095) visualize claim differences" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49502 [18:12:44] New review: Jeroen De Dauw; "Patch Set 3: Code-Review-2" [mediawiki/extensions/Wikibase] (master) C: -2; - https://gerrit.wikimedia.org/r/49502 [18:14:43] DanielK_WMDE: nevermind [18:19:16] New review: John Erling Blad; "Patch Set 2: Verified+2 Code-Review+2" [mediawiki/extensions/Wikibase] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/49460 [18:19:18] Change merged: John Erling Blad; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49460 [18:25:24] editing the Statements is strange sometimes [18:25:54] i added a link to commons, after reloading the page it was gone and after adding it again it was doubled [18:26:26] lbenedix: link? [18:26:47] http://www.wikidata.org/wiki/Q149 [18:26:58] looking [18:27:13] i see links to 2 different pictures [18:27:18] yes [18:27:19] did you remove the duplicate/ [18:27:20] ? [18:27:26] the first is wrong [18:27:31] hmmm [18:27:33] i copied from enwiki [18:27:38] ok [18:27:40] its not on commons [18:27:56] it links to File:File:Nyan cat 250px frame.PNG [18:28:01] that for sure is wrong :( [18:28:22] http://commons.wikimedia.org/wiki/File:Nyan_cat_250px_frame.PNG has been deleted [18:28:24] like i said, i copied the link from http://en.wikipedia.org/wiki/File:Nyan_cat_250px_frame.PNG [18:28:40] and i cant remove it by myself [18:29:00] * aude the copyright police, errr commons admin, says http://commons.wikimedia.org/wiki/File:Flickr_nyan..jpg probably will be deleted too sometime [18:29:05] anyway, won't do it [18:29:23] let's see if ic an remove the first image [18:29:54] Flickr_nyan..jpg has cc by sa [18:30:08] otherwise, my best advice is to wait until we do code update tomorrow and then see what issues there still are [18:30:16] ok [18:30:44] lbenedix: no idea about the cc but hope that's right :) [18:30:51] the experience when adding it was not so good [18:31:31] ok, i removed it [18:31:39] still weird [18:31:46] what is the reason that i can only add things and not remove? [18:31:48] * aude try to add it again [18:32:08] no idea, but did you try reloading the page? [18:32:14] yepp [18:32:14] duh: Yeah, http://toolserver.org/~multichill/ts2/geneawiki/?q=Q3044 now goes to modern times ;-) [18:32:14] New patchset: Jeroen De Dauw; "(bug 44095) visualize claim differences: html generation" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49504 [18:32:46] hahah [18:32:48] that is awesome [18:32:49] multichill: omg [18:33:13] Do we check if files exists on commons before we link to them? [18:33:21] nope [18:33:23] ok, i added it again and it links to File:File.... [18:33:35] * aude opens bugzilla [18:33:55] now I see two links again [18:34:04] :o [18:34:20] i only see one [18:34:23] one for each [18:34:25] could be something wrong in the patching [18:36:13] JeroenDeDauw will be amused that we have a nyancat bug https://bugzilla.wikimedia.org/45110 [18:36:29] heh [18:36:31] no idea about fixing, but suggest see how it is tomororw [18:36:39] and then henning probably has to look at it [18:37:12] lbenedix: if you can provide a screenshot for the bug and elaborate more, that would help [18:37:27] only thing i see is the "File:File" twice [18:37:40] what do you mean? [18:38:03] lbenedix: seems you have more issues than the "File:File:" [18:38:18] multiple bugs it seems [18:38:19] when adding it first it dissapeared after reloading [18:38:38] i was able to remove it again [18:38:47] hmmm [18:38:59] aude: not sure it qualifies as a nyancat bug [18:39:01] which browser? [18:39:03] heh [18:39:16] !nyandata | Danwe_WMDE [18:39:16] Danwe_WMDE: https://bit.ly/nyandata [18:39:17] Chrome [18:39:21] Denny_WMDE: ^ [18:39:32] >.< [18:39:34] lbenedix: hmmm [18:39:38] i'm using chrome also [18:39:45] http://lb.bombenlabor.de/ba/nyan/ [18:40:25] hah [18:41:46] New patchset: Aude; "(bug 44857) Add &bot param to API modules that are missing it" [mediawiki/extensions/Wikibase] (mw1.21-wmf10) - https://gerrit.wikimedia.org/r/49505 [18:41:50] Should just display that instead of the claim difference crap and be done w/ it [18:42:20] aude: thank you :DDD [18:42:22] New review: Aude; "Patch Set 1: Verified+2 Code-Review+2" [mediawiki/extensions/Wikibase] (mw1.21-wmf10); V: 2 C: 2; - https://gerrit.wikimedia.org/r/49505 [18:42:24] Change merged: Aude; [mediawiki/extensions/Wikibase] (mw1.21-wmf10) - https://gerrit.wikimedia.org/r/49505 [18:42:46] duh: sure :) [18:48:23] New review: AzaToth; "Patch Set 3: Code-Review-1" [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/49502 [18:53:58] New review: Aude; "Patch Set 3:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49502 [19:08:17] New review: Aude; "Patch Set 1:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49263 [19:20:19] New review: Daniel Kinzler; "Patch Set 10:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46294 [19:20:33] New review: AzaToth; "Patch Set 3:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49502 [19:22:58] New review: Aude; "Patch Set 10:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46294 [19:35:57] New review: John Erling Blad; "Patch Set 1:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49263 [20:05:58] it seems that all properties establish a "has a" relationship between a subject (e.g. Germany) and an object (e.g. Berlin, via the property 'capital'). are there plans to support "instance of" relations between two items beyond linking them with a special 'type' property, so that sets of properties can be applied to an item? [20:10:47] for example, if Germany has the properties population, type and prime minister, then how could a knowledge system deduce that Germany has a population and prime minister because it is has a type country, as opposed to Germany having a type and a prime minister because it has a population of 82,000,000? [20:33:22] Emw why is that important whether its from the type or the properties? [20:33:29] Danwe_WMDE: poke [20:36:43] because if instance-of relations are supported by only a special type of property (has-a relation) where the object is 'type', then it seems like knowledge representation systems would need to handle that property in a different way than all the others [20:38:32] i think making InstanceOfSnak (https://meta.wikimedia.org/wiki/Wikidata/Notes/Data_model#Snaks) available in the UI, like the PropertyValueSnak is currently available, would fix that issue [20:39:10] i do not undrstand the issue [20:39:51] why is "has a type" not a "has a" relation in your lingo? [20:41:37] 'has a type' is a 'has a' relation, but it's better known as an 'instance of' relation [20:42:14] Denny_WMDE: did you already fix the terms table language related bug or shall I have a look? [20:42:36] Danwe_WMDE: i tried to, i was looking at it, but i couldn't figure out what happened [20:43:03] if you want to do some work, I'd rather have you look at what jeroendedauw_ was emailing earlier, that is more urgent [20:43:16] Danwe_WMDE: are you helping with claim diffs? [20:43:19] Danwe_WMDE: but if you can easily fix the terms table bug i would be happy [20:43:26] if not, i'd go with tobi's patch #5 [20:43:36] it *did* work reasonably [20:43:52] but if we can polish something up with jeroen's code, then all the better [20:43:59] Emw: i do not see you argument. how is "state" a "has a"-relationship for a city? [20:44:25] I won't be able to do a lot of coding tonight. I will do the review of the diffing stuff now. I just thought the terms table bug is easy to fix so I would spend 15 to 30 minutes to try [20:44:41] what is even a has-a relation, and why is type different? [20:44:55] Danwe_WMDE: ok [20:45:17] i am poking at it and see if we can get the stuff tobi did to fit into the new code by jeroen [20:45:20] Danwe_WMDE: if it is the case, I wouldn't mind. i am very curious about the fix, though. [20:45:48] aude: Denny_WMDE: About the diff stuff (the HTML is missing?) you can ask Henning tomorrow perhaps, he is not working on critical stuff anymore. I won't be able to be in the office before noon tomorrow but I will write an email. [20:45:58] Danwe_WMDE: ok [20:46:17] New review: John Erling Blad; "Patch Set 1: Verified+2 Code-Review+2" [mediawiki/extensions/Wikibase] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/49068 [20:46:18] Change merged: John Erling Blad; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49068 [20:46:19] Denny_WMDE: i guess i'm thinking of 'has a' relations in the sense described in http://en.wikipedia.org/wiki/Has_a [20:46:56] Danwe_WMDE: if you can do one smallish thing, then it'd be take a look at jens' patch [20:47:01] Denny_WMDE: you too [20:47:16] someone has to decide if it's better than status quo or not [20:47:18] Emw: but that isn't true for the state or country relationship anyway, for cities [20:47:35] there is no possessive or mereological hierarchy [20:47:49] L10n-bot will have a bunch of changes.. Hope it doesn't collide with something [20:47:52] aude: does it work with vector? [20:47:57] Denny_WMDE: it does [20:48:04] it forces entity only search [20:48:15] you can click the magnifying glass and then do regular search [20:48:27] i'm also not sure about use of wgServer in the js [20:48:41] that's good then, sounds exactly as it should be [20:48:46] it is set as www.wikidata.org which could be an issue (i'm not sure) if you are on wikidata.org [20:48:52] I think Jens patch should be extended so it handle the ordinary namespaces also,.. but that is me.. ;) [20:49:05] won't we be always redirecting to wikidata.org soon? [20:49:09] Denny_WMDE: if the wgServer thing is not an issue, then i think it's an improvement albeign needs more work [20:49:14] Denny_WMDE: i hope so [20:49:30] Jeblad_WMDE: what ordinary namespaces? it is explicitly meant to only search for entities [20:49:54] Denny_WMDE: there's no way to search for non-entities [20:49:59] except the special page [20:50:00] I think that is wrong way to do it, but as I said, its what I think [20:50:17] you have to know to click the magnifying glass [20:50:57] aude: there is, by using the normal search, no? [20:51:02] blergh [20:51:12] Denny_WMDE: yes, but you have to get to the special page [20:51:16] people can do that [20:51:18] aude: ok. I'll do [20:51:24] Danwe_WMDE: thanks [20:51:33] just see if from technical point of view, there are any major issues or not [20:51:46] like your opinion of using wgServer [20:53:04] review? https://gerrit.wikimedia.org/r/#/c/49066/ [20:54:30] Jeblad_WMDE: ok [20:55:43] aude: did you try out Jens's change set? I have two suggestion menus now... [20:55:57] Danwe_WMDE: i did [20:56:03] really? [20:56:14] really [20:56:17] huh [20:56:18] I tested jens changes and it works here [20:56:34] opera? :D [20:56:49] Chrom here [20:56:51] firefox? chrome? [20:57:06] both [20:57:10] hmmm [20:57:14] I wonder if it could be a timing issue [20:57:25] timing? [20:57:29] no idea [20:57:34] What is the deadline for stuffs to be done for next deployment? [20:57:39] jeroendedauw_: 11am [20:57:43] says denny [20:57:51] blah [20:57:55] heh [20:58:08] New review: John Erling Blad; "Patch Set 1:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49263 [20:58:09] we need time to get everything into the branch and fully tested [20:58:43] sure. someone still needs to do the actual claim difference visualization though [20:59:07] * aude would approve patchset 5 from tobi, if we can't get the visualization done [20:59:19] i can poke at it a bit more but not promises [20:59:34] also would like the autosummaries slightly nicer than a snak guid [20:59:48] it's somewhat the same problem and could possibly share some code [21:00:01] hoo, I added a bug for the dialog.. sorry! =) [21:00:18] Jeblad_WMDE: Already fixed it... only needs a bit of testing ;) [21:00:31] * some [21:00:36] Ok, drop me a note and I review [21:00:38] hoo: nice [21:01:07] aude: At some point we have to backport that all to wmf10 or we'll deliver a horribly broken thing :P [21:01:23] hoo: we just need to mark the widget as experimental [21:01:29] Or that [21:01:30] can you help us with that? [21:01:34] New review: Daniel Werner; "Patch Set 1: Code-Review-1" [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/49263 [21:01:52] aude: I can do that... I would like to see it live with wmf11 though [21:01:59] and many others, I guess [21:02:00] wmf11 i think we can do [21:02:11] :) [21:03:08] Downloaded Jens changeset for new review and now it doesn't work [21:03:14] oh no [21:03:18] * Jeblad_WMDE scratches head [21:04:20] re Jens [21:04:40] the search on wikidata.org looks very different than a normal mediawiki installation search [21:04:59] it uses some vector configuration options [21:05:09] yeah, but these can be relevant for the dom [21:05:16] yes [21:06:08] $wgVectorFeatures['simplesearch'] = array( 'global' => true, 'user' => false ); [21:06:14] $wgVectorFeatures['expandablesearch'] = array( 'global' => false, 'user' => false ); [21:06:18] $wgVectorUseSimpleSearch = true; [21:06:24] that's what is used in production [21:06:42] i think the Vector extension is needed on top of core for those [21:06:47] Yep [21:06:55] That stuff IMO should go into core [21:07:06] We had that on wikitech-l some weeks back [21:07:09] yes, it's been suggested a bazillion times :o [21:07:24] But last time Tim agreed with it :P [21:07:39] anyway, wrt to jens code, we can probably implement the JS part in wikidata.org for wmf10, as a widget or something, if this doesnt go in [21:07:56] really top prios are the diffs and stuff, as i said in the email [21:08:12] Denny_WMDE: yep [21:08:22] worst case, one of tobi's earlier patches was fine [21:08:24] New review: Daniel Werner; "Patch Set 1:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49263 [21:08:37] not most elegant in way the code was implemented but it worked [21:10:16] i am grumpy about not taking the newer patch sets and instead having workarounds,but we really have to have diffs tomorrow. i'd prefer jeroendedauw_ 's code for that. let us decide in the morning what the status is [21:10:20] Emw: still there? [21:10:44] Denny_WMDE: ok [21:10:45] Denny_WMDE: yes, sorry for the delayed response. there could be a 'has a' relation between 'city' and 'state' in that a state has a city. in UML, there would be an association between a city class and a state class like (State)<> 0..1 --- 0..* (City), as in pond and duck in http://en.wikipedia.org/wiki/File:AggregationAndComposition.svg [21:10:49] i'll keep poking at it [21:11:52] Jens changeset works in some browser windows.. I guess its a caching issue.. [21:13:12] * hoo kills the suggester [21:13:14] Emw: two arguments: you could say the same about type -> instance, and 2. it does not cover truly symetric properties, like "borders". there is simply no aggregation or composition for borders, it truly is a graph. [21:15:09] how could you say that same about type and instance? [21:15:38] New review: Daniel Werner; "Patch Set 1:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49263 [21:18:17] you can't. and neither can you about genealogy. but that does not make it particularly special. [21:18:48] there are many properties that are like this. i am trying to understand what is special about instance-type. [21:20:25] PinkAmpersand: https://toolserver.org/~legoktm/cgi-bin/wikidata/logs.py?job=14 [21:20:26] if an instance has a type, then the properties of a type should exist for the instance [21:22:22] e.g. if the instance Canada has a type country, then can should be assumed to have a population property without having to explicitly define it [21:23:19] in the Wikidata use case, if a contributor were to state 'Canada is an instance of a country', then properties like 'population', 'area', etc. should appear on the page [21:24:22] so is this merely about suggestions for properties to be filled in the UI? [21:25:13] the nice thing in the current approach is that we do not have to discuss hard coded schemas for every type [21:25:21] just use any property anywhere [21:25:33] isn't that much more flexible? [21:26:10] i mentioned another important use case originally, e.g. for knowledge representation systems to be able to infer properties about an entity [21:27:09] it's flexible, it seems that flexibility would decrease how structured the data is [21:28:18] Emw: they can do that as well with properties [21:28:22] think OWL. [21:28:50] simply have a description like hasValue for HeadOfState == country [21:29:00] you can infer whatever you want from properties [21:29:02] no need for types [21:29:46] New review: John Erling Blad; "Patch Set 1:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49263 [21:30:36] to my knowledge OWL has classes, which i'd consider synonymous with types [21:32:01] yes, but you don't need them [21:32:04] you can just use properties [21:32:28] i.e. what I am saying is, you do not need special handling for the type-instance property [21:32:35] you can just very well just use the existing one [21:33:57] Denny_WMDE: doesn't InstanceOfSnak exist to specify 'instance of' relations between entities? https://meta.wikimedia.org/wiki/Wikidata/Notes/Data_model#InstanceOfSnak [21:34:42] we are not sure if we will drop the instanceOf snak [21:34:51] i am opposed to implement it at all [21:34:57] but others are not [21:35:05] we will go public with that discussion soon [21:35:46] * Jeblad_WMDE takes the snacks and runs away [21:36:59] would you disagree with the statement that infoboxes are collections of properties, and all infobox instances that have the same 'type' have the same set of available properties? [21:37:05] yes [21:37:22] New review: Daniel Werner; "Patch Set 3:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49502 [21:37:29] with the second half, definitively [21:38:29] New review: Alex Monk; "Patch Set 1:" [mediawiki/extensions/Wikibase] (mw1.21-wmf10) - https://gerrit.wikimedia.org/r/49500 [21:38:50] the world is much more complex than that, and there are too many exceptions to assume that every instance of a type has the same set of properties [21:39:12] first, not everything has a single type [21:40:42] and second, creating a type with all and only the relevant properties is simply too hard and controversial to make this a precondition to collecting data. i believe it might even be futile [21:40:50] but this is personal opinion, mind you [21:40:53] Change merged: Aude; [mediawiki/extensions/Wikibase] (mw1.21-wmf10) - https://gerrit.wikimedia.org/r/49500 [21:41:13] New review: Aude; "Patch Set 1:" [mediawiki/extensions/Wikibase] (mw1.21-wmf10) - https://gerrit.wikimedia.org/r/49500 [21:43:09] i am indeed looking for arguments *for* type-instance, but so far I found nothing convincing [21:44:25] how do you disagree with the second part of my statement above? [21:45:32] you said all infobox instances that have the same type have the same set of available properties [21:45:35] i say this is wrong [21:45:48] e.g. ronald reagen is an instance of actor, right? [21:46:01] yes [21:46:11] and still he would have a property "inauguration date", no? [21:46:18] so he is in the set of actor [21:46:40] but other properties are useful for him, besides the ones available through that type "actor" [21:47:39] so why not allow multiple inheritance? [21:48:33] why restrict the set of available properties in the first place? [21:49:59] assume the following: anything is an instance of item, and all properties are in the set of available properties. this is the current system. what do you want to improve? [21:50:27] ( i will go offline in about 5-10 minutes) [21:51:10] i don't see how your example of reagan invalidates the statement "all infobox instances that have the same type have the same set of available properties" [21:51:27] (thanks for discussing, this, btw) [21:51:56] because this is an instance of the type "actor", but it has properties available that are different than those of other instances of this type [21:52:07] so the statement is wrong [21:52:30] reagan is the subject, though, not the infobox [21:53:28] well, in that case it wouldnt matter at all [21:53:34] because we dont have infobox instances in wikidata [21:54:17] duh: what if the jobs take more than an hour to do? [21:55:51] we have items, and they have properties. infoboxes can in the near future call properties of the items associated with the given page. [21:58:50] rschen7754: oh the bot just keeps going, and whenever it finishes, and a new hour starts, the bot will start again [21:58:58] oh ok [21:59:55] and you can add new jobs to properties.js while it is running [22:00:18] they just wont get proccessed until the bot starts again [22:06:03] i'm off. see you tomorrow [22:07:05] thanks again, definite food for thought for me :) [22:07:18] glad to hear. [22:15:07] New patchset: Daniel Werner; "improvement for options handling of wb.ui.Base" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49594 [22:19:47] New patchset: Daniel Werner; "improvement for options handling of wb.ui.Base" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49594 [22:45:38] duh: you pinged? [22:45:43] yeah [22:45:49] * PinkAmpersand clicks link [22:45:59] ahh. c'est quoi? [22:46:08] íngles por favor [22:46:25] hehe [22:46:59] * PinkAmpersand doesn't understand the context of that link [22:47:25] oh [22:47:26] basically [22:47:32] it replaces the onwiki logging using the table [22:47:38] which was overflowing the page size [22:48:32] Also once I finish it, it will provide more features, and possibly optimization for future requests [22:48:45] cool! [22:49:00] oh and [22:49:02] https://toolserver.org/~legoktm/cgi-bin/wikidata/logs.py?job=21 [22:49:06] logging in real time :) [22:49:20] refresh and it updates [22:49:41] btw you ever make taht change to the "reason" parameter with RFD that I suggested? [22:50:31] i think i did? [22:52:59] * PinkAmpersand checks [22:53:27] also, i need to talk to hazard about his bot... [22:54:31] oh I think the one remaining thing I suggest was to have the reason say "Legobot" not just "bot", since, like, 5 years from now there might not be any context for that [22:54:44] oh ok [22:54:46] i can fix that [22:55:36] {{fixed}} [22:55:58] and pushed [22:56:46] :D Wikidata sometimes feels like one of those giant freighters that runs on a crew of 6. just a very tranquil environment of humans quietly supervising robots [22:57:15] * Jasper_Deng likes it that way [22:57:31] oh right [22:57:38] what do people think of [22:57:44] https://www.wikidata.org/wiki/User_talk:Legoktm#bot_descriptions [22:57:45] ? [22:57:51] * PinkAmpersand likes it that way too [22:59:27] duh: sounds like a good idea. i know ValterVB is already running something that adds basic descriptions, though. is he going off of categories, or something else? [22:59:40] dunno [23:00:03] i dont know how the other bots work :P [23:00:08] theres no central framework either [23:03:42] yeah. someone feel like tracking down the RFP? [23:09:53] oh hey so... straw poll. be honest here, because it matters. if I put myself up for adminship, what do y'all think would happen? [23:11:25] people would vote. [23:11:54] I would be hesitant to support if you keep bringing over enwiki-like actions like requesting a potential block to solve off-wiki canvassing. [23:12:18] ..steal a lot and they throw you in jail, steal a lot and they make you a king.. #singing [23:12:19] Other than that, your request would probably pass. [23:12:30] ^ [23:12:44] (actually, I doubt that anyone else would have issues with what you did there either, but I do :P) [23:12:44] does enwiki even do blocks for offwiki canvassing? [23:12:53] New patchset: Hoo man; "(bug 45109) Error tooltip still visible after changing site" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49598 [23:13:02] rschen7754: if they feel like it yes [23:13:10] ..steal a little and they throw you in jail, steal a lot and they make you a king.. #singing [23:13:25] Not the block itself, but that style of discussion. [23:13:36] "I don't like what you're doing so let's just threaten everyone with blocks." [23:13:43] well some rogue admins might… but it's hard to speak for the vast majority of admins on that [23:14:05] not admins, pretty much every discussion on WP:AN [23:14:57] well, realistically enwiki is the biggest and oldest wiki, but that means that we have the most long-term abusers [23:15:04] Jeblad_WMDE: Thanks for all the testing, btw :) Keep filing bugs :) [23:15:17] PinkAmpersand: you once suggested that Danny B back down for a BATTLE mentality [23:15:40] hihihi [23:15:41] guys [23:15:43] guess what [23:15:46] politics are stupid [23:15:47] now [23:15:51] get back to work [23:15:59] lol [23:16:18] https://www.wikidata.org/wiki/Wikidata:IC <-- needs help [23:16:20] I wish that I were a useful editor who could just ignore the political issue. [23:16:36] all i'm saying is that i don't agree with a lot of the way enwiki is right now, but going around saying that something is bad just because it's from enwiki is going to offend people [23:16:37] Instead, I'm useless at everything but mindless admin tasks and discussions (for the most part) [23:16:46] Ajraddatz: wanna do something useful? [23:16:51] discussions are fun [23:16:55] help generate categories for my properties bot. [23:17:17] especially when the otehr is a stubborn goat that can't be enlightened [23:17:27] Ajraddatz: yeah. in retrospect, while I don't think a suggestion like mine there was inappropriate per se, I'd like to keep Wikidata as drama-free as possible [23:17:29] I could help with that tomorrow night maybe. I'm pretty busy with university and developing a site ATM. [23:17:32] your profecies bot? [23:17:34] =D [23:17:38] :P [23:17:44] Ajraddatz: help develop wikidata ;) [23:17:50] prophecies.. or whatever [23:18:03] Jasper_Deng: as I've pointed out in the past, I said he *could* be blocked, not that he should be, and I agreed it wasn't necessary at the time [23:18:12] rschen7754, I don't think that all things from enwiki are bad.... (you aren't that bad yourself :D).... but I might just use that blanket statement when referring to a specific bad idea from enwiki. I'll need to watch that :P [23:18:16] Jeblad_WMDE: jhs found some huge categories for gender with like 500k people in them on dewiki [23:18:35] Ajraddatz: okay, understood :) [23:18:41] yes, they use male and female as category [23:18:41] and nemo_bis is going to help us use itwiki's persondata template since apparently its very well defined [23:18:47] PinkAmpersand: but didn't you think that he wouldn't appreciate you hanging BATTLE, which to him is a foreign policy, over his head? [23:19:08] ... [23:19:17] /mode +q *!*@* [23:19:21] hah [23:19:40] duh, I couldn't help develop wikidata. Too dumb. :( [23:19:41] ...and his eyes just telling lies, but the woman, shes just sitting there, the woman.. #singing [23:20:17] I must stop drinking the Warsteiner-shit, getting so sad and starting to sing.. [23:20:20] it's not even a policy. it's just a concept, and I think it applied there. but, whatever, yeah, if i run, I'll throw in something about how I won't let my occasional taste for drama get the better of me [23:20:36] PinkAmpersand: take it to #wikimedia-wikidata-drama [23:20:38] in that case, I'd have no problem with supporting [23:20:40] this is a drama free zone. [23:20:51] My poor neighbors.. [23:20:52] * duh goes back to coding [23:20:57] * Moe_Epsilon comes in riding a drama llama [23:21:02] lol Jeblad_WMDE [23:21:18] * Jeblad_WMDE goes llama hunting [23:21:39] o: [23:21:42] * Moe_Epsilon trots away [23:22:02] oh yeah Moe_Epsilon, did you see https://toolserver.org/~legoktm/cgi-bin/wikidata/logs.py?job=3 [23:22:11] * duh will keep showing that off [23:22:16] * Moe_Epsilon looks [23:22:27] ...one hand on my beer, the other drumming the tune on my old guitar while I hunting the llama.. [23:22:43] oh, nice duh [23:23:17] unfortunately the table method was too big of a page [23:23:25] so i switched to db logging [23:23:26] duh, not sortable ;( [23:23:32] no [23:23:36] i need to figure out how to do that [23:23:43] its a jquery plugin or something [23:24:45] Jeblad_WMDE: Do you think I should make the linkItem thingy experimental for now? [23:25:14] I think it is important and we will include it as fast as possible [23:25:25] Ask Denny tomorrow [23:25:44] ok [23:26:07] You could make a changeset to move it to experimental, but if I get around to review it I think the chances is that it goes in [23:26:23] You could also ask aude if shes still here [23:27:18] Jeblad_WMDE: you're lucky you're not saying this all on #wikipedia-en... the amount of times I've seen someone suggest we block a user for editing drunk.... [23:27:40] * Jeblad_WMDE clicks his heels [23:27:47] Schaahll! [23:28:35] As we say in Norway; Je' drikk bærre vatn! (I only drink wather!) [23:29:05] Its a local joke.. Perhaps I tell it sometimes.. That is why it is fun.. ;) [23:29:59] Or actually.. hehe [23:30:20] Some places in Norway they drink something called "karsk" [23:30:32] Karsk is moonshine and coffe [23:31:28] It is made by first dropping a coin in the coffee cup, then filling it with coffe until the coin disaers, then continuing with moonshine until you see the coin again [23:32:00] The thing is, the old 1øre coin was made of brass and was very dark [23:32:36] So when someone poor made karsk it was very little coffee and very much moonshine [23:33:16] And then the joke was when someone asked about the veeery thin "coffee" he would reply "I only drink wather". [23:33:50] Usually he would then drink moonshine strigth from the coffee cup without any coffee at all [23:34:31] So a local music group from Gjøvik in Norway made this into one of their themes, they only drink wather [23:55:03] well, so, one person answered my question :P