[00:49:11] Wikidata is down [00:49:19] !admin [00:49:30] not for me [00:49:39] what's up [00:49:41] we can't do anything about the site being down [00:49:45] http://www.wikidata.org/w/api.php?claim=%2484DE71B8-0612-4A78-A844-C6C1D388EECA&format=json&value={%22entity-type%22%3A%22item%22%2C%22numeric-id%22%3A2488794}&token=307640e3fea9868ad268fd836f9bcc28%2B\&snaktype=value&action=wbsetclaimvalue [00:50:08] we can pray for a holy technician [00:50:25] hmm, now down [00:50:26] :p [01:07:46] is anyone has ever worked with API of wikidata? [01:22:02] Amir1: yes [01:22:05] whats your question? [01:22:22] there is a problem about GUID [01:22:43] My bot sends this: [01:22:56] claim=q159%2484DE71B8-0612-4A78-A844-C6C1D388EECA&format=json&value=%7B%22entity-type%22%3A%22item%22%2C%22numeric-id%22%3A2488794%7D&token=307640e3fea9868ad268fd836f9bcc28%2B%5C&snaktype=value&action=wbsetclaimvalue [01:23:18] the claim GUID is about "demonym" [01:23:30] but the bot changes the last claim [01:23:35] http://www.wikidata.org/w/index.php?title=Q159&diff=12538541&oldid=12536059 [01:23:54] legoktm: Can you help me? [01:24:19] This problem become a very bothering bug [01:24:44] hmmm [01:25:02] is there any problem about the GUID? [01:25:11] i havent had any issues [01:25:19] but let me see [01:25:25] Thanks [01:25:55] "id": "q159$84DE71B8-0612-4A78-A844-C6C1D388EECA", [01:25:59] "numeric-id": 2488794 [01:26:12] is that the wrong one? [01:26:47] this is correct [01:27:19] but when you do this the bot puts the value in a wrong place [01:27:43] https://www.wikidata.org/wiki/Q2488794 == Irani [01:27:59] are you trying to change the value? [01:28:01] or add a new one? [01:28:05] yes [01:28:18] I want to change "demonym" claim [01:28:29] It's for test [01:28:40] I'm trying to write a code for bots [01:28:58] It's a very complicated code [01:30:33] oh [01:30:35] well [01:30:42] your problem is that you're using the wrong GUID [01:30:53] oh wait [01:30:55] o.O [01:31:08] wait no [01:31:09] "id": "q159$B5C14627-48C2-4EB8-AE44-71D151BD281F", [01:31:14] thats the one you should be using [01:33:00] sorry i was disconnected [01:33:11] where we were? [01:33:16] [08:30:41 PM] your problem is that you're using the wrong GUID [01:33:17] [08:31:09 PM] "id": "q159$B5C14627-48C2-4EB8-AE44-71D151BD281F", [01:33:17] [08:31:13 PM] thats the one you should be using [01:33:31] Are you sure? [01:33:35] i checked that [01:33:43] yes [01:33:47] https://www.wikidata.org/w/api.php?action=wbgetentities&ids=Q159&format=jsonfm [01:33:55] ctrl+f "p49" [01:36:27] OH SHIT [01:36:39] ? [01:36:51] i guess you figured out your mistake? :P [01:37:17] you can't believe what the shit i did [01:38:08] for claim in claims: [01:38:09] if claim['m'][1]==propertyID: [01:38:10] El búfer 1 está vacío. [01:38:11] theclaim=claim [01:38:13] thats the part [01:38:35] and when i wanted to put i wrote 'claim': claim['g'], [01:38:43] that must be 'claim': theclaim['g'], [01:39:13] errr [01:39:13] are you creating an object for claims? [01:39:22] no [01:39:33] I making something so much better [01:39:37] Another question [01:39:47] have you worked with raw vaules [01:39:49] like [01:39:57] http://www.wikidata.org/wiki/Property:P236 [01:40:07] how can import them via API? [01:40:53] I haven't messed with strings yet [01:40:58] Well kinda [01:41:07] My code looks like [01:42:10] I want to mess with :D [01:42:58] http://dpaste.de/3zJGw/raw/ [01:43:36] I was going to commit that last night but didn't exactly finish it [01:48:33] https://www.mediawiki.org/wiki/Special:Code/pywikipedia/11209 [01:51:42] you can't leave comments. [01:51:46] err edit summaries [01:52:23] ehhh [01:52:23] + value="{\"entity-type\":\"item\",\"numeric-id\":%s}" % value [01:52:26] not a fan of that [01:53:06] I'm not a very good programmer [01:53:18] I'm a very hard-coded person [01:53:35] In my field it's the best [01:53:51] (Computational Physics) [01:54:01] you can change and make it better [01:54:08] that the point of team work [01:54:35] and besides can you patrol my change in manual?http://www.mediawiki.org/w/index.php?title=Manual:Pywikipediabot/Wikidata&stable=0&shownotice=1 [01:56:55] done [01:57:05] and i made you autochecked or whatever that means [01:57:33] thanks [05:35:52] Hello. [05:36:00] hi [05:36:23] I've got a small issue when I try to type a property name. [05:36:31] There is auto completion... [05:36:43] And it works for most of the properties. [05:36:55] But it doesn' accept the "entity type" [05:37:10] i think it got renamed [05:37:15] to GND entity type [05:37:34] GND stand for ? [05:37:42] idk [05:37:48] I just added an alias for "entity type" [05:38:50] Ok thanks. [05:39:14] Should I update the [[Wikidata:List of properties]] ? [05:39:15] 10[1] 10https://www.wikidata.org/wiki/Wikidata:List_of_properties [05:40:42] sure [05:42:35] It would be more user friendly if the auto completion was not case sensitive... [05:43:00] it is??? [05:44:10] Well if I start to type gnd instead of GND it doesn't offer the different options for example... [05:44:16] hmmm [05:44:40] oh i see [05:44:47] lemme see if theres a bug for it [05:46:57] legoktm: YYou want me to fill a bug ? [05:47:14] https://bugzilla.wikimedia.org/show_bug.cgi?id=46185 [05:47:17] already did :) [05:47:50] ok. [05:47:59] :D [08:20:28] hello [08:20:48] o/ [08:20:53] hi legoktm :) [08:21:05] I have a little technical issue with wikidata [08:21:15] whats up? [08:22:33] *when I add an interwiki which is used in another entry, I have a message... "Edit not allowed: ... it's used on [[QAAAA]]... " for example. [08:22:33] the issue is, [08:22:55] *those [[ ]] don't give me a link [08:23:01] yeah [08:23:02] they are just plain text [08:23:05] you have to manually type it in [08:23:13] ahhh :( [08:23:22] i think theres a bug somewhere about it [08:23:52] ohh well, I hope it's fixed, at least it's not something about my browser, Opera [08:24:50] nah its for everyone [08:25:02] ok, thanks legoktm :) I'm going back to complete my edition [08:25:04] greetings [08:25:09] it was a general bug like "have better error messages" [08:25:12] :) [08:25:15] hehehe [08:25:25] bye :) [08:39:08] Is there any workaround for automatically updating wikidata link when renaming a connected page at local wikipedia? [08:41:23] yes [08:41:27] there are bots that will do it [08:41:52] but its not instantly [08:41:56] there's an open bug for it though [08:42:18] thanks legoktm [09:06:12] status update has link to this page: https://gerrit.wikimedia.org/r/#/q/%28status:open+project:mediawiki/extensions/Wikibase%29+OR+%28status:merged+project:mediawiki/extensions/Wikibase%29,n,z [09:06:20] It shows 500 Internal server error for me [09:07:46] hmm [09:07:50] i think gerrit is just acting up [09:08:17] it's the "our commits" link [09:08:24] yeah [09:08:25] "awaiting review" link works ok [09:08:30] i tried constructing it manually and it didnt work [09:31:31] New review: Daniel Kinzler; "(4 comments)" [mediawiki/extensions/Wikibase] (master) C: 1; - https://gerrit.wikimedia.org/r/53543 [09:36:59] New review: Daniel Kinzler; "(1 comment)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/53986 [09:56:07] New review: Daniel Kinzler; "(4 comments)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/53381 [10:00:08] DanielK_WMDE: so you have some minutes to have a look at an api-module I'm writing? [10:00:48] lbenedix: i'll have a look in a minute. i'll not be able to say much about all the JS stuff though. [10:01:14] its not about js-stuff [10:01:24] its about internal api-calls [10:16:15] when I return $request->getValues() I get all required parameters: action="upload" filename="UIFeedback-screenshot-90.png" file="BINARY_DATA" token="TOKEN" [10:16:32] but when executing the api-call i get: code="missingparam" info="One of the parameters filekey, file, url, statuskey is required" [10:20:33] Any ideas? [10:42:34] New review: Daniel Kinzler; "looks sane, haven't tried." [mediawiki/extensions/Wikibase] (master) C: 1; - https://gerrit.wikimedia.org/r/53414 [10:58:25] Is it possible to see the raw request I'm creating with new ApiBase(...)? [11:03:55] New patchset: Daniel Kinzler; "(bug 45568) propagate claim and label changes [DNM]" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/53991 [11:58:11] New patchset: Daniel Kinzler; "(bug 45568) propagate claim and label changes [DNM]" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/53991 [11:59:58] New patchset: Daniel Kinzler; "(bug 45568) propagate claim and label changes" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/53991 [12:01:27] lbenedix: sorry, i'm at a symposium; my attention was unexpectedly taken by the actual presentation :) [12:02:48] np [12:02:50] lbenedix: i havn't used internal API calls. they are pretty clunky. are you sure you need them? [12:03:01] what are you actually trying to do? [12:03:23] (upload? what are you uploading?) [12:03:30] one of the javascript-librarys i use upload the text-feedback together with a png [12:03:50] the png is either base64encoded or in $_FILE [12:03:58] lbenedix: the png will work, but uploading text files probably won't. [12:04:24] but... why do you use an *internal* api call for uploading? I'm confused about the control flow [12:04:33] i send the request to my api-module [12:04:33] the file is uploaded by the browser, no? [12:05:35] In the first place i sent it to a shabby specialpage_api and stored the binary data from the png into my table [12:05:43] I was told not to do that [12:06:00] can't you just upload the image directly using the upload API? [12:06:18] looping it through your own module seems unnecessary [12:06:37] but sending it to my api is already done [12:06:56] but now it's causing problems :) [12:06:56] ok, time for lunch, bbl [12:07:12] i think the problem is, that the internal api call is using GET [12:07:28] $api->getRequest()->getMethod() == 'GET' [12:07:39] I belive the upload api wants POST [12:14:05] Lydia_WMDE: Hi. There was some your post in Project chat mentioning "if interwiki is not going to show up in say 5 days please let me know". [12:15:36] the only I can found now is http://www.wikidata.org/w/index.php?title=Wikidata:Project_chat&diff=10495284&oldid=10494314 [12:16:50] but there was another one which also IIRC said something like "automatic purging should be deplyed together with the way to change interwikis directly from Wikipedia" [12:17:14] Do you remember the second one? Can't find it now... [12:18:22] And, bytheway, I have Wikipedia page which was not autopurged in like six days... [12:25:58] ok, back [12:26:31] Lydia_WMDE: have you get those messages I wrote several minutes ago? [12:35:47] New patchset: Aude; "(bug 45037) show edit link only if we have repo links" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/53543 [12:36:22] New review: Aude; "(1 comment)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/53543 [12:37:08] New patchset: Aude; "(bug 45037) show edit link only if we have repo links" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/53543 [12:39:21] aude: I am restarting Jenkins [12:56:44] hashar: ok :) [12:57:49] Utar: Lydia_WMDE isn't currently online (only her irc client is), but she'll look into it. [12:58:10] legoktm: I've just fixed again the Wikidata summaries on Meta for es: [12:58:11] Utar: we know there are issues with purging, and we are looking into it. [12:58:17] New review: Aude; "recheck" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/53543 [12:58:36] DanielK_WMDE: it's http://www.wikidata.org/w/index.php?title=Q318328&diff=10761472&oldid=10577599 for your information [12:58:59] still around the third of Wikipedias is lacking the link [12:59:11] I can't understand why addaway changed them back in https://meta.wikimedia.org?diff=5307132 [13:00:17] and I remember reading Lydia's comment somewhere "it is autopurging but slow" and "faster autopurging going to be depolyed with direct changes of interwiki at Wikipedia" [13:01:07] Utar: only autopurging for items that have been changed (sitelinks) since the client was deployed to those wikipedias [13:01:35] when people or bots go around and remove links, then [13:01:42] pages will get purged that way [13:01:53] but no for adding new links< [13:01:53] ? [13:01:53] ah, yes [13:01:55] that is slow [13:02:07] we're working to make it scale better and faster [13:02:29] good [13:06:34] aude: so it should autopurge, just the function is quite slow [13:06:50] and takes several days in stand of minutes [13:07:47] aude jenkins restarted [13:12:17] Utar: it should but several days concerns me [13:12:32] * aude looking at the item [13:15:38] aude: there isn't any rc entry corresponding to that edit on sqwiki (which should have been updated, but still has the old version of the links cached) [13:16:23] seems like some notifications get lost, or I got the filtering wrong, and the dispatcher decided that wiki doesn't need to be notified [13:16:32] could be related to slave lag. seems unlikely, though [13:16:37] i wonder if stuff gets purged [13:16:48] err... pruned before the jobs get to clients [13:16:55] changes. [13:17:13] we keep 48 hours of changes. that should be plenty [13:17:45] DanielK_WMDE: not so sure [13:17:55] * aude looking [13:19:30] aude: 24 hours [13:19:38] > select min(change_time) from wb_changes; [13:19:40] +------------------+ [13:19:41] | min(change_time) | [13:19:43] +------------------+ [13:19:44] | 20130315111502 | [13:19:46] +------------------+ [13:20:44] * aude looking for select * from wb_changes where change_object_id = 'q4705940'; [13:21:00] just made a new edit and edited that item very early on march 15 (yesterday) [13:21:14] not in my eswiki watchlist.... seeing if new edit appears [13:21:24] nothing in the changes table [13:22:39] > select max(rc_timestamp) from recentchanges where rc_type = 5; [13:22:41] +-------------------+ [13:22:42] | max(rc_timestamp) | [13:22:44] +-------------------+ [13:22:46] | 20130315131813 | [13:22:47] +-------------------+ [13:22:48] aude: ---^ [13:23:03] it does seem like things are getting lost, but it's hard for me to see when, where, why or how. [13:23:04] yep [13:23:40] we'll need some support, or more privileges, to really investigate this [13:23:57] obviously we need better tracking and the scripts be more adjustable to what the lags are [13:24:08] aude: we could ask for the pruning to be set to keep 3 days or so. [13:24:14] yes [13:24:22] but we should really not get 24 hours lagged. we shouldn't even get 1 hour lagged. [13:24:22] one thing we can do [13:24:26] yes [13:24:45] boosting the dispatch process is the first thing for me to do in the next sprint [13:24:49] not before the branch, though [13:24:56] same here [13:25:15] * aude not convinced oldest first is sufficient but better than random [13:25:36] shall investigate but in meantime can tweak the cron jobs [13:26:45] jem-: I dont think I ever intended to make that change [13:26:50] * aude wonders how logn the toolserver lag is? [13:27:00] I think tht may have been a copy and paste from two different versions of stuff [13:28:33] aude / DanielK_WMDE : Did you already think about https://www.wikidata.org/wiki/Special:MostInterwikis ? [13:29:25] multichill: no [13:29:27] multichill: hm? [13:30:43] https://www.wikidata.org/wiki/Q5296 would be at the top [13:32:19] ( SELECT ips_item_id, COUNT(ips_item_id) FROM wb_items_per_site GROUP by ips_item_id ORDER BY COUNT(ips_item_id) DESC LIMIT 100; ) [13:45:47] multichill: a ranking by number of sitelinks would be kind of interesting, but would it be useful? [13:46:43] I think so. Would be a nice feature to find items that have a lot of sitelinks and don't exist in a certain wiki [13:47:30] You could also do some nice importance rankings of items (number of sitelinks * size of each article for example) [13:52:23] addaway: Yes, that would be make sense [13:52:41] multichill: yes, we are already considering this for ranking search results / suggestions in the item selector [13:53:02] it's a useful measure, i'm just not sure if it's useful as a special page [13:53:05] I'm wondering how SOLR could fit into this [13:53:25] it fits well, we already have code for solr integration [13:53:37] but we don't have the infrastructure to run it yet [13:55:06] multichill: is there a request on bugzilla for this special page? go ahead and file it if you like [13:55:14] sounds like a nice volunteer project [13:55:30] Not sure yet what I would like. Just did http://toolserver.org/~multichill/temp/queries/wikidata/top_not_nlwiki.txt [13:56:37] Lot of category and template crap [13:56:54] multichill: ah, "lots of links, but not to X" is an interesting perspective! [13:57:23] I used to run a query like that every once in a while to find interesting subjects to write an article about [13:58:31] make an email service from this :) [13:58:56] Weird, https://www.wikidata.org/wiki/Q223 doens't have a nlwiki link [13:59:54] Oh, right https://www.wikidata.org/wiki/Q4148644 [14:01:09] ick! [14:01:17] high time i finish the merge/redirect feature [14:01:27] keeps getting pushed back :( [14:02:24] multichill: it's something we have on the road map, it just needs a bit more coding. it's perhaps important for the community to know that this is on the horizon [14:02:42] New special pages, what extension to file that under? [14:07:56] DanielK_WMDE: https://bugzilla.wikimedia.org/show_bug.cgi?id=46217 [14:20:15] can you tell how many users on wikidata.org are using IE? [14:28:54] New review: Daniel Kinzler; "(2 comments)" [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/49066 [14:41:44] New review: Aude; "recheck" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/53543 [14:56:00] lbenedix1: too many [14:56:38] hehe [14:56:45] and in %? [14:57:10] i don't know [14:57:41] lbenedix1: 18% for wikimedia overall, according to http://stats.wikimedia.org/wikimedia/squids/SquidReportClients.htm [14:58:13] probably less for wikidata, at the moment, because it's mostly interesting to the "early adapter" crowd, which is less likely to use IE [14:58:30] * lbenedix1 loves that yellow background [14:58:48] * DanielK_WMDE loves the beveled tables. they are so 90s. [14:59:09] and there was an issue for IE8 that was 'fixed' and reopened several times [14:59:42] 15% mobile, thats a lot... [16:11:33] New review: Daniel Werner; "(1 comment)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/53986 [16:21:13] New review: Daniel Werner; "(1 comment)" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/50753 [16:27:55] Hello people ! [16:28:03] There is a problem in Wikidata [16:28:53] When I change the case of an entry, the Save link in greyed, so I cannot save ! [16:29:12] But without this change, the link is not the good one [16:29:32] Link? [16:32:45] which entry are you talking about? [16:57:55] I am talking about this : http://www.wikidata.org/wiki/Q768650 [16:58:25] I wanted to put the correct case for the french name [16:58:42] And in this case ( ! ) it is impossible [17:01:56] I cna change the case [17:02:24] can you describe what exactly you are doing? [17:02:49] maybe a screenshot could help to reproduce that error [17:06:51] I was trying to change the french name "Le Roi est mort, vive le Roi !" to "Le roi est mort, vive le roi !" [17:08:44] I can change the case too, but then I cannot save, "save" is greyed [17:09:25] Nnemo: so you are saying if all you change is the capitalization of characters, the save button would still be gray? [17:09:43] Yes [17:09:51] what browser are you using? [17:09:58] I just tried it and can't confirm [17:09:59] Safari Mac [17:10:31] did you copy paste your string? [17:11:06] I don't have mac and Safari here but it works in Chrome for me [17:11:59] * lbenedix1 thinks a feedback mechanism that sends all relevant data including the browser, operating system and a screenshot with highlights would be usefull [17:13:02] could be ^^ [17:15:24] * lbenedix1 realizes, that he is developing such a feedback mechanism and needs some help with the upload of the rendered screenshot [17:16:56] Well, I have the problem in Firefox too, this stuff is broken [17:17:24] which version? [17:17:40] What help do you need for uploading ? [17:18:04] Nnemo: I'll check it in ff [17:18:23] Yes, but there are synchro problems [17:18:24] Nnemo: No, works for me in FF [17:18:46] Now, I see the sentence with "roi" [17:18:55] the image is rendered into a canvas object and I want to send this to the fileupload api [17:18:59] because I changed it Nnemo [17:19:53] No, not you. The robot did. [17:20:04] ? [17:20:11] Sure. [17:20:21] works for me with FF 19.0.2 [17:21:03] can you have a look at the Javascript-console? [17:21:22] Let's say I want to write "Roi". I don't, but let's test. [17:21:44] In Firefox, I write "Roi" in both words in the french name [17:22:11] "Save" is disabled, although this page exists in the french Wikipedia [17:22:34] ahh, you are editing the links... I changed the title [17:22:42] me too^^ [17:23:00] right, the save-link is grey [17:23:02] could be because of a redirect then [17:23:11] Yes, of course, the links [17:23:35] That's what this Wikidata thing is there for [17:23:50] Changing the title works OK [17:24:39] Ah ok, no redirect but seems like a bug in MW's autocompletion API module where the lower case is simply not returned. [17:25:44] So what do I get ? I get a list which suggests me "Le Roi est mort, vive le Roi !" and "Le Roi est mort, vive le Roi ! (album)". But no "Le roi est mort, vive le roi !" although this is the name of the article. [17:25:57] Then, two more problems. [17:26:09] yes, a bug in MW's "opensearch" API module. [17:28:52] 1) If I leave my typed entry "Le Roi est mort, vive le Roi !", "Save" is greyed. But if I choose it from the list of suggestions, "Save" becomes blue. *Exactly the same text.* This is stubborn, not very user-friendly. [17:30:52] Can't confirm. For me it is always disabled as it should be (after clicking edit and the same text is in the box already) [17:31:02] 2) So I have chosen the entry from the list, and now "Save" is enabled. If I click "save", what gets saved is… "Le roi est mort, vive le roi !", with small letters ! This is non-sense. [17:31:42] Sounds like there is a redirect between the two pages, in that case it behaves like this [17:32:01] when clicking at the suggestion its blue for me too [17:32:48] pasting "Le Roi est mort, vive le Roi !" into the box is leading to a grey save link [17:33:03] This is wrong. [17:33:13] And it is not specific to pasting. [17:33:42] selecting it from the suggestions, it is blue and the text in the box is "Le Roi est mort, vive le Roi !" [17:34:18] Page names in Wikipedia are case-sensitive. [17:34:22] i cant see any difference in the value of the box when selecting it from the list or typing/pasting it into [17:36:36] Of course you can't see any diff. The engine does not care. He just stubbornly refuse the user's typed text until the user accepts to choose an entry from the engine's list. [17:37:41] that is strange [17:37:43] This list gives suggestions. If suggestions were meant to be a mandatory passage, they should be radio buttons ! [17:38:00] Nnemo: About the first bug with case sensitivity, I now think the page "Le roi est mort, vive le Roi!" simply doesn't exist in the wikipedia you are using. [17:38:39] I've first been confused becasue http://sv.wikipedia.org/wiki/Le_roi_est_mort%2C_vive_le_Roi! looks like there is a page, but its just some generic info for empty pages [17:38:39] Yes, it exists, but with a space before the "!" [17:39:08] can you give me a link please? [17:39:10] http://fr.wikipedia.org/wiki/Le_roi_est_mort,_vive_le_roi_! [17:39:21] Great minds think alike :-) [17:39:40] alright [17:39:41] http://fr.wikipedia.org/w/api.php?action=opensearch&search=Le%20roi%20est%20mort,%20vive%20le%20roi%20! [17:40:05] its not there, I'll file a bug [17:42:07] Thank you people for correcting that mess. [17:42:11] Cheers ! [17:42:19] Thanks for reporting! [17:44:54] Danwe_WMDE: can you help me with the fileupload? [17:45:12] https://bugzilla.wikimedia.org/show_bug.cgi?id=46228 [17:45:27] Nnemo^^ [17:45:38] lbenedix: What's the issue? [17:46:05] I have a canvas-element which i want to send to the fileupload-api [17:46:36] ok [17:47:12] I tried sendig it to my api-module and forwarding the file with DerivativeRequest but that is not working [17:47:38] next try was sending it via js to the fileupload-api, also not working [17:52:22] mh, long time since I've been using that API module [17:52:42] not even sure it was that one [17:53:37] what does the fileupload api return? [17:54:00] code="missingparam" info="One of the parameters filekey, file, url, statuskey is required" [17:54:20] that was with the forwading-method [17:54:45] printing $api->getRequest()->getValues() gives me: action="upload" filename="UIFeedback-screenshot-90.png" file="BINARY_DATA" token="TOKEN" [17:58:22] My guess would be that something is wrong with what you submit in "file" [17:58:49] saving that data to a file is working [17:59:16] saving with fputs [18:00:34] ok, weird [18:01:05] sending the data as url: data:datauri is leading to "permission denied" [18:01:39] how can i find out if my user is allowed to upload files? [18:02:15] its only [18:03:14] easiest thing would be to just try with the user to upload via the special page for uploading [18:03:57] really don't know that much about the file upload (api). Just remember that dealing with the import api has been a pain for me in the past [18:04:28] the datauri-thing is not working: [18:05:02] the missing permission was upload_by_url right [18:05:51] but you are one of the javascript guys, right? [18:06:33] maybe you can help me with the "upload via js"-approach [18:07:58] lbenedix: Yes, I am. Didn't really think much about uploading stuff so far, but I can try ;) [18:09:53] I need to build a request like this: http://www.mediawiki.org/wiki/API:Upload#Sample_Raw_Upload [18:10:20] I think only action, filename, token and file are mandatory [18:12:46] right, so what's the problem? you can use mw.Api.prototype.post [18:13:33] Perhaps, also take a look at the upload wizzard of commons, I immagine that one has to deal with upload a lot ;) [18:14:35] I'm not a big javascript-developer ;) [18:15:23] this is not working: http://pastebin.com/FLjvk35B [18:15:53] btw: I'm not a big mediawiki-developer either [18:21:43] mh, try to use a more high level function, use mw.Api [18:21:53] var api = new mw.Api(); api.post( ... ) [18:22:05] i dont know how to get the file into that right [18:22:22] the same way you are trying there [18:23:05] the example request is: Content-Disposition: form-data; name="file"; filename="Apple.gif" Content-Type: application/octet-stream; charset=UTF-8 Content-Transfer-Encoding: binary [18:24:15] var api = new mw.Api(); params = { action: 'upload', filename: 'foo.jpg'... }; api.post( params ) [18:24:45] will return a jQuery promise so you can use .don() [18:24:46] done [18:28:53] "code":"badupload_file","info":"File upload param file is not a file upload; be sure to use multipart\/form-data for your POST and include a filename in the Content-Disposition header." [18:29:26] Content Type is: Content-Type:application/x-www-form-urlencoded; charset=UTF-8 [18:29:58] sounds good, so now you only have to sort out how to get the file in but at least your connection seems ok [18:30:34] with code from http://stackoverflow.com/questions/4998908/convert-data-uri-to-file-then-append-to-formdata I get binary data from the canvas-object [18:30:46] but how to stuff it into the request? [18:31:20] lbenedix: I am not sure, but where I would start looking would be a look at that extension: http://www.mediawiki.org/wiki/Extension:UploadWizard [18:32:01] I have, but it's tons of js [18:32:49] yeah, so just search for the usage of the API ;) [18:33:18] okay.... i've found the api.post [18:34:59] no... seems that they are using XMLHttpRequest [18:36:10] and they are handling real files [18:37:05] so they are not using the upload api at all? [18:37:21] are they providing their own api? [18:39:14] i think they use the upload-api [18:39:28] i found action: 'upload' [18:39:39] you probably use both together [18:40:15] yes, i think i have to build the request by myself and cant use api.post [18:50:31] \o? [18:50:37] \o/ [18:51:05] Danwe_WMDE: Tanks a million! [18:51:07] http://lbenedix.monoceres.uberspace.de/mediawiki/index.php?title=File:AAAAAAA.png [18:55:24] great :) [18:56:58] the is one more issue ;) [18:57:17] the css of the extension is not loaded properly in that installation [18:58:05] i dont know if its a mediawiki or a chrome bug [19:00:01] don't know, but it could be something missing/wrong in your extension's resources definition [19:16:22] Danwe_WMDE: its working perfectly well in one mediawiki-installation and its broken in another [19:20:37] MW versions? Other extensions? [19:21:07] i sent you the two links as a private message [19:21:12] could be a missing resource loader module dependency which in the other wiki is simply loaded for some reason (e.g. because another extension requires it) [19:21:29] the extensions are the same [19:22:17] the working one is mediawiki 1.21alpha (bd6128d) [19:22:43] the broken one is 1.21alpha (75f8f4c) [19:23:41] and most of the css is working, only the placement of the feedback window and the size of one div is strange [19:45:50] New review: Daniel Werner; "(3 comments)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/53545 [19:55:14] New patchset: Hoo man; "(linkItem) Always dismiss tooltips on dialog click" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54142 [20:31:21] New review: Daniel Werner; "(1 comment)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/53545 [20:40:21] New review: Daniel Werner; "(1 comment)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/53734 [21:04:48] New patchset: Hoo man; "Use the watchlist options given by the hook" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54147 [21:05:11] New patchset: Aude; "Put sql classes in store/sql directory in lib" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54148 [21:08:50] Danwe_WMDE get JeroenDeDauw and get out of the office! Now!! [21:10:27] New review: Aude; "so simple :)" [mediawiki/extensions/Wikibase] (master) C: 1; - https://gerrit.wikimedia.org/r/54147 [21:12:51] New review: Aude; "but would like Tobi or Anja to look at the selenium tests" [mediawiki/extensions/Wikibase] (master) C: 1; - https://gerrit.wikimedia.org/r/53414 [21:13:11] * aude wants chocolate :) [21:13:44] * Jeblad_WMDE has choccolate [21:13:48] Jeblad_WMDE: https://gerrit.wikimedia.org/r/#/c/53606/ [21:13:56] do you still get those errors? [21:13:58] 250g Ritter Sport [21:14:16] * aude thinks whatever issue, if any, is unrelated to that change [21:14:21] mmmm.... [21:15:00] 8) ... [21:15:20] Man opened his eyes during autopsy! My kind of movie.. [21:15:45] heh [21:16:14] Only problem, where is his body... [21:17:29] * aude wonder how many beers Jeblad_WMDE has consumed [21:17:29] "Why haven't you done the autopsy?" [21:17:40] "Oh, well, be blinked at me..." [21:17:43] lol [21:17:54] None so far! =D [21:17:58] :o [21:21:24] hoo: https://bugzilla.wikimedia.org/46232 [21:21:26] what do you think? [21:24:38] aude I'll check now [21:24:55] * Jeblad_WMDE decapitates the bug aand IT IS STILL ALIVE!! [21:25:22] heh [21:25:57] aude: Well... [21:26:38] we had a bug earlier saying the exact opposite (consistent user interface behavior) [21:26:47] hmmmm [21:26:51] which one? [21:27:14] or we could let ips edit? [21:27:29] but if they are logged in on enwiki but not wikidata, then ask them to login? [21:29:13] Change merged: Aude; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54142 [21:29:39] mh... personally I would show a link to the repo for JS users and (maybe) anons and use the widget for logged ins [21:29:44] but someone said we wont do that [21:29:49] John probably [21:30:23] we could ask denny, but i think 99% of users (the anons, the readers) will get confused [21:30:30] if they click and get the error [21:30:37] yeah [21:30:42] it's the logged in users who will use it and understand [21:31:32] Yes... IMO we could completely hide it for anons [21:31:45] the interface already is incosistent for loggend ins vs. outs [21:31:53] (eg. move page is only aviable for users= [21:31:54] sure [21:32:36] I have a little concern that i want to discuss about [21:32:38] http://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/Dexbot_2 [21:32:41] sending anons to Special:CreateItem is not a great idea either [21:32:44] or would be interesting [21:33:04] Do you think it's correct to add source "Imported from English Wikipedia [21:33:05] ? [21:33:06] that could be subject to an UI study with a limited user set or smth- [21:33:18] but it's not good for production [21:33:19] WP is not a reliable source [21:33:36] it's better wait to add better sources later [21:33:47] what do you think [21:33:51] hoo: agree [21:34:52] Amir1: i don't know about the bot discussions and how such properties are being handled [21:37:16] aude, Unrecognized value for parameter 'action': wbgetentities [21:37:22] Its still there [21:37:45] hmmmm [21:38:33] I wonder if it is a redirect that ends up in some weird place on my computer [21:39:01] i get the nicer error bubble that says the link is already used [21:39:41] Jeblad_WMDE: That error should really never occur... can you try to chase it down further? [21:40:25] I purged the repo item, and it is still there [21:40:55] Another thing, when there is an error and I close the dialog box the error bubble is still there [21:41:19] Well... this error comes from one of those: A serious bug in wikibase.RepoApi, a bug in the MediaWiki API (very unlikely) or a flawed configuration [21:41:23] It probably has the wrong parent [21:41:38] Jeblad_WMDE: That has just been fixed ;) [21:41:53] (aude just merged the fix) [21:42:30] i still get sent to Special%3AItemByTitle%2Fenwiki%2FPanama [21:42:47] Oh, so she wanted choclate.. xD [21:43:18] aude: where? [21:43:42] hoo: once i do link an item to the repo [21:43:45] The JS wikitext parser is awry [21:43:49] The requested URL /wiki/Special:ItemByTitle/enwiki/Panama was not found on this server. [21:44:16] if i manually decode the %2F to slashes, then it works [21:44:49] mw.config.get( 'wbRepoUrl' ) + mw.config.get( 'wbRepoArticlePath' ).replace( /\$1/g, encodeURIComponent( title ) ); [21:44:58] hmmm [21:45:32] seems like the slashes are getting double escaped or something [21:46:21] ah, got it [21:46:23] :) [21:46:30] good [21:46:30] mw.util to the rescue :) [21:47:36] * Hazard-SJ says hello [21:47:56] Hazard-Bot to the rescue :P [21:48:43] New patchset: Hoo man; "Fix title encoding for repo titles in linkItem" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54203 [21:48:48] aude: ^ ;) [21:48:52] ok [21:49:30] the dependency change is just for sanity .... mediawiki.util is already required by various other module [21:49:31] s [21:49:33] Something makes my client access itself and thats why it cant find wbgetentities [21:50:01] Jeblad_WMDE: :/ [21:51:24] Jeblad_WMDE: Could you check the JS variables wbRepoUrl and wbRepoScriptPath? [21:51:40] they are generated from your client configuration... if that's fine those should be as well [21:52:43] Its the config that is borken [21:54:29] Jeblad_WMDE: Your config or the way WB handles it? [21:55:37] hoo: encoding works now, including complex page titles like Côte d'Ivoire :) [21:55:50] * aude tries one with a question mar [21:55:51] k [21:55:55] :) [22:03:28] my config was wrong [22:03:35] New review: Aude; "works correctly with various complex page titles" [mediawiki/extensions/Wikibase] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/54203 [22:03:37] Change merged: Aude; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54203 [22:03:42] \o/ [22:04:04] It seems like the js-solution tries to use wbScriptPath no matter if it is defined or not [22:04:58] Jeblad_WMDE: Sure that's the only way (except of wild guessing around) [22:05:34] i think it's required to be defined in the client [22:05:51] at some point, we'll likely switch to using the sites table but [22:05:52] Yeah, the example config. has it [22:06:02] that means having a way to put test wikis in the sites table [22:06:27] and might mess with the assumption that my test wiki is supposed to behave like enwiki [22:06:34] and test2 wiki, etc. [22:08:36] the repoScriptPath in examples are ot correct in a lot of cases [22:09:24] Examples aren't meant to be correct... you can give them a more likely value, though [22:09:29] especially two mw-instances on one server.. breaks a lot of assumptions. [22:09:50] Jeblad_WMDE: they are just examples [22:10:40] no they are not if they are set up as part of a default working ebnvironment [22:11:01] But I'm not sure if it is possible to test for the situation in a sane way [22:11:41] Except testing for the error you get back and that is kind of foolish when the error only emerge in a dev setup [22:16:34] New review: John Erling Blad; "Removed the -1, this come from a mis-configuration in my server. Not sure, but I think we should add..." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/53606 [22:21:06] DanielK_WMDE: to keep you updated: the upload is now done in JS [22:22:03] the missing link was converting the dataURI to binary data [22:30:26] New review: John Erling Blad; "Verified that previous errors are gone now." [mediawiki/extensions/Wikibase] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/53414 [22:31:20] there is still a css bug that wants to be squished [22:36:28] Change merged: Aude; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/53975 [22:37:40] are there any standard (python) bot scripts that I can use to import articles? i have looked at the pywikipedia stuff, but it still seems rudimentary and not very simple to use� [22:37:57] Jhs: import how? what? [22:38:00] * aude thinks not yet [22:38:33] except for various bot scripts, perhaps [22:38:44] to do specific tasks [22:39:11] aude, like User:Dexbot imports from Farsi, User:MerlIwBot imports from Indonesian, etc [22:39:21] (MerlIwBot imports from lots of places) [22:40:15] i don't know what dexbot uses but it might by pywikipedia (+ other stuff)? [22:40:43] I think Merlissimo uses some java scripts or something [22:40:48] do you mean site links? [22:40:52] or properties? [22:41:07] sitelinks [22:41:14] (not javascript -- he uses java) [22:41:28] ah, there might be scripts but don't know which ones [22:41:50] it's just that i see so many bots importing sitelinks, so i thought maybe there was some shared script [22:41:56] but maybe they all just write their own scripts [22:42:18] i think there was some intention to have pywikipedia be capable [22:43:22] not sure what benestar uses... [22:55:53] lbenedix: uploading the screenshot will reveal the IP of an anonymous user, right? [22:56:11] or does it somehow use a fake user account? [22:56:21] that would be great [22:56:31] but i dont know how to do that [22:57:32] damn... i missed that point... [22:58:53] is it possible to get a static edit-token for a fake-user? [22:59:06] and wouldnt it be bad to give users that edittoken? [23:07:26] jhs, there are scripts for pywikipediabot to import sitelinks [23:08:00] There are also scripts for importing properties, but only some limited ones [23:08:31] MerlIWbot is using a special framework, at least for one of the bots [23:08:38] Jeblad_WMDE, i know, i have tried one briefly ( http://wikidata.org/wiki/Special:Contributions/JhsBot ), but since I'm not a scripter I don't know how I can make it (semi-)automatic, i.e. fetch articles from categories or special:allpages or whatnot [23:09:19] It is rather special and figure out how sitelinks fails [23:09:59] Not sure who runs the property-importing bot [23:10:54] Enhanced changes sucks [23:10:56] * suck [23:11:05] the code's scary [23:18:16] hoo: yes [23:18:48] no choice but to improve it enough to support it [23:19:26] I think I've found my way through to be able to hook it up so that we can support [23:19:27] it [23:19:34] oooh [23:19:41] I'll try that tomorrow... no promises, though [23:20:07] there are issues with then formatting the changes [23:20:07] We need two hooks as it got different methods for grouped and single lines :P [23:20:20] right [23:20:55] aude: Have you worked on that already? [23:20:55] i'd like there just to be a class or somethign that represents a line [23:21:00] yes [23:21:09] "# Classes to apply -- TODO implement" [23:21:10] more granularity [23:21:51] right now it does a mix of concat stuff and output stuff [23:22:42] it's a bit challenging to do big overhaul, however, and keep compatibilty with all the extensions, etc. [23:23:42] does someone of you have an idea why ?debug=true is leading to a completely broken extension? [23:24:08] its broken in chrome 27.0.1438.7 dev-m and working in ff 19 [23:24:08] lbenedix: Check your rewrite rules [23:24:11] oh :P [23:24:39] i'm still searching for the crome css bug [23:24:43] probably you didn't set the dependencies correctly, did you? [23:25:06] aude: Yeah... so we can't just hook that up and inject our own HTML? [23:25:29] if i did it wrong it shouldnt work without debug=true [23:25:43] hoo: it's not a great solution but for the regular format, we 'wipe out' the default html and replace it [23:25:51] lbenedix: Depends... often things randomly work [23:26:04] it would be better to replace specific parts like link href [23:26:05] aude: I know... it's not good, but it works [23:26:09] heh [23:26:18] it's fragile and has no tests, as a result [23:26:23] @todo fix that [23:26:57] I would like to have the whole ChangesList thing rewritten... but I guess that's not going to happen [23:27:53] and even if it would... who would merge a 1.5k line change breaking b/c [23:28:22] maybe i hijacked some dependencies from wikibase... [23:28:39] Jeblad_WMDE: Helo [23:28:47] with all other extensions removed its not working in ff [23:29:11] lbenedix: Just get your dependencies straight and try it after [23:29:36] I was sure I have all dependencies in the list [23:29:43] implicit dependencies are like asking for trouble [23:29:58] hoo: what i've attempted was to use the FetchChangesList hook and could introduce new way of doing stuff there [23:30:12] but keep the old stuff for backwards compatibility, somehow [23:30:24] * aude needs to poke at it more but have other priorities [23:30:40] if you can figure out something that would work for now, that would be a big help [23:31:03] I'll do tomorrow... that wont be super beautiful but, well [23:31:21] clean changes extension does what i tried [23:31:27] adds a third format [23:31:30] can i get a list of all dependencies loaded? [23:32:09] the code is not hugely better but is improved over core [23:32:28] lbenedix: Well, you need mw.loader.getState and mw.loader.getModuleNames [23:32:38] loop the second one into the first one :P [23:33:45] aude: mh... maybe I'll have a look at that extension also... but my current approach is to just overwrite the HTML like we do for the normal changes [23:34:08] ok, so introduce hook points for that..... [23:34:13] Yep [23:34:44] it's a little complicated because the way the grouping works [23:35:09] * aude would group wikidata users separate from wikipedia users, even if they are the same user name [23:35:43] if i get time, i can post some code probably after monday [23:37:04] amazing, my month+ old code rebased cleanly [23:37:21] :D [23:43:43] just 51 modules loaded [23:45:32] 29 are loaded when no extension is active [23:45:58] can i strike them off the list? [23:46:15] lbenedix: list? [23:46:43] list of possible missing dependencies [23:49:24] Well, why don't you just look at the JS errors you get? [23:49:36] lbenedix: ^ [23:51:09] is the order in the dependency array important? [23:51:28] No [23:52:04] i get an error for mw.util.wikiGetlink but mediawiki.util is in the dependency list [23:53:37] and loaded [23:53:57] for(var i=0;i mediawiki.util - ready [23:54:45] lbenedix: Do you have a link to the current code? Is it in gerrit? [23:54:59] Hazard-SJ: I'm here now.. [23:55:26] Every bot operators, please add your bot and your bot framework to http://www.wikidata.org/wiki/Wikidata:Bots_by_function [23:55:27] nope [23:55:38] i want to push it to gerrit when i foud this bug [23:55:40] http://lbenedix.monoceres.uberspace.de/mediawiki/index.php?title=Main_Page&debug=true [23:55:42] It would make it simpler for other bot operators to join in [23:56:19] Jeblad_WMDE: I believe the Wikibase extension returns an error if one tries direct editing? [23:56:28] * Jeblad_WMDE the eveil plan is to see which bots are crap.... [23:56:31] #Muahahaha [23:57:01] Hazard-SJ: I think I need some more context.. [23:57:22] lbenedix: I get "TypeError: mw.Api is not a constructor" [23:57:32] and mw.Api in fact is undefined at that time [23:57:35] I am extremly rotten on mind reading.. [23:57:40] do you have it in the dependencies? [23:58:51] yes [23:59:15] 'dependencies' => array( 'jquery.cookie', 'jquery.ui.draggable', 'mediawiki.util', 'mediawiki.api' )