[00:01:12] hast du position=top gesetzt? [00:02:11] Jeblad_WMDE: Is https://www.wikidata.org/?diff=12857017 ok? The source code is outdated at the moment, and not all of the code is published as yet. I'm also working on some new things. [00:02:25] Jeblad_WMDE: Also, {u'servedby': u'mw1190', u'error': {u'info': u'Unknown error: "wikibase-no-direct-editing"', u'code': u'unknownerror'}} [00:02:30] nope [00:02:45] (when I try to undo edits via API) [00:02:55] that would not explain the js-erros [00:03:54] Seems find to me [00:04:16] I guess a lot of operators are going to post links to old code [00:04:47] lbenedix: I guess resource loader needs to set async to the script modules [00:04:53] * elements [00:05:10] *async=false of course [00:05:14] ? [00:05:21] what do you mean? [00:05:28] Jeblad_WMDE: And about the error? [00:05:49] lbenedix: https://developer.mozilla.org/en-US/docs/HTML/Element/script ;) [00:06:14] i found position top [00:06:31] i thougt you were talking about css [00:06:50] JS can also have position=top [00:07:10] i found position=top in $wgResourceModules [00:08:16] https://gerrit.wikimedia.org/r/#/c/50004/9/UiFeedback.php [00:08:52] You should only set position=top if it's unavoidable as that slows down the page load [00:09:02] remove it and it'll probably work like a charm [00:09:13] Hazard-SJ: You can't change the content of an item by using ordinary editing operations [00:09:36] \o/ [00:09:41] thanks a lot! [00:09:53] lbenedix: You found a resource loader bug, though :P [00:09:58] yay [00:10:15] It is blocked several places, but most notably in a callback [00:11:43] Jeblad_WMDE: What's the easiest way to revert an item to an earlier version? [00:12:20] action=reset [00:12:27] :| [00:12:37] * Hazard-SJ tries to find more details [00:12:38] * lbenedix has to take back the last \o/ [00:13:00] Could be that the aPI blocks that because it sees it as an ordinary edit [00:13:02] now its neither working with debug=nor without [00:13:25] * hoo checks [00:14:17] but the css looks better with debug=true [00:14:22] I think we only checks that we can reset through the page itself, not the API call [00:14:44] Jeblad_WMDE: I can't find that on api.php nor on ME.org [00:14:49] * MW.org [00:16:13] * Hazard-SJ wonders what parameters to use [00:16:17] brb [00:18:01] hm.. could be that it is not exposed in the API [00:20:39] Hazard-SJ: reset is part of the action=edit which is blocked [00:21:06] I think it is necessary to file a bug for it and come up with a solution [00:22:42] hoo: i have the actual code here: https://bitbucket.org/lb42/uifeedback [00:27:30] adding ALL dependencies is working ;) [00:27:55] but only with position = top [00:27:58] That's what you're supposed to do :P [00:28:41] what's happening w/o position=top? [00:29:10] with ALL extension I mean literally all [00:29:28] Mh? [00:29:36] one of the librarys I use explodes without position top [00:30:02] it's badly written, then [00:30:10] turn it into an own module [00:30:43] i put that on the todolist [00:32:01] anyway... good night ;) [00:32:09] I dont think that this is the required minimal dependency list [00:32:10] 'site','user.groups','user.options','user.tokens','skins.vector.js','jquery','jquery.autoEllipsis','jquery.heckboxShiftClick','jquery.client','jquery.cookie','jquery.hidpi','jquery.highlightText','jquery.json','jquery.makeCollapsible','jquery.mw-jump','jquery.mwExtension','jquery.placeholder','jquery.jStorage','jquery.suggestions','jquery.tipsy','jquery.ui.core','jquery.ui.widget','jquery.ui.mouse','jquery.idraggable','mediawiki' [00:32:22] oO [00:32:34] i told you that i added all modules [00:32:36] ;) [00:32:44] I thought all you need [00:33:04] all I need and more [00:33:19] "Total download size: 210 M" srsly Fedora -.- [00:48:09] can I make blocking open proxy to prevent to use this [00:48:10] ? [01:20:46] * Hazard-SJ is back [04:07:14] shouldn't there be an automatic link on Wikipedia to create a new item for interwikis on Wikidata? [04:14:05] good idea [04:15:19] at the bottom of existing interwikis [04:15:39] I have one on French Wikipedia [04:16:01] I guess they modified the whole gadget to add interwikis to work for Wikidata, it should be implemented everywhere [04:16:09] the old** (not the whole) [04:16:12] lol [04:16:26] my fingers type like I speak, very bad accent :P [04:22:29] I think it's semi-automatic if you enable something on Wikidata [04:22:37] to import interwikis [04:22:49] but you manually create the item and remove the interwikis from the page [04:23:02] ihaveamac: SlurpInterwiki? [04:24:40] Hazard-SJ: yes [04:26:03] ihaveamac I mean when I create a new page on a given Wiki [04:26:30] I could click on this link on the same page on an existing Wiki to add my new one quicklu [04:26:33] quickly** [04:26:49] very useful for translators [04:27:52] the bad thing with the gadger on French Wiki is that the link doesn't exist if there isn't at least one interwiki already [04:27:56] gadget* [04:28:09] the link doesn't show** (in the left toolbar) [04:33:27] Amqui: I see [07:34:36] Amqui: we have something that fixes this [07:34:40] will be rolled out soon [08:08:19] yay "prev" and "next" work [08:09:04] actually [08:09:09] I was in the wrong namespace [10:41:53] I still have the css bug... it's fixed in every browser but chrome with debug!=true [11:47:31] DanielK_WMDE: Working 24x7 lalala ;-) [12:23:18] multichill: hi [12:23:27] hoi [12:23:40] I sent you an e mail [12:23:46] Is it ok? the code [12:23:53] do you have any suggestions? [12:24:12] I want to go further! but i don't where i go :D [12:25:05] Ok, I noticed your email, but didn't look into it yet. Sorry [12:25:21] I know you're busy [12:25:25] i see [12:25:35] whenever you can :) [12:27:04] multichill: but i think you must see this:http://www.mediawiki.org/wiki/Manual:Pywikipediabot/Wikidata#Changing_or_creating_claims.2Fstatements [12:51:16] !admins [12:51:25] the diff still not working [12:51:37] on user contributions [12:51:38] http://www.wikidata.org/w/index.php?title=Q676940&diff=prev&oldid=12980289 [12:51:44] !admin [12:51:55] http://www.wikidata.org/wiki/Special:Contributions/Dexbot [13:18:27] Hallo. [13:18:29] I have a friend who cannot edit Wikidata. He uses IE8. Is editing Wikidata known to be broken on it? [13:20:25] aharoni: there is a bug for ie8 in bugzilla [13:21:06] aharoni: https://bugzilla.wikimedia.org/show_bug.cgi?id=44228 [13:21:30] aharoni: http://lb.bombenlabor.de/tmp/ie8_langlinks.png [13:21:37] is that the behaviour? [13:22:22] or is it looking like this: http://lb.bombenlabor.de/tmp/ie8.PNG [13:22:40] [[object Object]] [13:22:40] 10[4] 10https://www.wikidata.org/wiki/object_Object [13:22:46] lbenedix: yes! the second one [13:23:18] Thanks. Just wanted to know that it's known. [13:23:19] you can add a comment to the bug, that its broken for you too [13:38:16] Amir1: Almost done compiling all the shit. I really had to do some upgrades on my little server [13:38:39] multichill: OK [13:38:41] I'm here [14:01:30] multichill: I must go [14:01:37] but i'll talk to you later [14:01:48] bye for now [14:38:20] hello [14:38:46] is there not a property like nationality for persons? [14:40:25] country of citizenship ? [14:40:34] https://www.wikidata.org/wiki/Property:P27 [14:41:46] Wardje: thx [15:32:31] is the german title of [[Q2098720]] correct? I think titles should not have braces, the content of this part should be in the desription, correct? [15:32:32] 10[5] 10https://www.wikidata.org/wiki/Q2098720 [17:31:12] shouldn't we change the "adding language as description" filter to check new accounts? [17:48:03] New patchset: Aude; "Second iteration of property parser function [WIP]" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54220 [17:49:28] New patchset: Aude; "Second iteration of property parser function [WIP]" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54220 [18:00:03] New patchset: Aude; "Second iteration of property parser function [WIP]" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54220 [18:12:18] New patchset: Aude; "Second iteration of property parser function [WIP]" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54220 [18:15:38] New patchset: Aude; "Second iteration of property parser function [WIP]" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54220 [18:15:44] New review: Aude; "patchset 5 rebased" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54220 [18:56:32] multichill: stewards have given the flags [18:56:34] *flag [19:21:49] New patchset: Aude; "Second iteration of property parser function [WIP]" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54220 [19:23:01] Hello, et Bonsoir (pour les francophones) [19:24:24] I just noticed that the full deployement of wikidata (and removing of wikilinks on almost all pedias) has made a very useful Commons tool completely unable to work... Sum-it-up [19:52:42] 7,500,000 now [19:52:44] only a couple of weeks until the big 10 i reckon [19:52:48] https://www.wikidata.org/wiki/Q7500000 [19:53:38] I'd say by late April [19:57:52] Will qualifiers be able to be used for negation? For example, could a "not" qualifier be used for a hypothetical "caused by" property to support statements like "Autism not caused by vaccines"? [19:59:50] Or how else would such direct assertions of negative facts be supported? If not by qualifiers, then the next best option seems to be via ad-hoc "not" properties, e.g. a property "not caused by". This approach has obvious drawbacks, like needing a new property for property one wants to be able to negate. [20:02:08] hsarrazin: I expect that wikidata will first hurt a lot for Commons before it becomes helpful [20:02:36] yes, I fear so :( [20:04:16] multichill what would be better, do you think ? wait for wikidata to take commons into account, or try and fix (temporarily) new tools ? [20:04:40] ... or old tools (I think sum-it-up is quite a "classic" ;) [20:04:59] That's the question I have been asking myself too for quite some time [20:05:16] I think in the long run we have Wikibase in some form on Commons, but what to do in the meantime? [20:06:00] I operate https://commons.wikimedia.org/wiki/User:CategorizationBot . Not sure if the logic it's based on will continue to function [20:06:07] multichill: you're back now! [20:06:09] hi [20:06:54] Yeah, finally updated about 80 packages on my netbsd system. Compiled more in a weekend than in the last year I guess ;-) [20:07:38] :) [20:08:05] I upgraded my linux a while ago, that was like a nightmare [20:08:06] * multichill has been running BSD for over 10 years now [20:08:15] WOW [20:09:23] multichill: have you seen the manual?http://www.mediawiki.org/wiki/Manual:Pywikipediabot/Wikidata [20:09:43] I like the minimal philosophy. Everything is off and you have to turn things on and the compiling form source makes it very fast on crappy old hardware [20:09:47] I tested it: [20:09:47] http://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/Dexbot_2 [20:09:53] http://www.mediawiki.org/wiki/Manual:Pywikipediabot/Wikidata#Changing_or_creating_claims.2Fstatements [20:10:27] I haven't developed the code for adding ref. but i think i'll do that tonight [20:12:19] I really want to add artists for singles, albums and etc [20:12:37] but i don't have approval yet [20:12:39] :( [20:13:48] Amir1: How do you know the language of a claim? [20:14:21] I think i based on English [20:15:19] I used a function named search entity [20:15:23] or somthing like that [20:15:26] something [20:16:02] the function uses only English for search but it's very simple to change that [20:16:14] do think i should change and develop? [20:17:50] multichill: [20:22:16] You should probably have some logic based on what wiki the user is working from. Haven't looked that closely into the code I'm afraid [20:22:58] ok [20:23:01] I'll that [20:23:06] i'll do that [20:23:32] (typing with one hand, using another for dinner) [20:23:41] :P [21:23:20] have anyone worked with wbsetreference? [21:23:49] can somebody giva me an example of a "snak" [21:26:14] Jeblad_WMDE: can you help me? [21:27:50] Its in the documentation Amir1 [21:31:18] Its going to be published some code using that in a week or so [21:31:30] sorry i was disconneted [21:31:48] Its not in the API-doc on Mediawiki.org [21:32:10] Amir1: you didn't loose anything [21:32:22] ok [21:32:41] i was working on http://wikidata.org/w/api.php [21:32:45] Anyhow, the only doc is in api.php for setreference, and in the code [21:32:54] in the page is written: [21:33:03] api.php?statement=q586$57CE3C9F-37AF-42B5-B067-DADA198DD579&snaks={"p1":[{snak}, {snak}], "p2": [{snak}]}&token=foo&baserevid=42 [21:33:06] oh [21:33:11] let met find my code [21:33:13] me* [21:33:19] thanks [21:35:00] s_params = {'statement':result['claim']['id'], [21:35:00] 'token':self.token, [21:35:01] 'bot':1, [21:35:01] 'snaks':'{"p143":[{"snaktype":"value","property":"p143","datavalue":{"type":"wikibase-entityid","value":{"entity-type":"item","numeric-id":'+str(value)+'}}}]}' [21:35:01] } [21:35:11] The statement is the ID of the claim you are changing [21:35:19] and that adds property 143 [21:35:48] The reference is the ID of the reference you are changing. That one is a hash. [21:36:11] snaks are the structures you set and is basically the same as for the claims [21:36:38] not, the snaks is not one single type of structure [21:36:42] hmm [21:36:45] i get that [21:36:46] note,... [21:36:47] thank [21:36:55] thanks [21:37:26] remember the token, remember the baserevid,..and the bot if appropriate [21:38:08] Jeblad_WMDE: I know [21:38:39] and as usual, if you give an id something is changed, if you do not give an id something is created [21:39:05] and reemember to pay me for my very valuable time... xD [21:39:28] * Jeblad_WMDE will steal choclate tomorrow and set it on Amir1's account [21:39:46] :))) [21:40:27] I guess I should write a better API doc.. [21:40:31] ;/ [21:42:54] Jeblad_WMDE: sometimes just needs a good example [21:48:03] http://www.wikidata.org/w/index.php?title=Q7515887&action=history delete plz [22:00:15] base-w: done [22:02:11] legoktm: i want to talk to you but i'm busy right now [22:02:20] don't go! [22:02:48] ack, i'm probably going to go in 10-15min, but ill be back later and my talk page is always open :) [22:04:40] ok [22:17:56] how can I set that I see three extra languages instead of only two? [22:21:33] #babel [22:21:42] it lets you add as many as you want [22:22:01] I mean with editing a page [22:22:12] I want to give titles in nl, en, fr and de [22:22:26] labelLister? [22:22:46] oh [22:22:48] just [22:22:51] Romaine: add babel to your user page [22:22:52] https://www.wikidata.org/wiki/User:Romaine [22:22:57] add |fr-0 [22:23:05] or whatever it really is [22:23:05] like my user page [22:23:06] [[User:Ladsgroup]] [22:23:06] 10[6] 10https://www.wikidata.org/wiki/User:Ladsgroup [22:23:33] I have no need for babel in my userpage [22:23:44] I just want to add titles in four languages to items [22:23:45] but thats now it works.. [22:23:52] really? [22:23:59] yeah :P [22:24:20] strange [22:24:28] really really strange :p [22:24:50] lol [22:25:11] I thought such thing should be able to set in the preferences [22:26:29] btw Amir1, why do you keep testing on Russia? [22:26:45] use the sandbox [22:26:46] https://www.wikidata.org/wiki/Wikidata:BOX [22:26:57] i usehttp://www.wikidata.org/wiki/Q1918044 [22:27:00] now [22:27:06] I left Russia [22:27:09] errr [22:27:09] :P [22:27:11] use the sandbox [22:27:15] its meant for testing :) [22:27:32] I must use an article of English Wikipedia [22:28:08] it links to https://en.wikipedia.org/wiki/Wikipedia:Wikidata/Wikidata_Sandbox [22:28:19] or just add a link to somewhere in your userspace [22:28:36] no no [22:28:45] I mean the item must has English WP link [22:29:05] the sandbox does? [22:29:22] no [22:29:27] I don't think so [22:29:48] yes it does [22:29:56] [05:28:07 PM] it links to https://en.wikipedia.org/wiki/Wikipedia:Wikidata/Wikidata_Sandbox [22:30:22] sorry i didn't know that [22:30:31] no worries [22:30:37] it should be advertised a bit more [22:31:46] shit [22:32:01] someone complained about you on WD:AN :P [22:32:21] the bot thinks the page is "Wikidata" lang in "Wikipedia" family [22:32:24] sorry [22:32:57] o.O [22:33:03] thats a bug then :P [22:33:18] shit [22:34:07] hello! the autocomplete is behaving a bit differently to how it used to. For example, "Neuilly-Saint-Front" would previously suggest "Canton of Neuilly-Saint-Front" as well as "Neuilly-Saint-Front" (a commune). Since you don't know what the item label is actually like, it's helpful to match at all positions in the label. Is this a known thing, or was it done on purpose? [22:34:17] inductiveload: is it possible that was an alias? [22:34:26] =)) [22:34:37] users think I'm on eidt war with my bot [22:34:56] but that seems like a good idea [22:35:37] "Canton of Neuilly-Saint-Front" doesn't have any aliases [22:37:01] I'm fairly sure it used to match anywhere in the label, I think I found a lot of things that way [22:37:02] legoktm: let me work on Newton Township, Muskingum County [22:37:10] umm [22:37:15] that's better [22:37:17] can you create your own sandbox? [22:37:19] ;) [22:37:24] maybe link it to your userpage or something? [22:37:44] orrr [22:37:51] just use the test-repo [22:37:59] also should it be "Canton of Neuilly-Saint-Front" or just "Neuilly-Saint-Front" with a description "canton in XXX department, France"? [22:38:03] test repo is like hell [22:38:25] I worked on it before the main wiki had been lived [22:38:25] o.O [22:38:28] heh [22:39:11] inductiveload: are you talking about https://www.wikidata.org/wiki/Q988669 ? [22:40:05] that's the commune, there's a canton at [[Q565294]] [22:40:05] 10[7] 10https://www.wikidata.org/wiki/Q565294 [22:40:39] the commune is in the canton [22:41:49] when you go to enter "admin unit" (P131), and start writing "Neuilly-Sain...", you don't get a suggestion for the canton, only the commune [22:42:06] even though the canton label contains "Neuilly-Saint-Front", just not at the start [22:42:11] oh i see [22:42:14] tbh im not sure how Cities/etc have been standardized [22:42:30] but i do think autcomplete should match anywhere in the word [22:42:47] i'm sure it would have previously done that [22:43:09] legoktm: return save-failed. where is problem?api.php?snaks=%7B%22p143%22%3A%5B%7B%27datavalue%27%3A%20%7B%27type%27%3A%20%27wikibase-entityid%27%2C%20%27value%27%3A%20%7B%27entity-type%27%3A%20%27item%27%2C%20%27numeric-id%27%3A%20328%7D%7D%2C%20%27property%27%3A%20%27p143%27%2C%20%27snaktype%27%3A%20%27value%27%7D%5D%7D&format=json&bot=1&token=4ec4ac69807459fecd89ef63d44c47ce%2B%5C&statement=q41 [22:43:10] 15189%2403974134-CB48-4DC8-AA91-33EB1F6B1983&action=wbsetreference [22:43:28] errr [22:43:37] what is the error? [22:44:32] Failed to save the change [22:44:43] code-error:save failed [22:45:13] umm [22:45:21] huh [22:45:45] how can encode this piece of sh** to understand a little more [22:46:00] not sure, i gtg right now [22:46:03] maybe someone else can help [22:46:04] sorry [22:46:42] Amir1: what's the aim? [22:47:02] inductiveload: setting a references via API [22:47:17] ok, i've done that [22:47:40] https://www.wikidata.org/wiki/User:Inductiveload/scripts/draggableSitelinks.js [22:48:01] addImportedFrom is the important function here [22:48:21] what language are you working in? [22:50:01] python [22:50:54] yay :-D [22:51:51] it's very similar [22:52:24] i haven't had a chance to fiddle with a python bot yet, but the idea in the code above should be fairly portable [22:52:29] ^^ that :-) [23:07:14] New patchset: Daniel Werner; "Performance improvement for JS Statement.equals" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54312 [23:16:39] New patchset: Daniel Werner; "(hot fix) fix of two obvious mistakes in jQuery.wikibase.snaklistview" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54313 [23:21:13] inductiveload: legoktm : is order important in API? [23:22:09] Amir1: if it's a hash (i.e. {key: val}), there is no defined ordering [23:22:44] so why the code gives me error? [23:22:57] can i see the code? [23:22:58] let me show you what is sent [23:23:19] are you posting or getting? [23:23:53] api.php?snaks=%7Bu%22%22p143%22%3A%5B%7B%22datavalue%22%3A%20%7B%22type%22%3A%20%22wikibase-entityid%22%2C%20%22value%22%3A%20%7B%22entity-type%22%3A%20%22item%22%2C%20%22numeric-id%22%3A%20328%7D%7D%2C%20%22property%22%3A%20%22p143%22%2C%20%22snaktype%22%3A%20%22value%22%7D%5D%22%7D&format=json&bot=1&token=4ec4ac69807459fecd89ef63d44c47ce%2B%5C&statement=q1918044%2472C29658-6491-43B1-9554-903F0AE [23:23:55] 9FD96&action=wbsetreference [23:24:01] posting [23:24:19] looks like a GET to me... [23:24:41] no it's a POST [23:25:11] for a POST, shouldn't the URL should be ...api.php, and the data is separate [23:25:24] can i see the code? [23:25:46] for i in range(0,len(snak)/2): [23:25:48] if not i==0: [23:25:49] finalsnak=finalsnak+u"," [23:25:51] snaki = [{"snaktype":"value", [23:25:52] "property":"p"+str(snak[i*2]), [23:25:54] "datavalue":{"type":"wikibase-entityid","value":{"entity-type":"item","numeric-id":snak[(i*2)+1]}}}] [23:25:55] finalsnak=finalsnak+ "'p"+str(snak[i*2])+"':"+str(snaki) [23:25:57] finalsnak=re.sub(ur"\bu\'", u'"',repr(finalsnak).decode("unicode-escape")).replace("'", '"') [23:25:57] pastebin please... [23:25:58] params = { [23:26:00] 'action': 'wbsetreference', [23:26:01] 'statement' : guid, [23:26:03] 'snaks' :u"{%s}" % finalsnak, [23:26:04] 'bot' : '1' [23:26:06] } [23:26:07] print snaks [23:26:09] if token: [23:26:10] params['token'] = token [23:26:12] else: [23:26:13] params['token'] = self.site().getToken(sysop = sysop) [23:26:15] output(u"Adding references to %s" % self.title()) [23:26:16] data = query.GetData(params, self.site(), sysop=sysop) [23:26:47] yeah…. [23:28:11] i think a kick's the only way to stop a paste if there's congestion control on [23:32:00] "the bitch is back" :D [23:32:06] json doesnt have an order [23:32:20] I love Elton John [23:32:43] just a moment [23:32:57] inductiveload: legoktm:http://pastebin.ru/BsDsCkl0 [23:33:17] line 119 [23:35:01] Amir1: that looks a bit odd, are you sure you need to do that? [23:36:31] inductiveload: no I'm not sure but doesn't work either way [23:36:59] I think needs a rewrite on that part [23:37:46] legoktm: what do you think [23:37:46] hello, could you read (and answer) this ? https://www.wikidata.org/wiki/Wikidata:Project_chat#Ability_to_include_links_to_redirects_or_anchors_.28aka_.22list_management.22.29 (sorry for my bad english) [23:37:51] can you pastebin an example of the finalsnak before that transform? [23:38:22] sorry bit busy atm [23:38:30] legoktm: ok [23:39:01] inductiveload: i'm on Windows right now and cmd doesn't let me copy output [23:39:13] :( [23:39:15] yes it does [23:39:21] look in the menus at the top [23:39:35] (i'm on linux, so I can't actually try it) [23:40:18] the interesting is there is no menu, linux is best for programming [23:40:22] longer term solution: ditch windoze [23:40:25] hmm [23:41:04] i think i can do something [23:41:37] I can make the bot to write it on a file [23:41:48] what is the exact thing you want [23:42:07] https://www.google.co.uk/search?hl=en&source=hp&q=windows%20copy%20from%20cmd&aq=f&aqi=&aql=&oq=&gs_rfai= [23:42:40] ;-) [23:42:42] hmm [23:43:55] i want to see what "finalsnak" is before you do line 119 to it [23:46:50] Changing Q1918044 [23:46:52] 'p143':[{'datavalue': {'type': 'wikibase-entityid', 'value': {'entity-type': 'it [23:46:53] em', 'numeric-id': 328}}, 'property': 'p143', 'snaktype': 'value'}] [23:52:25] yeah, that's the wrong way to make JSON [23:53:18] inductiveload: where is the problem? [23:53:34] I think it worked [23:53:37] let me check [23:53:40] http://pastebin.ru/IZr5Pmvl [23:54:01] horay [23:54:03] there's a json package, it's there for a reason :-) [23:54:05] it worked [23:54:35] trying to encode with repr will produce a lot of pain one day for some really wierd corner case [23:54:45] i'm not familiar with json :( where is a documentation? [23:54:55] Amir1: just make a dictionary/list [23:54:58] and then use [23:55:04] json.dumps(dict) [23:55:46] json is what you are passing to the API - it's a serialised form of a JS object, and Python has a library to make a Python dict or list into JSON as well [23:56:00] http://www.wikidata.org/w/index.php?title=Q1918044&action=history [23:57:13] Understood [23:57:28] but the problem is solved [23:57:34] ok... [23:57:36] i replace line 119 with [23:57:53] finalsnak=finalsnak.replace("'", '"') [23:57:58] one day, that will break horribly, and you will have no idea why [23:58:36] i want to use on this [23:58:48] but it's a dict in a list [23:58:51] is it ok? [23:59:01] can json handle this? [23:59:31] since you just faked some up and fed it to the API and it didn't choke, I'd say yes? [23:59:33] yes [23:59:58] finalsnak = json.dumps(finalsnak)