[01:53:54] ah, just missed robla [01:56:12] quick update: earlier today we have merged the new apache config and pushed to cluster. so en. and .de subdomains redirect to meta now, as opposed to the error message you got before. www. is the same. the DNS change will be looked at by Mark soon, i believe Reedy can install the wiki thought without that being a blocker. out.. [01:56:41] yeahh [01:56:55] ah, cool:) [01:56:55] Though, I presume we can't actually test it? ;) [01:56:55] hi Reedy [01:57:09] hii [01:57:15] Mark said "tomorrow" earlier today, so Europe tomorrow [01:57:37] alright [01:57:38] i mean, earlier than US tomorrow [01:57:43] setting up a wiki is fairly quick [01:58:05] great [01:58:36] we have an option to do it earlier "if we absolutely need to" [01:58:36] Then we have the wikidata fun and games ;) [01:59:02] but then we are supposed to find somebody who understands DNS and our setup very well [01:59:07] incl geostuff [02:00:27] i mean, i have diff files sitting on sockpuppet [02:01:58] but, self-review for major DNS change , and the rare case of completely new projects and a warning from both Ryan and Mark .. ..soo [02:02:57] :) [02:04:10] when it works we can copy for wikivoyage, needs the same stuff [02:04:33] alright, ttyl , tomorrow i will be in data center though [02:05:15] pictures! [02:05:58] yes!, so true!, thanks for reminder [02:06:07] i actually have a camera that would be better than the phone [02:06:12] i guess...heh [02:06:18] :D [02:06:32] but.. you did not miss anything last time [02:06:45] because we will have to take all the stuff out and back in again [02:06:51] the cabinets were too small and are now fixed [02:07:05] so can take pictures like it is first time for real :) [02:09:37] wheeeee [07:28:19] ta-dah! [07:56:22] Change merged: Henning Snater; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29789 [08:05:03] DanielK_WMDE: you may want to check https://bugzilla.wikimedia.org/show_bug.cgi?id=41298 and let sumanah know if we are satisfied that it's resolved [08:05:09] or if we should check more [08:20:02] New patchset: Henning Snater; "Improving centralised option handling via wikibase.ui.Base" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29933 [08:39:14] New patchset: Henning Snater; "Improving centralised option handling via wikibase.ui.Base" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29933 [09:07:53] aude: well, it seems fine to me, but the original reporter should state if he's satisfied. [10:08:36] New patchset: Tobias Gritschacher; "fixed use of old method in selenium teardown" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29942 [10:09:47] New patchset: Henning Snater; "Refactoring of EditableAliases QUnit test" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29943 [10:20:19] New patchset: John Erling Blad; "(Bug 41383) Update wbgetentities to report statements for items" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29945 [10:20:27] New patchset: Henning Snater; "Refactoring of EditableSiteLink QUnit test" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29946 [10:24:07] New patchset: Henning Snater; "Refactoring of EditableAliases QUnit test" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29943 [10:28:22] New patchset: Henning Snater; "Refactoring of EditableSiteLink QUnit test" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29946 [10:30:10] New patchset: Henning Snater; "Refactoring of EditableAliases QUnit test" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29943 [10:43:14] New patchset: Henning Snater; "Refactoring of AutocompleteInterface QUnit tests" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29949 [10:45:50] New patchset: Henning Snater; "Refactoring of EditableAliases QUnit test" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29943 [10:48:08] New patchset: Henning Snater; "Refactoring of EditableSiteLink QUnit test" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29946 [10:59:39] New patchset: John Erling Blad; "Fix names for labels and descriptions in wbsearchentities" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29950 [11:03:17] New patchset: Tobias Gritschacher; "always use the prefixed entityID in the UI" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29953 [11:39:02] New patchset: Jens Ohlig; "(Bug #40391) Add continuation to wbsearchentities and return only matching aliases" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29956 [11:42:05] New patchset: Jens Ohlig; "(Bug #40391) Add continuation to wbsearchentities and return only matching aliases" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29956 [11:45:10] New patchset: Jens Ohlig; "(Bug #40391) Add continuation to wbsearchentities and return only matching aliases" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29956 [12:02:46] New review: Daniel Werner; "There should be a couple of other places where the ID is used somehow. Especially css classes. They ..." [mediawiki/extensions/Wikibase] (master); V: 0 C: 0; - https://gerrit.wikimedia.org/r/29953 [12:34:26] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29942 [12:36:32] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29949 [12:37:59] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29943 [12:39:10] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29946 [12:41:06] Change merged: Henning Snater; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29950 [12:48:29] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29933 [12:57:23] New patchset: John Erling Blad; "(Bug 41390) Temporary fix for item uniqueness" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29959 [13:15:25] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29959 [13:16:12] New review: Tobias Gritschacher; "Danwe, that's already done. Prefixed ID is now used everywhere." [mediawiki/extensions/Wikibase] (master); V: 0 C: 0; - https://gerrit.wikimedia.org/r/29953 [13:19:31] New patchset: Jens Ohlig; "(Bug #40391) Add continuation to wbsearchentities and return only matching aliases" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29956 [13:27:16] DanielK_WMDE: is the test repository now powerful enough to import every langlink group? Should i remove my huwiki only condition? [13:27:57] Merlissimo: no, please wait until we have the beta wiki on the production cluster. that should be Really Soon Now. [13:28:33] i see Sk1d importing complete wikis [13:29:36] New patchset: Jens Ohlig; "(Bug #40391) Add continuation to wbsearchentities and return only matching aliases" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29956 [13:43:43] New patchset: Jens Ohlig; "(Bug #40391) Add continuation to wbsearchentities and return only matching aliases" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29956 [13:46:05] New patchset: Jens Ohlig; "(Bug #40391) Add continuation to wbsearchentities and return only matching aliases" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29956 [14:02:33] New patchset: Tobias Gritschacher; "removed duplicated testfile" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29964 [14:03:07] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29964 [14:07:37] Change merged: Henning Snater; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29953 [14:13:10] DanielK_WMDE: should i stop my bot? i asked and Lydia_WMDE said it is ok to continue importing links: http://lists.wikimedia.org/pipermail/wikidata-l/2012-October/001088.html [14:13:52] Sk1d, Merlissimo: it's not critical, i just expect things to be come pretty slow if we import too much stuff into the VM [14:14:06] database performance is bound by available ram for caching, and that#s rather low on labs [14:14:28] so i'd prefer to wait with the full improt until we have an instance running on the production cluster. which should be soone now. [14:14:54] Sk1d: Merlissimo has been limiting the import to pages that have a link to huwiki, because that#s where we will first deploy the client component of wikidata. [14:15:19] Lydia_WMDE, Abraham_WMDE1: or have you guys decided do handle this differently? [14:16:21] DanielK_WMDE: as long as it's not a problem i think it's fine to let the bots lose - and if the VM gets too slow we can nuke items [14:16:38] or is it already causing issues you think? [14:16:50] if so maybe it is a good idea to nuke items [14:17:20] i'd personally feel better if we can get as much testing done on the demo system as possible before it goes to the production system [14:17:56] Lydia_WMDE: i'm not aware of any problems currently. but i havn't tested [14:18:00] k [14:18:01] DanielK_WMDE: no that's ok [14:18:24] Lydia_WMDE: i agree that more testing is better. but "more items" may mean "less edits" because the database gets slower. so... what's "more testing"? [14:18:39] hmm? [14:18:42] sorry - not following :D [14:19:08] Lydia_WMDE: full import -> more items -> slower database -> lower edit rate -> less edits total. [14:19:13] at least in theory [14:19:13] ah [14:19:15] yeah [14:19:23] jeblad_WMDE: is "page" or "title" prefered for sitelinks? [14:19:25] well as i said if it gets slower we should definitely do something [14:19:26] no idea if it's *actually* happening like this, depends on several factors [14:19:55] Abraham_WMDE1: what's ok? sticking to the "hu only" policy, or importing everything? [14:20:23] Lydia_WMDE: well, as long as someone has an eye on it and actually does something when the test site breaks - fine with me :) [14:20:34] i just didn't see a good reason to tell Merlissimo to open the flood gates just yet. [14:20:36] deal [14:20:52] if i hear any complains I'll let you know [14:21:16] Merlissimo: i think "title" works and "page" doesn't :) [14:21:45] changed my bot it should now only post links containing huwiki [14:21:59] Merlissimo: the target page is given by specifying the target wiki's id and the page's title. [14:21:59] DanielK_WMDE: then https://www.mediawiki.org/wiki/Extension:Wikibase/API#wbsetitem "Add sitelinks" must be updated. page is used on the new array format [14:22:17] Sk1d: cool, thanks! though i was just told that that wouldn't be necessary :) [14:22:56] Merlissimo: ah, i was thinking of parameters. it's possible that we use "page" as a key in the json. [14:23:10] that's actually an annoying inconsistency, perhaps file a bug about it. [14:23:30] * DanielK_WMDE doesn't care much which we use, but it should be consistent [14:24:07] Merlissimo: compare https://www.mediawiki.org/wiki/Extension:Wikibase/API#wbsetsitelink [14:24:14] ugh :/ [14:25:05] DanielK_WMDE: that's what i just have read. and some example contain page one some title as key for pagetitle [14:25:31] i though perhaps one is only supported for backward compability [14:27:08] according to source the key name is just thrown away. So perhaps i should replace it by "Daniel" :-) [14:28:24] There should only be title [14:29:01] A page is the content, a title is the string used to identify the page [14:30:02] The page https://www.mediawiki.org/wiki/Extension:Wikibase/API is mostly written before implementation (that is made manually) so it does contain some errors [14:30:03] so only documentation is wrong [14:30:25] The actual examplesused both title and page [14:30:40] I thought I had removed all references to page [14:30:54] is the array format already live? [14:31:00] yes [14:31:17] i am just rewriting my code to use arrays [14:31:55] Test is a bit after dev, but I think everything is on test now [14:32:19] Slowly starting to get claims to work on dev [14:36:46] New patchset: Jens Ohlig; "(Bug #41390) Fix violation of label-description uniqueness constraint on items" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29969 [14:54:35] aude, Abraham_WMDE1: btw: i'm still smashing my head against this conflict resolution stuff [14:55:04] I'm under the impression that we just hit upon some underlying issue, and the tests i'm writing are uncovering more of those... [14:55:21] that stuff is *strange* some times. [14:56:10] DanielK_WMDE: :-/ [14:56:26] New patchset: Tobias Gritschacher; "use prefixed ID in ItemViewTest" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29970 [14:56:58] Change merged: Henning Snater; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29970 [15:00:34] DanielK_WMDE: i am not surprised [15:00:45] :/ [15:00:58] i am willing to poke on the weekend but need a mental break from it now [15:01:26] aude: no problem. by then i'll have test cases, at the very least. [15:01:32] good and i can look at those [15:03:22] * DanielK_WMDE sees 4 failures. goes and fixes silly mistake in test case. now sees 5 failures. [15:03:22] >_< [15:03:37] grrrr [15:05:58] my impression is that the existing conflict detection code is horribly broken [15:06:12] it doesn't detect all conflicts. and it reports conflicts that should have been resolved. [15:06:40] ugh [15:06:47] i think using wgStarttime for conflict resolution is simply wrong. wpStarttime shouldn't even exist. It's useless. [15:06:54] but not surprised [15:07:04] or can you think of any way how wpStarttime would be helpful for detecting conflicts? [15:08:01] you might also be interested in https://bugzilla.wikimedia.org/show_bug.cgi?id=41338 which just raises general concern about edit conflicts [15:08:36] not sure what the right way to do is, without poking at the code more [15:13:57] aude: also, using a timestamp to identify the base revision is just WRONG. it means that in my tests, i have to wait 1 sec after every edit, so i get unique timestamps! [15:13:57] gaaaahhhh! [15:16:46] * aude nods [15:16:53] timestamps might not be unique [15:31:23] aude: for detecting conflicts, the base revision id should be used. and nothing else. grrr [15:42:33] aude: ok, i can now reliably reproduce the problem programmatically, with a test case.only took me all day ;) [15:42:39] now for fixing the bugger... [15:45:52] Merlissimo: hey, it seems you're doing interwiki bots for wikidata. right now, what would be useful in pywikidata? when it is merged into pywikipedia, something like getting a list of interwikis joining the local & the remote ones could be nice, but I can't think of more things [15:47:59] joancreus: my interwiki bot is written in java. do you know how pwb iw bot will work after wikidata is live? [15:48:56] Merlissimo: right now, the PWB+Wikidata integration by Amir doesn't do much [15:48:56] DanielK_WMDE: YAY [15:49:01] oops., yay [15:49:40] it doesn't give a proper interface to the api, you can't access properties, it seems, & doesn't allow you to post more than one property change at once [15:49:59] so it seems after wikidata goes live, PWB+PWD will have to be integrated [15:50:12] but keeping the Item object, item.sitelinks["enwiki"] way of doing the things [15:50:15] abstracting it [15:50:22] (i think it's better that way) [15:51:05] is there any existing plan how to develop wikidata compatible pwd interwiki bots? [15:51:14] not that I am aware of [15:51:20] i'm only developing a library [15:51:27] it's up to others what to do with it :) Merlissimo [15:51:39] it would be interesting to create such bots [15:51:58] would you be interested in doing so? [15:52:30] they'll be really important, importing bazillions of interwikis manually isn't fun [15:53:12] so we must ensure that pwb will stop working on those wikis until sb. develops a solution [15:53:40] Merlissimo: by pwb you refer to interwiki.py, the bot adding interwikis? [15:53:47] yes, or else it will really mess it all up [15:53:48] joancreus: an a already wrote: i am running a java interwiki bot which is able to work with wikidata [15:53:55] oh ok [15:54:08] argh a pity it is not python, you could reuse the library [15:54:29] if i have time i'll try to translate it, but right now with changes constantly... [15:55:04] joancreus: python is not powerful enogh for my bot. i am doing 20000-50000 edit a day using a single process. [15:55:11] Merlissimo: there's jythonc to "compile" python to java [15:55:34] Merlissimo: but isn't it IO limited? no experience in interwiki bots, only general-use bots, but most things aren't CPU heavy [15:55:46] and my bot is using coimpletly different algorithms, so you cannot compare these frameworks [15:56:10] Merlissimo: but pywikidata only gives you the interface, you write your algorithms on top of it so you don't have to write the "save the item" code [15:56:15] why should io be limited? [15:56:30] no, i meant, wikipedia bots aren't usually CPU-bound [15:56:36] speed depends on the IO [15:56:43] not CPu [15:56:47] though in interwiki bots it might be heavier [15:57:20] my bot is sleeping most of the time because if wiki internal edit limit [15:57:51] DanielK_WMDE: ping [15:58:06] so it would be pretty much the same in python in terms of speed [15:58:19] but implementing tarjan in python could be heavy for big interwiki groups [15:58:40] Merlissimo: do you already have some code to get an item, and save an item? [15:59:03] yes, my bot is working since june on wikidata. [15:59:15] ok [16:00:35] just received the email that says that http://www.wikidata.org is live. [16:00:56] i only rewrote it today to use arrays for setitems instead of lists [16:01:39] there have been some changes which broke my code recently :( already fixed though [16:01:44] aharoni: ? [16:02:01] The domain is live. Without a lot of content. [16:02:01] aharoni: great! but it'll be data.wikimedia.org and this a redirect, or wikidata.org directly? [16:02:02] * aude sees just the landing page [16:02:12] No idea. [16:02:12] aharoni: that's not new :/ [16:02:20] New for me. [16:02:26] ah, okay [16:02:28] I get automatic emails about new domains. [16:02:31] it should change soon to be our wiki :) [16:02:35] beta wiki [16:02:41] Interests me as a language committee member. [16:02:49] * aude nods [16:03:09] it might take longer to get the language redirects working since it's non-standard setup for us [16:03:22] wikidata.org, data.wikimedia.org or something else: which will redirect to which? [16:03:31] i think data.wikimedia.org would be the most coherent one [16:03:36] joancreus: it will be wikidata.org [16:03:36] whops [16:03:45] joancreus: can do you request the corresponding item for a local page? [16:03:50] * aude is talking about de.wikidata.org, etc.  [16:03:54] oh [16:04:08] which i think will point to wikidata.org/wiki/Project:Main_Page/de etc. [16:04:13] since it was a meta-project like wikispecies or commons or meta, i thought it would be in wikimedia.org [16:04:14] or something [16:04:19] joancreus: no [16:04:26] Merlissimo: wikidata.api.getItemByInterwiki [16:04:32] in my bot [16:04:34] data.wikimedia.org might redirect but not sure [16:04:39] let me see what's the code under [16:05:00] Merlissimo: resp = self.request.get({"action":"wbgetentities", "sites": "|".join(sites), "titles": "|".join(titles)}) [16:05:03] joancreus: how do you know the hostname? [16:05:04] and de.wikidata.org/wiki/Q100 should point to de.wikidata.org/wiki/Q100?setlang=de or something [16:05:07] it might be broken let me check if it works [16:05:14] magically [16:05:21] Merlissimo: in config.py, there's api = "http://wikidata-test-repo.wikimedia.de/w/api.php" [16:05:28] aude: ok, makes sense [16:05:52] i think it will default to english initially but people can use the language selector to change their settings [16:06:00] it should be sticky [16:06:57] joancreus: ok hard coded like i have done. [16:07:21] Merlissimo: well, not hardcoded, in a file config.py ;) [16:07:46] ... which must be changed after wikidata is live. [16:08:08] yep [16:08:15] i only wanted to be sure that i havn't missed sth. [16:20:01] *awawy [16:20:01] *away [18:23:30] joancreus: Hi [18:23:46] Amir1: hi [18:25:15] joancreus: I think problem is you try to do nothing [18:25:15] try page Hydrogen [18:25:15] and what is the error> [18:25:42] page does not exist [18:26:13] Amir1: pywikibot.exceptions.NoPage: (wikidata:wikidata, u'[[wikidata:Hydrogen]]', 'Page does not exist. In rare cases, if you are certain the page does exist, look into overriding family.RversionTab') [18:26:13] 10[1] 10https://meta.wikimedia.org/wiki/wikidata:Hydrogen [18:26:15] What your trying to do? what the exact code? [18:26:29] wikipedia.Page(wikipedia.getSite('wikidata',fam='wikidata'),'Hydrogen').get() [18:26:34] however, wikipedia.Page(wikipedia.getSite('wikidata',fam='wikidata'),'q320').get() works [18:27:07] i know [18:27:19] i work on changing [18:27:37] about the problem i gonna fix it [18:27:38] you try that [18:28:37] page.put(u"",u"BOT: TESTING BOO",wikidata={'type':u'sitelink', 'site':'de', 'title':'BAR'}) [18:29:10] Amir1: okok [18:29:15] Amir1: and there's no way to add multiple sites, right? [18:29:31] (i mean, at once) [18:30:16] not for now but if it's possible i think i can change [18:30:35] i mean if it's possible to do via API [18:33:20] if you want take a look at pywikidata's source [18:33:20] it is handled in api.py [18:33:25] Amir1: [18:34:11] joancreus: how can i get information [18:34:13] about a page [18:34:27] https://github.com/jcreus/pywikidata [18:34:27] information? [18:34:32] look like page.get() but via API [18:34:43] JSON? [18:34:44] i mean content [18:34:45] yes [18:35:32] one of these:wbgetentities, wbcreateclaim, wbsetlabel, wbsetdescription, wbsetsitelink, wbsetaliases, wbsetitem, [18:35:32] wblinktitles, languagesearch, moodbar, feedbackdashboard, feedbackdashboardresponse, [18:35:32] moodbarsetuseremail, sitematrix, login, logout, query, expandtemplates, parse, opensearch, [18:35:32] feedcontributions, feedwatchlist, help, paraminfo, rsd, compare, tokens, purge, [18:35:32] setnotificationtimestamp, rollback, delete, undelete, protect, block, unblock, move, edit, upload, [18:35:33] filerevert, emailuser, watch, patrol, import, userrights, options [18:35:33] ? [18:35:33] those starting with wb are related to wikidata [18:35:33] http://wikidata-test-repo.wikimedia.de/w/api.php [18:35:36] (WikiBase) [18:35:48] I know [18:36:04] wbgetentities is used in wikidata.api.getItemById & wikidata.api.getItemByInterwiki [18:36:04] it returns an item [18:36:10] & its properties [18:36:20] in fact it can return more than one [18:36:25] wbsetitem is used to save (put) [18:37:11] wbsetitem is used to save labels [18:37:14] it's not important [18:37:17] wbsetsitelink it's more important i think [18:37:56] not necessarily labels [18:38:02] wbsetitem is to save contents [18:38:11] you can save whatever [18:38:16] by putting json in the &data= parameter [18:38:43] {{"sitelinks": blablfaldfsa}, {"labels": fdsafdsa fdsafdsfdsa fsaf}} [18:38:43] Caracteres inválidos en el enlace «Template:"sitelinks": blablfaldfsa}, {"labels": fdsafdsa fdsafdsfdsa fsaf»; no están permitidos: <>[]{} [20:06:23] New patchset: John Erling Blad; "Initial work on statements in ItemView" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30055 [20:07:48] hrmmm, where is the denny [20:08:33] also, when is boston? where else? [20:09:43] denny is at semantic mediawiki con [20:09:58] boston is ~mid-november [20:10:07] http://semantic-mediawiki.org/wiki/SMWCon_Fall_2012 [20:10:59] http://meta.wikimedia.org/wiki/Wikidata/Events [20:11:19] New patchset: John Erling Blad; "Initial work on statements in ItemView" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30055 [22:07:40] Lydia_WMDE: Hi [22:08:15] Amir1: hey [22:08:45] Did you see the argument? [22:09:05] i quickly skimmed it [22:09:16] at a conference today so didn't have much time [22:09:18] sorry [22:09:18] This week i have more commits than edits! [22:09:24] hehe [22:09:26] nice [22:17:06] Lydia_WMDE: Please see it, whenever you want