[07:51:03] New patchset: Henning Snater; "SiteLinksEditTool: Regenerating EditableValue prototype" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/70988 [08:35:17] DanielK_WMDE_: oin [08:35:25] DanielK_WMDE_: moin [09:02:08] DanielK_WMDE_: https://gerrit.wikimedia.org/r/#/c/70988/ [09:02:16] DanielK_WMDE_: daily time [09:34:38] https://bugzilla.wikimedia.org/show_bug.cgi?id=50062 [09:35:51] https://bugzilla.wikimedia.org/show_bug.cgi?id=50061 [09:36:31] https://bugzilla.wikimedia.org/show_bug.cgi?id=49982 [09:37:55] https://bugzilla.wikimedia.org/show_bug.cgi?id=49978 [09:39:43] https://bugzilla.wikimedia.org/show_bug.cgi?id=49425 [09:41:15] https://bugzilla.wikimedia.org/show_bug.cgi?id=49910 [09:42:28] https://bugzilla.wikimedia.org/show_bug.cgi?id=49880 [09:48:16] https://bugzilla.wikimedia.org/show_bug.cgi?id=49805 [09:49:25] https://bugzilla.wikimedia.org/show_bug.cgi?id=47114 [09:50:49] https://bugzilla.wikimedia.org/show_bug.cgi?id=45158 [09:51:19] https://bugzilla.wikimedia.org/show_bug.cgi?id=48047 [09:52:45] https://bugzilla.wikimedia.org/show_bug.cgi?id=44841 [09:56:11] https://bugzilla.wikimedia.org/show_bug.cgi?id=49367 [09:57:27] https://bugzilla.wikimedia.org/show_bug.cgi?id=49404 [09:58:16] https://bugzilla.wikimedia.org/show_bug.cgi?id=49079 [10:02:17] https://bugzilla.wikimedia.org/show_bug.cgi?id=49068 [10:03:14] Denny_WMDE: the Hardware-Meter is still standing on my desk showing a very small number of edits/min (~55) [10:04:17] lbenedix: shouldn't it be moving to our office? :) [10:04:29] https://bugzilla.wikimedia.org/show_bug.cgi?id=49100 [10:04:31] it should [10:04:54] what needs to happen for that? :) [10:05:24] https://bugzilla.wikimedia.org/show_bug.cgi?id=49120 [10:05:48] https://bugzilla.wikimedia.org/show_bug.cgi?id=48742 [10:05:56] we need to agree on a price and I have to come to your office [10:08:09] https://bugzilla.wikimedia.org/show_bug.cgi?id=49011 [10:08:44] how much do you want? [10:08:45] Change on 12mediawiki a page Extension:Wikibase Client was modified, changed by Aude link https://www.mediawiki.org/w/index.php?diff=719277 edit summary: [+34] lookup by property label not experimental now [10:09:08] * lbenedix is starting excel [10:10:37] the hardware was ~60EUR [10:11:59] abraham will negotiate with you, i need to run for a talk :) [10:12:08] see you later [10:33:13] aude: if you have some time to do reviews, please have a look at https://gerrit.wikimedia.org/r/#/c/70164/ [10:33:33] ok [10:34:53] lbenedix: hardware-meter? [10:36:42] YuviPanda: http://lb.bombenlabor.de/wikidatameter/ [10:37:04] New review: Daniel Kinzler; "Seems to work." [mediawiki/extensions/Wikibase] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/70988 [10:37:05] mmm that's pretty nice! [10:37:34] Abraham_WMDE: CR+2. Machst du dann den backport? [10:38:41] Change merged: jenkins-bot; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/70988 [10:38:44] DanielK_WMDE_: I think aude will backport it [10:38:58] DanielK_WMDE_: aude will backport it :) [10:39:06] sure :) [10:39:33] * aude also looking into why localisation update choked last night on wikibase data model [10:41:05] there will be no wmf8 and no wmf9 on monday if locailsation update can't run [10:41:15] anyone around to properly move http://meta.wikimedia.org/wiki/Wikidata/Events to wikidata? [10:41:27] ideally with translations [10:48:01] New patchset: Daniel Kinzler; "Add @group WikibaseLib to lib tests" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71001 [10:50:02] New patchset: Daniel Kinzler; "Add @group WikibaseLib to lib tests." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71001 [11:08:50] Is there documentation somewhere that helps me find out how to use the mw.getEditToken correctly? [11:10:46] Granjow: I don't know much about the JS side, but I know what it should do on the API side... [11:10:50] Anyway: "This is a low-level method used by api.postWithEditToken to get tokens." [11:10:54] https://www.mediawiki.org/wiki/ResourceLoader/Default_modules#mw.Api.23getEditToken [11:11:07] does this help? it seems to imply that you don't really need to call it directly. [11:11:46] Granjow: or maybe this helps? https://www.mediawiki.org/wiki/Manual:Edit_token#Retrieving_via_Ajax [11:12:06] * DanielK_WMDE_ finds it interesting that nobody says "AJAX" any more. [11:12:38] DanielK_WMDE_: Ah. That may help. I tried to use api.getEditToken but kept on receiving a TypeError: api.getEditToken is not a function [11:13:00] DanielK_WMDE_: I'd like to use the JS api to insert data from my DB into Wikibase. [11:13:06] * lbenedix thinks Ajax has something to do with XML, most "Ajax" Requests send and receive JSON [11:13:27] Ajaj? [11:13:56] * lbenedix loves the missuse of the term REST for every HTTP-Api [11:16:28] Ajax is not completely wrong because the XMLHttpRequest-Object is used to send the requests [11:16:46] Ok, mw.user.tokens.get( 'editToken' ) returns null, which is at least better than an error message. How would I call getEditToken directly? Shouldn't this be possible too? [11:18:26] Granjow: are you logged in? [11:19:26] lbenedix: yes. [11:20:17] api.php?action=tokens&type=edit&format=jsonfm does return me a token, too. [11:21:14] where do you call mw.user.tokens.get( 'editToken' );? [11:22:25] lbenedix: http://codepad.org/bCrpeNOl line 26. Or, in a subdirectoy of mediawiki. [11:23:01] so its not running from a mediawiki-context? [11:23:31] What is this? [11:23:55] it is not a gadget or inside of an extension? [11:24:04] No. [11:24:16] so you are not logged in [11:24:40] Why? And why does the function not tell me this? [11:24:49] And how can I get or stay logged in? [11:26:01] I'm not sure [11:26:10] And, when I use the Ajax call, I do get a token. [11:26:23] Line 31 [11:26:53] you need to authenticate before getting your token [11:27:31] hm, wait a second -- [11:28:38] Is this wrong? http://codepad.org/31AHBOAZ [11:28:45] It does not get executed. [11:31:59] Is this all undocumented, or did I simply not find the docs yet? [11:36:05] i'm not sure if you can usethe mw object from outside of the mediawiki context [11:36:32] What is the mediawiki context? [11:36:50] a mediawiki website [11:37:04] I mean, I loaded the modules (or at least I think I did), which is precisely what the extensions do as well [11:39:57] in an extension all requests have zou authßcookie [11:40:04] auth-cookie [11:44:43] The JS api needs a cookie, but my ajax request does not? [11:45:23] Did I load the module correctly, btw? When I call mw.user.getName, I receive a TypeError: mw.user.getName [11:45:29] is not a function [12:07:02] So I do this: http://codepad.org/RtTtD3xs [12:07:45] It says that the mediawiki.api module is ready. getModuleNames also contains mediawiki.api.parse. But api.parse ''is not a function''? [12:46:35] New patchset: Liangent; "New Utils::getLanguageFallbackChain() function" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/70871 [12:59:30] what does 'snak' mean in wikidata jargon? [13:00:11] liangent: http://meta.wikimedia.org/wiki/Wikidata/Data_model#Snaks [13:00:21] it's a collection of property + value [13:00:39] or can have "no value" or other thing than a "value" [13:03:42] New patchset: Aude; "SiteLinksEditTool: Regenerating EditableValue prototype" [mediawiki/extensions/Wikibase] (mw1.22-wmf9) - https://gerrit.wikimedia.org/r/71011 [13:03:50] aude: a single property-value pair, or a sum of some pairs? [13:04:00] Change merged: Aude; [mediawiki/extensions/Wikibase] (mw1.22-wmf9) - https://gerrit.wikimedia.org/r/71011 [13:05:23] liangent: a single pair [13:05:45] a collection of snaks (including references and qualifiers) is a statement [13:06:19] aude: got it thx [13:06:26] errr, it is a claim [13:06:37] claim is a superset of statements [13:11:15] Is any dev here working on the JavaScript part? [13:11:20] of Wikibase? [13:16:17] aude: claims include references? it doesn't seem so [13:17:18] or is https://meta.wikimedia.org/wiki/Wikidata/Data_model not up to date? [13:19:23] liangent: you are right [13:19:30] statements are a superset of claims [13:19:44] */ [13:19:47] class Statement extends Claim [13:26:57] aude: what's the guideline for changing a function definition? [13:28:33] i don't know if we have a guideline yet [13:29:02] Will I ever get help about development in this channel? [13:29:21] Or just about things that are documented anyway? [13:29:31] aude: simply replacing all existing function calls directly is fine? [13:29:40] Granjow: i know almost nothing about js [13:29:48] people are on vacation [13:30:25] another volunteer might be able to help, but seems they are on vacation also [13:37:41] https://gerrit.wikimedia.org/r/#/c/67399/ not being merged is really annoying [13:38:07] for gerrit / jenkins tests [13:39:55] aude: :( okay. [13:40:01] maybe DanielK_WMDE_ can look [13:40:07] although it's not his expertise [13:40:35] Nikerabbit: can you look at https://gerrit.wikimedia.org/r/#/c/67399/ [13:41:10] Nikerabbit: btw it's also needed for another patch of mine, in Extension:CentralNotice [13:42:32] aude: is the test suite of wikibase complete enough? so I can just depend on jenkins to find out function calls that I forgot to change? [13:42:54] i wouldn't depend on that [13:43:03] the tests are quite good but not 100% [13:47:32] liangent: is there no better name than "parent"? My initial guess was that this had nothing to do with language converter? [13:49:22] Nikerabbit: getBaseLanguage ? [13:51:30] liangent: can you embed language converter in the name somehow? [13:52:05] getBaseLanguageLanguageConverter [13:56:41] Reedy: yeah that would mean different thing [14:06:24] Nikerabbit: getParentLanguageToUseConverter [14:06:45] getParentLanguageForConverter a little more ambiguous [14:11:14] getLanguageConverterBaseLanguage? [14:16:11] Now I'm trying to load my module and wikibase.client.init (arbitrary choice) with $wgOut->addModules(), and nothing is loaded. $wgOut->addHTML() does add HTML though. [14:16:26] Is this normal? [14:17:20] Granjow: are you logged in? [14:17:39] i think that one is for the site link widget (for logged in users only currently) [14:17:50] aude: I'm about to log me out with a big hammer soon. [14:18:13] with addModules, though i think it should work [14:18:14] aude: Yes, I am, I'm now in a special page. Ordinary Mediawiki extension and so on. [14:18:36] what about a regular wiki page? [14:19:00] What about it? [14:19:03] i'm not sure with addModules, it should get loaded regardless of which conditions we have in our hooks [14:19:12] the widget is not available in special pages [14:19:18] not sure it matters here [14:19:44] But I have also defined my own module where I put the .js I previously tried to execute. Not loaded either. [14:19:52] hmmmm [14:20:04] then i am not sure [14:25:10] Okay, so when I do it twice, then it works. [14:25:38] not sure why [14:26:49] I guess I do ... the part that is documented is done so incorrectly. [14:27:09] where is it documented? [14:27:24] https://www.mediawiki.org/wiki/ResourceLoader/Developing_with_ResourceLoader [14:27:26] how very useful. [14:27:56] ah, ok [14:30:03] Indeed, for whatever undocumented reason, wb.user now works. [14:30:51] Parsing '''Hello''' still fails, but who cares, who uses that complex MediaWiki syntax anyway! [14:36:55] Change on 12meta_wikimedia a page Wikidata was modified, changed by Lydia Pintscher (WMDE) link https://meta.wikimedia.org/w/index.php?diff=5612752 edit summary: [+2] events page was moved to wikidata.org now [14:38:32] Change on 12meta_wikimedia a page Wikidata was modified, changed by Lydia Pintscher (WMDE) link https://meta.wikimedia.org/w/index.php?diff=5612793 edit summary: [-964] remove box asking for input as it is horribly outdated by now [14:41:28] DanielK_WMDE: Hi! Just to remind you https://www.wikidata.org/wiki/Wikidata:Wikisource . You have asked me to do so monday. [14:44:24] weeeee! It works! [14:44:47] :) [14:48:11] I could now fill my database with 42k empty items by just adding a single line of code! Now *that* is useful! :D [14:48:37] :D [14:50:01] liangent: the review situation is pretty bad at the moment... lots of people are on vacation. [14:50:05] i'll try to have a look [14:50:28] DanielK_WMDE_: in https://gerrit.wikimedia.org/r/#/c/69847/2/ValueValidators/includes/Result.php ... [14:50:39] you check $a is valid and has no errors and then return $b [14:50:58] is that really what you intend? if so, why? [14:51:35] aude: because if $a is valid and has no errors, it's empty. if there is anything interesting, it's contained in $b. [14:52:39] hmmm [14:52:41] aude: logically: valid( a ) => valid( merge( a, b ) ) = valid( b ) [14:53:09] similarly: empty( errors( a ) ) => errors( merge( a, b ) ) = errors( b ) [14:53:25] and of course vice versa [14:53:43] then $b may or may not contain errors [14:53:49] yes. [14:53:59] if both contain errors, then we merge [14:54:08] yes [14:54:55] seems okay then [14:55:08] i find the function name "merge" a little bit vague also [14:55:20] mergeResults() maybe [14:55:29] well, it merges the list of errors and adjusts the validity state accordingly [14:55:50] Result::merge isn't clear enough? Result::mergeResults() seems redundant... [14:56:23] newFromResults [14:56:45] as it's a static function [14:56:58] "new" implies that it'll always return a new object [14:57:00] that is not the case [14:57:01] * aude don't want to make you rebase your other patch though [14:57:19] $a.mergedWith($b) [14:57:23] ? [14:57:23] it returns $a or $b or new Result [14:57:38] when would it not return a Result [14:58:32] ? [14:58:51] am I required to write tests for all code paths I touched? [14:59:00] even if there were no tests before [14:59:12] strongly encouraged, i'd say [14:59:28] not sure we do that 100% though [15:00:03] aude: it does not always return a *new* object. well, results are immutable, so it wouldn't make much of a different [15:00:05] * aude would get stuff shot at me though if i don't provide enough tests [15:00:20] DanielK_WMDE_: hmmmm [15:00:20] Granjow: that would implie that $a gets modified [15:01:16] liangent: you can also try to make the person who wrote the code in question write the test :) [15:01:59] if it's something like the hooks file, then it's not very easy to test at this point [15:03:55] DanielK_WMDE_: i suppose merge() is good enough for now although think it's somewhat vague [15:03:58] DanielK_WMDE_: true. I would then trade redundancy for clarity and vote for mergeResults(). [15:04:15] Tpt_: that proposal seems fine to me... but remind me please, what was i supposed to do there? [15:04:52] Tpt_: oh, because of the commons stuff? see https://commons.wikimedia.org/wiki/Commons:Wikidata_for_media_info [15:06:21] Granjow, aude: but it's a static function, so it will always be written as Result::merge(), never just merge(). That should make it clear, no? [15:07:41] How can I submit patches to Wikibase? [15:07:56] DanielK_WMDE_: i know but it's not Results::merge [15:08:17] Granjow: via gerrit: https://www.mediawiki.org/wiki/Gerrit [15:08:29] New review: Aude; "(1 comment)" [mediawiki/extensions/DataValues] (master) C: -1; - https://gerrit.wikimedia.org/r/69847 [15:08:34] aude: no, that would be merging two sets of results :) [15:08:34] alright problem with your tests [15:08:41] * aude confused! [15:09:18] Granjow: see also https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker [15:09:25] aude: what problem? [15:09:30] see my comments [15:09:34] https://gerrit.wikimedia.org/r/#/c/69847/ [15:09:47] i can perhaps review more later or during the weekend [15:09:52] if you can fix that [15:10:36] * aude done with triaging the localisation update thing and we're not doing deployments next week anyway [15:10:52] DanielK_WMDE_: merge() sounds too much like union(a,b) ... well, thinking of it, it is actually exactly that. I think I just lost my point ;) [15:11:24] i can live with merge but it took me more than a minute to look at the code and understand [15:11:34] mergeResults would have been more obvious [15:12:32] anyway, i'm off soon in search of lunch :) and an appointment [15:12:42] * aude stop eating chocolates [15:13:17] DanielK_WMDE: You were just supposed to review it to see if there is no big issues. Thanks, I'll communicate on it. [15:13:19] aude: hahaha! omg i must have been tired :P [15:13:26] yeah [15:13:36] fix that and i can merge later [15:13:36] Tpt_: the proposal looks good to me, yea [15:13:40] optionally rename the function [15:13:44] DanielK_WMDE_: We need to turn of the change propagation for testwiki [15:13:55] hoo: huh? [15:14:00] If we don't, moves there will make wd.org inconsistent [15:14:14] aude: client to repo change propagation, I mean [15:14:40] DanielK_WMDE_: If the community agree on it, is it possible to see Wikisource support added this summer? [15:14:41] hoo: is there an option for that? i don't remember [15:14:44] does it directly insert the jobs aor require a cron job? [15:14:48] DanielK_WMDE_: Not yet [15:15:02] Tpt_: if it's just about language links, yes. [15:15:15] hoo: add that, then :) [15:15:20] DanielK_WMDE_: Ok. Thanks :-) [15:15:28] aude: it directly injects the jobs [15:15:29] we also need to setup test2 to be a client for test.wikidata [15:15:38] New patchset: Liangent; "Enable variant level fallback for {{#property: }}" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71072 [15:15:45] aude: we really need the cache key change :/ [15:15:53] DanielK_WMDE_: yes yes [15:15:58] aude: poke me about it on monday if i havn't done it by then [15:16:21] i suppose hope nobody goes on page move spree on test2 between now and monday? [15:16:57] i won't get around to configuring test2 today and finding someoen willing to deploy it on a friday [15:17:27] aude: Do you have an example change where you introduce a new client config.? I haven't worked with these in ages [15:17:38] aude: what repo is hooked up to test2? should be test.wikidata.org now, no? [15:17:49] it would be in initialise settings i think [15:17:54] if not, we should make it so :) [15:17:57] or we could use a switch statement in common settings [15:18:03] DanielK_WMDE_: that's what i am saying [15:18:08] just never got around to it [15:19:13] and then we'd want to setup cronjobs for change propagation from test wikidata to test2 [15:19:34] not urgent [15:20:00] aude: Wouldn't that be a nice place to experiment with job only propagation? :P [15:20:05] it would be [15:20:28] deploying new cron jobs requires extra super powers [15:20:33] e.g. not ree-dy [15:21:00] New patchset: Daniel Kinzler; "Allow Result objects to be merged" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/69847 [15:21:41] * hoo thinks about a good and generic enough setting name [15:22:01] hoo: "propagateChangesToRepo" ? [15:22:19] aude: fixed. [15:22:31] That sounds nice and will also cover delete propagation, ... [15:22:36] DanielK_WMDE_: ok [15:22:39] ok, i'm off again, have to sort some stuff downtown and see the inlaws. [15:22:55] will be online a bit tomorrow. most stuff will have to wait until monday. [15:22:59] hoo: sounds okay [15:23:21] ok :) [15:23:41] alright, time for lunch :) [15:23:55] * aude around later  [15:31:20] New patchset: Liangent; "New LanguageWrapper class" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67453 [15:31:34] New patchset: Liangent; "New Utils::getLanguageFallbackChain() function" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/70871 [15:31:44] New patchset: Liangent; "Enable variant level fallback for {{#property: }}" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71072 [15:32:49] DanielK_WMDE_: I think I added you as reviewer here: https://gerrit.wikimedia.org/r/#/c/71074/1,publish [15:33:57] Granjow: You did yes [15:34:20] You can see reviewers on the left of the main change view https://gerrit.wikimedia.org/r/71074 [15:35:23] :) [15:41:38] New patchset: Hoo man; "Introduce the propagateChangesToRepo client setting" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71076 [15:42:33] away again [15:48:42] Hm. I pulled from Wikibase, and now it is broken, I only get a white screen. No entries in /var/log/apache2/error.log. [15:50:49] How would I fix that? Any standard procedures? (For now I just used the reflog and went back to the previous version.) [15:55:55] New patchset: Liangent; "Enable variant level fallback for {{#property: }}" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71072 [16:09:58] New patchset: Liangent; "Enable variant level fallback for {{#property: }}" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71072 [16:22:38] later, thanks for the help! [16:35:02] . [16:36:23] New patchset: Liangent; "Enable variant level fallback for {{#property: }}" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71072 [16:49:17] hey guys [16:49:48] I am trying to checkout the correct versions of Wikibase, but I am unable to do so [16:50:01] it appears that, the master head is broken [16:50:54] It's probably not... did you also install all extension Wikibase depends on? [16:51:02] * extensions [16:53:01] pubsez: build is working https://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase [16:53:27] Like hoo said, you probably do not have the dependencies or perhaps you did not include the default config while also not specifying your own [16:53:43] in my case it complains about this line: wgValueFormatters[ \Wikibase\EntityId::getType() ] = 'Wikibase\Lib\EntityIdFormatter' [16:53:51] it appears that Wikibase\EntityId is missing [16:54:04] and when I comment that out, there are other errors in other locations [16:54:20] how can I checkout the specific version that is working in wikipedia [16:54:28] pubsez: You need the WikibaseDataModel extension [16:54:54] thanks hop, let me check it out [16:55:56] Change on 12mediawiki a page Extension:Wikibase was modified, changed by Hoo man link https://www.mediawiki.org/w/index.php?diff=719440 edit summary: [+11] /* Requirements */ + [[Extension:WikibaseDataModel|WikibaseDataModel]] [16:56:05] hoo: any tutorial about using composer in mw, to avoid such issues in the future? [16:56:16] JeroenDeDauw: We should create a page for that extension... people are confused a lot [16:56:38] liangent: Oh... I don't know... better ask Jeroen [16:57:16] Change on 12mediawiki a page Extension:Wikibase was modified, changed by Liangent link https://www.mediawiki.org/w/index.php?diff=719442 edit summary: [+28] /* Requirements */ [16:57:40] thanks, liangent... that page is pretty incomplete :/ [16:59:32] thanks hop, it works now [16:59:37] ops hop, not hop [16:59:51] arghh, damn you colloquy [17:00:44] :D At least it works now ;) [17:03:05] yes, at least the extension works just fine [17:03:23] although, I still have some work to do for setting up the wikidata :) [17:05:22] New patchset: Hoo man; "Drastically reduce the number of RL modules in clients" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/70115 [17:07:17] New review: Hoo man; "Rebased" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/70115 [17:10:08] hoo: we have a page https://www.mediawiki.org/wiki/Extension:Wikibase_DataModel [17:10:39] liangent: we do not have a tutorial, however if you know how to use composer, just install wikibase/wikibase [17:11:01] shame on me... thanks, Jeroen [17:11:20] liangent: see for instance: composer create-project wikibase/wikibase:dev-master Wikibase --keep-vcs [17:11:24] here https://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/jobs/8541070 [17:11:33] Change on 12mediawiki a page Extension:Wikibase was modified, changed by Hoo man link https://www.mediawiki.org/w/index.php?diff=719451 edit summary: [+2] /* Requirements */ fix link [17:11:39] do the wikidata extensions retrieve the properties through database or through http interface through wikidata.org [17:11:57] I am asking this because there is repoUrl in the configuration [17:12:10] liangent: https://packagist.org/search/?q=wikidata [17:12:19] pubsez: That's for linking the wiki on which the repo runs [17:12:38] Do you want to use Wikidata.org as you're repository wiki? That's not currently possible [17:12:53] nope, I was just trying to understand how wikidata overall works [17:13:02] JeroenDeDauw: I don't know how to use it now... [17:13:08] I am trying to setup one in my local machine [17:13:23] pubsez: Ah ok... that url is only used for front end things... javascript and links within the interface [17:13:29] and I'm having a cluster setup. can composer take care of it? [17:14:02] liangent: a cluster setup? [17:14:23] He probably means a wiki farm [17:14:28] with $wgConf and wmgUseWikibaseRepo / wmgUseWikibaseClient, like what WMF does [17:14:39] in the wikidatawiki dumps, I don't see two tables: wb_entity_per_page and wb_id_counters [17:14:44] how do we populate these tables [17:15:17] liangent: I don't know about these settings [17:15:28] liangent: for compoer it seems you should just RTFM http://getcomposer.org/doc/ :) [17:15:48] pubsez: repo/maintenance/rebuildEntityPerPage.php [17:16:09] thanks hoo (this time I got it :) ) [17:16:38] lib/maintenance/rebuildAllData.php [17:23:45] JeroenDeDauw: it seems composer is targeted at final users instead of developers..? [17:24:09] New review: Jeroen De Dauw; "No tests, which explains why the bug in this code has not been found yet" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/70163 [17:25:41] liangent: I'm finding it pretty useful :) [17:25:58] liangent: also, who is the final user of some software component? [17:26:43] JeroenDeDauw: I mean sysadmins wanting to install it on their servers [17:39:05] aude: btw why is denny missing again these days ... or it's just because I have a different online time than him? [17:52:26] liangent: he was around this morning [17:52:44] hey hoo, how does wikidata extension retrieve property:p154 for a particular page [17:52:51] or where can I find documentation that explains this overview [17:54:27] pubsez: you need to have your own properties [17:54:48] for wikidata and wikipedia, the databases "talk" to eachother [17:54:57] aude: I am trying to understand how it works on wikipedia in general [17:55:02] we're hoping at some point to have 3rd party access [17:55:06] some mechanism [17:55:25] ok, wikipedia directly access the database, although property info is cached in memcached [17:55:41] all entities (items + properties) are cached [17:55:47] what I am trying to understand is on wikipedia, how does wikibase client turn property:p154 into the logo image value [17:56:10] oh, it loads the entity data (for the connected item) [17:56:32] then sees there is property p154 and returns the data value for it [17:56:45] how does it turn the title to the entity [17:56:54] e.g. I see that Google => Q95 [17:57:35] ok, then it loads the data for q95 and gets the label [17:57:47] hopefully from the cache [17:57:54] but where does it find that Google is Q95 [17:58:14] aude: and he doesn't look responsive in gerrit emails either [17:58:18] I found that by searching wikidata, but how does the wikibase extension do that [17:58:50] google is the label... there is a "terms" table that stores the labels (and descriptions, aliases) for each item [17:59:08] that table is used in the lookup process [17:59:12] wb_terms? [17:59:13] liangent: sorry :( [17:59:16] pubsez: yes [17:59:23] now I got it, thanks [17:59:25] ok [17:59:42] and when items change, then wikidata inserts a "job" into allthe clients and then the caches get purged [17:59:51] for relevant items [18:00:07] i think denny is around next weeke [18:00:14] week [18:00:30] liangent: did your core patch get merged yet? [18:01:16] when I search select * from wb_terms where term_text = 'Google'; I see term_row_id etc... [18:01:21] aude: yes [18:01:22] but I don't see Q95 [18:01:31] liangent: goo [18:01:32] d [18:01:40] do I need to do another search in another table? [18:01:45] i can take a look at your wikibase patches [18:01:55] although would prefer a +1 from denny or something [18:02:55] aude: how do you think about that naming question [18:03:23] LanguageWrapper [18:04:32] pubsez: there is 95 (for term entity id) [18:04:39] and term type = item [18:04:58] so if you know the item is q95, then you can find the label in this table [18:05:18] liangent: even though it's longer, i would go with denny's suggestion [18:05:24] more specific is usually better [18:05:58] pubsez: since we know it's an item, then the software can handle attaching the prefix or otherwise knows about that [18:06:32] aude: but exactly how does the extension find q95 [18:06:45] i understand that, once it finds q95, it can get the correct value [18:06:56] but what I don't understand is, exactly how do we go from Google to q95 [18:07:11] the search on wb_terms table get: | 96524988 | 961680 | item | en | label | Google | google | [18:07:18] term_row_id | term_entity_id | term_entity_type | term_language | term_type | term_text | term_search_key | [18:07:31] but from here I don't see exactly how you get to q95 [18:07:43] is it wb_entity_per_page table? [18:07:50] pubsez: ok, if you are coming from the client (wikipedia)? [18:07:55] yes [18:07:57] then there is the wb_items_per_site table [18:08:07] you know the site is enwiki (english wikipedia) [18:08:15] the page title = Google [18:08:36] aude: you mean "with fallback" [18:08:44] but it's not exactly "with fallback" [18:08:52] actually it's more like "with conversion" [18:09:02] select * from wb_items_per_site where ips_site_page = 'Google' and ips_site_id = 'enwiki'; [18:09:17] liangent: ok, then maybe LanguageWithConversion ? [18:09:22] ok, got that [18:09:22] * aude not good at naming [18:09:26] 258140710 | 95 | enwiki | Google [18:09:31] yep [18:09:34] so 95 is Q95 [18:09:38] in that table, everything is an item [18:09:38] thanks aude [18:09:42] yes [18:10:56] aude: or LanguageMaybeWithConversion .. :p actually this class is an abstract of two types of languages, one with conversion and another without [18:11:12] liangent: ok [18:19:05] what's the use of wb_entity_per_page? [18:19:24] that's used in database lookup of entities [18:19:24] it appears that the extension is using this table to get the correct revision text [18:19:39] but in the wikidatawiki dumps, this table is not provided [18:19:46] i know :( [18:19:49] do we need to populate it ourselves [18:19:58] you can, although it takes a long time [18:20:11] * aude should poke the right people to get it added [18:20:32] is it excluded from dumps, because it is too large? [18:20:43] although I can't think anything larger than enwiki [18:21:02] it's also used in SpecialItemsWithoutSitelinks and the entities without labels pages [18:21:05] special pages [18:21:09] no, it's not too large [18:21:17] the terms table is quite a bit larger [18:21:20] New review: Liangent; "Name candidates per discussion with Aude:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67453 [18:21:30] how can we can get this table added to the dumps [18:21:37] or alternatively how can I populate this table [18:21:39] let's see if we have a bug report for it [18:21:52] we do have a script in the repo/maintenance folder [18:21:55] rebuildEntityPerPage [18:22:10] i think that's what it is called [18:22:14] we just used it for wikidata to fill in missing entries [18:22:27] do we run it from enwiki, or run it from wikidatawiki? [18:22:27] it took over 24 hours to run [18:22:31] from wikidata [18:23:09] wow [18:23:14] it does take a lot of time :) [18:25:53] ok, i think i can make a patch for it :) [18:25:55] https://gerrit.wikimedia.org/r/#/c/66570/1/xmldumps-backup/worker.py [18:26:02] cool, thanks aude [18:26:13] i don't see a reason not to include it [18:26:16] just an oversight [18:31:51] pubsez: ok, i might have to wait until monday and get help with that [18:32:04] the git repo that i get seems odd [18:32:59] thanks aude [18:36:57] ok, i figured out the problem :) [18:43:39] pubsez: https://gerrit.wikimedia.org/r/#/c/71087/ [18:43:54] assuming no issues with it, the table should be available soon [18:44:05] cool, thanks a lot [18:45:42] sure [18:46:42] if I understand it correctly, wikidata make dbpedia redundant to a certain point, since info boxes will be made out of structured data [18:47:03] so there won't be any point in trying to extract data out of infoboxes [18:47:09] Hmm, how do I put the birthdate https://en.wikipedia.org/wiki/Frans_Hals in Wikidata? It's around 1580.... [18:47:44] pubsez: we won't be able to package it nicely in all the ways folks need [18:48:01] there still is need for otheres to help with things like that [18:48:47] yes, for sure dbpedia won't be obsolete [18:48:48] multichill: i am not sure but think it has either a precision parameter to use or before /after [18:48:55] where you can give a range [18:49:04] Month -> Year -> decade [18:49:09] yeah [18:49:10] Now ca. year [18:49:18] I'm missing a step [18:49:26] ca. i am not sure [18:49:55] i'd need to ask if that is supported yet or what [18:51:24] My assumption is that it's just not supported in the UI [18:51:42] aude: Do you know where the different steps in precision are documented? [18:52:17] multichill: i don't [18:52:50] unfortunatley we are not great at maintaining documentation and it's an area where we need help [18:53:09] so i don't know if this is documented or not, and if so where [18:53:35] in the Special:Listdatatypes page it shows the parameters that are available for each data type [18:55:49] Ok. Thanks for the help Aude, i'll figure it out [18:55:56] ok [18:56:08] if you have anything to add to documentation, that would be great [18:56:15] or any suggestions [18:56:31] we are kind of short staffed lately due to people on vacation [19:01:50] aude: for the weekly status update, should I edit the wiki page straight away? [19:11:58] hi pragunbhutani [19:12:00] yes you can [19:12:14] lydia has some stuff in a google doc that she will put in that page soon [19:12:21] but we can edit directly [19:12:30] also, I need a little help with something [19:12:33] ok [19:17:00] New patchset: Daniel Kinzler; "Change client defaults if repo is on same wiki." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/70163 [19:24:13] aude: sorry, the net was acting up a little. [19:24:18] so as I was saying, I need a little help [19:24:21] pragunbhutani: no problem [19:25:17] with college coming to an end, I've sort of lost all form of schedule, with respect to my day [19:25:23] and that's making me very unproductive [19:26:07] ok [19:26:12] so can we maybe decide to have a fixed meeting say, twice a week or something? [19:26:18] yes, that would be great [19:26:34] i think tuesday (with lydia and jon, if possible) [19:26:42] and thursday, sometime? [19:26:45] * aude flexible [19:27:13] i think it's important to have a goal for each week, and the weekly status update helps motivate (it does for me) [19:27:44] That's sort of exactly what I need, at least for a little while [19:27:54] accountability [19:27:54] i think it's good for all of us [19:28:09] so i know if there are any issues we can help with [19:28:37] meanwhile i can poke at the mobile extension some more to understand it [19:28:38] pragunbhutani: we also have daily standup meetings at 11am our time that you could call into if you want [19:28:41] better [19:28:46] via google hangout [19:28:47] +1 [19:28:57] danielk joins us virtually part of the time [19:29:00] what time zone are you on [19:29:00] ? [19:29:07] cet [19:29:18] okay let me see what that is in my time [19:29:19] it's 3 1/2 hours behind you, i believe [19:29:44] that's correct [19:29:45] the daily meetings are very motivating [19:29:48] so 2 30 pm [19:29:52] * aude always wants a lot to say that i did [19:29:58] i think so [19:30:04] yeah that suits me as well! [19:30:10] excellent, and we have these on weekends as well? [19:30:13] we can also make up specific bug tickets for specific to do items [19:30:18] not on weekends [19:30:32] we use bugzilla tickets (bugzilla.wikimedia.org) to track everything [19:30:35] okay so I'm going to make it a point to attend that meeting as well [19:30:40] ok [19:30:42] pragunbhutani: cool :) [19:30:55] if we can figure out specific bug items and break down the tasks, that would help [19:31:10] okay, so maybe I could make a bug ticket for the mobile skin that I'm working on [19:31:21] what I need to do is this: [19:31:23] yes [19:31:30] pragunbhutani: it is a good idea to break it down into smaller parts [19:31:36] and have bugs for those things [19:31:37] and if there are any smaller pieces to that [19:31:42] and then one bug to track them all [19:31:48] yup [19:31:51] if wikidata is enable on a MW installation and it's being accessed on a mobile device, I want it to use a particular skin [19:32:17] sure [19:32:19] pragunbhutani: i will later send an email about the daily standup meeting [19:32:35] also to liangent and nilesh to see if they want to join [19:34:04] thanks Lydia_WMDE [19:34:10] np [19:34:47] pragunbhutani: if you have anything to add to the weekly summary please add it within the next 30 minutes :) [19:40:48] I got disconnected again [19:40:56] I don't know why this is happening!! [19:41:00] it happens [19:41:06] Sorry, did you get my message about the first task? [19:41:16] yes [19:41:21] detecting when wikibase is enabled and mobile is being used [19:41:40] ok [19:41:55] so how would you suggest I begin? [19:42:13] probably first figure otu the configurations for mobile [19:42:21] also, we need to get you a labs instance [19:43:01] oh yes, I'll go back to the mail you sent me [19:43:26] if you have applied for shell, then we need to add you to our project [19:43:44] no, I'll do that right away [19:44:44] ok, "All new labs accounts automatically file a request for shell access" [19:44:58] so you should have automatically been applied if you made an account on https://wikitech.wikimedia.org [19:45:38] aude: Managed to get most in, see http://tools.wmflabs.org/reasonator/?q=167654 :-) [19:46:24] multichill: nice :) [19:46:50] https://wikitech.wikimedia.org/wiki/Shell_Request/Pragunbhutani [19:47:00] the status is 'false' as of now [19:47:03] Went to http://en.wikipedia.org/wiki/Museum_of_Fine_Arts_%28Budapest%29 this weekend [19:47:56] pragunbhutani: you are added [19:48:23] aude: awesome [19:48:36] I haven't added a ssh key yet so I'll do that [19:49:02] it let me add you, so i assume you have access [19:49:43] next thing for labs is to try to login to one of the instances (wikidata-puppet-testrepo is one of mine) [19:49:52] then make an instance for yourself [19:50:20] https://wikitech.wikimedia.org/wiki/Help:Contents has all the help info [19:50:41] https://wikitech.wikimedia.org/wiki/Nova_Resource:Wikidata-dev additional info for the wikidata instances [19:51:06] (these mostly to be left alone and some are automated, but that explains what they are and what to do) [19:53:20] my internet again... [19:53:34] would I still have access if I have no ssh keys added? [19:53:57] ssh keys are required [19:54:00] I did add a key on gerrit, but none on wikitech [19:54:25] need to do both, and it's not instant for the keys to be synched with the instances [19:54:55] okay so I'll add a key on the openstack tab as well [19:55:19] * aude does the steps in https://wikitech.wikimedia.org/wiki/Help:Access#Using_agent_forwarding [19:55:45] i have a shell script that has those steps, so it's one command [19:56:27] https://gist.github.com/filbertkm/d67af98fbac734b24466 [19:56:44] from bastion, i then i do ssh wikidata-test-puppet or whatever [19:57:04] we might also want to get you a public ip for your demo system [19:57:41] until then or instead, i know there is some proxy magic to make stuff public [19:58:05] https://wikitech.wikimedia.org/wiki/Help:Proxy [20:00:22] hmm I'll read the documentation once [20:01:13] ok [20:01:27] it's quite a bit to read and learn [20:01:41] yes it is! [20:02:05] otherwise, once you have an instance, it's similar to amazon or whatnot [20:02:24] I didn't even understand half the things you just said, so I think I definitely need to read it once :p [20:02:29] ok [20:02:37] * aude around to answer questions and help [20:02:45] :) [20:20:23] what is a good algorythm to import p373? just to check if category exists at commons? [20:23:22] legoktm, ^ [20:23:49] aude: something didn't quite go as planned [20:24:22] huh? [20:25:03] I got as far as adding bastion to the list of known hosts [20:26:17] ok [20:26:42] I tried to ssh into an instance then and it says that the connection was refused [20:27:20] i think we need to wait for your ssh key to propagate [20:27:24] that's my guess [20:27:32] so try again in 30 minutes [20:28:10] (also, if i am ever not around, you can try asking in #wikimedia-labs) [20:28:19] ssh wikidata-test-puppet.pmtpa.wmflabs [20:28:25] is that the correct command to enter? [20:28:29] yes [20:28:46] okay I'll try again in a bit [20:29:02] ok, it's wikidata-puppet-testrepo [20:29:14] ah okay [20:29:15] try that [20:31:43] connection refused [20:32:40] hmmmm [20:32:53] you can do ssh -v [20:32:59] add the -v (for verbose) [20:34:13] okay [20:34:59] could not resolve hostname it says [20:35:09] nodename nor servname provided [20:35:18] ok, maybe it's not the right host name [20:36:04] ssh wikidata-puppet-testrepo.pmtpa.wmflabs worked for me [20:36:30] then I must not be doing something right [20:37:37] I did an eval [20:37:45] `ssh-agent` [20:37:52] followed by adding my key [20:38:20] followed by ssh -A pragunbhutani@bastion.wmflabs.org [20:38:30] that should work [20:40:12] when I do ssh -A pragunbhutani@bastion.wmflabs.org [20:40:24] I get 'Connection closed by 208.80.153.207' [20:40:31] ok, you don't get into bastion? [20:40:42] so it seems [20:40:43] that might mean we need to wait 30 min [20:40:50] key not propagated yet [20:43:00] Base-w: you should speak with multichill about that, he's been doing a lot of importing of commonscat [20:43:14] * multichill looks up [20:43:17] aude: noobish question, but I'd like to confirm [20:43:31] when I'm adding the key, do I need to add the ssh-rsa at the beginning? [20:43:33] multichill: 23:20:17 - Base-w: what is a good algorythm to import p373? just to check if category exists at commons? [20:43:35] or just the part after it? [20:43:52] Base-w: A, right, I wrote a bot for that [20:44:16] It takes the local version of {{commonscat}} and does all sorts of checking [20:44:17] [2] 04https://www.wikidata.org/wiki/Template:commonscat [20:44:57] multichill: what checking does it do? [20:45:09] Does the category exist, is it a redirect, etc [20:45:51] pragunbhutani: yes [20:45:52] It's based on a Commons category bot I wrote some years ago to spread the templates in multiple languages [20:46:14] Base-w: See https://commons.wikimedia.org/wiki/User:Multichill/Commonscat_stats [20:46:31] So I already had quite stable code. Just modified it to put it on Wikidata [20:48:06] ssh-rsa pragun06@gmail.com [20:48:10] if that's not wrong [20:48:25] then I guess I'll wait for the key to propagate [20:48:50] seems right, although i have something like katie@mymachine [20:49:09] multichill: well so then it seems better to ask you to import such stuff than do it myself. can you manage https://uk.wikipedia.org/wiki/Категорія:Посилання_на_категорію_Вікісховища_відсутнє_на_Вікіданих , please? [20:49:11] katie being my login name on my machine and mymachine beint the host name for my computer [20:49:15] something like that [20:49:19] aude because I'm getting a Permission denied (publickey) now [20:49:29] hmmm [20:49:45] but perhaps ignore cats prefixed with Зображення: [20:49:56] it's for images and very strange [20:50:49] https://wikitech.wikimedia.org/wiki/Help:Access#Permission_denied_.28publickey.29 [20:51:02] does any of the info on that page help? [20:51:43] i would add -i .ssh/wikidev to my ssh [20:51:50] or include it in my .ssh/config file [20:52:03] make sure it's sending the correct key [20:52:20] I did add the key [20:52:23] ok [20:52:24] wait, I'll check out that link [20:52:29] Base-w: Fired it up, haven't run it in a while so not sure if it still works ;-) [20:53:09] multichill: :) thanks :) [20:54:20] pragunbhutani: otherwise i see that coren is around in #wikimedia-labs [20:54:25] he might have a better idea [20:54:56] it helps to grab a staff person before they disappear for the weekend [20:55:01] maybe I should wait for a few more minutes before asking him? [20:55:04] ok [20:55:18] that could rule out the 'waiting for key to be propagated' scnario [20:55:22] ok [20:58:19] woah wikivoyage [20:58:54] im not sure we're ready for it... [20:59:30] Base-w: https://www.wikidata.org/wiki/Special:Contributions/BotMultichill <- seems to work [21:00:39] only 2 wikivoyage's have automatic approval... [21:04:11] multichill: cool, thanks :) [21:17:06] aude: no luck still [21:17:09] I'll go ask coren [21:19:36] ok [21:19:57] * aude sure i had trouble when i started but forget exactly what i did to fix [21:27:46] Base-w: You might want to look into https://en.wikipedia.org/wiki/Category:Commons_category_Wikidata_tracking_categories [21:28:28] I introduced that at the Dutch and English Wikipedia to keep track of progress of import [21:28:34] multichill: well this one i gave you in ukwiki produced by commonscat [21:28:57] but i didnt create there positive cats (when links the same) [21:29:09] oh, didn't you update the interwiki links? ;-) [21:29:21] https://en.wikipedia.org/wiki/Category:Commons_category_without_a_link_on_Wikidata [21:29:34] just for when links different or no link [21:29:40] multichill: ah, yeah :D [21:30:46] aude: what was the instance name again? [21:31:58] wikidata-puppet-testrepo [21:33:11] http://wikidata-puppet-testrepo.instance-proxy.wmflabs.org/ [21:33:18] you can see my "test" wiki [21:33:38] I'm in! [21:33:54] yay [21:34:02] now how do I create one? [21:34:14] aude: Nice test wiki :P [21:34:15] on wikitech wiki [21:34:20] heh [21:34:31] it's just for testing our puppet scripts and me fiddlign around [21:34:58] pragunbhutani: https://wikitech.wikimedia.org/wiki/Special:NovaInstance [21:35:12] select wikidata-dev in the project filter [21:35:17] then add instance [21:35:55] name it wikidata-mobile or something with "wikidata-" prefix [21:36:06] i'd choose small or medium [21:36:49] wikidata-mobile-tester [21:37:01] I think small should be sufficient, shouldn't it? [21:37:03] sure [21:37:05] agree [21:37:29] it's easy enough to make new ones [21:38:02] yeah, made mine now [21:38:06] cool [21:38:21] then you'll have to wait for puppet to sync everything [21:38:39] that might be the waiting part that i remember [21:39:16] how will I know when it's done? [21:39:28] just try logging in [21:39:37] it says "active" for status [21:39:46] so maybe it's done already [21:40:01] * aude amazed if it's that quick [21:40:10] umm, how do I log out of yours? [21:40:14] exit [21:40:25] back to bastion [21:41:31] it works! [21:42:15] now for the proxy bit? [21:43:17] that was after setting up apache (either manually or if you are interested, we can do with puppet at some point) [21:44:10] what's 'puppet'? [21:44:51] it's a tool for server configuration [21:45:10] http://en.wikipedia.org/wiki/Puppet_(software) -- general info and [21:45:17] https://wikitech.wikimedia.org/wiki/Puppet [21:45:32] so that way, wikimedia does not have to manually configure all the servers the same way [21:45:43] the configurations get applied automatically [21:46:12] https://blog.wikimedia.org/2011/09/19/ever-wondered-how-the-wikimedia-servers-are-configured/ [21:46:44] * pragunbhutani goes to read [21:46:50] it's a bit of learning curve [21:46:57] be warned, but once learned then it's nice [21:47:14] so i can delete my test instance, recreate and in a few steps, have it setup the same way again [21:47:35] that sounds clever! [21:47:38] yeah [21:47:56] we can look at it later [21:48:06] manual is fine for now [21:49:27] can you tell me the list of things to do from here-on to get my instance completely set up and ready to work? [21:49:40] 1) install apache2 [21:49:45] 2) install mysql5 [21:49:51] 3) install php5 [21:50:07] then setup mediawiki [21:50:19] not sure how experience you are with ubuntu? [21:50:22] hi, multichill [21:50:26] If you're there [21:54:18] of course [22:15:17] aude: back online [22:15:34] what else needs to be done to get the instance ready? [22:24:36] pragunbhutani: how much experince do you have with ubuntu? (vs. installing stuff on mac or whatever) [22:25:10] I used to use ubuntu a while back [22:25:16] not for very long, but a couple of months [22:25:18] it needs just the basic stuff to get any php / apache /mysql working [22:25:19] I should be okay [22:25:21] e.g. https://www.digitalocean.com/community/articles/how-to-install-linux-apache-mysql-php-lamp-stack-on-ubuntu [22:25:33] anyone up for adding the newest properties to http://www.wikidata.org/wiki/Wikidata:Status_updates/Next ? [22:25:41] i don't know if we need all that stuff in the tutorial [22:25:46] I think I can install a LAMP stack [22:25:57] ok [22:26:00] Lydia_WMDE: maybe we should have a bot do this? [22:26:05] then install mediawiki like you did before [22:26:19] okay awesome [22:26:20] legoktm: if someone would write one i would totally owe that person a beer or something [22:26:26] I'll do that and get back :) [22:26:29] ok [22:26:45] Lydia_WMDE: give me 10 minutes :D [22:26:53] :) [22:26:55] legoktm: haha awesome [22:27:05] legoktm: wikimania - beer - you! [22:27:10] or whatever else you prefer [22:28:34] * Lydia_WMDE wonders if there is an easy way to find out if there was a new task force created within the last week [22:29:22] you can do it with a sql query i think. check if any new pages were added to the category since the last timestamp [22:29:48] right [22:29:54] easy! though ;-) [22:30:09] DPL, too. [22:30:09] i don't think this falls under easy :P [22:30:17] Amgine_: ? [22:30:29] !e DynamicPageList [22:30:35] Amgine_: lolol [22:30:42] wikidata doesnt use DPL [22:30:53] Not my fault. [22:31:05] * Lydia_WMDE sobs a bit in the corner [22:31:07] :P [22:31:41] Lydia_WMDE: Could always bee worse... [22:32:03] JohnLewis: impossible! [22:32:16] Lydia_WMDE: How! [22:33:46] hah! [22:33:54] i think i have a reasonably ok way [22:33:56] \o/ [22:35:18] * legoktm stabs his text editor [22:35:19] ugh [22:35:31] Lydia_WMDE: what format do you want them in? [22:35:32] poor text editor [22:35:43] can i do {{p|##}}, {{p|##}}? [22:35:44] [3] 10https://www.wikidata.org/wiki/Template:p [22:36:14] legoktm: hmmm that'd be ok yes - however that doesn't work on the other wikis i am sending this to [22:36:24] so i always copy the plain text to the version i end up with [22:36:50] hm. [22:36:56] which is ok i think [22:37:12] ok [22:38:26] {{p|657}}, {{p|656}}, {{p|655}}, <-- thats id [22:38:26] [4] 10https://www.wikidata.org/wiki/Template:p [22:38:28] it* [22:39:17] only 3 new properties in the last 3 days? [22:39:22] ok [22:39:34] er, last 7 days [22:39:42] eh yeah [22:39:44] 7 [22:41:50] now lemme adjust it to post to the status update [22:41:56] at what time should it run every week? [22:44:57] let's say 7h ago from now [22:45:59] ok so thats... [22:46:16] every friday at 15:00 UTC [22:47:19] sounds good [22:55:34] Lydia_WMDE: ok, so it will just look for "\* Newest properties: .*?\n" and replace anything that might already be there [22:55:41] https://www.wikidata.org/w/index.php?title=Wikidata%3AStatus_updates%2FNext&diff=53125231&oldid=53125165 [22:55:49] \o/ [22:57:10] if anyone has anything to add to the next status update please add it asap [22:57:18] * aude wonder if my proposed property got created? [22:57:24] i just have the dev part to finish before i'm sending it out [22:58:15] it's "in progress"