[03:45:18] Did anyone else's custom Wikidata signature get wiped out? [03:45:30] I don't use one [03:45:30] I had a custom one (I can verify it from my contribs), and now it's gone. [03:45:38] do tell [03:45:46] Some people are saying their preferences got wiped [03:45:53] Like gadgets and stuff [03:46:00] But it didn't happen to everyone [03:46:02] That would make sense, since it's a pref. [03:46:05] Has it been filed? [03:46:32] dont know, there was a brief thread on project chat about it [03:47:23] My custom signature is still there [03:48:04] legoktm, they removed some of my gadgets too (certain Move was removed for example), and I believe added others (90% sure). [03:48:36] well you should have at least 1 new gadget since that was enabled by default a few hours ago [03:49:02] legoktm, that's fine. [03:49:08] The problem is they unchecked ones I had checked. [03:49:21] Someone must have messed up the batch job. [03:49:37] I will file in case they have a broken script. [03:56:20] Filed as https://bugzilla.wikimedia.org/show_bug.cgi?id=47109 [03:58:10] thanks [03:59:40] o.O [03:59:46] i never connected your ircname with your real name [04:00:37] Same to you. [07:58:12] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58498 [08:00:28] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58499 [08:00:38] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58503 [08:05:36] Change merged: Tobias Gritschacher; [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/58511 [08:07:12] New review: Tobias Gritschacher; "needs manual rebasing" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58510 [08:13:51] New patchset: Aude; "Add more verbosity to dispatch changes" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58674 [08:22:02] Hi. My IE-browser shows "edit"-Buttons only for a second, when a page is loaded. Is this a general problem? [08:35:21] Panic! [08:39:34] JeroenDeDauw1: nooo [08:41:57] DanielK_WMDE: ping [08:42:33] Danwe_WMDE: jakob is my collegue [08:42:48] hi Danwe_WMDE :) [08:45:13] Abraham_WMDE: hi [08:45:28] hi jakob [09:13:25] I think I am going to restart Jenkins since it is really quiet right now. Takes roughly half an hour. [09:31:47] !nyan [09:31:47] ~=[,,_,,]:3 [09:35:58] [[File:Wikidata-Cat.gif]] [09:35:59] 04[1] 04https://www.wikidata.org/wiki/File:Wikidata%2DCat%2Egif [09:36:02] :D [09:59:46] New review: Daniel Kinzler; "(3 comments)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58674 [10:03:38] thanks for the graphs, aude! [10:03:45] DanielK_WMDE: sure [10:03:53] are you using rrd? [10:03:56] * aude has to reset some of the dit rate ones [10:04:02] edit rates but not the wikidata one [10:04:05] yes, rrd [10:04:09] which ganglia uses [10:04:11] neat [10:04:19] it could be put into ganglia [10:04:29] once i figure out how exactly [10:04:53] once we have an API :) [10:05:06] and only if i could stop screen scraping :) [10:05:07] evil [10:05:14] the DispatchStats stuff is blocked on a design discussion, see the list [10:05:19] i see [10:05:19] i guess we can resolve it today, though [10:05:25] would be important [10:05:29] yes [10:05:56] i'm setting up redis configuration now [10:05:57] which is why one of the questions to discuss is "should be block development on (this kind of) design discussions". [10:06:19] oh, awesome. [10:06:31] maybe we can get an instance for labs too, soonish... [10:06:38] sure [10:06:56] Change merged: Henning Snater; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57449 [10:06:59] * aude really does *not* know the status of the labs instances [10:07:09] Change merged: Henning Snater; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57448 [10:07:10] next week can look at it perhaps [10:10:41] jenkins is back up [10:22:25] New patchset: Daniel Werner; "EntityIdInput expert keeps track of raw value when set to deleted entity" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58675 [10:26:14] JeroenDeDauw: regarding the email you just sent: i'll put that into my queue. need to take a look at ask, diff and datavalues now. will come back to the question later, if time permitts. just fyi. [10:28:55] tobyS: sure - I guess if what I am doing is bad, you might note it while looking at DataValues anyway :) [10:29:16] :) [10:30:09] tobyS: oh and, DIff is not only our best tested piece of code, it also has the most on wiki docs: https://www.mediawiki.org/wiki/Extension:Diff [10:32:25] New patchset: Jeroen De Dauw; "Move Query out of lib and QueryContent out of repo, both to Wikibase Query" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58510 [10:36:37] New patchset: Jeroen De Dauw; "Update INSTALL" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/58676 [10:38:26] New patchset: Jeroen De Dauw; "Fixed since tag" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/58677 [10:42:03] Change merged: Henning Snater; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58536 [10:42:54] New patchset: Jeroen De Dauw; "Added QueryStoreUpdater tests" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58524 [10:42:57] New patchset: Jeroen De Dauw; "Some implementation for in the SQLStore updater" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58525 [10:47:42] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58510 [10:51:47] New patchset: Jeroen De Dauw; "Added COPYING INSTALL README and RELEASE-NOTES to WikibaseQuery" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58679 [10:52:00] New patchset: Jeroen De Dauw; "Added QueryStoreUpdater tests" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58524 [10:52:04] New patchset: Jeroen De Dauw; "Some implementation for in the SQLStore updater" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58525 [11:03:05] Change merged: Henning Snater; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58675 [11:04:53] DanielK_WMDE: avail.? [11:05:34] Denny_WMDE: you asked, if you could get a Hardware Wikidatameter for WMDE-Office two days ago [11:06:17] still interested? [11:06:28] yes [11:08:09] I think that I dont have time until may for this [11:08:23] we are patient :) [11:11:44] New patchset: Jeroen De Dauw; "Generalized access to Claims by property ID." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56272 [11:12:23] New review: Jeroen De Dauw; "You are still not using assertCount ;p" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56272 [11:13:23] New review: Daniel Kinzler; "> You are still not using assertCount ;p" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56272 [11:13:28] 2 years ago I made this for Wikipedia edits: http://l3q.de/pediameter/ on this page you can find a simmulation of the Wikipedia-Meter that is visualizing edits/5sec for the 10 most active wikipedias (en,de,es,fr,ru,it,pt,nl,ja,pl) [11:15:06] the simulation has obviously disadvantages against a hardware meter [11:18:54] New review: Jeroen De Dauw; "(1 comment)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56272 [11:19:34] Abraham_WMDE: in a second [11:19:58] New patchset: Jeroen De Dauw; "Rename PropertySQLLookup to PropertyEntityLookup." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56319 [11:20:44] Tobi_WMDE: could you review my patch for adding edited stuff to the user's watchlist? Would be neat to get this deployed soon. https://gerrit.wikimedia.org/r/#/c/54704/ [11:21:22] Abraham_WMDE: what's up? [11:21:52] ...and... do we have bug 41573 on the board? [11:22:05] lbenedix: there's also this one: http://wikipulse.herokuapp.com/ but i also think the hardware thingy is simply cool [11:23:28] one of my dream is to build a wall of hardwaremeters [11:25:03] DanielK_WMDE: see pm [11:25:31] is there something like a control room deep under the WMFs office where such a Meter wall could be installed? [11:26:38] lbenedix: I'd prefer to have something in the Berlin office. And just very visible in the Wikidata part :) [11:26:42] DanielK_WMDE: I can put 41573 to the board, it is up for review, right? [11:26:55] lbenedix: I am not sure there are rooms in the cellar of the WMF building in SF [11:27:01] Abraham_WMDE: yes [11:27:07] would be good to get it in [11:27:12] i asked tobi to review [11:27:14] http://commons.wikimedia.org/wiki/File:Control_room_pt_tupper.jpg [11:28:13] New review: Jeroen De Dauw; "Errm. No strong objection to merging this, though I think you did not fix the actual issue. Sure the..." [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/56319 [11:28:22] btw. the measurements of the meter is ~10cm x 10cm [11:28:32] good [11:28:40] New review: Daniel Kinzler; "(1 comment)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56272 [11:28:42] New patchset: Jeroen De Dauw; "Implement PropertyLookup on top of TermIndex." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56330 [11:28:45] New patchset: Jeroen De Dauw; "Use the term based property lookup on the client." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56717 [11:29:09] DanielK_WMDE: dependency chain all the way across the sky? :p [11:29:21] 4 commits [11:29:32] didn't want to mix too many different issues [11:30:55] JeroenDeDauw: i'll fix the assertCount thing and the naming foo [11:31:03] i'll need to rebase again, but it should be quick [11:33:09] New review: Daniel Kinzler; "It's in the store because there was no WikibaseClient class when I wrote this. It would fit better t..." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56319 [11:33:41] New review: Daniel Kinzler; "naming issues" [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/56272 [11:41:32] JeroenDeDauw: what is the GeoCoordinateParser (DataValues) actually good for? [11:41:49] is that user data that is parsed? [11:42:06] tobyS: this particular one is not used yet by Wikibase or any of our code [11:42:14] It is however used by some other extensions [11:42:29] JeroenDeDauw: doesn't answer my question ;) [11:42:48] tobyS: we need the parsers since we expose a web api via which people can provide "formatted values", ie a geo coord in DMS format or whatnot [11:43:25] JeroenDeDauw: ok, understood [11:43:26] tobyS: does that answer it? [11:43:32] yup, thx :) [11:44:00] tobyS: perhaps we ought to discuss the DataType interface at some point, which is the mechanism via which Wikibase will make use of these ValueParsers and ValueFormatters [11:44:06] ATM this interface is a mess though [11:44:10] And it is not really used yet [11:44:19] (Which is why it is still poorly defined) [11:44:26] JeroenDeDauw: which files are that? [11:45:12] JeroenDeDauw: uh, i'm getting Notice: Use of undefined constant CONTENT_MODEL_WIKIBASE_QUERY - assumed 'CONTENT_MODEL_WIKIBASE_QUERY' in /DATA/var/www/daniel/wikidata/conf/repo/LocalSettings.php on line 77 [11:45:20] did the constants get moved around? [11:45:45] tobyS: DataValues/DataTypes/includes/DataType.php - also see the readme of this component [11:45:53] hm... maybe related to experiemntal mode (or lack thereof) [11:45:57] Which apparently is inconsistent with the code, though the concept is the same [11:46:08] Let me check if we have some documentation on the DataType concept [11:49:29] tobyS: does not look like we have docs on the purpose of this interface or the intended usage in Wikibase, I can explain if you want though [11:49:51] JeroenDeDauw: what is the reason that the parsers return a result object instead of throwing an exception if the value could not be parsed? [11:50:13] JeroenDeDauw: yeah, please hit me a small mail on that interface, thx upfront [11:50:59] New patchset: Aude; "Add more verbosity to dispatch changes" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58674 [11:51:23] tobyS: oh, I knew we had _some_ stuff on it: https://meta.wikimedia.org/wiki/Wikidata/Notes/SMW_and_Wikidata [11:51:33] See the DataTypes section there [11:51:37] thx [11:52:38] JeroenDeDauw: ^ result objects? [11:53:19] tobyS: re result object vs exception: I see no reason not to change it to use exceptions. In fact after reading the chapter on error handling in Clean Code I now figure I indeed might want to revisit what is being done there [11:53:57] tobyS: I'm guessing you think the code probably should use exceptions? [11:54:24] DanielK_WMDE: constants got moved yes [11:54:32] JeroenDeDauw: would make it easier, yes :) [11:54:39] DanielK_WMDE: this code is no longer loaded, even in experimental mode [11:54:46] thx for the info, just wanted to make sure i don't miss a point [11:54:51] It is now in its own extension, which you should load if you want to use the query stuff [11:55:28] I will however load it automatically from the experimental config in a minute to facilitate dev work for now - and kill it later before the initial version of the query extension [11:59:54] New patchset: Jeroen De Dauw; "Remove unused import" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58682 [12:00:30] New patchset: Jeroen De Dauw; "Remove unused imports" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58683 [12:02:50] New patchset: Daniel Kinzler; "Generalized access to Claims by property ID." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56272 [12:03:21] New review: Daniel Werner; "(4 comments)" [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/58223 [12:03:26] New review: Daniel Kinzler; "fixed naming issues" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56272 [12:03:59] New patchset: Daniel Kinzler; "Rename PropertySQLLookup to PropertyEntityLookup." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56319 [12:04:04] New patchset: Jeroen De Dauw; "Remove unused imports" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58685 [12:05:22] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56272 [12:07:51] New patchset: Jeroen De Dauw; "Added QueryStoreUpdater tests" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58524 [12:07:55] New patchset: Jeroen De Dauw; "Some implementation for in the SQLStore updater" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58525 [12:10:12] JeroenDeDauw: we load other dependencies automatically, why not this one? Hm, I guess it's not really a dependency... [12:10:14] anyway [12:11:03] JeroenDeDauw: you complained about the name "PropertyLookup", and about it being managed by the store... I suppose I can fix the naming in that change, but would move it from Store to WikibaseCLient in a follow-up [12:11:13] ...since that's an unrelated change [12:11:21] does that sound ok to you? [12:11:56] DanielK_WMDE: client specific then? [12:16:21] DanielK_WMDE: scrap what I said earlier - I already did this yesterday apparently - so the stuff gets loaded when using exp mode [12:16:29] aude: the ClientStore is already client specific [12:17:00] so generally moving store stuff out of lib? [12:17:04] aude: but it's a good question... sould this *really* be client side functionality? shouldn't this go in lib? [12:17:04] DanielK_WMDE: sure, sounds good [12:17:10] is that the idea? [12:17:30] aude: no, moving non-store stuff out of the store [12:17:35] hmmm [12:17:37] ok [12:17:48] for example, if we want lua to be available in the repo..... [12:17:55] or parser functions [12:17:56] aude: PropertyLookup (or whatever we end up calling it) is not a storage class really. [12:17:59] DanielK_WMDE: that is, if you are talking about having the WikibaseClient factory return an instance of this thing - the logic itself is indeed not client specific [12:18:11] i think there's a need for something that both repo and client can use [12:18:15] JeroenDeDauw: yes, that's what i'm talking about [12:18:16] DanielK_WMDE: that's fine [12:18:34] aude: yes, and there's also need for a shared factory (interface). but that's for later [12:18:39] New patchset: Henning Snater; "Introducing toolbarbase widget" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58474 [12:18:51] DanielK_WMDE: there is [12:19:19] aude: there is? [12:20:09] JeroenDeDauw, aude: so, names. Instead of PropertyLookup, PropertyFinder? PropertyLabelResolver? What? [12:20:09] i think having something equivalent to the client factory for the repo, and shared interface [12:20:33] either suggestions are fine [12:20:46] there's a factory there, but no shared interface. anyway. [12:20:52] New patchset: Henning Snater; "Renamed/Moved toolbar jQuery widgets" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58475 [12:21:04] right..... [12:21:18] DanielK_WMDE: those names do not sound good to me - can you perhaps first describe the responsibility of the class? Since I am not sure what exactly you are thinking of putting where [12:22:03] JeroenDeDauw: it's really just one method: getClaimsByPropertyLabel( Entity $entity, $propertyLabel, $langCode ) [12:22:19] PropertyRetrievingByLabelFromEntityLookupFinder .... :D [12:22:23] it needs it's own service because it needs to access the property labels in the database somehow [12:22:25] * aude annoyed with class naming [12:22:37] aude: actually, it retrieves Claims, not Properties :) [12:22:47] k [12:24:14] JeroenDeDauw: I can imagine adding a method for getting a Property object given a label, though one of the two implementations we currently have would have trouble supplying that functionality. [12:24:29] so, what would you call it? [12:24:47] oh, one merged! yay! [12:25:49] ByPropertyLabelClaimFinder is the shortest clear name I can think off [12:28:03] DanielK_WMDE: what about having one "ByPropertyLabelClaimFinder" class (and no interface (interface as in language feature)) and compositing out the stuff that currently varies between both implementations of the existing interface? [12:28:32] JeroenDeDauw: like, everything? [12:28:53] they really work in completely different way. [12:28:54] s [12:29:01] :o [12:29:57] Maybe "ClaimFinder" would be sufficient?... [12:30:11] find by what? [12:30:19] right [12:30:49] But if ByPropertyLabelClaimFinder is the interface, we would get something like TermIndexBasedByPropertyLabelClaimFinder for the implementation [12:30:50] ugh [12:30:56] nooooooooooo [12:31:00] :P [12:32:35] DanielK_WMDE: I do not understand what we currently have two implementations for [12:32:49] MockRepository implementing PropertyLookup is weird [12:37:40] JeroenDeDauw: there's an old implementation, without using the terms table, and a new one that is using the terms table. [12:37:46] we coudl ditch the old one if that helps [12:38:47] MockRepository implementing PropertyLookup... well, to mock PropertyLookup, you need Items and Properties. And you need a way to enumerate them. [12:39:11] In the MockRepository, we have both. The MockPropertyLookup could be build on top, but only if MockRepo exposed the internal structure [12:39:17] which i suppose would be doable [12:39:29] i don't see a big problem with implementing the interface directly, though [12:40:15] JeroenDeDauw: but anyway, we should plan for havign at least two implementation of PropertyLookup: one using the repo's terms table, and one using some local table (for 3rd party clients) [12:40:20] JeroenDeDauw: i don't get the purpose of ValueParsers/includes/api/ApiParseValue.php. does it just receive values via api request, parse them and return them again to be returned by the api? *wonder* [12:40:34] even if we don't implement this now, we should still provide for that case [12:41:15] tobyS: i think so, yes - quite useful to have. [12:41:25] tobyS: yes, that is indeed what it does. So it takes some string version, for instance DMS coordinate, and returns it as a serialized DataValue [12:41:51] JeroenDeDauw, DanielK_WMDE: so, for the purpose of validating a value or something? [12:42:03] tobyS: no, just parsing [12:42:40] I'd say yes, also for validating :) [12:42:55] hehe ;) [12:43:01] tobyS: the view in the UI is based on the structured value, the input is a text field. [12:43:11] input: 42 N, 10 E [12:43:11] output: some JSON thing like { 'latitude': 42.0, 'longitude': 10.0, 'altitude': null } [12:43:13] ah, i see [12:43:21] when a new value is entered, the view needs to be updated (possibly before saving) [12:43:24] so it unifies different input representations [12:43:27] understood [12:43:33] m [12:43:39] mau [12:43:51] tobyS: we have the ValueParser interface also on the client side in JS [12:44:13] And this API allows us to be lazy and create parsers for stuff by just calling this API in the praser rather then implementing the logic again... [12:44:39] I'm not convinced this is a good idea, but this is why we created that API module [12:46:59] hmm, don't find it that bad [12:47:30] might result in performance issues, depending on how often the api is used to validate the field values [12:47:38] but otherwise it sounds logical [12:47:52] since the functionality must exist in the backend anyway [12:49:43] aude, JeroenDeDauw: i think i know now what's wrong with the PropertyLookup. WE really want something that looks up properties by label. But the old code doesn't work that way, it makes do without having access to all properties. [12:50:08] Oh [12:50:08] That reminds me [12:50:19] I started a rebuildTermSearchKey run again [12:50:21] * Reedy checks on it [12:50:26] :) [12:50:55] Ooh [12:50:57] It broek [12:50:57] A database error has occurred. Did you forget to run maintenance/update.php after upgrading? See: https://www.mediawiki.org/wiki/Manual:Upgrading#Run_the_update_script [12:50:58] Query: SELECT term_row_id,term_language,term_text FROM `wb_terms` WHERE (term_row_id > 25764381) AND (term_search_key = '') ORDER BY term_row_id ASC LIMIT 100 FOR UPDATE [12:50:58] Function: Wikibase\TermSearchKeyBuilder::rebuildSearchKey [12:50:58] Error: 1032 Can't find record in 'wb_terms' (10.64.16.28) [12:50:58] aude: JeroenDeDauw: So if we ditch the old code, we can have a much better interface. But then we are forced to provide indexed access to properties on the client side. Do you think that's acceptable? [12:51:28] Reedy: what does that mean? [12:51:37] I have no idea [12:52:53] 30.8% done apparently [12:52:56] Reedy: http://bugs.mysql.com/bug.php?id=27123 ?? [12:53:34] They aren't using FOR UPDATE [12:53:46] Seems to be carrying on fine.. [12:53:51] Reedy: does this happen all the time, or only ocasionally? seems to be an internal mysql issue... [12:54:02] First time I've seen it [12:54:10] And it stopped the script at some point [12:54:13] anyway, if it misses some for some odd reason, we can just re-run the script [12:54:25] mwscript extensions/Wikibase/repo/maintenance/rebuildTermsSearchKey.php wikidatawiki --only-missing --force [12:54:28] ...for the rows still missing the search key [12:54:42] I logged a bug about some getting '' [12:54:59] huh, that shouldn't happen [12:55:17] maybe malformed utf8 in the input?... whatever [12:55:22] mmm [12:55:37] It made looking for a starting point hard ;) [12:55:51] https://bugzilla.wikimedia.org/show_bug.cgi?id=46867 [12:56:02] well, if --only-missing isn't a performance problem, just use that [12:56:13] Reedy: also, next time, bump the batch size [12:56:18] 100 is too small i think [12:57:02] "too small"? [12:58:43] DanielK_WMDE: now I am unsure how the old code works - how can it possibly do this correctly without access to property labels? [12:59:34] JeroenDeDauw: it has access, but no indexed access. It loads all Property entities used by the claims in the provided entitiy, then looks at the labels. [12:59:45] A sane design for this seems pretty simple: have an interface to obtain the needed info, have an object to do the filtering task, and have it composit an instance of the first interface [12:59:54] That's slow, but not as slow as having to load *all* Property objects and looking for the label [13:00:10] DanielK_WMDE: yeah so what> [13:00:25] You can easily change this if you properly design the thing [13:00:56] i don't see how. what would the interface for getting the needed info look like? [13:01:07] the two implementations need completely different info. [13:01:08] CÖDE FREEZE! :) [13:01:27] Silke_WMDE: cool name for a band :P [13:01:40] yeah [13:02:00] JeroenDeDauw: ...also, the actual filtering is already factored out into the Claims class. [13:02:44] Reedy: "too small" as in "makes it slow". [13:02:51] lol [13:03:00] I made it bigger before and it didn't seem to help [13:03:13] oh? ok fine then. [13:04:11] Theres over 83 million rows in that table [13:05:05] sounds about right [13:05:49] 61GB for the wikidata database [13:05:53] DanielK_WMDE: perhaps I do not understand the situation correctly - if I do, then I stick with my conclusion, if I don't, then to bad, I'm not looking further into this now, so sorry, can't help you with it further then [13:05:56] Reedy: i guess at some point we need to find a different way to store this. [13:07:04] JeroenDeDauw: i'll propose a different approach. [13:08:20] https://www.mediawiki.org/wiki/MediaWiki_1.22/Roadmap [13:10:13] Denny_WMDE: i fear the terms table will outgrow mysql pretty soon - once people start adding lables in more languages, it will explode. [13:10:18] we need to think about an alternative [13:10:30] lbenedix: pong, sorry for the lag [13:10:41] DanielK_WMDE: agree [13:11:16] edsu: ACK [13:11:41] is it possible to get the number of bot-edits vs human-edits from wikipulse? [13:12:09] I know that its possible to filter in wikistream [13:13:34] lbenedix: it's definitely possible [13:14:04] lbenedix: at least registered bots [13:14:24] lbenedix: what are you interested in seeing? [13:14:48] yepp [13:14:49] lbenedix: proportion of bot edits to humans for wikidata, or wikipedia in general? [13:15:01] wikidata [13:15:58] I think that nearly all edits in wikidata are made by bots and I want to prove that assumtion [13:16:39] ok, i am working on a page that will display stats for wikidata edits [13:16:56] so it's useful to hear that you would like to see this [13:18:07] lbenedix: i think you are right btw - lots of bot faces in http://wikistream.inkdroid.org/#wiki=wikidata.wikipedia :-D [13:22:17] I reckon we should delete https://wikidata.org/wiki/Property:P404 - it's no fun if it Propert 404 exists. [13:23:40] New patchset: Henning Snater; "Introducing toolbarbase widget" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58474 [13:24:23] tommorris: that's a slippery slope, what about p500 ? [13:24:30] tommorris: :-D [13:24:57] no, just randomly throw an error when someone tries to access it [13:25:19] also, now Bitcoin exists... P402 needs to have a paywall [13:29:31] Thanks! I think this is enough to see that my assumption is true... btw: have you seen the Hardware- WikidataMeter: http://www.youtube.com/embed/4A00Uf_qAbE ? [13:30:13] * lbenedix thinks about having two meters, one for bots and one for humans [13:32:20] the youtube links is for a live-stream of the meter on my table [13:38:56] New review: Daniel Kinzler; "will propose a different solution" [mediawiki/extensions/Wikibase] (master) C: -2; - https://gerrit.wikimedia.org/r/56319 [13:39:07] New review: Daniel Kinzler; "will propose a different solution" [mediawiki/extensions/Wikibase] (master) C: -2; - https://gerrit.wikimedia.org/r/56330 [13:39:18] New review: Daniel Kinzler; "will propose a different solution" [mediawiki/extensions/Wikibase] (master) C: -2; - https://gerrit.wikimedia.org/r/56717 [13:54:05] hi, i have a question regarding wikibase lua api. I want to get the local link to some id e.g: mw.wikibase.sitelink( id ) [13:54:20] it works fine- but when this entity dont have sitelinks it throws error [13:54:32] is there any workaround for it? [13:54:49] eranroz: i think the fix for that is due to be deployed on monday [13:54:51] eranroz: i think it's a bug and there's a bug fix coming [13:55:04] * aude checks [13:55:35] thanks [13:55:48] DanielK_WMDE: so, for the job queue, step #1 is to get redis working properly for handling the change notification jobs int he client [13:55:51] and inserting the jobs [13:56:12] is there anything special we need to do for that? [13:56:13] right now i have it setup but probably not configured 100% correct and it does not work [13:56:18] i thought it should Just Work... [13:56:18] figuring it out [13:56:43] BTW - is it ok to use Lua module to request page, and to get all the site links+labels of its associated properties? (performence question) [13:56:47] aaron seems to be online, perhaps ask him [13:56:50] sure [13:56:56] eranroz: yes [13:56:58] do you know if there is any documentation? [13:57:11] no idea [13:57:14] eranroz: https://gerrit.wikimedia.org/r/#/c/56737/ [13:57:19] the bug fix is there [13:57:20] New patchset: Henning Snater; "Renamed/Moved toolbar jQuery widgets" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58475 [13:57:21] not deployed yet [13:57:42] DanielK_WMDE: i'll poke at it a bit more this evening or later [13:57:51] * aude going home soon and be on the call [13:58:02] lbenedix: edsu: bots account for about 85-90% of the edits [13:58:06] * DanielK_WMDE got very confused about the PropertyLookup stuff and is giving up for today. crud. [13:58:11] heh [13:58:27] thanks - so i'll w8 for the deployment [14:00:59] DanielK_WMDE: aude: huh? are 90 Million lines in a MySQL DB a problem? [14:01:06] DanielK_WMDE: in terms of performance, the "13:57:52 Finding pending changes for vowiki " step in the dispatcher is what takes longest, by far [14:01:12] I mean, we are aiming there for about 2-4 Billion lines [14:01:27] Denny_WMDE: don't know at what point it's a problem [14:03:22] aude: well, that step reads some thousands of changes from the database. [14:03:37] * aude nods [14:04:20] Denny_WMDE: yes, that's what we are aiming for, and i fear it is a problem. my guess is that the practical limit is about 1 billion, but this is something to discuss with asher. [14:04:28] but we *should* discuss it with asher, and perhaps david [14:04:29] ok, truncated my changes table and restarting the bots :) [14:04:36] see how it works now [14:05:15] aude: you can play with dispatchBatchChunkFactor, may improve performance for small wikis (but is slower for big wikis). [14:05:31] DanielK_WMDE: my testwiki is not so small anymore [14:05:45] shall try various settings [14:05:53] aude: well, it really depends on what percentage of changes is relevant to the wiki [14:06:15] hmmm....k [14:06:37] * aude has also expanded the number of clients :) [14:08:45] aude: hm... there is one obvious way to improve performance, but it's not future proof [14:08:56] i'll mess with it for a bit and propose a patch [14:09:07] mmmm... k [14:09:15] what do you have in mind? [14:10:52] ok, reset my changes and stats collection for my test wiki [14:12:20] aude: currently, we list the changes, then for each change get the sitelinks, then look if the change is relevant to the target wiki. [14:12:31] that's nice and clean, flexible and future proof. [14:12:35] it's also dog slow. [14:12:48] we can join against sitelinks directly. [14:13:10] perhaps [14:13:52] won't work once we want to propagate changes for items referenced on different client pages [14:14:07] we'd need a way to track such usage [14:14:46] and filter based on usage at some point in the process... [14:15:54] JeroenDeDauw: regarding includes/diffop/diff/Diff.php, is the idea that a diff is a tree of diff-ops and sub-diffs? [14:22:28] DanielK_WMDE: once i reset my changes tablethe "14:20:43 Finding pending changes for hawwiki " is very quick [14:22:37] it degrades a bit with size of table [14:22:40] obviously [14:23:01] yes [14:23:05] * aude shall think about it and figure out redis jobs in the client [14:23:12] it seems actually like it degrades a LOT with the size of that table [14:23:19] it does [14:23:39] i probably had tens of thousands of changes in my table [14:23:45] didn't count :( [14:23:45] also with the size of the sitelinks table [14:23:54] didn't truncate my site links [14:24:00] * aude importing them from enwiki :) [14:24:36] aude: are sitelinks to your client wikis evenly distributed? i think the skewed distributen we have is part of the problem [14:24:57] they are distributed the same way as they are on wikidata and wikipedia [14:25:28] i don't have any site links that don't have enwiki there [14:25:51] could import from another wikipedia, but not important i think [14:25:53] Hi, I got an idea of feature request, I wanted to have feedback about it so I post here : some sort of namespaces for items [14:27:20] there seems to be different kind of items, some of them do not really are a part of knowledge database, but more "technical items", eg. item that are just here to identify a template on mediawiki [14:27:25] Denny_WMDE: is the 85-90% bot-edits an actual number? I thougt the bots are actually throttled due to the dispatch lag? [14:27:49] oh, it was based on analysis of the dumps until two weeks ago [14:27:55] I'm not sure properties are the right way to classify those items [14:27:57] they are not current [14:28:09] ggaaahhh! [14:28:14] :( [14:28:56] so, i can't join wb_changes against wb_items_per_site, because one has two columns for entity type and id, and the other uses a single column for the prefixed ID [14:29:04] well, i can join on that, but not efficiently [14:29:10] yikes [14:29:34] so, we have to mess with the database schema to address that performance issue [14:29:47] *sigh* [14:29:50] let me file a bug [14:30:04] definitely need consistency there [14:30:07] * DanielK_WMDE has the feeling of taking one step forward and two steps back [14:30:22] * aude sighs [14:31:19] looking at http://wikistream.inkdroid.org/#wiki=wikidata.wikipedia&namespace=all makes me think that the number might be higher [14:31:21] it'd obviously be easier to mak ethe changes table do same way (the *wrong* way) as site links [14:31:43] suppose it's doable but not so simple to fix the site links table [14:32:14] while at it make the page column store db_key form would be nice [14:32:24] ips_site_page [14:33:59] Ok, it seems there is more actual problems than my feature request :) [14:33:59] items per site really does not have entity type (except implied that it's an items per site table) [14:34:52] TomT0m: think that would be a question for the community [14:36:48] DanielK_WMDE: alright, going home for real.... back in hour [14:38:23] aude, community does not really deal well with feture request, it reasons mainly with what is as it is [14:38:58] aude, Denny_WMDE: https://bugzilla.wikimedia.org/show_bug.cgi?id=47125 [14:40:01] TomT0m: if I was king, items would only exist for "real" wikipedia articles. [14:41:22] TomT0m: but since we have "technical" items, i think they should be marked using some convention; perhaps with "instance of" -> "wikipedia template". If have seen something like that for categories, i think [14:41:49] Even better would be a special value for the "main type", even if that would be incompatible with the GND [14:42:19] actually... it doesn't even need to be a value, it can be the statement that such an item has *no* main type (set it to "no value"). [14:44:00] Tobi_WMDE: http://stackoverflow.com/questions/1732348/regex-match-open-tags-except-xhtml-self-contained-tags/1732454 [14:47:01] JeroenDeDauw: you shold definitely not parse HTML with RegEx'... [14:47:24] lbenedix: yes, you need to explain this to Danwe_WMDE [14:47:26] please do [14:47:34] HTML is crap [14:47:43] can be crap [14:48:28] Only when written by Danwe_WMDE clearly [14:48:31] [14:48:38] most browser dont care if you write
or
[14:49:03] its both valid [14:49:12] http://www.w3.org/wiki/HTML/Elements/br [14:50:14] only for xhtml (which is also crap) you have to close a br tag [14:50:56] for xtml you can use a xml parser (if you can be sure that its valid xml) and look for elements without children [14:51:02] DanielK_WMDE, hehe, actually we are doomed to have technical items since wikidata also solves the technical problem of interwikis, which exists for technical pages as well. I also saw the problem of identifying a Wikipedian using an item on mediawiki project [14:52:37] xpath for this would be //*[not(*) and not(normalize-space(.))] [14:53:27] but you'll also find

[14:54:08] I see you use GND as a reference also here, I saw a lot of discussion about that, a lot of wikidataans are not really happy about that, it seems a very poor solution to the problem I want to address, as well as a bunch of others [15:05:58] TomT0m: not everything that is "technical" in one Wikipedia is "technical" in another [15:06:28] TomT0m: i.e. there are items which represent an Article in Wikipedia A but a page in namespace "Anexos" in Wikipedia B [15:06:48] making a strict technical divisions would lead to problems for these items [15:11:09] Denny_WMDE, of course, I think this means wikidatas knowledge datas are not bound to wikipedias, there is perfectly valuable datas for humanity knowledge that are not meant to be the subject of articles in wikipedias [15:12:26] agreed. but how does this relate to your feature request? [15:14:36] I'm not sure how about how to solve the problem exactly, just checking to see if I'm the only one to think there is a problem in the first place [15:14:58] I don't really see a problem there [15:15:11] whatever you do, you will hardly do it on all items anyway [15:15:20] but over a set of them that you preselect [15:15:28] actually I saw discussions on wikidata about admissibility of such "technical" items, and community said "no, this is not valuable enough" [15:15:57] well, yeah, sure, the community is free to decide as they like [15:16:36] I though a clear way to say "this item is not humanity knowledge and not meant to" would open doors and avoid some discussions [15:17:44] there exists things that are not meant to be human knowledge? [15:18:01] ask the cylons [15:18:01] And I'm not sure it is an ontology problem as the same arise for possible pure "technical" properties ... [15:19:11] lbenedix, as there is things that are meant to get a page on mainspaces and wikipedia and help or discussion pages [15:19:26] *on wikipedias [15:22:41] New patchset: Henning Snater; "wikibase.ui.Toolbar: Fixed using incorrect scope reference" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58698 [15:22:48] I dont get it [15:23:04] do you have any example for things no human should know? [15:23:38] yeah, how to code templates in wikicode :) [15:24:24] anyone around with a php wikidata bot? :) [15:27:14] lbenedix: you can use recursive regular expressions on xhtml ;) [15:28:09] Danwe_WMDE: never use regex for (x)HTML [15:28:15] lbenedix, but the real thing is that items for wikitemplates are of a really different kinds compared to items of main space wikipedia articles, to the point I'm not sure they should really be really another way of discriminate them by their ontology "kinds" [15:28:44] Danwe_WMDE: "recursive regular expressions" is an oxymoron :P [15:28:47] New patchset: Henning Snater; "Introducing toolbarbase widget" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58474 [15:28:56] (and yes, i know they exist, even though they shouldn't) [15:29:36] http://www.catonmat.net/blog/wp-content/uploads/2009/12/yo-dawg-regex.jpg [15:31:05] DanielK_WMDE: I found them useful [15:32:29] DanielK_WMDE: e.g. if you want to match something within brackets while only getting the part surrounded by an equal number of opening/closing brackets [15:33:40] Danwe_WMDE: but as DanielK_WMDE says, it is not a "regular" expression [15:34:13] Denny_WMDE: DanielK_WMDE hehe :D [15:34:33] I don't care what they are as long as they exist and can be used :P [15:34:35] Danwe_WMDE: use a callback [15:34:53] I think that tobyS will agree you are doing it wrong if regular expressions show up very regular in your codebase? :p [15:35:11] Danwe_WMDE: ...or use a proper grammar to make a proper parser. [15:35:22] JeroenDeDauw: depends on what that codebase does... [15:35:31] Our codebase [15:36:20] I am not using them for this project anyhow. But I still like them, they can spare you a lot of code. [15:36:34] there are some cool implementations of yacc etc in JS, where the result is JS as well… that's recursive :) [15:36:52] JeroenDeDauw: i'm a fan of regex from a geek perspective. but in real-world projects they are mostly annoying, because many people cannot read them fluently. [15:37:31] there is one very important rule: code is ~50x more often read than written and should therefore be optimized for reading. [15:37:54] but if anyone can read and understand your code he will replace you if he's cheaper than you ;) [15:38:19] lbenedix: obviously not true in open source. [15:38:26] right [15:39:16] tobyS: Agreed, but what about documenting your regex? should solve the problem of people not being able to read them to a sufficient degree. [15:39:30] lbenedix: if that is the case where you are working, you need to find a place with non-retarded management :) [15:39:48] JeroenDeDauw: perhaps management can't read code ;) [15:39:50] ^([0-9a-zA-Z]([-\.\w]*[0-9a-zA-Z])*@([0-9a-zA-Z][-\w]*[0-9a-zA-Z]\.)+[a-zA-Z]{2,9})$ [15:39:58] Danwe_WMDE: depends, if you need to fix a bug in a regex that you don't understand documentation is worthless [15:40:50] Looks like we got management that can write evil regexes tho? ^^ [15:41:51] Denny_WMDE: this is not match all valid emailadresses ;) [15:42:09] lbenedix: you made my point :) [15:42:32] and no one can understand it [15:42:52] well, that one is still "easy" to read ;) [15:43:43] this one is nice, and maybe it maches all valid mailadresses: http://sleeksoft.co.uk/public/techblog/articles/20050121_3.html [15:44:48] quick question about the api if anyone knows, if I use wblinktitles will that create an entity if there is not an entity for either link I pass it? [15:45:05] addshore: it should according to the spec [15:45:10] :) [15:45:14] I hope that's they way it was implemented :) [15:45:16] should will do just perfectly ;p [15:45:54] something like that is way worse "((?&number)\ (?&op)\ (?&number)(?(DEFINE)(?\d+)(?\+|\*|-|/)))" [15:47:52] If someone submits such a regex, they are getting this article linked on code review http://bit.ly/16Pvv2k [15:52:13] New patchset: Jeroen De Dauw; "Added SnakStore and some initial implementation" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58701 [15:52:14] New patchset: Jeroen De Dauw; "Added NoValue and SomeValue SnakStores" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58702 [15:52:14] New patchset: Jeroen De Dauw; "refactor testCanStore method down" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58703 [15:55:01] Denny_WMDE: also another quick one, will wblinktitles overwrite pre existing links? :O [15:56:06] addshore: that I do not know [15:56:49] New patchset: Henning Snater; "Renamed/Moved toolbar jQuery widgets" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58475 [15:56:57] addshore: no it will not, and it cannot [15:57:19] :D [15:57:22] perfect [15:57:50] but try it out first, please [15:58:44] New patchset: Raimond Spekking; "Add newline otherwise translatewiki.net scripts freaks out" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58704 [15:59:01] no one admitting they clicked the link :) [16:00:25] Change merged: Raimond Spekking; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58704 [16:03:19] New patchset: Jeroen De Dauw; "Work on implementing NoValueSnakStore::storeSnak [DO NOT MERGE]" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58705 [16:11:48] New patchset: Henning Snater; "Introducing toolbarbase widget" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58474 [16:12:26] New patchset: Henning Snater; "Renamed/Moved toolbar jQuery widgets" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58475 [16:38:54] aude: you there? [16:40:14] Change merged: Daniel Werner; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58474 [16:40:46] Change merged: Daniel Werner; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58475 [16:41:33] hi Denny_WMDE [16:41:50] DanielK_WMDE: are you around? [16:42:34] would you mind to do the http://www.mediawiki.org/wiki/Manual:$wgRateLimits rate limit thingy? [16:42:53] I would suggest to make the throttle at 20 per 60 s? [16:42:56] i'll take a look [16:43:07] for now. rather be a bit conservative. [16:43:12] Abraham_WMDE: for a few more minutes, yes [16:43:44] Denny_WMDE: 20 per 60s is pretty restrictive, but probably ok [16:43:54] yeah, we can increase later [16:44:01] let's first get the lag away [16:44:04] and then increase slowly [16:44:19] * DanielK_WMDE wonders whether this is using a leaky bucket implementation [16:44:29] the ratelimit? [16:45:19] https://gerrit.wikimedia.org/r/#/c/58708/ for moving the cron jobs [16:45:34] checking if they are ready to be moved [16:48:43] <^demon> aude: Should be ok, /var/log/wikidata/* already exists and is owned by mwdeploy. [16:49:02] ^demon: ok [16:50:37] ^demon: for the rate limit, there's just one default (with 25 lines of settings) [16:50:46] <^demon> looks like it. [16:50:52] do we really have to copy the entire thing, just to have one part of it different for wikidata? [16:50:59] or is there some trick? [16:57:49] aude: +wikidata and override should work... [17:01:44] hmmm, ok [17:06:36] denny is gone.... [17:06:55] i wonder about 20 per minute [17:06:59] DanielK_WMDE: ^ [17:07:39] we can try but given the atomic nature of edits, a human can edit quicker (yet the js ui is slowish, so maybe can't be that quick) [17:09:11] https://gerrit.wikimedia.org/r/#/c/58709/ [17:50:44] New patchset: Jeroen De Dauw; "Removing me as author since this code changed into something I do not want to be associated with" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58722 [17:51:01] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58722 [18:05:40] New patchset: Legoktm; "(bug 41573) Add entities to watchlist." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54704 [18:21:33] I just got a code review email... [18:21:48] oh? [18:21:58] Yes... [18:23:56] I know it is Wikidata as it is for Wikibase. <-- Well, close enough to Wikidata for me. [18:34:27] [[WD:STOPTHEBOTS]] [18:34:28] 10[2] 10https://www.wikidata.org/wiki/WD:STOPTHEBOTS [18:34:32] * legoktm smirks [18:36:03] New review: Hoo man; "(3 comments)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58223 [18:39:09] So, 24 hours of no bots? [18:41:59] Thats the goal [18:48:03] I just made a little search and realized 87 percent of wd edits are made by bots [18:48:11] that's great [18:48:14] cheers to bots [18:49:06] oh right [18:49:17] Amir1_: https://www.wikidata.org/wiki/WD:STOPTHEBOTS [18:50:37] legoktm: I was in the discussion dear [18:51:36] oops :/ [18:51:36] i searched for "ladsgroup" and didnt see it [18:51:36] didnt realize you changed your signature [18:51:46] I changed to "Amir" at first [18:52:11] but i don't how my preference rested [18:53:47] I didn't see the new topics [18:53:56] I watched the page [18:54:00] I stop my bot now [19:07:51] aude: re rate limit: yes, humans could quickly fire of 5 changes in 3 seconds. but to you think they can keep that up for another 9 seconds? Only then would the rate limit kick in. [19:08:12] ...at least if it's implemnted correctly. [19:08:48] DanielK_WMDE: we'll see if it's a problem [19:08:50] probably not [19:08:59] unless they have some special gadget [19:29:12] Surely INT -> BIGINT should be quick at the database level? [19:29:21] Nothing needs to actually change.. [19:30:27] That and it has a PK so should work with OSC [19:31:51] DanielK_WMDE: I guess you can just update the schema, then move that bug to be a WMF ops bug [19:40:03] the bots are not stopped, are they? [19:40:31] Hoping to stop bots for 24 hours. [19:40:42] Are you asking for blocks? [19:41:03] still a lot of bot edits: http://wikistream.inkdroid.org/#wiki=wikidata.wikipedia&namespace=all [19:41:20] Jasper_Deng: no blocks [19:41:22] Jasper_Deng: Blocks are probably not the best thing. [19:41:25] lbenedix: not all [19:41:51] Legoktm; Legobot is still operating at WD:RFD correct? [19:42:14] yes but there is severe replag [19:43:10] mb you can config AbuseFilter for bots edits and define some rate filter. im not sure if abusefilter works on the api [19:43:14] New review: Jeroen De Dauw; "commit message = :D" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58704 [19:44:52] eranroz: it works on everything :) [19:45:07] but [19:45:11] we arent going to do that [19:50:24] Reedy: but alter table is always lock -> copy -> replace, no? i don't think an in-place conversion is possible. [19:51:04] legoktm, JohnLewis: we are working on getting an edit throttle deployed soon. That should tame the bots. [19:51:25] Nice DanielK_WMDE [19:51:29] sounds good [19:51:40] legoktm, eranroz: it was not trivial to make it work on everything, but it does now :) [19:51:53] :P [19:54:14] DanielK_WMDE: i think with their schema update tool, should be doable [19:54:33] * aude hopes so [19:56:14] maybe injecting some trojan capabilities to pywikipedia and to get control over this bot could be easier ;) [19:59:42] DanielK_WMDE: are you around tomorrow? [20:01:35] eranroz: pywikipedia has a default throttle at 10 seconds [20:02:02] notconfusing: have you seen https://www.wikidata.org/wiki/Wikidata_talk:Bots#bot_moratorium [20:04:04] aude: no, it's my daughter's birthday. [20:04:19] DanielK_WMDE: ok [20:04:31] * aude neither or semi available (at a conference) [20:04:51] probably get the redis working though and experiment with it [20:05:33] for inserting jobs, i'm quite sure my settings are just wrong and it should just work [20:06:59] legoktm, no, but I can turn it off if need be [20:07:10] notconfusing: that would be preferred [20:07:22] ok, I'll do it now. [20:07:38] notconfusing, heya. I put in a request for an ORCID property today [20:08:01] http://en.wikipedia.org/wiki/WP:NORUSH :) [20:08:10] no idea if wikidata has a similar thing.... [20:08:15] WP:DEADLINE [20:08:53] looks like the backlog is slow but steady declining and hope it helps a lot [20:09:02] decline faster [20:09:32] in meantime, we're moving the dispatcher to ashburn (faster, better hardware and better to interact with the databases) [20:11:02] bot down [20:11:08] ty :) [20:11:58] DanielK_WMDE: https://bugzilla.wikimedia.org/show_bug.cgi?id=46763 [20:12:17] what should i set the batch size to? [20:16:21] trying 500 [20:18:36] https://gerrit.wikimedia.org/r/#/c/58753/ [20:37:46] * Jasper_Deng pokes Lydia_WMDE [20:39:04] Jasper_Deng: you're a lucky one! just got home :) [20:39:06] wasup? [20:39:51] https://bugzilla.wikimedia.org/show_bug.cgi?id=45140 [20:40:01] Lydia_WMDE: I think the culprit is the code at https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/extensions/Wikibase.git;a=commitdiff;h=384baeac31516857131553b80a5813b79588953f;hp=ac63d423b3f0fc133e5ad99a57479a561a7a30ee [20:40:28] it seems to me (in my limited understanding of JS) that blockItemPageActions is mistakenly called when the page is protected [20:41:35] Jasper_Deng: having a look [20:42:32] Jasper_Deng: possible yes - care to leave a comment? then i can bug tobi about it tomorrow [20:42:50] on the bug? [20:43:14] yes that'd be ideal [20:44:09] commented [20:44:38] thanks! [20:44:48] i'll see what tobi has to say tomorrow [21:47:37] 2 Warning: preg_match() [function.preg-match]: Unknown modifier 'c' in /usr/local/apache/common-local/php-1.22wmf1/extensions/W [21:47:37] ikibase/repo/includes/api/SearchEntities.php on line 148 [22:23:32] Reedy: eek, can you file that? [22:26:33] Haha [22:26:46] I've already logged bugs for 'n' and '2' [22:27:44] Same line [22:28:40] * Reedy consolidates [23:24:21] Moe_Epsilon: Stealing my deletions eh? ;) [23:24:49] Riley: no, I had just started at the opposite end :p [23:25:03] * Moe_Epsilon collided [23:25:16] pfft, same thing [23:25:24] :P On my bulk requests. [23:25:44] Moe_Epsilon; There's two more where that came from. [23:25:51] ooh [23:25:54] No there isn't.. [23:26:05] xD [23:26:08] jk :) [23:26:14] <___< [23:27:03] * Riley remembers the exact duplicate days [23:27:12] heh [23:31:54] >:) [23:32:29] >.< [23:34:36] don't give me that face, you don't need anymore http://toolserver.org/~vvv/adminstats.php?wiki=wikidatawiki_p&tlimit=none [23:35:24] Moe_Epsilon; Somehow Riley knows when I am doing deletion requests before I do xD [23:35:50] spidey senses [23:35:55] tingling [23:36:52] I only have 9,663 deletions.. its not like I have 10000.. [23:36:54] ;) [23:37:38] End of the week knowing you, you'll be at 10500. [23:38:17] I have a little over 2,100, but that's just because I sat and idled at RFD for the past week xD [23:38:37] Hey, I do that too! [23:38:49] Although, only for about 500 haha [23:38:51] not lately, you've been slacking [23:38:54] :) [23:39:18] Oh really? Good luck getting any more deletions tonight ;) [23:39:21] Riley; You do sometimes but the only time I see you there is when I say 'I might get a hundread or two requests at RfD' [23:39:51] * Moe_Epsilon pushes Riley into a closet and locks [23:39:56] * Moe_Epsilon strolls away [23:40:41] * Riley deletes the closet door [23:40:59] * Riley strolls away [23:42:48] I should have whacked you with my mop :< more efficient [23:43:05] * Sannita borrows his mop to Moe_Epsilon [23:43:18] o: [23:43:47] Moe_Epsilon: Your ACC account has been approved and you may now join #wikipedia-en-accounts. Please also subscribe to the mailing list at https://lists.wikimedia.org/mailman/listinfo/accounts-enwiki-l#Subscribing [23:43:56] oh o: [23:44:05] okay :) [23:46:05] * legoktm used to be subscribed to that list [23:46:38] Legoktm: I am subscribed :D [23:48:23] Your account Legoktm has been suspended by Stwalkerster because Inactive for 45 or more days. Please contact a tool admin if you wish to come back.. To contest this suspension please email accounts-enwiki-l@lists.wikimedia.org. [23:48:26] lolol [23:48:29] that was in [23:48:35] 8/21/2010 [23:48:44] Wow. [23:48:46] :P [23:49:25] o-o