[00:03:53] ProcReader: that's painful, I'd fetch the change and "git rebase master" and fix, fix, fix, fix... [00:05:11] saper: as bad as I expected then :/ -- wish me luck :p [00:08:26] ProcReader: I have a change open since 2012 with 27 patchsets :/ [00:17:41] saper: out of curiosity, is it hard to get changes reviewed reasonably quickly on this project? [00:19:33] ProcReader: FlaggedRevs? no idea... [00:47:33] you'd hope it'd be doable since FlaggedRevs is deployed on WMF wikis, but, well, I keep having to rebase a patchset of mine against a different WMF-deployed extension since nobody seems to be too keen on reviewing such patches, so, um, good luck, I suppose? [02:00:25] that's encouraging haha =/ [02:03:05] WMF don't really care about FR [02:03:12] It's basically unmaintained and has no real owner [02:40:26] it should be disabled [02:42:08] It'd make more sense [03:21:30] ahh, just came across https://phabricator.wikimedia.org/T185664 =/ [03:21:54] well, I can see why nobody wants it [03:41:35] Reedy, saper: Got a seemingly working merge, haven't yet checked all the functionality. Slight problem though. I did a merge rather than a rebase (PHPStorm's UI seemed to do better with matches this way), not sure how much that matters here? [03:41:36] Second problem: when I push this as a patch, since the original patch goes off a different parent commit (which also happens to be unmerged) is Gerrit going to update the parent to show changes based off the current master, or is it gonna be screwed? [19:34:57] Hi. [19:35:16] Is it possible to list all the category pages that don't use a template? [19:36:53] Anything is possible [19:37:00] You'd need to do some list intersection thouhg [19:41:00] List intersecting? [19:41:50] Does the template need to insert another category, so I first query the category, then the category emitted by the template, then I do set operation A - B? [19:42:14] Get a list of all category pages [19:42:19] Get a list of all pages that transclude the template [19:42:32] The ones in A that are not in B is your answer [19:44:41] Ah, transcluding. Why did I not recall that API? [19:44:54] Well, thank you for the gentle reminder. [19:45:02] heh [19:45:10] Finding the right words is sometimes the hardest part :) [19:45:51] Or maybe because I was just so deep into thinking about complex APIs that I forgot the simple elegant ones. [19:50:35] Is there a namespace lookup table? [19:50:49] easiest IMHO is the api siteinfo [19:51:04] https://en.wikipedia.org/w/api.php?action=query&meta=siteinfo&siprop=namespaces|namespacealiases [19:51:05] Or do I just mw.config.values and string search? [19:54:29] For the search api, when I did srtitle=*, I get a warning. [19:55:20] What's the warning? [19:57:48] Okay, it somehow works when I use escape method of objects. [20:03:45] But I get zero results. [20:04:05] Doing something wrong here, Reedy?w/api.php?action=query&list=search&srsearch=*&utf8=&format=json&srnamespace=14&srlimit=500 [20:04:28] Why are you searching for *? [20:05:13] I want a list of all the category pages. [20:05:24] Don't need to do a search [20:06:07] Wow, is it possible to list all the cat pages without search? [20:06:29] https://en.wikipedia.org/w/api.php?action=query&list=allpages&apnamespace=14&aplimit=max [20:58:46] hey Reedy, are you about for me to bug you? sorry to disturb :) [20:59:39] I dunno, do I want to be? :P [20:59:55] hahaha probably not knowing my questions! [21:00:06] maye Reedy should bug you instead? [21:00:36] Platonides: they could certainly give it a go :p [21:00:44] basically I wanted to run this by someone more familiar with mw core [21:00:58] I'm trying to use parser functions in an extension, but use them in a way that alters the database [21:01:08] I don't want to alter the database when someone previews things, for obvious reasons [21:01:41] why do you want to change the database? [21:01:42] but the attemptSaveHook doesn't have a ParserOutput for ExtensionData, and the ContentAlterParserOutput hook only has OutputPage and ParserOutput, neither of which know their preview status [21:02:08] Platonides: custom db table - recognising a parser function in a page and adding something to an extension table appropriately to track that tagging [21:02:22] there's a table for that, I think.. [21:03:28] in this case it's using a custom table, bc it needs to be more than just tracking the use of the function. I'm building an extension that allows people to categorise pages based on markup on their corresponding talk page, so I need to add categories to a table of indexed page IDs along with indexed talk page IDs, together with the categories specified in the parser function [21:03:46] hence the quagmire in which I sit [21:04:42] if need be I can submit a patch to mw core to make this work, but I feel like I might well be missing something obvious that already exists [21:05:01] Reedy, is there a throttle limit I need to respect? [21:05:20] maybe page_props help you here? [21:05:51] I get a scripted requests from your IP have been blocked. [21:06:07] page_props is only indexed on one page, not two - but choosing the db table isn't the issue, it's the knowing when to put something into a db table that is [21:06:09] acagastya: there are throttle limits, yes [21:06:28] I really don't want to waste the time implementing the login; how what is the cap limit? [21:06:28] indexing on both pages is q important bc of the potential performance impact of a lookup on a non-indexed column [21:06:42] where do you want to show it? [21:06:58] maybe it would be easier to categorize the talk page [21:07:11] but then make the category page show the article rather than the talk? [21:07:50] Platonides: the whole point of the extension is that the page categorisation is interoperable with standard mw categorisation - so it could be used with tools like huggle, RecentChangesLinked etc [21:07:53] acagastya: Generally there's not [21:08:42] Well, I am getting error at ~ 10-12 th query. [21:08:56] Naypta: this is not how mediawiki is designed [21:09:11] what's the exact error message? [21:09:16] I am running a while (continue), and each result gives 500 results. [21:09:26] maybe you could make article pages automatically include the talk page [21:09:39] Platonides: I mean, yeah, extensions by definition don't do what mediawiki is designed to do :D the code I have at present works fully to do this, it's only the preview problem that remains [21:10:09] Reedy: "Scripted requests from your IP have been blocked, please see https://meta.wikimedia.org/wiki/User-Agent_policy. In case of further questions, please contact noc@wikimedia.org." [21:10:27] if I edit the article page that will probably break, too [21:10:38] acagastya: what user-agent are you using? [21:11:04] Platonides: nope, I've got the relevant hooks sorted so that it does the relevant lookups on edits and then adds the relevant categories automatically, with no changes to the wikitext [21:11:29] this is all tested, the only problem I'm currently aware of is the preview issue, which is why I'm raising it in particular [21:11:47] Reedy: I am using node-fetch, a NodeJS module which acts exactly like JavaScript fetch. [21:11:54] Not setting anything now. [21:12:07] acagastya: you will need to provide your own user-agent [21:12:11] as that page says [21:12:31] Well, it does not complaint for the first 10 times. [21:12:38] That explains it [21:12:39] https://github.com/wikimedia/puppet/blob/2d939e6ba428c343a789c69d82fa3093eb59a497/modules/varnish/templates/vcl/wikimedia-frontend.vcl.erb#L606-L615 [21:12:53] We don't like generic node-fetch [21:52:22] DannyS712: you said you'd consider reviewing but wont +2, any idea who might? or (code-wise) things i can do to make it less problematic for someone to agree to review / potentially +2? [21:52:40] not sure, sorry [22:21:52] Reedy: any suggestions? 🙊 [22:24:42] I hope this channel is an appropriate place for a MediaWiki-related open source job posting. https://www.fossjobs.net/job/10221/phppython-programmer-at-open-tech-strategies-llc/