[00:02:23] cariaso: you should generally use matching core and extension branches [00:02:45] such as master for all, or REL1_26 for all [00:19:24] tgr 'git branch -l' shows me only master; even after a 'git fetch —all' [00:20:14] but maybe not master at the same point in time? [00:20:35] maybe, but I don [00:21:07] I understand the need to pull particular branches ala REL1_26 for certain modules [00:21:18] but I'm not seeing them, if they exist in this case [00:22:12] well, if you use master for core, you should use master for everything else as well [00:22:23] just make sure they are all updated at the same time [00:22:57] if by core, you mean the directory containing LocalSettings.php that is on REL1_26 [00:23:11] http://52.23.170.224/index.php/Special:Version might answer other questions more clearly [00:23:24] you are trying to use a version of Flow that's after grants have been moved from OAuth to core, and a version of core that's before it [00:24:20] oh, in that case you should be using REL1_26 of Flow and OAuth [00:24:42] or be ready to deal with surprises [00:31:40] I see no evidence of a REL1_26 for OAuth https://phabricator.wikimedia.org/diffusion/EOAU/ [00:32:02] (I don't use Flow, and have no interest unless it's a requirement for OAuth) [00:32:54] no, I just assumed because you linked to a Flow patch [00:33:42] https://phabricator.wikimedia.org/diffusion/EOAU/browse/REL1_26/ [00:34:01] Phabricator's repo overview interface is a sad thing [00:35:36] I linked only because I saw their tracker discussing moving grants [00:36:03] re: sad, I can only agree [00:38:57] thanks for the pointer. I'll pick up from there. [02:06:28] https://www.mediawiki.org/wiki/Extension:ParserFunctions indicates is https://www.mediawiki.org/wiki/Special:MyLanguage/MediaWiki_1.18#Bundled_extensions and above, and that I should wfLoadExtension( 'ParserFunctions' ); [02:06:28] however after doing so, update.php complains PHP Fatal error: Uncaught exception 'Exception' with message '/var/www/html/extensions/ParserFunctions/extension.json does not exist!' in /var/www/html/includes/registration/ExtensionRegistry.php:106 [02:06:43] and it's true that even /var/www/html/extensions/ParserFunctions isn't there [02:18:43] same is true for ConfirmEdit, wiki page suggests it's already installed, but I seem to need to do a manual checkout. [02:24:26] cariaso: extensions are only bundled if you download the tarball. They aren't contained in the git repo [02:30:12] ah, thx [02:51:52] hey, is anyone around? [03:02:28] !ask | ilera [03:02:28] ilera: Please feel free to ask your question: if anybody who knows the answer is around, they will surely reply. Don't ask for help or for attention before actually asking your question, that's just a waste of time – both yours and everybody else's. :) [03:03:23] is it possible to export a specific version of a page? [04:18:00] ilera: you mean create an XML dump of one revision of one page? [04:19:14] more a past revision of a page [04:19:38] like, getting {{documentation}} (spelling?) from wikipedia before it was using lua [04:20:14] If you want xml for an import only way I know is maintenance script with start and end revision [04:20:21] Otherwise copy paste but still credit [04:21:11] gah =/ cause i know templates like {{ambox}} use a bunch of other templates too [04:21:14] but have now moved to lua [13:22:18] Wanted to ask, whether from a wiki administrator's point of view, the benefits of using the master latest development version through git, are significant enough to clone MediaWiki and Extension releases through Git or just using the stable tarball releases good enough? [13:22:22] are the updates significant enough to go through the extra hassle of cloning and continuously updating? [13:25:21] That depends on your personal judgement of "significant". [13:25:53] as unstable master versions might have more features but also more bugs. [13:45:02] if you don't have a compelling reason to track master, then stick to the stable releases [14:24:07] and ; thanks for the help [14:26:37] hey anyone here familiar with SMW ? [14:38:26] !ask | Benzi-Junior [14:38:26] Benzi-Junior: Please feel free to ask your question: if anybody who knows the answer is around, they will surely reply. Don't ask for help or for attention before actually asking your question, that's just a waste of time – both yours and everybody else's. :) [14:40:33] morning [14:59:28] Benzi-Junior: you're probably better off asking questions about SMW on the #semantic-mediawiki channel. [15:06:03] Extension:SpamBlacklist Problem - the standard Meta-Wiki blacklist is blocking URLs on its list but the English Wikipedia's spam blacklist is not blocking the URLs. I am using the method documented here: https://www.mediawiki.org/wiki/Extension:SpamBlacklist#Examples [15:13:28] anyone know why the English Wikipedia's spam blacklist might not be loading? [16:13:41] Hello. I am trying to use mediawiki api with action=query to retrieve some data from Wikipedia articles. For this purpose, I use Python requests library. In order to see the rest of the results, I have to pass the value of the continue parameter to my next query. The problem is: this parameter is itself a dictionary with the following values: "continue":{"clcontinue": "736|American_Zionists", "continue": "||"}. [16:14:40] Does requests library encript this multidimensional dictionary in the url by itsel, or I have to encript it manually in my code? [16:21:22] Let me ask my question in another way: How do you usually pass that 2-dimensional continue parameter into a url? [16:21:46] Does passing the clcontinue not work? [16:26:05] Do you mean passing it alone, not enclosed in the continue parameter? [16:27:15] Guest3006: i think the API expects you to pass &clcontinue=736|American_Zionists&continue=|| [16:27:41] Guest73865: ^ [16:27:46] Guest73865: see also https://www.mediawiki.org/wiki/API:Query#Continuing_queries [16:28:32] huh, that actually has an example using 'requests'. :D [16:31:33] MatmaRex: Thank you so much for your response. Please give me 5 minutes to try it. [16:55:07] MatmaRex: It works perfectly. Thank you so much. [16:55:44] yay :) [19:10:13] English language WUaS in MediaWiki - http://worlduniversityandschool.org/WUaS_En_Wiki - and German language WUaS in MediaWiki - http://worlduniversityandschool.org/WUaS_De_Wiki - are unavialable as of this morning with this error message "Sorry, sorry this site is temporarily unavailable" - from here http://worlduniversityandschool.org/. (Ryan K and Jan Z helped install these in MediaWiki at the SF WMF Developers’ Summit over Jan [19:10:14] 4,5.6). Can someone please help get these working again? Thank you. [19:15:08] Scott_WUaS: Well, http://worlduniversityandschool.org/ is unavailable in general. Who maintains that page? [19:15:28] Only the maintainer can tell what happened. [19:15:36] Also see https://www.mediawiki.org/wiki/How_to_debug [19:17:23] Thanks, Andre … I maintain it and it’s been up until this morning. [19:17:52] Scott_WUaS, see https://www.mediawiki.org/wiki/How_to_debug [19:18:06] Scott_WUaS, but problem might be already on a webserver level. [19:18:10] means: hosting provider [19:18:46] WUaS received a similar error message from our host, and Wikdiatan/Wikimedian JanZ read this message from WUaS’s host and said he didn’t agree with the error message. [19:19:02] This was in January … [19:20:05] I don't know how that is related to the current problem? [19:20:07] am checking out https://www.mediawiki.org/wiki/How_to_debug [19:59:47] der ? [21:13:50] hello [21:14:06] where can I report typos in PWB's translation? [21:14:18] PyWikiBot's, I mean [21:14:40] does PWB use translations from translatewiki.net? [21:15:06] no, I think [21:15:20] in scripts/i18n/ [21:15:37] but I'm too lazy to use git [21:15:53] there's https://translatewiki.net/wiki/Translating:Pywikibot [21:16:05] https://translatewiki.net/wiki/Special:Translate?group=out-pywikipedia-0-all [21:16:37] oh [21:16:40] thank you [22:01:03] Hi. I am trying to request a number of Wikipedia pages using MediaWiki API, but unfortunately some of these pages have Latin characters and it does not return any result. E.g., https://en.wikipedia.org/wiki/Jacques_Dr%C3%A8ze Would you please help me with this issue? [22:02:05] Guest61622: use an api client that can do proper UTF encoding [22:03:51] Would you please give me more information about the proper encoding? [22:05:37] python, for example, can handle unicode strings, maybe the library you're using is not... believe it or not, in 2016 there are still software that can't handle unicode strings properly [22:06:11] there's probably people that doesn't know there are more languages than English, or doesn't care... [22:07:50] I am Python, but what kind of encoding does MediaWiki accept? Sorry if I am not familiar with it. [22:08:31] I am using Python, but what kind of encoding does MediaWiki accept? Sorry if I am not familiar with it. [22:09:13] unicode/utf8, meybe the problem is the version of python you're using, or the way you're constructing api URLs... without details there's nothing much we can do [23:53:19] hi, anyone know why VisialEditor git branch for version 1.26 is missing the lib/ve/src dir? [23:54:00] #mediawiki-visualeditor might. [23:54:21] ponyofdeath: It's a sub-module. You have to do `git submodule update` to check it out. [23:55:43] git submodule update --init [23:55:47] ok thanks! [23:55:55] guess instrcutions missed that on the extension page [23:56:14] at least for the 1.26 branch [23:56:28] No [23:56:29] ? [23:56:50] The git submodule update --init command is vital, as MediaWiki-VisualEditor needs the core VisualEditor submodule to work. If you do not use this command, VisualEditor will fail to work.”