[00:07:10] zol_mike: composer is getting to be that tool, I think [00:08:30] I saw that and it seemed to be the only thing still alive and being developed. I was hoping somebody had another idea. [00:10:39] not that i know of. the good thing about composer is that it's good for managing extensions, but it's also good for extensions to manage their own dependencies at the same time [00:10:56] https://github.com/mooeypoo/MediaWiki-ExtensionStatus [02:11:12] Hi everyone [02:11:40] HELLO??? [02:11:51] IS ANYONE THERE???? [02:13:09] HELLO DEONTRAYISAWSOME, I THINK YOU'VE LEFT CAPSLOCK ON!!!!! [02:13:12] ;-) [02:16:43] So funny [02:17:56] well, it's friday :) [02:24:20] fyi, I've written https://ofswiki.org/wiki/Mediawiki_Setup_Guide, with automation in mind, and some recommendations, meant to supplement mediawiki.org [02:42:45] how fun! "[b43cd35e] 2014-08-01 02:41:06: Fatal exception of type PasswordError" on mw.org, yet can log in fine on other WMF wikis. [02:46:12] [02:46:33] Instead, it's sDrewth's fault. [02:54:28] Nemo_bis: afaik, there is currently a wiki-based non-standardized lua tranliteration module. [02:58:22] Hello Everyone! [02:59:19] Why does that sound like the doctor from the simpsons? [02:59:42] Ramallah is being invaded by the Zionist entitu [02:59:49] *entity [03:01:33] * sDrewth takes full blame, and then onwardly blames the weather [03:01:50] Their missles have hit [03:08:01] In the Name of God, the beneficent, the Merciful, all praise be to Allah, the lord of the Universe [03:08:22] O God hasten the arrival of the Mahdi and make us his followers [03:08:36] and bless those who attest of his righfulness. [03:09:07] Muhammad: Did you have a question about MediaWiki? [03:09:38] Carmela, I am getting reports the Palestinian settle of Ramallah has been sieged [03:10:21] Sounds like you may need to install the ParserFunctions extension. [03:11:13] ParserFunctions [03:11:15] ? [03:11:57] Don't worry about it. [03:12:45] Have the Zionist made an official statements regarding Nabluz and Ramallah? [03:13:15] Not yet. [03:14:07] sDrewth: Actually, I blame the tor exception. [03:15:02] At least 60 children were killed in Nabluz. [03:15:26] Hamas has vowed to recapture Jerusalem at all cost. [03:16:07] They can't afford Jerusalem. [03:17:36] Carmela, what makes you say that? [03:18:03] They have 14 year old boys fighting on their side. [03:18:07] Because the Israelis have a lot more money and artillery. [03:18:28] Right, 14-year-old boys really aren't a match for the Israeli army. [03:18:54] They are asking help from the Islamic Republic of Iran and Hezbollah. [03:19:09] And yet. [03:21:54] Carmela, Hamas will do chemical siege on Jerusalem [03:28:10] I doubt it. [03:30:39] yeah chemical is quite expensive and Hamas just don't have the money.. [03:32:49] comets, they do [03:33:00] Iran is funding them [03:33:31] if they did, they would be firing actual rockets at jerusalem, not throwing rocks.. [03:36:45] comets, those are civilians [03:36:57] they need to upgrade [[Special:Nuke]] .. [03:37:23] Iran supplies them weapons [03:40:05] if Iran did supply them weapons, the death toll in Israel would atleast be in double digits.. [03:40:35] comets, don't estimate Iran [03:40:46] i don't .. [03:41:00] Iranians have the ability to wipe out several villages [03:41:18] they prefer to wipe out their OWN villages.. [03:41:57] Rouhani extends his hand on the issue [03:43:35] Hamas would have more luck if he extended his bank book instead... [04:47:03] Hi folks. Is there anyway to perform a query where you can check multiple columns for one value? Eg, pull up all results where 'tk1', 'tk2' and 'tk3' = 'pi' [04:48:27] And by that I mean pull up any result that has any of those equaling 'pi'. [04:49:32] I've tried using an array: array('tk1','tk2','tk3') => 'pi', but that's 'illegal'. [04:50:55] CarHop: WHERE foo = bar AND bing = bar AND baz = bar [04:51:00] That should work, no? [04:52:31] Did I say and? I meant or (Sorry). As in, it pulls up every result where (In the above case) foo, bing or baz = bar. The issue is I'm not sure how to do that with Mediawiki's querying system. [05:02:55] Is there no way to swap 'AND' for 'OR'? [05:03:16] WHERE 'foo' IN (bar, baz, bing) [05:03:18] That might work? [05:04:04] Nope :/ [05:06:09] In Mediawiki: 'bar' => foo' 'baz' => 'foo' = WHERE 'bar' = 'foo' AND 'baz' = 'foo' [05:06:09] OR is supported in SQL. [05:06:38] *In Mediawiki: 'bar' => foo', 'baz' => 'foo' = WHERE 'bar' = 'foo' AND 'baz' = 'foo' [05:06:47] A comma (,) = AND [05:06:57] So how do I get OR? [05:10:59] Carmela: I'm writing my queries Mediawiki-style, not traditional style. [09:05:18] Hi, I am trying to make a link to an external site in the footer. I looked at https://www.mediawiki.org/wiki/Manual:Footer and added some links, but I can't add a link to an external site :( I can see that the poor guy here couldn't get the answer to his question for 3 years now https://www.mediawiki.org/wiki/Manual_talk:Footer#External_links.3F He posted in a few places, but no answer anywhere. [09:07:18] !wg CopyRightIcons | natbrown [09:07:18] natbrown: https://www.mediawiki.org/wiki/Manual:%24wgCopyRightIcons [09:07:20] I tried #REDIRECT etc... This is looks ugly :( http://wikitranslate.org/wiki/MediaWiki:Poweredbyfgpage I know that I don't know a lot (php, mysql, ... list is too long... ) Please help!!! [09:07:21] covers it iirc [09:08:20] Nope, can't use the icons. I had them before, since I changed the skin, I can't use them. [09:09:35] p858snake: There is no text on the page at all :( https://www.mediawiki.org/wiki/Manual:%24wgCopyRightIcons [09:09:39] natbrown: https://github.com/wikimedia/mediawiki-extensions-WikimediaMessages/blob/master/WikimediaMessages.php#L98 that's what Wikimedia sites use to add a "Developers" link in the footer [09:10:33] legoktm:Thank you! I'll try it now! [09:21:10] !wg FooterIcons | natbrown [09:21:10] natbrown: https://www.mediawiki.org/wiki/Manual:%24wgFooterIcons [09:21:17] is the correct link I was referring to [09:25:21] p858snake:Thanks, I should try it without the image, Don't know how to do it... [10:01:09] hi [10:01:28] i need some help here? any one wanna help me? [10:02:28] !ask | drsalmanshah165 [10:02:28] drsalmanshah165: Please feel free to ask your question: if anybody who knows the answer is around, they will surely reply. Don't ask for help or for attention before actually asking your question, that's just a waste of time – both yours and everybody else's. :) [10:02:30] wikipedia gives me error of book failed rendering on creating .zim file of 3000 articles [10:02:49] Have you tried doing a smaller one? [10:02:52] ie of 1000? [10:03:11] i can download 1053 articles [10:03:47] what can i do now? [10:04:25] whats the problem. book fetching stuck at 77% and then show me message of book rendering failed message [10:06:16] Reedy do you have any idea about this? [10:06:28] how do I make my user sysop? [10:07:24] whats the problem. book fetching stuck at 77% and then show me message of book rendering failed message [10:09:45] !createandpromote | pwca [10:09:45] pwca: To recreate the admin user, run "php maintenance/createAndPromote.php" on the command line. [10:10:00] which command line? [10:10:07] the one on the web server [10:10:14] !shell [10:10:15] Shell access (that is, SSH access, see http://en.wikipedia.org/wiki/SSH) is highly recommended if you want to run MediaWiki. You can install without it, and basic operation will work, but even creating backups or upgrading will be painful without it. Some more involved maintenance tasks may even be impossible. MediaWiki is not designed for minimal environments. [10:10:20] oh god. [10:10:26] I'm not a hacker? [10:11:09] pwca: generally, any user with "bureaucrat" rights can make any other user an admin using the web interface [10:11:12] it's just a few clicks [10:11:23] I don't have any bureaucrat users. [10:11:47] pwca: exactly. so since you don't have a user that has the right to make admins, what can you do? [10:11:56] be design and definition, you cannot change that via the web interface. [10:12:13] if you could, that would be a major security problem, right? [10:12:25] so, you have to do it via the command line, or directly in the database [10:12:39] hm... [10:13:05] doesn't the user created during setup have bureaucrat rights? it would be somewhat silly if they didn't... [10:13:28] this hacking stuff works well. [10:13:39] :P [10:15:37] so bureaucrat should be a more restricted category than administrator in general? [10:15:37] any one solve my problem? [10:19:47] is there some similar way to delete users, DanielK_WMDE? [10:20:04] everything I see suggests going into the database and doing it manually. [10:20:09] I worry that might break something. [10:21:35] pwca: you should not delete users. users are referenced from page histories and logs. deleting them breaks referential integrity. [10:21:43] why would you delete users? what's the point? [10:22:13] the closest that can be done relatively safely is a) renaming users and b) merging user accounts. [10:22:23] i think there is an extension called MergeAndRename or some such [10:22:41] it's a private sandbox user and the user I want to delete was created automatically and isn't being used. [10:22:53] private sandbox Wiki, I mean. [10:23:02] pwca: and yes, bureaucrats should be rare. essentially, bureaucrats can make admins. admins can delete, protect, and block. [10:23:59] maybe I should merge it to my account? [10:24:07] pwca: so what? just let it sit there. on any active wiki i have seen, there are more unused accounts than active ones. many with random or bogus names, insults, what have you. [10:24:17] who cares? it's not like they are visible unless you look for them [10:24:43] pwca: you could try that, yea. but as I said: who cares? just let it sit there and ignore it. [10:25:01] it's like a rotten tooth. [10:25:26] it's more like a single sock at the back of the drawer [10:25:37] it's a wiki. it will never be complete or clean. [10:25:39] get used to it. [10:28:16] hm, ok. [10:28:38] at least I want to strip it of sysop rights. [10:55:09] pwca: once you have a bureaucrat, that should be simple enough [10:55:58] use the Special:UserRights page to set a user's group membership. which groups have which rights can be changed in your LocalSettings [10:56:02] !access | pwca [10:56:02] pwca: For information on customizing user access, see . For common examples of restricting access using both rights and extensions, see . [10:56:25] p858snake:Thanks! I've added the links without the image, I have added some links to the bottom of the pages with https://www.mediawiki.org/wiki/Extension:FooterManager I wanted them to be on a different line. [10:58:07] legoktm: It was a bit difficult for me since I couldn't understand where to add the external link. I have done it the way suggested by p858snake. [11:59:32] hi [12:02:22] I have using media wiki for the last one month..I want to know how can u replace the editor content into a structured format with different aspects like [[ ]] , * space etc ? [12:04:19] hi anybody is there ? [12:05:08] hi [12:05:15] kunalg [12:05:30] Vijayakumar: Hi [12:05:39] I have using media wiki for the last one month..I want to know how can u replace the editor content into a structured format with different aspects like [[ ]] , * space etc ? [12:07:30] Vijayakumar: You can check out Wiki markup here https://www.mediawiki.org/wiki/Help:Formatting [12:08:24] And if you aren't interested to do it manually, have a look at Visual editor https://www.mediawiki.org/wiki/VisualEditor [12:09:58] Vijayakumar: Does this solve your question? [12:10:25] @kunalg no [12:11:21] Vijayakumar: So, can you rephrase it? [12:12:49] I would like to give contribution to wiki. so I just want to know if I type [[vijayakumar]] how it should be converted to a link in view page..just replace the bracket and give link or something else [12:12:49] ? [12:18:16] Vijayakumar: Well, if you do, http://en.wikipedia.org/wiki/Special:Search?go=Go&search=vijaykumar it will be a link to wiki/index.php/vijaykumar if it exists. [12:19:09] Vijayakumar: If not, it will be a redirect to creating the page [12:21:22] Vijayakumar: I meant [[ vijaykumar ]] links to wiki/index.php /vijaykumar [12:21:37] Vijayakumar: I really don't know how to escape in IRC [12:41:56] thanks [12:42:13] hello, I have some troubles with the recent changes in my private wiki. It displays only the account creations, and nothing about new pages or page modifications. [12:48:45] statement: I just #commented the lines $GroupPermissions to see, and it works. But I would get the recent changes in the main page, when set on private. Is it configurable? [12:51:57] !access [12:51:57] For information on customizing user access, see . For common examples of restricting access using both rights and extensions, see . [13:00:54] thanks Carmela, but it doesn't solve the problem. When the mediawiki is set to publicly readable, the recent changes are displayed, and the recent changes aren’t when it is set to private. but I would get these recent changes and the wiki set as private at the same time. I haven't seen anything that allows me that in the documentation. [13:03:42] furthermore, the refresh of a page don't make appear a file that was broken, when it was added in the meantime. [13:08:06] sorry, I think to have found why... :D [13:08:19] I was registered as a bot. [13:11:27] indeed, that was the cause [13:11:45] now it works. hard to be a newbie. [13:12:17] finally the links helped, thanks Carmela [13:17:40] monpauvrelieu: how did you accidentally become a bot?... [13:27:04] I created a separate account that the one for only administration, I changed my rights to administrator and bureaucrat, but I checked the robot at the same time. [15:02:14] Hey could someone help me, is there a way to tell if some page is using a specific extension? [15:02:37] Hi zol_mike! Do you mean a parser extension, or something else? [15:04:13] Sorry I'm rather new, we have this old extension called collist.php and I want to get rid of it, but I can't disable it and check all the pages by hand to make sure nothing is messed up. Is there a parser function out there for something like this? [15:04:52] I'm basically trying to clean up my extensions so I can switch over to Composer to manage extensions from here on out [15:08:08] Is it possible to have 2 Mediawiki installs in two seperate datacenters that replicate all data so they are basicly identical? I'm thinking a Master Master database replication and perhaps an hourly rsync or rempte nfs for the images. [15:09:18] kaotic: do you want edits to be made via both DCs at the same time to be allowed? [15:09:26] yes [15:09:44] not sure that's supported, really [15:09:52] I have 2 NOC's we want a wiki hosted in both locations and the data to be replcated. [15:10:00] zol_mike_: no, there's nothing in core able to track parser function usage on pages [15:10:30] zol_mike_: best you can do is export an XML dump of the content (just last revision of all pages) and manually search on it [15:10:41] marktraceur: ^ am I off base? [15:11:01] Alright thanks Vulpix [15:11:23] * marktraceur looks [15:12:00] marktraceur: the question here probably can be easy as, "does WMF do this?" :P [15:12:11] I'm pretty sure we don't do Master/Master [15:12:17] But...I'm not the person to ask [15:12:36] Vulpix: we don't [15:12:55] Master/Master can cause conflicts in replication, right? [15:13:11] the app needs to be aware of the DCs, which I'm pretty sure MW isn't [15:13:49] Vulpix: Yeah, because race conditions will happen; all those "edit conflict" screens become basically impossible :) [15:13:49] kaotic: the good news is: we'll be adding this support Real Soon Now (TM) when we bring a new DC online [15:14:19] zol_mike_: You might want to look up mwgrep, which the Parsoid team has used to great effect [15:14:26] It's a fast way to grep for things over a dump. [15:14:27] Nice [15:14:39] kaotic: well, that would probably be caotic after all (pun! :P ) [15:14:52] * marktraceur muzzles Vulpix [15:14:55] I'll explore another option, Basicly the NOC Managers are looking for a way for both locations to have a local wiki of the same data [15:14:56] Sorry about him [15:15:13] I dont see the need for a local wiki myself [15:15:28] Hrm. [15:15:49] kaotic: your first thoughts would be fine if you don't care about occasional dataloss ;) [15:15:50] kaotic: Maybe you could have the two wikis both point at a Wikibase instance, or something equally convoluted [15:16:09] !wikibase [15:16:12] Too easy [15:16:14] !wikidata [15:16:14] https://meta.wikimedia.org/wiki/Wikidata [15:16:49] hmmm thats an option, thanks! [15:19:09] I just got clarification... they were looking for a easy way to get redundancy should one fail... wish they would have just asked me about taht rather than come up with a possible solution themselves [15:19:12] lol [15:19:24] Master slave will work for what they are trying to acheve. [15:19:37] kaotic: Time to point them at http://catb.org/~esr/faqs/smart-questions.html [15:21:00] kaotic: yep [15:21:24] kaotic: in the Grand Glorious Future you can do something way more complex and convoluted, but for now, yeah, that's easy :) [15:21:57] !xy [15:21:57] The XY problem is asking about your attempted *solution* rather than your *actual problem*. http://meta.stackoverflow.com/a/66378 [15:22:04] greg-g, in teh future we will look at load balancing across mutiple machines but they think they are going to put way more load on the servers than they really are. [15:22:21] kaotic: /me nods [15:22:24] (Are you sure it can handle 20-30 users!!!) [15:22:28] lol [15:22:32] yes... yes I am. [15:24:31] !smart [15:24:31] "It's better to be a smart ass than a dumb ass, but at the end of it all, you're still an ass." --- some smart ass. [15:24:37] Hm, no. [15:24:40] !smartqs [15:24:48] !smartqs is http://catb.org/~esr/faqs/smart-questions.html [15:24:48] Key was added [15:24:54] I'm sure I'll remember that. [15:25:22] Hi! Is anyone able to diagnose why the CSS on my wiki is failing to be compiled?: https://freestatechronicles.com/index.php?title=Main_Page&useskin=vector [15:25:26] https://freestatechronicles.com/load.php?debug=true&lang=en&modules=skins.vector.styles&only=styles&skin=vector&* [15:25:29] can anybody give me some help on fixing this code for 1.23 [15:25:31] http://pastie.org/9436906 [15:25:56] $wgArticle went bye-bye and the page about it doesn't say anything about how to replace it in a SkinTemplate [15:26:11] orion: Are you making modifications to the LESS styles of your wiki? [15:26:21] marktraceur: or you'll end sending a !smart instead of !smartqs when needed :P [15:26:30] orion: Better question: What was the last thing you did before it borked. [15:27:57] QuasAtWork`: https://www.mediawiki.org/wiki/Manual:$wgArticle#Replacement [15:28:05] QuasAtWork`: SkinTemplate is a ContextSource. [15:28:50] ah. [15:29:35] marktraceur: The last thing I did was upgrade from 1.22.8 to 1.23.2 [15:29:41] QuasAtWork`: The doxygen inheritance dialogs are your friends :) [15:29:57] MatmaRex: are you around? [15:30:08] yeah I juuust opened that up [15:30:08] orion: OK. Did you modify any of the files in the wiki, or were they all stock? [15:30:16] ;) [15:30:17] sorta [15:30:56] MatmaRex: orion came here yesterday with that problem, I think it may be a LESS issue but I don't know how it's handled by MediaWiki, can you share some quick thoughs? [15:31:31] MatmaRex: https://freestatechronicles.com/load.php?debug=true&lang=en&modules=skins.vector.styles&only=styles&skin=vector&* [15:31:44] FWIW it looks like the last line has two braces, maybe by mistake. [15:31:46] * MatmaRex reads scrollback [15:32:05] marktraceur: They are all stock. [15:32:27] orion: You had trouble with the patch too, right? There not being a tests directory or something? [15:33:03] I didn't use the patch, I extracted a fresh tarball and copied my settings/images/extensions over. [15:33:07] Oh, K [15:33:14] That was someone else, sorry :) [15:33:52] orion: is it a new install, or an upgrade from an older version (prior to 1.23.x)? [15:34:03] orion: huh. [15:34:28] ah, I see, from 1.22.8 [15:34:47] never mind [15:35:30] orion: it looks like it's not loading all of the files for this request… can you verify that all of the vector skin files are actually there, and are readable? [15:35:57] for comparison, look at e.g. https://bits.wikimedia.org/en.wikipedia.org/load.php?debug=true&lang=en&modules=skins.vector.styles&only=styles&skin=vector&* [15:36:28] the contents of https://freestatechronicles.com/load.php?debug=true&lang=en&modules=skins.vector.styles&only=styles&skin=vector&* are there too, at the very end [15:38:05] https://freestatechronicles.com/skins/vector/screen.less redirects to https://freestatechronicles.com/w/Skins/vector/screen.less [15:38:09] Is that not right? [15:39:20] well, that setup may be problematic [15:39:21] IOW, should the entirety of the skins directory be accessible directly? [15:39:27] hmmm [15:39:33] yes [15:39:36] but shouldn't be a problem for server-side CSS concatenation/minification [15:39:42] but also this ^ [15:41:34] orion: do you have ./resources/src/mediawiki.ui/vector.less in your server? [15:42:11] SHA256 (./resources/src/mediawiki.ui/vector.less) = 546e6722d78466f0ee16b04d9f6a3882ae1e9b9c20a5b29101a2a23d277dbc37 [15:43:16] yep, that's correct [15:43:38] and does it have the same permissions as other files on that folder? [15:44:14] Yes, 644 [15:45:16] I guess the lessc.inc.php doesn't depend on any external library to run [15:47:37] ok it's not working. [15:48:13] Should I file a bug report? [15:48:14] i wonder how we could debug this [15:49:39] MatmaRex: Whatever changes you want me to make I'm happy to make them. [15:51:06] call to undefined method MonacoTemplate::getContext [15:51:07] orion: you have shell access, right? we could try poking some things [15:51:26] run `php maintenance/eval.php` [15:51:29] I have root access too. [15:51:58] and in it, run: var_dump( $wgOut->getResourceLoader()->getModule('skins.vector.styles') ) [15:52:31] just to verify that both files that should be listed there are actually listed [15:53:58] http://lpaste.net/7562032812216287232 [15:54:29] yeah, that's sane [15:54:36] and now let's try to have it generate the CSS: [15:55:25] $ctx = new ResourceLoaderContext( $wgOut->getResourceLoader(), $wgRequest ); [15:55:31] oh it's because this is a QuickTemplate... [15:55:44] var_dump( $wgOut->getResourceLoader()->getModule('skins.vector.styles')->getStyles( $ctx ) ); [15:56:29] http://lpaste.net/7161040650983440384 [15:56:49] huh. [15:57:49] indeed the styles for 'screen' are missing… [15:57:51] got it ;) [15:57:53] what about var_dump( $wgOut->getResourceLoader()->getModule('skins.vector.styles')->getStyleFiles( $ctx ) ); ? [15:57:57] just had to do getSkin() [15:59:11] (ah sorry, that won't work, nevermind) [15:59:28] ughhhhh, stupid protected methods [15:59:51] orion: also, have you already checked the debug logs for things like "style file not found"? [16:01:29] MatmaRex: What is the location of the log? [16:01:42] !debuglog [16:01:42] There is no such key, you probably want to try: !errors, !sqllog, [16:01:45] !debug [16:01:45] For information on debugging (including viewing errors), see http://www.mediawiki.org/wiki/Manual:How_to_debug . A list of related configuration variables is at https://www.mediawiki.org/wiki/Manual:Configuration_settings#Debug.2Flogging [16:01:57] you might need to enable it [16:02:16] https://www.mediawiki.org/wiki/Manual:$wgDebugLogFile [16:04:03] (the fix if anybody was curious -> https://github.com/haleyjd/monaco-port/commit/ac44db9c8682129d46cd16ab35acf9408407d3db ; feel free to criticize or berate) [16:04:55] MatmaRex: E486: Pattern not found: style [16:05:11] MatmaRex: (I searched the log file and couldn't find the word `style' anywhere) [16:05:29] Request ended normally [16:05:33] orion: are there any [resourceloader] entries? [16:06:03] no [16:06:23] hmm. [16:06:34] well, let's try some more magic in eval.php then… [16:10:22] right. do this: [16:11:03] $ctx = new ResourceLoaderContext( $wgOut->getResourceLoader(), $wgRequest ); [16:11:03] $module = $wgOut->getResourceLoader()->getModule('skins.vector.styles') [16:11:03] $method = new ReflectionMethod($module, 'readStyleFile') [16:11:03] $method->setAccessible(true) [16:11:03] var_dump( $method->invoke($module, 'vector/screen.less', false ) ) [16:12:02] > var_dump( $method->invoke($module, 'vector/screen.less', false ) ) [16:12:02] NULL [16:12:28] huh. [16:12:48] that's weirder than i expected [16:13:30] looks like some preg calls deep inside are failing [16:13:36] orion: run var_dump(preg_last_error())? [16:13:56] int(2) [16:14:14] i wonder which one it is [16:14:15] http://pl1.php.net/manual/en/pcre.constants.php [16:14:50] PREG_BACKTRACK_LIMIT_ERROR [16:15:04] yeah [16:15:04] see also http://php.net/manual/en/pcre.configuration.php#ini.pcre.backtrack-limit [16:15:49] (on my installation, PREG_SET_ORDER and PREG_SPLIT_DELIM_CAPTURE are also 2. silly php) [16:16:03] orion: but that's probably the backtrack limit. try raising it [16:16:29] do you know if you are using some non-default settings? i don't think i've seen anyone run into this before [16:16:52] (or at least, not in this particular context) [16:17:18] orion: well, what's your actual backtrack_limit ? [16:17:30] 10,000 [16:17:48] I've never heard of this limit before. [16:17:57] The default is 1,000,000 [16:18:05] according to php.net anyway [16:18:16] My php.ini file must be old [16:18:33] well, could be different depending on the package, I think? [16:18:34] backtracking limit defaults to 100000 for PHP < 5.3.7. [16:18:56] php was installed via freebsd ports [16:19:00] yeah, probably stupid package manager people setting stupid values… [16:19:52] All is well now. :) [16:19:55] Thank you everyone! [16:20:10] I literally would never have been able to figure this out on my own. [16:22:00] I'm gonna leave a note about this on https://www.mediawiki.org/wiki/Manual:Errors_and_symptoms [16:23:27] :) [16:26:39] Hi - this is kind of a tricky question: when a template is resaved, all the pages that call that template eventually get re-parsed, via jobs. Is there any way for a parser function within a page to know whether it's being called as a result of a template save or not? [16:27:15] orion, MatmaRex: I wonder if we should be throwing exceptions when preg errors like this [16:27:35] well, that would help debugging this at least [16:27:44] We really shouldn't have to investigate like that for when we hit configuration limits. [16:28:02] Parser functions are called with a $parser argument, and I could find one difference in $parser when it was called as a result of a template save - there was no value for $parser->mOptions->mUserLang->dateFormatStrings. However, I don't know if checking for that is a stable solution, going forward... [16:28:32] Krenair: we'd have to add checks for pcre_last_error() everywhere… [16:28:45] urgh :( [16:29:12] hmm [16:29:28] or maybe we could just check it at the end of request processing, and wfDebugLog() it? [16:29:45] i wonder what we do for other *_last_error functions [16:30:00] but it would work only if the *last* preg caused an error [16:31:04] better than nothing [16:31:55] ah [16:31:59] orion: what's your server's distro? [16:32:06] no, you're right, that'd be useless [16:32:21] "last" meant "last operation", not "last error" [16:34:08] preg_match() returns FALSE if an error occurred. We should probably check it on every call, instead of just discarding the return value [16:34:35] yeah [16:34:49] if it's false, we can check last_error etc. [16:37:33] this particular case was the preg_replace_callback() in CSSMin::remap [16:37:56] (which uses a horrible, horrible dynamically-built regex, by the way) [16:38:24] regexes? horrible? no! :p [16:38:34] i don't think it's practical to do such a check for every preg_* function call [16:38:53] Searching 3355 files for "preg_" [16:38:56] 1068 matches across 250 files [16:39:04] (that's on REL1_23) [16:39:48] but that particular one may be good to have that check, seeing it has failed here [16:40:12] yeah [16:40:32] these lines are from !g f3779e067fa6afaaf1de54f9e7cf2554b64261ad [16:40:35] !g f3779e067fa6afaaf1de54f9e7cf2554b64261ad [16:40:35] https://gerrit.wikimedia.org/r/#q,f3779e067fa6afaaf1de54f9e7cf2554b64261ad,n,z [16:41:13] no idea if the older version could cause this too [16:41:31] but we could add some helpful exceptions there and backport to 1.23 [16:47:04] well we upgraded to 1.23, and after working for ~ half an hour or so, our wiki is now a blank white page with no error messages and nothing echoed to the apache error log [16:47:28] !debugging [16:47:28] For information on debugging (including viewing errors), see http://www.mediawiki.org/wiki/Manual:How_to_debug . A list of related configuration variables is at https://www.mediawiki.org/wiki/Manual:Configuration_settings#Debug.2Flogging [16:47:34] !blankpage [16:47:34] A blank page or HTTP 500 error usually indicates a fatal PHP error. For information on debugging (including viewing errors), see . [16:47:38] :) [16:47:42] no HTTP 500 [16:47:47] HTTP 200 response. [16:47:59] > A blank page... [16:48:17] lemme try the error_reporting [16:48:20] I don't have that call in there [16:49:22] ok now I see some stuff. [16:49:23] thanks [17:26:09] Does anyone know how to add say an html wrapper to a subskin? [17:31:42] for storing data about log events, when would I use log_params, and when would I use the log_search table? [17:33:17] you can use both as well [17:33:58] (e.g. if people often don't filter on the information, then it's faster to not JOIN on log_search and just use the version of the info in log_params) [17:34:17] the later is useful for filtering for live queries since it's indexed [17:34:21] AaronSchulz: there was a proposal to make log_params its own table. is that necessary, given that we have log_search? sorry, the log_search documentation page doesn't have a lot of info [17:34:33] for offline use (e.g. slow analaytics) queries, then log_params is enough [17:35:04] RevisionDelete for example uses it to filter log events don't to the revision ID (not just page) [17:36:16] hi Castile [17:36:26] Castile: you can email mediawiki-l and they might have answers [17:36:28] !lists | Castile [17:36:29] Castile: mediawiki-l and wikitech-l are the primary mailing lists for MediaWiki-related issues. See https://www.mediawiki.org/wiki/Mailing_lists for details. [17:54:20] anomie: when you are in London: http://www.ltmuseum.co.uk/ and http://www.clockmakers.org/museum-and-library/museum/ [17:54:43] Krenair: valhallasw`cloud Lydia_WMDE ^ [17:54:52] in fact every coder who is going to Wikimania [17:55:11] a clockmakers museum and a museum of London's subways/trains/etc. [17:56:46] Sumanah!!!!!!! [17:56:52] the London Transport Museum has, like, original Gill Sans stuff http://www.ltmcollection.org/museum/object/link.html?IXinv=1992/386 [17:57:03] valhallasw`cloud: !! yes! [17:57:11] I think I may have told you about the Transport Museum valhallasw`cloud [17:57:41] Yes! Thanks for the links [17:57:53] http://www.ltmcollection.org/museum/gallery/gallery_top.html?IXgallery=CGP.060 - "Frank Pick changed the face of London's public transport through good, progressive design. He saw this as essential to being 'fit for purpose'." [17:58:40] valhallasw`cloud: how was your flight? [18:03:17] I actually slept! [18:03:41] sumanah: who would I talk to about login issues related mediawiki.org? [18:03:50] valhallasw`cloud: :D [18:03:58] Amgine: there are a couple bugs filed [18:04:14] Even though they put me in the seat with the least leg room of the entire plane :-P (just after a bulkhead) [18:04:39] valhallasw`cloud: wow. I'm glad you were able to get some sleep! It was lovely to get to hang out and host you. [18:04:52] And it was super fast - we left an hour late and we still arrived early [18:04:58] wow [18:05:10] :) it was great to visit, too [18:05:16] Amgine: speaking here is a reasonable step. greg-g might want to know [18:05:30] valhallasw`cloud: and now I know that Big Bird varies in color by country! [18:05:58] *grin* [18:06:01] Thanks. I suspect my issue is related to Tor use, but I haven't been able to log in to comment on the api roadmap. [18:06:16] [242b8202] 2014-08-01 18:02:56: Fatal exception of type PasswordError [18:08:37] Amgine: reported [18:08:51] https://bugzilla.wikimedia.org/show_bug.cgi?id=69007 [18:09:14] heh... thanks. Was working my way through a looooong list o' bugs. [18:11:24] https://www.mediawiki.org/wiki/Thread:Project:Current_issues/Can%27t_log_in +1 [18:11:52] Amgine: https://gerrit.wikimedia.org/r/#/c/151126/ should be better in a matter of minutes [18:24:38] thanks Nemo_bis Vulpix [18:27:18] * Nemo_bis never heard this proverb [18:27:37] hahaha [18:27:43] it shows up in "Y: The Last Man" [18:27:54] but not credited as Italian [18:30:45] <^d> We really shouldn't link to that page from the sidebar. [18:30:49] <^d> Amgine: ^ [18:32:19] I what, ^d? [18:32:34] <^d> [[Project:Current Issues]] [18:32:40] <^d> people always post bugs and so forth there. [18:32:46] <^d> because we link to it from mw.org's sidebar. [18:33:17] I was just chuckling about someone posting a can't log in bug there, whilst logged in of course. [18:34:06] They also post them to the support desk, ^d. https://www.mediawiki.org/w/index.php?diff=1084225 [18:34:44] <^d> More specifically. [18:34:49] <^d> The talk page of the support desk :p [18:34:49] on the *talk page* of the support desk! [18:35:01] [18:48:16] if I have a patch that fixes two bugs, how do I do the bug footer in the commit message? [18:49:30] leave an emtpy new line and then put: Bug: 00000 [18:49:39] well, the current bug number [18:49:50] It's in the commit message guidelines [18:50:09] https://www.mediawiki.org/wiki/Git/Commit_message_guidelines [18:50:16] leucosticte: You put two lines on the bottom. Bug: 12345 and Bug: 12346 on separate lines [18:50:35] leucosticte: also, go you for fixing bugs! [18:50:44] ah, 2 bugs in one patch! [18:50:51] RoanKattouw, sumanah: Thanks! [19:28:10] Hi guys. I cannot succeed to use the internal links feature when trying to link a revision in particular from a talk page. https://en.wikipedia.org/wiki/Help:Diff#Internal_links [19:28:10] Is this feature always enabled on MediaWiki? [19:31:20] hi, I'm looking for a list of all hooks that can be used by mw.loader.using( ... )? [19:31:52] wget: if you're trying to use {{Diff}}, you need to copy the "Diff" template from Wikipedia into your wiki. [19:33:03] physikerwelt: a list of the most important ones is in https://www.mediawiki.org/wiki/RL/DM [19:33:19] wget: oh, oops, I was looking at the wrong thing. [19:33:29] Yaron: Not eh {{Diff}} template obviously, just the link [[Special:Diff/oldId/newId [19:34:28] Right... that should work on your wiki... [19:35:26] Yaron: Hum. I'm trying with this link https://wiki.archlinux.org/index.php?title=VirtualBox&diff=316311&oldid=315152 [19:35:26] and my link is simply [[Special:Diff/316311]], but it remains desperately red :-( [19:35:55] though I didn't make a typo. [19:36:12] wget: that wiki is using MediaWiki 1.22.9 [19:36:41] That was added in 1.23 I think, or even current 1.24 [19:37:19] Hum Ok. Then I'm using the whole link. No problem. Thanks for the help. [19:37:33] Vulpix: I just see mw.loader.using( 'mediawiki.page.startup' but I'm looking for a later hook. Is there a way to print the list of hooks in the console? [19:38:40] physikerwelt: mw.loader.getModuleNames() seems to do the job [19:40:11] Vulpix: Thank you [19:43:41] hi, i am need to hide part of the article from unauthorized users, how to do it? [19:49:07] !cms [19:49:07] Wikis are designed for openness, to be readable and editable by all. If you want a forum, a blog, a web authoring toolkit or corporate content management system, perhaps don't use wiki software. There is a nice overview of free tools available at including the possibility to try each system. For ways to restrict access in MediaWiki, see !access. [19:49:23] !access [19:49:23] For information on customizing user access, see . For common examples of restricting access using both rights and extensions, see . [19:56:32] many thanks [20:01:12] Reedy: should these be filed http://p.defau.lt/?twMZrn0IKu4i1K2MIz5CcQ (found in my /var/log/hhvm/error.log )? [20:03:32] manybubbles: had a complaint the other day regarding ancient greek in search in en.WT. [20:04:45] Amgine: ok - is it can you file a bug? Its unfortunately the best way for me to track it. I can go hunting for the complaint but I've got my face burried in something at the moment. Sorry. [20:05:10] no worries, will try to get the complainant to file one. [20:05:19] [20:05:39] Amgine: thanks! [20:32:43] hi fhocutt thanks for the email [20:33:51] you're welcome, sumanah [20:36:24] Hi eldur! Thanks for working on the java client library [21:10:48] fhocutt: I like your https://github.com/fhocutt/jwbf/blob/master/README.md [21:10:57] thanks sumanah! [21:11:18] fhocutt: It seems like a good level of detail and explanation [21:11:48] sumanah: eldur suggested this enhancement to work on https://github.com/eldur/jwbf/issues/18 [21:12:07] I'll email you all with a link to that, but at a first glance looks pretty reasonable. [21:12:36] * sumanah looks [21:14:04] fhocutt: Interesting! search, I understand. I'm not sure what the original requester meant about a way to work with revisions - do you understand? [21:14:30] and thanks for the feedback--if that is generally as I intended, I will jump in to making line notes and further tweaks. [21:14:43] I don't, but search is the one I would be working on [21:14:47] ok [21:16:05] I think I saw someone mention that they didn't see some things they expected a library to handle with revisions, but I don't recall who or where :P [21:16:42] :( fhocutt maybe it was in one of the talk pages about your evaluations? [21:16:54] or in IRC [21:17:43] fhocutt: nod. In which case you could grep through your XChat logs for it, if you could deal with the false positives for the search string "revisions" [21:18:18] <^d> search? [21:18:26] looking at the code, I think that there aren't methods to deal with multiple revisions, just the most recent one [21:18:28] <^d> (i swear I don't stalk the word and just caught it) [21:18:50] * fhocutt grins [21:19:10] talking about me working on this enhancement for jwbf: https://github.com/eldur/jwbf/issues/18 [21:22:08] <^d> Ok cool. And yes, all the normal search syntax works in the API as well. [21:22:16] <^d> So insource://, etc. [21:22:22] oh, nice, thanks for the info [21:30:48] Reposting this question from before: [21:30:56] When a template is resaved, all the pages that call that template eventually get re-parsed, via jobs. Is there any way for a parser function within a page to know whether it's being called as a result of a template save or not? Parser functions are called with a $parser argument, and I could find one difference in $parser when it was called as a result of a template save - there was no value for $parser->mOptions->mUserLang->dateFormatSt [21:31:18] Yaron: Cut off after "dateFormatSt" [21:31:24] Oh, okay. [21:31:36] (...) there was no value for $parser->mOptions->mUserLang->dateFormatStrings. However, I don't know if checking for that is a stable solution, going forward... [21:32:15] I don't know if there is, and I feel like it might be designed such that there shouldn't be a way to tell [21:32:41] But I think that's a question for Aaron [21:32:42] RoanKattouw: ah - well, that's also good to know. [21:33:01] Why do you want to detect that, bTW? [21:33:50] Well, I'm working on an extension that saves data for a page, and I only want it to get called when either the page is saved, or it's re-parsed via a template. [21:34:22] From what I can tell, it appears that when you re-save a page normally, it gets parsed three (!) times. [21:35:07] And of course, pages are parsed when they're viewed (and there's no cache). [21:36:13] So there are basically five different times parser functions can get called (counting a page save as three of them), and I only want this parser function to do something on two of those five. [21:37:01] Hmm and this is because in those other three cases the parameters to the parser function didn't change or something? [21:37:17] Yes. [21:37:26] does jenkins run tests for draft changes? [21:38:06] ^^the answer is yes [21:40:23] I guess you could try to detect the fact that there was no change, as opposed to the fact that the cirumstances are such that you know there won't be a change? [21:40:39] You may also be able to use page properties to your advantage [21:40:51] They allow you to save data into the page_props table at parse time [21:41:27] The only restriction is that you have to recompute and re-save the data on every parse, otherwise the parser will get clever and remove the property [21:42:26] RoanKattouw: yes, I thought about a solution like that - it's possible to do that, I was just hoping there was a way to not have to touch the DB at all (other than for saving the real data), for maximum efficiency. [21:42:40] Right [21:42:51] I don't know exactly how page props are deal with in this context [21:43:17] and whether they're stored in the ParserOutput object [21:44:46] Well, it's fine - one or a few more DB accesses aren't a big deal, especially for a version 0.1 [21:46:07] RoanKattouw: (or anyone else) do you know why, by the way, pages are parsed three times on save? I guess I'm just curious about that in general. [21:47:21] Do you have a parser cache? [21:47:35] The page being parsed twice I can understand [21:47:49] Once to save all the metadata, then again because you're viewing it [21:48:00] Ah, that makes sense. [21:48:06] Even with caching those might be different parses because your preferences might be different from the defaults [21:48:20] I don't know if I have a parser cache... I don't think I have any special caching set up. [21:48:38] And the metadata extraction parse uses the default prefs [21:49:00] The reasoning behind that is that you want to cache that result early because most people (including all anons) have default prefs, so they can use that result [21:49:06] Oh, that might be it. [21:49:45] So that explains two parses, but not three [22:47:02] In the Name of God, the beneficent, the merciful, all praise be to Allah, the lord of the universe, O God hasten the arrival of the Mahdi and make us his follower and bless those who attest of his rightfulness. [22:47:46] The Zionist entity has captured a few civilians in the city of Ramallah [22:48:25] It is reported by the Palistinian Authorities that the Zionist have took 10 Palestinian children as hostages. [22:49:46] !ops [22:50:16] Hi OERIAS. [22:50:23] Hello Carmen [22:50:35] It's Muhammad again [22:50:40] Yes, of course. I remember. :-) [22:50:43] PBUH [22:50:50] PBUH. [22:51:17] RD, That is my name, I'm posing as the prophet. [22:51:25] RIPPBUH [22:51:29] p858snake|l: Simmer down. [22:51:56] *I'm not [22:51:58] OERIAS: There's probably a channel dedicated to Gaza and Israel discussion. [22:52:09] Maybe ##gaza ? [22:52:10] What channel? [22:52:13] I'm not sure. [22:52:22] Carmela, I am reporting [22:52:36] Right, but #mediawiki isn't the appropriate channel. [22:52:41] Oh, you could go to #wikinews. [22:52:56] <^d> Maybe they'll write an article in 3-4 weeks. [22:53:05] Or he can write one now. [22:53:09] I have to use a VPN to report as I fear that the Zionist are controlling the mobile towers and all other communications [22:53:23] Please note this down dear colleagues [22:53:42] I mean, [22:53:46] Israel is certainly spying. [22:55:30] <^d> I spy with my little eye. [22:55:34] <^d> Something that's....blue. [22:56:04] Are we having an op party? [22:56:12] <^d> Sure why not [22:56:57] It's a wild time in here. [22:57:02] <^d> Could op everyone [22:57:04] <^d> Would be fun. [22:57:35] I would be in favor of that. [22:57:48] Ironically it took 18 hours for the news of the break of the cease fire to reach the west. [22:58:11] needmoarops [22:58:38] Bottled water, toiletries, and medicine are beginning to be scarce. [22:59:12] Yeah, it's basically a war. [22:59:17] It's gonna be pretty bad for a while. [22:59:35] But it's really no different than the last 500 times this has happened in the past 2,000 years. So... [23:00:08] <^d> http://www.washingtonpost.com/blogs/worldviews/wp/2014/08/01/a-timeline-of-gazas-failed-forgotten-cease-fires/ [23:01:01] But ironically it is getting to the point of the Palestinian people are forming another intifada. People in Nabluz are burning tires and setting fire to certain check points. [23:02:22] OERIAS: Is there a reason you use "Zionist" in place of "Israeli"? [23:03:27] Do you expect non-bias comments from the prophet? [23:03:34] (PBUH) [23:04:31] Carmela, Newsreporters in the Arabic language use Zionist. [23:04:38] is there any reason this discussion is still in here? [23:04:57] RD, I think the Prophet might have actually called for peace. [23:07:29] <^d> p858snake|l: Bored? Nothing else going on? [23:07:41] <^d> But no, not really a good reason. [23:18:56] We allow off-topic conversation as long as it doesn't disrupt on-topic conversation. [23:18:59] AIUI, anyway. [23:24:28] <^d> Considering nobody talks in here anymore :\ [23:28:16] Right.