[00:54:52] Is there a way I can change $wgTmpDirectory? I am getting "Warning: file_exists(): open_basedir restriction in effect. File(/root/tmp) is not within the allowed path(s)" and I don't want to change the value of TMP because this is a shared server. [01:25:27] You should be able to set it in LocalSettings.php [04:08:28] What does it mean when a user has an empty skin preference value? [04:08:58] Is that a valid modern way to assume the default? Or is that a legacy behavior? [04:14:28] GPHemsley: is there an empty string in the database row for the user? [04:14:32] or is it not set..? [04:14:58] if ( $key == '' || $key == 'default' ) { [04:14:58] // Don't return the default immediately; [04:14:58] // in a misconfiguration we need to fall back. [04:14:58] $key = $defaultSkin; [04:15:04] so it means default [04:15:20] but MW shouldn't create a row if the user's preference value is equivalent to the default [05:25:22] crhylove, hey [05:48:17] crhylove, you should visit http://debconf14.debconf.org/ [09:22:35] Hi ... I am in search of answer that which Single Sign-On plugin I need for our organisational Wiki [09:23:35] we have one domain only and I want to achieve single sign-on for a security group that is created for IT department only [09:24:31] And I want users from that particular security group can login to our mediaWiki through SSO authentication... can anyone please help [09:31:20] anyone can help me with my question of binding AD security group for wiki authentication [09:34:27] Svetlana: Hope you can help me on that [09:34:34] if you are not busy :) [09:36:13] barely familiar topic to me but having it clarified could probably help: I understand it's through SSO authentication but i don't understand what the primary "group" or "domain" is [09:37:20] domain is our local organisational network address you can say [09:38:13] and primary group is ... group is an entity of Active directory where you can add multiple objects [09:38:18] "security group that is created for IT department only" [09:38:21] "Active directory" [09:38:23] "objects" [09:38:25] what is all that? [09:38:27] yes [09:39:36] !ldap | Umair [09:39:36] Umair: http://www.mediawiki.org/wiki/Extension:LDAP_Authentication To get support, open a new thread on the support page http://www.mediawiki.org/wiki/Extension_talk:LDAP_Authentication [09:39:48] yes ldap [09:39:54] I am already on this link [09:40:52] but I am not sure which extension do I need because there are 2 extensions: 1- Single domain straight binding 2 - Single domain search before binding [09:42:33] Hello and big thanks to MW devels! [09:42:40] and to any FLGOSS devels out there! [09:46:27] hello? which AD authentication do I need to get AD authentication from a particualar security group [09:56:30] I'm trying to figure out a CAPTCHA to install [09:56:54] Is it possible to have many CAPTCHAs and that one is chosen randomly at trigger time ? [09:57:14] I figure that'd make it more difficult for the spambots [09:57:59] nope. Each captcha has a definition of when it should be triggered [09:58:47] Vulpix: oh 'k [09:59:24] even if you provide a logic to randomize the inclusion of one or another captcha extension, you'll have to keep track of it to serve the same kind of captcha to the user, otherwise, it will display captcha A, and when the user solves it, the result would be handled by a different captcha and would fail [10:00:01] QuestyCaptcha is usually very effective against spambots [10:17:20] Umair, I don't know what "security group" or "AD" is. Sorry. :-( [10:38:52] jubo2, iirc Wikimedia projects use https://www.mediawiki.org/wiki/Extension:ConfirmEdit apparently (for edits) and they might also use something for signups - maybe the same thing and maybe not. [10:40:06] Svetlana: Спасибо для информатсия [10:40:21] пожалуйста [10:41:58] I think it would be preferable to have KittehAuth for 75% hits, something else but simple and non-demanding for 25% and QuestyCaptcha AND KittenAuth for account creates and logins [10:42:25] Is it too pig to ask for credentials and semi-proof-of-being-a-human at log-in ? [10:42:49] no.. no it's not.. it's most reasonable.. [11:05:42] anyone on my issue please [11:05:56] AD authentication for MediaWiki users [11:14:46] Umair, yes, I asked you to clarify [11:16:05] oh damn, it's a ms thing. a /term/ - "active directory" - who'd guess. [11:16:33] https://www.mediawiki.org/wiki/Extension:LDAP_Authentication/AD_Configuration_Examples [11:16:36] http://www.pickysysadmin.ca/2013/05/13/how-to-configure-mediawiki-to-authenticate-against-active-directory-on-centos/ [11:16:42] http://noltechinc.com/miscellany/active-directory-authentication-with-mediawiki [11:16:47] what have you tried so far? [11:19:48] Yes [11:20:11] Active Directory is a container for users and computers in MS Windows server [11:20:16] to authenticate users [11:20:22] when they login to network [11:20:55] hey thanks for these links [11:21:01] let me check and I will get back [11:21:02] ;) [11:21:03] cheers [11:33:13] Hi, I need to draw a flowchart and though I can use LibreOffice it would be nice if this was directly editable in my mediawiki. Are there any plugins or built-in tools for drawing flowcharts? [11:42:35] DrSlony: there is one for graphviz [11:42:53] https://www.mediawiki.org/wiki/Extension:GraphViz [11:46:28] thanks [11:46:36] is it possible to install that without shell access? [12:00:17] I don't think there are any direct tools/plugins [12:00:39] but if you do it as a svg it will be pretty easier to update [12:00:54] and there is a svg editor tool i believe [12:02:47] i think graphviz is the way to go [12:03:09] i installed the extension but now i need to install the actual graphviz program which is a bigger problem as this is a typical cheap server and i dont have root access [12:03:12] trying customer support [12:14:38] hi [12:29:53] I wonder why the tag "tags" seem to be always empty (like in http://fr.wiktionary.org/w/api.php?action=query&prop=revisions&titles=cruche&rvprop=ids|timestamp|user|comment|size|tags&rvlimit=500). Do you have an idea? [12:35:55] Automatik: what makes you think there should be something there? [12:36:05] "minor" is a tag, no? [12:36:11] no [12:36:20] ok, so what's a tag? [12:36:35] something like "visualeditor"? [12:36:42] Automatik: https://www.mediawiki.org/wiki/Manual:Change_tag_table [12:37:28] Yes that's one [12:37:35] ok thank you :) [12:37:43] "minor" is a so-called flag and it's stored in https://www.mediawiki.org/wiki/Manual:Recentchanges_table#rc_minor [12:48:45] hey folks, I'm trying to upgrade from 1.22.3 to 1.23.2, and when I run the update.php script, I get the following error: PHP Fatal error: Call to undefined function wfOut() in /production/wikiwork/extensions/TitleKey/TitleKey_body.php on line 126 [12:48:53] any insight? [12:50:53] ecrist: you should also upgrade your extensions [12:51:26] basically, download each extension again, selecting your current release when downloading (1.23) [12:52:09] ok [12:52:12] thanks [13:02:55] hello [13:03:05] is anyone here? [13:03:51] I think this is not a good time of day to be asking questions… [13:04:09] you havent asked any real questions yet [13:04:10] Regardless, I guess I'll leave a question here and hope somebody answers it later. [13:04:18] Oh hello there [13:04:28] im not here [13:04:51] then you're one of the more clever bots I've ever run into [13:05:27] Here's my question: why is a link to a visited existing page turning up light red? Shouldn't it be purple? [13:05:32] Fresh install [13:06:14] explosionduck.com/dnd/wiki [13:09:52] I feel kind of silly asking such a simple question, but it seems that googling for the words color and wiki doesn't turn out good results. [13:10:35] "mediawiki visited link color" could be a more useful query :D [13:10:53] depends on the skin you're using possibly and please tell your version too [13:11:14] try under another browser and see whether it's the same there [13:11:38] be back later, please rant it all out to the channel without pinging me specifically [13:12:45] Thank you for the suggestions. I'm using the default skin, version 1.23.2. [13:13:09] I already tried it under Firefox. The results are the same. [13:13:29] (I'm using chrome, so Firefox is the other browser.) [13:13:34] synkarius: Cache and/or slow job queue running. If you make a null edit like I just did, it fixes it. [13:15:06] Lcawte: that's very good to know. What is a null edit? [13:15:45] synkarius: Just click edit page then save page, not changing anything. [13:16:04] Lcawte: thank you very much. Problem solved. [13:34:08] Upgrade of MW and installation of a CAPTCHA are todays remaining goals [13:59:19] I was wondering about different languages on wikis: How does one view/create the wiki's content in a different language? I can change the preferences to Japanese, but the pages still show in English. [14:00:31] to the task at hand ( Upgrade MW and install CAPTCHA ) [14:00:40] TheRetroGamer: you can't. Content must exist in other languages in different pages/subpages, or even different wikis (en.wikipedia.org, de.wikipedia.org, etc) [14:00:43] ssh palestinetunnel.org [14:01:09] pro-skilzzz: can into command 'ssh' courtesy of Tatu Ylönen [14:02:09] backup database. backup 'w/' directory, download, run upgrade.. [14:02:14] So, this would probably be something to ask the site's developers about? [14:03:44] yes, probably [15:01:16] Hello and thanks for the awesome MediaWiki [15:01:52] I get error '[2814af62] 2014-08-21 14:59:35: Fatal exception of type MWException' [15:02:34] when trying to move an installation with .tar.gz:ing the w/ directory and the sql from virtualized Kubuntu14 to native Kubuntu14 [15:02:47] installed the APM to the L with apt-get [15:02:57] something is wrong coz it gives error [15:04:41] I prlly try to install MW from the .tar.gz of the old MediaWiki ( the one that matches the database structure ) [15:05:36] If someone is reading these lines I will tell what I am doing: Making sure that the backups actually load into a MySQL and display the wiki on localhost of my laptop [15:17:06] got it [15:17:22] Installed the MediaWiki and swapped the old database into the new installation [15:17:39] now all looks nice but the Main Page did not change [15:17:44] all the other pages are there [15:18:23] so what.. all I need to know is that the .sql dump loads into MySQL and the wiki pages are accessible on the localhost/secretdirectory/ [15:18:24] Main Page shouldn't change, I presume, since you just moved your wiki [15:19:19] Vulpix: I installed the same version of MW as the installation which'es mysqldumps I'm trying to check. Then just dropped the database and loaded it from the .sql file backupped from the production server [15:19:48] yes, that's what I've understood [15:20:01] I'm ready to upgrade it and install a CAPTCHA [15:20:07] or there is an another route [15:20:29] reading manuals is always a good route :) [15:21:26] if you want to do both, you should upgrade first [15:24:21] I'ma try both methods [15:24:29] is scribunto master compatible with 1.23? [15:25:58] omai.. the mysql root passwd is gonna go in plaintext... no much matter. all servers have iptables configured to let nothing but 22, 80 and 443 pass [15:26:56] why would you use root password for mediawiki? [15:27:24] brr, brr [15:27:37] The convenience of letting MW setup its own account during setup. Presumably it forgets the root pw after that. [15:28:30] well, it asks for an account with admin privileges (not necessarily root) during setup, but those credentials are not stored [15:47:10] it's advancing somewhat.. [15:48:31] MediaWiki realoads LocalSettings.php each time the file is saved yes ? [15:51:06] navboxen brokeh [15:51:14] something something.. [15:51:54] I prlly forgot to import {{navbox}} or something like that [15:52:17] the infoboxen I've imported and converted to link to 'pedia work [16:22:26] shit.. [16:22:52] the DDoS attack continues as fort as it was when I took the http://GloBBA12.SI/wiki/ study wiki offline [16:23:10] This is new for me.. [16:23:24] I don't intuitively know how to deal with this [16:23:53] The most crappy solution would be just to move the site to somewhere else and not tell the DDoS'ers where it went [16:24:52] jubo2: what kind of DDOS attack are you having? [16:33:50] Vulpix: I don't know exactly. There are many hosts that are rapidly requesting pages [16:34:23] maybe google is indexing your site? [16:34:53] Vulpix: it would say googlespider or something [16:35:01] I've seen google taking down a website because indexing pages at a gazillion pages per second [16:35:18] Vulpix: and these bots are accessing non-existent pages mostly [16:36:14] jubo2: if it's google, or bing, they'll include a URL saying that in the User-Agent string (usually visible in access_log) [16:36:15] Vulpix: I have the old database where they run spam links into the wiki for weeks and weeks before I figured out that it's the DDoS on the http://GloBBA12.SI/wiki/ that's making http://develop.consumerium.org/wiki/ squat [16:36:22] they share a server [16:36:49] Vulpix: might find out something about the attacker by looking at what it is posting [16:37:38] iirc they spammed RX pharmacy, get papers written for money and such content [16:38:24] ah, spambots... well, there's nothing much to do other than installing a captcha [16:38:34] QuestyCaptcha is usually a good option [16:38:36] Vulpix: yeah.. I'm on that next.. [17:11:55] https://www.mediawiki.org/wiki/Extension:ConfirmEdit#ReCaptcha says.. [17:12:09] "Part of the weakness of the ReCaptcha module is that ConfirmEdit doesn't include any penalty mechanism, so spam bots can simply keep trying to bypass the CAPTCHA until they get through. This is an issue that is strongly worth addressing in some way." [17:12:57] It would seem to me that I need something blocks the spam botnet connections when it fails the CAPTCHA for a few times [17:16:28] jubolog2: cloudfare? :p [17:18:30] Nemo_bis: what's that ? [17:18:40] I'm going for MSFT technology [17:19:05] The kittehs and puppies is going to be the least annoying and hard as hell for the machines to pass [17:19:49] fail2ban [17:20:26] And humans... [17:22:14] the CAPTCHA didn't activate.. damn.. [17:24:39] is there some interaction with $wgGroupPermissions['sysop']['edit'] = true; and the CAPTCHA activation ? [17:27:56] I've got a build process which has worked for ages (mw ~1.9) and am currently running 1.22. I've attempted to update to 1.23, but failed. I've since tried the bleeding edge also. In both cases it throws a "Exception from line 182 of /var/www/html/includes/Hooks.php: Invalid callback in hooks for ParserFirstCallInit". I also have the backtrace if that would be helpful. Can anyone suggest what might be causing this? [17:28:23] cariaso: the problem is an extension [17:28:36] well, one of your extensions [17:29:30] Vulpix, I use semantic mediawiki, and many others. I'll try to piecemeal comment out and narrow it down. thanks. [17:31:54] I cannot get the Asirra CAPTCHA to engage [17:32:03] I'm missing something.. [17:32:50] If someone wants to help me rule out the possibilities of what is wrong help would be appreciated but now I watch the news but got a couple-three hours before I sleep [17:34:43] There might be important information to be learned by analysing the access.log no ? [17:35:24] just need someone who knows someone that has ũba-h4x program that can find things out about the botnet by looking at the httpd access.log ? [17:36:42] got 296,405,826 Bytes of accesss.log and the wiki is normally an exteremely low traffic site that's been online for 2 yrs [17:37:13] jubolog2: sadly, you can't do anything to the botnet except prevent it editing your wiki with captchas [17:37:15] so 10 megs legit and 290 megs of botnet DDoS attack or sumpting liek that [17:38:00] Vulpix: is there no way to refuse to serve IP-addresses that have failed the CAPTCHA for like.. umm.. 3 times ? [17:38:01] basically, you can't prevent rain, but you can walk under an umbrella [17:39:02] jubolog2: on linux, yes, install fail2ban, and you can configure it to read from a logfile every failed captcha attempt. You'll need to modify the captcha extension so it logs failures to a file [17:39:39] Vulpix: 'kdänks I look it up on https://startpage.com or https://duckduckgo.com [17:40:15] it shouldn't matter, probably [17:41:13] * jubolog2 reading https://en.wikipedia.org/wiki/Fail2ban [17:41:40] so it's a way to zap hosts off using IPTABLES to block the unwanted traffic [17:42:19] yes [17:42:31] I may have to use it.. [17:42:49] !e StopFormSpam [17:42:50] https://www.mediawiki.org/wiki/Extension:StopFormSpam [17:43:13] legoktm: does not exist [17:43:20] oh, I spelled it wrong [17:43:20] use rules to DROP packets coming from offending IPs, and they'll get timeouts accessing your site, that will never reply to all their connections [17:43:26] https://www.mediawiki.org/wiki/Extension:StopForumSpam [17:43:40] Vulpix: that sounds very efficient for the problem at hand [17:44:09] jubolog2: it's another anti-spam extension myself and another dev wrote that uses blacklists from stopforumspam.com [17:44:15] Vulpix: also sounds like a way to potentially mess up your clean-cut IPTABLES poliicies [17:44:54] yes, you may end filling up your iptables with lots of IPs [17:45:24] Vulpix. eventually isolated my problem to SemanticResultFormats . My thanks for your help. [17:46:09] cariaso: those problems are usually caused by an outdated extension. Try downloading it again, choosing the version of MediaWiki you're using [17:46:15] It's like with http://MaidSafe.net ... if it's a clean implementation it's gonna rock the world.. if it's made with cardboard pieces, used bubble gum and shit quality tape .. nicht so gut.. [17:46:33] legoktm: thanks for that info [17:47:28] My current problem is that the Asirra CAPTCHA is not being executed when it should be executed according to my LocalSettings.php [17:47:39] prlly something wrong with the LocalSettings.php [17:48:33] I could switch to some other CAPTCHA type and see if that works ... or I could find the mediawiki log file(s) and look at that for clues why it didn't work [17:48:37] Asirra only works for user registrations, not for page edits [17:49:09] Vulpix: huh..? rly? why? [17:49:48] idk, but https://gerrit.wikimedia.org/r/#/c/22550/ [17:50:13] Maybe I should just have the potential users ask me to give them an account [17:51:44] $wgGroupPermissions['*']['createaccount'] = false; [17:52:05] this is necessary for not to let the botnet register any accounts [17:53:18] The old database for http://GloBBA12.SI/wiki/ got so bloated with tens of thousands of edits and thousands of accounts ( estimates ) I just XML-exported all the actual content articles to a fresh MW installation [17:54:10] jubolog2: use QuestyCaptcha, it's very good against bots, if you use questions that bots can't resolve by themselves :) [17:54:19] just don't put simple math questions [17:54:56] but I'm still thinking about that it could be useful if there was a way to donate the access.log to an anti-spamnet effort like "I got hit by a botnet and I wanna upload the access.log." [17:55:28] then the service would piece together information from the various access.log s of sites that have been hit by a botnet [17:55:37] *about the botnets [17:55:46] stopforumspam is basically that, except it's designed more for forums, not wikis [17:55:48] that's what StopForumSpam extension does, apparently [17:55:57] also project honeypot [17:56:24] honeypot.. sounds delicious.. anyone got any tea, lemon and cut grusian brandy ..? [17:59:03] anyways the attack occoured after I made the mistake of linking to the site from https://en.wikipedia.org/wiki/User:Juxo/wikireader_into/Logistics_and_SCM , https://en.wikipedia.org/wiki/User:Juxo/wikireader_into/Economics etc. [17:59:36] Looks like someone doesn't like the free dissemination of informations regarding business studies [18:00:05] It'd be too bad for the capitalist if the grossly underpriviliged weren't so underpriviliged [18:00:26] jubolog2: you're wrong, spambots like free dissemination of their spam business :P [18:01:23] Vulpix: you're prlly right.. The nasty spammers are prlly always on look for new wikis that have been left wide-open [18:07:49] the CAPTCHA not fireing was my bad [18:08:15] Didn't read the instructions on what to do to LocalSettings.php thoroughly enough [18:08:40] the kittehs and the puppieh are too small [18:08:45] gotta do something about that [18:15:23] ah.. it shows big pic when hover-over [18:16:09] I need to figure out a correct combination of anti-vandal measures.. Better sleep on it.. been looking at the screens a lot today [18:16:50] I've banned editing by others then sysop and banned account creation [18:17:09] I'ma prlly want to be able to manually add accounts if some fellow student asks for one [18:17:27] ... and hope the server holds under the load of the attack [18:19:05] Currently I get 2 small servers for the price of one ( their list prices are middle 00's so they had no problem agreeing ) [18:19:43] and I've not the need for the 2nd now that I've joined http://kapsi.fi as a private individual and get the 555GB of disk [18:20:20] so I guess I could ask them to recycle the other server and give me one bigger one for the list price of the small one [18:22:10] plus the most important ( http://develop.consumerium.org/wiki/ that I've been at since 2003 ) was recently XML-dumped and archived by the Wikiteam within the archive.org servers so that's safe even If catastrophic data loss by me happens [18:25:30] that's definitely not the kind of safeness I want sysadmins to feel when I dump their wikis [18:26:05] Nemo_bis: I have to admit [18:26:24] drastic data loss was once one command away [18:27:23] the backups, if you can call them backups because they were not loaded on a test system to verify they actually load if the need arises [18:28:18] That close call teached me to set up test system to check that the mysqldumps actually load into mysql [19:03:15] Monday, Tuesday, Wednesday, FLGOSS devel appreciation day, Friday, Saturday, Sunday. *leave it on repeat* [19:04:44] G? [19:04:52] "Gratis"? [19:05:15] Great? [19:05:21] tpyo? [19:05:32] lydsxiae? [19:05:40] lol [19:06:26] marktraceurWMF: yes. gratis [19:06:31] Ah. [19:06:45] some nut in #debian-offtopic started yelling at me for putting the G there [19:06:50] jubolog2: Since when is Thursday $movementName developer appreciation day? [19:07:07] marktraceurWMF: since I declared it today naturally [19:07:23] jubolog2: Well, lots of free software developers are staunchly pro-business, others are staunchly anti-business, it's a mixed field [19:07:37] jubolog2: And this is every Thursday? [19:07:46] I've been saying we should have many FLGOSS devel appreciation day .. Now I've just locked on to having 52 of 'em per year [19:08:25] Thursday, Thor's Dag [19:13:01] Huh. [19:13:10] jubolog2: Not going to lie, that will get old [19:18:09] old.. [19:18:20] everyone under 1,000,000,000 s is a youngster [19:22:03] Not what I meant [19:22:09] It will not be fun after like two weeks [19:22:24] Also, pretty sure nobody will sign on to the idea because of that reason. [19:36:41] suppose that I need to select 100,000 rows from the archive table based on ar_rev_id (I can't use ar_page_id or anything like that) and then do some inserts using that data. what's the least expensive way to do this? one big query or 100,000 queries? or something in between? [19:38:56] marktraceurWMF: I've been in freenode irc for over 11 yrs now. What if I personally come in #mediawiki, #debian-offtopic and #kubuntu-offtopic every FLGOSS devel appreciation day and say a big "Thank You" for the next 11 yrs ..? [19:39:06] the WHERE clause would have to be a string with a bunch of ORs, I guess [19:39:17] jubo2: Then good luck [19:39:25] marktraceurWMF: merci [19:39:30] I think you'll wind up getting banned from a few channels [19:39:40] Depending on how you do it [19:40:04] marktraceurWMF waving the hypothetical bannhammer ehh ? [19:40:38] leucosticte: I don't know what does MediaWiki use for queries, but I'd probably use a "datareader" (fetches one row at a time, instead of loading all rows in memory) [19:40:43] I'm not an op in this channel, so no, I'm not [19:41:00] I would ban you from a project channel I was an op in if you came in and made a lot of noise every single week though [19:41:24] marktraceurWMF: ah i c what is cooking your noodle.. holdon.. [19:41:26] as for the WHERE clause, I guess those ar_rev_id are not random, but based on some specific conditions, maybe do the proper query with joins with other tables? [19:43:41] marktraceurWMF: channel traffic in a support channel indicates support demand. this is bad for us, members of support channel X. No channel traffic == no support demand == good for us [19:44:11] marktraceurWMF: but this does not apply to non-support channels like #debian-offtopic and #kubuntu-offtopic [19:44:20] Vulpix: the use case is bug 69049. originally, the data were going to be stored in log_search so that a JOIN could be used; but Brian Wolff objected so I changed it to storage in log_params. But I can't do a JOIN on that https://bugzilla.wikimedia.org/show_bug.cgi?id=69049 [19:50:04] that patchset doesn't look bad, but I don't know the performance implications of deleting a page with a huge lot of revisions [20:03:37] jubo2: Sure. Unless you do your thanks in a disruptive way. [20:15:30] Is there a way in Mediawiki Vagrant to set the version we want of MediaWiki? [22:45:55] ping [22:46:17] !ask [22:46:17] Please feel free to ask your question: if anybody who knows the answer is around, they will surely reply. Don't ask for help or for attention before actually asking your question, that's just a waste of time – both yours and everybody else's. :) [23:31:29] are there any limits on how much stuff should be put in log_params? E.g. is it okay to have 100,000 elements in there? [23:32:10] probably limited by database field size… [23:32:13] ok I'm reading the news, media viewer... wtf? [23:32:28] trelane: Hi! What is confusing about it? [23:32:52] why I'm reading about it in the news? [23:33:01] trelane: Where? [23:33:03] it's a freaking image viwer, kind of a stupid reason for a revolt [23:33:09] Well. [23:33:15] cuz that's the situation set up by https://gerrit.wikimedia.org/r/#/c/151309/ [23:33:34] trelane: It's sort of reflective of the current political situation on the wikis in re WMF [23:33:39] http://wikipediocracy.com/2014/08/19/the-battle-for-wikipedia-how-your-donations-may-be-destroying-the-crowd-sourced-encyclopedia/ [23:33:53] > news [23:33:58] > wikipediocracy [23:35:08] lol wikipediocracy [23:35:36] "inserting random ASCII characters such as chess pawns" lol, ASCII [23:36:59] MatmaRex: That's quality tech journalism right thar [23:38:14] hahaha [23:39:14] leucosticte: yeah, that's not feasible at all [23:40:36] legoktm: well, Brian Wolff didn't like the log_search alternative either [23:40:46] that would be even worse [23:41:27] legoktm: so, is there an alternative to those two solutions, besides the status quo? [23:41:31] doesn't special:mergehistory have an undo button? [23:42:13] legoktm: I haven't done anything with mergehistory; I'm not sure. I wonder how that undo button is implemented (haven't dived into the source yet) [23:42:31] it's in the merge log [23:42:52] https://en.wikipedia.org/wiki/Special:Log/merge there's an "unmerge" button [23:44:49] got some problem with wrong folder submissions on uploading images - 700 - permission denied ... [23:50:26] oh, I see that the merge history just stores a page title and a timestamp in log_params [23:51:22] so what makes a large number of elements in log_params unfeasible? [23:52:26] *infeasible [23:55:29] the way SpecialMoveRevisions extension is implemented now, it uses a SELECT query with as many ORs as there are revisions (see line 178-186 at https://github.com/Inclumedia/MoveRevisions/blob/master/SpecialMoveRevisions.php)