[00:41:43] friends [00:42:01] has wikipedia started using vue.js yet? [04:39:00] bleb, I don't think so, but that being said there was a, it seems successful, RFC about it: https://phabricator.wikimedia.org/T241180 [05:03:16] ok [05:04:15] wanna see something messed up though [05:04:22] go to the desktop verison of wikipedia [05:04:58] narrow the window until the "talk" tab on the top is about to overlap with the "read" tab [05:05:04] watch it dance [05:05:41] yep, been there seen that before :) [05:06:09] when did it start [05:10:04] When vector was introduced [05:10:17] ~2010-ish [05:41:30] Is anyone familiar with Page Forms? I'm having trouble getting the preview button to work. It keeps throwing a session error [05:42:13] I think I was able to fix it by adding wpUltimateParam to the data array in PF_AutoeditAPI.php in the setupEditPage function [05:42:24] though I'm not sure if that's a proper fix [06:59:49] hi, does anyone know if there is something that would make https://phabricator.miraheze.org/T5694 work? [07:02:55] Reception123: weird, https://revit.miraheze.org/w/index.php?sort=relevance&search=%22Change+the+path+of+local+files+away+from+%27My+Documents%27+%5BARCHIVED%5D%22&title=Special%3ASearch&profile=advanced&fulltext=1&advancedSearch-current=%7B%7D&ns0=1&ns4=1&ns14=1&ns1736=1 works [07:06:18] Reception123: I suspect some sort of bug is happening, although i don't know what. [07:14:59] another random bug I noticed with page forms: if i mark a section as "hide if empty", and also have a free text input, the next time i edit a page with the form, the contents of the free text input will go into the first empty section [07:39:22] bawolff: thanks for the answer! [09:25:03] Hello folks. New MediaWiki user here and looking for info on upgrading from 1.26 to 1.31LTS. I've been reading the release notes and upgrade docs but just wondering - roughly, how long it takes to do an upgrade? [09:59:16] legoktm: My plan is to clone the live install to a local dev server and run through the process locally before attempting on the live server. I'm inheriting maintenance from another dev who's not been involved for some time, so cant get info from them. There's quite a bit of data but it's not been modified much AFAIK. [09:59:47] I mean the install hasn't been modified much. [10:00:16] that sounds like a good plan :) if you get stuck or run into issues, we should be able to help you in here [10:02:03] legoktm: cool thanks. From what I have been reading, it looks like i'll need to apply patches incrementally to bring it up to 1.31? [10:03:35] luxumbra: personally, I would just download the 1.31 tarball and extract it over the existing directory. see https://www.mediawiki.org/wiki/Manual:Upgrading#Using_a_tarball_package [10:05:32] legoktm: what about patch files for the schema changes? [10:06:00] those should also be in the new tarball? [10:06:53] schema changes are applied running maintenance/update.php [10:07:55] even when jumping from 1.26 to 1.31? [10:09:01] yes [10:29:52] legoktm: thanks for your help. [10:30:11] Vulpix: thanks. [10:30:35] anytime! [10:54:27] why this channel ban riot.im user? [10:59:30] Guys, how do I add google tag manager tag in timeless skin? [11:13:28] Hello everyone. I'm currently having an issue, of a few, with a wiki I've taken over where the IP of the server is being shown instead of the user IP. Any suggestions on how to go about resolving this would be greatly appreciated. [11:27:46] first thought is migrate to the current version and filter out whatever bolixing resulted in that behavior [11:28:11] which will also establish control of the overall app/instance [11:29:19] a good chance a priori any user supplied content will be seperable from the mediawiki bolixing, i mean customization [11:31:02] In that case, my new error is that mediawiki isn't recognising that I've upgraded to the latest php. The website loads up "MediaWiki 1.34 requires at least PHP version 7.2.9, you are using PHP 5.6.40-0+deb8u11.". Last time I tried to fix this, I took the website down for a small while. [11:31:40] Going into a command line and running php -v shows version 7.3 iirc [11:32:35] yeah be careful there's a lot of snafu around 7.x, golden versions for certain plaforms and pkg combinations [11:32:55] it is all biz as usual and you should be on 7,2 there or thereabouts [11:33:44] eg.meansofproduction.biz is the 1.32 instance I'm running on 7.2 [11:35:39] 7.2.19; there's no absolute reason that the cli and webserver module have the same version [11:36:13] and in a pit of dirtball bolixing could be any ole kind of crazy [11:36:44] You'll have to forgive me as this side of computers is definitely far from my forte. You're advising to downgrade from 7.3 to 7.2 for stability? [11:38:04] no not at all unless he's already gotten something snagged by 7.3 [11:38:19] which is why I said 7.x [11:38:37] > 7.3 will almost asure a problem [11:39:12] not sure what mediawiki supports, you should go with that for the version your'e using [11:40:02] Anyone with a pointer to how the names of special pages are localized? [11:40:04] Ah, well in that case I just have to find a way for mediawiki to recognise that 7.3 is enabled. I was reading a week or so go that the wiki not recognising the version maybe attached to the apache module so I'll try and read into that a tad more [11:42:18] i do have 7.3.14 running on one host but it's not that one and not heavily used yet [11:44:17] jeblad: no but I think it is documented and it's also evident in the file structure [11:44:53] i18n is pretty much intrinsic in mediawiki [11:45:38] ofc a lot of things can be expected to bottom out in english [14:08:06] Hi, I am a little confused on how the review process on Gerrit actually works. I have the following repository on Gerrit: "https://gerrit.wikimedia.org/r/c/mediawiki/extensions/Expressions/". I commited my changes and submitted those for review: "https://gerrit.wikimedia.org/r/c/mediawiki/extensions/Expressions/+/602165". Should I wait until *someone* reviews this commit? Or should I assign [14:08:09] someone who I think would be able to review it? What do I have to do to have my commit reviewed and merged with master? Thanks. [14:12:51] In all honesty, unless it has other maintainers, or is WMF deployed or MW bundled, you can probably expect no one will drive by review it [14:14:03] I would suggest getting CI setup for it though, so automated tests and CR can be done [14:24:40] Okay, thank you. I will look into that [14:38:02] Reedy: Sorry for bothering you again, but I can't figure out how to exactly set up Jenkins for my repository :P Could you point me in the right direction? [14:40:58] It's not really self serve, but you can submit a change to integration/config [14:44:02] I note you don't have anything in your repo currently in terms of tests of any kind [14:44:14] So you might want to get a base composer.json to do things like php linting etc [14:44:46] Okay. I will also write some PHPUnit tests. [14:44:58] I was going to do that anyway, but didn't get around to it [14:45:20] (but it would probably have been better to first write those, then write the extension :)) [14:45:34] ...speaking of testing... *hides* is there an obvious way to get a page created in an integration test to have its output parsed? as far as I can tell, using editPage from MediaWikiIntegrationTestCase doesn't trigger OutputPageParserOutput hooks [14:51:24] Xxmarijnw: Certainly, bare minimum would be getting PHP linting setup [14:55:01] That is, using php-parallel-lint and a "test" script in the composer.json, right? [14:55:08] So using: https://www.mediawiki.org/wiki/Continuous_integration/Entry_points [14:55:30] yup [14:56:31] Alright :) [15:04:08] bpirkle: I'm thinking about multi-dc in context of the new session store. Specifically whether (and if so, how) it handles the scenario of a logged-in user viewing pages and osciliating between DCs. Is there a mechanism by which MW can "wait" for writes from previous requests to have caught up? Or is the expectation that a given user remains sticky to a particular DC for their entire session? (Might be fine, just curious) [15:07:12] Hi and thanks for the awesome software and awesome support! My understanding (which may be inaccurate) is that there is no way to have a Wordpress at the root of the domain, but have a wiki in /wiki/, so I guess I'd like Wordpress to be /wordpress/ and wiki to be /wiki/ so I need to put the /w/ (I use this for mediawikis) into /var/www/sitename.tld/wiki/w/ and then make the wiki appear in sitename.tld/wiki/ by setting $wgArticlePath = [15:07:13] "/$1"; I'm [15:07:14] looking at https://www.mediawiki.org/wiki/Manual:Short_URL. If it were possible to have Wordpress in root, yet magically wiki in /wiki/, that'd be optimal [15:12:01] Krinkle: good question. urandom would be better prepared to answer it, and I don't see him in this channel. [15:14:40] bpirkle: ack, will ask elsewhere then :) [15:22:44] jukebohi: that would be possible only if the rewrite rules of MediaWiki would take precedence over wordpress [15:32:49] Ok. Thanks for info Vulpix. Now searching for single-landing page software. Mailchimp's "forever free"-level could be the way to go. and then move the wordpress and the wiki to subdirs [15:34:20] ... coz my HTML reeks of the 90's [15:35:26] anyone here have experience with the MW performance profiler? am trying to coax it into showing me performance profiling for page edits and purges, and not having any luck - configuring it to use a file did nothing at all, and with ProfilerOutputText the output seems to be related to the page view, not to the edit [16:08:33] Reedy: I have committed by composer.json and .lock files and I made sure `composer test` succeeded without fails. I have pushed that for review. https://gerrit.wikimedia.org/r/c/mediawiki/extensions/Expressions/+/602397/1 [16:08:40] But now I need that commit reviewed as well :( [16:08:55] We don't generally commit composer.lock files fwiw [16:09:15] Hmm... I thought that actually was good practice [16:09:18] Ah well [16:09:43] we do commit package.lock [16:10:02] Okay [16:12:51] I can make you a CI patch to test this too [16:13:03] That'd be great! [16:50:03] Xxmarijnw: tests running [16:51:55] Okay, thank you :) [16:54:16] Seems the test failed because my extension.json is not entirely valid [16:54:41] I've whitelisted you for CI too [16:54:49] So any furhter patches you make will automatically trigger CI [16:57:50] Sounds good :) [17:01:01] So, I have now submitted another patchset that fixes the issue in extension.json PHPUnit was complaining about: https://gerrit.wikimedia.org/r/c/mediawiki/extensions/Expressions/+/602429 [17:09:19] Reedy: For the change I mentioned above, CI doesn't seem to get triggered automatically yet. [17:10:43] Curious [17:10:45] https://gerrit.wikimedia.org/r/#/c/integration/config/+/602425/1/zuul/layout.yaml [17:12:40] Seems okay, but the email address I use on Gerrit is `marijn@wikibase.nl`. Might that be the issue? [17:13:02] Loo at https://gerrit.wikimedia.org/r/#/c/mediawiki/extensions/Expressions/+/602429/ [17:13:07] Author [17:13:08] Xxmarijnw [17:13:08] Jun 4, 2020 5:55 PM [17:13:08] Committer [17:13:08] Xxmarijnw [17:13:09] Jun 4, 2020 5:55 PM [17:14:09] Hm, you're right [17:15:41] When I go to settings on Gerrit however, it lists my email as `marijn@wikibase.nl` again. The e-mail I use on wikitech is `marijnvanwezel@gmail.com`. [17:15:57] Oh, Jenkins seems to have noticed the review now [17:16:04] I rechecked the commit [17:16:11] git will use whatever you've set in your git config [17:16:23] That makes sense [17:19:17] Jenkins reports a bunch of errors on `PHPUnit extensions suite (without database or standalone)`. Most of them seem to be `include_once` warnings where it can find `Invoker.php` or `.php`, which seems strange to me. [17:20:55] https://github.com/wikimedia/mediawiki-extensions-Expressions/blob/master/src/Expressions.php#L12-L24 [17:21:01] We generally don't recommend doing autoloading yourself [17:22:39] Would you recommend loading the files via `AutoloadClasses` in extension.json? I prefer not to use `AutoloadNamespaces` for compatibility reasons [17:24:25] Yeah [17:24:31] Namespaces is better, but if you can't use it for reasons... [17:25:35] Some of the wiki's where I work *still* use 1.27 :( [17:25:51] Your extension.json doesn't make sense [17:26:00] manifest version 2 isn't in REL1_27 [17:26:05] So it wouldn't load there [17:26:35] That's not good. I'll update that as well. Thanks for pointing it out [17:26:46] 1.29 starts having version 2 [17:27:07] I see now [17:27:33] "requires" also doesn't support 1.27 [17:28:10] It should? [17:28:10] https://github.com/wikimedia/mediawiki/blob/1.27.0/docs/extension.schema.json#L273-L282 [17:29:10] https://www.mediawiki.org/wiki/Manual:Extension.json#requires lists two version numbers. [17:29:28] I see now that the 1.29 is only for specifying extension dependencies [17:29:36] Then it's alright [17:32:43] https://gerrit.wikimedia.org/r/c/mediawiki/extensions/Expressions/+/602434 [17:43:00] CI again does not seem to get triggered :( [17:43:52] I wonder if your other email needs whitelisting too even if your commits aren't using them as commiter/author [17:48:07] Done that now [17:56:37] Let's hope it works :) [17:58:45] Reedy: A button that says "Submit including parents" now appears, but it's greyed out. Hovering over it says "This change depends on other changes which are not ready.". How should I go about fixing that? Since those changes will probably never be ready. [17:59:06] You have to submit parent patches first [18:01:09] How would I go about doing that, since those are not verified by Jenkins? [18:01:48] Depends [18:01:58] The ones fixing stuff maybe should be moved onto master instead [18:02:06] And merged before your test patch (which should be rebased after) [18:02:29] You can also just force them all through, but I wouldn't advise doing that as normal workflow [18:09:41] Okay [18:21:10] I'm sorry, but I still can't seem to figure out how I should do it. I am quite new to Git, and have never use Gerrit before. [18:28:19] You can use rebase in the gerrit interface, and tell it master [18:28:40] I tried that, but I get `Could not perform action: The change could not be rebased due to a conflict during merge.` [18:29:02] Under `Merge conflicts` it lists https://gerrit.wikimedia.org/r/c/mediawiki/extensions/Expressions/+/602397 [18:29:52] You'll need to rebase locally then [18:29:59] The git implementation in gerrit is naieve [18:44:46] I have to leave now, I will continue it tomorrow [20:23:03] Hi. I'm moving a wiki from a subdomain to the main domain, so that the wiki should show up in https://stop-synthetic-filth.org/wiki/ (where /wiki/ is a real directory), but navigating to that address just gives the directory listing. Putting a full adress like https://stop-synthetic-filth.org/wiki/Main_Page does yield the main page. I'm a little confused with the rewrite rules that I need (they are pretty close now, but not quite). I [20:23:04] utilized [20:23:05] https://shorturls.redwerks.org/ to detect and generate the rewrite rules, which are now inside a -directive [20:32:21] jukebohi: you should configure "index.php" as the default document [20:46:48] Vulpix: where do I do that? In the VirtualHost-directive? [20:49:39] https://www.tecmint.com/change-root-directory-of-apache-web-server/ [20:54:40] jukebohi: https://stackoverflow.com/a/19322414/7268905 [20:56:51] Basically, instead of displaying the directory index when accessing a directory (something you should never enable on an application server, to avoid people seeing any backup or leftover file with potential private information), it will try to serve a document with that name if it exists on that directory [21:04:34] the situation is that the wiki is in /var/www/stop-synthetic-filth.org/wiki/w/ ... I did some changes and the wiki no longer loads at all, instead complains 404. I did use the short url tool again and got slightly different results. Document root for server is /var/www/stop-synthetic-filth.org. Here is the -directive that was generated https://paste.debian.net/1150285/ [21:07:33] These lines don't look right [21:07:36] RewriteRule ^/?wiki/wiki(/.*)?$ %{DOCUMENT_ROOT}/wiki/w/index.php [L] [21:07:38] RewriteRule ^/?wiki$ %{DOCUMENT_ROOT}/wiki/w/index.php [L] [21:09:59] Do you expect your pages to be under /wiki/wiki/? [21:10:11] No. I don't know where it gets that from [21:11:17] What is your desired URL structure? [21:11:34] domainname.tld/wiki/Article_name [21:11:34] Provide both short url and "ugly" (index.php) urls [21:12:30] If your wiki is also under /wiki/, that's not recommended [21:13:03] I mean, index.php should be on a non-conflicting path. either /w/index.php or even /index.php [21:14:08] It's not impossible to put index.php on the same path of /wiki/, but it has some limitations and only recommended for people that know what they're doing [21:14:15] Vulpix: I'm a little clumsy at technical things. Previously I have used /var/www/domain.tld/wiki/w/ as wiki directory and '/wiki/Article_name' as URL, but now I want to have /wordpress/ and /wiki/ at the root of the domain, so I can't use what I've been using so far [21:15:48] having index.php on /wiki/w/ and pretty url on /wiki/ would confuse the server. Think about this: Is it /wiki/w/ the path to index.php or a page named [[w/]]? [21:16:11] Vulpix: that seems to be the problem. the system gets confused [21:17:00] Would you be comfortable having the wiki's index.php under /w/ and pretty urls under /wiki/ ? [21:18:22] Well, it also depends for how long did you have your old structure. If it's an established wiki you probably want to conserve the old structure in place (the /wiki/w thing) to prevent broken links [21:19:02] redirects can help with that, anyway [21:20:21] Vulpix: I'd be happy with something that gives me wiki in /wiki/ and Wordpress in /wordpress/ [21:20:34] I can scrap the wordpress if it is getting too much in the way [21:22:50] Vulpix: If I place the Mediawiki files directly into /var/www/stop-synthetic-filth.org/wiki/ instead of /var/www/stop-synthetic-filth.org/wiki/w/ could I get this arrangement to work? [21:23:52] Yes. But would you want "short URLs" on /wiki/ or somewhere else? Because then the wiki files should be on a different path [21:24:45] Vulpix: I'd like the short urls to be https://stop-synthetic-filth.org/wiki/Article_name [21:26:06] You can try moving your wiki to %{DOCUMENT_ROOT}/w/ [21:26:17] and the only rule you need is RewriteRule ^/?wiki(/.*)?$ %{DOCUMENT_ROOT}/w/index.php [L] [21:26:59] $wgScriptPath = "/w"; [21:27:00] $wgArticlePath = "/wiki/$1"; [21:35:53] hmm.. the short URL builder stopped working (I have commented out all that would affect it from the VirtualHost and from LocalSettings.php) ... maybe I try manually [21:41:42] * Naypta just published extension code to Gerrit for the first time! 🥳 [21:42:18] I'm following the "Writing an extension for deployment" guide - it says either ask here or on the wikitech-l mailing list for people who might be willing to review, which is best to do? :D [21:43:22] With 'RewriteRule ^/?wiki(/.*)?$ %{DOCUMENT_ROOT}/w/index.php [L]' the redirection from /w/ -> /wiki/ works, but the wiki is nowhere to be found (404). Something slightly wrong somewhere.. [21:47:30] Butbutbut... $wgScriptPath used to be "/w" when the wiki was in /var/www/sitename.tld/wiki/w/ whereas now the Mediawiki files are in /var/www/sitename.tld/w/ [21:50:05] jukebohi: did you set $wgScriptPath and $wgArticlePath as I said? It looks like you didn't https://stop-synthetic-filth.org/w/api.php?action=query&meta=siteinfo [21:50:23] At least not $wgScriptPath [21:50:26] I set them but the thign would not work [21:51:20] your script is on https://stop-synthetic-filth.org/w/load.php, then script path is /w [21:52:23] but now the wiki won't work [21:52:25] the leading / indicates the URL starts at the domain level. (your domain)/w [21:53:36] Is there any RewriteCond that would cause the RewriteRule to not run? [21:54:29] where did you define that rule? on which .htaccess? I would expect it to be in the root folder and not in the /w path [21:55:03] I defined it inside of VirtualHost inside of a Directory-directive. Is that wrong? [21:55:55] it should be directly inside the VirtualHost, outside of any Directory directive [21:56:14] Naypta: what would you need reviewed? got a link? [21:57:01] cheers ashley! MarcoAurelio who created the gerrit repo has actually since added himself as a reviewer it seems, but the more the merrier of course :D the changeset is at https://gerrit.wikimedia.org/r/c/mediawiki/extensions/CatTalk/+/602500 [21:57:34] Vulpix: Now I changed the scope of the Directory-directive to instead of and now all seems to be working [21:57:39] I'm new to gerrit so I'm not sure if him being added means he's actively taking an interest in reviewing it or whether it is just because he made the repo for me [21:59:16] glad it works now :) [22:02:43] Big thank you to you Vulpix for helping out [22:23:01] Naypta: left you some comments and suggestions, please feel free to ask if(/when) some of 'em are unclear or whatnot [22:23:16] ashley: awesome, tysm! will have a read :D [22:24:42] happy to help (even if I don't exactly have experience in getting code WMF-deployed, though some of my things have ended up there regardless :) [22:25:36] heh, your guesses are bound to be far better than mine - I've flung myself in at the deep end here having not worked with PHP in years, so it's going to be a learning experience whateverwhichway :p [22:25:44] * Naypta didn't miss PHP, to be fair... [23:33:58] Guys, how do I add google tag manager tag in timeless skin? [23:48:54] g'nite and thanks to you awesome people. *ZZZZZ*