[04:17:26] how do i add a See Also to a page in mw? (own hostest instance) [04:17:43] oh, i ugess I can just write it in the page source [07:26:55] After issues with 1.31.12 patching I did a reinstall, but trying to dry-run .13 I'm getting reversed patch detected? Do I just have to do a reinstall again? [07:28:18] NGC_6205: well you should go to 1.31.14 [07:28:29] what issues are you having? [07:28:32] but I thought I had to patch to .13 first? [07:28:46] ah, no, you're correct [07:33:51] was very much hoping this would be a quick and easy update :( [07:42:07] maybe it's just two extensions, I'll just manually update them [16:29:36] Hello, I've some issue with Mediawiki OAuth authentication, does someone successfully use OAuth 2.0 on their Mediawiki ? [16:33:30] Myst: tell us about the issues you have, we can't help you otherwise [16:40:06] https://phabricator.wikimedia.org/T279056 [16:40:11] It's the same error [16:40:50] *nearly the same, I don't have it on the same file, but the cause it's highly probably the same [16:41:40] When I create an OAuth "2.0" consumer on my Mediawiki/Wikibase installation, when I send the form, I got the "Method MediaWiki\Extensions\OAuth\Entity\AccessTokenEntity::__toString() must not throw an exception, caught ParseError: syntax error, unexpected 'Parser' (T_STRING), expecting function (T_FUNCTION) or const (T_CONST)" error message [16:42:30] which php version are you using? [16:42:52] 7.3.27 [16:43:23] up to date debian 10 [16:45:19] For my error, I think the issue come from : [16:45:19] " $accessToken = \Html::element( 'span', [ [16:45:19] 'style' => 'overflow-wrap: break-word' [16:45:20] ], (string)$accessToken ); [16:45:20] " [16:45:20] the "(string)$accessToken" [16:47:16] It seem I don't have the last Mediawiki, I'll install the 1.35.2 and test again [16:47:24] I'm running 1.35.1 [17:01:35] Ok, same issue with 1.35.2 [17:09:34] Is there any more to the error message than that? [17:14:51] Hi. Does anyone know how a whole subpage space can be defined in AbuseFilter? For example all pages under "Test/"? [17:15:46] use rlike? [17:16:35] or even just.. title contains "Test/" [17:16:47] page_title contains "Test/" I guess [17:28:46] thanks! [17:51:17] Sorry for asking again, but is there any explanation for why editing AbuseFilters would not show up on Special:RecentChanges? [17:51:29] (and yes, I don't have any bot flag or anything and other edits work normally) [18:04:44] Reception123: https://phabricator.wikimedia.org/T34959 [18:05:23] thanks, though the filter was public in this instance [18:06:18] Majavah: or am I understanding wrong and it was removed for all filters? [18:06:53] Hi there, how to prevent search engines from indexing non-existing pages? [18:07:05] Reception123: I think it was removed from all as a side effect, yes [18:07:11] thanks [18:09:34] though I do have the abusefilter-log and abusefilter-view permissions, shouldn't that allow me to view it in RC or in "All public logs"? The only place I can see is in the abuse log specifically [18:11:52] For example, I make an internal link to [[Missing]] which doesn't exist. How to prevent for example, robots from visiting [[Missing]] ? [18:12:08] Reception123: Special:AbuseLog is for abuse hits, not for abuse filter changes [18:12:33] sorry, I misspoke I meant the abuse filter log [18:12:46] I am able to see changes to abuse filters there, but not in "All public logs" or RC [18:13:24] adfeno: those pages already have meta robots noindex. Do you want bots to don't even attempt to open them to see they're not indexable? [18:13:54] Vulpix: that would help too. [18:15:26] adfeno: if you have "pretty urls" implemented, where pages are on /wiki/PAGE but editing goes to /index.php?title=PAGE, you can add a line to robots.txt disallowing /index.php [18:20:29] Vulpix: Hm… OK, will check it out. [18:21:01] On a related question: is there any way to prevent indexing of deleted entries ? [18:21:18] Indexing by search engine robots, I mean [18:28:27] Vulpix: unless I'm mistaken the noindex meta tag might not be enough to stop missing pages from being indexed, https://www.mediawiki.org/wiki/Manual:Robots.txt#Spidering_vs._indexing I wonder what could be done then? Or, is it enough to just trust the noindex meta tag in this case? [18:30:57] adfeno: meta robots noindex is sufficient for preventing search engines from indexing those pages: https://developers.google.com/search/docs/advanced/crawling/block-indexing [18:32:07] Note, however, that preventing bots from accessing those pages (for example, by disallowing them with robots.txt) may cause search engines to still list those pages, because they can't access the page to see if it has a noindex tag [18:33:19] deleting a page automatically causes it to have a meta robots noindex tag. However, the search engine won't know the page has been deleted until it crawls it again [18:35:06] Google and Bing provide ways to manually remove specific pages [18:38:11] What about YaCy? [18:44:23] meta robots is a standard way to tag pages as not indexable (or at least that's the webmaster's intention). Every crawler should support this [18:50:15] Vulpix: thanks for the information ;) [18:55:16] How to add nofollow to links to non-existing wiki pages? [19:27:06] With a hook probably. This is one thing MediaWiki should do by default, though [21:51:14] addshore: regarding mediawiki-docker-dev composer install command - I usually run it inside ./bash instead of on the host (or via another docker container) [21:51:23] do other people do that? should we recommend that? [21:51:38] We could recommend that, but the docs currently dont [21:51:54] mwcli is the future anyway ;) [21:52:00] seems like it'd be harder to mess up and more closely under our control and testable. [21:52:20] e.g. otherwise ou'd need to make sure the user has the right php and composer versions first whcih is a pretty big hurdle for some [21:52:29] i might even have a testable version of the new mwdd in mwcli this weekend [21:52:40] k