[00:02:49] Sazpaimon: sounds like it could be; I mean, the usual way to modify hooks array is to define the "Hooks" section in the extension's extension.json file (or a skin's skin.json file) [00:03:16] sure, if you're writing an extension [00:03:28] but what if you just wanna make a simple hook in your LocalSettings.php file [00:04:08] well it's PHP, so using $wgHooks should be doable (but depending on what you're trying to do...a mini-extension of some kind might still end up being the thing you want & need) [00:04:44] I mean, all 3 methods work fine [00:05:00] i was just curious if one would end up being deprecated/removed in the near future [00:06:39] $wgHooks would be more stable than Hooks::register probably [00:06:48] in the context of LocalSettings.php [00:06:54] the latter is mostly an implementation detail [00:07:13] whereas the former was how hooks were defined for a (very) long time, long before extension.json/skin.json existed [00:07:42] that being said, I can't predict the future very well :) [00:34:38] speaking of hooks, is there a way to hook into the sidebar rendering for the vector skin that allows me to edit some of the rendered html? It doesnt seem like there is [00:35:17] for example, I want to add some font awesome icons next to some of the sidebar entries, which requires I add an tag next to the text [00:35:44] I can modify the skin directly if needed, but i wanted to see if there was a way to do it without messing with the skin files directly [00:37:49] Sazpaimon: I would do that with css [00:39:14] css won't help me add the needed markup to the sidebar elements [00:39:51] are you sure? [00:40:24] what do you want to do, exactly? [00:40:31] (final result) [00:43:00] I have a question about the 'pagelinks' and 'page' tables. If I find an entry in 'pagelinks' that is `pl_from(123) → pl_title("foo")` and I find in 'page' an entry `page_id(123), page_title("bar"), page_is_redirect(1)`. Is it possible for the 'page' entry to not be a redirect to "foo"? [00:43:40] hmm [00:43:58] I guess you could have #REDIRECT [[foobar]]\n[[foo]] [00:44:06] that might produce such result [00:44:44] so say I have a link to a message board, I'll add `** http://mysite.org/forums | Forums` to MediaWiki:Sidebar. This will output `
  • Forums
  • ` [00:45:10] adrian_1908: have a look at the redirect table [00:45:35] ok, Sazpaimon [00:46:15] if I wanted to add a font awesome icon, I'd need to add something like before the "Forums" text [00:46:30] Platonides: For comparison it might be useful, but the redirects table is said to be incomplete and I'm not sure to what degree. [00:46:33] see: https://fontawesome.com/icons/comments [00:46:51] adrian_1908: which wiki are we talking about? [00:47:20] Sazpaimon: should the icon be inside the ? [00:47:27] Wikipedia, but it's on Mediawiki where I found a statement regarding that. [00:47:44] Platonides, to make it clickable, yes [00:47:46] “As of August 2007, database dumps for Wikipedia and other Wikimedia projects as provided on https://dumps.wikimedia.org/ have incomplete data in this table: only redirect pages that have been created or edited after summer 2007 are present. For older redirects, resort to using the pagelinks table.“ [00:48:04] "As of August 2007" [00:48:21] Yes, but will all old redirects have been touched? [00:48:23] every page will have been reparsed since [00:49:26] Reparsed meaning what? I assumed the message says that some kind of edit had to have taken place. Or will automated actions have touched every page since? [00:50:04] I think a cron will have run over it [00:50:25] I'm thinking the final html should be something like:
  • Forums
  • [00:50:49] you will also need some css using that fa-comments class [00:51:11] and it can be changed to use #n-Forums.a [00:51:37] or even #n-Forums.a:first-child [00:52:07] hmm, not sure about the latter [00:53:55] Sazpaimon: see https://fontawesome.com/how-to-use/on-the-web/advanced/css-pseudo-elements [00:55:58] comments is f086 [00:57:05] so #n-Forums.a::before { font-family: "Font Awesome 5 Free"; font-weight: 900; content: "\f007"; [00:57:11] so #n-Forums.a::before { font-family: "Font Awesome 5 Free"; font-weight: 900; content: "\f086"; } [00:57:21] that should do [00:57:40] it may also need a display: inline-block; [01:03:32] yeah I can give that a try [01:04:06] i would have rather use the actual font awesome styles themselves personally [01:04:25] just so i dont have to go poking around for the actual character code for each icon [01:06:45] if MediaWiki:Common.css supported a preprocessor like less, i could import the fa-comments class [01:09:09] you could do that with an extension [01:09:33] but really, looking up the few codes you will need will be much quicker [01:10:08] plus, you'd also need to look up the names (is it comments, or speak bubble? no, but I want the one with the money...) [01:10:18] you can put some comments with the code name along it [01:12:08] yeah the only other alternative would be to edit the skin template directly [01:12:45] and tweaking extension and template files was what got me in the mess of not updating MW for 5 years [01:56:39] I seem to have lost my editing toolbar [01:56:46] how can I bring it back [02:04:40] can you try in a new private window without logging in? [02:04:51] just to see if it's issue with wiki configuration, or an account settings [02:09:09] maybe you were using the classic toolbar, which was removed from wikimedia sites several months ago? [02:09:33] mnathani: when did it stop working? [02:09:38] mnathani: did you upgrade the wiki recently? [02:15:35] I just upgraded it now [02:15:48] how do I tell what version it is running? [02:19:04] now I get a 500 server error [02:22:34] see https://www.mediawiki.org/wiki/Manual:How_to_debug [03:52:27] Thanks Gryllida [03:52:33] you're welcome :-) [03:52:42] it's quite a feat.. did you figure it out? [04:26:24] wipe and reload [04:26:30] joking ... [04:26:47] restore from backup and upload new mediawiki using ftp [04:26:58] could not figure out how to extract in the same folder [04:27:03] using ssh [04:27:12] all good now [04:27:24] everything working, without 500, with toolbar? :-) [04:27:30] yup [04:27:53] no data loss either [04:28:41] nice :-) did you consider the "timeless" skin at your wiki? [04:29:00] it seems to me like heaven, i use it at wikipedia and everything :-) a lot more ergonomic in my experience [14:47:57] I use the MW API to query site statistics and CollectD to gather some info directly from MW databases. Semantic MW has its own API calls using smwinfo but it doesn't provide "Outdated entities" that shows on Special:Statistics. I've been reading through the source code trying to understand where that comes from, but in short, is there any [14:47:58] relatively easy way to query that value so I can add it to my grafana wiki dashboard? [14:50:10] DB query? [14:50:22] Or feature request and ask the SMW guys to add it to the API? [14:55:44] I just have to keep digging through the source to figure out how to query for that value. Just making sure I wasn't missing something obvious where it was directly accessible outside the database (without crude web scraping). [14:56:02] I thought about a feature request, first I figured I'd ask here to be sure I wasn't missing something. [14:59:01] so many layers of method calls! [15:04:01] I mean, if it's shown on Special:Statistics, they're probably using some hook call to add stuff to the page [15:04:56] https://github.com/wikimedia/mediawiki/blob/master/includes/specials/SpecialStatistics.php#L67-L69 [15:05:00] hook is SpecialStatsAddExtra [15:05:22] https://github.com/SemanticMediaWiki/SemanticMediaWiki/blob/a564e4cb14d9f724af5f2272990cdbffff0a7eb7/src/MediaWiki/Hooks/SpecialStatsAddExtra.php#L21 [15:05:42] That's where I was looking but it seems Outdated Entities is a bit more complicated than what gets added in that hook [15:06:43] So, using https://wikifarm.wmflabs.org/cpt/index.php/Special:Statistics as an example [15:06:48] - Outdated entities ⁱ 1,599 [15:06:53] https://wikifarm.wmflabs.org/cpt/index.php/Special:Statistics?uselang=qqx [15:06:59] smw-statistics-delete-count [15:07:05] https://github.com/SemanticMediaWiki/SemanticMediaWiki/blob/a564e4cb14d9f724af5f2272990cdbffff0a7eb7/src/MediaWiki/Hooks/SpecialStatsAddExtra.php#L57 [15:07:29] It's in $this->store->getStatistics()['DELETECOUNT'] [15:07:41] hmm, maybe the naming is throwing me off since I've been scouring for "outdated" through the code [15:08:11] Yeah, I gave up with that after "outdated" wasn't in the file at all bar one comment [15:08:18] I knew it had to be referred to as something else [15:09:25] Sweet, it is delete-count, wihch is available in the api call, api.php?format=json&action=smwinfo&info=deletecount [15:09:28] perfect, thanks! [15:10:31] So I can add a collectd curl_json config to grab the smwinfo values from my wikis to easily get them into graphite [15:12:18] I use collectd dbi for some other stuff that is also available through the api call, so I may move the ones that can into curl_json as well to eliminate the complexity of the curl_json.conf [15:12:48] Too bad database sizes can't be queried like that. I have to use the information_schema table to grab that info. [15:14:56] It wouldn't be impossible to make MW show that [15:15:31] MW API for siteinfo has a few hooks [15:15:46] You could subscribe to them (even in LocalSettings.php) a hook that does a DB query to query information_schema etc [15:16:13] I would guess... most 10-15 lines of PHP? [15:16:25] Bit more if you wanted it on Special:Version too [15:31:07] The database queries work and I'm not much of a PHP coder, haven't really touched in significantly in years, like almost 20. Being able to simplify what I already have and adding in the smwinfo stats is a good enough win for me. :) [15:31:57] I mean the db size queries specifically. The messy thing is that I've had to have multiple Database blocks in the collectd dbi config due to using different database users to connect. [16:27:09] andre__ do have a NGINX reverse proxy running in front of mediawiki? [16:27:32] hispeed_m: no, why? [16:32:40] andre__ because I want to do it and maybe you have a working config? [17:33:44] Hello, I have a question about webscraping and Wikipedia. Can anyone help me? [17:33:54] Potentially [17:34:31] !ask | frog59 [17:34:31] frog59: Please feel free to ask your question: if anybody who knows the answer is around, they will surely reply. Don't ask for help or for attention before actually asking your question, that's just a waste of time – both yours and everybody else's. :) [17:36:04] :) ok, thanks. I tried to scrape data from Polish Wikipedia using rvest but I was blocked (404 error). Is it because i tried to get too much data? It were references from article in one category (a lot of) [17:36:43] frog59: what makes you think you're blocked? [17:36:45] I don't think you'd get a 404 [17:37:12] (but https://www.mediawiki.org/wiki/API might make more sense anyway) [17:37:43] Answer: 'open.connection(x, "rb")':HTTP error 404 [17:38:09] It seems I try to get too much data at once but I am not sure... [17:38:33] What is x? [17:38:42] beyond a variable [17:38:47] what are you actually trying to request [17:40:00] References and categories for a set of Polish Wikipedia articles (category: Polska pod zaborami) [17:40:17] Yes, but what actual URLs? [17:41:12] It was a set of 179278 articles. For instance https://pl.wikipedia.org/wiki/Zbigniew_Klimowski and so on [17:42:31] Are you sure the 404 wasn't just an article that has now been deleted? [17:42:32] $ curl -I https://pl.wikipedia.org/wiki/Lolnocanhas [17:42:32] HTTP/2 404 [17:43:02] No, bacause I can get these articles in my browser [17:43:58] Maybe you could suggest how to get categories and references for such a huge set of articles? [17:44:15] The API can give you a list of categories on a page [17:44:41] References aren't though [17:45:24] Is it the only solution to use Wikipedia dumps? [17:45:28] TechCom is hosting an office our to talk about the new and improved RFC process (any anything else you may want to discuss) in a bit over two hours. If you are interested, please join us at 20:00 UTC (1pm PST) in the #wikimedia-office channel. [17:46:02] https://en.wikipedia.org/w/api.php?action=help&modules=query%2Bcategories [17:47:03] Oops, correction. In a bit over three hours. 2pm PDT, 21 UTC. Sorry :) [17:47:57] Thank you! [17:52:28] frog59: Confirmed, you wouldn't get 404s [17:53:16] Maybe it is simply a problem wit this R package - rvest? [17:53:38] Entirely possible [17:55:46] But maybe here is a solution: https://en.wikipedia.org/wiki/Wikipedia:Database_download#Please_do_not_use_a_web_crawler [17:56:17] It seems to me that this is my case but I wasn't sure so I asked you this question [20:26:25] hello, I have a MW install that I've upgraded form 1.19 et 1.34 [20:26:58] now everything works except the Special:Categories page where I got a database error [20:27:11] the debug log shows its en encoding (utf8) error [20:27:59] looking at the database, theres a mix of binary, latin1_swedish_ci, and utf8_general_ci encodings [20:28:23] is there a maintenance script to convert all my tables to the correct encodings ? [20:30:21] Nope [20:32:02] so... how can I fix the tables? what are they supposed to use? all binary? [20:32:49] I'm no expert but there are Percona tools that can help make such changes. [20:33:09] I have phpMyAdmin on hand [20:33:10] https://www.percona.com/software/database-tools/percona-toolkit [20:33:38] but I don't know what the end result is supposed to look like, eg. what you get with a clean install [20:34:07] Default is [20:34:07] $wgDBTableOptions = 'ENGINE=InnoDB, DEFAULT CHARSET=binary'; [20:34:43] I'm going to try this https://www.winterrodeln.org/trac/browser/servermediawiki/trunk/maintenance/update_mediawiki_sql.py [21:05:02] so it worked, those ALTER statements fixed my problem, and the wiki contents still looks good [21:48:29] I used to have "file:" in $wgUrlProcotols and that broke File: tags [21:48:40] I removed it and that fixed things [21:49:06] now, I have the same problem in the VisualEditor, [[File:foo.bar]] tags being rendered incorrectly [21:49:16] regardless of $wgUrlProtocols [21:49:31] is there a config somewhere in VE that might need tweaking ? [21:55:02] lavamind: check if it may be a cache problem (try a page that you can be almost sure nobody would have tried to edit it with VE) [21:55:40] also try to inspect the image with F12 and see if the URL may be pointing to a wrong hostname or path because of a bad configuration [21:57:13] Vulpix: its the exact same problem I had before, if I change the link to [[Fichier:foo.bar]] then it works in VE [21:57:37] it's almost like VE had cached the wgUrlProtocols parameter somewhere [21:58:24] pretty sure it's not a *local* cache, since the problem remains even in a private browser window [21:59:01] I'm not sure where it could be cached... I doubt Parsoid or RESTBase can cache that but who knows [22:00:08] Vulpix: aah, retsrating Parsoid fixed it :) [22:00:11] restarting* [22:00:28] good to know! [22:21:20] Huh, apparently {{DEFAULTSORT: }} does not do the same as [[Category:Foo| ]] [22:21:23] e.g. to sort a page at the top [22:21:47] not that I've ever written {{DEFAULTSORT: }} this way, but it is what VE/Parsoid produce by default. [22:21:57] I guess it is trimmed and considered absent/ineffective [23:35:52] Huh.