[09:15:56] Hey, guys. I have a bunch of images on a Synology NAS, how do I go about linking them in Mediawiki? I've already tried File:smb:// and Image: [09:19:17] Jux: can you make a symlink in your uploads folder? [09:19:37] Talking of uploads folder... Nothing I tried seemed to make it 'secure' according to the installer. [09:19:58] I'm currently trying the .htaccess file, but HTML is still being rendered in that location. [09:20:31] 1) what script is run to check it that the installer users? [09:20:45] I can make a symlink, yes... So, I'd just $ ln -s and that's it? [09:21:03] Jux... I'm sure I'm missing somehting, but then your wiki could at least see those files [09:21:27] Hmm... okay. I'll do it and see what happens [09:21:29] My first thought was 'import them into the wiki', but you don't want to duplicate them [09:21:56] Yes, I have a colossal database of at least 20 TB worth of images [09:21:59] I'm just reading this for my problem, it may help you too: https://www.mediawiki.org/wiki/Manual: [09:22:10] You really need that much prawn? [09:23:06] ROFL... Pr0n and wiki is a strange combination, don't you think? [09:23:16] no comment [10:31:37] Does anybody know if it's absolutely necessary to manually upload images into mediawiki before they can be used? Can't they just be dropped in ./images? [11:35:31] Jux: There is https://www.mediawiki.org/wiki/Manual:$wgHashedUploadDirectory which will keep the flat directory structure. The problem is that each image also has an entry in the database, so just dumping all the images in the images directory will probably not make them appear in the wiki. [11:40:59] there is a maintenance script that can import a folder of images into mediawiki [11:41:04] There is an importImages.php script in the maintenance folder, so a course of action that might work could be to import all the images (which will copy them into the images dir) and then replace the copies by symlinks. I dont know if there is a way to just register the files on the db. [11:55:20] hello [11:55:38] Is there a simple way to check if my upload directory is protected correctly? [11:55:51] The install script seems to do this [11:57:59] Warning: Your default directory for uploads /var/www/html/mediawiki/images/ is vulnerable to arbitrary scripts execution. [11:59:21] my .htaccess file isn't being respected (I don't think), but I'm not sure how to configure the vhost [12:02:14] Do I put the directive within the VirtualHost directive in the /etc/apache2/sites-enabled/mediawiki.conf file? [12:05:30] I'll ask in apache [12:14:32] Nikerabbit: hey! can you tell me what script you ran for the profiler analysis, and on what data? I'l like to reproduce and analyze what's happening. [12:16:09] duesen: it's from production so you don't have the data unfortunately, details are in https://phabricator.wikimedia.org/T230100 [12:16:34] duesen: in theory you could setup mediawiki-core in your test wiki, but that would take a while [12:18:54] I'm struggling to implement what's written here: https://www.mediawiki.org/wiki/Manual:Security#Upload_security [12:18:57] Nikerabbit: I have never set up Translate before. What'S the minimal setup I need to test this with just 100 messages or so? [12:19:17] Is that Directory directive an absolute path, or relative to the DocumentRoot? [12:20:31] duesen: you need ULS + Translate, then you can pretty easily setup a message group of core (or any set of json files) and import that to wiki and try to export it again [12:21:32] faceface: absolute, see https://httpd.apache.org/docs/current/mod/core.html#directory [12:21:57] thanks duesen [12:22:34] Nikerabbit: do you have documentation for " pretty easily setup a message group of core"? [12:22:48] duesen: well yes and no [12:23:01] https://github.com/wikimedia/translatewiki/blob/master/groups/MediaWiki/MediaWiki.yaml but it's more complex than what you need [12:23:02] :P [12:23:44] and there is https://www.mediawiki.org/wiki/Help:Extension:Translate/Group_configuration and https://www.mediawiki.org/wiki/Help:Extension:Translate/Group_configuration_example [12:23:57] and you don't need even to use YAML if you don't want to, but there aren't good examples of that [12:24:50] duesen: basically configure $wgTranslateGroupRoot, create a simple yaml configuration file based on the model, use scripts/processMessageGroupChanges.php to import it in batches (you need translate-manage right) [12:26:20] oh and register the yaml file with $wgTranslateGroupFiles[] [12:26:49] pick maybe some small extension with 20+ messages [12:27:43] duesen: I can walk you through this in more detail if you want, but need to find a time slot for that [12:28:40] Nikerabbit: thanks, I'll see how far I get. If get stuck, I'll send you a calendar invite [12:29:01] duesen: good luck :D [12:30:57] duesen: https://github.com/wikimedia/translatewiki/blob/master/groups/MediaWiki/mwgitlab.yaml more minimal example: the relevant ones are BASIC and FILES sections [12:46:48] Nikerabbit: Unable to open CDB file "false/messagechanges.default.cdb.tmp.1870497391" for write [12:46:59] I guess "false" is the value of some config variable i need to set somewhere :) [12:48:16] duesen: right, set wgTranslateCacheDirectory to somewhere writable [12:48:28] it defaults to wgCacheDirectory, but that also defaults to false I think [13:38:16] This is annoying, I see that /etc/apache2/sites-available/mediawiki.conf is being read, but the directory directive is being ignored [13:39:58] Ahhh... The virtual directory of my mediawiki.conf isn't being respected because of default.conf [13:40:04] (I suspect) [14:01:27] Technical Advice IRC meeting starting in 60 minutes in channel #wikimedia-tech, hosts: @Lucas_WMDE & @mutante - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [14:51:24] Technical Advice IRC meeting starting in 10 minutes in channel #wikimedia-tech, hosts: @amir1 & @Lucas_WMDE - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [16:37:49] Nikerabbit: Petr updated the patch against Translate. Give it a try when you have time. [16:38:03] This should help a lot. But maybe we are missing another place that needs batching... [16:46:04] duesen: yes I saw. may test it before techcom meeting today [16:47:01] Nikerabbit: off topic but aren't you a Finn? [16:55:47] are finns not allowed test things before techcom meetings [16:55:52] *to [17:09:01] <_H_K_T_> !admin ever know hat I have BANana of GLOBE's size: https://meta.wikimedia.org/wiki/Special:Log?type=&user=WMFOffice&page=User%3AWikinger%40global&wpdate=&tagfilter= [17:09:01] To recreate the admin user, run "php maintenance/createAndPromote.php" on the command line. [17:09:23] <_H_K_T_> ever know hat I have BANana of GLOBE's size: https://meta.wikimedia.org/wiki/Special:Log?type=&user=WMFOffice&page=User%3AWikinger%40global&wpdate=&tagfilter= [17:10:04] <_H_K_T_> mail me at gogigeorgereeves@gmail.com OR join groups.google.com/forum/#!forum/wikimediacomputer [17:11:42] <_H_K_T_> Do You Know: HKT stands for Hansemann-Kennemann-Tiedemann, and there are instances of https://oldthing.de/Patriotika-AK-Heil-und-Sieg-unseren-Waffen-1915-0024787605 that have nothing to do with fascism? [17:12:48] harmaahylje: yes of course [17:12:53] <_H_K_T_> Catholic Saint,named Anna Katharina Emmerick says: The first tongue, the mother tongue, spoken by Adam, Shem, and Noah [protoindogermanic], was different, and it is now extant only in isolated dialects. Its first pure offshoots [indoiranian] are the Zend, the sacred tongue of India, and the language of the Bactrians. In those languages, words may [17:12:53] <_H_K_T_> be found exactly similar to the Low German of my native place. The book that I see in modern Ctesiphon, on the Tigris, is written in that language. Protoindogermanic itself, (borrowing-free100%, all attempts to contaminate of protoindogermanic with outside borrowings are strictly 100% banned) is in Gerhard Kobler - Indogermanisches Wörterbuch and [17:12:54] <_H_K_T_> in Julius Pokorny - Indogermanisches etymologisches Wörterbuch , it is REAL, except laryngeal theory samples discernible by leetspeak spellings. Protoindogermanic is REAL, unlike fake quenya sindarin valaric and maiaric. [17:16:09] <_H_K_T_> and yet PO-lish PO-lice mistakes https://oldthing.de/Patriotika-AK-Heil-und-Sieg-unseren-Waffen-1915-0024787605 with nazism national socialism and names national communism, not ever bothering to look that it belongs to 1914-1918 period! [17:16:48] <_H_K_T_> explanation: NAMU = nationalcommunism / NAZI = nationalsocialism [17:18:46] <_H_K_T_> WMF forced down my throat BANana with length of 40 megameter - circumference of GLOBE: https://meta.wikimedia.org/wiki/Special:Log?type=&user=WMFOffice&page=User%3AWikinger%40global&wpdate=&tagfilter= [17:19:16] <_H_K_T_> WIKINGER is not fascist, Wikinger is HAKATIST [17:21:10] <_H_K_T_> If https://it.wikipedia.org/wiki/Klan_Ausran would be NOT pagan but CATHOLIC instead, it would be 100% along Bicmarckien reINDOGERMANization line! [17:47:41] Nikerabbit: i'm hitting Notice: By-passing message group cache for test-babel [Called from MessageGroupBase::initCollection in /var/www/mediawiki/extensions/Translate/messagegroups/MessageGroupBase.php at line 247] [17:47:44] is that expected? [17:48:26] seems like it aborts the export [18:04:55] duesen: it won't abort, it just means it has not been imported with scripts/processMessagechanges.php [18:06:51] Nikerabbit: but that's how I imported it. And I don't see the output file... I'll poke around. Just thought you might know why this happens. [18:10:06] Hi, i got some problems with ReplaceText extenstion and i18n strings. When i go to Special pages and check [18:10:46] duesen: jobqueue drained? [18:10:48] ...out ReplaceText extension in the list. The page title is in this weird format <replacetext> [18:11:20] Nikerabbit: ah! that might be it. [18:11:58] hm, nope [18:12:02] doesn't help [18:12:33] duesen: did you finish the thing by going to Special:ManageMessageGroups and clicking through? [18:13:35] can php during web request write to the directory you set to the cache dir? [18:14:21] Nikerabbit: i guess... no to both. i'll fix that. sorry [18:34:00] duesen: testing the latest patchset now... still seeing a considerable slowdown, but not CPU bound. Lots of IO or idle [18:42:43] Nikerabbit: where from in Finland, if you don't mind me asking? [18:46:20] harmaahylje: Helsinki [18:48:44] Nikerabbit: lots of IO is bad, we don't want that :/ [18:49:03] Petr is investigating as well [18:49:54] duesen: yeah looks like DB is getting more load [18:50:28] Nikerabbit: a *little* more is expected, but it shouldn't be a lot more. [18:50:38] duesen: but on quick look we are now talking about 6x increase and not 20x [18:50:48] in real running time of the script, that is [18:51:00] Better, but still not good [18:51:17] I'd settle for x2, though that'S still worse than I hoped [18:51:35] duesen: okay, scrap that, I just got 2.5x on second run [18:51:50] Nikerabbit: you are runnign this on translatewiki, right? Does that use externalstore? [18:51:57] duesen: no we don't [18:52:18] Hi and thanks for the awesome software. I'm trying to upgrade a 1.32.0 to 1.33.0 and I get error "[a4d70e703ecdbf3c153334b6] [no req] Error from line 35 of /var/www/consumerium.org/develop/w/extensions/Wikibase/lib/WikibaseLib.entitytypes.php: Class 'Wikibase\DataModel\Entity\ItemId' not found". It is a wiki family of two wikis I am trying to get to update [18:52:22] ok. 2.5 doesn't sound too terrible, though we should still try to optimize [18:52:31] ES would have offered an easy opportunity for optimizations [18:53:10] jubo2: I think wikibase uses submodules for dependencies. make sure you pull them in [18:53:30] duesen: ok. thank you for info. [18:53:35] duesen: ES: easy? how big impact? [18:53:50] how do I pull them. I got the newest Wikibase with the extension distributor [18:54:11] Nikerabbit: a lot, potentially. but *adding* ES won't help. The optimization would make ES less slow :) [18:54:42] hmm [18:54:53] so it's dead end? [18:54:57] Nikerabbit: with ES, we are still doing one DB query for each revision. that's not new, that has always been the case. btu we could fix that [18:55:07] ES is a deadend for your case, yes [18:55:13] we'll need somethign else [18:55:15] okay, understood [18:55:24] right now, i'm hitting OOM when trying to run Translate jobs :D [18:55:57] the default limit is still like 100M?? [18:56:01] duesen: I'm not very texy. How would I pull the submodules? [18:57:29] real 4m15.187s [18:57:29] user 2m36.656s [18:57:37] that's the lowest I got out of three runs with the patch [18:57:42] jubo2: git submodule update --init --recursive [18:57:55] ...if it's the first time you are pulling submodules for that repo [18:57:55] real 1m55.024s [18:57:56] user 0m45.048s [18:57:58] this without [18:59:01] meh :/ [19:04:44] duesen: I made query dumps without&with the patch if that helps [19:04:55] duesen: I did not install the Wikibase with .git, but extension distributor [19:05:14] ... so it does not find a .git-directory in there [19:06:13] duesen: I still see (unnecessary?) Title::newFromIDs query per language [19:13:54] For some reason constructing Title object is slow, and it really adds up when doing that hundreds of thousands of times [19:16:05] jubo2: it's possible that the extension distributor bundle just doesn't work for wikibase. not sure. [19:16:53] Nikerabbit: Title::newFromIDs, with an "s" at the end, should be fine. Without the "s" is a problem. [19:18:45] Hm... RevisionRecord needs a Title object. Revision didn't. Maybe there is more potential for optimization there... [19:19:01] is it really the object construction, or is the the query against the page table? [19:19:05] duesen: Extension Distributor is failing to include composer dependencies, maybe that's also true for git submodules... https://phabricator.wikimedia.org/T227362 [19:19:48] Vulpix: for composer, that's intentional. One vendor directory per extension would lead to duplication and conflicts. [19:19:51] duesen: it doesn't show up high in profile, so it's jut Title::makeTitle(Safe) that is slow [19:19:55] but it's quite possible that it doesn't do submodules [19:20:27] Nikerabbit: makeTitle should be really fast. makeTitleSafe is pretty slow. [19:20:36] maybe be can drop the "Safe"? [19:20:51] duesen: that's what my optimization patch does [19:20:57] I see [19:22:13] Yeah Title::newFromIDs only takes less than 5% in the new profile [19:23:45] Various Wikimedia\Assert\Assert things still show up high in the profile, but maybe that's just a profiling artefact than a real issue [19:24:28] Nikerabbit: i just realized that the query that fetches the revisions also pulls in the data from page. we could optimize for that, using the page_xxx fields from the revision row if present, and call Title::newFromRow instead of using newFromIDs [19:25:10] yeah that sounds nice [19:28:20] what are you guys up to? just curious what is the task at hand [19:28:54] harmaahylje: https://gerrit.wikimedia.org/r/c/mediawiki/extensions/Translate/+/537483 [19:29:20] harmaahylje: also https://phabricator.wikimedia.org/T230100 [19:29:55] Nikerabbit: but if newFromIDs is only 5%, then this will only gain us 5%. I assume these 5% include the time that newFromIDs spends waiting for the DB. [19:30:01] IF it doesn't they gaim may be much larger [19:30:40] duesen: I really don't know much about xdebug profiling output, other than that it lies [19:30:46] harmaahylje: something that is trivial in theory, and really hard in practice. Abstraction is bad for performance ;) [19:32:34] Ok... getting the Wikibase with .git I get error saying I have a version of Wikibase that requires >=1.34 [19:32:51] I guess I need to figure out how to get a version suitable for 1.33 [19:33:36] https://yosefk.com/blog/how-profilers-lie-the-cases-of-gprof-and-kcachegrind.html [19:34:05] jubo2: you might find that from the REL1_33 branch of Wikibase, if you are lucky [19:35:35] Nikerabbit: Yes, now trying it [19:38:38] Hmmhh... now getting "Class 'Wikibase\DataModel\Entity\ItemId' not found", even as I did the 'git submodule update --init --recursive' [19:42:20] jubo2: maybe that's from composer? [19:44:49] Nikerabbit: i'm failing to increase the memory limit... can't it be done with ini_set in LocalSettings? [19:46:03] Nikerabbit: I need composer? That error is from /var/www/consumerium.org/develop/w/extensions/Wikibase/lib/WikibaseLib.entitytypes.php [19:47:19] oh, right. $wgMemoryLimit. [19:47:57] jubo2: composer is used to pull in dependencies. iirc Wikibase doesn't use it. But maybe it does now. [19:47:57] https://www.mediawiki.org/wiki/Composer [19:48:23] jubo2: the error definitly means you are missing a library. we are all just trying to find out why, and what the "currect" way of installing it would be. [19:49:20] this library, to be precise: https://packagist.org/packages/wikibase/data-model [19:49:34] but if you are missing that, you are probably missing several more [19:52:40] Now running 'composer install --no-dev' [19:55:04] jubo2: you may need to set up composer.local.json for this to work for extensions. see https://www.mediawiki.org/wiki/Composer#Using_composer-merge-plugin [19:55:21] I have "extensions/*/composer.json" in there [19:55:31] no idea why we don't just do that per default [19:55:42] would be too easy, i guess [19:56:17] duesen: I copied the composer.local.json from the old mediawiki [19:56:56] Now the installer starts to run, but migration of ipblocks.ipb_by and ipblocks.ipb_by_text to ipblocks.ipb_by_actor fails. Full error message here https://paste.debian.net/1102589/ [20:00:09] you'll have to ask anomie about that one... [20:02:33] "Unknown column 'ipb_by_actor' in 'where clause'" ... could this be something to do with the Mediawiki being upgraded is one member of a 2 wiki wiki family. I have previously upgraded this wiki family, just needed to disable the shared tables for the duration of the upgrade [20:04:38] jubo2: Seems like https://phabricator.wikimedia.org/T227662#5325853 at first to me, but shared tables might cause it somehow too. [20:07:35] anomie: but I have disabled the shared tables for now. So is this something that requires for waiting for 1.33.1 or what should I do? [20:07:42] Yes, shared tables will be a problem here, because the updater will try to migrate a column that has been already dropped in the previous run from another wiki [20:08:14] Vulpix: But I disabled the shared tables before trying to run the update [20:08:18] I don't know if the patches/maintenance scripts will take those shared tables into account [20:09:02] jubo2: Basically, make sure that all the table alterations from maintenance/archives/patch-actor-table.sql actually happened. Chances are you'll find that it somehow only did the first few in the file and not the rest. [20:11:00] jubo2: Also probably the same for maintenance/archives/patch-comment-table.sql [20:11:03] ugh, yeah, having a bunch of tables on a single script may not work well when some of them are shared and others aren't [20:11:22] * anomie should finish reviewing https://gerrit.wikimedia.org/r/c/mediawiki/core/+/530374 ... [20:11:58] anomie: how do I check if those .sql were executed or not? [20:12:37] jubo2: You'd have to connect to the database and look at the table structures. [20:26:05] I tried migrating the main wiki first instead of the test wiki and that ran fine. Switching to upgrading the test wiki the updater fails with the same error as 'php maintenance/migrateActors.php' [20:26:30] Here is the error I get from 'php maintenance/migrateActors.php' https://paste.debian.net/1102593/ [20:27:16] I am guessing that some previous upgrade of the test wiki left it in an inconsistent state and that is why the installer will not run, but runs into a DB structure it does not expect [20:43:41] I rolled back the changes now. So it would seem the main wiki seems fine, but the (very minor) test wiki has gotten mangled in some update [20:54:41] I need to figure out how to solve this: Main wiki of the wiki family updates, but the other one doesn't [23:28:05] hi everyone ! [23:28:16] Hi Sophivorus [23:28:20] hi ! [23:28:26] im developing a gadget on Wikipedia [23:28:35] it's pretty widespread already [23:28:38] in many languages [23:28:48] and i could really use some way [23:28:59] of getting the latest translations from translatewiki [23:29:00] ProveIt? [23:29:03] yes ! [23:29:31] i found a magic api call [23:29:34] in translatewiki [23:29:40] that gets me the latest messages [23:29:40] https://translatewiki.net/w/api.php?action=query&list=messagecollection&mcgroup=proveit&mclanguage=es&formatversion=2 [23:29:44] but the problem is [23:30:00] that wikimedia has all that CSP [23:30:14] that issues a warning if i query translatewiki [23:30:43] and im finding no way to get the latest messages from diffusion or somewhere within the wikimedia umbrella [23:31:17] it even seems like there's a witch hunt going on [23:31:20] yeah I'm not sure that API is really intended to be used for programs to get translations at runtime [23:31:49] i bet not [23:31:56] but the current solution is [23:32:02] manually creating JSON pages at commons [23:32:07] and querying the Commons api [23:32:18] i dont think thats any better [23:32:41] I mean you could theoretically write a script that pulls the translations from translatewiki and builds them into your app's source [23:33:32] yes i guess [23:33:37] Sophivorus, why do you think there's a witch hunt going on? [23:33:43] ah, yes [23:33:44] https://phabricator.wikimedia.org/T172065 [23:33:46] there [23:34:01] oh that [23:34:51] yeah making external API calls from within the context of someone's wikimedia production browser session may leak private info [23:35:00] im interpreting that as an active resistance against the type of solution i was thinking (using the translatewiki api) [23:35:09] .. true *sigh* [23:35:30] the script you mentioned earlier [23:35:47] like the user's IP, UA, page they were looking at (?) [23:36:08] the script you mentioned, would have to be written in PHP, right ? [23:36:08] TWN is not part of wikimedia production and would be whitelisted in the content security policy [23:36:23] no you could write it in whatever language you want [23:36:26] would be blacklisted, you mean ? [23:36:42] maybe even lolcode [23:36:47] hah [23:37:13] Sophivorus, sorry, would *not* be whitelisted is what I meant to say :) [23:37:19] good catch [23:37:21] do you have a link to get me started on writing the script ? [23:37:46] (i value your help so i pay attention) [23:38:07] I guess it depends on what bit you don't know how to do [23:38:35] ultimately the problem to solve is to go through that TWN API you linked and dump it as JSON that you copy+paste into your gadget? [23:38:44] not the nicest solution in the world but I don't know if we have a better one right now [23:38:46] yes [23:39:08] since you're a gadget author and GEI you presumably have some programming experience [23:39:15] indeed [23:39:28] and you have located an API to get the data you want [23:39:33] i would have no trouble at all writing the script, my question is how do i get it to run periodically [23:39:50] or should i run it manually every now and then ? [23:40:43] it's up to you [23:40:47] i wouldn't want to set up a cron job on my own computer, since i want it to run from wikimedia servers to guarantee continuity over time [23:40:50] thats what i mean [23:40:56] well [23:41:25] i don't want to bother, if you're not sure what im asking or what the answer is, i'll figure it out [23:41:35] no I understand what you mean [23:41:56] I guess the main problem is this thing basically has to come from a restricted namespace, and so whatever is automatically updating the wiki's copy will need privileges? [23:42:13] though [23:42:19] hm.. true [23:42:27] maybe if it's simple JSON, no JS execution [23:43:02] and your application doesn't trust the translations to not be malicious [23:43:12] you could put it in a normal namespace and have a bot account edit it? [23:43:51] I don't know how many gadgets I'd be comfortable dealing with that securely [23:43:59] that could work .. [23:44:05] it's things like [23:44:12] if you render a translation as HTML instead of plain text [23:44:42] and it was open to editing by ordinary users [23:45:00] ah i understand what you mean [23:45:02] it'd essentially be a security vulnerability affecting anyone using your gadget [23:45:17] i think i can get editinterface rights for the bot, maybe [23:45:25] heh [23:45:26] good luck [23:45:35] :-P [23:45:35] admin bots were controversial enough IIRC [23:45:47] interface admin bots... [23:46:12] well, there definitely are some, namely the ones pointing out all the javascript and css style errors [23:46:49] so yea [23:46:53] thanks for the help [23:46:56] you could have a cron that runs under your own account, I don't know how comfortable I'd be with people storing their privileged account BotPasswords on toolforge or somewhere, if you want to go down that route I'd recommend asking in -cloud [23:47:10] tbh someone else probably already does this [23:47:22] yea i should research that first .. [23:48:21] the downside on using bots or even my own account, is that someday i may die [23:48:25] hah [23:48:43] apis would continue to work .. [23:48:56] your wikimedia LDAP account would probably remain open too :P [23:49:22] hah yea, but editinterface rights expire every year :-P [23:49:29] (at least mine) [23:49:45] thanks again krenair [23:49:56] i'll meditate and research a bit more and decide [23:50:07] best of luck [23:50:14] thanks again, bye ! [23:50:19] bye