[10:03:52] Nikerabbit: what might be the reason that Translate fails to generate the /en page? Can the generating be forced? This is missing: https://wiki.documentfoundation.org/Feature_Comparison:_LibreOffice_-_Microsoft_Office/en [11:44:38] buovjaga: usually it means that the job queue is stuck; I don't see it move from 15 https://wiki.documentfoundation.org/api.php?action=query&meta=siteinfo&siprop=statistics [11:47:02] Nemo_bis: I have not seen it lower than 3 as I've observed it yesterday and today... I asked earlier today that our infra guy look into it [11:47:55] ok [11:48:04] probably just a broken cronjob somewhere [11:48:15] I will keep you updated :) [12:11:41] buovjaga: there is a script that can be run to force them [12:12:30] Nikerabbit: ok, infra guy is looking into the problem right now "it complains that the job param can't be serialized and gives up" [12:14:37] buovjaga: exact logs / traces are helpful [12:14:39] it might be due to the VM experiencing rough space weather just at the time I marked the page for translation... "the blob seems truncated" [12:16:36] buovjaga: I have a faint memory that has happened in the past when the job_params table was smaller [12:16:56] I can't remember when exactly it was made larger [12:19:48] We are running MW 1.31.4 [12:29:04] "Change job table params from blob to mediumblob" c3c22e9ae8b21db6a708c1ebb4ddd984c6018f94 is from April [12:31:00] no mention of Translate at https://phabricator.wikimedia.org/T124196 though [12:31:23] yep, our guy said "ah it's rdbms truncating this, job's column type is BLOB, should be larger (MEDIUMBLOB or LONGBLOB) to store longer stuff" [12:32:28] there's a SQL patch in maintenance/archives/patch-job-params-mediumblob.sql [12:32:42] pretty trivial :) [12:34:34] thanks [12:35:06] Nemo_bis: it will have no problematic effect on next upgrade (to the next LTS)? [12:44:25] buovjaga: I've not checked the update.php code but AFAIK the worst that can happen is that it will run again the same patch patch-job-params-mediumblob.sql when upgrading to 1.33 or higher and have no effect [12:44:43] buovjaga: possibly, but in theory the updater should skip schema changes that are already applied [12:44:48] hi duesen_ [12:44:52] cheers [12:44:54] btw the job table is entirely disposable, you can TRUNCATE it with no information loss AFAIK [12:45:04] (my message was redundant, eh) [12:45:36] Nemo_bis: why would you TRUNCATE it though [12:45:52] because you hate up to date links tales [12:45:54] tables [12:46:28] I wouldn't if it has 5 lines :) ; just saying that not much can go wrong, I think [12:46:52] Nemo_bis: update.php failing midway can go wrong.... [13:20:25] Nikerabbit: hey! Looks like the patch disabling caching for blob batches should go on swat, or at least on the next train. Would you give a +1? https://gerrit.wikimedia.org/r/c/mediawiki/core/+/542328 [13:20:31] all the american folks seem to be out [13:21:47] uh, a +2 even. it already has a +1 [13:28:43] duesen_: yes, though, as I wrote on the bug, I am worried that it is not the (sole) cause [13:29:20] Nikerabbit: I'm also worried about that, but it does seem like it's definitly *one* case. [13:29:40] and it's really the only thing that's new... [13:30:22] Hi, I've added $wgDebugDumpSql = true; but I don't see any log anywhere (e.g. in the page or in /var/log/httpd/*). Any idea where I should look or what I may have missed? [13:31:13] or maybe there's some cache I should flush somewhere? [13:31:19] Sarayan: check page source html. Or enable $wgDebugToolbar as well [13:32:56] I see nothing in the source, let's try the toolbar [13:33:26] ahh, debug log, nice [13:37:41] oh great, retrieving a page is a double join [13:38:18] errr, quintuple [13:43:07] duesen_: postgres doesn't like the binary tests it seems: https://integration.wikimedia.org/ci/job/mediawiki-quibble-vendor-postgres-php72-docker/2939/console [13:44:05] ok, I see my problem [13:44:18] my revision_comment_temp table is empty, is that fixable? [13:44:58] Nikerabbit: btw. that sounds like the postgres driver is just broken. that would cause data loss in production :( [13:46:28] maintenance/migrateComments.php says comments already migrated, great [13:48:11] duesen_: fortunately almost nobody is using it [13:51:03] *sigh* [13:54:03] Nikerabbit: i took out the \0 [13:58:47] YES! Fixed [13:59:11] gross hack but I don't care [14:01:03] okay [14:01:33] let's see what jenkins thinks [15:33:26] i need to submit my article but it gets rejected [15:56:22] duesen_: thanks a lot for the help [17:40:02] Yay, /en lives!! https://wiki.documentfoundation.org/Feature_Comparison:_LibreOffice_-_Microsoft_Office/en [17:46:17] Well, the problem with raw tags being rendered remains: https://wiki.documentfoundation.org/Feature_Comparison:_LibreOffice_-_Microsoft_Office ie. Yes [17:48:50] Any tips would be great. If this could be used to solve it https://www.mediawiki.org/wiki/Template:Translatable_template can someone please explain in concrete terms what needs to be done? [17:49:48] that linked template is just so you can automatically transclude translated versions of templates [17:50:06] e.g. {{tnt|foo}} will be {{foo/en}} on en, {{foo/de}} on de, etc. [17:50:33] (well I think en is assumed the default actually so no subpage there) [17:51:24] yeah, so it doesn't solve the problem of how to separate a string from the pile of markup in order to not annoy translators as in: https://wiki.documentfoundation.org/WikiAction/edit/Template:Yes [17:51:43] it does not, sadly [17:51:50] there's no way to get around that I think [17:52:37] although I think that template you linked may fix the page itself? Maybe? not 100% sure [17:52:38] Alrighty, well thanks for confirming :) Personally I don't think it's a deal breaker for such simple one-word templates [17:52:59] (as in it'll prevent the tags from showing in the main page you're using {{yes}} in) [17:53:28] I'm not super well-versed in the translate ext so I can't really say for sure [17:53:51] hmm, ok I hope some satguru-level person can answer then [17:57:03] buovjaga: https://phabricator.wikimedia.org/T47096 maybe (so that you don't need a bunch of lua modules to do the same), though the bug description is very bad [17:59:24] Thanks. Ok, I will edit the yes/no/partial templates so translate tags are inside the noincludes and add a comment explaining it and pointing to the rfe [18:34:08] In the LibreOffice conference last month, we held a presentation/discussion session about Translate with our wiki guy Dennis. Then last week I wrote down instructions for our wiki editors: https://wiki.documentfoundation.org/TDF_Wiki/Multilingual [23:00:39] hello, im trying to take in an extract of a page in c# but im having trouble interpreting the json data thats returned [23:02:38] why? [23:06:01] im trying to get just the extract but when im setting up the classes they dont come out normal i guess? [23:06:32] Normal? [23:08:51] i dont really know how to setup the c# classes to take in the json and i tried to use an online c# json to class site but it came out with classes with names like "__invalid_type__2916579" [23:09:33] I imagine you don't need it generating classes to represent it [23:10:01] Try using https://www.newtonsoft.com/json [23:10:12] It's the goto library for c# for .NET [23:10:32] uh, json for .NET [23:10:52] does it work with unity? [23:10:59] Should do [23:11:51] Maybe not from a quick google [23:12:16] https://assetstore.unity.com/packages/tools/input-management/json-net-for-unity-11347 [23:14:17] But doing the classes yourself for simpler stuff... [23:14:29] JavaScriptSerializer jss = new JavaScriptSerializer(); [23:14:29] var updaterData = jss.Deserialize(json); [23:17:22] is there a way to return the json data in a simpler format? [23:17:46] What query are you doing? [23:18:08] https://en.wikipedia.org/w/api.php?format=json&action=query&prop=extracts&redirects=true&explaintext&titles=50_Cancri [23:22:30] * Reedy tests something in linqpad [23:22:59] Hmm. No that example also uses json.net [23:25:13] In theory, you could just use a regex :P [23:26:37] aaaaaaaa [23:26:48] thats probably what im gonna have to end up doing [23:27:49] https://assetstore.unity.com/packages/tools/input-management/json-net-converters-simple-compatible-solution-58621 [23:30:00] This was something I did a while ago for a case where I couldn't use json.net https://p.defau.lt/?LQhHj3_5A748tCjDk0zeJA [23:30:26] in theory, a simple set of nested objects you could get it to play ball [23:31:30] Which is I guess whatever your online thing was trying to do [23:32:37] im trying to see what i can do about the pageid because it changes with every page so i cant turn it into a class [23:32:48] Just use a string or an int? [23:33:48] Dictionary [23:36:47] Have you tried either of those assetstore links? [23:38:41] im trying the first one [23:42:28] the name of the object that returns the pageid title and extract data changes between each query which i m not sure how to handle [23:48:09] even if i use json.net and call the objects dynamically i think i still would need to know the pageid beforehand to get that object specifically but im not sure [23:52:31] thanks for your help Reedy im gonna keep digging at it [23:53:11] Just iterate all over all in the result? [23:53:46] It should be able to present it in something with an indexed keyvaluepair (usually a dictionary in c#)