[02:58:47] is there a better way to respect linebreaks in wikitext other than using ? [03:04:03] ningu: also
[03:04:24] 	 legoktm: that turns off wikitext though right?
[03:04:29] 	 But poem is often used for that case
[03:04:30] 	 Yes
[03:04:35] 	 ok
[03:04:39] 	 yeah so depends what the goal is
[10:23:20] 	 Is there anyway to use $wgHooks['SidebarBeforeOutput'] ... or something similar... to put some arbitrary html at the top of the sidebar rather than at the bottom?
[10:23:48] 	 I get the same problem with SkinBuildSidebar too
[10:59:38] 	 nevermind - figured it out :D
[13:26:05] 	 i suppose it's not possible to force (even temporarily) a slash in a file upload name? wgIllegalFileChars and wgLegalTitleChars title chars tweaks aren't helping
[13:33:42] 	 Forward or backward? I'd see it very problematic. Forward slashes can be interpreted as a directory separator (and private info may be exposed if this can be used to traverse directories, with file names like ../../etc/passwd), and backward slashes can cause trouble in scape sequences
[13:36:44] 	 forward /
[13:38:17] 	 i don't see mediawiki as actually allowing that in that scenario as trying to manually traverse results in being redirected to the wiki main page
[13:47:16] 	 If you happen to name a file "File:A/../B.png it will probably end up in an unexpected location on the filesystem, and I assume the webserver will not map it correctly
[17:00:03] 	 Hi
[17:00:45] 	 ?
[17:43:08] 	 Amir1: addshore: the sqlblobstore, in a nut shell, do you think they're more or less duplicates of the same, or is there a conceptual difference in the format (e.g. parsed vs unparsed) or backend (memc vs elsewhere).
[17:43:20] 	 context being T255305
[17:43:21] 	 T255305: [Investigation] - Consider if the Wikibase cache for CachingEntityRevisionLookup is needed any more - https://phabricator.wikimedia.org/T255305
[17:44:07] 	 Krinkle: I think they are the same, the only conceptual difference is that sqlblobstore keeps the rev id while the wikibase one keeps the page title
[17:44:20] 	 which means dropping it would shift some more queries to s8
[17:44:24] 	 So tldr;  this has existed in Wikibase for years, then sqlblobstore cache came around and we need to delete ours 
[17:44:56] 	 Amir1: hmm, where does that extra lookup happen? When I looked before I didn't spot that
[17:44:59] 	 another point is that sqlblobstore compresses them which is great for network and memcached memory but I'm slightly worried this might go cpu of appservers kaboom
[17:45:22] 	 WikipageMetadataAccessor or something
[17:45:45] 	 but it has a LRU cache (which we could turn into an APCu if it goes bad)
[17:46:06] 	 ok, good to keep an eye on, let serviceops know about that as well, as an FYI to look out for improved/reduced memc saturation but possible cpu load given wikibase will no longer be bypassing the compress/decompress that we do elsewhere for revision text loads. 
[17:46:49] 	 Effie is in loop
[17:47:32] 	 first I'm deploying on repos, that's small enough
[17:47:40] 	 Amir1: I see! I didnt notice that when I started looking at it in https://gerrit.wikimedia.org/r/c/mediawiki/extensions/Wikibase/+/605333 but i didnt finish looking reeally...
[17:48:00] 	 also ttl of sqlblobstore is 7days but ours is one hour
[17:49:09] 	 Amir1: got a pointer for where int he general entity access process WikipageMetadataAccessor comes into it?
[17:50:01] 	 addshore: PrefetchingWikiPageEntityMetaDataAccessor 
[17:50:33] 	 ::loadRevisionInformation
[17:50:59] 	 I also do xhgui to make sure nothing horrible happens
[17:52:26] 	 Amir1: did you look at my patch? I dont remember what direction it was going in, but use of WikipageMetadataAccessor was removed from the "critical path" in https://gerrit.wikimedia.org/r/c/mediawiki/extensions/Wikibase/+/605333/4/data-access/src/SingleEntitySourceServices.php
[17:52:54] 	 when exploring this topic
[17:53:21] 	 hmm, I missed it
[17:53:57] 	 but I think it's a right path, maybe we should add cache for it somewhere but in general, I think it's the right path
[17:54:03] 	 Yeah, I think removing that bit ends up being more work, but then I think it ends up with less cache usage and less db calls, but I dont remember either as I didnt look in a while
[17:54:46] 	 I ended up writing something to use RevisionStore fully though, rather than just remove the cache https://gerrit.wikimedia.org/r/c/mediawiki/extensions/Wikibase/+/605333/4/lib/includes/Store/Sql/RevisionStoreEntityRevisionLookup.php
[17:55:11] 	 but thats why we want the investigation first to figure out which path makes the most sense :)
[17:57:19] 	 This might not be related, but one of the things we added in core years ago was a cache for title->revID for the common case where you want to transclude the current/latest revision (such as templates and Lua module source code). This is part of Revision/getKnownCurrentRevision, which optimises for this case specifically and gets updated proactively on-edit by MW.
[18:01:12] 	 Parser and Lua use this method
[18:01:15] * Krinkle meeting
[18:01:23] * addshore also in a call now
[18:03:39] 	 before and after: https://performance.wikimedia.org/xhgui/run/view?id=5f342c3cbb8544a80bdb2b91 and https://performance.wikimedia.org/xhgui/run/view?id=5f342e54bb8544b80b969e7e
[18:03:51] 	 the after one is the second request to make the cache warm
[18:07:16] 	 Krinkle: that sounds awesome, I will try to see if/when we can use it
[18:07:56] 	 Amir1: are they both after a warmup?
[18:09:32] 	 actually the first one is not
[18:09:43] 	 let me do it on the other mwdebug node
[18:09:50] 	 ah, it's synced
[18:10:07] 	 I'll do it on clients instead
[18:57:12] 	 1.34, a few extensions, trying to add a custom group. It shows on Special:UserRights, but toggling it there doesn't cause it to save (page reloads, looks fine, but setting doesn't stick and there's no change history saved). Any debugging suggestions?
[18:59:00] 	 Cancel that - missed "The group name cannot contain spaces". :|
[18:59:53] 	 Zebranky_: the database name can't contain spaces, but you can customize the display name to have them
[19:05:39] 	 Zebranky_: as Majavah said. The way you do that is e.g. define the group in your LocalSettings to 'somegroup' and then edit the page [[MediaWiki:Group-somegroup]] on-wiki to set the group's display name
[19:21:10] 	 Hello. Will categories automatically be created if I add category:foo in a page? 
[19:32:02] 	 Depends what you mean by automatically created
[19:32:10] 	 You don't have to create a description page for a category, no
[19:38:19] 	 bd808: Looking at https://phabricator.wikimedia.org/T225097#6272173 again - so BounceHandler tells Exim to invoke this curl script, which always goes to BounceHandler on metawiki, where it then forges a job for the right wiki and queues it. Short of switching the job queue, I suppose a quick fix coudl be to disable the extension there. Are you aware of it doing anything useful right now on wikitech?
[19:39:56] 	 Krinkle: well, it was handling bounces until the job queue was redesigned without considering wikitech. Now it is probably just generating these errors
[19:40:37] 	 I just want to get wikitech moved into the main hosting pool to make all this crap go away
[19:40:59] 	 bd808: wikitech has always had its own jobqueue handler, right? afaik in the old queue this wouldn't have worked either, since the redis jobchron.php handler couildn't execute those wikitech jobs either
[19:41:23] 	 Exim has always been configured to curl/POST to Meta-Wiki.
[19:41:33] 	 (I think?)
[19:41:37] 	 bouncehandler used to work. I don't have graphs to show when it stopped, but it did work in the past
[19:41:44] 	 okay, interesting.
[19:42:07] 	 anyway, yeah, agreed, on the main cluster thing. at least job queue seems like an inessential thing for wikitech's emergency clause to not speical-case.
[19:42:16] 	 and overall for that clause, I think wikitech-static suffices anyway
[19:42:55] 	 yeah, the disaster recovery use case should be supported by wikitecch-staic
[19:43:01] 	 *static
[20:09:03] 	 Reedy: thanks. No I didn't intend to create description pages for each category, but am hoping that I can use something to find a list of all categories, like a tag cloud etc 
[23:13:03] 	 is it fair to say that there is no particularly simple way, in mediawiki core, to take a string of wikitext and locate any "|" characters that are used literally rather than to separate arguments in templates/parser functions?
[23:13:14] 	 or in tables
[23:14:19] 	 you could do $wgParser->parse() and search the | in the resulting html
[23:14:25] 	 huh
[23:14:38] 	 but that's unlikely to be helpful for whatever you want
[23:14:41] 	 I suppose so, but then I wouldn't be able to locate it in the corresponding wikitext :)
[23:14:48] 	 right
[23:14:56] 	 what are you *really* trying to do?
[23:14:59] 	 I think people will just need to manually do {{!}} when they want |
[23:15:03] 	 don't give use a XY question
[23:15:11] 	 s/use/us/
[23:15:30] 	 Platonides: so, I have pages in a namespace that will be rendered on parse using a tempalte defined under MediaWiki:
[23:15:35] 	 template*
[23:15:51] 	 it will look like it's just plain page content but that will be passed to the template, so if people use | there, it will screw stuff up
[23:16:19] 	 the template itself can't really be avoided. it's more important than the | issue
[23:16:29] 	 I'm just wondering if there's a way around it since it may confuse people
[23:16:35] 	 it = the | issue
[23:16:43] 	 how are you rendering it?
[23:16:50] 	 instead of copying the contents of that page
[23:16:55] 	 use a reference to that
[23:17:02] 	 hrm
[23:17:12] 	 so oif you have Ningu:foobar containing "I like | bars"
[23:17:16] 	 that won't create an infinite loop?
[23:17:22] 	 this is in PageContent.php for the Page: namespace
[23:17:26] 	 in getParserOutput()
[23:17:46] 	 instead of replacing the text with {{template|1=I like | bars}}
[23:18:02] 	 you could use {{template|1={{Ningu:foobar}}}}
[23:18:18] 	 hmm
[23:18:27] 	 so you are wrapping the output of the page?
[23:18:31] 	 specifically, it's for the body portion of the Page: page
[23:18:33] 	 yes
[23:18:41] 	 well, I'm allowing users to do that by defining that template
[23:18:48] 	 I think you *may* be able to do this
[23:18:49] 	 I'm modifying ProofreadPage to do that
[23:18:53] 	 depending on how you hook
[23:19:25] 	 as the transclusion of the page will go fetch the wikitext
[23:19:37] 	 it's not transclusion though
[23:19:38] 	 the ,  etc. would not work on such pages, though
[23:19:52] 	 "fake transclusion" if you want :P
[23:20:26] 	 https://github.com/wikimedia/mediawiki-extensions-ProofreadPage/blob/master/includes/Page/PageContent.php#L278
[23:20:35] 	 right there, I'm wrapping $this->body->getText() 
[23:20:39] 	 if the template exists
[23:21:34] 	 hmm
[23:21:51] 	 you could probably create a preprocessor
[23:22:07] 	 that might allow you to do what you want without breaking the |
[23:22:52] 	 will the template need to view the contents of the parameter?
[23:23:22] 	 a simple solution would be to replace the $this->body->getText() with "{{$templatename|1=$marker}}"
[23:23:27] 	 it needs to be able to pass the contents on to something else
[23:23:36] 	 but not look at them
[23:23:44] 	 and then replace the $marker in the processed output with $this->body->getText()
[23:23:53] 	 huh, ok
[23:24:08] 	 this approach is used on the parser and multiple extensions
[23:25:02] 	 any chance you can show a code example?
[23:25:20] 	 I mean an existing one, would be fine
[23:26:35] 	 I am looking at Cite
[23:26:40] 	 as I expected it would do that
[23:26:46] 	 but is perhaps not too clear
[23:26:53] 	 mostly I just want to see the APIs
[23:26:57] 	 to generate the marker and replace it
[23:27:08] 	 in a real example would be easier
[23:27:50] 	 see Parser::insertStripItem()
[23:29:20] 	 ok, so that seems to make a new one
[23:29:26] 	 it "inserts" it in the state
[23:29:43] 	 and then I use the result in the wikitext?
[23:29:49] 	 yes
[23:30:04] 	 you want to use it outside the parser
[23:30:12] 	 oh, except right now it's calling $wikitextContent->getParserOutput
[23:30:16] 	 so perhaps it will be a bit different / break somewhere
[23:30:27] 	 I don't have a Parser object. well I probably do somewhere
[23:32:09] 	 well there's MediaWikiServices::getInstance()->getParser()
[23:32:12] 	 I dunno