Fork me on GitHub

Wikimedia IRC logs browser - #wikimedia-tech

Filter:
Start date
End date

Displaying 258 items:

2018-01-31 01:41:38 <Ivy> Urbanecm: That's not what emergency means.
2018-01-31 01:43:34 <Ivy> And it's not really appropriate to spam multiple channels.
2018-01-31 01:56:11 <Vermont> is anyone here good with wiki text willing to help me make a signature that I'm bad at trying to make
2018-01-31 02:00:48 <Ivy> Vermont: The default user signature is fine.
2018-01-31 02:01:17 <Vermont> k
2018-01-31 02:01:36 <Vermont> Ivy: Sorry to bother ya
2018-01-31 07:12:07 <Wm> Wm
2018-01-31 15:58:59 <addshore> =o
2018-01-31 16:02:22 <CFisch_remote> \o/
2018-01-31 16:02:32 <addshore> \o
2018-01-31 16:02:36 <addshore> It's that time again!
2018-01-31 16:02:46 <CFisch_WMDE> It's that time again indeed!
2018-01-31 16:03:32 <CFisch_WMDE> Technical Advice IRC meeting!
2018-01-31 16:03:35 <Hauskatze> which time?
2018-01-31 16:03:36 <Hauskatze> ah
2018-01-31 16:03:40 <Hauskatze> good good
2018-01-31 16:03:49 <Hauskatze> oh hi mainframe98 -- nice to see you
2018-01-31 16:04:01 <d3r1ck> Hey everyone! Glad to see you all around :)
2018-01-31 16:04:04 <mainframe98> Hi! Glad I can finally attend one of these meetings
2018-01-31 16:04:17 <CFisch_WMDE> Hi Hauskatze, hi d3r1ck, hi mainframe98 !
2018-01-31 16:04:35 <Hauskatze> mainframe98: I cannot believe the fix for https://gerrit.wikimedia.org/r/#/c/406988/ was that easy
2018-01-31 16:05:26 <d3r1ck> Sometimes I'm not online but want to access MW docs offline, is that possible? I'm aware that there is https://doc.wikimedia.org/. Is it possible to also have that bundled with MW offline?
2018-01-31 16:05:29 <mainframe98> This is why using a static code analyse tool is important. Unfortunately, Phan doesn't run with Jenkins tests, or it would've caught it.
2018-01-31 16:06:03 <Hauskatze> I think I remember that they wanted to use phan at some point?
2018-01-31 16:06:20 <Hauskatze> I can't remember
2018-01-31 16:06:21 <d3r1ck> addshore: A little help here please? https://gerrit.wikimedia.org/r/#/c/406985/, https://gerrit.wikimedia.org/r/#/c/404400/
2018-01-31 16:06:38 <d3r1ck> CFisch_WMDE: any ideas if one can get offline docs?
2018-01-31 16:07:02 <mainframe98> d3r1ck: you could generate your own copy of the documentation with doxygen. Additionally, there is quite some documentation available in the docs/ folder.
2018-01-31 16:07:12 <CFisch_WMDE> Hauskatze: you can use phan on the CI alreaddy
2018-01-31 16:07:14 <CFisch_WMDE> already
2018-01-31 16:07:27 <CFisch_WMDE> d3r1ck: Not that I know off
2018-01-31 16:07:28 <Hauskatze> addshore / CFisch_WMDE -- not sure if this is the right time/way, but maybe you could review https://gerrit.wikimedia.org/r/#/c/406988/ and associate task and see if the proposed patch fixes the issue?
2018-01-31 16:07:50 <d3r1ck> mainframe98: Yeah, thanks. I looked at the docs/ folder too but I was thinking of a way that one can access the docs via a web browser as if it's online
2018-01-31 16:08:12 <jubo2> Is it just me or is the enwiki Watchlist maybe slightly in a broken state?
2018-01-31 16:08:23 <jubo2> https://en.wikipedia.org/w/index.php?title=Special%3AWatchlist&days=180&hidecategorization=1&hideWikibase=1&namespace=&action=submit gives zero hits
2018-01-31 16:08:38 <d3r1ck> mainframe98: You get the point I'm trying to pass across? So I can look at docs/ which is great but can I access it offline via the web browser?
2018-01-31 16:08:55 <addshore> jubo2: I see things in my watchlist at that url
2018-01-31 16:09:42 <jubo2> addshore: hmm.. I cannot believe on-one edited any of the 429 pages I watch in the last half year
2018-01-31 16:09:50 <jubo2> *no-one
2018-01-31 16:09:56 <addshore> Hauskatze: I had added myself as a reviewer for that patch, although legoktm knows more about MassMessage things so it may be best to wait for him to take a look!
2018-01-31 16:10:14 <Hauskatze> addshore: ack, agree - thanks!
2018-01-31 16:10:26 <mainframe98> d3r1ck: I'm pretty sure that you could use doxygen to manually generate the docs that you see on docs.wikimedia.org. That way you have a copy of docs.wikimedia locally.
2018-01-31 16:10:28 <Hauskatze> addshore: fwiw I think https://gerrit.wikimedia.org/r/#/c/406824/ is +2able
2018-01-31 16:10:32 <addshore> jubo2: 30 days is the limit I'm pretty sure
2018-01-31 16:10:50 <addshore> Hauskatze: done ;)
2018-01-31 16:11:06 <CFisch_WMDE> :-D
2018-01-31 16:11:12 <jubo2> hmm.. I didn't know of a limit.. it does say 720 hours in the UI
2018-01-31 16:11:15 <Hauskatze> :D x2
2018-01-31 16:11:17 <d3r1ck> mainframe98: Yeah, thanks so much. But please I'm also asking if it's possible to get the docs (aftre generating) on the web browser. Is it accessible that way?
2018-01-31 16:11:44 <jubo2> Mediawiki changes and gets more features and naturally some special arrangements for WMF wikis' needs
2018-01-31 16:12:51 <addshore> jubo2: what sorts of pages do you have in your watchlist, something should probably be showing up
2018-01-31 16:12:52 <jubo2> So Watchlist is fine
2018-01-31 16:12:53 <mainframe98> d3r1ck: it should? At least JavaDoc would generate documentation that you can open in your webbrowser. I'll see if I can generate it
2018-01-31 16:12:54 <jubo2> Ok
2018-01-31 16:13:03 <addshore> have you tried adding a page to your watchlist that you know has recent changed?
2018-01-31 16:13:15 <d3r1ck> mainframe98: Okay great! Walk me through when you are done please :)
2018-01-31 16:13:17 <jubo2> No and I need to get going somewhere soon
2018-01-31 16:13:24 <d3r1ck> CFisch_WMDE: Thanks for the +2 :)
2018-01-31 16:14:05 <CFisch_WMDE> You could have that jenkins job vote I guess
2018-01-31 16:15:46 <bam_> I am new in mediawiki hacking, wich tasks do you advise me?
2018-01-31 16:15:59 <d3r1ck> CFisch_WMDE: :D
2018-01-31 16:16:11 <Hauskatze> mwext-php70-phan-docker is run on 'check experimental'
2018-01-31 16:16:17 <CFisch_WMDE> bam_: In Phabricator there are some tasks tagged as easy
2018-01-31 16:16:21 <Hauskatze> I'm running it on MassMessage now
2018-01-31 16:16:49 <bam_> CFisch_WMDE: Do pass me the link, pls
2018-01-31 16:17:01 <CFisch_WMDE> jepp give me a sec
2018-01-31 16:17:33 <CFisch_WMDE> https://phabricator.wikimedia.org/maniphest/query/jsySdXllltwv/#R
2018-01-31 16:17:51 <CFisch_WMDE> These are tasks from all over the place
2018-01-31 16:18:00 <d3r1ck> bam_: maybe you can help clear this? https://phabricator.wikimedia.org/T175794
2018-01-31 16:18:09 <d3r1ck> Even though Lego has a script that does that automatically
2018-01-31 16:18:15 <d3r1ck> But it can help you get started :)
2018-01-31 16:18:16 <CFisch_WMDE> you could take a look into them and see if you can make any sense out of it
2018-01-31 16:18:32 <CFisch_WMDE> bam_: and see if someone is assigned already
2018-01-31 16:18:54 <CFisch_WMDE> if you really want to work on one of the things assign yourself first of cause
2018-01-31 16:19:03 <d3r1ck> bam_: Yes, follow CFisch_WMDE advice :)
2018-01-31 16:19:18 <CFisch_WMDE> and best ask questions on the ticket, like if you need more input for a better understanding
2018-01-31 16:19:34 <CFisch_WMDE> you can also always ask here
2018-01-31 16:19:35 <bam_> alright, d3r1ck
2018-01-31 16:19:42 <d3r1ck> :)
2018-01-31 16:19:57 <CFisch_WMDE> or or look onto the board for devs https://discourse-mediawiki.wmflabs.org/
2018-01-31 16:20:23 <CFisch_WMDE> bam_: I guess you read the basics already about becoming a mediawiki hacker?
2018-01-31 16:20:31 <CFisch_WMDE> https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker
2018-01-31 16:20:47 <bam_> CFisch_WMDE: yes of course
2018-01-31 16:21:15 <bam_> I just need some task to work on my spare time
2018-01-31 16:21:21 <CFisch_WMDE> :-)
2018-01-31 16:21:28 <CFisch_WMDE> there is also the tag https://phabricator.wikimedia.org/tag/need-volunteer/
2018-01-31 16:21:47 <d3r1ck> bam_: I remember I merged your task last time right? Thanks for working on that :)
2018-01-31 16:21:55 <d3r1ck> Maybe some improvements on the docs can also help?
2018-01-31 16:22:07 <d3r1ck> CFisch_WMDE: Any help on this please? https://gerrit.wikimedia.org/r/#/c/406421/, thank you :)
2018-01-31 16:23:04 <CFisch_WMDE> d3r1ck: Added myself as a reviewer ... will probably look into it the next days :-)
2018-01-31 16:23:38 <bam_> d3r1ck: you're welcome. I was busy, I couldn't work on other tasks for AWMD. Thing will be ready for them now
2018-01-31 16:23:53 <CFisch_WMDE> mainframe98: You something posted on the meeting page. Anything you would like to know around that topic?
2018-01-31 16:24:00 <CFisch_WMDE> *had
2018-01-31 16:24:12 <d3r1ck> bam_: Okay great! Anytime I mean. :)
2018-01-31 16:24:14 <CFisch_WMDE> ( not that I feel very prepared )
2018-01-31 16:24:19 <d3r1ck> CFisch_WMDE: Thanks :)
2018-01-31 16:25:20 <mainframe98> CFisch_WMDE: Sure. I have an extension with a service that implements SalvageableService. I was wondering how that works in a multi-wiki environment.
2018-01-31 16:25:32 <mainframe98> Are the values salvaged between wikis?
2018-01-31 16:25:54 <d3r1ck> CFisch_WMDE: Hey, and this too please :), https://gerrit.wikimedia.org/r/#/c/370370/
2018-01-31 16:25:58 <d3r1ck> Thank you
2018-01-31 16:27:14 <CFisch_WMDE> d3r1ck: as long as this meeting does not get flooded with review requests I guess its ok ;-)
2018-01-31 16:27:17 <bam_> I want to work on this task https://phabricator.wikimedia.org/T175794, but I see GCI
2018-01-31 16:27:34 <CFisch_WMDE> mainframe98: ok, I will have a look ... addshore do you have any insights in that area?
2018-01-31 16:27:47 <bam_> Was it part of GCI?
2018-01-31 16:27:52 <d3r1ck> Is there a bot or tool that one can use to read changes of content on Google Spreadsheet and then updates a wiki table?
2018-01-31 16:28:04 <d3r1ck> bam_: Yes it was part of the Google Code-In contest
2018-01-31 16:28:19 <d3r1ck> But GCI is over and that needs to be cleaned, I mean if someone wants to :)
2018-01-31 16:28:23 <bam_> So, it's now free?
2018-01-31 16:28:27 <addshore> mainframe98: SalvageableServices are only salvagable on a per request basis
2018-01-31 16:28:36 <d3r1ck> CFisch_WMDE: Yeah, that's my last PS review request. Thanks :)
2018-01-31 16:28:46 <d3r1ck> bam_: Yeah it's free and you can work on it :)
2018-01-31 16:29:02 <addshore> mainframe98: what service are you creating (im curious)
2018-01-31 16:29:06 <CFisch_WMDE> bam_: so looking at the ticket I would say it's free ... there seems to be a lot of work done already
2018-01-31 16:29:06 <d3r1ck> bam_: ... and don't forget to keep in mind what CFisch_WMDE said if you have any issues :)
2018-01-31 16:29:07 <bam_> But I thing I need some clarification.
2018-01-31 16:29:14 <Hauskatze> who can review https://gerrit.wikimedia.org/r/#/c/406816/ ?
2018-01-31 16:29:19 <CFisch_WMDE> but still some extensions left
2018-01-31 16:29:34 <d3r1ck> bam_: What clarifications, please feel free to ask here :)
2018-01-31 16:29:36 <bam_> Which ones exactly?
2018-01-31 16:30:01 <d3r1ck> bam_: Okay! If you read the ticket, you'll see that some have patches appeneded at the end of them, some don't
2018-01-31 16:30:04 <bam_> Tasks to work on, because some extensions are already done
2018-01-31 16:30:09 <addshore> Hauskatze: probably a combination of CFisch_WMDE and I
2018-01-31 16:30:20 <d3r1ck> So you just need to pick an extension that doesn't have the minus-x utility and then add that to it.
2018-01-31 16:30:29 <mainframe98> addshore: A service that creates wikiset objects.
2018-01-31 16:30:52 <addshore> hmm, mainframe98 which part of the service is salvagable ?
2018-01-31 16:31:12 <d3r1ck> bam_: So you can see this in the ticket "extensions/CommunityTwitter - no mediawiki/minus-x"
2018-01-31 16:31:18 <addshore> /what resources are salvagable / hard / expensive to recreate?
2018-01-31 16:31:21 <CFisch_WMDE> Hauskatze: I have that one on my monitor already will probably look into it tomorrow
2018-01-31 16:31:25 <bam_> Ok
2018-01-31 16:31:27 <d3r1ck> That extension doesn't yet have minus-x so you can add it
2018-01-31 16:31:52 <d3r1ck> So the ones that don't have patches are not yet done and the ones that have patches are already done and pending review I think
2018-01-31 16:32:09 <d3r1ck> bam_: So just read the task description and you'll see an example patch that solves the issue for each extension :)
2018-01-31 16:32:34 <d3r1ck> bam_: You can read what the utility is all about here: https://www.mediawiki.org/wiki/MinusX.
2018-01-31 16:32:36 <bam_> It is what I am trying to do?
2018-01-31 16:33:11 <d3r1ck> bam_: I don't understand your question please, can you rephrase? Or does someone understands bam_? :)
2018-01-31 16:34:08 <mainframe98> addshore: The sets themselves. It's not so much that the sets are hard to create, but they're requested every request, for each set the wiki is part of. The problem stems from the fact that it is used in the MediaWiki Setup process, when caching is not available.
2018-01-31 16:34:44 <bam_> I mean I am trying to read task description
2018-01-31 16:35:00 <d3r1ck> bam_: Okay! Sure! Feel free :)
2018-01-31 16:35:11 <d3r1ck> I saw the question mark and so I thought it was a question
2018-01-31 16:35:18 <d3r1ck> 17:34:37 bam_ | It is what I am trying to do?
2018-01-31 16:35:49 <bam_> Sorry
2018-01-31 16:36:00 <addshore> hmmmm
2018-01-31 16:36:03 <bam_> I could be a period (.)
2018-01-31 16:36:17 <d3r1ck> Okay!
2018-01-31 16:36:26 <addshore> mainframe98: they will be recreated every request no matter if they implement a salvagableservice or not
2018-01-31 16:37:19 <addshore> mainframe98: and in a regular request you will have the same service object for the whole life of the request
2018-01-31 16:37:19 <bam_> How do I assign myself a task on Phabrigator?
2018-01-31 16:37:26 <addshore> unless your doing some really funky stuff
2018-01-31 16:38:01 <d3r1ck> bam_: That task is quite big to assign to yourself
2018-01-31 16:38:11 <d3r1ck> But to assign a task to yourself, you can follow 2 methods
2018-01-31 16:38:19 <bam_> Ok
2018-01-31 16:38:45 <bam_> What are those methods?
2018-01-31 16:38:49 <mainframe98> addshore: I knew that the service object is retained the whole request, but how is anything that is service cached then?
2018-01-31 16:39:49 <d3r1ck> 1. Go to the text area below the task, click on "Add action" dropdown, click on "Assign/Claim" (this will auto assign to you) then hit submit
2018-01-31 16:40:00 <mainframe98> addshore: If I have an extension that obtains configuration from the database, and provides that to a custom config factory, how should it cache that configuration so it isn't requested each request. (since putting configuration for each set toghether in one large config object is expensive)
2018-01-31 16:40:45 <d3r1ck> 2. On the right menu, click "Edit Task", on the "Assigned To" input field in the form, put your "phab name" and hit "Save changes"
2018-01-31 16:40:55 <d3r1ck> bam_: Method 1 is the fastest. I'll recommend that.
2018-01-31 16:40:56 <addshore> so the services themselves are 'cached' in memory for the whole request within the service container (MediaWikiServices)
2018-01-31 16:40:58 <addshore> mainframe98: ^^
2018-01-31 16:41:01 <d3r1ck> Do you understand? bam_ ? :)
2018-01-31 16:41:23 <addshore> mainframe98: if you have a service that gets config from the db, you will need to cache that within the service itself if you want some sort of cache / not to hit the db each time
2018-01-31 16:41:37 <addshore> mainframe98: do you have some code or a patch that I could look at?
2018-01-31 16:42:03 <d3r1ck> bam_: You can assign that task to yourself if you want to finish up adding minus-x to all the remaining extensions. But since it's many tasks in 1, you can just keep submitting patches :)
2018-01-31 16:42:41 <bam_> Cant see the the text area
2018-01-31 16:42:58 <mainframe98> addshore: unfortunately, it's not public, since I keep it on my own offline git repo.
2018-01-31 16:43:14 <d3r1ck> bam_: It's because the task has many comments, scroll right at the bottom-most part of the page :)
2018-01-31 16:43:18 <addshore> mainframe98: okay
2018-01-31 16:43:38 <addshore> mainframe98: so yes, it sounds like your service that constructs these wikisets should have some sort of cache within it for caching the data
2018-01-31 16:44:08 <addshore> mainframe98: take a look at WANObjectCache (for a complex usecase that will work in most cases)
2018-01-31 16:44:31 <mainframe98> d3r1ck: Unfortunately, I can't seem to successfully generate the mediawiki documentation offline. There's something that fails within it.
2018-01-31 16:45:01 <addshore> mainframe98: you also have the implementations of BagOStuff which care much simpler, you can get a generic cache from MediaWikiServices::getMainObjectStash
2018-01-31 16:45:12 <bam_> Oh, I see. Thank you so much
2018-01-31 16:45:21 <d3r1ck> mainframe98: Hmmm.. Okay! Thanks very much :), I'll also try to see if I can do that after the meeting :)
2018-01-31 16:45:28 <d3r1ck> bam_: Yeah, you're welcome anytime :)
2018-01-31 16:45:30 <mainframe98> addshore: I had used WANObjectCache before, which seems to suite my needs. The only problem I ran into was getting that service injected since I need to have it before the config factory can be created.
2018-01-31 16:46:04 <addshore> hmmm
2018-01-31 16:46:08 <d3r1ck> Phantom42: You around sir? :)
2018-01-31 16:46:37 <Phantom42> d3r1ck: Yes, I am!
2018-01-31 16:46:39 <addshore> mainframe98: you say this was being run in setup? do you have an exact line i could look at? where is the service that creates wikiset objects for you being created?
2018-01-31 16:46:45 <d3r1ck> Phantom42: something interesting or that may interest you: https://phabricator.wikimedia.org/project/view/1424/.
2018-01-31 16:46:56 <d3r1ck> You may find some pleasure working on those? :)
2018-01-31 16:47:04 <mainframe98> addshore, I'll try to post a gist.
2018-01-31 16:47:12 <addshore> mainframe98: okay!
2018-01-31 16:47:14 <d3r1ck> Phantom42: Tony and I with others are always available to review :)
2018-01-31 16:47:27 <d3r1ck> Phantom42: In fact, you know the general process :)
2018-01-31 16:48:00 <d3r1ck> Phantom42: Just thought about you so was willing to share that with you please :)
2018-01-31 16:48:46 <Phantom42> d3r1ck: Oh, thanks for sharing! I would be glad to work on this!
2018-01-31 16:49:02 <d3r1ck> Phantom42: Thank you so much :)
2018-01-31 16:51:23 <mainframe98> addshore: https://gist.github.com/mainframe98/c449ef71bb87cdbc337dd7c55dc6defc
2018-01-31 16:51:28 <bam__> I'm sorry, but I have to leave you.
2018-01-31 16:51:58 <CFisch_WMDE> bam_: Thanks for checking in!
2018-01-31 16:52:01 <CFisch_WMDE> See you around
2018-01-31 16:52:03 <CFisch_WMDE> :-)
2018-01-31 16:52:24 <bam__> Thank you, too for hosting
2018-01-31 16:52:26 <CFisch_WMDE> And please always feel free to ask for help!
2018-01-31 16:52:41 <mainframe98> addshore: This doesn't use the wikiset factory service, I now realize. The project has gotten big enough that I don't always know which what uses, unfortunately.
2018-01-31 16:52:48 <bam__> Ok
2018-01-31 16:54:21 <addshore> so mainframe98 the $boostrapConfig is a GlobalVarConfig instance
2018-01-31 16:54:52 <addshore> I guess that has config for caches in there
2018-01-31 16:55:29 <addshore> Also, if the caching config is general / doesnt change per site, then you can still get a cacheing service from $services there
2018-01-31 16:57:53 <mainframe98> addshore: Alright, thats good to know. The problem is that the caching service uses the MainConfig service, which calls the MediaWiki provided ConfigFactory, which I'm trying to replace. Isn't that going to cause a conflict?
2018-01-31 16:58:47 <addshore> What sort of cache are you looking to use? file? apc? memcached or similar?
2018-01-31 17:00:12 <mainframe98> addshore: Anything that is shared between wikis, so probably apc, memcached or similar. Since configuration for sets is shared between wikis, it makes sense to share it between wikis.
2018-01-31 17:00:38 <addshore> Depending on the order that the service actually gets replaced in just calling $services->getMainObjectStash() could work
2018-01-31 17:01:34 <CFisch_WMDE> So despite the very active discussion here the official part of the Technical Advice IRC meeting this week if over now. Thanks to all that participated, see you next week I guess :-)!
2018-01-31 17:01:44 <addshore> o/ CFisch_WMDE :D
2018-01-31 17:01:56 <CFisch_WMDE> *is
2018-01-31 17:02:35 <mainframe98> addshore: True, but due to backwards compatibility, especially regarding extensions that do not use the Config factory yet, I have to export the configuration to $GLOBALS almost instantaneously for the extensions to work.
2018-01-31 17:03:03 <addshore> oooof
2018-01-31 17:03:14 <mainframe98> addshore: Regardless, you've helped me a great deal already. I can at least try and see if I can implement simple caching insertion and see where that brings me.
2018-01-31 17:03:30 <mainframe98> addshore: So, thanks!
2018-01-31 17:04:49 <addshore> mainframe98: I gneerally thing that trying to redefine the ConfigFactory might result in even more errors further down the line
2018-01-31 17:05:39 <addshore> mainframe98: something to take note of is in indlues/Setup.php on line 609 ish there is a call to MediaWikiServices::resetGlobalInstance( new GlobalVarConfig(), 'quick' );
2018-01-31 17:07:24 <addshore> it is indeed a sticky area :)
2018-01-31 17:10:14 <mainframe98> addshore: That could actually work in my advantage. Thanks for the explanations. I now understand SalvageableService a lot better. I'll see what I can achieve.
2018-01-31 17:13:30 <Hauskatze> CFisch_WMDE: thanks for the tips wrt phan and to mainframe98 for past/future tests on that patch :D
2018-01-31 17:13:43 <Hauskatze> I might create a task for that, but I'm not sure it's needed
2018-01-31 17:14:18 <CFisch_WMDE> you're welcome
2018-01-31 17:14:26 <Hauskatze> what a silly fix, adding 'use MWNamespace;' ;-)
2018-01-31 17:14:34 <Hauskatze> wonder how it didn't failed int he past
2018-01-31 17:39:50 <Hauskatze> is the tech advice still?
2018-01-31 17:40:06 <Hauskatze> 'cause I have https://gerrit.wikimedia.org/r/#/c/372803/ which could use some more eyes
2018-01-31 17:40:37 <Hauskatze> [not mine, from Melo-s, but for a extension I use a lot everyday :) ]
2018-01-31 17:40:41 <CFisch_WMDE> Nope normally only 1 hour, but if you're lucky someone will pay attention anyway ;-)
2018-01-31 17:40:49 <Hauskatze> hah, well, not urgent
2018-01-31 17:44:07 <Hauskatze> thanks for merging the massmessage stuff
2018-01-31 17:44:14 <Hauskatze> hope that it resolves the issue :)
2018-01-31 19:21:29 <alfie> doing some sockhunting on en-wiki: does anyone know of a tool to compare the editor intersection on a set of articles?
2018-01-31 19:31:00 <Stryn> alfie: I think https://tools.wmflabs.org/intersect-contribs/
2018-01-31 19:32:23 <alfie> Stryn: this is the inverse - i want to look at a set of articles and see which users are common to them
2018-01-31 19:32:35 <alfie> has a suspicion there's some undisclosed paid editing going on
2018-01-31 19:32:47 <Stryn> ah then I don't know unfortunately
2018-01-31 19:33:00 <alfie> Stryn: no worries, this'll probably come in handy anyway!
2018-01-31 19:33:18 <alfie> Wikipedia: Actually just an exercise in set theory.
2018-01-31 20:23:05 <ankry> any hints how to force a 2-year-old thumbnail from an outdated verion of a file to be regenerated?
2018-01-31 20:23:20 <ankry> https://upload.wikimedia.org/wikipedia/commons/thumb/e/ee/Karol_May_-_Old_Surehand_01.djvu/page256-854px-Karol_May_-_Old_Surehand_01.djvu.jpg
2018-01-31 20:23:34 <ankry> I see p. 230 instead of 232 here
2018-01-31 20:33:13 <ankry> Last-Modified: Fri, 08 Jan 2016 07:47:03 GMT
2018-01-31 20:33:13 <ankry> Etag: a81b9d5936394d1d2a1a7341c53058c9
2018-01-31 20:33:13 <ankry> X-Timestamp: 1452239222.85447
2018-01-31 20:33:13 <ankry> X-Trans-Id: txd5b25d42412b41cab95b1-005a721e98
2018-01-31 20:33:13 <ankry> X-Varnish: 499281636, 338202450 336658680, 921780112
2018-01-31 20:33:15 <ankry> Via: 1.1 varnish-v4, 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1)
2018-01-31 20:33:18 <ankry> Age: 2354
2018-01-31 20:33:21 <ankry> X-Cache: cp1048 pass, cp3034 hit/10, cp3035 miss
2018-01-31 20:33:23 <ankry> X-Cache-Status: hit-local
2018-01-31 20:35:30 <ankry> _08_ _Jan_ _2016_ while current version was uploaded 2016-08-23T23:28:43
2018-01-31 20:41:42 <bd808> ankry: it looks like it at least needs to be purged from the Varnish cache layer -- https://wikitech.wikimedia.org/wiki/Multicast_HTCP_purging#One-off_purge -- but it may or may not also need to be somehow purged from the long term image storage layer (swift).
2018-01-31 20:42:37 <bd808> ankry: filing a phabricator task and tagging it with #traffic <https://phabricator.wikimedia.org/tag/traffic/>; would probably be the way to start that process
2018-01-31 20:46:38 <ankry> bd808: so no way to do it online?
2018-01-31 20:47:09 <bd808> ankry: not that I can think of, primarily because that image size is non-standard
2018-01-31 20:47:36 <bd808> I think that means it would only exist in the Varnish cache
2018-01-31 20:49:04 <bd808> generating purges of all thumb sizes for a given image from the MediaWiki side is something that is not really possible today. This is in part a limitation of the way that Varnish keeps track of objects in its cache
2018-01-31 20:50:16 <bd808> the best we can do is to enumerate the images that are known to exist in the Swift object store and then issue Varnish HTCP purges for each unique image size
2018-01-31 20:51:04 <bd808> I'm not entirely sure if that even works for djvu and other multi-page media types
2018-01-31 21:05:38 <ankry> bd808: https://phabricator.wikimedia.org/T186153
2018-01-31 21:05:59 <ankry> can you help tagging?
2018-01-31 21:06:31 <bd808> ankry: I took some guesses :)

This page is generated from SQL logs, you can also download static txt files from here