[00:40:46] !log codesearch sudo service hound-search restart (T240776) [00:40:49] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Codesearch/SAL [00:40:49] T240776: Codesearch "Everything" is missing MediaWiki core - https://phabricator.wikimedia.org/T240776 [01:30:50] !log codesearch rolling restart of all hound instances because of gerrit-replica outage (T240776, T240763) [01:30:53] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Codesearch/SAL [01:30:54] T240776: Codesearch "Everything" is missing MediaWiki core - https://phabricator.wikimedia.org/T240776 [01:30:54] T240763: gerrit-replica is returning 502 responses when trying to git clone, breaking libup - https://phabricator.wikimedia.org/T240763 [20:20:54] Hi, on horizon.wikimedia.org I see this: debian-10.0-buster (deprecated 2019-12-15) [20:21:09] What to do? [20:21:11] !help [20:21:12] Zoranzoki21: If you don't get a response in 15-30 minutes, please create a phabricator task -- https://phabricator.wikimedia.org/maniphest/task/edit/form/1/?projects=wmcs-team [20:22:48] Zoranzoki21: that's OK. Just means a new image is provided for new instances on buster [20:23:21] Which one I should use? https://prnt.sc/qb82xj [23:07:12] Zoranzoki21: that's me experimenting, sorry. Use debian-10-buster. [23:39:09] hello: not sure if it is the right channel to ask, but was trying to download hourly data dump files from https://dumps.wikimedia.org/other/pageviews/2019/2019-12/ for a project. However, noticed that the most recent data is more than a day old. Is there a reason the dump is falling behind? [23:43:13] mturk, interesting, that might actually be a #wikimedia-operations question actually. [23:44:05] although.... [23:44:15] dumps.wm.o is served by labstore boxes [23:44:31] labstore100[67].wikimedia.org [23:46:07] thanks for the response. not sure if I can access that server tho? any particular way to investigate why the recent data is not available? [23:46:17] no you won't be able to [23:46:29] I won't be able to either [23:48:15] I don't know how files actually get there [23:48:41] but I assume it's a convenient place to serve them given they'll have to be uploaded there for accessing over NFS inside labs? [23:51:55] looks like various analytics/statistics servers mount stuff from labstore100[67] [23:58:23] interesting.can someone confirm whether the dump job is failing/is delayed?