[03:10:32] 10DBA, 10Community-Tech, 10MediaWiki-Database, 10MediaWiki-extensions-PageAssessments, 10Operations: cron spam from mwmaint1002 - https://phabricator.wikimedia.org/T211269 (10Mathew.onipe) [10:11:32] 10DBA: labsdb1004 replication broken for linkwatcher_linklog table - https://phabricator.wikimedia.org/T211210 (10Banyek) The replication is caught up, Doing the reimport now [10:17:59] 10DBA: labsdb1004 replication broken for linkwatcher_linklog table - https://phabricator.wikimedia.org/T211210 (10Banyek) The operation would be: - reimport the data with the `/home/marostegui/reimport_from_master.sh` script. (It will take time as the table is ~400G) - when it's done, I'll remove the replicatio... [10:27:28] 10DBA: labsdb1004 replication broken for linkwatcher_linklog table - https://phabricator.wikimedia.org/T211210 (10Banyek) hm, I leave this as-is as I am not sure which user to use [12:38:05] 10DBA, 10Community-Tech, 10MediaWiki-Database, 10MediaWiki-extensions-PageAssessments, 10Operations: cron spam from mwmaint1002 - https://phabricator.wikimedia.org/T211269 (10Banyek) I'll take a look into this [12:56:16] 10DBA: labsdb1004 replication broken for linkwatcher_linklog table - https://phabricator.wikimedia.org/T211210 (10Marostegui) You should use root to run the script. If the table is that big you can either reimport it entirely or just diff the table and fix the inconsistencies manually. [13:20:39] 10DBA, 10Community-Tech, 10MediaWiki-Database, 10MediaWiki-extensions-PageAssessments, 10Operations: cron spam from mwmaint1002 - https://phabricator.wikimedia.org/T211269 (10Banyek) [13:28:40] marostegui: daniel says yes :) [13:28:44] * addshore will file the ticket now [13:32:18] 10DBA, 10Wikidata: Make a copy of the current wb_terms table on the MCR testing DB servers - https://phabricator.wikimedia.org/T211338 (10Addshore) [13:32:22] ^^ tada [13:32:59] 10DBA, 10Community-Tech, 10MediaWiki-Database, 10MediaWiki-extensions-PageAssessments, 10Operations: cron spam from mwmaint1002 - https://phabricator.wikimedia.org/T211269 (10Banyek) I checked the query `SELECT /* Wikimedia\Rdbms\Database::select www-data@mwmain... */ DISTINCT( pa_project_id ) FROM `pa... [13:33:08] tx [13:33:19] fyi Manuel is out today (bank holiday) [13:34:15] 10DBA, 10Wikidata: Make a copy of the current wb_terms table on the MCR testing DB servers - https://phabricator.wikimedia.org/T211338 (10Banyek) p:05Triage>03Normal [13:54:28] 10DBA, 10Community-Tech, 10MediaWiki-Database, 10MediaWiki-extensions-PageAssessments, 10Operations: cron spam from mwmaint1002 - https://phabricator.wikimedia.org/T211269 (10Banyek) 05Open>03Invalid This is a duplicate of T208231 [13:55:01] 10DBA, 10Operations: Issues with purgeUnusedProjects.php cron job on mwmaint1002 (Fri Oct 26) - https://phabricator.wikimedia.org/T208231 (10Banyek) [14:21:52] 10DBA, 10Operations: Issues with purgeUnusedProjects.php cron job on mwmaint1002 (Fri Oct 26) - https://phabricator.wikimedia.org/T208231 (10Banyek) [14:22:08] 10DBA, 10Operations, 10User-Banyek: Issues with purgeUnusedProjects.php cron job on mwmaint1002 (Fri Oct 26) - https://phabricator.wikimedia.org/T208231 (10Banyek) [14:38:19] banyek: o/ [14:38:28] do you have a min for a couple of questions? [14:39:02] (about dbstore) [14:39:06] no. :( , I just stand up for going to a doctor - I have an appointment [14:39:12] we can talk when I am back home [14:39:16] or tomorrow [14:40:17] elukey: I'll ping you, ok? [14:49:13] sure! [14:49:24] Tx [16:30:58] elukey: I am here [16:31:01] we can talk [16:35:41] hello! [16:36:09] nothing really big, I just wanted to know if my understanding about the multi instance settings is right [16:36:56] so IIUC, each section will get (on the assigned dbstore host) a mysql instance only for that database [16:37:21] and what we call 'staging' should likely follow the same path [16:37:58] so people used to the 'multi source' world will not be able to join any database, not even on the same host [16:38:12] (say if sX and sY are assigned to the same dbstore node) [16:38:47] I don't wanto to rethink everything, just to be super sure about how to explain this to everybody using dbstore1002 now [16:39:04] and think about how to reach out to people to migrate over etc.. [16:41:29] yes, exactly that is the point, that was the reason I asked about to put a 'staging' db on each instance too [16:42:10] but ofc anyone still able to join data from those schemas which are in the same section [16:42:31] yep yep [16:43:20] ahhh wait what we call sectionX in reality is not a single db, but could be multiple ones? [16:43:23] * elukey super ignorant [16:44:04] course, it is only a convenient way to call a group of schemas [16:44:05] silly me [16:44:12] all right got it [16:44:29] but for example the s1 only have enwiki inside [16:45:15] here are the dblists: [16:45:15] https://phabricator.wikimedia.org/source/mediawiki-config/browse/master/dblists/s1.dblist [16:45:18] https://phabricator.wikimedia.org/source/mediawiki-config/browse/master/dblists/s2.dblist [16:45:20] etc. [16:45:22] until s8 [16:46:02] super thanks a lot [16:46:22] you welcome! I am glad to help :) [16:46:42] so I am now going to reach out to people about the proposal that you made in the task (about splitting the sections and staging) [16:47:12] On monday we'll declare time for comment over and we'll be able to start anytime [16:47:22] perfect [16:47:51] ack :_ [16:47:52] I'll continue tomorrow digging the host to see if there's anything what is set up there but not in puppet [16:47:52] :) [16:48:08] I am a bit conecerned about 'how to' but we'll see :) [16:57:18] 10DBA, 10Analytics, 10Analytics-Kanban, 10Data-Services, and 3 others: Create materialized views on Wiki Replica hosts for better query performance - https://phabricator.wikimedia.org/T210693 (10Banyek) 150 Gb seems acceptable to me, but - we still have some time factor to think about, because those table... [16:58:10] 10DBA, 10Analytics, 10Analytics-Kanban, 10Data-Services, and 3 others: Create materialized views on Wiki Replica hosts for better query performance - https://phabricator.wikimedia.org/T210693 (10Banyek) @Bstorm if you prepare a depool patch for me for tomorrow I can start create the mat. view on the other... [16:59:50] 10DBA, 10Patch-For-Review, 10User-Banyek, 10Wikimedia-Incident: Compare a few tables per section between hosts and DC - https://phabricator.wikimedia.org/T207253 (10Banyek) @Anomie it worth a look, thanks [17:02:50] 10DBA, 10Cloud-Services, 10User-Banyek, 10User-Urbanecm: Prepare and check storage layer for punjabiwikimedia - https://phabricator.wikimedia.org/T207584 (10Banyek) @Urbanecm Nope, I checked it on the labsdb instances and it was sanitized properl;y. @Bstorm you can create the views (if the fishbowl wikis... [17:03:47] 10DBA, 10Cloud-Services, 10User-Urbanecm: Prepare and check storage layer for punjabiwikimedia - https://phabricator.wikimedia.org/T207584 (10Banyek) [17:05:01] 10DBA, 10Operations, 10User-Banyek: Issues with purgeUnusedProjects.php cron job on mwmaint1002 (Fri Oct 26) - https://phabricator.wikimedia.org/T208231 (10Banyek) i'd like to add the owner of the script as a subscriber, but I don't know how to find who is it [17:12:34] 10DBA, 10Analytics, 10Data-Services, 10User-Banyek, 10User-Elukey: Hardware for cloud db replicas for analytics usage - https://phabricator.wikimedia.org/T210749 (10Banyek) [17:15:35] 10DBA, 10Analytics, 10Data-Services, 10User-Banyek, 10User-Elukey: Hardware for cloud db replicas for analytics usage - https://phabricator.wikimedia.org/T210749 (10Banyek) >>! In T210749#4795368, @Milimetric wrote: > Ok, done and agreed. But instead of trying to find hardware that will keep up with rep... [17:19:08] I leave for today. [18:52:18] 10DBA: labsdb1004 replication broken for linkwatcher_linklog table - https://phabricator.wikimedia.org/T211210 (10Marostegui) p:05Normal>03High [20:23:58] 10DBA, 10Operations, 10ops-eqiad: rack/setup/install pc1007-pc1010 - https://phabricator.wikimedia.org/T207258 (10Cmjohnson) @marostegui and all, the system board that was replaced yesterday was faulty. Showing errors on DIMM slots B4 and B1. After swapping DIMMs in B with DIMMs in A, the error remained B4... [23:26:24] 10DBA, 10Operations, 10Research, 10Services (designing): Storage of data for recommendation API - https://phabricator.wikimedia.org/T203039 (10bmansurov)