[07:32:44] As per fast.com my internet speed is 39Mbps, yet, I get the following: [07:32:45] ``` [07:32:47] Fetching /mnt/nfs/labstore-secondary-tools-project/campwiz-backend/mariadb.dump to mariadb.dump [07:32:48] mariadb.dump 11% 10MB 0.3KB/s - stalled -``` [07:32:50] for the last 120 minutes. Does it look anyway fair? [07:32:51] Did I retry? Of course, many times. [07:32:53] Is your internet is the problem? I can use any other websites very easily. I even included the my speed from fast.com. Can you guys understand my frustration? Is there gonna be any solution? [08:02:57] where is this output from? where are you running this? [08:54:38] From my toolforge to my laptop [08:55:36] ``` [08:55:36] sftp toolforge [08:55:38] get /mnt/nfs/labstore-secondary-tools-project/campwiz-backend/mariadb.dump to mariadb.dump``` [08:57:51] Now, it is showing `15% 14MB 0.1KB/s - stalled `. Very fast speed [08:59:19] Now, I don't know what is happening but when I threaten to kill off the connection (`Ctrl + C`) the speed temporarily increases upto 200KBps [09:05:27] i suspect what you are seeing is actually an issue with the sftp protocol/tool specifically, and not with the network bandwidth being limited in general [09:05:42] you might want to try tuning its parameters: https://serverfault.com/a/843722 [09:08:29] or you might want to try the 'rsync' tool to transfer the file instead of 'sftp' [09:17:37] so, using rsync over ssh, [09:17:38] ``` [09:17:39] 1.11M 1% 66.07kB/s 0:23:46 ``` [09:17:41] for the last couple of minutes (re @wmtelegram_bot: or you might want to try the 'rsync' tool to transfer the file instead of 'sftp') [09:24:41] it is not helping either [09:24:41] `2,227,872 2% 2.45kB/s 10:32:29` [09:43:31] @nokibsarkar can you please add this info to T395135? I would also try a few things to identify where the issue is: 1) download a file (not too small) from a toolforge URL with curl and see if the speed is the same 2) same but from a non-toolforge website 3) same but from a non-WMF website [09:43:32] T395135: [infra] Reports of slow connectivity from APAC - https://phabricator.wikimedia.org/T395135 [09:49:41] Actually it does not even complete within 3 hours [09:50:12] So, download is not my option here. Maybe partial download might be an option [09:57:41] From toolforge to Our Droplet at tools.wikilovesfolklore.org [09:57:42] ``` [09:57:44] campwiz@wlftools:~$ rsync --progress -e "ssh -i ~/.ssh/id_ecdsa.toolforge" "nokibsarkar@login.toolforge.org:/data/project/campwiz-backend/mariadb.dump" . [09:57:45] mariadb.dump [09:57:47] 95,367,325 100% 6.48MB/s 0:00:14 (xfr#1, to-chk=0/1)``` [10:02:04] I just tried downloading the same file with your same command (I just modified the username and ssh key) and it downloaded the file in 20 seconds (I am based in Italy) [10:03:14] it's not easy to identify exactly what is causing the slow speed for you, but one thing you can do is to compare with other websites, not just fast.com [10:03:44] I think you mentioned that downloading from commons.wikimedia.org is fast? can you double check that is still true? [10:05:40] for example, how long does it take to download this random image with curl? "curl https://upload.wikimedia.org/wikipedia/commons/e/eb/SMS_Arcona_NH_65764_-_Restoration.jpg --output test.jpg" [10:27:00] !log quarry disable excel exports (T395237) and tweak redis resources (T396785) [10:27:04] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Quarry/SAL [10:27:05] T395237: quarry is leaking tmp files - https://phabricator.wikimedia.org/T395237 [10:27:06] T396785: Fix Quarry's Redis pod exiting causing frequent outages - https://phabricator.wikimedia.org/T396785 [10:37:49] !log quarry deploy patch adding line numbers to text areas (T315066) [10:37:52] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Quarry/SAL [10:37:52] T315066: Add line numbers in SQL input textarea - https://phabricator.wikimedia.org/T315066 [10:44:06] oooh shiny [11:05:26] i added comment on https://phabricator.wikimedia.org/T395135 (re @wmtelegram_bot: for example, how long does it take to download this random image with curl? "curl https://upload.wikimedia.org/wikipedi...) [18:13:18] !log devtools - deleted puppetmaster-1004 after confirming no instance uses it, only puppetmaster-1003 is used and they are the same version (T390948) [18:13:21] Logged the message at https://wikitech.wikimedia.org/wiki/Nova_Resource:Devtools/SAL [18:13:21] T390948: Cleanup collaboration-services WMCS hiera config - https://phabricator.wikimedia.org/T390948 [18:14:51] Hi, like thumbnail, is there any efficient transcoded version of video/audio available in commons? [18:18:32] nokibsarkar: yes, for example if you go to a file page of a video, scroll down to the section "Transcode status". like here: https://commons.wikimedia.org/wiki/File:After_Storm_(4K_Resolution).webm [18:18:53] you will see different formats / bitrates htere [18:20:39] are there any list of these standard formats? [18:21:42] I mean, are these encoding available for all type of video? is there a way to deterministically calculate these download? [18:23:01] I would hope the commons server supports byte range for buffering thing [18:23:43] bandwidth is a thing for my users, but also performance [18:27:09] nokibsarkar: I think the answer how to get the list of formats is: look here or git clone the repository mediawiki-config: https://gerrit.wikimedia.org/r/plugins/gitiles/operations/mediawiki-config/+/refs/heads/master/wmf-config/CommonSettings.php and then look for all lines with "$wgEnabledTranscodeSet" where a format is set to "true". [18:27:33] another irrelevant thing, gitlab ci cd cache does not work (at least i could not make it work) >> https://gitlab.wikimedia.org/nokibsarkar/campwiz-frontend/-/jobs/535588 [18:27:35] $wgEnabledTranscodeSet['360p.webm'] = true; [18:27:35] $wgEnabledTranscodeSet['144p.mjpeg.mov'] = true; [18:28:47] but not sure enough. maybe ask also here https://commons.wikimedia.org/wiki/Commons:Village_pump [18:28:57] there is a button to start a new discussion thread [18:30:26] you can get that info from the API: https://commons.wikimedia.org/wiki/Special:ApiSandbox#action=query&format=json&prop=transcodestatus&titles=File%3AAfter%20Storm%20(4K%20Resolution).webm&formatversion=2 [18:51:26] is it available via commons replica?