[04:32:15] [telegram] Is it common for Special:ActiveUsers to go out of sync or something? I fail to see this user in it: https://ps.wikipedia.org/wiki/ځانگړی:ونډې/Af420 [04:32:15] [telegram] https://ps.wikipedia.org/wiki/ځانگړی:ActiveUsers [17:28:16] [telegram] https://i.imgur.com/JQeE2Sk.jpg [17:29:39] [telegram] Curious (re @Thecladis: Is it common for Special:ActiveUsers to go out of sync or something? I fail to see this user in it: https://ps.wikipedia.org/wiki/ځانگړی:ونډې/Af420 [17:29:40] [telegram] https://ps.wikipedia.org/wiki/ځانگړی:ActiveUsers) [19:02:53] [telegram] Hello, I am applying for a WikiCite grant to mass upload DBLP and OpenCitations scholarly databases to Wikidata. [19:02:54] [telegram] I invite you to endorse the proposal at https://meta.wikimedia.org/wiki/Wikicite/grant/Adding_support_of_DBLP_and_OpenCitations_to_Wikidata [19:03:04] [telegram] WikiCite require several endorsements from the Wikimedia Community to fund several grant proposals. [19:03:04] [telegram] This is mainly to ensure that the mass upload of data will not cause controversions among the Wikimedia Community. [19:39:06] [telegram] I don't get it. You're going to use the money to buy a computer with an internet connection? Why don't you just use Toolforge or the Wikimedia Cloud? (re @Csisc1994: Hello, I am applying for a WikiCite grant to mass upload DBLP and OpenCitations scholarly databases to Wikidata. [19:39:06] [telegram] I invite you to endorse the proposal at https://meta.wikimedia.org/wiki/Wikicite/grant/Adding_support_of_DBLP_and_OpenCitations_to_Wikidata) [19:55:12] [telegram] This is a question. We will apply deep learning algorithms on the data before including it to Wikidata. We do not like to overwhelm the cloud with data preprocessing (re @Maarten: I don't get it. You're going to use the money to buy a computer with an internet connection? Why don't you just use Toolforge or the Wikimedia Cloud?) [19:56:40] [telegram] During this summer, the Wikimedia Cloud get blocked several times because of a high number of the requests [19:57:59] [telegram] We are dealing about editing over 30 million Wikidata items in a row. [20:00:48] [telegram] Deep learning? What kind of model? [20:00:49] [telegram] No fancy computer needed for edits. My 10+ year old laptop is doing around 500.000 edits a day [20:02:23] [telegram] I will be honoured to have advice related to that. (re @Maarten: Deep learning? What kind of model? [20:02:24] [telegram] No fancy computer needed for edits. My 10+ year old laptop is doing around 500.000 edits a day) [20:03:31] [telegram] It is a combination of RNN, CNN and LSTM (re @Maarten: Deep learning? What kind of model? [20:03:33] [telegram] No fancy computer needed for edits. My 10+ year old laptop is doing around 500.000 edits a day) [20:04:45] [telegram] Hybrid approach [20:06:27] [telegram] You're probably better (and cheaper) off just getting a VM on one of the cloud providers [20:07:04] [telegram] I will absolutely think of this (re @Maarten: You're probably better (and cheaper) off just getting a VM on one of the cloud providers) [20:07:18] [telegram] 4000$ is a lot of cloud credits [20:07:46] [telegram] However, renting the cloud for many years is expensive as well. [20:08:35] [telegram] 1 December 2020 - 30 April 2021it says in the proposal [20:09:01] [telegram] That is for the development. But, the bot will run for years. [20:10:23] [telegram] 500 USD for Internet are sufficient for two years where I live. [20:11:32] [telegram] We will just develop the bot until late April 2021. Then, we will let the bots work on their own [20:13:08] [telegram] We will only adjust the bot codes when there is a change in the data model of the source bibliographic databases.