[01:03:46] [telegram] I was testing a few minutes ago to evaluate Wikidata based on external resources [01:04:41] [telegram] I had this message: Access to XMLHttpRequest at 'https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4795929' from origin 'https://www.wikidata.org' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. [01:05:36] [telegram] This proves that Wikidata is a secure resources and that JavaScript tools cannot be used to hack personal information as many think. [01:06:14] [telegram] Although my code did not run, I was honoured to see this fact. [01:06:38] [telegram] ddd [06:21:35] [telegram] pywikibot is funny [06:21:53] [telegram] More precisely, the MediaWiki software development universe is funny [06:21:59] [telegram] WARNING: /Users/aaharoni/dev/pywikibot-he/pywikibot/page/__init__.py:2068: FutureWarning: [06:22:00] [telegram] Pywikibot needs a MediaWiki markup parser. [06:22:01] [telegram] Please install the requested module with either [06:22:03] [telegram] pip install "mwparserfromhell>=0.5.0" [06:22:04] [telegram] or [06:22:06] [telegram] pip install "wikitextparser>=0.47.0" [06:22:07] [telegram] Using pywikibot without MediaWiki markup parser is deprecated for 21 days. [06:22:09] [telegram] templates = textlib.extract_templates_and_params( [06:23:16] [telegram] Like, what I'm thinking is: "OK, what should I choose? I don't think I want something from hell. I guess I'll take wikitextparser. But... maybe the one from hell is actually better? Why would anyone make a parser and name it 'a parser from hell'?" [06:48:11] [telegram] Ihihihih [07:24:12] [telegram] I think the idea is that wikitext is a hellish format to parse [07:24:26] [telegram] but mwparserfromhell is the one of the two that I’ve heard of and used before, it’s pretty good [07:24:29] [telegram] *looks up wikitextparser* [07:26:15] [telegram] ah, they have a comparison with mwparserfromhell https://pypi.org/project/wikitextparser/#compared-with-mwparserfromhell [07:26:23] [telegram] I was quite sure about: «to develop that, the developer made a pact with the devil» [08:56:22] [telegram] the devil is in the details, so use the one from hell. But maybe the WikiTextParser is even better? It looks like you can choose either one of them, or even both. MWparserFromHell can help you to get parameters from templates, if you ever need that. [09:26:02] [telegram] Feedback wanted from new and not so new technical folks: Wikimedia is exploring the idea of creating a developer portal to make it easier for technical contributors to find key documents. Would a single point of entry help you find what you need? Please share your thoughts: https://www.mediawiki.org/wiki/Developer_Advocacy/Developer_Portal/Content_Draft - thanks in advance! [09:29:27] [telegram] Good initiative. Reminds me of https://xkcd.com/927/ 😊 (re @andreklapper: Feedback wanted from new and not so new technical folks: Wikimedia is exploring the idea of creating a developer portal to make it easier for technical contributors to find key documents. Would a single point of entry help you find what you need? Please share your thoughts: https://www.mediawiki.org/wiki/Developer_Advocacy/Developer_Portal/Cont [09:50:34] [telegram] I also always have that XKCD one in mind. :) That's why that place does not create any new content and only links out to existing content. (re @MaartenDammers: Good initiative. Reminds me of https://xkcd.com/927/ 😊) [09:52:45] [telegram] It's about discoverability. It is not about yet another place that offers/hosts docs. Feedback which use cases or docs might still be missing is very welcome. [09:59:04] [telegram] Do we have any plane to move toward any other alternative? (re @lucaswerkmeister: I think the idea is that wikitext is a hellish format to parse) [09:59:59] [telegram] my understanding is that the parsing team at the WMF is very slowly and incrementally improving wikitext [10:00:23] [telegram] e.g. with the recent-ish change that signatures can’t contain “unbalanced” markup anymore (or something like that, I don’t quite remember) [10:01:54] [telegram] and in addition to that, [[mw:Specs/HTML/2.2.0]] defines an HTML format for Wikitext, and Parsoid can convert back and forth between the two formats, so you can also make edits based on the HTML and let Parsoid take care of the Wikitext parts [10:03:10] [telegram] (note: “very slowly” is a good thing – we don’t want articles to break all the time, after all) (re @lucaswerkmeister: my understanding is that the parsing team at the WMF is very slowly and incrementally improving wikitext) [11:10:21] [telegram] Anybody have got an error starting with 'Undefined offset' as shown in the picture when you use Author Disambiguator tool? : https://tools-static.wmflabs.org/bridgebot/90235ff9/image_2021_05_07_16_40_17.png [12:23:26] [telegram] Well, the technical difficulty of not breaking functionalities is indeed a significant challenge. But it's probably a peace of cake compared to phuman resistance to change. There are some change management tricks to deal with that, surely. (re @lucaswerkmeister: (note: “very slowly” is a good thing – we don’t want articles to break all the time, after all)) [12:27:31] [telegram] [[Cognitive inertia]]