[05:54:20] Sex [08:54:58] when requesting a csrf token (for anonymous users) from the api, i get this token: +\\ [08:55:05] is it valid? :| [09:57:27] Hey friends! Question: Is it generally possible, to update a wikipedia-site - made by myself - through a excel file, wich uses the "format_as_wikitable"-macro? THX [09:58:45] specially I mean, when I`m clicking the publish button (in team tab)? [11:31:46] how can i allow for .iso uploads to use as a download ie. [[Media:xxxxxx|xxxxxx ? [11:37:45] how can i allow for .iso uploads to use as a download ie. [[Media:xxxxxx.iso|xxxxxx]] [11:38:51] deebee: https://www.mediawiki.org/wiki/Manual:Configuring_file_uploads [11:57:28] Thank you Andre I got it to work finally ... [19:23:52] i've imported a wiki without being able to import any page redirects, so i have ~2,500 broken links in Special:WantedPages. is there a way i can unlink all of these items across all the pages, turning them into plain text? [19:25:58] actually, if i could just unlink any "red" link across all 5,000+ pages, that would help a lot [19:44:56] !e ReplaceText [19:44:56] https://www.mediawiki.org/wiki/Extension:ReplaceText [19:45:09] khalella: ^ that extension may be helpful [19:45:19] although with as many links as you want to unlink, a bot would likely be better [19:48:33] Skizzerz: can this extension match broken links, specifically ? [19:48:52] you'd have to tell it what pages individually to replace [19:48:58] it can't in general match redlinks [19:49:01] D: [19:49:03] hence why I brought up the bot idea [19:49:25] a bot would be fine, if i could find an example that at least gets me part of the way there [19:51:08] i can also export, script python against it, and then import again [19:51:15] but i don't know how to check if a link is broken or not [19:51:46] maybe if i can export that list of 2,500 WantedPages [20:02:31] khalella: you can use the unlink.py script of pywikibot to remove links of that list of wanted pages (one link each time) [20:03:26] you could export that list and generate the command to run pywikibot for every item on the list, to execute it in batch [20:03:27] Vulpix: awesome [20:03:53] some of the items have 100+ instances though. what happens then [20:03:58] ill check it out [20:05:44] it will replace all links that point to that page. The only drawback is that it can't handle multiple pages to unlink at once, which might cause the bot to run multiple times on the same page if a given page have links to more than one wanted page [20:07:02] thank you Vulpix [20:07:14] yw