[09:28:57] Hello there. I'm having some issues with running the maintenance/update.php [09:29:08] Failed to populate content table revision row batch starting at 1 due to exception: MediaWiki\Storage\BlobAccessException: Unable to fetch blob at tt:234 in /var/www/mediawiki/includes/Storage/SqlBlobStore.php:295 [10:29:09] 1.31 works fine, but 1.32 and above report this error [11:15:10] Hello, everyone. I am trying to decompress a part of a multistream wiki-dump directly in the ubuntu terminal. So far, I have managed to cut out a part of the archive using the dd command (as advised here https://en.wikipedia.org/wiki/Wikipedia:Database_download#How_to_use_multistream? ) but when I try to decompress it, bzip2 doesn't recognize it as [11:15:10] a bzip2 file. As far as I can tell, I am either using bzip2 wrong or I didn't properly isolate the part of the dump I want to extract (I just took out everything between two indexes given in the pages-articles-multistream-index.txt.bz2 file). Can someone give me a concrete example of how to decompress just part of the archive? [13:53:55] https://tesseractnet.org/wiki/index.php?title=Main_Page <---------------------How to remove all that and just have tesseractnet.org/wiki/Main_Page? [14:06:47] i tried https://shorturls.redwerks.org//, but when i pate my wiki's url, i get: An unknown error has occurred. .query not returned by api [14:08:57] Hi. I recently moved my wiki from an azure webapp to an azure virtual machine. The thumbnails for my images are no longer working, it shows the message "error creating thumbnail: Unable to save thumbnail to destination". The permissions for my images folder is set recursively to 0777 (for now). Any idea what might be causing this? [14:09:36] did you restart the webserver? [14:10:07] Not afaik. I'm not really in control, but I can see if I can do that [14:10:22] just throwing out ideas [14:13:37] Restarted, no change :( [14:14:47] I have been messing around in the image directory. Are there some folders that have to be there? I deleted everything except the images (disabled hashing, so all in one pile). Then I recreated the temp and archive folders (and then set 0777 on everything) [14:15:44] Could it have something to do with the owner? I'm listed as the owner of the files, but I'm not www-root [14:25:09] or www-data or whatever [14:25:52] Should www-data be the owner of everything in the mediawiki folder? [14:34:27] Lol, I was missing the leading / in my path to temp... can't believe I spent hours on this [14:46:27] Hi everyone, I'm a working student an new to mediawiki. Maybe someone can help me, at least I can ask :) So I want to make a top 10 Wiki-Editors and a top 10 Wiki-Article-Creators. I have found the mediawiki page "How to become a MediaWiki hacker". I already installed a mediawiki 1.34 on my notebook. Now I need to create an extension, which [14:46:28] listens to the hooks/events (like ArticlePageEditedAndSaved) get the informations about the user from that hook/event) and store it in the database. Is that the right direction to go? [16:03:32] Damn I got disconnected. Here again my question: [16:03:35] Hi everyone, I'm a working student an new to mediawiki. Maybe someone can help me, at least I can ask :) So I want to make a top 10 Wiki-Editors and a top 10 Wiki-Article-Creators. I have found the mediawiki page "How to become a MediaWiki hacker". I already installed a mediawiki 1.34 on my notebook. Now I need to create an extension, which [16:03:35] listens to the hooks/events (like ArticlePageEditedAndSaved) get the informations about the user from that hook/event) and store it in the database. Is that the right direction to go? [16:12:47] that's one way to do it, yes. It won't be able to capture historical data though [16:13:33] the advantage of doing it that way is that the lists will be much faster to update, at the expense of extra storage space for the new table [16:13:34] Also note that some code for this might already exist out there, e.g. the code providing the data for https://en.wikipedia.org/wiki/Wikipedia:List_of_Wikipedians_by_number_of_edits [16:14:03] you can do it without an extension by using the MediaWiki API or by directly querying the revision table and running summations [16:15:20] Via the extension route, it can be exposed via a special page you create or via wiki markup that can be inserted into a page. Via the API route, you'd need a bot account to update an on-wiki page or to display the data elsewhere [16:16:33] Hi Skizzerz first it doesn't need to capture historical data. second puuh don't understand the rest :( [16:16:40] do you have some links? [16:16:59] the link andre__ provided is achieved via the second method I listed (no extension, just a bot querying the API) [16:18:04] A ok, but I also want to capture some other stuff. So I don't want to be limited to only edits [16:18:36] if you can view it on a wiki page somewhere (via recentchanges, logs, etc.) then it's available in the API [16:18:53] I would personally suggest that route first, only falling back to an extension if it really doesn't suit your needs [16:19:04] ooh ok [16:19:17] see https://www.mediawiki.org/wiki/API:Main_page for details on how to use the API [16:20:00] and you can poke around at https://en.wikipedia.org/wiki/Special:ApiSandbox to see what the responses look like to help you develop your program [16:20:30] that [16:20:36] that's nice [16:21:44] if you have access to the revision table directly, you could run direct SQL queries against it (so you won't be able to do that on wikipedia et al, but you can on your own wiki you control). This would likely get you faster results than the API but at the expense of needing to learn how to tell apart things like edit vs page creation [16:22:26] ooh ok [16:23:22] so you mean if I have access to the database [16:23:24] and the mediawiki database schema does change (sometimes drastically) between versions, so your SQL would need to be adjusted as the wiki gets upgraded, whereas the API is very stable and rarely needs modification [16:23:25] yep [16:24:08] that access I can request :) [16:24:31] puuh so many new ideas, now I have a lot to read [16:24:54] best of luck :) [16:25:07] (not related to the above) Is wikitext the only contentmodel that can handle rvsection=0 with a rvprop=content? [16:25:40] '=D thank you Skizzerz an a bit more I need [16:26:53] the sql things sounds very attempting, but have 0 experience [16:27:12] hope it's not to hard to learn it [16:29:18] McJill: in core? yes, only wikitext supports sections. Extensions can add content models with section support [16:29:40] 👍 [16:29:50] you can see if any given content model supports it by checking if it defines the getSection() function in the Content class [16:30:06] (if it doesn't, it'll fall back to the base class definition which is usually "not supported") [16:32:30] Cool, thanks [16:33:22] I figure an easy regex for ^Module:|.(js|css)$ on the pagename should approximately cover most cases [16:33:59] not sure what you're trying to do, so I can't confirm nor deny that :) [16:35:32] API query that looks up the first revision of a page and gets the content of section 0, skipping content and section if not wikitext, but preferably without querying the content model, so testing the page name is probably robust enough [16:35:38] but anyway, thanks for the confirmation [16:45:51] just spitballing, not even sure if it's possible, but could you use a generator to create page names that are only wikitext content model, then feed that to the revisions API? [16:49:48] Eh, it’s a one-off each time it’s run; honestly, I could use wgPageContentModel for most cases, but occasionally there’s a need for something other than the current page. [17:34:26] Hello, World! rvexludeuser (API) not working when user is anon? [17:34:35] https://www.mediawiki.org/w/api.php?action=query&prop=revisions&titles=Extension:MultimediaViewer&rvprop=user&rvexcludeuser=27.56.243.177 [20:13:51] I don't know if this is the right channel, but I want to report a vandal https://www.mediawiki.org/wiki/Special:Contributions/82.223.14.46 [20:15:58] Urbanecm: ^ per "MediaWiki admin" [20:16:11] Sk4mp: fyi, i tried to match https://www.mediawiki.org/wiki/Special:ActiveUsers?username=&groups%5B%5D=sysop&wpFormIdentifier=specialactiveusers to IRC nicknames [20:16:30] if you see other names in both lists.. feel free to ping them [20:17:06] it's not like a massive attack, so I don't want to ping and distract anyone :) [20:17:26] ok, that's nice [20:36:44] mutante: {{done}} [20:37:05] p858snake: thank you [20:37:07] Sk4mp: ^ [20:37:18] someone else will need to clean up [20:37:35] thats a translateable page, and I can never get that to work [20:39:46] i also gave you the powers on mwwiki mutante [20:39:52] go forth archive greatness, etc etc [20:56:06] p858snake: oh, that was not expected. thanks though! [21:02:43] I like to empower power for future situations, or something [21:08:18] I like to empower power for future situations, or something >.> [21:52:41] Anyone? [21:52:51] hi [21:53:55] SamiWey, at your Special:Version, 1. what is your mediawiki version 2. is the 'Nuke' extension listed? [21:57:23] My mediawiki version is 1.34.0 [21:58:11] SamiWey: good [21:58:17] and nuke version 1.34 [21:58:34] SamiWey: ok. and it is still not working today? [22:01:28] https://dpaste.org/R7cu [22:01:46] Te same error appears bro [22:09:34] Broo i lose conection [22:10:06] What i can do ¿? [22:10:44] SamiWey: can you edit pages? does that work? [22:12:42] I can't save pages [22:14:43] SamiWey: ok [22:15:09] SamiWey: did you update your wiki recently, or is it a clean install? [22:17:19] I've bien updating 1 year in a row [22:18:05] By softaculous php 7.4 [22:20:32] SamiWey: ok, one sec [22:21:43] SamiWey: did you run "php maintenance/update.php" ? [22:23:17] I have no idea how to run php maintenance [22:24:03] SamiWey: do you have shell access? [22:24:22] SamiWey: if not, you can run the update through the web browser [22:24:29] SamiWey: for example, if your wiki is at http://example.org/w/index.php, then navigate to http://example.org/w/mw-config/ [22:26:17] Yes i hace Shell acces [22:26:38] Who is the code [22:27:41] terminal is the name [22:28:46] SamiWey: do you speak Spanish? [22:28:54] Yes [22:29:11] okay. if there is an awake Spanish speaker, I would appreciate assistance :) [22:29:19] I not speak english [22:29:22] SamiWey: did you use terminal before? [22:30:41] Never [22:40:25] SamiWey: do you know what your wiki installation directory is? [22:40:43] Public_html [22:40:54] SamiWey: type "cd public_html" in terminal [22:41:12] SamiWey: then type "pwd" and tell me what it says [22:44:40] SamiWey: hello? [22:45:08] Hello [22:45:12] Wait [22:45:42] SamiWey: do you have a laptop? [22:45:52] Yes [22:46:11] SamiWey: are you using this live chat from your laptop? [22:46:23] From Android [22:46:46] I have TeamViewer [22:47:04] SamiWey: can you please open live chat on your laptop? [22:47:05] In laptop [22:47:13] Ok [22:47:56] SamiWey: thanks - this will be a lot easier [22:48:51] SamiWey: please use your laptop only; please do not use a mobile for these tasks [22:49:30] hello [22:49:47] hi SamiWey46 [22:49:53] type "cd public_html" in terminal [22:49:56] then type "pwd" and tell me what it says [22:52:42] the directory [22:54:27] all good [22:55:14] SamiWey46: what does "pwd" say? [22:57:04] the directory the directory /home/user/public_html [22:57:22] good [22:57:38] now type "php maintenance/update.php" [23:00:35] el_id 0 - 3613 of 3613Done, 0 rows updated, 0 deleted.Purging caches...done.Done in 0.8 s. [23:02:55] you are a genius, I am going to cry!! [23:06:45] SamiWey46: did it start working now? [23:14:27] Yes [23:15:46] ok :)