[01:19:58] Hello! If a sparql endpoint gets me dbpedia infobox content, then where do I go for getting media files hosted on commons? [01:21:25] I realize this is fishing in the wrong pond but I'm still hoping for a nibble of direction. :) [01:27:30] do you know the file names? [01:30:21] Ricalsin: if you know the filename, replace spaces with underscores then take the md5 hash of the name. the file will be at https://upload.wikimedia.org/wikipedia/commons/a/ab/File_name.jpg (where "a" is the first character of the hex-encoded md5 hash, and "ab" are the first two characters of the same) [01:33:17] alternatively, https://commons.wikimedia.org/wiki/Special:Redirect/file/File_name.jpg will issue a HTTP redirect to that same location (and has less stringent formatting requirements for the file name) [01:56:47] !worst [01:56:48] Generally, the worst that can happen is that someone compromises your site, steals your data, deletes it and replaces it with kiddy porn. Then they start a spam service on your server advertising the kiddy porn to twenty million email addresses, daily. Then your home gets raided, you get fired and your wife leaves you. Then you get killed in your sleep by ninjas. [02:01:38] has anyone had a problem with google login letting people create users with invalid usernames? [02:02:08] for some reason its letting them create users with lowercase first letters, which in turn means they're unbannable unless manually renamed in the database [03:51:04] Skizzerz: That is wonderful info! How come I could not find that in the countless docs I've been reading on sparql and dbpedia? [03:51:36] Skizzerz: Thank you! [05:43:11] Just loaded up a KVirc chat client on a ubuntu box. Seems okay. Kinda new to the server / channel settings. Sorry for the off-on-in-out of late. :) [08:32:16] Hello everyone! [08:32:29] I have some trouble. So maybe some one can help me? [08:40:48] !ask | kobzar [08:40:48] kobzar: Please feel free to ask your question: if anybody who knows the answer is around, they will surely reply. Don't ask for help or for attention before actually asking your question, that's just a waste of time – both yours and everybody else's. :) [08:44:31] OK. I was moved my MW(mediawiki) from one server to another. I was moved all files, and create sql dump. Its ok. But my new server working only with cp1251 codepage. So, its not problem for me. I am create new sql dump with cp1251 codepage and import it on new serwer. [08:45:26] My new MW working good at the moment. All articles name showed correctly. But articles body in wrong codepage. [08:46:17] I was try encode my xml backup file from utf-8 to cp1251 - but after this i cant import it to MW. I see error [08:48:12] So, how i can convert my articles from utf to cp1251? Both servers works at the moment. [09:03:26] kobzar: have you seen http://stackoverflow.com/questions/16141416/is-encoding-cp1252-invalid-in-an-xml-file?lq=1 ? [09:05:47] I'm not sure what encoding import supports however, I wouldn't be surprised if it was only the encodings that it can be exported as [09:23:09] i am not sure that i am understand what are i am need do/ [09:48:19] which table name where is articles body saved? [11:48:10] %( [11:58:15] Hi! [12:00:07] Can you help me we to change the headline font in mediawiki? [14:03:45] morning [14:03:57] Niharika: Thanks for your help. I did a vagrant destroy -f && vagrant up. And had to reconfigure virtualbox-dkms, for some reason. Got everything working properly. [14:05:24] Am facing another problem while submitting my first patch. git review -s gives: Using global/system git-review config files (/etc/git-review/git-review.conf) is deprecated. Using global/system git-review config files (/home/aashaka/.config/git-review/git-review.conf) is deprecated [14:06:11] morning :) [14:08:09] aashaka: thats just a warning :) [14:08:46] just check to see if a changeId is inserted in commit message, when you save a commit [14:11:30] that would be after git review -R? [14:11:57] aashaka: no, when you save a commit message, see it again using "git log", it should show a changeId at bottom [14:13:16] okay. It does show a Change-Id. Thanks. [14:14:55] good to go! :D [14:27:58] patch-for-review. https://gerrit.wikimedia.org/r/#/c/267250/. it was a nice experience! [14:33:55] Hi, does anyone know how to use an SQL query on a wiki (/wikifarm) to see if an edit was made in the last 60 days? RC max age is 180 [14:42:02] if it goes to 180 days, then why wouldn't you be able to see a change from 60 days ago? [14:48:14] hi I am new to wikimedia, I want to know how wikimedia works, from where to start reading source code of wikimedia? I have already installed vagrant, I am able to run my own instance of wikimedia [14:49:52] Niharika: please help me to read source code of mediawiki, from where to start reading the source code in the mediawiki repository [14:58:17] RobotsOnDrugs: I can see the edits on Special:RecentChanges but I'm looking for a database query that I can just run on every wiki's database without loading the website :P the RecentChanges table should go back that far, I'm just not sure how to look for the times etc, idk the format [14:59:52] what about the api? [15:00:48] Possible, but I don't know much about the api. Our mediawiki server already uses database queries for several things (a custom cw_wikis table that is used to edit Sitename and close wikis on the farm) [15:01:14] the manual on recentchanges table says theres a time stamp :p I'm sure if I looked at the actual timestamps I'd see the format, but I can't connect to it right now :p [15:01:37] I thought of this query SELECT * from recentchanges ORDER BY rc_timestamp; [15:01:47] somebody else said "SELECT DATE(rc_timestamp) > DATE_SUB(CURRENT_DATE, INTERVAL 1 MONTH) FROM recentchanges ORDER BY rc_timestamp DESC LIMIT 1" [15:01:48] ideas? [15:07:21] PuppyKun: the timestamp format is a string like YYYYMMDDHHMMSS [15:07:46] e.g. now is 20160129150735 [15:08:39] why not just SELECT MAX(rc_timestamp) FROM recentchanges ? [15:08:59] that would work :p [15:09:11] I still have to somehow figure out if the highest timestamp is more than 60 days old [15:09:45] https://mariadb.com/kb/en/mariadb/datediff/ ? idk if this would work with the timestamp format :p [15:09:47] probably easier to do in your own code in whatever language than in SQL… [15:10:04] MatmaRex: :P Alrighty X3 Thank you tho. tbh didn't know about select max() [15:33:40] !hack [15:33:40] https://www.mediawiki.org/wiki/Do_not_hack_MediaWiki_core [15:34:49] praveenraj: https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker [16:31:05] Hi could someone help me with composer. When you have an extension that requires another extension but through composer. When you go to install it the path looks like extensions/Example/extensions/Example2/ I would like to get it so it looks like extensions/Example/ and extensions/Example2/ i am aware there is an extra plugin in composer i am not sure how to do this please could i have help. [18:34:07] I've got an updated patch for review, it's just git commit --amend and then git review -R is that about right? [18:34:51] the commit I'm amending already has a change ID [18:35:23] ah git commit --amend --no-edit I suppose? [18:36:32] JosefAssad: https://www.mediawiki.org/wiki/Git/TLDR [18:39:51] yes I read that. https://www.mediawiki.org/wiki/Gerrit/Tutorial#Amending_a_change_.28your_own_or_someone_else.27s.29 says git review -R instead [18:40:00] man pages it is [18:42:34] you don't need the --no-edit, just git commit --amend is fine [18:46:28] you'd think so, but I have had a very strange "catatonic emacs" bug all day :) ty still apergos [18:47:41] you could alias emacs to emacs -nw maybe [18:47:53] just ot avoid that whole thing with the separate window and etc [18:47:56] anyways [18:49:20] while following the https://www.mediawiki.org/wiki/Extension:RDFIO tutorial, i tried svn checkout http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/WikiObjectMode and i got svn: E175013: Access to '/svnroot/mediawiki/trunk/extensions/WikiObjectModel' forbidden. [18:49:30] I ought probably see if magit will do gerrit [18:49:34] THAT would be useful [18:50:33] jkale: SVN is shutdown, I think, gone for good [18:51:17] Vulpix: what can i do? [18:52:55] jkale: instead of those steps, the https://www.mediawiki.org/wiki/Extension:Wiki_Object_Model seems can be downloaded from git [18:57:00] Vulpix: it only offers a snapshot download. could you please write the git command line for me? [18:57:39] git clone https://gerrit.wikimedia.org/r/p/mediawiki/extensions/WikiObjectModel [18:58:20] Vulpix: thank you :) [19:14:16] I finally did get cross-wiki Echo notifications working. https://media.giphy.com/media/l0NwIhhV06Hk8GUrC/giphy.gif Unfortunately I had to edit EchoForeignNotifications::getApiEndpoints() to hook into our wiki farm system to return the correct information. [19:18:50] Trela: Neat! [19:19:24] Trela: Is that because your farm doesn't use the right interface to specify the right information, Notifications doesn't, or because said interface doesn't exist? [19:20:21] James_F: We use a Redis back end and we can't quite override the SiteConfiguration class to fully implement a correct interface. [19:20:55] Back on MW 1.21 and 1.23 we have Echo cross-wiki notifications using a custom Redis back end that pushed all, local and global, notifications into Redis. [19:21:04] *have/had [19:21:12] * James_F nods. [19:21:57] Moving to the newer official code that only piece we need from that is the wiki configuration. It can be pulled from the master database, but that in an encrypted format. [19:23:06] *that is in [19:23:13] Sorry, my English skills are off today. [20:06:10] Where/how can I suggest a new hook to add to core? [20:10:01] Running mw 1.23.8 with Collection (1.7.0) and mwlib as my render server. When I try to download a page as pdf, I'm getting an outdated version of the page. Any ideas on what might cause that? [20:20:33] [20:27:16] G_SabinoMullane: I would submit it as a patch, if you know how to do that. [20:27:40] Thanks [20:54:35] why do i need to add "index.php" (http://localhost/mywiki/index.php/Main_Page)? what did i do wrong? [20:55:00] !shorturl | jkale [20:55:00] jkale: To create simple URLs (such as the /wiki/PAGENAME style URLs on Wikimedia sites), follow the instructions at or try the new beta tool at . There are instructions for most different webserver setups. If you have problems getting the rewrite rules to work, see !rewriteproblem [20:56:10] MaxSem: thanks [22:35:26] I am seeing a wave of new spam in some wikis, e.g. at http://reprap.org/wiki/Special:RecentChanges [22:35:52] People add phone numbers and words like "phone", "customer support" and so on. [22:36:34] All that should trick people into thinking that these numbers would be the official numbers of companies like Microsoft, Norton, McAfee and some more. [22:37:10] Often the page text consists of a few sentences, no links in them, and these sentences then are copied via copy and paste until the page is nicely filled. [22:37:46] What is the way to go against this kind of abuse? [22:38:53] Joergi, AbuseFilter? [22:43:04] MaxSem: In this situation, how would you use it? I only know it exists, but I have never used it myself. [22:43:48] create a regexp that catches these edits, use it in a filter [22:44:54] ah, found a public filter for that: https://www.mediawiki.org/wiki/Special:AbuseFilter/48