[00:52:49] What would the mediawiki equivalent to mysql_fetch_assoc be? No documentation anywhere describes anything beyond reading/writing to the database. [00:54:19] https://dpaste.de/OiBX < Something like that [00:57:03] I'm trying to dump all of the results from a query; no luck. [00:59:44] LuckDuck: $dbr = wfGetDb(DB_SLAVE); $result = $dbr->select(…); var_dump($result) [01:01:49] MatmaRex: That dumps loads of random text - Include my SQL password! and username! [01:02:25] *ignore the random !* [01:02:42] LuckDuck: well, the result object contains more information than just the query result [01:02:55] foreach($result as $row) { var_dump($row); } [01:03:01] this might be closer to what you want i guess [01:05:30] That did the trick. I guess if you decide to do a straight up dump then it prints EVERYTHING related to getting that data? Ouch. [01:06:19] yeah [01:24:07] is there anyway to shorten file names or remove file sizes shown on the default category galleries, i.e., http://commons.wikimedia.org/wiki/Category:Invertebrates? [01:25:20] I'm getting errors with PDFs and MW 1.23 [01:33:09] ah, apparently, it's showBytes in http://www.mediawiki.org/wiki/Manual:$wgGalleryOptions [01:36:51] beautiful. [01:36:53] thanks! ;) [02:37:21] I've still been attempting away, no suck luck. Mediawiki's not accepting mysql_fetch_array & friends. Is there not any solution? [02:37:30] *such ... [06:56:48] Hi, I have a mainly English wiki, a few people would like to translate it into French and German. What is the easiest way to enable translations? Second question, should the translated pages be called e.g. Fr/Color Management or Color Management/Fr or does it not matter? [06:57:30] To be specific, i dont care about the language of the interface, i just want to give people access to the translated pages. [06:59:18] !e Translate [06:59:18] https://www.mediawiki.org/wiki/Extension:Translate [06:59:25] DrSlony: try using ^ [06:59:43] the extension uses "Page name/[langcode]" so Page name/fr [07:01:34] also wanted to mention i have no shell access [07:06:33] DrSlony: hm, do you have FTP access? [07:07:18] worst case, you can just create pages named "Page name/fr" and have users just manually translate them [07:38:38] Can anyone help with $wgHooks['PageContentSaveComplete'] hook [07:38:42] It does not working [07:38:54] Here is code for reference http://pastebin.com/xZk8Bmi7 [07:53:29] Ashish, after saving, you get redirected so doing anything UI related from this hook makes no sense [07:54:32] MaxSem, How then I can make it work ? [07:55:06] use some other hook or pass some URL parameter on redirection [07:59:19] MaxSem, I think this could help "ArticleUpdateBeforeRedirect" ? [07:59:40] probably [07:59:59] sorry, I need to go to bed soon [08:00:54] Okay Sure...thanks for help [09:28:17] Hi! [09:28:58] I am using MobileFrontend extension with the extension that I am working on : BookManagerv2 [09:29:27] I want to enable js and css modules for my extension [09:30:29] I enabled those modules using EnableMobileModules hook. [09:31:06] I am using jquery.ui.autocomplete as one of the dependencies [09:31:34] and I get this error: Uncaught Error: Unknown dependency: jquery.ui.autocomplete [09:32:02] Any help? [10:02:13] Djain: in the RL module definition, it has to have 'targets' => array( 'desktop', 'mobile' ) [10:02:24] but they don't want jquery.ui loaded on mobile because its big [10:02:54] you might want to talk to someone in #wikimedia-mobile about it, they're pretty helpful [10:03:30] ok thanks [11:18:46] a question about the new magic work {{!}} (replacement for template. Why didn't they just create the magic word {{|}}? [11:19:22] seems weird way to hack to overcome a hack [11:19:35] s/work/word/ [11:22:30] sDrewthedoff: because | isn't a valid page title character, so it's not a valid magic word thing either. [11:25:00] I will just say 'okay', obviously some arcane knowledge there that escapes me [11:25:03] thx [12:19:36] question [12:20:12] Is there an extension which lets me use counters ? [12:20:33] I.E template/magic word like entites which automatically increment when used many times... [12:20:45] The thought was to have a syntax like {{ [12:22:14] *like {{#start:var}} ... {{#i:var}}..{{#i:var}}..{{#end:var}} ? [12:23:26] This would assist some tasks at Wikisource. [12:24:47] Qcoder00: http://www.mediawiki.org/wiki/Extension:Counter [12:26:26] That looks broadly like what I was lokign for thanks [16:06:07] Hi [16:07:22] hi BartlomiejB [16:07:32] welcome - how is your MediaWiki experience going? :) [16:26:10] anomie: Hi, did you have a look at the new patch, is this what is expected? [16:26:14] anomie: https://gerrit.wikimedia.org/r/#/c/143025/ [16:27:10] kunalg: Haven't had a chance to review it, but I see you rearranged the patches which is good. [16:27:25] anomie: Ok [17:27:04] fhocutt: a combination of weird wifi and slow things and headphones mean maybe we should continue here [17:27:07] I'm sorry [17:27:14] ah, ok, that works [17:28:49] How can I automatically create an article, when a user registers on my wiki? sorry for my bad english... [17:35:16] sorry fhocutt - ok, on to https://www.mediawiki.org/wiki/API:Client_code/Evaluations ! [17:37:52] fhocutt: so the simplemediawiki I think has not changed too much but I see you took my suggestions and the wording is a little kinder ;-) [17:38:00] yes :) [17:38:16] fhocutt: now looking at wikitools and mwclient and Pywikibot [17:38:35] as I look, fhocutt, I wonder - do you have a secret favorite? although I guess if you say it here, it's not a secret [17:38:55] "i love all my children equally" [17:39:28] hah! My secret favorite, which does not meet the standard, is wikitools, although the fact that it is GPL3 may cause license compatibility problems [17:39:43] mostly because the code is beautifully readable and it uses API calls very efficiently [17:39:43] wikitools is packaged for Fedora and not Debian! will wonders never cease. (more clearly: I find that unusual, but maybe I should not) [17:40:20] I would bet that it has to do with the maintainer's dev environment :) [17:40:25] I agree :) [17:40:34] fhocutt: the "Well documented" section in https://www.mediawiki.org/wiki/API:Client_code/Evaluations/wikitools seems to have some inconsistencies in the bullets? [17:40:44] oh no, now I get it [17:41:02] maybe if you use : to indent the "answers"/"grades" for each bullet I would understand more clearly [17:41:58] you can use multiple colons :::: to indent more and more [17:42:06] hm, what do you think now? [17:43:42] fhocutt: that feels easier for me to read, thanks! you may wish to make this change to all your sub-bulleted lists ..... did you ever use the "definition list" tag in HTML? (wow I am dating myself) [17:44:06] no, I don't actually have much html experience [17:44:25] https://developer.mozilla.org/en-US/docs/Web/HTML/Element/dl [17:44:44] basically it was good for pages that were, like, glossaries [17:44:46] ah, I see [17:45:07] I am not gonna make you yakshave by trying to figure out if there's a similar thing in wikitext, but I may do so at some point [17:45:48] fhocutt: I know you are still drafting, but I will mention that I personally like consistent use of markup to mark things like Wiki.setUserAgent as code [17:46:01] ty! [17:46:06] * sumanah is still on https://www.mediawiki.org/wiki/API:Client_code/Evaluations/wikitools [17:46:56] ok! overall I like https://www.mediawiki.org/wiki/API:Client_code/Evaluations/wikitools . Now on to /mwclient [17:48:52] "The documentation is not complete; the section on Generators, in particular, would be very helpful for a less experienced Python developer who is trying to understand the underlying structure of the library." fhocutt I find this confusing - is there a missing "not"? [17:49:29] sumanah: there are confusing gaps in the documentation [17:49:39] oh [17:50:19] fhocutt: you get why I was confused there? [17:50:32] I think so, will rework that [17:50:40] * sumanah could also give tiny Sumana-would-say-it-differently prose style suggestions but only when requested [17:50:50] they do not impair transmission of meaning or courtesy [17:51:53] your suggestions do not, you mean? [17:52:58] I mean the sentences I would talk about are currently fine - they convey info fine and they are courteous [17:53:02] sorry [17:53:12] "It is confusing that the user calls page.edit() to retrieve the text of a page, and page.save(...) to edit the text of a page. Consider deprecating/renaming these functions for clarity." - OMG WOW thank you for saying that. I had not known this. What a pain! [17:53:37] I know! Bad UX there. [17:54:03] that's the sort of thing you have to note so strongly in the docs that you really should save people the cognitive resources [17:54:04] fhocutt: the more I read these the more I appreciate the summary you give of what is good about each child, I mean, library [17:54:17] yeah? :) [17:54:18] "monday's child is fair of ux" [17:54:29] * sumanah is in a super silly mood [17:55:05] "Tuesday's child is clear and pep8y" [17:55:56] so fhocutt I figure you have the pywikibot TODOs up top for a reason (in that it's inconsistent with the other evals, where the TODOs are at the end) - I am curious [17:56:06] oh, actually I don't [17:56:18] thank you for pointing that out, the reason is that I was Done With This [17:56:27] hahahaha [17:56:42] * sumanah attempts to model Assume Good Faith ;-) [17:57:04] changed [17:57:12] * fhocutt grins [17:58:35] back in a moment [18:03:37] sorry fhocutt - was moving around [18:03:39] back [18:04:17] hi sumanah [18:04:51] anomie, does it work for you to push back looking at Perl libraries until sumanah and I are done here? [18:04:52] fhocutt: I like your pywiki suggestions and I think it's good [18:04:59] I think I personally am done! [18:05:05] ok, cool! [18:05:15] hi anomie and hope it's cooler where you are than where I am. [18:05:20] fhocutt: Sure [18:05:41] also anomie we should totally schedule a week for me to come visit you and cowork, if that's of interest to you ..... we do not have to do it now [18:05:45] thanks for looking those over. Could you also take a look at the template email I drafted to the library maintainers? [18:05:50] sumanah: 93�F according to my weather widget [18:05:58] aieeee! [18:06:06] anomie: as in, we don't have to schedule it now, and it certainly won't happen now [18:06:14] fhocutt: will do right now! [18:06:20] ty! [18:07:04] sumanah: Interesting suggestion, although we'd have to figure out logistics. [18:07:06] fhocutt: one more thing to add to that email: thank them for their work [18:07:08] anomie: certainly [18:07:21] right! [18:08:24] fhocutt: other than that, ""I have prepared a "gold standard"[1] to be used to [18:08:25] find and advertise" could be wordsmithed [18:08:58] mm, ok [18:10:06] fhocutt: because the standard is not actually what you will use to advertise...... youknowwhatimean [18:10:09] I have installed mediawiki and am attempting to connect to it using localhost/mediawiki/ I am receiving a 403 Forbidden error. Does anyone have any idea of what might be causing this? Any troubleshooting steps i should follow? [18:10:17] !debug | ehazen [18:10:17] ehazen: For information on debugging (including viewing errors), see http://www.mediawiki.org/wiki/Manual:How_to_debug . A list of related configuration variables is at https://www.mediawiki.org/wiki/Manual:Configuration_settings#Debug.2Flogging [18:10:24] those tips may help ehazen [18:10:35] thanks i'll check it out! [18:10:46] sumanah, gotcha [18:11:02] anything else? [18:11:04] fhocutt: and my last note: "the issue tracker" - linking or specifying (GitHub or whatever) will be good [18:11:06] that's it from me [18:11:25] great, thanks! [18:12:25] today I will make those changes, check in with Merlijn on these, and start looking at the Perl libraries [18:12:52] cooool [18:12:54] see ya! [18:12:57] later! [18:14:28] anomie, does looking at https://github.com/gitpan/MediaWiki-API and/or https://github.com/MediaWiki-Bot/MediaWiki-Bot in ~10 min or so work for you? [18:14:35] fhocutt: ok [18:15:28] anomie: thanks! [18:24:51] anomie: got some time now? [18:24:57] fhocutt: yes [18:28:23] legoktm yes, i have ftp, thank you for helping [18:34:33] looking at the codebase itself now [18:34:55] what's going on with the tests here? https://github.com/gitpan/MediaWiki-API/tree/master/t [18:36:31] fhocutt: Looks like the only tests there are checking whether everything is documented [18:36:42] ok [18:36:59] useful, if not for the code itself [18:38:24] looks like pretty much everything is in API.pm: https://github.com/gitpan/MediaWiki-API/tree/master/t [18:38:30] not that one [18:38:34] https://github.com/gitpan/MediaWiki-API/blob/master/lib/MediaWiki/API.pm [18:41:03] fhocutt: Yeah. MediaWiki-API looks like a relatively low-level interface (but still handling continuation and tokens at least), then MediaWiki-Bot has a bunch of higher-level functions. [18:41:22] yeah. [18:41:50] do they look like good Perl, at a first glance? [18:42:05] Well, continuation for list modules; it doesn't seem like it has support for continuing prop modules [18:43:42] Seems decent; I'm not really familiar with coding styles for Perl, really. Rather than various _hashref parameters I might want to see actual hashes, but that's not really a big issue. [18:44:11] what's the difference between the two? [18:44:43] Whether you call it as "foo( { Key1 => Value1, Key2 => Value2 } )" or "foo( Key1 => Value1, Key2 => Value2 )" [18:45:39] hm, ok [18:47:03] huh, this isn't actually that long, it just has lots and lots of comments [18:47:12] is that usual for Perl modules? [18:47:49] If they use POD documentation it can be [18:49:30] ok, so it handles tokens, nice [18:52:09] could you help me unpack what's going on in list? [18:52:20] https://github.com/gitpan/MediaWiki-API/blob/master/lib/MediaWiki/API.pm#L556 [18:53:39] does it just make an API call with a given query, then see if there are continuations to be had? I'm not sure what hook is doing [18:54:32] looks like it's something that gets passed in to options? [18:54:45] It looks like it has the option for the caller to provide a callback function that gets called for each query, instead of accumulating the whole result set in an array and returning it. [18:55:16] could you expand on that? [18:55:34] i.e. you'd set 'hook' to some function foo(), and it calls "foo( $results )" for each batch fetched from the API. [18:56:19] If you don't pass a hook function, it just merges the results for all the batches into one array and returns that at the end. [18:56:40] so it's for if you want to process the results in small batches instead of having a giant array to deal with? [18:57:02] is that mostly a performance helper? [18:57:14] Yeah, looks like it [18:58:11] useful, ok [18:58:47] that's what I have for now on this one, anomie [18:58:52] thanks for your help. [18:58:56] You're welcome [18:59:30] is it ok if I email or ping you here when I run into confusing bits while I'm doing the Perl evaluations? [19:00:59] Yes, feel free [19:01:31] tyvm! [19:33:40] Is anyone familiar with the IRCColourfulRDFeedFormatter class? [19:51:03] !ask | Rosencra_ [19:51:03] Rosencra_: Please feel free to ask your question: if anybody who knows the answer is around, they will surely reply. Don't ask for help or for attention before actually asking your question, that's just a waste of time – both yours and everybody else's. :) [19:52:23] Well, in the getLine function, what are $feed and $actionComment for… I looked around at the code, and I'm not quite sure myself. [19:52:37] Other than $feed looks like it does something with interwiki prefixes [19:53:22] IRCColourfulRDFeedFormatter doesn't exist ;) [19:53:33] IRCColourfulRCFeedFormatter [19:53:43] yeah yeah yeah, fat finger ;) [19:54:05] use copy-paste :) [19:54:20] Rosencra_: the documentation says " @see RCFeedFormatter::getLine", have you tried looking there? [19:54:24] !class IRCColourfulRCFeedFormatter [19:54:24] See https://doc.wikimedia.org/mediawiki-core/master/php/html/classIRCColourfulRCFeedFormatter.html [19:54:27] !class RCFeedFormatter [19:54:27] See https://doc.wikimedia.org/mediawiki-core/master/php/html/classRCFeedFormatter.html [19:54:47] https://doc.wikimedia.org/mediawiki-core/master/php/html/interfaceRCFeedFormatter.html * [19:55:18] It's various options [19:55:36] see RecentChanges.php line 389 onwards [19:56:02] Just the IRC formatter doesn't use most of them [19:56:16] https://github.com/wikimedia/mediawiki-core/blob/master/includes/changes/RecentChange.php#L389 [19:56:17] is wm-bot incognito? :P [19:56:56] And for the documentation https://github.com/wikimedia/mediawiki-core/blob/master/includes/DefaultSettings.php#L5600 [19:58:45] Ahhh ha! [19:59:26] @seen petan [19:59:26] MatmaRex: Last time I saw petan they were quitting the network with reason: *.net *.split N/A at 7/1/2014 4:22:14 AM (1d15h37m12s ago) [19:59:32] Thanks! [20:31:42] * As of MediaWiki 1.22, the only supported 'engine' parameter option in core [20:31:42] * is 'UDPRCFeedEngine', which is used to send recent changes over UDP to the [20:31:42] * specified server. [20:31:44] sigh, bug 1 [20:32:13] ? [20:32:32] 1.22 has RedisPubSubFeedEngine [20:35:11] I blame https://gerrit.wikimedia.org/r/#/c/80958/ [20:59:42] sumanah: valhallasw has suggested separating out the TODOs into what are effectively "technical" and "social" [20:59:56] fhocutt: Hmmm! [21:00:06] so stuff that requires code changes vs. stuff like improving/deciding the level of maintenance, fostering a hospitable community [21:00:19] Hmmmmm [21:00:45] is he particularly optimistic about the chances of any of them happening? ;-) [21:00:47] I like it, but I'm concerned that the nontechnical suggestions will be automatically discounted. any suggestions on phrasings to avoid that? [21:00:58] hah! I am not sure about that. [21:01:00] is his reasoning that "this makes them easier to understand" or "easier to advocate" or what? [21:01:08] easier to understand in the list [21:01:15] in that like is grouped more with like [21:01:19] right [21:02:13] * sumanah is thinking [21:06:03] fhocutt: I say it's worth going, and that it is also worth asking Merlijn to specifically back you up onlist and say "yes we should be doing these things including (doc/communication/friendliness/etc)" [21:06:23] I mean, it's worth doing - it will make the list easier to understand if like is grouped iwth like [21:06:30] right, ok [21:06:44] but yes, you are right about the risk [21:07:36] I will ask all of the mentors, for when I'm emailing results to mediawiki-api-l etc. [21:08:28] all right [21:11:21] sumanah: any suggestions on titles for the social todos? [21:11:31] fhocutt: "process" [21:12:47] yess thank you [21:14:16] where do docs go? Process? [21:14:21] yeah [21:14:25] k [21:14:28] imo [21:14:32] that will beef it up, too [21:14:38] yuppppp [21:15:00] does this even count as deviousness [21:15:21] it's all ux design [21:15:23] everything [21:17:16] everything is, seriously [21:20:39] cscott: bounding boxes for default-sized thumbnails - the conversation I saw stopped Jun 2. should it restart on wikitech-l ? [21:21:09] i'm rather busy with other projects at the moment. can we defer it for a month or so? [21:21:13] I may be super out of touch and it may be obsolete, apologies [21:21:19] cscott: up to you! [21:23:54] cscott: I updated the thread and I'll check in again with you in a month. Best wishes re your other projects! [21:24:05] thanks [21:51:41] if I have an image, say Test.png, is there a way I can link directly to the image? like http://example.org/RawFile:Test.png [21:51:49] and that will return the image? [21:51:58] Seranok: [[Media:Test.png]] [21:52:11] If you need it in a url, User Special:FilePath [21:52:16] s/User/Use [21:53:24] hiya [21:53:35] what's the minimal linux distro i can run mediawiki on? [21:55:12] What do you mean? [21:56:34] I want to make a VM to run mediawiki in [21:56:43] and I'd like to make it as small as possible [21:57:42] in disk space? [21:57:58] yeah [21:58:38] pick a distro with a decent enough package manager [21:58:41] do a minimal install [21:58:57] then just install only the packages you need [21:59:46] ok thanks :) [22:00:18] bawolff: so like example.org/Special:Test.png? [22:00:30] not like that [22:01:03] like https://commons.wikimedia.org/wiki/Special:FilePath/Example.png [22:01:18] But if you're just writing in wikitext use media namespace [22:01:49] And if you're writing a script, you can figure out what the url for a file is by taking the first two digits of the md5 sum of the file name [22:02:02] https://commons.wikimedia.org/wiki/Special:redirect/file/Example.png also works [22:02:12] how do you know this [22:03:31] Seranok: you mean how do you know the md5 sum? [22:03:39] i don't want to do an md5 sum lol [22:03:50] i mean how do you know this stuff [22:03:59] i could not find it on google [22:04:42] * bawolff has spent a lot of time with MediaWiki [22:05:25] Seranok: This is the developer irc channel for mediawiki, so many of the people here are the people who make mediawiki, and we know lots of things [22:05:51] And we're also bad at documenting things, so they sometimes don't show up on google ;) [22:22:44] bawolff: want to squash a quick bug? [22:23:00] When people say that, its never quick :P [22:23:05] But which bug? [22:23:10] bawolff: yes it is [22:23:18] bawolff: see bug 63326 [22:23:19] hey I think Wikimedia is an OpenID provider https://twitter.com/christi3k/status/484454739371782144 - am I right? [22:24:03] sorry, tweet is a friend asking who the non-Google OpenID providers are [22:24:08] recommended ones anyway [22:24:31] hmm, are we? There was talk about being one, but I don't remember if we actually ever did it [22:24:44] sumanah: no, we're not [22:24:47] ok thanks legoktm [22:24:55] sumanah, see https://bugzilla.wikimedia.org/show_bug.cgi?id=13631 [22:24:58] one day we will be hopefully [22:24:58] I was gonna suggest trying to use it to log into Stack Overflow or something [22:25:26] launchpad is an OpenID provider [22:27:44] Betacommand: Looking into it. Basically I just have to resurect thedj's patch I think [22:27:58] bawolff: pretty much :P [22:31:50] bawolff: When I say a patch is easy it normally is [22:32:12] oh wow https://launchpad.net/ last blog post in 2012 :( my spouse used to work on that project [22:33:58] thank you SamB! https://help.launchpad.net/YourAccount/OpenID now I know [22:36:15] Too bad we're not an open id provider. That'd be really cool [22:37:02] nod [22:37:05] thx all [22:38:49] I'm assuming the OpenID URL would be basically your userpage on the wiki of your choice? [22:39:20] bawolff: Hi [22:39:45] Hi [22:40:23] SamB: User's can be renamed, so we might not want to do that [22:40:30] bawolff: Needed some suggestion here, https://gerrit.wikimedia.org/r/#/c/137915/ (I knew I will get a -1 there :P) [22:40:41] bawolff: oh, true [22:40:43] I think OpenID extension already does something else involving the user's id [22:41:09] so when I am trying to scrape a page on a MediaWiki site it's returning this HTML page which sets the cookie [22:41:22] how do I disable this cookie redirect page? [22:42:41] kunalg: I'll look after I finish looking at the block patch [22:42:54] Seranok: As a general rule, have you tried using the api instead [22:42:57] bawolff: Sure :) [22:43:46] * bawolff doesn't even want to know how a user named "Apitestuser" got registered on my wiki. Some test must not be using a cloned db... [22:45:51] bawolff: some test that extends ApiTestCase isn't setting @database [22:48:19] bawolff: i am trying to get the raw contents of a page [22:48:32] from my reading it doesn't seem like the API would help with that [22:49:31] Seranok: prop=revisions&rvprop=text [22:49:34] or something like that [22:56:29] bawolff: even if i use said api, still have the same problem: requires cookies [22:56:56] Default MW does not require cookies for logged out users [22:57:24] Other sites may do other things with extensions [23:03:15] Might as well try here - Would anybody happen to know the answer to this - The question on the top there woth two replies?: https://www.mediawiki.org/wiki/Project:Support_desk- Both my partner and I have been0,00 going round and round with that, it's frustrating. [23:03:53] Er, did it just censor out a piece of my post or is that just my client? [23:04:03] TennaFox: Not just you [23:04:17] maybe you accidentally hit the colour button [23:04:26] yeah, colour [23:05:29] TennaFox: The "Mediawiki equivalent to mysql/mysqli fetch_array?" question? [23:09:15] bawolff: Yes. My partner is assisting me in porting over a simple blog applet I wrote to Mediawiki, and it's been a real pain getting that paticular function to work. [23:09:41] The full (More or less main) question/explaination is in the 2nd reply. [23:10:36] Ok. I'll respond over there in case anyone else is looking at the question [23:11:37] Thank-you bunches, I'm rarely driven to complete insanity. [23:19:51] TennaFox: Does https://www.mediawiki.org/wiki/Thread:Project:Support_desk/Mediawiki_equivalent_to_mysql/mysqli_fetch_array%3F/reply_(3) help [23:20:26] Let me try that out... [23:21:26] Question - Where does $categories come from? [23:21:40] TennaFox: The previous person's answer makes slightly more sense if you assume that $out is an instance of OutputPage (e.g. like $wgOut or $context->getOutput() ), and $out->addHTML is somewhat close to an echo [23:22:05] I just declared it as a temporary variable before the loop ( $categories = array(); ) [23:22:50] It's just a holder to put all the results in one array [23:31:54] bawolff: It seems to be working - Yet it all results appear as NULL when var_dumped. Oddly, it gets the count right - Just not the actual data. [23:33:31] TennaFox: What's the name of the column in the db for categories [23:34:05] Ah, I think I misread your post [23:34:14] At this point in time, it's no longer "category" but "tag". [23:34:40] (I made proper modifications to your example) [23:34:48] I thought it was post_categories for some reason [23:35:22] Anyways, the line $categories[] = $row->post_categories; should be $categories[] = $row->tag; (or whatever the column name is) [23:35:43] Oh, haha. My fault for not spotting that! [23:36:50] And there it is! I'll try to commit that to memory. Thanks to you, we're one step closer to proper news distribution. [23:37:12] Glad to help [23:37:28] Just for reference, MW has a category feature built in [23:38:24] And there's extensions that already exist for making an rss feed based on most recently added thing for a category [23:38:28] In case that's what you are doing [23:40:41] Yeah, but in this case I felt it should operate seperaly from the way pages do - That might get confusing when you view the category's page. I haven't tried yet, but I'm sure creating an RSS feed wouldn't be too difficult. It's not anything I've messed with yet though.