[00:06:32] what's the status of the enums? [00:08:24] the whats? [00:08:41] a property that may have only a limited set of possible values [00:09:49] ah [00:09:55] check bugzilla? [00:10:21] searching it for "wikidata enum / enums / enumerations" didn't find anythnig :) [00:10:25] love bugzilla! [00:11:26] can we switch to something like youtrack??? http://www.jetbrains.com/youtrack/index.jsp [00:12:52] https://www.mediawiki.org/wiki/Grand_plot_to_overthrow_Bugzilla [00:14:43] similarly related https://www.mediawiki.org/wiki/Grand_plot_to_overthrow_Gerrit :) [00:15:57] hear hear [00:16:06] duh: so have you seen anything about enums? [00:16:12] nope [00:16:16] i can search though [00:18:52] hm [00:18:54] cant find anything [00:21:50] exactly. which is very strange - it seems it would be a critical feature [00:22:03] i suspect it was in the design docs somewhere [00:22:08] aude: might know? [00:22:13] maybe its called something else? [00:22:32] restricted set? [00:22:39] i mean, its a programming term really [00:23:49] https://meta.wikimedia.org/wiki/Wikidata/Data_model [00:23:50] enjoy [00:24:32] duh: bleh :) [00:24:37] the next round of reconfirmations should be starting, anyone want to update the page? [00:24:41] sure [00:24:51] can i be the sysop? :) [00:24:54] can you change the sitenotice? [00:24:59] yurik: you dont even have a talk page! [00:25:11] i can change the site notice, sure [00:25:22] * yurik wonders if he needs the extra headacke... proly not [00:25:39] duh: Yuriks don't need no stinking talk page! [00:25:48] :) [00:26:16] because otherwise people will start writing me, and I would have to reply, and i won't get any coding done [00:26:39] just looked at the output of the https://wikidata.org/w/api.php?action=wbgetentities&sites=enwiki&titles=Application%20programming%20interface&format=jsonfm [00:26:41] scarry stuff [00:27:12] "language" repeated in every subranch, etc [00:28:04] Moe_Epsilon: i think i did it right, check for me? [00:28:17] * Moe_Epsilon checks [00:29:00] let me fix up the bottom part though [00:29:12] looks good [00:30:12] lol, API has a label "api.php" in the "diq" language [00:31:37] ok done [00:31:41] someone has to close them now [00:31:48] duh: thought the next round started tomorrow [00:31:54] it is tomorrow [00:32:07] isnt it? [00:32:12] (UTC) [00:32:15] yes [00:32:22] it has been for 30 minutes now [00:32:22] :p [00:32:51] but they *end* on the 16th [00:32:56] and the next round starts teh 17th [00:33:00] it's only the 16th UTC [00:33:00] :o [00:33:11] oh :o [00:33:13] my bad [00:33:17] * duh will revert [00:33:25] think i got everything [00:33:41] {{fixed}} [00:34:02] heh [00:34:08] i clicked revert on popups at the same time [00:34:14] ahh, well :p the beginning/end dates are a little confusing [00:34:32] time to finish my abusefilter bot [00:34:41] lol [00:35:08] when would it be appropriate to close https://www.wikidata.org/wiki/Wikidata:PC#Enable_AbuseFilter_notifications_to_IRC ? [00:35:14] someone else obviously [00:35:28] yeah, action=wbgetentities needs serious rework - too verbose [00:35:45] yurik: what i did this morning https://gerrit.wikimedia.org/r/#/c/49205/ [00:36:16] also, [00:36:21] isnt that how theyre stored anyways? [00:36:28] but that's the value modifications [00:36:45] it doesn't matter how stuff is stored - api is separate from storage [00:36:54] no but i mean [00:36:58] it gives the data back how its stored [00:37:00] we need to make it A) small B) fast C) easy to work with [00:37:05] that way it can be passed back to the api [00:37:12] hmm [00:37:21] like i can take what i get from api [00:37:27] add 'en':'blah' [00:37:29] and pass it back [00:37:36] i don't think its a good idea [00:37:47] when you add stuff, you don't need to pass back what you had there before [00:37:52] right now the only way to create items is by passing a json object [00:37:53] right [00:37:54] you dont have to [00:37:57] when you remove, you need to identify it [00:38:00] its optional [00:38:08] to remove you just set the title param as '' [00:38:24] not very intuitive, we should use verbs, not hacks :) [00:38:44] again, in this example: https://wikidata.org/w/api.php?action=wbgetentities&sites=enwiki&titles=Application%20programming%20interface&format=jsonfm [00:39:00] {{sofixit}} :) [00:39:08] it should return: aliases : zh : [a,b,c], [00:39:19] brb in 10 [00:39:22] simple, much shorter [00:39:24] i will! [00:39:25] =) [00:39:47] and i will integrate it as part of the action=query where it belongs [00:47:33] yes! [02:57:41] rawr canvassing https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/EdwardsBot [02:59:11] hmmm [02:59:17] maybe everyone is sleeping? :( [03:01:38] you'd think canvassing people would be easier... [03:02:11] {{oppose}} [03:02:15] :P [03:03:54] but please, id like that one to get wrapped up quickly [03:09:54] * wctaiwan highfives Jasper_Deng  [03:13:18] techman224: yeah i would think a speedy close is a good idea, but maybe give it 24h? [03:13:49] duh, That's what I was thinking [03:13:59] ok :) [03:15:18] I added that note [03:17:32] ty [03:58:02] Jasper_Deng: can you edit https://www.wikidata.org/wiki/MediaWiki:Uploaddisabledtext to give a link to the commons upload form? [03:59:13] duh: /me demures unless he can shown how non-free images would work [03:59:24] uh [03:59:25] actually, not until we have a policy on images [03:59:35] what does it matter? [03:59:48] non free images arent allowed because file uploads are disabled [04:00:02] b/c it might invite some to upload non-free images to Commons [04:00:08] thats their fault [04:00:09] link them to [04:00:17] https://commons.wikimedia.org/wiki/Commons:Upload [04:00:21] that makes it abundantly clear [04:01:03] but non-free images might prove useful on Wikidata [04:01:27] i dont get what that has to do with modifying a system message. [04:01:53] * Jasper_Deng doesn't want to mislead users [04:02:02] i dont see how you are. [04:02:24] Just say "to upload free images, please upload them at the Wikimedia Commons." [04:02:48] I won't oppose another admin doing this [04:03:44] techman224: ^? [04:04:55] I'll update it, uploads are disabled anyways [04:05:08] thanks [04:06:36] duh, done [04:06:48] ty :) [04:07:36] grrr [04:07:39] delete https://www.wikidata.org/wiki/MediaWiki:Abusefilter-edit-denied [04:07:46] Riley: what was that for? ^ [04:08:38] Well I have no clue, it was months ago. Let me see [04:09:05] suppress = oversight. [04:09:12] the original meaning of the message is fine [04:09:44] * Jasper_Deng has previously chastised Riley for copy/pasting enwiki interface messages [04:09:54] meh [04:09:58] i got fixed on enwiki a while back [04:10:00] I agree, the interface uses hidden for hiding abuse filters [04:10:03] thats the only reason i noticed [04:10:08] Fixed. [04:10:16] Sorry. [04:10:24] no worries :P [04:10:27] * duh waves to Aranda56  [04:10:42] * Aranda56 checks this channel out [04:11:12] this is where all the cool people are [04:16:04] addshore: poke [04:25:49] so... [04:26:04] who wants to help generate automatic-property-thingies? :) [04:26:05] https://www.wikidata.org/wiki/User_talk:Legobot/properties.js [04:26:21] https://www.wikidata.org/wiki/User:Legobot/properties.js/Documentation [04:29:17] duh: is it live? [04:29:26] no the bot hasnt been formally approved yet [04:29:27] but [04:29:30] start adding things [04:30:10] when does it get formally approved? [04:30:18] whenever an admin decides to close it? [04:32:42] and i dont think https://gerrit.wikimedia.org/r/#/c/49205/ will get deployed until monday according to the roadmap [04:34:21] Marked http://www.wikidata.org/wiki/Property:P31 for deletion... let the discussion begin [04:34:35] hm [04:34:39] what happens if a property is deleted [04:34:42] and still in use? [04:34:45] do we need an orphan bot? [04:34:52] or does it just magically disappear? [04:34:56] no no, it should not be deleted before all instances are corrected [04:35:08] so we need an oprhan bot? [04:35:34] i think we need a ref integrity check [04:35:47] a what? :P [04:39:13] duh: http://en.wikipedia.org/wiki/Referential_integrity [04:39:14] yurik: you might want to crosspost to WD:PC since many people might not see it [04:39:22] ah ok [04:39:47] as in - it must check master for the result to be empty before deleting [04:40:06] ok [04:40:19] believe it or not, but this is how it was done before wiki ;) [04:40:43] and domas came and ruined it all for us [04:44:45] duh: thanks, added to WD:PC [05:11:09] psst, yurik, i'm about to make it so {{Delete}} autosigns. mind signing your deletion nomination so it doesn't break things? [05:11:28] i didn't??? [05:11:42] nope :P [05:12:08] dope :( [05:12:38] Pink|TV: i signed the forum, not talk page. Is there a param for the template [05:13:14] huh? just throw in a ~~~~ after your comment in the templaet [05:13:38] hehe, i thought there was a param. I'm just slow today :) [05:13:45] lol 'sok. [05:13:53] signed [05:14:42] good :) https://www.wikidata.org/w/index.php?title=Template:Delete&diff=6661914&oldid=6427855 this should take care of it in the future [05:15:17] PinkAmpersand: how good are you with template magic? [05:15:37] 3.5-4 on the Babel scale, I think [05:15:48] or maybe just 3. idk. what you got? [05:15:51] i have been trying to fix https://www.mediawiki.org/wiki/Template:ApiEx [05:16:02] so that the sample code is hidden in a section [05:16:33] like so it's collapsed? [05:16:36] there is a {{collapse top}} [05:16:54] and a matching {{collapse bottom}}, but the moment i add it, it breaks the whole thing :( [05:17:10] hmm. lemme see what i can do [05:17:17] I want to keep the header [05:17:25] but not the results [05:17:39] i think having a
PinkAmpersand: i think only the {{{result}}} param needs to be collapsed [05:20:25] its been such a long time when i wrote that monster :( [05:22:37] actually no, include the {{{POST}}} too [05:22:42] that thing is a weird table [05:27:13] yurik: ok, think i got it. want me to save it? [05:27:26] yes please :) [05:27:33] thanks!!! [05:27:56] oh, want me to include a way that people can make it so it /doesn't/ collapse? [05:28:05] how? [05:28:08] did you save it? [05:28:24] no, not yet. and i mean like a "collapse=no" parameter one could add [05:28:44] it should be collapsed by default [05:28:49] oh [05:28:57] sure [05:29:17] is the POST inside or outside? [05:32:48] inside [05:33:13] if you want, I can also add a "collapse=result_only" [05:35:24] yurik: ^ [05:36:16] PinkAmpersand: ideally i would like the collapsed portion to be ONLY the results, but always [05:36:33] the header (the query as sent to the API) should always be visible [05:36:38] oh. i thought yo usaid you wanted {{{Post}}} too [05:36:51] right right, post should also go inside [05:36:54] sorry [05:37:11] #mediawiki-template-support ;) [05:37:29] duh: didn't know there is a channel for that :) [05:37:30] ok. but i meant "result_only" would make it so that the post /doesn't/ collapse, but the result still does [05:37:35] there isnt :P [05:37:38] its now this channel [05:38:03] PinkAmpersand: btw, it seems that some users of that template don't include the results at all [05:38:12] not sure how it should handle them [05:38:16] very few [05:38:33] i am not sure its needed really, up to you [05:39:15] ok. and i'll throw in a parser so it's not collapsing empty space if people leave the {{{result}}} blank [05:44:06] PinkAmpersand: thanks! i can't wait for the LUA to make the whole thing readable :) [05:44:20] LUA scares me :( [05:44:37] PinkAmpersand: trust me, nothing can be scarier than template magic [05:44:48] well, maybe except dealing with the parser code [05:45:05] i like my {{#if:{{{foo|}}}||{{{#ifeq:{{{bar|}}}|{{{baz|}}}||quux}}}} [05:45:15] exactly my point! [05:45:28] and i thought perl was bad [05:45:52] can we all just switch to c# or python please? [05:46:18] i will even tolerate java, even though by now it's so far behind, its not even funny [05:47:02] but all that stuff is so... weird. idk. template markup is the only programming language I can actually consistently understand [05:47:20] its not a programming language [05:47:24] its just a markup language [05:47:36] well i suppose technically its programming [05:47:44] but horribly convoluted programming [05:49:29] idk, i find javascript way more complicated [05:50:56] I don't think it's about it being complicated, PinkAmpersand [05:51:19] Java is arguably more complicated. But it is much more orderly in its rules and syntax. [05:51:24] (for example) [05:56:35] PinkAmpersand: there are some weird artifacts -https://www.mediawiki.org/wiki/API:Query [05:57:05] how so? [05:57:07] the first box - extended content should be "API Result" or something similar [05:57:14] and the The following content has been placed in a collapsed box for improved usability. [05:57:16] is not needed [05:57:53] is it possible to remove it? [05:58:19] yurik: you could edit {{Collapse top}} or create an alternate version, but otherwise, no [05:58:32] (but ok, I'll add a heading to the boxes) [05:58:37] ok, its fine than [05:58:43] PinkAmpersand: thank you!!! [05:59:35] PinkAmpersand: just out of curiosity, do you think it is possible to make the actual api?... call as the header? [05:59:54] yes, it would be [05:59:58] shall i? [06:00:07] that's what i tried to do myself [06:00:16] but it broke with the divs :( [06:00:21] lemme see what i can do, then [06:00:22] if you can [06:00:46] but if you are busy doing something else, i really don't want to take any more of your time :) [06:02:15] nah it's fine [06:10:45] btw Yurik I'm probably going to put your deletion request for P31 on hold [06:10:55] need consensus to develop [06:10:58] Jasper_Deng: i think his request isnt a CSD [06:11:02] its more of a XfD [06:11:10] since its not on WD:RfD, I don't see any rush. [06:11:12] yeah, exactly [06:11:38] Jasper_Deng: i didn't see any {{proposeToDelete}} - that's what my intention was [06:11:57] * duh comissions PinkAmpersand for a new template. [06:12:03] lol [06:12:19] http://www.mediawiki.org/wiki/Requests_for_comment/Wikidata_API#action.3Dwbgetentities [06:12:32] PinkAmpersand would make a fine admin [06:12:40] my initial thoughts about changing the wikidata API [06:12:48] hm [06:12:56] yurik: should that happen on mw or wikidata? [06:13:14] duh: it will be a change to the extension [06:13:24] true [06:13:31] although i suspect i will have to make a few modifications to the ApiPageSet class [06:13:44] to allow for the extension to specify which pages it wants to work with [06:14:05] because the titles & sites arguments specify a wkidata specific search [06:14:46] btw did you see that bug in the module manager thing today? [06:14:55] with POSTing not being enforced? [06:15:02] no, where?!? [06:15:17] lemme find it [06:15:21] krenair already fixed it [06:15:22] not good, me like no bugs [06:15:52] i'll pull and see the latest [06:15:56] https://gerrit.wikimedia.org/r/#/c/49344/ [06:16:03] https://bugzilla.wikimedia.org/show_bug.cgi?id=45017 [06:17:38] good thing it got caught [06:17:53] need to see how it got removed [06:17:55] bleh :( [06:18:07] don't like bugs, especially when they relate to security [06:18:34] yurik: yeah i can't figure out how to make that thingy the header. i'll try to do it some other time, maybe [06:18:54] PinkAmpersand: i think its because of the class in div [06:19:08] that's what foiled my own attempt last time [06:19:44] PinkAmpersand: thanks for the change! [06:19:55] doc is soo much more readable now [06:20:33] duh: what do you think about the proposal? [06:20:41] I have to read it a bit more [06:20:50] just look at the results [06:20:53] the diff [06:21:27] I just finished a script that parses a dump, checks if the content is up to date, and fetches from the API if necessary, and yields all of it together in a generator. [06:21:54] sounds ... complicated :) [06:22:07] why are you playing with the dump? [06:22:30] I need to fetch the page content of ~2.5 million pages. [06:22:41] ...=gigabytes.... [06:22:54] why content though? [06:23:00] The bot is editing pages [06:23:04] Let me get the link [06:23:22] is that the bot that deletes langlinks? [06:23:25] https://en.wikipedia.org/wiki/Wikipedia:Bots/Requests_for_approval/Legobot_28 [06:23:25] yes [06:23:38] the rest of the script is super easy [06:23:46] fetching the page content was hard. [06:25:00] duh: is that a modification of the interwiki bot? [06:25:05] no [06:25:07] my own script [06:25:31] because that's where it handles all the possible cases with one->many [06:25:41] i mean - many pages in one to one in another [06:25:49] and all sorts of other weird cases [06:26:42] Eh [06:26:58] But around half the links already exist [06:27:11] do you do deletion in all sites at the same time? [06:27:28] lol PinkAmpersand. [06:27:32] yurik: no, it only runs on enwiki [06:27:37] enwiki + wikidata [06:27:49] bleh... might run into major problems [06:27:53] how so? [06:28:00] it checks that wikidata == local before removing [06:28:00] because of interwiki conflicts of many->one [06:28:08] duh: re RFP? [06:28:14] PinkAmpersand: yeah :P [06:28:19] :D [06:28:22] yurik: but that gets handled on wikidata [06:28:27] there are already enough of those problems [06:28:55] that's true, but IMO the bot should be an extension of interwiki bot, and delete only when it detects that all sites have identical lists, and delete them everywhere at once [06:29:07] i don't think we should rush with the deletion until everything switches [06:29:38] in the mean time i will see if i can get the langlinks to return if the data is coming from local or the repo [06:29:57] no [06:30:00] this way we can quickly narrow the troubled ones [06:30:02] the problem is [06:30:06] humans are removing at their own pace [06:30:13] but not all humans verify its correct [06:30:21] they just assume "oh wikidata, i can delete these" [06:30:30] humans should not remove at all [06:30:31] which is why this is important to do it right [06:30:34] but they are. [06:30:35] why are they removing ?? [06:30:41] that we should stop [06:30:42] to make wikidata show up [06:30:45] no [06:30:47] thats a the problem [06:30:51] since local suppresses wikidata [06:30:58] and interwiki bots are no longer running [06:31:03] our langlinks are becoming out of date [06:31:13] and have to be replace with wikidata, via removal [06:31:14] does wikidata append to the local? [06:31:23] they do merge [06:31:27] it's not one or the other [06:31:34] in that case there is no problem [06:31:38] merge, with local overwriting wikidata [06:31:41] there is [06:31:42] exactly [06:31:44] say on local [06:31:50] you have [[de:Page1]] [06:31:58] on wikidata that gets updated to [[de:Page2]] [06:32:10] the link on enwiki will still show [[de:Page1]] [06:32:16] even though its been fixed on wikidata [06:32:22] and since no bots are running, it wont get fixed [06:33:13] ok, so what you are saying is that we need to get the langlinks info ASAP [06:33:23] yeah, as i edit road articles i'm removing the interwikis, since i know over 90% of US road articles have items [06:33:37] yurik: yes. [06:33:41] including all the ones in california [06:33:49] so my bot will do both [06:33:57] just make sure you monitor all those articles [06:33:57] it will create the wikidata item if needed [06:34:07] because if there is a rogue bot readding them, it has to be stopped [06:34:08] and then remove langlinks when confirmed [06:34:17] yurik: all interwiki bots on en have been disabled [06:34:34] i'm worried about international ones :) [06:34:44] with a global bot flag [06:34:45] wait [06:34:47] rschen7754: wtf [06:34:48] its editing [06:34:56] 06:19, 16 February 2013 Rschen7754 (talk | contribs) unblocked DarafshBot (talk | contribs) (updated) [06:35:00] 06:23, 16 February 2013 (diff | hist) . . (+33)‎ . . m Maritime republics ‎ (r2.7.1) (Robot: Adding fa:جمهوری دریایی) (top) [rollback: 1 edit] [rollback] [vandalism] [06:35:07] :( [06:35:12] just block it [06:35:17] yeah [06:35:33] page slow loading [06:35:41] yurik: because of pywikibot update, they wont check against en anymore [06:35:57] true, but they might not have updated yet :) [06:36:12] not much we can do [06:36:15] call me paranoid [06:36:20] we can monitor that [06:36:31] is there a list of known IW bots? [06:36:36] we set up https://en.wikipedia.org/w/index.php?title=Special:AbuseLog&wpSearchFilter=524 [06:36:38] yurik: no. [06:36:48] we created the list via database query and manual list [06:37:11] and then i (with help from rschen7754) went around the wikiverse and notified all bot owners on their home wiki about the change [06:38:01] https://en.wikipedia.org/w/index.php?title=Paris_Fashion_Week&diff=538467417&oldid=526339980 [06:38:04] bugaga [06:38:17] yurik: that bot has issues [06:38:30] 06:35, 16 February 2013 Rschen7754 (talk | contribs) blocked DarafshBot (talk | contribs) (account creation blocked) with an expiry time of indefinite (nope, still broken) [06:38:30] 06:19, 16 February 2013 Rschen7754 (talk | contribs) unblocked DarafshBot (talk | contribs) (updated) [06:38:30] 00:14, 16 February 2013 Rschen7754 (talk | contribs) blocked DarafshBot (talk | contribs) (account creation blocked, autoblock disabled) with an expiry time of indefinite (interwiki bot that was not updated) [06:38:42] please push the button [06:39:21] it took that op 3 tries to get through BRFA [06:39:40] technically it's not re-adding links [06:39:43] but still [06:39:45] doesnt matter [06:39:57] it had in the past [06:40:02] meaning its a liability [06:40:20] i really only trust Merlissimo's bot staying unblocked on enwiki [06:40:34] ok, is it dead? [06:40:38] blocked? [06:40:45] yeah, i'm an enwiki admin [06:41:04] oh, good [06:41:09] super! [06:41:54] ok [06:41:57] time to finish this bot [06:42:02] the hard part is done :P [06:42:32] good luck :) [06:42:43] just make sure it deletes only when the wikidata has a 100 match :) [06:42:49] well [06:42:53] it also will attempt merges [06:43:00] how so? [06:43:16] if en:page1 & page2 --> he:pageA [06:43:24] which one makes it to wikidata? [06:43:37] it checks for a 75% match [06:43:45] well at least right now it will [06:43:48] it will be suc a mess :((( [06:43:52] i probably will need to change it [06:44:13] ugh i got that exception error again [06:46:15] Riley: have some free time? [06:46:31] duh: Yes [06:46:37] wanna help generate more stuff for https://www.wikidata.org/wiki/User_talk:Legobot/properties.js ? [06:46:51] https://en.wikipedia.org/wiki/Category:Cities_in_the_United_States_by_state is probably a good starting place [06:47:06] duh: some of the road ones are probably easy to do [06:47:10] at least in the US [06:47:10] rschen7754: add them! [06:47:22] duh: that would require work to figure out the cats :P [06:48:52] get started! [06:48:56] there are so many categories to exploit [06:50:29] yeah, i still have to get phase 2 set up for USRD [06:50:44] trying to finish up phase 1 too :/ [06:50:54] my bot will finish it up :) [06:52:02] i need a good way to resume from a dump if it dies... [06:52:42] well, unfortunately it can't add commons files with variables [06:53:11] is it a predictable patern? [06:53:14] pattern* [06:56:47] duh: typically [06:56:57] https://www.wikidata.org/wiki/Q19183 [06:57:06] there's a few other fields that i still need to add [06:58:18] hm [06:58:22] ill look at it another time [06:58:27] it looks possible [06:58:33] but probably easier with a standalone script [06:58:39] rather than modifying my current one [06:59:10] my next step will support parsing templates [07:03:33] do i need my own repo to use wikidata client? [07:04:56] no [07:05:02] you can technically use the wikidata repo [07:06:09] good. but it seems like i do have to use a ton of extensions just to enable this baby [07:06:39] yeah [07:30:50] bleh, got a weird error after installing the client [07:31:13] http://pastebin.com/uaTczcdD [07:34:08] duh: does your bot support multiple statements for a property? [07:34:18] yes [07:34:27] ok [07:34:57] you just have to add it as multiple templates [07:35:01] do i need to setup remote db access? [07:35:12] i copied all the settings from http://www.mediawiki.org/wiki/Extension:Wikibase_Client [07:35:14] yurik: why not use the wikidata-vagrant? [07:35:19] i use it [07:35:28] ohh, wikidata-vagrant [07:35:29] sorry [07:35:37] i use the regular mvw one [07:35:43] link? [07:36:02] there are multiple ones [07:36:18] silkemeyer or abta [07:37:01] silkemeyer is the one i used [07:37:06] i just never finished setting it up [07:37:10] aude told me to use that one though [07:37:15] thx [07:37:43] bleh, but i already got the regular one setup :( [07:38:05] and it can't use multiple ones from the same repo [07:42:17] get another computer :P [07:43:10] https://www.wikidata.org/wiki/User_talk:Legobot/properties.js [07:43:13] so much potential [07:43:23] just need people to help gather categories. [07:46:01] duh: any ideas -- http://pastebin.com/EbjtpYJf [07:46:11] it complains on LoadBalancer->getConnection(-1, Array, 'wikidatawiki') [07:46:22] #2 /srv/mediawiki/includes/dao/DBAccessBase.php(61): LoadBalancer->getConnection(-1, Array, 'wikidatawiki') [07:46:24] yeah [07:46:33] are your foreign dbs set up properly> [07:46:42] or however that works [07:47:37] well, i followed the instructions on the ext page (link above) [07:47:46] and just copied all the settings from there [07:48:48] hm [07:48:54] what if [07:48:56] you remove [07:49:00] $wgWBSettings['repoDatabase'] = 'wikidatawiki'; [07:49:00] $wgWBSettings['changesDatabase'] = 'wikidatawiki'; [07:49:12] and include $wgWBSettings['repoUrl'] = 'http://wikidata-test-repo.wikimedia.de'; [07:49:18] duh: is your bot recursive? [07:49:50] rschen7754: no, however if theres a good reason, i can enable a recursion option [07:49:57] :/ [07:50:03] recursion is just a bad idea in general [07:50:10] for enwiki categories [07:50:13] i mean will you do subcats [07:50:22] do you want me to? [07:50:33] duh: http://pastebin.com/JJa9E7xw [07:50:34] how deep? [07:50:37] this is what i use [07:50:38] 1-2 [07:50:44] i think a few states may use it [07:50:47] yurik: get rid of [07:50:47] $wgWBSettings['repoDatabase'] = 'wikidatawiki'; [07:50:47] $wgWBSettings['changesDatabase'] = 'wikidatawiki'; [07:51:22] rschen7754: ok and do you want only articles at level 2, or articles at levels 0, 1, 2? [07:51:28] 0,1,2 [07:51:34] ok good, thats easy :) [07:51:47] just add |recursion=2 [07:51:53] ok [07:51:55] but it will only work for your name [07:51:59] as requestor [07:52:00] ooo [07:52:04] thx [07:52:15] ill whitelist people if they need it [07:52:20] i dont trust most people with recursion [07:52:36] had to re-run the interlang script again, but worked, thanks! [07:53:06] :D [07:54:32] duh: spoke too soon - the site works, but it doesn't show any interwiki links [07:54:37] hm [07:54:45] are the page names the same? [07:54:47] i tried creating a page that i know exists in wikidata.en [07:54:51] Childebert I [07:54:56] (was a random page) [07:55:02] oh right [07:55:05] null edit the page [07:55:06] it shows "add links" [07:55:13] its like a 5 minute delay caching or something [07:55:20] i created it from scratch [07:55:29] try null edit again? [07:55:30] delay on the test machine?? [07:55:36] yeah caching or something [07:55:37] idk [07:55:47] it was on the village pump [07:55:50] it just showed "add links" [07:55:57] hm [07:55:58] not sure then [07:56:06] clicking it shows "unexpected error occured" [07:58:20] no clue... [07:59:31] * Moe_Epsilon waves [08:01:06] duh: oooh... bleeeeh.it uses origin header (CORS) for some reason [08:01:14] * duh waves to Moe_Epsilon  [08:04:23] when i click "add links", it issues an origin=localhost api call to the test wikidata [08:05:20] ah [08:35:15] duh: also for the statements does your bot create pages? [08:35:32] if you set |create=blah [08:35:37] ok [08:35:46] (where blah is anything) [08:42:06] duh: ok finally added one [08:43:38] :D [08:43:54] i think there's a few hundred [08:44:02] yay! [08:44:19] a few hundred more property sets, or few hundred in that one? [08:44:29] what i think we'll do is once it gets approved [08:44:38] if you have a request, you stick it on the talk [08:44:40] there's a few hundred in there [08:44:52] then a sysop who is not the requestor, will copy it over after checking it [08:44:58] ok [08:44:59] yeah i looked at the category [08:45:18] it looks good to me [08:45:22] cool [08:45:59] i would have added the two-digit ones but there's too many ignores that would be needed [08:46:22] how so? [08:46:36] too many oddballs in the category [08:46:52] oh hmmmm [08:46:56] some of these pages are redirects [08:47:15] thats a quick fix [08:49:34] done [08:50:27] thx [08:50:47] though those probably go to other things in the category [08:52:28] yeah [08:57:23] rschen7754: https://www.wikidata.org/w/index.php?diff=6674588&oldid=6673766&rcid=6687157 :DDD [08:57:38] wow [08:57:52] you'll need a bot just to generate the list :P [08:58:36] i used a script to generate it [08:58:39] now im checking it [09:00:23] ahh stupid florida [09:00:33] those are all subcat by county [09:18:33] phew, checked them all [09:30:33] duh: can you do multiple ignore prefixes> [09:34:05] nvm figured it out [09:36:41] commas [09:36:51] ah yup [09:36:56] though some categories have commas :P [09:37:43] wait [09:37:47] some pages have commas [09:37:49] ugh [09:37:53] eh [09:38:03] its an edge case [09:38:09] lol [09:38:12] i'm not going to write a full proper parser [09:40:50] oh and rschen7754, can you give yurik autopatrolled please [09:40:57] k [09:41:10] almost gave it to him on enwiki :P [09:41:25] enwiki is fine too :) [09:41:28] :P [09:41:33] done [09:42:15] what's the difference between "concept" and "Representation term" [09:42:25] http://www.wikidata.org/wiki/Q1969448 [09:42:58] no clue, but i'm going to sleep now, goodnight everyone [09:43:27] gnight [09:50:48] how do we mark vandals? [09:50:56] http://www.wikidata.org/w/index.php?title=Q7207&curid=8404&diff=6677560&oldid=6373366&rcid=6690142 [09:53:55] just warn them and if it persists flag down an admin [09:54:03] i mean is there a tempalte? [09:54:15] {{uw-vandalism1}} i think [09:54:28] even for IP? [10:06:53] i think we ought to have a plugin that does dynamic google translate for all labels for each language :) [10:14:12] even for IP [10:24:56] yurik: just curious, one question if i may? [10:25:05] sur [10:25:29] Denny_WMDE: that's very formal really :) [10:26:06] since you think "is a" should be deleted, and replaced with a constrained "entity type", i wonder about how to say that Brac, the place I come from, is an island? [10:26:34] probably with a property that is specific to geograghical objects [10:26:54] geographical feature = island [10:27:03] ah, ok [10:27:26] so instead of having one classification, we break it down into many classifcation systems per area? [10:27:35] i would think so [10:27:59] because when you have something like an english verb "is a", it does not really give any semantical meaning [10:28:13] na, I agree that "is a" as a term sucks [10:28:23] here having a prop already classifies the object as geo [10:28:41] and the island classifies it further [10:28:45] I would have gone for "instance of" or "type", but I am trying to keep out of the discussion [10:29:27] i guess the thought here is that the property itself is already an attribute, so when one adds property=value, in reallity that's two values [10:29:41] one narrowing it somewhat, and the second narrowing it even further [10:30:24] hmm. ok. as said, I don't want to discuss it — I try to keep out, I just wanted to see the answer to that obvious question [10:30:29] and you have one, so I am fine :) [10:30:34] :) [10:30:35] hehe [10:30:50] time to apply for adminship. i have "the answer"! [10:31:02] well, an answer :) [10:31:14] ok, thanks [10:31:22] did you see my idea about the API? [10:31:39] http://www.mediawiki.org/wiki/Requests_for_comment/Wikidata_API [10:40:34] Denny_WMDE: sent it to the list, will see what people will say :) [11:34:00] anyone want to confirm for me, wikidata search returns nothing? :( [11:35:58] * aude doesn't know how to fix it :( [11:44:37] ok, seems search is broken on multiple wikis, not only wikidata [11:44:42] due to server issues it seems [12:14:30] is there a list of aall languages wikidata is deployed on yet? [12:20:03] addshore: not sure of some "list" but enwiki, hewiki (hebrew), itwiki (italian) and hu (hungarian) [12:20:23] we do have a list actually, but only in the settings [12:20:36] yep, I just went and searched for the word 'live' through the wikidata new page :) [12:20:49] ah [12:40:41] aude: hi. have you seen my api idea? [12:40:52] * yurik is looking for some feedback [12:41:27] New patchset: Jeroen De Dauw; "(bug 44095) enhanced claim diff visualizaion including refs" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/48459 [12:43:51] yurik it's not so much my area but looks interesting [12:44:12] * aude suggests asking Denny_WMDE danielk and Jeroen [12:44:39] Denny_WMDE disappeared a bit earlier... I suspect I scared him away [12:44:52] nah, went shopping [12:45:38] on first glance the proposal looks sensible, but really need to take a closer look [12:57:35] Denny_WMDE: let me know if you have any questions [12:57:49] * yurik falling asleep while coding smart continue .... [12:58:44] ohh, Denny_WMDE quick question - is it possible to use wikidata repo (or maybe test wikidata) from my local client? [12:58:54] i tried to set it up, but it fails [12:59:26] the "edit links" link fails because it uses CORS origin= request [12:59:42] yurik: i don't think it's setup yet [12:59:46] we have a bug for that [12:59:57] +1 to aude [13:00:12] so at this point there is no way to locally test the client without having the server? [13:00:13] bleh [13:00:30] i was trying to patch it to have langlinks return origin [13:00:40] yurik: well, we usually have a test repo and a test client [13:00:48] both locally? [13:00:52] we the developers and the test repo and client talk to eachother [13:01:09] except when it comes to suggesting sitelinks and such, we setup th eclient to behave as enwiki or some wikipedia [13:01:18] both locally [13:01:31] aude: i'm only interested in the langlinks [13:01:33] you might be able to use our test repo [13:02:00] it doesn't need CORS, right? [13:02:01] we had a patch to enable cors there, although it think it just enables our test client to talk to test repo [13:02:10] not sure [13:02:37] I tried running php extensions/Wikibase/lib/maintenance/pollForChanges.php [13:02:49] and it failed with 'testwiki.wb_changes' does not exist [13:02:51] hmmm, that is deprecated [13:02:55] it should still work though [13:02:56] (SQL table) [13:03:12] it will use the wb_changes table from the repo [13:03:40] right, but that's because you have shared db install [13:03:44] the client also uses remote db access to fetch the site links [13:03:54] no, it's not shared [13:04:09] but can be setup to give db access to the client [13:04:11] i mean - you use some sort of a remote db querying [13:04:19] the way that global usage, central auth do, etc [13:04:30] that's what tim starling, etc. recommended we do [13:04:42] but it's not ideal for third parties [13:04:48] the question is - can the client connect to the repo through public-only interfaces [13:04:59] via the api? (slow, yes) [13:05:15] slow is fine, doesn't work :) [13:05:17] but that needs cors, if it's editing i think [13:05:26] readonly [13:05:27] if it's just querying , it can be codone [13:05:30] can be done [13:06:04] we have gadget(s) for wikipedia that say if there is a related wikidata item, show the description, label, etc. [13:06:18] those must use the api [13:06:27] http://pastebin.com/Ki4dGfUe [13:06:49] this is how i configured my local client [13:07:09] the last 3 are commented out [13:07:30] (i copied all these settings from the client extension page in MW) [13:07:41] the changesDatabase and repoDatabase are the important ones [13:08:04] you need a second mw instance with the repo... recommended [13:09:19] hrm... looks like we need to update the docs [13:09:48] err http://www.mediawiki.org/wiki/Extension:WikidataClient#WikibaseLib_and_WikibaseClient [13:10:01] is important about the database access [13:10:20] aude: what do you mean? [13:10:27] aude: https://bugzilla.wikimedia.org/show_bug.cgi?id=43920 [13:10:31] i did the populateInterwiki & update [13:10:39] The client needs foreign access the database of the Wikibase Repo. It uses MediaWiki's LoadBalancer functionality to do this, similar to the way foreign database access is handled for Wikimedia Foundation projects [13:10:47] In client/ExampleSettings.php you can see sample configurations for setting up the foreign database access. [13:10:57] hi multichill [13:11:03] aude: Is that what you're seeing at other wiki's too? [13:11:05] multichill: we know [13:11:28] maxsem is attempting to ping ops [13:11:39] but it's night time in SF [13:11:53] aude: Ahum, I meant https://bugzilla.wikimedia.org/show_bug.cgi?id=43920 [13:12:10] That has been open for over a month [13:12:29] Search is slowly getting worse and worse [13:12:37] https://bugzilla.wikimedia.org/45073 [13:12:43] Throwing "nothing found" on timeouts [13:12:57] oops [13:13:31] is commons having the problems in bug 45073? [13:14:37] The symptons look a like, but at Commons several reloads will give you the result [13:14:53] hmmm [13:15:06] http://commons.wikimedia.org/w/index.php?title=Special%3ASearch&profile=default&search=georgetown&fulltext=Search&uselang=en [13:15:07] Did anyoune ever add wikidata to Lucene? [13:15:10] looks like the same outage [13:15:33] multichill: i think we did something but the existing search solution for wikipedia doesn't apply so well for wikidata [13:15:43] multichill: Not long after the AMS hackathon ;) [13:15:49] servers failed because all the devs were checking server status [13:15:51] we're looking into having solr or something better [13:15:57] by searching repeatedly... [13:16:11] Lucene and solr is family ;-) [13:16:24] Reedy: you know about https://bugzilla.wikimedia.org/45073 ? [13:16:31] although you can't fix it, i'm sure [13:16:38] multichill: yep [13:16:54] aude: I think feeding the wikidata items to a search engine could give very good results [13:17:13] multichill: they do not get sorted in a nice way [13:17:23] aude: It's not a duplicate btw [13:17:28] right [13:17:53] i can look on commons once search works at all again [13:19:50] alright, time to eat..... [13:19:52] back in a bit [14:15:24] how do i search properties, e.g. if i want to search for a property like "contains", how would i do that? [15:26:06] just great. Even though the API is marked on the front page as "will change", there is already a lib for it [15:26:32] and then people will start complaining that it has changed [15:31:13] Emw: did you try Special:Search's advanced search and check "Property"? [15:31:46] it seems to work for me https://www.wikidata.org/w/index.php?title=Special%3ASearch&profile=advanced&search=is+a&fulltext=Search&ns120=1&redirs=1&profile=advanced [15:39:47] whym_away: thanks! i've added some thoughts to https://www.wikidata.org/wiki/Wikidata:Project_chat#Searching_properties_by_title [15:50:23] New patchset: Daniel Kinzler; "(bug 43990) Robust serialization of change objects" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46294 [15:51:55] New review: Daniel Kinzler; "Patch Set 9:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46294 [17:57:53] Hi all. How can we enable Wikidata based intewiki on ml.wikipedia? [18:04:21] vssun: It will be enabled for all wikipedias, soon [18:06:52] hoo:Thanks. [18:07:41] vssun: the current plan is to do it on 6th of march [18:08:39] Nice [18:37:47] we use Wikidata :) http://omegawiki.blogspot.de/2013/02/using-wikidata-to-display-links-to.html [19:22:09] kipcool: \o/ [19:23:18] :) [19:47:57] Wikidata office hour starts in about 10 mins in #wikimedia-office [19:56:25] Lydia_WMDE: let me know if you want me to change anything on that post [19:56:32] i'm pretty much done otherwise [19:56:53] yurik: ok - i'll have a look as soon as i can but it might not happen before monday tbh [19:56:56] thanks for working on it! [19:57:05] no probs, and no rush [19:57:12] just get the wd deployed to all asap :) [19:57:32] :) [20:17:56] New review: John Erling Blad; "Patch Set 1: Verified+2 Code-Review+2" [mediawiki/extensions/Wikibase] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/49238 [20:17:57] Change merged: John Erling Blad; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49238 [20:32:50] yurik: it's mostly removing links it seems [20:32:52] hey [20:32:58] hi [20:33:07] that's fine, but it should only remove links in the main namespace [20:33:13] so here's the thing which I was talking about re: my father's technical issues on IE 8 [20:33:14] stop it to let the bot owner fix it [20:33:23] i'm not an admin [20:33:25] he tried while logged in, he tried while logged out [20:33:33] he tried the page for "Nitrogen" [20:33:36] got nothing [20:33:51] I tried the page and was able to edit , both while logged in and logged out [20:34:42] yurik: blocked [20:34:54] rschen7754: thx [20:34:58] np [20:36:21] anyone else having troubles saving labels and descriptions atm? [20:36:57] ça compense le malheur du petit déjeûner donc [20:37:01] w/w [20:37:39] Dragonfly6-7: what windows version is your father using? i guess xp [20:39:47] mmm, blocked the bot operator, not the master :P [20:39:51] *bot [20:51:06] !nyan [20:51:06] ~=[,,_,,]:3 [20:51:31] lol [20:52:09] JeroenDeDauw: http://commons.wikimedia.org/wiki/File:Wikidata-Cat.gif [20:54:48] Dragonfly6-7: I just installed IE8 in a virtual machine running XP --> wikidata.org not working [20:55:03] IE 8.0.6001.18702 [20:55:05] :( [20:55:35] Ibenedix1: in which way? [20:56:05] there are no edit-links, just [[object Object]] [20:56:17] JS problem [20:56:42] * Jasper_Deng thinks IE8 and before always hasn't been nice w/ modern JS [20:56:53] yepp [20:57:01] but IE8 is the newest IE for XP [20:57:16] of course you can use Chrome or Firefox with XP [20:57:36] but not in every environment, for example in the office [20:58:02] :/ [21:00:03] I have IE8 at work, on some client-based Windows version (Windows 7 or Vista based I think, but not sure). I've only tried logged-out, and then I see edit fields for labels and descriptions, but no "Save" or "Cancel" buttons. mind you that my work's computer system is fucked up in many ways though, so I can't chalk it up to being Wikidata's fault [21:00:26] oh, and I see the [[object Object]] thing too, but only some times [21:02:01] on other pages there are no edit links and no [[object Object]] [21:12:21] [21:52:09] JeroenDeDauw: http://commons.wikimedia.org/wiki/File:Wikidata-Cat.gif [21:12:24] OMG!!! [21:12:25] :D [21:12:43] :D [21:12:57] Oh, what is this>? [21:13:03] :) [21:13:09] o_O [21:13:11] omfg [21:13:17] approval, i has it [21:13:50] lbenedix1: you win one free internets! [21:13:54] :D [21:13:58] yay [21:15:42] [[Testing]] [21:16:17] lbenedix1: mind if I use this as twitter avatar? ;p [21:16:52] can you use a animated gif as twitter-avatar? [21:17:43] darn, totally forgot to ask (if anyone is still around): how can i help with implementing langlinks sourcing [21:17:55] omg... reminds me of my early days in the internet when was all over the internets [21:18:02] yurik: meaning? [21:18:05] lbenedix1: unfortunately not, though one can use a static version [21:18:30] Lydia_WMDE: querying langlinks returns a list without specifying if it came from the local or the repo [21:18:43] this is crucial for the bots to know if they need to cleanup the article [21:18:57] otherwise they need to download the whole content and analyze it [21:19:01] yurik: ohh there is a bug for that - poking there is best :) [21:19:06] do you have that one already? [21:19:11] ? [21:19:19] let me see if i can find it [21:20:09] yurik: https://bugzilla.wikimedia.org/show_bug.cgi?id=41345 [21:20:14] i might have seen the bug, but the important question is not "can i have this feature", but "how is it currently stored, and can i help you with implementing it" - considering that i wrote the prop=langlinks [21:20:38] yes then best is to post a question in that bug and i'll prod people to give you an answer [21:20:51] oki, will try [21:20:51] what >_> I can't even tweet an animated image ... daaah [21:20:53] :) [21:21:07] Lydia_WMDE is very animated! [21:21:12] haha [21:21:40] Lydia_WMDE: why don't property edits have diffs? http://www.wikidata.org/?diff=6717164&oldid=1395360&rcid=6729859 [21:21:41] yurik: ever tried to attach lydia to a tweet? [21:22:01] Jasper_Deng: missing feature - will come on monday :) [21:22:07] JeroenDeDauw: !!! [21:22:11] i'm sure it will be buggy :( [21:22:30] yurik: awww - but but but! [21:22:54] see - its already buggy - it should bee tweet tweet tweet [21:23:03] lol [21:23:47] tweet tweet tweet tweet [21:23:52] ^just for you yurik :P [21:23:53] Remember everyone; Wikidata:Administrators/Confirm 2013/5 ends in 3 hours and we need people to make the closure. (we forgot about it last time -.-) [21:24:08] Riley: ohh good point! [21:24:40] yurik: probably would end up being classified as spam anyway [21:24:55] * Lydia_WMDE marks JeroenDeDauw as spam [21:25:15] * Hazard-SJ wants to close them :) [21:25:16] * yurik remembers that spam has a different meaning [21:25:31] especially in germany [21:25:35] Just that I've never been around to do so :( [21:26:32] alright folks - /me is away for a bit - back sometime later again [21:29:11] OK [21:29:55] ta ta [21:32:47] Hazard-SJ, what's up with this? https://www.wikidata.org/w/index.php?title=Q386724&action=history [21:32:59] re. https://www.wikidata.org/wiki/Talk:Q386724 [21:33:15] that item is used as a main type property for many pages (see https://www.wikidata.org/wiki/Special:WhatLinksHere/Q386724 ) [21:35:35] Lydia_WMDE: yurik: ehh - I do hope no one is going to try to eat me now :0 [21:37:32] Jhs: Weren't they using Q574288? *facepalm* [21:38:14] Hazard-SJ, nope https://www.wikidata.org/wiki/Special:WhatLinksHere/Q574288 [21:38:17] :P [21:42:36] !nyandata is https://bit.ly/nyandata [21:42:36] Key was added [21:43:08] thank you wm-bot [21:43:34] Jhs: Might as well I undo my changes and create a separate item for the disambig :/ [21:44:34] yeah, probably [21:48:40] Did anyone already write a "who is your daddy?" bot for Wikidata? [21:49:04] Jhs: Done [21:49:13] Hazard-SJ, nice :) [21:49:16] multichill: I didn't [21:49:40] That should be fun [21:50:20] Maybe I'll write it tomorrow. Just loop over all the royal infoboxes and grab father and mother [21:52:47] either there is a major insest going on, or this site is broken http://toolserver.org/~magnus/ts2/geneawiki/?q=Q1339 [21:53:26] yurik: Works for me [21:53:45] does it show two arraws between most names? [21:54:09] Oh wait, after reload it changes [21:54:30] oooh, me stupid. they are going in the oposite dir [21:54:32] my bad :) [21:54:54] yurik: Check out http://toolserver.org/~magnus/ts2/geneawiki/?q=Q855749&lang=de [21:54:55] but in that case it is broken because some only have one link [21:55:12] guess it will be a good job for a bot to fix those [21:55:17] * Hazard-SJ will be back [21:55:24] multichill et al, do you know/have any scripts that are easy to reuse for adding labels and descriptions? i'm not much worth in writing bots, but i'm pretty good at using them once they're written :P [21:55:42] welcome to another nightmare - where fixing one will require an immediate fix of the other, otheriwe a bot will restore it :) [21:56:20] yurik, tell me about it. i was struggling with that for hours when fixing an interwiki conflict on ~10 Wikipedias about a year ago :P [21:56:46] hehe, but wikidata was suppose to fix this, not break it! [21:56:47] =) [21:56:55] afaik pywikipedia supports it now [21:57:03] problem was that there were three different subjects all interlinked in different languages, and the bots kept readding links that i was removing, even though I was fast as lightning [21:58:05] multichill, i've seen https://www.mediawiki.org/wiki/Manual:Pywikipediabot/Wikidata but i'm not sure what to make of it [22:22:11] Jhs: pywikibot wikidata support isnt very good at the moment [22:22:31] i just forked my own repo and added in what i needed [22:23:39] multichill: woah, that tool is pretty awesome [22:24:05] duh: Please merge with trunk to keep it in one place [22:24:37] multichill: its on the rewrite branch, and i submitted a patch 1, maybe 2? weeks ago and no one merged it yet [22:25:37] multichill: https://sourceforge.net/tracker/?func=detail&aid=3601600&group_id=93107&atid=603140 [22:25:55] fscking sourceforge [22:26:34] * duh hates it too [22:26:54] so i just stopped submitting patches if no one was going to merge them [22:34:23] actuallly multichill, dont merge that one ill submit a better patch in a bit [22:46:16] Hazard-SJ: thanks [22:55:48] https://www.wikidata.org/wiki/Property:P107 <- que? [22:58:14] Jhs: Do you speak (or should I say, write) PHP? [22:58:33] Hazard-SJ, nope. i can read some, but that's about it [22:59:03] duh: Thanks for what? :? [22:59:10] EdwardsBot [22:59:23] Oh :P You're welcome :D [23:27:51] yay, i now have 10 000 edits on wikidata :D [23:28:30] O_o [23:28:43] Three months and I have only a bit of 3k :P [23:29:28] most (~85 %) are from this week [23:29:31] :P [23:30:22] hehe [23:30:57] right now i'm going through https://en.wikipedia.org/wiki/Category:International_rivers_of_Africa and adding main type (place) and country to those rivers. plus label and description in Norwegian [23:30:59] * duh needs someone to approve his bot [23:31:07] Jhs: my bot can do all the properties [23:31:23] duh, but so can i :P [23:31:32] my bot does it automatically :P [23:31:37] you just feed it the category [23:31:43] property/item, and it does it [23:32:22] how about country? the categories for these articles are�diffuse [23:33:03] can it be parsed using a subcat? [23:33:09] right now it doesnt check article text [23:33:10] https://www.wikidata.org/wiki/User_talk:Legobot/properties.js [23:33:15] and https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Legobot_4 [23:33:47] Ajraddatz: you didnt comment on it..would you be willing to close the request? [23:34:25] yeah [23:34:29] in a bit, busy with something else now [23:34:35] ok thanks [23:40:18] duh, that's actually awesome. i'm filling out a request for rivers in Africa now (might take some time�) [23:40:30] :D [23:40:46] last night i wrote a script to help me generate the requests [23:40:56] which is funny how i need a script to use a script [23:42:41] lol [23:43:19] i plan to expand it soon to be able to parse templates and stuff, but that needs some more optimization [23:46:31] question! i am new to wikidata and want to learn how to use it. so...i'm trying to add properties to the psychedelic frogfish entry. but it won't let me save Buck Randolph as the "discovered by" property [23:47:15] link me to the enwik of Buck Randolph? [23:47:23] i don't think he has a page [23:47:26] ah [23:47:34] so he needs to have a wikidata item before you can add it [23:47:43] okay :) [23:47:48] and to have a wikidata item, he needs to have at least one sitelink on a wmf project [23:47:53] there is a proposal to change that though [23:47:59] aw [23:48:16] so i should leave buck randolph out then [23:48:32] or create the buck randolph article on enwiki :P [23:48:38] heh fat chance [23:48:44] or dewiki, simplewiki, etc [23:52:23] :o i am surprised there isn't a "has" property [23:52:30] "is a" [23:52:44] but yurik has started a deletion request on it [23:52:56] "is a" fish with [[binocular vision]] ? [23:53:00] because it really *is a* terrible property [23:53:17] heh [23:53:25] BobTheWikipedian: you can propose new properties [23:53:32] maybe "vision" ? [23:53:39] "is a vision" [23:53:41] nah [23:54:02] https://www.wikidata.org/wiki/Wikidata:PP [23:57:38] 'has a' != 'is a' [23:58:14] http://en.wikipedia.org/wiki/Has-a, http://en.wikipedia.org/wiki/Is-a [23:59:07] heh [23:59:11] did not know that, thanks [23:59:35] BobTheWikipedian: you forgot to sign lol. [23:59:40] yes thanks [23:59:43] lol [23:59:54] i hit save again right as you said that