[00:50:16] are any of our many many German users around? [00:50:49] At nearly 2am, I doubt it [00:51:08] * addshore waves [00:51:09] lol 2am is when i start my admin rounds someday ;) [00:51:37] addshore: vous êtes allemand ? [00:53:05] Why are you talking in french? [00:53:24] because I don't speak German :P [00:53:29] nein [00:53:45] Sprechen Sie Englisch? [00:53:55] DAS IST VERBOTEN. [00:54:17] sim! [00:55:21] addshore: btw there's and !_admin request in #wikipedia-en [00:55:24] *an [00:55:57] * addshore will wait 5 mins to see if anyone else deals with it :/ [00:56:03] i have my own backlog to deal with currently :/ [01:14:29] what language is nvwiki? O_o [01:15:12] 'nv' => 'Diné bizaad', # Navajo [01:17:16] cheers! [01:18:27] fyi, languages/Names.php in core [01:20:35] :) [01:28:00] btw rschen7754 https://www.wikidata.org/wiki/Wikidata:Bot_requests#Merge_P:P127_and_P:P165 [01:28:19] yeah that's what i proposed originally [01:28:27] but somebody had to rename it and create a separate one [01:28:29] *sigh* [01:29:42] remind me to update the requests remaining [01:31:02] ok [01:31:08] someone asked for a request to migrate them [01:58:09] Where can I translate messages for Wikidata? TWN? [01:58:30] yes [01:58:55] everything should be there except for some gadgets which are done locally [02:03:50] Please see the text "Also known as:" in every wikidata entry. The translation of this message to 'ml' language is incorrect. I want to change it. I didnt find it on TWN [02:04:12] legoktm [02:05:05] * legoktm looks [02:07:34] that should be the message "wikibase-aliases-label" ? [02:10:19] legoktm: please let me check [02:11:21] legktm: can you give the link please? [02:11:41] i just found https://www.wikidata.org/wiki/MediaWiki:Wikibase-aliases-label [02:11:50] lemme look on TWN [02:13:04] hmm [02:13:06] i dont see it [02:13:54] vssun: i asked in #mediawiki-i18n [02:17:29] got it and changed [02:18:26] I was looking at Wiki Lexical Data [02:18:36] Thank you very much legoktm [02:18:55] oh great :) [03:40:33] hey everyone, I just found out about wikidata and think its very promising...but the phase 1 confuses me. havent interlanguage links already done in wikipedia?? [03:40:50] they are done, but differently [03:41:13] instead of every wikipedia linking to every other wikipedia [03:41:13] what makes this one better? [03:41:25] you only have to update wikilinks in one place [03:41:28] not on 200+ wikis [03:41:32] now we have every wikipedia getting the same set of links from one place [03:41:33] *langlinks [03:41:42] if you add a page that is in another language on another projec [03:41:53] do you want to edit one wiki, or 200 wikis? [03:41:56] :) [03:42:09] it's simple [03:42:28] plus it has some other information, such as description and claims (like founder, owner, etc.) [03:43:09] i see, that makes sense. [03:43:19] are there any plans to machine translate wiki pages? [03:43:43] machine translation isn't completely accurate, but someone could take a page and run it through a translator [03:43:50] or have a human translate it [03:45:19] ok; so now the easiest way to contribute would be to improve a random item...does that simply mean translating words to a different language? [03:46:28] you need to change the language of the entire wiki for your view to add labels, descriptions, aliases, and claims in another language [03:46:34] it's simple, it's at the top right next to your name [03:46:50] then you add non-interwiki related data in another language [03:47:33] for example, i came across this page [03:47:33] http://www.wikidata.org/wiki/Q180767 [03:47:47] it doesnt have an english translation, so i can edit and add an english one? [03:48:44] yes [03:48:59] if you want to add an english page, you need to place it on the english wikipedia [03:49:07] but you can add the other info on wikidata [03:50:55] thanks...so i looked up the word "mathematics" [03:50:55] http://www.wikidata.org/wiki/Q1952935 [03:51:16] the list of languages is rather short...is there a reason why the wikipedia language translations arent directly used here? [03:51:44] i.e. when i go to page mathematics, i can go to the term in other languages where i can get a translation of the word [03:52:19] wait...is categorized as album? one sec lol [03:53:05] http://www.wikidata.org/wiki/Q1908348 mathematics as artist/person?? [03:53:39] http://www.wikidata.org/wiki/Q401540 [03:53:42] and one more [03:53:46] did you read https://en.wikipedia.org/wiki/Mathematics_%28producer%29 ? [03:54:35] i did not, thanks for pointing that out...i still find it odd that the singer ranks higher than the subject [03:54:57] how is the ranking algorithm executed? [03:55:02] yeah search is messed up [03:55:04] its broken [03:56:20] thanks, i will look into discussion page to see whats being plan to fix the search [03:56:46] also when a term has ambiguous meanings, will it always show up as separate terms as in the example of mathematics? [03:56:58] planned* [04:01:15] back...so ppl apparently know the search is broken but i didnt see any plan to get it fixed [04:06:11] they are still working on it [04:14:25] thanks guys! [04:52:29] Hey all [04:52:56] I see that the thread about opping out of global sysops has not been touched in almost 5 days now [04:54:53] in my opinion someone who didn't vote should just close it and dispose of it as they see fit [04:57:20] I did a conditional support [04:57:30] But as I see it, consensus was not achieved [04:58:15] Removing my vote, I see seven supports against three opposes [04:58:42] 70%…. right on the line [04:58:59] Yes [04:59:26] Although I don't consider that it is enough to make such a desicion [04:59:31] What do you think? [04:59:43] i wonder if punting it to the stewards is a possibility [05:00:32] I was about to do this: Close it as no consensus and open an RFC [05:02:24] My bot has 5k edits to go before reaching a million! [05:03:29] I think its the best course of action :) [05:03:34] Epic legoktm, congrats [05:04:03] Someone told me my bot is editing at 100epm :P [05:04:15] I think it's slowed down now though [05:04:28] Btw legoktm... You owe me a bot :P [05:04:34] Oh yeah >.< [05:04:39] Sorry [05:04:43] Haha [05:04:45] I'm really backlogged right now [05:04:46] DOn't worry :) [05:04:52] Maybe you should ask on WP:Bot requests? [05:05:06] Vacation9 helped me [05:05:09] ^^ [05:05:54] oh good [05:07:23] Well, I will close the discussion and open an RFC about it tomorrow [05:08:43] ok [05:08:55] i guess what works [05:08:59] that* [05:09:51] Yeah [05:10:19] I've imported 46k people's data into my database from itwiki so far. [05:12:05] Good [05:20:54] I am starting the RFC page now [05:28:04] 1.5K left [06:01:31] [12:01:25 AM] <+legoktm> !count Legobot@wikidata.org [06:01:31] [12:01:26 AM] Edit count: 1002435 [06:01:35] :DDDDD [06:02:11] :) [06:06:14] legoktm: where did you do that? [06:06:33] in my channel [06:06:43] ##legoktm [06:06:43] 10[1] 10https://www.wikidata.org/wiki/ [06:06:47] heh [06:06:53] am I able to do it [06:07:13] yeah [06:09:03] i thought i had it so you could pm the bot [06:09:05] I feel like using !count on everything but I'm going to bed soon because I don't feel good [06:09:07] but idk if that works [06:09:10] aww :( [06:09:12] bonus [06:09:19] !editcount User gives you global editcount [06:10:39] Congratulations legoktm [06:10:53] now i need to fix pywikipedia :D [06:12:23] "Thank you for reporting." [06:12:24] aw [06:13:20] a bot filled RC [06:20:50] Well, it is a pleasure to talk, but it's time for me to leave [06:22:05] o/ [06:22:09] legoktm: one should only attempt to fix the fixable... [06:22:18] heheheh [06:22:33] legoktm: are you going to be in amsterdam? [06:22:40] Possibly [06:22:47] i need to figure out whom i need to talk to @ hackathon [06:22:58] multichill mentioned scholarships so I would need one of those and talk to my parents [06:23:31] it sux to travel when you are under 12 :( [06:23:46] >.< [06:23:52] you need all the guardian permissions [06:23:54] bleh [06:23:56] I'm 18 tyvm. [06:23:57] :-P [06:24:19] teasing [06:24:35] i wonder whom i would see from wikidata team [06:24:53] * yurik feels old [06:25:07] * yurik doesn't like feeling old [06:48:29] benecloudstar: around? [07:25:41] New review: Jeroen De Dauw; "(1 comment)" [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/46686 [07:31:23] New review: Jeroen De Dauw; "(1 comment)" [mediawiki/extensions/Wikibase] (master) C: -2; - https://gerrit.wikimedia.org/r/46686 [07:39:36] does wbgetentity not support an &oldid= parameter? [07:39:48] how do i get content of old revisions? [07:42:12] New review: Jeroen De Dauw; "(1 comment)" [mediawiki/extensions/Wikibase] (master) C: 2; - https://gerrit.wikimedia.org/r/52655 [07:42:15] New patchset: Jeroen De Dauw; "Improve term normalization." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52655 [07:43:18] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52655 [07:43:38] Pink|sulking{jk}: are you now the SUL king? [07:46:30] he's sulking. [07:46:42] * legoktm pokes yurik  [07:47:16] actually [07:48:57] JeroenDeDauw: do you know if there is anyway to use action=wbgetentities with an oldid? [08:28:25] ok [08:28:28] i'm somewhat confused now [08:28:41] if https://www.wikidata.org/w/api.php?action=query&prop=revisions&titles=Q42&rvprop=ids|content&format=jsonfm works fine [08:29:00] whats the point in having a separate wbgetentities module? [08:29:12] is it because you can't do stuff like props? [08:29:58] oh and the lookup with site+pagename [08:32:23] yurik: around? [09:50:30] legoktm: around? [09:50:39] hi [09:50:42] Hi :) [09:50:53] It's about the summaries of your bot on es.wiki [09:51:15] The translation sounds quite strange :) [09:51:21] does it? [09:51:25] i pulled it from addshores list [09:51:36] https://meta.wikimedia.org/wiki/User:Addbot/Wikidata_Summary [09:51:39] Well, Addbot's summaries are much, much better [09:51:45] let me check if it got updated in the meantime :/ [09:51:52] Example [09:52:01] 2013-03-07 23:07:36 - 64413595 - Addbot (-1286) Moviendo 55 enlaces interlingúisticos, ahora proporcionado(s) por [[d:|Wikidata]] en la página [[d:q131080]]. [09:52:01] 10[2] 10https://www.wikidata.org/wiki/13 => [09:52:03] 10[3] 10https://www.wikidata.org/wiki/q131080 [09:52:10] * 2013-03-07 23:28:03 - 64415545 - Legobot (-23) Quitando 1 enlaces entre-wiki, proviendo ahora por [[d:|Wikidata]] en la página [[d:q131080]]. [09:52:13] oh yeah [09:52:17] looks like someone updated it [09:52:29] let me fix that [09:53:07] Ok, thanks :) [09:54:44] done [09:54:47] thanks for pointing that out [09:55:10] Np :) [09:55:59] i'm rather amazed at how interwiki bot runners manage to keep up to date when running bots on so many wikis [09:56:47] i cant even remember what wikis its running on without checking a list :/ [09:56:58] Yes, anyway those bots will stop working soon... [09:57:41] :) [09:59:18] * jem- waves again for Wikidata, never too much :) [10:03:07] Mmmmmm I see there's a little error on the final translation [10:03:27] interlingúisticos -> interlingüísticos [10:03:32] Well, not so little [10:03:42] I'll update the page on Meta [10:05:24] wait which should it be? [10:05:44] De second one [10:05:48] ok [10:05:48] *The [10:05:51] And... [10:06:01] if it says "proporcionado(s)" [10:06:14] It should also say "enlace(s) interlingüístico(s)" [10:06:32] The same "problem", singular o plural [10:06:34] or* [10:06:38] Argh :) [10:07:02] how about you update it on meta and ill just copy that :P [10:07:12] I'm on it :) [10:09:19] {{Done}} [10:09:19] How efficient, jem-! [10:10:24] ok updated and restarted [10:16:43] Thanks again :) [10:17:25] no problem [10:19:59] BTW, I'd like to ask if it is intended or known than in history pages the interwikis aren't visible but the "edit links" link is [10:20:32] that* [10:21:18] hmm [10:21:23] that might be a bug? [10:21:25] not sure [10:21:41] Well, I guess the coherent would be none or both [10:21:52] Maybe Lydia_WMDE or Denny_WMDE can tell ? [10:22:08] * Lydia_WMDE reads up [10:22:10] sec [10:22:23] :) [10:22:37] ah [10:22:42] i think we have a bug for that [10:22:44] let me look [10:23:21] we have https://bugzilla.wikimedia.org/show_bug.cgi?id=45838 [10:24:13] Perfect :) I'll follow it [10:25:18] :) [10:54:22] DanielK_WMDE_: ping [11:00:18] New patchset: Tobias Gritschacher; "(hotfix) adjusting selenium tests to use NewItem instead of CreateItem" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52783 [11:00:30] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52783 [11:00:50] Silke_WMDE: hm? [11:03:46] DanielK_WMDE_: I have contradictory OAI extension settings [11:04:08] 1) $oaiAgentRegex = '!.*!'; [11:04:09] $oaiAuth = false; [11:04:09] $oaiAudit = false; [11:04:18] and 2) [11:04:27] @include( $IP.'/extensions/OAI/OAIRepo.php' ); [11:04:27] $oaiAgentRegex = '/experimental/'; [11:04:27] $oaiAuth = true; [11:04:27] $oaiAudit = true; [11:04:27] $oaiAuditDatabase = 'oai'; [11:04:40] what do want on test? [11:07:05] New patchset: Jens Ohlig; "Lua support to access the repo data (DO NOT MERGE)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52784 [11:16:34] New patchset: Jens Ohlig; "Lua support to access the repo data (DO NOT MERGE)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52784 [11:27:01] DanielK_WMDE_: ^^ [11:29:54] Silke_WMDE: 2) is for production, 1) is for test and dev [11:30:11] ok, thanks! [11:30:24] we could use 2) on test, if we set up the audit database (or doe we already have that?), but it's pretty pointless. [11:30:45] I don't think we have it [11:30:58] Silke_WMDE: basically, 1) turns off authentication for the OAI interface, making testing easier [11:30:59] I'll use 1) [11:31:04] ah ok [11:31:24] perhaps add a comment to that effect [11:31:31] ok [11:35:24] DanielK_WMDE_: What is confusing: 1) doesn't have the @include line nor a require_once [11:36:02] * aude waves [11:36:22] * Silke_WMDE waves back to aude [11:37:33] anything urgent to do? [11:39:40] aude: sent you a PM with a question [11:39:47] ok [11:53:12] New patchset: Tobias Gritschacher; "(testing) added selenium tests for string datatype" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52785 [12:01:38] Silke_WMDE: it needs a require_once. maybe that is elsewhere in the file? [12:01:45] otherwise, we just forgot :) [12:03:02] ok, thx! [12:03:30] FOooooood! Anyone? [12:06:21] hi all, I want to help wikidata, more specifically remove local iw links from svwp, how do I most easily do this if I have a pywikipedia bot running with bot rights on svwp? I'm a tech noob so reasonably detailed instructions would be appreciated [12:28:00] Grillo: hey :) [12:28:11] Grillo: you'll want to talk to addsleep and legoktm i think [12:28:20] Hi [12:28:24] hi :) [12:28:30] see my message above [12:28:57] I'm guessing the addbot owner is asleep :) [12:29:10] He is :P [12:29:30] My pywikipediabot is a bit complicated to set up [12:29:38] but I can show you through the steps if you'd like [12:29:59] sure, I already have a pywikipediabot but I guess I'm missing a lot of stuff :) [12:30:17] https://github.com/legoktm/wikidata/blob/master/enwiki_removal.py (the filename is misleading) [12:30:35] do you have a toolserver or wmf labs account? [12:30:41] nope... [12:30:56] do you have a local dump of svwiki? [12:30:58] thats what my bot runs on [12:31:39] According to [[WD:Wikidata migration]], addbot should be running through svwiki soon... [12:31:40] 10[6] 10https://www.wikidata.org/wiki/WD:Wikidata_migration [12:32:03] it has been running, but only for an hour or so per day [12:32:15] hmmm [12:32:16] maybe it will start running faster soon [12:32:25] in that case I don't need to do anything [12:32:42] ill poke him when he wakes up [12:32:54] http://sv.wikipedia.org/wiki/Special:Bidrag/Addbot so far it's only done about 7000 edits [12:32:56] i think it runs a cron and dies after making a certain number of edits [12:33:03] just to make sure it doesnt overwhelm smaller wikis [12:33:06] ah [12:33:25] maybe it's better to wait than go into stuff that's more complicated than I'd like to get myself into :P [12:33:35] heh [12:33:51] my script overrides some of pywikipediabots internal checks [12:34:08] which is why it requires some additional setup [12:34:46] you know, I'll just wait and make some manual contributions here and there :P [12:35:09] you might find my tool https://toolserver.org/~legoktm/cgi-bin/wikidata/checker.py useful :) [12:35:27] it analyzes the local page text and tells you whether its ready to migrate to wikidata [12:36:28] its a bit slow but it also handles redirect checking [12:36:35] it would be nice if that page had an automated edit function, like wikicleaner [12:37:06] thats past my skillset unfortunately :/ [12:37:44] at least it's useful for me, thanks :) [12:37:45] i had it so Legobot would automatically clean the page on approved wikis, but it made it sooooo much slower and had a few un-reproducable bugs so i turned it off [12:37:52] yw :) [12:39:19] oh hmmm, does svwiki have autoapproval for globalbots? [12:39:42] if so i can run my bot over it [12:40:56] more wikis opting into phase 2 early? :D [12:42:56] Grillo: ! [12:43:01] * addrawr is awake now :) [12:43:05] :D [12:43:25] please see http://sv.wikipedia.org/wiki/Special:Bidrag/Addbot [12:43:28] ;p [12:43:42] legoktm, I don't think so, but it's usually easy to get at http://sv.wikipedia.org/wiki/Wikipedia:Robotans%C3%B6kan [12:43:46] I can't approve it though [12:44:03] sv is global bots :) [12:44:07] ok :) [12:44:24] I don't know my own version... [12:44:28] :D [12:44:33] hi [12:44:38] hi Base ! [12:44:40] New patchset: Aude; "(bug 43998) initial implementation of property parser hook" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46686 [12:44:53] New review: Aude; "rebased" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46686 [12:45:13] addrawr: auto-approval? [12:45:32] i'm filing the request anyways [12:45:34] hmm not sure, I have a local flag too :O [12:45:36] addrawr are you add bots operator? [12:45:42] Base: yes [12:45:49] ok [12:46:48] Grillo: https://sv.wikipedia.org/wiki/Wikipedia:Robotans%C3%B6kan#Legobot [12:47:04] What language it has been writen? [12:47:48] Base: addbot is PHP, legobot is python [12:47:59] legoktm, so your bot does merliwbot's job too? [12:48:17] that is https://sv.wikipedia.org/wiki/Special:Bidrag/MerlIwBot [12:48:41] errr no, merliwbot is just a standard interwiki bot i think. [12:48:51] How it works? [12:49:03] yeah, but now it just seems to clean up after migrated articles, where some links stayed behind :P [12:49:13] It removes if wd and local are same? [12:49:19] yes [12:49:33] If not same then what? [12:50:44] it leaves it for a human to figure out [12:50:58] my bot will generate lists in its userpage based on which wiki has the conflict [12:51:30] New patchset: John Erling Blad; "(Bug 44876) Add toc to EntityView" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52796 [12:51:39] addrawr, http://sv.wikipedia.org/wiki/Special:Bidrag/Addbot it's not currently running? [12:53:07] hmmm Grillo I will check O_o [12:53:51] addrawr, do you keep it running on several languages at the same time? because it's been stopped for 10 hours now :P [12:53:57] Grillo: cron seems to have broken [12:54:12] * addrawr thinks it may have stopped in many places [12:54:35] yep it stopped everywhere at 3:20 as I broke cron :p [12:54:53] * addrawr goes to fix [12:54:59] :) [12:55:04] it uses api, yep? I want to create smth like that too [12:56:05] Base: the sourcecode is all up on github, fine my repos at /addshore [12:56:18] and yes Grillo it runs on most wikis :P [12:56:39] ok, I thought you were switching, and thus I thought this would take years :P [12:56:59] i dont know php so it is good just to see algorythm :) [12:57:03] I am running the script every 20 mins for wikis without a translated summary [12:57:09] and every 5 mins for places with a nice summary [12:57:12] what wiki are you from? [12:57:24] enwiki :) [12:57:53] ah terrible wiki :D [12:58:13] and it's on again :) [12:59:29] when was the import of iw links to wikidata done? it seems most of that work is already done [12:59:30] Base: its not that bad ;p [12:59:40] Grillo: its still happening [12:59:46] Grillo: it has been going on for months and is still happening ;p [13:00:00] just whenever a bot that can do it find a page that isnt imported, it will [13:00:16] addrawr, how does your robot choose articles to edit? [13:00:32] it seems random... [13:00:33] It has more rules than some small wikis. Also there my bot is banned :D [13:00:34] Grillo: legoktm ran a script against the DB sumps finding pages with interwiki links on them [13:00:53] Grillo: they're probably in order by page_id or something [13:00:58] ah [13:02:00] addrawr, did it crash again or is it constantly searching for articles to edit, cause it hasn't done anything for 5 minutes? [13:02:10] (not trying to be impatient :P ) [13:02:26] * addrawr checks [13:02:35] Does you run it in uk? [13:02:44] *Do [13:03:05] its still going, I may increase the rate a bit [13:03:11] Base: check https://www.wikidata.org/wiki/Wikidata:Wikidata_migration [13:03:22] it runs on wikimedia labs [13:03:47] Ah idk how to make a server bots [13:03:53] New patchset: John Erling Blad; "(Bug 44876) Add toc to EntityView" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52796 [13:04:05] ive registrated on labs but idk to do there [13:04:28] * Base is going to learn wd api [13:05:12] legoktm, have you started importing data from the gender categories on svwp? :) [13:05:20] that already finished :DDD [13:05:22] noticed http://www.wikidata.org/wiki/Q5559872 [13:05:23] ah [13:05:25] nice :) [13:05:54] Grillo: https://www.wikidata.org/w/index.php?title=Q6000000&action=history [13:06:54] hmm, was that the first instance of wikidata vandalism? :) [13:10:15] hi from pc [13:11:57] Grillo: i wish. there was some football referee who blew a call. oh man :/ [13:12:21] ended up semi'ing the page and revdel'ing at least 10 edits [13:12:36] did 6 million get extra attention, considering all the edits? [13:12:42] probably, its linked on the main page [13:14:22] https://www.wikidata.org/wiki/Q3333333 now I'm thinking about creating that on svwp :P [13:15:14] Hello everyone. [13:15:32] hi nullzero [13:15:54] Why every link in my bot's page is broken? [13:16:13] https://www.wikidata.org/wiki/Special:Contributions/Nullzerobot [13:16:26] I mean diff link [13:16:35] nullzero: thats a known bug [13:16:43] it should be on project chat [13:16:43] Ahh [13:16:57] Thank you :) [13:17:08] if you go to the page history [13:17:14] the buttons will work [13:17:14] https://www.wikidata.org/w/index.php?title=Q572718&diff=10211078&oldid=3016597 [13:17:26] legoktm: 300,000 down ;p [13:17:32] damnnn [13:17:40] btw [13:17:47] my bot passed 1mil edits on wikidata :) [13:17:52] :D [13:18:11] thats easy legoktm to get :P [13:18:21] :/ [13:18:39] espacially if you have a category which haves ~75k pages in it [13:18:45] x2 mostly for the edits [13:18:50] and thats 150k :P [13:19:01] heheheh [13:19:27] your bots edit 2 times per pages mostly so yea.. :) [13:19:47] i'm working on a fix to that. [13:20:01] DanielK_WMDE_: poke [13:20:02] https://gerrit.wikimedia.org/r/#/c/52797/ [13:20:38] how to get to what data page is connected? [13:21:08] Base-w: ?action=wbgetentities&sites=enwiki&titles=Main Page [13:21:10] * aude not sure adding a second cron job is that simple to improve things or more we should do [13:21:35] legoktm: it is in wikidatawiki? [13:21:36] like maybe having a way to assign certain cron jobs a set of wikis to work on [13:21:54] Base-w: errr sorry, what are you asking? [13:22:15] this line to add to wikidata.org ? [13:22:31] legoktm: ^ [13:22:50] wait are you talking about API for for humans? [13:23:01] API [13:23:15] for bots sure [13:23:17] https://www.wikidata.org/w/api.php?action=wbgetentities&titles=Main%20Page&sites=enwiki [13:23:22] New review: Aude; "(1 comment)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52421 [13:24:24] legoktm: thanks. and what i need to check it is sitelink tags? [13:24:52] prop=langlings returns both local and wd links? [13:25:22] prop=langlinks on the client site will return both [13:25:26] New patchset: John Erling Blad; "(Bug 44876) Add toc to EntityView" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52796 [13:25:39] you need to use wikidata API to see what only wikidata sees [13:26:06] hi, can someone invite me to #wikidata-admin? [13:26:13] * #wikidata-admin [13:26:23] give me a seco [13:26:25] sec* [13:26:27] thanks legoktm [13:26:29] so if prop=langlings and ?action=wbgetentities&titles=Main%20Page&sites=enwiki are same i can remove links? [13:27:35] New patchset: John Erling Blad; "(Bug 44876) Add toc to EntityView" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52796 [13:28:12] Base-w: thats one way to check. its probably best if you check against the raw page text vs API i think. [13:29:19] review 52796 anyone? [13:30:01] legoktm: raw text and wikidata's API you mean? [13:30:10] yes [13:35:29] New patchset: Aude; "(bug 43998) initial implementation of property parser hook" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46686 [13:36:01] Henning_WMDE: Jeblad_WMDE: DanielK_WMDE_: weekly summary please. _now_ i need to get it finished [13:36:01] thanks [13:36:07] AnjaJ_WMDE: you too [13:37:00] interwikilink it is \[\[[a-z-]{2,}\:[^]] \]\] [13:37:00] 10[7] 10https://www.wikidata.org/wiki/a%2Dz%2D [13:37:00] ? [13:37:00] or there is some bettew regexp? [13:37:15] [^]]+ [13:37:16] sure [13:37:32] *better [13:37:58] Base-w: look and see what pywikipediabot uses? [13:38:15] could you give a link (again) [13:38:26] or you mean interwiki.py ? [13:39:03] i mean that [13:39:03] https://svn.wikimedia.org/svnroot/pywikipedia/branches/rewrite/pywikibot/textlib.py [13:39:11] ctrl+f on "def getLanguageLinks" [13:39:42] How do I start in Wikidata:Project chat in my language? [13:39:55] omg [13:40:12] is this one not interlangual? [13:40:25] Привет всем! :P [13:42:08] manzzzz: i'm not sure actually. maybe just create it? [13:42:13] and add the relevant links [13:42:36] addrawr, I hate to be a pest, but it stopped 20 minutes ago again :P [13:42:48] >.< [13:42:52] * addrawr checks [13:43:26] * addrawr rarws [13:44:50] New patchset: John Erling Blad; "(Bug 44940) Add fragment to sitelink" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52799 [13:45:07] woot [13:45:26] Legobot is running fully on every wiki where it can starting with an "A". [13:45:43] New patchset: John Erling Blad; "(Bug 44940) Add fragment to sitelink" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52799 [13:46:39] Change abandoned: John Erling Blad; "Covered by" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49072 [13:47:30] New patchset: Aude; "(bug 43998) initial implementation of property parser hook" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46686 [13:48:02] legoktm: woop! [13:48:13] legoktm: can lines in cron have comments at the end of them using a #? [13:48:19] or does that break the cron line..? [13:48:25] idk [13:48:28] New review: Aude; "(6 comments)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/46686 [13:48:30] i add them on newlines usually [13:48:38] * addrawr has just moved all of his comments [13:48:44] Grillo: should keep working now :) [13:48:49] if not ill be back in like an hour! [13:48:54] nice :) [13:50:04] it would be interesting to know how much of the database consists of just iw links... [13:51:06] lol addrawr, addbot can't handle wikis with -'s in them. [13:51:21] ? [13:51:26] it should be able to.. [13:51:31] I fixed that on en [13:51:46] Grillo: there are about 225,000,000 iw links [13:51:53] maybe 250,000,000 [13:51:59] addrawr: bat-smg, Addbot has no edits [13:52:14] also addrawr http://ganglia.wmflabs.org/latest/?c=bots&h=bots-bnr1&m=load_one&r=hour&s=by%20name&hc=4&mc=2 [13:52:16] hehehe [13:52:45] I'm guessing they are about 15 bytes each, so that would be almost 4 TB... [13:53:37] legoktm: wikidata api still returns languages with underscores right? [13:53:45] yeah [13:53:54] then it should remove them :P [13:54:19] bnr1 is taking a heavy beating xD [13:54:55] thats my fault [13:54:58] * addrawr may have to alter its usage slighty as at the moment it looks like it is just going up [13:55:00] its cuz im loading dumps [13:55:04] ahh xD [13:55:15] i'm done starting new wikis though [13:55:16] New review: Aude; "I'm still not sure about whether the item id is needed as part of the anchor tag. It's probably wor..." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52796 [13:55:17] its soo easy though [13:55:30] ./deploy.sh dbwiki [13:55:34] and it goes [13:55:42] xD [13:55:47] right ill be back soon :) [13:55:58] http://dpaste.de/X5yf8/raw/ [14:07:17] New review: John Erling Blad; "Try to implement a merge function with only one item on the page. ;)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52796 [14:09:20] Tobi_WMDE: i see JS breakage on master: "TypeError: this.setOptions is not a function" [14:09:24] is that known? [14:09:34] I also see "TypeError: "prototype" is read-only" [14:10:18] the first one is from serialization.Unserializer.js line 23 [14:10:56] the second one is from dataValues.util.js line 46 [14:11:21] Henning_WMDE: or maybe you have an idea? [14:11:44] I can edit labels and descriptions, but nothing else [14:13:25] DanielK_WMDE_: hm.. i will check in a minute [14:14:53] DanielK_WMDE_: have you pulled DataValues? 'cause I can't see those errors.. [14:16:33] it https://www.wikidata.org/w/api.php?action=wbgetentities&titles=Main%20Page&sites=enwiki has - replaced to _ isnt it? is in only one such thing? [14:17:48] (I mean wiki names) [14:18:03] all wikinames are like that [14:18:07] its the database name [14:18:10] Tobi_WMDE: ah, that could be it! [14:20:27] Tobi_WMDE: ok, that was it, sorry. But now I see a new problem: [14:20:31] "Ein Skript auf dieser Seite ist eventuell beschäftigt oder es antwortet nicht mehr." [14:20:40] DanielK_WMDE_ : i hope so, not having another idea.. [14:20:47] initializing the page is so slow, firefox wants to kill the skript [14:21:01] uhh [14:21:02] ...and it only has one statement. [14:21:25] somethings wrong then [14:21:56] well, if i tell it to continue, everything seems to work... [14:22:45] http://www.wikidata.org/wiki/Q76 runs smoothly for me, and it has a lot of statements [14:23:15] Tobi_WMDE: i had firebug on to investigate the other issue, maybe that slowed things down [14:23:35] hm.. but it's a bit slower in firefox than in chrome [14:24:03] Q76 took about 5 seconds to initialize for me. Not a big problem, but we really shouldn't get slower [14:25:34] yes, I agree.. we should have a look at performance tuning of JS soon.. [14:33:08] New patchset: John Erling Blad; "(Bug 44876) Add toc to EntityView" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52796 [14:34:02] New patchset: John Erling Blad; "(Bug 44876) Add toc to EntityView" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52796 [14:52:15] Legoktm, I have started the RFC. Do I have to add a note somewhere? [14:52:28] Create a new section on project chat with a link [14:55:02] Ok [14:55:27] we should try and do a round up of all RfCs and collect them on one central page at some point. [14:55:39] there is a rfc page [14:56:39] ooh [14:56:42] :DD [14:56:46] Haha indeed [14:56:47] thanks :) [14:56:54] Therehttps://www.wikidata.org/wiki/Wikidata:Requests_for_comment#Current.2FOpen_requests [14:56:57] https://www.wikidata.org/wiki/Wikidata:Requests_for_comment#Current.2FOpen_requests [14:57:08] Should I still add the link? [14:57:17] yes please [14:57:34] Ok, Adding a section in the ptoject chat [14:57:37] project* [14:58:41] Added [14:58:45] Please go and comment [14:58:50] The RFC is so emtpy :'( [15:03:50] Yay someone voted :D [15:14:32] Hi [15:21:17] edit #10000000: https://www.wikidata.org/w/index.php?title=&diff=10000000 [15:31:03] Is Wikidata intended to be used on other projects such as Wiktionary in the near future? [15:31:58] Automatik: if your definition of near future is next month then no - otherwise yes :) it'll happen but it'll still take a while [15:32:09] Stryn: \o/ [15:33:18] Jeblad_WMDE: AnjaJ_WMDE: last call [15:35:24] uh.. [15:37:23] Automatik: to be honest, wiktionary is probably the most complex of the sister projects. It will take a while to support that. [15:37:35] other projects, like Wikivoyage, are much simpler. [15:38:40] If a site does not appear this doesn't mean there are no pages with IW links or we are waiting for the next run! -- http://www.wikidata.org/wiki/Wikidata:Wikidata_migration/Progress isn't it supposed to say "this means", that is the opposite? [15:40:16] DanielK_WMDE: nevertheless, interwikis in Wiktionary are conventionally exactly the same, it's surprising [15:42:07] the same that the current page* [15:52:10] Automatik: which makes wikidata kind of pointless. just linking to the *same* title on other sites can be done a *lot* simpler. [15:52:27] but eventually, i would want to see all the structured data in wiktionary in a wikidata-like system [15:52:39] so far, that's just a dream, though :) [15:52:52] * DanielK_WMDE_ hopes nobody sais omegawiki [15:53:24] "omegawiki" [15:53:37] * Lydia_WMDE takes away legoktm's cookies [15:53:38] ;-) [15:53:42] :x [15:53:48] i know! mean, no? [15:54:01] * Lydia_WMDE can be mean - even if she doesn't look like it ;-) [15:54:03] improvement suggestion: http://www.freeimagehosting.net/newuploads/u6g9z.png [15:54:25] * Lydia_WMDE waits for site to load [15:54:35] yeah, it's bloody slow [15:54:44] hello [15:55:07] At the end [15:55:13] What would happen with OmegaWiki? [15:55:39] legoktm: How is your run at nlwp going? [15:55:48] hey multichill [15:55:51] its still running [15:56:06] ksuhku: not loading :/ [15:56:17] hey multichill [15:56:44] oh now it did [15:57:29] If it can be done a lot simpler, what prevents to do it? [15:57:49] i'll try to put it someplace else, it's a massive 6KB image [15:58:07] ksuhku: no it's ok - i got it now [15:58:34] ksuhku: i think this can be changed on-wiki? can someone confirm? if so feel free to propose it on the project chat [15:58:52] Automatik: we need to get one thing done first - it's bad to do things half-assed ;-) [16:00:28] Don't understand, sorry, my English isn't good enough [16:00:44] we have to finish one thing first [16:00:50] that is making it work for wikipedia [16:00:57] then we can move on to others [16:03:19] Ok, thanks [16:03:43] Lydia_WMDE, I thought this is the project chat? [16:04:00] ksuhku: there is also a page called project chat on the wiki [16:04:01] ksuhku: [[WD:Project chat]] [16:04:02] 10[1] 10https://www.wikidata.org/wiki/WD:Project_chat [16:04:05] ^ [16:04:07] It is a Sylvester Stallone movie at Cinestar #justsaying [16:04:16] Lydia_WMDE, legoktm thx! [16:04:28] i dont think we can link in the messages? [16:04:39] hmmm possibly [16:04:59] i'd test it on test repo except i'm not an admin there… [16:06:30] I put the file in here FWIW http://commons.wikimedia.org/wiki/File:Wikidata_improvement_suggestion_463646.png and will add it to the chat page [16:10:07] I just learned about wikidata and I already love it! [16:10:17] :) [16:13:28] New patchset: Daniel Kinzler; "(bug 45097) Introducing the Summary class." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52807 [16:14:13] Change abandoned: Daniel Kinzler; "replaced by I90c9919" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/49219 [16:14:24] Change abandoned: Daniel Kinzler; "replaced by I90c9919" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/50154 [16:23:31] New review: Daniel Kinzler; "needs manual rebase" [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/49862 [16:45:04] DanielK_WMDE_: https://bugzilla.wikimedia.org/show_bug.cgi?id=41586 <- good to close? [16:51:01] New patchset: Tobias Gritschacher; "(testing) adding selenium tests for delted-property/item-handling in UI" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52810 [16:51:42] New patchset: Tobias Gritschacher; "(testing) adding selenium tests for deleted-property/item-handling in UI" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52810 [16:58:25] New patchset: Tobias Gritschacher; "(bug 43997) adding Selenium tests for linking items on client" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/47597 [16:58:49] New review: Tobias Gritschacher; "Anja, please check again." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/47597 [16:59:19] wikidata office hour in german starting in #wikimedia-office now [17:03:26] New patchset: Jeroen De Dauw; "DO NOT MERGE" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52609 [17:13:57] legoktm: pong [17:33:15] New patchset: John Erling Blad; "(Bug 45881) Stop use of entity ids with wrong prefix" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52876 [17:35:10] New patchset: John Erling Blad; "(Bug 45881) Stop use of entity ids with wrong prefix" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52876 [17:47:02] New patchset: John Erling Blad; "(Bug 45881) Stop use of entity ids with wrong prefix" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52876 [17:48:12] New patchset: John Erling Blad; "(Bug 45881) Stop use of entity ids with wrong prefix" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/52876 [18:10:50] \NICK MichaelSchoenitzer [18:11:17] MichaelSchoenitz: mit / :) [18:11:52] ja, danke � und offenbar ist MichaelSchoenitzer zu lang� [18:11:57] :D [18:32:50] mmm [18:33:27] a page is moved to Wikidata with a bot, but when I try to use the link in the sidebar to get to Wikidata, Wikidata can't find the item [18:33:35] while it has its own Q [18:33:59] watch: https://nl.wikipedia.org/wiki/Wikipedia:Mededelingen [18:34:12] and Q https://www.wikidata.org/wiki/Q79790 [18:34:24] anyone having an explanaition? [18:40:33] Romaine: weird [18:40:43] yups [18:41:32] oh, got to fo [18:41:36] *go, see you later [18:49:54] I have trouble in installing wb extension [18:51:38] addrawr ping [18:53:19] Romaine: looking [18:53:29] A database query syntax error has occurred. The last attempted database query was: "SELECT page_id,page_latest FROM `wikipage` WHERE page_content_model IN ('wikibase-item','wikibase-property') LIMIT 1000 " from within function "Wikibase\SqlStore::rebuild". Database returned error "1054: Unknown column 'page_content_model' in 'where clause' (localhost)" [18:53:29] :) [18:54:06] Romaine: ah - known bug :( [18:54:16] ok, thanks for looking [18:54:18] the problem is the Wikipedia: ns [18:54:25] I suspected that [18:54:32] it'll hopefully be fixed in the next rollout [18:54:53] seems to me an urgent need to fix, as it makes a Q unfindable [18:55:20] yeah :/ [18:55:26] let me look for the bug so you can comment [18:56:14] Romaine: https://bugzilla.wikimedia.org/show_bug.cgi?id=44536 [18:56:27] Lydia_WMDE: my bug is known or the bug is something else? [18:56:42] Amir1: sorry that was for Romaine [18:56:51] Amir1: no idea about your problem - sorry :/ [18:56:53] thanks for looking [18:57:35] Amir1: if no-one else here can help you please send an email to the mailing list [18:57:49] Amir1: i just saw an email by Lydia_WMDE saying that vagrant distro has been cleaned up for wikidata... that could be an easy alternative? [18:57:54] oh ok [18:58:13] just a thought [18:58:47] btw, Amir1 did you do php extensions/Wikibase/lib/maintenance/populateSitesTable.php [18:59:13] yurik: what is vagrant distro [18:59:46] its a virtual machine with a barebone linux distro, and some scripts to configure it [19:00:03] the scripts will set up mediawiki + wikibase [19:01:44] hmmm [19:01:51] Amir1: have you gone through the http://www.mediawiki.org/wiki/Extension:Wikibase [19:02:09] yes [19:02:17] re vagrant - all the setup will be in the virtual box [19:02:54] the only pain with vagrant is that you have to configure your IDE to allow incomming connections from the VM [19:03:00] for debugging [19:03:09] I describe all the steps on my MW user page [20:06:03] Hi, anyone here? [20:07:38] likely [20:07:58] I don't think this edit is correct: https://www.wikidata.org/w/index.php?title=Q1689171&diff=10272126&oldid=10075356 [20:08:06] But I'm not sure enough about it to revert [20:09:54] chinese article seems to be about basketball player, so you're right, it's not the same [20:10:10] there are so many iw in some main pages, is there any way of filter them? [20:11:06] Stryn: so can you revert? [20:11:13] I did [20:11:28] Thanks :) [20:11:31] basketball player is now here: [[Q3808361]] [20:11:31] 10[1] 10https://www.wikidata.org/wiki/Q3808361 [20:12:13] I wonder why the dot did made that mistake [20:15:49] Alchimista: you can use the noexternallanglinks magic word [20:15:56] used on english wikipedia i think [20:16:25] used on Finnish wikipedia too [20:16:46] http://fi.wikipedia.org/w/index.php?title=Wikipedia%3AEtusivu&diff=12785251&oldid=12515852 [20:17:10] Stryn: bestest! [20:17:13] :) [20:17:28] Thanks! [20:21:20] Lydia_WMDE: clicking the "diff" link at the top of https://www.wikidata.org/w/index.php?title=Q149973&oldid=8604082 doesn't work [20:21:29] (URL = https://www.wikidata.org/w/index.php?title=Q149973&diff=prev&oldid=8604082 ) [20:21:32] huh: known :( [20:22:46] https://bugzilla.wikimedia.org/show_bug.cgi?id=45821 ^^ [20:25:50] it annoyed me at first [20:25:52] well [20:26:00] as in, I thought it was only me [20:26:07] and if I made the latest edit [20:26:20] then I read more and it was "prev" and "next" breaking it [21:13:22] 1 problem: when clicking on "saving" a window pops up, asking me to accept the conditions [21:13:51] When I click on "don't show me again" it's okay as long as I have the browser window open [21:14:20] It does not seem to generate a cookie. So when I open the browser again it asks me again to accept the conditions [21:14:46] So that I have to click on this pop up window 20 times a day. [21:16:53] Hurrah. 100 users logged on here and no answer to my problem. Long live WikiData. [21:16:55] Bye. [21:33:58] my first baby steps: are these label + description correctly added? https://www.wikidata.org/wiki/Q2353813 [21:35:16] ksuhku: I'd say label is correct, but I'm going to check th description further since there's no english page [21:35:33] yeah, thanks and sorry for the languages... [21:35:42] no, it's not your fault at all [21:36:02] a quick translation should be OK [21:36:27] it looks like it's right [21:37:35] thanks alot ihaveamac [22:49:47] Hello, I need step-by-step help with creating new item - interwiki for a category that exists just in two languages that are not connected on Wikipedia [22:50:35] Gumruch: the page Wikipedia:Wikidata on english wikipedia should help [22:50:35] What should be a proper label and description for a category and in which language? [22:51:02] language: the same language your user interface is set to on wikidata [22:51:36] [[Help:Descriptions]] should have more about the desctiptions [22:51:36] 10[2] 10https://www.wikidata.org/wiki/Help:Descriptions [22:52:33] Lydia_WMDE: OK, thanks, I have red Czech help and How to Edit Wikidata.pdf and I have not found anything about creating new CATEGORY entry for creating interwiki outside Wikipedia [22:53:27] ok - a good way is usually looking at other similar pages then [23:11:41] so, hatte jetzt Zeit und hab das Skript von Denny_WMDE erweitert [23:13:31] Von den ~10 Millionen Edit sind ~8 Millionen von Bots, echte Nutzer haben in Wikidata bisher1.5 Millionen Edits gemacht [23:14:16] 85% der Edits sind von Bots [23:16:03] bots ftw [23:16:15] (Ich hoffe das die Liste auf Wikidata:Bots vollst�ndig ist)