[00:01:25] what to do if in seperate Q's the same subject is created, can it be merged? [00:03:11] Romaine: I was wondering the same thing [00:03:36] with only one or to copy paste can work [00:03:41] there is a feature request for thislet's say: wikidata people have this feature planned before is goes live ;-) [00:03:48] There's no merge now AFAI [00:03:49] AFAAIK [00:03:52] but sometimes with islands of many ... [00:04:12] so we are now playing not live :p [00:04:31] Romaine: https://bugzilla.wikimedia.org/show_bug.cgi?id=38664 [00:05:01] and are renamings on the local wiki's automatically changed in the system? [00:06:03] no langlinks get lost is this case [00:06:17] but that will be changed in future I hope? [00:07:55] I already find some double creations :S [00:09:24] Yeah.. [00:09:59] Romaine: I found one [00:10:23] Q40, Q29 [00:12:17] Q191, Q39 [00:12:46] is there any way to mark pages for deletion? [00:14:12] I relocated some data and put another country in it [00:15:19] Q2013 and Q184 are duplicates [00:15:24] :S [00:16:08] too bad we can't add {{delete}} or something to the page [00:16:09] 10[3] 10https://meta.wikimedia.org/wiki/Template:delete [00:16:36] maybe to the talk pages? [00:16:55] I added empty as title in the doublures [00:18:13] YairRand: I didn't even notice we had talk pages [00:19:31] every item has a talk page. they probably function like regular wiki pages. [00:20:08] YairRand: labels are not unique [00:20:20] so? [00:20:49] a sitelinks can be connected to only one item [00:21:18] so...? [00:21:33] but you can have two items having th same label (e.g. two cities having the same name) [00:22:04] they have the same name, but different articles (=sitelinks) [00:22:24] I understand that [00:23:01] This is probably because the wiki existed previously but has been closed. [00:23:07] Merlissimo: sorry, which statement are you responding to? [00:23:23] Q2013 and Q184 are duplicates [00:24:10] New patchset: Reedy; "Fix line endings to be consistent with everything else" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30736 [00:24:20] Merlissimo: so you're saying that they're for two different entities called "Wikidata"? [00:25:05] there can be two different entities sharing the same label [00:25:40] yes, but those two are probably just referring to the same entity... [00:25:46] but they are two different items which must have different sitelinks [00:26:04] one has no sitelinks at all [00:26:17] two different subjects (even if they have the same name) [00:26:33] (and has apparently just had its label changed to "empty") [00:26:43] without having an article connect you cannot know the content of an entity [00:27:41] you can if the description makes it clear that it refers to the same entity [00:28:44] if I set my language preference to something else then English, the labels in English disappear? [00:29:28] New patchset: Reedy; "Tidy up inconsistent returns" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30737 [00:31:16] New patchset: Reedy; "Tidy up inconsistent returns" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30737 [00:31:16] Romaine: yes, only labels in your user lange are shown (or no label if there is not one). Solving this problem is also a feature request. [00:33:00] from wiki's I set up we had the user language as tekst, unless that wasn't available then English [00:33:11] showing at least something [00:34:16] New patchset: Reedy; "@param $foo false or @return false isn't valid" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30738 [00:40:40] wheee [00:40:41] 200 items [00:56:07] um, it is possible to delete items, right? [00:58:14] YairRand: only if you are an admin [00:58:28] ah, okay. [00:59:01] would there be any problem with adding a Template:Delete to wikidata? [01:00:44] editing article source is blocked for entities. maybe you shoudl ask at http://www.wikidata.org/wiki/Wikidata:Contact_the_development_team [01:01:58] on this channel all developers are off because local time is 2 o'clock am [01:02:03] it could be added to article talk pages [01:02:49] blocking entities' article source in general probably makes sense [01:07:46] you create one new page for creating a deletion request. and then an admin must delete two pages. [01:08:17] but i do not know another solution, too. [01:08:22] yep [01:11:20] an alternative would be a "deletion requests" page [01:11:29] not sure if that would be better [01:15:15] Can we add things for non-Wikipedia projects? [01:16:10] I don't think so [01:28:34] PiRSquared17: only for wikipedia wikis. even not all wikis that have langlinks to wikipedia wiki are possible (species, commons, incubator) [01:29:02] there are some feature requests for this [01:35:46] Romaine: ping [01:35:53] ? [01:35:58] The links for Brussels go to different pages [01:36:34] The English one is for "Brussels Hoofdstedelijk Gewest" [01:36:44] strange en-wiki [01:38:30] solved [01:38:38] also created Brussels Capital Region [01:38:58] :) [01:44:59] * Romaine goes to bed [01:45:58] almost 250 [01:48:09] Now there is 250 ;) [01:57:42] Reedy: https://www.wikidata.org/wiki/Wikidata:Project_chat [01:57:51] nvm, working now [01:57:57] :S [02:53:55] Reedy: ping [02:59:56] Will there be bots to automatically add all the interwiki links to wikidata? [03:00:16] and the data [03:07:31] PiRSquared17: yes. But not everything can be added automatically [03:08:03] Merlissimo: I read that things like birth dates will be added. How? [03:08:13] e.g. my bot is running on test wiki http://wikidata-test-repo.wikimedia.de/wiki/Special:Contributions/MerlIwBot [03:08:50] phase 1 is langlinks only, birth dates will be phase 2 which will be enabled much later [03:10:09] Merlissimo: "much later" over a year? [03:11:50] i don't know but developing was just started. [03:13:46] but you know that espacially birth dates are not easy to implements in a data model? [03:14:13] because same person has different dates on different wikis [03:14:21] can has [03:17:37] so we need to check every one? :S [03:24:02] PiRSquared17: that must be done by local community. my favorite example is Riaz Shahi. I would say he died in 2001, because many doctors checked his corpse an declared him as dead. Everyone could also visit his corpse at a mausoleum. That's why many wikis have categorised him as dead. But on enwiki he is categorised in "Possibly living people " because before his death he say he will revive. After a long descussion the enwiki community [03:24:02] decided that he only "disappeared". So he has not date of death there. [03:27:00] * Merlissimo thinks that people who said that say will revive should always tell the people the date of revival. Then they could finally be declared as death after this date ;-) [09:41:23] DanielK_WMDE: regarding your edit conflicts tests at https://gerrit.wikimedia.org/r/#/c/29973/ [09:41:36] DanielK_WMDE: I am not sure how we can get rid of sleep(1) :-] [09:41:55] hashar: me nither. seems to me we need to keep them. [09:41:58] is that needed to change the edit time ? [09:42:08] yes [09:42:22] the revision's timestamp is used to detect the edit conflicts (yes, that's stupid) [09:42:42] i could create revisions directly, with fake timestamps [09:42:51] but then i wouldn't be testing EditPage any more, would I... [09:43:55] does the conflict triggers only when the edit timestamps are equals? [09:44:28] no, it triggers if there was an edit made with a timestamp > the base timestamp [09:45:04] i guess for the simplest case, i could just fake the base timestamp. [09:45:27] that makes for less robust tests though, it needs internal knowledge. [09:45:56] that is, i would be *creating* an edit conflict, instead of *simulating* the situation in which a conflict should be detected. [09:48:32] 'rev_timestamp > ' . $db->addQuotes( $db->timestamp( $since ) ) [09:48:32] :( [09:48:36] in Revision::userWasLastToEdit() [09:49:21] yep [09:50:04] the $since is coming from EditPage->edit time [09:50:26] EditPage->edittime, can't it be forged/altered somehow instead of sleeping? [09:50:26] yes. it's a request parameter [09:50:55] in some cases, i could set a dummy time, yes. [09:51:16] i may be able to get rid of one or two sleeps that way. i don't think i can eliminate all without incorporating too much internal knowledge into the test [09:51:23] that would trigger the conflict without having to sleep() ;-] [09:51:38] yes, but it makes the test case less expressive [09:51:45] but faster [09:51:52] i'm then no longer testing the actual situation that should be detected. [09:51:53] isnt sleep() a bit cumbersome? [09:52:06] hashar: i could also write empty test cases, they are fastest ;) [09:52:08] indeed [09:52:23] that is advocated by Domas using the only worthwhile performance operator: // [09:52:23] of course it's cumbersome. it sucks. timestamps are the wrong way to do this anyway. [09:52:29] the detection should be based on revision ids. [09:52:43] hehehe.... nice [09:53:03] http://dom.as/2007/11/15/optimization-operator/ [09:53:04] ;) [09:54:04] DanielK_WMDE: so if you can get rid of some of the sleep() calls easily, go for it [09:54:31] hashar: yea, it's on the list. you put a -1 there, right? [09:54:44] there are also a few more modifications which are needed in the tests [09:54:45] hence the -1 I guess [09:55:03] which does not mean I reject the change :-] [09:55:29] I am very happy to see a test covering edit conflicting [09:58:47] is wikidata compatible with flaggedrevs? could flaggedrevs be installed on wikidata.org? [10:01:36] Hola. [10:01:46] No CentralAuth on wikidata.org yet? [10:03:08] aharoni: should work [10:05:30] Lydia_WMDE: it works if I press "Log in" and type the password, but I would expect it to work immediately if I'm logged in in another project. [10:06:00] aharoni: ah i don't know the details of that [10:06:29] aharoni: maybe it's not working because you logged in last time when wikidata wasn't yet included in central auth? [10:06:44] no idea. [10:14:54] aharoni: i thought single log in does not work over the projects due to different second level domains? [10:15:20] i.e. if you are logged in in wikipedia, and move to commons or wiktionary, you also are not logged in there (if you have not been logged in before) [10:15:36] that is not a wikidata issue, i am afraid, but wmf wide [10:15:39] can anyone confirm? [10:15:52] it works from https://he.wikipedia.org to https://hr.wikisource.org . [10:16:06] different language and different project. [10:16:26] Reedy: ^ you know a thing or two about CentralAuth, don't you? [10:20:07] Denny_WMDE: that has been my experience [10:21:13] so I guess the existing interwiki links have yet to be loaded? [10:22:44] edsu: we will not do that [10:22:47] and we asked bot authors to wait [10:22:50] aharoni: when to you login last time? [10:22:59] to give people some time to play and actually do something themself [10:23:03] to wikidata.org? that's the first time. [10:23:14] Lydia_WMDE: huh, ok -- i haven't been following recent developments, i thought i heard someone say that at wikimania [10:23:44] Merlissimo: oh - when did log in to some other older project? I don't know. A while ago ;) [10:23:56] Lydia_WMDE: it doesn't look like there's a whole lot to do other than add interwiki links, but I guess I'm missing something? [10:24:11] hi edsu [10:24:12] no, that's it for now [10:24:14] edsu: no we've always been clear that we won't be importing links and other data [10:24:18] edsu: no you're not missing anything [10:24:20] the bots will do most of the work [10:24:25] that's all there is for now [10:24:27] aude: hi! :) congrats on getting wikidata going [10:24:30] but people can have fun too :) [10:24:49] how's the hurricane? [10:25:22] aharoni: i just tried. i used safari [10:25:27] i logged into en.wp [10:25:36] and then went to en.wsource [10:25:45] and i was not logged in [10:25:50] aude: it was pretty strong for this area, but by some strange miracle i didn't lose power [10:25:50] which browser are you using? [10:26:20] Lydia_WMDE: sorry i wasn't paying attention at wikimania then :) [10:26:22] * aude nods and amazed you have power still :) [10:26:52] edsu: phase 2 will be all the infobox data [10:26:56] aharoni: the login page sets cookies for all domains. so if you haven't loaded the login page on any wiki since yesterday, cookies are not set [10:27:04] but not ready to launch that jus tyet [10:27:33] aude: gotcha, it definitely makes sense to stage it [10:27:50] i was wondering if wikidata mediawiki was configured to announce changes in irc [10:28:15] was going to add it to wikistream.inkdroid.org :) [10:28:46] edsu: probably is [10:29:47] Merlissimo, Denny_WMDE - OK, if I log in to another project, and then to Wikidata, then everything is fine. (tried with another account.) [10:31:29] aude: i wasn't sure what the name was since it didn't fit the #lang.project pattern [10:40:07] Reedy: what is the maximum item size allowed on wikidata.org? 2MB like for other projects? [10:40:51] * Merlissimo 's bot created an item of 65k [10:41:15] edsu: looking [10:43:06] edsu: it's #wikidata.wikipedia [10:43:12] thanks! [10:43:42] it's just how the database is setup that it gets the "wikipedia" in the name, but it works [10:51:41] aude: does phase 2 include the feature to link to commons? [10:52:59] Merlissimo: we will be able to link to commons media in phase 2 [10:54:23] i man to have sitelinks to common pages and this value can be uses as target at template {{commons}} on local pages [10:54:24] 10[4] 10https://meta.wikimedia.org/wiki/Template:commons [10:54:49] and also maintaining commons langlinks (which link to wikipedia) [10:57:08] this should be added at some point, I agree [10:59:24] if it is implemented later, we maybe must add a template on commons pages which defines the related item and a bot syncs the langlinks from wikidata to commons [10:59:59] the same for incubator [11:00:18] i would be surprised if it wouldn't be implemented at some point [11:00:41] we will see [11:00:56] i am not promising that feature, but i have the feeling that it would be not too hard to do [11:08:12] i guess you all have noticed how some languages are capitalized and others not? [11:26:44] Hi, congrats to everyone for the good work, and it's nice that Wikidata official birthday will be the same as mine :) [11:27:16] Lydia_WMDE: should we start a discussion on wikidata.org which features a bot importing sitelinks must have / how it should work? [11:28:09] edsu: we noticed [11:28:27] e.g. today i implemented the feature that now displaytitles instead of to titles is used a label. so we can collected all wanted features. [11:34:47] jem-: congratulations! :) [11:36:11] New review: John Erling Blad; "I think we need to discuss this as it will have implications a lot of places in our doc." [mediawiki/extensions/Wikibase] (master); V: 0 C: -2; - https://gerrit.wikimedia.org/r/30738 [11:37:35] Denny_WMDE: #mediawiki-i18n [11:40:26] https://gerrit.wikimedia.org/r/#/c/30738/ [11:41:01] Its a good (and valid) point reedy makes but I think we need to discuss this first [11:46:01] New patchset: Jens Ohlig; "(Bug #40391) Add continuation to wbsearchentities and return only matching aliases" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29956 [11:54:05] Hello TorstenK [11:54:08] :D [11:54:13] Hallo, wann plant ihr auf wikidata.org Infos für Autoren bereit zu halten? [11:55:16] Lydia_WMDE, Denny_WMDE: TorstenK made a point on twitter that the Main Page lacks information for the community how to get involved, especially how to add items etc. [11:55:45] Lydia_WMDE: a little timetable would be helpfull [11:56:01] TorstenK: hey [11:56:13] TorstenK: timetable for what exactly? [11:56:27] Merlissimo: sounds good to me :) [11:59:16] Jens_WMDE: i was wondering how to add items too [12:00:33] Denny_WMDE: Thanks :) [12:00:53] Btw, is it Ok to begin creating discussion / village pumps / etc. or should we wait until there's no risk of a full reset? [12:00:53] edsu: http://wikidata.org/wiki/Special:ItemByTitle is where i start [12:01:39] see if some wikipedia link is already included in an item (e.g. enwiki => Canada ) [12:01:39] Lydia_WMDE: when there will be manuals to use Wikidata [12:01:50] if it doesn't exist, then it will show a create item link [12:01:50] Lydia_WMDE: he only way to find the Repository is to go on "RecentChanges" yet [12:01:50] thas soon as the community works on it - this is nothing the staff should work on right now [12:02:15] i am happy to answer questions anyone has working on it [12:02:42] TorstenK: actually, wikidata.org _is_ the repository. [12:02:43] aude: thanks! [12:03:05] Jens_WMDE: i think it's ok to do that [12:03:10] eh [12:03:12] jem-: ^ [12:03:19] i'm sure the workflow can be improved and i think we welcome feedback [12:04:53] Lydia_WMDE: I think at least an English manual how to work with Wikidata ist essential for actually being launched. [12:05:33] jem-: i think these pages should be created, yes. but it's up to you :) [12:05:47] the longer the page is staying the lower the probability that we will restart [12:06:30] Lydia_WMDE: how should items be deleted by community? on all other wikis they add {{delete}} to a page and then an admin delete the page after reviewing if the deletion request s valid. But this cannot be done with items [12:06:30] 10[3] 10https://meta.wikimedia.org/wiki/Template:delete [12:06:45] TorstenK: Wikipedia started without a manual [12:07:23] I have the hope that the manual will be created as we go [12:07:28] Merlissimo: http://www.wikidata.org/wiki/Wikidata:Requests_for_deletions [12:07:42] Denny_WMDE: Ok :) I'll try to catch up with things first, I have a lot of open battles, as usual [12:07:46] Merlissimo: i will go through them later today [12:07:56] it would be good i think if people start applying at the stewards page on meta for admin acces [12:08:10] it would be nice for the community to handle deletions and not us [12:08:16] jep [12:08:57] http://meta.wikimedia.org/wiki/Steward_requests/Permissions [12:09:00] aude: more than nice, actually. it is mandatory that the community does that, not the software developers. [12:09:08] Jens_WMDE: sure :) [12:10:40] @RC- meta_wikipedia Wikidata [12:10:41] Deleted item from feed [12:10:48] @RC+ meta_wikimedia Wikidata [12:10:48] Inserted new item to feed of changes [12:10:54] aude: test pls [12:11:05] Denny_WMDE: No, it didn't The manual was short on the front page [12:12:33] okay, I just wait for the hu.wp-Launch [12:12:38] jeremyb: what? [12:12:47] aude: edit [[m:wikidata]] ;) [12:12:47] 10[5] 10https://meta.wikimedia.org/wiki/wikidata [12:12:51] * aude can't  [12:12:52] !place [12:12:52] that's a lovely place! [12:12:57] you edit [12:13:01] oh, the meta wiki page [12:13:04] can't?! ok [12:13:17] * jeremyb wonders what aude thought ;P [12:13:40] !nyan [12:13:41] ~=[,,_,,]:3 [12:14:14] Change on 12meta_wikimedia a page Wikidata was modified, changed by Katie Filbert (WMDE) link https://meta.wikimedia.org/w/index.php?diff=4355992 edit summary: null edit [12:14:36] * aude shouldn't edit wikidata.org during working hours [12:15:08] TobiasG_WMDE: is your bug 41541 a dupe of my bug 41533? [12:16:19] aude: it's so tempting... [12:16:27] heh [12:17:10] aude: ahhh [12:17:45] aude: a null edit works even if you do not modify anything. Then no new revision is created, but tables are updated [12:18:13] Merlissimo: ok [12:19:13] Raymond: I think you mean https://bugzilla.mozilla.org/show_bug.cgi?id=41533 and https://bugzilla.wikimedia.org/show_bug.cgi?id=41541 [12:19:40] Raymont, sorry, https://bugzilla.wikimedia.org/show_bug.cgi?id=41533 and https://bugzilla.wikimedia.org/show_bug.cgi?id=41541 [12:19:59] Raymond: my bug is a more general description of your's [12:20:18] cause there are more problems with the IME than what u described [12:20:37] TobiasG_WMDE: ok, than I will dupe mine in favour of yours :) [12:24:39] New patchset: Jens Ohlig; "(Bug #40391) Add continuation to wbsearchentities and return only matching aliases" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29956 [12:26:59] hey guys, I did a deletion of a duplicate on wikidata [12:27:14] but actually I don't know if am I allowed to do such things [12:27:24] does someone know? [12:28:30] got an answer on the wiki [12:28:54] jeblad_WMDE: https://gerrit.wikimedia.org/r/#/c/29956/ merge please? [12:29:06] running the tests [12:36:11] Change merged: John Erling Blad; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29956 [12:37:16] Vito_away: thanks, btw [12:37:56] you're very welcome [12:42:36] Merlissimo: not for the reason she was editing... to make the bot talk! [12:42:53] Will the Hungarian Wikipedia be the first one to turn on Wikidata? [12:43:49] TorstenK: are you press? [12:44:03] yes [12:44:26] Lydia_WMDE: ping ? [12:44:47] Hi, is it possible to create links to pages other than Wikipedia in Wikidata right now? [12:45:13] Dakdada: i haven't edited it yet so idk... maybe you could check the interwiki map ? [12:46:18] Dakdada: no, check allowed values of site at http://wikidata-test-repo.wikimedia.de/w/api.php?action=paraminfo&modules=wbsetitem [12:46:35] or that [12:47:08] interwikimap is indepentent. these values are related to sitematrix. [12:47:37] Yahoooo. Congrat everyone! [12:48:26] jeremyb: Lydia_WMDE is on lunch right now [12:48:40] Denny_WMDE: see TorstenK above... [12:49:10] TorstenK: that is the current plan, yes. They have volunteered to be the first one to try it out. [12:49:45] Thanks for the answers, that's what I thought. I guess Wikipedia takes priority for now. [12:50:43] Dakdada: yes [12:53:18] TorstenK? Norway? [12:56:44] that was short... [12:57:41] I got some "missing wiki" error messages [12:57:53] yes, we had those reports. not sure where it comes from [13:00:20] (maybe because someone added many whisky distilleries' entries .___. ) [13:03:13] jem-: pong [13:09:44] jeblad_WMDE: will you add wbgetentities to wikidata.org or will this wiki use wbgetitems only (my bot already uses wbgetentities) [13:10:17] jeblad_WMDE is at lunch [13:10:27] but it will move to wbgetentities [13:10:41] i think we will update the version of wikibase on wikidata.org at some point [13:10:41] wikidata.org runs a slightly older code base, that is all [13:10:48] somethings will change but not sure when we will update [13:10:49] the API will break [13:10:56] oh, right, we should write that somewhere :/ [13:11:00] and when? [13:11:19] maybe when the client gets installed on huwp? [13:11:38] so should i rollback my code to the old version? [13:11:38] not sure if we need another security review or what the process will be [13:12:07] Merlissimo: not sure [13:12:22] not sure when there will be bots on wikidata.org vs. we update [13:12:56] it is only a renaming which could be easily backported [13:13:20] don't know [13:14:58] Vito_away: the beer entity is confusing wikidata [13:15:50] The wikitech-l has already been notified about breaking changes in the API [13:16:17] Merlissimo: i'd say no about the backporting [13:16:48] An updated branch of wikidata will suddenly appear.. ;) [13:16:53] * aude doesn't work on the api though [13:18:04] * Vito_away hates editconflicts [13:19:17] jeblad_WMDE: does the wikidata.org branch already support arrays? [13:19:21] Vito_away: i think there's some room for improvement there but not sure when/what that will happen [13:19:22] We havn't completed the code for patching items, but it will basically work as if there were no edit conflicts [13:19:45] if i edit a completely different site link from someone else, i think it should auto merge the conflict [13:19:58] but doesn't appear to do that yet [13:19:58] yep, there's much room for improving edit-conflict management :D [13:20:06] even on wikipedia too, though [13:20:12] Merlissimo: Not sure if that code got in .. or if it was part of the backport [13:20:15] the code is a bit ugly for edit conflicts :o [13:20:18] actually the error message for duplicate entry should give an easy way to solve it [13:20:28] at least a link to the other entry [13:20:33] * aude nods [13:21:00] if you want to note that here: http://www.wikidata.org/wiki/Wikidata:Contact_the_development_team [13:21:03] that would help [13:21:10] Oh,.. it is a way to find duplicate entries... [13:21:12] to make sure we don't forget [13:21:13] * jeblad_WMDE slaps head [13:21:24] anyway I can understand editconflict is quite harder to manage on wikidata than on wikipedia [13:22:35] We have a quite nice piece of code to do diff/patching [13:23:24] The reson why the code is not already doing patching is because we needed something up and running and I messed around for to long [13:24:43] * jeblad_WMDE is master of fuckups [13:24:53] lol [13:24:58] oh please. [13:25:12] jeblad_WMDE: you're not the master :) [13:25:21] !nyan [13:25:21] ~=[,,_,,]:3 [13:25:27] :) [13:27:31] jeblad_WMDE: nonsense :) [13:27:38] as so often i disagree with you [13:28:20] but yes, it iwll be nice to see the merge code up and running, but we really need to get our prios settled [13:28:49] 48 hours days? [13:29:27] Can anybody explain to me whats going on here? https://gerrit.wikimedia.org/r/#/c/30736/1/lib/resources/wikibase.ui.PropertyEditTool.EditableValue.SiteIdInterface.js [13:29:47] cr-lf stuff? [13:31:43] line ending stuff, yes. not sure exactly what, though [13:41:34] guys, I don't know if you noticed that [13:41:45] but "Create an item" is missing on the left column [13:41:59] is it voluntary? [13:42:25] Sannita: yes for now [13:42:33] Lydia_WMDE: oh, thanks [13:43:45] * jeblad_WMDE have a "mess up gene" which is quite unique [13:53:40] Are Wikidata URIs currently deferencable? [13:54:56] csarven: if you mean the linked data sense, no not yet [13:55:04] Yes. Thanks. [13:56:31] Do you have an ETA on that or when do you reckon the URIs are /safe enough/ for interlinking? [13:57:12] no ETA [13:59:14] Change on 12meta_wikimedia a page Wikidata was modified, changed by Denny Vrandečić (WMDE) link https://meta.wikimedia.org/w/index.php?diff=4356408 edit summary: [14:01:16] I'm diong some config changes on the test client (en first, then he). Hope you won't notice it. ;) [14:02:41] Change merged: John Erling Blad; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30736 [14:10:41] Denny_WMDE: we need a plan for making cononical URIs discoverable [14:12:22] I'd just take some of the code from SMW for that, especially the URI derefencer and the meta link in the header. [14:12:47] that should be sufficient [14:13:15] GERRIT!! =[ [14:13:34] MAILMAN! [14:13:38] -.- [14:14:20] Denny_WMDE: ok. [14:14:21] hm... so... [14:14:37] Denny_WMDE: i'm about done with my backlog, would look into the change propagation stuff now. [14:14:42] or do you have anything urgent? [14:15:38] i guess i could look into Bug 41491 [14:15:47] oh, also... [14:16:47] Denny_WMDE, aude, AnjaJ_WMDE: what's the status of the jenkins failures? [14:17:05] no idea [14:17:05] do we have any idea now why so many tests fail on the wmf install? do they still? [14:17:13] AnjaJ_WMDE is checking i think [14:19:03] DanielK_WMDE: besides the jenkins problems, as said: getting a new branch out that can be reviewed, that would be great, and i would love to prio that [14:19:45] there are a lot of db errors (transactions that is) but no clue yet why [14:19:50] Denny_WMDE: sure - i was asking whether you are of anythink blocking the branch [14:20:13] AnjaJ_WMDE: have you tested on sqlite? [14:20:15] nope [14:20:30] that may be the cause [14:20:41] it is an item for next week I think [14:20:47] put it on bugzilla [14:21:27] AnjaJ_WMDE: i'm not sure what my request is... that we run jenkins on sqlite locally? [14:21:51] we could do that, but once jenkins works at the wmf for us, what's the point? [14:21:56] no I meant it is on bugzilla already [14:21:56] see [14:22:08] * aude thinks just run the tests with sqlite [14:22:19] and fix any bugs related to that [14:22:19] https://bugzilla.wikimedia.org/show_bug.cgi?id=41420 [14:22:33] doesn't have to be with jenkins here [14:23:01] aude: true. and i'll try that now. but the question remains: do we want a jenkins run with sqlite? [14:23:08] i mean, at the office [14:23:15] it is not only those bugs, there are some that need looking into from others like jeblad_WMDE: https://integration.mediawiki.org/ci/job/MediaWiki-Tests-Extensions/1501/testReport/(root)/Wikibase_Test_ApiBotEditTest__testCreateItem/testCreateItem_with_data_set__1/ [14:23:15] if it's easy to do, sure [14:23:39] jeblad_WMDE thought there was an issue with the config also [14:23:47] i am not sure we need the local jenkins run on sqlite [14:23:48] something to do with bots ^ [14:24:04] as DanielK_WMDE says, once jenkins runs on wmf, what would be the point? [14:24:11] the config for wmf jenkins is on gerrit so someone could look at it [14:24:15] Denny_WMDE: agree [14:24:19] selenium tests uses a special bot user [14:24:30] Nag TobiasG_WMDE about it.. [14:24:34] jeblad_WMDE: but that's not from selenium [14:24:45] jenkins doesn't run seleium on the wmf cluster [14:24:53] jenkins using selenium? [14:24:53] wmf doesn't run any selenium tests afaik [14:25:03] nope [14:25:08] the selenium tests are not run at all by wmf, afaik [14:25:08] oki, then I have no idea [14:25:11] jeblad_WMDE: we have jenkins calling selenium locally, the wmf doesn't [14:25:24] AnjaJ_WMDE: https://gerrit.wikimedia.org/r/#/c/28499/ [14:25:31] they are doing some new QA stuff though [14:25:36] the bugs have to come from the phpunit tests [14:31:52] durrr. [14:31:52] Anyone who is familiar with Wikidata and is free now do verify what I said on Help talk:Contents. Thanks :) [14:31:52] from within function "DatabaseBase::sourceFile( /DATA/var/www/daniel/wikidata/extensions/OAI/update_table.sql )". [14:31:52] Database returned error "1: near "AUTOINCREMENT": syntax error" [14:32:36] hm, why is wikidata.org so slow? [14:32:53] because wikipedia keeps pushing it out of all caches, i guess... [14:33:32] Hydriz: just replied [14:33:32] looks good :) [14:33:37] hehe, thanks :) [14:33:41] Hydriz: feel free to poke me directly for things like that [14:33:52] sure. [14:34:07] man, we need a temp admin [14:34:43] Hydriz: anything specific you need fixed by an admin? [14:34:50] Change merged: John Erling Blad; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30590 [14:34:58] nothing, but the occasional deletions and all [14:35:42] Hydriz: ah yeah - the option is still to just repurpose them to something that doesn't exist yet [14:35:54] not perfect but... [14:36:07] true... [14:36:22] Hydriz: your answers are pretty good... I don't really want to add another list of 13 points with my answers, though. [14:36:38] how about you start the FAQ page with these, and I edit when that is up? [14:36:40] DanielK_WMDE: Thanks :) [14:36:58] i'd love to add my 2¢, but this format is rather odd. [14:37:06] Hydriz: if you do an faq there is some stuff on [[Wikidata/FAQ]] that might be interesting [14:37:06] 10[6] 10https://meta.wikimedia.org/wiki/Wikidata/FAQ [14:37:19] Lydia_WMDE: Sure, thanks for that too [14:37:57] np [14:39:42] New patchset: John Erling Blad; "(Bug 41214) Changes to use variant length prefixes and fragments" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/28527 [14:43:46] *damn* is that slow. [14:44:34] Lydia_WMDE: do you know who is running AsimovBot? [14:44:44] AsimovBot: poke [14:44:45] 04Error: Command “poke” not recognized. Please review and correct what you’ve written. [14:44:55] AsimovBot: help [14:44:55] Asimov v. 2, por jem- (IRC) / -jem- (Wikimedia), 2010-12. Órdenes (preceder con -): ab acad ad alias alusiones arroba art ascii ayuda ayudando ayudante ayudanteop biblio bibliotecario bot bufer bug cab cac cafe calc cas cb cdb char creadores demoda dest dominio dpd drae exclam expand fetch flames gallery gblock google guion hiperignore hora ide ignore ip l links lista log logs ls luxo mant msg mw nuke13 => [14:44:55] op otrs otrsteam patea ping pong proyecto random rank rb relevo reset revisar silencio sincat stats status sug sul tam ticket ticketid user v vec vot webchat wikinick wikirank wlm. 13Más: -? orden; 10https://toolserver.org/~jem/asimov [14:45:09] o_O [14:45:09] the fuque? [14:45:14] DanielK_WMDE: jem- iirc [14:45:42] jem-: can you make AsimovBot link to wikidata.org instead of meta, please? [14:47:19] * DanielK_WMDE sent mail [14:47:27] anyway [14:47:33] so. [14:47:53] AnjaJ_WMDE: i can't run update.php on sqlite. fails with a syntax error. looks like we don't have support for sqlite at all. [14:48:10] so... i wonder how jenkins is running that stuff at all ;) [14:50:23] * DanielK_WMDE blinks [14:50:36] they don't run update.php at wmf [14:51:12] is update.php broken for core or for wikibase? [14:51:55] sorry, i was wrong. should have looked more closely before yelling. [14:52:07] it's the OAI extension that fails on sqlite, not WIkibase [14:52:15] ah, not surprised [14:52:41] * aude thinks JeroenDeDauw uses sqlite with some tests or something [14:53:04] and i've used it via stuff in /maintenance/dev/ [14:53:23] * DanielK_WMDE is now running all wikibase tests on sqlite [14:53:23] People are throwing welcomes at each other :/ [14:53:28] \o/ [14:53:28] :) [14:53:41] i guess i should set up my private user page [14:53:58] aude: what's up with the favicon? [14:54:03] DanielK_WMDE: caching [14:54:34] try http://wikidata.org/favicon.ico or put a "?" at the end if necessary to invalidate cache [14:55:16] DanielK_WMDE: There is one good reason to add back action=raw [14:55:31] Its a good way to verify the internal structure [14:56:04] Now we must implement ways around by using wfDebug to see whats in there [14:56:39] aude: DanielK_WMDE: heh? what's the deal with sqlite? [14:57:53] New patchset: John Erling Blad; "(Bug 41383) Update wbgetentities to report statements for items" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/29945 [14:58:52] JeroenDeDauw: some of our tests fail on sqlite. i see 5 failures and 3 errors. will look into it [14:59:30] jeblad_WMDE: no, you can use the API. just ask for the revision text as you usually would. [15:00:14] jeblad_WMDE: action=raw doesn't have a good way to report the content model (http headers work for the format, but not the model) [15:00:14] the API response contains al the info necessary to interpret the data [15:01:25] aude: guh, firefox holds on to old favicons forever [15:02:39] * jeblad_WMDE wants a Spesial:Dump [15:03:22] point, click and drag and they will update [15:03:58] New patchset: Jeroen De Dauw; "added convenience function to check if something is a coordinates string" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/30786 [15:05:03] -group DataValueExtensions,Diff,Wikibase --verbose --exclude-group WikibaseAPI [15:05:03] 04Error: Command “group” not recognized. Please review and correct what you’ve written. [15:05:03] DanielK_WMDE: running fine for me sqlite [15:05:23] lol AsimovBot [15:06:08] -moron [15:06:08] 04Error: Command “moron” not recognized. Please review and correct what you’ve written. [15:06:18] hehe [15:06:23] erd humor [15:17:41] New review: Jeroen De Dauw; "Any reason to be particularly concerned about the performance of these methods?" [mediawiki/extensions/Diff] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/30643 [15:17:41] Change merged: Jeroen De Dauw; [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/30643 [15:21:02] is it guaranteed that the id of an item always equal to lower case page title? [15:25:36] Merlissimo: the id? [15:25:40] that's a number [15:26:15] !nyan [15:26:16] ~=[,,_,,]:3 [15:26:24] no that's e.g. q321. wbgetitem always needs item id, not the page_title which is Q321 [15:27:35] the when i want the sitelink of an page, have i always request the page info first (which contains title and idem id), or can i convert it myself [15:27:35] ? [15:27:36] Prefixed ids is in lowecase, but in the title and URL the first letter will be uppercased [15:28:47] We are shifting to prefixed ids, but some places it will be valid to use a unprefixed id [15:29:20] so it is sure if i convert to page title to lowercase internally instead sending an additional request for getting the item id [15:30:32] The prefixed id, that is the title, will always be the prefixed item id as you get them back from wbgetentities [15:31:04] But then you don't know if they actually points to an entity or just happens to be a title [15:31:08] ;) [15:31:57] Also, we don't assume anything about the namespace anymore, but I think it is still used a couple of places in the code [15:32:05] It will probably go away with time [15:32:11] at toolserver i only know the page_title as it is stored in table page. [15:32:29] Some handlers still use the namespace I think [15:33:05] There should also be a field for content model [15:33:40] If content model is an entity then you can use the title as the prefixed id [15:33:57] yes, which contains the string "wikibase-item" [15:34:41] what was the reason to not use page_title=item=id? [15:35:54] You mean a freee string as id? [15:36:16] Mainly that we want a fixed url that never changes [15:36:33] no, only using lower case page titles, or uppercase ids [15:36:39] No wrangling internally how something shall be written [15:36:59] No Denny.. [15:37:02] Software makes the prefixes uppercase [15:37:15] (i only think of the cpu power all scripts will use only for converting this :D) [15:38:11] uppercase-first is only the default config of mediawiki, but that can be changed as e.g on jvwiki [15:38:13] Personally I would prefer uuid and use a few bits to encode the type of entity [15:38:53] New patchset: Jens Ohlig; "Little fix for wbsearchentities with empty result list" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30790 [15:38:53] JeroenDeDauw: https://bugzilla.wikimedia.org/show_bug.cgi?id=41553 [15:40:19] Sometime it feels like something has exploded in my brain.. [15:40:19] the problem is that wbgetitems has its own action and so no generators can be used. [15:40:21] What was I doing.. [15:40:50] Change merged: Jens Ohlig; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/30584 [15:41:21] Merlissimo: Good point, think I said that last spring.. ;) [15:41:45] yes [15:42:32] Change merged: Jens Ohlig; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/30587 [15:42:43] btw wbgetentities doc link to wbgetitems on api.php at wikidata.org and on test repository wgetitems link to wbsetentities [15:47:29] Change merged: John Erling Blad; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/30786 [15:55:27] how can i add redirects as sitelinks? [15:55:56] the interface automaticcally added the redirect target [15:56:14] We only want to link to the target [15:56:55] but the redirect can be about a different item [15:57:23] Then it is wrong in our respect [15:57:25] Merlissimo: yes then you have different items for them [15:57:37] It must be linked at the project itself [15:58:00] Lydia_WMDE: and how to i connect to item to the redirect? [15:58:23] Merlissimo: you don't as jeblad_WMDE said [15:58:24] you can't [15:58:31] can you give an example where you need it? [15:58:40] then maybe we can help more [15:59:32] In some cases several subtopics are linked to a common page with fragment identifiers [16:01:29] Lydia_WMDE: http://de.wikipedia.org/w/index.php?title=Clyde_Barrow&redirect=no . in these cases redirects also have categories [16:01:41] Its not that unusual on Wikipedia, but we don't support it, as it would break one of the the constraints.. I think we could code around it, .. [16:01:57] to the date of birth should no be added to an item connect to the redirect garget [16:02:37] for bot these pages should be marked with __STATICREDIRECT__ [16:02:37] Merlissimo: in this case there should be an item for Clyde without a link to enwp atm i guess [16:07:14] I think they should be turned into stub articles [16:07:31] yeah that's an option [16:07:50] But its not a wikidata-problem, the projects should decide for themselves [16:08:47] this example won't be accepted as article by dewiki community [16:09:10] We could point to the redirects without resolving them in these cases as they pose no problems for us [16:09:31] But then the target page cant access the item either [16:09:38] just check if this page contains magic word staticredirect [16:09:53] jeblad_WMDE: i think EntityFactory::getByID (or whatever it is called) could simply nomalize the ID to lower case. [16:10:02] it would be fine for the API to accept the upper-case version [16:10:03] No we wont parse pages during normalization of titles [16:10:17] i'll put in on bugzilla to be discussed [16:10:34] bonny has different dates of birth than clyde, so the item cannot be used for the combinding article [16:11:03] BTW, I'm still getting the occasional 404 failure from the Apaches. [16:11:12] jem-: yeah :/ [16:11:12] It is possible to ask for data from specific items, but it will be a mess [16:11:12] same here [16:11:21] jem-: also still some missing wiki errors [16:11:27] eh [16:11:29] jem-: not you :D [16:11:32] James_F: ^ [16:11:46] Lydia_WMDE: :-( [16:11:48] Its a 0.176% fringe case and I don't think we will put any real effort into it [16:12:54] Hmm. Is there no entry of Angela Merkel yet? Obama and Cameron have entries.. [16:13:22] * James_F had expected that WMDE staffers would want to correct our Anglosphere-leaning imbalance. :-) [16:13:36] Basic idea is "link common items and leave the rest to the community" [16:13:42] Indeed. [16:14:18] I tried and got voted down. ;/ [16:14:36] Exists now; Q567. [16:14:51] Its not only anglocentric, its also paternal. [16:15:18] sofixit community :D [16:16:21] Oh, no, I'm going to whine about it the next month! [16:16:24] JeroenDeDauw: do you have the current id blacklist we are using on wikidata.org? [16:16:43] Jens_WMDE: or do you know where to find it? did we deploy it on the test system? [16:17:12] DanielK_WMDE: we did [16:17:26] Lydia_WMDE: i want to see the config for that :) [16:17:41] Silke_WMDE: do you know where it is? [16:17:58] what? [16:18:12] ah. [16:18:19] no. some e-mail thread from weeks ago. [16:19:10] DanielK_WMDE: spoiler... [16:19:56] http://media.tumblr.com/tumblr_ma1qc0Ow6T1qdldxh.png [16:21:25] http://ecomodder.com/imgs/nissan-wood-spoiler.jpg [16:22:42] DanielK_WMDE: no, ask Denny [16:23:12] or perhaps aude also knows [16:23:40] https://gerrit.wikimedia.org/r/#/c/30626/ [16:25:24] James_F|Away: created some imbalance yesterday [16:25:41] thanks aude [16:26:05] !nyan [16:26:05] ~=[,,_,,]:3 [16:26:10] heh [16:26:47] ok, so that blacklist doesn't seem to have an impact on the test failures [16:26:55] good [16:27:16] sqlite seems to trigger some errors, but not nearly as many as we see on jenkins [16:27:31] http://www.wikidata.org/wiki/Q149 :) [16:38:37] ugh... [16:39:02] our queries for searchg terms use on-the-fly charset conversion? gah! [16:39:14] that measn a full table scan plus a disk sort. [16:39:20] that's *extremely* slow [16:39:55] (and also, sqlite doesn't support it) [16:42:11] DanielK_WMDE: what do you suggest as an alternative? [16:42:24] Jens_WMDE: nothing? [16:42:55] Jens_WMDE: text in the search index must be normalized upon insertion. then the search terms are also normalized before running the query. [16:43:04] that way, no on-the-fly conversion should ever be needed. [16:43:19] Jens_WMDE: i'm already hacking at it. [16:43:20] this is strange... [16:43:33] TermCacheTest should never have worked. [16:43:34] not for mysql either [16:43:48] Warning: Test method "testTermArrayStructure" in test class "Wikibase\Test\TermCacheTest" is not public. [16:48:32] Change merged: John Erling Blad; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30790 [16:48:55] Jens_WMDE: are we always doing case-insensitive searches? or do we need case sensitive too? [16:49:01] if we need both, we have to double the index. [16:51:47] Just for the record, I said that was slow but it was decided it was good enough for now [16:51:49] as for now, we always do case-insensitive matches [16:52:44] i don't understand why we'd need on-the-fly conversion, ever. [16:52:59] I think it would be good enough to normalize to upper or lower case [16:53:27] Also on insertion we should do an nfc-transform [16:53:46] and search with the nfc-transform [16:54:41] we need conversion, but not necessarily on-the-fly conversion [16:59:06] you know that namespace prefixes are always case insensitive? [17:03:04] Is Denny in the office? [17:03:09] no [17:03:33] I'll look for him later [17:03:49] He said he would be online [17:04:47] I've found the source of the intermittant 404s and redirects to missing wiki on meta [17:04:54] As expected, 2 apaches are out of sync [17:04:55] nICE! [17:05:02] Reedy: \o/ [17:05:02] fixable? [17:05:10] Yeah, should be [17:05:11] should i add srv194 to mediawiki-installations? [17:05:16] At worse, we take them out of rotation [17:05:20] i am wondering if it is special [17:05:49] srv199 though seems all normal like it should have been synced [17:05:51] meh [17:06:08] sync-common is still running on 194 [17:06:16] ok, sounds a bit slow [17:06:31] i think wbc_items_per_site has a bug: ips_site_page VARCHAR(255). This stores the full title with namespace prefix. But title without namespace can even have 255 chars [17:06:44] ok, just finished [17:07:14] mutante: that looks to have fixed 194 [17:07:26] srv194.pmtpa.wmnet 200 OK 37884 [17:07:29] nice! [17:10:26] 199 done, still broken [17:10:26] srv199.pmtpa.wmnet 404 Not Found [17:10:26] sounds like it's possibly got an out of date apache config/apache not restarted [17:10:36] mutante: main.conf is out of date on 199 [17:10:36] no wikidata in it [17:10:57] hrmm, can fix, just wonder why it fails sync [17:11:11] is it possible this has not synced since January?:o [17:11:29] slightly out of date? =) [17:11:34] The last Puppet run was at Tue Oct 30 16:17:37 UTC 2012 (48 minutes ago). [17:11:34] Ubuntu 12.04 LTS auto-installed on Tue Sep 11 22:55:25 UTC 2012. [17:11:48] It can't be more than 2 months out of date ;) [17:12:34] ok, i said that because it got a new mainboard and MAC address back then [17:12:50] was checking if maybe it had just been worked on [17:13:01] but not the case [17:14:15] ah [17:15:32] New patchset: Aude; "store rc_bot in change info blob" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30803 [17:16:30] Merlissimo: do you want to make a bug for that? [17:16:37] or i can do it, if you prefer [17:16:59] maybe namespace should be a separate column? [17:17:45] aude: i have an old revision installed locally from september. i am not sure if the content of this field may have changed. the create table statement is still the same [17:17:53] ok [17:18:13] i do think DanielK_WMDE is looking at our strategy for handling changes so the table might even be deprecated [17:18:28] not sure it makes sense to have the table like that anyway [17:18:45] but if we keep, we can look at the issue [17:21:01] aude: i would need that table for data analyse on ts because there is not other way to get langlinks for a page, isn't it? [17:24:26] srv199 fixed [17:31:24] Merlissimo: langlinks for an ordinary page or sitelinks for an item? [17:32:46] i "only" need langlinks for a page [17:32:54] like langlinks table atm [17:35:10] I wonder if your bot should be reimpllemented as a special page for analysis of the item [17:35:45] I wrote a few bugs about similar special pages [17:36:02] I think they could be really useful [17:45:34] jeblad_WMDE: if you like i cold easily create a report on wikidata.org "item having sitelinks to not existing targets" [17:48:38] My point is that we should not create maintenance reports, we should provide the necessary information in place [17:49:50] A special page to produce a list of items and their state regarding a specific factor [17:50:19] Or perhaps even better, some way to trigger the test and provide the information on the page itself [17:50:52] A kind of "this sitelink is weird" marker [17:51:03] Merlissimo: you want to know from the langlinks table or somewhere, which ones are local and which are wikidata? [17:53:48] i need the the langlinks connection from one page to another page for my script. as long as the items_per_site table is requestable at repo or client everything is finde for me [17:54:41] ok [18:01:48] New patchset: Aude; "store rc_bot in change info blob" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30803 [18:15:02] Hello all [18:15:31] PiRSquared17: hi! [18:15:39] PiRSquared17: thanks so much for your work so far [18:16:02] * Lydia_WMDE is sooo happy to see this coming together now and so many people helping :) [18:17:01] :) I'm excited to see this project going so well [18:17:13] \o/ [18:18:31] Is there a channel where all the edits are reported? [18:18:40] i don't think so [18:18:47] want to get that set up? [18:19:09] sure [18:19:09] cool :) [18:19:09] let me know if you need anything [18:19:25] * Lydia_WMDE adds wikidata.org to the topic of the channel in the meantime [18:19:45] PiRSquared17: You do not mean that: http://www.wikidata.org/wiki/Special:RecentChanges [18:20:01] PiRSquared17: irc://irc.wikimedia.org/wikidata.wikipedia [18:20:05] Is there a reason you don't use #wikidata for general chat about the project? [18:20:28] MF-W: ah. I tried #wikidata but it wasn't there [18:20:40] James_F: yes because it is not in the wikimedia namespace and a while ago freenode contacts for wikimedia did not have control over it [18:20:57] Lydia_WMDE: But that's fixed now. :-) [18:20:57] redirect it? [18:21:02] James_F: yeah but now everyone is here... :D [18:21:12] Merlissimo: it does [18:21:12] eh [18:21:15] mutante: ^ [18:21:18] you can make users autojoin this channel on join in the other if you ask "gc" nicely [18:21:21] why is my tabcompletion so fail today... [18:21:28] ah:) [18:21:33] New patchset: Aude; "cleaning up recent changes code, ExternalRecentChange stuff to handle more stuff" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30809 [18:21:34] Lydia_WMDE: We're using #wikimedia-wikivoyage for "technical" stuff about getting the project set up, but the community area is going to be #wikivoyage. [18:21:42] gotcha:) already works [18:21:46] Lydia_WMDE: Might be sensible to adopt the same pattern for here? [18:21:59] James_F: let's see how it goes like this for a few days [18:22:03] if we need to split it we will [18:22:12] * James_F nods. [18:22:57] PiRSquared17: or that: http://www.wikidata.org/w/index.php?title=Special:RecentChanges&feed=rss [18:23:18] is irc.wikimedia.org a freenode redirect or a seperate server? [18:24:05] Lydia_WMDE: it's a mw core process [18:24:15] Lydia_WMDE: irc.wikimedia is wikimedia-owned [18:24:37] k [18:24:47] * Lydia_WMDE sets up and connects [18:25:54] q694 [18:26:38] Lydia_WMDE: channel #wikidata.wikipedia [18:26:44] mutante: thx [18:26:55] bah! [18:26:57] hehe [18:27:00] Merlissimo: thanks! [18:27:13] tabcompletion is really failing me today :/ [18:27:30] for some irc clients just clicking on irc://irc.wikimedia.org/#wikidata.wikipedia will work [18:27:31] New patchset: Aude; "cleaning up recent changes code, ExternalRecentChange stuff to handle more stuff" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30809 [18:27:45] PiRSquared17: i have set that up with a different client unfortunately [18:27:50] so need to do it manually [18:28:24] hmmm and it seems to not work [18:28:31] ohhh [18:28:31] wait [18:28:33] it does [18:29:03] Lydia_WMDE: will we allow items like http://www.wikidata.org/w/index.php?diff=7665&oldid=7229 for userpages? [18:29:23] * Lydia_WMDE looks [18:29:37] New review: Aude; "I am already working on cleaning up code in the client extension, so there would be a bit of merge c..." [mediawiki/extensions/Wikibase] (master); V: 0 C: 0; - https://gerrit.wikimedia.org/r/30737 [18:29:39] or is it just for articles? [18:29:58] #pheh well in theory it is for everything so also this - up to the community to decide if it's ok or not [18:30:00] PiRSquared17: the question is: how to show these lagnlinks at his wikidata user page ;-) [18:30:09] Merlissimo: hehe [18:31:06] Lydia_WMDE: may you want write "hello" on the other channel :D [18:31:51] Merlissimo: heh can't it seems [18:31:54] i think edsu may add wikidata to wikistream [18:32:21] http://wikistream.inkdroid.org/ [18:32:21] Lydia_WMDE: then you are normally kicked and banned for 5 minutes my the server [18:32:35] Oo [18:32:35] mean! [18:32:35] :D [18:32:58] the irc channel is very cool but wikistream is a nice way to see the changes too [18:33:40] are you going to have recent changes channels on our own IRC server, btw? [18:33:43] cvn networks needs to know about this new channel, too [18:34:10] mutante: we already do [18:34:10] mutante: isn't that wikidata.wikipedia now? [18:34:15] k:) [18:34:15] Merlissimo: cvn? [18:34:25] Lydia_WMDE: #countervandalism [18:34:27] nice that it works out of the box [18:34:28] ohhh [18:34:30] ok [18:34:32] can someone do that? [18:34:41] they're the people who run channels like #cvn-sw [18:34:45] i see [18:34:57] and if you want RSS feeds added to planet, i can do that [18:35:13] mutante: planet wikimedia as in the blog agregator? [18:36:04] yes [18:36:14] well, RSS aggregator [18:36:32] in different languages [18:36:53] mutante: ok we have a blog as part of wikimedia germany's blog [18:37:02] if that's already agregated then it's fine [18:37:08] else i can get you a link to the rss feed [18:37:47] Wikimedia Deutschland Blog is in de.planet [18:38:02] but if the wikidata one has a different feed URL, sure [18:38:37] mutante: ok i'll look it up - might take some time - lots of other fires to put out [18:38:53] sure, just saying i am involved in planet anyways [18:39:03] :) [18:39:59] When will bots/mass-uploads be allowed on Wikidata? Is there any policy yet? [18:40:34] PiRSquared17: no policy but we've asked bot owners to not let their bots run yet because we want to give people a chance to first do stuff on their own [18:40:57] without having to mind bots already and having all the work done by them already [18:41:26] i think that's really important for the beginning [18:41:27] there's still years to come to run bots ;-) [18:41:50] Lydia_WMDE: bots cannot do "all the work". e.g desciptions and so on [18:42:02] Merlissimo: yeah of course [18:42:30] Merlissimo: and i fully expect people to become tired at some point of doing it by hand [18:42:54] and my bot is also spamming to much aliases (but i think removal by humans is easiert than adding) [18:42:59] * jeblad_WMDE think Merlbot is a bit stupid, it must have an IQ less than 100 [18:43:02] but i really want to give people some days to play without bots interfeering already [18:43:11] =D [18:43:44] a magic merge button is definitely needed [18:43:45] Can't write a description,.. ;p [18:44:08] Vito: yeah :/ there's a bug for it [18:44:11] It's good to see the software seems pretty stable [18:44:23] \o/ [18:44:24] I had tho merge two entries by hand .__. [18:44:24] jeblad_WMDE: Merl_IW_Bot , i also have MerlBot but this but really is stupid because it can only create reports. [18:44:37] Merlissimo: lol [18:44:37] switching language n-times [18:44:41] just looking at DB errors for wikidatawiki... wikidatawiki JobQueueDB::claim 10.0.6.44 1213 Deadlock found when trying to get lock; try restarting transaction (10.0.6.44) [18:44:45] Not anything specific to the site [18:44:47] Vito: :/ yeah not nice indeed [18:45:05] Lydia_WMDE: i also habe Merl_Link_Bot for fixing weblinks [18:45:21] Merlissimo: what does it do exactly? [18:45:29] checking if links still work? [18:45:41] The bot Merlissimo made for fixing langlinks-mess is quite clever [18:46:10] if a domain changes or the site structure of a website changes i'll rewrite the url [18:46:27] reedy, i add profiler stuff, could be nice if you took a look when I post it on gerrit [18:48:00] jeblad_WMDE: but http://wikidata-test-repo.wikimedia.de/wiki/Q137109?uselang=en where really too much aliases added. i already added some code the reduce them [18:49:03] Ouch [18:49:36] Some languages have redirects for variations in upper-/lowercasing [18:49:38] :o [18:49:51] That will make a mess [18:50:29] jeblad_WMDE: but removing some is much less work for humans than adding e.g. "The City that Never Sleeps" [18:50:49] yes [18:52:46] jeblad_WMDE: have you already seen my biggest single edit of +65k? http://wikidata-test-repo.wikimedia.de/wiki/Q137159 [18:54:30] Interessting that only sitelinks can produce so much data [18:55:10] I like the interface [18:55:12] jeblad_WMDE: not much left until you hit the 2 MB limit ;-) [18:59:18] 65k that is nearly a complete early 5 1/4'' disc isn't it? (i think it was 80kb, but 5 were always needed after formatting) [19:00:13] single sided low density? [19:00:42] normal one was 320kB if I'm correct [19:04:31] i think by first msdos "boot" discs where 320 [19:06:35] windows for workgroups 3.11 are my oldest 3 1/2'' disks i can find atm. [19:06:51] almost up to 800! [19:11:10] I need some floppy disks actually [19:12:45] jeblad_WMDE: sure [19:15:09] hi there! I got a question, so for now interlings are the 'only' data on wikidata is that right? [19:15:21] yes [19:15:26] mich2: and descriptions, aliases [19:15:31] sitelinks as we call them [19:15:34] but other than that, yes [19:15:53] so without it for example every wikipedia had to maintain its only list: http://www.wikidata.org/wiki/Q2 [19:16:03] statements and claims comes next [19:16:52] of earth pages in others languages, and now its in all in place for the wikis listed on your page? [19:17:39] Merlissimo: right [19:17:49] Merlissimo: it's not used on the wikipedias yet however [19:17:49] bah [19:17:54] mich2: ^ [19:17:58] Lydia_WMDE: can you give me bug number for merging entries? [19:18:04] Merlissimo: sorry -.- [19:18:06] Vito: let me check [19:18:34] it is definitely the most needed feature now! [19:18:34] Vito: https://bugzilla.wikimedia.org/show_bug.cgi?id=38664 [19:18:40] its hard to believe hat all those wikipedia main tian their interlinks through botnetworks [19:18:55] mich2: indeed! :) [19:19:05] it's amazing that it worked mostly so far [19:19:05] its seems much more reasonable to keep that in one place [19:20:16] It was hard to keep them in sync, especially when something went wrong [19:20:25] But then merlissimo made his bot [19:20:40] I wished for wikidata already back in 2007 [19:21:52] actually some years ago I wondered about a way to split datas out of templates [19:25:43] really great work from the wikidata team [19:25:58] looking forward to the next phase of the project! [19:30:02] New patchset: John Erling Blad; "(Bug 41537) Added a number of calls to wfProfileIn and wfProfileOut" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30821 [19:32:17] alright peeps - time for me to go home [19:32:26] i'll be back online in 30 mins to an hour [19:32:33] thanks all for being awesome today :) [19:32:58] * jeblad_WMDE thinks Lydia need sme sleep now.. ;) [19:33:27] the rose never sleeps ;-) [19:33:29] well [19:33:31] maybe i do [19:33:35] ;-) [19:33:40] -> biab [19:35:12] How long before we hit Q1000?> [19:36:07] couple of hours? [19:36:23] unleash the bots! [19:36:31] well, not yet but [19:36:53] * aude read that as Q10007 :o [19:37:08] Q10007 will be a bot, Q1000 will be a human [19:39:27] heh [19:40:01] New review: John Erling Blad; "Some entries are wrong, a new patchset is in the work." [mediawiki/extensions/Wikibase] (master); V: 0 C: -2; - https://gerrit.wikimedia.org/r/30821 [19:45:18] https://www.wikidata.org/wiki/Special:RecentChanges [19:45:21] This is awesome/amusing [19:46:25] what about it [19:46:40] The activity increasing and increasing [19:46:50] yes [19:46:59] thanks for taking on more of the elements [19:47:03] btw [19:47:36] @1337 should be saved [19:47:48] I think it is [19:48:09] My OCD makes me want to fix things when they have no english titile... [19:48:21] https://www.wikidata.org/wiki/Q1337 [19:48:29] nice [19:50:17] https://www.wikidata.org/wiki/Q877 is my standard datapoint [19:50:51] http://www.wikidata.org/wiki/Q791 :D [19:50:51] it's not vandalism [19:51:29] hehe [19:51:41] aude: Yesterday I googled to find out what was the worlds longest word... [19:52:06] https://www.wikidata.org/wiki/User:Aude/common.js < Useful Javascript for anyone. Add Special:CreateItem to your sidebar ;) [19:52:31] * aude shall add special:itembytitle there [19:52:31] There are several aglumative languages where you can keep on adding word together [19:52:31] I am waiting for out first piece of spam/vandalism [19:53:25] It should only be possible to add spam and vandalism in labels, descriptions and aliases [19:53:26] don't tempt people :) [19:53:43] or links to wikipedia vandalism [19:53:43] *beans* [19:54:10] (diff | hist) . . Gangnam Style (Q890)‎; 19:53 . . (+34)‎ . . ‎Stryn (Talk | contribs | block)‎ (‎Added site-specific [nowiki] link: Gangnam Style) [rollback] [19:54:20] * Reedy facepalms [19:54:20] Perhaps it should be possible to suppress sitelinks in recent changes [19:54:53] hey can my bot Sk!dbot get botrights? [19:55:11] Not yet [19:55:14] Adding wrong links are an option now, but it will be harder and harder as there are less free sitelinks [19:56:51] Sk1d: no bots yet [19:57:17] is rollback or any of the nonadmin rights enabled? [19:57:32] Guerillero|storm: should be [19:57:41] all the standard mediawiki config [19:57:50] ooh [19:57:57] wikimedia config [19:57:58] * PiRSquared17 can't wait for first hat collector [19:57:59] heh, we're over 900 now [19:58:31] aude: whenn will there be bots enabeld? [19:59:14] I would put my neck on the line just to see what the RfA process on wikidata would look like [19:59:24] since it has never been tried [20:01:44] I'd half hope that admin should be given fairly freely to trusted community members [20:02:00] New patchset: John Erling Blad; "(Bug 41537) Added a number of calls to wfProfileIn and wfProfileOut" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30821 [20:02:44] I would hope so [20:03:05] Anybody with a decent history from other projects, or a few days ans som hundred edits here? [20:03:27] Seems sensible [20:03:29] There will be many that goes tired and leave so time is more important than number of edits [20:04:22] New review: John Erling Blad; "Should be fixed now" [mediawiki/extensions/Wikibase] (master); V: 0 C: 0; - https://gerrit.wikimedia.org/r/30821 [20:04:24] Hopefully we'll have bots doing this interwiki work again soon ;) [20:05:31] I am up to 70 edits or so [20:05:57] jeblad_WMDE: did you asked Merlissimo? he is bürokrat on de:wiki [20:06:47] I will not say anything about who I prefer (or not) as admin or buraucrat [20:07:04] I'm staff.. [20:07:55] But I will probably do a RfA for my ordinary account when there are a community later on [20:08:16] the special:itembytitle workflow is much better [20:10:58] It half feels like we are only making an interwiki database [20:11:17] or am I missing something [20:11:56] Thats correct [20:12:24] hello [20:12:46] But that is now, the sitelink database we are building will later be used for organizing dataflows [20:12:58] hello Mathonius [20:12:58] ok [20:13:04] that makes sense [20:14:27] there's no link to Special:CreateItem in the search results when I set the language to Dutch (nl), is this a problem I can report here? [20:15:16] Mathonius: use Special:ItemByTitle [20:15:24] the search isn't very good yet [20:16:29] I am starting to relise that [20:16:37] The title of that page is really off [20:16:51] It says the item is identified before it is found [20:16:55] when I get half way through a point and I hot and error [20:16:59] * Lydia_WMDE is back [20:17:05] Aude: okay, that sort of works, thanks [20:17:36] Later versions are better [20:18:02] :) [20:18:22] Mathonius: yes, we will improve things but the special page is what i use [20:18:34] * aude wants to remind folks about http://meta.wikimedia.org/wiki/Steward_requests/Permissions [20:18:59] if you'd like to be an admin, request there (initially, until we have local wiki process) [20:19:22] wikidata is currently a GS wiki, is that temporary? [20:19:29] * aude sees the request for deletion page but staff shouldn't handle those [20:19:34] GS? [20:19:42] global sysop [20:19:42] the crats want some sort of agreement that you should be an admin [20:19:45] i assume so [20:19:49] if it's automatic [20:19:51] Mathonius: until they get tired of us :P [20:20:06] haha :P [20:20:20] it won't be long until we have a few admins and crats [20:25:53] Like this aude https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions#Requests_for_adminship [20:26:36] Guerillero|storm: that works :) [20:26:57] the stewards can work with that i think [20:27:32] you might want to update http://www.wikidata.org/wiki/Wikidata:Community_portal [20:30:23] I added anothe link to the [[Wikidata:Requests for permissions]] page [20:30:23] 10[4] 10https://meta.wikimedia.org/wiki/Wikidata:Requests_for_permissions [20:32:36] Guerillero|storm: thanks [20:34:55] Q1000 [20:34:58] :p [20:35:33] I made Q999 [20:35:34] only 1 off [20:36:52] Romaine: :D have a cookie [20:37:01] Guerillero|storm: you too to make up for it ;-) [20:37:22] lol. danka [20:37:22] [20:35] How long before we hit Q1000?> [20:37:34] 1 hour [20:37:36] Oo [20:37:45] holy... [20:37:50] holy cookie [20:37:51] why do we need bots? [20:38:00] ;) [20:38:00] Denny_WMDE: lol [20:38:00] j/k [20:38:00] right [20:38:14] Romaine: ohhh holy cookie... i wonder how that one tastes [20:39:12] Denny_WMDE: to create edit conflicts and to create interwiki errors [20:39:19] \o/ [20:39:19] So Lydia_WMDE, we can now do the interlanguage thing, does that also include Commons? :-) [20:39:30] multichill: not yet [20:39:30] we absolute must be sure we aren't without errors [20:39:40] Romaine: word! [20:39:59] grmbl, that would mean I could just trash a couple of my bots as useless :-D [20:40:18] multichill: heh useless bots are a good thing? [20:40:39] Of course, that means we have a better system [20:40:46] true tht [20:40:48] *that [20:41:08] well [20:41:09] we're getting there ;-) [20:41:31] Looking forward to your trip to Amsterdam? [20:41:38] very much so [20:41:45] going to vienna after it [20:41:51] looking forward to that too [20:42:00] and utrecht will be the first talk after the launch [20:42:05] \o/ [20:43:33] New patchset: Reedy; "Tidy up inconsistent returns" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/30737 [20:44:37] Lydia_WMDE: And the whole wikidata thing should be really nice to replace https://commons.wikimedia.org/wiki/Commons:Monuments_database [20:44:37] what is in Utrecht? [20:44:55] multichill: indeeed [20:45:00] indeed [20:45:15] multichill: yes! [20:45:23] Romaine: wikimedia nl conference where i am going to give a keynote [20:46:07] in Dutch [20:46:09] ? [20:46:22] hah [20:46:23] no [20:46:30] my dutch is non-existent [20:46:32] Deutsche then [20:46:37] It's nearly the same [20:46:43] i do speak english, german and some spanish and latin [20:46:43] but no dutch :/ [20:46:50] hehe nah - will do it in english [21:13:39] * PiRSquared checks to see if he is still the user with the most edits [21:14:47] third now [21:14:53] PiRSquared: noooooooooooo! :D [21:15:06] PiRSquared: who's first and second? [21:15:22] 1: Romaine 2: Electron [21:15:28] PiRSquared: cool - where do you see that? [21:16:12] I went through the whole list : https://www.wikidata.org/w/index.php?title=Special:ActiveUsers&limit=500 :P [21:16:47] hah [21:16:47] ok [21:16:49] I only added all the countries, continents, months, days of the week [21:17:44] Lydia_WMDE: can we test the current API? [21:17:59] PiRSquared: on the demo system linked in the topic sure [21:18:07] PiRSquared: depending on what you want to do of course [21:18:14] * PiRSquared looks [21:18:30] PiRSquared: keep in mind though that we'll have to break the api once more [21:18:30] :/ [21:18:47] so what you see now on demo and live system isn't final [21:41:23] Sk1d: we should make a plan which features our bots must have to run on wikidata.org [21:44:46] hm atm my bot is only abeld to check all interwikilinks on a wikipedia and try to save all of them. if there is any conflikt it skips the artikle [21:48:52] and what data is added by your bot? [21:49:25] my bot uses displaytitle as label, or page title if there is no special display title with brackets removed. redirects are added as aliases [21:49:49] redirects that only have a different case are skipped [21:52:32] maybe we you create a page to request for botflag and indruduce our bots, so that cummunty can discuss about them [21:54:27] * Reedy pets Merlissimo [21:55:36] * Merlissimo purrs [21:56:42] Reedy: zhwiki replag is really low since all old job were removed [21:56:48] haha [21:57:38] The name X is too similar to the existing account: Y [21:57:53] User account Y is not registered. [21:58:37] http://www.wikidata.org/wiki/Wikidata:Elements_Task_Force If anyone wants to work on elewmets [21:58:51] * kondi doesn't want just another username. :( [21:59:40] kondi: Don't you have an account on other wikis? Just log in [22:00:01] http://www.wikidata.org/wiki/Wikidata:Contact_the_development_team#Arabic_language_is_right-adjusted_instead_of_left-adjusted_in_tables [22:00:01] uhm [22:00:04] isn't that how it is supposed to be? [22:00:15] there is a User:Y on enwiki and i assume a global account [22:00:15] aude: ^ [22:00:15] http://en.wikipedia.org/wiki/User:Y [22:00:18] Gueriillero|dinn: creating item including all langlinks would take only few minutes. so maybe you should wait until they are created and use your time for revieing [22:00:58] aude X and Y were placeholder. [22:01:05] Lydia_WMDE: it's supposed to be that way, though i suppose we could use the use/setlang to determine alignment [22:01:12] kondi: ok [22:01:19] aude: so it is correct as is? [22:01:47] Lydia_WMDE: correct unless we want to use the use/setlang to determine stuff like alignment and capitalization [22:01:59] aude: k thx - will reply [22:02:10] ok [22:02:28] Gueriillero|dinn: time to create a page with task forces listing? [22:02:30] :) [22:02:40] I added them to the community portal [22:02:52] ah cool [22:02:54] when the list gets too long we can fork [22:03:11] it is best to not have too many pages that people can't find things [22:03:43] we really need categories and a way to do redlinks [22:03:49] * aude thinks [22:04:03] Guerillero: yeah agreed :) didn't see you had added it there already [22:04:07] Guerillero: why do you not like Untribium [22:05:21] what number it it? [22:06:31] un-tri-bi-um 132 [22:06:38] * Guerillero can only find a de-wikipedia page about it [22:07:20] which has langlinks [22:07:55] frwiki has http://fr.wikipedia.org/wiki/Untrioctium [22:09:04] I wonder why there is nothing in engish about the element [22:10:04] oh, it has yet to be sythsised [22:10:08] Guerillero: mind if I add a category to the task force pages? [22:10:15] sure [22:10:15] :) [22:11:55] Guerillero: do you know if there is a way to list all pages of the category on the community portal page? [22:11:58] umm [22:12:01] so this page doesn't need to be updated [22:12:10] I should know this [22:12:12] :D [22:15:30] I thought you could trandclude it like a template [22:15:36] but it doesn't seem to work [22:17:13] :/ [22:17:13] i though this was somehow possible [22:17:13] does anyone else know? [22:17:13] Lydia_WMDE: just include the category [22:17:28] Merlissimo: how exactly? :D sorry - being dense tonight... [22:17:59] [[:Cat:foo]] [22:17:59] 10[7] 10https://meta.wikimedia.org/wiki/:Cat:foo [22:18:24] * Lydia_WMDE goes and tries [22:19:57] mmh, does not work. some years ago is was possible to include it simply by {{category:...}} [22:19:57] 10[5] 10https://meta.wikimedia.org/wiki/Template:category:%2E%2E%2E [22:20:16] [[:Cat:Foo]] would be linking [22:20:16] 10[6] 10https://meta.wikimedia.org/wiki/:Cat:Foo [22:20:19] yeah doesn't work :( [22:20:53] AsimovBot: sush! [22:20:53] 04Error: Command “sush!” not recognized. Please review and correct what you’ve written. [22:20:53] DPL? [22:21:00] Reedy: ? [22:21:17] Dynamic Page List/intersection [22:21:21] ah [22:21:28] or DanielK_WMDEs CategoryTree! [22:21:34] hehe [22:21:45] anything that works on wikidata.org now? [22:21:45] I can enable other extensions if needed/wanted [22:21:51] let's not do that for now [22:22:09] if it's not possible i'll leave it as is [22:22:18] that's good enough for now [22:22:18] even if not great [22:22:36] CategoryTree is enabled [22:22:52] oh [22:22:52] Foobar [22:24:00] i'm probably using it wrong [22:24:04] if someone else wants to give it a shot please do [22:24:09] http://www.wikidata.org/wiki/Wikidata:Community_portal [22:24:16] category is Task force [22:25:08] http://www.wikidata.org/w/index.php?title=Q64&action=history [22:25:09] -.- [22:25:22] anyone want to take care of that? [22:25:39] :/ [22:26:03] Wikidata has decided I want italian [22:26:12] I would need rollback [22:26:16] I've got it [22:26:18] just having to login [22:26:28] https://www.wikidata.org/w/index.php?title=Q64&action=rollback&from=151.41.160.140&token=4b0242921a5f1519e327183383d2b25e%2B\ [22:26:28] oops [22:26:44] Lydia_WMDE: (cur | prev) 22:26, 30 October 2012‎ PiRSquared17 (Talk | contribs)‎ m . . (1,000 bytes) (+497)‎ . . (Reverted edits by 151.41.160.140 (talk) to last revision by Bobo11) (rollback 1 edit | undo) [22:26:52] PiRSquared: you're the best [22:27:01] :D [22:27:09] It's not showing it for me... [22:27:25] (the rollback revision) [22:27:39] https://www.wikidata.org/w/index.php?title=Q64&diff=13160&oldid=12956 [22:27:54] meh, replag seemingly [22:27:59] yeah [22:28:09] had a bit of that earlier too [22:28:15] Lydia_WMDE: you don't have admin/crat?! [22:28:32] PiRSquared: i do as staff but would like to not use it tbh where possible [22:28:58] if no-one else is around i'll of course take care of such things [22:30:22] Lydia_WMDE: http://www.wikidata.org/wiki/Wikidata:Community_portal#Task_forces [22:30:22] People should vote on people up at http://www.wikidata.org/wiki/Wikidata:Requests_for_permissions#Requests_for_adminship [22:31:20] Have we become reddit? [22:31:33] Reedy: i hope not! [22:31:39] it'd be way too early for that [22:31:46] reedyit [22:32:12] Reedy: also yay! [22:32:18] thanks [22:34:05] Page edits since Wikidata was set up 13,275 [22:34:07] Nice [22:35:42] oh my [22:35:51] Average edits per page 9.72 [22:40:37] I keep thinking of features in Translate it'd be really useful to have here [22:40:48] "All pages where no translation in your language exists" [22:45:33] Reedy: jep totally [22:45:40] we need that at some point [22:45:45] i think there's already a bug for it even [22:46:14] Lydia_WMDE: what about enabling translations for lokal messages like e.g. on incubatorwiki? [22:46:54] Merlissimo: can you give me details? sorry i'm not familiar with how it workd [22:46:56] *works [22:47:00] i wanted to create a bot template with some descriptions. but you said it should not have a default language [22:47:54] a bot template doing what? [22:48:03] * Lydia_WMDE is tired [22:48:49] for bot userpage [22:48:59] ah [22:48:59] ok [22:49:22] e.g. http://incubator.wikimedia.org/wiki/Incubator:Main_Page contain parts that are translated in different languages. for en e.g. http://incubator.wikimedia.org/wiki/MediaWiki:Featuredwikis-active/en is used [22:49:36] * Lydia_WMDE looks [22:49:56] http://incubator.wikimedia.org/w/index.php?title=Special%3ATranslate&taction=translate&group=page-Incubator%3AMain+Page&language=de&limit=100&task=view [22:50:17] Merlissimo: ok so this is basically the translate extension right? [22:50:43] Looks like it, yes [22:50:50] that what it is used for on transaltewiki.net. on incuabtor it is for translation local sites [22:50:54] => It is [22:51:01] as for that: i think we should totally enable that soon - however let's wait another day or two at least to see how everything is going [22:51:24] i'm afraid of adding more to the mix atm [22:51:37] different to metawiki or commons incubator is not enwiki focused. i think that was also your intention for wikidata [22:51:37] And we probably need Niklas to help advise re config [22:51:52] Merlissimo: in principle yes [22:52:35] Reedy: maybe copying incubator config for this would help [22:52:35] arrrgh, it's raining [22:53:32] The config is teh same across all Wikimedia project usage [22:54:35] i think the extention is not enabled on any other wiki except incubatorwiki [22:54:43] it is [22:54:58] Soulkeeper is #1 now... [22:54:59] * Merlissimo looks at the config [22:55:12] wikimedia wikis, mediawikiwiki, metawiki, outreachwiki... [22:55:17] *wikimania wikis [22:55:55] oh yes [22:56:59] bewikimedia, thats Romaine ;-) [22:57:09] yes? [22:57:20] translateadmin [23:00:31] time for me to get some sleep - see you tomorrow folks :) [23:00:38] bye Lydia_WMDE ! [23:01:30] bye [23:26:49] Page edits since Wikidata was set up 14,877 [23:26:49] Average edits per page 10.21 [23:27:04] 50 minutes, 1600 esira [23:27:05] *edits [23:27:26] !! lots of activity already [23:27:49] Hmm [23:27:52] AllPages seems broken [23:27:55] http://www.wikidata.org/w/index.php?title=Special:AllPages&from=Q1&to=Q213 [23:29:43] Reedy: maybe alphanumeric order? [23:29:50] Maybe.. [23:29:58] But then the from/to is rubbish [23:30:29] Certainly, the sort-order could be improved. :-) [23:31:54] This wiki isn't good for people with OCD [23:33:44] it will be [23:36:54] Annoying [23:37:06] People keep creating pages, but no links on them [23:37:17] and you can't see the description in other languages without changing [23:46:25] Edit conflicts are becoming very common :p [23:47:51] because a group of us are trying to correct them at once [23:47:58] yup [23:48:03] with no conflict handling [23:52:28] I have here 5 empty pages :S [23:52:49] I have here 5 edit conflicts xD [23:53:07] I have here 5 pages with no label [23:53:16] Romaine: means someone has only put a label in another language.. [23:53:29] I know [23:53:30] or emptied that page [23:53:34] PiRSquared: with no label or only with no label in your language? [23:53:44] Merlissimo: in English [23:53:59] * Romaine fills in all English labels [23:54:05] Is it possible for a page to have no label at all? [23:54:11] I have searched through the AllPages [23:54:11] That's what I've been doing [23:54:17] for pages without English label and added that [23:54:25] PiRSquared: english is needed for all items because ... [23:55:03] ENGLISH IS TEH LANGWAJ [23:55:14] CANONICAL BITCHES [23:55:16] ksh! [23:56:34] it is just a matter of fast working, then you have less edit conflicts :p [23:57:00] heh [23:57:14] pfff http://www.wikidata.org/wiki/Q101 [23:57:37] * Merlissimo don't know why people are doing to much unneeded work manually instead of focusing of the important parts [23:57:46] I added that [23:57:51] Merlissimo: For amusement [23:57:55] english seems to be the lingua franca of wikidata [23:57:58] What important parts? :p [23:58:13] the focus on bots is an important part :p [23:58:34] English is the lingua franca of people in the world [23:58:38] things that cannot be done automatically: descriptions, creating items for langlinks conflicts [23:58:52] but you've gotta create the pages [23:58:58] and the bot needs some links to exists..