[00:50:32] yayayay [00:50:33] :D [08:08:38] Hey there, I have officially announced that wikidata is enabled on beta (the bug request was https://bugzilla.wikimedia.org/show_bug.cgi?id=47827 ) [08:08:42] the site is http://wikidata.beta.wmflabs.org/wiki/Wikidata:Main_Page :) [08:27:26] Change merged: jenkins-bot; [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/65817 [08:34:19] Change merged: jenkins-bot; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67104 [08:36:52] New patchset: Tobias Gritschacher; "(bug 49171) parse the copyright message in EntityView" [mediawiki/extensions/Wikibase] (mw1.22-wmf6) - https://gerrit.wikimedia.org/r/67241 [08:37:13] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (mw1.22-wmf6) - https://gerrit.wikimedia.org/r/67241 [09:29:18] aude: are you around ? [09:29:32] hi hashar [09:29:35] I am not sure I got the point in your comment regarding wikidata on beta https://bugzilla.wikimedia.org/show_bug.cgi?id=47827#c6 [09:29:43] I remember I had stack traces on wikidata [09:29:53] had to change the wiki main page to points to Wikidata:Main_page [09:29:59] which seems to be what you are asking [09:29:59] yes [09:30:25] the default main page is wikitext [09:30:42] so when we set main namespace to be entity content [09:30:50] then it fails to parse the wikitext [09:31:07] looks okay now [09:31:22] yeah I have changed themainpage MediaWiki: message [09:31:27] that works [09:31:33] the next thing is the populateSitesTable maintenance script [09:31:43] will look at how it is setup on production / in puppet [09:31:47] we still have [[Main Page]] around but suppose it doesn't matter now [09:31:48] [1] 04https://www.wikidata.org/wiki/Main_Page [09:31:52] ack [09:32:01] ah [09:32:05] we have it puppetized for our test system [09:32:08] got deleted by sam hehe [09:32:12] ok [09:32:31] it's exec .... [09:32:44] does populateSitesTable need to be run once or is it a recurring task ? [09:32:57] good question... [09:33:01] not at this time [09:33:17] in the future we need a way to keep it synched when there are new wikipedias or whatnot added [09:33:26] ahh [09:34:30] next wikipedia and we'll have a solution probably [09:34:48] until then, once is fine [09:35:34] mwscript extensions/Wikibase/lib/maintenance/populateSitesTable.php --wiki=wikidatawiki [09:35:34] done. [09:35:39] good :) [09:36:05] hopefully it populated using the all-labs.dblist [09:36:06] :D [09:36:30] $wiki = $this->getOption( 'load-from', 'https://meta.wikimedia.org/w/api.php' ); [09:36:31] or onot [09:36:43] hmm i still cant add links [09:36:55] there is an option to get the data from http://meta.... [09:37:25] yeah load-from [09:37:31] need to find out which api I need to hit [09:37:32] --load-from http://meta.wikimedia.org/w/api.php [09:37:47] any wikimedia wiki should be fine [09:37:54] it's just pulling the sitematrix data [09:38:04] which is slightly different on beta [09:38:10] oh [09:38:46] http://deployment.wikimedia.beta.wmflabs.org/w/api.php would do it [09:38:51] hmm need a hack [09:39:04] or we could make available a sql dump to load [09:39:08] not as nice [09:39:45] that api should work [09:41:24] aude: do we have a script to dump the site table ? [09:41:36] not specific [09:41:39] ugly: https://www.wikidata.org/wiki/Special:ItemDisambiguation?language=en&label=Chinatown&submit=Suchen [09:41:47] pretty: http://tools.wmflabs.org/wikidata-terminator/?lang=en&term=Chinatown&doit=Do+it [09:41:51] we should fix this :) [09:41:51] but one of the standard maintenance scripts should be okay [09:43:47] aude do you know the database table ? :-D [09:43:49] name [09:43:50] err [09:43:58] aude: do you happen to know what is the database table name ? [09:44:00] site [09:44:04] and there is site_identifiers [09:44:12] * aude thinks it's singular [09:44:23] plural :-) [09:44:41] oh, ok [09:44:55] yep, sites and site_identifiers [09:46:16] site_domain: gro.sbalfmw.ateb.aidepikiw.eh. [09:46:17] haha [09:46:27] heh [09:48:36] aude: sql dumps are at https://bugzilla.wikimedia.org/show_bug.cgi?id=47827#c9 [09:48:51] aude: ideally the remote API url should be a wg setting [09:49:02] and addwiki.php should use it to run the maintenance script [09:49:07] but I don't think addwiki has any hook to use [09:49:30] makes sense [09:49:37] filling bugs :-] [09:50:06] thanks [09:50:18] stupid addwiki has now wfRunHooks [09:50:24] oooh [09:52:29] did the script run yet? [09:52:45] i have ran it yes [09:53:02]  https://bugzilla.wikimedia.org/show_bug.cgi?id=47827#c9 shows the content of the wikidatawiki tables [09:53:11] ok, i still can't add site links [09:54:10] "populateSitesTable.php load-from should be configurable via a $wg setting" https://bugzilla.wikimedia.org/show_bug.cgi?id=49236 [09:54:15] agree [09:56:10] "addWiki.php should have some hooks" https://bugzilla.wikimedia.org/show_bug.cgi?id=49238 [09:56:14] ok [09:56:29] just pasting there for information [09:56:34] I am not going to handle them either [09:56:42] that's fine [09:56:46] but I like to fill bugs to make sure one day we will fix them :-] [09:56:59] they are not highest priority but sure we will fix next time a new wikipedia gets added [09:57:08] feel free to adjust the priority :) [09:57:21] ok [09:58:12] oh, is there a way to "kick" the memcached for the beta instance? [09:58:31] there might be with eval.php [09:58:37] "populateSitesTables.php should be run automatically on new wiki addition" https://bugzilla.wikimedia.org/show_bug.cgi?id=49240 [09:58:41] yes [09:58:43] made a dependency of the addWiki.php hooking [09:58:52] ah memcached true [09:59:03] I guess populateSitesTables does not clear the memcache key hehe [09:59:53] it should [09:59:59] gah, another bug [10:00:12] let me find out the memcached key [10:00:15] ok [10:02:14] Sites::singleton()->getSites( false ); [10:04:57] SiteSQLStore::newInstance()->getSites() gives me correct urls at least :D [10:05:06] good [10:05:18] Sites::singleton()->getSites( false ); does too [10:05:31] (i love how we have different interfaceshehe) [10:05:40] the singleton is deprecated [10:06:08] http://wikidata.beta.wmflabs.org/w/api.php [10:06:24] looks like it has some choices for action=wbsetsitelink [10:07:43] ok, it works [10:07:57] maybe some cache expired [10:08:15] thanks for your help aude! [10:08:36] it's not providing site link suggestions though [10:09:09] could be the suggestions are not from en wikipedia but en beta wikipedia [10:10:07] The external client site did not provide page information. [10:10:29] meaning it's not able to verify my input [10:12:06] need to do more debugging but the backend seems okay [10:13:11] at least one thing working :-D [10:13:15] yes [10:13:15] do you have access on beta ? [10:13:23] what do you mean? [10:13:24] we got some udp2log logs there that might help [10:13:27] i'm not in the project [10:13:30] * aude doesn't think [10:13:42] I should add you so you can look at the mw logs :) [10:13:48] would be great [10:15:12] the project still lack documentation :/ [10:15:26] I have added you as a root on deployment-prep project [10:15:34] the work instance is deployment-bastion.pmtpa.wmflabs [10:15:57] all the mediawiki code base is maintained automatically under the user account 'mwdeploy' [10:16:08] I got an alias: alias mwdeploy='sudo su --login --shell /bin/bash mwdeploy' [10:16:25] then that is the usual stuff: mwscript eval.php --wiki=wikidatawiki [10:16:38] and logs in /home/wikipedia/logs [10:16:51] ok, thanks [10:16:53] if you get access in production, you should feel at home [10:17:07] i have most of this stuff in my wikifarm :) [10:17:13] :-) [10:17:32] the mediawiki-config on beta is updated whenever a change is merged (there is a jenkins job) [10:17:38] right [10:17:39] the extensions and mediawiki are updated every X minutes [10:17:44] ni d [10:17:45] nice [10:17:46] and once per hour we run upgrade.php on all wikis [10:17:50] ok [10:18:05] so if you need to do a config tweak, that needs to be done in mediawiki-config.git and merged [10:18:15] ok [10:18:17] labs uses specific files such as wmf-config/InitialiseSettings-labs.php [10:18:28] yep [10:18:55] we'll need to figure out which beta instance(s) can be clients [10:23:34] aude: feel free to fill bugs about it [10:23:50] I am disconnecting for now, got to pack my luggages then I head for some short vacations. [10:23:55] will be back on monday [10:24:38] ok, enjoy! [10:39:26] New review: Daniel Kinzler; "oops, forgot to post these comments last night" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/66280 [10:42:46] Abraham_WMDE: https://gerrit.wikimedia.org/r/#/c/65443/ [10:43:18] hi. i'd like to add an interlanguage link but have the following problem. the german article is more specialized than the english one. the english one has a redirect from the special, to the more general one. the more general one is already linked. when i try to link the german with the english one, i get an error saying that there is already another link to it [10:44:27] vmx: [Wikidata:Interwiki conflicts]] is a good place to put it then and have someone figure it out [10:44:35] [[Wikidata:Interwiki conflicts]] even [10:44:36] [2] 10https://www.wikidata.org/wiki/Wikidata:Interwiki_conflicts [10:52:32] New patchset: Aude; "(bug 49242) clear cached sites data when running populateSitesTable script" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67243 [10:53:44] Lydia_WMDE: thanks [10:57:37] Tobi_WMDE: do you have IE 9/? [10:57:42] IE 9 [10:58:06] aude: just IE10 [10:58:13] hmmm [10:58:16] Henning_WMDE: ^^^ [10:58:18] * aude needs help with https://bugzilla.wikimedia.org/49139 [10:58:25] i have no way of investigating [10:58:39] or know how to fix in a way that works for IE 9 [11:00:26] * aude hopes hoo can help also [11:00:43] hey, DanielK_WMDE - a few moments? [11:00:51] moin Denny_WMDE [11:01:13] aude: you are moin-ing? :) cool. you are getting germanized :) [11:01:19] heh [11:01:38] Denny_WMDE: in a few minutes [11:02:36] ok [11:05:02] Tobi_WMDE: Henning_WMDE see also https://bugzilla.wikimedia.org/49243 [11:05:24] * aude wonder if the test machine has IE 9 or how i can even see this issue [11:13:55] Denny_WMDE: going through the agenda for tonight with abraham [11:14:00] what's up? [11:15:19] re javaservlet deployment [11:15:32] just wanted to know about this item for the call tonight [11:16:17] Denny_WMDE: nilesh's suggester engine is java based. we'd need a servelet to interface with that. [11:16:42] we are already using some servlets (solr, gerrit), i just want to know about the requirements [11:16:58] so it is about how this would work, not about actually deploying it already? [11:17:07] yes [11:17:16] exploring possibilities/constraints [11:17:51] Denny_WMDE: if you don't have anything urgent, i'd like to go for food now :) [11:18:12] DanielK_WMDE: enjoy :) [12:58:52] Denny_WMDE: The wikidatameter is ready for shipping: http://lb.bombenlabor.de/wikidatameter/ (I have to write a short text about it...) [13:00:05] Denny_WMDE is here? [13:00:39] I see him in the list of people in this room... [13:00:57] I have no idea if he is present [13:07:53] New patchset: Daniel Kinzler; "EntityLookup should fail on bad revision" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/66280 [13:09:49] New patchset: Daniel Kinzler; "EntityLookup should fail on bad revision" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67251 [13:16:55] liangent: i am around [13:20:16] Denny_WMDE: checked https://www.mediawiki.org/w/index.php?title=User%3ALiangent%2Fwb-lang%2Fdev&diff=704427&oldid=703737 ? Line 94 of the diff: this is already happening on zhwiki [13:20:41] and https://www.mediawiki.org/w/index.php?title=User%3ALiangent%2Fwb-lang%2Fdev&diff=705672&oldid=704427 : commented by someone else [13:21:40] (I posted a link to this page on zhwiki once, and I believe Li3939108 followed that link) [13:23:41] aude: is there any special combination which should be tested for today? [13:24:11] hmmm, wmf5 client / wmf6 core [13:24:18] I've tested wmf6 client&repo and am about to run tests for wmf5-client + wmf6-repo [13:24:28] aude: ok [13:24:29] ok [13:24:43] we'll keep test2 same as it is now except core gets updated [13:24:53] so wmf5client-withwmf6core + what repo? [13:24:53] we'll update test2 + wikidata on monday with our branch [13:25:08] wmf5 repo (core and extensions) [13:25:38] * aude can't imagine any problems, knowing what changed/not [13:25:38] ok. will do that. thx [13:25:38] but thats only an issue for test2, right? [13:25:42] yes [13:25:47] tests are important though [13:27:16] hello [13:27:31] do anyone knows when/if the releases dates of the movies will be integrated to wikidata , [13:32:01] Guest74146: you could help with that :) it is already possible [13:32:47] i don't know at all how to do this, i will let the people who knows doing this ^^ [13:37:43] is Li3939108 in this channel? [13:56:07] New patchset: Daniel Kinzler; "Add EntityRevision functionality" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67256 [13:58:24] New review: Daniel Kinzler; "@jeroen: there are several follow-ups now. Do you still see anything that needs to be fixed *here*?" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/66122 [14:02:31] New patchset: Daniel Kinzler; "EntityLookup should fail on bad revision." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67251 [14:07:28] Can a admin please pm me? :) [14:15:45] New patchset: Daniel Kinzler; "EntityLookup should fail on bad revision" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67251 [14:18:59] New patchset: Daniel Kinzler; "Add EntityRevision functionality" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67256 [14:19:56] New patchset: Daniel Kinzler; "EntityLookup should fail on bad revision" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67251 [14:20:07] New patchset: Daniel Kinzler; "Add EntityRevision functionality" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67256 [14:30:24] aude: https://bugzilla.wikimedia.org/show_bug.cgi?id=49139#c7 [14:31:51] hoo: ok [14:32:03] can you see if there are any additional issues such as reported in https://bugzilla.wikimedia.org/49243 ? [14:32:12] * aude can't check on IE [14:32:46] I checked it on IE9 and it never gave more than "Unexpected error" to me... I'd be surprised if someone got it to run [14:32:51] hmmmm [14:33:25] that's because it tried to do a CORS request and IE threw and exception AFAIR [14:33:26] seems ULS team etc. is wanting "Languages" p-lang to appear always now [14:33:34] so they are unhiding it [14:33:36] don't we do that [14:33:45] We only hide it via a css module [14:33:55] hoo: for anons also, they want that [14:34:08] and when the widget doesn't work, they still want the heading [14:34:25] see http://en.wikipedia.beta.wmflabs.org/wiki/Think%20Like%20a%20Cat [14:34:29] they can do #p-lang { display: block !important; } then [14:34:36] ok [14:35:11] Such a mess that I can't copy and paste into the remote windows (connected with Remmina) [14:35:25] ok [14:36:29] The lang section even appears for me logged out and in IE9 mode [14:36:40] ok [14:36:51] because of what uls is doing [14:37:35] It throws errors in IE7 and alerts in IE8 mode, though... that's bad [14:37:40] :( [14:37:51] but not our issue [14:37:55] ok [14:39:03] alright back in ~hour [14:50:23] New patchset: Tobias Gritschacher; "Split up switch-case into separate methods" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/66981 [14:51:29] New patchset: Tobias Gritschacher; "Split up switch-case into separate methods" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/66981 [14:52:04] hi, sorry for the noob question, but is the taxonomy of wikispecies already somehow covered by wikidata? [14:52:51] (currently i downloaded the wikispecies dump and work my way from there, but its a bit fragile) [14:53:45] New patchset: Tobias Gritschacher; "Split up switch-case into separate methods" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/66981 [14:55:46] New patchset: Tobias Gritschacher; "Split up switch-case into separate methods" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/66981 [15:00:20] DanielK_WMDE: whenever you feel funny enough for review: https://gerrit.wikimedia.org/r/#/c/66981/ ;) [15:10:59] New patchset: Henning Snater; "Fix for globeCoordinate's toDegree()" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/67257 [15:11:39] kritzikratzi: hi [15:11:43] hi+ [15:11:58] no, currently Wikidata is not used on any Wikimedia sites other than Wikipedia [15:12:07] ah, ok [15:12:09] there are eventual plans to incorporate Wiktionary, Wikispecies, etc. [15:12:33] but we do have a fair amount of content on taxonomy [15:12:35] +commons +voyage [15:12:53] in a way i'm happy, [15:12:53] because it means my work wasn't for nothing :) [15:13:17] :) [15:13:32] taxonomy on animals? or taxonomy of other things? [15:13:48] well, taxonomy of life in general [15:13:58] it's a work in progress, like everything else, of course [15:14:20] (reading the List_of_properties article atm, i can see that there's lots of taxo items, like subclass/instance/etc) [15:14:33] http://www.flickr.com/photos/armadillu/sets/72157633770984985/ [15:14:41] this is the project a friend of mine and i are working on [15:14:48] but there are certain editions of Wikipedia (Dutch and Vietnamese, especially) that have a *lot* of taxonomic data, so we can import a lot of stuff from them [15:14:59] the idea is to visualize the taxo info of wikispecies [15:15:05] oooh awesome! [15:15:24] yea, i've started looking into the infoboxes today, [15:15:33] yeah, if you're doing that, Wikispecies is probably better suited than Wikidata, for the time being [15:15:49] eventually, though, something like what you're doing is exactly what we're aiming to help with [15:15:58] but i'm not sure how/if i can reliable connect the en wikipedia articles with the wikispecies articles [15:16:19] hmm [15:16:24] (wikispecies has images for only ~30k of its articles, seems there's considerably more image material on wikipedia itself) [15:17:10] you could go by title and who discovered it, though idk how easy that'd be to automate [15:18:11] i think by title is the way to go too, [15:18:33] not sure how its gonna turn out though [15:19:00] although in my experience from Wikidata, we /have/ occasionally had cases where taxonomic articles with the same title in different languages turn out to be different species [15:19:11] ah, good to know [15:19:17] from a scientific perspective that shouldn't happen, of course, but it occasionally does [15:19:28] its a good tip, [15:19:37] i'll double check more than usual [15:19:52] i think if a handful articles are wrong in a 400k dataset its not a big deal, [15:19:58] still, it's a low enough false-positive rate that we have a bot that merges taxonimic articles with the same title automatically [15:20:02] but if its every second article or so its another story :) [15:20:02] yeah, exactly [15:20:23] *taxonomic [15:20:48] i see ... [15:21:04] btw., [15:21:13] maybe this is interesting, maybe not [15:21:19] http://www.airdates.tv [15:21:24] this was my first wiki parsing experiment [15:21:42] its full of mad heuristics, a gazillion regular expressions [15:21:54] huh, cool! [15:21:56] but somehow it works okayish enough to be useful [15:22:06] remind me to use that for future bootlegging [15:22:16] i don't encourage that [15:22:18] :P [15:22:26] it's got links to torrents :P [15:22:34] but it also has a disclaimer :) [15:22:37] though i never use torrents. i just stream stuff online [15:22:51] and the yellow background while hovering is... interesting... [15:23:03] lol, [15:23:15] yea... i made this thing in around two days work time, [15:23:20] just to see if its possible [15:23:34] now i only make changes if i have to [15:23:37] where do you get the data from? [15:23:40] wikipedia [15:23:51] the {{episode}} tag [15:23:51] [3] 04https://www.wikidata.org/wiki/Template:episode [15:24:03] how cool, thats a handy bot! [15:24:14] ah, but it links to the wrong sit e:) [15:24:21] {{w:episode}} [15:24:22] [4] 04https://www.wikidata.org/wiki/Template:w:episode [15:24:24] grrr [15:24:29] [[w:Template:episode]] [15:24:30] [5] 04https://www.wikidata.org/wiki/Template:episode [15:24:30] rofl [15:24:41] * PinkAmpersand punts AsimovBot out a window [15:24:48] http://en.wikipedia.org/wiki/Template:Episode_list [15:24:52] :) [15:24:53] [[en:Template:episode]] [15:24:53] [6] 10https://en.wikipedia.org/wiki/Template:episode [15:25:00] therreeeee we go [15:26:15] * PinkAmpersand prefers Helpmebot to AsimovBot [15:40:19] aude: according to the tests there sould be no problem for switching wmf5-client's core to wmf6 [16:21:21] New patchset: Jeroen De Dauw; "Change composer file so the autoloading works" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/67267 [16:24:25] Change merged: Jeroen De Dauw; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/67267 [16:47:47] Change merged: Daniel Werner; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/67257 [17:03:58] New patchset: Jeroen De Dauw; "Remove keyword from composer file" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/67271 [17:04:07] Change merged: Jeroen De Dauw; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/67271 [17:10:22] New patchset: Jeroen De Dauw; "Remove some whitespace" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/67272 [17:10:46] Change merged: Jeroen De Dauw; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/67272 [17:27:28] [travis-ci] wikimedia/mediawiki-extensions-Ask#6 (master - 87c5642 : jeroendedauw): The build is still failing. [17:27:28] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Ask/compare/5f7a7b0f87ec...87c5642fa967 [17:27:28] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Ask/builds/7848511 [17:27:47] ahhh, packagist, y u provide old version? [17:28:52] oh cool, another spambot :) [17:28:55] PinkAmpersand: (and everyone else too) what kind of technologies do you use the extract data from existing wiki articles? [17:29:20] * PinkAmpersand isn't a bot-op, so is a bad person to ask [17:29:36] for me regular expressions seem the way to go, eg see here https://github.com/kritzikratzi/TreeOfLifeParser/blob/master/src/main/java/sd/wikitest/FileExtractor.java or https://github.com/kritzikratzi/TreeOfLifeParser/blob/master/src/main/java/sd/wikitest/PrunedCSV.java [17:29:58] eh, have you tried the API, kritzikratzi? [17:30:03] there's an api? [17:30:05] :P [17:30:11] yes, lego would know of it [17:30:24] lego? [17:30:29] (sorry, very new to all this) [17:30:33] * legoktm pokes his head up [17:30:38] ah [17:30:40] kritzikratzi: the API is at https://www.wikidata.org/w/api.php [17:30:51] https://www.wikidata.org/w/api.php?action=wbgetentities&ids=q42&format=jsonfm <-- example [17:31:14] are you trying to get data out of wikidata, or a wikipedia article? [17:31:20] wikipedia, [17:31:30] wikidata seems amazing, and i'm very much looking forward to it, [17:31:43] but if i understand correctly there's very little actual data on it yet [17:32:11] yeah [17:32:21] the developers have been lagging w/ respect to quantitative data types like integers [17:32:31] i'm asking here because i suspect the data is put into wikidata in one or another automated ways, [17:32:33] * Jasper_Deng turns his eye towards Lydia_WMDE [17:32:40] https://svn.wikimedia.org/viewvc/pywikipedia/branches/rewrite/scripts/harvest_template.py?view=markup [17:32:49] thats a script that multichill wrote (he's not online right now) [17:33:05] Pyfisch also wrote xyr own script [17:33:14] ah, very cool, that looks like its along the lines of what i'm wondering [17:33:24] i have one that uses mwparserfromhell somewhere [17:33:51] aude: we don't have wgWBRepoSettings['normalizeItemByTitlePageNames'] = true, do we? [17:33:54] we should... [17:34:08] * DanielK_WMDE should learn how to make the appropriate patch [17:35:17] Jasper_Deng, kritzikratzi: integers would be easy, but *quantities* are hard. You need precision, units, dimensions, all that. [17:35:54] DanielK_WMDE: still, we should get integers soon at least. For example, the atomic # of an element is a dimensionless, certain, integer. [17:36:33] Change merged: Daniel Werner; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/65679 [17:36:54] Jasper_Deng: yes, we'll probably have unit-less qunatities first. Still not a plain integer. [17:37:04] i guess that's up next after geo-coordinates and URLs. [17:37:07] why wouldn't that be a plain int? [17:37:57] because then we'd have one data type more, which is propbably rarely appropriate but very likely to be misused. [17:38:23] if we have int, people will give the population of a city as an int. but it's a measured quantity. [17:38:51] well, I'd assume users would have to be educated on precision/accuracy. [17:39:03] but things like rankings and atomic #, are bare ints. [17:39:05] when will int data type be ready? [17:39:13] *sigh* [17:39:38] Jasper_Deng: i don't think these cases justify an extra data type though [17:39:56] lbenedix: unitless quantities? not too far off, but i'm not going to guess at a date. [17:40:12] (everything i say can be used against me...) [17:40:16] \o/ [17:40:26] * Jasper_Deng is assuming that for unit conversions, functions need to be defined [17:40:40] geo-coords are up next. i see urls taking shape. [17:41:13] Jasper_Deng: i don't think we'll support anything beyond linear conversion. maybe with an offset, to cover °C va °F [17:41:13] * lbenedix wonders that the 'complicated' types are first... [17:41:38] there are also logarithmic quantities [17:41:40] lbenedix: it's because there is a TON of them on wikipedia [17:41:54] time and coords are first because about 60% of wikipedia content is people and places [17:42:34] good point [17:42:56] Jasper_Deng: but the need for autmatic conversions involving logarithms is rather rare. [17:50:15] conversion of lenght units: http://upload.wikimedia.org/wikipedia/commons/e/eb/English_length_units_graph.svg [17:51:16] there is a nice graph for mass units: http://upload.wikimedia.org/wikipedia/commons/1/1e/English_mass_units_graph.svg [17:53:47] lbenedix: check the second image, [17:53:50] 1 troy = 12 troys [17:54:01] can't help but love that unit system [17:54:15] also 1 tower = 12 towers [17:54:58] DanielK_WMDE: btw, what would be the domain of an int datatype? [17:55:03] will there have to be a bigint? [17:55:09] (etc.) [17:55:24] there will be no inte datatype [17:55:32] there will be a quantity data type [17:55:39] of unlimited range? [17:56:01] more or less, yes. [17:56:31] I mean, we're not storing all ~15 trillion digits of pi, are we? [17:56:32] not quite sure yout how we will implement the "must be an integer" constraint. [17:56:49] no. [17:56:49] only 15 trillion? [17:57:03] (all 15 trillion we know, that is) [17:57:54] <^demon> There's only 16 trillion, couple more years and they'll find the last one. [17:58:03] Jasper_Deng: no, more like 100 decimal digits before and after the comma or some such. [17:58:07] ^demon: hehe [17:58:23] 39 are enough to make a circle with radius = radius of the known universe with an error of the size of a helium atom [17:58:25] Jasper_Deng: but again: just a guess. don't quote mor on that [17:58:47] <^demon> lbenedix: And I still can't draw a good circle ;-) [17:58:49] I'm also guessing that in the database it'll be a 64-bit floating point. [17:59:11] We need some of that there quantum storage [17:59:14] Jasper_Deng: for the query index it might be. not for the primary representation [17:59:21] as I know the database is the same as for wikitext [17:59:47] lbenedix: yea, but we can't run queries on that, so we need additional tables for that [17:59:58] ok [18:01:34] Pi and the size of the Universe: http://www.youtube.com/watch?v=FpyrF_Ci2TQ [18:02:43] well, to put pi in perspective, the planck space is on the order of 10^-99 m. [18:03:03] and that's the smallest measurable volume possible. [18:06:12] New review: Daniel Kinzler; "I don't this this is needed: Utils::insertSitesFrom() calls SiteStore::saveSites, and SiteSQLStore::..." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67243 [18:06:13] to get to the planck distance you need 25 more digits [18:14:49] New patchset: Daniel Kinzler; "(bug 38201) Support multiple site link groups." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/65768 [18:24:53] http://test.wikidata.org/wiki/Main_Page [18:25:03] * Reedy breaks it further [18:26:47] Yup, let me revert that for the moment [18:35:06] aude: About? [18:38:41] [b395da3e] 2013-06-06 18:38:35: Fatal exception of type MWContentSerializationException [18:38:43] Sadface [18:40:28] denny on the stream soon: https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings [18:46:14] By clicking "save", you agree to the terms of use, and you irrevocably agree to release your contribution under the CC0 license. [18:46:46] When it's a create button.. [18:48:52] Reedy: :D [18:49:43] I think some of the apaches look to be out of sync... [18:49:55] A database query syntax error has occurred. This may indicate a bug in the software. The last attempted database query was: [18:49:55] (SQL query hidden) [18:49:55] from within function "Wikibase\TermSqlIndex::getMatchingTermCombination". Database returned error "1146: Table 'testwikidatawiki.wb_terms' doesn't exist (10.64.16.24)". [18:50:25] Now Labs, presenting remotely. [18:50:56] ... wrong channel. :-) [18:51:30] And apparently Q6 is the first Item I created :/ [18:51:31] http://test.wikidata.org/wiki/Special:RecentChanges [18:55:45] Good [18:55:54] Apache seem to be singing from the same hymn sheet now [18:56:32] https://test.wikidata.org/wiki/Main_Page [18:56:32] [f36e32ed] 2013-06-06 18:56:19: Fatal exception of type MWContentSerializationException [18:57:34] lego, Reedy: I did a page move [18:57:43] gr. [18:57:48] it was b/c that old main page was stuck in the mainspace [18:57:51] where it doesn't belong. [18:58:26] FTFY [18:58:29] https://test.wikidata.org/wiki/Wikidata:Main_Page [19:00:38] Reedy: wanna give me +sysop? You do not have permission to create properties, for the following reason: The action you have requested is limited to users in one of the groups: Administrators, propertycreator. [19:00:48] * Reedy pets legoktm [19:01:20] * legoktm huggles Reedy  [19:01:22] Reedy: me too [19:01:45] (partly b/c I would like to see if the fact that the main page was stuck in the main namespace was the reason why it started @ Q6) [19:03:04] i dont think so [19:03:27] it's the only reason I can think of. [19:03:32] i think thats b/c on real wikidata 1-5 were specially imported [19:03:48] Possibly [19:03:53] / "reserved" [19:04:59] Though, I did try to create Universe, which is at 1 on the live site [19:07:00] now I'm going to test the bug [19:07:20] Which bug? [19:07:29] Noting it's running the same code as wikidatawiki [19:07:29] the above one [19:07:35] I see a load of console errors too [19:08:12] Reedy: well, it also seems that the Wikibase extension isn't quite fully installed [19:08:18] CreateItem = "no special page" [19:08:22] It might be [19:08:31] No one has been around to help me, so it's slightly cobbled together [19:08:43] It should be using the same config [19:08:47] hhm... I just made the second one and it's Q7 [19:09:11] i can haz zyzop? [19:10:31] Reedy ^ [19:11:31] someone needs to get us a cool testwiki-type logo [19:11:53] a blackened one? [19:11:57] like testwiki's? [19:12:23] yeah [19:12:37] we could steal the WD OS logo [19:12:58] http://commons.wikimedia.org/wiki/File:Wikidata_oversight.svg [19:14:12] I still keep seeing the wikipedia logo periodically [19:15:08] * Reedy wonders where the wikidata team are and what they are doing.. [19:15:26] I think it's b/c of some rotten caching. [19:15:55] New patchset: Daniel Werner; "Added wb.Site.prototype.getGroup" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67289 [19:15:55] New patchset: Daniel Werner; "Refactor Sites related wikibase tests in frontend" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67290 [19:15:55] New patchset: Daniel Werner; "Added JavaScript wikibase.getSiteGroups()" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67291 [19:21:59] * Reedy throws things at Lydia_WMDE, DanielK_WMDE, aude [19:22:37] Reedy: it's 9:30 pm... [19:22:42] sitting on the couch and watching tv [19:22:50] And? You're usually still online :p [19:30:30] Reedy: just finished rating at amonie on the architecture page :) [19:30:36] * DanielK_WMDE is about to go for a beer now [19:30:41] Rating? Or ranting? ;) [19:32:08] Reedy: ranting of course. see, time for a beer :) [19:44:54] Reedy: we should probably get rid of the propertycreator right on test [19:48:14] what's the interwiki prefix gonna be? [19:48:39] i like testd [19:48:44] or dtest [19:48:46] wdtest [19:48:48] idk [19:48:49] bbl [19:49:18] Reedy: do we want the same namespace license structure as on WD proper? [19:49:33] the footer currently just says everything's CC-BY-SA [19:49:34] Presumably [19:50:16] It should be CC PD 0 (or whatever) for NS_MAIN [19:50:17] if ( in_array( $wgDBname, array( 'wikidatawiki', 'testwikidatawiki' ) ) ) { [19:51:33] updated the copyright footer [19:51:48] though currently the message is "Wikimedia-copyright", when I think it should be "Wikidata-copyright" [19:52:38] Sounds fishy [19:53:21] PinkAmpersand: Ah, found it [19:53:22] if ( $wgDBname === 'wikidatawiki' ) { [19:53:22] $siteMessageKey = 'wikidata-copyright'; [19:53:22] } [19:53:26] In wikimedia messages [19:53:52] as a programming concept, why do we need the "strictly equals" operator here? [19:54:23] God knows [19:54:27] Doesn't really matter [19:54:52] ah. so the solution's to make it $wgDBname === array('wikidatawiki', 'testwikidatawiki'), right? [19:55:48] No [19:55:52] you can't do that [19:55:56] https://gerrit.wikimedia.org/r/67301 [19:55:59] Well, you can do that [19:56:04] It won't do what you think it might [19:56:21] would it like try to compare the references by value? [19:56:31] (as what happens in Java) [19:57:14] Wonder if I should just force that through [20:03:55] legoktm: "Now that we have test.wikidata.org, that one should probably be scheduled with the other testwikis (Thursday), and then www.wikidata.org on Monday." [20:03:58] That was the idea ;) [20:05:43] grrr, no gitweb [20:06:41] Reedy: so is one of the testwikis gonna run off of testwikidata, or is it not gonna have any client side? [20:06:55] Yup, I think it was intended to use test2wiki [20:07:04] New patchset: Daniel Werner; "Fixed assumption of SiteLinksEditTool of single edit tool per page (DO NOT MERGE)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67303 [20:07:04] New patchset: Daniel Werner; "Display site-link group specific heading per SiteLinksEditTool" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67304 [20:07:13] Which is not currently configured [20:10:14] what's the difference between "Extension:Wikibase Database" and "Extension:Wikibase"? (I note the former's been deleted on mw.org) [20:10:25] Change abandoned: Daniel Werner; "(no reason)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67304 [20:10:58] probably a rename? [20:11:11] but we have both installed :/ [20:11:27] I guess the former is a component of the latter [20:14:09] New patchset: Daniel Werner; "Display site-link group specific heading per SiteLinksEditTool in JS" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67308 [20:14:47] Jasper_Deng: but there really shouldn't be an extension installed on a Wikimedia wiki whose page on mw is deleted [20:14:49] New patchset: Daniel Werner; "Fixed assumption of SiteLinksEditTool of single edit tool per page (DO NOT MERGE)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67303 [20:15:04] idk, Lydia would know [20:15:07] https://www.mediawiki.org/wiki/Extension:Wikibase_Database is linked to from Special:Version [20:16:25] PinkAmpersand: Jasper_Deng: i have no idea what that extension is [20:16:34] or does [20:16:51] Jasper_Deng: anything interesting in the deleted content on mw? [20:17:15] * PinkAmpersand pokes JeroenDeDauw to see if he's around [20:17:16] just an infobox, basically [20:17:27] nothing more [20:18:06] ah [20:18:12] author is JeroenDeDauw [20:18:17] so he should know [20:18:19] hence the poke [20:18:25] :P [20:19:37] New review: Daniel Werner; "(1 comment)" [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/65768 [20:19:43] * PinkAmpersand finally got around to finishing last season of Breaking Bad, and keeps on mentally associating Lydia_WMDE with Lydia in that [20:20:13] PinkAmpersand: hah - now i wish i had watched it to tell if this is good or bad for me :P [20:20:33] do you know the premise of the show? :) [20:21:00] vaguely [20:21:22] lol basically she's a high-powered corporate executive who runs a meth distribution ring on the side [20:21:27] and occasionally has people killed [20:22:07] heh [20:22:11] ok... [20:22:20] plus she's German. hence the mental association. [20:22:37] that's all. i'm just rambling because i'm tired. [20:22:46] at 4 in the afternoon.... fail. [20:23:17] that's no fail [20:23:29] especially if you ate a big lunch. [20:24:18] ;-) [20:37:31] New patchset: Daniel Werner; "Fixed assumption of SiteLinksEditTool of single edit tool per page (DO NOT MERGE)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67303 [21:06:50] [travis-ci] wikimedia/mediawiki-extensions-Ask#7 (master - 7af1461 : jeroendedauw): The build was fixed. [21:06:50] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Ask/compare/87c5642fa967...7af14617b623 [21:06:51] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Ask/builds/7855879 [21:18:30] New patchset: Jeroen De Dauw; "Update composer file to include the entry point as autoloading" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/67330 [21:19:01] New patchset: Jeroen De Dauw; "Update gitignore to ignore vendor/" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/67331 [21:19:21] Danwe_WMDE: https://gerrit.wikimedia.org/r/#/c/67072/ [21:20:44] Change merged: Jeroen De Dauw; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/67330 [21:21:50] Change merged: Jeroen De Dauw; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/67331 [21:25:33] [travis-ci] wikimedia/mediawiki-extensions-Ask#8 (master - a6f7a91 : jeroendedauw): The build was broken. [21:25:33] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Ask/compare/7af14617b623...a6f7a917b9f7 [21:25:33] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Ask/builds/7856600 [21:29:25] [travis-ci] wikimedia/mediawiki-extensions-Ask#9 (master - 2fc1117 : jeroendedauw): The build was fixed. [21:29:25] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Ask/compare/a6f7a917b9f7...2fc1117cb1ba [21:29:25] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Ask/builds/7856704 [21:29:52] Reedy: :D [21:29:53] [travis-ci] wikimedia/mediawiki-extensions-Ask#10 (master - 3787689 : jeroendedauw): The build was fixed. [21:29:53] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Ask/compare/2fc1117cb1ba...3787689b9dea [21:29:53] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Ask/builds/7856721 [21:31:44] Abraham_WMDE: check this out https://travis-ci.org/wikimedia/mediawiki-extensions-Ask/jobs/7856724 [21:31:49] Expand line 18 [21:31:50] :D [21:31:58] Finally got the composer thing to work properly [21:37:06] JeroenDeDauw: \o/ [21:37:23] that's awesome [21:40:44] Abraham_WMDE: requesting git repos for all our components now [21:40:54] So we can work in a more standard way [21:41:07] Stuffing everything in a single repo apparently is just not done [21:41:17] Both composer and travis assume you have only one component per repo [21:41:33] oh ok [21:42:58] but the travis setup is a real benefit anyway [21:43:08] thx for that [21:45:45] New patchset: Jeroen De Dauw; "Add composer.phar to gitignore" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/67343 [21:46:24] New patchset: Jeroen De Dauw; "Updated gitignore file with composer stuff" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67344 [21:46:51] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67344 [21:46:58] Change merged: Jeroen De Dauw; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/67343 [21:50:52] New patchset: Jeroen De Dauw; "Move globecoordinate registration to correct location" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/67072 [21:55:38] Abraham_WMDE: it is going to be so cool once we can get Wikibase and then just run "composer install" and have all the dependencies be pulled and loaded [21:59:59] that will be epic ;) [22:01:26] really looking forward to it and we'll set this up on our monitoring screen too \o/ [22:01:43] JeroenDeDauw: I'm off now, cya later [22:04:32] ttyl [22:10:08] New patchset: Daniel Werner; "Display site-link group specific heading per SiteLinksEditTool in JS" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67308 [22:27:56] New patchset: Jeroen De Dauw; "Do not spam Special:Version with irrelevant detail information" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/67357 [22:34:36] New patchset: Jeroen De Dauw; "Remove unused Item::setSiteLinks" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67358 [22:51:11] New patchset: Jeroen De Dauw; "Added SimpleSiteLink class so we can migrate DataModel code away from SiteLink" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67363 [22:53:56] New patchset: Jeroen De Dauw; "Remove dead code in SiteLink::newFromText" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67364 [23:22:51] Danwe_WMDE: https://en.wikiquote.org/wiki/Martin_Fowler [23:41:17] Danwe_WMDE: askz! http://programmers.stackexchange.com/ [23:45:06] New patchset: Jeroen De Dauw; "Provide alternative to methods using SiteLink in item and deprecate the SiteLink using ones" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67377 [23:51:08] New patchset: Jeroen De Dauw; "Remove/replace some MW specific code" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67378 [23:51:08] New patchset: Jeroen De Dauw; "Get rid of some SiteLink usage in Item" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67379 [23:57:52] New patchset: Jeroen De Dauw; "Automatically register DataModel tests with MediaWiki" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67380 [23:57:52] New patchset: Jeroen De Dauw; "Improvements to ItemDiffTest" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67381