[16:19:37] No objections to lurkers? [17:00:44] hey everyone :) [17:00:56] who is here for the wikidata office hour? [17:00:59] Hi All :) [17:01:05] _raises hand_ [17:01:05] hey scott! [17:01:11] it's a denny! [17:01:21] hi all [17:01:23] I am, Lydia! Hi :) [17:01:36] I am, Lydia! Hi :) Hey Denny [17:01:40] how is everyone? [17:01:41] hello, Lydia! I'm here for wikidata. [17:01:51] yay [17:02:22] alright [17:02:49] i wanted to start with a bit of looking back at the past 3/4 months, then talking about what's coming up and then do questions [17:02:52] * aude waves :) [17:03:03] hey aude :) [17:03:06] as always :) [17:03:10] hi [17:03:11] yes! :D [17:03:18] nice to have you all here [17:03:30] so let's take a look at what happened over the past 3/4 months [17:03:48] i'm sure i'll miss some things but here's some of the cool stuff: [17:04:19] we've had a pretty damn good increase in active editors: https://vital-signs.wmflabs.org/#projects=wikidatawiki/metrics=RollingActiveEditor [17:04:39] any guesses why this increased so sognificantly recently? [17:05:22] maybe new deployments in various WikipediasN [17:05:24] ? [17:05:29] vandalism? [17:05:35] Ricordisamoa: i hope not! [17:05:36] faster website? [17:05:46] dennyvrandecic: possible! also: prettier website? [17:05:54] Tpt: yeah also possible [17:05:55] Lydia_WMDE: I'm getting a 504 from that link [17:06:03] YairRand: -.- [17:06:31] ok more stuff: [17:06:38] we celebrated our 3rd birthday \o/ [17:06:47] :) [17:07:05] the gene wiki people published a nice article about making infoboxes work with wikidata on english wikipedia: http://i9606.blogspot.de/2015/10/poof-it-works-using-wikidata-to-build.html [17:07:05] it seems labs is down [17:07:14] ohnoes [17:07:17] we killed it [17:07:42] http://scoms.hypotheses.org/498 <- here is a nice article about creating a network of actors based on wikidata's data [17:07:58] the wikipedia gender index was released using wikidata's data: http://wigi.wmflabs.org/ [17:08:19] people are building very nice lists/galleries/etc using magnus' listeria bot: https://commons.wikimedia.org/wiki/User:Lockal/Paintings_by_Jacob_van_Ruisdael [17:08:48] some wikipedias have taken the time to get down the number of articles without a wikidata item: https://tools.wmflabs.org/wikidata-todo/duplicity.php?wiki=itwiki&mode=stats [17:08:59] sorry if this doesn't work for you now. should be back up soon i hope [17:09:17] yay for italian wikipedia for example :) and all the other wikipedias who did that! [17:09:42] i also loved this article about wikidata being the new rosetta stone: http://blogs.cccb.org/lab/en/article_la-nova-pedra-de-rosetta/ [17:09:43] i think a bunch of tools are down :/ [17:10:15] ok what happened on the technical side? [17:10:21] we've made a mobile view work [17:10:43] we improved how wikidata edit summaries are shown on wikipedia and co [17:11:02] we made special:nearby work: https://www.wikidata.org/wiki/Special:Nearby [17:11:19] we've cleaned up the user interface a bunch (more to come) [17:11:56] we've made it possible to add a reference at the same time as you add the main part of the statement - hopefully leading to more people adding references [17:12:20] we've created the dashboard of all dashboards for wikidata: https://grafana.wikimedia.org/dashboard/db/wikidata :D [17:13:00] we've cleaned up query.wikidata.org and added examples and made them more visible so more people can hopefully use the query service [17:13:26] we've given meta, mediawiki and wikispecies access to the data on wikidata [17:13:43] oh and we attended a bunch of really cool events like the world health summit [17:14:11] the biggest discussion lately probably went around data quality on wikidata though [17:14:19] a lot has happened in that area as well: [17:14:36] i wrote a good overview of the topic for the signpost: https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2015-12-09/Op-ed [17:14:44] there is maintenance going on in all of labs, so we need to wait a bit for the tools :? [17:14:47] :/ [17:14:53] thx aude [17:15:01] http://addshore.com/2015/12/wikidata-references-from-microdata/ <- this is something awesome adam has done [17:15:28] basically automatically extracting references for data we have from markup in the pages linked on wikipedia and wikidata [17:15:40] so we can increase the number of references significantly [17:16:26] and the other important step is ORES - a tool to help us automatically figure out which edits are likely good or bad to make it easier to find the ones that need human review: http://blog.wikimedia.de/2016/01/02/teaching-machines-to-make-your-life-easier-quality-work-on-wikidata/ [17:16:37] that was done by amir, aaron and their team [17:17:01] and another important part of the debate: magnus looking at comparing the reference situation on wikidata and wikipedia: http://magnusmanske.de/wordpress/?p=378 [17:17:34] so we still have a long way to go in terms of data quality but i believe we've made really huge steps over the last quarter [17:17:40] \o/ [17:17:46] \o/ indeed :) [17:18:21] *thunderous applause* [17:18:25] :D [17:18:42] :D [17:18:47] :) [17:18:55] :p [17:19:03] oh one thing i nearly forgot there: magnus worked on a nice gadget to just drag references over from wikipedia: https://www.youtube.com/watch?v=jP-qJIkjPf0 [17:19:14] you can find out how to use it in the description of the video [17:19:34] speaking of magnus and his awesome tools: https://tools.wmflabs.org/mix-n-match/ got a lot of new catalogs [17:19:52] and https://tools.wmflabs.org/wikidata-game and https://tools.wmflabs.org/wikidata-game/distributed/ are always awesome for when you have some free time to kill [17:20:24] alright - so far for the this is what happened part. any questions about that? [17:20:46] so when are you going to import say https://finds.org.uk/ironagecoins [17:20:49] Lydia_WMDE: we've recently met with Internet Archive folks, I wonder if their catalog (they have catalog, not ony wayback machine) could be hooked up to mix-n-match [17:21:06] SMalyshev: very likely. get them connected to magnus [17:21:21] Lydia_WMDE: yes, trying :) [17:21:23] geniice: not up to the development team but the editing community :) [17:21:27] how's the automatic reference generator for Wikipedia going ? There's still too many people writing content with crap references or no real references at all. We need more references within the WMF-sphere first, really. [17:23:46] Lydia_WMDE so tools to mass import data and from external databases and run regular verification and update checks already exist? [17:23:58] NotASpy : assuming people use crap references or no references at all, how would using the WMF-sphere which relies on said crap references (or lack thereof) be any better than using the reference it's based on directly ? [17:24:37] Alphos: it's not, we need better references first, before worrying too much about doing more with those references. [17:25:35] NotASpy Couldn't wikidata just import things like the DOAJ journal index and report when an open acess journal is or isn't on it? [17:26:10] yes, you could do lots of things like that. [17:26:31] hm [17:26:34] NotASpy when we do have proper references, the software allows us from now on to use them, and that's a good thing ; the software can't however force people to use proper references : there's no way to determine the "properness" of a ref without a human looking at it ;-) [17:26:36] sorry my irc client was giving me a headache [17:26:47] back again now under this nick [17:26:59] Alphos: of course there's a way to check for properness - URL, data, title, author data etc. [17:27:36] geniice: there are some tools but i don't think it is perfect yet [17:27:40] still quite some work to do [17:28:01] that checks fields were filled. it doesn't check the reference is relevant, or that it's from a known reference source [17:28:09] Alphos: which then can cross with some of the things geniice just mentioned - you could in fact prepare a reference, sort of in advance, for every journal article out there, and then substitute a bare URL reference for a proper fully formatted, ready to roll reference by virtue of having 'mined' journal catalogues etc. [17:28:31] NotASpy, yes that is a good idea [17:28:44] it could be a bot, improving existing references, right? [17:28:58] NotASpy which would still require a human eye for checking the quality of the ref ; but on the principle, i wouldn't mind that kind of system :) [17:29:00] so you find an URL to a journal paper as a reference, and replace it with a more complete reference [17:29:10] very much that sort of thing, yes. [17:29:21] dennyvrandecic: sounds like a plan! [17:29:23] yes, that sounds like a good idea [17:29:29] I've been toying with it for years and never really getting anywhere (it's way above my level of coding competence) [17:29:30] :) [17:29:54] Alphos already various automated attacks. exclude everything on Beall’s List: http://scholarlyoa.com/publishers/ [17:30:10] NotASpy: if you could write down exactly what needs to happen I'm sure I would be able to find some time to help! [17:31:01] sweet [17:31:04] NotASpy: will you? [17:31:31] geniice no blacklist can ever be complete, by the very definition of what they are ;-) it is a good start though [17:31:59] Alphos highlight DOAJ Seal then other DOAJ stuff. Might have to throw some money in DOAJ's dirrection long term [17:32:27] will do. [17:32:34] :) [17:32:38] ok. should we move on to what's coming up? [17:32:47] Lydia_WMDE please :) [17:32:52] cool [17:33:13] so frimelle has been working on the articleplaceholder [17:33:33] http://articleplaceholder.wmflabs.org/mediawiki/index.php/Special:AboutTopic/Q3 [17:33:33] it shows information on small wikipedias when they don't have info about a topic but wikidata does [17:33:37] \o/ [17:33:48] we're currently looking for the first wikipedias to try it out [17:34:05] i hope with this we'll really be able to make true on our promise of helping the smaller wikipedias [17:34:24] by making them more useful for their readers and attracting more editors for them [17:35:05] another student project we have running right now is charlie's. she'd researching how to make editing of wikidata's data from wikipedia work [17:35:18] so we remove yet another barrier for the wikipedias to use wikidata [17:35:31] once we have her concept we can see how to implement it [17:35:37] Lydia_WMDE: cool! [17:35:39] :) [17:35:42] i think so too! [17:35:47] that would be really needed on frWP [17:35:53] :D [17:36:00] do we have a page about her project? [17:36:07] hint hint: those are two of the most awesome student projects. if you're a student and looking for something to work on with impact let me know [17:36:17] Harmonia_Amanda: yeah give me a sec [17:36:26] Lydia_WMDE: thank you [17:36:37] https://www.wikidata.org/wiki/Wikidata:Client_editing_input <- this is where she started [17:36:56] dang, not a student ^^' [17:36:57] she's currently in writing and prototyping mode [17:37:02] Alphos: ;-) too bad [17:37:15] Lydia_WMDE: thank you very much [17:37:26] you're welcome! [17:37:34] other things we're working on: [17:38:03] a new datatype for identifiers should be ready in the next few weeks so we can put identifiers into their own section and properly link them in the exports [17:38:11] this should clean up the items quite a bit [17:38:27] another thing i want to do is add an image to the item's header so it is a bit more visual [17:38:57] bene has been working on making it possible to show more languages in the in other languages box. this way we should be able to get rid of the label lister gadget [17:39:11] aude has been working on making our search ranking better [17:39:51] adrian has been working on cleaning up all languages around mediawiki so we can finally have more languages in the monolingual text datatype [17:39:56] (it's a mess! -.-) [17:40:45] hoo has made the last patch necessary for the in other projects sidebar. we want to take that out of beta features soon. my hope is that we'll give all the sister projects good exposure this way and can grow them [17:41:08] 2 students have been working on a new datatype for mathematical expressions/formula [17:41:27] should also be coming in the next few weeks [17:42:25] we'll be digging into Commons again and getting out a first prototype for structured data for media files (not to be deployed on commons but just a testsystem because it'll be too ugly/unusable at this point( [17:42:59] and we want to improve performance yet some more [17:43:29] i'd also love to get the constraints finally show up next to statements but i'll need to see how soon we can get to it [17:43:44] and later this year we'll get to automated lists on wikipedia etc based on wikidata queries [17:43:59] lots to do! [17:44:09] but i'm excited. good stuff is coming up. [17:44:12] i hope you are too [17:44:19] :D [17:44:20] it might make sense to rework the structure of some constraints, btw. getting difficult to do some things. [17:44:27] *nod* [17:45:41] alright [17:45:43] questions? :) [17:45:44] good stuff indeed [17:46:08] +1 [17:46:13] \o/ [17:46:22] I guess Esperanto WP is small enough for the placeholders [17:46:30] Lydia_WMDE: you haven't mentioned watchlist... do you consider it done at this point? [17:46:32] but I'm not really active there [17:46:35] Polyglot: how big is it? but yeah - likely [17:46:51] matej_suchanek: on wikipedia? not done, no [17:47:18] 223k articles [17:47:20] on wikidata: also not done but i need to find someone to work on it - currently not possible with the small team [17:47:25] I would hope that there is a Wikipedia language community who would ask for this [17:47:32] Instead of one that is being approached [17:47:40] like the Italian and Hebrew did for phase 1 [17:47:44] dennyvrandecic: we have a few already after we asked for volunteers \o/ [17:47:50] sweet [17:47:56] more always welcome [17:48:11] Polyglot: how many editors? [17:48:49] are 1240k articles few enough? [17:48:56] no idea really, not a lot of 'native' speakers, but the ones that do edit are probably very motivated :-) [17:49:12] Ricordisamoa: i personally consider the number of editors more important [17:49:21] Polyglot: you can check on special:statistics [17:49:23] ic, active editors? [17:49:33] Polyglot: either is fine [17:49:35] ili havas 418 aktivaj uzantoj :) [17:49:46] that totally works [17:49:47] *aktivajn uzantojn [17:50:04] 8075 utenti attivi [17:50:21] Ricordisamoa: that is quite a few. which wikipedia? [17:50:26] it [17:50:30] hah [17:50:32] ok [17:50:42] 115k registered / 400 active in the last month [17:50:45] i'm totally up for giving it to it as well. but maybe in a second round? [17:50:56] Polyglot: that works just fine [17:50:59] I shall ask [17:51:02] :) [17:51:02] Polyglot There is an input page on ArticlePlaceholder on the esperanto Wikipedia already https://eo.wikipedia.org/wiki/Vikipedio:Diskutejo/Teknikejo#Anstata.C5.ADigilo_de_neekzistantaj_artikoloj [17:51:44] bonege! :-) [17:51:53] (real good) [17:52:25] Ah thanks, my Esperanto isn't that good (or at all existing) yet :) [17:52:47] frimelle: kial ne? :p [17:53:22] Well, so many languages :D [17:53:33] https://incubator.duolingo.com/courses/eo/en/status :) [17:53:38] :D [17:53:48] <3 duolingo [17:53:54] Lydia_WMDE: Have you seen the discussion regarding union properties? is there any chance we'll get some kind of array datatype (or even a "valueless" datatype, which would also work)? [17:53:58] Hello Lydia_WMDE - Where had the disscussion reached.. [17:54:03] ? [17:54:04] Okay, let's talk about Wikidata again instead of pushing me into Esperanto :D [17:54:06] YairRand: i don't think i have [17:54:18] ShakespeareFan00: we're in the questions part [17:54:28] YairRand: can you summarize? [17:54:49] I had a a WikiData relatted question, which was to do with a timescale for numeric datatypes... [17:55:00] ShakespeareFan00: ask away [17:55:02] proposals are "union of" and "disjoint union of". needs a way to have a distinct group of items as a value [17:55:15] There are som data sources for 'distance' data... [17:55:30] and I wanted to put data baout distances from a datum into Wikidata [17:55:30] YairRand: can you give me an example of a statement this would be used for? [17:55:40] Currently I don't think it's that easy to do [17:56:10] Is there a timescale for Wikidat being able to have a 'distance from' data type ? [17:56:14] an example given there is "nucleon" disjointunionof [ "proton", "neutron" ] [17:56:47] ShakespeareFan00: distance between points from coordinates? [17:56:50] ShakespeareFan00: you are looking for the numbers datatpye. you can already express it. i don't know if a property for it exists at this point. you can check the property proposals if not [17:57:06] Thanks [17:57:14] YairRand: hmmmm and that can't be made as two statements? [17:57:17] My other question was slightly less serious [17:57:37] ShakespeareFan00: if they're points on Earth you may want to get the distance from polar coordinates [17:57:38] Were there plans to write a 'quiz generator' based on Wikidata? [17:57:51] someone already did! [17:57:53] Lydia_WMDE: no, because a class can be sudivided different ways, so you need clear groups [17:57:57] let me see if i can find the paper [17:58:07] Ricordisamoa: Wikidata isn't a GIS , that's a known limitation [17:58:15] https://twitter.com/wikidata/status/462966192098258945 [17:58:28] I have an other question: why not have a simple datatype for integer like "ranking" ou "atomic number"? [17:58:38] YairRand: i see. hmmmm. to be honest definitely not in the short term future because there are all kinds of hairy details in this one. like which datatypes do you allow in the individual parts [17:59:07] Tpt: we might in the future. this is one solution we're looking at for the +-1 issue [17:59:14] Tpt: Most numbers are dimensioned or have units [17:59:16] Lydia_WMDE: just "item-array" would work for this [17:59:26] Lydia_WMDE: ok, thank you :-) [17:59:27] YairRand: ok [17:59:33] At least in a scientfic context I think [17:59:35] https://github.com/FraBle/WikidataQuiz [17:59:45] ShakespeareFan00: yes, but using the quantity datatype seems overkill in a lot of use cases [17:59:51] Tpt ranking isn't an integer per se, it's an ordinal, not a cardinal. i'd love two distinct integer types, one for cardinals, one for ordinals, but maybe that's just me :p [17:59:54] Lydia_WMDE: currently being discussed is just mandating always having "no value" for the value and filling in the real values as qualifiers [17:59:56] Ricordisamoa: thanks! [18:00:04] Also should Wikidata have error bounds on numeric data? [18:00:10] here is a paper: http://www.tik.ee.ethz.ch/file/0d5ed5588540ee586ae420a8199e0852/wikidata-quiz.pdf by another student/team [18:00:18] I mean some figures can have HUGE error bounds [18:00:34] making them less useful [18:00:52] ShakespeareFan00: we already have uncertainty. we'll be working on improving it [18:01:12] OKay, Sorry... I seem to be a bit behind :( [18:01:18] don't worry [18:01:19] Lydia_WMDE : maybe a +x-y ? [18:01:38] Alphos: yeah [18:01:39] some uncertainty is weighed one way or the other [18:01:43] \o/ [18:01:59] On numbers , I will also note that it's not easy in Wikidata to enter arrays / lists into a single item [18:02:09] This is most likely a data model item [18:02:11] the more people help us the ealrier we will get to such things :) [18:02:33] ShakespeareFan00 could you give an example of array/list used in a single property ? [18:02:42] ShakespeareFan00: because wikidata isn't very suitable for long kinds of series of numbers for example [18:02:55] at least at this point [18:02:58] Alphos : A railway line listing the Stations on it in increasing diatcne from the teminus [18:03:02] *terminus [18:03:21] oh, that's not just numbers then [18:03:24] Yes you can put in Connecting stations with qaulifiers [18:03:40] but you can't at present force an ordering by the qualifiers [18:03:56] as for railway lines, i have to speak with a friend of mine for specifics to create a gadget to do just that ^^ [18:04:05] I accept this may be a data normalisation issue [18:04:20] Given that Wikidata isn't fully relation in that sense [18:04:36] *relational [18:05:07] From a data modelling perspective, I see railway lines as having a terminus, and intermediate statiosn some distance away. [18:05:23] on wikidata it seems to be modelled differently [18:05:28] meaning station to station [18:05:28] ShakespeareFan00 i'm not a member of the office, but i'm planning on working on it ;-) [18:05:50] Lydia_WMDE: For some applications that works, but for things like route planning it doesn't [18:05:51] * Alphos has a thing for railways [18:06:00] ShakespeareFan00: yeah [18:06:18] Alphs: Don't forget that similiar tactics can be used for highways, river networks , etc.... [18:06:22] https://www.wikidata.org/wiki/Q704716 <- this one for example is the next one to where i am right now as an example [18:06:28] But Wikidata as I said isn't a GIS [18:06:34] jep :) [18:06:45] ShakespeareFan00 not "similar". "identical". a network is a network that is a network ;) [18:07:03] alright. more questions? [18:07:22] With your example, Lydia_WMDE for adjacent station, useufl info to add would be inter station dsistances and travel times for xample [18:07:23] questions, no. comments, yes ! [18:07:32] ShakespeareFan00: yeah [18:07:39] Alphos: haha [18:07:52] Lydia_WMDE: Would a future office hour on GIS in Wikimedia be a useful idea? [18:08:18] ShakespeareFan00: probably (including yurik, maxsem, etc) [18:08:30] we had past office hours about OSM [18:08:36] ShakespeareFan00: likely! but i think the people aude mentioned should handle that as they're better equipped [18:08:47] Although it's probably something wher you should talk with OSM people about [18:09:01] there's been some discussion at least how to have a place for stuff like geojson [18:09:01] ShakespeareFan00, i ran a hangout session with community once, but in reality - 24x7 is my GIS office hours :) [18:09:16] and how to connect / make use of that in graphs, maps and wikidata [18:09:19] The concern with OSM that has been expressed before is that it's not generally WP:RS being peer generated [18:09:21] there's yurik :) [18:09:29] aude, funny, discussing it as part of https://www.mediawiki.org/wiki/Wikimedia_Discovery/RFC -- see the talk page [18:09:55] aude, Yurik: You are familiar with online route plannners? [18:10:04] ShakespeareFan00: yes [18:10:06] If Wikidata could do what they do :) [18:10:08] i use google :) [18:10:16] not sure this is the proper setting for my comments, just say so if not : 1) given the contribs/user ratio even for non-bot users, and given that reverting a large amount of contribs over a specific period of time for a given user can be incredibly tedious for one person, i've worked on a bot to do just that ; could be used to revert vandalism or mistakes, in effect improving data quality. [18:10:29] can I speak of what Tpt and I did today? We created a fantastic template which use Wikidata to bibliographical entries. And it works and it awesome https://fr.wikipedia.org/wiki/Mod%C3%A8le:Bibliographie [18:10:36] e.g. http://wiki.openstreetmap.org/wiki/Open_Source_Routing_Machine [18:10:44] There are some open source routing engines as you mention.. [18:11:01] But most do Street map routing, not Transport routing via Pubvlic transport as such [18:11:38] Alphos: nice! [18:11:51] Harmonia_Amanda: \o/ did you already tweet it? [18:11:56] Lydia_WMDE: no [18:11:59] 2) while test-editing that bot, i stumbled upon wikilink conflicts that weren't detected by wikidata when they were first added ; addshore mentionned a period where constraint checks failed, i also thought of redirects that were created after they were added in wikidata ; either way, we need a way to find them [18:11:59] Harmonia_Amanda: http://www.opentripplanner.org/ [18:12:00] you should! :D [18:12:02] Alphos: Most excellent. [18:12:04] I was thinking along the lines of You take The Picadilly line to Heathrow 4 -> Flight To Berlin -> and then some Ubhan into Berlin for example [18:12:13] public transport routers need data from public transport providers, isn't that out of scope for wikidata? [18:12:16] Alphos: write on project chat about it? [18:12:21] anyway, we might be getting off topic and could schedule another chat about this [18:12:26] ShakespeareFan00, routing is awesome, but the bigger question is if it should be prioritized over making maps available to the wikipedia articles with custom info layers, plus historical maps, etc. [18:12:31] Ricordisamoa: Hmmm that is a concern [18:12:32] Lydia_WMDE: yes but in french ? https://twitter.com/Harmonia_Amanda/status/690218339596161024 [18:12:42] that is fine [18:12:42] so, as suggested by addshore, i just created a labs account, and will get started on writing a script that could be run periodically and produce conflict reports :) [18:12:57] YairRand thank you ^^ [18:13:07] :) [18:13:12] Yurik: I did mention earlier that being able to sort stuff in Wikidata entries based on qualifers would be useufl [18:13:21] Lydia_WMDE already requested a bot perm, currently in progress :) will mention it in project chat as well, thanks for the suggestion [18:13:26] For adjecent stations this might be something like travel time etc.. [18:13:41] Alphos: cool! [18:13:57] I.e Station X is 5 mins awy, Staion Y is 10 and so on [18:14:02] <3 everyone! [18:14:11] https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/RollBot please comment if you will :) [18:14:31] Something else which may also be out od ccope was using Wikidata to do compartive Road signs... [18:14:43] Rather complex, but do-able [18:15:06] Assuming someone was willing to enter the various signs from respective road traffic rules... [18:15:16] And figure out the equivalancces... [18:15:34] ShakespeareFan00, it would be doable, just wondering on the usage - how big of an impact we could get with it [18:15:44] * yurik is all about flashy stuff that affects tons of people :) [18:15:52] Yurik: It might be very useufl to Wikivoyage people [18:16:09] * yurik agrees [18:16:12] for some reason, everyone forgets about wikivoyage people [18:16:21] And being able to have a Compartive Road traffic chart would be useufl for others [18:16:36] Most European road signs are boradly similiar, but not identical [18:16:37] <- always thinks of them in airplanes and when thinking about sandy beaches in the caribean [18:16:38] :D [18:16:47] Lydia_WMDE: One more question: Is Sparql going to be the query language used when we get queries on the client projects? If so, is it extendable? I've been fiddling with queries, and there are some pretty substantial gaps in what's possible... [18:16:55] And of course the UK has ever so slightly different ones to Europe [18:17:06] ShakespeareFan00, my first goal for wikivoyage is to get them off wmflabs hacky maps to a good solid platform with geojson stored directly on wiki. Afterwards, we want to expose that to all wikipedias [18:17:28] If you want to go further, you might want to look into whther Wikidata can be used to do 'compartive' pictograms for things... [18:17:31] YairRand: sparql will be what it is built on very likely. but it'll probably need to be considerably more userfriendly [18:17:45] afterwards - we could think about adding cool new features like that. BUT!!! this is my priorities list, whereas this is an open platform, contributions in any areas are welcome :) [18:17:52] YairRand: can you write down the limitations you got to? then we can see what can be done still [18:18:03] Road Signs was my first thought, but you have other signs that have equivlaent menaing but different sign [18:18:09] depnding on where you are [18:18:14] Hmmm... [18:18:22] Pictogram translation in effect :) [18:18:33] Google's probably working on it though :( [18:18:39] Lua modules could be built to generate SPARQL queries with chainable syntax [18:19:00] Yurik, Lydia : OK I think I've saiod what i wanted [18:19:06] :) [18:19:23] any other questions/topics/opinions? [18:19:33] cool things to show off? [18:19:47] Yurik: I'd strongly suggest looking into how to do code pictogram equivalnces into Wikidata :) [18:19:52] Lydia_WMDE, any hope of a fun graph idea ;) [18:19:57] i already talked about RollBot, what more cool things do you want from me ?! :p [18:20:01] ArthurPSmith is working on a Wikidata-based "chart of the nuclides" [18:20:02] Fun graph? [18:20:02] https://gerrit.wikimedia.org/r/#/c/245591/ [18:20:06] yurik: ahhhhhhhh :P [18:20:14] You mean Wikidata derived infogrpahics? [18:20:15] Alphos: haha. fair! [18:20:17] Lydia_WMDE: sparql can't, for example, check what city a former entity was located in. (can't do wdt:P* qualified each time by start/end times). should I write up a full list of major limitations somewhere? [18:20:32] ShakespeareFan00, Lydia_WMDE has this huge desire to build an amazing infographics in-wiki using graph ext :D [18:20:39] YairRand: that would be very useful [18:20:44] and i just might be able to help... maybe [18:20:51] ;-) [18:21:07] yurik needs cool ideas for graphs to build on-wiki with wikidata [18:21:12] because otherwise all that data goes nowhere :-P [18:21:15] if you have some send them all his way :) [18:21:17] unless visualized :-P [18:21:18] Yurik: Cna you build tranist maps? [18:21:20] Sorry [18:21:29] I have this obssesion with rail stuff [18:21:40] Currently most of the Tramsit maps on Wikipedi are hand drawn... [18:21:45] ShakespeareFan00: it's ok. we don't discriminate based on data preference here ;-) [18:21:48] YairRand: Yes, please! The more specific the feedback, the better! Then we know specifically what needs improving. :-) [18:21:56] Surely there are ways to use Wikidata to draw Transit Maps? [18:22:04] YairRand: (I'm the product manager at the WMF responsible for the Wikidata Query Service) [18:22:06] Assuming you have the right data in Wikidata? [18:22:16] ShakespeareFan00 as i said, going to work on it ^^ [18:22:30] lots of gadgets to be done for transportation networks ^^ [18:22:32] Of course you can also build Idea maps... All grpahs [18:22:39] Deskana: I'll post it to WD:DEV, okay? [18:22:47] * aude hopes not to lose power during the great snowpocalypse this weekend so i can keep hacking on pet projects like graphs :) [18:22:52] ShakespeareFan00, transit maps tend to be hand-made because people want to overlay them with the schematic geo-representation of the location. On the other hand, it should be fairly easy to plot train stations with geo coordinates using graph ext [18:22:54] I mean you could draw a transit map like inforgraphic of decay products :) [18:22:55] YairRand: i'll make sure Deskana sees it [18:23:01] endless ideas :P [18:23:22] Yurik: True, but abstarct TYransit mpas tweak the Geogrpahy for clarity... [18:23:26] *Tranist [18:23:38] This is ehy Tranist companies pay a fortune to map designers ;) [18:23:54] YairRand: Please ping me when you do, my username is [[User:Deskana (WMF)]]. I don't check Wikidata too much, 99% of my work is not directly Wikidata related. :-) [18:24:04] and even then don't get it right all the time.... Look at digrams of New Yorks Subway :( [18:24:07] Lydia_WMDE: Thanks for that. :-) [18:24:14] np [18:24:28] yep - graph ext is "data driven graphing" - you give it data, and graphs auto-converts it into an image. If it has to be tweaked by hand, its not good [18:24:58] alright folks. should we wrap this up for today? [18:25:00] There are ways to layout abstract graphs that aren't geographical though [18:25:12] This probably needs a longer disscussion at another time [18:25:14] office hour-and-a-half [18:25:17] :D [18:25:18] ShakespeareFan00, https://www.mediawiki.org/wiki/Extension:Graph/Demo [18:25:20] :) [18:25:37] thanks so much everyone for coming and having a good lively hour and a half [18:25:41] this was good! [18:25:44] yurik : Cool.. [18:25:46] thanks Lydia_WMDE :) [18:25:46] indeed it was [18:25:58] So it could do things like Resource mapping from USDA data? [18:26:05] :) [18:26:07] Assuming someone had the time :) [18:26:19] I'll post the log unless someone else wants to [18:26:39] "Good Morning Wikipetan, Please show me where the largest Wheat producers are..." [18:26:42] XD [18:26:47] And it brings up a map [18:26:49] XD [18:27:16] A Wiki-assistant is probably a very wishlist item though [18:27:24] Magnus Manske day is -4 [18:27:39] ShakespeareFan00: not that super far off: http://askplatyp.us/ [18:27:43] Ricordisamoa: ohhhhhhhhhhhhh [18:27:47] we should do something! [18:28:28] "Hi Joe, Where do I see real canibals?" being a more extreme question you could ask it? [18:28:48] ( For those that don't know I am refernecing a very old Science Fiction short story here) [18:30:22] * ShakespeareFan00 jokingly wonder what a Wiki-sister woudl look like on screen :) [18:30:30] Wiki (As)sister :) [18:30:38] You can have that :) [18:32:14] Lydia_WMDE: We have moved askplatyp.us to a new server so it should be faster ;-) [18:32:56] ShakespeareFan00: If you are interested to contribute, askplatyp.us welcomes contributions ;-) [18:32:59] Tpt: sweet! [18:34:43] we don't have any orcs in wikidata [18:34:51] they seem to be not notable. [18:34:55] lol [18:34:57] such specism [18:36:05] :P [18:36:22] dennyvrandecic: I count 9 orcs [18:36:36] according to the reasonator [18:36:37] YairRand: cool! how? [18:36:50] sorry, 15 [18:37:26] see for example Q2566378 [18:37:52] Tpt: I might [18:38:01] Not sure how to find Kasuga though [18:38:15] dennyvrandecic: nine tolkien orcs, 6 wow orcs [18:38:16] (so I can calrify the artwork licensing on the mascots) [18:38:49] YairRand: ah, I see. But they are not typed as orc (Q194061) but as Orc (Q722547) [18:39:28] Thanks... [18:39:39] I assume this channel is logged? [18:39:40] we need to make clear the distinction between "based on" and "subclass of" for mythical creatures... [18:40:35] hmm. orc subclass of warcraft race seems wrong. i will chance that to instance of. [18:41:12] dennyvrandecic: no it's a class [18:41:24] orc is a class [18:41:35] ah [18:41:40] it's "race" [18:41:47] sure it is a class. but not a subclass of warcraft race. [18:41:48] yes orc is "one" race [18:41:54] so instance of [18:41:57] ok :) [18:42:05] instance of warcraft race [18:42:12] subclass of warcaft character. [18:43:08] conceptually, {group of "people"} can't be a subclass of {single "person"/"character} [18:43:39] meh, all the warcraft races are "subclass of warcraft race" [18:44:10] https://query.wikidata.org/#PREFIX%20wd%3A%20%3Chttp%3A%2F%2Fwww.wikidata.org%2Fentity%2F%3E%0APREFIX%20wdt%3A%20%3Chttp%3A%2F%2Fwww.wikidata.org%2Fprop%2Fdirect%2F%3E%0APREFIX%20wikibase%3A%20%3Chttp%3A%2F%2Fwikiba.se%2Fontology%23%3E%0APREFIX%20p%3A%20%3Chttp%3A%2F%2Fwww.wikidata.org%2Fprop%2F%3E%0APREFIX%20v%3A%20%3Chttp%3A%2F%2Fwww.wikidata.org%2Fprop%2Fstatement%2F%3E%0APREFIX%20q%3A%20%3Chttp%3A%2F%2Fwww.wikidata.org% [18:44:10] 2Fprop%2Fqualifier%2F%3E%0APREFIX%20rdfs%3A%20%3Chttp%3A%2F%2Fwww.w3.org%2F2000%2F01%2Frdf-schema%23%3E%0A%0ASELECT%20%3FcatLabel%20WHERE%20%7B%0A%20%20%20%3Fcat%20%20wdt%3AP279%20wd%3AQ15839082%20.%20%0A%0A%20%20SERVICE%20wikibase%3Alabel%20%7B%0A%20%20%20%20bd%3AserviceParam%20wikibase%3Alanguage%20%22en%22%20.%0A%20%20%20%7D%0A%7D [18:44:19] dennyvrandecic : tinyurl ! [18:44:21] if someone wants to go through and clean that up [18:44:31] http://tinyurl.com/hpfusfj [18:44:36] I'll go through it with autolist [18:44:47] thanks YairRand [18:52:34] endmeeting [18:55:16] dennyvrandecic: wow species fix complete [18:56:26] YairRand were you being impressed or abbreviating "world of warcraft" ? :p [18:56:40] Alphos: Abbreviating :) [18:56:49] wikidatans inbound for dinner, thanks for the office hour ! [18:57:28] YairRand: thanks, that was quick and great!