[16:00:58] Hey everyone :) [16:01:10] Hi!!! <3 [16:01:10] hi! :) [16:01:16] who is here for the office hour [16:01:23] o/ [16:01:29] Me!! [16:01:35] hi, Lydia! I'm here for Wikidata office hour! [16:01:42] Yay! :) [16:01:49] nice [16:01:59] * Ainali is here for that [16:02:26] I'm here too [16:02:34] Hello :) [16:02:35] * JustMe_ listening :-) [16:02:37] so i wanted to give an overview of what happened over the last 3 months and then see what is coming next [16:02:57] but first of all I want to introduce LeaAuregann_WMDE [16:03:05] 👍 [16:03:09] Yay \o/ [16:03:13] she joined the team to handle the community communication around Wikidata [16:03:40] and I'm really happy to have her on board. that should help with me not having enough time to handle product management and communication [16:03:42] So LeaAuregann_WMDE will run these office hours later on? [16:04:10] yes ;-) [16:04:27] 👍 [16:04:28] though i'll be around of course to support and answer all of your hairy questions ;-) [16:04:30] If you're OK with that :-) [16:05:00] alright then let's see where we are and what happend over the past 3 months [16:05:16] right now we have over 16000 editors with more than one edit over the last 30 days [16:05:38] and around 1230 with more than 100 [16:05:58] I am really happy about that [16:06:06] Hi Lydia - yes, I am! [16:06:27] and we have about 24 million items with 15 statements on average [16:06:30] Hello Lea! [16:06:39] also not too shabby ;-) [16:07:18] Great, Lydia! [16:07:26] a lot has happened since the last office hour. i'll surely forget some important things but you can add things if they come to your mind [16:08:15] we've done interviews and so on to get a better understanding of how automated list generation should work with Wikidata. you can see more on that at https://www.wikidata.org/wiki/Wikidata:List_generation_input and give more feedback [16:08:33] feedback especially from wikipedia editors who work on lists would be extremely valuable [16:09:24] the other big thing is that we are currently in the final steps of getting reviews and feedback for how we are going to model lexicographical data on Wikidata [16:09:31] this is basically support for Wiktionary [16:09:53] you can find the proposal on https://www.wikidata.org/wiki/Wikidata:Wiktionary [16:10:16] i'm quite excited about being able to support this kind of data as well because it opens up so many more opportunities for both Wiktionary and Wikidata [16:11:05] * bawolff thinks that sounds a lot like OmegaWiki... [16:11:24] bawolff: not quite but if you are familiar with it it would be great to get your feedback on the proposal [16:11:33] (Thanks for the update re lexicographical data - curious how many different ways of structuring Wikidata further will inform different options for translation between any 2 given languages - e.g. how will pairing phrases, beyond pairing lexemes work?) [16:11:53] I'm not particularly familiar with OmegaWiki, I mean more from a "goals" perspective [16:11:59] *nod* [16:12:32] another area we've been spending more time on is the ArticlePlaceholder [16:12:39] For that matter, I'm not particularly familar with Wiktionary either [16:12:48] we've fixed a lot of bugs and rolled it out on Kannada and Welsh Wikipedia as well [16:12:56] bawolff: ok fair enough [16:13:23] But it might be cool to have a FAQ entry comparing and contrasting it to OmegaWiki for people like me who just vaugely know what the two projects are :) [16:13:33] yeah that makes sense [16:14:27] But I did once try to write a screen scraper for extracting definitions from wiktionary [16:14:34] Which was an utterly miserable experiance [16:14:51] heh yeah i can imagine [16:14:57] And I mean utterly misearable (part of the problem was technology choice, but only part) [16:14:58] that should hopefully become easier [16:15:07] So more machine readability would be nice [16:16:28] Lydia_WMDE: could you please remind us the URL of the work done on ArticlePlaceholder ? [16:16:45] dachary: do you want to see one? [16:16:52] yeah :-) [16:16:56] ok one sec [16:17:18] https://cy.wikipedia.org/wiki/Arbennig:AboutTopic/Q2673 [16:17:38] thanks ! [16:17:44] (in case anyone is interested my definition extracting thing was https://en.wiktionary.org/w/api.php?action=parse&format=xml&xslt=MediaWiki%3AextractFirst.xsl&prop=text&page=dog&lang=en&count=1&showWord=none&audio=none&redirects=on . It really sucks) [16:18:09] it's done with https://www.mediawiki.org/wiki/Extension:ArticlePlaceholder ? [16:18:26] one thing that _a lot_ of people have been complaining about is that we add +-1 to quantities a lot of times. we've been trying to figure out a solution and i think we have it now. i need to still find the time to write it up and get you all something for testing. hope to get that done in the next month [16:18:37] dachary: yes [16:18:50] very cool [16:18:53] :) [16:19:30] we've also improved our browser tests so that hopefully less bugs ever find the way to your eyes ;-) [16:19:40] (famous last words, i know) [16:20:12] another huge thing from the last 3 months is that we finally were able to release the first prototype for support for Wikimedia Commons [16:20:14] 👍 [16:20:17] you can read up more here: https://commons.wikimedia.org/wiki/Commons_talk:Structured_data#It.27s_alive.21 [16:20:48] there is still a lot of work to be done before we can get anything live on Commons but this was a huge milestone [16:21:29] dare any time estimates? ;) [16:21:41] shush dennyvrandecic ;-) [16:21:52] *hiding in corner* [16:21:56] awwwww [16:22:04] nah i really don't know [16:22:16] right now we are working on federation as a next step [16:22:40] this basically means we enable you to use the items and properties from Wikidata to make statements on commons [16:22:50] this is big task number 2 [16:23:09] puh. Federation could have meant so much more... Glad it is so restricted to the task at hand :) [16:23:12] big task number 3 is multi-content-revisions in order to integrate the statements in the file page [16:23:19] haha [16:23:32] (Cool translation from Welsh to English pour moi using info boxes it would seem - https://cy.wikipedia.org/wiki/Arbennig:AboutTopic/Q2673 ! ) [16:23:35] once we have those 3 things in place we have the bare minumum [16:23:47] and then we need integration in all the existing tools and workflows [16:23:50] search for example [16:23:53] and upload wizard [16:24:19] but we'll get there [16:24:26] and then we'll have an even more amazing Commons [16:24:27] \o/ [16:25:26] more artsy things: our designer worked on new graphics for the data model as well as infographics for data flows and quality processes. you can find them here: https://commons.wikimedia.org/wiki/Category:Wikidata_information_materials [16:25:33] Will we use query.wikidata.org to get things in commons? [16:25:37] might be useful for you at events or for talks [16:26:09] Ainali: unclear still. i definitely want people to be able to query wikidata data and commons data at the same time [16:26:19] how exactly that will look we'll have to see [16:26:48] speaking of query service. that also got some love. [16:27:11] we now have map layers and show categories in the example queries so you can more easily find what you are looking for [16:27:42] in queries you now also have access to page prop information like the number of sitelinks and statements on an item [16:28:10] so you can easily query for items with at least 10 wikipedia articles about a given topic area [16:28:43] (I am off for now - thanks, will read up later) [16:28:47] cya [16:29:02] also, the queries examples have been moved to https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/queries/examples [16:29:08] indeed! [16:29:29] so if you have examples to add, please add it there [16:29:38] See you, Denny! :) [16:29:42] we've also made a bunch of usability improvements like better error messages [16:30:16] when you add a new statement now you also get a message saying no item or property was found for your input if we don't have something in Wikidata [16:31:39] we also reviewed and improved everything around calendar models. adam ran a bot to mark dates that will need to be reviewed because they might have a wrong calendar model. this sometimes happened because our UI was bad and people were confused [16:32:09] how are they marked ? [16:32:18] they have a qualifier [16:32:28] ok [16:32:33] i can later look up the exact items [16:40:45] :D [16:40:49] technology is conspiring against me and Lea [16:40:58] anyway [16:41:13] https://www.wikidata.org/wiki/Wikidata:Requests_for_comment/Data_quality_framework_for_Wikidata is another important thing in my opinion [16:41:39] Alessandro has spent some time with the dev team to work on the RfC in order to better understand what quality means on wikidata [16:41:53] and then see how we can assess it [16:42:06] we now have a page to request queries: https://www.wikidata.org/wiki/Wikidata:Request_a_query [16:42:34] and there is a sunday query thing going on on twitter where you can ask for people to help you write cool queries [16:43:23] #SundayQuery https://twitter.com/search?f=tweets&vertical=default&q=%23sundayquery&src=typd :) [16:45:09] Request a query is a great development [16:48:25] the property creation process was also improved by creating https://www.wikidata.org/wiki/Wikidata:Property_proposal/Overview and https://www.wikidata.org/wiki/Wikidata:Property_proposal/Attention_needed [16:48:25] that gives you an idea of which property proposals need input [16:48:25] another milestone: the Persondata template on English Wikipedia is gone in favor of Wikidata [16:48:25] \o/ [16:48:25] it provided machine readable data for people [16:48:25] and the Freebase API was now also shut down [16:48:26] this tutorial on how to write infoboxes with wikidata also seems really useful and could probably use some more love and spreading: https://www.wikidata.org/wiki/Wikidata:Infobox_Tutorial [16:48:28] as well as these grant proposals related to Wikidata that are currently under review: [16:48:29] * https://meta.wikimedia.org/wiki/Grants:Project/Putnik/Wikidata_module [16:48:31] * https://meta.wikimedia.org/wiki/Grants:Project/StrepHit_IEG_renewal [16:48:34] * https://meta.wikimedia.org/wiki/Grants:Project/Harej/Librarybase:_an_online_reference_library [16:48:36] * https://meta.wikimedia.org/wiki/Grants:Project/WikiFactMine [16:48:38] alright [16:48:40] that is it for the review :D [16:48:42] questions on that or should we move on to what's coming next? [16:48:51] seems that we encounter some problems with freenode, sorry for that ;) [16:49:06] * Nightrose looks @freenode with a frowny face [16:49:49] for this quarter i tried putting the big things i want the team to work on into one task on phabricator so you can more easily follow it: https://phabricator.wikimedia.org/T146637 [16:50:28] the biggest things we'll be working on is federation for Commons and automated sitelinks and the first new entity type (like items and properties) for Wiktionary [16:50:41] Lydia (Lydia_WMDE or Nightrose) and Lea (LeaAuregann_WMDE or Auregann), I'm hoping to attend the Wikimedia Developers' conference SF in the first week of January, and it would be great if Ryan Kaldari and Jan Zerebecki (if he attends this year with the Wikidata team) could again re-install World University and School in MediaWiki as well as connect it with the Wikidata newly. Would this be possible please, Lydia and Lea [16:50:57] we can certainly have a look yeah [16:51:15] we don't yet know who of us exactly will be coming but should figure that out soon [16:51:33] Great [16:51:56] in addition to the things linked in the task above the folling is coming: [16:52:23] (Lydia, Could we email further directly for a little planning in the interim, please? ) [16:52:24] Stas has worked on unit conversion for the query service. that should be coming in the next weeks for the first units [16:52:32] Scott_WUaS: of course [16:52:44] Thank you! :) [16:53:04] and most importantly for the next quarter: we'll be celebrating Wikidata's 4th birthday! \o/ [16:53:19] :-) [16:53:19] (now is the time to think about presents :P) [16:53:39] would this unit conversion make constructs like in https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/queries/examples#Objects_with_most_mass obsolete? or simplify them somehow? [16:54:10] WikidataFacts: it would simplify them in that you can then search for them in one unit instead of having to look at all mass units [16:54:23] you will then have everything in kg for example and order that [16:54:33] This is the part of my text from 9:50 which got cut off, for the office hour record, Lydia and Lea (and re this blog entry http://scott-macleod.blogspot.com/2016/09/bengal-tiger-wuas-in-mediawiki-and.html ). And how might we best explore this further? Thank you.) [16:54:46] that’s what that example attempts to do as well, ?mass is always in kilograms [16:55:09] ok then i don't think it'll change much for this one [16:55:52] but it will make queries easier for the things where you have a lot of different units like km, miles, feet and so on [16:56:11] a query for the longest rivers for example [16:56:55] is there some link where I can find out more about this? :) [16:56:56] Scott_WUaS: i'll have a look :) [16:57:14] WikidataFacts: the ticket about it. let me try to find it. [16:57:16] WikidataFacts: if I understand your SPARQL code correctly, unit conversion *would* make that a lot simpler. [16:57:42] https://phabricator.wikimedia.org/T77978 [16:57:45] DanielK_WMDE_: it does the conversion “manually” using P2370 conversion to SI unit [16:57:47] (Nightrose: And I'll email you something more up to date and specific. Thanks, Lydia!) [16:57:47] WikidataFacts: you would not need to look at the unit item at all. YOur baseMass would already be in kg. [16:57:59] Nightrose: great, thanks [16:59:27] more questions? [16:59:31] birthday ideas? [16:59:32] :D [16:59:35] WikidataFacts: you would use wdv instead of wdt - that's it (i hope i didn't get the prefix wrong, this is off the top of my head...( [16:59:41] :)) [17:00:03] WikidataFacts: yea, no manual conversion needed any more, we'll do that during RDF export [17:00:17] DanielK_WMDE_: that sounds awesome, thanks [17:00:29] we use different prefixes for the normalized and the original value [17:00:30] :) [17:00:36] I'm giving a talk (and serving cupcakes in the outline of the wikidata logo) at Yale Univ. Libraries for the birthday! [17:00:46] hweyl: excellent! [17:00:56] hweyl: add it to the events page if it is open to the public? [17:01:05] sure! [17:01:13] In conjunction with the WMF language engineering team, who at Wikidata might specifically working on ontologies for translation in Wikidata? [17:01:34] *be [17:01:50] no-one on the development team is [17:02:01] and i am not sure if anyone is working on it on the data side [17:02:48] I'm wondering how best to focus related data questions and overall trajectories in these regards. [17:03:16] nightrose: sigh, it is not public. I have proposals in for 2 others that will be public, will add to events if they are accepted [17:03:28] cool [17:03:46] Especially with computational linguists developing Wikidata for translation and in a myriad of other directions, technically and end user wise, and for professional translation. [17:03:54] Scott_WUaS: in a wiki project. i would check if one exists for languages already [17:04:15] About the birthday, I will collect some short stories about cool things you do on Wikidata, things you're proud of, and so on :) If you have something to tell, please contact me! [17:04:42] Scott_WUaS: our model is (roughly) based on the LEMON ontology. Any higher level modelling will be left to the community - as always, Wikimedia will stay out of content decisions. I expect there will be a lot of on-wiki discussions regarding what aspects of lexical information to model how, once we have Lexeme entities. [17:05:15] So, check back in a year. [17:05:30] OK ... it's the overall structure of Content Translation and related in terms of Wikidata developments ... thanks DanielK_WMDE_ and Nightrose Lydia_WMDE_ [17:05:52] would an extension like https://www.mediawiki.org/wiki/Extension:SemanticHistory for SMW ever be something Wikidata might do? [17:06:04] what does it do? [17:06:13] Scott_WUaS: ContentTranslation, the MediaWiki extension? That's unrelated. [17:06:36] Thanks for the clarification, Daniel_WMDE_ [17:07:40] Wikidata modeling for professional machine translation, among a myriad of other language and translation development questions, will be a fascinating set of Wikidata developments. [17:08:05] Happy 4th birthday, Wikidata [17:11:22] alright folks :) thanks so much for your time [17:11:22] think about things you want to do for the birthday and tell Auregann [17:12:06] Thank you, Lydia, Wikidatans and All! [17:12:29] Don't hesitate to contact me if you have any other question. Bye :) [17:12:43] hweyl: make pictures of the cupcakes!