[18:00:16] o/ [18:00:20] #startmeeting Wikidata [18:00:20] Meeting started Wed Jun 28 18:00:19 2017 UTC and is due to finish in 60 minutes. The chair is Lydia_WMDE. Information about MeetBot at http://wiki.debian.org/MeetBot. [18:00:20] Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. [18:00:20] The meeting name has been set to 'wikidata' [18:00:32] Hello :) [18:00:39] hey everyone! :) [18:01:15] Welcome to the Wikidata IRC office hour :) We didn't do this for a while, so we have a lot to catch up ^^ [18:01:39] Let's start with some numbers: we reached Q30000000 https://www.wikidata.org/wiki/Q30000000 and even Q31000000! We also reached 500 000 000 edits! [18:02:12] Lucas_WMDE joined the team! Doing amazing work on integrating constraint reports into the item UI, and of course lots of cool queries <3 [18:02:19] \o/ [18:02:27] \o/ [18:02:36] Upcoming: Wikimania (hackathon and conference) and a lot of Wikidata-related events will happen! https://wikimania2017.wikimedia.org/wiki/Programme [18:03:18] great! [18:03:35] so let's take a look at the development side of things [18:03:56] We've done a lot of stuff. first the things that happened on Wikidata itself: [18:04:00] Yay! [18:04:22] \o/ [18:04:29] As Auregann_WMDE said we worked on improving the constraints reports. Go and test out the gadget if you have not done it yet. We'd love your feedback [18:04:51] They should also be more understandable now [18:05:05] the gadget is great [18:05:08] We've improved the input for links to files on Commons. there is now a preview [18:05:20] the query service got some love as well [18:05:26] it has more autocompletion now [18:05:30] nice! [18:05:32] a new tree view visualization [18:06:06] yes ! [18:06:07] federation is possible meaning you can combine data from wikidata's query service and other sparql endpoints in one query [18:06:29] and you can now fo queries that include calls to the mediawiki api [18:06:52] There are also two new datatypes available to link to geoshape files and tabular data files [18:07:19] https://www.wikidata.org/wiki/Special:AvailableBadges is a new special page to show all the badges you can add to sitelinks [18:07:43] We've had requests for a nt truthy dump and made that available [18:08:06] And the following new language codes are now available for monolingual text values: brx, chn, cop, gez, quc, kjh, nr, fkv, lag [18:08:16] researchers in my lab love the truthy dump [18:08:20] :D [18:08:22] good! [18:08:54] We're also working on improving the usage tracking in order to better understand how wikidata's data is used on the other projects and make the change notifications in the watchlist better [18:09:35] On Wikipedia you will soon have support for showing badges in the sidebar also for linking to other projects. So far badges are only shown for links to the same project in another language [18:09:45] (Lydia_WMDE: How many language codes are there now in Wikidata then? Are there 358?) [18:10:14] and we are almost ready to roll out support for showing wikidata changes also in the enhanced recent changes mode. so far you could only see wikidata changes on wikipedia in the other mode [18:10:32] Scott_WUaS: i am not sure to be honest. i'll need to investigate [18:11:08] Thanks! [18:11:12] and the last thing: Special:EntityData now shows all the data formats we support for it [18:11:33] that brings us to the second big project: Wiktionary and support for lexicographical data [18:11:50] we've put quite a bit of work into that over the past quarter [18:12:03] yay! [18:12:11] \o/ [18:12:16] Wiktionary now has automated sitelinks for the main namespace as well as sitelinks via Wikidata for the other namespaces [18:12:33] that will hopefully take off quite some work from the shoulders of the editors [18:13:01] We've also written down some documentation of the data model we are working towards: https://www.mediawiki.org/wiki/Extension:WikibaseLexeme/Data_Model [18:13:14] and published some examples to better understand it: https://www.wikidata.org/wiki/Wikidata:Wiktionary/Data_model_examples [18:14:10] great! Thank you! [18:14:13] our current goal is to have a prototype by Wikimania that we can get feedback on. it won't be pretty and it won't do persitant storing of data at that point but it should give everyone the opportunity to give feedback and help us plan the next steps [18:14:35] a prototype of the new entity type for Lexeme (next to item and property) [18:15:32] including forms and senses already? [18:15:44] dennyvrandecic_: in a non-persistent form, yes [18:15:59] so you can make edits but they won't survuve a reload [18:16:12] *survive [18:16:54] sweet [18:17:02] Which brings me to the last big block, more support for Wikipedia and co [18:17:25] there we worked on a prototype for editing Wikidata's data directly from Wikipedia. Feedback is very welcome. More information here: https://www.wikidata.org/wiki/Wikidata:Client_editing_prototype [18:18:09] Lydia_WMDE, it looks very nice [18:18:18] this tool seems to be a big improvement. I ca'nt wait ... [18:18:22] can't [18:18:24] and we made more articleplaceholder pages indexable by search engines to get more traffic and hopefully editors to the smaller wikipedias. we're still in a trial with welsh wikipedia but if your wikipedia would like to have articleplaceholder enabled let me know [18:18:42] paucabot: Micru: :) [18:18:47] (if your project isn't as large as ruwiki) [18:19:05] very true, sjoerddebruin! [18:19:24] unfortunately we still have some performance issues which mean we can't roll it out on large wikis at this point [18:19:31] ruwiki and wikidata is alreay a recipe for melting servers [18:19:36] hehe indeed [18:20:03] any questions about that so far or should we look at what is coming over the next quarter? [18:20:23] no questions so far, go ahead [18:20:41] alright [18:21:15] so we'll spend some time getting to better understand how people are editing wikidata and how wikidata's data is used in the other wikimedia projects. [18:21:37] we'll do user testing sessions to figure out the next steps for improving our input widgets [18:21:56] Lydia, there are still some datatypes missing, like the astronomical coordinates. Is there any plan for that? [18:22:36] not for the next quarter at least. beyond that i also don't have a concrete plan at this point but if someone wants to look into developing those we'd be happy to help. [18:22:53] there is just too much other stuff on our plate atm and it doesn't have high priority [18:23:08] ok, thanks [18:23:39] Another thing we want to work on is graduating the constraints support from being a gadget to the core software [18:23:53] making it available for everyone and being properly integrated [18:24:16] We'll put work into multi content revisions as groundwork for structured data support for wikimedia commons [18:24:38] and of course we'll work more on the demo for supporting Wiktionary/lexicographical data for Wikimania [18:25:15] and the last thing is doing some more experiments with writing a query builder that people can later use to create automated list articles on wikipedia [18:25:31] and that's it for the upcoming part. any questions about that? [18:25:47] Anything coming up for a power user like me? :P [18:26:21] Will the query builder support similar features as listeria? [18:26:41] sjoerddebruin: constraints improvements, hopefully better input widgets in the long run based on the user research we are doing this quarter, more usage of our work in the other projects :) [18:27:12] any estimates on when the lexo stuff may launch? [18:27:18] Micru: that is the end-goal but the query builder is just the first step towards it. it could be used to create the queries necessary to have listeria-like functionality [18:27:46] ok, thanks for the answer [18:27:50] dennyvrandecic_: depends on the feedback we get and how much work it is to adress it [18:28:14] assume little work to adress it :) how does the timeplan look like in this case? [18:28:38] My best guess is end of the year but that is not a very confident guess [18:29:16] that is six months! :O that's longer than it took wikidata to launch :( [18:29:48] Developers, developers, developers. [18:30:00] dennyvrandecic_ it is not the same to build from scratch, than to integrate into old code [18:30:11] ^ What Micru says ;-) [18:30:17] I know, Wikidata was integrated into MediaWiki :) [18:30:34] or rather, built on top of it :) [18:31:10] yeah, considering contenthandler, i'd say integrated :) [18:31:48] hehe right. That was fun... [18:32:04] Lydia_WMDE: from my experience, it would be nice to increase the visibility of the "deprecate a statement" action versus the "delete statement" action. Because quite a few new users are deleting statements when they should deprecate them [18:32:25] Lydia_WMDE: maybe in the new interface for wikipedia ? [18:32:29] reiga: thanks! i'll note that down [18:33:06] yeah the difference between deleting and deprecating/lowering the rank is one of the biggest issues there [18:33:07] Same applies for sport clubs transfers for example. A lot of people just change values instead of adding new ones and updating the old ones. [18:33:17] *nod* [18:33:35] (i've requested abuse filter for that, but abuse filter still isn't the best thing for Wikidata) [18:33:52] great :) [18:34:16] Maybe something we can teach ORES? :P [18:34:31] ah yeah. will talk to amir about it [18:34:53] alright. more questions or should we let Auregann_WMDE go on with her part? [18:35:55] Ok then, let's talk about content now :) [18:36:08] We had a few nice data donations: Songkick have donated 155k of their artist identifiers https://blog.songkick.com/combining-forces-with-the-wikipedia-universe-38b562ced1e8 [18:36:19] The famous gif website Giphy also donated information, some help is needed to match the catalogue https://tools.wmflabs.org/mix-n-match/#/catalog/487 [18:36:39] Have you heard about #100wikidatadays, a challenge to improve 1 item per day during 100 days? https://www.wikidata.org/wiki/Wikidata:100wikidatadays [18:37:07] Oh and by the way, Eurowing apps use “the magic power” of Wikidata https://twitter.com/pigsonthewing/status/867270004017909760 [18:37:40] Statements are still growing strong. [18:37:59] Let’s talk about some cool tools now! You can check Monumental, displaying heritage data https://tools.wmflabs.org/monumental/#/ [18:38:07] But also Comprende!, a quiz interface on top of MediaWiki/WikiBase by Magnus Manske http://magnusmanske.de/wordpress/?p=446 [18:38:15] Have a look at Causegraph, a tool to visualize and analyze cause/influence relationships using Wikidata http://causegraph.org/ [18:38:25] Wikidata Diff is a very useful tool to compare the basic properties (no qualifiers/ ranks, yet) of two Wikidata items http://tools.dicare.org/wikidata-diff/ [18:38:33] My personal favorite is Wikidata Guessr, a game where you have to find the location of a picture. And you can even personalize it with your own categories! http://guessr.morr.cc/ [18:38:41] Last but not least, VizQuery is an experiment to query Wikidata with a visual interface, that should inspire us for the query service https://tools.wmflabs.org/hay/vizquery/ [18:39:50] and now... some update about WikidataCon :D [18:39:57] WikidataCon (October 28th-29th, Berlin) is the conference dedicated to the Wikidata community, organized for and with the community. It’s going to be amazing :) [18:40:22] The scholarship application process runs until July 16th https://www.wikidata.org/wiki/Wikidata:WikidataCon_2017/Scholarships [18:40:29] The call for submissions for the program runs until July 31st https://www.wikidata.org/wiki/Wikidata:WikidataCon_2017/Program/Submit [18:40:40] Registration: don’t wait! Already 94 persons on 100 registered (50 extra tickets will be released in September) Also, if you registered but you finally can’t come, please release your ticket on Eventbrite ^^ https://www.eventbrite.de/e/wikidatacon-2017-registration-35426998105 [18:40:55] You can also help us in one of the volunteer teams: https://www.wikidata.org/wiki/Wikidata:WikidataCon_2017/Volunteer [18:41:39] So many registrations, positive feedback about WikidataCon, people ready to help... thanks all for that <3 [18:41:56] A lot of events happened during the last quarter, organized all around the world by the community :) Wikidata meetups in Berlin, conferences, workshops, the Datensummit in Berlin where open data community discovered Wikidata and SPARQL, Wikicite, Wahlsalon and Wahldaten hackathon about elections data, a Wikidata+GLAM workshop, a keynote at foss.north... [18:42:03] Most of the Wikidata team attended to the Wikimedia hackathon, worked on lots of stuff and discussed about diverse issues with people. Whe also run the documentation sprint, helping attendees to improve documentation about Wikidata and Wikibase. The documentation sprint will happen again at Wikimania! [18:42:26] Wikidata has 4 new admins: welcome and thanks to Zolo, MisterSynergy, Queryzo and ChristianKl, and one new bureaucrat: Lymantria [18:42:34] John Commings has been applying for a grant to continue his work at UNESCO https://meta.wikimedia.org/wiki/Grants:Project/John_Cummings/Wikimedian_in_Residence_at_UNESCO_2017-2018 [18:42:41] Auregann_WMDE, I want to make you aware that I have complained about the stinginess of the WMF by not giving the Wikidata conference more funds >>> https://lists.wikimedia.org/pipermail/wikimedia-l/2017-June/087903.html [18:42:43] Andy Mabbett is Wikimedian in Residence in the History of Modern Biomedicine Research Group http://www.histmodbiomed.org/blog/introducing-wikimedian-residence-mr-andy-mabbett [18:43:01] MySociety is requesting a grant to fun their project EveryPoliticians dedicated to open data in politics https://meta.wikimedia.org/wiki/Grants:Project/mySociety/EveryPolitician [18:43:21] Micru: I saw this [18:43:29] I think it is important to raise this topic in the wide community so the people are aware that the WMF is not contributing enough money to fulfill our mission [18:44:23] To finish this office hour, before going through your questions, here are a few interesting links to browse [18:44:31] Repurpos.us: A fully open and expandable drug repurposing portal by Sebastian Burgstaller-Muehlbacher, PhD https://docs.google.com/presentation/d/1QWMBZNESPgQ_wio1kcnzv8rF74WezFl8L5v0tjwOQfc [18:44:38] The Wikidata data model and your SPARQL queries by Bob DuCharme http://www.snee.com/bobdc.blog/2017/04/the-wikidata-data-model-and-yo.html [18:44:43] Documentation on using OpenRefine to match from a data set you have to items in Wikidata https://github.com/OpenRefine/OpenRefine/wiki/Reconciliation [18:44:49] Schema.org proposes to encourage the use of Wikidata as a common entity base for the target of the sameAs relation https://www.wikidata.org/wiki/Wikidata:Schema.org [18:44:55] Six degrees on Wikidata, by Andrew Gray http://6dfb.tumblr.com/post/161020960651/introducing-six-degrees-on-wikidata [18:45:00] Some statistics on scholarly data in Wikidata, by Finn Årup Nielsen https://finnaarupnielsen.wordpress.com/2017/05/25/some-statistics-on-scholarly-data-in-wikidata/ [18:45:07] Paper: Becoming Wikidatians: evolution of participation in a collaborative structured knowledge base: http://scholarspace.manoa.hawaii.edu/handle/10125/41688 [18:45:15] MySociety published a five part series examining how to use Wikidata to answer the question: 'What is the gender breakdown of heads of government across the world?': [18:45:22] Help us find the offices of heads of governments across the world! https://medium.com/mysociety-for-coders/help-us-find-the-offices-of-heads-of-governments-across-the-world-4558124bcd24 [18:45:27] Help us to show who fills the role of head of government on Wikidata https://medium.com/mysociety-for-coders/now-help-us-to-show-who-fills-the-role-of-head-of-government-db6ebc3c6872 [18:45:32] Linking back in the other direction https://medium.com/mysociety-for-coders/task-3-linking-back-in-the-other-direction-105fd98facac [18:45:38] Check that the data is consistent https://medium.com/mysociety-for-coders/task-4-check-that-the-data-is-consistent-238b3d00dc19 [18:45:45] Conclusion: So, what is the gender breakdown of heads of government? https://medium.com/mysociety-for-coders/conclusion-so-what-is-the-gender-breakdown-of-heads-of-government-654a0dde1fb [18:46:04] And finally: Building communities of knowledge with Wikidata https://i9606.blogspot.de/2017/06/building-communities-of-knowledge-with.html [18:47:05] Lydia_WMDE: people are so silent... I think they're all playing Wikidata Guessr :D [18:47:17] lol [18:47:20] it's awesome! [18:47:23] and pretty hard [18:47:37] We should do a tournament at WikidataCon [18:47:46] oh yeah [18:48:26] my problem is, bugtracker-to-IRC bridge bots have conditioned me to ignore blocks of chat messages with links at the end :D [18:48:29] I stop talking now, and we still have time for all of your questions :) [18:49:55] Auregann_WMDE: I have a question, should we talk about issues like the one about the grant money in public or do you prefer to have a 1 to 1 conversation? [18:50:08] no question from me, just a cheer and keep doing such a good job \o/ [18:50:14] Micru: i'd prefer to discuss this one-on-one [18:50:19] reiga: <3 [18:50:30] Ok, let´s arrange a meeting for next week ;) [18:51:23] and yes, I also agree that the work done so far is amazing, and that you guys should keep delivering awesome stuff like the announcements today, really cool! [18:51:37] :) [18:51:39] We try! :D [18:51:39] Micru: we already have a meeting with the scholarship committee on Sunday [18:51:54] In the meantime, I really think that exposing this in public doesn't serve our cause [18:51:55] The WMF can't underestimate Wikidata anymore. [18:52:06] Oh for those planning long-term: WikidataCon is also Wikidata's 5th birthday. You might want to think about presents ;-) [18:52:24] 500k edits isn't a present? :P [18:52:29] haha [18:52:30] maybe [18:52:32] Auregann_WMDE: true [18:52:54] sjoerddebruin: I hope they stop doing that [18:53:05] sjoerddebruin: that was too early, now we have to reach 1M in time ;) [18:53:09] Micru: I suggest we wait for the answer of the grant committee now, see what happens [18:53:27] Auregann_WMDE: Ok [18:53:29] Well I think that stoped when WMF agreed to long-term fund Wikidata development at WMDE :) [18:53:30] I actually have a goal for the birthday. ;) [18:53:32] <3 for that [18:53:44] Development yeah. This is more community. [18:53:53] sjoerddebruin: always included for me! [18:54:00] Remember that we're not aiming for organizing a event of the size of Wikimania or something. We decided to have only 150 attendees for the first edition [18:54:13] Maybe, in the future, the event will happen again and will grow [18:54:31] Still, events are expensive. [18:55:25] Yeah and we have quite some money for it now. Let's see how the scholarship grant goes and then decide on next actions. We have options to make it all work out. [18:56:30] Any other topic for the last minutes? [18:56:45] Wishes? Grumbles? Cool stuff? :D [18:57:10] I wish for you guys to have an awesome week! :D [18:57:53] Same to you! [18:58:26] Wikidata is aswome [18:58:27] Thanks to all of you for your support :) [18:58:27] sorry, awsome [18:58:41] awwsome? :P [18:58:47] Yes! :P [18:58:47] thx! [18:58:49] wawsome [18:58:54] yeah! [18:58:57] miawsome? [18:59:16] #Cats [18:59:54] Now help me with the search for who added 2 million qualifiers last week... [19:00:08] *lol* interesting... [19:00:18] Platypus now speaks a bit of Spanish: https://askplatyp.us/?lang=es [19:00:19] Have a nice day/night/evening and see you soon :) [19:00:19] sjoerddebruin: what ? :D [19:00:30] Tpt[m]: yay! [19:00:42] https://grafana.wikimedia.org/dashboard/db/wikidata-datamodel-statements?refresh=30m&panelId=11&fullscreen&orgId=1&from=now-6M&to=now (posted that in #wikidata as well) [19:00:56] Wait, 1 million* [19:01:07] Time for coffee. :P [19:01:16] hehe [19:01:25] thanks for coming everyone! <3 [19:01:33] #stopmeeting [19:02:20] have a nice evening everyone [19:02:26] * Lydia_WMDE looks @ wm-labs-meetbot` [19:03:16] #endmeeting [19:03:16] Meeting ended Wed Jun 28 19:03:16 2017 UTC. Information about MeetBot at http://wiki.debian.org/MeetBot . (v 0.1.4) [19:03:16] Minutes: https://tools.wmflabs.org/meetbot/wikimedia-office/2017/wikimedia-office.2017-06-28-18.00.html [19:03:16] Minutes (text): https://tools.wmflabs.org/meetbot/wikimedia-office/2017/wikimedia-office.2017-06-28-18.00.txt [19:03:16] Minutes (wiki): https://tools.wmflabs.org/meetbot/wikimedia-office/2017/wikimedia-office.2017-06-28-18.00.wiki [19:03:17] Log: https://tools.wmflabs.org/meetbot/wikimedia-office/2017/wikimedia-office.2017-06-28-18.00.log.html [19:03:36] thanks sjoerddebruin xD [19:03:54] Always there. <3 [19:05:09] aha! [19:05:10] thx [20:14:58] Jamesofur: here? [21:00:29] #startmeeting RFC meeting [21:00:30] Meeting started Wed Jun 28 21:00:30 2017 UTC and is due to finish in 60 minutes. The chair is TimStarling. Information about MeetBot at http://wiki.debian.org/MeetBot. [21:00:30] Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. [21:00:30] The meeting name has been set to 'rfc_meeting' [21:00:39] #link https://phabricator.wikimedia.org/T53736 [21:03:19] just waiting for Krinkle to join, he's driving this RFC at the moment [21:05:27] Krinkle: I did the meetbot thing already [21:05:34] Thanks [21:05:47] I've just updated the task description again at https://phabricator.wikimedia.org/T53736 [21:06:06] Writing up a more explicit and brief version of the proposal I want to put forth for feedback. [21:06:12] Wrote up* [21:06:29] I basically incorporates all prior feedback from the past few weeks. [21:06:36] hmm what's the diff between Solution 1 and Solution 2 - their description seems to be the same? [21:06:50] I'm still open to entirely new ideas of course, it's still pretty early days [21:07:10] SMalyshev: Good point. They are indeed quite similar. The difference is in the links produced by the Parser output. [21:07:10] let me edit that description [21:07:21] I scratched my head about that too for a minute [21:07:39] SMalyshev: Solution 2 proposes to pre-resolve these server side to point to /wiki/Target?rdfrom=Redirect directly instead of letting an HTTP redirect happen when the user clicks the link. [21:07:45] It's merely an optimisation. [21:08:01] reload, see if that is better [21:08:07] I don't like spot-the-difference either [21:08:08] Thanks [21:08:22] ah, ok, now it's much clearer [21:08:28] thanks [21:09:38] what happens in solution 2 if the redirect target changes - would it go back and fix the links? [21:10:02] SMalyshev: Yes, regular update propagation. Similar to templates and image backlinks do already. [21:10:16] we would need to purge the parser cache and varnish cache of all pages that link to the redirect [21:10:26] In the current implementation of the jobqueue refresh links, that means all incoming pages are purged instantly, and updated in the background, or on first visit (whichever comes first) [21:11:18] Although there is a separate proposal to improve that. https://gerrit.wikimedia.org/r/#/c/295027/ [21:11:19] so in solution 2, the page would have incoming link for both redirect page and the ultimate target, right? [21:11:49] yeah, there are also incoming external links which can't be updated [21:12:11] SMalyshev: No, our linking table tracks what page title you link to in the source, that doesn't change. But Speical:WhatLinksHere does include links to redirects, that feature remains unchanged. [21:12:27] it will include both effectively, as it does now. [21:12:45] but instead of linking to /wiki/Redirect it links to /wiki/Target?rdfrom=Redirect [21:14:27] I think solution 1 is fine, I just wonder whether it better for latency than the current solution [21:14:27] would that also mean cache fragmentation due to ?rdfrom is URLs? [21:14:36] that's the point, right? [21:15:02] ah, because of "redirected from" part? [21:15:11] SMalyshev: We already fragment cache for "Target" across "Target", and "Redirect" given "/wiki/Redirect" currently renders HTTP 200 as Target + message [21:15:13] I was replying to myself [21:15:31] gwicke apparently proposed stripping the ?rdfrom in varnish to avoid cache fragmentation [21:15:49] Which requires moving the message to JavaScript. [21:16:08] I don't like that [21:16:15] yeah that's what I was going to ask, who'll do the "redirect from" message [21:16:39] I don't think people realise how awesome redirects are for keyword stuffing [21:16:58] I agree with moving it to JS being a problem. Not because of the audience (no-js users not getting it is "fine" I suppose), but because of the performance/flash of adding content above-the-fold at run-time [21:17:22] I've often done google queries where the "redirected from" message has been the only instance of my search keyword on the page [21:17:33] Interesting. [21:18:18] I mean, we could do better [21:18:33] we could have a collapsed box or something which lists all incoming redirects, that would give us lots of google juice [21:19:06] yeah, or in at the bottom of the page. [21:19:20] yup [21:19:52] +1 for not making tha page jump to inject the message [21:20:24] Page jump can be avoided by reserving the space ahead of time [21:20:31] there are a bunch of them though... I mean, Obama's page has 136, Trump's 64. Though I guess these articles are already big, maybe another 100 items won't hurt that much [21:20:36] .client-js selector is jump-free [21:20:49] Anyway, let's not drift off-topic :) [21:20:52] Krinkle: sure, but in the regular view, that would stay blank. not so nice [21:21:10] DanielK_WMDE_: Oh, right, because we're not just moving text, we're folding the cache together. [21:21:11] nevermind :) [21:21:22] :) [21:21:27] yeah to me having actual redirect seems more robust. unless we identify performance issue with it [21:21:34] any reply to my question about solution 1 versus solution 0 end-user latency? [21:22:03] We could change rdfrom=.... to hasrd=1 in Varnish and still reduce cache fragmentation to +1 instead of +N [21:22:34] TimStarling: The current experience depends on HTML + 1 js + 2 js request + dom ready + js parsing/execeution [21:22:57] Which must be longer than 1 round-trip given it depends on 3 roundtrips that happen serially [21:23:02] one unlocks the next etc. [21:23:07] fair enough [21:23:33] you know when I introduced this feature I think I just put one line of javascript in an inline