[09:23:04] PROBLEM - puppet last run on wdqs2001 is CRITICAL: CRITICAL: Catalog fetch fail. Either compilation failed or puppetmaster has issues [09:29:24] PROBLEM - puppet last run on wdqs2002 is CRITICAL: CRITICAL: Catalog fetch fail. Either compilation failed or puppetmaster has issues [10:03:47] Lucas_WMDE: are you the right person to ask/bug about constraints stuff? [10:04:35] RECOVERY - puppet last run on wdqs2001 is OK: OK: Puppet is currently enabled, last run 7 seconds ago with 0 failures [10:04:48] the gadget is giving me a weird error I don't even understand for subclasses, like on https://www.wikidata.org/wiki/Q20162172 [10:07:52] nikki: looks like this fix hasn't been propagated yet https://www.wikidata.org/w/index.php?diff=488330267&oldid=486877681&title=Property_talk:P279 [10:08:45] Yeah, I wish the constraint table would be updated one last time before switching over to the statement based constraints. [10:13:15] matej_suchanek: ahh. thanks [10:14:11] it would make sense to update it, yeah. it's a bit hard to tell if it's working properly if fixes don't show up [10:15:32] nikki: well, yes, but I was away just now, sorry :D [10:15:53] ugh, another broken constraint template [10:15:55] RECOVERY - puppet last run on wdqs2003 is OK: OK: Puppet is currently enabled, last run 18 seconds ago with 0 failures [10:16:04] I already found https://phabricator.wikimedia.org/T169184 earlier today [10:16:46] it wasn't an urgent question anyway :) [10:17:37] bah, four matches for “QQ” in constraints.csv [10:17:41] this is why we need constraint statements :( [10:17:55] I'm sure people will find a way to break things using statements too :P [10:20:54] PROBLEM - High lag on wdqs2003 is CRITICAL: CRITICAL: 100.00% of data above the critical threshold [1800.0] [10:22:54] PROBLEM - High lag on wdqs2003 is CRITICAL: CRITICAL: 100.00% of data above the critical threshold [1800.0] [10:26:40] nikki, matej_suchanek, sjoerddebruin: I’ve updated https://phabricator.wikimedia.org/T169184 to list all these problems as well, and okay, I guess we’ll have to do another import… [10:27:16] I can claim it and fix them all if the list is complete [10:29:02] hello, using sparql : how do I get a list of all the proprietaries an item has ? [10:35:31] matej_suchanek: that would be awesome [10:35:45] What to do with https://www.wikidata.org/wiki/Wikidata:Database_reports/Constraint_violations/P1766#Type_Q2221906? :/ [10:35:53] I think the list is complete [10:37:48] matej_suchanek: if it helps, I can send you the wikidataconstraints.csv file I got from aude (should be the same one that’s also used on wikidata.org) [10:39:53] Lucas_WMDE: what does the file contain? [10:40:09] the constraint definitions, imported from templates [10:40:17] which are then imported into the database and used by the extension [10:40:42] but I see you’re already done :) so I’ll try reimporting the constraints for those properties locally and see if they’re all fixed [10:52:54] RECOVERY - High lag on wdqs2003 is OK: OK: Less than 30.00% above the threshold [600.0] [15:41:31] PROBLEM - puppet last run on wdqs1001 is CRITICAL: CRITICAL: Catalog fetch fail. Either compilation failed or puppetmaster has issues [15:46:31] PROBLEM - puppet last run on wdqs1002 is CRITICAL: CRITICAL: Catalog fetch fail. Either compilation failed or puppetmaster has issues [15:56:12] PROBLEM - puppet last run on wdqs1003 is CRITICAL: CRITICAL: Catalog fetch fail. Either compilation failed or puppetmaster has issues [16:00:45] No edit links? [16:05:10] Back. [16:06:04] sjoerddebruin: weird, I just noticed that edit links are missing… on my local development instance o_O [16:06:12] They are gone again. [16:06:33] hm [16:06:51] https://wikitech.wikimedia.org/wiki/Server_Admin_Log [16:08:31] RECOVERY - puppet last run on wdqs1001 is OK: OK: Puppet is currently enabled, last run 5 seconds ago with 0 failures [16:08:35] git bisect points at https://gerrit.wikimedia.org/r/#/c/361498/, which sounds plausible at first glance [16:09:09] Aleksey_WMDE: ^ is it possible that your change broke edit links? some race condition in initialization perhaps [16:09:39] wait, no, that change was only merged noon today, deployment was before that [16:10:31] RECOVERY - puppet last run on wdqs1002 is OK: OK: Puppet is currently enabled, last run 14 seconds ago with 0 failures [16:17:19] sjoerddebruin: are they completely gone or do they just take longer to appear? [16:17:36] Lucas_WMDE: I'm getting "Invalid file type" on various JS pages [16:17:54] in the console? [16:18:01] Yes. https://www.dropbox.com/s/2uq50lios147bri/Schermafdruk%202017-06-29%2018.16.46.png?dl=0 [16:18:48] wtf [16:19:11] I think it's your gadget. [16:19:33] I disagree :D [16:19:40] but seriously wtf are those URLs [16:19:48] resourceloader messing up? [16:20:06] Well, this was done earlier today... https://www.wikidata.org/w/index.php?title=MediaWiki:Gadgets-definition&curid=21137&diff=509713313&oldid=501614621 [16:20:20] RECOVERY - puppet last run on wdqs1003 is OK: OK: Puppet is currently enabled, last run 14 seconds ago with 0 failures [16:20:31] I have the gadget enabled, and it’s working for me [16:20:42] so the new gadget version is definitely deployed [16:20:55] do you get the same problem in a private window? [16:21:03] Nope. [16:21:35] Sorry, it's not your gadget. [16:22:09] What are you getting on https://www.wikidata.org/w/extensions/PropertySuggester/modules/ext.PropertySuggester.EntitySelector.js? [16:22:18] Unknown file path [16:22:37] but I don’t think that path should be generated [16:22:44] that looks like something you would have on a local wiki [16:22:56] but ResourceLoader is supposed to bundle that JS file and a thousand more together in a single request [16:23:22] I don’t see anything suspicious in your common.js… [16:23:54] (the problem on my local wiki turned out to be unrelated btw) [16:26:45] I'm using the debug mode btw [16:27:07] oh, okay [16:28:28] that makes a bit more sense [16:28:50] 1. but still, with ?debug=true, the requests I see are all load.php?debug=true&… [16:29:02] 2. on the other hand – I did get a 404 on ext.PropertySuggester.EntitySelector.js, and no edit buttons [16:29:22] sorry, nevermind 1., I was stupid [16:29:32] I get lots of /w/ URLs [16:29:36] but EntitySelector is the only one with 404 [16:30:09] what about https://www.wikidata.org/w/extensions/WikibaseQualityConstraints/modules/gadget.css? [16:30:38] Unknown file path [16:30:45] I tried this from Chromium just now, where I’m not logged in [16:30:51] let’s see what happens in Firefox [16:34:16] Chrome is only giving me errors for the constraints gadget. [16:34:26] no 404 in firefox, but also no edit links [16:42:49] Buttons do appear in Chrome when I disable your gadget. [16:43:00] uh oh [16:45:10] Wait. https://meta.wikimedia.org/w/index.php?title=User%3ASjoerddebruin%2Fglobal.js&type=revision&diff=16945110&oldid=16945094 [16:46:56] by the way, I’m still not sure what the problem is – does it only happen in debug mode or always? [16:47:10] because I only got it in debug mode [16:47:13] Debug mode seems to never load. [16:47:20] Commenting out that line seems to work. [16:47:33] https://tools-static.wmflabs.org/ doesn’t load at all for me right now [16:48:16] They are restarting stuff, it seems. https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/SAL [16:48:45] It's quite annoying, I've had the same thing in the past with the labeling script [16:48:58] They say to add this to your common.js, but your javascript doesn't load if labs is down. [16:51:20] Since you are not in #wikimedia-operations Lucas_WMDE: "Warning: OutputPage::transformFilePath: Failed to hash /srv/mediawiki/php-1.30.0-wmf.7/extensions/WikibaseQualityConstraints/modules/gadget.js [Called from OutputPage::transformFilePath in /srv/mediawiki/php-1.30.0-wmf.7/includes/OutputPage.php at line 37" [16:56:47] sjoerddebruin: unfortunately, I have to go home pretty soon… I’ll rejoin as WikidataFacts once I’m there, but I won’t have my work laptop with me [16:57:04] ok [18:48:28] anybody know about the discrepancy between results from an api wbsearchentities call and the same search on wikidata using the searchbox? [20:03:43] juniordev: which search box? the one on the top right? that should be the same... though it may be using some special set of options for wbsearchentities. [20:04:09] just figured it out, it isnt the same [20:04:15] we plan to make the search box behave differently, though (to support searches for page titles in other namespaces) [20:04:58] you can get the searchbox results using action=query list=search [20:04:59] juniordev: the box should be using wbsearchentities. unless we are not talking about the same box. [20:05:52] on the homepage of wikidata.org, top right there is a searchbox that says "Search Wikidata" [20:06:27] yep [20:06:31] and it provides results from the aforementioned api call, which can be different from wbsearchentities, because that api call only considers labels and alias I believe [20:06:43] when i type there, this is the request i get: https://www.wikidata.org/w/api.php?action=wbsearchentities&search=xx&format=json&language=de&uselang=de&type=item [20:06:45] got it all figured out tho [20:07:02] that's action=wbsearchentities for me... [20:07:08] weird [20:07:09] if it isn't for you, something is wrong [20:07:30] well im working off a very specific use case example [20:07:33] on wikidata, we hack the skin to replace the normal search box (action=search) with an entity selector [20:07:42] otherwise you can't search for labels properly [20:07:47] what skin are you using? [20:08:02] idk what a skin is [20:08:23] heres my example: use wbsearchentities with the term "silento [20:08:24] then you are probably using the default (Vector) [20:08:31] and it returns zero results [20:08:38] try https://www.wikidata.org/wiki/Wikidata:Main_Page?useskin=modern :D [20:08:49] but if you type the same thing into the searchbox you get what I am looking for [20:09:08] oh yeah thats totally different [20:09:13] guess that explains it [20:09:21] while Ive got you here tho, can I ask a question? [20:09:36] DanielK_WMDE: iiieech [20:09:39] well no. "strange" skins may prevent our hack from working. that would be an explanation [20:09:53] but with everything set to default, you should be using wbsearchentities [20:10:00] WikidataFacts: hehe ;) [20:10:31] juniordev: when i type "silento" into the search box, i get no hits either. [20:10:52] juniordev: ah - but your language setting matters. it secides what labels are searched. [20:10:57] on the standard skin? [20:11:10] yes, on the standard skin [20:11:11] no hits [20:11:20] use uselang=en to force the language to english. [20:11:30] you can use uselang in the browser and with the api [20:11:33] so heres the weird thing [20:11:36] should work consistently. [20:11:44] are you loggerd in? [20:11:46] it says no results in the dropdown from the searchbox [20:11:53] but if you still hit enter it gets results [20:11:54] * DanielK_WMDE shouldn't be online any more and has to run away soon [20:11:58] heres the url from the response: https://www.wikidata.org/w/index.php?search=&search=silento&title=Special:Search&go=Go&searchToken=bvo1fig0bph8uswufrh2z03rn [20:12:13] * DanielK_WMDE thinks that juniordev can then bug WikidataFacts ;) [20:12:27] lolwat is * an admin or something? [20:12:34] what, that skin eyesore wasn’t punishment enough: :D [20:12:50] juniordev: I’m an intern at WMDE, but this is a private account [20:12:53] juniordev: no :) just type "/me is a new user" [20:12:54] but I can still answer questions :) [20:13:05] oh for sure im an intern too lol [20:13:29] * juniordev is a new user [20:13:31] * WikidataFacts checks juniordev’s info for “real name” [20:13:46] not like an intern at wmde haha [20:13:47] nice to meet you, VEVO-LLC.bar2.SanFrancisco1.Level3.net/*.*.*.* :P [20:14:05] wow I only feel slightly violated [20:14:14] nice to meet you too lol [20:14:34] juniordev: oh, i get it. if you hit enter before the api request returns, you get forwarded to Special:Search. Which uses Elastic. wbsearchentities will use elastic too, soon, but that's another topic. [20:14:51] for sure, that makes sense [20:14:56] juniordev: try not hitting enter. wait for suggestions to show. you will see none. because this is not a prefix of a label or alias in your language. [20:15:00] anyway. [20:15:24] this is where the senior dev leaves the office, and let's the interns do their thing :) [20:15:26] WikidataFacts: can I ask you a question about wdtk? [20:15:42] DanielK_WMDE: oh, it’s your turn driving through the rain now? [20:15:43] DanielK_WMDE: thanks for the help [20:15:49] WikidataFacts: yes, indeed [20:15:50] juniordev: you can ask, but I’m not familiar with it [20:16:11] wdtk isn't done by Wikimedia. It's an unofficial tool, and (slightly) incompatible with WDQS [20:16:26] oh huh, thats interesting [20:16:58] my question is: does the wdtk-wikibaseapi package ApiConnection and all that automatically rate limit calls/queries? [20:17:26] I got a 503 http response earlier and I was worried I might be making to many calls and get banned or something [20:17:34] but then it didnt happen again [20:19:35] I know that pywikibot takes care of that (sleeps for n seconds and then retries the action), and I heard that wdtk doesn’t… but that’s all I know, sorry [20:20:20] oh okay. if im running a lengthy script do you have an estimate of how long I should wait between api calls/how many per second? [20:20:48] I think the response includes that info, but I usually saw pywikibot sleep 8-10 seconds [20:21:39] see e. g. https://paws-public.wmflabs.org/paws-public/46618563/Darwins-fox.ipynb [20:21:45] hmm okay, I will look into it more. if you get a better sense of what protocol I should be following for using the API lemme kno [20:22:55] it would be great if I could just do these api calls on a local data dump, but wdtk doesnt provide tools for that [20:23:03] :'( [20:24:27] if a senior dev gets back on at some point, could you pass along my question? thanks [20:24:39] ok [21:22:30] Hi. I am not sure if I get wikidata. Is it meant for as a db-like storage of table data from e.g. wikipedia? [21:25:50] in a rough sense [21:29:04] I just stumbled across all those hand-written tables, which would be much better if they would actually have a similar structure and could be aggregated somehow: https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units [21:29:38] Is this the kind of data which should be dumped into wikidata? Can then those tables be generated out of wikidata? [21:34:11] Platonides: ^