[00:25:57] (03PS1) 10Hoo man: Remove hard coded item ids from SetSiteLinkTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102856 [00:26:05] damn... that was tough :D [00:26:46] aude: ^ [01:06:28] (03PS1) 10Hoo man: Remove hard coded entity ids from SetQualifierTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102863 [01:09:26] (03CR) 10jenkins-bot: [V: 04-1] Remove hard coded entity ids from SetQualifierTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102863 (owner: 10Hoo man) [01:32:35] (03PS2) 10Hoo man: Remove hard coded entity ids from SetQualifierTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102863 [01:35:35] (03CR) 10jenkins-bot: [V: 04-1] Remove hard coded entity ids from SetQualifierTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102863 (owner: 10Hoo man) [01:41:20] (03PS3) 10Hoo man: Remove hard coded entity ids from SetQualifierTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102863 [01:44:11] (03CR) 10jenkins-bot: [V: 04-1] Remove hard coded entity ids from SetQualifierTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102863 (owner: 10Hoo man) [02:13:36] (03PS4) 10Hoo man: Remove hard coded entity ids from SetQualifierTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102863 [02:16:29] (03CR) 10jenkins-bot: [V: 04-1] Remove hard coded entity ids from SetQualifierTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102863 (owner: 10Hoo man) [02:41:19] (03PS5) 10Hoo man: Remove hard coded entity ids from SetQualifierTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102863 [02:44:03] (03CR) 10jenkins-bot: [V: 04-1] Remove hard coded entity ids from SetQualifierTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102863 (owner: 10Hoo man) [03:09:50] Japan gov has launched a "data.gov" today. http://www.data.go.jp/?lang=english [03:10:23] Is there a page to list this kind of external repositories? [05:59:13] does wikidata use libapache2-mod-passenger / Phusion Passenger ? [10:31:10] (03CR) 10Aude: [C: 032] Remove hard coded item ids from SetSiteLinkTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102856 (owner: 10Hoo man) [10:36:33] (03Merged) 10jenkins-bot: Remove hard coded item ids from SetSiteLinkTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102856 (owner: 10Hoo man) [10:42:24] [travis-ci] wikimedia/mediawiki-extensions-Wikibase#1459 (master - 7db2ecd : Marius Hoch): The build was broken. [10:42:24] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Wikibase/compare/088a0cd857f0...7db2ecd87917 [10:42:24] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/15760376 [11:27:44] (03PS1) 10Aude: remove hardcoded ids in SetReferenceTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102917 [11:29:48] (03PS2) 10Aude: remove hardcoded ids in SetReferenceTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102917 [11:49:51] (03PS3) 10Aude: remove hardcoded ids in SetReferenceTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102917 [12:01:57] (03PS1) 10Aude: Remove hardcoded ids from SetClaimTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102919 [12:01:58] (03PS1) 10Aude: Remove hardcoded ids and cleanup in RemoveClaimsTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102920 [12:04:52] (03CR) 10jenkins-bot: [V: 04-1] Remove hardcoded ids from SetClaimTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102919 (owner: 10Aude) [12:07:43] (03CR) 10jenkins-bot: [V: 04-1] Remove hardcoded ids and cleanup in RemoveClaimsTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102920 (owner: 10Aude) [12:08:04] (03PS2) 10Aude: Remove hardcoded ids and cleanup in RemoveClaimsTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102920 [12:11:22] :( [12:11:39] I was so happy with KDE yesterday and now it's slow as hell [12:11:42] * aude trying [12:11:50] :/ [12:12:14] maybe the radeon power management is to blame... will reboot w/o it.. [12:15:23] (03PS2) 10Aude: Remove hardcoded ids from SetClaimTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102919 [12:18:04] (03CR) 10jenkins-bot: [V: 04-1] Remove hardcoded ids from SetClaimTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102919 (owner: 10Aude) [12:24:07] wow... takes 5s to swithc tabs in pidgin now [12:24:24] and it can't follow my typing anyomre :( [12:24:33] :( [12:28:14] u-n-u-s-e-a-b-l-e [12:37:43] (03PS3) 10Aude: Remove hardcoded ids from SetClaimTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102919 [12:40:25] (03CR) 10jenkins-bot: [V: 04-1] Remove hardcoded ids from SetClaimTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102919 (owner: 10Aude) [12:41:36] I actually wanted to get some work done today... and not wait an hour for stuff to render :) [12:41:38] * :( [12:42:03] * aude cries [12:42:54] * aude see what travis says [12:43:11] aude: Saw my other change which jenkins -1s? [12:43:16] yeah [12:43:28] I couldn't reproduce that locally... no matter what I tried [12:43:35] set claim test passes for me, individually and with --group [12:45:56] The command "bash ./build/travis/script.sh" exited with 255. [12:45:58] no help [12:46:24] aude: mh... make it run with --debug [12:46:30] ok [12:47:53] [travis-ci] filbertkm/Wikibase#101 (setclaimtest - d91af4a : aude): The build failed. [12:47:53] [travis-ci] Change view : https://github.com/filbertkm/Wikibase/commit/d91af4a82acb [12:47:53] [travis-ci] Build details : http://travis-ci.org/filbertkm/Wikibase/builds/15765002 [12:47:56] gah [12:48:15] ok, that's more useful :) [12:51:59] Lydia_WMDE: why KDE no more fast? :( [12:52:41] going to restart once again.... [12:53:10] aude: cross your fingers, please [12:58:12] [travis-ci] filbertkm/Wikibase#103 (setclaimtest - 2d521d4 : aude): The build is still failing. [12:58:12] [travis-ci] Change view : https://github.com/filbertkm/Wikibase/compare/d91af4a82acb...2d521d45af62 [12:58:12] [travis-ci] Build details : http://travis-ci.org/filbertkm/Wikibase/builds/15765205 [13:12:11] aude: Figured it... as I've updated my grub this morning it somehow decided it would be smart to use a debug kernel as the default [13:12:19] and the debug kernel is slow as hell :P [13:46:55] hmmm, SuccuBot is editing quite rapidly [13:52:22] aude: How much is that exactly? :P [13:52:40] http://demos.filbertkm.com/wikistats/wikidata.php [13:53:19] more than thousand per minute, maybe more than 2 thousand [13:53:47] oO wow [13:54:02] seems the dispatcher might not be able ot handle that much [13:54:08] ewk [13:54:29] It's been ages since we last had peformance problems with the dispatcher [13:54:45] yeah, it can probably handle quite a bit but maybe not this much [13:55:02] which we should of course improve [13:55:13] aude: job queue based dispatch! [13:55:20] yeah! [13:55:23] now that we use redis that should scale [13:55:27] :) [13:56:42] yeah... lag is growing :/ [13:57:35] I wonder how that bot can edit this many pages... probably running a lot of threads on a powerful system [13:57:49] probably [14:04:42] aude: You have quite nice stats on that page :) [14:04:54] the wikipedia ones, I mean [14:04:55] i'm leaving a note on his talk page [14:05:03] * aude surprised my stats still work [14:05:14] should move them to tool labs and have an api for it [14:05:24] :) [14:05:46] aude: Job queue size on wikidata would be nice... [14:05:55] * aude nods [14:06:36] left note [14:07:22] :) [14:12:47] [ 3752.731336] traps: soffice.bin[17723] trap divide error ip:7f04e51ef346 sp:7fffd00b0a70 error:0 in libvclplug_genlo.so[7f04e51bc000+89000] [14:12:55] I didn't wanted to use libreoffice anyway [14:12:57] huh [14:12:59] * hoo install abiword [14:13:02] * installs [14:13:46] Lydia_WMDE: you missed the daily - fired! [14:14:59] JeroenDeDauw: :P And you missed the fun I had... Go to the API tests, grep for PropertyId, cry a lot [14:16:42] hoo: why is the "grep for PropertyId" needed? Seems redundant [14:17:29] hoo: anything in particular to look at ? [14:17:51] JeroenDeDauw: Well, we have a couple (really) of tests that use the same hard coded properties for testing [14:18:13] oh the tests.. [14:18:23] https://gerrit.wikimedia.org/r/#/c/102863/ [14:18:35] I suspect that this fails because it relied on side effects from other tests [14:18:38] can't reproduce [14:21:21] alright, !admin [14:21:44] huh? [14:21:46] dispatch lag keeps growing :( [14:21:54] shall I block? [14:22:03] i think the bot needs to slow down [14:22:20] aude: Half an hour? [14:22:44] * aude see if he answers me [14:22:50] k [14:23:33] i'm sure it would catch up quickly if the bot slows down [14:24:51] probably [14:25:15] 500 per minute would be totally fine, maybe even 800 or so [14:25:52] we also need to be concerned on teh wikipedia side with the job executions and parsing that occurs [14:25:54] aude: You know that we can't do that... we only have blocked or not blocked [14:26:13] yeah, well i told him that on his talk page [14:26:21] if he can change the bot [14:26:25] (or she) [14:26:25] well, AbuseFilter in theory could do that, but AbuseFilters sucks (I'm its maintainer, I'm allowed to say that :P) [14:26:34] :) [14:33:16] aude: Sorry... but the lag is almost 100k changes... [14:33:24] * aude says block then [14:34:06] when succu comes online, maybe he/she can limit the edit rate but not sure to wait for him/her [14:34:21] aude: Ok... did half an hour now [14:34:25] k [14:34:25] (change visibility) 14:34, 20 December 2013 Hoo man (talk | contribs | block) blocked SuccuBot (talk | contribs) with an expiry time of 30 minutes (autoblock disabled) (Please lower the edits/minute to about 500-800 at the most) (unblock | change block) [14:34:35] should help [14:35:02] lag already decreasing :) [14:35:07] wow [14:42:04] Hello! Sorry to poke, but does anyone have an opinion on [[Wikidata:Requests_for_permissions/Bot/InductiveBot]]? It's been a week since I asked, any no-one has replied. Should I continue to wait, or ask somewhere else? [14:42:11] Damn, Daniel went on holidays a day to early :P [14:42:48] (03PS4) 10Aude: Remove hardcoded ids in SetReferenceTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102917 [14:43:14] succu replied [14:43:41] i think in half hour, lag will come back to normal and bot will be unblocked [14:48:04] Yeah, we should give the dispatcher some time to breathe first :D [14:48:14] Average--84,79144 minutes [14:48:18] * 44 minutes [14:48:21] way to high [14:48:38] :( [14:53:27] ooof that's what i get for fiddling with cinnamon - an X server crash to the face! [14:54:47] inductiveload: heh... I had so much with Xorg in the last two days... [14:55:55] + fun [14:56:42] (03PS1) 10Aude: mock Title in InfoActionHookHandlerTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102944 [15:00:20] hoo: i'd say -fun [15:01:02] i'll do it in a VM next time and hopefully won't lose all my state when I poke it wrong [15:02:12] are you f..... kidding me [15:02:16] damn AbiWord [15:02:48] i fail to see how office software can be so universally crap in 2013 [15:02:53] [travis-ci] filbertkm/Wikibase#107 (infoactionhook - 31b44cd : aude): The build failed. [15:02:53] [travis-ci] Change view : https://github.com/filbertkm/Wikibase/compare/088a0cd857f0^...31b44cd7adf4 [15:02:53] [travis-ci] Build details : http://travis-ci.org/filbertkm/Wikibase/builds/15770723 [15:02:58] grrrr [15:03:17] smell of gas [15:03:17] gtg [15:10:42] aude: DispatchStats now reports a lag of 58 minutes [15:10:50] only 65k changes, though [15:10:57] the lag will go higher [15:11:00] changes down [15:11:19] when "pending" changes is to 0, then the time will decrease [15:11:41] ah, I see... [15:11:54] lag is the oldest change not processed [15:11:59] :) [15:12:00] * aude thinks [15:12:24] "Lag" is the time between the change last dispatched to the wiki, and the last change performed on Wikidata. [15:12:42] Enough for now... totally exhausted for now... the day started with a broken grub, went on with a slow KDE and finished in broken LibreOffice and crashy AbiWord... [15:12:47] ok [15:12:52] * aude going home  [15:13:09] aude: :) Happy holidays! [15:13:13] you too! [15:13:45] Got the card, btw :) You people are awesome (sadly I can't thank Lydia :P) [15:38:02] (03CR) 10Jeroen De Dauw: [C: 032] mock Title in InfoActionHookHandlerTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102944 (owner: 10Aude) [15:46:10] [travis-ci] wikimedia/mediawiki-extensions-Wikibase#1460 (master - b56f16f : Jeroen De Dauw): The build is still failing. [15:46:10] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Wikibase/compare/7db2ecd87917...b56f16fd9283 [15:46:10] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/15772747 [15:58:43] [travis-ci] wmde/DataValuesJavascript#7 (master - d959355 : jeroendedauw): The build is still failing. [15:58:43] [travis-ci] Change view : https://github.com/wmde/DataValuesJavascript/compare/5d5d479b3e47...d959355c4ce6 [15:58:43] [travis-ci] Build details : http://travis-ci.org/wmde/DataValuesJavascript/builds/15773865 [16:08:14] [travis-ci] wmde/DataValuesJavascript#8 (master - 64590dc : jeroendedauw): The build is still failing. [16:08:14] [travis-ci] Change view : https://github.com/wmde/DataValuesJavascript/compare/d959355c4ce6...64590dc27fc3 [16:08:14] [travis-ci] Build details : http://travis-ci.org/wmde/DataValuesJavascript/builds/15774387 [16:09:11] so, apparently the main gas valve in a building can be loose for years and then one day start leaking [16:09:21] that's encouraging... [16:10:12] (03PS4) 10Aude: Remove hardcoded ids from SetClaimTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102919 [16:13:08] (03CR) 10jenkins-bot: [V: 04-1] Remove hardcoded ids from SetClaimTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/102919 (owner: 10Aude) [16:19:56] aude: Same as I had [16:20:04] really? [16:20:06] * failures [16:20:07] yep [16:20:11] travis approve [16:20:12] s [16:20:21] aude: Also my machine approved [16:20:29] me too [16:20:30] 02:44:01 1) Wikibase\Test\Api\SetQualifierTest::testAddRequests with data set #2 (Wikibase\PropertyValueSnak) 02:44:01 UsageException: Malformed input: oO [16:20:41] (changed the string to test, but that didn't alter anything9 [16:20:49] and I got [16:20:49] 02:44:01 2) Wikibase\Test\Api\SetQualifierTest::testChangeRequests with data set #2 (Wikibase\PropertyValueSnak) 02:44:01 UsageException: Claim does not have a qualifier with the given hash [16:21:28] I don't really get that error, but I also don't have much insight into the internals of the snake validation stuff [16:21:30] did you run locally with sqlite? [16:21:43] aude: Both MariaDB and SQLite passed locally [16:21:54] * aude will have to poke more when my mind is fresh [16:22:09] I ran the tests by group and by folder (the api folder) [16:22:10] lookign more will not help [16:22:15] more now* [16:22:41] aude: Yeah... want a different thing to think about? :P LinkItem on commons... [16:22:54] :) [16:23:00] does it work? [16:23:08] Yes, but not the way we want it :P [16:23:13] oh [16:23:45] The problem is that it tries to link page in the same namespace on other wikis... so that we can only link files on the wikipedias [16:25:23] aude: The widget itself can do that, but it has to be constructed with the right option set... and I'm not sure how to do that in a manner which doesn't fully suck [16:25:34] hmmm [16:27:39] aude: Or I could change it to fully disable the namespace hinting, so that you can link whatever namespace you want (at least on commons) [16:27:56] i think it's a setting [16:28:13] or could be [16:28:24] I doubt we already have a setting for that [16:28:53] maybe i don't quite understand [16:30:10] aude: Wait, I'll give you an example [16:30:58] go to eg. https://commons.wikimedia.org/wiki/File:Karlsruhe_BVerfG_05.jpg (hint: I'm on that one) [16:31:04] and run [16:31:04] $( '#t-info' ).addClass( 'wbc-editpage' );mw.loader.load( 'wikibase.client.linkitem.init' ); [16:31:11] k [16:31:43] Then you can open the addlink widget, and play with it [16:36:47] for other namespaces, it seems to work fine [16:36:49] e.g. https://www.wikidata.org/wiki/Q7111843#sitelinks-wikipedia [16:38:01] aude: Oh sure [16:38:14] For categories the assumption is corrent [16:38:16] having links to files is a bit odd [16:38:26] * corrent [16:38:28] * correct [16:38:29] duh [16:42:28] [travis-ci] wmde/DataValuesJavascript#9 (master - 1fcf2a1 : jeroendedauw): The build is still failing. [16:42:28] [travis-ci] Change view : https://github.com/wmde/DataValuesJavascript/compare/64590dc27fc3...1fcf2a105992 [16:42:28] [travis-ci] Build details : http://travis-ci.org/wmde/DataValuesJavascript/builds/15776385 [16:42:46] [travis-ci] wmde/Wikibase#32 (newcomponents - 281da60 : jeroendedauw): The build failed. [16:42:46] [travis-ci] Change view : https://github.com/wmde/Wikibase/compare/d6a24e5badee...281da609786d [16:42:46] [travis-ci] Build details : http://travis-ci.org/wmde/Wikibase/builds/15775444 [16:43:00] * aude away [16:43:59] aude: ;) Maybe we can come up with a nice christmas present for the commons users later :D [17:17:06] JeroenDeDauw: i did not ;-) vacation! [17:48:33] * hoo|away hugs Lydia_WMDE! [17:49:15] hoo: *hug* ;-) [17:49:52] Lydia_WMDE: then you are fired from vacation! [17:50:28] JeroenDeDauw: ;-) [19:57:13] aude: Around? [20:15:21] https://www.wikidata.org/wiki/Q15304676 mh [21:04:15] New RFC! https://www.wikidata.org/wiki/Wikidata:Requests_for_comment/Interwiki_links_for_special_pages [21:07:22] woah [21:07:27] how fast was that bot going? [21:08:26] legoktm: SuccuBot? :P [21:08:32] yeah [21:08:52] It peaked at far over 1k edits per minute... [21:10:59] :P [21:11:03] block it! [21:11:05] wow [21:11:05] :P [21:11:13] at least it wasn't my bot this time! [21:11:18] legoktm: Hi. Can you please revive your patch? Yours was far better than mine [21:11:40] ebraminio: heh. alright [21:11:40] perhaps an upper limit should be set so we don't break the servers... [21:12:00] Ask him [21:13:02] ebraminio: done [21:13:15] rschen7754: you realize that bots have the noratelimit right for a reason? ;) [21:13:29] well yeah [21:14:27] Once we switched towards using the job queue for change dispatching this shouldn't be a big issue after all [21:15:04] but right now over 1k edits/ min by a single bot are a bit to much [21:15:36] * DangSunM is away from now [21:18:32] it was more like 2k a minute or bit more [21:18:58] almost 3k [21:32:48] hoo: legoktm https://www.mediawiki.org/wiki/Requests_for_comment/Linker_service [21:33:02] i'm trying to come up with better name for the rfc [21:33:55] mh... +1 on the content, btw [21:33:57] I can't think of one, but +1 to the idea :D [21:34:22] :) [21:34:38] that class is incredibly annoying [21:35:03] aude: Indeed... I hacked a bit on it in the past (nofollow stuff)... it *sucks* [21:35:24] alright, "linker class refactor" [21:35:44] i'd like to keep it specific and focused [21:35:53] if goes well, similar could be done elsewhere in mediawiki [21:37:37] aude: Yeah :) [21:37:55] aude: Will you attend the Architecture Summit now, btw? [21:38:08] yeah [21:38:13] * aude will be in the us anyway [21:38:14] \o/ [21:38:47] I'll fly there just for it... but I'll also take a few days after to see the city a bit [21:39:10] cool [21:39:13] will it be your first time in SF? [21:39:27] i'm just there a few days [21:39:27] legoktm: Yep... never been there [21:39:34] :D [21:39:39] You two probably already know it... [21:39:45] I live an hour away :P [21:39:52] :) [21:40:52] * aude prays for no snow storm that week in DC [21:41:06] that intereferes with travel [21:41:27] aude: how many days in advance will you leave DC? [21:41:33] My biggest aim is: Kill AbuseFilter... [21:41:35] 1 day [21:41:43] very early in the morning [21:42:01] hoo: And Create HooFilter? :) [21:42:46] JohnLewis: I'm not this selfish ... but it should at least be named "TheOneWhoHadTheIdeaIsAwesomeReallyFilter" :P [21:43:11] aude: Did you really just mock a title object in a unit test? oO [21:43:16] just as in today [21:43:26] yeah [21:43:39] oh, it does Title::exists [21:43:42] I see [21:43:44] seems much better and not dififcult [21:43:51] difficult* [21:44:05] But no, there's absolutely no reason for having a TitleValue object at all... [21:44:16] Yeah, also faster [21:44:25] ? [21:44:58] aude: If we had TitleValues you wouldn't have to mock the whole title but probably only a Title database lookup class [21:45:09] right [22:44:31] Are there any guidelines for which wikipedia page a given wikidata item should be linked to? e.g. it seems to me that https://www.wikidata.org/wiki/Q15221256 and https://www.wikidata.org/wiki/Q34273 both are equally applicable for the latter’s wikipedia page. [22:46:45] the latter is a smaller number and contains links to pages on other wikipedias [22:47:01] not sure if that is included in any policy, though [22:47:48] Should the be merged? [22:47:55] *they [22:48:05] yeah [22:48:29] done :) [22:48:50] I guess it’s unclear to me when an entity should be created for a wikipedia page (e.g. there are plenty of 'instance of wikipedia disambiguation pages') and when they should be separate. [22:49:53] And in fact… it seems that 'list of cities in wisconsin' (the wikimedia list article) is a separate thing from the concept of a city in wisconsin. [22:49:58] So… [22:50:33] …maybe they shouldn’t be merged? :-/ [22:52:06] the concept of a city in wisconsin doesn't seem consistant with what items generally are on wikidata [22:52:17] Is it required for some other property to be used? [22:52:31] not sure what you mean? [22:52:48] Like, the notability policy (WD:N) lists the reasons for an item to exist [22:53:05] OK… [22:53:19] generally items need either a link to a wikipedia page, or to make a statement in a property [22:54:40] OK…well, the 'link to a wikipedia page' part seems to be overtaken by the 'list of cities in wisconsin' entity… [22:54:57] and I’m not sure what 'making a statement in a property' means. [22:55:11] hmm [22:55:14] the 'fulfills some structural need' seems possibly relevant [22:55:25] yeah, that would be filling a statement in a property [22:55:37] so, for example Property:sex has two possible values, male or female [22:55:55] but rather than using the items with interwiki links on them, we have specially created items for that property [22:56:25] sorry, this is hard for me to explain because I still don't fully understand it myself [22:56:56] For this specific case, I *think* that a city in wisconsin could use the property "instance of" = city, with a qualifier "in administrative unit" = wisconsin [22:57:29] OK…it (was) serving as a link between 'political subdivision of wisconsin' and 'nth-class city' (which is a classification of city used in Wisconsin ;-) ) [22:58:01] Problem being that 'city' is a type of human settlement…while 'city (wisconsin)' is a type of government [22:58:22] the former is dependent on population (larger than a town), the latter is not. [22:58:35] hm, interesting [22:58:43] I guess it does fill a structural need? [22:58:46] I can undelete it [22:58:53] I *think* it does… [22:59:27] I mean, it seems to me that ideally we’d just have the list linked to that, and no need for a wikidata entry for the list article… [22:59:51] (but then, I don’t really get the purpose of a wikidata entry saying that a wikipedia article exists. [23:00:15] heh [23:13:59] (if you’re curious, http://208.80.153.172/wdq/?q=tree[1537][][131]_AND_claim[31:(TREE[15221256][][279])]_AND_noclaim[31:515] ) — things that Wisconsin considers cities which would probably not be considered cities otherwise ) [23:14:48] (that’s 155 out of the 190 cities-by-government)