[02:05:14] How should multidisc games be imported into Wikidata? https://www.wikidata.org/w/index.php?title=Q609139&diff=700841502&oldid=630379374 [08:33:13] hey [08:34:03] is there a way to assign a statment to all the results of a sparql query? [10:28:05] frickel: with some scripting, yes: some examples https://github.com/maxlath/wikidata-scripting [10:56:41] can an editor/admin please look at User:Lilimoscu ? [10:56:57] I just corrected a vandalism edit by this account [10:58:48] there's not a lot to look at [10:59:27] and based on those two edits, it could be someone who's just confused by the interface [11:01:35] agreed, that's why I asked... [11:01:52] we've had in another MediaWiki installation recently this kind of edits... [11:02:00] new account, simple edits [11:02:21] I have the impression someone is preparing a new spambot and is testing it [11:02:33] bc the edit just doesn't make sense [11:02:42] note that the second edit is a correction of the first... [11:03:06] (other install == wikipathways.org) [11:04:03] searching for the text, I found https://www.wikidata.org/wiki/Q55093731 [11:04:52] where someone has used that property for a url for whatever reason [11:07:47] ah, there's your spam! [11:07:50] good catch! [11:08:00] ok, something to flag with the spam team, then [11:08:02] the other edits from the IP look reasonable though [11:08:51] sure, that's the only way to make you hard to get picked up automatically [11:09:19] are you restoring Q55093731 ? [11:09:47] so, I often had problems with URLs for genuine things... [11:09:58] so, how come this account can freely add random URLs? [11:11:08] well, you can be right... [11:11:50] but I don't trust it... [11:15:33] I've removed the pubmed id and the title (which was the title of the newspaper not the book), the urls aren't exactly random, they all mention the book [11:15:50] also removed the aliases, which were genres [11:16:06] +1 [11:19:08] thanks for checking [11:32:20] is there a consensus about how different editions with the same ISBN number should be handled? [11:33:04] I have two books with essentially the same content and the same ISBN number but the cover is different [11:36:54] maxlath[m]: so far I'm using js to write Quickstatements. your approach seems a bit more sophisticated [11:37:01] ^^ [11:54:30] addshore or DanielK_WMDE_: Do you know why the same checks are implemented in TimeValue.php (function normalizeIsoTimestamp) and in IsoTimestampParser.php (function splitTimeString)? [15:06:13] Hi I don't success in using www.wikidata.org/wiki/Q32088211 so I came here hoping to find some help. I want the list of articles from Wikipedia in the category "À illustrer". So I tried "SELECT ?articleLabel WHERE { [15:06:13] ?article wdt:P31 wd:Q32088211 [15:06:13] SERVICE wikibase:label { bd:serviceParam wikibase:language "en". } [15:06:13] }" but "No matching records found" Does any one understand what the issue is ? [15:07:13] (with a '.' at the end of the second line) [17:08:20] Can someone help with this query: https://tinyurl.com/y8aguju2 [17:09:04] Question in query, basically it's about how to work with properties… [17:54:46] frickel: if you're in JS, you might want to have a look to https://github.com/maxlath/wikidata-edit too :)