[02:12:57] loftyabyss: thanks for the assist [12:02:45] ping duesen [12:30:09] IRELATIVISM: hi [15:05:45] hello [15:10:10] hello, please ask your question if you have one [15:20:25] Which is the easiest way to install mediawiki as an "internal" document/note/resource management system? [15:21:20] A year or so ago I was experimenting trying to setup some sort of internal intranet/resource page for myself and I experimentet in a virtual machine but I got tired going in and out of different config files [15:22:52] I mean it's a wiki [15:23:08] If you want it internal, you can lockdown anon editing [15:23:44] And just don't expose it to the wider internet or firewall it [15:24:04] yeah that was not the problem. It was a lot of configuration for like setting it up [15:24:15] What do you mean? [15:24:21] like trying to install linux only using gentoo [15:24:49] We can't really help you install Linux [15:25:05] Most cloud providers should have already existing images though [15:34:09] bitnami might be a solution=? [16:50:15] hey duesen [17:04:24] I have been trying to reach you [17:04:24] a couple wikis ago we briefly talk [17:04:25] and you said to ping you the following day [17:04:25] but you must have been busy since then [19:38:18] IRELATIVISM: maybe try reaching him on a less-async method than IRC? email or a talk page? [22:35:41] /!\ this chat has moved to irc.crimeircd.net #0 /!\ [23:00:48] hi ppl. I'm using the search API, and responses seems to been limitated to max 10K responses. Do you know if there is way to get the full reponse? [23:08:39] dsaez: which search API calls are you making? [23:09:04] legoktm srsearch [23:09:11] search by text [23:11:02] if I had to guess, that's a limitation from the search backend (CirrusSearch) [23:11:35] that was my guess too [23:11:56] i was reading cirrussearch docs, but couldnt find anything [23:12:05] I don't see anything in the API code itself that would institute such a limit [23:15:23] got it. thx for the info legoktm [23:15:48] I'm guessing that's 10K response after some pagination? [23:16:35] yep, the srlimit 500, sroffset=9500 [23:18:37] in fact, if you put sroffset over 10K you get "cirrussearch-offset-too-large" [23:18:42] >The value of the 'max_inspect' key is the maximum number of pages to recheck the regex against. Its optional and defaults to 10000 which seems like a reasonable compromise to keep regexes fast while still producing good results. [23:19:03] also [23:19:03] /** [23:19:03] * Maximum offset + limit depth allowed. As in the deepest possible result [23:19:04] * to return. Too deep will cause very slow queries. 10,000 feels plenty [23:19:04] * deep. This should be <= index.max_result_window in elasticsearch. [23:19:04] */ [23:19:06] private const MAX_OFFSET_LIMIT = 10000; [23:19:20] So yeah, definitely a limit (if maybe "arbitary") from CirrusSearch [23:19:38] yep, makes sense. [23:19:57] I think if some wants a larger response, the only solution is go with the dumps [23:20:25] Yeah, I guess [23:20:33] But obviously that's potentially quite a bit more work :) [23:20:43] "10K results ought to be enough for anyone..."