[WIKI] Please update the static wiki once a week
-
- Goblin
- Posts: 254
- Joined: Tue Feb 13, 2007 5:33 am
[WIKI] Please update the static wiki once a week
I have my internet connection down for some time, and I thought I could grab a static copy of the wiki and the manual to carry on with my work, but I found the tarred wiki is very dated... No updates since may 2007 or so.
I know most users wouldn't care it it was updated every week, but I hope you do it anyway for the ones who do.
I know most users wouldn't care it it was updated every week, but I hope you do it anyway for the ones who do.
0 x
-
- Goblin
- Posts: 254
- Joined: Tue Feb 13, 2007 5:33 am
- sinbad
- OGRE Retired Team Member
- Posts: 19265
- Joined: Sun Oct 06, 2002 11:19 pm
- Location: Guernsey, Channel Islands
- x 2
- Contact:
It's been turned off because MediaWiki has been the cause of several site-wide outages in recent times, due to some performance issues it has in certain edge cases, the static build was yet another overhead. It's low priority at least until I know we're over these issues. To be honest I've never particularly liked the idea anyway - it's out of date the second we build it anyway and is there anyone who doesn't have broadband these days? I caved in and reluctantly created the static copy but given the resources it sucks down and the performance issues with the wiki I'm strongly considering canning the static copy and telling people to live with the online version.
0 x
- SomeFusion
- Gremlin
- Posts: 191
- Joined: Sun Dec 01, 2002 12:38 am
- SpaceDude
- Bronze Sponsor
- Posts: 822
- Joined: Thu Feb 02, 2006 1:49 pm
- Location: Nottingham, UK
- Contact:
Why not use a web crawler like HTTrack (http://www.httrack.com/) to create your own off-line version of the site? Although this may not go down well if you suck up too much bandwidth
.

0 x
- Kojack
- OGRE Moderator
- Posts: 7152
- Joined: Sun Jan 25, 2004 7:35 am
- Location: Brisbane, Australia
- x 19
Or not use a web crawler if you don't want to get ip blocked (as happens on the ogre site).
(I actually like mass downloaders, I use one I won't mention to download web comics so I can read them using an image browser instead of waiting for them to download and scrolling around on the page to find the next button. With careful configuration the bandwidth hit is smaller than actually going to the site and reading them one by one, since I can filter out stuff or generate urls procedurally, or do an update so only new things are grabbed. But some sites don't like it, and will ban you. That's what happens here)
(I actually like mass downloaders, I use one I won't mention to download web comics so I can read them using an image browser instead of waiting for them to download and scrolling around on the page to find the next button. With careful configuration the bandwidth hit is smaller than actually going to the site and reading them one by one, since I can filter out stuff or generate urls procedurally, or do an update so only new things are grabbed. But some sites don't like it, and will ban you. That's what happens here)
0 x
- SpaceDude
- Bronze Sponsor
- Posts: 822
- Joined: Thu Feb 02, 2006 1:49 pm
- Location: Nottingham, UK
- Contact:
- sinbad
- OGRE Retired Team Member
- Posts: 19265
- Joined: Sun Oct 06, 2002 11:19 pm
- Location: Guernsey, Channel Islands
- x 2
- Contact:
The main problem with a webcrawler is that they will crawl through everything - on a Wiki, that's a major problem, as it tends to crawl through all the history and worse, all the 'diff' options. There's a lot of those, and they're not cheap on large pages. Bandwidth isn't the main issue, it's crawlers hitting diff after diff and putting the site under heavy load....
I've tried various tactics to stop this but none of them are foolproof, especially given the number of crawler tools out there. They are my arch-enemy.
Tuning continues, the situation seems to be improving but we're still monitoring to try to isolate remaining spikes. With the amount of traffic going on it's hard to see the wood for the trees and I don't have a lot of time to spend on this.
I've tried various tactics to stop this but none of them are foolproof, especially given the number of crawler tools out there. They are my arch-enemy.
Tuning continues, the situation seems to be improving but we're still monitoring to try to isolate remaining spikes. With the amount of traffic going on it's hard to see the wood for the trees and I don't have a lot of time to spend on this.
0 x
-
- Gnoblar
- Posts: 3
- Joined: Tue Jan 20, 2009 8:51 pm
Re: [WIKI] Please update the static wiki once a week
It will be great if Sinbad makes a offline copy of wiki once a month or two...
0 x
- xavier
- OGRE Retired Moderator
- Posts: 9481
- Joined: Fri Feb 18, 2005 2:03 am
- Location: Dublin, CA, US
Re: [WIKI] Please update the static wiki once a week
Apart from digging up a year-old thread...DaMan wrote:It will be great if Sinbad makes a offline copy of wiki once a month or two...
Did you not read anything he posted above? I mean, really -- a minimum of effort put forth on your part, is required to have your posts taken seriously.
0 x
-
- Halfling
- Posts: 85
- Joined: Sun Sep 23, 2007 7:58 pm
Re:
I am fairly sure their are crawlers designed for downloading wikis.sinbad wrote:The main problem with a webcrawler is that they will crawl through everything - on a Wiki, that's a major problem, as it tends to crawl through all the history and worse, all the 'diff' options. There's a lot of those, and they're not cheap on large pages. Bandwidth isn't the main issue, it's crawlers hitting diff after diff and putting the site under heavy load....
I am not sure how much it would be worth it. However, I did use it when I didn't have my internet connection for a week.
0 x
- sinbad
- OGRE Retired Team Member
- Posts: 19265
- Joined: Sun Oct 06, 2002 11:19 pm
- Location: Guernsey, Channel Islands
- x 2
- Contact:
Re: [WIKI] Please update the static wiki once a week
Honestly, this is WAY low on my priority list. None of the tools I've tried seem to work properly, or put too much strain on the web server. Newer versions of MediaWiki require a server upgrade which goes beyond what I'm willing to undertake right now. You may just have to live with this for the rare occasions you don't have internet access, it's just too much hassle for the tiny amount of utility it brings.
Locked, that's my final word on this topic for now.
Locked, that's my final word on this topic for now.
0 x