Is there some software that does this? (Wikipedia)

Wolfdale75

Limp Gawd
Joined
Jun 17, 2010
Messages
133
Hi Guys,

I am looking for some software that lets you down load entire Wiki webpages (incl. images) to your harddisk, and then continuously (or when you ask it) looks for updated versions of those particular pages, and downloads them, so you can keep your collection of stored Wiki pages up-to-date.

Is there such a program that can do this?

And can you recommend any other Wiki-related software?

Many thanks!
 
Many thanks :)

Actually, I might have found the best thing out there. See here: http://xowa.sourceforge.net/

It doesn't rip and check for updates for individual pages, but it does let you download the entire Wikipedia each month, which is still very useful (if you have enough disk space :)
 
All current browsers allow you to save pages as completely or print them to files (pdf or something). The second variant is decent enough and easier to deal with, at least for me.
 
Looks like Wiki themselves provide something:

http://en.wikipedia.org/wiki/Wikipedia:Database_download
http://en.wikipedia.org/wiki/Wikipedia:Duplication_detector

Lot of information in there. So how big is Wikipedia currently, I cant find current numbers?

It says it in your first link if you scroll down. With only the current revision of all articles and no talk pages or images, its about 10GB compressed, and 40GB uncompressed.

The link to that version is here: http://dumps.wikimedia.org/enwiki/latest/enwiki-latest-pages-articles.xml.bz2
 
Back
Top