Mailing List Archive

Media Wiki offline?
Hello everybody,

I am looking for a possibility to make my media wiki usable, even if i do
not have an internet connection.

Is there a possibility to make some kind of static snapshot in html format
or anything else? It is not nessesary to change sites in offline mode, I
just want to have read access to my wiki.

I found a tool called wiki2static, but it is not functional.

Regards

Florian Taeger
Re: Media Wiki offline? [ In reply to ]
Le Jeudi 2 Décembre 2004 18:56, Florian Taeger a écrit :
> Hello everybody,
hi

>
> I am looking for a possibility to make my media wiki usable, even if i do
> not have an internet connection.
you can install the true mediawiki software and download the current databases
for local use.
>
> Is there a possibility to make some kind of static snapshot in html format
> or anything else? It is not nessesary to change sites in offline mode, I
> just want to have read access to my wiki.
>
> I found a tool called wiki2static, but it is not functional.
For me it nearly works ;) I just have trouble with french-accents, and the
first page which have disappeared !!!
>
> Regards
>
> Florian Taeger
>
Regards
Alain
Re: Media Wiki offline? [ In reply to ]
>you can install the true mediawiki software and >download the current
>databases
>for local use.

That's another problem. My collegues are using windows and don't want to
use/install a complete wamp server. It's to complicate to tell them to click
on a icon ... windows users - you know ? :)

Regards

Florian
Re: Media Wiki offline? [ In reply to ]
Le Jeudi 2 Décembre 2004 19:20, Florian Taeger a écrit :
> >you can install the true mediawiki software and >download the current
> >databases
> >for local use.
>
> That's another problem. My collegues are using windows and don't want to
> use/install a complete wamp server.
You can download static snapshots of the wikipedia at the same place
you found wiki2static. The snapshots works fine :)

regards
Alain
Re: Media Wiki offline? [ In reply to ]
>You can download static snapshots of the wikipedia at >the same place
>you found wiki2static. The snapshots works fine :)


The Problem is: I don't need a snapshot of wikipedia ... I need a snapshot
of my internal corporate wiki :)
Re: Media Wiki offline? [ In reply to ]
Le Jeudi 2 Décembre 2004 19:42, Florian Taeger a écrit :
> >You can download static snapshots of the wikipedia at >the same place
> >you found wiki2static. The snapshots works fine :)
>
>
> The Problem is: I don't need a snapshot of wikipedia ... I need a snapshot
> of my internal corporate wiki :)

If you have your wiki, use it, or use wiki2static :)

I dont know if its the right place for talking of wiki2static, i hope so.
For me its works at 99%, and i think the 1% of trouble is my fault...

What doesn't work with it ?

Alain

PS: how to launch a php script on command line ?
Re: Media Wiki offline? [ In reply to ]
>I dont know if its the right place for talking of
>wiki2static, i hope so.


Where is the place to discuss mediawiki stuff, if it is not here ??

wiki2static is - as i think - created to make a static dump of wikipedi,
which is based on media wiki.

even if my wiki is also based on mediawiki, is is quite different.

I think, wiki2static has got problems with grabbing my wiki. But i need the
possibility to use this wiki on a windows pc without running software like
apache, mysql etc.

Alain - what did you use wiki2static for? grabbing a dump of wikipedia or
dump your own wiki??

Does anybody has an idea, how can i create a static dump of my own wiki?? i
need a possibility to use my personal wiki offline, without using
apache/mysql.

Regards

Florian
Re: Media Wiki offline? [ In reply to ]
> PS: how to launch a php script on command line ?

just do "php index.php" or "php4 index.php" to execute the php code in
index.php. And you need php4-cgi to do a commandline execution.

Regards

Florian
Re: Media Wiki offline? [ In reply to ]
wamp is not necessarily an obstacle.

I wrote http://meta.wikimedia.org/wiki/Wiki_on_a_stick to describe a
quick and small set-up of a tree containing all software & data that
can then be installed by a simple copy, even to a USB memory stick.
The total initial size is about 20MB. Data distributions could be
most easily generated by stopping the wiki servers briefly and just
copying the wiki database folder.
Re: Media Wiki offline? [ In reply to ]
On Thu, 2 Dec 2004 19:20:39 +0100, Florian Taeger <mail@konfu.de> wrote:
> That's another problem. My collegues are using windows and don't want to
> use/install a complete wamp server.

Try XAMPP. It's the only way (I think) to set up AMP on Windows. It's
what I'm used to set up the AMP server I'm using to rewrite my site on
a computer that's offline. From there, you only need to copy the MySQL
DBs. (The included phpMyAdmin can backup a server to a SQL script)

> It's to complicate to tell them to click
> on a icon ... windows users - you know ? :)

Hey! I take that personally! (But I know what you mean)

--
-------------------------------------------------------------------
http://endeavour.zapto.org/astro73/
Thank you to JosephM for inviting me to Gmail!
Re: Media Wiki offline? Dump & wiki2static [ In reply to ]
Le Jeudi 2 Décembre 2004 23:31, Florian Taeger a écrit :
> >I dont know if its the right place for talking of
> >wiki2static, i hope so.
>
>
> Where is the place to discuss mediawiki stuff, if it is not here ??
>
> wiki2static is - as i think - created to make a static dump of wikipedi,
> which is based on media wiki.
>
> even if my wiki is also based on mediawiki, is is quite different.
>
> I think, wiki2static has got problems with grabbing my wiki. But i need the
> possibility to use this wiki on a windows pc without running software like
> apache, mysql etc.
>
> Alain - what did you use wiki2static for? grabbing a dump of wikipedia or
> dump your own wiki??
>
> Does anybody has an idea, how can i create a static dump of my own wiki?? i
> need a possibility to use my personal wiki offline, without using
> apache/mysql.
>
> Regards
>
> Florian

Hi

I understand your trouble now I tried to dump my modified wikipedia,
and then tried to "2static" it.
THE PB is "how are that dump generated" , with which option precisely ?

I use wiki2static to generate a static dump, in order
to have save ressources (CPU and RAM) for lightservers we plan to put in some
schools here in Sénégal :)
As the connections are _very_ erratic (when it is not the power !) and _very_expensive
(100 euro/month for 1024 kbit ADSL in Dakar, 50 euro /month for 56kbit in countryside !)
we really need that.

I have spend 3 days of testing and "reverse understanding by brute force"
(wc cur.sql , head -45 cur.sql , cat cur.sql | cut -c1-80 , and the same with mydump.sql ;)

here are my conclusions:
- I lack documentation about setting up the DB, and tuning it, but
i probably missed something ;)

- I miss only 1 option to reproduce the dump of the database:
mysqldump --opt wikidb cur > mycur.sql
is _nearly_ the same as the downloaded.

The ONLY difference is tics ` around the name of the table cur:
INSERT INTO cur VALUES (146617,0,'Crazy_Day',
INSERT INTO `cur` VALUES (138898,0,'Antoine-Laure......

I dont know howto dump without the ` !!!! (im on debian sarge)

and that breaks wiki2static which is looking for
marker1="INSERT INTO cur VALUES" with no ' around cur !!! and
use that to delimit fileds !!!

I had trouble with few rare _very_ big special pages (upload logs, and
last changes) which where more than 1M each and force me to raise
the limits in everything (apache, php, mysql , for buffersizes and timeout => 300s)

I have also a weirdness:
the table i upload is supposed to be InnoDB type
when i get it inside mysql, it is a MyISAM type, and then i need
to change it to InnoDB (in phpmyadmin, because 10 days ago i didnt knew
anaything about LAMP)

I also did some "optimisations" following what phpmyadmin suggested.
if i m right, there are huge speed improvements , but i don't know excatly what happended:
the "Special page" allpages , took 5 min the first time (with wrong MyISAM type)
and only 12s after tthe InnoDB and optimisations (and a reboot, to be sure there is
no cache in use ;)

After that I also "benchmark" with htdig (search engine, reading the entire cylcopedia,
either static dump or dynamic wiki):
1 hour for htdigging the static version (french text table , 270 Mo)
10 hours for digging the wiki version (I stopped after 1 hour ;)

++
Alain