this is not likely the best place to ask this, but i'm not sure fo where else
would be. i've noticed lately that a lot of non-profit organisations are
running into trouble paying their bills. the costs are coming mainly from
things like bandwidth consumption, the result of migration of the public from
mainstream sources (cnn.com) to independent ones (indymedia.org). the
problem however is that the traffic hitting non-profit sites is making it
very expensive to run one.
some have added advertising to the site, others have just plain shut down, but
i was wondering how difficult it would be (or if such a thing already exists)
to run a distributed webserver. something that would split each request to a
page off to multiple low-bandwidth, sattelite servers... like you and i
running boxes at home. some data (say, large video files etc) would only be
available from one pool of sources, while others (html, jpg, png files) would
be available from a larger set.
of course there would have to be lots of checking to make sure that a server
doesn't blow up or anything, but how doable is this? and why haven't i seen
more non-profits & ngo's doing this sort of thing? personally, i wouldn't
mind giving up a little bandwith for a couple sites i want to support but
don't have the cash to donate to...
--
travel is fatal to prejudice, bigotry, and narrow-mindedness, and many of our
people need it sorely on these accounts. broad, wholesome, charitable views
of man and things cannot be acquired by vegetating in one little corner of
the earth all one's lifetime.
- mark twain
--
gentoo-user@gentoo.org mailing list
would be. i've noticed lately that a lot of non-profit organisations are
running into trouble paying their bills. the costs are coming mainly from
things like bandwidth consumption, the result of migration of the public from
mainstream sources (cnn.com) to independent ones (indymedia.org). the
problem however is that the traffic hitting non-profit sites is making it
very expensive to run one.
some have added advertising to the site, others have just plain shut down, but
i was wondering how difficult it would be (or if such a thing already exists)
to run a distributed webserver. something that would split each request to a
page off to multiple low-bandwidth, sattelite servers... like you and i
running boxes at home. some data (say, large video files etc) would only be
available from one pool of sources, while others (html, jpg, png files) would
be available from a larger set.
of course there would have to be lots of checking to make sure that a server
doesn't blow up or anything, but how doable is this? and why haven't i seen
more non-profits & ngo's doing this sort of thing? personally, i wouldn't
mind giving up a little bandwith for a couple sites i want to support but
don't have the cash to donate to...
--
travel is fatal to prejudice, bigotry, and narrow-mindedness, and many of our
people need it sorely on these accounts. broad, wholesome, charitable views
of man and things cannot be acquired by vegetating in one little corner of
the earth all one's lifetime.
- mark twain
--
gentoo-user@gentoo.org mailing list