Mailing List Archive

Important Question About Recommended Environment For The Tutorial
Hi Everyone,

After talking to MST I realized that I have been way overdue getting a note
out to the group on an issue that's pretty fundamental to the tutorial (I
had some one-on-one discussions about it, but I don't think I did anything
to the whole group). Sorry for not getting something like this out to the
group sooner and sorry for this email being so long (but it's kind of a big
topic with a somewhat long history and potentially important ramifications
for the future).

Two of the big issues I have been grappling with since I first did the
tutorial almost 3 years ago are:

1) How to get an environment where users can quickly get up and running to
work through the tutorial (IOW, mostly how to avoid problems with CPAN)

2) How to have a 'known good" set of modules that the tutorial should work
against.


Another key question is: "What is the purpose of the tutorial?" My thought
has been that it needs to provide a good, stable environment where newcomers
can learn the basics of Catalyst. It should try to show relatively current
best practices and features, but IMHO, having it "just work" is more
important than having the latest and greatest of everything -- if people get
frustrated in the early learning phases they will never "stick around" to
learn the finer points. Any comments on this point of view?

In terms of trying to address 1 & 2 above, my initial plan of attack was to
encourage the use of MST's really cool "cat-install" script (while still
mentioning other options like "cat in a box"). I stuck
Catalyst::Manual::Installation::CentOS4 out there as a way that people could
start from scratch with a relatively popular distro and build a Catalyst
environment. Unfortunately, using cat-install to always pull the latest
versions of everything seemed to create more problems with #1 and #2 above
than I expected. More times than not it ran into at least one module that
failed to install correctly. And, even if people did get past the install,
there was no way to know if the tutorial was now broken once they got into
the details.

I have talked to MST over the years about trying to have a way to
automatically regression test the tutorial on an ongoing basis. If such a
thing existed, it seems like a great solution -- we could have a nightly
cron job automatically tell us if some new module broke either the install
*or* the tutorial itself. However, I'm not aware of any frameworks or tools
to automatically test all that. Does anyone else know of one? I spent some
time thinking about how to use some =for tags in the POD to automate the
testing, but I didn't get very far with it -- unless I'm missing something,
it seems like a big job to do it right. So far I have done it manually: I
start with a "minimal" install of Linux in VMWare, I run cat-install
following the exact directions in Catalyst::Manual::Installation::CentOS4
(almost always updating them because of some dependency or module change
requiring a hack), manually walking through every command and and cutting
and pasting every chunk of code into the right place. It's obviously very
slow (even though I have done it so many times I can almost do it in my
sleep), :-) but it's the only way I know to make sure it works (at least I
know it worked the day I finished my testing... all bets were off as soon as
the underlying modules get updated the next day).

Then I happened to try Catalyst on Ubuntu 8.04 back when that first came out
earlier this year. I couldn't believe how fast and easy it was! In about 2
minutes I was able to boot the Live CD, uncomment the universe repositories
in /etc/apt/sources.list and run one apt-get command. Boom, I was done and
it "just worked". I could do the entire tutorial in that environment. No
more waiting an hour for cat-install to finish only to realize it failed on
some module 50 minutes earlier (don't get me wrong, I think cat-install is
terrific, but it does have to download, compile and test each and every
module). Yeah, Ubuntu didn't give me the latest and greatest of every
module, but there was an advantage to that -- it gave me a known environment
for the next 6 months until Ubuntu did their next release. So while it
wasn't perfect, it seemed like the lesser evil to me -- the setup was quick,
painless, and pretty darn bulletproof, Ubuntu is obviously very popular so
lots of people are familiar with it (not that they need to know anything
about Ubuntu for the tutorial), and, because they release a new version of
Ubuntu every 6 months it stays fairly current, etc. All I have to do is
test and update everything every six months as a new version of Ubuntu comes
out (I'm currently working on an update for Ubuntu 8.10).

Thoughts on this? Baring an automated testing methodology that really
exercises all parts of the tutorial process (the install, running the helper
commands, copying code from the pod files, etc.), it seems like something
along the lines of Ubuntu is a pretty good compromise. Especially if we
don't have a volunteer to automate the testing process. :-)

Thoughts? Comments? Suggestions?

Thanks,
Kennedy

PS -- Note that with the Ubuntu approach, we do have the option of having
them use CPAN for one or more modules if we want to get around a serious bug
and/or pull in a module that newer than can be found in Ubuntu universe.
And, because apt-get would do the heavy lifting of getting the 172
modules/packages installed first, the job would still be a lot faster,
simplier, and more tightly controlled than a raw build against CPAN.

PPS - Another idea I have thought about and seen discussed is having a
VMWare virtual appliance image available. It sounds like a great way to go,
but we would need to find a way to maintain it.
Re: Important Question About Recommended Environment For The Tutorial [ In reply to ]
On Mon, Dec 8, 2008 at 12:12 PM, <hkclark@gmail.com> wrote:

> Hi Everyone,
>
> After talking to MST I realized that I have been way overdue getting a note
> out to the group on an issue that's pretty fundamental to the tutorial (I
> had some one-on-one discussions about it, but I don't think I did anything
> to the whole group). Sorry for not getting something like this out to the
> group sooner and sorry for this email being so long (but it's kind of a big
> topic with a somewhat long history and potentially important ramifications
> for the future).
>
> <snip>

I forgot to mention that I haven't done any recent testing with
Task::Catalyst::Tutorial. I could see where doing something like having
cat-install create just a basic Catalyst environment and then using
T::C::Tutorial to complete the install might be less error-prone than the
method in Catalyst::Manual::Installation::CentOS4, but unless I'm missing
something, it still seams like it leaves lots of room for some of the same
issues I describe in my original note. Let me me know if I'm looking at it
the wrong way.

Thanks!
Kennedy
Re: Important Question About Recommended Environment For The Tutorial [ In reply to ]
On Mon, Dec 8, 2008 at 9:12 AM, <hkclark@gmail.com> wrote:
> Hi Everyone,
>
> After talking to MST I realized that I have been way overdue getting a note
> out to the group on an issue that's pretty fundamental to the tutorial (I
> had some one-on-one discussions about it, but I don't think I did anything
> to the whole group). Sorry for not getting something like this out to the
> group sooner and sorry for this email being so long (but it's kind of a big
> topic with a somewhat long history and potentially important ramifications
> for the future).
>
> Two of the big issues I have been grappling with since I first did the
> tutorial almost 3 years ago are:
>
> 1) How to get an environment where users can quickly get up and running to
> work through the tutorial (IOW, mostly how to avoid problems with CPAN)
>
> 2) How to have a 'known good" set of modules that the tutorial should work
> against.
>
>
> Another key question is: "What is the purpose of the tutorial?" My thought
> has been that it needs to provide a good, stable environment where newcomers
> can learn the basics of Catalyst. It should try to show relatively current
> best practices and features, but IMHO, having it "just work" is more
> important than having the latest and greatest of everything -- if people get
> frustrated in the early learning phases they will never "stick around" to
> learn the finer points. Any comments on this point of view?
>

I 100% agree. The biggest thing for the tutorial is to show new users
how easy it is. We cannot reasonable expect them to know how to use
CPAN (or even local::lib).

I'm a seasoned developer, and was just infuriated by the old version
of gems on debian. It wouldn't work and was throwing a very cryptic
error message. It took about 15 minutes to solve, and if I was
experimenting with Ruby rather than trying to install something, I
would have given up and just forgotten about it. This is what we
cannot have.

> In terms of trying to address 1 & 2 above, my initial plan of attack was to
> encourage the use of MST's really cool "cat-install" script (while still
> mentioning other options like "cat in a box"). I stuck
> Catalyst::Manual::Installation::CentOS4 out there as a way that people could
> start from scratch with a relatively popular distro and build a Catalyst
> environment. Unfortunately, using cat-install to always pull the latest
> versions of everything seemed to create more problems with #1 and #2 above
> than I expected. More times than not it ran into at least one module that
> failed to install correctly. And, even if people did get past the install,
> there was no way to know if the tutorial was now broken once they got into
> the details.
>
> I have talked to MST over the years about trying to have a way to
> automatically regression test the tutorial on an ongoing basis. If such a
> thing existed, it seems like a great solution -- we could have a nightly
> cron job automatically tell us if some new module broke either the install
> *or* the tutorial itself. However, I'm not aware of any frameworks or tools
> to automatically test all that. Does anyone else know of one? I spent some
> time thinking about how to use some =for tags in the POD to automate the
> testing, but I didn't get very far with it -- unless I'm missing something,
> it seems like a big job to do it right. So far I have done it manually: I
> start with a "minimal" install of Linux in VMWare, I run cat-install
> following the exact directions in Catalyst::Manual::Installation::CentOS4
> (almost always updating them because of some dependency or module change
> requiring a hack), manually walking through every command and and cutting
> and pasting every chunk of code into the right place. It's obviously very
> slow (even though I have done it so many times I can almost do it in my
> sleep), :-) but it's the only way I know to make sure it works (at least I
> know it worked the day I finished my testing... all bets were off as soon as
> the underlying modules get updated the next day).
>

Can't we have the Cat tutorial as a CPAN dist that runs a t/ suite?
Then let CPAN Testers have at it.

> Then I happened to try Catalyst on Ubuntu 8.04 back when that first came out
> earlier this year. I couldn't believe how fast and easy it was! In about 2
> minutes I was able to boot the Live CD, uncomment the universe repositories
> in /etc/apt/sources.list and run one apt-get command. Boom, I was done and
> it "just worked". I could do the entire tutorial in that environment. No
> more waiting an hour for cat-install to finish only to realize it failed on
> some module 50 minutes earlier (don't get me wrong, I think cat-install is
> terrific, but it does have to download, compile and test each and every
> module). Yeah, Ubuntu didn't give me the latest and greatest of every
> module, but there was an advantage to that -- it gave me a known environment
> for the next 6 months until Ubuntu did their next release. So while it
> wasn't perfect, it seemed like the lesser evil to me -- the setup was quick,
> painless, and pretty darn bulletproof, Ubuntu is obviously very popular so
> lots of people are familiar with it (not that they need to know anything
> about Ubuntu for the tutorial), and, because they release a new version of
> Ubuntu every 6 months it stays fairly current, etc. All I have to do is
> test and update everything every six months as a new version of Ubuntu comes
> out (I'm currently working on an update for Ubuntu 8.10).
>

I think that starting with packages (either via apt or whatever other
package) is fantastic, but I think there -has- to be a Task:: for
fixing any out-of-date modules that will cause problems. I guess this
is where people have to put on their volunteer hats, but with the
prevalence of various virtualization packages I think this can be
smoked and automated fairly easy. If someone can install
Task::Catalyst::Debian and it lists the modules that may need to be
updated (JSON and JSON::XS come to mind). I haven't thought this idea
out in full to see how it would really work, but I really think this
is the path we will need to go down. It will also foster sentiments
of, "They care about us $distro users!". Brownie points!

> Thoughts on this? Baring an automated testing methodology that really
> exercises all parts of the tutorial process (the install, running the helper
> commands, copying code from the pod files, etc.), it seems like something
> along the lines of Ubuntu is a pretty good compromise. Especially if we
> don't have a volunteer to automate the testing process. :-)
>
> Thoughts? Comments? Suggestions?
>

Thank you very much for the write-up, it was refreshing to read!

-Jay

_______________________________________________
Catalyst-dev mailing list
Catalyst-dev@lists.scsys.co.uk
http://lists.scsys.co.uk/cgi-bin/mailman/listinfo/catalyst-dev
Re: Important Question About Recommended Environment For The Tutorial [ In reply to ]
On 8 Dec 2008, at 17:12, hkclark@gmail.com wrote:
> I have talked to MST over the years about trying to have a way to
> automatically regression test the tutorial on an ongoing basis. If
> such a thing existed, it seems like a great solution -- we could
> have a nightly cron job automatically tell us if some new module
> broke either the install *or* the tutorial itself.

Assuming that the tutorial and all of the code it uses lives in
subversion, then I've got a smoke-testing solution already setup
which could be extended a little to do (some of) this.

> However, I'm not aware of any frameworks or tools to automatically
> test all that. Does anyone else know of one? I spent some time
> thinking about how to use some =for tags in the POD to automate the
> testing, but I didn't get very far with it -- unless I'm missing
> something, it seems like a big job to do it right.

Yeah, pulling the tutorial fragments out is somewhat harder..

> So far I have done it manually: I start with a "minimal" install of
> Linux in VMWare, I run cat-install following the exact directions
> in Catalyst::Manual::Installation::CentOS4 (almost always updating
> them because of some dependency or module change requiring a hack),
> manually walking through every command and and cutting and pasting
> every chunk of code into the right place. It's obviously very slow
> (even though I have done it so many times I can almost do it in my
> sleep), :-) but it's the only way I know to make sure it works (at
> least I know it worked the day I finished my testing... all bets
> were off as soon as the underlying modules get updated the next day).

It would be nice to be able to use some of the CPAN testers smoke
test tools to do this for you, but privately - i.e. build all the
dependencies but report privately.

I'd certainly be interesting in helping to work on some sort of
general project-wide smoking solution, as having an aggressive
Catalyst (and dependencies) specific testing would be good for the
general quality / perception of quality of the project..

We're good at (and people have put a lot of effort into) making the
dependency stack work well, but it'd be nice to be able to be
proactive rather than reactive about these issues..

> Thoughts on this? Baring an automated testing methodology that
> really exercises all parts of the tutorial process (the install,
> running the helper commands, copying code from the pod files,
> etc.), it seems like something along the lines of Ubuntu is a
> pretty good compromise. Especially if we don't have a volunteer to
> automate the testing process. :-)

I think we should try to do all of the above ;_)

> PS -- Note that with the Ubuntu approach, we do have the option of
> having them use CPAN for one or more modules if we want to get
> around a serious bug and/or pull in a module that newer than can be
> found in Ubuntu universe. And, because apt-get would do the heavy
> lifting of getting the 172 modules/packages installed first, the
> job would still be a lot faster, simplier, and more tightly
> controlled than a raw build against CPAN.

I'd say that you show pulling everything using apt-get, but then
building a local::lib, and install Task::Catalyst::Tutorial into it,
so that you get any modules you need updating.. Does that sound like
a sane compromise?

> PPS - Another idea I have thought about and seen discussed is
> having a VMWare virtual appliance image available. It sounds like
> a great way to go, but we would need to find a way to maintain it.

Ah, well that's easy.

You can script vmware from perl, so you just make a new disk, attach
it to a pre-existing machine, boot, login, debootstrap the new disk,
install everything and do some setup, shut down the vmware machine..

I'm not volunteering for this, as basically I don't have the hardware
to mess about with it outside of work time, but having done something
similar, I can say it's possible and not too hard..

Cheers
t0m


_______________________________________________
Catalyst-dev mailing list
Catalyst-dev@lists.scsys.co.uk
http://lists.scsys.co.uk/cgi-bin/mailman/listinfo/catalyst-dev