Mailing List Archive

Re: sorry if you got (sort of) dup mail
Re: sorry if you got (sort of) dup mail [ In reply to ]
: > It would be extremely difficult unless all your Unix utilities were
: > already available as relocatable or shared code. It would also need
: > multi-threading of some sort to emulate pipes. And unless everything
: > was a shared library, you wouldn't get to share program memory between
: > different processes running the same Unix utility, so performance
: > might not be so great.
: >
: > Larry
: >
:
: Yes -- but I am not advocating even *touching* the Unix utilities.
: I am advocating putting the unix-utilities used in the code lock, stock and
: barrel in the byte-compiled executable itself.
: (including 'sh', to run the executables, and a small lookup table to know
: where to look to execute a given command in the code).

Er. It would be *extremely difficult* unless all your Unix utilities were
*already* available as *relocatable* or shared code.

I wasn't advocating touching them either. The simple fact is that for
most Unix utilities, the relocation information is already stripped, so
what you want is effectively impossible (which is, I guess, one variety
of "extremely difficult").

It might just be barely possible if you moved Perl up to high memory
and treated all the executables as overlays sharing the same virtual
address. But emulating the OS is bound to be exceedingly tedious, not
to mention slow. How would you keep your emulated pipelines from
deadlocking? (You can't hope to run your subprocesses to completion
all the time.) How do you share signals? What do you do if one of
your subprocesses gets a signal that can't be caught? Where do you
keep everybody's heap, and stack? What if your Unix executables depend
on shared libraries, as most of them do nowadays?

Urque.

Alternately, you could copy all the embedded files to disk and execute
them as real processes. Leaving aside some vicious licensing issues
for the moment, this would run like a pig. Each *invocation* of the
program would have the overhead of "untarring" all its embedded files.
It would have to find disk space for its files. For each invocation.
It would run separate executables to do the same thing, so different
invocations of the same program would not be able to share program code
in memory.

Even if you could find some way to share such de-embedded executables, you're
potentially forcing every installation that uses your programs to maintain
every version of nearly every Unix utility on disk. I know disk is
getting cheaper, but not that fast.

The cost is too high for the benefit. I see no way to reduce the cost.
Sorry 'bout that...

Larry
Re: sorry if you got (sort of) dup mail [ In reply to ]
Re: sorry if you got (sort of) dup mail [ In reply to ]
: hmm. Then I would say that one of two things should happen.
:
: 1) function stubs should be made for the common command calls in unix (like
: ls and cat and pwd) that should be included (as options?) in perl.
: I would think that could be done with admittedly some pain by looking
: at the public domain. I would also think that this would cause a speedup
: over the current 'system' or '``' implementation.
:
: 2) function stubs in UNIX should be rewritten in perl. I have already seen ls
: (and it was horrid), and this would be probably

Certainly, portable interfaces should be made available for those who
want to do portable programming. It's just that it's a little hard to
tell Unix programmers they can't use Unix... :-\

Larry
Re: sorry if you got (sort of) dup mail [ In reply to ]