Mailing List Archive

Best video capture resolution for output to TV?
Hi All,

I have a fairly middle of the range JVC TV set that will be my ultimate
display for my Myth box. I wanted to get some thoughts on what the optimum
capture resolution would be, being that TV is the main viewing
platform. My PVR 250 can do up to 720x480 but I have a feeling anything
over 400-ish is probably overkill because I won't see it on the TV. Thoughts?

-Jeff
RE: Best video capture resolution for output to TV? [ In reply to ]
I've played with settings a lot, and I've found I can't really tell a
difference between 640x480 and 320x480. I record at MPEG4 320x480 4400
bitrate. This gives me quality inline with a regular cable connection
and takes about 1Gig per hour of recording. Watching LiveTV uses about
33% of my Athlon XP 2000. This leaves me plenty of processor for a
second recording. Watchin Live TV and Recording anoth Show uses about
75% processor. When I watch TV, Record and Play a movie on a remote
front end, I'm pretty much max out my CPU.

Also worth noting is that I have my Live TV buffering on a separate disk
from my record TV storage.

My system is Athlon XP 2000+, 256 DDR, 80G 8MB Cache ATA133, 40G 2MB
Cache ATA 100

Ben (Brown)

-----Original Message-----
From: mythtv-users-bounces@snowman.net
[mailto:mythtv-users-bounces@snowman.net] On Behalf Of Jeff C
Sent: Thursday, May 15, 2003 11:26 PM
To: mythtv-users@snowman.net
Subject: [mythtv-users] Best video capture resolution for output to TV?


Hi All,

I have a fairly middle of the range JVC TV set that will be my ultimate
display for my Myth box. I wanted to get some thoughts on what the
optimum
capture resolution would be, being that TV is the main viewing
platform. My PVR 250 can do up to 720x480 but I have a feeling anything

over 400-ish is probably overkill because I won't see it on the TV.
Thoughts?

-Jeff


_______________________________________________
mythtv-users mailing list
mythtv-users@snowman.net
http://lists.snowman.net/cgi-bin/mailman/listinfo/mythtv-users
Re: Best video capture resolution for output to TV? [ In reply to ]
> Van: Ben Brown <ben@handcoder.com>
>
> Watching LiveTV uses about 33% of my Athlon XP 2000.

Are you sure you have setup X correctly? AFAIK that is a quite high value.
What's the video card, mobo (chipset), system memory?

Henk Poley <><
RE: Best video capture resolution for output to TV? [ In reply to ]
Really I thought it was fairly low. One thing, which I forgot, is
Debug, it runns about 22% without debug. Video is Geforce 2 MX,
Motherboard is, Epox 8kmm, 256M RAM. Also, I'm not talking about
mythbackend alone, I'm referenceing total machine usage.

-----Original Message-----
From: mythtv-users-bounces@snowman.net
[mailto:mythtv-users-bounces@snowman.net] On Behalf Of Henk Poley
Sent: Friday, May 16, 2003 12:50 AM
To: Discussion about mythtv
Subject: Re: [mythtv-users] Best video capture resolution for output to
TV?


> Van: Ben Brown <ben@handcoder.com>
>
> Watching LiveTV uses about 33% of my Athlon XP 2000.

Are you sure you have setup X correctly? AFAIK that is a quite high
value. What's the video card, mobo (chipset), system memory?

Henk Poley <><
_______________________________________________
mythtv-users mailing list
mythtv-users@snowman.net
http://lists.snowman.net/cgi-bin/mailman/listinfo/mythtv-users
Re: Best video capture resolution for output to TV? [ In reply to ]
On Fri, May 16, 2003 at 12:25:55AM -0300, Jeff C wrote:
> I have a fairly middle of the range JVC TV set that will be my ultimate
> display for my Myth box. I wanted to get some thoughts on what the optimum
> capture resolution would be, being that TV is the main viewing
> platform. My PVR 250 can do up to 720x480 but I have a feeling anything
> over 400-ish is probably overkill because I won't see it on the TV.
> Thoughts?

It all depends on your personal preference. Some people think VHS
quality is good enough, while some people won't settle for anything less
than DVD quality (full NTSC resolution). If it were me, I would be
recording at 720x480, but I want the best quality I can get.

Also, keep in mind that the smaller the TV, the lower resolution you can
get away with. 480x480 on a 32" TV will look terrible, but 480x480 on a
20" TV might not...

Mike.
Re: Best video capture resolution for output to TV? [ In reply to ]
At 06:49 AM 5/16/2003 +0200, Henk Poley wrote:
> > Van: Ben Brown <ben@handcoder.com>
> >
> > Watching LiveTV uses about 33% of my Athlon XP 2000.
>
>Are you sure you have setup X correctly? AFAIK that is a quite high value.
>What's the video card, mobo (chipset), system memory?

Is this just a misunderstanding about live versus "live"?

If one were really watching live TV (using xawtv, for example), this would
be very high CPU use ... I'd expect to see about 2%, not 33%, in this instance.

But Myth's "live" TV isn't *really* live -- it's buffered, which means the
system is both recording (encoding) and playing back (decoding) all the
time. For that activity, 33% CPU use seems quite moderate with this
hardware. My system (Cel 1.7 GHz) runs at about 85% CPU use for "live" TV
(now that it has an xv-capable video card), a value roughly consistent with
what I see if the system is recording in the background while I watch a
different, previously recorded show. (This using MPEG4, otherwise the Myth
default values.)
RE: Re: Best video capture resolution for output to TV? [ In reply to ]
> -----Original Message-----
> From: mythtv-users-bounces@snowman.net
> [mailto:mythtv-users-bounces@snowman.net]On Behalf Of Mike Frisch
> Sent: Friday, May 16, 2003 10:14 AM
> To: mythtv-users@snowman.net
> Subject: [mythtv-users] Re: Best video capture resolution for output to
> TV?
>
>
> It all depends on your personal preference. Some people think VHS
> quality is good enough, while some people won't settle for anything less
> than DVD quality (full NTSC resolution). If it were me, I would be
> recording at 720x480, but I want the best quality I can get.
>


Why capture at 720x480 when NTSC resolution is not that high? Anything
beyond 640x480 is a waste (actually, even 480x480 is just fine... see below)


> Also, keep in mind that the smaller the TV, the lower resolution you can
> get away with. 480x480 on a 32" TV will look terrible, but 480x480 on a
> 20" TV might not...

480x480 looks great on my 60" TV... I can hardly tell the difference from
640x480.

-JAC
RE: Re: Best video capture resolution for output to TV? [ In reply to ]
At 11:06 AM 5/16/2003 -0400, Joseph A. Caputo wrote:
[...]
> > It all depends on your personal preference. Some people think VHS
> > quality is good enough, while some people won't settle for anything less
> > than DVD quality (full NTSC resolution). If it were me, I would be
> > recording at 720x480, but I want the best quality I can get.
>
>Why capture at 720x480 when NTSC resolution is not that high? Anything
>beyond 640x480 is a waste (actually, even 480x480 is just fine... see below)

Please forgive my confusion here ...but in what sense does NTSC even *have*
a horizontal "resolution"? I understand that the vertical resolution -- 480
real, out of 525 theoretical, lines, in two interlaced frames -- is well
defined. But I thought the horizontal signal was continuous, not discrete,
making the relevant question the appropriate rate at which to *sample* it
for digitizing. (There is probably a theoretical answer to the
sampling-rate question too, but I've never seen it explained.)
RE: Re: Best video capture resolution for output to TV? [ In reply to ]
> Why capture at 720x480 when NTSC resolution is not
> that high? Anything
> beyond 640x480 is a waste (actually, even 480x480 is
> just fine... see below)

What resolution do NTSC analog cable channels come in
at? I know they are supposed to be less than say the
digital quality channels you get over satellite. I
have Comcast Digital Cable and the last technician I
talked to said your first 80 or so channels are just
analog and of lesser quality then the rest of the
channels available. Can anyone elaborate on what
those differences are in terms of resolution?


> 480x480 looks great on my 60" TV... I can hardly
> tell the difference from
> 640x480.

I agree ... I only record the analog channels and
480x480 seems fine to me. I used to record at 720x480
but found no improvement with the higher resolution.
Of course I didn't change the bitrate when I lowered
the resolution, so technically the picture would be
better on the lower res capture.

__________________________________
Do you Yahoo!?
The New Yahoo! Search - Faster. Easier. Bingo.
http://search.yahoo.com
RE: Re: Best video capture resolution for output to TV? [ In reply to ]
Sorry, I misspoke...

Based on what I've read in the archives, 640 is a practical maximum width
for capturing NTSC. Anything higher will simply emphasize the shortcomings
of the NTSC source (be it broadcast or VHS). Just because you're capturing
at 720x480 doesn't mean you're getting a DVD-quality picture. The same
applies to PAL, with slightly different numbers.

Basically, crap in == crap out. Capturing at 720x480 would probably just
highlight some of the crappiness that is less noticeable at lower
resolutions.

Again, this is based on previous posts in the archives. I have no
first-hand knowledge in this area, other than to say that I see very little
(if any) quality difference between 640x480 and 480x480 on my 60" TV.

-JAC

> -----Original Message-----
> From: mythtv-users-bounces@snowman.net
> [mailto:mythtv-users-bounces@snowman.net]On Behalf Of Ray Olszewski
> Sent: Friday, May 16, 2003 11:39 AM
> To: Discussion about mythtv
> Subject: RE: [mythtv-users] Re: Best video capture resolution for output
> to TV?
>
>
> Please forgive my confusion here ...but in what sense does NTSC
> even *have*
> a horizontal "resolution"? I understand that the vertical
> resolution -- 480
> real, out of 525 theoretical, lines, in two interlaced frames -- is well
> defined. But I thought the horizontal signal was continuous, not
> discrete,
> making the relevant question the appropriate rate at which to *sample* it
> for digitizing. (There is probably a theoretical answer to the
> sampling-rate question too, but I've never seen it explained.)
Re: Re: Best video capture resolution for output to TV? [ In reply to ]
On Fri, May 16, 2003 at 08:39:29AM -0700, Ray Olszewski wrote:
> Please forgive my confusion here ...but in what sense does NTSC even *have*
> a horizontal "resolution"?

It must have a horizontal resolution when it's being stored in a digital
medium. Look at DVD, for example.

> I understand that the vertical resolution -- 480 real, out of 525
> theoretical, lines, in two interlaced frames -- is well defined. But I
> thought the horizontal signal was continuous, not discrete, making the
> relevant question the appropriate rate at which to *sample* it for
> digitizing. (There is probably a theoretical answer to the
> sampling-rate question too, but I've never seen it explained.)

Sorry, I cannot comment here. I do not know the mathematics involved in
this approximation.
Re: Re: Best video capture resolution for output to TV? [ In reply to ]
On Fri, May 16, 2003 at 08:51:10AM -0700, Allen T. Gilliland IV wrote:
> > Why capture at 720x480 when NTSC resolution is not
> > that high? Anything
> > beyond 640x480 is a waste (actually, even 480x480 is
> > just fine... see below)
>
> What resolution do NTSC analog cable channels come in
> at? I know they are supposed to be less than say the
> digital quality channels you get over satellite.

I believe it has been approximated to be ~300 lines.
RE: Re: Re: Best video capture resolution for output to TV? [ In reply to ]
It's something like 240 scan lines times 2 interlaced frames = 480 scan
lines.
(Actually, I think it's more than that, but the extra lines contain stuff
like the VBI data)

The confusing issue here is what defines the horizontal resolution? The
technical answer seems to be "as high as you want to sample it", however
there is a practical limit beyond which there is no visible differenc in
picture quality. This is due to the relatively poor quality (compared to
digital) of the source signal as well as the limitations of a typical NTSC
television set (unless you have an HDTV & a DVI output on your frontend).

-JAC

> -----Original Message-----
> From: mythtv-users-bounces@snowman.net
> [mailto:mythtv-users-bounces@snowman.net]On Behalf Of Mike Frisch
> Sent: Friday, May 16, 2003 12:17 PM
> To: Discussion about mythtv
> Subject: [mythtv-users] Re: Re: Best video capture resolution for output
> to TV?
>
>
> > What resolution do NTSC analog cable channels come in
> > at? I know they are supposed to be less than say the
> > digital quality channels you get over satellite.
>
> I believe it has been approximated to be ~300 lines.
Re: Re: Re: Best video capture resolution for output to TV? [ In reply to ]
At 12:12 PM 5/16/2003 -0400, Mike Frisch wrote:
>On Fri, May 16, 2003 at 08:39:29AM -0700, Ray Olszewski wrote:
> > Please forgive my confusion here ...but in what sense does NTSC even
> *have*
> > a horizontal "resolution"?
>
>It must have a horizontal resolution when it's being stored in a digital
>medium. Look at DVD, for example.

You are correct, of course. I meant to pose my request for clarification in
the context of what MythTV actually does, which is to digitize an analog
signal (I think it's even analog if you have digital service, since the
capture is then done over a Composite, or maybe sVideo, feed from the
ditital-interface box). In that respect, Joseph's response was to the point.

But his reply got me wondering ... to what extent is the benefit of
increased horizontal dot density limited by encoding parameters? For
example, if the encoding quality is set at, say, 10 Mb/minute, I'd expect
that increasing horizontal dot density will at some point hit the limit of
encoding quality (or maybe CPU speed, but let's put this part aside for
now). In such a case (if the CPU permits it), would increasing encoding
quality to, say, 20 Mb/minute allow increased horizontal dot density to
show improved quality? At what point does the resolution of the NTSC signal
*itself* impose a limit?
Re: Re: Re: Best video capture resolution for output to TV? [ In reply to ]
Well ... based on my understanding of sampling there
is no actual limit to the amount of sampling you can
do of an analog signal. So for the purposes of our
resolution debate I would say that the higher your
horiz resolution is the better.

Of course there are mathematical formulas which detail
the suggested number of samples necessary to represent
an analog signal. And practically speaking you
wouldn't want to exceed 720 because your playback
image would be in a strange ratio.

> But his reply got me wondering ... to what extent is
> the benefit of
> increased horizontal dot density limited by encoding
> parameters? For
> example, if the encoding quality is set at, say, 10
> Mb/minute, I'd expect
> that increasing horizontal dot density will at some
> point hit the limit of
> encoding quality (or maybe CPU speed, but let's put
> this part aside for
> now). In such a case (if the CPU permits it), would
> increasing encoding
> quality to, say, 20 Mb/minute allow increased
> horizontal dot density to
> show improved quality? At what point does the
> resolution of the NTSC signal
> *itself* impose a limit?

I don't think there is any limit on the number of
samples you can take of an analog signal. But of
course the higher the horiz res (dot density) the more
pixels you have per frame in your video. And so to
maintain quality you would have to increase your
bitrate as you increase resolution. I am sure at some
point your computer wouldn't be able to handle all the
data ... at least not in real time.

__________________________________
Do you Yahoo!?
The New Yahoo! Search - Faster. Easier. Bingo.
http://search.yahoo.com
RE: Re: Re: Best video capture resolution for outputto TV? [ In reply to ]
> -----Original Message-----
> From: mythtv-users-bounces@snowman.net
> [mailto:mythtv-users-bounces@snowman.net]On Behalf Of Allen T. Gilliland
> IV
> Sent: Friday, May 16, 2003 2:35 PM
> To: Discussion about mythtv
> Subject: Re: [mythtv-users] Re: Re: Best video capture resolution for
> outputto TV?
>
>
> Well ... based on my understanding of sampling there
> is no actual limit to the amount of sampling you can
> do of an analog signal. So for the purposes of our
> resolution debate I would say that the higher your
> horiz resolution is the better.
>

This is assuming the quality of the analog data is perfect. At some point,
the quality of your capture will either (a) contain more precision than is
necessary to 100% faithfully represent the original image (due to signal
noise, quality of the videotape at the transmitting end, etc), and/or (b)
exceed the ability of the TV to represent the full quality of the output
captured image. If I'm not mistaken, TV's also display pixels, although
they're not perfectly round like a CRT's. So there's really no point in
capturing more pixels across than your TV can show.

Again, what's the good in having a full-DVD resolution representation of a
grainy image?

Just my $.02

-JAC
Re: Re: Re: Best video capture resolution for outputto TV? [ In reply to ]
On Fri, May 16, 2003 at 03:15:06PM -0400, Joseph A. Caputo wrote:
> So there's really no point in capturing more pixels across than your
> TV can show.

Agreed, but anybody with a modern TV can display a full resolution NTSC
signal.

> Again, what's the good in having a full-DVD resolution representation of a
> grainy image?

For me, it's simply being able to eke out that last bit of signal
quality. I generally have not fooled around with lower resolutions.

Mike.
Re: Re: Re: Re: Best video capture resolution for outputto TV? [ In reply to ]
Since I started this thread I'll toss in that now that I TV out working, my
320x480 recordings look about the same to me as my 720x480s. FWIW...

-Jeff

>For me, it's simply being able to eke out that last bit of signal
>quality. I generally have not fooled around with lower resolutions.
>
>Mike.
Re: Re: Best video capture resolution for output to TV? [ In reply to ]
Ray Olszewski wrote:
> At 11:06 AM 5/16/2003 -0400, Joseph A. Caputo wrote:
> [...]
>
>> > It all depends on your personal preference. Some people think VHS
>> > quality is good enough, while some people won't settle for anything
>> less
>> > than DVD quality (full NTSC resolution). If it were me, I would be
>> > recording at 720x480, but I want the best quality I can get.
>>
>> Why capture at 720x480 when NTSC resolution is not that high? Anything
>> beyond 640x480 is a waste (actually, even 480x480 is just fine... see
>> below)
>
>
> Please forgive my confusion here ...but in what sense does NTSC even
> *have* a horizontal "resolution"? I understand that the vertical
> resolution -- 480 real, out of 525 theoretical, lines, in two interlaced
> frames -- is well defined. But I thought the horizontal signal was
> continuous, not discrete, making the relevant question the appropriate
> rate at which to *sample* it for digitizing.

Exactly. Each scan line is an analog wave. The faster the
clock is set to latch on samples for the capture, the more
samples per scan line. The higher the sample rate, the more
detailed the digital representation of the analog signal.
However, there is the law of diminishing returns and there
is a trade off for CPU and disk space.


Perceived Quality
10

9 * * * *
* *
8 * *
*
7 *

6 *

5
*
4

3 *

2

1
240 280 320 360 400 440 480 520 560 600 640 680 720
Horizontal resolution

If recordings look okay at 352x480 but better at 720x480
does it look enough better to justify files twice as big
and only half the maximum recording time? TiVo maxes out
at 544x480.

Signal quality is also a factor. A DVD player or digital
channel from a digital cable box over s-video may look
slightly better at 640 or 720 but for broadcast TV,
anything over 480 looks pretty much the same.

> (There is probably a theoretical answer to the sampling-
> rate question too, but I've never seen it explained.)

This doesn't cover digital sampling rates but has gads
of authoritative information about how TV signals work:
http://www.ntsc-tv.com/ . One relevant section is:

http://www.ntsc-tv.com/ntsc-index-04.htm

which show test patterns for approximating the effective
horizontal resolution of a display device and how Kell
factor limits the maximum horizontal resolution of a
display.

-- bjm
Re: Re: Best video capture resolution for output to TV? [ In reply to ]
Joseph A. Caputo wrote:
> Sorry, I misspoke...
>
> Based on what I've read in the archives, 640 is a practical maximum width
> for capturing NTSC. Anything higher will simply emphasize the shortcomings
> of the NTSC source (be it broadcast or VHS). Just because you're capturing
> at 720x480 doesn't mean you're getting a DVD-quality picture. The same
> applies to PAL, with slightly different numbers.

True. Taking the though further, there is nothing magic
about 640 either. The only thing unique about it is that
640x480 happens to be 4:3 without adjusting the aspect
ratio. For sampling, it's just another clock rate like
720 or 544. For some (bad ;-) signal/display combinations,
beyond 480 may be just capturing more crappiness. For
clean signals and a high def display, 1024 might look a
little better but bt8x8 chips don't go beyond 720.

-- bjm
Re: Re: Re: Best video capture resolution for output to TV? [ In reply to ]
Mike Frisch wrote:
> On Fri, May 16, 2003 at 08:39:29AM -0700, Ray Olszewski wrote:
>
>>Please forgive my confusion here ...but in what sense does NTSC even *have*
>>a horizontal "resolution"?
>
>
> It must have a horizontal resolution when it's being stored in a digital
> medium. Look at DVD, for example.

A DVD or MythTV file or any digital recording has to have
a horizontal sample resolution. However, television signals
are analog. The amplitude ramps up and down with no specific
digital rate. Ray is correct that *NTSC* does not have a
horizontal resolution.

-- bjm
Re: Re: Re: Best video capture resolution for output to TV? [ In reply to ]
Ray Olszewski wrote:
...
> But his reply got me wondering ... to what extent is the benefit of
> increased horizontal dot density limited by encoding parameters? For
> example, if the encoding quality is set at, say, 10 Mb/minute, I'd
> expect that increasing horizontal dot density will at some point hit the
> limit of encoding quality (or maybe CPU speed, but let's put this part
> aside for now). In such a case (if the CPU permits it), would increasing
> encoding quality to, say, 20 Mb/minute allow increased horizontal dot
> density to show improved quality? At what point does the resolution of
> the NTSC signal *itself* impose a limit?

Good point (if I undersood you correctly =). The bit rate
also has an impact on how much detail is preserved during
compression. In testing I found that given a medium res and
medium bit rate, raising the bitrate improved the picture
quality more than raising the resolution to hit a target
file size.

> At what point does the resolution of the NTSC signal
> *itself* impose a limit?

You lost me on this question. However, you may want to
walk up to your TV and look at it from a few inches away.
You'll see that the picture isn't nearly as good as our
eyes are fooled into believing when we watch from a distance.
There are limits to the acuity of the eye so broadcast
equipment and TVs are designed to a tolerance where flaws
aren't appearent from a viewing distance of about seven
times the screen height away.

-- bjm
Re: Re: Re: Best video capture resolution for output to TV? [ In reply to ]
Allen T. Gilliland IV wrote:
> Well ... based on my understanding of sampling there
> is no actual limit to the amount of sampling you can
> do of an analog signal. So for the purposes of our
> resolution debate I would say that the higher your
> horiz resolution is the better.

At the expense of CPU and disk space.

> Of course there are mathematical formulas which detail
> the suggested number of samples necessary to represent
> an analog signal. And practically speaking you
> wouldn't want to exceed 720 because your playback
> image would be in a strange ratio.

But 480x480 or 544x480 aren't 4:3 either. MythTV scales
to fullscreen so assuming your TV or monitor is 4:3,
any recording resolution will look correct. For TV in a
window, there is a Setup->Playback checkbox to force
4:3 regardless of the recording resolution.

-- bjm
Re: Re: Re: Best video capture resolution for output to TV? [ In reply to ]
> Good point (if I undersood you correctly =). The bit
> rate
> also has an impact on how much detail is preserved
> during
> compression. In testing I found that given a medium
> res and
> medium bit rate, raising the bitrate improved the
> picture
> quality more than raising the resolution to hit a
> target
> file size.

Actually, the bitrate is exactly what controls the
quality of your recording. Basically, uncompressed
video = video at max bitrate. Essentially, when you
compress your videos you can think of it in terms of
compressing each frame individually. The bitrate
controls how much data or storage space you are
willing to commit to a single second of video. So if
your bitrate is 3000 kbit/s @ 30 fps, then you are
using 100 kbits per frame.

Resolution is a factor because it affects your bits
per pixel. So obviously taking 2 images, 480x480 and
720x480 each only allowed 100 kbits (because of the
bitrate you chose) then the smaller image is going to
come out better because it is not compressed as much.
So when you are choosing your settings you should
consider how many bpp each frame will have, the higher
the better.

Now obviously compression is tricky buisness and there
are many more factors involved, but bitrate/bpp is the
most important variable.

__________________________________
Do you Yahoo!?
The New Yahoo! Search - Faster. Easier. Bingo.
http://search.yahoo.com
Re: Re: Re: Best video capture resolution for output to TV? [ In reply to ]
At 04:12 PM 5/16/2003 -0700, Bruce Markey wrote:
[...]
>Allen T. Gilliland IV wrote:
>>And practically speaking you
>>wouldn't want to exceed 720 because your playback
>>image would be in a strange ratio.
>
>But 480x480 or 544x480 aren't 4:3 either. MythTV scales
>to fullscreen so assuming your TV or monitor is 4:3,
>any recording resolution will look correct. For TV in a
>window, there is a Setup->Playback checkbox to force
>4:3 regardless of the recording resolution.

This raises a related but distinct issue -- to what extent does encoding in
a non-4:3 ratio complicate playback? I ask because I've noticed that on my
Myth system, if I run xine and play back DivX video I've encoded at a 4:3
ratio, playback uses about 5% of CPU. But if in Myth I play back video that
Myth has recorded at its default 3:3 ratio, rescaling it to 4:3, the
decoding uses almost 35% of CPU.

Why the difference? Is it connected to the rescaling? Is the Myth codec
just that much less efficient than xine's? Is the larger image size
(480x480 in Myth, 320x240 in xine, so Myth is 3 times the pixels of xine)
enough to cause a 7x increase in CPU use? Is it a combination of all these
things?

1 2  View All