Mailing List Archive

Enlighten me about 1080 "interlace"
I thought I understood this interlace stuff but Myth
is challenging my concept of interlace vs.
non-interlace video.

I have a TV capable of displaying 1920x1080
interlaced. It's a CRT based RPTV from Pioneer, if
you care. I have watched terrestrial broadcasts on
HDTV and it looks breathtaking. The image consists of
two alternating 540-line frames, each 1/60 second,
displayed at even and odd scan lines for 1080 unique
lines of information.

Now I get my Myth box running and it looks very, very
good. But not quite as good as my TV with a direct
OTA receiver. Why not? I've set everything up as
carefully as possible. I'm on stock 0.16 Myth, I have
an Intel 3GHz processor running an nVidia 5500 card
with Xv. The modeline is correct and the card is
doing interlaced out, as evidenced by the flickering
on 1-pixel lines with an xterm on screen.

Some people on the list whose advice I trust very much
have suggested that 540p and 1080i output are
_identical_ as played by Myth. This certainly seems
to be true. I have played with the "deinterlace"
setting and the "bob and weave" deinterlace (which I
understand duplicates one of the frames and tosses the
other) looks the same as interlaced on my set.

Is this an inherent limitation of Myth? Is there no
way to recover and show both frames of the 1080i
signal on Myth, or am I missing a key setting? It
bothers me that my set could be showing me twice the
picture I'm getting now!




__________________________________
Do you Yahoo!?
Yahoo! Mail - now with 250MB free storage. Learn more.
http://info.mail.yahoo.com/mail_250
Re: Enlighten me about 1080 "interlace" [ In reply to ]
> I thought I understood this interlace stuff but Myth
> is challenging my concept of interlace vs.
> non-interlace video.
>
> I have a TV capable of displaying 1920x1080
> interlaced. It's a CRT based RPTV from Pioneer, if
> you care. I have watched terrestrial broadcasts on
> HDTV and it looks breathtaking. The image consists of
> two alternating 540-line frames, each 1/60 second,
> displayed at even and odd scan lines for 1080 unique
> lines of information.
>
> Now I get my Myth box running and it looks very, very
> good. But not quite as good as my TV with a direct
> OTA receiver. Why not? I've set everything up as
> carefully as possible. I'm on stock 0.16 Myth, I have
> an Intel 3GHz processor running an nVidia 5500 card
> with Xv. The modeline is correct and the card is
> doing interlaced out, as evidenced by the flickering
> on 1-pixel lines with an xterm on screen.

I'm no expert on this at all, but I had always assumed the picture
will never be as good on MythTV as OTA direct to your TV as the video
has to be encoded and decoded. Inevitable some of the original picture
quality will be lost in that process, I would have thought. If it's
_almost_ as good perhaps that's all you can hope for.
Re: Enlighten me about 1080 "interlace" [ In reply to ]
On Tue, Dec 21, 2004 at 03:45:42PM +1100, Phill Edwards wrote:
> > I thought I understood this interlace stuff but Myth
> > is challenging my concept of interlace vs.
> > non-interlace video.
> >
> > I have a TV capable of displaying 1920x1080
> > interlaced. It's a CRT based RPTV from Pioneer, if
> > you care. I have watched terrestrial broadcasts on
> > HDTV and it looks breathtaking. The image consists of
> > two alternating 540-line frames, each 1/60 second,
> > displayed at even and odd scan lines for 1080 unique
> > lines of information.
> >
> > Now I get my Myth box running and it looks very, very
> > good. But not quite as good as my TV with a direct
> > OTA receiver. Why not? I've set everything up as
> > carefully as possible. I'm on stock 0.16 Myth, I have
> > an Intel 3GHz processor running an nVidia 5500 card
> > with Xv. The modeline is correct and the card is
> > doing interlaced out, as evidenced by the flickering
> > on 1-pixel lines with an xterm on screen.
>
> I'm no expert on this at all, but I had always assumed the picture
> will never be as good on MythTV as OTA direct to your TV as the video
> has to be encoded and decoded. Inevitable some of the original picture
> quality will be lost in that process, I would have thought. If it's
> _almost_ as good perhaps that's all you can hope for.

No. In theory there is no reason why Myth's decoder would not be as
good (or better) than the one in the TV. If you have DVI, the results
should be identical.

In practice, I am not sure myth's decoder is as good (I see more mpegging
artifacts). And your modeline may never be the exact perfect one the
TV has. Inside, the TV is just writing directly into its own frame buffer
for the DLP. In theory, you could get your DVI stream to also just be
fed into that frame buffer, if you did it exactly right.

I'm surprised the TV doesn't figure this out for you with DVI, saying,
"Hmmm, here comes 1920 pixels, why don't I do with that just what I would
do with stuff I decompressed myself." Perhaps some TVs do.

What would give you identical results would be streaming mpeg out firewire,
or, if an RF modulator was available, streaming out ATSC or QAM to a
TV ready to receive that. I think it would be cool if the pcHDTV card
came with an ATSC RF modulator on it as well as a demodulator. Like
the very first VCRs, sending out their signal on channel 3. But this
time doing _better_ than component video.

Though you would have to write an mpeg streamer X driver, no small feat.
Re: Enlighten me about 1080 "interlace" [ In reply to ]
--- Brad Templeton <brad+myth@templetons.com> wrote:

> Inside, the TV is just writing directly
> into its own frame buffer
> for the DLP. In theory, you could get your DVI
> stream to also just be
> fed into that frame buffer, if you did it exactly
> right.

Just to get back on track with this example. No DLP.
No framebuffer. No DVI. Just a big 'ol analog CRT
projection set that runs at 1080i native. And it
looks like I'm losing half my picture through Myth.



__________________________________
Do you Yahoo!?
All your favorites on one personal page – Try My Yahoo!
http://my.yahoo.com
Re: Enlighten me about 1080 "interlace" [ In reply to ]
Joe Barnhart wrote:
> I thought I understood this interlace stuff but Myth
> is challenging my concept of interlace vs.
> non-interlace video.
>
> I have a TV capable of displaying 1920x1080
> interlaced. It's a CRT based RPTV from Pioneer, if
> you care. I have watched terrestrial broadcasts on
> HDTV and it looks breathtaking. The image consists of
> two alternating 540-line frames, each 1/60 second,
> displayed at even and odd scan lines for 1080 unique
> lines of information.
>
> Now I get my Myth box running and it looks very, very
> good. But not quite as good as my TV with a direct
> OTA receiver. Why not? I've set everything up as
> carefully as possible. I'm on stock 0.16 Myth, I have
> an Intel 3GHz processor running an nVidia 5500 card
> with Xv. The modeline is correct and the card is
> doing interlaced out, as evidenced by the flickering
> on 1-pixel lines with an xterm on screen.
>
> Some people on the list whose advice I trust very much
> have suggested that 540p and 1080i output are
> _identical_ as played by Myth. This certainly seems
> to be true. I have played with the "deinterlace"
> setting and the "bob and weave" deinterlace (which I
> understand duplicates one of the frames and tosses the
> other) looks the same as interlaced on my set.

There is no "bob and weave" deinterlace in MythTV. The relevant
algorithms are "onefield," which behaves as you say -- tosses half the
fields; and "bob," which shows each field sequentially (one field each
1/60 sec.).

The other two algorithms are "linearblend," which retrieves resolution
from low-motion scenes at the expense of ghosting during horizontal
motion; and "kerneldeint," which does better but is too
resource-intensive for HDTV on today's hardware.

It's worth noting once again that if you're using XvMC, none of these
algorithms are used. If you choose any of them, it will use the only
method XvMC is capable of: bob implemented in hardware. This is not a
software limitation.

There are more-advanced algorithms out there, notably from the
open-source (but for Windows <boggle>) DScaler, and from TVTime. The
software architecture of how Myth displays video is not amenable to
easily adapting these filters, which is a shame, because the TVTime ones
seem to be developing into somewhat of a standard.

When displayed in a 540p frame, bob deinterlacing should look identical
to native 1080i, at least on a CRT with relatively long-persistence
phosphor. Make sure you're using a 1920x540 video mode (not 960x540),
or you *are* tossing out half your horizontal resolution.

I believe that on some sets, native 1080i output would look better than
the fake 540p that we use. However, there appears to be a bug in
nVidia's drivers, where *video* is not displayed at the full resolution,
but instead undergoes some sort of deinterlacing. Pause some
horizontally-panning scene when in 1080i and you should see flicker, but
you don't. Normal GUI stuff is fine; it's the Xv overlay that's screwed up.

In conclusion, I think that for a 1080i-native set, 1920x540 with bob
deinterlacing is the best you'll get out of Myth right now. If your set
does 1080*p*, it should be even better, though some of those
more-advanced algorithms would help.

-Doug
Re: Enlighten me about 1080 "interlace" [ In reply to ]
--- Doug Larrick <doug@ties.org> wrote:


Thanks for the informative reply. Now I feel like I'm
getting somewhere.

> There is no "bob and weave" deinterlace in MythTV.
> The relevant
> algorithms are "onefield," which behaves as you say
> -- tosses half the
> fields; and "bob," which shows each field
> sequentially (one field each
> 1/60 sec.).

Yes, "bob" is what I interpreted as "bob and weave".
So it displays 540p by unraveling the interlaced
frames and showing each one for 1/60 sec. That's why
I appear to lose half of the resolution of the set.

> It's worth noting once again that if you're using
> XvMC, none of these
> algorithms are used.

No XvMC here. I learned my lesson!

> When displayed in a 540p frame, bob deinterlacing
> should look identical
> to native 1080i, at least on a CRT with relatively
> long-persistence
> phosphor.

This would only be true if the frame were displaced
one scan line to "fill in" the resolution instead of
overwriting it. From what I see, the displacement
does not occur, which makes diagonal lines more jaggy
than they should be.

> I believe that on some sets, native 1080i output
> would look better than
> the fake 540p that we use. However, there appears
> to be a bug in
> nVidia's drivers, where *video* is not displayed at
> the full resolution,
> but instead undergoes some sort of deinterlacing.
> Pause some
> horizontally-panning scene when in 1080i and you
> should see flicker, but
> you don't. Normal GUI stuff is fine; it's the Xv
> overlay that's screwed up.

You are exactly correct -- pausing should let me see
the interlaced frames flicker and that does not
happen. Has anyone mentioned this to nVidia? Is it
their driver or do you suppose some weakness in
XFree86? (If this is nVidia's fault, I could walk
over to their building some night and spray-paint
helpful suggestions on their wall. It's only about a
mile from where I live.) (Just kidding, nVidia.)

> In conclusion, I think that for a 1080i-native set,
> 1920x540 with bob
> deinterlacing is the best you'll get out of Myth
> right now.

Yes, and while it does look *good* it doesn't quite
measure up to what a hi-def receiver puts out. And
that's what has me looking to improve things.

Thanks for you cogent answer and your insight.




__________________________________
Do you Yahoo!?
Yahoo! Mail - Find what you need with new enhanced search.
http://info.mail.yahoo.com/mail_250
Re: Enlighten me about 1080 "interlace" [ In reply to ]
--- Joe Barnhart <joebarnhart@yahoo.com> wrote:

>
> --- Doug Larrick <doug@ties.org> wrote:
>
> > When displayed in a 540p frame, bob deinterlacing
> > should look identical
> > to native 1080i, at least on a CRT with relatively
> > long-persistence
> > phosphor.
>
> This would only be true if the frame were displaced
> one scan line to "fill in" the resolution instead of
> overwriting it. From what I see, the displacement
> does not occur, which makes diagonal lines more
> jaggy
> than they should be.

No, wait. I think I get it now. The TV SET is doing
the displacement of every other "540p" frame. It
doesn't know any better, so it always shows
alternating frames, each shifted up or down.

So why does my resolution seem to suffer? Could it be
because the "top" and "bottom" fields lose their
identity and get swapped? Would that degrade the
picture or be harmless?




__________________________________
Do you Yahoo!?
The all-new My Yahoo! - What will yours do?
http://my.yahoo.com