Mailing List Archive

My experience with HD video and the Nvidia 6200TC
(This is part of an occasional series on how I've successfully [or
unsuccessfully] accomplished something in MythTV on my Fedora Core
4-based system using the ATrpms packages. I will assume that readers
can do basic Linux tasks, such as install RPMs, edit /etc files, and
generate xorg.conf. I hope to cover the mystery areas where others may
most often fall astray in.)

I am able to happily view high-definition (1080i and 720p) video
through MythTV. Here are the ingredients:

* A powerful-enough frontend. As I understand it, my Pentium 4 3.0GHz
frontend/backend is just over the line in terms of having enough
horsepower to display HD without XvMC. This is important, as I'll
get to later.
* A supported video card. For HD, this generally means an Nvidia 5200
(AGP) or 5300 (PCI Express; rare), or later; I have an Nvidia 6200
TC with 128MB. ATI owners who want to display HD (as opposed to
standard-definition) are, from what I understand, out of luck due to
driver-support issues. Recent Intel integrated-graphic chipsets (my
MythTV box has one, actually) and the Via UniChrome also work, but I
don't have any firsthand experience with them.
* A HD display. I have a 1920x1080p panel now but previously happily
used a 1280x1024 monitor; MythTV happily scaled high-definition
pictures into a 1280x720 rectangle in the middle of the
monitor. I'll briefly speak on modelines later.

I use ATrpms' prepackaged Nvidia drivers on my Fedora Core 4 MythTV
box. The RPM packages are nvidia-graphics-devices,
nvidia-graphicsxxxx-libs, nvidia-graphicsxxxx,
nvidia-graphicsxxxx-kmdl, and nvidia-graphics-helpers. General
consensus is that only the 7676 and the most-recent 8756 versions are
worthwhile. For me xxxx=7676, because the -kmdl package is
kernel-specific, I still use the 2.6.15-1.1833-FC4smp kernel RPM [1],
and there is no 8756-kmdl package for 2.6.15. I'll proceed from here
using 7676 as the example.

After installing the relevant RPM packages, type

# nvidia-graphics-switch 7676

This modifies various things inside the system to suit your driver;
with multiple Nvidia driver version installed, this command switches
from one to another. It's probably a good idea to now reboot the
system. Sometimes (I say sometimes because this doesn't always happen
at this stage) the driver will after the reboot switch out the
existing /etc/X11/xorg.conf file for a generic one. This will
typically be obvious because the resolution will be set very low
(800x600, for example). Just rename xorg.conf.backup (the original
file) back to xorg.conf, then restart X (Ctrl-Alt-Delete).

Key configuration changes occur in three places: xorg.conf, the
nvidia-settings utility, and mythfrontend.

* xorg.conf
Some excerpts:

Section "Module"
Load "dbe"
Load "extmod"
Load "fbdevhw"
Load "glx"
Load "record"
Load "freetype"
Load "type1"
Load "v4l"
EndSection

Section "Monitor"
Identifier "Monitor0"
VendorName "Westinghouse"
ModelName "LVM-37W1"
HorizSync 30 - 80
VertRefresh 50 - 75
Option "dpms"

# DisplaySize 820 460 # My monitor's actual dimensions
DisplaySize 488 274 # Used to force 100x100 DPI, which MythTV likes

Modeline "1920x1080-59.94p"
148.352
1920 1960 2016 2200
1080 1082 1088 1125
+hsync -vsync
# The modeline I use

Modeline "1920x1080-60p"
148.5
1920 1960 2016 2200
1080 1082 1088 1125
# Another modeline I've used

Modeline "1920x1080-59.94i"
74.176
1920 1960 2016 2200
1080 1082 1088 1125
Interlace
# What I'd use if I had a 1080i display, or if I wanted it to
# handle deinterlacing

Modeline "1920x1080"
138.50
1920 1968 2000 2080
1080 1082 1087 1111
+hsync -vsync
# Modeline my display suggests to the driver. I have this here
# because certain versions of the Nvidia driver properly read
# it from the display and others don't; I can't remember what
# the case is for 7676.

EndSection

Section "Device"
Identifier "Videocard0"
Driver "nvidia"
VendorName "eVGA"
BoardName "Geforce 6200TC"

# Option "ExactModeTimingsDVI" "1" # Doesn't work for me
Option "XvmcUsesTextures" "Yes" # I don't think this does anything on a 6200
# Option "NoDDC"
# Option "ConnectedMonitor" "DFP"

EndSection

* nvidia-settings (X = check, _ = no check)
In X Server XVideo Settings:
X Video Texture Adapter|Sync to VBlank (Highly important!)
_ Video Blitter Adapter|Sync to VBlank

In OpenGL Settings:
X Sync to VBlank
X Allow Flipping.

* mythfrontend's Setup|TV Settings|Playback
X Deinterlace playback | Bob (2x framerate)
Preferred MPEG2 Decoder: libmpeg2 (I can't say I've seen a difference
here between this and Standard.)
X Enable OpenGL vertical sync for timing (RTC timing, used when
unchecked, also works fine, as long as dev.rtc.max-user-freq=1024
appears in /etc/sysctl.conf)
X Enable realtime priority threads (irrelevant, as I haven't actually taken
the steps to let mythfrontend run in realtime)
_ Use video as timebase
X Extra audio buffering

What happens with different settings:
* _ Video Texture Adapter|Sync to VBlank (Video tearing, so that the
top 1/6 of the picture appears slightly behind or ahead of the rest)
* Kernel, one field, and linear blend filters (Inferior to Bob, in my
experience. But if I do use them, I have to turn OpenGL sync off in
mythfrontend to avoid stuttering video.)
* Preferred MPEG Decoder: XvMC (Video looks great, as it always uses
Bob deinterlacing. However, XvMC doesn't save very much CPU
horsepower on my setup, and in any case, displaying the OSD always
causes the video to stutter. My understanding is that on a 5200/5300
that XvmcUsesTextures line above would solve the issue; I actually
have a 5300 I bought for cheap off eBay and mean to try this
sometime.)

With the above settings, I get what I can only describe as jaw
droppingly-good video output on my 37" panel. For example, about 70
seconds into a Discovery HD program called HD Traveler: New York City,
there is a right-left pan looking down into Times Square. If I use any
filter besides Bob, the streetscape blurs during the pan; with Bob, I
can make out sufficient details such that I can tell that one of the
storefronts on the other side of the street from the camera is a Chase
Bank.

[1] I'd use 2.6.16 if I could because I could then use the 8756 driver
in ATrpms packaging. However, with all FC4 2.6.16 kernel packages I've
tried, including the latest 2108, I can reliably hang my MythTV system
in a few seconds by making a Samba or NFS connection to another system
on the network (not the NAS I keep my MythTV recordings on, but
another Linux box running FC3). I believe it has something to do with
2.6.16' included sky2 and sk98lin kernel modules that support my
D-Link (SysKonnect) gigabit Ethernet card. When I try to manually
compile the sk98lin driver I downloaded from the SysKonnect site (what
I have to use for pre-2.6.16 kernels), I get an error.

--
Yeechang Lee <ylee@pobox.com> | +1 650 776 7763 | San Francisco CA US
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
Re: My experience with HD video and the Nvidia 6200TC [ In reply to ]
Nice writeup, maybe worth while to place in the wiki?

Just a comment on your closing text:

On Sun, May 14, 2006 at 04:47:41AM -0700, Yeechang Lee wrote:
> [1] I'd use 2.6.16 if I could because I could then use the 8756 driver
> in ATrpms packaging. However, with all FC4 2.6.16 kernel packages I've
> tried, including the latest 2108, I can reliably hang my MythTV system
> in a few seconds by making a Samba or NFS connection to another system
> on the network (not the NAS I keep my MythTV recordings on, but
> another Linux box running FC3). I believe it has something to do with
> 2.6.16' included sky2 and sk98lin kernel modules that support my
> D-Link (SysKonnect) gigabit Ethernet card. When I try to manually
> compile the sk98lin driver I downloaded from the SysKonnect site (what
> I have to use for pre-2.6.16 kernels), I get an error.

There are sk98lin kmdls at ATrpms which you could try out. I had some
reports that the kernel's version eat your system, while the kmdls
seem to work fine.
--
Axel.Thimm at ATrpms.net
Re: My experience with HD video and the Nvidia 6200TC [ In reply to ]
>I am able to happily view high-definition (1080i and 720p) video
>through MythTV. Here are the ingredients:
>
>* A powerful-enough frontend. As I understand it, my Pentium 4 3.0GHz
> frontend/backend is just over the line in terms of having enough
> horsepower to display HD without XvMC. This is important, as I'll
> get to later.
>* A supported video card. For HD, this generally means an Nvidia 5200
> (AGP) or 5300 (PCI Express; rare), or later; I have an Nvidia 6200
> TC with 128MB. ATI owners who want to display HD (as opposed to
> standard-definition) are, from what I understand, out of luck due to
> driver-support issues.

ATI owners with a 9600 AGP or above and 128 MB video RAM will find
HD works just fine using the fglrx drivers. There is much documentation
on the wiki and other places that no problem exists.
Re: My experience with HD video and the Nvidia 6200TC [ In reply to ]
James C. Dastrup wrote:
>> I am able to happily view high-definition (1080i and 720p) video
>> through MythTV. Here are the ingredients:
>>
>> * A powerful-enough frontend. As I understand it, my Pentium 4 3.0GHz
>> frontend/backend is just over the line in terms of having enough
>> horsepower to display HD without XvMC. This is important, as I'll
>> get to later.
>> * A supported video card. For HD, this generally means an Nvidia 5200
>> (AGP) or 5300 (PCI Express; rare), or later; I have an Nvidia 6200
>> TC with 128MB. ATI owners who want to display HD (as opposed to
>> standard-definition) are, from what I understand, out of luck due to
>> driver-support issues.
>
> ATI owners with a 9600 AGP or above and 128 MB video RAM will find
> HD works just fine using the fglrx drivers. There is much documentation
> on the wiki and other places that no problem exists.

I am currently using a Radeon 8500 with the X.org 7.0 radeon driver and
an LCD TV connected via DVI. It works fine except that the Athlon64
3700+ CPU is sometimes overloaded playing and deinterlacing 1080i. I
expect that to change when I switch the CPU to an Athlon 64 X2 3800+.

Jonathan Rogers
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
Re: My experience with HD video and the Nvidia 6200TC [ In reply to ]
On 5/15/06, Jonathan Rogers <jonner@teegra.net> wrote:
> James C. Dastrup wrote:
> >> I am able to happily view high-definition (1080i and 720p) video
> >> through MythTV. Here are the ingredients:
> >>
> >> * A powerful-enough frontend. As I understand it, my Pentium 4 3.0GHz
> >> frontend/backend is just over the line in terms of having enough
> >> horsepower to display HD without XvMC. This is important, as I'll
> >> get to later.
> >> * A supported video card. For HD, this generally means an Nvidia 5200
> >> (AGP) or 5300 (PCI Express; rare), or later; I have an Nvidia 6200
> >> TC with 128MB. ATI owners who want to display HD (as opposed to
> >> standard-definition) are, from what I understand, out of luck due to
> >> driver-support issues.
> >
> > ATI owners with a 9600 AGP or above and 128 MB video RAM will find
> > HD works just fine using the fglrx drivers. There is much documentation
> > on the wiki and other places that no problem exists.
>
> I am currently using a Radeon 8500 with the X.org 7.0 radeon driver and
> an LCD TV connected via DVI. It works fine except that the Athlon64
> 3700+ CPU is sometimes overloaded playing and deinterlacing 1080i. I
> expect that to change when I switch the CPU to an Athlon 64 X2 3800+.

Jonathan, what percentage of the CPU does X use and what percentage is
Myth using? My 3200+ handles deinterlacing 1080i fine, but I've got a
FX5200, I'm trying to get a handle on how much CPU the ATI Xorg driver
uses compared to the nvidia drivers.

thanks!

--
Steve
Before you ask, read the FAQ!
http://www.mythtv.org/wiki/index.php/Frequently_Asked_Questions
then search the Wiki, and this list,
http://www.gossamer-threads.com/lists/mythtv/
Mailinglist etiquette -
http://www.mythtv.org/wiki/index.php/Mailing_List_etiquette
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
Re: My experience with HD video and the Nvidia 6200TC [ In reply to ]
> I am currently using a Radeon 8500 with the X.org 7.0 radeon driver
and
> an LCD TV connected via DVI. It works fine except that the
Athlon64
> 3700+ CPU is sometimes overloaded playing and
deinterlacing 1080i. I
> expect that to change when I switch the
CPU to an Athlon 64 X2 3800+.

Depending on what threads myth is
using, and how, you might find that playback is WORSE with the 3800
X2.  A simple test would be to playback 1080i now, and run 'top' on
the FE machine.  If you only see a single "mythfrontend"
process in the top 10, then hit "H" (capital) to tell top to
show threads.  If it's a single thread thats eating most of the CPU,
then having 2 cores (each slower than your 3700) won't help.  If it's
multiple threads each eating about the same amount of CPU, then the dual
core might help.

As a side-note, on my machine (X2 3800) I got
choppy playback if I turn off XvMC and leave bob deinterlace on.  Of
course, other deint filters might eat less CPU...
Re: My experience with HD video and the Nvidia 6200TC [ In reply to ]
Steven Adeff wrote:
> On 5/15/06, Jonathan Rogers <jonner@teegra.net> wrote:
>> James C. Dastrup wrote:
>>>> I am able to happily view high-definition (1080i and 720p) video
>>>> through MythTV. Here are the ingredients:
>>>>
>>>> * A powerful-enough frontend. As I understand it, my Pentium 4 3.0GHz
>>>> frontend/backend is just over the line in terms of having enough
>>>> horsepower to display HD without XvMC. This is important, as I'll
>>>> get to later.
>>>> * A supported video card. For HD, this generally means an Nvidia 5200
>>>> (AGP) or 5300 (PCI Express; rare), or later; I have an Nvidia 6200
>>>> TC with 128MB. ATI owners who want to display HD (as opposed to
>>>> standard-definition) are, from what I understand, out of luck due to
>>>> driver-support issues.
>>> ATI owners with a 9600 AGP or above and 128 MB video RAM will find
>>> HD works just fine using the fglrx drivers. There is much documentation
>>> on the wiki and other places that no problem exists.
>> I am currently using a Radeon 8500 with the X.org 7.0 radeon driver and
>> an LCD TV connected via DVI. It works fine except that the Athlon64
>> 3700+ CPU is sometimes overloaded playing and deinterlacing 1080i. I
>> expect that to change when I switch the CPU to an Athlon 64 X2 3800+.
>
> Jonathan, what percentage of the CPU does X use and what percentage is
> Myth using? My 3200+ handles deinterlacing 1080i fine, but I've got a
> FX5200, I'm trying to get a handle on how much CPU the ATI Xorg driver
> uses compared to the nvidia drivers.

Playing a 720p movie, the "X" process takes about 50% CPU time and
"mythfrontend" about 30%. Playing 1080i show with Bob, "X" takes about
53% and mythfrontend about 30% to 40%. Those percentages can vary quite
a bit with the program and even scene, which is why it usually keeps up,
but sometimes stutters.

I've also tried a Radeon 9000 with the "radeon" driver because it is
fanless, but it performed much worse and has a pink bar on the right
side of a 1080i video. I'm hoping to find a card that's slightly better
than the 8500 without a fan that will still work with the X.org driver
and will use somewhat less CPU time.

Jonathan Rogers
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
Re: My experience with HD video and the Nvidia 6200TC [ In reply to ]
Jonathan Rogers wrote:
>
> I've also tried a Radeon 9000 with the "radeon" driver because it is
> fanless, but it performed much worse and has a pink bar on the right
> side of a 1080i video. I'm hoping to find a card that's slightly better
> than the 8500 without a fan that will still work with the X.org driver
> and will use somewhat less CPU time.
>
> Jonathan Rogers
>
I run a Radeon 9250 and it has the pink bar. This weekend I finally got
around to installing linux (dual-boot) on my laptop. After installing
mythfrontend I was surprised to see no pink bar on 1080i programs with
it's Mobility 9600.


Lee

_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
Re: My experience with HD video and the Nvidia 6200TC [ In reply to ]
On 5/15/06, Jonathan Rogers <jonner@teegra.net> wrote:
> Steven Adeff wrote:
> > On 5/15/06, Jonathan Rogers <jonner@teegra.net> wrote:
> >> James C. Dastrup wrote:
> >>>> I am able to happily view high-definition (1080i and 720p) video
> >>>> through MythTV. Here are the ingredients:
> >>>>
> >>>> * A powerful-enough frontend. As I understand it, my Pentium 4 3.0GHz
> >>>> frontend/backend is just over the line in terms of having enough
> >>>> horsepower to display HD without XvMC. This is important, as I'll
> >>>> get to later.
> >>>> * A supported video card. For HD, this generally means an Nvidia 5200
> >>>> (AGP) or 5300 (PCI Express; rare), or later; I have an Nvidia 6200
> >>>> TC with 128MB. ATI owners who want to display HD (as opposed to
> >>>> standard-definition) are, from what I understand, out of luck due to
> >>>> driver-support issues.
> >>> ATI owners with a 9600 AGP or above and 128 MB video RAM will find
> >>> HD works just fine using the fglrx drivers. There is much documentation
> >>> on the wiki and other places that no problem exists.
> >> I am currently using a Radeon 8500 with the X.org 7.0 radeon driver and
> >> an LCD TV connected via DVI. It works fine except that the Athlon64
> >> 3700+ CPU is sometimes overloaded playing and deinterlacing 1080i. I
> >> expect that to change when I switch the CPU to an Athlon 64 X2 3800+.
> >
> > Jonathan, what percentage of the CPU does X use and what percentage is
> > Myth using? My 3200+ handles deinterlacing 1080i fine, but I've got a
> > FX5200, I'm trying to get a handle on how much CPU the ATI Xorg driver
> > uses compared to the nvidia drivers.
>
> Playing a 720p movie, the "X" process takes about 50% CPU time and
> "mythfrontend" about 30%. Playing 1080i show with Bob, "X" takes about
> 53% and mythfrontend about 30% to 40%. Those percentages can vary quite
> a bit with the program and even scene, which is why it usually keeps up,
> but sometimes stutters.

strange to me that X takes up so much CPU on your system while it
takes very little on mine. Are you sure Xv is enabled? Your Myth usage
seems a bit lower than mine, but I'm running a 3200+, compared to your
3700+. Have you tried the fglrx drivers to see if they improve your X
usage?


> I've also tried a Radeon 9000 with the "radeon" driver because it is
> fanless, but it performed much worse and has a pink bar on the right
> side of a 1080i video. I'm hoping to find a card that's slightly better
> than the 8500 without a fan that will still work with the X.org driver
> and will use somewhat less CPU time.

my laptop (Xpress 200M) has the pink bar issue with Xv as well, though
lately i've noticed it hasn't been there in Myth, I haven't poked
around to see if it was the driver update or if Myth is reverting to
X11...

--
Steve
Before you ask, read the FAQ!
http://www.mythtv.org/wiki/index.php/Frequently_Asked_Questions
then search the Wiki, and this list,
http://www.gossamer-threads.com/lists/mythtv/
Mailinglist etiquette -
http://www.mythtv.org/wiki/index.php/Mailing_List_etiquette
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
Re: My experience with HD video and the Nvidia 6200TC [ In reply to ]
Lee Koloszyc wrote:
> Jonathan Rogers wrote:
>>
>> I've also tried a Radeon 9000 with the "radeon" driver because it is
>> fanless, but it performed much worse and has a pink bar on the right
>> side of a 1080i video. I'm hoping to find a card that's slightly better
>> than the 8500 without a fan that will still work with the X.org driver
>> and will use somewhat less CPU time.
>>
>> Jonathan Rogers
>>
> I run a Radeon 9250 and it has the pink bar. This weekend I finally got
> around to installing linux (dual-boot) on my laptop. After installing
> mythfrontend I was surprised to see no pink bar on 1080i programs with
> it's Mobility 9600.

Yeah, I failed to research the Radeon 9000 before I bought it; it's
inferior to the 8500. The 9250 is very similar to the 9000 I believe. I
should have read the following page before buying. It doesn't talk about
limitations of the video overlay, but it does talk about relative
performance.

<URL:http://dri.freedesktop.org/wiki/ATIRadeon>

Jonathan Rogers
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
Re: My experience with HD video and the Nvidia 6200TC [ In reply to ]
Steven Adeff wrote:
> On 5/15/06, Jonathan Rogers <jonner@teegra.net> wrote:
>> Steven Adeff wrote:
>>> On 5/15/06, Jonathan Rogers <jonner@teegra.net> wrote:
>>>> James C. Dastrup wrote:
>>>>>> I am able to happily view high-definition (1080i and 720p) video
>>>>>> through MythTV. Here are the ingredients:
>>>>>>
>>>>>> * A powerful-enough frontend. As I understand it, my Pentium 4 3.0GHz
>>>>>> frontend/backend is just over the line in terms of having enough
>>>>>> horsepower to display HD without XvMC. This is important, as I'll
>>>>>> get to later.
>>>>>> * A supported video card. For HD, this generally means an Nvidia 5200
>>>>>> (AGP) or 5300 (PCI Express; rare), or later; I have an Nvidia 6200
>>>>>> TC with 128MB. ATI owners who want to display HD (as opposed to
>>>>>> standard-definition) are, from what I understand, out of luck due to
>>>>>> driver-support issues.
>>>>> ATI owners with a 9600 AGP or above and 128 MB video RAM will find
>>>>> HD works just fine using the fglrx drivers. There is much documentation
>>>>> on the wiki and other places that no problem exists.
>>>> I am currently using a Radeon 8500 with the X.org 7.0 radeon driver and
>>>> an LCD TV connected via DVI. It works fine except that the Athlon64
>>>> 3700+ CPU is sometimes overloaded playing and deinterlacing 1080i. I
>>>> expect that to change when I switch the CPU to an Athlon 64 X2 3800+.
>>> Jonathan, what percentage of the CPU does X use and what percentage is
>>> Myth using? My 3200+ handles deinterlacing 1080i fine, but I've got a
>>> FX5200, I'm trying to get a handle on how much CPU the ATI Xorg driver
>>> uses compared to the nvidia drivers.
>> Playing a 720p movie, the "X" process takes about 50% CPU time and
>> "mythfrontend" about 30%. Playing 1080i show with Bob, "X" takes about
>> 53% and mythfrontend about 30% to 40%. Those percentages can vary quite
>> a bit with the program and even scene, which is why it usually keeps up,
>> but sometimes stutters.
>
> strange to me that X takes up so much CPU on your system while it
> takes very little on mine. Are you sure Xv is enabled? Your Myth usage
> seems a bit lower than mine, but I'm running a 3200+, compared to your
> 3700+. Have you tried the fglrx drivers to see if they improve your X
> usage?
>

Yes, Xvideo is certainly working, as confirmed by xvinfo and mplayer.
You have an Nvidia card, right? The load split may be different due to
hardware and driver differences. In the end, the most important thing in
determining overall functionality is total system load. The fglrx driver
did seem to use a little less CPU time, but Bob deinterlace was
completely broken and my goal is to use all Free software if possible.

Jonathan Rogers
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
Re: My experience with HD video and the Nvidia 6200TC [ In reply to ]
>>> I've also tried a Radeon 9000 with the "radeon" driver because it is
>>> fanless, but it performed much worse and has a pink bar on the right
>>> side of a 1080i video. I'm hoping to find a card that's slightly better
>>> than the 8500 without a fan that will still work with the X.org driver
>>> and will use somewhat less CPU time.
>>>
>>> Jonathan Rogers
>>>
>> I run a Radeon 9250 and it has the pink bar. This weekend I finally got
>> around to installing linux (dual-boot) on my laptop. After installing
>> mythfrontend I was surprised to see no pink bar on 1080i programs with
>> it's Mobility 9600.
>

That's because you are now using a 9600 or above.

>
>Yeah, I failed to research the Radeon 9000 before I bought it; it's
>inferior to the 8500. The 9250 is very similar to the 9000 I believe. I
>should have read the following page before buying. It doesn't talk about
>limitations of the video overlay, but it does talk about relative
>performance.
>

Avoid these problems by using a 9600 or above. The 9600 and above are
the cards that are advertised by ATI to support HDTV.
Re: My experience with HD video and the Nvidia 6200TC [ In reply to ]
Gary Dezern wrote:
>
>> I am currently using a Radeon 8500 with the X.org 7.0 radeon driver
> and
>> an LCD TV connected via DVI. It works fine except that the
> Athlon64
>> 3700+ CPU is sometimes overloaded playing and
> deinterlacing 1080i. I
>> expect that to change when I switch the
> CPU to an Athlon 64 X2 3800+.
>
> Depending on what threads myth is
> using, and how, you might find that playback is WORSE with the 3800
> X2. A simple test would be to playback 1080i now, and run 'top' on
> the FE machine. If you only see a single "mythfrontend"
> process in the top 10, then hit "H" (capital) to tell top to
> show threads. If it's a single thread thats eating most of the CPU,
> then having 2 cores (each slower than your 3700) won't help. If it's
> multiple threads each eating about the same amount of CPU, then the dual
> core might help.

Yeah, I'm fully aware of that the application needs multiple threads to
take advantage of two CPUs. Even though I'm seeing that one mythfrontend
thread takes the lion's share of CPU time, I do know that the X and
mythfrontend processes each take significant CPU time, so they should
split between CPUs nicely. I currently have the X2 3800 in my desktop
machine, but I'm pretty sure it's more needed in the mythbox, so I plan
to swap them.
>
> As a side-note, on my machine (X2 3800) I got
> choppy playback if I turn off XvMC and leave bob deinterlace on. Of
> course, other deint filters might eat less CPU...

Playing video on my desktop (with a 17" monitor connected via VGA),
which currently has the X2 3800, an Nvidia 6800GS and proprietary 8756
Nvidia driver, seems to work fine. Each CPU has 50% or more time free
playing 1080i with Bob. The X process only takes about 10% of a CPU's
time. I'm not entirely sure how to tell if mythfrontend is using XvMC,
just Xv or what. I am using current SVN and selected "Standard" decoder
in the config. Selecting "libmpeg2" decoder uses more CPU time, but
still plays smoothly.

Jonathan Rogers
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
Re: My experience with HD video and the Nvidia 6200TC [ In reply to ]
Steven Adeff wrote:
> On 5/15/06, Jonathan Rogers <jonner@teegra.net> wrote:
>> Steven Adeff wrote:
>>> On 5/15/06, Jonathan Rogers <jonner@teegra.net> wrote:
>>>> James C. Dastrup wrote:
>>>>>> I am able to happily view high-definition (1080i and 720p) video
>>>>>> through MythTV. Here are the ingredients:
>>>>>>
>>>>>> * A powerful-enough frontend. As I understand it, my Pentium 4 3.0GHz
>>>>>> frontend/backend is just over the line in terms of having enough
>>>>>> horsepower to display HD without XvMC. This is important, as I'll
>>>>>> get to later.
>>>>>> * A supported video card. For HD, this generally means an Nvidia 5200
>>>>>> (AGP) or 5300 (PCI Express; rare), or later; I have an Nvidia 6200
>>>>>> TC with 128MB. ATI owners who want to display HD (as opposed to
>>>>>> standard-definition) are, from what I understand, out of luck due to
>>>>>> driver-support issues.
>>>>> ATI owners with a 9600 AGP or above and 128 MB video RAM will find
>>>>> HD works just fine using the fglrx drivers. There is much documentation
>>>>> on the wiki and other places that no problem exists.
>>>> I am currently using a Radeon 8500 with the X.org 7.0 radeon driver and
>>>> an LCD TV connected via DVI. It works fine except that the Athlon64
>>>> 3700+ CPU is sometimes overloaded playing and deinterlacing 1080i. I
>>>> expect that to change when I switch the CPU to an Athlon 64 X2 3800+.
>>> Jonathan, what percentage of the CPU does X use and what percentage is
>>> Myth using? My 3200+ handles deinterlacing 1080i fine, but I've got a
>>> FX5200, I'm trying to get a handle on how much CPU the ATI Xorg driver
>>> uses compared to the nvidia drivers.
>> Playing a 720p movie, the "X" process takes about 50% CPU time and
>> "mythfrontend" about 30%. Playing 1080i show with Bob, "X" takes about
>> 53% and mythfrontend about 30% to 40%. Those percentages can vary quite
>> a bit with the program and even scene, which is why it usually keeps up,
>> but sometimes stutters.
>
> strange to me that X takes up so much CPU on your system while it
> takes very little on mine. Are you sure Xv is enabled? Your Myth usage
> seems a bit lower than mine, but I'm running a 3200+, compared to your
> 3700+. Have you tried the fglrx drivers to see if they improve your X
> usage?

Another issue is that my 8500 card may be a little buggy. I've had it
for several years and already replaced the fan with a makeshift CPU fan.
I've seen messages about firmware problems in X logs. Strangely, moving
through MythTV menus is very slow, though playback is fine. The new 9000
card I tried navigated menus quickly, but played video slowly. I'm
thinking that maybe I should get a 9600 or 9700 if it is quiet.

Jonathan Rogers
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
Re: My experience with HD video and the Nvidia 6200TC [ In reply to ]
James C. Dastrup wrote:
> Avoid these problems by using a 9600 or above. The 9600 and above are
> the cards that are advertised by ATI to support HDTV.

ATI's advertising probably has to do with the capabilities of the
Windows driver, which supports hardware acceleration for various
operations similar to the Nvidia ones. But, if ATI says 9600's and above
support HD, that probably means that X.org's Xvideo support will display
1080i properly without any pink bars. However, I also know from
experience that older cards (at least the 8500 and by extension,
probably any r200) also work for HD.

Jonathan Rogers
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users