Mailing List Archive

Nvida card choice for mythtv and gaming
Being stuck at home gives me time to come up with computer projects. So
I thinking about taking an old Core i7 2600 and moving it to a closet
near my 4K UHD TV in the family room. I would need to add a GFX card,
even for mythtv as the internal graphics are unacceptable to me.  And 
lower end gaming would require a better card than mythtv.

So I looking for what works for y'all at an acceptable price. My current
card in another desktop that seems fine for mythtv is a GeForce GT 710,
but that's old. That would be my bottom end choice. Fanless would be
great, but since it's in a closet, not required.

Jim A


_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On Fri, May 8, 2020 at 6:03 AM Jim Abernathy <jfabernathy@gmail.com> wrote:

> Being stuck at home gives me time to come up with computer projects. So
> I thinking about taking an old Core i7 2600 and moving it to a closet
> near my 4K UHD TV in the family room. I would need to add a GFX card,
> even for mythtv as the internal graphics are unacceptable to me. And
> lower end gaming would require a better card than mythtv.
>
> So I looking for what works for y'all at an acceptable price. My current
> card in another desktop that seems fine for mythtv is a GeForce GT 710,
> but that's old. That would be my bottom end choice. Fanless would be
> great, but since it's in a closet, not required.
>

I had a GT 710 in my combined FE/BE box in the living room and it worked
fine for Minecraft for the kids, however, I recently installed my old
GT1050 in it mostly for Folding@Home and the fan is pretty quiet. It's of
course a few generations old but that should keep the price down. As a
bonus since it's a mid-lower card, it doesn't need any external power, it
runs off the PCIE bus.

That being said, since AMD support is built in these days I moved the an
AMD RX580 on my main desktop (got it at a great price on ebay via newegg)
and it works fine for MythTV using the NVDEC profile.

Checking Ebay, you can get an RX 580 4GB for the same or less than the GT
1050... I'm running Folding@Home on both and my RX580 is about 5 times
faster than the GT1050.

Thanks,
RIchard
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On Fri, 8 May 2020 07:02:35 -0400, you wrote:

>Being stuck at home gives me time to come up with computer projects. So
>I thinking about taking an old Core i7 2600 and moving it to a closet
>near my 4K UHD TV in the family room. I would need to add a GFX card,
>even for mythtv as the internal graphics are unacceptable to me.? And?
>lower end gaming would require a better card than mythtv.
>
>So I looking for what works for y'all at an acceptable price. My current
>card in another desktop that seems fine for mythtv is a GeForce GT 710,
>but that's old. That would be my bottom end choice. Fanless would be
>great, but since it's in a closet, not required.
>
>Jim A

For MythTV, a fanless Nvidia GT1030 card is fine. You do need two
slots for the fanless versions. If you get a fan version GT1030,
there are single slot cards. I do not do gaming, so I can not tell
you about that. But if you want long life, you really do want the
fanless cards. All of mine are still working 8-9 years on after
mostly 24/7 operation for that time.
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Nvida card choice for mythtv and gaming [ In reply to ]
I would second the NVidia GT1030. I've had mine for a few years now and it
has worked well for me. It is the "lowest" card in the NVidia lineup that
supports NVidia's newer NVDEC api, and handles all the HEVC/H.265 content
I've been able to throw at it. Brand new, they are available for around
$85-100 US on Amazon.

-Erik

On Fri, May 8, 2020 at 7:10 AM Stephen Worthington <stephen_agent@jsw.gen.nz>
wrote:

> On Fri, 8 May 2020 07:02:35 -0400, you wrote:
>
> >Being stuck at home gives me time to come up with computer projects. So
> >I thinking about taking an old Core i7 2600 and moving it to a closet
> >near my 4K UHD TV in the family room. I would need to add a GFX card,
> >even for mythtv as the internal graphics are unacceptable to me. And
> >lower end gaming would require a better card than mythtv.
> >
> >So I looking for what works for y'all at an acceptable price. My current
> >card in another desktop that seems fine for mythtv is a GeForce GT 710,
> >but that's old. That would be my bottom end choice. Fanless would be
> >great, but since it's in a closet, not required.
> >
> >Jim A
>
> For MythTV, a fanless Nvidia GT1030 card is fine. You do need two
> slots for the fanless versions. If you get a fan version GT1030,
> there are single slot cards. I do not do gaming, so I can not tell
> you about that. But if you want long life, you really do want the
> fanless cards. All of mine are still working 8-9 years on after
> mostly 24/7 operation for that time.
> _______________________________________________
> mythtv-users mailing list
> mythtv-users@mythtv.org
> http://lists.mythtv.org/mailman/listinfo/mythtv-users
> http://wiki.mythtv.org/Mailing_List_etiquette
> MythTV Forums: https://forum.mythtv.org
>
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On Fri, May 8, 2020 at 9:33 AM Erik Merkle <mythtv@emerkle.net> wrote:

> I would second the NVidia GT1030. I've had mine for a few years now and it
> has worked well for me. It is the "lowest" card in the NVidia lineup that
> supports NVidia's newer NVDEC api, and handles all the HEVC/H.265 content
> I've been able to throw at it. Brand new, they are available for around
> $85-100 US on Amazon.
>
> -Erik
>
> On Fri, May 8, 2020 at 7:10 AM Stephen Worthington <
> stephen_agent@jsw.gen.nz> wrote:
>
>> On Fri, 8 May 2020 07:02:35 -0400, you wrote:
>>
>> >Being stuck at home gives me time to come up with computer projects. So
>> >I thinking about taking an old Core i7 2600 and moving it to a closet
>> >near my 4K UHD TV in the family room. I would need to add a GFX card,
>> >even for mythtv as the internal graphics are unacceptable to me. And
>> >lower end gaming would require a better card than mythtv.
>> >
>> >So I looking for what works for y'all at an acceptable price. My current
>> >card in another desktop that seems fine for mythtv is a GeForce GT 710,
>> >but that's old. That would be my bottom end choice. Fanless would be
>> >great, but since it's in a closet, not required.
>> >
>> >Jim A
>>
>> For MythTV, a fanless Nvidia GT1030 card is fine. You do need two
>> slots for the fanless versions. If you get a fan version GT1030,
>> there are single slot cards. I do not do gaming, so I can not tell
>> you about that. But if you want long life, you really do want the
>> fanless cards. All of mine are still working 8-9 years on after
>> mostly 24/7 operation for that time.
>>
>
I think I'll go with GT 1030. I've had good look with the GT710, so I have
my fingers crossed,

Thanks, all
Jim A
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On 5/8/20 3:11 PM, James Abernathy wrote:
>
>
> On Fri, May 8, 2020 at 9:33 AM Erik Merkle <mythtv@emerkle.net
> <mailto:mythtv@emerkle.net>> wrote:
>
> I would second the NVidia GT1030. I've had mine for a few years
> now and it has worked well for me. It is the "lowest" card in the
> NVidia lineup that supports NVidia's newer NVDEC api, and handles
> all the HEVC/H.265 content I've been able to throw at it. Brand
> new, they are available for around $85-100 US on Amazon.
>
> -Erik
>
> On Fri, May 8, 2020 at 7:10 AM Stephen Worthington
> <stephen_agent@jsw.gen.nz <mailto:stephen_agent@jsw.gen.nz>> wrote:
>
> On Fri, 8 May 2020 07:02:35 -0400, you wrote:
>
> >Being stuck at home gives me time to come up with computer
> projects. So
> >I thinking about taking an old Core i7 2600 and moving it to
> a closet
> >near my 4K UHD TV in the family room. I would need to add a
> GFX card,
> >even for mythtv as the internal graphics are unacceptable to
> me.  And
> >lower end gaming would require a better card than mythtv.
> >
> >So I looking for what works for y'all at an acceptable price.
> My current
> >card in another desktop that seems fine for mythtv is a
> GeForce GT 710,
> >but that's old. That would be my bottom end choice. Fanless
> would be
> >great, but since it's in a closet, not required.
> >
> >Jim A
>
> For MythTV, a fanless Nvidia GT1030 card is fine.  You do need two
> slots for the fanless versions.  If you get a fan version GT1030,
> there are single slot cards.  I do not do gaming, so I can not
> tell
> you about that.  But if you want long life, you really do want the
> fanless cards.  All of mine are still working 8-9 years on after
> mostly 24/7 operation for that time.
>
>
> I think I'll go with GT 1030. I've had good look with the GT710, so I
> have my fingers crossed,
>
> Thanks, all
> Jim A

I forgot to ask about the proper upgrade process for adding a new GT1030
into a system using internal Intel GFX. Part of me thinks it would be
best to install the nvidia drivers and utilities prior to popping in the
new card.

Any advice??

Jim A
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On Sun, 10 May 2020 10:08:58 -0400, you wrote:

>
>On 5/8/20 3:11 PM, James Abernathy wrote:
>>
>>
>> On Fri, May 8, 2020 at 9:33 AM Erik Merkle <mythtv@emerkle.net
>> <mailto:mythtv@emerkle.net>> wrote:
>>
>> I would second the NVidia GT1030. I've had mine for a few years
>> now and it has worked well for me. It is the "lowest" card in the
>> NVidia lineup that supports NVidia's newer NVDEC api, and handles
>> all the HEVC/H.265 content I've been able to throw at it. Brand
>> new, they are available for around $85-100 US on Amazon.
>>
>> -Erik
>>
>> On Fri, May 8, 2020 at 7:10 AM Stephen Worthington
>> <stephen_agent@jsw.gen.nz <mailto:stephen_agent@jsw.gen.nz>> wrote:
>>
>> On Fri, 8 May 2020 07:02:35 -0400, you wrote:
>>
>> >Being stuck at home gives me time to come up with computer
>> projects. So
>> >I thinking about taking an old Core i7 2600 and moving it to
>> a closet
>> >near my 4K UHD TV in the family room. I would need to add a
>> GFX card,
>> >even for mythtv as the internal graphics are unacceptable to
>> me.? And
>> >lower end gaming would require a better card than mythtv.
>> >
>> >So I looking for what works for y'all at an acceptable price.
>> My current
>> >card in another desktop that seems fine for mythtv is a
>> GeForce GT 710,
>> >but that's old. That would be my bottom end choice. Fanless
>> would be
>> >great, but since it's in a closet, not required.
>> >
>> >Jim A
>>
>> For MythTV, a fanless Nvidia GT1030 card is fine.? You do need two
>> slots for the fanless versions.? If you get a fan version GT1030,
>> there are single slot cards.? I do not do gaming, so I can not
>> tell
>> you about that.? But if you want long life, you really do want the
>> fanless cards.? All of mine are still working 8-9 years on after
>> mostly 24/7 operation for that time.
>>
>>
>> I think I'll go with GT 1030. I've had good look with the GT710, so I
>> have my fingers crossed,
>>
>> Thanks, all
>> Jim A
>
>I forgot to ask about the proper upgrade process for adding a new GT1030
>into a system using internal Intel GFX. Part of me thinks it would be
>best to install the nvidia drivers and utilities prior to popping in the
>new card.
>
>Any advice??

I have never done that, but I have recently done an upgrade where I
went from a GT220 to a GT1030. The Nvidia drivers needed for the
GT220 will not run a GT1030 and the drivers needed for a GT1030 will
not run a GT220. So what I did was to change the video card, then
when I rebooted, I used the grub menu to select recovery mode (text
mode only boot) and from the recovery menu I selected going to a root
prompt. From there, I ran the "ubuntu-drivers devices" command to get
a list of devices and the drivers the system thought were needed. The
suggested driver package was Nvidia 440, which is right for a GT1030,
so then I just ran "ubuntu-drivers install" and it downloaded and
installed the 440 drivers. Then I did a "reboot" command and the
system came up on the new drivers just fine. I have yet to look at
the options I have in my /etc/X11/xorg.conf file - there may be
something to adjust there. But I have not got much spare time at the
moment, so since it was working, I have not tried any adjustments yet.

I am not sure if it is possible to pre-load a driver set when you do
not have any hardware installed that it can run on. I suspect that
the install would fail.

If you want to be paranoid, then I would suggest using a clonezilla
boot to do a full backup of your system partition before upgrading.
That way if anything goes badly wrong, you can just restore the backup
(and go back to the Intel video).

Since you will still have the Intel video hardware installed after you
add the GT1030, you should be able to boot using the Intel drivers
once you have the GT1030 installed, and then just install the Nvidia
440 drivers. Then reboot into the BIOS and switch its settings to
booting using the PCIe video rather than the motherboard video. Shut
down, move the video cable to the GT1030 card and reboot and it should
all work. And after that, it should also work to use the Intel video
for a second monitor if you want. Or have two monitors on the GT1030
and a third on the Intel.
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On 5/10/20 11:04 AM, Stephen Worthington wrote:
> On Sun, 10 May 2020 10:08:58 -0400, you wrote:
>
>> On 5/8/20 3:11 PM, James Abernathy wrote:
>>>
>>> On Fri, May 8, 2020 at 9:33 AM Erik Merkle <mythtv@emerkle.net
>>> <mailto:mythtv@emerkle.net>> wrote:
>>>
>>> I would second the NVidia GT1030. I've had mine for a few years
>>> now and it has worked well for me. It is the "lowest" card in the
>>> NVidia lineup that supports NVidia's newer NVDEC api, and handles
>>> all the HEVC/H.265 content I've been able to throw at it. Brand
>>> new, they are available for around $85-100 US on Amazon.
>>>
>>> -Erik
>>>
>>> On Fri, May 8, 2020 at 7:10 AM Stephen Worthington
>>> <stephen_agent@jsw.gen.nz <mailto:stephen_agent@jsw.gen.nz>> wrote:
>>>
>>> On Fri, 8 May 2020 07:02:35 -0400, you wrote:
>>>
>>> >Being stuck at home gives me time to come up with computer
>>> projects. So
>>> >I thinking about taking an old Core i7 2600 and moving it to
>>> a closet
>>> >near my 4K UHD TV in the family room. I would need to add a
>>> GFX card,
>>> >even for mythtv as the internal graphics are unacceptable to
>>> me.  And
>>> >lower end gaming would require a better card than mythtv.
>>> >
>>> >So I looking for what works for y'all at an acceptable price.
>>> My current
>>> >card in another desktop that seems fine for mythtv is a
>>> GeForce GT 710,
>>> >but that's old. That would be my bottom end choice. Fanless
>>> would be
>>> >great, but since it's in a closet, not required.
>>> >
>>> >Jim A
>>>
>>> For MythTV, a fanless Nvidia GT1030 card is fine.  You do need two
>>> slots for the fanless versions.  If you get a fan version GT1030,
>>> there are single slot cards.  I do not do gaming, so I can not
>>> tell
>>> you about that.  But if you want long life, you really do want the
>>> fanless cards.  All of mine are still working 8-9 years on after
>>> mostly 24/7 operation for that time.
>>>
>>>
>>> I think I'll go with GT 1030. I've had good look with the GT710, so I
>>> have my fingers crossed,
>>>
>>> Thanks, all
>>> Jim A
>> I forgot to ask about the proper upgrade process for adding a new GT1030
>> into a system using internal Intel GFX. Part of me thinks it would be
>> best to install the nvidia drivers and utilities prior to popping in the
>> new card.
>>
>> Any advice??
> I have never done that, but I have recently done an upgrade where I
> went from a GT220 to a GT1030. The Nvidia drivers needed for the
> GT220 will not run a GT1030 and the drivers needed for a GT1030 will
> not run a GT220. So what I did was to change the video card, then
> when I rebooted, I used the grub menu to select recovery mode (text
> mode only boot) and from the recovery menu I selected going to a root
> prompt. From there, I ran the "ubuntu-drivers devices" command to get
> a list of devices and the drivers the system thought were needed. The
> suggested driver package was Nvidia 440, which is right for a GT1030,
> so then I just ran "ubuntu-drivers install" and it downloaded and
> installed the 440 drivers. Then I did a "reboot" command and the
> system came up on the new drivers just fine. I have yet to look at
> the options I have in my /etc/X11/xorg.conf file - there may be
> something to adjust there. But I have not got much spare time at the
> moment, so since it was working, I have not tried any adjustments yet.
>
> I am not sure if it is possible to pre-load a driver set when you do
> not have any hardware installed that it can run on. I suspect that
> the install would fail.
>
> If you want to be paranoid, then I would suggest using a clonezilla
> boot to do a full backup of your system partition before upgrading.
> That way if anything goes badly wrong, you can just restore the backup
> (and go back to the Intel video).
>
> Since you will still have the Intel video hardware installed after you
> add the GT1030, you should be able to boot using the Intel drivers
> once you have the GT1030 installed, and then just install the Nvidia
> 440 drivers. Then reboot into the BIOS and switch its settings to
> booting using the PCIe video rather than the motherboard video. Shut
> down, move the video cable to the GT1030 card and reboot and it should
> all work. And after that, it should also work to use the Intel video
> for a second monitor if you want. Or have two monitors on the GT1030
> and a third on the Intel.

It was actually easier going from internal Intel GFX to Nvidia GT1030.

Since my motherboard is old, I could not see the BIOS or grub display on
the 4K monitor prior to this. I added the card and moved the HDMI from
internal to the GT1030 and booted up to a 4K desktop which is too small
for me.

I used the additional drivers app and found it was using the Nouveau
open source driver. I selected the nvidia-440 and rebooted.  Then I used
the Nvidia X server settings app to set the display to 1920x1080@60hz.

Works great. I had the changed audio and video settings on mythfrontend
and I had to change the default sound for the system to the HDMI port on
the GT1030 card.

So no hassle at all.

Jim A


_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Nvida card choice for mythtv and gaming [ In reply to ]
Now that I have the GT 1030 working in Xubuntu 20.04 with the Nvidia-440
driver, most things like Mythtv are working great.

However, I can't seem to keep the display at 1920x1080@60hz. It wants to
switch back to 4K which is the EDID or automatic choice by default.

So on boot it uses the /etc/X11/xorg.conf file that the Nvidia X server
setting program created to force it to 1080p. But if I switch the AV
Receiver away for some period of time from the computer port to DVD
player or anything else and then switch back to the computer it comes
back as 4K like it forgot how it booted. If I switch away for a short
time like 3-5 minutes it comes back in 1080p as I want.

I'm thinking Ubuntu is getting an event triggered by the switching back
to computer port of the AVR and using auto settings for the display. If
I run Nvidia X server app at this point the configuration settings are
in auto instead of 1920x1080@60. I can either change it again or reboot.

Or, the display power management mode is getting in the way.  I had it
in Presentation mode and Display power management off.  I'm running a
test now where Presentation mode is off, but I turned on the Display
power management but set the timer settings to 'never'. I see if that works?

Anyone know what's happening?

Jim A


_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On Mon, 11 May 2020 07:00:40 -0400, you wrote:

>Now that I have the GT 1030 working in Xubuntu 20.04 with the Nvidia-440
>driver, most things like Mythtv are working great.
>
>However, I can't seem to keep the display at 1920x1080@60hz. It wants to
>switch back to 4K which is the EDID or automatic choice by default.
>
>So on boot it uses the /etc/X11/xorg.conf file that the Nvidia X server
>setting program created to force it to 1080p. But if I switch the AV
>Receiver away for some period of time from the computer port to DVD
>player or anything else and then switch back to the computer it comes
>back as 4K like it forgot how it booted. If I switch away for a short
>time like 3-5 minutes it comes back in 1080p as I want.
>
>I'm thinking Ubuntu is getting an event triggered by the switching back
>to computer port of the AVR and using auto settings for the display. If
>I run Nvidia X server app at this point the configuration settings are
>in auto instead of 1920x1080@60. I can either change it again or reboot.
>
>Or, the display power management mode is getting in the way.? I had it
>in Presentation mode and Display power management off.? I'm running a
>test now where Presentation mode is off, but I turned on the Display
>power management but set the timer settings to 'never'. I see if that works?
>
>Anyone know what's happening?
>
>Jim A

I do not get problems like that since my TV is only 1080p. When I
play a 4k file, the GPU scales it down to 1080p nicely.

As a workaround, you can use xrandr from a command prompt to switch
the mode manually back to the one you want, maybe something like
"xrandr 60" which would work on my system to put it into 1080p@60Hz.

I think that if you want it to never go to 4k you will need to do some
more xorg.conf setup to specify the allowed modes, rather than just
the mode at startup. But if you prevent it from going to 4k modes,
then if you want to play a 4k video file, it will scale it down to
1080p which is a complete waste of your 4k TV.

So maybe you should be considering actually getting it to use 4k mode,
but changing the DPI settings so that text is readable. I had to do
that for my TV:

Section "Device"
Identifier "Nvidia GT1030"
Driver "nvidia"
Option "DPI" "100x100"
Option "NoLogo" "1"
EndSection
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On Mon, May 11, 2020 at 9:22 AM Stephen Worthington <
stephen_agent@jsw.gen.nz> wrote:

> On Mon, 11 May 2020 07:00:40 -0400, you wrote:
>
> >Now that I have the GT 1030 working in Xubuntu 20.04 with the Nvidia-440
> >driver, most things like Mythtv are working great.
> >
> >However, I can't seem to keep the display at 1920x1080@60hz. It wants to
> >switch back to 4K which is the EDID or automatic choice by default.
> >
> >So on boot it uses the /etc/X11/xorg.conf file that the Nvidia X server
> >setting program created to force it to 1080p. But if I switch the AV
> >Receiver away for some period of time from the computer port to DVD
> >player or anything else and then switch back to the computer it comes
> >back as 4K like it forgot how it booted. If I switch away for a short
> >time like 3-5 minutes it comes back in 1080p as I want.
> >
> >I'm thinking Ubuntu is getting an event triggered by the switching back
> >to computer port of the AVR and using auto settings for the display. If
> >I run Nvidia X server app at this point the configuration settings are
> >in auto instead of 1920x1080@60. I can either change it again or reboot.
> >
> >Or, the display power management mode is getting in the way. I had it
> >in Presentation mode and Display power management off. I'm running a
> >test now where Presentation mode is off, but I turned on the Display
> >power management but set the timer settings to 'never'. I see if that
> works?
> >
> >Anyone know what's happening?
> >
> >Jim A
>
> I do not get problems like that since my TV is only 1080p. When I
> play a 4k file, the GPU scales it down to 1080p nicely.
>
> As a workaround, you can use xrandr from a command prompt to switch
> the mode manually back to the one you want, maybe something like
> "xrandr 60" which would work on my system to put it into 1080p@60Hz.
>
> I think that if you want it to never go to 4k you will need to do some
> more xorg.conf setup to specify the allowed modes, rather than just
> the mode at startup. But if you prevent it from going to 4k modes,
> then if you want to play a 4k video file, it will scale it down to
> 1080p which is a complete waste of your 4k TV.
>
> So maybe you should be considering actually getting it to use 4k mode,
> but changing the DPI settings so that text is readable. I had to do
> that for my TV:
>
> Section "Device"
> Identifier "Nvidia GT1030"
> Driver "nvidia"
> Option "DPI" "100x100"
> Option "NoLogo" "1"
> EndSection
>

great suggestion on changing the DPI. My problem with leaving the card in
4K mode is the AV Receiver is only 4K@30 hz capable. I think I've found my
problem though. Seems to be a bunch of places that Screen locking and
sleeping is controlled in Xubuntu. I think I've found them all. A few
days testing will prove it or not.

Jim A
Re: Nvida card choice for mythtv and gaming [ In reply to ]
I've not run into that situation before. I have my xorg set to pin the
resolution with this line in my Screen Section:

Option "metamodes" "1920x1080 +0+0 {
ForceFullCompositionPipeline=On }"

The only issue I have is when I reboot my machine, if the TV is not on and
connected to the machine, my display is blank. I think this is because it
can't get the EDID info from the TV, so it doesn't know how to initialize
the output. At one time I had extracted the EDID and used my xorg.conf to
specify it, but I lost that bit along the way. For me, it isn't that big an
inconvenience as I don't reboot often and it's easy to just ensure the TV
is on when I do.

I believe you can pin the output resolution to 60Hz refresh with a line
like this:

Option "metamodes" "1920x1080_60 +0+0 {
ForceFullCompositionPipeline=On }"

I don't do that myself because I watch different content that have
different FPS (some 24, some 30, some 60) and I want the TV to switch to
whatever the content is using. Also, I want to say that I had some issues
with some content being detected as 59.XXX FPS and my TV/1030 can switch to
that output. So I opted for just pinning the resolution and letting the
refresh rate change with the content.

Hope that helps,
Erik

On Mon, May 11, 2020 at 6:00 AM Jim Abernathy <jfabernathy@gmail.com> wrote:

> Now that I have the GT 1030 working in Xubuntu 20.04 with the Nvidia-440
> driver, most things like Mythtv are working great.
>
> However, I can't seem to keep the display at 1920x1080@60hz. It wants to
> switch back to 4K which is the EDID or automatic choice by default.
>
> So on boot it uses the /etc/X11/xorg.conf file that the Nvidia X server
> setting program created to force it to 1080p. But if I switch the AV
> Receiver away for some period of time from the computer port to DVD
> player or anything else and then switch back to the computer it comes
> back as 4K like it forgot how it booted. If I switch away for a short
> time like 3-5 minutes it comes back in 1080p as I want.
>
> I'm thinking Ubuntu is getting an event triggered by the switching back
> to computer port of the AVR and using auto settings for the display. If
> I run Nvidia X server app at this point the configuration settings are
> in auto instead of 1920x1080@60. I can either change it again or reboot.
>
> Or, the display power management mode is getting in the way. I had it
> in Presentation mode and Display power management off. I'm running a
> test now where Presentation mode is off, but I turned on the Display
> power management but set the timer settings to 'never'. I see if that
> works?
>
> Anyone know what's happening?
>
> Jim A
>
>
> _______________________________________________
> mythtv-users mailing list
> mythtv-users@mythtv.org
> http://lists.mythtv.org/mailman/listinfo/mythtv-users
> http://wiki.mythtv.org/Mailing_List_etiquette
> MythTV Forums: https://forum.mythtv.org
>
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On Mon, May 11, 2020 at 9:40 AM Erik Merkle <mythtv@emerkle.net> wrote:

> I've not run into that situation before. I have my xorg set to pin the
> resolution with this line in my Screen Section:
>
> Option "metamodes" "1920x1080 +0+0 {
> ForceFullCompositionPipeline=On }"
>
> The only issue I have is when I reboot my machine, if the TV is not on and
> connected to the machine, my display is blank. I think this is because it
> can't get the EDID info from the TV, so it doesn't know how to initialize
> the output. At one time I had extracted the EDID and used my xorg.conf to
> specify it, but I lost that bit along the way. For me, it isn't that big an
> inconvenience as I don't reboot often and it's easy to just ensure the TV
> is on when I do.
>
> I believe you can pin the output resolution to 60Hz refresh with a line
> like this:
>
> Option "metamodes" "1920x1080_60 +0+0 {
> ForceFullCompositionPipeline=On }"
>
> I don't do that myself because I watch different content that have
> different FPS (some 24, some 30, some 60) and I want the TV to switch to
> whatever the content is using. Also, I want to say that I had some issues
> with some content being detected as 59.XXX FPS and my TV/1030 can switch to
> that output. So I opted for just pinning the resolution and letting the
> refresh rate change with the content.
>
> Hope that helps,
> Erik
>

Good to know, thanks.

I'm going to look into a 4K HDMI switch that will allow me to plug the PC
and AVR into it and switch the UHD 4K@60 between the two. That way I can
let the PC run at 4K @60 i.e. leave it in auto mode and let the apps change
the resolution as needed. I can solve the Desktop issue with the DPI
setting in xorg.conf.

Jim A
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On 5/11/20 4:00 AM, Jim Abernathy wrote:
> Now that I have the GT 1030 working in Xubuntu 20.04 with the Nvidia-440
> driver, most things like Mythtv are working great.
>
> However, I can't seem to keep the display at 1920x1080@60hz. It wants to
> switch back to 4K which is the EDID or automatic choice by default.
>
> So on boot it uses the /etc/X11/xorg.conf file that the Nvidia X server
> setting program created to force it to 1080p. But if I switch the AV
> Receiver away for some period of time from the computer port to DVD
> player or anything else and then switch back to the computer it comes
> back as 4K like it forgot how it booted. If I switch away for a short
> time like 3-5 minutes it comes back in 1080p as I want.
>
> I'm thinking Ubuntu is getting an event triggered by the switching back
> to computer port of the AVR and using auto settings for the display. If
> I run Nvidia X server app at this point the configuration settings are
> in auto instead of 1920x1080@60. I can either change it again or reboot.
>
> Or, the display power management mode is getting in the way.  I had it
> in Presentation mode and Display power management off.  I'm running a
> test now where Presentation mode is off, but I turned on the Display
> power management but set the timer settings to 'never'. I see if that
> works?
>
> Anyone know what's happening?
>
> Jim A
>

Your Window Manager/Desktop of choice is controlling the desktop
resolution and ignoring the xorg settings. The Display settings in many
Window Managers does not change your xorg settings, it over-rides it
when you login.

I had been running Xfce for years on my frontend before buying a 4k tv.
While Xfce will honor your existing setting on initial boot, it will
automatically switch based on edid when a monitor is plugged in (such as
changing the input on an AVR). The only way I could find to prevent
that in Xfce was by patching and compiling. Not something I wanted to
maintain.

Other Window Managers behave the same way, but thankfully, some of them
allow you to define what you actually want when a new monitor is plugged
in. I switched to Mate and it allows me to hard define what I want it
set to and doesn't automatically choose when getting the new display
attached signal.


_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On 5/12/20 2:19 PM, BP wrote:
> On 5/11/20 4:00 AM, Jim Abernathy wrote:
>> Now that I have the GT 1030 working in Xubuntu 20.04 with the
>> Nvidia-440 driver, most things like Mythtv are working great.
>>
>> However, I can't seem to keep the display at 1920x1080@60hz. It wants
>> to switch back to 4K which is the EDID or automatic choice by default.
>>
>> So on boot it uses the /etc/X11/xorg.conf file that the Nvidia X
>> server setting program created to force it to 1080p. But if I switch
>> the AV Receiver away for some period of time from the computer port
>> to DVD player or anything else and then switch back to the computer
>> it comes back as 4K like it forgot how it booted. If I switch away
>> for a short time like 3-5 minutes it comes back in 1080p as I want.
>>
>> I'm thinking Ubuntu is getting an event triggered by the switching
>> back to computer port of the AVR and using auto settings for the
>> display. If I run Nvidia X server app at this point the configuration
>> settings are in auto instead of 1920x1080@60. I can either change it
>> again or reboot.
>>
>> Or, the display power management mode is getting in the way.  I had
>> it in Presentation mode and Display power management off. I'm running
>> a test now where Presentation mode is off, but I turned on the
>> Display power management but set the timer settings to 'never'. I see
>> if that works?
>>
>> Anyone know what's happening?
>>
>> Jim A
>>
>
> Your Window Manager/Desktop of choice is controlling the desktop
> resolution and ignoring the xorg settings. The Display settings in
> many Window Managers does not change your xorg settings, it over-rides
> it when you login.
>
> I had been running Xfce for years on my frontend before buying a 4k
> tv. While Xfce will honor your existing setting on initial boot, it
> will automatically switch based on edid when a monitor is plugged in
> (such as changing the input on an AVR).  The only way I could find to
> prevent that in Xfce was by patching and compiling. Not something I
> wanted to maintain.
>
> Other Window Managers behave the same way, but thankfully, some of
> them allow you to define what you actually want when a new monitor is
> plugged in.  I switched to Mate and it allows me to hard define what I
> want it set to and doesn't automatically choose when getting the new
> display attached signal.

Interesting.  I had nothing but trouble with the Ubuntu 18.04 and 20.04
default desktop and xorg.conf fighting each other. So I went back to
Xubuntu 20.04.  I used the settings that you can get to with the
Presentation mode button on the top right panel.

I'll play with that a while, but have used Mate before in test. Can I
just install Mate desktop over or along side XFCE4 so I can choose at login?

And where do I find the Mate settings your were talking about??

Jim A


_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On 5/12/2020 1:19 PM, BP wrote:
> I had been running Xfce for years on my frontend before buying a 4k tv.
> While Xfce will honor your existing setting on initial boot, it will
> automatically switch based on edid when a monitor is plugged in (such as
> changing the input on an AVR).  The only way I could find to prevent
> that in Xfce was by patching and compiling.  Not something I wanted to
> maintain.

I had this issue on my Mythbuntu 16.04 BE/FE when we got a new TV and it
was annoying. I finally gave up and wrote a script that runs
continuously to check and run xrandr if the resolution isn't 1080p, so
now it fixes itself when I switch inputs. Turning my soundbar on or off
also tended to screw up the video connection and the script fixes that
as well. My Kubuntu desktop/FE connected to a different TV tends to
have handshake issues coming out of sleep, so I mapped an xrandr script
to a global hotkey. If it doesn't connect I can type my password and
hit ctrl-alt-z and fix it right up.
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On 5/12/20 11:44 AM, Jim Abernathy wrote:

>
> And where do I find the Mate settings your were talking about??
>
> Jim A
>

I don't remember everything I eventually did. Lots of memories of
editing config files trying to make Xfce work muddying my recollection.
Nothing obvious in my .bash_history either.

I know my xorg.conf is set to only use the display mode
(resolution/refresh freq) I wanted. For Mate, I might have only used
the Preferences > Display app to set it and it worked right that easily.
I do have a ~/.config/monitors.xml file which has the desired setting
in it and I know Mate does use that file. A quick google search implies
that the Mate display settings app will create it.

The only reason I went with Mate is because I hate Unity and Gnome3 and
it's what I generally use in my Ubuntu virtual machines since it works
well without hardware acceleration. If there's something you're used to
using other than Xfce I'd just test using the window manager's display
setting, then unplugging the monitor and plugging it back in to see if
it resets.


_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Nvida card choice for mythtv and gaming [ In reply to ]
> On May 12, 2020, at 3:58 PM, BP <lists@qucae.com> wrote:
>
> On 5/12/20 11:44 AM, Jim Abernathy wrote:
>
>> And where do I find the Mate settings your were talking about??
>> Jim A
>
> I don't remember everything I eventually did. Lots of memories of editing config files trying to make Xfce work muddying my recollection. Nothing obvious in my .bash_history either.
>
> I know my xorg.conf is set to only use the display mode (resolution/refresh freq) I wanted. For Mate, I might have only used the Preferences > Display app to set it and it worked right that easily. I do have a ~/.config/monitors.xml file which has the desired setting in it and I know Mate does use that file. A quick google search implies that the Mate display settings app will create it.
>
> The only reason I went with Mate is because I hate Unity and Gnome3 and it's what I generally use in my Ubuntu virtual machines since it works well without hardware acceleration. If there's something you're used to using other than Xfce I'd just test using the window manager's display setting, then unplugging the monitor and plugging it back in to see if it resets.
>

I think the problem has something to do with the fact that xscreensaver and Display settings are both installed and you have to make sure that you turn off the display management power settings and screen locking in both places. I’ve made all those changes; now I’ll test a bunch of cases.

Jim A


_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Nvida card choice for mythtv and gaming [ In reply to ]
If you disable DPMS in your Desktop environment, and disable xscreensaver
and find that DPMS and/or screensaver is still kicking in, you can try
running "xset q" to query things. For example, my system sometimes shows
this:

Screen Saver:
prefer blanking: no allow exposures: no
timeout: 1800 cycle: 900
DPMS (Energy Star):
Standby: 2700 Suspend: 0 Off: 3600
DPMS is Enabled
Monitor is On

To turn things off, I can run "xset s off -dpms":

Screen Saver:
prefer blanking: no allow exposures: no
timeout: 0 cycle: 900
DPMS (Energy Star):
Standby: 2700 Suspend: 0 Off: 3600
DPMS is Disabled

The "s off" option turns off the screensaver (timeout: 1800 to timeout: 0)
and the "-dpms" option disables DPMS. On some of my systems, I put that
command in my user's $HOME/.xprofile and it essentially disables them at
login. I have done this with Xfce as well as LXDE.

Hope that helps (YMMV),
Erik

On Tue, May 12, 2020 at 3:29 PM James Abernathy <jfabernathy@gmail.com>
wrote:

>
> > On May 12, 2020, at 3:58 PM, BP <lists@qucae.com> wrote:
> >
> > On 5/12/20 11:44 AM, Jim Abernathy wrote:
> >
> >> And where do I find the Mate settings your were talking about??
> >> Jim A
> >
> > I don't remember everything I eventually did. Lots of memories of
> editing config files trying to make Xfce work muddying my recollection.
> Nothing obvious in my .bash_history either.
> >
> > I know my xorg.conf is set to only use the display mode
> (resolution/refresh freq) I wanted. For Mate, I might have only used the
> Preferences > Display app to set it and it worked right that easily. I do
> have a ~/.config/monitors.xml file which has the desired setting in it and
> I know Mate does use that file. A quick google search implies that the
> Mate display settings app will create it.
> >
> > The only reason I went with Mate is because I hate Unity and Gnome3 and
> it's what I generally use in my Ubuntu virtual machines since it works well
> without hardware acceleration. If there's something you're used to using
> other than Xfce I'd just test using the window manager's display setting,
> then unplugging the monitor and plugging it back in to see if it resets.
> >
>
> I think the problem has something to do with the fact that xscreensaver
> and Display settings are both installed and you have to make sure that you
> turn off the display management power settings and screen locking in both
> places. I’ve made all those changes; now I’ll test a bunch of cases.
>
> Jim A
>
>
> _______________________________________________
> mythtv-users mailing list
> mythtv-users@mythtv.org
> http://lists.mythtv.org/mailman/listinfo/mythtv-users
> http://wiki.mythtv.org/Mailing_List_etiquette
> MythTV Forums: https://forum.mythtv.org
>
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On 5/12/20 1:28 PM, James Abernathy wrote:

>
> I think the problem has something to do with the fact that xscreensaver and Display settings are both installed and you have to make sure that you turn off the display management power settings and screen locking in both places. I’ve made all those changes; now I’ll test a bunch of cases.
>
> Jim A
>

A couple of the links I found in my browsing history (actually links
marked as read using a persisted search for "xfce resets screen size
when monitor off") that point directly to Xfce:

https://bbs.archlinux.org/viewtopic.php?id=243115
https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=842536
https://www.reddit.com/r/xfce/comments/9cgf2o/how_to_prevent_desktop_from_being_reconfigured/

I remember trying to disable the display daemon but I think I found that
that does not apply to newer versions of Xfce.

The having a persistently running script as suggested in another reply
looks like another solution that would work well without changing window
managers.

_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On 5/12/20 4:45 PM, Erik Merkle wrote:
> If you disable DPMS in your Desktop environment, and disable
> xscreensaver and find that DPMS and/or screensaver is still kicking
> in, you can try running "xset q" to query things. For example, my
> system sometimes shows this:
>
> Screen Saver:
>   prefer blanking:  no    allow exposures:  no
>   timeout:  1800    cycle:  900
> DPMS (Energy Star):
>   Standby: 2700    Suspend: 0    Off: 3600
>   DPMS is Enabled
>   Monitor is On
>
> To turn things off, I can run "xset s off -dpms":
>
> Screen Saver:
>   prefer blanking:  no    allow exposures:  no
>   timeout:  0    cycle:  900
> DPMS (Energy Star):
>   Standby: 2700    Suspend: 0    Off: 3600
>   DPMS is Disabled
>
> The "s off" option turns off the screensaver (timeout: 1800 to
> timeout: 0) and the "-dpms" option disables DPMS. On some of my
> systems, I put that command in my user's $HOME/.xprofile and it
> essentially disables them at login. I have done this with Xfce as well
> as LXDE.
>
> Hope that helps (YMMV),
> Erik

Great tips.  The xset q showed that screensaver was off; 'standby' and
'off' where 0 (never), but DPMS was enabled. So I set it off with the
settings app. So xset q now shows DPMS disabled.

It seems to be working at this point, but I need to switch from the PC
to another device for an hour or so and see if it comes back correctly.
I've left the PC at the Mythtv top menu.

Thanks,

Jim A
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On 5/12/20 5:33 PM, BP wrote:
> On 5/12/20 1:28 PM, James Abernathy wrote:
>
>>
>> I think the problem has something to do with the fact that
>> xscreensaver and Display settings are both installed and you have to
>> make sure that you turn off the display management power settings and
>> screen locking in both places. I’ve made all those changes; now I’ll
>> test a bunch of cases.
>>
>> Jim A
>>
>
> A couple of the links I found in my browsing history (actually links
> marked as read using a persisted search for "xfce resets screen size
> when monitor off") that point directly to Xfce:
>
> https://bbs.archlinux.org/viewtopic.php?id=243115
> https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=842536
> https://www.reddit.com/r/xfce/comments/9cgf2o/how_to_prevent_desktop_from_being_reconfigured/
>
>
> I remember trying to disable the display daemon but I think I found
> that that does not apply to newer versions of Xfce.
>
> The having a persistently running script as suggested in another reply
> looks like another solution that would work well without changing
> window managers.
>
I remember that on the Raspbian desktop that you had to install
xscreensaver and then disable it to complete the no blanking

Jim A


_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Nvida card choice for mythtv and gaming [ In reply to ]
> On May 12, 2020, at 5:38 PM, Jim Abernathy <jfabernathy@gmail.com> wrote:
>
>
> On 5/12/20 4:45 PM, Erik Merkle wrote:
>> If you disable DPMS in your Desktop environment, and disable xscreensaver and find that DPMS and/or screensaver is still kicking in, you can try running "xset q" to query things. For example, my system sometimes shows this:
>>
>> Screen Saver:
>> prefer blanking: no allow exposures: no
>> timeout: 1800 cycle: 900
>> DPMS (Energy Star):
>> Standby: 2700 Suspend: 0 Off: 3600
>> DPMS is Enabled
>> Monitor is On
>>
>> To turn things off, I can run "xset s off -dpms":
>>
>> Screen Saver:
>> prefer blanking: no allow exposures: no
>> timeout: 0 cycle: 900
>> DPMS (Energy Star):
>> Standby: 2700 Suspend: 0 Off: 3600
>> DPMS is Disabled
>>
>> The "s off" option turns off the screensaver (timeout: 1800 to timeout: 0) and the "-dpms" option disables DPMS. On some of my systems, I put that command in my user's $HOME/.xprofile and it essentially disables them at login. I have done this with Xfce as well as LXDE.
>>
>> Hope that helps (YMMV),
>> Erik
> Great tips. The xset q showed that screensaver was off; 'standby' and 'off' where 0 (never), but DPMS was enabled. So I set it off with the settings app. So xset q now shows DPMS disabled.
> It seems to be working at this point, but I need to switch from the PC to another device for an hour or so and see if it comes back correctly. I've left the PC at the Mythtv top menu.
>
> Thanks,
> Jim A
>

This morning I powered up the TV and AVR. It was not on the PC input. But when I switched it to the PC input it was blank until I clicked the mouse. At that point it was back at the Mythtv Frontend main menu.

So for me this is as good as fixed. No login required, no script to execute, just a mouse click.

Jim A
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On Wed, May 13, 2020 at 9:30 AM James Abernathy <jfabernathy@gmail.com>
wrote:

>
> On May 12, 2020, at 5:38 PM, Jim Abernathy <jfabernathy@gmail.com> wrote:
>
>
> On 5/12/20 4:45 PM, Erik Merkle wrote:
>
> If you disable DPMS in your Desktop environment, and disable xscreensaver
> and find that DPMS and/or screensaver is still kicking in, you can try
> running "xset q" to query things. For example, my system sometimes shows
> this:
>
> Screen Saver:
> prefer blanking: no allow exposures: no
> timeout: 1800 cycle: 900
> DPMS (Energy Star):
> Standby: 2700 Suspend: 0 Off: 3600
> DPMS is Enabled
> Monitor is On
>
> To turn things off, I can run "xset s off -dpms":
>
> Screen Saver:
> prefer blanking: no allow exposures: no
> timeout: 0 cycle: 900
> DPMS (Energy Star):
> Standby: 2700 Suspend: 0 Off: 3600
> DPMS is Disabled
>
> The "s off" option turns off the screensaver (timeout: 1800 to timeout: 0)
> and the "-dpms" option disables DPMS. On some of my systems, I put that
> command in my user's $HOME/.xprofile and it essentially disables them at
> login. I have done this with Xfce as well as LXDE.
>
> Hope that helps (YMMV),
> Erik
>
> Great tips. The xset q showed that screensaver was off; 'standby' and
> 'off' where 0 (never), but DPMS was enabled. So I set it off with the
> settings app. So xset q now shows DPMS disabled.
>
> It seems to be working at this point, but I need to switch from the PC to
> another device for an hour or so and see if it comes back correctly. I've
> left the PC at the Mythtv top menu.
>
> Thanks,
>
> Jim A
>
>
> This morning I powered up the TV and AVR. It was not on the PC input. But
> when I switched it to the PC input it was blank until I clicked the mouse.
> At that point it was back at the Mythtv Frontend main menu.
>
> So for me this is as good as fixed. No login required, no script to
> execute, just a mouse click.
>
> Jim A
>

I did discover that some of the required settings do not make sense. If my
system is not running mythfrontend then I can leave the power management in
Presentation mode which turns off DPMS. However, in Mythfrontend, this will
not work and I can't get mythtv to come out of blank mode. I have to do
Ctl-Alt-F1 and login to CLI and reboot.

What I have found works in all cases is Screensaver installed and disabled,
and DPMS enabled with all the timers set to Never.

Jim A
Re: Nvida card choice for mythtv and gaming [ In reply to ]
On Thu, 14 May 2020 08:47:31 -0400, you wrote:

>I did discover that some of the required settings do not make sense. If my
>system is not running mythfrontend then I can leave the power management in
>Presentation mode which turns off DPMS. However, in Mythfrontend, this will
>not work and I can't get mythtv to come out of blank mode. I have to do
>Ctl-Alt-F1 and login to CLI and reboot.

You could probably use xrandr or xset from Ctrl-Alt-F1 to unblank it.
Something like:

xrandr -d :0 --output DFP-0 --auto

or

xset -display :0 dpms on

>What I have found works in all cases is Screensaver installed and disabled,
>and DPMS enabled with all the timers set to Never.
>
>Jim A

I am guessing this is why I have these in my /etc/X11/xorg.conf:

Section "Monitor"
Identifier "Sony TV"
Option "DPMS" "true"
EndSection

Section "ServerFlags"
Option "BlankTime" "0"
Option "StandbyTime" "0"
Option "SuspendTime" "0"
Option "OffTime" "0"
EndSection

I have had them there for many years, but recently I was scratching my
head wondering why. I can see why I had the times set to 0, but just
why I still had DPMS enabled was bothering me.
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org

1 2  View All