Mailing List Archive

Problem with TV resolution after upgrading to 20.04
I've recently upgraded (fresh install) of Xubuntu 20.04. I'd been
using Ubuntu (desktop) for almost 10 years, but recently the gnome
desktop has been giving me fits. I had Xubuntu on my remote frontend,
and that installation was a charm, fixing all the problems with menu
bars.

So, the installation went well with the exception of losing a power
supply in one of my disk arrays (fixed and the raid arrays are all
online) but I still have problems with the part that everyone using
the system looks at: the TV. I set the display to 3840x2160 in the
desktop and told it to hide the menu bar and it looked great! Until I
turned the TV off and back on. It had changed the resolution to
1920x1080.

After some digging I found the issue with X. In the Xorg.log.0 file I
see this after switching the display to the proper resolution:

[ 52764.156] (--) NVIDIA(GPU-0): DFP-0: disconnected
[ 52764.156] (--) NVIDIA(GPU-0): DFP-0: Internal TMDS
[ 52764.156] (--) NVIDIA(GPU-0): DFP-0: 165.0 MHz maximum pixel clock
[ 52764.156] (--) NVIDIA(GPU-0):
[ 52764.186] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): connected
[ 52764.186] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): Internal TMDS
[ 52764.186] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): 600.0
MHz maximum pixel clock
[ 52764.186] (--) NVIDIA(GPU-0):
[ 52764.241] (II) NVIDIA(0): Setting mode "HDMI-0: 3840x2160
@3840x2160 +0+0 {ViewPortIn=3840x2160, ViewPortOut=3840x2160+0+0}"

Then after turning the TV off I see this:

[ 52827.478] (--) NVIDIA(GPU-0): DFP-0: disconnected
[ 52827.478] (--) NVIDIA(GPU-0): DFP-0: Internal TMDS
[ 52827.478] (--) NVIDIA(GPU-0): DFP-0: 165.0 MHz maximum pixel clock
[ 52827.478] (--) NVIDIA(GPU-0):
[ 52827.508] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): connected
[ 52827.508] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): Internal TMDS
[ 52827.508] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): 600.0
MHz maximum pixel clock
[ 52827.508] (--) NVIDIA(GPU-0):
[ 52827.543] (II) NVIDIA(0): Setting mode "HDMI-0: nvidia-auto-select
@1920x1080 +0+0 {ViewPortIn=1920x1080, ViewPortOut=1920x1080+0+0}"

And there it stays until I go back into the display properties and put it back.

I also noticed this declaration in the Xorg.log.0 file:

[ 52762.997] (--) NVIDIA(GPU-0): DFP-0: disconnected
[ 52762.997] (--) NVIDIA(GPU-0): DFP-0: Internal TMDS
[ 52762.997] (--) NVIDIA(GPU-0): DFP-0: 165.0 MHz maximum pixel clock
[ 52762.997] (--) NVIDIA(GPU-0):
[ 52763.027] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): connected
[ 52763.027] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): Internal TMDS
[ 52763.027] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): 600.0
MHz maximum pixel clock
[ 52763.027] (--) NVIDIA(GPU-0):
[ 52763.031] (==) NVIDIA(0):
[ 52763.032] (==) NVIDIA(0): No modes were requested; the default mode
"nvidia-auto-select"
[ 52763.032] (==) NVIDIA(0): will be used as the requested mode.
[ 52763.032] (==) NVIDIA(0):
[ 52763.032] (II) NVIDIA(0): Validated MetaModes:
[ 52763.032] (II) NVIDIA(0): "DFP-1:nvidia-auto-select"
[ 52763.032] (II) NVIDIA(0): Virtual screen size determined to be 1920 x 1080
[ 52763.039] (--) NVIDIA(0): DPI set to (30, 30); computed from
"UseEdidDpi" X config
[ 52763.039] (--) NVIDIA(0): option
[ 52763.039] (II) NVIDIA: Using 24576.00 MB of virtual memory for
indirect memory
[ 52763.039] (II) NVIDIA: access.
[ 52763.059] (II) NVIDIA(0): Setting mode "DFP-1:nvidia-auto-select"

Other people with a 4K monitor must have discovered this issue, but my
searching has only shown up some older issues fixed with modelines or
insidious EDID fake files to force X not to ask for what's available.
I've tried some of these with no success, but thought someone has had
to have come across this before, so I'm humbly asking for help.

I'm using an NVIDIAGeForce GT1030 card, NVIDIA driver 450.51.06, a
Denon AVR-X2600H, and an LG OLED65C9PUA 4K HDR TV.

Regards,

Ken Emerson
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Problem with TV resolution after upgrading to 20.04 [ In reply to ]
On Fri, 9 Oct 2020 12:11:36 -0500, you wrote:

>I've recently upgraded (fresh install) of Xubuntu 20.04. I'd been
>using Ubuntu (desktop) for almost 10 years, but recently the gnome
>desktop has been giving me fits. I had Xubuntu on my remote frontend,
>and that installation was a charm, fixing all the problems with menu
>bars.
>
>So, the installation went well with the exception of losing a power
>supply in one of my disk arrays (fixed and the raid arrays are all
>online) but I still have problems with the part that everyone using
>the system looks at: the TV. I set the display to 3840x2160 in the
>desktop and told it to hide the menu bar and it looked great! Until I
>turned the TV off and back on. It had changed the resolution to
>1920x1080.
>
>After some digging I found the issue with X. In the Xorg.log.0 file I
>see this after switching the display to the proper resolution:
>
>[ 52764.156] (--) NVIDIA(GPU-0): DFP-0: disconnected
>[ 52764.156] (--) NVIDIA(GPU-0): DFP-0: Internal TMDS
>[ 52764.156] (--) NVIDIA(GPU-0): DFP-0: 165.0 MHz maximum pixel clock
>[ 52764.156] (--) NVIDIA(GPU-0):
>[ 52764.186] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): connected
>[ 52764.186] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): Internal TMDS
>[ 52764.186] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): 600.0
>MHz maximum pixel clock
>[ 52764.186] (--) NVIDIA(GPU-0):
>[ 52764.241] (II) NVIDIA(0): Setting mode "HDMI-0: 3840x2160
>@3840x2160 +0+0 {ViewPortIn=3840x2160, ViewPortOut=3840x2160+0+0}"
>
>Then after turning the TV off I see this:
>
>[ 52827.478] (--) NVIDIA(GPU-0): DFP-0: disconnected
>[ 52827.478] (--) NVIDIA(GPU-0): DFP-0: Internal TMDS
>[ 52827.478] (--) NVIDIA(GPU-0): DFP-0: 165.0 MHz maximum pixel clock
>[ 52827.478] (--) NVIDIA(GPU-0):
>[ 52827.508] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): connected
>[ 52827.508] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): Internal TMDS
>[ 52827.508] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): 600.0
>MHz maximum pixel clock
>[ 52827.508] (--) NVIDIA(GPU-0):
>[ 52827.543] (II) NVIDIA(0): Setting mode "HDMI-0: nvidia-auto-select
>@1920x1080 +0+0 {ViewPortIn=1920x1080, ViewPortOut=1920x1080+0+0}"
>
>And there it stays until I go back into the display properties and put it back.
>
>I also noticed this declaration in the Xorg.log.0 file:
>
>[ 52762.997] (--) NVIDIA(GPU-0): DFP-0: disconnected
>[ 52762.997] (--) NVIDIA(GPU-0): DFP-0: Internal TMDS
>[ 52762.997] (--) NVIDIA(GPU-0): DFP-0: 165.0 MHz maximum pixel clock
>[ 52762.997] (--) NVIDIA(GPU-0):
>[ 52763.027] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): connected
>[ 52763.027] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): Internal TMDS
>[ 52763.027] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): 600.0
>MHz maximum pixel clock
>[ 52763.027] (--) NVIDIA(GPU-0):
>[ 52763.031] (==) NVIDIA(0):
>[ 52763.032] (==) NVIDIA(0): No modes were requested; the default mode
>"nvidia-auto-select"
>[ 52763.032] (==) NVIDIA(0): will be used as the requested mode.
>[ 52763.032] (==) NVIDIA(0):
>[ 52763.032] (II) NVIDIA(0): Validated MetaModes:
>[ 52763.032] (II) NVIDIA(0): "DFP-1:nvidia-auto-select"
>[ 52763.032] (II) NVIDIA(0): Virtual screen size determined to be 1920 x 1080
>[ 52763.039] (--) NVIDIA(0): DPI set to (30, 30); computed from
>"UseEdidDpi" X config
>[ 52763.039] (--) NVIDIA(0): option
>[ 52763.039] (II) NVIDIA: Using 24576.00 MB of virtual memory for
>indirect memory
>[ 52763.039] (II) NVIDIA: access.
>[ 52763.059] (II) NVIDIA(0): Setting mode "DFP-1:nvidia-auto-select"
>
>Other people with a 4K monitor must have discovered this issue, but my
>searching has only shown up some older issues fixed with modelines or
>insidious EDID fake files to force X not to ask for what's available.
>I've tried some of these with no success, but thought someone has had
>to have come across this before, so I'm humbly asking for help.
>
>I'm using an NVIDIAGeForce GT1030 card, NVIDIA driver 450.51.06, a
>Denon AVR-X2600H, and an LG OLED65C9PUA 4K HDR TV.
>
>Regards,
>
>Ken Emerson

The standard fix is to use nvidia-settings to download the EDID data
for the TV, then set up xorg.conf to use the downloaded copy of the
EDID data and not to fetch the EDID data from the TV. This seems to
be a problem that happens with AV amps more often than the TVs
themselves, so it may not happen if you directly connect the TV to the
MythTV box. I would recommend having it directly connected when you
download its EDID data.

It is likely that you would be able to use an xrandr command to get
the display to switch back to 4k mode again, if you preferred to do
that every time it happens rather than messing around with xorg.conf.
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Problem with TV resolution after upgrading to 20.04 [ In reply to ]
>On Fri, 9 Oct 2020 12:11:36 -0500, Stephen Worthington wrote:
>
>>I've recently upgraded (fresh install) of Xubuntu 20.04. I'd been
>>using Ubuntu (desktop) for almost 10 years, but recently the gnome
>>desktop has been giving me fits. I had Xubuntu on my remote frontend,
>>and that installation was a charm, fixing all the problems with menu
>>bars.
>>
>>So, the installation went well with the exception of losing a power
>>supply in one of my disk arrays (fixed and the raid arrays are all
>>online) but I still have problems with the part that everyone using
>>the system looks at: the TV. I set the display to 3840x2160 in the
>>desktop and told it to hide the menu bar and it looked great! Until I
>>turned the TV off and back on. It had changed the resolution to
>>1920x1080.
>>
>>After some digging I found the issue with X. In the Xorg.log.0 file I
>>see this after switching the display to the proper resolution:
>>
>>[ 52764.156] (--) NVIDIA(GPU-0): DFP-0: disconnected
>>[ 52764.156] (--) NVIDIA(GPU-0): DFP-0: Internal TMDS
>>[ 52764.156] (--) NVIDIA(GPU-0): DFP-0: 165.0 MHz maximum pixel clock
>>[ 52764.156] (--) NVIDIA(GPU-0):
>>[ 52764.186] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): connected
>>[ 52764.186] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): Internal TMDS
>>[ 52764.186] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): 600.0
>>MHz maximum pixel clock
>>[ 52764.186] (--) NVIDIA(GPU-0):
>>[ 52764.241] (II) NVIDIA(0): Setting mode "HDMI-0: 3840x2160
>>@3840x2160 +0+0 {ViewPortIn=3840x2160, ViewPortOut=3840x2160+0+0}"
>>
>>Then after turning the TV off I see this:
>>
>>[ 52827.478] (--) NVIDIA(GPU-0): DFP-0: disconnected
>>[ 52827.478] (--) NVIDIA(GPU-0): DFP-0: Internal TMDS
>>[ 52827.478] (--) NVIDIA(GPU-0): DFP-0: 165.0 MHz maximum pixel clock
>>[ 52827.478] (--) NVIDIA(GPU-0):______

>>[ 52827.508] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): connected
>>[ 52827.508] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): Internal TMDS
>>[ 52827.508] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): 600.0
>>MHz maximum pixel clock
>>[ 52827.508] (--) NVIDIA(GPU-0):
>>[ 52827.543] (II) NVIDIA(0): Setting mode "HDMI-0: nvidia-auto-select
>>@1920x1080 +0+0 {ViewPortIn=1920x1080, ViewPortOut=1920x1080+0+0}"
>>
>>And there it stays until I go back into the display properties and put it back.
>>
>>I also noticed this declaration in the Xorg.log.0 file:
>>
>>[ 52762.997] (--) NVIDIA(GPU-0): DFP-0: disconnected
>>[ 52762.997] (--) NVIDIA(GPU-0): DFP-0: Internal TMDS
>>[ 52762.997] (--) NVIDIA(GPU-0): DFP-0: 165.0 MHz maximum pixel clock
>>[ 52762.997] (--) NVIDIA(GPU-0):
>>[ 52763.027] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): connected
>>[ 52763.027] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): Internal TMDS
>>[ 52763.027] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): 600.0
>>MHz maximum pixel clock
>>[ 52763.027] (--) NVIDIA(GPU-0):
>>[ 52763.031] (==) NVIDIA(0):
>>[ 52763.032] (==) NVIDIA(0): No modes were requested; the default mode
>>"nvidia-auto-select"
>>[ 52763.032] (==) NVIDIA(0): will be used as the requested mode.
>>[ 52763.032] (==) NVIDIA(0):
>>[ 52763.032] (II) NVIDIA(0): Validated MetaModes:
>>[ 52763.032] (II) NVIDIA(0): "DFP-1:nvidia-auto-select"
>>[ 52763.032] (II) NVIDIA(0): Virtual screen size determined to be 1920 x 1080
>>[ 52763.039] (--) NVIDIA(0): DPI set to (30, 30); computed from
>>"UseEdidDpi" X config
>>[ 52763.039] (--) NVIDIA(0): option
>>[ 52763.039] (II) NVIDIA: Using 24576.00 MB of virtual memory for
>>indirect memory
>>[ 52763.039] (II) NVIDIA: access.
>>[ 52763.059] (II) NVIDIA(0): Setting mode "DFP-1:nvidia-auto-select"
>>
>>Other people with a 4K monitor must have discovered this issue, but my
>>searching has only shown up some older issues fixed with modelines or
>>insidious EDID fake files to force X not to ask for what's available.
>>I've tried some of these with no success, but thought someone has had
>>to have come across this before, so I'm humbly asking for help.
>>
>>I'm using an NVIDIAGeForce GT1030 card, NVIDIA driver 450.51.06, a
>>Denon AVR-X2600H, and an LG OLED65C9PUA 4K HDR TV.
>>
>>Regards,
>>
>>Ken Emerson
>
>The standard fix is to use nvidia-settings to download the EDID data
>for the TV, then set up xorg.conf to use the downloaded copy of the
>EDID data and not to fetch the EDID data from the TV. This seems to
>be a problem that happens with AV amps more often than the TVs
>themselves, so it may not happen if you directly connect the TV to the
>MythTV box. I would recommend having it directly connected when you
>download its EDID data.
>
>It is likely that you would be able to use an xrandr command to get
>the display to switch back to 4k mode again, if you preferred to do
>that every time it happens rather than messing around with xorg.conf.

Thank you. That worked. For those who search for this problem with
EDIDs and Audio/Video Receivers (AVR) I have included the link that
I used to create the edid.bin file for my TV. Plugging the TV directly
into the video card is a key point.

https://kodi.wiki/view/Archive:Creating_and_using_edid.bin_via_xorg.conf

Regards,

Ken Emerson

mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Re: Problem with TV resolution after upgrading to 20.04 [ In reply to ]
Thank you. That worked. For those who search for this problem with
EDIDs and Audio/Video Receivers (AVR) I have included the link that
I used to create the edid.bin file for my TV. Plugging the TV directly
into the video card is a key point.

https://kodi.wiki/view/Archive:Creating_and_using_edid.bin_via_xorg.conf

Regards,

Ken Emerson

Another reason to do this is if you connect the mythtv frontend thru a
HDMI switch or have a TV with multiple inputs.

I was always having a FE forgetting it's resolution or just staying
blank.  This was really important on Raspberry PI FE.

The method is slightly different but creating the EDID.dat file is what
solves it.

Jim A
Re: Problem with TV resolution after upgrading to 20.04 [ In reply to ]
On Fri, 9 Oct 2020 17:02:04 -0500, you wrote:

>>On Fri, 9 Oct 2020 12:11:36 -0500, Stephen Worthington wrote:
>>
>>>I've recently upgraded (fresh install) of Xubuntu 20.04. I'd been
>>>using Ubuntu (desktop) for almost 10 years, but recently the gnome
>>>desktop has been giving me fits. I had Xubuntu on my remote frontend,
>>>and that installation was a charm, fixing all the problems with menu
>>>bars.
>>>
>>>So, the installation went well with the exception of losing a power
>>>supply in one of my disk arrays (fixed and the raid arrays are all
>>>online) but I still have problems with the part that everyone using
>>>the system looks at: the TV. I set the display to 3840x2160 in the
>>>desktop and told it to hide the menu bar and it looked great! Until I
>>>turned the TV off and back on. It had changed the resolution to
>>>1920x1080.
>>>
>>>After some digging I found the issue with X. In the Xorg.log.0 file I
>>>see this after switching the display to the proper resolution:
>>>
>>>[ 52764.156] (--) NVIDIA(GPU-0): DFP-0: disconnected
>>>[ 52764.156] (--) NVIDIA(GPU-0): DFP-0: Internal TMDS
>>>[ 52764.156] (--) NVIDIA(GPU-0): DFP-0: 165.0 MHz maximum pixel clock
>>>[ 52764.156] (--) NVIDIA(GPU-0):
>>>[ 52764.186] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): connected
>>>[ 52764.186] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): Internal TMDS
>>>[ 52764.186] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): 600.0
>>>MHz maximum pixel clock
>>>[ 52764.186] (--) NVIDIA(GPU-0):
>>>[ 52764.241] (II) NVIDIA(0): Setting mode "HDMI-0: 3840x2160
>>>@3840x2160 +0+0 {ViewPortIn=3840x2160, ViewPortOut=3840x2160+0+0}"
>>>
>>>Then after turning the TV off I see this:
>>>
>>>[ 52827.478] (--) NVIDIA(GPU-0): DFP-0: disconnected
>>>[ 52827.478] (--) NVIDIA(GPU-0): DFP-0: Internal TMDS
>>>[ 52827.478] (--) NVIDIA(GPU-0): DFP-0: 165.0 MHz maximum pixel clock
>>>[ 52827.478] (--) NVIDIA(GPU-0):______
>
>>>[ 52827.508] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): connected
>>>[ 52827.508] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): Internal TMDS
>>>[ 52827.508] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): 600.0
>>>MHz maximum pixel clock
>>>[ 52827.508] (--) NVIDIA(GPU-0):
>>>[ 52827.543] (II) NVIDIA(0): Setting mode "HDMI-0: nvidia-auto-select
>>>@1920x1080 +0+0 {ViewPortIn=1920x1080, ViewPortOut=1920x1080+0+0}"
>>>
>>>And there it stays until I go back into the display properties and put it back.
>>>
>>>I also noticed this declaration in the Xorg.log.0 file:
>>>
>>>[ 52762.997] (--) NVIDIA(GPU-0): DFP-0: disconnected
>>>[ 52762.997] (--) NVIDIA(GPU-0): DFP-0: Internal TMDS
>>>[ 52762.997] (--) NVIDIA(GPU-0): DFP-0: 165.0 MHz maximum pixel clock
>>>[ 52762.997] (--) NVIDIA(GPU-0):
>>>[ 52763.027] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): connected
>>>[ 52763.027] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): Internal TMDS
>>>[ 52763.027] (--) NVIDIA(GPU-0): DENON, Ltd. DENON-AVR (DFP-1): 600.0
>>>MHz maximum pixel clock
>>>[ 52763.027] (--) NVIDIA(GPU-0):
>>>[ 52763.031] (==) NVIDIA(0):
>>>[ 52763.032] (==) NVIDIA(0): No modes were requested; the default mode
>>>"nvidia-auto-select"
>>>[ 52763.032] (==) NVIDIA(0): will be used as the requested mode.
>>>[ 52763.032] (==) NVIDIA(0):
>>>[ 52763.032] (II) NVIDIA(0): Validated MetaModes:
>>>[ 52763.032] (II) NVIDIA(0): "DFP-1:nvidia-auto-select"
>>>[ 52763.032] (II) NVIDIA(0): Virtual screen size determined to be 1920 x 1080
>>>[ 52763.039] (--) NVIDIA(0): DPI set to (30, 30); computed from
>>>"UseEdidDpi" X config
>>>[ 52763.039] (--) NVIDIA(0): option
>>>[ 52763.039] (II) NVIDIA: Using 24576.00 MB of virtual memory for
>>>indirect memory
>>>[ 52763.039] (II) NVIDIA: access.
>>>[ 52763.059] (II) NVIDIA(0): Setting mode "DFP-1:nvidia-auto-select"
>>>
>>>Other people with a 4K monitor must have discovered this issue, but my
>>>searching has only shown up some older issues fixed with modelines or
>>>insidious EDID fake files to force X not to ask for what's available.
>>>I've tried some of these with no success, but thought someone has had
>>>to have come across this before, so I'm humbly asking for help.
>>>
>>>I'm using an NVIDIAGeForce GT1030 card, NVIDIA driver 450.51.06, a
>>>Denon AVR-X2600H, and an LG OLED65C9PUA 4K HDR TV.
>>>
>>>Regards,
>>>
>>>Ken Emerson
>>
>>The standard fix is to use nvidia-settings to download the EDID data
>>for the TV, then set up xorg.conf to use the downloaded copy of the
>>EDID data and not to fetch the EDID data from the TV. This seems to
>>be a problem that happens with AV amps more often than the TVs
>>themselves, so it may not happen if you directly connect the TV to the
>>MythTV box. I would recommend having it directly connected when you
>>download its EDID data.
>>
>>It is likely that you would be able to use an xrandr command to get
>>the display to switch back to 4k mode again, if you preferred to do
>>that every time it happens rather than messing around with xorg.conf.
>
>Thank you. That worked. For those who search for this problem with
>EDIDs and Audio/Video Receivers (AVR) I have included the link that
>I used to create the edid.bin file for my TV. Plugging the TV directly
>into the video card is a key point.
>
>https://kodi.wiki/view/Archive:Creating_and_using_edid.bin_via_xorg.conf
>
>Regards,
>
>Ken Emerson

There is an easier way to get the EDID data than those on that page.
If you run nvidia-settings, it has an option to read the EDID data
from a device and store it directly to a file.
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-users
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org