Hi Mark
Here are a few observations.
**VAAPI acceleration
OpenGL-HW - no deinterlacer is offered
Resolution change is ok - it would fail on old VAAPI
1080i - ticker jerky due to single frame deinterlace
So it is inferior to the old vaapi in the deinterlace area, superior in
the changed resolution area.
**VAAPI decode only
the va deinterlacers are not being used - if you select one of them the
output is still interlaced
video output is corrupt after a change of resolution
1080i - ticker jerky due to interlace
So it is inferior to the old vaapi2 in the deinterlace area and in the
changed resolution area.
OpenGL Slim (s/w decode)
video output is corrupt after a change of resolution
Screen shot of corrupted video after resolution change
see https://imgur.com/LbACR9c
I have not yet taken much of a look at the code in the render branch
yet. I thing that tracing through it during playback would help
understanding.
I did invent my own deinterlace shader and test it with a simple program
- I am sure it is inferior but I am trying out various things as a
learning exercise.
I suppose the main complex things that need to be done in shaders is
deinterlace and conversion from various YUV encodings to RGB for
display. I plan to look at the code and try to understand how those work.
At some point I hope to try integrating direct NVDEC display.
Let me know if I should look into the issues reported above, and if I
should make updates to the render branch. I don't want to step on what
you are doing.
It seems that debugging a shader is not easy - you can't do printf in
the code or trace through it with gdb to see where something is going wrong.
Peter
_______________________________________________
mythtv-dev mailing list
mythtv-dev@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-dev
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org
Here are a few observations.
**VAAPI acceleration
OpenGL-HW - no deinterlacer is offered
Resolution change is ok - it would fail on old VAAPI
1080i - ticker jerky due to single frame deinterlace
So it is inferior to the old vaapi in the deinterlace area, superior in
the changed resolution area.
**VAAPI decode only
the va deinterlacers are not being used - if you select one of them the
output is still interlaced
video output is corrupt after a change of resolution
1080i - ticker jerky due to interlace
So it is inferior to the old vaapi2 in the deinterlace area and in the
changed resolution area.
OpenGL Slim (s/w decode)
video output is corrupt after a change of resolution
Screen shot of corrupted video after resolution change
see https://imgur.com/LbACR9c
I have not yet taken much of a look at the code in the render branch
yet. I thing that tracing through it during playback would help
understanding.
I did invent my own deinterlace shader and test it with a simple program
- I am sure it is inferior but I am trying out various things as a
learning exercise.
I suppose the main complex things that need to be done in shaders is
deinterlace and conversion from various YUV encodings to RGB for
display. I plan to look at the code and try to understand how those work.
At some point I hope to try integrating direct NVDEC display.
Let me know if I should look into the issues reported above, and if I
should make updates to the render branch. I don't want to step on what
you are doing.
It seems that debugging a shader is not easy - you can't do printf in
the code or trace through it with gdb to see where something is going wrong.
Peter
_______________________________________________
mythtv-dev mailing list
mythtv-dev@mythtv.org
http://lists.mythtv.org/mailman/listinfo/mythtv-dev
http://wiki.mythtv.org/Mailing_List_etiquette
MythTV Forums: https://forum.mythtv.org