Mailing List Archive

1 2  View All
Re: judder problems with nvidia 358.16 [ In reply to ]
Thank you Mark. From the log entries it just keeps coming back to that
MythTV is having parsing and matching issues between all the contradictory
XRandr and NVIDIA data. I have pretty much given up trying to figure out
what is going wrong there.

Happily, I am able to report that I have gotten the following working in a
proof of concept patch:

1. Report all the non-integer refresh rates in the MythTV GUI accurately
for a monitor using only XRandr.
2. Switch to the exact modeline for that selection when playing a video

I have some follow ups to do on a couple of points (besides cleanup) before
I think this is usable:

1. Prevent X from panning the screen around when we do switch to a lower
resolution
1. This should be simple, but for some strange reason seems to
require special steps including disabling the monitor before the attempt
under some circumstances (possibly NVIDIA related).
2. Figure out what MythTV is doing (or should do) on the attempts to
switch to a GUI resolution.
1. This is reported as an attempt to switch to a refresh rate of 0.

I have also not had a chance to sync up with 0.29 yet.

Thanks,
Alex
Re: judder problems with nvidia 358.16 [ In reply to ]
On August 9 you wrote:

>Thank you Mark. From the log entries it just keeps coming back to that
>MythTV is having parsing and matching issues between all the contradictory
>XRandr and NVIDIA data. I have pretty much given up trying to figure out
>what is going wrong there.
>
>Happily, I am able to report that I have gotten the following working in a
>proof of concept patch:
>
>1. Report all the non-integer refresh rates in the MythTV GUI accurately
>for a monitor using only XRandr.
>2. Switch to the exact modeline for that selection when playing a video
>
>I have some follow ups to do on a couple of points (besides cleanup)
before
>I think this is usable:
>
>1. Prevent X from panning the screen around when we do switch to a lower
>resolution
>1. This should be simple, but for some strange reason seems to
>require special steps including disabling the monitor before the attempt
>under some circumstances (possibly NVIDIA related).
>2. Figure out what MythTV is doing (or should do) on the attempts to
>switch to a GUI resolution.
>1. This is reported as an attempt to switch to a refresh rate of 0.
>
>I have also not had a chance to sync up with 0.29 yet.

VDPAU scaling and deiterlace work pretty well. My myth box looks way better
than my Firestick using Kodi, for example. That shows lots of interlace
artifacts no matter what I set.

I would not worry so much about the resolution, but the frame rate is
critical for getting rid of judder. If it stayed at native TV resolution
and scaled everything using VDPAU, I think that would be good enough, as
long as the frame rate matches the source.

Thanks again for looking into this.

Mark
Re: judder problems with nvidia 358.16 [ In reply to ]
I have opened a ticket with my attempted patch at
https://code.mythtv.org/trac/ticket/13100. It *should* work on any recent
versions of MythTV and X.Org: the affected file hasn't otherwise changed in
years.

Thanks,
Alex
Re: judder problems with nvidia 358.16 [ In reply to ]
I just realized this assumed you set a primary monitor, which isn't
something one would usually bother to do with a single display and X
doesn't consider a single display primary by default. Will try to fix that.

-Alex
Re: judder problems with nvidia 358.16 [ In reply to ]
On 08/13/2017 11:40 PM, Alex Halovanic wrote:
> I just realized this assumed you set a primary monitor, which isn't
> something one would usually bother to do with a single display and X
> doesn't consider a single display primary by default. Will try to fix
> that.
>
> -Alex
>
>
Hi Alex

It is best to add comments like this to the ticket, so that if somebody
is looking at the ticket they know the status.

Peter
Re: judder problems with nvidia 358.16 [ In reply to ]
Thanks, I only meant it as a quick status update before I actually fixed
the patch which I wasn't able to get to as quickly as I hoped.

I have updated the ticket with a newer version. Any testing anyone is
willing to do would be helpful, I tried my best with the monitors and
videos I have available to me.

-Alex
Re: judder problems with nvidia 358.16 [ In reply to ]
On August 14 you wrote:

> Thanks, I only meant it as a quick status update before I actually fixed
> the patch which I wasn't able to get to as quickly as I hoped.
>
> I have updated the ticket with a newer version. Any testing anyone is
> willing to do would be helpful, I tried my best with the monitors and
> videos I have available to me.
>

Do you think you could write up a short description of the algorithm you
are trying to implement? Especially, what is selected for interlaced
sources? Maybe put it in comments? I believe the previous code chose the
matching progressive refresh rate and let the deinterlacer do it's job.

I will try to test the patch, but it would really be a help to know what it
is supposed to do. I will admit I do not have the time (or expertise) to go
through your code. It has been a while since I compiled a custom MythTV, so
it may be a few weeks, but I will try to test it out.

Thanks for this work,

Mark
Re: judder problems with nvidia 358.16 [ In reply to ]
Hi Mark,

I should clarify that I did *not* change the algorithm that MythTV is using
to try to select the best refresh rate for a video's FPS when you use the
Auto setting. I only changed the lookup of what "Width + Height + Refresh"
combinations are *available*. You can see these choices when selecting
choices other than Auto in the menu. As part of that I also necessarily
had to change how one switches to a given modeLine (whether chosen by hand
or automatically).

The reason I did not have to change the Auto refresh rate algorithm is that
it already took into account more precise decimal values. The actual
difficulty was that the most straightforward XRandr API implementation for
looking up and setting refresh rates only used integers (for that matter,
so does the OS X one, I believe). This resulted in:

- Not having a complete list of rates available (59.94 and 60.00 both
appear rounded out as 60).
- This was all non-NVIDIA drivers.
- Or having fake (but distinct) values that involve quite a bit of
guesswork to figure out what they really were, both by MythTV, and by users
in the Settings screen
- This was NVIDIA, but the guesswork MythTV was able to use broke
somewhere along the way.

For those interested in what the actual "best rate" algorithm that I did
not tocuh does, it mostly involves looking for an available refresh rate
(for a given screen size) that seems "close enough". This is likely because
Frames Per Second is not a precise measure and fluctuates a bit. It does
prioritize a refresh rate that is double the frame rate for videos in the
24.5 -> 30.5 FPS ranges. This is because an interlaced video will come out
to something like ~29.97 FPS, but needs a refresh rate of 59.94 to display
alternating frames, whether that is 59.94i or 59.94p. The progressive
option is usually the better choice since the interlaced mode will only
work if the resolution is also identical AND the TV has a good
deinterlacer. But, near as I can tell, MythTV does not currently make any
comparisons between interlaced and non-interlaced modes: there are no flags
passed around indicating that the mode is interlaced, and for that matter,
no indication that the video itself was, beyond some guesses around those
FPS ranges.

It's quite likely there is some room for improvement in the auto-matching
algorithm as well. I didn't want to bite off too much on this attempt. For
those who want a closer look, it is implemented here
<https://github.com/MythTV/mythtv/blob/master/mythtv/libs/libmythui/DisplayResScreen.cpp#L119>
.

Thanks,
Alex

1 2  View All