Yes, I had indeed a setting for width > 720 and height > 576 for the
profile and no default setting . The default setting used to be software
I have removed the width and height restrictions and the videos of all
resolutions play now with deinterlacing. OK.
But, and this is the crucial part, now that I have removed the width/height
restrictions also the live TV does use the deiinterlacing!!!!!
This suggests that the live TV setting can select the wrong profile, such
as that it does not look at the width and height settings in the profile.
Or that it selects a profile when width and height are still zero (because
no video has been decoded yet).
I agree about using all available CPU cores as default value.
About software decoding, this has always been my preferred setting when the
CPU was fast enough because software decoding is usually more capable of
handling errors in the input stream in the sense that it does better
correction so that you do not see the errors. At least with MPEG2/SD this
was the case. But given that I have been watching via the VDPAU decoder for
the last few years this is for me not much of an issue. But my cable signal
is perfect, for terrestrial and satellite the software decoding option
would be useful.
On Tue, 18 Feb 2020 at 13:19, Mark Kendall <firstname.lastname@example.org> wrote: > On Mon, 17 Feb 2020 at 22:30, Mark Kendall <email@example.com> wrote:
>> I *think* the root problem is something like;
>> - the above profile is chosen
>> - your SD channels then do not meet the criteria for this profile and it
>> is filtered out
>> - there is no other profile - and everything defaults to 'off' (i.e. no
>> hardware accel, no deint etc)
> OK I've reproduced that profile and it does indeed drop back to no
> deinterlacing. There are a couple of issues here:-
> - when I updated the database video profiles (as part of the render
> merge), I missed the vdpau and openglvaapi render options when using ffmpeg
> decoding - which is why your (I'm assuming) fallback profile of software
> decode is rejected. Not quite sure how to deal with that at this stage -
> slightly reluctant to push another database update though I could just
> handle it in the code for now. Profiles using software decode and
> vdpau/vaapi rendering should have been switched to software decode and
> opengl rendering (the deinterlacers should have been updated correctly).
> - we really should have a suitable 'default' profile when everything else
> is rejected. It will default to ffmpeg decoding and opengl output but it
> would be better if it also defaulted to the number of CPUs present for
> decoding and some basic software deinterlacing.
> - I pretty much forgot about the previous option of using software
> decoding but vdpau/vaapi for rendering. Is this something that people need?
> The only benefit now would be to make the most of VDPAU/VAAPI/NVDEC etc
> deinterlacers. I'm just not sure how useful it is for the extra code it
> would require. Most VDPAU, VAAPI and NVDEC hardware will decode anything
> that is interlaced. VideoToolbox has no deinterlacing support (at least not
> via FFmpeg), drmprime, mmal and v4l2 are similar - and mediacodec is a law
> unto itself.
> With respect to your current issue, I will put something in place to fix
> the vdpau rendering case. Otherwise you could just remove the resolution
> check for using VDPAU decoding or add a new profile entry that explicitly
> sets the fallback to software decoding and the deinterlacing options you
> mythtv-dev mailing list
> MythTV Forums: https://forum.mythtv.org