Author Topic: nVidia, VSync/Blank and jerky video...  (Read 6613 times)

colinjones

  • Alumni
  • LinuxMCE God
  • *
  • Posts: 3003
    • View Profile
nVidia, VSync/Blank and jerky video...
« on: May 17, 2008, 04:00:49 pm »
Does anybody know if this is good advice? http://wiki.linuxmce.org/index.php/Nvidia_Card_Tweaks_For_Better_MythTV_and_UI_Performance

And if it is superceded by the UseEvents option in xorg.conf?

I can't see how you would avoid tearing if it is not sync'd with the vertical blanking period. Also, it doesn't specify what is meant by "high CPU". Playing media files and ripped DVDs I get anywhere between 20 and 30% CPU for the xine process which doesn't sound high to me (AMD 5200+ in AMD64 bit mode) but I still get jerky video....

skerit

  • Veteran
  • ***
  • Posts: 56
    • View Profile
Re: nVidia, VSync/Blank and jerky video...
« Reply #1 on: May 18, 2008, 12:03:52 pm »
I have the same problem.

I tried the "vsync" option (which was already on, actually) but it didn't help at all..

colinjones

  • Alumni
  • LinuxMCE God
  • *
  • Posts: 3003
    • View Profile
Re: nVidia, VSync/Blank and jerky video...
« Reply #2 on: May 18, 2008, 03:06:25 pm »
btw, I tried glxgears before and after turning off that option.... with it on (sync'd) was getting 30% CPU as I said on an ISO play back and roughly 30-40 fps with glxgears. Off, I get 100% CPU and about 1100+fps!!! So obviously making a big diff... The ISO playback CPU comes down to about 15% CPU, which I guess is consistent with vsyncing being inefficient (although I don't see why it should be if it is written properly - threaded/sleeping/non-blocking...) so I have left it like that....

However, it definitely does NOT solve the jerkiness, and now I have been watching video for a while with that setting, as expected, the tearing has come back (albeit reasonably minor)...

Col

jeff_rigby

  • Regular Poster
  • **
  • Posts: 46
    • View Profile
Re: nVidia, VSync/Blank and jerky video...
« Reply #3 on: May 21, 2008, 04:22:07 pm »
Try using HDMI or DVI at 1080I or 720P to a good TV (transparancy turned off).  This has the TV doing the video and not the video card.  This eliminates all problems.

colinjones

  • Alumni
  • LinuxMCE God
  • *
  • Posts: 3003
    • View Profile
Re: nVidia, VSync/Blank and jerky video...
« Reply #4 on: May 22, 2008, 01:26:37 am »
That's all I use, HDMI-HDMI with a 46" LCD TV capable of 1080p - has always stuttered, even just the screen saver zooming and panning! (with screensaver, the pans just look uneven/not-smooth)

1audio

  • Addicted
  • *
  • Posts: 552
    • View Profile
Re: nVidia, VSync/Blank and jerky video...
« Reply #5 on: May 23, 2008, 06:06:22 pm »
The 1080i vs. 1080p may be significant. At 1080p the CPU is doing the deinterlacing something PC's are not good at. The display may be much better. there are no 1080p 60 Hz sources. Bluray is 1080p 24Hz and broadcast is 1080i and 720p. I will try it as well since the tearing is annoying. However we are hoping that the cpu won't be doing much beyond shuffling decoded bits to the video output. What happens with 480i remains to be seen

colinjones

  • Alumni
  • LinuxMCE God
  • *
  • Posts: 3003
    • View Profile
Re: nVidia, VSync/Blank and jerky video...
« Reply #6 on: May 24, 2008, 01:28:49 am »
hmmm brings up the question again, that on earth is the "deinterlace" setting doing? Defaults to off, and I have it set to off - when I turn it on, there is a noticeable CPU hit. But I don't think the CPU is doing deinterlacing in the off position as the CPU usage doesn't suggest that. Also, most of my sources would be progressive (TV torrents) but smaller than 1080, so its just scaling.

Either way, the bit rates of these source vary WIDELY from fairly low compression to high compression, and the bit rate is usually much more important to the CPU usage (unless you're comparing 480i with 1080p!) as this can be anywhere between 1Mb/s through 15Mb/s depending on how the show was ripped...

I will have a think about switching to 1080i, but it could be difficult getting a modeline for that which works with my TV (have had alsorts of probs!) - but it is likely I will end up increasing the load if the CPU really is doing this... as I say, most of my content is progressive, so it will end up having to interlace it :)

In any case, even if this is a solution for others, it is unlikely to be for me because as I mentioned, the jerkiness even occurs on the screen saver pics... and interlace isn't relevant there... damn :(

1audio

  • Addicted
  • *
  • Posts: 552
    • View Profile
Re: nVidia, VSync/Blank and jerky video...
« Reply #7 on: May 24, 2008, 01:42:15 am »
The interlace setting are an arbitrary mapping of the Descaler settings (http://deinterlace.sourceforge.net/index.php) that are part of Xine. The settings relate mostly to CPU usage. if you are going from an interlaced source to a progressive display the de-interlacing must be done BUT the quality matters a lot. just inserting the lines in makes for a very degraded picture- visible by the Venetian blind effect on moving objects. this is because the progressive frame has two images superimposed where the object moved between them. A good deinterlacer will figure out where each pixels is going between frames- not easy.

colinjones

  • Alumni
  • LinuxMCE God
  • *
  • Posts: 3003
    • View Profile
Re: nVidia, VSync/Blank and jerky video...
« Reply #8 on: May 24, 2008, 01:59:28 am »
Understood and agreed, I realise how deinterlacing works, and have played extensively with the options in VDR because live SDTV is often a problem for me with the combing/venetian-blind effect. But I never get this with my torrent sources as they come either from progressive sources or the person who grabbed it has deinterlaced it already (using a good one!)

So on balance, I believe using a progressive display mode reduces (almost eliminates) any deinterlacing that needs to be done in my case. That, low CPU and the fact that the jerkiness is still apparent through pluto screen saver suggests to me that interlacing isn't the issue in my case - although I did agonise over it initially :)

1audio

  • Addicted
  • *
  • Posts: 552
    • View Profile
Re: nVidia, VSync/Blank and jerky video...
« Reply #9 on: May 24, 2008, 02:15:12 am »
This is a little off-the-wall but are you running the display at the same frame rate as the source? Or you may be dropping frames somewhere in the chain. There are ways to look for this but they made no sense to me, way beyond my Linux skills.

I notice some jerkyness but only on really demanding material (HD at 20Mbps+ going in).

colinjones

  • Alumni
  • LinuxMCE God
  • *
  • Posts: 3003
    • View Profile
Re: nVidia, VSync/Blank and jerky video...
« Reply #10 on: May 24, 2008, 03:28:36 am »
Not at the moment - I was trying for ages to work out how to get it running at 1080p50 - it just wouldn't do it through HDMI, even though it is in a DVB country and so 50Hz, and worked happily at this when watching TV directly. Eventually, I threatened Samsung with demanding my money back unless they got someone on the case. They called me straight away and found me an option "Just Scan" that changed it from "Mode Not Supported" to working perfectly at 50Hz. Obviously, one of my main suspicions was frame dropping, like you say, between source and display. Made absolutely no difference!

Now I have rebuilt 0710, I didn't bother setting it to 50Hz and let it go to 60Hz because the "Just Scan" option doesn't hold between reboots of the core, and reseting it every time is a pain when it doesn't fix the problem!

Its incredibly annoying being so close but not able to fix that final problem - I have asked all over the place about how to troubleshoot the "chain" as you put it as that is exactly what I want to do, but well outstrips my Linux abilities as well.... so far nobody has even replied, so not looking good :(

totallymaxed

  • LinuxMCE God
  • ****
  • Posts: 4660
  • Smart Home Consulting
    • View Profile
    • Dianemo - at home with technology
Re: nVidia, VSync/Blank and jerky video...
« Reply #11 on: May 24, 2008, 10:36:16 am »
Not at the moment - I was trying for ages to work out how to get it running at 1080p50 - it just wouldn't do it through HDMI, even though it is in a DVB country and so 50Hz, and worked happily at this when watching TV directly. Eventually, I threatened Samsung with demanding my money back unless they got someone on the case. They called me straight away and found me an option "Just Scan" that changed it from "Mode Not Supported" to working perfectly at 50Hz. Obviously, one of my main suspicions was frame dropping, like you say, between source and display. Made absolutely no difference!

Now I have rebuilt 0710, I didn't bother setting it to 50Hz and let it go to 60Hz because the "Just Scan" option doesn't hold between reboots of the core, and reseting it every time is a pain when it doesn't fix the problem!

Its incredibly annoying being so close but not able to fix that final problem - I have asked all over the place about how to troubleshoot the "chain" as you put it as that is exactly what I want to do, but well outstrips my Linux abilities as well.... so far nobody has even replied, so not looking good :(

Colin I have to think that it must be your hardware that is causing this issue for you... as we don't see it on our systems and I am pretty sure if it was generally a problem plenty of people would be reporting it here too.

Apart from a couple of situations, with a specific range of hardware, we just use the AVwizard to do the configs on our systems.

Andrew
Andy Herron,
CHT Ltd

For Dianemo/LinuxMCE consulting advice;
@herron on Twitter, totallymaxed+inquiries@gmail.com via email or PM me here.

Get Dianemo-Rpi2 ARM Licenses http://forum.linuxmce.org/index.php?topic=14026.0

Get RaspSqueeze-CEC or Raspbmc-CEC for Dianemo/LinuxMCE: http://wp.me/P4KgIc-5P

Facebook: https://www.facebook.com/pages/Dianemo-Home-Automation/226019387454465

http://www.dianemo.co.uk

cobradevil

  • Regular Poster
  • **
  • Posts: 48
    • View Profile
Re: nVidia, VSync/Blank and jerky video...
« Reply #12 on: May 24, 2008, 11:07:43 am »
Hello all,

I have a samsung 37" 1080p screen and had a lot of troubles setting it up.

The av wizard did not configure my tv correct so then everybody is talking about custom modelines and stuf but what i saw later is that lmce is disabling edid information in the xorg config.
Does someone know why?

===========
--use-edid, --no-use-edid
    Enable or disable use of the EDID (Extended Display Identification Data) from your display device(s). The EDID will be used for driver operations such as building lists of available modes, determining valid frequency ranges, and computing the DPI (Dots Per Inch). This option defaults to TRUE (the NVIDIA X driver will use the EDID, when available). It is NOT recommended that you use this option to globally disable use of the EDID; instead, use '--no-use-edid-freqs' or '--no-use-edid-dpi' to disable specific uses of the EDID.
===========

when the av wizard configured my tv as standard 1024x768 ui mask 2 alpha blending zero for better video playback i edited my xorg.conf manualy:
        section device
        Option "UseEdidDpi" "false"
        Option "UseEDID" "true"

These options are for nvidia drivers only!!

Now i rebooted my md and voila it was running perfectly 1080p

Now i have seen that using alpha blending is a real penalty for performance!!!
So disable that and see what happens.

I guess that within the next year video playback support for linux will be getting better!

Good luck

William van de Velde