Author Topic: Improve ATI/INTEL UI2 Support  (Read 5880 times)

hari

  • Administrator
  • LinuxMCE God
  • *****
  • Posts: 2428
    • View Profile
    • ago control
Improve ATI/INTEL UI2 Support
« on: December 03, 2008, 12:55:53 am »
this is a follow up from http://forum.linuxmce.org/index.php?topic=5797.30

I got an ATI based board for exact that reason. We also got a intel based TEO-X unit for testing. It is not that the core devs really liked the "go nvidia" recommendation. Tschak already gave pointers to the code sections in question. It would help if somebody could isolate the needed GL stuff and write a small reproducer with our needed functions.

A summary from danielk:
The X.org extensions the UI2 uses above the usual suspects are:
  XDamage
  XComposite
  XRender

The UI is rendered to an OpenGL surface and the video is rendered to
an XVideo surface, so both need to be supported with XComposite. Also,
OpenGL must support the GLX_RGBA_BIT render type; basically it needs
to support RGB buffers with an alpha channel. We don't need any alpha
for the XVideo since it's always on the bottom of the stack and has
full opacity.

XVideo must support buffers of at least 1920x1088. The 8 at the end
of 1088 is not a mistake; at least one driver for intel chips supported
"1920x1080" video frames, but this meant it couldn't play ATSC
1920x1080, video since the actual size of the video buffer is 1920x1088
due to the MPEG2 16x16 block size. XVideo also needs to respect the
dimensions of the video frame to use and the output frame in
XvShmPutImage, including negative x & y output position values (for
when the user zooms into the video) and support showing only a portion
of the input buffer (This is used, for instance, for bob deinterlacing
when the top half of the buffer is used for the even fields and the
bottom half of the buffer is used for the odd fields.) It doesn't matter
if only I420_PLANAR or YV12_PLANAR is supported for the video buffers,
but if both are advertised in the driver they should both work without
funky colors (Some ATI drivers pretend to support I420, but actually
always display the buffer as a YV12 buffer, so the U & V planes are
reversed when an application selects I420_PLANAR).

We currently use xcompmgr for the compositing, but I'm open to changing
this. What I really want, which we don't have with nvidia, is foolproof
V-Sync when using XComposite. Currently, if we turn try to enable
a video player's use of OpenGL V-Sync we actually lose V-Sync when
XComposite is in use. If we enable global V-Sync in the nVidia driver
we get a little judder when it works, and sometimes it doesn't work
at all.

best regards,
Hari
rock your home - http://www.agocontrol.com home automation

samuelmukoti

  • Regular Poster
  • **
  • Posts: 49
    • View Profile
Re: Improve ATI/INTEL UI2 Support
« Reply #1 on: January 07, 2009, 02:53:22 pm »
I'm excited about this thread, ATI & Intel has some good offerings and theres many small form-factor boxes on sales that use Via Chrome, Intel and ATI chipsets.

If only i knew how to help :-(

br

Sam

this is a follow up from http://forum.linuxmce.org/index.php?topic=5797.30

I got an ATI based board for exact that reason. We also got a intel based TEO-X unit for testing. It is not that the core devs really liked the "go nvidia" recommendation. Tschak already gave pointers to the code sections in question. It would help if somebody could isolate the needed GL stuff and write a small reproducer with our needed functions.

A summary from danielk:
The X.org extensions the UI2 uses above the usual suspects are:
  XDamage
  XComposite
  XRender

The UI is rendered to an OpenGL surface and the video is rendered to
an XVideo surface, so both need to be supported with XComposite. Also,
OpenGL must support the GLX_RGBA_BIT render type; basically it needs
to support RGB buffers with an alpha channel. We don't need any alpha
for the XVideo since it's always on the bottom of the stack and has
full opacity.

XVideo must support buffers of at least 1920x1088. The 8 at the end
of 1088 is not a mistake; at least one driver for intel chips supported
"1920x1080" video frames, but this meant it couldn't play ATSC
1920x1080, video since the actual size of the video buffer is 1920x1088
due to the MPEG2 16x16 block size. XVideo also needs to respect the
dimensions of the video frame to use and the output frame in
XvShmPutImage, including negative x & y output position values (for
when the user zooms into the video) and support showing only a portion
of the input buffer (This is used, for instance, for bob deinterlacing
when the top half of the buffer is used for the even fields and the
bottom half of the buffer is used for the odd fields.) It doesn't matter
if only I420_PLANAR or YV12_PLANAR is supported for the video buffers,
but if both are advertised in the driver they should both work without
funky colors (Some ATI drivers pretend to support I420, but actually
always display the buffer as a YV12 buffer, so the U & V planes are
reversed when an application selects I420_PLANAR).

We currently use xcompmgr for the compositing, but I'm open to changing
this. What I really want, which we don't have with nvidia, is foolproof
V-Sync when using XComposite. Currently, if we turn try to enable
a video player's use of OpenGL V-Sync we actually lose V-Sync when
XComposite is in use. If we enable global V-Sync in the nVidia driver
we get a little judder when it works, and sometimes it doesn't work
at all.

best regards,
Hari

jondecker76

  • Alumni
  • wants to work for LinuxMCE
  • *
  • Posts: 763
    • View Profile
Re: Improve ATI/INTEL UI2 Support
« Reply #2 on: January 07, 2009, 07:42:36 pm »
Speaking out of ignorance, but wouldn't it make more sense to do all rendering (video and UI) to a GL surface? I don't see the advantage of the XVideo surface and compositing.
Look at video games for example, you rarely see problems associated with alpha blending and framerate or vsync. The only problems are usually associated with the z-buffer, which does not apply in this case.
Sure, it may take some extra processor overhead, but UI2 by design is for higher end systems.
Can't it be as simple as copying the XVideo buffer to an OpenGL buffer, doing your alpha blending, then bringing the entire result back to XVideo for rendering?

IMO, compositing xv and GL together is an ugly practice, requiring specialized driver support. Blending multiple surfaces under only OpenGL however would be much better, and would eliminate the need for driver support for composite extensions and such (it should run on any decent OpenGL supported gfx card). It would likely also be a good start to getting things to run on Gnome as well. I guess the trick would be benchmarking the performance hit of such an approach.

edit:
heres a small related article
http://lurkertech.com/lg/ogl.video.html
« Last Edit: January 08, 2009, 12:46:09 am by jondecker76 »

samuelmukoti

  • Regular Poster
  • **
  • Posts: 49
    • View Profile
Re: Improve ATI/INTEL UI2 Support
« Reply #3 on: January 08, 2009, 07:06:37 am »
I have to agree with you jondecker, i'm also talking from ignorance, but i beleive you make a really good point.  If i look at what XBMC has done - they've just pegged minimum requirements as "OpenGL V2 support required" on the hardware side, and are effectively supporting all major GPU, Intel, ATI & NVidia.  They also do a-lot of similar effects to what we do in LMCE, overlaying UIs on background media - be it pictures or video.

You also mention Gnome, whats the plan there - are you planing to move away from KDE?
Is there some sort of roadmap for 0810?

Just curious,

br

Sam

Speaking out of ignorance, but wouldn't it make more sense to do all rendering (video and UI) to a GL surface? I don't see the advantage of the XVideo surface and compositing.
Look at video games for example, you rarely see problems associated with alpha blending and framerate or vsync. The only problems are usually associated with the z-buffer, which does not apply in this case.
Sure, it may take some extra processor overhead, but UI2 by design is for higher end systems.
Can't it be as simple as copying the XVideo buffer to an OpenGL buffer, doing your alpha blending, then bringing the entire result back to XVideo for rendering?

IMO, compositing xv and GL together is an ugly practice, requiring specialized driver support. Blending multiple surfaces under only OpenGL however would be much better, and would eliminate the need for driver support for composite extensions and such (it should run on any decent OpenGL supported gfx card). It would likely also be a good start to getting things to run on Gnome as well. I guess the trick would be benchmarking the performance hit of such an approach.

edit:
heres a small related article
http://lurkertech.com/lg/ogl.video.html

jondecker76

  • Alumni
  • wants to work for LinuxMCE
  • *
  • Posts: 763
    • View Profile
Re: Improve ATI/INTEL UI2 Support
« Reply #4 on: January 08, 2009, 07:21:17 am »
The comment about Gnome is just from a portability stand point - I don't believe it is in the roadmap at all. But if the ties to compositing and KDE are broken, then LMCE could much easier be ported to other distributions.

I think a good starting point is to reasearch gstreamer a bit more and see if it has the ability to render to an OpenGL surface, and if so there is half of the battle right there

tschak909

  • LinuxMCE God
  • ****
  • Posts: 5549
  • DOES work for LinuxMCE.
    • View Profile
Re: Improve ATI/INTEL UI2 Support
« Reply #5 on: January 08, 2009, 04:11:40 pm »
you're going to find gstreamer to be a big pile of shit to work with.

-Thom

samuelmukoti

  • Regular Poster
  • **
  • Posts: 49
    • View Profile
Re: Improve ATI/INTEL UI2 Support
« Reply #6 on: January 10, 2009, 08:26:32 am »
My prayer is Phonon improves, as it will support various back-ends so we don't need to worry about mplayer plays HD better than Xine and what not..

I haven't looked at Phonon from trunk lately, but i believe since (k)ubuntu 0810, comes with Qt4 and KDE4 libs, rather that talking gstreamer native, phonon could be a good thing.

But then maybe is say that coz i dislike having to deal with g_objects and gtk event loop.  Phonon is a breath of fresh air -  API wise - but practicality still remains to be seen.

Ohh, and I've remembered a blog entry i saw a while ago.  Qt/KDE widget in 3D space - with our own theming we can make nice consistent buttons, grids, textedit etc.. exciting!

http://labs.trolltech.com/blogs/2008/12/02/widgets-enter-the-third-dimension-wolfenqt/ also http://www.vtk.org/pipermail/vtkusers/2008-October/097529.html and for a pic: http://www.flickr.com/photos/ariyahidayat/2621182494/
 

Best tools for the job i say.

Sam
« Last Edit: January 10, 2009, 08:39:51 am by samuelmukoti »