Author Topic: [SOLVED] Integrated ATI RadeonTM 1250 and UI2 issue  (Read 20150 times)

hari

  • Administrator
  • LinuxMCE God
  • *****
  • Posts: 2428
    • View Profile
    • ago control
Re: [SOLVED] Integrated ATI RadeonTM 1250 and UI2 issue
« Reply #30 on: December 03, 2008, 12:53:52 am »
I got an ATI based board for exact that reason. We also got a intel based TEO-X unit for testing. It is not that the core devs really liked the "go nvidia" recommendation. Tschak already gave pointers to the code sections in question. It would help if somebody could isolate the needed GL stuff and write a small reproducer with our needed functions.

A summary from danielk:
The X.org extensions the UI2 uses above the usual suspects are:
  XDamage
  XComposite
  XRender

The UI is rendered to an OpenGL surface and the video is rendered to
an XVideo surface, so both need to be supported with XComposite. Also,
OpenGL must support the GLX_RGBA_BIT render type; basically it needs
to support RGB buffers with an alpha channel. We don't need any alpha
for the XVideo since it's always on the bottom of the stack and has
full opacity.

XVideo must support buffers of at least 1920x1088. The 8 at the end
of 1088 is not a mistake; at least one driver for intel chips supported
"1920x1080" video frames, but this meant it couldn't play ATSC
1920x1080, video since the actual size of the video buffer is 1920x1088
due to the MPEG2 16x16 block size. XVideo also needs to respect the
dimensions of the video frame to use and the output frame in
XvShmPutImage, including negative x & y output position values (for
when the user zooms into the video) and support showing only a portion
of the input buffer (This is used, for instance, for bob deinterlacing
when the top half of the buffer is used for the even fields and the
bottom half of the buffer is used for the odd fields.) It doesn't matter
if only I420_PLANAR or YV12_PLANAR is supported for the video buffers,
but if both are advertised in the driver they should both work without
funky colors (Some ATI drivers pretend to support I420, but actually
always display the buffer as a YV12 buffer, so the U & V planes are
reversed when an application selects I420_PLANAR).

We currently use xcompmgr for the compositing, but I'm open to changing
this. What I really want, which we don't have with nvidia, is foolproof
V-Sync when using XComposite. Currently, if we turn try to enable
a video player's use of OpenGL V-Sync we actually lose V-Sync when
XComposite is in use. If we enable global V-Sync in the nVidia driver
we get a little judder when it works, and sometimes it doesn't work
at all.

Maybe we should split this off into the dev section.

best regards,
Hari
rock your home - http://www.agocontrol.com home automation