Can I make a suggestion? Recently I was speculating on a new way of using Xine for audio and video play back
http://forum.linuxmce.org/index.php?topic=7367.msg47277#msg47277I realise that this would be some major re-plumbing, but I would like at least to have this idea posted for the record in the Features forum....
In short, the idea was to move to a model were there would be 2 xine DCE devices rather than 1, in each MD. The first would be a "Xine Server" and second a "Xine Client". When the media plugin wants to start a peice of media, it would send a DCE message to the Xine Server on the MD in question to pick up the stream object from the foundry, and start playing it in server/broadcast mode. It would then send another DCE message to the Xine Client on the same MD, telling it to connect to the broadcast of the Xine Server. In this way a new piece of media would start playing, with no apparent difference from how things are done currently.
However, now, if you want to bifurcate the stream using the media map, to extend that media to play on other MDs, the media plugin simply sends a DCE message to the Xine Client on the other MDs telling them to connect to the same broadcast stream from the original MDs Xine Server. In this way, we could guarantee that any media (audio or video) is perfectly sync'd no matter how many MDs were playing it - they would all be playing a real-time stream without the need for buffering, so the sync could only be out by the latency around the internal network which would be far lower than is noticable in media. This would replace the need to use the time stamp events. These would still be needed for resuming playback but relying on them for sync'ing media streams, as we all know, doesn't work very well.
This could even be extended to use a multicast address, which would save hugely on network bandwidth in environments that have large numbers of MDs (as the "broadcast" mode isn't a real TCP/IP broadcast, rather it would be multiple unicasts, each consuming bandwidth)
Further, this could be the nucleus of separating audio and video streams so that you could play the video-only from one piece media and the audio-only from another, on the same MD. Potentially, many different audio streams mixed together, and possibly even breaking up the screen of an MD to play multiple video streams simultaneously on the same screen. We could have as many Xine Client DCE devices as we want in each MD to allow for this. For audio, they just need to play nice together allowing the audio streams to be mixed (quite possible), for the video idea of course, this is much more complex....
Does this strike any of the developers as a good idea?