Anyway, its not in a slick, finished state.
This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.
Also, because I like to make people's lives hard, I have some quick questions if you dont mind.
*How do you view the path of cross compiling qt (necessitas) vs flushing out the JavaDCE?
*Do you envision a method the two apps could talk to each other. Is there some type of interface i would need to provide on the Qt side?
The main thrust of the questions is I like to have a big picture view of things in my head. With desktop variants, i have a fairly defined idea of the involved parts. Android is still something thats in flux from that perspective and im just wondering if its time to start learning some java on my end.
Thanks for all your hard work on this and qOrbiter, its really appreciated.
I would put the device on the core if it does not really need to be on the MD.
I would always try to expose the audio as a stream, this allows us to seamlessly send audio to both squeezeboxes and MDs.
There is a spotify library iirc? Is that what you intend to use. Either way, it should be possible to implement several spotify clients within the same Spotify device, right? So it might not be necessary with several devices.
If possible, I'd use xine to play the stream on MDs, and then there be no reason for the spotify device to be on the MD itself. All of this is depending upon a spotify device that allows us to access the audio stream in some way, ofcourse.
Don't know the exact process of adding screens, but they need to be added in the Screen table in the database. I think this is used, at least as reference, in qOrbiter too.
After adding an entry in the table, and committing it to sqlCVS, you go ahead create a new Screen_xx.xml file in qOrbiter (from how I understand qOrbiter).