After a 5 month hiatus from the project so I could collect my thoughts and recover from the intense burnout, I am back in the saddle again.
We are all squashing bugs as best we can, and I am also trying to finish up some things I had started over two years ago. One of them being Nintendo Wii Remote support.
After much deliberation and study of the existing solutions out there, none of them were adequate enough for what LinuxMCE really needed. They were all nasty hacks on top of nasty hacks, which manifested side effects in various ways, such as not being able to use but in one room of the house, not being able to use more than one WiiMote properly, insanely inconvenient button mappings, etc.
With my initial work, I tried to use the existing Bluetooth infrastructure to detect the Wiimotes. While this did work, and the Wiimotes were detected by the orbiter, they were incorrectly detected as mobile phones, and an orbiter would attempt to be generated for them. This causes undue waste in time and resources generating an orbiter which will never be used. In addition, the bluetooth device detection loop currently in place to detect mobile phones clogs up the bluetooth dongle so that other programs can't detect things. This presents itself most commonly as not being able to pair the remotes "on the first..second...fifth...thirty-seventh try." but just fine on the "thirty-eighth." Very inconvienient!
The initial try used the wminput program from the CWiiD project to map buttons, direction pads, the IR pointer and Accelerometers, to kernel input events, so that to the rest of the system, the wireless WiiMote looks just like a wired joystick and/or mouse. While this did allow it to work for more than 60% of the use cases, this is not sufficient for daily use.
The reason, is quite simple, the WiiMote is a hybrid device. It presents itself as both a game pad and a remote control device with a very zen-like button layout.
You may think the solution is to force all of our connected programs to use the same "button" layout universally, but this would only work if one WiiMote were to ever be used with a single media director in the house. There have only historically been single keyboards and mice connected to computer workstations, and this bears itself out with various programs that re-map keys.
Why do we need to do this in the first place, when we only have one UI to deal with?
Wrong, we have multiple UIs to deal with.
We need to deal with:
* Standard orbiter displays
* Media playback displays
* On-screen navigation displays such as when MythTV displays menus
* File Browser displays
* Game Player displays
And with each of these, they need their own button mappings..
For example, the trigger and A buttons in standard orbiter displays should map to "OK" and Left Mouse button simultaneously. (The reason for this, is historical, and it's quite silly, but yes, it needs to be this way.)
and in the File browser, the vertical 1 and 2 buttons should map to Page up and Page Down.
While playing media, the 1 and 2 buttons should map to the skip buttons (which change channels or change tracks, depending on the media being played)
The Game Player, presents an additional challenge. There aren't multiple keyboards or mice, so distinguishing between concepts like Player 1 and Player 2 are more difficult, ESPECIALLY since wiimotes can connect and disconnect at any given time. I can't simply hard-code button mappings across multiple applications cleanly. It's too brittle.
To solve this problem, we need to literally present a "Joystick" and a "trackball" to the Game Player, and to the mame and mess program it is controlling. When the game player is playing, change the button and d-pad and IR mappings to present a joystick like device and trackball.
So how do we do this, while providing a mechanism to detect WiiMotes?
The solution is to provide a DCE device, which will emit device detected/removed events in response to WiiMotes being paired/unpaired, while also providing a thread which utilizes the excellent WiiUse library, to talk to the WiiMote. This thread will utilize the uinput kernel driver which allows regular user space programs access to the kernel input subsystem, and depending on which screen is visible, will change the events that get sent along uinput appropriately.
How do we determine if a screen is changed, and what type it needs to be?
The answer here has already been solved by Pluto, for their infra-red remote controls, as they do the same exact thing. See the Set Screen Type command. This is emitted by Orbiter to devices so that they can change their internal state appropriately. Normally, this is handled by IRBase, but since we're a bit outside that scope, we'll handle all this stuff internally ourselves, especially since the button combinations that we need are already known, and won't need to be changed. So modifying Orbiter to find devices in the Wii Remote category, and to send the Set Screen Type message here, is enough.
With this, now we can maintain a map of "buttons" that we need to map, and when to change them, and when we recieve the Set Screen Type command, we simply make the change.
This leaves Follow Me, and Using the WiiMote during the AV Wizard.
The former, is possible because each Wii Remote has its own distinct address. We can utilize the existing infrastructure in place for the Fiire Remote, coupled with this, and create a device for each family member's Wii Remote, coupled with a Wii Remote Dongle device on each media director, so that when we press the vertical 1+2 buttons together, the system will do a Follow Me media event.
The latter, is solved by implementing Local Mode in the DCE device driver. In Local mode, a DCE socket isn't opened, but rather, a standard UNIX socket is opened that the AV Wizard can read, so that the AV Wizard can use the wiimote, even though the entire system isn't completely powered up.
All this, just to make it all appear nice and consistent and "just work"
I will be posting more in this thread, as this develops.