LinuxMCE Forums
General => Feature requests & roadmap => Topic started by: Medrawd on July 11, 2012, 05:47:54 pm
-
Not really a fature request, but I stumbled upon their demo and I think it might be usefull on a mobile orbiter.
http://youtu.be/PkehW3fkpLQ (http://youtu.be/PkehW3fkpLQ)
http://www.indooratlas.com/technology.html (http://www.indooratlas.com/technology.html)
-
It looks like it's;
A) Proprietary, and therefore not easily incorporated into LinuxMCE, due to potential licensing/cost issues
B) Limited in scope/use in a home setting,... a person knows where they are in their home (absent someone with Alzheimers), and limited use in tracking others because they'll need their phone at all times, and
C) There are other floor plan drawing apps, including the open source Sweet Home 3D that are just as good at creating nice floor plan layouts for use with LinuxMCE's lighting, etc. home controls.
It's interesting tech, though. But it looks like it would be more useful for places like airports, shopping malls, casino/resort complexes, etc.
-
I think the major use of this technology, would be to allow a tablet / mobile device to know which room it was in, and automatically show all the controls for that room etc. With so many possible controls available, UI's that instinctively display the most appropriate controls dependent on their context is key. Obviously use of this tech is not to tell someone where they are in their own house.
I'm likely going to do something like this using Bluetooth and WiFi signal strengths. Almost every room in my house will have a Bluetooth dongle of some sort, and a few of the main rooms will have their own WiFi access points. Room movement gestures would be nice. E.g. I've just put music on in the dining room that has French doors leading out to the garden, within 30 seconds I go outside into the garden with the device I just use to turn the music on, my music know's based on my preferences I'd like the music to follow me out there.
I just got my Ubuntu 12.04 LTS Xen server up and running last night, so will be installing my live/development/test system on that tonight hopefully. Recently got a HP Touchpad that now has ICS on it so will be getting qOrbiter on that, and trying to implement the above soon enough.
Steve
-
There are better Bluetooth technologies for doing this.
One such... Bluemon:
http://www.matthew.ath.cx/projects/bluemon/
-
The Bluetooth Dongle code needs someone who is willing to:
(1) improve it,
(2) fix it to support things other than mobile orbiters so we can provide an entry point for all sorts of other bluetooth devices into the sytem.
-Thom
-
About as much as I am able to do is install/configure the apps... I can't actually code (maybe someday I will). I've toyed with bluemon a little, but would be lying to say I know it well,... and I haven't touched it in years (I don't think the developer has either, actually). I can do a little shell scripting, though...
One architecture question: How does LinuxMCE go about releasing a device that has been captured for a particular job, once it is done? For instance, with Bluetooth, how would we set it up so that the system could grab the Bluetooth services for a Bluetooth headset, but then release it once done being used to later do a brief scan for known signals in proximity with bluemon, for instance? ? Not detail, mind you, but just overall function...
-
Currently, there is no such mechanism in place.
Bluetooth_Dongle uses the VIPShared routines (these routines were shared with Pluto's other project, PlutoVIP), which run in a constant runloop which scans for bluetooth devices, followed by inquiries to find RFCOMM channels. This happens constantly, and while the Inquiry feature is happening, bluez will simply return an error.
This is an issue because, if you simply run scanning loops in parallel while this happens, your scanning loop will fail, at seemingly random times while VIPShared does the inquiry for the mobile orbiters.
So, at the very least, a mechanism needs to be coded in Bluetooth_Dongle so that devices can register to have scanning time, and DCE commands can be sent to do scanning to target devices, while they emit events asynchronously.
This is my frustration, this project needs actual people willing and able to code, and all of us who CAN code, are tied up either (1) fixing bugs, or (2) working on our respective corners of the system which demand our full attention. We need people who either can code, or are willing to put in the energy and time to learn to accomplish what they want to do with the system.
-Thom
-
This may be a dumb question... But, here it goes anyway... Could you suspend the runloop by using a kill on it, when a menu item is triggered, launching a different Bluetooth service, and then restart the runloop afterwards, by wrapping the call to the desired Bluetooth enabled program in script??? It's hardly elegant. But is the system able to do this without it automatically trying to restart the runloop?
Essentially;
kill -15 [whatever the runloop's id is]
[whatever Bluetooth enabled app you want to run]
[command for restarting the runloop]
? ? ?
It has the subtlety of a sledgehammer, but would it work?
-
You would not need to do this if some things were refactored in VIPShared, but this will take some investigation and somebody will just need to sit down and do some serious elbow greased research. The fundimental problem is not simple, and can't be solved by killing stuff like that.
-Thom
-
There are better Bluetooth technologies for doing this.
One such... Bluemon:
http://www.matthew.ath.cx/projects/bluemon/
I don't believe Bluemon is a better technology than IndoorAtlas at all for devices that fulfil the requirements. It is however more device agnostic. I actually implemented FollowMe media using Bluemon and MythTV back in 2006.
After I've got qOrbiter onto my Jogglers, location change gestures will likely be one of the first things I look at in combination with interfacing my PIR's into LinuxMCE.
Steve
-
Given a PIR's incredibly coarse distinction of motion, I do not believe a PIR would be best to implement motion gestures.
The more effective route would be to use a Kinect (or its OEM counterpart, I forget the name), as it provides enough information to shove to a computer visualization library like OpenCV to not only recognize gestures, but faces, and other unique shapes.
-Thom
-
@stedaniels:
To do the kind of thing that I think you're talking about would require a number of different sensors, all located in strategic positions. Frankly, I don't see the benefit, at least until the motion sensing and visual/auditory recognition technology gets better than it is now.
Note: Bill Gates has a system whereby a person wears a pin that contains an rfid chip and changes the environment to suit that occupant or guest. Now look at the WAF problems associated with that!!!
PS: The Nokia N900 has an experimental app that allows you to control the phone by SMS. Combined with something like bluemon, LinuxMCE could send an SMS message to the phone that would cause it to launch a particular app or command based on Bluetooth signal strength derived location... If you could do that on an Android device and/or even run an SMS-like service locally (or dbus?!),... You could achieve something like you want.
-
Presence detection has always been a major pain in the arse.
The first usable system I saw was an echolocation mesh that was present in the building at the AT&T (nee Olivetti) Research Lab in the UK, where they had placed arrays of echolocation transducers in the raised ceilings of the floors of the building, which created specific harmonic signatures when they bounced off each badge. Its initial cost was in the 6 figures.
Pluto's approach made sense for its integrated proof of concept. Make the media directors clients, constantly pinging for the phones which were servers, advertising an RFCOMM service on a known channel (this was allowed on Symbian, however, as Hari discovered, the J2ME port required that the service advertise itself via SDP.). This created a semi-reliable way to make the system figure out whether you were in one room or another. This worked well in large houses because of the nature of bluetooth signals and their transmission classes versus signal attenuation, but in smaller living spaces, it was hell because the clouds would overlap, and it incurred so much logic in the system just to try and figure out where a device actually was, not to mention, the constant use of the bluetooth was hell on battery life, but it DID work, if you understood how bluetooth's signals propagate. However, as time has gone on, this method is much more difficult to do with modern phones, as they want developers to use wifi instead for "network like" things. We can still utilize this ,but it needs to be brought to the present reality of smartphones with multiple network interfaces. Not even mentioning the fact that the bluetooth interface was horrendously slow for how it was being used (30 seconds typically between binding phases, and typically 1 second between button presses to show the new screen), and the target was for phones with hard buttons only, no touch screen ability, and insanely low resolution (176x208 was the initial target.)
Fiire sidestepped this problem by making remotes with a definite mandatory access control address, and sending that address each time the remote connected with a target media director via its dongle. The other aspect was that the radio chips in these remotes had a rampable transmission power adjust, coupled with a directional antenna, which allowed it to quickly find a machine to bind to. It worked, and it worked well, but Fiire contracted with Gyration to build the custom device, of which only a limited run was made, and no more of these devices are available.
The approach of using a definite camera with a depth imaging sensor is the most reliable and cost effective option at present, as it costs $120 or so for a device, and the protocol for it has now been decoded and well understood. The data emitted by the camera is literally a 1 to 1 mapping of RGB pixel to depth, making this shit _really_ easy to package up and send to recognition engines. The DCE device just needs to be done!
Follow Me, is merely an event that is fired to the Orbiter plugin. Anything can fire it, and if you can definitely provide a PK_User, the system will do the right thing.
But please, forget the PIR nonsense. It will trigger far too many false positives at best.
-Thom
-
Not got time for a lengthy reply, but my mentioning my PIR's was an aside that I probably shouldn't of mentioned it due to it's closeness to the topic in hand. PIR's are for my alarm system, Dual Mode, internal & external, also got perimeter covered with door/window alarm sensors topped off with a healthy does of fire/smoke/CO detectors.
JaseP, I think your over complicating what I said I was going to do.. I almost always have my mobile in my pocket, it would be very easy for it to know which room it was in, if I had just used it to turn on music in one location, and then I move to another location, it is relatively easy to process that logic and cause the media to follow me. That's all I'm saying.
I'm not talking about human presence detection, but device presence detection, two quite different things in my opinion.
I'm going to be sticking a tōd on the dog for dog presence detection ;-)
Steve
-
as tschak already said the follow me part is the easy one, you just need to send the proper events. The hard part is to reliably detect where a user is located.
-
As I said before, bluemon could do that part,... but someone would have to throw a DCE router wrapper around it. It might be possible to write a basic phone app to do it, as well, and have it send a signal to Core about where it is...
-
and that would give us what exact benefit over the currently working Bluetooth_Dongle DCE device with built in Orbiter for RA solution?
-
Hari, The trick is finding a way to extend the Bluetooth_Dongle to handle more than the classic Pluto orbiter, which...really is very long in the tooth, and not supported at all by anything new. At all. ever. I know of almost nobody using it.
-Thom
-
and that would give us what exact benefit over the currently working Bluetooth_Dongle DCE device with built in Orbiter for RA solution?
I have to admit,... Since I don't have auto detect follow-me support set up on my system, I didn't actually know how that works until I looked at the wiki, just now,...
The only advantages, I would guess, over the current Bluetooth support would be
1) To auto detect the device and automatically transfer the media room to room, without the user having to select the follow-me function or otherwise acknowledge the transfer to the new room...
2) A potential for more accuracy based on tuning the action to signal strength and
3) Potentially more compatibility with the use of cheap USB adapters that all have the same MAC address (a common problem with those $1-$5 knock-off adapters), since the detection on the LinuxMCE side would be passive.
The downside is that you might have to tune/train the system better based on signal strength (particular to specific devices). But as far as I read, that might be related to the problem of signal overlap where rooms are close to each other and adapters have long-ish range.
I'm not saying it's worth making a wholesale change or addition in LinuxMCE to do it that way... I'm just saying it might be more like the "automagic" functionality that stedaniels is looking to have, without importing a whole new, complex, unknown and proprietary system.
The idea of uninitiated (free from user activation) auto sensing capabilities is a very cool one, though... When I got turned onto the idea of home automation, I imagined a system where I would drive up to my house, the garage door would open, detect I was inside, and after I closed the door would unlock the inside garage door, as well as announce any messages I might have and/or set lighting, media, etc. to my desired pre-sets. LinuxMCE has all the backbone for that to happen, ... It's the details that need to be worked out. But, again, I don't know that changing anything major in the system would be helpful, at this point, and additions are all speculatory, too.
I'm guessing that a good coder, familiar with USB, could harvest code from bluemon and other projects to augment/clean-up LinuxMCE's current Bluetooth functionality (with attribution and releasing under the appropriate licenses, of course). Unfortunately, I don't code (at least, yet). And I don't personally know anyone, interested in home automation, who codes.