Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - tschak909

Pages: 1 2 [3] 4 5 ... 11
31
Developers / Changes to support Capture Card Audio Port.
« on: May 19, 2011, 03:22:57 pm »
If you wish to see all the appropriate changes, you can see Bug #1063 in trac.... But basically:

* The addition of a new DesignObj in Designer.

A new designobj (6063), has been added to support the new screen. It looks like all the rest of the Setup Wizard screens, except that this one has a new data grid, dgCaptureCardAudioPort, which gets its information from the Media Plugin.

The datagrid generator for this grid gathers all the devices of category Capture Card Audio Port, for a given PK_Device (specified in a previous screen), and puts them in this grid.

Selecting an entry from the datagrid on this designobj will fire a Set Capture Card Audio Port command in the Media Plugin, to assign a number to the FK_Capture_Card_Audio_P devicedata. This number is the device # of the corresponding Capture Card Audio Port device template.

As such, I have made a number of new templates in this category:

* Front Analog Port
* Rear Analog Port
* Default Port
* S/PDIF

more can be added as needed to best describe tuner card audio ports, and they are added in exactly the same way as the Capture Card Video Ports.

Right now, the Extra Parameters device data is used by the Media Plugin to specify what to pass to the CaptureCards_SelectInput_Foo.sh script, as of now, we call this script as:

Code: [Select]
CaptureCards_SelectInput_HDPVR.sh /dev/video0 0 0

where
$1 is the block device (translated from <%BLOCK%>)
$2 is the desired video port as seen by v4l2-ctl
$3 is the desired audio port as seen by v4l2-ctl

This is called from Media_Plugin::StartCaptureCard(), by the App Server, before the call to start the stream by Xine Player.

Hopefully this will help some people who are making device templates for tuner cards.

-Thom

32
Users / New Setup Wizard Screen: Set Capture Card Audio Port
« on: May 19, 2011, 03:14:37 pm »
Good morning.

A new screen has been added to the Setup Wizard.

Some devices like the Hauppauge HD-PVR tuner device, have more than one set of audio plugs which are selectable for input. Until now, these were ignored. I have added all the necessary device templates, design objects, and other bits to the system to allow tuners with more than one audio input to be selected right from the Setup Wizard..

This screen sets between the A/V Destination for a Video Capture Port, and the Rooms you wish to use this A/V device screen in the Setup wizard.

A Video showing the new screen is here: http://www.youtube.com/watch?v=hGS11BexPFo

If you want to see the screens clearly, I suggest selecting the 720p or 480p resolution appropriately.
 
Let me know what you think,
-Thom

p.s. There is a corresponding developer thread here: http://forum.linuxmce.org/index.php?topic=11689.0 talking about the internals.

33
Developers / Solicitation for Development: Desktop / Workstation Agent
« on: April 09, 2011, 07:20:44 pm »
Hello, Everyone.

I had heard quite a few times over the years, the wish for greater integration with Desktop / Workstation computers within the home. Things like:

* Being able to start a piece of media on a workstation, or move a piece of media between a TV and Workstation, on eitheHar an orbiter, or the file manager
* Having a convienient orbiter available on a workstation to do the above, or control other house functions
* Having a way to turn the monitor on/off automatically, in response to changes in house mode (when going Away for example, or waking up, or presence detection)

etc.

I have put together a quick prototype of this, and it is in SVN in a few pieces, for those interested in finishing development of such a component.

Wiki Page is here: http://wiki.linuxmce.org/index.php/Workstation_Agent

A demo video is here: http://www.youtube.com/watch?v=P1pedtpuYzA

No, I am not putting up instructions on the wiki of how to install the components, why? Because I want to foster development within here, and discussion, to force the issue towards making it easily installable, rather than letting setup instructions languish on a wiki that people just muddy through.

Thoughts?
-Thom

34
Users / PadOrbiter 2.0
« on: March 10, 2011, 06:18:45 pm »
After working on it off and on for the last year, I have prepared the first version of PadOrbiter 2.

What's different this time?

Primarily, PadOrbiter was a distribution that was hand crafted specifically for a single device, with no allowance for different hardware. It was designed to run on a specific run of WebDT 366 tablets that found their way onto the market in the latter half of 2008. They had very limited amounts of memory, and very constrained resources, so I had to be extremely creative with what I put on the devices.

Times have changed.


Now, we are amidst ever more capable portable devices, which contain much faster processors, more RAM, and more storage space than ever before. These devices also contain video accelleration cores as part of their system design to aid in the playback of high-definition content. These devices take on many forms, but we are starting to see a mass commoditisation across some common axes:

* Intel Based
- Atom Z or N series CPU
- GMA3150, GMA500, GMA600 GPU

* ARM Based
- Cortex A8/A9 based CPU, such as OMAP3/OMAP4 or NVIDIA Tegra
- SGX5535 GPU and TI Video accelleration (found in the N900), or NVIDIA Tegra

More designs are on the way, so the question becomes how do we handle these devices effectively?

While the Proxy Orbiter based designs have allowed developers to quickly get existing orbiter designs onto existing iOS and Android platforms, I see this as a stop-gap. This goes against one of the primary goals of LinuxMCE, to unify virtually every single piece of technology in the home.

To do this properly, will take a two pronged approach:

* For native systems, such as Android and IOS, Native DCE implementations need to be ported (in the case of Android), and/or built (in the case of iOS)
* For devices that will just run Orbiter, a version of the Orbiter based on MeeGo technologies can be used, This thread addresses this particular instance.

Why? Isn't Orbiter just a glorified touch menu?

Yes, but since these devices have far more advanced video playback and GPU processing features, we can actually make every portable device not only a control point, but also a media end point.

Also, consider PlutoStorageDevices. Once this is running on PadOrbiter v2, the host device will suddenly be able to share USB storage devices with the rest of the house, transparently. Imagine being able to convieniently be able to plug in a friend's portable USB disk, onto a nearby Joggler. It is not only certainly possible, but it will be done.

PadOrbiter v2 also contains in addition to the existing Orbiter code, libraries for Qt Quick/QML and Clutter, which I have been investigating to build a new Orbiter on top of. So the entire stack is FORWARD COMPATIBLE with not only the newer, less expensive hardware coming out, but it also provides the software platform bits needed to carry the Orbiter idea forward to bring in new ideas. Add to this the additional capabilities of media playback and control, and advanced features like pluto storage device points, and you have a solution you can't get anywhere else.

For now, I am placing kickstart files in the following place
http://svn.linuxmce.org/svn/branches/LinuxMCE-0810/src/MeeGo

These can be built using meego-image-creator, which can be found by your friendly google.

Joggler images must be built with -f raw

and for everything else I typically use -f livecd

I will release images later, as I still am refining some bits.

p.s. Oh, there's a video of it running on the Joggler, here: http://www.youtube.com/watch?v=ZRKkqxWcytI

Enjoy,
-Thom

35
Developers / ONGOING: WiiMote support
« on: September 11, 2010, 09:03:21 pm »
Hello everyone.

After a 5 month hiatus from the project so I could collect my thoughts and recover from the intense burnout, I am back in the saddle again.

We are all squashing bugs as best we can, and I am also trying to finish up some things I had started over two years ago. One of them being Nintendo Wii Remote support.

After much deliberation and study of the existing solutions out there, none of them were adequate enough for what LinuxMCE really needed. They were all nasty hacks on top of nasty hacks, which manifested side effects in various ways, such as not being able to use but in one room of the house, not being able to use more than one WiiMote properly, insanely inconvenient button mappings, etc.

With my initial work, I tried to use the existing Bluetooth infrastructure to detect the Wiimotes. While this did work, and the Wiimotes were detected by the orbiter, they were incorrectly detected as mobile phones, and an orbiter would attempt to be generated for them. This causes undue waste in time and resources generating an orbiter which will never be used. In addition, the bluetooth device detection loop currently in place to detect mobile phones clogs up the bluetooth dongle so that other programs can't detect things. This presents itself most commonly as not being able to pair the remotes "on the first..second...fifth...thirty-seventh try." but just fine on the "thirty-eighth." Very inconvienient!

The initial try used the wminput program from the CWiiD project to map buttons, direction pads, the IR pointer and Accelerometers, to kernel input events, so that to the rest of the system, the wireless WiiMote looks just like a wired joystick and/or mouse. While this did allow it to work for more than 60% of the use cases, this is not sufficient for daily use.

The reason, is quite simple, the WiiMote is a hybrid device. It presents itself as both a game pad and a remote control device with a very zen-like button layout.

You may think the solution is to force all of our connected programs to use the same "button" layout universally, but this would only work if one WiiMote were to ever be used with a single media director in the house. There have only historically been single keyboards and mice connected to computer workstations, and this bears itself out with various programs that re-map keys.

Why do we need to do this in the first place, when we only have one UI to deal with?

Wrong, we have multiple UIs to deal with.

We need to deal with:

* Standard orbiter displays
* Media playback displays
* On-screen navigation displays such as when MythTV displays menus
* File Browser displays
* Game Player displays

etc.

And with each of these, they need their own button mappings..

For example, the trigger and A buttons in standard orbiter displays should map to "OK" and Left Mouse button simultaneously. (The reason for this, is historical, and it's quite silly, but yes, it needs to be this way.)

and in the File browser, the vertical 1 and 2 buttons should map to Page up and Page Down.

While playing media, the 1 and 2 buttons should map to the skip buttons (which change channels or change tracks, depending on the media being played)

The Game Player, presents an additional challenge. There aren't multiple keyboards or mice, so distinguishing between concepts like Player 1 and Player 2 are more difficult, ESPECIALLY since wiimotes can connect and disconnect at any given time.  I can't simply hard-code button mappings across multiple applications cleanly. It's too brittle.

To solve this problem, we need to literally present a "Joystick" and a "trackball" to the Game Player, and to  the mame and mess program it is controlling. When the game player is playing, change the button and d-pad and IR mappings to present a joystick like device and trackball.

So how do we do this, while providing a mechanism to detect WiiMotes?

The solution is to provide a DCE device, which will emit device detected/removed events in response to WiiMotes being paired/unpaired, while also providing a thread which utilizes the excellent WiiUse library, to talk to the WiiMote. This thread will utilize the uinput kernel driver which allows regular user space programs access to the kernel input subsystem, and depending on which screen is visible, will change the events that get sent along uinput appropriately.

How do we determine if a screen is changed, and what type it needs to be?

The answer here has already been solved by Pluto, for their infra-red remote controls, as they do the same exact thing. See the Set Screen Type command. This is emitted by Orbiter to devices so that they can change their internal state appropriately. Normally, this is handled by IRBase, but since we're a bit outside that scope, we'll handle all this stuff internally ourselves, especially since the button combinations that we need are already known, and won't need to be changed. So modifying Orbiter to find devices in the Wii Remote category, and to send the Set Screen Type message here, is enough.

With this, now we can maintain a map of "buttons" that we need to map, and when to change them, and when we recieve the Set Screen Type command, we simply make the change. :D

This leaves Follow Me, and Using the WiiMote during the AV Wizard.

The former, is possible because each Wii Remote has its own distinct address. We can utilize the existing infrastructure in place for the Fiire Remote, coupled with this, and create a device for each family member's Wii Remote, coupled with a Wii Remote Dongle device on each media director, so that when we press the vertical 1+2 buttons together, the system will do a Follow Me media event. :)

The latter, is solved by implementing Local Mode in the DCE device driver. In Local mode, a DCE socket isn't opened, but rather, a standard UNIX socket is opened that the AV Wizard can read, so that the AV Wizard can use the wiimote, even though the entire system isn't completely powered up.

All this, just to make it all appear nice and consistent and "just work"

I will be posting more in this thread, as this develops.

-Thom

36
Users / ANN: Podcast on FLOSS Weekly - 1 Sep - LinuxMCE
« on: August 25, 2010, 04:05:41 pm »
LinuxMCE will be featured on Twit.TV's "FLOSS Weekly" podcast on Wednesday, September 1st at 8:30 AM Pacific time. I will be interviewed by Randal Schwartz, an additional host may also be present (not quite sure, as guest hosts tend to change at the last minute.)

http://twit.tv/floss/

Hope you all enjoy. I am asking for a temporary pin on this article.

-Thom

edited: wrong date in body. I am an ID10T.
edited again: they pushed me up an hour :)

37
Developers / Programming Task: Detect VDPAU
« on: July 02, 2010, 10:21:02 pm »
We need some way to detect VDPAU.

My thought, is to build a small C program that links to libvdpau, and does some basic initialization, if it is successful, then exit with an errorcode of 0,
otherwise, exit with another error code that we can decode, at the very least, that it wasn't successful.

Any volunteers?

-Thom

38
http://tschak.homelinux.com/downloads/Media_Plugin.so

Take this, put in your /usr/pluto/bin

let the router reload itself.

This fix will be in the repository on the next pass.

-Thom

39
Developers / Work in Progress: Pandora Player
« on: June 05, 2010, 06:12:47 am »
I am drawing screens for a Pandora Player, which will play Pandora content.

Since Pandora is a purely internet feed, that dynamically changes, the standard remote simply isn't appropriate here, so a new set of screens is being designed from scratch.

Here is the first one, the playlist remote (as seen from UI1 and/or touchscreen orbiters.)

http://tschak.homelinux.com/screen-previews/pandora-player/Pandora%20Remote_v3.png

Hopefully the different buttons are intuitive. I had to go through a few passes to lay things out well.

-Thom

40
Developers / Re: Clutter Exercise: Media List - [X] DONE
« on: June 05, 2010, 03:21:12 am »
As a first experiment, let's try to build a media list, using Clutter or Qt.

How about the following:

* A Grid of thumbnails that stretch from left to right, and scroll off the bottom
* as you mouse over one of them, it pushes forward, becomes larger, and expands into a rectangle, containing the thumbnail. and some text.
* kinetic scrolling.

Go.

-Thom

41
Feature requests & roadmap / Meego (was: Attitude of COOPERATION.)
« on: June 03, 2010, 09:39:14 pm »
We intend to move to 1004 for the immediate future. I am doing research to move the stack to MeeGo.

-Thom

42
Developers / Programming Documentation Task: Document DCERouter.
« on: May 31, 2010, 08:15:49 am »
I have a task for someone interested in creating programming and implementation detail documentation,

start with documenting the DCERouter.

The source is here: http://svn.linuxmce.org/svn/branches/LinuxMCE-0810/src/DCERouter

there is some start to doxygen metadata in the code, but perhaps we can improve this.

Anyone want to take this on?

-Thom

43
Developers / Guys, seriously, are there no coders in here?
« on: May 23, 2010, 03:33:37 am »
Really, I'm fed up.

Am I really the only one man enough to do any serious additions to the system?

-Thom

44
Developers / Programming Task: Create new Bluetooth DCE Device
« on: May 16, 2010, 07:51:27 am »
A new Bluetooth DCE device should be created.

This should use the Bluez4 stack. Initially it should perform:

* Hardware Detect Bluetooth Devices, and emit Device Detected events.
** Device Detected events should fill out as many of the event's fields as possible, including device category so that we can detect based on device TYPE, rather than looking for specific devices.
* Proximity Detection of Bluetooth Devices, Sensor Tripped events.

Use BluetoothProximity's logic as a basis, but please, rewrite it as a C++ device. The idea here is to use the same techniques in that program, but do it in C++, and emit DCE events from it.

Ideally this should be structured so that we can add additional classes to handle more device types. This will become the replacement for Bluetooth Dongle.

Once this works, we can pave the way for other pieces, such as proper presence detection, follow me, etc. to work on a wider variety of phones, and not have to be paired to the orbiter architecture.

Anyone want to take this on?

-Thom

45
Hello everyone.

We currently keep the name of the system inside the web admin's network configuration panel.

It is, by default, set to DCERouter, and we need for it to be resolvable by name on every machine on the network, so somebody can go to a windows, mac, or linux machine and type

http://dcerouter/

into a browser, and have it just work.

Any takers?

-Thom

Pages: 1 2 [3] 4 5 ... 11