Author Topic: Installing a LinuxMCE Core with Nvidia 8500GT...  (Read 2047 times)

cirion

  • Guru
  • ****
  • Posts: 353
    • View Profile
Installing a LinuxMCE Core with Nvidia 8500GT...
« on: February 22, 2008, 04:55:56 pm »
I have installed LinuxMCE on serveral machines now and learned from everyone of them. This one was the hardest to install...

Before trying the DVD installer like I usally do, I tried the CD version. I did not want to wipe the drive.

1. I started with installing Kubuntu 0710 64bit without internet so no updates.
2. After a reboot, I connected the Internet cable to my modem and a cable to my switch.
3. I inserted the first LinuxMCE 0710b3 64bit CD and installed the installation routine and started it.
4. It asked to install the Nvidia driver and I selected YES.

Aftere a few minutes of installing it rebooted to a black screen...
I had not used CD2 yet.
It was not done and not at the wizard.
There was no picture on VGA or DVI (I had 2 monitors connected).
CTRL + ALT + 1 did not work and I had no idea of what had to be done.
So I went with the DVD install instead.

I tried installing 0710Beta3 DVD 64bit with a Nvidia 8500GT card installed in my server.

1. Prepared a HD that culd be wiped 200GB.
2. Booted from the DVD
3. I chose to install LinuxMCE 0710 from the menu
4. Black screen!

I have not selected a HD yet, and no picture at all. Both VGA and DVI are connected.
I have installed this DVD on several servers and know that it works well and that the install works.
Could this be the problem many people are having with these cards?

This should be the time when the DVD asks for a drive to use.
Having installed this disk before I knew what to do next.

5. I pressed 0 + Enter Nothing
6. I pressed 1 + Enter DVD started installing but still no picture
(Knowing well how the DVD works I know this part takes time)
7. Still no picture but the DVD ejected, so I removed it and pressed enter.
8. It rebooted and picture was lost after the Bios picture.
9. It rebooted again and now I could hear the AVWizard beeps (still no picture on VGA or DVI)
10. I pressed 1 for DVI and it beeped once and showed me the AVWizard!
11. I tried changing resolutiong to 1280x1024 and 1024x768 but all came up 640x480?!? I went with that for now.
12. The setup wizard came up in 640x480 so I just followed it.
13. The GUI came up in UI2 like I had chosen in 640x480.
14. I pressed CTRL + ALT + 1 and logged in with linuxmce/linuxmce
15. I edited the xorgconf with the following command:
Code: [Select]
sudo vi /etx/X11/xorg.confFor those who do not know vi, insert button lets you edit, then press esc to exit insert mode and press : to type commands.
Code: [Select]
wq is the command for write + quit and press enter.
16. To stop LinuxMCE to mess with my new xorg.conf I edited Xconfigure.sh and added exit to the top.
Code: [Select]
sudo vi /usr/pluto/bin/Xconfigure.sh17.
Code: [Select]
sudo shutdown -r now made a reboot.
18. After a rebuild the UI was now 1600x1200

jgaffney

  • Douchebag
  • Veteran
  • *
  • Posts: 138
    • View Profile
Re: Installing a LinuxMCE Core with Nvidia 8500GT...
« Reply #1 on: February 22, 2008, 07:47:39 pm »
I think you forgot to tell us what changes you made to your xorg.conf

jimmejames

  • Guru
  • ****
  • Posts: 183
    • View Profile
Re: Installing a LinuxMCE Core with Nvidia 8500GT...
« Reply #2 on: February 22, 2008, 10:05:10 pm »
That's the same issue lightkeeper and I were having here:  http://forum.linuxmce.org/index.php?topic=4236.0   I didn't get as far as you and have been playing with it for the last couple days...

cirion

  • Guru
  • ****
  • Posts: 353
    • View Profile
Re: Installing a LinuxMCE Core with Nvidia 8500GT...
« Reply #3 on: February 22, 2008, 11:06:41 pm »
Here you go, my xorg.conf:

Code: [Select]
# xorg.conf (xorg X Window System server configuration file)
#
# This file was generated by dexconf, the Debian X Configuration tool, using
# values from the debconf database.
#
# Edit this file with caution, and see the xorg.conf manual page.
# (Type "man xorg.conf" at the shell prompt.)
#
# This file is automatically updated on xserver-xorg package upgrades *only*
# if it has not been modified since the last upgrade of the xserver-xorg
# package.
#
# If you have edited this file but would like it to be automatically updated
# again, run the following command:
#   sudo dpkg-reconfigure -phigh xserver-xorg

Section "Files"
EndSection

Section "InputDevice"
Identifier "Generic Keyboard"
Driver "kbd"
Option "CoreKeyboard"
Option "XkbRules" "xorg"
Option "XkbModel" "pc105"
Option "XkbLayout" "us"
EndSection

Section "InputDevice"
Identifier "Configured Mouse"
Driver "mouse"
Option "CorePointer"
Option "Device" "/dev/input/mice"
Option "Protocol" "ImPS/2"
Option "ZAxisMapping" "4 5"
Option "Emulate3Buttons" "true"
EndSection

Section "InputDevice"
Driver "wacom"
Identifier "stylus"
Option "Device" "/dev/input/wacom"
Option "Type" "stylus"
Option "ForceDevice" "ISDV4" # Tablet PC ONLY
EndSection

Section "InputDevice"
Driver "wacom"
Identifier "eraser"
Option "Device" "/dev/input/wacom"
Option "Type" "eraser"
Option "ForceDevice" "ISDV4" # Tablet PC ONLY
EndSection

Section "InputDevice"
Driver "wacom"
Identifier "cursor"
Option "Device" "/dev/input/wacom"
Option "Type" "cursor"
Option "ForceDevice" "ISDV4" # Tablet PC ONLY
EndSection

Section "Device"
Identifier "nVidia Corporation G80 [GeForce 8500 GT]"
Driver "nvidia"
BusID "PCI:3:0:0"
# Option "XvmcUsesTextures" "true"
# Option "renderAccel" "true"
# Option "NoDDCValue"
# Option "UseEDID" "false"
# Option "ExactModeTimingsDVI" "true"
# Option "NoLogo" "true"
# Option "NoBandWidthTest" "true"
# Option "ModeValidation" "NoDFPNativeResolutionCheck, NoEdidMaxPClkCheck, NoMaxPClkCheck, AllowInterlacedModes, AllowNon60HzDFPModes"
# Option "DynamicTwinView" "false"
# Option "AllowGLXWithComposite" "true"
# Option "AddARGBGLXVisuals" "true"
# Option "ConnectedMonitor" "DFP"
EndSection

Section "Monitor"
Identifier "Generic Monitor"
Option "DPMS"
# Modeline "640x480" 25.18 640 656 752 800 480 490 492 525
# HorizSync 20-500
# VertRefresh 59-61
EndSection

Section "Screen"
Identifier "Default Screen"
Device "nVidia Corporation G80 [GeForce 8500 GT]"
Monitor "Generic Monitor"
DefaultDepth 24
SubSection "Display"
Modes "1600x1200"
Virtual 1600 1200
EndSubSection
# Option "XvmcUsesTextures" "true"
# Option "renderAccel" "true"
# Option "NoDDCValue"
# Option "UseEDID" "false"
# Option "ExactModeTimingsDVI" "true"
# Option "NoLogo" "true"
# Option "NoBandWidthTest" "true"
# Option "ModeValidation" "NoDFPNativeResolutionCheck, NoEdidMaxPClkCheck, NoMaxPClkCheck, AllowInterlacedModes, AllowNon60HzDFPModes"
# Option "DynamicTwinView" "false"
# Option "AllowGLXWithComposite" "true"
# Option "AddARGBGLXVisuals" "true"
Option "TVStandard" "1600x1200"
EndSection

Section "ServerLayout"
Identifier "Default Layout"
Screen "Default Screen"
InputDevice "Generic Keyboard"
InputDevice "Configured Mouse"

# Uncomment if you have a wacom tablet
# InputDevice     "stylus" "SendCoreEvents"
# InputDevice     "cursor" "SendCoreEvents"
# InputDevice     "eraser" "SendCoreEvents"
EndSection
Section "Extensions"
Option "Composite" "false"
Option "RENDER" "true"
EndSection

cirion

  • Guru
  • ****
  • Posts: 353
    • View Profile
Re: Installing a LinuxMCE Core with Nvidia 8500GT...
« Reply #4 on: February 22, 2008, 11:20:25 pm »
As you can see from my xorg.conf, all that I have done is comment out all options except TVStandard, and the modeline + horizsync and vertrefresh.
Then changed the values for modes, virtual and TVstandard to 1600x1200.

Why I have commented them out?

Well I had huge problems getting 1366x768 to work on my LCD TV.
I was not getting any help anywhere and not knowing what to do I just deleted all lines I did not understand, to add then one at a time...
But that was what made 1366x768 work :)

Testing the same on this 8500GT card worked great :)

Modelines are not needed on any nvidia card I have tried when all options are commented out or deleted.

colinjones

  • Alumni
  • LinuxMCE God
  • *
  • Posts: 3003
    • View Profile
Re: Installing a LinuxMCE Core with Nvidia 8500GT...
« Reply #5 on: February 22, 2008, 11:25:36 pm »
Modelines are not needed on any nvidia card I have tried when all options are commented out or deleted.

That's because you are using a standard mode definition that is built in ("1600x1200"). But that means you are using the internal "modeline" definition for that mode. If you want to control it properly you need the modeline with a different, non-standard name eg "1600x1200_cirion", define it and then select it.

Use the UseEDID option to probe your monitor and read the results in the /var/log/Xorg.0.log file - you will see it validating all the modes, and you should get a better idea of what it will support. But definitely uncomment all those options, with them commented you don't have any clue as to what it is doing...

cirion

  • Guru
  • ****
  • Posts: 353
    • View Profile
Re: Installing a LinuxMCE Core with Nvidia 8500GT...
« Reply #6 on: February 23, 2008, 12:04:56 am »
That's because you are using a standard mode definition that is built in ("1600x1200").
Is 1366x768 a standard mode too?

Not shure what you are trying to say, since I have tried all modeline generators and a lot of wikis to get 1366x768 to work. Nvidia is not helping either. I used several months to get it to work, almost gave up on Linux....

When I removed all options and stop LinuxMCE in changing the xorg.conf 1366x768@66 works... Even smooth in UI2 Overlay.
I would like to have control over every aspect of my GFX card, but if that means a black screen i se no point...

If you are saying I can get 1366x768 on a nvidia GFX card with UI2 and Alpha blending and no tearing if I set the options right?
Then I would definately try anything you say!

I have in my setups the following Nvidia GF cards:
2MX, 7050pv, 8500GT, 8600GT

colinjones

  • Alumni
  • LinuxMCE God
  • *
  • Posts: 3003
    • View Profile
Re: Installing a LinuxMCE Core with Nvidia 8500GT...
« Reply #7 on: February 23, 2008, 01:48:17 am »
I would say - if its working and you are happy with the result, don't play with it! Take a backup copy of your xorg.conf and leave it at that.

But if you do want to play, take a copy of the xorg.conf, and start by returning the commented lines, and add Option "UseEDID" "true" to the device and screen sections. Boot with that - if it comes back with a black screen, try to kill X (CTRL-ALT-Backspace) or reboot in recovery mode, then read your /var/log/Xorg.0.log file. You should see it interogating the monitor, and learn a lot more about which resolution, etc it is selecting and which ones it believes are valid and not valid.

I don't think you will ever get UI2Alpha working completely without tearing. Even if it is only occasional - look up Totallymaxed's comments on this subject.