Display settings are reset when monitor is disconnected
Hi,Nvidia GTX 295 has 2 GPUs and when the HDMI is connected to a monitor Nvidia CUDA sees both GPUs.The generic VGA and HDMI are extended to eachother, and the generic VGA channel ("Display device on: VGA") is set as main display. Despite this, Windows is disabling the second GPU (HDMI/Physical display) when monitor is disconnected, so that CUDA no longer can utilize both GPUs.How to I make the settings permanent, so that when computer is no longer connected to a monitor CUDA is able to use both GPUs?Better still: how to I enable both GPUs by default - no monitors attached? Some Nvidia cards like Tesla do not have any monitor outlet...RegardsMorten Ross
April 8th, 2009 1:35pm

Morten,I'm not entirely certain you CAN do what you're trying to do. When Windows detects hardware, it makes note of that hardware's capabilities and puts the information into the device manager database. With the "miracle" of plug and play, when a hardware item is removed, it's removed from the database. When it's plugged in, it's restored to the database. The drivers in question for the Nvidia card are likely designed to look in the Device Manager to see what monitor's are there, what they can do and adjust power usage accordingly. Green, for better or worse, is the "new black"... Power conservation is the popular thing. If it doesn't see a monitor, it doesn't see a need for having the 2nd GPU enabled since it literally has nothing to do and it putsit into sleep mode. Exactly what ARE you trying to do anyhow?
Free Windows Admin Tool Kit Click here and download it now
April 8th, 2009 2:11pm

Hi,I'm trying to use all GPU cores for scientific computing:http://www.nvidia.com/object/cuda_home.htmlIn this case on BOINC (Berkeley Open Infrastructure for Network Computing) -SETI@Home: http://www.nvidia.com/object/cuda_home.html#state=detailsOpen;aid=c39d46a0-0e15-11de-8c30-0800200c9a66There is a lot of discussions and confusion on this matter as some do not have this problem (no monitors attached), while others experience a lot of problems, and have to create a dummy plug (made with resistors to simulate an attached monitor) - see http://www.overclock.net/overclock-net-folding-home-team/384733-30-second-dummy-plug.html.The problem is that I have not found any means to forcibly set both/all GPUs to accessible to CUDA. This is a problem to all that are utilizing cards with more than one GPU to compute... As far as Windows Device Manager is concerned, both GPUs ARE there to use - allways.Morten
April 8th, 2009 4:02pm

This suggestion will cost you some money and the results, I do not guarantee...Consider looking into a pair of media devices like Extron DVI/HDMI, switches or extenders which would stay permanently connected to the video outputs but the monitors could be disconnected. I don'tunderstand why the monitorsare disconnected to begin with but I don't need to know. You will need to research their white papers or contact them to see how a particular device behaves. Hopefully, the monitor will be transparent. Perhaps you only disconnect one of the monitors, and therefore merely need one box. Expect to find a device that handles more equipment than your needs require.T.R.
Free Windows Admin Tool Kit Click here and download it now
April 8th, 2009 4:47pm

Hi TR,The GPUs are used to extend computing capabilities of the computer. In fact the computing power of the GPU is sometimes 10 times that of any of today's Intel or AMD processor. So let's say top dog Intel processor completes a work unit in60 minutes, the GPU will complete it 10 minutes. The last generation video card is to many not just for gaming/video, but rather a means to hugely increase the gflops of any given computer.The monitors are redundant in a computing setting - the computers with their GPUs are in this setting just another box in a server farm. Alas, so far a CPU/GPU server farm seems rather difficult to achieve.I have noticed that even with two monitors attached the setting will revert back to one GPU once the computer is restarted. Only if I re-insert the HDMI monitor will the second GPU be active again.This is not working as expected, and seems to be Windows 7-related.How can this be remedied?Thanks for your input onExtron, but a dummy plug does the same.Morten
April 8th, 2009 8:51pm

Ah.. I see.. Going for the home super computer.. ;-) Very cool effort.Ok... Then the solution would seem to be to check with Nvidia. Remember, today's Nvidia drivers are, like Windows 7, in a Beta state. While they're more or less equal to Vista drivers as far as the driver model is concerned, given the issues reported here, they're not quite perfected. As such, they're probably working on perfecting getting them to display stuff properly first, andare going to worry about extending their cards' capabilities later. On that note, do you happen to know if Vista is on the list of supported OS's that can pull off this trick successfully? If so, you might try an older Vista compatible driver and see if that does any better. Keep in mind that since it is a Vista driver, it may not be 100% compatible with Windows 7, and may cause other issues.
Free Windows Admin Tool Kit Click here and download it now
April 8th, 2009 9:48pm

Nvidia is pointing to Microsoft, and vice versa, so this is very frustrating.... Nvidia CUDA is supported on Vista and Server 2008 - but this is a problem on Vista and Server 2008 as well - but not for everyone - that's what so frustrating, as we have not found any means to control this behaviour. If we had some utility, registry setting or other type of manually controlling this, we would be a lot closer to an acceptable situation.For the time being I must use a dummy plug in HDMI port, and hope there are no power outages, as that requires manual intervention in the form of pulling the dummy plug out and in again in order for Windows to see it's there. The same if there is a monitor attached, so this is not a dummy plug issue. Windows should know there is a monitor attached to the HDMI-port during startup, but alas....MortenMR
April 9th, 2009 2:39pm

I have a Nvidia QNVS 290 on a PCIe x1 slot and a NVIDIA Tesla in my only PCIe x16 2.0 Slot. Windows 7 correctly recognises both cards. My problem is simply that I have no monitor connected to the Tesla, and my monitors are connected to the QNVS 290. At boot time the Tesla on the PCIe x16 is always selected - with or without a connected monitor. Of course the Tesla works fine just has to have a monitor connected. Q1: Is there any way of ensuring that at boot up Win 7 always selects the QNVS 290 on the PCIe x1 slot. Which works perfectly as long as the Tesla is not in the PCIe x16 slot. I have tried this with a GeForce GTX460 in the PCIe x16 slot; also with no monitor connected. I still cannot seem to force the OS to use the QNVS 290 as the primary graphics card for the monitors. Any ideas? T.
Free Windows Admin Tool Kit Click here and download it now
October 1st, 2010 8:16am

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics