BigD 2008-08-07 07:54
I have a remote computer with 4 monitor ports. I believe it has 2 graphics cards with a VGA and DVI port on each. I do not have physical access to the machine at this moment.
In desktop properties, I see 4 displays, but I am not able to "extend the desktop" to any of the additional 3 displays. I believe the problem occurs because there is no physical monitor attached to these other 3 ports.
Is there a way to fake-out the computer into thinking that a monitor is attached, perhaps via a little software utility or other little trick.
This is a rack-mounted server, but we want 4 monitors on it (we use radmin.com remote software), but don't want to physically attach 4 monitors to it.
BigD
|
Christian Studer 2008-08-07 08:45
To do this via software it would probably need to be supported by the display driver, I remember old Nvidia drivers having an option to always detect a monitor, but I haven't seen this option in any recent drivers.
I don't know if there's a hardware device which would trick the display driver into detecting a monitor.
Christian Studer - www.realtimesoft.com
|
BigD 2008-08-07 12:01
The cards are ATI, using Catalyst drivers. This concept seemed to work on some other machine with some other cards (don't remember if it was NVidia or ATI) a while back. But I don't remember any details.
|
David DeRolph 2008-08-09 07:31
"Is there a way to fake-out the computer into thinking that a monitor is attached, perhaps via a little software utility or other little trick."
What would be the point of this?
|
BigD 2008-08-11 11:46
We have a special application that requires at least 4 monitors and lots of pixels per monitor. The server is on a rack and we don't want to physically attach any monitors for space and cost reasons. In fact, we want the server to just have a power cable and ethernet cable plugged into the back and that's it.
We have multiple users who will access this computer, possibly at the same time, via our remote access software...radmin in our case. If the remote users have the same multiple monitor setup (2 X 2 in this case), and the same screen size, the connection will feel transparent...each remote screen displays on each local screen...with no window borders, scroll bars, etc.
Does that make sense?
|
ecarlson 2008-08-11 13:51
If the server is running something newer than Windows 2000, you could use Terminal Services (Remote Desktop) instead of RAdmin, along with the /SPAN parameter (and the latest Remote Desktop client) to get the virtual client console to span all the local monitors. It won't even matter if you only have an ancient single-output 1.Meg video card on the server since it won't be using the server's video at all.
You won't be able to have multiple people seeing the same local console at the same time as you can with RAdmin, though you can access the local console if necessary, using either the /console switch, or the newer /admin switch (which supersedes the /console switch).
- Eric, www.InvisibleRobot.com
|
BigD 2008-08-12 04:37
Thanks for the tip Eric, but we do need to have one application visible by many people at the same time. Remote Desktop/Term Server is great for multiple separate desktops but is terrible for our needs. I sure wish there was a product that combined the best of both worlds: local desktop sharing and no reliance on server graphics hardware...and at a reasonable price for a small business.
|
Rick 2008-08-23 15:13
You can build, or have built, a VGA dummy load.
Using an DVI to VGA adapter, connect three 75 ohm resistors, from one to six, two to seven, and three to eight. This is for colors red,blue,green.
RICK
|
BigD 2008-08-27 12:27
Thank you very much Rick. I'll give that a try.
BigD
|
BigD 2008-09-10 04:33
Rick,
I haven't tried the dummy load trick, but I know someone who has, and they said it will not maintain the necessary resolution. The resolution always goes back to 640X480. Have you seen this?
BigD
|