Ed 2013-08-07 08:50
I've heard so much about UltraMon that I decided to link an old 19" with my new monitor on an xp system, but I don't appear to be able to.
The new monitor is not recognised by UltraMon.
It is a generic video card (I'm not that concerned about great graphics as I use the computer for writing) and has the adapters. I have tried all the usual strategies of changing leads (the old and new monitor work on the VGA, but not on the HVI) so I tried another video card (with upto date drivers), which I know to be good and the same thing happened (VGA okay on both monitors, but both monitors bad on HVI). I am no expert but I think I have learned my way around computers over ten years or so of using them and have fixed most things, but I am stumped here.
I read that the multi monitor system would not work with on-board graphics and it would need to be disabled if that was the case. I am sure I don't have that as I reasoned there would be a VGA socket to the motherboard - or am I wrong there, could the problem be somewhere in this area?
With your vast experience I am sure you have come across every instance that would prevent UltrMon from working, I hope you have come across this and can help me.
Cheers,
Ed.
Ed
|
Christian Studer 2013-08-07 12:19
Does the monitor have a DVI port as well so that you can connect it directly to the video card's DVI port, or are you using a DVI-to-VGA adapter?
Christian Studer - www.realtimesoft.com
|
Ed 2013-08-07 13:10
It has both
Ed
|
Christian Studer 2013-08-07 13:34
If the monitor still doesn't work if it is connected via DVI port to the video card's DVI port, go to Display Properties or UltraMon menu > Display Settings and make sure it is enabled.
This is either a configuration or hardware issue, UltraMon can't help as basic multi-monitor functionality is handled by the operating system; you don't need UltraMon to use two monitors, it adds additional multi-monitor features on top of what's provided by the operating system.
Christian Studer - www.realtimesoft.com
|
Ed 2013-08-07 13:39
I can't enable it because it doesn't show up.
Ed
|
Ed 2013-08-07 13:45
Sorry, I meant to say I thought it would be a simple job to link the second monitor on and if it was I thought I'd have real fun and put my original 15" monitor on as well (as I have a spare VGA only graphic card) making a three monitor system.
Ed
|
Christian Studer 2013-08-07 14:32
Not sure what might cause this, if you have a DVI-to-VGA adapter around I would try if the second monitor works when connected via its VGA port to the video card's DVI port.
Christian Studer - www.realtimesoft.com
|
DT 2013-08-07 17:01
Have you tried looking in the BIOS settings? Perhaps something in there needs to be enabled?
|
Ed 2013-08-08 01:11
Hi,
I've done the whole combination of leads and VGA- DVI converter connection and DVI to DVI cable, but the second screen does not even flicker.
I'm always a little unsure of meddling with the BIOS. Any idea of what I should be looking for?
Ed
|
Christian Studer 2013-08-08 08:38
Did you reboot after changing the monitor cabling? Windows may only detect monitors if they're connected and turned on during startup.
You also may not immediately get a signal after connecting a monitor as the monitor may need to be enabled via Display Properties first.
With only a single video card I don't think you'll need to change BIOS settings, usually this is only necessary if you have multiple video cards and want to change which one is treated as the primary card.
Even if the system had an onboard video card you shouldn't have to change settings, the onboard card either gets disabled when you install the video card, or both cards work fine.
Christian Studer - www.realtimesoft.com
|
Ed 2013-08-09 15:31
I've tried everything that has been suggested here and I thank you for your help. However, a family emergency has just taken precedent, so I will have to leave this for a while. Thanks again.
Ed
|