• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Connecting PC by HDMI?

Crewman47

Commodore
Newbie
I've got GeForce GT220 with an HDMI output and my monitor has an HDMI input and I was wondering if anyone has there PC connected to their monitor this way and sees much of a difference than connecting by standard VGA, epsecially with video and graphics on the Web, video games or any other apps that you might run?

Also I have tried myself and noticed that for resolution settings it'll give you various options in 480p, 576p, 780p and one 1080i but I find that the refresh rates that I choose make text and images too sharp and too shaky. It's a 22inch wide screen monitor I have also.
 
Between HDMI and SVGA connections you should certainly see a difference.

As for the refresh rates, either you're selcting ones that're too high for your monitor/card to handle or you're choosing ones so high that your eyes are able to precieve the difference. Just select the one that looks the best to you that your monitor supports.
 
Er, if it's an LCD it should only be supporting 60hz. Additionally, you absolutely should not be outputting at an interlaced resolution (1080i) for a PC monitor.
 
DVI and HDMI are the same signal, just a different connector (and HDMI can carry audio as well). However, your video card might only output standard TV resolutions over the HDMI connection, since that is what that connection is intended for. Therefore, you might not be able to run your monitor at the appropriate resolution if you use DVI.

Concerning VGA, I've tried using both DVI and VGA with my monitor and any differences are minimal.
 
not quite - DVI supports both analogue (VGA) and digital signals.

In any setup, one or the other or both signals may be present in the cable.

If you see no difference between using the VGA and DVI cables, then it may be that you're just passing the same analogue signal through the DVI cable.

With any LCD monitor, the most important thing is to match native resolution and refresh rate at true 32 bit colour. If you don't do that, then it will be blurry or fuzzy, and a digital signal won't show any improvement over the analogue VGA. :)

Also, most LCD monitors have a sharpness control alongside contrast and brightness, which you can reduce slightly to make text more comfortable on the eye.
 
not quite - DVI supports both analogue (VGA) and digital signals.

In any setup, one or the other or both signals may be present in the cable.

Totally correct, but I'd be surprised if a GeForce GT220 card carried a DVI-A signal alongside the DVI-D signal.

But I digress.
 
not quite - DVI supports both analogue (VGA) and digital signals.

In any setup, one or the other or both signals may be present in the cable.

If you see no difference between using the VGA and DVI cables, then it may be that you're just passing the same analogue signal through the DVI cable.
Yes, it's true that the DVI connector can carry the VGA signal as well. However, that's typically only used when using a DVI-VGA adapter. In my case, the only way I can tell a difference is in the alignment of the image while the system is booting. If it's VGA, it's not usually properly centered, but with DVI it is.

Oh, to correct my last post, I said "you might not be able to run your monitor at the appropriate resolution if you use DVI" when I meant "you might not be able to run your monitor at the appropriate resolution if you use HDMI." It's too late to edit the post, and I didn't come back here and notice the error until now.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top