I got a new monitor.

Discussion in 'Science and Technology' started by Rincewiend, Oct 10, 2009.

  1. Rincewiend

    Rincewiend Admiral Admiral

    Joined:
    Mar 7, 2001
    Location:
    The Netherlands
    An iiyama PLE2208HDS, 22" widescreen...
    Native resolution is 1920 x 1080, which is 16:9 ratio...
    My question is : what 16:9 resolution should i pick for games to keep a nice framerate...

    [​IMG]

    I have it hooked up to my graphics card with a DVI cable..
    The card is a geforce 9500 GT...
    And i run Vista with 4GB ram...
     
  2. Supreme Admiral

    Supreme Admiral Captain

    Joined:
    May 22, 2009
    Location:
    North Carolina
    Is it in FULL HD, yes?
     
  3. Rincewiend

    Rincewiend Admiral Admiral

    Joined:
    Mar 7, 2001
    Location:
    The Netherlands
    1080P, cost me €169.00
    So i guess, if i pick the max resolution for games that support it i play them in HD?!?
     
  4. sojourner

    sojourner Admiral In Memoriam

    Joined:
    Sep 4, 2008
    Location:
    Just around the bend.
    resolution vs framerate depends on the game. World of Warcraft? probably max resolution. How much ram does your videocard have?
     
  5. Brent

    Brent Admiral Admiral

    Joined:
    Apr 24, 2003
    Location:
    TARDIS
    a 9500 GT is not a powerful video card, you might find it difficult to play at 1920x1080 in newer games. Like the above poster stated, older games like WoW would be able to do it easily, but newer games, you may not be able to run at the native rez with that card.

    You'll see what's available in games when you launch them though.

    Going back just now and looking at the specs on a 9500 GT, it is real slow, I doubt you'll be able to play any games at the native resolution well enough, except for very old games, maybe.
     
  6. Rincewiend

    Rincewiend Admiral Admiral

    Joined:
    Mar 7, 2001
    Location:
    The Netherlands
    It's from EVGA, here is a review for it:
    http://www.youtube.com/watch?v=WJPkJyDmH10

    And i wasen't planning on playing games on that resolution, that's why i asked what lower 16:9 resolutions would be better...
    I needed a new monitor, but a new graphics card can wait a couple of months...
     
  7. Brent

    Brent Admiral Admiral

    Joined:
    Apr 24, 2003
    Location:
    TARDIS
    That review is crap, here are some better reviews that might give you an idea of performance at several resolutions:

    http://www.guru3d.com/article/geforce-9500-gt-review/1

    http://www.legitreviews.com/article/760/1/

    http://www.pcper.com/article.php?aid=596&type=expert

    http://www.anandtech.com/video/showdoc.aspx?i=3401

    Like I said, games will show you what resolutions are available for your display ratio when you launch them. In a lot of games you can select the format, 16:9, and it will show the resolutions available at that ratio in-game, just select one that is lower than 1920 and see what happens with performance.

    It also depends on the games you will be playing obviously.
     
  8. Brent

    Brent Admiral Admiral

    Joined:
    Apr 24, 2003
    Location:
    TARDIS
    BTW, here is a list of 16:9 resolutions I just found through googling:

    List of 16:9 Widescreen Resolutions
    Resolution - Aspect ratio
    852x480-----16:9
    1280x720-----16:9
    1365x768-----16:9
    1600x900-----16:9
    1920x1080-----16:9
     
  9. Mr. B

    Mr. B Vice Admiral Admiral

    Joined:
    Dec 28, 2002
    Location:
    New Orleans
    Anything other than the native resolution of any LCD display is going to appear rather sub-standard.
     
  10. Brent

    Brent Admiral Admiral

    Joined:
    Apr 24, 2003
    Location:
    TARDIS
    This I agree on, when you use a lower resolution, and keep the same ratio and have the image stretched across the screen, it will indeed appear more blurry. If you have to run at a lower resolution than native, I suggest selecting "Do Not Scale" in the NVIDIA control panel under Adjust desktop size and position. This will cause the game not to stretch across the screen at a lower resolution, but it means the game will be smaller than the size of the display. Another option is to run in Windowed mode.
     
  11. Rincewiend

    Rincewiend Admiral Admiral

    Joined:
    Mar 7, 2001
    Location:
    The Netherlands
    My old monitor was 1280x1024...
    Games ran good at that res, so looks like i will have to go with 1365x768 or 1440x900...
    Hm, just looked, but if i want a new, more powerfull or 2nd graphics card(another 9500 GT for SLI) i also will need a new PSU since i have a 350W one right now...
    So that wont happen this year...

    1440x900 still looks sharp and crisp, so i think i will go with that...
     
    Last edited: Oct 11, 2009
  12. Brent

    Brent Admiral Admiral

    Joined:
    Apr 24, 2003
    Location:
    TARDIS
    That is indeed a low wattage PSU these days, I would suggest not even bothering with SLI on a 9500 GT, if you really wanted an upgrade I would suggest a newer PSU and then a newer single graphics card of the latest generation.

    Have fun gaming with your new display, let us know how it goes.
     
  13. Subspace

    Subspace Lieutenant Junior Grade Red Shirt

    Joined:
    Sep 3, 2009
    :drool:
    FULL HD MMMMMMM, Delish.
     
  14. DiSiLLUSiON

    DiSiLLUSiON Commodore Commodore

    Joined:
    Apr 7, 2004
    Location:
    The Netherlands
    That does not have to be true. Some videogames, like Word of Warcraft, have an inferior game engine which isn't optimized, while other games (even newer ones) might have better engines. This translates into better framerates and higher resolutions.
     
  15. Brent

    Brent Admiral Admiral

    Joined:
    Apr 24, 2003
    Location:
    TARDIS
    That isn't how it works.
     
  16. DiSiLLUSiON

    DiSiLLUSiON Commodore Commodore

    Joined:
    Apr 7, 2004
    Location:
    The Netherlands
    It cannot be denied to be a part of the equation. So in part, it is.

    Take two competing games -- for example World of Warcraft and Guild Wars. World of Warcraft pushes far less vertices and effects on the screen at any given moment compared to Guild Wars, while its system specs are far more demanding. Given a completely identical system, the screen resolution and effects begin to impact the framerate in World of Warcraft far faster then in Guild Wars, thereby making it possible in the other to have a higher resolution and better effects. In other words: With Guild Wars you get more for less, all due to how well the engine is optimized.

    You can test this for yourself. Take a mid-range graphics card, set all game settings to maximum, turn off your videocard anti-aliasing and try the highest resolution that still has a playable framerate. If the game engine would have no effect, every game would begin to stutter at exactly the same resolution, or you would see a clear line where a graphically more complex game has a lower limit. But that's not always true.

    As such, to say that newer (and/or more complex) games always need better videocards is a generalization. In general, that is true. But there are huge differences between games.
     
    Last edited: Oct 12, 2009
  17. Brent

    Brent Admiral Admiral

    Joined:
    Apr 24, 2003
    Location:
    TARDIS
    I think its incorrect to call the WoW engine "Inferior" and "Unoptimized". The engine is old, and simple, and due to the amount of time it has been out is very optimized now. I also think it is incorrect to say that just because a newer engine is in use it will instantly have better framerates because it is better "Optimized". Most games I've played proves that wrong. Newer game engines allow newer effects, and game devs take advantage of this, and due to this, newer graphics cards can push them faster. This is a simplistic explanation that works generally.