How does the GTX 690 Perform in 2017?

Last Updated on



Views: 31315 | Rating: 4.58 | View Time: 5:2 Minutes | Likes: 328 | Disslikes: 30

Lets see how the GTX 690 has held up in the past 5 years

Everybody Bounce – Waterflame

Waterflame Newgrounds Page

My Twitter:

Business Enquiries:
[email protected]

This Post Has 32 Comments

  1. Now I wonder how much is the GTX Titan Z still worth nowadays.

  2. New to gaming pc world, bought a used pc with a intel core 2 quad qx6600 and Radeon HD 6850. Is this a better upgrade? I’m slacking on getting good FPS , or let me know other suggestions, wanna leave the console world

  3. I've had that card since it was released; finally upgrading to 1080ti 😀

    Yeah, bragging

    I had had a spectacular run with it although about a year ago I had to switch from ultra settings to high and windowed mode from max resolution to 1900×1200 resolution

    Hence the upgrade. 🙂

    I will find a prime spot for it in my old hardware resting place; who knows maybe one day I'll rebuild a retro PC using it for old times sake 🙂

    It had served me well. Definitely worth the money I paid. Think about it, for 6 years I had no worries about/with performance or where to upgrade next; everything was smooth. My programs, 3Ds Max, Solid Works; it just worked at max settings. What a glorious card

  4. I ran pubg at all ultra settings and 4K at just a little bit over 420 fps

  5. que carajo es esto si el ghost recon con una gtx 1080 y un i7 4770k y se baja a 39 fps

  6. 74fps in witcher 3???? its like, or better than a 1060. i dont understand the world anymore. in wildlands its better than my rx580nitro. wtf

  7. Fake !! Widland test il et affichée 6go de vram !!!!! La 690 n'a pas 6go!!!!!

  8. good review but music is soooo noisy

  9. Just buy a r9 295×2 . I bought one my self for 180$ USD on eBay. Blows a Gtx 1070 away . 4k gaming near 1080 performance.

  10. dude this is a kick ass card !! why did they stopped building these kind of cards?!!?

  11. If they wouldn't have gimped the VRAM down to 2GB when they could've easily made it 4GB, this card would've been a serious issue for the 980, as while the 980 does have better performance than the 690, it ain't by a whole lot. If the 690 had 4GB, I'd imagine the card would be neck and neck at least with the 980, meaning a GPU two generations back could do what their current tech could. The biggest downside to the 690 is it's an absolute power hog at 300W all by itself, running two in SLI is fucking 600W which is more than a lot of modern power supplies are even rated to output due to hardware becoming far more efficient in terms of TDP, by extension making the average gaming PSU in the 450-650W range as opposed to a few years ago when most power supplies were 850-1000W+.

    That makes me think they intentionally handicapped the card to 2GB VRAM, otherwise it just would've been too damned powerful. It already beat the 700 generation flagship out, but you could justify buying a 780 / 780Ti over a 690 because back then the prices of 690's were still fairly high. By the time the 980 hit, the 690 price had sank a lot, so I'm thinking maybe NVIDIA foresaw this possibility and gimped the card on purpose, realizing that unlocking the card's full potential could screw them financially later on. I STILL see rigs with 690's in SLI (technically quad-SLI I suppose) and shit just smashing games, though the 2GB of VRAM typically limits them to 1080P tops. Imagine this card at 4GB, truly amazing that this is tech from fucking 2012.

    I think if NVIDIA really wanted to they could release a GPU that could stand the test of time for 12+ years or something, but it would likely be expensive and would fuck up their business model substantially.

  12. what should buy, this card for 100-110$ or gtx 1050 for 135 $ ? both used

  13. i have to disagree with the frame buffer. I have tested cards from HD 4870X2, HD 5970, HD 5970 + HD 5870 CF (3GB total) and in modern games and older. It uses 2GB or more depending on the game.

    I'm currently waiting on a GTX 590 with 3GB and im gonna do some testing with it and see if that cards also use all the vram instead of the 1.5GB.

    So i don't really know about the "you-cant-use-all-the-vram" thing that goes around.

    If so, please explain 🙂

Leave a Reply

Close Menu